Skip to main content

Building Transformer-Based Natural Language Processing Applications

Data Science Hub / NVIDIA Workshop

Event Details

Date
Monday, October 24, 2022
Time
8:30 a.m.-12:30 p.m.
Description

Transformer-based models, such as Bidirectional Encoder Representations from Transformers (BERT), have revolutionized natural language processing (NLP) by offering accuracy comparable to human baselines on benchmarks like SQuAD for question-answer, entity recognition, intent recognition, sentiment analysis, and more. In this workshop, you’ll learn how to use Transformer-based natural language processing models for text classification and named-entity recognition (NER) tasks. You’ll also learn how to analyze various model features, constraints, and characteristics to determine which model is best suited for a particular use case based on metrics, domain specificity, and available resources.

When: Oct 24-25, 8:30 am - 12:30 pm (CT).

Where: This workshop takes place virtually via Zoom. Registrants will receive the Zoom link in an email one week prior to the workshop.

Registration: https://www.eventbrite.com/e/building-transformer-based-natural-language-processing-applications-tickets-407036646567

Learning Objectives

  • Understand how text embeddings have rapidly evolved in NLP tasks such as Word2Vec, recurrent neural network (RNN)-based embeddings, and Transformers
  • See how Transformer architecture features, especially self-attention, are used to create language models without RNNs
  • Use self-supervision to improve the Transformer architecture in BERT, Megatron, and other variants for superior NLP results
  • Leverage pre-trained, modern NLP models to solve multiple tasks such as text classification, NER, and question answering
  • Manage inference challenges and deploy refined models for live applications

Prerequisites

  • Experience with Python coding and use of library functions and parameters
  • Fundamental understanding of a deep learning framework such as TensorFlow, PyTorch, or Keras
  • Basic understanding of neural networks

Suggested materials to satisfy prerequisites: Python Tutorial, Overview of Deep Learning Frameworks, PyTorch Tutorial, Deep Learning in a Nutshell, Deep Learning Demystified

Hardware Requirements: Desktop or laptop computer capable of running the latest version of Chrome or Firefox. Each participant will be provided with dedicated access to a fully configured, GPU-accelerated server in the cloud.

Agenda: To review the full workshop agenda, please visit https://www.nvidia.com/en-us/training/instructor-led-workshops/natural-language-processing/.

Certificate: Upon successful completion of the assessment, participants will receive an NVIDIA DLI certificate to recognize their subject matter competency and support professional career growth.

Cost
$10

Tags