Building Transformer-Based Natural Language Processing Applications

REPRESENTATION, CLASSIFICATION, SCALING

Instructors

Luigi Troiano
University of Salerno

Francesco Gissi
University of Sannio

Vincenzo Benedetto
University of Sannio

Full-day Workshop

2020 EVENTS

In this workshop, you’ll learn how to use Transformer-based natural language processing models for text classification tasks, such as categorizing documents. You will also learn how to leverage Transformer-based models for named-entity recognition (NER) tasks and how to analyze various model features, constraints, and characteristics to determine which model is best suited for a particular use case based on metrics, domain specificity, and available resources.

By participating in this workshop, you’ll be able to:

  • Understand how text embeddings have rapidly evolved in NLP tasks such as Word2Vec, recurrent neural network (RNN)-based embeddings, and Transformers

  • See how Transformer architecture features, especially self-attention, are used to create language models without RNNs

  • Use self-supervision to improve the Transformer architecture in BERT, Megatron, and other variants for superior NLP results

  • Leverage pre-trained, modern NLP models to solve multiple tasks such as text classification, NER, and question answering

  • Manage inference challenges and deploy refined models for live applications

Prerequisites:

  • Experience with Python coding and use of library functions and parameters

  • Fundamental understanding of a deep learning framework such as TensorFlow, PyTorch, or Keras.

  • Basic understanding of neural networks.

Tools, libraries, and frameworks: CuDF, CuPy, TensorFlow 2, and NVIDIA Triton™ Inference Server

OCT

21

09h30
(CEST)

ONLINE

JAN

09

2020

10h00

University of SANNIO

Department of Engineering

Laboratorio Polifunzionale

C.so Garibalbi, 107

82100 Benevento (Italy)