NLP with Hugging Face

gen AI

This project demonstrates the Hugging Face Transformers for Natural Language Processing (NLP) tasks. It begins with an introduction to Transformers Pipelines and what happens behind the scenes when using them. Then it jumps into the architecture of transformer models and tokenizers, demonstrating how to handle multiple sequences and process textual data effectively. It also covers model fine-tuning using the Trainer API and shows how to train models on GPUs and other hardware accelerators for improved performance and scalability.

Table of contents