English | MP4 | AVC 1280×720 | AAC 44KHz 2ch | 6h 3m | 790 MB
Everything you need to know about using the tools, libraries, and models at Hugging Face—from transformers, to RAG, LangChain, and Gradio.
Hugging Face in Action reveals how to get the absolute best out of everything Hugging Face, from accessing state-of-the-art models to building intuitive frontends for AI apps.
With Hugging Face in Action you’ll learn:
- Utilizing Hugging Face Transformers and Pipelines for NLP tasks
- Applying Hugging Face techniques for Computer Vision projects
- Manipulating Hugging Face Datasets for efficient data handling
- Training Machine Learning models with AutoTrain functionality
- Implementing AI agents for autonomous task execution
- Developing LLM-based applications using LangChain and LlamaIndex
- Constructing LangChain applications visually with LangFlow
- Creating web-based user interfaces using Gradio
- Building locally running LLM-based applications with GPT4ALL
- Querying local data using Large Language Models
Want a cutting edge transformer library? Hugging Face’s open source offering is best in class. Need somewhere to host your models? Hugging Face Spaces has you covered. Do your users need an intuitive frontend for your AI app? Hugging Face’s Gradio library makes it easy to build UI using the Python skills you already have. In Hugging Face in Action you’ll learn how to take full advantage of all of Hugging Face’s amazing features to quickly and reliably prototype and productionize AI applications.
Hugging Face is an incredible open-source ecosystem for AI engineers and data scientists, providing hundreds of pre-trained models, datasets, tools, and libraries. It’s also a central hub for collaborating on leading edge AI research. Hugging Face is a massive platform, and this book will help you take full advantage of all it has to offer.
Hugging Face in Action teaches you how to build end-to-end AI systems using resources from the Hugging Face community. In it, you’ll create multiple projects, including an object detection model, a RAG Q&A application, an LLM-powered chatbot, and more. You’ll appreciate the clear, accessible explanations, along with thoughtful introductions to key technologies like LangChain, LlamaIndex, and Gradio.
What’s Inside
- How to navigate the huge Hugging Face library of models and tools
- How to run LLMs locally using GPT4ALL
- How to create web-based user interfaces using Gradio
- How to improve models using Hugging Face datasets
Table of Contents
1 Chapter 1. Introducing Hugging Face
2 Chapter 1. Hugging Face models
3 Chapter 1. Hugging Face Gradio Python library
4 Chapter 1. Understanding the Hugging Face mental model
5 Chapter 1. Summary
6 Chapter 2. Getting started
7 Chapter 2. Installing the Transformers library
8 Chapter 2. Installing the Hugging Face Hub package
9 Chapter 2. Summary
10 Chapter 3. Using Hugging Face transformers and pipelines for NLP tasks
11 Chapter 3. Working with the Transformers library
12 Chapter 3. Using transformers for NLP tasks
13 Chapter 3. Summary
14 Chapter 4. Using Hugging Face for computer vision tasks
15 Chapter 4. Object detection
16 Chapter 4. Image classification
17 Chapter 4. Image segmentation
18 Chapter 4. Video classification
19 Chapter 4. Summary
20 Chapter 5. Exploring, tokenizing, and visualizing Hugging Face datasets
21 Chapter 5. Tokenization in NLP
22 Chapter 5. Visualizing datasets
23 Chapter 5. Summary
24 Chapter 6. Fine-tuning pretrained models and working with multimodal models
25 Chapter 6. Working with multimodal models
26 Chapter 6. Summary
27 Chapter 7. Creating LLM-based applications using LangChain and LlamaIndex
28 Chapter 7. Introducing LangChain
29 Chapter 7. Connecting LLMs to your private data using LlamaIndex
30 Chapter 7. Summary
31 Chapter 8. Building LangChain applications visually using Langflow
32 Chapter 8. Creating a new Langflow project
33 Chapter 8. Asking questions on your own dat
34 Chapter 8. Using your project programmatically
35 Chapter 8. Summary
36 Chapter 9. Programming agents
37 Chapter 9. Developing agents using smolagents
38 Chapter 9. Developing agents with LangChain
39 Chapter 9. Developing agents using LangGraph
40 Chapter 9. Summary
41 Chapter 10. Building a web-based UI using Gradio
42 Chapter 10. Working with widgets
43 Chapter 10. Creating a chatbot UI
44 Chapter 10. Summary
45 Chapter 11. Building locally running LLM-based applications using GPT4All
46 Chapter 11. Installing GPT4All
47 Chapter 11. Summary
48 Chapter 12. Using LLMs to query your local data
49 Chapter 12. Using LLMs to write code to analyze your data
50 Chapter 12. Summary
51 Chapter 13. Summary
52 Chapter 13. Bridging LLMs to the real world with the Model Context Protocol
53 Chapter 13. Building an MCP server
54 Chapter 13. Testing the MCP server using Claude Desktop
55 Chapter 13. Trying third-party MCP servers
Resolve the captcha to access the links!
