Jelajahi Sumber

Update README.md

Maxime Labonne 2 tahun lalu
induk
melakukan
8fd411e472
1 mengubah file dengan 18 tambahan dan 0 penghapusan
  1. 18 0
      README.md

+ 18 - 0
README.md

@@ -16,6 +16,8 @@ A step-by-step guide on how to get into large language models with learning reso
 
 ![](images/roadmap.png)
 
+---
+
 ### 1. Mathematics for Machine Learning
 
 Before mastering machine learning, it is important to understand the fundamental mathematical concepts that power these algorithms.
@@ -33,6 +35,8 @@ Before mastering machine learning, it is important to understand the fundamental
 - [Khan Academy - Calculus](https://www.khanacademy.org/math/calculus-1): An interactive course that covers all the basics of calculus.
 - [Khan Academy - Probability and Statistics](https://www.khanacademy.org/math/statistics-probability): Delivers the material in an easy-to-understand format.
 
+---
+
 ### 2. Python for Machine Learning
 
 Python is a powerful and flexible programming language that's particularly good for machine learning, thanks to its readability, consistency, and robust ecosystem of data science libraries.
@@ -50,6 +54,8 @@ Python is a powerful and flexible programming language that's particularly good
 - [freeCodeCamp - Machine Learning for Everybody](https://youtu.be/i_LwzRVP7bg): Practical introduction to different machine learning algorithms for beginners.
 - [Udacity - Intro to Machine Learning](https://www.udacity.com/course/intro-to-machine-learning--ud120): Free course that covers PCA and several other machine learning concepts.
 
+---
+
 ### 3. Neural Networks
 
 Neural networks are a fundamental part of many machine learning models, particularly in the realm of deep learning. To utilize them effectively, a comprehensive understanding of their design and mechanics is essential.
@@ -66,6 +72,8 @@ Neural networks are a fundamental part of many machine learning models, particul
 - [Fast.ai - Practical Deep Learning](https://course.fast.ai/): Free course designed for people with coding experience who want to learn about deep learning.
 - [Patrick Loeber - PyTorch Tutorials](https://www.youtube.com/playlist?list=PLqnslRFeH2UrcDBWF5mfPGpqQDSta6VK4): Series of videos for complete beginners to learn about PyTorch.
 
+---
+
 ### 4. Natural Language Processing (NLP)
 
 NLP is a fascinating branch of artificial intelligence that bridges the gap between human language and machine understanding. From simple text processing to understanding linguistic nuances, NLP plays a crucial role in many applications like translation, sentiment analysis, chatbots, and much more.
@@ -83,6 +91,8 @@ NLP is a fascinating branch of artificial intelligence that bridges the gap betw
 - [Jake Tae - PyTorch RNN from Scratch](https://jaketae.github.io/study/pytorch-rnn/): Practical and simple implementation of RNN, LSTM, and GRU models in PyTorch.
 - [colah's blog - Understanding LSTM Networks](https://colah.github.io/posts/2015-08-Understanding-LSTMs/): A more theoretical article about the LSTM network.
 
+---
+
 ### 5. The Transformer Architecture
 
 The Transformer model, introduced in the "Attention is All You Need" paper, is a type of neural network architecture at the core of large language models.
@@ -100,6 +110,8 @@ The Transformer model, introduced in the "Attention is All You Need" paper, is a
 - [Introduction to the Transformer by Rachel Thomas](https://www.youtube.com/watch?v=AFkGPmU16QA): Provides a good intuition behind the main ideas of the Transformer architecture.
 - [Stanford CS224N - Transformers](https://www.youtube.com/watch?v=ptuGllU5SQQ): A more academic presentation of this architecture.
 
+---
+
 ### 6. Pre-trained Language Models
 
 Pre-trained models like BERT, GPT-2, and T5 are powerful tools that can handle tasks like sequence classification, text generation, text summarization, and question answering.
@@ -118,6 +130,8 @@ Pre-trained models like BERT, GPT-2, and T5 are powerful tools that can handle t
 - [Hugging Face - Transformers Notebooks](https://huggingface.co/docs/transformers/notebooks): List of official notebooks provided by Hugging Face.
 - [Hugging Face - Metrics](https://huggingface.co/metrics): All metrics on the Hugging Face hub.
 
+---
+
 ### 7. Advanced Language Modeling
 
 To fine-tune your skills, learn how to create embeddings with sentence transformers, store them in a vector database, and use parameter-efficient supervised learning or RLHF to fine-tune LLMs.
@@ -134,6 +148,8 @@ To fine-tune your skills, learn how to create embeddings with sentence transform
 - [Hugging Face - PEFT](https://huggingface.co/blog/peft): Another library from Hugging Face implementing different techniques, such as LoRA.
 - [Efficient LLM training by Phil Schmid](https://www.philschmid.de/fine-tune-flan-t5-peft): Implementation of LoRA to fine-tune a Flan-T5 model.
 
+---
+
 ### 8. LMOps
 
 Finally, dive into Language Model Operations (LMOps), learning how to handle prompt engineering, build frameworks with Langchain and Llamaindex, and optimize inference with weight quantization, pruning, distillation, and more.
@@ -151,4 +167,6 @@ Finally, dive into Language Model Operations (LMOps), learning how to handle pro
 - [Pinecone - LangChain AI Handbook](https://www.pinecone.io/learn/langchain-intro/): Excellent free book on how to master the LangChain library.
 - [A Primer to using LlamaIndex](https://gpt-index.readthedocs.io/en/latest/guides/primer.html): Official guides to learn more about LlamaIndex.
 
+---
+
 *Disclaimer: I am not affiliated with any sources listed here. This roadmap was inspired by the excellent [DevOps Roadmap](https://github.com/milanm/DevOps-Roadmap) from Milan Milanović and Romano Roth.*