mac 5f121395e0 update vor 2 Jahren
..
README_20230218134308.md 5f121395e0 update vor 2 Jahren
README_20230218134453.md 5f121395e0 update vor 2 Jahren
README_20230218134502.md 5f121395e0 update vor 2 Jahren
README_20230218134512.md 5f121395e0 update vor 2 Jahren
README_20230218134520.md 5f121395e0 update vor 2 Jahren
README_20230218134600.md 5f121395e0 update vor 2 Jahren
README_20230218134602.md 5f121395e0 update vor 2 Jahren
README_20230218134603.md 5f121395e0 update vor 2 Jahren
README_20230218134605.md 5f121395e0 update vor 2 Jahren
README_20230218134607.md 5f121395e0 update vor 2 Jahren
README_20230218134609.md 5f121395e0 update vor 2 Jahren
README_20230218134612.md 5f121395e0 update vor 2 Jahren
README_20230218134613.md 5f121395e0 update vor 2 Jahren
README_20230218134639.md 5f121395e0 update vor 2 Jahren
README_20230218134646.md 5f121395e0 update vor 2 Jahren
README_20230218134648.md 5f121395e0 update vor 2 Jahren
README_20230218134656.md 5f121395e0 update vor 2 Jahren
README_20230218134659.md 5f121395e0 update vor 2 Jahren
README_20230218134700.md 5f121395e0 update vor 2 Jahren
README_20230218134703.md 5f121395e0 update vor 2 Jahren
README_20230218134714.md 5f121395e0 update vor 2 Jahren
README_20230218134750.md 5f121395e0 update vor 2 Jahren
README_20230218134753.md 5f121395e0 update vor 2 Jahren
README_20230218134802.md 5f121395e0 update vor 2 Jahren
README_20230218134804.md 5f121395e0 update vor 2 Jahren
README_20230218134810.md 5f121395e0 update vor 2 Jahren
README_20230218134814.md 5f121395e0 update vor 2 Jahren
README_20230218134820.md 5f121395e0 update vor 2 Jahren
README_20230218134824.md 5f121395e0 update vor 2 Jahren
README_20230218134827.md 5f121395e0 update vor 2 Jahren
README_20230218134830.md 5f121395e0 update vor 2 Jahren
README_20230218134834.md 5f121395e0 update vor 2 Jahren
README_20230218134836.md 5f121395e0 update vor 2 Jahren
README_20230218134840.md 5f121395e0 update vor 2 Jahren
README_20230218134924.md 5f121395e0 update vor 2 Jahren
README_20230218134925.md 5f121395e0 update vor 2 Jahren
README_20230218134931.md 5f121395e0 update vor 2 Jahren
README_20230218134933.md 5f121395e0 update vor 2 Jahren
README_20230218135018.md 5f121395e0 update vor 2 Jahren
README_20230218135025.md 5f121395e0 update vor 2 Jahren
README_20230218135027.md 5f121395e0 update vor 2 Jahren
README_20230218135028.md 5f121395e0 update vor 2 Jahren
README_20230218135030.md 5f121395e0 update vor 2 Jahren
README_20230218135040.md 5f121395e0 update vor 2 Jahren
README_20230218135047.md 5f121395e0 update vor 2 Jahren
README_20230218135049.md 5f121395e0 update vor 2 Jahren
README_20230218135057.md 5f121395e0 update vor 2 Jahren
README_20230218135104.md 5f121395e0 update vor 2 Jahren
README_20230218135112.md 5f121395e0 update vor 2 Jahren
README_20230218135123.md 5f121395e0 update vor 2 Jahren
README_20230218135126.md 5f121395e0 update vor 2 Jahren
README_20230218135257.md 5f121395e0 update vor 2 Jahren
README_20230218135259.md 5f121395e0 update vor 2 Jahren
README_20230218135301.md 5f121395e0 update vor 2 Jahren
README_20230218135304.md 5f121395e0 update vor 2 Jahren
README_20230218135320.md 5f121395e0 update vor 2 Jahren
README_20230218135325.md 5f121395e0 update vor 2 Jahren
README_20230218135330.md 5f121395e0 update vor 2 Jahren
README_20230218135537.md 5f121395e0 update vor 2 Jahren
README_20230218135540.md 5f121395e0 update vor 2 Jahren
README_20230218135544.md 5f121395e0 update vor 2 Jahren
README_20230218135547.md 5f121395e0 update vor 2 Jahren
README_20230218135552.md 5f121395e0 update vor 2 Jahren
README_20230218135602.md 5f121395e0 update vor 2 Jahren
README_20230218135608.md 5f121395e0 update vor 2 Jahren
README_20230218135630.md 5f121395e0 update vor 2 Jahren
README_20230218135631.md 5f121395e0 update vor 2 Jahren
README_20230218135636.md 5f121395e0 update vor 2 Jahren
README_20230218135639.md 5f121395e0 update vor 2 Jahren
README_20230218135656.md 5f121395e0 update vor 2 Jahren
README_20230218135659.md 5f121395e0 update vor 2 Jahren
README_20230218135704.md 5f121395e0 update vor 2 Jahren
README_20230218135706.md 5f121395e0 update vor 2 Jahren
README_20230218135707.md 5f121395e0 update vor 2 Jahren
README_20230218135752.md 5f121395e0 update vor 2 Jahren
README_20230218135755.md 5f121395e0 update vor 2 Jahren
README_20230218135757.md 5f121395e0 update vor 2 Jahren
README_20230218135804.md 5f121395e0 update vor 2 Jahren
README_20230218135939.md 5f121395e0 update vor 2 Jahren
README_20230218135943.md 5f121395e0 update vor 2 Jahren
README_20230218135946.md 5f121395e0 update vor 2 Jahren
README_20230218135949.md 5f121395e0 update vor 2 Jahren
README_20230218135957.md 5f121395e0 update vor 2 Jahren
README_20230218135959.md 5f121395e0 update vor 2 Jahren
README_20230218140001.md 5f121395e0 update vor 2 Jahren
README_20230218140009.md 5f121395e0 update vor 2 Jahren
README_20230218140013.md 5f121395e0 update vor 2 Jahren
README_20230218140019.md 5f121395e0 update vor 2 Jahren
README_20230218140031.md 5f121395e0 update vor 2 Jahren
README_20230218140034.md 5f121395e0 update vor 2 Jahren
README_20230218140056.md 5f121395e0 update vor 2 Jahren
README_20230218140058.md 5f121395e0 update vor 2 Jahren
README_20230218140131.md 5f121395e0 update vor 2 Jahren
README_20230218140137.md 5f121395e0 update vor 2 Jahren
README_20230218140208.md 5f121395e0 update vor 2 Jahren
README_20230218140210.md 5f121395e0 update vor 2 Jahren
README_20230218140216.md 5f121395e0 update vor 2 Jahren
README_20230218140217.md 5f121395e0 update vor 2 Jahren
README_20230218140219.md 5f121395e0 update vor 2 Jahren
README_20230218140221.md 5f121395e0 update vor 2 Jahren
README_20230218140223.md 5f121395e0 update vor 2 Jahren
README_20230218140230.md 5f121395e0 update vor 2 Jahren
README_20230218140238.md 5f121395e0 update vor 2 Jahren
README_20230218140319.md 5f121395e0 update vor 2 Jahren

README_20230218134308.md

Awesome-LLM

🔥 Large Language Models(LLM) have taken the NLP community the Whole World by storm. Here is a comprehensive list of papers about large language models, especially relating to ChatGPT. It also contains codes, courses and related websites as shown below:

Milestone Papers

Year keywords Institute Paper Publication
2017 Transformers Google Attention Is All You Need NeurIPS
2018 GPT 1.0 OpenAI Improving Language Understanding by Generative Pre-Training
2019 BERT Google BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding NAACL
2019 T5 Google Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer JMLR
2019 GPT 2.0 OpenAI Language Models are Unsupervised Multitask Learners
2020 GPT 3.0 OpenAI Language models are few-shot learners NeurIPS
2020 Scaling Law OpenAI Scaling Laws for Neural Language Models
2020 Megatron-LM NVIDIA Megatron-LM: Training Multi-Billion Parameter Language Models Using Model Parallelism
2022 Flan-T5 Google Scaling Instruction-Finetuned Language Models
2021 LM-BFF Princeton Making Pre-trained Language Models Better Few-shot Learners ACL
2021 WebGPT OpenAI WebGPT: Improving the Factual Accuracy of Language Models through Web Browsing
2021 Codex OpenAI Evaluating Large Language Models Trained on Code
2022 Foundation Models Stanford On the Opportunities and Risks of Foundation Models
2022 HELM Stanford Holistic Evaluation of Language Models
2022 InstructGPT OpenAI Training language models to follow instructions with human feedback
2022 Emergent Abilities Google Emergent Abilities of Large Language Models TMLR
2022 COT Google Chain-of-Thought Prompting Elicits Reasoning in Large Language Models NeurIPS
2022 PaLM Google PaLM: Scaling Language Modeling with Pathways
2022 FLAN Google Finetuned Language Models are Zero-Shot Learners ICLR
2022 Chinchilla DeepMind An empirical analysis of compute-optimal large language model training NeurIPS
2022 BIG-bench Google Beyond the Imitation Game: Quantifying and extrapolating the capabilities of language models
2022 BLOOM BigScience BLOOM: A 176B-Parameter Open-Access Multilingual Language Model
2022 OPT Meta OPT: Open Pre-trained Transformer Language Models
2022 LaMDA Google LaMDA: Language Models for Dialog Applications
2022 Sparrow DeepMind Improving alignment of dialogue agents via targeted human judgements
2022 Retro DeepMind Improving language models by retrieving from trillions of tokens ICML
2022 METALM Microsoft Language Models are General-Purpose Interfaces
2022 Galactica Meta Galactica: A Large Language Model for Science
2023 Flan 2022 Collection Google The Flan Collection: Designing Data and Methods for Effective Instruction Tuning

ChatGPT Evaluation

  • Is ChatGPT a General-Purpose Natural Language Processing Task Solver? Link

  • Is ChatGPT A Good Translator? A Preliminary Study Link

Tools for Training LLM

Tutorials about LLM

  • [ICML 2022] Welcome to the "Big Model" Era: Techniques and Systems to Train and Serve Bigger Models Link

  • [NeurIPS 2022] Foundational Robustness of Foundation Models Link

  • [Andrej Karpathy] Let's build GPT: from scratch, in code, spelled out. Video|Code

Course about LLM

  • [Stanford] CS224N-Lecture 11: Prompting, Instruction Finetuning, and RLHF Slides

  • [Stanford] CS324-Large Language Models Homepage

  • [Stanford] CS25-Transformers United V2 Homepage

  • [李沐] InstructGPT论文精读 Bilibili Youtube

  • [李沐] HELM全面语言模型评测 Bilibili

  • [李沐] GPT,GPT-2,GPT-3 论文精读 Bilibili Youtube

  • [Aston Zhang] Chain of Thought论文 Bilibili Youtube

Useful Resources

  • [2023-02-16][知乎][旷视科技]对话旷视研究院张祥雨|ChatGPT的科研价值可能更大 Link
  • [2023-02-15][知乎][张家俊]关于ChatGPT八个技术问题的猜想 Link
  • [2023-02-14][Stephen Wolfram]What Is ChatGPT Doing … and Why Does It Work? Link
  • [2023-02-13][知乎][熊德意] 对ChatGPT的二十点看法 Link
  • [2023-02-11][知乎][刘聪NLP] ChatGPT-所见、所闻、所感 Link
  • [2023-02-07][Forbes] The Next Generation Of Large Language Models Link
  • [2023-01-26][NVIDIA] What Are Large Language Models Used For? Link
  • [2023-01-18][知乎][张俊林] 通向AGI之路:大型语言模型(LLM)技术精要 Link
  • [2023-01-06][Shayne Longpre] Major LLMs + Data Availability Link
  • [2021-10-26][Huggingface] Large Language Models: A New Moore's Law Link

Contributing

This is an active repository and your contributions are always welcome!

I will keep some pull requests open if I'm not sure if they are awesome for LLM, you could vote for them by adding 👍 to them.


If you have any question about this opinionated list, do not hesitate to contact me chengxin1998@stu.pku.edu.cn.