Jelajahi Sumber

Update README.md

angysaravia 2 tahun lalu
induk
melakukan
4f4718258f
1 mengubah file dengan 2 tambahan dan 2 penghapusan
  1. 2 2
      README.md

+ 2 - 2
README.md

@@ -6,12 +6,12 @@
 | 1) **Toolformer: Language Models Can Teach Themselves to Use Tools** - Toolformer - introduces language models that teach themselves to use external tools via simple API calls.    | [Paper](https://arxiv.org/abs/2302.04761), [Tweet](https://twitter.com/dair_ai/status/1624832248691191808?s=20&t=ygX07dsAPDF8_jwrxZIo1Q)|
 | 2) **Describe, Explain, Plan and Select: Interactive Planning with Large Language Models Enables Open-World Multi-Task Agents** - Describe, Explain, Plan, and Select - proposes using language models for open-world game playing.| [Paper](https://arxiv.org/abs/2302.01560), [Tweet](https://twitter.com/dair_ai/status/1624832250717036548?s=20&t=ygX07dsAPDF8_jwrxZIo1Q) |
 | 3) **A Categorical Archive of ChatGPT Failures** - A Categorical Archive of ChatGPT Failures - a comprehensive analysis of ChatGPT failures for categories like reasoning, factual errors, maths, and coding. | [Paper](https://arxiv.org/abs/2302.03494), [Tweet](https://twitter.com/dair_ai/status/1624832252587700230?s=20&t=ygX07dsAPDF8_jwrxZIo1Q) |
-| 4) **Hard Prompts Made Easy: Gradient-Based Discrete Optimization for Prompt Tuning and Discovery** - Hard Prompts Made Easy - optimizing hard text prompts through efficient gradient-based optimization.  | [Paper](https://arxiv.org/abs/2302.03668), [Tweet]([XXX](https://twitter.com/dair_ai/status/1624832254588465156?s=20&t=ygX07dsAPDF8_jwrxZIo1Q)  |
+| 4) **Hard Prompts Made Easy: Gradient-Based Discrete Optimization for Prompt Tuning and Discovery** - Hard Prompts Made Easy - optimizing hard text prompts through efficient gradient-based optimization.  | [Paper](https://arxiv.org/abs/2302.03668), [Tweet](https://twitter.com/dair_ai/status/1624832254588465156?s=20&t=ygX07dsAPDF8_jwrxZIo1Q)](https://twitter.com/dair_ai/status/1624832254588465156?s=20&t=ygX07dsAPDF8_jwrxZIo1Q)  |
 | 5) **Data Selection for Language Models via Importance Resampling** - Data Selection for LMs - proposes a cheap and scalable data selection framework based on an importance resampling algorithm to improve the downstream performance of LMs. | [Paper]([XXX](https://arxiv.org/abs/2302.03169)), [Tweet](https://twitter.com/dair_ai/status/1624832256400302080?s=20&t=ygX07dsAPDF8_jwrxZIo1Q)  |
 | 6) **Structure and Content-Guided Video Synthesis with Diffusion Models** - Gen-1 - proposes an approach for structure and content-guided video synthesis with diffusion models.   | [Paper](https://arxiv.org/abs/2302.03011) , [Project](https://research.runwayml.com/gen1), [Tweet](https://twitter.com/dair_ai/status/1624832258296229889?s=20&t=ygX07dsAPDF8_jwrxZIo1Q)  |
 | 7) **A Multitask, Multilingual, Multimodal Evaluation of ChatGPT on Reasoning, Hallucination, and Interactivity** - Multitask, Multilingual, Multimodal Evaluation of ChatGPT - performs a more rigorous evaluation of ChatGPt on reasoning, hallucination, and interactivity. | [Paper](https://arxiv.org/abs/2302.04023), [Tweet](https://twitter.com/dair_ai/status/1624832260213026819?s=20&t=ygX07dsAPDF8_jwrxZIo1Q)  |
 | 8) **Noise2Music: Text-conditioned Music Generation with Diffusion Models** -  Noise2Music - proposes diffusion models to generate high-quality 30-second music clips via text prompts.  | [Paper](https://arxiv.org/abs/2302.03917), [Project](https://google-research.github.io/noise2music/)[Tweet](https://twitter.com/dair_ai/status/1624832262163337220?s=20&t=ygX07dsAPDF8_jwrxZIo1Q)  |
-| 9) **Offsite-Tuning: Transfer Learning without Full Model** - Offsite-Tuning - introduces an efficient, privacy-preserving transfer learning framework to adapt foundational models to downstream data without access to the full model. | [Paper]((https://arxiv.org/abs/2302.04870), [Project](https://github.com/mit-han-lab/offsite-tuning), [Tweet](https://twitter.com/dair_ai/status/1624832264029831169?s=20&t=ygX07dsAPDF8_jwrxZIo1Q)  |
+| 9) **Offsite-Tuning: Transfer Learning without Full Model** - Offsite-Tuning - introduces an efficient, privacy-preserving transfer learning framework to adapt foundational models to downstream data without access to the full model. | [Paper](https://arxiv.org/abs/2302.04870), [Project](https://github.com/mit-han-lab/offsite-tuning), [Tweet](https://twitter.com/dair_ai/status/1624832264029831169?s=20&t=ygX07dsAPDF8_jwrxZIo1Q)  |
 | 10) **Zero-shot Image-to-Image Translation** - pix2pix-zero - proposes a model for zero-shot image-to-image translation.  | [Paper](https://arxiv.org/abs/2302.03027), [Project](https://pix2pixzero.github.io/), [Tweet](https://twitter.com/dair_ai/status/1624832265967607813?s=20&t=ygX07dsAPDF8_jwrxZIo1Q)  |
 
 ---