浏览代码

Fixed all "Open in Colab" absolute paths

Connor Treacy 4 月之前
父节点
当前提交
aa60f75d44
共有 25 个文件被更改,包括 27 次插入28 次删除
  1. 1 2
      3p-integrations/aws/getting_started_llama_3_on_amazon_bedrock.ipynb
  2. 1 2
      3p-integrations/aws/prompt_engineering_with_llama_2_on_amazon_bedrock.ipynb
  3. 1 1
      3p-integrations/aws/react_llama_3_bedrock_wk.ipynb
  4. 2 1
      3p-integrations/groq/llama3_cookbook_groq.ipynb
  5. 1 1
      3p-integrations/langchain/langgraph_rag_agent.ipynb
  6. 1 1
      3p-integrations/langchain/langgraph_rag_agent_local.ipynb
  7. 1 1
      3p-integrations/langchain/langgraph_tool_calling_agent.ipynb
  8. 1 1
      3p-integrations/llamaindex/dlai_agentic_rag/Building_Agentic_RAG_with_Llamaindex_L2_Tool_Calling.ipynb
  9. 1 1
      3p-integrations/llamaindex/dlai_agentic_rag/Building_Agentic_RAG_with_Llamaindex_L3_Building_an_Agent_Reasoning_Loop.ipynb
  10. 1 1
      3p-integrations/llamaindex/dlai_agentic_rag/Building_Agentic_RAG_with_Llamaindex_L4_Building_a_Multi-Document_Agent.ipynb
  11. 1 1
      3p-integrations/togetherai/knowledge_graphs_with_structured_outputs.ipynb
  12. 1 1
      3p-integrations/togetherai/pdf_to_podcast_using_llama_on_together.ipynb
  13. 1 1
      end-to-end-use-cases/agents/DeepLearningai_Course_Notebooks/AI_Agentic_Design_Patterns_with_AutoGen_L4_Tool_Use_and_Conversational_Chess.ipynb
  14. 1 1
      end-to-end-use-cases/agents/DeepLearningai_Course_Notebooks/AI_Agents_in_LangGraph_L1_Build_an_Agent_from_Scratch.ipynb
  15. 1 1
      end-to-end-use-cases/agents/DeepLearningai_Course_Notebooks/Building_Agentic_RAG_with_Llamaindex_L1_Router_Engine.ipynb
  16. 1 1
      end-to-end-use-cases/agents/DeepLearningai_Course_Notebooks/Functions_Tools_and_Agents_with_LangChain_L1_Function_Calling.ipynb
  17. 1 1
      end-to-end-use-cases/coding/text2sql/quickstart.ipynb
  18. 1 1
      end-to-end-use-cases/live_data.ipynb
  19. 1 1
      end-to-end-use-cases/responsible_ai/llama_guard/llama_guard_customization_via_prompting_and_fine_tuning.ipynb
  20. 1 1
      end-to-end-use-cases/responsible_ai/llama_guard/llama_guard_text_and_vision_inference.ipynb
  21. 1 1
      end-to-end-use-cases/video_summary.ipynb
  22. 1 1
      getting-started/Prompt_Engineering_with_Llama.ipynb
  23. 2 2
      getting-started/RAG/hello_llama_cloud.ipynb
  24. 1 1
      getting-started/build_with_Llama_3_2.ipynb
  25. 1 1
      getting-started/finetuning/quickstart_peft_finetuning.ipynb

文件差异内容过多而无法显示
+ 1 - 2
3p-integrations/aws/getting_started_llama_3_on_amazon_bedrock.ipynb


文件差异内容过多而无法显示
+ 1 - 2
3p-integrations/aws/prompt_engineering_with_llama_2_on_amazon_bedrock.ipynb


文件差异内容过多而无法显示
+ 1 - 1
3p-integrations/aws/react_llama_3_bedrock_wk.ipynb


+ 2 - 1
3p-integrations/groq/llama3_cookbook_groq.ipynb

@@ -7,7 +7,8 @@
    "source": [
     "# Llama 3 Cookbook with LlamaIndex and Groq\n",
     "\n",
-    "<a href=\"https://colab.research.google.com/github/meta-llama/llama-recipes/blob/main/recipes/llama_api_providers/llama3_cookbook_groq.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>\n",
+    "<a href=\"https://colab.research.google.com/github/meta-llama/llama-cookbook/blob/main/3p-integrations/groq/llama3_cookbook_groq.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>\n",
+    "\n",
     "\n",
     "Meta developed and released the Meta [Llama 3](https://ai.meta.com/blog/meta-llama-3/) family of large language models (LLMs), a collection of pretrained and instruction tuned generative text models in 8 and 70B sizes. The Llama 3 instruction tuned models are optimized for dialogue use cases and outperform many of the available open source chat models on common industry benchmarks.\n",
     "\n",

+ 1 - 1
3p-integrations/langchain/langgraph_rag_agent.ipynb

@@ -5,7 +5,7 @@
    "id": "6912ab05-f66a-40a9-a4a5-4deb80d2e0d9",
    "metadata": {},
    "source": [
-    "<a href=\"https://colab.research.google.com/github/meta-llama/llama-recipes/blob/main/recipes/3p_integrations/langchain/langgraph_rag_agent.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>"
+    "<a href=\"https://colab.research.google.com/github/meta-llama/llama-cookbook/blob/main/3p-integrations/langchain/langgraph_rag_agent.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>"
    ]
   },
   {

+ 1 - 1
3p-integrations/langchain/langgraph_rag_agent_local.ipynb

@@ -5,7 +5,7 @@
    "id": "1f53f753-12c6-4fac-b910-6e96677d8a49",
    "metadata": {},
    "source": [
-    "<a href=\"https://colab.research.google.com/github/meta-llama/llama-recipes/blob/main/recipes/3p_integrations/langchain/langgraph_rag_agent_local.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>"
+    "<a href=\"https://colab.research.google.com/github/meta-llama/llama-cookbook/blob/main/3p-integrations/langchain/langgraph_rag_agent_local.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>"
    ]
   },
   {

+ 1 - 1
3p-integrations/langchain/langgraph_tool_calling_agent.ipynb

@@ -5,7 +5,7 @@
    "id": "8ac4ba3b-c438-4f2e-8f52-39846beb5642",
    "metadata": {},
    "source": [
-    "<a href=\"https://colab.research.google.com/github/meta-llama/llama-recipes/blob/main/recipes/3p_integrations/langchain/langgraph_tool_calling_agent.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>"
+    "<a href=\"https://colab.research.google.com/github/meta-llama/llama-cookbook/blob/main/3p-integrations/langchain/langgraph_tool_calling_agent.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>"
    ]
   },
   {

+ 1 - 1
3p-integrations/llamaindex/dlai_agentic_rag/Building_Agentic_RAG_with_Llamaindex_L2_Tool_Calling.ipynb

@@ -4,7 +4,7 @@
    "cell_type": "markdown",
    "metadata": {},
    "source": [
-    "<a href=\"https://colab.research.google.com/github/meta-llama/llama-recipes/blob/main/recipes/3p_integrations/llamaindex/dlai_agentic_rag/Building_Agentic_RAG_with_Llamaindex_L2_Tool_Calling.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>\n",
+    "<a href=\"https://colab.research.google.com/github/meta-llama/llama-cookbook/blob/main/3p-integrations/llamaindex/dlai_agentic_rag/Building_Agentic_RAG_with_Llamaindex_L2_Tool_Calling.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>\n",
     "\n",
     "This notebook ports the DeepLearning.AI short course [Building Agentic RAG with Llamaindex Lesson 2 Tool Calling](https://learn.deeplearning.ai/courses/building-agentic-rag-with-llamaindex/lesson/3/tool-calling) to using Llama 3. It shows how to use Llama 3 to not only pick a function to execute, but also infer an argument to pass through the function.\n",
     "\n",

+ 1 - 1
3p-integrations/llamaindex/dlai_agentic_rag/Building_Agentic_RAG_with_Llamaindex_L3_Building_an_Agent_Reasoning_Loop.ipynb

@@ -4,7 +4,7 @@
    "cell_type": "markdown",
    "metadata": {},
    "source": [
-    "<a href=\"https://colab.research.google.com/github/meta-llama/llama-recipes/blob/main/recipes/3p_integrations/llamaindex/dlai_agentic_rag/Building_Agentic_RAG_with_Llamaindex_L3_Building_an_Agent_Reasoning_Loop.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>\n",
+    "<a href=\"https://colab.research.google.com/github/meta-llama/llama-cookbook/blob/main/3p-integrations/llamaindex/dlai_agentic_rag/Building_Agentic_RAG_with_Llamaindex_L3_Building_an_Agent_Reasoning_Loop.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>\n",
     "\n",
     "This notebook ports the DeepLearning.AI short course [Building Agentic RAG with Llamaindex Lesson 3 Building an Agent Reasoning Loop](https://learn.deeplearning.ai/courses/building-agentic-rag-with-llamaindex/lesson/4/building-an-agent-reasoning-loop) to using Llama 3. It shows how to define a complete agent reasoning loop to reason over tools and multiple steps on a complex question the user asks about a single document while maintaining memory.\n",
     "\n",

+ 1 - 1
3p-integrations/llamaindex/dlai_agentic_rag/Building_Agentic_RAG_with_Llamaindex_L4_Building_a_Multi-Document_Agent.ipynb

@@ -4,7 +4,7 @@
    "cell_type": "markdown",
    "metadata": {},
    "source": [
-    "<a href=\"https://colab.research.google.com/github/meta-llama/llama-recipes/blob/main/recipes/3p_integrations/llamaindex/dlai_agentic_rag/Building_Agentic_RAG_with_Llamaindex_L4_Building_a_Multi-Document_Agent.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>\n",
+    "<a href=\"https://colab.research.google.com/github/meta-llama/llama-cookbook/blob/main/3p-integrations/llamaindex/dlai_agentic_rag/Building_Agentic_RAG_with_Llamaindex_L4_Building_a_Multi-Document_Agent.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>\n",
     "\n",
     "This notebook ports the DeepLearning.AI short course [Building Agentic RAG with Llamaindex Lesson 4 Building a Multi-Document Agent](https://learn.deeplearning.ai/courses/building-agentic-rag-with-llamaindex/lesson/5/building-a-multi-document-agent) to using Llama 3. It shows how to use an agent to handle multiple documents and increasing degrees of complexity.\n",
     "\n",

+ 1 - 1
3p-integrations/togetherai/knowledge_graphs_with_structured_outputs.ipynb

@@ -5,7 +5,7 @@
    "metadata": {},
    "source": [
     "# Generating Knowledge Graphs with LLMs and Structured Outputs\n",
-    "[![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/meta-llama/llama-recipes/blob/main/recipes/3p_integrations/togetherai/knowledge_graphs_with_structured_outputs.ipynb)"
+    "[![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/meta-llama/llama-cookbook/blob/main/3p-integrations/togetherai/knowledge_graphs_with_structured_outputs.ipynb)"
    ]
   },
   {

+ 1 - 1
3p-integrations/togetherai/pdf_to_podcast_using_llama_on_together.ipynb

@@ -4,7 +4,7 @@
    "cell_type": "markdown",
    "metadata": {},
    "source": [
-    "[![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/meta-llama/llama-recipes/blob/main/recipes/3p_integrations/togetherai/pdf_to_podcast_using_llama_on_together.ipynb)"
+    "[![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/meta-llama/llama-cookbook/blob/main/3p-integrations/togetherai/pdf_to_podcast_using_llama_on_together.ipynb)"
    ]
   },
   {

+ 1 - 1
end-to-end-use-cases/agents/DeepLearningai_Course_Notebooks/AI_Agentic_Design_Patterns_with_AutoGen_L4_Tool_Use_and_Conversational_Chess.ipynb

@@ -5,7 +5,7 @@
    "id": "7a4b75bb-d60a-41e3-abca-1ca0f0bf1201",
    "metadata": {},
    "source": [
-    "<a href=\"https://colab.research.google.com/github/meta-llama/llama-recipes/blob/main/recipes/quickstart/agents/DeepLearningai_Course_Notebooks/AI_Agentic_Design_Patterns_with_AutoGen_L4_Tool_Use_and_Conversational_Chess.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>"
+    "<a href=\"https://colab.research.google.com/github/meta-llama/llama-cookbook/blob/main/end-to-end-use-cases/agents/DeepLearningai_Course_Notebooks/AI_Agentic_Design_Patterns_with_AutoGen_L4_Tool_Use_and_Conversational_Chess.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>"
    ]
   },
   {

+ 1 - 1
end-to-end-use-cases/agents/DeepLearningai_Course_Notebooks/AI_Agents_in_LangGraph_L1_Build_an_Agent_from_Scratch.ipynb

@@ -5,7 +5,7 @@
    "id": "de56ee05-3b71-43c9-8cbf-6ad9b3233f38",
    "metadata": {},
    "source": [
-    "<a href=\"https://colab.research.google.com/github/meta-llama/llama-recipes/blob/main/recipes/quickstart/agents/DeepLearningai_Course_Notebooks/AI_Agents_in_LangGraph_L1_Build_an_Agent_from_Scratch.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>"
+    "<a href=\"https://colab.research.google.com/github/meta-llama/llama-cookbook/blob/main/end-to-end-use-cases/agents/DeepLearningai_Course_Notebooks/AI_Agents_in_LangGraph_L1_Build_an_Agent_from_Scratch.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>"
    ]
   },
   {

+ 1 - 1
end-to-end-use-cases/agents/DeepLearningai_Course_Notebooks/Building_Agentic_RAG_with_Llamaindex_L1_Router_Engine.ipynb

@@ -4,7 +4,7 @@
    "cell_type": "markdown",
    "metadata": {},
    "source": [
-    "<a href=\"https://colab.research.google.com/github/meta-llama/llama-recipes/blob/main/recipes/quickstart/agents/DeepLearningai_Course_Notebooks/Building_Agentic_RAG_with_Llamaindex_L1_Router_Engine.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>"
+    "<a href=\"https://colab.research.google.com/github/meta-llama/llama-cookbook/blob/main/end-to-end-use-cases/agents/DeepLearningai_Course_Notebooks/Building_Agentic_RAG_with_Llamaindex_L1_Router_Engine.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>"
    ]
   },
   {

+ 1 - 1
end-to-end-use-cases/agents/DeepLearningai_Course_Notebooks/Functions_Tools_and_Agents_with_LangChain_L1_Function_Calling.ipynb

@@ -5,7 +5,7 @@
    "id": "2ba1b4ef-3b96-4e7e-b5d0-155b839db73c",
    "metadata": {},
    "source": [
-    "<a href=\"https://colab.research.google.com/github/meta-llama/llama-recipes/blob/main/recipes/quickstart/agents/DeepLearningai_Course_Notebooks/Functions_Tools_and_Agents_with_LangChain_L1_Function_Calling.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>"
+    "<a href=\"https://colab.research.google.com/github/meta-llama/llama-cookbook/blob/main/end-to-end-use-cases/agents/DeepLearningai_Course_Notebooks/Functions_Tools_and_Agents_with_LangChain_L1_Function_Calling.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>"
    ]
   },
   {

+ 1 - 1
end-to-end-use-cases/coding/text2sql/quickstart.ipynb

@@ -5,7 +5,7 @@
    "id": "e8cba0b6",
    "metadata": {},
    "source": [
-    "<a href=\"https://colab.research.google.com/github/meta-llama/llama-recipes/blob/main/recipes/use_cases/coding/text2sql/quickstart.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>  \n",
+    "<a href=\"https://colab.research.google.com/github/meta-llama/llama-cookbook/blob/main/end-to-end-use-cases/coding/text2sql/quickstart.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>  \n",
     "\n",
     "## Quick Demo of Text2SQL Using Llama 3.3\n",
     "\n",

+ 1 - 1
end-to-end-use-cases/live_data.ipynb

@@ -5,7 +5,7 @@
    "id": "30eb1704-8d76-4bc9-9308-93243aeb69cb",
    "metadata": {},
    "source": [
-    "<a href=\"https://colab.research.google.com/github/meta-llama/llama-recipes/blob/main/recipes/use_cases/LiveData.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>\n",
+    "<a href=\"https://colab.research.google.com/github/meta-llama/llama-cookbook/blob/main/end-to-end-use-cases/live_data.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>\n",
     "\n",
     "## This demo app shows:\n",
     "* How to use LlamaIndex, an open source library to help you build custom data augmented LLM applications\n",

+ 1 - 1
end-to-end-use-cases/responsible_ai/llama_guard/llama_guard_customization_via_prompting_and_fine_tuning.ipynb

@@ -15,7 +15,7 @@
    "source": [
     "# Llama Guard 3 Customization: Taxonomy Customization, Zero/Few-shot prompting, Evaluation and Fine Tuning \n",
     "\n",
-    "<a target=\"_blank\" href=\"https://colab.research.google.com/github/meta-llama/llama-recipes/blob/main/recipes/responsible_ai/llama_guard/llama_guard_customization_via_prompting_and_fine_tuning.ipynb\">\n",
+    "<a target=\"_blank\" href=\"https://colab.research.google.com/github/meta-llama/llama-cookbook/blob/main/end-to-end-use-cases/responsible_ai/llama_guard/llama_guard_customization_via_prompting_and_fine_tuning.ipynb\">\n",
     "  <img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/>\n",
     "</a>\n",
     "\n",

+ 1 - 1
end-to-end-use-cases/responsible_ai/llama_guard/llama_guard_text_and_vision_inference.ipynb

@@ -7,7 +7,7 @@
    "source": [
     "# Llama Guard 3 Text & Vision update\n",
     "\n",
-    "<a href=\"https://colab.research.google.com/github/meta-llama/llama-recipes/blob/main/recipes/responsible_ai/llama_guard/llama_guard_text_and_vision_inference.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>\n",
+    "<a href=\"https://colab.research.google.com/github/meta-llama/llama-cookbook/blob/main/end-to-end-use-cases/responsible_ai/llama_guard/llama_guard_text_and_vision_inference.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>\n",
     "\n",
     "In this notebook we show simple inference scripts using the [transformers](https://github.com/huggingface/transformers) library, from HuggingFace. We showcase how to load the 1B text only and 11B vision models and run inference on simple inputs. For details on the models, refer to their corresponding model cards:\n",
     "* [Llama Guard 3 1B](https://github.com/meta-llama/PurpleLlama/blob/main/Llama-Guard3/1B/MODEL_CARD.md)\n",

+ 1 - 1
end-to-end-use-cases/video_summary.ipynb

@@ -5,7 +5,7 @@
    "id": "30b1235c-2f3e-4628-9c90-30385f741550",
    "metadata": {},
    "source": [
-    "<a href=\"https://colab.research.google.com/github/meta-llama/llama-recipes/blob/main/recipes/use_cases/VideoSummary.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>\n",
+    "<a href=\"https://colab.research.google.com/github/meta-llama/llama-cookbook/blob/main/end-to-end-use-cases/video_summary.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>\n",
     "\n",
     "## This demo app shows:\n",
     "* How to use LangChain's YoutubeLoader to retrieve the caption in a YouTube video\n",

+ 1 - 1
getting-started/Prompt_Engineering_with_Llama.ipynb

@@ -5,7 +5,7 @@
    "cell_type": "markdown",
    "metadata": {},
    "source": [
-    "<a href=\"https://colab.research.google.com/github/meta-llama/llama-recipes/blob/main/recipes/quickstart/Prompt_Engineering_with_Llama_3.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>\n",
+    "<a href=\"https://colab.research.google.com/github/meta-llama/llama-cookbook/blob/main/getting-started/Prompt_Engineering_with_Llama.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>\n",
     "\n",
     "# Prompt Engineering with Llama\n",
     "\n",

+ 2 - 2
getting-started/RAG/hello_llama_cloud.ipynb

@@ -5,7 +5,7 @@
    "id": "1c1ea03a-cc69-45b0-80d3-664e48ca6831",
    "metadata": {},
    "source": [
-    "<a href=\"https://colab.research.google.com/github/meta-llama/llama-recipes/blob/main/recipes/use_cases/RAG/HelloLlamaCloud.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>\n",
+    "<a href=\"https://colab.research.google.com/github/meta-llama/llama-cookbook/blob/main/getting-started/RAG/hello_llama_cloud.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>\n",
     "\n",
     "## This demo app shows:\n",
     "* How to run Llama 3.1 in the cloud hosted on Replicate\n",
@@ -37,7 +37,7 @@
     "!pip install sentence-transformers\n",
     "!pip install faiss-cpu\n",
     "!pip install bs4\n",
-    "!pip install replicate",
+    "!pip install replicate\n",
     "!pip install langchain-community"
    ]
   },

+ 1 - 1
getting-started/build_with_Llama_3_2.ipynb

@@ -5,7 +5,7 @@
    "id": "42939a0f",
    "metadata": {},
    "source": [
-    "<a href=\"https://colab.research.google.com/github/meta-llama/llama-recipes/blob/main/recipes/quickstart/build_with_Llama_3_2.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>"
+    "<a href=\"https://colab.research.google.com/github/meta-llama/llama-cookbook/blob/main/recipes/quickstart/build_with_Llama_3_2.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>"
    ]
   },
   {

+ 1 - 1
getting-started/finetuning/quickstart_peft_finetuning.ipynb

@@ -8,7 +8,7 @@
     "Copyright (c) Meta Platforms, Inc. and affiliates.\n",
     "This software may be used and distributed according to the terms of the Llama 2 Community License Agreement.\n",
     "\n",
-    "<a href=\"https://colab.research.google.com/github/meta-llama/llama-recipes/blob/main/recipes/quickstart/finetuning/quickstart_peft_finetuning.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>"
+    "<a href=\"https://colab.research.google.com/github/meta-llama/llama-cookbook/blob/main/getting-started/finetuning/quickstart_peft_finetuning.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>"
    ]
   },
   {