浏览代码

fix broken link errors

dloman118 10 月之前
父节点
当前提交
c6a714b4d8

+ 1 - 1
recipes/llama_api_providers/Groq/groq-api-cookbook/function-calling-101-ecommerce/Function-Calling-101-Ecommerce.ipynb

@@ -328,7 +328,7 @@
    "source": [
     "The two key parameters we need to include in our chat completion are `tools=tools` and `tool_choice=\"auto\"`, which provides the model with the available tools we've just defined and tells it to use one if appropriate (`tool_choice=\"auto\"` gives the LLM the option of using any, all or none of the available functions. To mandate a specific function call, we could use `tool_choice={\"type\": \"function\", \"function\": {\"name\":\"create_order\"}}`). \n",
     "\n",
-    "When the LLM decides to use a tool, the response is *not* a conversational chat, but . From there, we can execute the LLM-identified tool with the LLM-identified parameters, and feed the response *back* to the LLM for a second request so that it can respond with appropriate context from the tool it just used:"
+    "When the LLM decides to use a tool, the response is *not* a conversational chat, but a JSON object containing the tool choice and tool parameters. From there, we can execute the LLM-identified tool with the LLM-identified parameters, and feed the response *back* to the LLM for a second request so that it can respond with appropriate context from the tool it just used:"
    ]
   },
   {

+ 2 - 1
recipes/llama_api_providers/Groq/groq-api-cookbook/rag-langchain-presidential-speeches/rag-langchain-presidential-speeches.ipynb

@@ -57,11 +57,12 @@
    ]
   },
   {
+   "attachments": {},
    "cell_type": "markdown",
    "id": "4c18688b-178f-439d-90a4-590f99ade11f",
    "metadata": {},
    "source": [
-    "A Groq API Key is required for this demo - you can generate one for free [here](https://console.groq.com/keys). We will be using Pinecone as our vector database, which also requires an API key (you can create one index for a small project there for free on their Starter plan), but will also show how it works with [Chroma DB](https://www.trychroma.com/), a free open source alternative that stores vector embeddings in memory. We will also use the Llama3 8b model for this demo."
+    "A Groq API Key is required for this demo - you can generate one for free [here](https://console.groq.com/). We will be using Pinecone as our vector database, which also requires an API key (you can create one index for a small project there for free on their Starter plan), but will also show how it works with [Chroma DB](https://www.trychroma.com/), a free open source alternative that stores vector embeddings in memory. We will also use the Llama3 8b model for this demo."
    ]
   },
   {

+ 4 - 0
recipes/llama_api_providers/Groq/groq-example-templates/conversational-chatbot-langchain/README.md

@@ -12,6 +12,10 @@ A simple application that allows users to interact with a conversational chatbot
 
 ## Usage
 
+<!-- markdown-link-check-disable -->
+
 You will need to store a valid Groq API Key as a secret to proceed with this example. You can generate one for free [here](https://console.groq.com/keys).
 
+<!-- markdown-link-check-enable -->
+
 You can [fork and run this application on Replit](https://replit.com/@GroqCloud/Chatbot-with-Conversational-Memory-on-LangChain) or run it on the command line with `python main.py`

+ 5 - 2
recipes/llama_api_providers/Groq/groq-example-templates/crewai-agents/README.md

@@ -12,9 +12,12 @@ The [CrewAI](https://docs.crewai.com/) Machine Learning Assistant is a command l
 
 - **LangChain Integration**: Incorporates LangChain to facilitate natural language processing and enhance the interaction between the user and the machine learning assistant.
 
-
 ## Usage
 
+<!-- markdown-link-check-disable -->
+
 You will need to store a valid Groq API Key as a secret to proceed with this example. You can generate one for free [here](https://console.groq.com/keys).
 
-You can [fork and run this application on Replit](https://replit.com/@GroqCloud/CrewAI-Machine-Learning-Assistant) or run it on the command line with `python main.py`. You can upload a sample .csv to the same directory as ```main.py``` to give the application a head start on your ML problem. The application will output a Markdown file including python code for your ML use case to the same directory as main.py.
+<!-- markdown-link-check-enable -->
+
+You can [fork and run this application on Replit](https://replit.com/@GroqCloud/CrewAI-Machine-Learning-Assistant) or run it on the command line with `python main.py`. You can upload a sample .csv to the same directory as `main.py` to give the application a head start on your ML problem. The application will output a Markdown file including python code for your ML use case to the same directory as main.py.

+ 4 - 0
recipes/llama_api_providers/Groq/groq-example-templates/groq-quickstart-conversational-chatbot/README.md

@@ -12,6 +12,10 @@ A simple application that allows users to interact with a conversational chatbot
 
 ## Usage
 
+<!-- markdown-link-check-disable -->
+
 You will need to store a valid Groq API Key as a secret to proceed with this example. You can generate one for free [here](https://console.groq.com/keys).
 
+<!-- markdown-link-check-enable -->
+
 You can [fork and run this application on Replit](https://replit.com/@GroqCloud/Groq-Quickstart-Conversational-Chatbot) or run it on the command line with `python main.py`.

+ 4 - 0
recipes/llama_api_providers/Groq/groq-example-templates/groqing-the-stock-market-function-calling-llama3/README.md

@@ -18,6 +18,10 @@ The function calling in this application is handled by the Groq API, abstracted
 
 ## Usage
 
+<!-- markdown-link-check-disable -->
+
 You will need to store a valid Groq API Key as a secret to proceed with this example. You can generate one for free [here](https://console.groq.com/keys).
 
+<!-- markdown-link-check-enable -->
+
 You can [fork and run this application on Replit](https://replit.com/@GroqCloud/Groqing-the-Stock-Market-Function-Calling-with-Llama3) or run it on the command line with `python main.py`.

+ 4 - 0
recipes/llama_api_providers/Groq/groq-example-templates/llamachat-conversational-chatbot-with-llamaIndex/README.md

@@ -14,4 +14,8 @@ A simple application that allows users to interact with a conversational chatbot
 
 ##Usage
 
+<!-- markdown-link-check-disable -->
+
 You will need to store a valid Groq API Key as a secret to proceed with this example. You can generate one for free [here](https://console.groq.com/keys).
+
+<!-- markdown-link-check-enable -->

+ 4 - 0
recipes/llama_api_providers/Groq/groq-example-templates/presidential-speeches-rag-with-pinecone/README.md

@@ -22,8 +22,12 @@ The main script of the application is [main.py](./main.py). Here's a brief overv
 
 ## Usage
 
+<!-- markdown-link-check-disable -->
+
 You will need to store a valid Groq API Key as a secret to proceed with this example outside of this Repl. You can generate one for free [here](https://console.groq.com/keys).
 
+<!-- markdown-link-check-enable -->
+
 You would also need your own [Pinecone](https://www.pinecone.io/) index with presidential speech embeddings to run this code locally. You can create a Pinecone API key and one index for a small project for free on their Starter plan, and visit [this Cookbook post](https://github.com/groq/groq-api-cookbook/blob/dan/replit-conversion/presidential-speeches-rag/presidential-speeches-rag.ipynb) for more info on RAG and a guide to uploading these embeddings to a vector database
 
 You can [fork and run this application on Replit](https://replit.com/@GroqCloud/Presidential-Speeches-RAG-with-Pinecone) or run it on the command line with `python main.py`.

+ 4 - 0
recipes/llama_api_providers/Groq/groq-example-templates/text-to-sql-json-mode/README.md

@@ -38,8 +38,12 @@ A well-crafted system prompt is essential for building a functional Text-to-SQL
 
 ## Usage
 
+<!-- markdown-link-check-disable -->
+
 You will need to store a valid Groq API Key as a secret to proceed with this example. You can generate one for free [here](https://console.groq.com/keys).
 
+<!-- markdown-link-check-enable -->
+
 You can [fork and run this application on Replit](https://replit.com/@GroqCloud/Building-a-Text-to-SQL-app-with-Groqs-JSON-mode) or run it on the command line with `python main.py`.
 
 ## Customizing with Your Own Data

+ 4 - 0
recipes/llama_api_providers/Groq/groq-example-templates/verified-sql-function-calling/README.md

@@ -36,8 +36,12 @@ The verified SQL queries and their descriptions are stored in YAML files located
 
 ## Usage
 
+<!-- markdown-link-check-disable -->
+
 You will need to store a valid Groq API Key as a secret to proceed with this example. You can generate one for free [here](https://console.groq.com/keys).
 
+<!-- markdown-link-check-enable -->
+
 You can [fork and run this application on Replit](https://replit.com/@GroqCloud/Execute-Verified-SQL-Queries-with-Function-Calling) or run it on the command line with `python main.py`.
 
 ## Customizing with Your Own Data