|
@@ -1,6 +1,14 @@
|
|
|
{
|
|
|
"cells": [
|
|
|
{
|
|
|
+ "cell_type": "markdown",
|
|
|
+ "id": "8ac4ba3b-c438-4f2e-8f52-39846beb5642",
|
|
|
+ "metadata": {},
|
|
|
+ "source": [
|
|
|
+ "<a href=\"https://colab.research.google.com/github/meta-llama/llama-recipes/blob/main/recipes/use_cases/agents/langchain/langgraph-tool-calling-agent.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>"
|
|
|
+ ]
|
|
|
+ },
|
|
|
+ {
|
|
|
"cell_type": "code",
|
|
|
"execution_count": null,
|
|
|
"id": "974c2eb0-4844-4e0d-ae91-91571d070a3f",
|
|
@@ -30,8 +38,8 @@
|
|
|
"\n",
|
|
|
"It allows for more customization than agent executor:\n",
|
|
|
"\n",
|
|
|
- "1) It allows us to define `nodes` for our assistant (which decides whether to call a tool) and our actions (tool calls)\n",
|
|
|
- "2) It allows us to define specific `edges` that connect these nodes (e.g., based upon whether a tool call is decided)\n",
|
|
|
+ "1) It allows us to define `nodes` for our assistant (which decides whether to call a tool) and our actions (tool calls).\n",
|
|
|
+ "2) It allows us to define specific `edges` that connect these nodes (e.g., based upon whether a tool call is decided).\n",
|
|
|
"3) It enables `cycles`, where we can call our assistant in a loop until a stopping condition.\n",
|
|
|
"\n",
|
|
|
"\n",
|
|
@@ -42,7 +50,7 @@
|
|
|
"\n",
|
|
|
"As before, we'll use [Tavily](https://tavily.com/#api) for web search.\n",
|
|
|
"\n",
|
|
|
- "We'll use Replicate for various multi-modal capabilities.\n",
|
|
|
+ "We'll use [Replicate](https://replicate.com/), which offers free to try API key and for various multi-modal capabilities.\n",
|
|
|
"\n",
|
|
|
"We can review LangChain LLM integrations that support tool calling [here](https://python.langchain.com/docs/integrations/chat/).\n",
|
|
|
"\n",
|
|
@@ -250,9 +258,9 @@
|
|
|
"source": [
|
|
|
"### Assistant \n",
|
|
|
"\n",
|
|
|
- "This is llama3, with tool-calling, using [Groq](https://python.langchain.com/v0.1/docs/integrations/chat/groq/).\n",
|
|
|
+ "This is Llama 3, with tool-calling, using [Groq](https://python.langchain.com/v0.1/docs/integrations/chat/groq/).\n",
|
|
|
"\n",
|
|
|
- "We bind the available tools to the llm. \n",
|
|
|
+ "We bind the available tools to Llama 3. \n",
|
|
|
"\n",
|
|
|
"And we further specify the available tools in our assistant prompt."
|
|
|
]
|
|
@@ -815,7 +823,7 @@
|
|
|
"name": "python",
|
|
|
"nbconvert_exporter": "python",
|
|
|
"pygments_lexer": "ipython3",
|
|
|
- "version": "3.11.9"
|
|
|
+ "version": "3.10.14"
|
|
|
}
|
|
|
},
|
|
|
"nbformat": 4,
|