浏览代码

Removing references to Llama 2 especifically

Beto de Paola 1 年之前
父节点
当前提交
751732b15a

+ 3 - 3
recipes/llama_api_providers/examples_with_aws/getting_started_llama2_on_amazon_bedrock.ipynb

@@ -6,8 +6,8 @@
     "id": "lbfIu_3eEaAh"
    },
    "source": [
-    "# Using Amazon Bedrock with Llama 2\n",
-    "Use this notebook to quickly get started with Llama 2 on Bedrock. You can access the Amazon Bedrock API using the AWS Python SDK.\n",
+    "# Using Amazon Bedrock with Llama\n",
+    "Use this notebook to quickly get started with Llama on Bedrock. You can access the Amazon Bedrock API using the AWS Python SDK.\n",
     "\n",
     "In this notebook, we will give you some simple code to confirm to get up and running with the AWS Python SDK, setting up credentials, looking up the list of available Meta Llama models, and using bedrock to inference.\n",
     "\n",
@@ -243,7 +243,7 @@
     "prompt_1 = \"Explain black holes to 8th graders\"\n",
     "prompt_2 = \"Tell me about llamas\"\n",
     "\n",
-    "# Let's now run the same prompt with Llama 2 13B and 70B to compare responses\n",
+    "# Let's now run the same prompt with Llama 3 8B and 70B to compare responses\n",
     "print(\"\\n=======LLAMA-3-8B====PROMPT 1================>\", prompt_1)\n",
     "response_8b_prompt1 = invoke_model(bedrock_runtime, 'meta.llama3-8b-instruct-v1:0', prompt_1, 256)\n",
     "print(\"\\n=======LLAMA-3-70B====PROMPT 1================>\", prompt_1)\n",