ソースを参照

Update the streaming API from str to bool based on API backend update

Chester Hu 10 ヶ月 前
コミット
44a7074292

+ 4 - 4
recipes/llama_api_providers/Azure_API_example/azure_api_example.ipynb

@@ -96,7 +96,7 @@
     "Streaming allows the generated tokens to be sent as data-only server-sent events whenever they become available.  \n",
     "This is extremely important for interactive applications such as chatbots, so the user is always engaged.  \n",
     "\n",
-    "To use streaming, simply set `\"stream\":\"True\"` as part of the request payload.  \n",
+    "To use streaming, simply set `\"stream\":True` as part of the request payload.  \n",
     "In the streaming mode, the REST API response will be different from non-streaming mode.\n",
     "\n",
     "Here is an example: "
@@ -108,7 +108,7 @@
    "metadata": {},
    "outputs": [],
    "source": [
-    "!curl -X POST -L https://your-endpoint.inference.ai.azure.com/v1/chat/completions -H 'Content-Type: application/json' -H 'Authorization: your-auth-key' -d '{\"messages\":[{\"content\":\"You are a helpful assistant.\",\"role\":\"system\"},{\"content\":\"Who wrote the book Innovators dilemma?\",\"role\":\"user\"}], \"max_tokens\": 500, \"stream\": \"True\"}'"
+    "!curl -X POST -L https://your-endpoint.inference.ai.azure.com/v1/chat/completions -H 'Content-Type: application/json' -H 'Authorization: your-auth-key' -d '{\"messages\":[{\"content\":\"You are a helpful assistant.\",\"role\":\"system\"},{\"content\":\"Who wrote the book Innovators dilemma?\",\"role\":\"user\"}], \"max_tokens\": 500, \"stream\": True}'"
    ]
   },
   {
@@ -170,7 +170,7 @@
     "            {\"role\":\"user\", \"content\":\"Who wrote the book Innovators dilemma?\"}], \n",
     "        \"max_tokens\": 500,\n",
     "        \"temperature\": 0.9,\n",
-    "        \"stream\": \"True\",\n",
+    "        \"stream\": True,\n",
     "}\n",
     "\n",
     "body = str.encode(json.dumps(data))\n",
@@ -230,7 +230,7 @@
     "            {\"role\":\"user\", \"content\":\"Who wrote the book Innovators dilemma?\"}],\n",
     "        \"max_tokens\": 500,\n",
     "        \"temperature\": 0.9,\n",
-    "        \"stream\": \"True\"\n",
+    "        \"stream\": True\n",
     "}\n",
     "\n",
     "\n",