Skip to content

Removed duplicate output responses #77

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 1 commit into
base: master
Choose a base branch
from
Open
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
127 changes: 7 additions & 120 deletions anthropic_api_fundamentals/04_parameters.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -155,7 +155,7 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"Of course, if we try generating a poem again with a larger value for `max_tokens`, we'll likely get an entire poem:"
"Of course, if we try generating a poem again with a larger value for `max_tokens`, we'll likely get an entire poem. This is what Claude generated with max_tokens set to 500:"
]
},
{
Expand Down Expand Up @@ -209,35 +209,6 @@
"print(longer_poem_response.content[0].text)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"This is what Claude generated with `max_tokens` set to 500:\n",
"\n",
"```\n",
"Here is a poem for you:\n",
"\n",
"Whispers of the Wind\n",
"\n",
"The wind whispers softly,\n",
"Caressing my face with care.\n",
"Its gentle touch, a fleeting breath,\n",
"Carries thoughts beyond compare.\n",
"\n",
"Rustling leaves dance in rhythm,\n",
"Swaying to the breeze's song.\n",
"Enchanting melodies of nature,\n",
"Peaceful moments linger long.\n",
"\n",
"The wind's embrace, a soothing balm,\n",
"Calms the restless soul within.\n",
"Embracing life's fleeting moments,\n",
"As the wind's sweet song begins.\n",
"```\n",
"\n"
]
},
{
"cell_type": "markdown",
"metadata": {},
Expand Down Expand Up @@ -276,7 +247,7 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"It's also important to know that increasing `max_tokens` does not ensure that Claude actually generates a specific number of tokens. If we ask Claude to write a joke and set `max_tokens` to 1000, we'll almost certainly get a response that is much shorter than 1000 tokens."
"It's also important to know that increasing `max_tokens` does not ensure that Claude actually generates a specific number of tokens. If we ask Claude to write a joke and set `max_tokens` to 1000, we'll almost certainly get a response that is much shorter than 1000 tokens. In the below example, we ask Claude to "Tell me a joke" and give max_tokens a value of 1000. It generated this joke:"
]
},
{
Expand Down Expand Up @@ -330,23 +301,6 @@
"print(response.usage.output_tokens)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"In the above example, we ask Claude to \"Tell me a joke\" and give `max_tokens` a value of 1000. It generated this joke: \n",
"\n",
"```\n",
"Here's a classic dad joke for you:\n",
"\n",
"Why don't scientists trust atoms? Because they make up everything!\n",
"\n",
"How was that? I tried to keep it clean and mildly amusing. Let me know if you'd like to hear another joke.\n",
"```\n",
"\n",
"That generated content was only 55 tokens long. We gave Claude a ceiling of 1000 tokens, but that doesn't mean it will generate 1000 tokens."
]
},
{
"cell_type": "markdown",
"metadata": {},
Expand All @@ -363,7 +317,7 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"Let's take a look at how the number of tokens generated by Claude can impact performance. The following function asks Claude to generate a very long dialogue between two characters three different times, each with a different value for `max_tokens`. It then prints out how many tokens were actually generated and how long the generation took."
"Let's take a look at how the number of tokens generated by Claude can impact performance. The following function asks Claude to generate a very long dialogue between two characters three different times, each with a different value for `max_tokens`. It then prints out how many tokens were actually generated and how long the generation took. If you run the code, the exact values you get will likely differ, but here's one example output:"
]
},
{
Expand Down Expand Up @@ -425,19 +379,6 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"If you run the code, the exact values you get will likely differ, but here's one example output: \n",
"\n",
"```\n",
"Number of tokens generated: 100\n",
"Execution Time: 1.51 seconds\n",
"\n",
"Number of tokens generated: 1000\n",
"Execution Time: 8.33 seconds\n",
"\n",
"Number of tokens generated: 3433\n",
"Execution Time: 28.80 seconds\n",
"```\n",
"\n",
"As you can see, **the more tokens that Claude generates, the longer it takes!**"
]
},
Expand All @@ -463,7 +404,7 @@
"\n",
"Another important parameter we haven't seen yet is `stop_sequence` which allows us to provide the model with a set of strings that, when encountered in the generated response, cause the generation to stop. They are essentially a way of telling Claude, \"if you generate this sequence, stop generating anything else!\"\n",
"\n",
"Here's an example of a request that does not include a `stop_sequence`:"
"Here's an example of a request that does not include a `stop_sequence`. The below code asks Claude to generate a JSON object representing a person. Here's an example output Claude generated:"
]
},
{
Expand Down Expand Up @@ -508,28 +449,7 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"The above code asks Claude to generate a JSON object representing a person. Here's an example output Claude generated: \n",
"\n",
"```\n",
"Here's an example of a JSON object representing a person with a name, email, and phone number:\n",
"\n",
"{\n",
" \"name\": \"John Doe\",\n",
" \"email\": \"[email protected]\",\n",
" \"phoneNumber\": \"123-456-7890\"\n",
"}\n",
"\n",
"\n",
"In this example, the JSON object has three key-value pairs:\n",
"\n",
"1. \"name\": The person's name, which is a string value of \"John Doe\".\n",
"2. \"email\": The person's email address, which is a string value of \"[email protected]\".\n",
"3. \"phoneNumber\": The person's phone number, which is a string value of \"123-456-7890\".\n",
"\n",
"You can modify the values to represent a different person with their own name, email, and phone number.\n",
"```\n",
"\n",
"Claude did generate the requested object, but also included an explanation afterwards. If we wanted Claude to stop generating as soon as it generated the closing \"}\" of the JSON object, we could modify the code to include the `stop_sequences` parameter."
"Claude did generate the requested object, but also included an explanation afterwards. If we wanted Claude to stop generating as soon as it generated the closing \"}\" of the JSON object, we could modify the code to include the `stop_sequences` parameter. The model generated the following output:"
]
},
{
Expand Down Expand Up @@ -565,16 +485,6 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"The model generated the following output:\n",
"\n",
"```\n",
"Here's a JSON object representing a person with a name, email, and phone number:\n",
"{\n",
" \"name\": \"John Doe\",\n",
" \"email\": \"[email protected]\",\n",
" \"phone\": \"555-1234\"\n",
"\n",
"```\n",
"**IMPORTANT NOTE:** Notice that the resulting output does **not** include the \"}\" stop sequence itself. If we wanted to use and parse this as JSON, we would need to add the closing \"}\" back in."
]
},
Expand Down Expand Up @@ -638,7 +548,7 @@
"source": [
"We can provide multiple stop sequences. In the event that we provide multiple, the model will stop generating as soon as it encounters any of the stop sequences. The resulting `stop_sequence` property on the response Message will tell us which exact `stop_sequence` was encountered. \n",
"\n",
"The function below asks Claude to write a poem and stop if it ever generates the letters \"b\" or \"c\". It does this three times:"
"The function below asks Claude to write a poem and stop if it ever generates the letters \"b\" or \"c\". It does this three times. Here's an example output:"
]
},
{
Expand Down Expand Up @@ -681,14 +591,6 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"Here's an example output: \n",
"\n",
"```\n",
"Response 1 stopped because stop_sequence. The stop sequence was c\n",
"Response 2 stopped because stop_sequence. The stop sequence was b\n",
"Response 3 stopped because stop_sequence. The stop sequence was b\n",
"```\n",
"\n",
"The first time through, Claude stopped writing the poem because it generated the letter \"c\". The following two times, it stopped because it generated the letter \"b\". Would you ever do this? Probably not!"
]
},
Expand Down Expand Up @@ -722,7 +624,7 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"Let's try a quick demonstration. Take a look at the function below. Using a temperature of 0 and then a temperature of 1, we make three requests to Claude, asking it to \"Come up with a name for an alien planet. Respond with a single word.\" "
"Let's try a quick demonstration. Take a look at the function below. Using a temperature of 0 and then a temperature of 1, we make three requests to Claude, asking it to \"Come up with a name for an alien planet. Respond with a single word.\" Following is the result of running the function (your specific results may vary):"
]
},
{
Expand Down Expand Up @@ -777,21 +679,6 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"This is the result of running the above function (your specific results may vary): \n",
"\n",
"```\n",
"Prompting Claude three times with temperature of 0\n",
"================\n",
"Response 1: Xendor.\n",
"Response 2: Xendor.\n",
"Response 3: Xendor.\n",
"Prompting Claude three times with temperature of 1\n",
"================\n",
"Response 1: Xyron.\n",
"Response 2: Xandar.\n",
"Response 3: Zyrcon.\n",
"```\n",
"\n",
"Notice that with a temperature of 0, all three responses are the same. Note that even with a temperature of 0.0, the results will not be fully deterministic. However, there is a clear difference when compared to the results with a temperature of 1. Each response was a completely different alien planet name. "
]
},
Expand Down