Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

chore: update versions #8663

Merged
merged 1 commit into from
Jul 12, 2024
Merged

chore: update versions #8663

merged 1 commit into from
Jul 12, 2024

Conversation

pngwn
Copy link
Member

@pngwn pngwn commented Jun 28, 2024

This PR was opened by the Changesets release GitHub action. When you're ready to do a release, you can merge this and the packages will be published to npm automatically. If you're not ready to do a release yet, that's fine, whenever you add more changesets to main, this PR will be updated.

Releases

@gradio/[email protected]

Fixes

[email protected]

Fixes

[email protected]

Highlights

Support message format in chatbot 💬 (#8422 4221290)

gr.Chatbot and gr.ChatInterface now support the Messages API, which is fully compatible with LLM API providers such as Hugging Face Text Generation Inference, OpenAI's chat completions API, and Llama.cpp server.

Building Gradio applications around these LLM solutions is now even easier!

gr.Chatbot and gr.ChatInterface now have a type parameter that can accept two values - 'tuples' and 'messages'. If set to 'tuples', the default chatbot data format is expected. If set to 'messages', a list of dictionaries with content and role keys is expected. See below -

def chat_greeter(msg, history):
    history.append({"role": "assistant", "content": "Hello!"})
    return history

Additionally, gradio now exposes a gr.ChatMessage dataclass you can use for IDE type hints and auto completion.

image

Tool use in Chatbot 🛠️

The Gradio Chatbot can now natively display tool usage and intermediate thoughts common in Agent and chain-of-thought workflows!

If you are using the new "messages" format, simply add a metadata key with a dictionary containing a title key and value. This will display the assistant message in an expandable message box to show the result of a tool or intermediate step.

import gradio as gr
from gradio import ChatMessage
import time

def generate_response(history):
    history.append(ChatMessage(role="user", content="What is the weather in San Francisco right now?"))
    yield history
    time.sleep(0.25)
    history.append(ChatMessage(role="assistant",
                               content="In order to find the current weather in San Francisco, I will need to use my weather tool.")
                               )
    yield history
    time.sleep(0.25)

    history.append(ChatMessage(role="assistant",
                               content="API Error when connecting to weather service.",
                              metadata={"title": "💥 Error using tool 'Weather'"})
                  )
    yield history
    time.sleep(0.25)

    history.append(ChatMessage(role="assistant",
                               content="I will try again",
                              ))
    yield history
    time.sleep(0.25)

    history.append(ChatMessage(role="assistant",
                               content="Weather 72 degrees Fahrenheit with 20% chance of rain.",
                                metadata={"title": "🛠️ Used tool 'Weather'"}
                              ))
    yield history
    time.sleep(0.25)

    history.append(ChatMessage(role="assistant",
                               content="Now that the API succeeded I can complete my task.",
                              ))
    yield history
    time.sleep(0.25)

    history.append(ChatMessage(role="assistant",
                               content="It's a sunny day in San Francisco with a current temperature of 72 degrees Fahrenheit and a 20% chance of rain. Enjoy the weather!",
                              ))
    yield history


with gr.Blocks() as demo:
    chatbot  = gr.Chatbot(type="messages")
    button = gr.Button("Get San Francisco Weather")
    button.click(generate_response, chatbot, chatbot)

if __name__ == "__main__":
    demo.launch()

tool-box-demo

Thanks @freddyaboulton!

Features

Fixes

@gradio/[email protected]

Dependency updates

@gradio/[email protected]

Dependency updates

@gradio/[email protected]

Dependency updates

@gradio/[email protected]

Dependency updates

@gradio/[email protected]

Highlights

Support message format in chatbot 💬 (#8422 4221290)

gr.Chatbot and gr.ChatInterface now support the Messages API, which is fully compatible with LLM API providers such as Hugging Face Text Generation Inference, OpenAI's chat completions API, and Llama.cpp server.

Building Gradio applications around these LLM solutions is now even easier!

gr.Chatbot and gr.ChatInterface now have a type parameter that can accept two values - 'tuples' and 'messages'. If set to 'tuples', the default chatbot data format is expected. If set to 'messages', a list of dictionaries with content and role keys is expected. See below -

def chat_greeter(msg, history):
    history.append({"role": "assistant", "content": "Hello!"})
    return history

Additionally, gradio now exposes a gr.ChatMessage dataclass you can use for IDE type hints and auto completion.

image

Tool use in Chatbot 🛠️

The Gradio Chatbot can now natively display tool usage and intermediate thoughts common in Agent and chain-of-thought workflows!

If you are using the new "messages" format, simply add a metadata key with a dictionary containing a title key and value. This will display the assistant message in an expandable message box to show the result of a tool or intermediate step.

import gradio as gr
from gradio import ChatMessage
import time

def generate_response(history):
    history.append(ChatMessage(role="user", content="What is the weather in San Francisco right now?"))
    yield history
    time.sleep(0.25)
    history.append(ChatMessage(role="assistant",
                               content="In order to find the current weather in San Francisco, I will need to use my weather tool.")
                               )
    yield history
    time.sleep(0.25)

    history.append(ChatMessage(role="assistant",
                               content="API Error when connecting to weather service.",
                              metadata={"title": "💥 Error using tool 'Weather'"})
                  )
    yield history
    time.sleep(0.25)

    history.append(ChatMessage(role="assistant",
                               content="I will try again",
                              ))
    yield history
    time.sleep(0.25)

    history.append(ChatMessage(role="assistant",
                               content="Weather 72 degrees Fahrenheit with 20% chance of rain.",
                                metadata={"title": "🛠️ Used tool 'Weather'"}
                              ))
    yield history
    time.sleep(0.25)

    history.append(ChatMessage(role="assistant",
                               content="Now that the API succeeded I can complete my task.",
                              ))
    yield history
    time.sleep(0.25)

    history.append(ChatMessage(role="assistant",
                               content="It's a sunny day in San Francisco with a current temperature of 72 degrees Fahrenheit and a 20% chance of rain. Enjoy the weather!",
                              ))
    yield history


with gr.Blocks() as demo:
    chatbot  = gr.Chatbot(type="messages")
    button = gr.Button("Get San Francisco Weather")
    button.click(generate_response, chatbot, chatbot)

if __name__ == "__main__":
    demo.launch()

tool-box-demo

Thanks @freddyaboulton!

Fixes

Dependency updates

@gradio/[email protected]

Dependency updates

@gradio/[email protected]

Dependency updates

@gradio/[email protected]

Features

Dependency updates

@gradio/[email protected]

Dependency updates

@gradio/[email protected]

Fixes

Dependency updates

@gradio/[email protected]

Features

  • #8733 fb0daf3 - Improvements to gr.Examples: adds events as attributes and documents, them, adds sample_labels, and visible properties. Thanks @abidlabs!

Dependency updates

@gradio/[email protected]

Dependency updates

@gradio/[email protected]

Dependency updates

@gradio/[email protected]

Dependency updates

@gradio/[email protected]

Dependency updates

@gradio/[email protected]

Dependency updates

@gradio/[email protected]

Dependency updates

@gradio/[email protected]

Dependency updates

@gradio/[email protected]

Dependency updates

@gradio/[email protected]

Dependency updates

@gradio/[email protected]

Features

@gradio/[email protected]

Fixes

Dependency updates

@gradio/[email protected]

Dependency updates

@gradio/[email protected]

Dependency updates

@gradio/[email protected]

Dependency updates

@gradio/[email protected]

Dependency updates

@gradio/[email protected]

Fixes

Dependency updates

@gradio/[email protected]

Features

Dependency updates

@gradio/[email protected]

Dependency updates

@gradio/[email protected]

Dependency updates

@gradio/[email protected]

Features

Dependency updates

@gradio/[email protected]

Features

Dependency updates

@gradio/[email protected]

Fixes

@gradio/[email protected]

Dependency updates

@gradio/[email protected]

Dependency updates

@gradio/[email protected]

Dependency updates

@gradio/[email protected]

Dependency updates

@gradio/[email protected]

Dependency updates

@gradio/[email protected]

Fixes

Dependency updates

@gradio/[email protected]

Dependency updates

@gradio/[email protected]

Dependency updates

@gradio/[email protected]

Dependency updates

@gradio/[email protected]

Fixes

Dependency updates

@gradio/[email protected]

Highlights

Support message format in chatbot 💬 (#8422 4221290)

gr.Chatbot and gr.ChatInterface now support the Messages API, which is fully compatible with LLM API providers such as Hugging Face Text Generation Inference, OpenAI's chat completions API, and Llama.cpp server.

Building Gradio applications around these LLM solutions is now even easier!

gr.Chatbot and gr.ChatInterface now have a type parameter that can accept two values - 'tuples' and 'messages'. If set to 'tuples', the default chatbot data format is expected. If set to 'messages', a list of dictionaries with content and role keys is expected. See below -

def chat_greeter(msg, history):
    history.append({"role": "assistant", "content": "Hello!"})
    return history

Additionally, gradio now exposes a gr.ChatMessage dataclass you can use for IDE type hints and auto completion.

image

Tool use in Chatbot 🛠️

The Gradio Chatbot can now natively display tool usage and intermediate thoughts common in Agent and chain-of-thought workflows!

If you are using the new "messages" format, simply add a metadata key with a dictionary containing a title key and value. This will display the assistant message in an expandable message box to show the result of a tool or intermediate step.

import gradio as gr
from gradio import ChatMessage
import time

def generate_response(history):
    history.append(ChatMessage(role="user", content="What is the weather in San Francisco right now?"))
    yield history
    time.sleep(0.25)
    history.append(ChatMessage(role="assistant",
                               content="In order to find the current weather in San Francisco, I will need to use my weather tool.")
                               )
    yield history
    time.sleep(0.25)

    history.append(ChatMessage(role="assistant",
                               content="API Error when connecting to weather service.",
                              metadata={"title": "💥 Error using tool 'Weather'"})
                  )
    yield history
    time.sleep(0.25)

    history.append(ChatMessage(role="assistant",
                               content="I will try again",
                              ))
    yield history
    time.sleep(0.25)

    history.append(ChatMessage(role="assistant",
                               content="Weather 72 degrees Fahrenheit with 20% chance of rain.",
                                metadata={"title": "🛠️ Used tool 'Weather'"}
                              ))
    yield history
    time.sleep(0.25)

    history.append(ChatMessage(role="assistant",
                               content="Now that the API succeeded I can complete my task.",
                              ))
    yield history
    time.sleep(0.25)

    history.append(ChatMessage(role="assistant",
                               content="It's a sunny day in San Francisco with a current temperature of 72 degrees Fahrenheit and a 20% chance of rain. Enjoy the weather!",
                              ))
    yield history


with gr.Blocks() as demo:
    chatbot  = gr.Chatbot(type="messages")
    button = gr.Button("Get San Francisco Weather")
    button.click(generate_response, chatbot, chatbot)

if __name__ == "__main__":
    demo.launch()

tool-box-demo

Thanks @freddyaboulton!

Features

Dependency updates

@gradio/[email protected]

Fixes

Dependency updates

@gradio/[email protected]

Dependency updates

@gradio/[email protected]

Fixes

@gradio/[email protected]

Dependency updates

[email protected]

Highlights

Support message format in chatbot 💬 (#8422 4221290)

gr.Chatbot and gr.ChatInterface now support the Messages API, which is fully compatible with LLM API providers such as Hugging Face Text Generation Inference, OpenAI's chat completions API, and Llama.cpp server.

Building Gradio applications around these LLM solutions is now even easier!

gr.Chatbot and gr.ChatInterface now have a type parameter that can accept two values - 'tuples' and 'messages'. If set to 'tuples', the default chatbot data format is expected. If set to 'messages', a list of dictionaries with content and role keys is expected. See below -

def chat_greeter(msg, history):
    history.append({"role": "assistant", "content": "Hello!"})
    return history

Additionally, gradio now exposes a gr.ChatMessage dataclass you can use for IDE type hints and auto completion.

image

Tool use in Chatbot 🛠️

The Gradio Chatbot can now natively display tool usage and intermediate thoughts common in Agent and chain-of-thought workflows!

If you are using the new "messages" format, simply add a metadata key with a dictionary containing a title key and value. This will display the assistant message in an expandable message box to show the result of a tool or intermediate step.

import gradio as gr
from gradio import ChatMessage
import time

def generate_response(history):
    history.append(ChatMessage(role="user", content="What is the weather in San Francisco right now?"))
    yield history
    time.sleep(0.25)
    history.append(ChatMessage(role="assistant",
                               content="In order to find the current weather in San Francisco, I will need to use my weather tool.")
                               )
    yield history
    time.sleep(0.25)

    history.append(ChatMessage(role="assistant",
                               content="API Error when connecting to weather service.",
                              metadata={"title": "💥 Error using tool 'Weather'"})
                  )
    yield history
    time.sleep(0.25)

    history.append(ChatMessage(role="assistant",
                               content="I will try again",
                              ))
    yield history
    time.sleep(0.25)

    history.append(ChatMessage(role="assistant",
                               content="Weather 72 degrees Fahrenheit with 20% chance of rain.",
                                metadata={"title": "🛠️ Used tool 'Weather'"}
                              ))
    yield history
    time.sleep(0.25)

    history.append(ChatMessage(role="assistant",
                               content="Now that the API succeeded I can complete my task.",
                              ))
    yield history
    time.sleep(0.25)

    history.append(ChatMessage(role="assistant",
                               content="It's a sunny day in San Francisco with a current temperature of 72 degrees Fahrenheit and a 20% chance of rain. Enjoy the weather!",
                              ))
    yield history


with gr.Blocks() as demo:
    chatbot  = gr.Chatbot(type="messages")
    button = gr.Button("Get San Francisco Weather")
    button.click(generate_response, chatbot, chatbot)

if __name__ == "__main__":
    demo.launch()

tool-box-demo

Thanks @freddyaboulton!

Features

Fixes

Dependency updates

@gradio/[email protected]

Dependency updates

@gradio/[email protected]

Dependency updates

@gradio/[email protected]

Features

Fixes

Dependency updates

@gradio/[email protected]

Dependency updates

@pngwn pngwn added no-visual-update Add this to a PR to skip chromatic deployment and tests flaky-tests This label runs flaky tests (those that use the HF API) on a PR windows-tests Run backend tests on Windows on this PR (by default, applied only to the changeset release PR) labels Jun 28, 2024
@gradio-pr-bot
Copy link
Contributor

gradio-pr-bot commented Jun 28, 2024

🪼 branch checks and previews

Name Status URL
Spaces ready! Spaces preview
Website ready! Website preview
Storybook building...
🦄 Changes skipped! Workflow log

Install Gradio from this PR

pip install https://gradio-builds.s3.amazonaws.com/7bdd359c16cde77d0c61eb4e1577f7e2b67d7af2/gradio-4.37.2-py3-none-any.whl

Install Gradio Python Client from this PR

pip install "gradio-client @ git+https://github.com/gradio-app/gradio@7bdd359c16cde77d0c61eb4e1577f7e2b67d7af2#subdirectory=client/python"

Install Gradio JS Client from this PR

npm install https://gradio-builds.s3.amazonaws.com/7bdd359c16cde77d0c61eb4e1577f7e2b67d7af2/gradio-client-1.3.0.tgz

@pngwn pngwn force-pushed the changeset-release/main branch 13 times, most recently from 7e881ab to 7eaf014 Compare July 8, 2024 11:05
@pngwn pngwn force-pushed the changeset-release/main branch 2 times, most recently from 34a3fc8 to 5e8a180 Compare July 9, 2024 23:54
@pngwn pngwn force-pushed the changeset-release/main branch 10 times, most recently from 18460fd to 2f898e3 Compare July 11, 2024 13:15
@pngwn pngwn force-pushed the changeset-release/main branch 12 times, most recently from 60e0d35 to e567844 Compare July 12, 2024 17:38
@abidlabs
Copy link
Member

Tested locally and everything lgtm!

@abidlabs abidlabs merged commit 1b74e21 into main Jul 12, 2024
8 of 9 checks passed
@abidlabs abidlabs deleted the changeset-release/main branch July 12, 2024 18:54
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
flaky-tests This label runs flaky tests (those that use the HF API) on a PR no-visual-update Add this to a PR to skip chromatic deployment and tests windows-tests Run backend tests on Windows on this PR (by default, applied only to the changeset release PR)
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants