DocumentationAPI ReferenceπŸ““ TutorialsπŸ§‘β€πŸ³ Cookbook🀝 IntegrationsπŸ’œ Discord

GoogleAIGeminiChatGenerator

This component enables chat completion using Google Gemini models.

NameGoogleAIGeminiChatGenerator
Pathhttps://github.com/deepset-ai/haystack-core-integrations/tree/main/integrations/google_ai
Most common Position in a PipelineAfter a DynamicChatPromptBuilder
Mandatory Input variablesβ€œmessages”: a list of ChatMessage objects representing the chat
Output variablesβ€œreplies”: a list of alternative replies of the model to the input chat

GoogleAIGeminiChatGenerator supports gemini-pro, gemini-ultra, and gemini-pro-vision models.

Prompting with images requires gemini-pro-vision. Function calling, instead, requires gemini-pro.

Parameters Overview

GoogleAIGeminiChatGenerator uses a Google Makersuite API key for authentication. You can write this key in an api_key parameter or as a GOOGLE_API_KEY environment variable (recommended).

To get an API key, visit the Google Makersuite website.

Usage

To begin working with GoogleAIGeminiChatGenerator, install the google-ai-haystack package:

pip install google-ai-haystack

On its own

Basic usage:

import os
from haystack.dataclasses import ChatMessage
from haystack_integrations.components.generators.google_ai import GoogleAIGeminiChatGenerator

os.environ["GOOGLE_API_KEY"] = "<MY_API_KEY>"
gemini_chat = GoogleAIGeminiChatGenerator()

messages = [ChatMessage.from_user("Tell me the name of a movie")]
res = gemini_chat.run(messages)

print(res["replies"][0].content)
>>> The Shawshank Redemption

messages += [res["replies"], ChatMessage.from_user("Who's the main actor?")]
res = gemini_chat.run(messages)

print(res["replies"][0].content)
>>> Tim Robbins

When chatting with Gemini Pro, you can also easily use function calls. First, define the function locally:

from google.ai.generativelanguage import FunctionDeclaration, Tool

get_current_weather_func = FunctionDeclaration(
  name="get_current_weather",
  description="Get the current weather in a given location",
  parameters={
    "type_": "OBJECT",
    "properties": {
      "location": {"type_": "STRING", "description": "The city and state, e.g. San Francisco, CA"},
      "unit": {
        "type_": "STRING",
        "enum": [
          "celsius",
          "fahrenheit",
        ],
      },
    },
    "required": ["location"],
  },
)

tool = Tool(function_declarations=[get_current_weather_func])

For demonstration purposes, we’re also creating a function that will always tell us the same thing:

def get_current_weather(location: str, unit: str = "celsius"):
  return {"weather": "sunny", "temperature": 21.8, "unit": unit}

Create a new instance of GoogleAIGeminiChatGenerator to set the tools:

import os
from haystack_integrations.components.generators.google_ai import GoogleAIGeminiChatGenerator

os.environ["GOOGLE_API_KEY"] = "<MY_API_KEY>"

gemini_chat = GoogleAIGeminiChatGenerator(model="gemini-pro", tools=[tool])

And then ask a question:

from haystack.dataclasses import ChatMessage

messages = [ChatMessage.from_user("What is the temperature in celsius in Berlin?")]
res = gemini_chat.run(messages=messages)

print(res["replies"][0].content)
>>> {'location': 'Berlin, Germany', 'unit': 'metric'}

weather = get_current_weather(**res["replies"][0].content)

messages += res["replies"] + [ChatMessage.from_function(content=weather, name="get_current_weather")]

res = gemini_chat.run(messages=messages)
print(res["replies"][0].content)
>>> 21.8 Celsius

In a pipeline

import os
from haystack.components.builders import DynamicChatPromptBuilder
from haystack.dataclasses import ChatMessage
from haystack import Pipeline
from haystack_integrations.components.generators.google_ai import GoogleAIGeminiChatGenerator


# no parameter init, we don't use any runtime template variables
prompt_builder = DynamicChatPromptBuilder()

os.environ["GOOGLE_API_KEY"] = "<MY_API_KEY>"
gemini_chat = GoogleAIGeminiChatGenerator()

pipe = Pipeline()
pipe.add_component("prompt_builder", prompt_builder)
pipe.add_component("gemini", gemini)
pipe.connect("prompt_builder.prompt", "gemini.messages")

location = "Rome"
messages = [ChatMessage.from_user("Tell me briefly about {{location}} history")]
res = pipe.run(data={"prompt_builder": {"template_variables":{"location": location}, "prompt_source": messages}})

print(res["replies"][0].content)

>>> - **753 B.C.:** Traditional date of the founding of Rome by Romulus and Remus.
>>> - **509 B.C.:** Establishment of the Roman Republic, replacing the Etruscan monarchy.
>>> - **492-264 B.C.:** Series of wars against neighboring tribes, resulting in the expansion of the Roman Republic's territory.
>>> - **264-146 B.C.:** Three Punic Wars against Carthage, resulting in the destruction of Carthage and the Roman Republic becoming the dominant power in the Mediterranean.
>>> - **133-73 B.C.:** Series of civil wars and slave revolts, leading to the rise of Julius Caesar.
>>> - **49 B.C.:** Julius Caesar crosses the Rubicon River, starting the Roman Civil War.
>>> - **44 B.C.:** Julius Caesar is assassinated, leading to the Second Triumvirate of Octavian, Mark Antony, and Lepidus.
>>> - **31 B.C.:** Battle of Actium, where Octavian defeats Mark Antony and Cleopatra, becoming the sole ruler of Rome.
>>> - **27 B.C.:** The Roman Republic is transformed into the Roman Empire, with Octavian becoming the first Roman emperor, known as Augustus.
>>> - **1st century A.D.:** The Roman Empire reaches its greatest extent, stretching from Britain to Egypt.
>>> - **3rd century A.D.:** The Roman Empire begins to decline, facing internal instability, invasions by Germanic tribes, and the rise of Christianity.
>>> - **476 A.D.:** The last Western Roman emperor, Romulus Augustulus, is overthrown by the Germanic leader Odoacer, marking the end of the Roman Empire in the West.

Related Links

Check out the API reference in the GitHub repo or in our docs: