πŸŽ„ Let's code and celebrate this holiday season with Advent of Haystack
Maintained by deepset

Integration: Anthropic

Use Anthropic Models with Haystack

Authors
deepset

Table of Contents

Overview

This integration supports Anthropic Claude models provided through Anthropic’s own inferencing infrastructure. For a full list of available models, check out the Anthropic Claude documentation.

Haystack 2.0

You can use Anthropic models with AnthropicGenerator and AnthropicChatGenerator.

Currently, available models are:

  • claude-2.1
  • claude-3-haiku-20240307
  • claude-3-sonnet-20240229 (default)
  • claude-3-opus-20240229

Installation

pip install anthropic-haystack

Usage

Based on your use case, you can choose between AnthropicGenerator or AnthropicChatGenerator to work with Anthropic models. To learn more about the difference, visit the Generators vs Chat Generators guide.
Before using, make sure to set the ANTHROPIC_API_KEY environment variable.

Using AnthropicChatGenerator

Below is an example RAG Pipeline where we answer a predefined question using the contents from the below mentioned URL pointing to Anthropic prompt engineering guide. We fetch the contents of the URL and generate an answer with the AnthropicChatGenerator.

# To run this example, you will need to set a `ANTHROPIC_API_KEY` environment variable.

from haystack import Pipeline
from haystack.components.builders import ChatPromptBuilder
from haystack.components.converters import HTMLToDocument
from haystack.components.fetchers import LinkContentFetcher
from haystack.components.generators.utils import print_streaming_chunk
from haystack.dataclasses import ChatMessage
from haystack.utils import Secret

from haystack_integrations.components.generators.anthropic import AnthropicChatGenerator

messages = [
    ChatMessage.from_system("You are a prompt expert who answers questions based on the given documents."),
    ChatMessage.from_user(
        "Here are the documents:\n"
        "{% for d in documents %} \n"
        "    {{d.content}} \n"
        "{% endfor %}"
        "\nAnswer: {{query}}"
    ),
]

rag_pipeline = Pipeline()
rag_pipeline.add_component("fetcher", LinkContentFetcher())
rag_pipeline.add_component("converter", HTMLToDocument())
rag_pipeline.add_component("prompt_builder", ChatPromptBuilder(variables=["documents"]))
rag_pipeline.add_component(
    "llm",
    AnthropicChatGenerator(
        api_key=Secret.from_env_var("ANTHROPIC_API_KEY"),
        streaming_callback=print_streaming_chunk,
    ),
)


rag_pipeline.connect("fetcher", "converter")
rag_pipeline.connect("converter", "prompt_builder")
rag_pipeline.connect("prompt_builder.prompt", "llm.messages")

question = "When should we use prompt engineering and when should we fine-tune?"
rag_pipeline.run(
    data={
        "fetcher": {"urls": ["https://docs.anthropic.com/en/docs/build-with-claude/prompt-engineering/overview"]},
        "prompt_builder": {"template_variables": {"query": question}, "template": messages},
    }
)

Using AnthropicGenerator

Below is an example of using AnthropicGenerator:

from haystack_integrations.components.generators.anthropic import AnthropicGenerator

client = AnthropicGenerator()
response = client.run("What's Natural Language Processing? Be brief.")
print(response)

>>{'replies': ['Natural language processing (NLP) is a branch of artificial intelligence focused on enabling
>>computers to understand, interpret, and manipulate human language. The goal of NLP is to read, decipher,
>> understand, and make sense of the human languages in a manner that is valuable.'], 'meta': {'model':
>> 'claude-2.1', 'index': 0, 'finish_reason': 'end_turn', 'usage': {'input_tokens': 18, 'output_tokens': 58}}}

Haystack 1.x

You can use Anhtropic Claude in your Haystack 1.x pipelines with the PromptNode, which can also be used with and Agent.

Installation (1.x)

pip install farm-haystack[inference]

Usage (1.x)

You can use Anthropic models in various ways:

Using Claude with PromptNode

To use Claude for prompting and generating answers, initialize a PromptNode with the model name, your Anthropic API key and a prompt template. You can then use this PromptNode in a question answering pipeline to generate answers based on the given context.

Below is the example of a PromptNode that uses a custom PromptTemplate

from haystack.nodes import PromptTemplate, PromptNode

prompt_text = """
Answer the following question.
Question: {query}
Answer:
"""

prompt_template = PromptTemplate(prompt=prompt_text)

prompt_node = PromptNode(
    model_name_or_path = "claude-2",
    default_prompt_template=PromptTemplate(prompt_text),
    api_key='YOUR_ANTHROPIC_API_KEY',
    max_length=768,
    model_kwargs={"stream": True},
)

Using Claude for Agents

To use Calude for an Agent, simply provide a PromptNode that uses Claude to the Agent:

from haystack.agents import Agent
from haystack.nodes import PromptNode

prompt_node = PromptNode(model_name_or_path="YOUR_ANTHROPIC_API_KEY", api_key=anthropic_key, stop_words=["Observation:"])
agent = Agent(prompt_node=prompt_node)