Getting Started

Chat with datasources

With Superagent you can easily create an AI Assistant that has access to your private data through a concept we call Datasources. Superagent supports different types of datasources listed here. Attaching a datasource to an Assistant enables that Assistant to read and analyze that data.

Step-by-step guide

Note that you usually only need to create the llm object once and re-use it for subsequent agents you create.


  1. Start by configuring an LLM and creating an agent
1import os
2from superagent.client import Superagent
3
4client = Superagent(
5 base_url="https://api.beta.superagent.sh",
6 token=os.environ["SUPERAGENT_API_KEY"]
7)
8
9llm = client.llm.create(request={
10 "provider": "OPENAI",
11 "apiKey": os.environ["OPENAI_API_KEY"]
12})
13
14agent = client.agent.create(
15 name="Earnings Assistant",
16 description="An Assistant that's an expert at analyzing earnings reports",
17 avatar="",
18 is_active=True,
19 initial_message="Hi there! How can I help you?",
20 llm_model="GPT_3_5_TURBO_16K_0613",
21 prompt="You are an expert at analyzing earnings reports.",
22)
23
24client.agent.add_llm(agent_id=agent.data.id, llm_id=llm.data.id)
  1. Now let’s create a datasource by uploading Tesla’s Q3 2023 Earnings report to Superagent. Note that the description key tells the Assistant when this datasource should be used. Be vary of what you put in as your description as it might effect the Assistant’s behaviour.
1datasource = client.datasource.create(request={
2 "name": "Tesla Q3 2023",
3 "description": "Useful for answering questions about Tesla's Q3 2023 earnings report",
4 "type": "PDF",
5 "url": "https://digitalassets.tesla.com/tesla-contents/image/upload/IR/TSLA-Q3-2023-Update-3.pdf"
6})
7
8# Connect the datasource the the Agent
9client.agent.add_datasource(
10 agent_id=agent.data.id,
11 datasource_id=datasource.data.id
12)
  1. Now we can start chatting with the Assistant:
1prediction = client.agent.invoke(
2 agent_id=agent.data.id,
3 input="What was Tesla's revenue?",
4 enable_streaming=False,
5 session_id="my_session_id"
6)
7
8print(prediction.data.get("output"))

That’s it! You can attach one or many datasources to your Assistants by following the same pattern. Be aware that there might be some rate limiting when adding many big datasources at once.

Full code

1import os
2from superagent.client import Superagent
3
4client = Superagent(
5 base_url="https://api.beta.superagent.sh",
6 token=os.environ["SUPERAGENT_API_KEY"]
7)
8
9llm = client.llm.create(request={
10 "provider": "OPENAI",
11 "apiKey": os.environ["OPENAI_API_KEY"]
12})
13
14agent = client.agent.create(request={
15 "name": "Earnings Assistant",
16 "description": "An Assistant that's an expert at analyzing earnings reports",
17 "avatar": "https://tesla.com/logo.png", # Replace with a real image
18 "isActive": True,
19 "llmModel": "GPT_3_5_TURBO_16K_0613",
20 "initialMessage": "Hi there, how can I help you?",
21 "prompt": "You are an expert at analyzing earnings reports.\nUse the earnings reports provided to answer any questions."
22})
23
24datasource = client.datasource.create(request={
25 "name": "Tesla Q3 2023",
26 "description": "Useful for answering questions about Tesla's Q3 2023 earnings report",
27 "type": "PDF",
28 "url": "https://digitalassets.tesla.com/tesla-contents/image/upload/IR/TSLA-Q3-2023-Update-3.pdf"
29})
30
31# Connect the datasource and llm to the Agent
32client.agent.add_datasource(
33 agent_id=agent.data.id,
34 datasource_id=datasource.data.id
35)
36client.agent.add_llm(agent_id=agent.data.id, llm_id=llm.data.id)
37
38# Invoke the Assistant
39prediction = client.agent.invoke(
40 agent_id=agent.data.id,
41 input="What was Tesla's revenue?",
42 enable_streaming=False,
43 session_id="my_session_id"
44)
45
46print(prediction.data.get("output"))
47
48# Tesla's revenue was 24 Billing USD according to the earnings report.

Replit template

We’ve created a Replit template for this which you can run here.