I Tried AISuite by AndrewNg, and It is GREAT!

Pankaj Singh Last Updated : 17 Dec, 2024
9 min read

Andrew Ng recently released AISuite, an open-source Python package designed to streamline the use of large language models (LLMs) across multiple providers. This innovative tool simplifies the complexities of working with diverse LLMs by allowing seamless switching between models with a simple “provider:model” string. By significantly reducing integration overhead, AISuite enhances flexibility and accelerates application development, making it an invaluable resource for developers navigating the dynamic landscape of AI. In this article, we will see how effective it is.

AISuite by AndrewNg

In this article, you will learn about AISuite, a Python library created by Andrew Ng’s team. We will explain how AISuite helps you easily work with different large language models (LLMs) and why it is useful for AI projects.

What is AISuite?

AISuite is an open-source project led by Andrew Ng, designed to make working with multiple large language model (LLM) providers easier and more efficient. Available on GitHub, it provides a simple, unified interface that allows seamless switching between LLMs using HTTP endpoints or SDKs, following OpenAI’s interface. This tool is ideal for students, educators, and developers, offering consistent and hassle-free interactions across various providers.

Supported by a team of open-source contributors, AISuite bridges the gap between different LLM frameworks. It enables users to integrate and compare models from providers like OpenAI, Anthropic, and Meta’s Llama with ease. The tool simplifies tasks such as generating text, conducting analyses, and building interactive systems. With features like streamlined API key management, customizable client configurations, and an intuitive setup, AISuite supports both simple applications and complex LLM-based projects.

Implementation of AISuite

1. Install Necessary Libraries

!pip install openai
!pip install aisuite[all]
  • !pip install openai: Installs the OpenAI Python library, which is required to interact with OpenAI’s GPT models.
  • !pip install aisuite[all]: Installs AISuite along with optional dependencies needed to support multiple LLM providers.

2. Set API Keys for Authentication

os.environ['OPENAI_API_KEY'] = getpass('Enter your OPENAI API key: ')
os.environ['ANTHROPIC_API_KEY'] = getpass('Enter your ANTHROPIC API key: ')
  • os.environ: Sets environment variables to securely store the API keys required to access LLM services.
  • getpass(): Prompts the user to enter their OpenAI and Anthropic API keys securely (without displaying the input).
  • These keys authenticate your requests to the respective platforms.

Also read: How to Generate Your Own OpenAI API Key and Add Credits?

3. Initialize the AISuite Client

client = ai.Client()

This initializes an instance of the AISuite client, allowing interaction with multiple LLMs in a standardized way.

4. Define the Prompt (Messages)

messages = [
   {"role": "system", "content": "Talk using Pirate English."},
   {"role": "user", "content": "Tell a joke in 1 line."}
]
  • The messages list defines a conversation input:
    • role: “system”: Provides instructions to the model (e.g., “Talk using Pirate English”).
    • role: “user”: Represents the user’s query (e.g., “Tell a joke in 1 line”).
  • This prompt ensures the responses follow a pirate theme and include a one-line joke.

5. Query the OpenAI Model

response = client.chat.completions.create(model="openai:gpt-4o", messages=messages, temperature=0.75)
print(response.choices[0].message.content)
  • model=”openai:gpt-4o”: Specifies the OpenAI GPT-4o model.
  • messages=messages: Sends the prompt defined earlier to the model.
  • temperature=0.75: Controls the randomness of the response. A higher temperature results in more creative outputs, while lower values produce more deterministic responses.
  • response.choices[0].message.content: Extracts the text content of the model’s response.

6. Query the Anthropic Model

response = client.chat.completions.create(model="anthropic:claude-3-5-sonnet-20241022", messages=messages, temperature=0.75)
print(response.choices[0].message.content)
  • model=”anthropic:claude-3-5-sonnet-20241022″: Specifies the Anthropic Claude-3-5 model.
  • The remaining parameters are identical to the OpenAI query. This demonstrates how AISuite enables easy switching between providers by changing the model parameter.

7. Query the Ollama Model

response = client.chat.completions.create(model="ollama:llama3.1:8b", messages=messages, temperature=0.75)
print(response.choices[0].message.content)
  • model=”ollama:llama3.1:8b”: Specifies the Ollama Llama3.1 model.
  • Again, the parameters and logic are consistent, showcasing how AISuite provides a unified interface across providers.

Output

Why did the pirate go to school? To improve his "arrrrrrr-ticulation"!

Arrr, why don't pirates take a shower before they walk the plank? Because
they'll just wash up on shore later! 🏴‍☠️

Why did the scurvy dog's parrot go to the doctor? Because it had a fowl
temper, savvy?

Create a Chat Completion

!pip install openai
!pip install aisuite[all]
os.environ['OPENAI_API_KEY'] = getpass('Enter your OPENAI API key: ')
from getpass import getpass
import aisuite as ai
client = ai.Client()
provider = "openai"
model_id = "gpt-4o"
messages = [
   {"role": "system", "content": "You are a helpful assistant."},
   {"role": "user", "content": "Give me a tabular comparison of RAG and AGENTIC RAG"},
]
response = client.chat.completions.create(
   model=f"{provider}:{model_id}",
   messages=messages,
)
print(response.choices[0].message.content)

Output

Certainly! Below is a tabular comparison of Retrieval-Augmented Generation
(RAG) and Agentic RAG.

| Feature                | RAG                                             |
Agentic RAG                                        |

|------------------------|-------------------------------------------------|-
---------------------------------------------------|

| Definition             | A framework that combines retrieval from external
documents with generation. | An extension of RAG that incorporates actions
based on external interactions and dynamic decision-making. |

| Components             | - Retrieval System (e.g., a search engine or
document database) <br> - Generator (e.g., a language model) | - Retrieval
System <br> - Generator <br> - Agentic Layer (action-taking and interaction
controller) |

| Functionality          | Retrieves relevant documents and generates
responses based on prompted inputs combined with the retrieved information.
| Adds the capability to take actions based on interactions, such as
interacting with APIs, controlling devices, or dynamically gathering more
information. |

| Use Cases              | - Knowledge-based question answering <br> -
Content summarization <br> - Open-domain dialogue systems | - Autonomous
agents <br> - Interactive systems <br> - Decision-making applications <br> -
Systems requiring context-based actions |

| Interaction            | Limited to the input retrieval and output
generation cycle. | Can interact with external systems or interfaces to
gather data, execute tasks, and alter the environment based on objective
functions. |

| Complexity             | Generally simpler as it combines retrieval with
generation without taking actions beyond generating text. | More complex due
to its ability to interact with and modify the state of external
environments. |

| Example of Application | Answering complex questions by retrieving parts of
documents and synthesizing them into coherent answers. | Implementing a
virtual assistant capable of performing tasks like scheduling appointments
by accessing calendars, or a chatbot that manages customer service queries
through actions. |

| Flexibility            | Limited to the available retrieval corpus and
generation model capabilities. | More flexible due to action-oriented
interactions that can adapt to dynamic environments and conditions. |

| Decision-Making Ability| Limited decision-making based on static retrieval
and generation. | Enhanced decision-making through dynamic interaction and
adaptive behavior. |

This comparison outlines the foundational differences and capabilities
between traditional RAG systems and the more advanced, interaction-capable
Agentic RAG frameworks.

Each model Uses a Different Provider

1. Installing and Importing Libraries

!pip install aisuite[all]
from pprint import pprint as pp
  • Installs the aisuite library with all optional dependencies.
  • Imports a pretty-printing function (pprint) to format output for better readability. A custom pprint function is defined to allow a custom width.

2. Setting Up API Keys

import os
from getpass import getpass
os.environ['GROQ_API_KEY'] = getpass('Enter your GROQ API key: ')

Prompts the user to input their GROQ API key, which is stored in the environment variable GROQ_API_KEY.

3. Initializing the AI Client

import aisuite as ai
client = ai.Client()

Initializes an AI client using the aisuite library to interact with different models.

4. Chat Completions

messages = [
    {"role": "system", "content": "You are a helpful agent, who answers with brevity."},
    {"role": "user", "content": 'Hi'},
]
response = client.chat.completions.create(model="groq:llama-3.2-3b-preview", messages=messages)
print(response.choices[0].message.content)

Output

How can I assist you?
  • Defines a chat with two messages:
    • A system message that sets the tone or behavior of the AI (concise responses).
    • A user message as input.
  • Sends the messages to the AI model groq:llama-3.2-3b-preview and prints the model’s response.

5. Function to Send Queries

def ask(message, sys_message="You are a helpful agent.",
         model="groq:llama-3.2-3b-preview"):
    client = ai.Client()
    messages = [
        {"role": "system", "content": sys_message},
        {"role": "user", "content": message}
    ]
    response = client.chat.completions.create(model=model, messages=messages)
    return response.choices[0].message.content
ask("Hi. what is capital of Japan?")

Output

'Hello. The capital of Japan is Tokyo.'
  • ask is a reusable function to send queries to the model.
  • Accepts:
    • message: The user’s query.
    • sys_message: Optional system instruction.
    • model: Specifies the AI model.
  • Sends the input and returns the AI’s response.

6. Using Multiple APIs

os.environ['OPENAI_API_KEY'] = getpass('Enter your OPENAI API key: ')
os.environ['ANTHROPIC_API_KEY'] = getpass('Enter your ANTHROPIC API key: ')
print(ask("Who is your creator?"))
print(ask('Who is your creator?', model='anthropic:claude-3-5-sonnet-20240620'))
print(ask('Who is your creator?', model='openai:gpt-4o'))

Output

I was created by Meta AI, a leading artificial intelligence research
organization. My knowledge was developed from a large corpus of text, which
I use to generate human-like responses to user queries.

I was created by Anthropic.

I was developed by OpenAI, an organization that focuses on artificial
intelligence research and deployment.
  • Prompts the user for OpenAI and Anthropic API keys.
  • Sends a query (“Who is your creator?”) to different models:
    • groq:llama-3.2-3b-preview
    • anthropic:claude-3-5-sonnet-20240620
    • openai:gpt-4o
  • Prints the response from each model, showing how different systems interpret the same query.

7. Querying Multiple Models

models = [
    'llama-3.1-8b-instant',
    'llama-3.2-1b-preview',
    'llama-3.2-3b-preview',
    'llama3-70b-8192',
    'llama3-8b-8192'
]
ret = []
for x in models:
    ret.append(ask('Write a short one sentence explanation of the origins of AI?', model=f'groq:{x}'))
  • A list of different model identifiers (models) is defined.
  • Loops through each model and queries it with:
    • Write a short one sentence explanation of the origins of AI?
  • Stores responses in the list ret.

8. Displaying Model Responses

for idx, x in enumerate(ret):
    pprint(models[idx] + ': \n ' + x + ' ')
  • Loops through the stored responses.
  • Formats and prints the model’s name along with its response, making it easy to compare outputs.

Output

('llama-3.1-8b-instant: \n'

 ' The origins of Artificial Intelligence (AI) date back to the 1956 Dartmouth '

 'Summer Research Project on Artificial Intelligence, where a group of '

 'computer scientists, led by John McCarthy, Marvin Minsky, Nathaniel '

 'Rochester, and Claude Shannon, coined the term and laid the foundation for '

 'the development of AI as a distinct field of study. ')

('llama-3.2-1b-preview: \n'

 ' The origins of Artificial Intelligence (AI) date back to the mid-20th '

 'century, when the first computer programs, which mimicked human-like '

 'intelligence through algorithms and rule-based systems, were developed by '

 'renowned mathematicians and computer scientists, including Alan Turing, '

 'Marvin Minsky, and John McCarthy in the 1950s. ')

('llama-3.2-3b-preview: \n'

 ' The origins of Artificial Intelligence (AI) date back to the 1950s, with '

 'the Dartmouth Summer Research Project on Artificial Intelligence, led by '

 'computer scientists John McCarthy, Marvin Minsky, and Nathaniel Rochester, '

 'marking the birth of AI as a formal field of research. ')

('llama3-70b-8192: \n'

 ' The origins of Artificial Intelligence (AI) can be traced back to the 1950s '

 'when computer scientist Alan Turing proposed the Turing Test, a method for '

 'determining whether a machine could exhibit intelligent behavior equivalent '

 'to, or indistinguishable from, that of a human. ')

('llama3-8b-8192: \n'

 ' The origins of Artificial Intelligence (AI) can be traced back to the '

 '1950s, when computer scientists DARPA funded the development of the first AI '

 'programs, such as the Logical Theorist, which aimed to simulate human '

 'problem-solving abilities and learn from experience. ')

Models provide varied responses to the query about the origins of AI, reflecting their training and reasoning capabilities. For instance:

  • Some models reference the Dartmouth Summer Research Project on AI.
  • Others mention Alan Turing or early DARPA-funded AI programs.

Key Features and Takeaways

  • Modularity: The script uses reusable functions (ask) to make querying efficient and customisable.
  • Multi-Model Interaction: Showcases the ability to interact with various AI systems, including GROQ, OpenAI, and Anthropic.
  • Comparative Analysis: Facilitates comparison of responses across models for insights into their strengths and biases.
  • Real-Time Inputs: Supports dynamic input for API keys, ensuring secure integration.

This script is an excellent starting point for exploring different AI model capabilities and understanding their unique behaviours.

Conclusion

AISuite is an essential tool for anyone navigating the world of large language models. It empowers users to harness the best of multiple AI providers while simplifying development and fostering innovation. Its open-source nature and thoughtful design underscore its potential as a modern AI application development cornerstone.

It accelerates development and enhances flexibility by enabling seamless switching between models like OpenAI, Anthropic, and Meta with minimal integration effort. Ideal for both simple and complex applications, AISuite supports modular workflows, API key management, and real-time multi-model comparisons. Its ease of use, scalability, and ability to streamline cross-provider interactions make it an invaluable resource for developers, researchers, and educators, empowering efficient and innovative utilisation of diverse LLMs in an evolving AI landscape.

If you are looking for generative AI course online then explore: GenAI Pinnacle Program

Frequently Asked Questions

Q1. What is AISuite?

Ans. AISuite is an open-source Python package created by Andrew Ng to streamline working with multiple large language models (LLMs) from various providers. It provides a unified interface for switching between models, simplifying integration and accelerating development.

Q2. Which providers are supported by AISuite?

Ans. AISuite currently supports the following providers: OpenAI, Anthropic, Azure, Google, AWS, Groq, Mistral, HuggingFace, and Ollama.

Q3. Can I use AISuite with multiple AI providers at once?

Ans. Yes, AISuite supports querying multiple models from different providers simultaneously. You can send the same query to different models and compare their responses.

Q4. What is the key feature of AISuite?

Ans. AISuite’s key feature is its modularity and ability to integrate multiple LLMs into a single workflow. It also simplifies API key management and allows easy switching between models, facilitating quick comparisons and experimentation.

Q5. How do I install AISuite?

Ans. To install AISuite and necessary libraries, run:
!pip install aisuite[all]
!pip install openai

Hi, I am Pankaj Singh Negi - Senior Content Editor | Passionate about storytelling and crafting compelling narratives that transform ideas into impactful content. I love reading about technology revolutionizing our lifestyle.

Responses From Readers

Clear

null null
null null

Does aisuite support gemini-pro? I see they have support for google what ever that means, but little in the way of example provider strings.

We use cookies essential for this site to function well. Please click to help us improve its usefulness with additional cookies. Learn about our use of cookies in our Privacy Policy & Cookies Policy.

Show details