Docs
Quick Start - Python

Quick Start - Python

A quick example of using Interlify in Python.

Register

You can register for free by going to the Interlify register page. You can skip this step if you already have an account for Interlify.

Login to the page, then you will be redirected to the dashboard.

Create Tools

In Interlify, tools are function definitions. A tool represents a single interaction (API call) with your internal service. Each tool represents a single API call.

Interlify's cutting‐edge AI can understand your API specification (as OpenAPI-style) and generate the tools for you. You just need to import the OpenAPI specifications in yml or json format.

To create tools, follow these steps:

  1. On the Dashboard page, go to the Tools section.

  2. In the Create Tools section, click the "Select File" button to import your OpenAPI file, then click the "Generate Tools", and Your tools will be ready in seconds!

You may need to refresh the page to see the created tools.

You can check each tool's details by clicking the tool. You can update the tool details or delete the tool.

Create a Project

A project contains a group of tools that can be fetched by the client and provided to Large Language Models.

To create a project, follow these steps:

  1. On the Dashboard page, go to the Projects section.

  2. Click the "New Project +" button to create a new project. Enter a project name, select the tools you want to group under this project. Then click the "Save" button.

The project setup is done!

Create an API Key

An API key is associated with your account. It is required to use Interlify in your code and is used to verify your account.

To create an API key, follow these steps:

  1. In the Dashbaord page, go to the API Keys section.

  2. In the "Generate new API Key section", enter you API Key Name, then click the "Create +" button. An API key will be generated for your account.

An API key can be used across multiple projects.

Add Interlify to your code

Install Interlify via pip

Install the Interlify client in your project:

pip install interlify

Then use the Interlify client with a few lines of code:

from openai import OpenAI
from interlify import Interlify
 
 
client = OpenAI()
 
 
# Initialize the client
interlify = Interlify(
    api_key="YOUR_API_KEY", 
    project_id="YOUR_PROJECT_ID", 
    auth_headers=[
        {"Authorization": "YOUR_API_ACCESS_TOKEN"}
        ]
    )
 
# Prepare tools
tools = interlify.get_tools()
 
messages=[]
 
completion = client.chat.completions.create(
    model="gpt-4o",
    messages=[{"role": "user", "content": "What is the weather like in Paris today?"}],
    # Use tools
    tools=tools
)
 
response_message = response.choices[0].message
tool_calls = response_message.tool_calls
 
messages.append(response_message)
 
 
for tool_call in tool_calls:
    function_name = tool_call.function.name
 
    function_args = json.loads(tool_call.function.arguments)
 
    # Call the tool using interlify
    function_response = interlify.call_tool(function_name, function_args)
 
    messages.append(
        {
            "role": "tool",
            "content": str(function_response),
            "tool_call_id": tool_call.id,
        }
    )
 
final_response = client.chat.completions.create(
    model=model, messages=messages, tools=tools, tool_choice="auto"
)
 
print(final_response.choices[0].message.content)
 

Explanation

In the above code, Interlify did the following things:

  1. Instantiate the client
interlify = Interlify(
    api_key="YOUR_API_KEY", 
    project_id="YOUR_PROJECT_ID", 
    auth_headers=[
        {"Authorization": "YOUR_API_ACCESS_TOKEN"}
        ]
    )

The YOUR_API_ACCESS_TOKEN is the entire string value of the authorization header. The string will be used by your service to authorize LLM to access protected resources.

For the Bearer token format: Authorization: Bearer <YOUR_TOKEN>

The YOUR_API_ACCESS_TOKEN should be "Bearer <YOUR_TOKEN>".

For the Basic token format: Authorization: Basic <base64(username:password)>

The YOUR_API_ACCESS_TOKEN should be "Basic <YOUR_ENCODED_CREDENTIALS>".

If your API does not require a token, you can remove the authHeaders, so the code would be:

interlify = Interlify(
    api_key="YOUR_API_KEY", 
    project_id="YOUR_PROJECT_ID"
    )
  1. Prepare the tools
tools = Interlify.get_tools();
  1. Provide tools to LLM
completion = client.chat.completions.create(
    model="gpt-4o",
    messages=[{"role": "user", "content": "What is the weather like in Paris today?"}],
    tools=tools
)
  1. Call the tool
function_response = interlify.call_tool(function_name, function_args)

That's it!

GL & HF!

Currently, Interlify works for OpenAI compatible client/models, such as OpenAI, DeepSeek, Groq Client, etc.

If you have any questions, ideas, requests, or feedback, we highly recommend visiting our Build Together page to share them with us. We are eager to hear from you!