Empowering APIs using Function Callings with OpenAI

Freddy Domínguez
10 min readOct 6, 2023

--

Transform your custom APIs into an awesome functionality using LLMs

Ever since OpenAI opened up their API to the public, many developers have been using ChatGPT’s responses. This is exciting, but it comes with challenges, especially for engineers who need to work with data from different formats like JSON, XML, GSON, HTML, and more. The main challenge now is turning this data into useful information for users.

The article is structured into 5 segments:

  1. User’s Intention and chat context
  2. Spotify API and python
  3. Function calling with Spotify API
  4. Cope with follow-up question
  5. Think Big (Moving from Function to Functionality)

1. User’s Intention and chat context

To handle this, many people use Regex patterns or break down the text. Often, the tricky part in a chat conversation is understanding what the user wants to do next. The final user might want the model to answer their question using the chat history, or they could want to fetch new information from sources like APIs. That’s when using function callings becomes important because it helps capture the user’s real intention.

This is an awesome feature in practice, because of that it allows you to create a seamless workflow for responding to the user’s intention with accurate data, while also avoiding the generation of hallucinations.

Remember that the most important thing is to provide the LLM with as essential and precise context as possible, this approach ensures that the LLM generates competent responses and avoid hallucinations.

2. Spotify API and python

The utilization of function calls to integrate LLMs with other tools and systems comes with inherent risks. It is essential to understand the potential dangers associated with employing function callings and to take precautions to ensure responsible use them responsibly. Thus why I will provide you a simple and effectively API as example.

We will be utilizing the Spotify API for this purpose. This API offers comprehensive results from inquiries designed to inspect audio content stored on the platfornm, including songs, soundtracks, shows, and their respective episodes. To provide a dynamic example, I will employ the Spotipy Python library. This library allows us to encapsulate the whole REST API provided by Spotify.

First at all, we set our environment variables in .env file:

# .env file
OPENAI_API_KEY={openai_key}
SPOTIPY_CLIENT_ID={spotipy_client_key}
SPOTIPY_CLIENT_SECRET={spotipy_secret_key}

You can generate the API keys from openai and spotify. Sequentially, we setup our REST clients by using the following code:

from dotenv import load_dotenv
load_dotenv()
client_credentials_manager = SpotifyClientCredentials()
sp = spotipy.Spotify(client_credentials_manager=client_credentials_manager)

Next we define our function called list_latest_episode_of_spotify_show to retrieve data from spotify api[2], which provided a list of podcasts based on some arguments, notice we use ‘ES’ as market on the grounds that we search only in that market, its possible that some podcasts haven’t any episode in the market.

def list_latest_episode_of_spotify_show(show_name, limit=40):
"""Get the latest episodes of a determine show or episode in a given name"""
result_fetch = sp.search(show_name, limit=limit, offset=0, type='episode',market='ES')
# transform dictionary into a simple list of shows
final_list = []
for item in result_fetch['episodes']['items']:
if show_name.lower() in item['description'].lower():
final_list.append({'release_date':item['release_date'],'episode_name':item['name'],'url':item['external_urls']['spotify']})
# sorted by release date
final_list = sorted(final_list, key=lambda x: x['release_date'], reverse=True)
return json.dumps(final_list[:limit])

Testing our function with “Globant” and limit 3, we got the output:

// list_latest_episode_of_spotify_show('Globant',3) was invoked
[{"release_date": "2023-06-26",
"episode_name": "Qu\u00e9 es el Learning Match - Santos Videla (Globant)",
"url": "https://open.spotify.com/episode/568pXcS72Cg9R8RlLsQCzP"},
{"release_date": "2023-06-20",
"episode_name": "Argentinian tech giant Globant plans to double down on Europe: A chat with Fernando Matzkin, who's leading the charge",
"url": "https://open.spotify.com/episode/2e7tBHgZhlsnekzoCf1xus"},
{"release_date": "2023-06-13",
"episode_name": "Entrevista a Andrey Luj\u00e1n, Managing Director de Globant en Chile",
"url": "https://open.spotify.com/episode/1lYESmQLnB4PABIqH4xJlO"}]

Its alive!!! Our external API python function is working as expected.

3. Function calling with Spotify API

After defining the function, the next step is to outline its structure to ensure that the LLM can comprehend when it needs to be called. This is where the creation of a dictionary comes into play. This dictionary serves as an explanation of how our function behaves.

Lets take a look into the definition of functions

functions = [
{
"name": "list_latest_episode_of_spotify_show",
"description": "Get the latest episodes from Spotify shows or episodes in a given name",
"parameters": {
"type": "object",
"properties": {
"show_name": {
"type": "string",
"description": "The name of the show or episode, e.g., Apple, Twitter, Amazon",
},
"limit": {
"type": "integer",
"description": "The maximum number of items to fetch",
},
},
"required": ["show_name"],
},
}
]

I am going to explain it: To begin, the dictionary’s name must align precisely with what we have assigned. Furthermore, a description is required, this description serves as a sentence that the function’s intended purpose. In addition to these, we also need to detail the function’s parameters. Typically, these parameters take the form of object types that signify their function. The dictionary properties provide insight into the argument names, their respective types, and corresponding descriptions. In our scenario, we work with two main parameters: “show_name,” related to the name of the show, and “limit,” which signifies the desired episode count. Lastly, it is essential to include a list featuring non-optional parameters. This list prevents potential exceptions from arising because of that it is an important role. Of course, you should also be able to use default values, which can help reduce the number of tokens.

Remember, nowadays, the more unnecessary tokens you use, the more money you waste.

So let’s start, we define our prompt with chatCompletion passing the defined functions, a list of maps, as parameter.

messages = [{'role': 'user', 'content': 'Hello, I need your help to find postcasts in my Spotify'}]
user_message="What are the latest ten episodes of Globant?"
messages.append({"role": "user", "content": user_message})
completion= openai.ChatCompletion.create(
model="gpt-3.5-turbo",
messages=messages,
functions=functions
)

When we checked the OpenAI result, we noticed that the content is null. This implies that the Language Model (LLM) assumes the next step is to invoke a function named ‘list_latest_episode_of_spotify_show’ with the arguments ‘show_name’ set to ‘Globant’ and ‘limit’ set to 10.

<OpenAIObject at 0x7ff57ea42890> JSON: {
"role": "assistant",
"content": null,
"function_call": {
"name": "list_latest_episode_of_spotify_show",
"arguments": "{\n \"show_name\": \"Globant\",\n \"limit\": 10\n}"
}
}

If we process the output result, so we can inspect each value separately:

response_function_calling=completion.choices[0].message
function_name=(response_function_calling['function_call'])['name']
print(function_name) # list_latest_episode_of_spotify_show
show_name=eval(response_function_calling['function_call']['arguments'])['show_name']
print(show_name) # Globant
limit=eval(response_function_calling['function_call']['arguments'])['limit']
print(limit) # 10

Having selected the function and its arguments, we are ready to invoke it, as we have all the information to do so. Let’s proceed by defining a dictionary that represents all available functions in our chat environment:

available_functions = {
'list_latest_episode_of_spotify_show': list_latest_episode_of_spotify_show
}
function_response = available_functions[function_name](
show_name=show_name,
limit=limit,
)
from pprint import pprint
pprint(function_response, width=150)

Here’s the resulting output:

('[{"release_date": "2023-06-26", "episode_name": "Qu\\u00e9 es el Learning Match - Santos Videla (Globant)", "url": '
'"https://open.spotify.com/episode/568pXcS72Cg9R8RlLsQCzP"}, {"release_date": "2023-06-20", "episode_name": "Argentinian tech giant Globant plans '
'to double down on Europe: A chat with Fernando Matzkin, who\'s leading the charge", "url": '
'"https://open.spotify.com/episode/2e7tBHgZhlsnekzoCf1xus"}, {"release_date": "2023-06-13", "episode_name": "Entrevista a Andrey Luj\\u00e1n, '
'Managing Director de Globant en Chile", "url": "https://open.spotify.com/episode/1lYESmQLnB4PABIqH4xJlO"}, {"release_date": "2023-02-17", '
'"episode_name": "C\\u00f3mo sigue la deuda en pesos, cu\\u00e1ndo llegan los billetes de $2.000 y Globant rompe r\\u00e9cords", "url": '
'"https://open.spotify.com/episode/3jusDSalZOXIpT29jmlaPS"}, {"release_date": "2023-01-13", "episode_name": "#4 Elena Morettini, Global Head '
'Sustainable Business en Globant", "url": "https://open.spotify.com/episode/6uzo9K9ydjWjlRuLU6AN8u"}, {"release_date": "2023-01-11", '
'"episode_name": "Alerta por los activos de los bancos, cu\\u00e1nto crecer\\u00e1 el PBI en 2023 y adquisici\\u00f3n de Globant", "url": '
'"https://open.spotify.com/episode/5ArNty2xDDuueBot8j6w6z"}, {"release_date": "2022-10-12", "episode_name": "Q&A con Nicol\\u00e1s Dujovne, los '
'planes de Migoya para Globant y el ajuste de las jubilaciones en 2023", "url": "https://open.spotify.com/episode/2HTdIPM0aSuO9JjkHnldzS"}, '
'{"release_date": "2022-06-03", "episode_name": "Una alianza que potencia la experiencia de cliente, YPF y Globant.", "url": '
'"https://open.spotify.com/episode/47pXZNEZqkT4H6TiIOlrsT"}, {"release_date": "2022-04-19", "episode_name": "El caso Globant: \\u00bfC\\u00f3mo se '
'construy\\u00f3 un unicornio argentino?", "url": "https://open.spotify.com/episode/4LAgh6AUVRKGoDJJmjDueO"}, {"release_date": "2022-04-13", '
'"episode_name": "#16 - Por uma nova forma de aprendizado: case FIAP e Globant ", "url": '
'"https://open.spotify.com/episode/3gR76J5tN22FeL38ZgcJ1j"}]')

4. Cope with follow-up questions

According to the OpenAI documentation, function calling enables the model to provide structured information that can be used to call the functions in the code. Additionally, it allows an API call to describe functions to the model, which can then intelligently generate a JSON object containing arguments for those functions.

Those functions and their arguments provide us with the capability to determine which code to execute in order to meet the user’s latest input requirements. Why is this so important? It’s simply because of follow-up questions such as “What occurs in Latin America?” or “On Twitter?” or “And in the last year?”

By utilizing function calling we let the LLM to generate and pass the correct parameters to functions, facilitating their invocation.

Our python function has two parameters: the former is non-optional, and the latter is optional with a default value of 40. It appears as follows:

def list_latest_episode_of_spotify_show(show_name, limit=40):
...

To use it chatCompletion in python, we must pass the functions structure as an argument in OpenAI api:

messages = [{'role': 'user', 'content': 'Hello, I need your help find the latest postcast of Bitcoin in Spotify'},
{'role': 'assistant', 'content': 'Here are the last 20 episodes of Bitcoin: ...'}]
user_message="and the latest ten of Twitter?"
messages.append({"role": "user", "content": user_message})
completion= openai.ChatCompletion.create(
model="gpt-3.5-turbo",
messages=messages,
functions=functions
)

After the completion of the chat task, it will return an output which is an OpenAIObject and could contain either the “content” data, representing the general chat output, or the “function_call” data, containing the details of the next function to be called along with its arguments.

// output of chatCompletion
<OpenAIObject at 0x7ff57ea42890> JSON: {
"role": "assistant",
"content": null,
"function_call": {
"name": "list_latest_episode_of_spotify_show",
"arguments": "{\n \"show_name\": \"Twitter\",\n \"limit\": 10\n}"
}
}

As we can observe, the LLM is capable of interpreting the user’s intention and ensuring that the function has the required arguments to be invoked in the subsequent /next steps. It can even utilize a different input text, for example: “What about the 10 from Twitter?”, “and the last 3 episodes?”, “for IBM”, and “now, provide me with the latest one.”

Furthermore, we can define a function to retrieve the latest user intent in the chat in the form of a question by using the last 10 interactions with that chat. Why is this beneficial? The reason is that numerous tools and indexes, like Langchain and LlamaIndex, have been developed to provide answers to questions. By implementing this approach, we can navigate through these resources with great simplicity and ease.

5. Think Big (Moving from Function to Functionality)

Enhance the functionality of our Spotify feature in our chat.

First, we save our function in a file, so we write the python code. I include a magic command due to that I am working on a Jupyter Notebook:

%%writefile spotify_utils.py
import json
import spotipy
from spotipy.oauth2 import SpotifyClientCredentials
client_credentials_manager = SpotifyClientCredentials()
sp = spotipy.Spotify(client_credentials_manager=client_credentials_manager)

def list_latest_episode_of_spotify_show(show_name, limit=40):
"""Get the latest episodes of a determine show or epiode in a given name"""
result_fetch = sp.search(show_name, limit=limit, offset=0, type='episode',market='ES')
final_list = []
for item in result_fetch['episodes']['items']:
if show_name.lower() in item['description'].lower():
final_list.append({'release_date':item['release_date'],'episode_name':item['name'],'url':item['external_urls']['spotify']})
final_list = sorted(final_list, key=lambda x: x['release_date'], reverse=True)
return json.dumps(final_list[:limit])

The above code write the function in spotify_utils.py file and read the spotify development credentials.

We can test it just by using the getattr function to dynamically access a function from the spotify_utils module and then execute that function with provided arguments:

import spotify_utils
function_object = getattr(spotify_utils, function_name)
function_response = function_object(**arguments)
# ('[{"release_date": "2023-06-26", "episode_name": "Q .....

Now, we know how to dynamically access a function, we can generalize a function to several purposes using our external API. Lets define a general function call askToSpotifyBot which contain only the user input:

def askToSpotifyBot(user_input,functions=functions):
messages = [{'role': 'system', 'content': """You are an assistant helping me find podcasts on Spotify and returning the entire list.
Please reply in markdown format and add the link at the end of each episode using [Listen here](url)"""}]
messages.append({"role": "user", "content": user_input})
completion= openai.ChatCompletion.create(
model="gpt-3.5-turbo",
messages=messages,
functions=functions
)
response_function_calling = completion.choices[0].message
if response_function_calling is not None and response_function_calling.get('function_call'):
function_name= response_function_calling['function_call']['name']
function_args = json.loads(response_function_calling["function_call"]["arguments"])
function_response = available_functions[function_name](**function_args)
_response = openai.ChatCompletion.create(
model="gpt-3.5-turbo",
messages=[
{"role": "user", "content": user_input},
response_function_calling,
{
"role": "function",
"name": function_name,
"content": function_response,
},
],
functions = functions
)
return _response.choices[0]["message"]["content"].strip()
else:
return completion.choices[0]["message"]["content"].strip()

In the askToSpotifyBot function, we begin by initializing the system messages. Next, we add the user’s input to the messages list. After that, we invoke the GPT-3 model for the initial completion. If a function call is detected, it is invoked, and the resulting function response is passed through to the chatcompletion again for interpretation. Otherwise, the initial completion is returned as a normal completion.

Finally, let see our final results in markdown placeholder for 3 samples:

from IPython.display import Markdown
# User input 1 - Simple Question
Markdown(askToSpotifyBot("What are the latest 7 episodes of Twitter? Be concise"))

The Output (updating a snapshot):

Markdown output — prompt: What are the latest 7 episodes of Twitter? Be concise

Sample 2:

# User input 2 - A Question + condition
Markdown(askToSpotifyBot("What are the latest 8 episodes of Globant? Group by year"))

The output 2:

Markdown output — prompt: What are the latest 8 episodes of Globant? Group by year

Sample 3:

# User input 3 - Question + condition + replace output information 
Markdown(askToSpotifyBot("What are the latest 15 episodes of Bitcoin? Group by month and change 'Listen here' link to 'Escuchame'"))

The output 3:

Markdown output — prompt: What are the latest 15 episodes of Bitcoin? Group by month and change ‘Listen here’ link to ‘Escuchame’

Conclusion

Function calling is an extraordinary feature that allows us to create complex workflows for utilizing data from external sources. Additionally, it enables us to capture user intentions and extract potentially crucial and specific information. This is especially valuable in cases where the user input is not sufficiently clear, such as with follow-up questions.

You can find the Google Colab Notebook here.

Thank you for being here, please comment down your views, if any mistakes found the article will be updated

Reference

  1. https://platform.openai.com/docs/guides/gpt/function-calling
  2. https://developer.spotify.com/documentation/web-api
  3. https://spotipy.readthedocs.io/en/2.22.1/
  4. https://colab.research.google.com/github/romellfudi/medium/blob/main/Function_calling_with_OpenAI_API.ipynb

--

--

Freddy Domínguez
Freddy Domínguez

Written by Freddy Domínguez

Peruvian #Software #Engineer CIP 206863, #Business #Intelligence #Data #Science. I work with people to create ideas, deliver outcomes, and drive change

No responses yet