How to Create an Agent Without Frameworks

·5 min read min read·Tutorials
How to Create an Agent Without Frameworks

Have you ever wanted to create your own AI agent—one that not only chats intelligently but can also perform real tasks like calling functions, querying data, or executing calculations?

With the latest capabilities from OpenAI, you can now build powerful tool-using agents entirely with the official openai Python library—no frameworks, no extra dependencies, just clean and direct control over the logic.

In this step-by-step guide, you’ll learn how to:

Create and describe custom tools (functions), Let the model choose and call them automatically, Handle tool responses and generate useful replies. Whether you want to build a chatbot, virtual assistant, or the backend of an AI-powered app, this guide will give you everything you need to get started with OpenAI’s function calling system.

Let’s dive in.


✅ 1. Setup

🔧 Install the OpenAI Python SDK

bash
CopiaModifica
pip install openai

🧪 Import and Configure

python
import openai  
import json


openai.api_key = "sk-..."  # or use: os.getenv("OPENAI_API_KEY")

🛠️ 2. Define Your Tools (Functions + JSON Schemas)

Each tool consists of:

  • A Python function that performs the task.
  • A JSON schema describing its parameters so the model knows how to use it.
python
def get_weather(city: str) -> str:
    return f"The weather in {city} is sunny and 23°C."

def calculate_sum(a: int, b: int) -> int:
    return a + b

tools = [
    {
        "type": "function",
        "function": {
            "name": "get_weather",
            "description": "Get the current weather in a given city.",
            "parameters": {
                "type": "object",
                "properties": {
                    "city": {"type": "string"}
                },
                "required": ["city"]
            }
        }
    },
    {
        "type": "function",
        "function": {
            "name": "calculate_sum",
            "description": "Calculate the sum of two integers.",
            "parameters": {
                "type": "object",
                "properties": {
                    "a": {"type": "integer"},
                    "b": {"type": "integer"}
                },
                "required": ["a", "b"]
            }
        }
    }
]

🧠 3. Agent Logic

This function handles:

  • The first call to OpenAI to decide whether to call a tool,
  • Executing the tool if needed,
  • A follow-up call to return a final response using the tool's result.
python
def run_agent():
    messages = [{"role": "system", "content": "You are a helpful assistant."}]
    while True:
        user_input = input("User: ")
        messages.append({"role": "user", "content": user_input})
        # First call: decide what to do
        response = openai.chat.completions.create(
            model="gpt-4o-mini",  # or "gpt-4o"
            messages=messages,
            tools=tools,
            tool_choice="auto"
        )
        message = response.choices[0].message
        if message.tool_calls:
            for tool_call in message.tool_calls:
                name = tool_call.function.name
                args = json.loads(tool_call.function.arguments)
                # Call the corresponding tool
                if name == "get_weather":
                    result = get_weather(**args)
                elif name == "calculate_sum":
                    result = str(calculate_sum(**args))
                else:
                    result = "Tool not implemented."
                # Append tool result to messages
                messages.append(message)
                messages.append({
                    "role": "tool",
                    "tool_call_id": tool_call.id,
                    "name": name,
                    "content": result
                })
            # Second call: generate final response with tool output
            follow_up = openai.chat.completions.create(
                model="gpt-4o-mini",
                messages=messages
            )
            final_message = follow_up.choices[0].message
            print("Assistant:", final_message.content)
            messages.append({"role": "assistant", "content": final_message.content})
        else:
            print("Assistant:", message.content)
            messages.append({"role": "assistant", "content": message.content})

🚀 4. Run the Agent

python
if __name__ == "__main__":
    run_agent()

💬 Example Conversation

txt
User: What's the weather in London?
Assistant: The weather in London is sunny and 23°C.
txt
User: Can you add 15 and 27?
Assistant: The sum of 15 and 27 is 42.

Ready to Scale Your Data Collection?

Join thousands of businesses using ScrapeGrapAI to automate their web scraping needs. Start your journey today with our powerful API.


Complete code

python
import openai
import json
# 🔐 Set your API key
openai.api_key = "sk-..."  # Replace with your actual key or use os.getenv("OPENAI_API_KEY")

# 🛠️ Define the tools
def get_weather(city: str) -> str:
    return f"The weather in {city} is sunny and 23°C."

def calculate_sum(a: int, b: int) -> int:
    return a + b

tools = [
    {
        "type": "function",
        "function": {
            "name": "get_weather",
            "description": "Get the current weather in a given city.",
            "parameters": {
                "type": "object",
                "properties": {
                    "city": {"type": "string"}
                },
                "required": ["city"]
            }
        }
    },
    {
        "type": "function",
        "function": {
            "name": "calculate_sum",
            "description": "Calculate the sum of two integers.",
            "parameters": {
                "type": "object",
                "properties": {
                    "a": {"type": "integer"},
                    "b": {"type": "integer"}
                },
                "required": ["a", "b"]
            }
        }
    }
]

# 🤖 Run the agent loop
def run_agent():
    messages = [{"role": "system", "content": "You are a helpful assistant."}]

    while True:
        user_input = input("User: ")
        messages.append({"role": "user", "content": user_input})

        # First model call (tool selection)
        response = openai.chat.completions.create(
            model="gpt-4o-mini",  # or "gpt-4o"
            messages=messages,
            tools=tools,
            tool_choice="auto"
        )

        message = response.choices[0].message

        if message.tool_calls:
            for tool_call in message.tool_calls:
                name = tool_call.function.name
                args = json.loads(tool_call.function.arguments)

                # Call the appropriate tool
                if name == "get_weather":
                    result = get_weather(**args)
                elif name == "calculate_sum":
                    result = str(calculate_sum(**args))
                else:
                    result = "Unknown tool."

                # Append tool response
                messages.append(message)
                messages.append({
                    "role": "tool",
                    "tool_call_id": tool_call.id,
                    "name": name,
                    "content": result
                })

            # Second model call (final response)
            follow_up = openai.chat.completions.create(
                model="gpt-4o-mini",
                messages=messages
            )
            final_message = follow_up.choices[0].message
            print("Assistant:", final_message.content)
            messages.append({"role": "assistant", "content": final_message.content})

        else:
            print("Assistant:", message.content)
            messages.append({"role": "assistant", "content": message.content})

# 🚀 Launch
if __name__ == "__main__":
    run_agent()

💡 FAI – Frequently Asked Ideas (for Extension and Improvement)

1. Add More Tools

Examples:

  • Currency conversion
  • External API calls
  • Send emails
  • Query a database
  • Summarize documents

2. Persist User State

Save user state or history using:

  • In-memory Python objects
  • Local file (e.g., JSON, SQLite)
  • A real database (PostgreSQL, MongoDB, etc.)

3. Stream Responses

Use stream=True to display output token-by-token—ideal for live UIs or chat interfaces.

4. Support Multiple Tool Calls

Handle multiple tool_calls in a loop and pass them back all at once for a richer response.

5. Async Support

If integrating into a web app (FastAPI, Flask with asyncio), you can adapt this logic to use await and async HTTP calls.

6. Frontend Integration

Wrap the agent in a Flask, FastAPI, or Streamlit server and serve a chat UI connected to this backend.

Ready to Scale Your Data Collection?

Join thousands of businesses using ScrapeGrapAI to automate their web scraping needs. Start your journey today with our powerful API.

Did you find this article helpful?

Share it with your network!