Enhancing Unity Experiences with Azure OpenAI for XR — Part 1

Elanthirayan
2 min read2 days ago

--

To create a FastAPI application that integrates with Azure OpenAI, you can follow these steps. First, ensure you have the necessary packages installed, such as fastapi, uvicorn for running FastAPI, and openai for interacting with Azure OpenAI. Here’s how you can structure your FastAPI application:

Install Required Packages:

Make sure you have fastapi, uvicorn, and openai installed. You can install them using pip if you haven't already:

pip install fastapi uvicorn openai

Python Code

from fastapi import FastAPI, HTTPException
from openai import AzureOpenAI

app = FastAPI()

# Azure OpenAI configuration
endpoint = "Your_Endpoint" # Replace with your Azure endpoint
key = "Your_Key" # Replace with your Azure API key
model_name = "gpt-4o" # Replace with your model name

client = AzureOpenAI(
azure_endpoint=endpoint,
api_version="2024-02-01",
api_key=key
)

@app.get("/")
def read_root():
return {"Hello": "World"}

@app.get("/ask_ai/{question}")
def ask_ai(question: str):
try:
completion = client.chat.completions.create(
model=model_name,
messages=[
{
"role": "user",
"content": question,
},
],
)
answer = completion.choices[0].message.content
return {"question": question, "answer": answer}
except Exception as e:
raise HTTPException(status_code=500, detail=str(e))

Let’s break down the provided code in detail. This code is a basic FastAPI application that integrates with Azure OpenAI for answering questions using a pre-trained model.

Explanation:

  • FastAPI Setup: Imports FastAPI from the fastapi module and sets up a basic FastAPI application (app).
  • Azure OpenAI Configuration: Sets up the Azure OpenAI client using your Azure endpoint, API version, and API key.
  • Endpoints:
  • /: Returns a simple "Hello World" message.
  • /ask_ai/{question}: Accepts a question as a path parameter (question) and sends it to Azure OpenAI for completion. Returns the question and the AI's answer in JSON format.
  • Error Handling: Catches exceptions and raises HTTP 500 errors with the exception details.

Run Your FastAPI Application:

Save the main.py file and run your FastAPI application using:

uvicorn main:app --host 0.0.0.0 --port 8000

Access Your API:

Once your FastAPI application is running, you can access the endpoints:

  • http://localhost:8000/: Returns "Hello World".
  • http://localhost:8000/ask_ai/{question}: Replace {question} with your actual question to get the AI's answer.

Make sure to replace "Your_Endpoint", "Your_Key", and "gpt-4o" with your actual Azure OpenAI endpoint, API key, and model name respectively. This setup allows you to integrate Azure OpenAI into a FastAPI application for answering questions dynamically.

To integrate this FastAPI setup into Unity, please refer to the next part of this blog series. https://elanthirayan.medium.com/enhancing-unity-experiences-with-azure-openai-f-bridging-virtual-worlds-with-intelligent-7d33959440a7

--

--

Elanthirayan

Technology enthusiast with a passion for VR/AR, ML, and IoT. I am excited to explore and push the boundaries of what is possible with these tools/techniques.