Note
Access to this page requires authorization. You can try signing in or changing directories.
Access to this page requires authorization. You can try changing directories.
Important
This feature is in preview.
This article shows how to use Azure OpenAI in Fabric, with OpenAI Python SDK and with SynapseML.
Prerequisites
OpenAI Python SDK isn't installed in default runtime, you need to first install it. Change the environment to Runtime version 1.3 or higher.
%pip install -U openai
Chat
Create a new cell in your Fabric notebook to use this code, separate from the cell described in the previous step to install the OpenAI libraries. GPT-4.1 and GPT-4.1-mini are language models optimized for conversational interfaces. The example presented here showcases simple chat completion operations and isn't intended to serve as a tutorial.
Note
Different versions of the OpenAI Python SDK may have different method names and parameters. Please refer to the official documentation for the version you are using.
import openai
response = openai.ChatCompletion.create(
deployment_id="gpt-4.1",
messages=[
{
"role": "user",
"content": """Analyze the following text and return a JSON array of issue insights.
Each item must include:
- issue_brief (1 sentence)
- scenario
- severity (high | medium | low)
- verbatim_quotes (list)
- recommended_fix
Text:
We booked the hotel room in advance for our family trip. The check-in the great however the room service was slow and pool was closed
Return JSON only.
"""
}
],
)
print(f"{response.choices[0].message.role}: {response.choices[0].message.content}")
Output
assistant: [
{
"issue_brief": "Room service was slow during the stay.",
"scenario": "Guests experienced delays in receiving room service after check-in.",
"severity": "medium",
"verbatim_quotes": [
"the room service was slow"
],
"recommended_fix": "Improve staffing or training for room service to ensure timely delivery of services."
},
{
"issue_brief": "The hotel pool was unavailable during the stay.",
"scenario": "Guests were unable to use the pool because it was closed.",
"severity": "medium",
"verbatim_quotes": [
"pool was closed"
],
"recommended_fix": "Notify guests in advance about facility closures and provide alternative amenities or compensation if possible."
}
]
Embeddings
Create a new cell in your Fabric notebook to use this code, separate from the cell described in the previous step to install the openai libraries. An embedding is a special data representation format that machine learning models and algorithms can easily utilize. It contains information-rich semantic meaning of a text, represented by a vector of floating point numbers. The distance between two embeddings in the vector space is related to the semantic similarity between two original inputs. For example, if two texts are similar, their vector representations should also be similar.
The example demonstrated here showcases how to obtain embeddings and isn't intended as a tutorial.
response = openai.embeddings.create(
input="The food was delicious and the waiter...",
model="text-embedding-ada-002" # Or another embedding model
)
print(response)
Output
CreateEmbeddingResponse(
data=[
Embedding(
embedding=[
0.0022756962571293116,
-0.009305915795266628,
0.01574261300265789,
...
-0.015387134626507759,
-0.019424352794885635,
-0.0028009789530187845
],
index=0,
object='embedding'
)
],
model='text-embedding-ada-002',
object='list',
usage=Usage(
prompt_tokens=8,
total_tokens=8
)
)
Related content
- Use prebuilt Text Analytics in Fabric with REST API
- Use prebuilt Text Analytics in Fabric with SynapseML
- Use prebuilt Azure AI Translator in Fabric with REST API
- Use prebuilt Azure AI Translator in Fabric with SynapseML
- Use prebuilt Azure OpenAI in Fabric with REST API
- Use prebuilt Azure OpenAI in Fabric with SynapseML and Python SDK