avatar

ShīnChvën ✨

Effective Accelerationism

Powered by Druid

Convert Azure OpenAI Service API to OpenAI Service API

azure openai service

Azure OpenAI Service provides access to OpenAI models like GPT and DALL-E on Microsoft Azure cloud with higher performance and availability.

However, the API is not compatible with the original OpenAI API. Fortunately, it's easy to setup a proxy to convert Azure OpenAI Service API to OpenAI Service API.

haibbo/cf-openai-azure-proxy is a Cloudflare worker script to proxy OpenAI‘s request to Azure OpenAI Service. It is open source so you don't have to afraid of your data and api secret being stolen.

This article will show you how to set up this proxy.

1. Create a Cloudflare account

If you don't have a Cloudflare account, you can create one for free at Cloudflare.

2. Create a Cloudflare worker

After you have created a Cloudflare account, please click Workers & Pages in the left sidebar, then click Create Application.

Inside the Create Application page, click Create Worker button, then click deploy to initiate a new worker. The worker name you see in this step will be part of worker url, so you can change it to whatever you like.

When it is done, you will see a Edit code button, click it to open the worker editor.

3. Install the proxy script

In the worker editor, you will see a worker.js file, delete all the content in it and paste the content of cf-openai-azure-proxy/cf-openai-azure-proxy.js.

Then find lines in the following code snippet and replace these variables with your own values.

// The name of your Azure OpenAI Resource.
const resourceName="Your Azure OpenAI Resource Name"

// The deployment name you chose when you deployed the model.
// Mapping Azure OpenAI Service model deployment name with OpenAI model name.
const mapper = { 
    'gpt-3.5-turbo': "Your GPT-3.5 model deployment name",
    'gpt-4': "Your GPT-4 model deployment name"
};

const apiVersion="2023-05-15"; //  Azure OpenAI API version, "2023-05-15" or newer.

// ...
// ...
// ...
// skip some code

const deployName = mapper[modelName] || 'Your model deployment name'; // set a default model deployment name here to avoid possible errors.

Finally, click Save and Deploy button to deploy the worker.

4. Test the proxy

Just add it to your third-party OpenAI API client, and it should work.

If you find this article useful, please consider donating to me.