OpenAI, a renowned leader in AI research, offers an API that enables developers to leverage their powerful language models. This article will provide a step-by-step guide on setting up OpenAI reverse proxy with Nginx on a Ubuntu 22.04 machine with a sub-domain and Let’s Encrypt free SSL. This setup allows you to efficiently integrate AI capabilities into your applications like Janitor AI, Venus AI and more.
Also Check: How to setup OpenAI Reverse Proxy with Docker in Hugging Face
Benefits of an OpenAI Reverse Proxy
OpenAI reverse proxy in combination with Nginx has some advantages listed below:
- Performance: By configuring a reverse proxy, you can cache OpenAI API responses, reducing latency and improving overall performance for your users.
- Scalability: The reverse proxy acts as an intermediary between your application and the OpenAI API, enabling you to scale your AI integration seamlessly.
- Security: A reverse proxy can add an extra layer of security by shielding sensitive API keys and protecting your backend infrastructure from direct external access.
Let’s begin to start configuring OpenAI reverse proxy with Nginx.
Prerequisites
- A machine with Linux distro with external IP so that we can configure subdomain and install SSL.
- A user with sudo privileges or root access.
Initial Setup
Start by updating the packages to the latest version available.
sudo apt update sudo apt upgrade -y
Install Nginx for OpenAI Reverse Proxy
You can install Nginx easily by a single command.
sudo apt install nginx
Verify the nginx installation using the below command.
sudo service nginx status
You will see an output of the status of Nginx (active or failed).
Configure OpenAI Reverse Proxy with Nginx
Now you need to remove the default Nginx configuration that is shipped with Nginx installation.
sudo rm -rf /etc/nginx/sites-enabled/default sudo rm -rf /etc/nginx/sites-available/default
Create a new configuration for OpenAI reverse proxy.
Create a new file inside the Nginx sites-available directory.
sudo nano /etc/nginx/sites-available/reverse-proxy.conf
Copy the entire configurations listed below to the editor.
Make sure to replace the below listed ones.
- OPENAI_API_KEY with the one you get from OpenAI platform.
- YOUR_DOMAIN_NAME with your domain name.
proxy_ssl_server_name on; server { listen 80; server_name YOUR_DOMAIN_NAME; proxy_set_header Host api.openai.com; proxy_http_version 1.1; proxy_set_header Host $host; proxy_busy_buffers_size 512k; proxy_buffers 4 512k; proxy_buffer_size 256k; location ~* ^/v1/((engines/.+/)?(?:chat/completions|completions|edits|moderations|answers|embeddings))$ { proxy_pass https://api.openai.com; proxy_set_header Authorization “Bearer OPENAI_API_KEY”; proxy_set_header Content-Type “application/json”; proxy_set_header Connection ”; client_body_buffer_size 4m; } }
Hit CTRL + X followed by ENTER to save and exit the editor.
Enable the newly created Nginx configuration.
Configuring Proxy Cache (Optional)
You can also configure caching for performance if you need. You need to just replace the above code we added with the one below.
proxy_ssl_server_name on; proxy_cache_path /server_cache levels=1:2 keys_zone=openai_cache:10m max_size=1g inactive=4d use_temp_path=off; log_format cache_log ‘$remote_addr – $remote_user [$time_local] ‘ ‘”$request” $status $body_bytes_sent ‘ ‘”$http_referer” “$http_user_agent” ‘ ‘Cache: $upstream_cache_status’; server { listen 80; server_name YOUR_DOMAIN_NAME; proxy_set_header Host api.openai.com; proxy_http_version 1.1; proxy_set_header Host $host; proxy_busy_buffers_size 512k; proxy_buffers 4 512k; proxy_buffer_size 256k; location ~* ^/v1/((engines/.+/)?(?:chat/completions|completions|edits|moderations|answers|embeddings))$ { proxy_pass https://api.openai.com; proxy_set_header Authorization “Bearer OPENAI_API_KEY”; proxy_set_header Content-Type “application/json”; proxy_set_header Connection ”; proxy_cache openai_cache; proxy_cache_methods POST; proxy_cache_key “$request_method|$request_uri|$request_body”; proxy_cache_valid 200 4d; proxy_cache_valid 404 1m; proxy_read_timeout 8m; proxy_cache_use_stale error timeout updating http_500 http_502 http_503 http_504; proxy_cache_background_update on; proxy_cache_lock on; access_log /dev/stdout cache_log; proxy_ignore_headers Cache-Control; add_header X-Cache-Status $upstream_cache_status; client_body_buffer_size 4m; } }
Enable Nginx Configuration for OpenAI Reverse Proxy
sudo ln -s /etc/nginx/sites-available/reverse-proxy.conf /etc/nginx/sites-enabled/reverse-proxy.conf
Test Nginx configuration.
sudo nginx -t
Restart Nginx for the changes to take effect.
Also Read: How to Setup and Use Janitor AI API with OpenAI Reverse Proxy
Secure the Setup with Free SSL
Now we will install Let’s Encrypt free SSL and secure your requests.
Install Certbot using the below command.
sudo apt install python3-certbot-nginx
Now you can install SSL using the certbot command.
Make sure to replace your email and domain name with the real ones.
Important: Your domain should point to the IP address of your server, otherwise SSL installation will get failed.
sudo certbot -nginx -redirect -no-eff-email -agree-to-tos -m [email protected] -d yourdomain.com
Now you will have the SSL installed for you.
Verify OpenAI Reverse Proxy With Nginx
Now you have your Nginx server configured to work with OpenAI API. To test if this works you can form the url with v1/chat/completions.
These are some of the endpoints listed below.
- POST /v1/chat/completions
- POST /v1/completions
- POST /v1/edits
- POST /v1/embeddings
- POST /v1/moderations
- POST /v1/answers
If you make requests to the required endpoint you will get the response as requested.
Conclusion
That’s it! You’ve successfully set up an OpenAI API reverse proxy using Nginx on Ubuntu 22.04 . You have also installed and configured SSL to handle security measures for protecting your API key and requests.
How to Setup OpenAI Reverse Proxy with Nginx