Setup OpenAI Reverse Proxy With NGINX for ChatGPT
ChatGPT OpenAI reverse proxy with NGINX with step-by-step instructions for seamless integration to applications like Janitor AI or other services.
Join the DZone community and get the full member experience.
Join For FreeOpenAI, a renowned leader in AI research, offers an API that enables developers to leverage their powerful language models. This article will provide a step-by-step guide on setting up OpenAI reverse proxy with NGINX on a Ubuntu 22.04 machine with a sub-domain and Let’s encrypt free SSL. This setup allows you to efficiently integrate AI capabilities into your applications like Janitor AI, Venus AI, and more.
Benefits of an OpenAI Reverse Proxy
OpenAI reverse proxy, in combination with NGINX, has some advantages listed below:
- Performance: By configuring a reverse proxy, you can cache OpenAI API responses, reducing latency and improving overall performance for your users.
- Scalability: The reverse proxy acts as an intermediary between your application and the OpenAI API, enabling you to scale your AI integration seamlessly.
- Security: A reverse proxy can add an extra layer of security by shielding sensitive API keys and protecting your backend infrastructure from direct external access.
Let’s begin to start configuring OpenAI reverse proxy with NGINX.
Prerequisites
- A machine with Linux distro with external IP so that we can configure subdomain and install SSL.
- A user with
sudo
privileges orroot
access.
Initial Setup
Start by updating the packages to the latest version available.
sudo apt update
sudo apt upgrade -y
Install NGINX for OpenAI Reverse Proxy
You can install NGINX easily with a single command.
sudo apt install nginx
Verify the NGINX installation using the below command.
sudo service nginx status
You will see an output of the status of NGINX (active or failed).
Configure OpenAI Reverse Proxy With NGINX
Now, you need to remove the default NGINX configuration that is shipped with NGINX installation.
sudo rm -rf /etc/nginx/sites-enabled/default
sudo rm -rf /etc/nginx/sites-available/default
Create a new configuration for OpenAI reverse proxy.
Create a new file inside the NGINX sites-available
directory.
sudo nano /etc/nginx/sites-available/reverse-proxy.conf
Copy the entire configurations listed below to the editor.
Make sure to replace the below-listed ones.
OPENAI_API_KEY
with the one you get from the OpenAI platform.YOUR_DOMAIN_NAME
with your domain name.
/etc/nginx/sites-available/reverse-proxy.confproxy_ssl_server_name on;
server {
listen 80;
server_name YOUR_DOMAIN_NAME;
proxy_set_header Host api.openai.com;
proxy_http_version 1.1;
proxy_set_header Host $host;
proxy_busy_buffers_size 512k;
proxy_buffers 4 512k;
proxy_buffer_size 256k;
location ~* ^\/v1\/((engines\/.+\/)?(?:chat\/completions|completions|edits|moderations|answers|embeddings))$ {
proxy_pass https://api.openai.com;
proxy_set_header Authorization "Bearer OPENAI_API_KEY";
proxy_set_header Content-Type "application/json";
proxy_set_header Connection '';
client_body_buffer_size 4m;
}
}
Hit CTRL + X
followed by ENTER
to save and exit the editor.
Enable the newly created NGINX configuration.
Configuring Proxy Cache (Optional)
You can also configure caching for performance if you need to. You just need to replace the above code we added with the one below.
/etc/nginx/sites-available/reverse-proxy.confproxy_ssl_server_name on;
proxy_cache_path /server_cache levels=1:2 keys_zone=openai_cache:10m max_size=1g inactive=4d use_temp_path=off;
log_format cache_log '$remote_addr - $remote_user [$time_local] '
'"$request" $status $body_bytes_sent '
'"$http_referer" "$http_user_agent" '
'Cache: $upstream_cache_status';
server {
listen 80;
server_name YOUR_DOMAIN_NAME;
proxy_set_header Host api.openai.com;
proxy_http_version 1.1;
proxy_set_header Host $host;
proxy_busy_buffers_size 512k;
proxy_buffers 4 512k;
proxy_buffer_size 256k;
location ~* ^\/v1\/((engines\/.+\/)?(?:chat\/completions|completions|edits|moderations|answers|embeddings))$ {
proxy_pass https://api.openai.com;
proxy_set_header Authorization "Bearer OPENAI_API_KEY";
proxy_set_header Content-Type "application/json";
proxy_set_header Connection '';
proxy_cache openai_cache;
proxy_cache_methods POST;
proxy_cache_key "$request_method|$request_uri|$request_body";
proxy_cache_valid 200 4d;
proxy_cache_valid 404 1m;
proxy_read_timeout 8m;
proxy_cache_use_stale error timeout updating http_500 http_502 http_503 http_504;
proxy_cache_background_update on;
proxy_cache_lock on;
access_log /dev/stdout cache_log;
proxy_ignore_headers Cache-Control;
add_header X-Cache-Status $upstream_cache_status;
client_body_buffer_size 4m;
}
}
Enable NGINX Configuration for OpenAI Reverse Proxy
sudo ln -s /etc/nginx/sites-available/reverse-proxy.conf /etc/nginx/sites-enabled/reverse-proxy.conf
Test NGINX configuration.
sudo nginx -t
Restart NGINX for the changes to take effect.
Secure the Setup With Free SSL
Now we will install it. Let’s Encrypt free SSL and secure your requests.
Install Certbot using the below command.
sudo apt install python3-certbot-nginx
Now you can install SSL using the certbot
command.
Make sure to replace your email and domain name with the real ones.
Important: Your domain should point to the IP address of your server; otherwise, SSL installation fails.
sudo certbot --nginx --redirect --no-eff-email --agree-to-tos -m yourmail@mail.com -d yourdomain.com
Now, you will have the SSL installed for you.
Verify OpenAI Reverse Proxy With NGINX
Now, you have your NGINX server configured to work with OpenAI API. To test if this works, you can form the URL with v1/chat/completions
.
These are some of the endpoints listed below.
- POST
/v1/chat/completions
- POST
/v1/completions
- POST
/v1/edits
- POST
/v1/embeddings
- POST
/v1/moderations
- POST
/v1/answers
If you make requests to the required endpoint you will get the response as requested.
Conclusion
That’s it! You’ve successfully set up an OpenAI API reverse proxy using NGINX on Ubuntu 22.04 . You have also installed and configured SSL to handle security measures for protecting your API key and requests.
Published at DZone with permission of Pappin Vijak. See the original article here.
Opinions expressed by DZone contributors are their own.
Comments