Mastering NGINX: Configurations for Load Balancing, Proxying, SSL, Caching, and More

1. NGINX as a Load Balancer

This configuration balances traffic between multiple backend servers:

http {
    upstream backend {
        server backend1.example.com;
        server backend2.example.com;
        server backend3.example.com;
    }

    server {
        listen 80;

        location / {
            proxy_pass http://backend;
            proxy_set_header Host $host;
            proxy_set_header X-Real-IP $remote_addr;
            proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
        }
    }
}

2. NGINX as a Reverse Proxy

This setup proxies requests to a single backend server:

server {
    listen 80;

    location / {
        proxy_pass http://backend.example.com;
        proxy_set_header Host $host;
        proxy_set_header X-Real-IP $remote_addr;
        proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
        proxy_set_header X-Forwarded-Proto $scheme;
    }
}

3. NGINX as a Web Server

This configuration serves static files directly:

server {
    listen 80;
    server_name example.com;

    root /var/www/html;
    index index.html index.htm;

    location / {
        try_files $uri $uri/ =404;
    }
}

4. NGINX as an SSL Termination Proxy

This setup handles HTTPS requests and proxies them to an HTTP backend server:

server {
    listen 443 ssl;
    server_name example.com;

    ssl_certificate /etc/nginx/ssl/example.com.crt;
    ssl_certificate_key /etc/nginx/ssl/example.com.key;

    location / {
        proxy_pass http://backend.example.com;
        proxy_set_header Host $host;
        proxy_set_header X-Real-IP $remote_addr;
        proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
        proxy_set_header X-Forwarded-Proto https;
    }
}

5. NGINX as a Cache

This configuration caches static content:

proxy_cache_path /data/nginx/cache levels=1:2 keys_zone=my_cache:10m max_size=1g inactive=60m use_temp_path=off;

server {
    listen 80;

    location / {
        proxy_cache my_cache;
        proxy_pass http://backend.example.com;
        proxy_set_header Host $host;
        proxy_set_header X-Real-IP $remote_addr;
        proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
        add_header X-Cache-Status $upstream_cache_status;
    }
}

6. NGINX as a WebSocket Proxy

This setup proxies WebSocket traffic:

server {
    listen 80;

    location /ws/ {
        proxy_pass http://backend.example.com;
        proxy_http_version 1.1;
        proxy_set_header Upgrade $http_upgrade;
        proxy_set_header Connection "Upgrade";
        proxy_set_header Host $host;
    }
}

7. NGINX for Rate Limiting

This configuration applies a rate limit for requests:

http {
    limit_req_zone $binary_remote_addr zone=one:10m rate=10r/s;

    server {
        listen 80;

        location / {
            limit_req zone=one burst=5;
            proxy_pass http://backend.example.com;
        }
    }
}

Each role can be customized further depending on your application’s requirements. Make sure to test your configuration before deploying to production.

Previous Article

What is NGINX?

Next Article

How I became a Cloud Architect and DevOps

Subscribe to our Newsletter

Subscribe to our email newsletter to get the latest posts delivered right to your email.
Pure inspiration, zero spam ✨