nginx is cutting the end of dynamic pages and caching them
Solution 1
Check permission of these directories:
client_body_temp_path
proxy_temp_path
fastcgi_temp_path
It is likely that one or all of these directories is not writable for nginx. These temp folders act as buffers for nginx when handling big requests, if nginx cannot write to them then nginx just returns the content which is stored in its memory.
You should also see errors similar to below in your error logs, if not check your log level / log path again.
2013/10/07 11:01:09 [crit] 3307#0: *33 open() "/var/lib/nginx/tmp/proxy/2/00/0000000002" failed (13: Permission denied) while reading upstream
Solution 2
Possible connection with this bug http://www.ruby-forum.com/topic/4080504 , try updating nginx.
oliver nadj
Updated on September 18, 2022Comments
-
oliver nadj almost 2 years
I moved one of my old sites from an Apache to an nginx server. Everything is working fine but the site has some long content (a +100k generated HTML file).
My first trial was to disable chunked transfer encoding, but that did not help.
Here it is my nginx config:
$ cat /etc/nginx/nginx.conf user www-data; worker_processes 1; error_log /var/log/nginx/error.log; pid /var/run/nginx.pid; events { worker_connections 1024; } http { include /etc/nginx/mime.types; sendfile on; keepalive_timeout 65; tcp_nodelay on; gzip on; gzip_static on; gzip_http_version 1.0; gzip_disable "MSIE [1-6]\."; gzip_vary on; gzip_comp_level 1; gzip_proxied any; gzip_types text/plain text/css application/json application/x-javascript text/xml application/xml application/xml+rss text/javascript application/javascript text/x-js; gzip_buffers 16 8k; include /etc/nginx/conf.d/*.conf; include /etc/nginx/sites-enabled/*; } $ cat /etc/nginx/sites-enabled/example.com server { listen 443 ssl; server_name example.com; access_log /var/log/nginx/example.com.access.log; error_log /var/log/nginx/example.com.error.log; charset iso-8859-2; root /var/www/public/example.com; chunkin off; chunked_transfer_encoding off; location ~ ^.+\.php { fastcgi_split_path_info ^((?U).+\.php)(/?.+)$; fastcgi_param SCRIPT_FILENAME $document_root$fastcgi_script_name; fastcgi_param SCRIPT_NAME $fastcgi_script_name; fastcgi_param PATH_INFO $fastcgi_path_info; fastcgi_pass 127.0.0.1:9000; include fastcgi_params; } location / { index index.php; try_files $uri /index.php?$args; } ssl_certificate /etc/nginx/ssl/server.crt; ssl_certificate_key /etc/nginx/ssl/server.key; }
There are some weird things going on. Firebug shows me the page is being cached, but I don't know why.
UPDATE:
Finally I can reproduce the issue by using the following PHP script:
<?php //error_reporting(E_ALL ^ E_NOTICE ^ E_DEPRECATED); //the whole content printed as expected error_reporting(E_ALL & ~E_DEPRECATED); //truncated content header("Content-Type: text/plain; charset=iso-8859-2"); $i = 500000; while ($i) { $i--; printf("%10s", $i); if (!($i%50)) { echo "\n"; } $a = $undefined; }
This script runs and terminates normally if I exclude
E_NOTICE
from error reporting.-
oliver nadj about 11 yearsI just opened a ticket trac.nginx.org/nginx/ticket/373 and it seems the I have problem with old version of PHP.
-
-
oliver nadj about 11 yearsThx @Andrei, I updated nginx to 1.4.1 but the issue is still exists.
-
Andrei Mikhaltsov about 11 yearsDoes turning off gzip in nginx have any effect on size or anything?
-
flunder over 9 yearsAfter searching for hours how to fix a Wordpress chunked Transfer Encoding Issue this has finally solved it. Bless!