Nginx stalled cache updating
For now we’ll have both options enabled and later we should do benchmarks with different file types and sizes to determine optimal gzip usage on a larger scale.Nowadays the only viable option to run PHP on Nginx is via Fast CGI using the PHP Fast CGI Process Manager.In our case there are two distinct scenarios for response time.The simple one is when a static file is requested (e.g.
Both were built on Windows 7 Ultimate (32-bit and 64-bit for the respective versions of Nginx) using Cygwin.Even search engines dislike slow servers and decrease their ranking. In our article a few months ago we asked what is the fastest web server in the world.The results combined with other arguments (open source, ease of use, security) lead us to decide on using Nginx as our preferred general web server for new web services.With 100 concurrent requests arriving all the time the server resources are quickly saturated, so we did another benchmark using only 10 concurrent requests doing 80 requests in total: $ ab -n 80 -c 10 -g Requests per second: 34.04 [#/sec] (mean) Time per request: 293.776 [ms] (mean) $ ab -n 80 -c 10 -g Requests per second: 26.50 [#/sec] (mean) Time per request: 377.311 [ms] (mean) Comparing the 6 milliseconds of a static page to on average 300 milliseconds of a PHP page tells us that serving PHP is 50 times heavier and an obvious goal for our optimization.As the virtual server in question has two CPU cores, the first thing to do was to match the Nginx worker process count to that we changed in All of these optimizations are likely to effect the total request time by just a few milliseconds, so using the static file benchmark possible small changes were more prominent, but still changes on the scale of 5 ms to 4 ms are really tiny: The option multi_accept makes the worker process accept all new connections instead of serving on at a time: Again, results are minimal.
The jumps between 40 for the two changes reflect the point where the response time is rounded to 5 instead of 4 milliseconds.