nginx php5-fpm upstream timed out (110: Connection timed out) while connecting to upstream nginx php5-fpm upstream timed out (110: Connection timed out) while connecting to upstream nginx nginx

nginx php5-fpm upstream timed out (110: Connection timed out) while connecting to upstream


Firstly, reduce the PHP-FPM max_requests to 100; you want PHP threads to restart much sooner than 10000 req's.

Secondly, you've only got one PHP process running with lots of children. This is fine for development, but in production you want to have more PHP processes each with fewer children, so that if that process goes down for any reason there are others which can take up the slack. So, rather than a ratio of 1:50 as you have now, go for a ratio of 10:5. This will be much more stable.

To achieve this you may want to look at something like supervisor to manage your PHP processes. We use this in production and it has really helped increase our uptime and reduce the amount of time we spend managing/monitoring the servers. Here's an example of our config:

/etc/php5/php-fpm.conf:

[global]daemonize = no[www]listen = /tmp/php.socket

/etc/supervisor.d/php-fpm.conf:

[program:php]user=rootcommand=/usr/sbin/php-fpm -c /etc/php5/php.ini -y /etc/php5/php-fpm.confnumprocs=10process_name=%(program_name)s

/etc/nginx/conf/php.backend:

upstream backend {    server unix:/tmp/php.socket}

EDIT:

As with all server set-ups, don't rely on guess-work to track down where your issues are. I recommend installing Munin along with the various PHP(-FPM) and Nginx plugins; these will help you track hard statistics on requests, response times, memory usage, disk accesses, thread/process levels... all essential when tracking down where the issues are.

In addition, as I mentioned in a comment below, adding both server- and client-side caching to your set-up, even at a modest level, can aid in providing a better experience for users, whether it's using nginx's native caching support or something more specific like varnishd. Even the most dynamic of sites/apps have many static elements which can be held in memory & served faster. Serving these from cache can help reduce the load overall and ensure that those elements which absolutely need to be dynamic have all the resources they need when they need them.