I have an e-commerce system that is a complete PHP spaghetti mess.
I am running way Way WAY to many database queries. I want to share how I am tackling this mess.
I am using the mysql general log to store the queries that are being run to both a table and a file. At first I was using this to see what was run most often, and started caching those in memcache.
Today I decided, why don’t I find out what pages are causing the most queries, and optimize those, instead of individual queries. For example, my first test showed my product page took 110 queries.
1) I am executing this https://gist.github.com/1958901 on every page when I initialize the DB.
2) I turn on the general log for a few minutes. set global general_log=1; wait…. set global general_log=0;
3) I am using this query to find out what threads are running the most queries.
SELECT general_log., count(*) as c from general_log where general_log.event_time like ‘2012-03-02%’ and argument like ‘SELECT%’ group by q order by c DESC limit 10
4) I can then use the thread_id to find out what it was doing. The @page from the gist above will be one of the first queries, so I know where to start analyzing.
I hope this helps someone else.