11 Dec 2014

My Most Useful MySQL Performance Tools

Here is a list of my most useful tools that I use when doing performance audits.
Please note, I am writing this mainly for myself, because I sometimes end up trying to find them in my other blog post about mastering indexing and this may save me time as well as a few changes that have happened over the years.

Regular Slow Log Report
pt-query-digest slow_query.log  >slow.txt

All Queries (that use indexes) for a certain table
pt-query-digest slow_query.log  --filter '($event->{fingerprint} =~ m/^(!?select|update|delete)/) &&  ($event->{arg} =~ m/mytable /) ' --limit=100% >mytable.txt

Longest Running Select Queries - most painful queries with response time % right next to them.
pt-query-digest slow_query.log  --filter '($event->{fingerprint} =~ m/^(!?select)/)'  --order-by=Query_time:max > select.txt

Filter the slow log for all Select queries for a certain table
pt-query-digest slow_query.log  --filter '($event->{fingerprint} =~ m/^(!?select)/) &&  ($event->{arg} =~ m/mytable /) ' --no-report --output=slowlog >mytable.log

Find unused indexes from pre-filtered table's logs  
pt-index-usage mytable.log --host 127.0.0.1 --tables mytable >mytable_indexes.txt 


Find Top 15 Largest tables on a server (use with caution) - (from www.mysqlperformanceblog.com)
-----------------------------
delimiter $$
create procedure dba.largest()
begin

SELECT CONCAT(TABLE_SCHEMA, '.', TABLE_NAME) AS TABLE_NAME,
CONCAT(ROUND(TABLE_ROWS / 1000000, 2), 'M') ROWS,
CONCAT(ROUND(DATA_LENGTH / ( 1024 * 1024 * 1024 ), 2), 'G') DATA,
CONCAT(ROUND(INDEX_LENGTH / ( 1024 * 1024 * 1024 ), 2), 'G') IDX,
CONCAT(ROUND(( DATA_LENGTH + INDEX_LENGTH ) / ( 1024 * 1024 * 1024 ), 2), 'G') TOTAL_SIZE,
ROUND(INDEX_LENGTH / DATA_LENGTH, 2) IDXFRAC
FROM INFORMATION_SCHEMA.TABLES
ORDER BY DATA_LENGTH + INDEX_LENGTH DESC
LIMIT 15;
end $$
delimiter ;
---------------------------------

ToDo:
Run filtered log against database with the percona toolkit log player to test index improvements.


3 comments:

  1. Still using slow query log analysis? :-) Why not VividCortex?

    ReplyDelete
  2. I like sticking to old technologies. Hence, MySQL and relational dbs.

    ReplyDelete
  3. I have been using MONyog - MySQL Monitoring Tool and pretty amazed by it. It is an agent-less monitoring tool and still able to access and manipulate the slow-query log of the MySQL server. Earlier I use to write cron jobs for similar tasks.

    Not just the slow query log, it also manipulates data from general query log and mysql error log to be presented to the user in a very intuitive way. It use queries like show global variables, show global status, show slave status, show full processlist etc to get data about the MySQL server status at different intervals specified by the user.

    Post MySQL version 5.5 it makes use of the Performance schema database to fetch and manipulate data with almost negligible overhead. I am surprised a the amount of data it provides to the DBA. Should definitely give it a try.

    ReplyDelete