Optimizing a php application in 5 minutes

The optimization stage that comes after the main development of an application after it is completed. Optimizing means reducing loading and execution times, improving the user experience by making the application reacting more responsively.
When it comes the time to optimize your web application, don’t go blindly searching for the bottleneck. Often there are no obvious improvable points in a codebase and your assumptions about the slowness causes can be wrong. Profiling is the activity of discovering what is forcing your application run slower than expected, by analyzing the code execution and time tracking.
If your language of choice is php, fortunately this process takes only 5 minutes.

Minute 1: xdebug installation
In a shell accessed directly of by ssh on the server which runs the php application, type:

sudo pecl install xdebug

Obviously if you already have xdebug on your development server you can skip this step. The pecl binary should create a minimal configuration for you, but if you not see a line referencing xdebug.so in your php.ini add it by yourself:
zend_extension=/path/to/xdebug.so
The xdebug extension provides many utilities to php developers. One of them is the profiling of code execution.

Minute 2: xdebug profiling configuration
Under the loading directive of xdebug in php.ini, add the following lines:

xdebug.profiler_enable = 1
xdebug.profiler_output_dir = /tmp

These directives tell xdebug to enable the profiler from the start of php scripts and to put cachegrind files in /tmp, after the script have finished running. The cachegrind files are lists of all the function calls made during the script execution, along with their source and information on the elapsed time. Make sure there is enough disk space on the folder you choose to kept them, since their size can quickly go up to hundred of megabytes, and to disactivate the profiling directives after you finished your optimization work.

Minute 3: load a page of your choice
I hope this does not take an entire minute, otherwise a long optimization phase will be mandatory.

Minute 4: installing webgrind
At http://code.google.com/p/webgrind/ you can download webgrind, a web application created for interpreting cachegrind files, which are not human readable. There are other solutions for reading cachegrind files, but I prefer a portable web application since where there is php, webgrind can be installed.
Simply decompress the package into a folder in your webserver and load its path it in the browser. Webgrind is written in php5 and it does not have dependencies to configure.

Minute 5: loading a cachegrind files and observing the result
Select from the webgrind menu Show 90% of [select a cachegrind file] in percent|milliseconds, and hit update. After the file has been uploaded and analyzed, a list of functions and methods similar to the following will be shown:

Every function comes with a color that distinguish it between userland functions (green), php functions (red) and constructs (gray), such as require_once(). Probably you can optimize directly only the green functions, but other tools such as the apc cache can improve the constructs as well.
The numbers crunched by webgrind comprehend the Invocation count (the times the function has been called), the Total self cost and the Total Inclusive Cost. The latter is the time elapsed between the function calls and the instant when they returned a value; the former is the effecttive time spent in the function body during the execution, excluding calls to other functions. You should find obvious optimization spots observing Total self costs, and in fact the default ordering of webgrind uses this metric.
I added a long, no-op for cycle to the constructor of the NakedService class to simulate a bottleneck like a slow query or an access to a webservice. See by yourself what happens when I profile again:

Look for places in your code that can be improved in speed not with static analysis, but enabling the profiler during the real execution.
I hope the 5 minutes have been well spent. Profiling applications is often necessary and it should not be difficult once you have this structure in place.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.