I have a (large) software project in a git repository. There is a benchmark suite that spits out lots of numbers (time per benchmark, allocations done per benchmark) I’d like to track.
What is the best way to get a grip on intended and unintended performance changes in such a setup?
(Ideally, I think, there is a piece of free software that takes a git repository, a script to obtain the numbers, schedules jobs for new commit, also checks old commits when idle and presents a slick web interface with lots of graphs and analysis tools.)
5
I have a limited amount of experience with Jenkins. One thing I really liked about it was the dashboard display that had been set up by my colleagues, using plugins for Jira, Subversion, and Clover (code coverage). Jenkins supports quite a number of plugins and you can develop them yourself.
I know that Hudson has a similar extensibility.
3
I believe that the best approach for your case is using Jenkins with the Performance Plugin. It has nice graphs and can be used both with JMeter and SoapUI.
Also test performance is important. According to Martin Fowler it is better to run fast unit tests for every commit and slow integration and performance tests every few hours.
What is the best way to get a grip on intended and unintended performance changes in such a setup?
I don’t know anything about your benchmark suite, but as far as I understand the question, this can be implemented as an automatic test (I assume you have already a test suite with integration or acceptance tests in place). What you need is
-
a possibility to record the performance values in a machine readable form
-
a possibility to store the recorded values as reference values
-
a possibility to compare recorded values from “the latest run” with the reference values, and if those values differ “too much”, you this up as a “failed test”.
Maybe you have to implement these three things by using some scripts on your own, but when your test suite allows unattened execution, it should not be too hard.
So whenever your regular (integration) test suite is run (for example, as part of a “nightly build”), you run also the performance test suite, and will be informed about unwanted changes in the same manner as you will be informed about any other failed test.
Have in mind that you have to define what “too much” means. Have also in mind, that the outcome of the performance results can be extremely hardware depedent, so when you have to run the test suite on different machines, or if the build server is working on too many other things in parallel during the tests, expect the results to be different.
So before throwing some arbitrary tools at the data to produce some nicely coloured charts you have to check manually after each run, check if that is really appropriate for your case, and if a fully automatic without any graphics does not suite your needs better.
Of course, charts may be helpful for analysing the root cause whenever the performance drops, but just for detecting a deviation I would avoid anything which needs manual interception.
EDIT: Rereading your question again, I am wondering if you are using an automatic build server, or if you currently do all builds locally. If you have no build server, that may be indeed the first thing to start with (independent from your performance tests). Afterwards you can integrate some automatic tests on that platform, and afterwards the performance tests. If you don’t want to setup such a system, the minimum thing you need is a single script which will check out the complete source code to a clean directory and compile everything in one step. Afterwards you can try to integrate the automatic run of the performance tests. But I guess a server system is more appropriate, because otherwise you are going to block your workstation for a certain amount of time with each test run.
6
Some inspiration of what is possible can be found on these existing Performance Dashboards. Unfortunately, not all of them are free software, and neither of them seems to be ready-to-use by others:
- https://datazilla.mozilla.org/
- http://goperfd.appspot.com/perf
- http://speed.pypy.org/
- https://chromeperf.appspot.com/