During WOPR22, we took an informal survey of tools that practitioners have used regularly over the last year. It should be remembered that tools are frequently chosen by non-practitioners for us, for reasons besides fitness for purpose. This survey does not meet any standard of statistical significance, and does not include many well-known tools – just the collective experience of attendees. Some of us use a wide variety of tools as we move from project to project; some of us use the same tools regularly.
This work is the product of all of the Workshop on Performance and Reliability (WOPR22) attendees. WOPR22 was held May 21-23, 2014, in Malmö, Sweden, on the topic of “Early Performance Testing”. Participants in the workshop included Fredrik Fristedt, Andy Hohenner, Paul Holland, Martin Hynie, Emil Johansson, Maria Kedemo, John Meza, Eric Proegler, Bob Sklar, Paul Stapleton, Andy Still, Neil Taitt, and Mais Tawfik Ashkar.
|JMeter (x5)||Excel (All the x times)||Fiddler|
|LoadRunner (x4)||DynaTrace||Charles Proxy|
|BlazeMeter (x2)||Ajax DynaTrace||WebPageTest (scripting, too)|
|Visual Studio (x2)||Graphite||Wireshark|
|The Grinder||VisualVM||Shunra (WAN Emulation)|
|eggPlant Performance||qExplorer/Queue Explorer|
|Silk Performer||New Relic|
|Load UI Pro/Soap UI Pro/Other UI Pro||Nmon|
|Spirent Avalanche||Task Manager|
|AWR/Statspack/Oracle Trace/TX Prof|