WOPR9

ms

WOPR9 was held in Redmond, Washington, USA on September 27-29, 2007, and was hosted by Microsoft.

Attendees

Henry Amistadi, Scott Barber, Goranka Bjedov, David Bronner, Rob Carlson, Ankur Chadda, Ross Collard, Dan Downing, Ed Glas, Dawn Haynes, Paul Holland, Rauli Kaksonen, Joon Lee, Richard Leeke, Wilson Mar, Jude McQuaid, Derek Mead, Michael Pearl, Jim Pierson, Eric Proegler, Raymond Rivest, Scott Snyder, Talli Somekh, Roland Stens, Nate Whitmore, Nick Wolf

Screen Shot 2014-10-22 at 12.06.45 AM

Theme: Pushing the boundaries of performance testing tools, or when the performance test tool won’t let you do…

Questions we are interested in exploring at WOPR9 include: What limitations do we run into when using our favorite Performance Test Tool? By sharing experiences and attempting to understand the issues that are encountered when we start exploring the boundaries of our tool set we will learn how people dealt with these and will generate ideas for us to deploy in our own environment.

WOPR9 will be exploring all aspects of the limits of performance testing tools, including protocols, reporting, analysis, high volume transactions, integration, coordination with other tools, budgetary constraints due to licensing models etc.

WOPR9 will have a high technical content due to the topic and the issues under discussion. However, experiences that directly link tool capabilities with achieving your business goals will be very welcome as well.

Some questions we are likely to address include: In which experiences was the vanilla performance testing tool not adequate?

What was the cause?

How did manage do to your work?

If you did not manage to make it work, what was the key issue?

What skills did you need to have?

Were you able to find alternative tools?

Did you need to extend the functionality by programming your own solution?

What category of issues did you run into?

Protocol Support?

Extremely high transaction rates?

Critical timing issues?

Too much result data generated?

Canned reports inadequate, not timely?

Bugs in the tool?

Integration with third party solutions?

Result analysis tools lacking?

What did you do to solve your issues?

What was the business benefit of your solution?

What support did you receive from management for your work?

WOPR9 will give working practitioners (testers) a chance to examine, discuss and brainstorm these questions.

Leave a Reply

Your email address will not be published. Required fields are marked *

© 2003-2023 WOPR Frontier Theme Vlone