WOPR20

liquidnet_logo

WOPR20 was held in New York City on May 14-16, 2013, and was hosted by Liquidnet. Ross Collard was the Content Owner.

Attendees

Bob Binder, Goranka Bjedov, Becky Clinard, Ross Collard, Dan Downing, Rada Fey, Mieke Gievers, Neil Gray, Julian Harty, Matthew Heusser, Doug Hoffman, Andy Hohenner, Paul Holland, Richard Leeke, Jude McQuaid, John Meza, Ammre Mohammad, Eric Proegler, Anna Royzman, Simon `Peter´ Schrijver, Mais Tawfik Al-Khoury Ashkar, Mark Tomlinson, Thomas Vaniotis, Glenn Wang, Julie Xie

WOPR20

Theme: The Leading Edge of Performance and Reliability

A key responsibility for every professional is keeping his or her skills current. Easier said than done! For starters, what is current? What about preparing for the future?

WOPR20 examines trends and innovations in performance testing and related fields, from the perspective of what is relevant to working practitioners, by addressing these kinds of questions:

  • How have your engineering, deployment and testing methods changed recently? How is that going?
  • What kinds of tools are you using in automating testing now?
  • What are you using them for now that you didn’t before?
  • What kinds of quality, reliability, and performance objectives do your organization and your customers look for?
  • How do you engineer and test for these objectives? How do you report against these objectives?
  • How is your organization thinking about quality from a strategy perspective? Have you tried to create “centers of excellence” to drive quality in your organization? How did they turn out?
  • What sorts of problems are you solving today that you didn’t have before? How are you solving them? What new ideas do you have that you haven’t tried yet?

Background

Ten years ago, the first WOPR conference brought together practitioners in performance and reliability. Since then, platforms have evolved almost beyond imagination. For web apps, virtualization, cloud infrastructure, and the explosion of mobile and tablet devices change everything. Today, you can buy computing power and network bandwidth that cost millions 10 years ago, if you could buy it at all, and put it in your pocket.

The counterbalance to Moore’s Law is Wirth’s Law: “Software is getting slower more rapidly than hardware becomes faster.” Software architects assemble stacks on top of platforms to build complex applications. Software re-use is the norm and mash-ups are increasingly widespread: software stacks built on increasing layers of abstraction are easier and “faster” to develop, but less efficient, and usually harder to debug. More ambitious solutions connect more and more disparate technologies with ever-larger datasets.

Software methods and tools have evolved significantly; some say radically. Agile processes are now widely used. Traditional separations of roles between development and operations are being smudged and erased. Testing automation by developers has become an integral part of many development processes: there is more automation, and it is easier to develop.

After years of stagnation in testing tools, innovation has picked back up recently. Older tools like LoadRunner have been adapted to new uses like predictive modeling; newer automation libraries and frameworks like Selenium and Cucumber help extend automation wider in our projects. New load tools with different architectures and licensing models like NeoLoad, SOASTA and BlazeMeter have changed what kinds of tests can be run.

Practices are still evolving to keep up.

Contexts For Experience Reports

These contexts are likely to yield interesting experiences. There are surely more we have not considered. In your proposal, you could identify which category (or “Other”) you think your experience best fits in.

New Architectures and Contexts

  • Outsourced infrastructure
  • High frequency trading systems
  • AI-like apps such as Contextual and Location-Based Search
  • Video streaming

Application Performance and Reliability

  • High-availability, always-on 24×365.25 systems
  • Enterprise-wide, end-to-end performance and reliability
  • Managing SLAs for third-party architecture components (Cloud based platforms, etc.)

Testing Tools and Environments

  • Innovative tool uses and adaptations
  • New tools and tool features
  • Automation frameworks

Testing Methodologies

  • Agile performance testing
  • Session-based exploratory testing
  • Predictive Analytics
  • Extending integration/end-to-end testing with performance testing
  • Outsourcing very high volume performance testing

Organization and Culture

  • Performance centers of excellence
  • Extending integration/end-to-end testing into performance testing
  • Outsourcing very high volume performance testing

Work Products

Matt Heusser wrote an article for CIO Magazine about WOPR20.

Eric Proegler wrote a blog post about WOPR20. 

Neil Gray wrote a blog post about WOPR20.

Private workshop collaboration space here.

Leave a Reply

Your email address will not be published. Required fields are marked *

© 2014 WOPR Frontier Theme