WOPR27

The Workshop on Performance and Reliability (WOPR) announces its 27th Workshop. WOPR27 will be hosted by Medidata in New York on October 24-26, 2018. The traditional Pre-WOPR Dinner will take place on Tuesday, October 23rd.

Julian Harty is the Content Owner for WOPR27.

How Production Data, Logging, and Analytics are Used in Performance and Reliability

We are often asked to improve the performance of our systems. Beyond system resource metrics and response times, we seek out, aggregate, and analyze measurable and meaningful data from these systems. We do so to assess how these systems perform, how they scale, how they fail, how they serve our customers, and how our customers choose to interact with them.  We then, hopefully, use this data to improve the performance of our systems and, perhaps, of our work.

Inspecting data and analytics from deployed contexts is powerful feedback for many people including capacity planning and product teams. What are our customers, systems, and software telling us with the data they create?

We look for new measurements and signals to help us better understand our systems, our customers’ expectations, and our work all the time. We use data to try and answer meaningful questions like:

  • How long are people engaged with our applications?
  • How long does our development team take to completely deliver a unit of customer value?
  • What versions are our customers still using?
  • What operating systems and devices are they using?
  • How patient are our customers when waiting for help?
  • What is our reliability in delivering multimedia and complex interactions over last-mile conditions?
  • When do our customers decide to buy? And when do our customers decide to leave?
  • How does our software perform in specific geographic regions?
  • What should we build next, and how should we test it?

Please come share your experiences in collecting, analyzing, and sensemaking of analytics.

We are interested in hearing your experiences about how you used production data and analytics to change, improve, tune or otherwise modify your systems along with any changes to the methodologies and processes by how your team creates and manages those changes.

Stories (experience reports) about how we perform is also of interest, yet seldom studied by us; nonetheless at an organisational level it’s a topic for business leaders, HR, and the Board. Harvard Business Review and many others, research the human performance aspects to consider ways people can use data to improve how they work. Your experiences on the assessing our performance are also welcome at this WOPR.

Software Development provides rich seams of data we can analyse. Buse and Zimmermann described some of this in their work Analytics for Software Development, and IEEE Software has published 2 issues on the theme of Software Analytics. There’s not been much published or discussed on ways we can measure, design, deploy, use, collect, or analyse data in its many and varied forms related to software testing, and particularly related to Performance and Robustness of software and systems.

This WOPR’s theme is on Data, and Analytics we may be able to usefully perform with such data. We encourage your experience reports and insights into data and analysis, e.g.

  • On modelling test data based on analytics of real world usage, user-supplied-data, and user-related data.
  • On building observability into systems, and the design and implementation of logging and telemetry (including analytics).
  • On using analytics to assess the performance and reliability of our systems, over time and during incident response.
  • On our performance, and the comparisons with feedback from the field (both human-initiated and digital).
  • On obviating some testing based on confidence from production use
  • On ways data related to how the software is actually used has helped provide insights we’ve used to improve our subsequent testing
  • On the complex and faceted challenges of balancing user-choice, privacy, ethics, and preserving confidentiality compared with the potential value of collecting and mining the data.

The domains may well bring specific sources of information and/or particular challenges like confidentiality, privacy, personal safely, and other considerations. For instance, mobile apps and their app stores provide rich channels of data and feedback for particular apps, medicine and drug development is highly regulated so the data can be extremely sensitive, and so forth.

An example for a mobile app

As an illustration of using data analytics this diagram shows a timeline for our work and various sources of information & feedback in a mobile context.

The past provides an evolving archive of our work and includes artifacts (software releases, tests, bugs, human perceptions, and so on). Much of the feedback we receive ‘today’ is on past work. If our software sends data we can analyse it to learn more about how our software is used and behaves. Timely feedback about how our software is currently behaving may help us take practical decisions that improve the service for users. We can use data from the various sources to help shape our work related to future releases and future practices. We can also refine our software to collect increasingly relevant data while pruning the collection of unnecessary or ill-advised/ill-chosen data.

The past provides an evolving archive of our work and includes artifacts (software releases, tests, bugs, human perceptions, and so on). Much of the feedback we receive ‘today’ is on past work. If our software sends data we can analyse it to learn more about how our software is used and behaves. Timely feedback about how our software is currently behaving may help us take practical decisions that improve the service for users. We can use data from the various sources to help shape our work related to future releases and future practices. We can also refine our software to collect increasingly relevant data while pruning the collection of unnecessary or ill-advised/ill-chosen data.

Conference Location and Dates

WOPR27 will be hosted by Medidata in New York on October 24-26, 2018. The traditional Pre-WOPR Dinner will take place on Tuesday, October 23rd.

If you would like to attend WOPR27, please submit your application soon. We will begin sending invitations in late August.

About WOPR

WOPR is a peer workshop for practitioners to share experiences in system performance and reliability, allow people interested in these topics to network with their peers, and to help build a community of professionals with common interests. Participants are asked to share first-person experience reports which are then discussed by the group. More information about Experience Reports is available at http://www.performance-workshop.org/experience-reports/.

WOPR is not vendor-centric, consultant-centric, or end user-centric, but strives to accommodate a mix of viewpoints and experiences. We are looking for people who are interested in system performance, reliability, testing, and quality assurance.

WOPR has been running since 2003, and over the years has included many of the world’s most skillful and well-known performance testers and engineers. To learn more about WOPR, visit our About page, connect at LinkedIn and Facebook, or follow @WOPR_Workshop.

Costs

WOPR is not-for-profit. We do ask WOPR participants to help us offset expenses, as employers greatly benefit from the learning their employees can get from WOPR. The expense-sharing amount for WOPR27 is $300 USD. If you are invited to the workshop, you will be asked to pay the expense-sharing fee to indicate acceptance of your invitation. We are happy to discuss the fee if needed.

Applying for WOPR

WOPR conferences are invitation-only and sometimes over-subscribed. For WOPR27, we plan to limit attendance to about 20 people. We usually have more applications and presentations than can fit into the workshop; not everyone who submits a presentation will be invited to WOPR, and not everyone invited to WOPR will be asked to present.

Our selection criteria are weighted heavily towards practitioners, and interesting ideas expressed in WOPR applications. We welcome anyone with relevant experiences or interests. We reserve seats to identify and support promising up-and-comers. Please apply, and see what happens.

The WOPR organizers will select presentations, and invitees will be notified by email according to the above dates. You can apply for WOPR27 here.

© 2003-2017 WOPR Frontier Theme