Seth’s Big List of Publications

 

Interview: Software Test Pro Radio: Me on Data-Driven Quality

More fun chatting with Mark Tomlinson about DDQ.

Create Your Roadmap to Data-Driven Quality

Presentation to STPCon

April 15 2014

This talk walks you through creating a a roadmap for your product on how to leverage production or near-production data for quality assessment.

An earlier version of this talk was also presented as

Your Path To Data-Driven Quality–ALM Forum 2014

  • Presentation to ALM Forum 2014
  • April 3, 2014

Interview: PerfBytes podcast: Data Driven Performance

Answer me this: how many, how much, averages, standard deviations, frequency distribution or just plain yes or no? It’s very hard to know whether your testing effort truly has worth if you don’t take a few steps into a few scientific techniques for data analysis. Seth Eliot joins us for a conversation about Data Driven Performance as we learn about the science behind application data, metrics and instrumentation and how to leverage an understanding of the real-world in load testing. We chat about hippo’s, the international space station, zero gravity, motorcycle maintenance and mucus creation.

Mashing up Data-Driven Quality and Exploratory Testing

  • April, 2014
  • Article for Software Test Professionals

Do It In Production – Testing Where It Counts

(link includes video and PPT deck)

  • Presentation to TestBash 2.0
  • March 22, 2013

Bing analyzes Petabytes of data per day. Facebook instruments everything. Amazon “unplugs” entire data centers on a regular basis.  Why? To improve quality! And it’s all done in production.

This talk covers

  • Measurement
  • Big Data
  • Why TiP?
  • TIP/ UFT Balance
  • Passive vs. Active Methodologies
  • Experimentation
  • Intro to Data Science

 

A to Z Testing in Production: Industry Leading Techniques to Leverage Big Data for Quality

  • Presentation to STPCon Fall 2012
  • October 16, 2012

Testing in production (TiP) is a set of software methodologies that derive quality assessments not from test results run in a lab but from where your services actually run – in production.  The big data pipe from real users and production environments can be used in a way that both leverages the diversity of production while mitigating risks to end users. By leveraging this diversity of production we are able to exercise code paths and use cases that we were unable to achieve in our test lab or did not anticipate in our test planning.

This session introduces test managers and architects to TiP and gives these decision makers the tools to develop a TiP strategy for their service.  Methodologies like Controlled Test Flights, Synthetic Test in Production, Load/Capacity Test in Production, Data Mining, Destructive Testing and more are illustrated with examples from Microsoft, Netflix, Amazon, and Google.  Participants will see how these strategies boost ROI by moving focus to live site operations as their signal for quality.

At STPCon Spring 2012 (March 28, 2012) I gave a previous version of this talk (see it here).  The latest version is the best one, but the previous version has audio.

 

Back to the Future: Where We’re Going, We Don’t Need… Test Cases

  • Oct 9, 2012
  • Article for Software Test Professionals

Testing in Production Mindmap

  • August 6, 2012
  • For Ministry of Testing (Software Testing Club)

The Future of Software Testing

In late 2011, early 2012 I wrote a three piece series on The Future of Testing for The Testing Planet, newsletter

  1. Testing in Production
  2. TestOps
  3. The Cloud

Leaping into The Cloud: Rewards, Risks, and Mitigations

  • Presentation to Better Software Conference West (with Ken Johnston)
  • June 13, 2012

The cloud has rapidly gone from “that thing I should know something about” to the “centerpiece of our corporate IT five-year strategy.” However, cloud computing is still in its infancy. Sure, the marketing materials presented by cloud providers tout huge cost savings and service level improvements—but they gloss over the many risks such as data loss, security leaks, gaps in availability, and application migration costs. Ken Johnston and Seth Eliot share new research on the successful migrations of corporate IT and web-based companies to the cloud. Ken and Seth lay out the risks to consider and explore the rewards the cloud has to offer when companies employ sound architecture and design approaches. Discover the foibles of poor architecture and design, and how to mitigate these challenges through a novel Test Oriented Architecture (TOA) approach. Take back insights from industry leaders—Microsoft, Amazon, Facebook, and Netflix—that have jumped into the cloud so that your organization does not slam to the ground when it takes the leap.

Previous version of this talk November 2011

 

Testing the Limits With Microsoft’s Seth Eliot

  • June 2011 interview with uTest

Quality in the Cloud: The new Role of TestOps

  • Online Article for Software Test Professionals
  • March 2012

TWiST #85 – Testing in Production, Part I

  • This Week in Software Testing Podcast – March 2012

How the Cloud Changes Software Production

  • Better Software Magazine
  • March 31, 2011

 

Ditch the Requirements – Focus on the Customer Instead

  • Software Test Professionals
  • February 1, 2011

 

Testing in Production, Your Key to Engaging Customers

  • Presentation to STP Con – Software Test Professionals Conference 2011
  • March 23, 2011

Seth Eliot will show you how to use Testing in Production (TiP) to align your software development to your customers’ needs and discover those unarticulated needs that drive emotional attachment and market share. Seth will demonstrate the tools you can use to TiP and get direct, actionable feedback from actual users. Feature lists do not drive customer attachment, meeting key needs does. Seth maintains that getting prototypes and product in front of real users is crucial to uncover features that meet these key needs and quantify how much of an impact they will have. Understanding this impact is important since evidence shows that more than half of the ideas that we think will improve the user experience actually fail to do so—and some actually make it worse. Techniques like Online Experimentation and Exposure Control enable you to find what works and what doesn’t. Production however can be a dangerous place to test, so these techniques must also limit any potential negative impact on users. Seth shows several examples from software leaders like Microsoft, Amazon.com, and Google to show how Testing in Production with real users will enable you to realize better software quality.

 

Tracking Users’ Clicks and Submits: Tradeoffs between User Experience and Data Loss

  • Microsoft
  • September 1, 2010

 

Testing with Real Users: User Interaction and Beyond, with Online Experimentation

  • Presentation to the Better Software Conference 2010
  • June 9, 2010

Evidence shows than more than half of the ideas that we think will improve the user experience actually fail to do so—and some actually make it worse. Instead of guessing, why not measure what your real users like and don’t like? Controlled, online experiments (A/B tests being the simplest version) are a proven way to make data-driven decisions about what works and what doesn’t. Seth Eliot shares numerous examples of online experimentation within Microsoft to test new user interfaces with their customers. Seth shows how special frameworks, such as Microsoft’s ExP (Experimentation Platform), can also move testing into the high-value realm of testing-in-production. In addition to new features and designs, Microsoft tests the impact of new code in production. By employing online experimentation, you can control how and when new, potentially dangerous code is exposed to users. Exposure control enables you to reap the benefits of testing in production while limiting the potential negative impact on your customers and users

 

Method for Metallographically Revealing Intermetallic Formation at Galfan/Steel Interfaces

  • Materials Characterization, Volume 30, Issue 4, Pages 295-297
  • June 1, 1993