[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
10.1109/ICSE-SEIP.2019.00010acmconferencesArticle/Chapter ViewAbstractPublication PagesicseConference Proceedingsconference-collections
research-article

Safe velocity: a practical guide to software deployment at scale using controlled rollout

Published: 27 May 2019 Publication History

Abstract

Software companies are increasingly adopting novel approaches to ensure their products perform correctly, succeed in improving user experience and assure quality. Two approaches that have significantly impacted product development are controlled experiments - concurrent experiments with different variations of the same product, and phased rollouts - deployments to smaller audiences (rings) before deploying broadly. Although powerful in isolation, product teams experience most benefits when the two approaches are integrated. Intuitively, combining them may seem trivial. However, in practice and at a large scale, this is difficult. For example, it requires careful data analysis to correctly handle exposed populations, determine the duration of exposure, and identify the differences between the populations. All of these are needed to optimize the likelihood of successful deployments, maximize learnings, and minimize potential harm to users of the products. In this paper, based on case study research at Microsoft, we introduce controlled rollout (CRL), which applies controlled experimentation to each ring of a traditional phased rollout. We describe its implementation on several products used by hundreds of millions of users along with the complexities encountered and overcome. In particular, we explain strategies for selecting the length of the rollout period and metrics of focus, and defining the pass criterion for each of the rings. Finally, we evaluate the effectiveness of CRL by examining hundreds of controlled rollouts at Microsoft Office. With our work, we hope to help other companies in optimizing their software deployment practices.

References

[1]
"Overview of Microsoft Windows insider program." {Online}. Available: https://insider.windows.com/en-us/how-to-overview/.
[2]
J. Humble and D. Farley, Continuous Delivery: Reliable Software Releases through Build, Test, and Deployment Automation, Pearson Education, 2011.
[3]
F. Auer and M. Felderer, "Current state of continuous experimentation: a systematic mapping study," in Proceedings of the 2018 44rd Euromicro Conference on Software Engineering and Advanced Applications (SEAA), 2018.
[4]
A. Fabijan, P. Dmitriev, H. H. Olsson, and J. Bosch, "Online controlled experimentation at scale: an empirical survey on the current state of A/B testing," in Proceedings of the 2018 44rd Euromicro Conference on Software Engineering and Advanced Applications (SEAA), 2018.
[5]
A. Fabijan, H. H. Olsson, and J. Bosch, "Customer feedback and data collection techniques in software R & D: a literature review," in Proceedings of Software Business, ICSOB 2015, 2015, vol. 210, pp. 139--153.
[6]
P. Bosch-Sijtsema and J. Bosch, "User involvement throughout the innovation process in high-tech industries," J. Prod. Innov. Manag., vol. 32, no. 5, pp. 1--36, 2014.
[7]
"Agile software development," Wikipedia. {Online}. Available: https://en.wikipedia.org/wiki/Agile_software_development
[8]
M. Shahin, M. Zahedi, M. A. Babar, and L. Zhu, "Adopting continuous delivery and deployment," Proc. 21st Int. Conf. Eval. Assess. Softw. Eng. - EASE'17, no. i, pp. 384--393, 2017.
[9]
A. Fabijan, P. Dmitriev, H. H. Olsson, and J. Bosch, "The benefits of controlled experimentation at scale," in Proceedings of the 2017 43rd Euromicro Conference on Software Engineering and Advanced Applications (SEAA), 2017, pp. 18--26.
[10]
J. Yli-Huumo, T. Rissanen, A. Maglyas, K. Smolander, and L.-M. Sainio, "The relationship between business model experimentation and technical debt," Softw. Business, Icsob 2015, vol. 210, pp. 17--29, 2015.
[11]
G. Schermann, J. J. Cito, and P. Leitner, "Continuous experimentation: challenges, implementation techniques, and current research," IEEE Softw., vol. 35, no. 2, pp. 26--31, Mar. 2018.
[12]
"Software release life cycle," Wikipedia. {Online}. Available: https://en.wikipedia.org/wiki/Software_release_life_cycle.
[13]
"Chrome release channels." {Online}. Available: https://www.chromium.org/getting-involved/dev-channel.
[14]
E. von Hippel, "Lead users: a source of novel product concepts," Manage. Sci., vol. 32, no. 7, pp. 791--805, 1986.
[15]
A. Fabijan, P. Dmitriev, H. H. Olsson, and J. Bosch, "Effective online experiment analysis at large scale," in Proceedings of the 2018 44rd Euromicro Conference on Software Engineering and Advanced Applications (SEAA), 2018.
[16]
R. Kohavi and S. Thomke, "The surprising power of online experiments," Harvard Business Review, no. October, 2017.
[17]
R. L. Kaufman, J. Pitchforth, and L. Vermeer, "Democratizing online controlled experiments at Booking.com," arXiv Prepr. arXiv1710.08217, pp. 1--7, 2017.
[18]
K. Kevic, B. Murphy, L. Williams, and J. Beckmann, "Characterizing experimentation in continuous deployment: a case study on Bing," in 2017 IEEE/ACM 39th International Conference on Software Engineering: Software Engineering in Practice Track (ICSE-SEIP), 2017, pp. 123--132.
[19]
Y. Xu, N. Chen, A. Fernandez, O. Sinno, and A. Bhasin, "From infrastructure to culture: A/B testing challenges in large scale social networks," in Proceedings of the 21th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, 2015, no. Figure 1, pp. 2227--2236.
[20]
R. Kohavi, R. Longbotham, D. Sommerfield, and R. M. Henne, "Controlled experiments on the web: survey and practical guide," Data Min. Knowl. Discov., vol. 18, no. 1, pp. 140--181, Feb. 2009.
[21]
J. L. Devore and K. N. Berk, Modern Mathematical Statistics with Applications, Springer, 2011.
[22]
G. Schermann, J. Cito, P. Leitner, U. Zdun, and H. C. Gall, "We're doing it live: a multi-method empirical study on continuous experimentation," Inf. Softw. Technol., Mar. 2018.
[23]
M. T. Rahman, L.-P. Querel, P. C. Rigby, and B. Adams, "Feature toggles: practitioner practices and a case study," Proc. 13th Int. Work. Min. Softw. Repos. - MSR '16, pp. 201--211, 2016.
[24]
P. Runeson and M. Höst, "Guidelines for conducting and reporting case study research in software engineering," Empir. Softw. Eng., vol. 14, no. 2, pp. 131--164, 2008.
[25]
H.-F. Hsieh and S. E. Shannon, "Three approaches to qualitative content analysis.," Qual. Health Res., vol. 15, no. 9, pp. 1277--88, Nov. 2005.
[26]
M. Crotty, The foundations of social research: meaning and perspective in the research process. 1998.
[27]
R. Kohavi, A. Deng, B. Frasca, T. Walker, Y. Xu, and N. Pohlmann, "Online controlled experiments at large scale," in Proceedings of the 19th ACM SIGKDD international conference on Knowledge discovery and data mining - KDD '13, 2013, p. 1168.
[28]
R. Kohavi and R. Longbotham, Online Controlled Experiments and A/B Testing, no. Ries 2011. Boston, MA: Springer US, 2017.
[29]
H. Hohnhold, D. O'Brien, and D. Tang, "Focusing on the long-term," in Proceedings of the 21th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining - KDD '15, 2015, pp. 1849--1858.
[30]
P. Dmitriev, B. Frasca, S. Gupta, R. Kohavi, and G. Vaz, "Pitfalls of long-term online controlled experiments," in 2016 IEEE International Conference on Big Data (Big Data), 2016, pp. 1367--1376.
[31]
G. van Belle, Statistical Rules of Thumb: Second Edition. 2008.
[32]
R. Kohavi, A. Deng, R. Longbotham, and Y. Xu, "Seven rules of thumb for web site experimenters," in Proceedings of the 20th ACM SIGKDD international conference on Knowledge discovery and data mining - KDD '14, 2014, pp. 1857--1866.
[33]
A. Fabijan, P. Dmitriev, H. H. Olsson, and J. Bosch, "The evolution of continuous experimentation in software product development," in Proceedings of the 39th International Conference on Software Engineering ICSE'17, 2017.
[34]
P. Dmitriev, S. Gupta, K. Dong Woo, and G. Vaz, "A dirty dozen: twelve common metric interpretation pitfalls in online controlled experiments," in Proceedings of the 23rd ACM SIGKDD international conference on Knowledge discovery and data mining - KDD '17, 2017.

Cited By

View all
  • (2024)A/B testingJournal of Systems and Software10.1016/j.jss.2024.112011211:COnline publication date: 2-Jul-2024
  • (2023)A/B Integrations: 7 Lessons Learned from Enabling A/B Testing as a Product FeatureProceedings of the 45th International Conference on Software Engineering: Software Engineering in Practice10.1109/ICSE-SEIP58684.2023.00033(304-314)Online publication date: 17-May-2023
  • (2022)UX research in the software industryProceedings of the 21st Brazilian Symposium on Human Factors in Computing Systems10.1145/3554364.3559126(1-13)Online publication date: 17-Oct-2022
  • Show More Cited By

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image ACM Conferences
ICSE-SEIP '19: Proceedings of the 41st International Conference on Software Engineering: Software Engineering in Practice
May 2019
339 pages

Sponsors

Publisher

IEEE Press

Publication History

Published: 27 May 2019

Check for updates

Author Tags

  1. controlled experiment
  2. controlled rollout
  3. phased rollout
  4. software development

Qualifiers

  • Research-article

Conference

ICSE '19
Sponsor:

Upcoming Conference

ICSE 2025

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)4
  • Downloads (Last 6 weeks)1
Reflects downloads up to 04 Dec 2024

Other Metrics

Citations

Cited By

View all
  • (2024)A/B testingJournal of Systems and Software10.1016/j.jss.2024.112011211:COnline publication date: 2-Jul-2024
  • (2023)A/B Integrations: 7 Lessons Learned from Enabling A/B Testing as a Product FeatureProceedings of the 45th International Conference on Software Engineering: Software Engineering in Practice10.1109/ICSE-SEIP58684.2023.00033(304-314)Online publication date: 17-May-2023
  • (2022)UX research in the software industryProceedings of the 21st Brazilian Symposium on Human Factors in Computing Systems10.1145/3554364.3559126(1-13)Online publication date: 17-Oct-2022
  • (2022)Rapid Regression Detection in Software Deployments through Sequential TestingProceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining10.1145/3534678.3539099(3336-3346)Online publication date: 14-Aug-2022

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media