[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
10.1145/2513228.2513234acmconferencesArticle/Chapter ViewAbstractPublication PagesracsConference Proceedingsconference-collections
research-article

A systematic methodology for OS benchmark characterization

Published: 01 October 2013 Publication History

Abstract

Using benchmarks to evaluate operating systems is a common and important approach. However, determining which benchmarks to use for such evaluation requires very careful consideration. It has been found that a seemingly naive change of system configuration or input set could lead to drastic change of benchmark characteristics, and could also lead to misleading or incorrect results. Some OS benchmark suites may also include too many benchmark programs with very similar characteristics that could give biased results against, or in favor of, certain kernel behavior. Hence, we need to determine the characteristics of benchmark programs in order to come up with an appropriate benchmark suite for such evaluation, and to interpret the measured results more precisely and correctly. Although there have been many tools developed to help on profiling an OS and to characterize its run-time behavior, the collected data by those tools are often very large and complex. It is extremely time consuming, labor intensive, and error prone to analyze the large volume of measured results, and to determine the characteristics of a suite of benchmark programs. In this work, we propose to use machine-learning techniques to help on analyzing and characterizing OS benchmark programs based on the traced OS kernel events. In this work, a systematic methodology is proposed to automatically characterize benchmarks. We found that the characterized OS behavior could help developers to choose appropriate applications to benchmark operating systems.

References

[1]
Brown, A. B. and M. I. Seltzer. Operating system benchmarking in the wake of lmbench: A case study of the performance of NetBSD on the Intel x86 architecture. in Proceedings of the 1997 ACM SIGMETRICS international conference on Measurement and modeling of computer systems. 1997. ACM.
[2]
Yaghmour, K. and M. R. Dagenais. Measuring and characterizing system behavior using kernel-level event logging. in Proceedings of the annual conference on USENIX Annual Technical Conference. 2000. USENIX Association.
[3]
Desnoyers, M. and M. R. Dagenais, Lttng, filling the gap between kernel instrumentation and a widely usable kernel tracer, in Linux Foundation Collaboration Summit2009.
[4]
Scott, A. and M. Knott, A cluster analysis method for grouping means in the analysis of variance. Biometrics, 1974: p. 507--512.
[5]
Kaufman, L. and P. J. Rousseeuw, Finding groups in data: an introduction to cluster analysis. Vol. 39. 1990: Wiley Online Library.
[6]
Calzarossa, M. and G. Serazzi, Workload characterization: A survey. Proceedings of the IEEE, 1993. 81(8): p. 1136--1150.
[7]
Hoste, K. and L. Eeckhout, Microarchitecture-independent workload characterization. Micro, IEEE, 2007. 27(3): p. 63--72.
[8]
Wold, S., K. Esbensen, and P. Geladi, Principal component analysis. Chemometrics and intelligent laboratory systems, 1987. 2(1-3): p. 37--52.
[9]
Chang, C. C. and C. J. Lin, LIBSVM: a library for support vector machines. ACM Transactions on Intelligent Systems and Technology (TIST), 2011. 2(3): p. 27.
[10]
Vapnik, V. N., The nature of statistical learning theory2000: Springer Verlag.
[11]
Hsu, C. W. and C. J. Lin, A comparison of methods for multiclass support vector machines. Neural Networks, IEEE Transactions on, 2002. 13(2): p. 415--425.
[12]
Cortes, C. and V. Vapnik, Support-vector networks. Machine learning, 1995. 20(3): p. 273--297.
[13]
Park, A., J. C. Becker, and R. J. Lipton, IOStone: a synthetic file system benchmark. ACM SIGARCH Computer Architecture News, 1990. 18(2): p. 45--52.
[14]
McVoy, L. and C. Staelin. lmbench: Portable tools for performance analysis. in Proceedings of the 1996 annual conference on USENIX Annual Technical Conference. 1996. Usenix Association.
[15]
Love, R., Linux kernel development2009: Addison-Wesley Professional.
[16]
Bovet, D., M. Cesati, and A. Oram, Understanding the Linux kernel, 2002: O'Reilly & Associates, Inc.
[17]
SturartRussel and Peter Norvig (2003), "Artificial Intelligence: A Modern Approach," Prentice Hall International, Inc. 2nd Edition, pp. 712--762.
[18]
bw_mem_cp_16k_unrolled_alignedJ. Levon, OProfile, A system-wide profiler for Linux systems, Homepage: http://oprofile.sourceforge.net.

Cited By

View all
  • (2018)A comparative analysis of wearables operating systems based on functional constraints2018 15th IEEE Annual Consumer Communications & Networking Conference (CCNC)10.1109/CCNC.2018.8319168(1-6)Online publication date: Jan-2018

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image ACM Conferences
RACS '13: Proceedings of the 2013 Research in Adaptive and Convergent Systems
October 2013
529 pages
ISBN:9781450323482
DOI:10.1145/2513228
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 01 October 2013

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. benchmark
  2. multi-core
  3. operating system
  4. scalability

Qualifiers

  • Research-article

Funding Sources

Conference

RACS'13
Sponsor:
RACS'13: Research in Adaptive and Convergent Systems
October 1 - 4, 2013
Quebec, Montreal, Canada

Acceptance Rates

RACS '13 Paper Acceptance Rate 73 of 317 submissions, 23%;
Overall Acceptance Rate 393 of 1,581 submissions, 25%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)4
  • Downloads (Last 6 weeks)0
Reflects downloads up to 28 Jan 2025

Other Metrics

Citations

Cited By

View all
  • (2018)A comparative analysis of wearables operating systems based on functional constraints2018 15th IEEE Annual Consumer Communications & Networking Conference (CCNC)10.1109/CCNC.2018.8319168(1-6)Online publication date: Jan-2018

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media