[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
10.5555/2819419.2819423acmconferencesArticle/Chapter ViewAbstractPublication PagesicseConference Proceedingsconference-collections
research-article

Evaluating bug finders: test and measurement of static code analyzers

Published: 16 May 2015 Publication History

Abstract

Software static analysis is one of many options for finding bugs in software. Like compilers, static analyzers take a program as input. This paper covers tools that examine source codewithout executing itand output bug reports. Static analysis is a complex and generally undecidable problem. Most tools resort to approximation to overcome these obstacles and it sometimes leads to incorrect results. Therefore, tool effectiveness needs to be evaluated. Several characteristics of the tools should be examined. First, what types of bugs can they find? Second, what proportion of bugs do they report? Third, what percentage of findings is correct? These questions can be answered by one or more metrics. But to calculate these, we need test cases having certain characteristics: statistical significance, ground truth, and relevance. Test cases with all three attributes are out of reach, but we can use combinations of only two to calculate the metrics.
The results in this paper were collected during Static Analysis Tool Exposition (SATE) V, where participants ran 14 static analyzers on the test sets we provided and submitted their reports to us for analysis. Tools had considerably different support for most bug classes. Some tools discovered significantly more bugs than others or generated mostly accurate warnings, while others reported wrong findings more frequently. Using the metrics, an evaluator can compare candidates and select the tool that aligns best with his or her objectives. In addition, our results confirm that the bugs most commonly found by tools are among the most common and important bugs in software. We also observed that code complexity is a major hindrance for static analyzers and detailed which code constructs tools handle well and which impede their analysis.

References

[1]
D. Wheeler, and R. S. Moorthy, "State-of-the-art resources (SOAR) for software vulnerability detection, test, and evaluation," Institute of Defense Analyses IDA Paper P-5061, July 2014.
[2]
National Institute of Standards and Technology, "SAMATE - software assurance metrics and tool evaluation" Available from: http://samate.nist.gov
[3]
V. Okun, R. Gaucher, and P. E. Black, "Static analysis tool exposition (SATE) 2008, NIST Special Publication 500--279, June 2009.
[4]
V. Okun, A. Delaitre, and P. E. Black, "The second static analysis tool exposition (SATE) 2009," NIST Special Publication 500--287, June 2010.
[5]
V. Okun, A. Delaitre, and P. E. Black, "Report on the third static analysis tool exposition (SATE) 2010," NIST Special Publication 500--283, October 2011.
[6]
V. Okun, A. Delaitre, and P. E. Black, "Report on the static analysis tool exposition (SATE) IV," NIST Special Publication 500--297, January 2013, http://dx.doi.org/10.6028/NIST.SP.500--297
[7]
A. Delaitre et al., "SATE V report," NIST Special Publication, unpublished.
[8]
A. Delaitre, V. Okun, and E. Fong, "Of massive static analysis data," Proceeding of Software Security and Reliability (SERE 2013), June 2013.
[9]
G. Diaz and J. R. Bermejo, "Static analysis of source code security: assessment of tools against SAMATE tests," Information and software technology 55 (2013) 1462-1476. (Also available at http://dx.doi.org/10.10.16/j.infsof.2013.02.005)
[10]
Software Assurance Reference Dataset (SARD) project web site. http://samate.nist.gov/SARD
[11]
J. A. Kupsch and B. P. Miller, "Manual vs automated vulnerability assessment: A case study," First international workshop on managing insider security threats (MIST 2009), West Lafayette, IN, June 2009.
[12]
C. Willis, "CAS static analysis tool study overview," In proc. eleventh annual high confidence software and systems conference, page 86, National Security Agency, 2011, http://hcss.csp.org
[13]
Common Weakness Enumeration (CWE). The MITRE corporation, http://cwe.mitre.org
[14]
Paul E. Black, "Static analyzers: seat belts for your code," May-June 2012, IEEE Security & Privacy, 10(3):48--52.
[15]
Paul E. Black, "Counting bugs is harder than you think," in 11th IEEE Int'l Working conference on source Code Analysis and Manipulation (SCAM2011), Williamsburg, VA, September 2011.
[16]
"CAS static analysis tool study methodology," Center for Assured Software, NSA, 2011, http://samate.nist.gov/docs/CAS_2011_SA_Tool_Method.pdf
[17]
Common Vulnerabilities and Exposures (CVE). The MITRE corporation, http://cve.mitre.org
[18]
"Juliet test suite v1.2 user guide," Appendix A, Center for Assured Software, NSA, December 2012, http://samate.nist.gov/SARD/testsuite.php
[19]
Gerard J. Holzmann, "Conquering complexity," Computer 40 (12): 111--113, Dec. 2007.
[20]
2011 CWE/SANS Top 25 Most Dangerous Software Errors, http://cwe.mitre.org/top25/

Cited By

View all
  • (2018)Prioritizing alerts from multiple static analysis tools, using classification modelsProceedings of the 1st International Workshop on Software Qualities and Their Dependencies10.1145/3194095.3194100(13-20)Online publication date: 28-May-2018
  • (2017)Identifying and documenting false positive patterns generated by static code analysis toolsProceedings of the 4th International Workshop on Software Engineering Research and Industrial Practice10.1109/SER-IP.2017..20(55-61)Online publication date: 20-May-2017
  • (2016)EvilCoderProceedings of the 32nd Annual Conference on Computer Security Applications10.1145/2991079.2991103(214-225)Online publication date: 5-Dec-2016

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image ACM Conferences
COUFLESS '15: Proceedings of the First International Workshop on Complex faUlts and Failures in LargE Software Systems
May 2015
87 pages

Sponsors

Publisher

IEEE Press

Publication History

Published: 16 May 2015

Check for updates

Author Tags

  1. software assurance
  2. software faults
  3. software vulnerability
  4. static analysis tools

Qualifiers

  • Research-article

Conference

ICSE '15
Sponsor:

Upcoming Conference

ICSE 2025

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)4
  • Downloads (Last 6 weeks)0
Reflects downloads up to 01 Jan 2025

Other Metrics

Citations

Cited By

View all
  • (2018)Prioritizing alerts from multiple static analysis tools, using classification modelsProceedings of the 1st International Workshop on Software Qualities and Their Dependencies10.1145/3194095.3194100(13-20)Online publication date: 28-May-2018
  • (2017)Identifying and documenting false positive patterns generated by static code analysis toolsProceedings of the 4th International Workshop on Software Engineering Research and Industrial Practice10.1109/SER-IP.2017..20(55-61)Online publication date: 20-May-2017
  • (2016)EvilCoderProceedings of the 32nd Annual Conference on Computer Security Applications10.1145/2991079.2991103(214-225)Online publication date: 5-Dec-2016

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media