Verisim: Formal analysis of network simulations
Pages 2 - 13
Abstract
Why are there so few successful "real-world" programming and testing tools based on academic research? This talk focuses on program analysis tools, and proposes a surprisingly simple explanation with interesting ramifications.
For a tool aimed at developers or testers to be successful, people must use it - and must use it to help accomplish their existing tasks, rather than as an end in itself. If the tool does not help them get their job done, or the effort to learn and/or use the tool is too great, users will not perceive enough value; the tool will not get significant usage, even if it is free.
This talk focuses on the often-overlooked consequences of this seemingly basic statement in two major areas: program analysis, and the work beyond core analysis that must be done to make a successful tool. Examples will be drawn from tools that have been successfully used in industry (sold commercially, and developed for internal use).
References
[1]
Debra Anderson, Thane Frivold, and Alfonso Valdes. Next-generation intrusion detection expert system (NIDES) : A summary. Technical report, SRI, May 1995. SRI-CSL-95-07.
[2]
Karthikeyan Bhargavan, Carl A. Gunter, and Davor Obradovic. An assessment of tools used in the verinet project. Technical Report MS-CIS-00-15, University Pennsylvania, 2000. http://www.cis.upenn.edu/ verinet/papers/tool-assessment.ps.
[3]
Karthikeyan Bhargavan, Carl A. Gunter, and Davor Obradovic. A taxonomy of logical network analysis techniques. Technical Report MS-CIS-00-14, University ofPennsylvania, 2000. http: //www.cis.upenn.edu/verinet/papers/taxonomy.ps.
[4]
Karthikeyan Bhargavan, Davor Obradovic, and Carl A. Gunter. Formal veri~cation of standards for distance vector routing protocols, February 2000. http://www.cis.upenn.edu/~hol/papers/rip.ps.
[5]
G.v. Bochmann and O. Bellal. Test result analysis with respect to formal speci~cations. In Proc. 2-nd Int. Workshop on Protocol Test Systems, Berlin, pages 272{294, October 1989.
[6]
G.v. Bochmann, D. Desbiens, M. Dubuc, D. Ouimet, and F. Saba. Test result analysis and validation of test verdicts. In Proc. Workshop on Protocol Test Systems (IFIP), 1990.
[7]
G.v. Bochmann, R. Dssouli, and J.R. Zhao. Trace analysis for conformance and arbitration testing. IEEE Tr. on Soft. Eng., 15(11):1347{1356, November 1989.
[8]
Josh Broch, David A. Maltz, David B. Johnson, Yih-Chun Hu, and Jorjeta Jetcheva. A performance comparison of multi-hop wireless ad hoc network routing protocols. In Proceedings of the Fourth Annual ACM/IEEE International Conference on Mobile Computing and Networking, October 1998.
[9]
Monica Brockmeyer, Farnam Jahanian, Constance Heitmeyer, and Bruce Labaw. A exible, extensible environment for testing real-time speci~cations. In Proceedings of the IEEE Real-Time Technology and Applications Symposium (RTAS), 1997.
[10]
Laura K. Dillon and Q. Yu. Oracles for Checking Temporal Properties of Concurrent Systems. In Proceedings of the 2nd ACM SIGSOFT Symposium Foundations of Software Engineering (SIGSOFT'94), volume 19, pages 140{153, December 1994. Proceedings published as Software Engineering Notes.
[11]
S.A. Ezust and G.v. Bochmann. An Automatic Trace Analysis Tool Generator for Estelle Speci~cations. Computer Communication Review, 25(4):175{184, October 1995. Proceedings of ACM SIGCOMM 95 Conference.
[12]
Kevin Fall and Kannan Varadhan. ns Notes and Documentation. The VINT Project, February 2000.
[13]
Constance Heitmeyer, Alan Bull, Carolyn Gasarch, and Bruce Labaw. Scr*: A toolset for specifying and analyzing requirements. In Proc. of COMPASS, 1995.
[14]
Ahmed Helmy and Deborah Estrin. Simulation-based `STRESS' Testing Case Study. InSixth International Symposium on Modeling, Analysis and Simulation of Computer and Telecommunication Systems (MASCOTS), July 1998.
[15]
L. J. Jagadeesan, A. Porter, C. Puchol, J. C. Ramming, and L.G.Votta. Speci~cation-based testing of reactive software: Tools and experiments. In Proceedings of the International Conference on Software Engineering, May 1997.
[16]
Moonjoo Kim, Mahesh Viswanathan, Han^ene Ben-Abdallah, Sampath Kannan, Insup Lee, and Oleg Sokolsky. A framework for run-time correctness assurance of real-time systems. Technical Report MS-CIS-98-37, University ofPennsylvania, 1998.
[17]
Moonjoo Kim, Mahesh Viswanathan, Han^ene Ben-Abdallah, Sampath Kannan, Insup Lee, and Oleg Sokolsky. Formally speci~ed monitoring of temporal properties. In Proceedings European Conference on Real-Time Systems, 1999.
[18]
I. Lee, S. Kannan, M. Kim, O. Sokolsky, and M.Viswanathan. Runtime assurance based on formal speci~cations. In Proceedings International Conference on Parallel and Distributed Processing Techniques and Applications, 1999.
[19]
Ulf Lindqvist and Phillip A. Porras. Detecting computer and network misuse through the production-based expert system toolset (P-BEST). In Proceedings of the 1999 IEEE Symposium on Security and Privacy, Oakland, California, May 1999.
[20]
T.O. O'Malley, D.J. Richardson, and L.K. Dillon. E~cient Speci~cation-Based Test Oracles. In Second California Software Symposium (CSS'96), April 1996.
[21]
Charles Perkins. Ad hoc on-demand distance vector (AODV) routing. Internet-Draft Version 00, IETF, November 1997.
[22]
Charles E. Perkins and Elizabeth M. Royer. Ad hoc on-demand distance vector routing. In Proceedings of the 2nd IEEE Workshop on Mobile Computer Systems and Applications, pages 90{100, February 1999.
[23]
Phillip A. Porras and Peter G. Neumann. EMERALD: Event monitoring enabling responses to anomalous live disturbances. In National Information Systems Security Conference, 1997.
[24]
D.J. Richardson, S. Leif Aha, and T.O. O'Malley. Speci~cation-Based Oracles for Reactive Systems. In 14th International Conference on Software Engineering, May 1992.
Index Terms
- Verisim: Formal analysis of network simulations
Recommendations
Verisim: Formal analysis of network simulations
ISSTA '00: Proceedings of the 2000 ACM SIGSOFT international symposium on Software testing and analysisWhy are there so few successful "real-world" programming and testing tools based on academic research? This talk focuses on program analysis tools, and proposes a surprisingly simple explanation with interesting ramifications.
For a tool aimed at ...
Comments
Please enable JavaScript to view thecomments powered by Disqus.Information & Contributors
Information
Published In
Sept. 2000
211 pages
- August 2000211 pages
Copyright © 2000 ACM.
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]
Publisher
Association for Computing Machinery
New York, NY, United States
Publication History
Published: 01 August 2000
Published in SIGSOFT Volume 25, Issue 5
Check for updates
Qualifiers
- Article
Contributors
Other Metrics
Bibliometrics & Citations
Bibliometrics
Article Metrics
- View Citations8Total Citations
- 624Total Downloads
- Downloads (Last 12 months)60
- Downloads (Last 6 weeks)7
Reflects downloads up to 18 Dec 2024
Other Metrics
Citations
View Options
Login options
Check if you have access through your login credentials or your institution to get full access on this article.
Sign in