[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
10.5555/3155562.3155633guideproceedingsArticle/Chapter ViewAbstractPublication PagesaseConference Proceedingsconference-collections
Article
Free access

Detecting unknown inconsistencies in web applications

Published: 30 October 2017 Publication History

Abstract

Although there has been increasing demand for more reliable web applications, JavaScript bugs abound in web applications. In response to this issue, researchers have proposed automated fault detection tools, which statically analyze the web application code to find bugs. While useful, these tools either only target a limited set of bugs based on predefined rules, or they do not detect bugs caused by cross-language interactions, which occur frequently in web application code. To address this problem, we present an anomaly-based inconsistency detection approach, implemented in a tool called Holocron. The main novelty of our approach is that it does not look for hard-coded inconsistency classes. Instead, it applies subtree pattern matching to infer inconsistency classes and association rule mining to detect inconsistencies that occur both within a single language, and between two languages. We evaluated Holocron, and it successfully detected 51 previously unreported inconsistencies - including 18 bugs and 33 code smells - in 12 web applications.

References

[1]
StackOverflow, “StackOverflow Developer Survey 2015,” 2015, http://stackoverflow.com/research/developer-survey-2015#tech (Accessed: May 16, 2015).
[2]
F. Ocariza, K. Pattabiraman, and B. Zorn, “JavaScript errors in the wild: an empirical study,” in Proceedings of the International Symposium on Software Reliability Engineering (ISSRE). IEEE Computer Society, 2011, pp. 100–109.
[3]
F. Ocariza, K. Bajaj, K. Pattabiraman, and A. Mesbah, “A study of causes and consequences of client-side JavaScript bugs,” IEEE Transactions on Software Engineering (TSE), p. 17 pages, 2017.
[4]
G. Richards, C. Hammer, B. Burg, and J. Vitek, “The eval that men do: A large-scale study of the use of eval in JavaScript applications,” in Proceedings of the European Conference on Object-Oriented Programming (ECOOP). Springer, 2011, pp. 52–78.
[5]
A. Marchetto, P. Tonella, and F. Ricca, “State-based testing of Ajax web applications,” in Proceedings of the International Conference on Software Testing, Verification and Validation (ICST). IEEE Computer Society, 2008, pp. 121–130.
[6]
A. Mesbah and A. van Deursen, “Invariant-based automatic testing of Ajax user interfaces,” in Proceedings of the International Conference on Software Engineering (ICSE). IEEE Computer Society, 2009, pp. 210–220.
[7]
K. Pattabiraman and B. Zorn, “DoDOM: leveraging DOM invariants for web 2.0 application robustness testing,” in Proceedings of the International Symposium on Software Reliability Engineering (ISSRE). IEEE Computer Society, 2010, pp. 191–200.
[8]
S. Artzi, J. Dolby, S. Jensen, A. Møller, and F. Tip, “A framework for automated testing of JavaScript web applications,” in Proceedings of the International Conference on Software Engineering (ICSE). ACM, 2011, pp. 571–580.
[9]
S. Mirshokraie, A. Mesbah, and K. Pattabiraman, “Guided mutation testing for JavaScript web applications,” Transactions on Software Engineering (TSE), vol. 41, no. 5, pp. 429–444, 2015.
[10]
Douglas Crockford, “JSLint,” 2012, http://www.jslint.com (Accessed: April 18, 2012).
[11]
S. Jensen, A. Møller, and P. Thiemann, “Type analysis for JavaScript,” Proceedings of the International Static Analysis Symposium (SAS), pp. 238–255, 2009.
[12]
S. H. Jensen, M. Madsen, and A. Møller, “Modeling the HTML DOM and browser API in static analysis of JavaScript web applications,” in Proceedings of the Joint Meeting of the European Software Engineering Conference and the Symposium on the Foundations of Software Engineering (ESEC/FSE). ACM, 2011, pp. 59–69.
[13]
F. Ocariza, K. Pattabiraman, and A. Mesbah, “Detecting inconsistencies in JavaScript MVC applications,” in Proceedings of the International Conference on Software Engineering (ICSE). IEEE Computer Society, 2015.
[14]
D. Engler, D. Y. Chen, S. Hallem, A. Chou, and B. Chelf, “Bugs as deviant behavior: A general approach to inferring errors in systems code,” in Proceedings of the ACM Symposium on Operating Systems Principles (SOSP). ACM, 2001, pp. 57–72.
[15]
A. Milani Fard and A. Mesbah, “JSNose: Detecting JavaScript code smells,” in Proceedings of the International Working Conference on Source Code Analysis and Manipulation (SCAM). IEEE Computer Society, 2013, pp. 116–125.
[16]
D. Synodinos, “Top JavaScript MVC frameworks,” 2013, http://www. infoq.com/research/top-javascript-mvc-frameworks (Accessed: May 16, 2015).
[17]
Two Sigma, “Beaker,” 2016, https://github.com/twosigma/ beaker-notebook (Accessed: April 29, 2016).
[18]
MarionetteJS, “Backbone Marionette,” 2016, https://github.com/ marionettejs/backbone.marionette (Accessed: April 29, 2016).
[19]
I. D. Baxter, A. Yahin, L. Moura, M. S. Anna, and L. Bier, “Clone detection using abstract syntax trees,” in Proceedings of the International Conference on Software Maintenance (ICSM). IEEE Computer Society, 1998, pp. 368–377.
[20]
R. Agrawal, T. Imieli´nski, and A. Swami, “Mining association rules between sets of items in large databases,” in Proceedings of the International Conference on Management of Data (SIGMOD). ACM, 1993, pp. 207–216.
[21]
R. Agrawal and R. Srikant, “Fast algorithms for mining association rules in large databases,” in Proceedings of the International Conference on Very Large Databases (VLDB). Morgan Kaufmann Publishers Inc., 1994, pp. 487–499.
[22]
Adobe Systems, “Brackets,” 2015, http://www.brackets.io (Accessed: May 16, 2015).
[23]
A. Hidayat, “Esprima,” 2015, http://www.esprima.org/ (Accessed: May 16, 2015).
[24]
jindw, “XMLDOM,” 2016, https://www.github.com/jindw/xmldom (Accessed: April 29, 2016).
[25]
K. Sera, “apriori.js,” 2016, https://github.com/seratch/apriori.js (Accessed: April 29, 2016).
[26]
U. Shaked, “AngularJS vs. BackboneJS vs. EmberJS,” 2014, http: //www.airpair.com/js/javascript-framework-comparison (Accessed: May 16, 2015).
[27]
Google, “Built with AngularJS,” 2015, https://github.com/angular/ builtwith.angularjs.org/blob/master/projects/projects.json (Accessed: May 16, 2015).
[28]
J. Ashkenas, “BackboneJS: Tutorials, blog posts and example sites,” 2016, https://github.com/jashkenas/backbone/wiki/Tutorials, -blog-posts-and-example-sites (Accessed: April 29, 2016).
[29]
EmberSherpa, “Open source Ember apps,” 2016, https://github.com/ EmberSherpa/open-source-ember-apps (Accessed: April 29, 2016).
[30]
B. Johnson, Y. Song, E. Murphy-Hill, and R. Bowdidge, “Why don’t software developers use static analysis tools to find bugs?” in Proceedings of the International Conference on Software Engineering (ICSE). IEEE Computer Society, 2013, pp. 672–681.
[31]
N. Ayewah, W. Pugh, J. D. Morgenthaler, J. Penix, and Y. Zhou, “Evaluating static analysis defect warnings on production software,” in Proceedings of the ACM SIGPLAN-SIGSOFT Workshop on Program Analysis for Software Tools and Engineering (PASTE). ACM, 2007, pp. 1–8.
[32]
S. Hangal and M. S. Lam, “Tracking down software bugs using automatic anomaly detection,” in Proceedings of the International Conference on Software Engineering (ICSE). ACM, 2002, pp. 291–301.
[33]
Z. Li and Y. Zhou, “PR-Miner: automatically extracting implicit programming rules and detecting violations in large software code,” in Proceedings of the International Symposium on Foundations of Software Engineering (FSE). ACM, 2005, pp. 306–315.
[34]
Z. Li, S. Lu, S. Myagmar, and Y. Zhou, “CP-Miner: Finding copy-paste and related bugs in large-scale software code,” Transactions on Software Engineering (TSE), vol. 32, no. 3, pp. 176–192, 2006.
[35]
W. Weimer and G. C. Necula, “Mining temporal specifications for error detection,” in Proceedings of the International Conference on Tools and Algorithms for the Construction and Analysis of Systems (TACAS). Springer, 2005, pp. 461–476.
[36]
A. Wasylkowski, A. Zeller, and C. Lindig, “Detecting object usage anomalies,” in Proceedings of the Joint Meeting of the European Software Engineering Conference and the Symposium on the Foundations of Software Engineering (ESEC/FSE). ACM, 2007, pp. 35–44.
[37]
L. Jiang, Z. Su, and E. Chiu, “Context-based detection of clone-related bugs,” in Proceedings of the Joint Meeting of the European Software Engineering Conference and the Symposium on the Foundations of Software Engineering (ESEC/FSE). ACM, 2007, pp. 55–64.
[38]
C.-H. Hsiao, M. Cafarella, and S. Narayanasamy, “Using web corpus statistics for program analysis,” in Proceedings of the International Conference on Object Oriented Programming Systems Languages & Applications (OOPSLA). ACM, 2014, pp. 49–65.
[39]
S. Thummalapenta and T. Xie, “Alattin: Mining alternative patterns for detecting neglected conditions,” in Proceedings of the International Conference on Automated Software Engineering (ASE). IEEE Computer Society, 2009, pp. 283–294.
[40]
N. Gruska, A. Wasylkowski, and A. Zeller, “Learning from 6,000 projects: lightweight cross-project anomaly detection,” in Proceedings of the International Symposium on Software Testing and Analysis (ISSTA). ACM, 2010, pp. 119–130.
[41]
H. V. Nguyen, H. A. Nguyen, T. T. Nguyen, A. T. Nguyen, and T. N. Nguyen, “Detection of embedded code smells in dynamic web applications,” in Proceedings of the International Conference on Automated Software Engineering (ASE). IEEE Computer Society, 2012, pp. 282– 285.
[42]
N. Tsantalis, T. Chaikalis, and A. Chatzigeorgiou, “Jdeodorant: Identification and removal of type-checking bad smells,” in Proceedings of the European Conference on Software Maintenance and Reengineering (CSMR). IEEE Computer Society, 2008, pp. 329–331.
[43]
E. Van Emden and L. Moonen, “Java quality assurance by detecting code smells,” in Proceedings of the Working Conference on Reverse Engineering (WCRE). IEEE Computer Society, 2002, pp. 97–106.
[44]
Synopsys, “Coverity,” 2016, http://www.coverity.com/ (Accessed: April 29, 2016).
[45]
S. P. Reiss, “Finding unusual code,” in Proceedings of the International Conference on Software Maintenance. IEEE Computer Society, 2007, pp. 34–43.
[46]
D. Hovemeyer and W. Pugh, “Finding bugs is easy,” in Companion Proceedings of the International Conference on Object-Oriented Programming, Systems, Languages and Applications (OOPSLA). ACM, 2004, pp. 132–136.
[47]
L. Gong, M. Pradel, M. Sridharan, and K. Sen, “DLint: Dynamically checking bad coding practices in JavaScript,” in Proceedings of the International Symposium on Software Testing and Analysis (ISSTA). ACM, 2015.
[48]
Facebook, “Flow: a static type checker for JavaScript,” 2016, http:// flowtype.org/ (Accessed: April 29, 2016).
[49]
M. Pradel, P. Schuh, and K. Sen, “TypeDevil: Dynamic type inconsistency analysis for JavaScript,” in Proceedings of the International Conference on Software Engineering (ICSE). IEEE Computer Society, 2015, pp. 314–324.
[50]
H. V. Nguyen, H. A. Nguyen, T. T. Nguyen, A. T. Nguyen, and T. N. Nguyen, “Dangling references in multi-configuration and dynamic PHPbased web applications,” in Proceedings of the International Conference on Automated Software Engineering (ASE). IEEE Computer Society/ACM, 2013, pp. 399–409.
[51]
T. Polychniatis, J. Hage, S. Jansen, E. Bouwers, and J. Visser, “Detecting cross-language dependencies generically,” in Proceedings of the European Conference on Software Maintenance and Reengineering (CSMR). IEEE, 2013, pp. 349–352.
[52]
R.-H. Pfeiffer and A. Wasowski, “Taming the confusion of languages,” in Proceedings of the European Conference on Modelling Foundations and Applications (ECMFA). Springer, 2011, pp. 312–328.
[53]
P. Mayer and A. Schroeder, “Cross-language code analysis and refactoring,” in Proceedings of the International Working Conference on Source Code Analysis and Manipulation (SCAM). IEEE Computer Society, 2012, pp. 94–103.
[54]
D. Strein, H. Kratz, and W. Löwe, “Cross-language program analysis and refactoring,” in Proceedings of the International Workshop on Source Code Analysis and Manipulation (SCAM). IEEE Computer Society, 2006, pp. 207–216.

Cited By

View all
  • (2019)Diversity-based web test generationProceedings of the 2019 27th ACM Joint Meeting on European Software Engineering Conference and Symposium on the Foundations of Software Engineering10.1145/3338906.3338970(142-153)Online publication date: 12-Aug-2019

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image Guide Proceedings
ASE '17: Proceedings of the 32nd IEEE/ACM International Conference on Automated Software Engineering
October 2017
1033 pages
ISBN:9781538626849

Sponsors

Publisher

IEEE Press

Publication History

Published: 30 October 2017

Author Tags

  1. JavaScript
  2. cross-language interactions
  3. fault detection

Qualifiers

  • Article

Acceptance Rates

Overall Acceptance Rate 82 of 337 submissions, 24%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)16
  • Downloads (Last 6 weeks)5
Reflects downloads up to 20 Jan 2025

Other Metrics

Citations

Cited By

View all
  • (2019)Diversity-based web test generationProceedings of the 2019 27th ACM Joint Meeting on European Software Engineering Conference and Symposium on the Foundations of Software Engineering10.1145/3338906.3338970(142-153)Online publication date: 12-Aug-2019

View Options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Login options

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media