[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

US20200065525A1 - Method and apparatus for generating privacy profiles - Google Patents

Method and apparatus for generating privacy profiles Download PDF

Info

Publication number
US20200065525A1
US20200065525A1 US16/673,262 US201916673262A US2020065525A1 US 20200065525 A1 US20200065525 A1 US 20200065525A1 US 201916673262 A US201916673262 A US 201916673262A US 2020065525 A1 US2020065525 A1 US 2020065525A1
Authority
US
United States
Prior art keywords
privacy
web session
irregularity
web
session data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US16/673,262
Inventor
Travis Spence Powell
Nadav Caspi
Robert I. Wenig
Wolf Herda
Gerard Dietrich
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Acoustic LP
Original Assignee
Acoustic LP
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Acoustic LP filed Critical Acoustic LP
Priority to US16/673,262 priority Critical patent/US20200065525A1/en
Assigned to TEALEAF TECHNOLOGY, INC. reassignment TEALEAF TECHNOLOGY, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: CASPI, NADAV, DIETRICH, GERARD, HERDA, WOLF, POWELL, TRAVIS SPENCE, WENIG, ROBERT I.
Assigned to ACOUSTIC, L.P. reassignment ACOUSTIC, L.P. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: INTERNATIONAL BUSINESS MACHINES CORPORATION
Assigned to INTERNATIONAL BUSINESS MACHINES CORPORATION reassignment INTERNATIONAL BUSINESS MACHINES CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: TEALEAF TECHNOLOGY, INC.
Publication of US20200065525A1 publication Critical patent/US20200065525A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/60Protecting data
    • G06F21/62Protecting access to data via a platform, e.g. using keys or access control rules
    • G06F21/6218Protecting access to data via a platform, e.g. using keys or access control rules to a system of files or objects, e.g. local or distributed file system or database
    • G06F21/6245Protecting personal data, e.g. for financial or medical purposes
    • G06F21/6263Protecting personal data, e.g. for financial or medical purposes during internet communication, e.g. revealing personal data from cookies
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/60Protecting data
    • G06F21/604Tools and structures for managing or administering access control systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/60Protecting data
    • G06F21/62Protecting access to data via a platform, e.g. using keys or access control rules
    • G06F21/6218Protecting access to data via a platform, e.g. using keys or access control rules to a system of files or objects, e.g. local or distributed file system or database
    • G06F21/6245Protecting personal data, e.g. for financial or medical purposes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L63/00Network architectures or network communication protocols for network security
    • H04L63/10Network architectures or network communication protocols for network security for controlling access to devices or network resources
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L63/00Network architectures or network communication protocols for network security
    • H04L63/20Network architectures or network communication protocols for network security for managing network security; network security policies in general
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2221/00Indexing scheme relating to security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F2221/21Indexing scheme relating to G06F21/00 and subgroups addressing additional information or applications relating to security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F2221/2117User registration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2221/00Indexing scheme relating to security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F2221/21Indexing scheme relating to G06F21/00 and subgroups addressing additional information or applications relating to security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F2221/2141Access rights, e.g. capability lists, access control lists, access tables, access matrices

Definitions

  • Monitoring and replay systems may capture web session data, such as web pages sent from a web application server to a client computer and user interface events entered into the web pages at the client computer.
  • the captured web session data may be used to replay and analyze user experiences during the web sessions.
  • the replayed web sessions may be used to identify problems users may be have while navigating through web pages during the web session.
  • Sensitive personal information may be entered into the web pages during the web sessions.
  • the web sessions may involve on-line purchases of products and/or services.
  • users may need to enter social security numbers, passwords, credit card numbers, bank account numbers, health information, stock information, home addresses, or the like, or any combination thereof.
  • Government privacy regulations may prohibit the retention of certain personal information or limit the retention of the personal information to certified entities. These privacy regulations may require monitoring and replay systems to filter sensitive personal information before storing the captured web session data in a database for subsequent replay analysis.
  • FIG. 1 depicts an example of a system for filtering information from captured web session data.
  • FIG. 2 depicts an example of a privacy processing system.
  • FIG. 3 depicts examples of privacy metrics generated by the privacy processing system.
  • FIG. 4 depicts an example process for comparing privacy metrics with privacy profiles.
  • FIG. 5 depicts an example process for generating privacy metrics.
  • FIG. 6 depicts an example process for generating privacy profiles.
  • FIG. 7 depicts an example of a process for detecting irregular privacy metrics.
  • FIGS. 8A and 8B depict example graphs displaying average execution times for privacy rules.
  • FIGS. 9A and 9B depict example graphs displaying percentage of successful completions for privacy rules.
  • FIGS. 10A, 10B and 10C depict examples of replayed web sessions showing correct and incorrect privacy filtering.
  • FIG. 11 depicts an example computing device for implementing the privacy processing system.
  • FIG. 1 depicts a web session 100 conducted between a web application 104 operating on a web server 102 and a computing device 110 .
  • Web application 104 may support any type of online web session such as online purchases, online financial or medical services, social networking, etc. Of course, these are just examples, and any type of electronic web based transaction or activity may be performed using web application 104 .
  • Computing device 110 may comprise a Personal Computer (PC), laptop computer, wireless Personal Digital Assistant (PDA), cellular telephone, smart phone, tablet, or any other wired or wireless device that accesses and exchanges information with web application 104 . Any number of computing devices 110 may conduct different web sessions 100 with web application 104 at any geographical location and at any time of day.
  • PC Personal Computer
  • PDA Personal Digital Assistant
  • cellular telephone smart phone
  • tablet or any other wired or wireless device that accesses and exchanges information with web application 104 .
  • Any number of computing devices 110 may conduct different web sessions 100 with web application 104 at any geographical location and at any time of day.
  • Network connection 108 may comprise any combination of connections over an Internet network, a wireless network, a WiFi network, a telephone network, a Public Services Telephone Network (PSTN), a cellular network, a cable network, a Wide Area Network (WAN), a Local Area Network (LAN), or the like, or any combination thereof.
  • PSTN Public Services Telephone Network
  • WAN Wide Area Network
  • LAN Local Area Network
  • a web browser or web application 118 operating on computing device 110 may send Hyper Text Transfer Protocol (HTTP) requests to web application 104 over network connection 108 .
  • Web application 104 may send back one or more of web pages 106 in response to the HTTP requests and computing device 110 may display the web pages via the web browser or application 118 on a computer screen 116 .
  • web browser or mobile application 118 may display an electronic web page 112 that contains fields 114 A- 114 C for entering a user name, password, and social security number, respectively.
  • Web application 104 may send additional web pages 106 and/or responses to computing device 110 in response to the information entered into fields 114 .
  • Web session monitors 122 may capture web session data 124 during web session 100 .
  • Web session data 124 may include network data transferred over network connection 108 between computing device 110 and web application 104 and user interface events generated on computing device 110 .
  • web session data 124 may comprise the Hyper Text Transfer Protocol (HTTP) requests and other data requests sent from computing device 110 to web application 104 and the Hyper Text Markup Language (HTML) web pages 106 and other responses sent back to computing device 110 from web application 104 .
  • HTTP Hyper Text Transfer Protocol
  • HTML Hyper Text Markup Language
  • Some of web session data 124 may include user interface events entered by a user into computing device 110 , such as mouse clicks, keystrokes, alpha-numeric data, or the like, or any combination thereof.
  • user interface events may comprise data entered into fields 114 of web page 112 or may comprise selections of icons or links on web page 112 .
  • Other web session data 124 may comprise webpage logic/code sent by web application 104 along with web pages 106 to computing device 110 that further determine the different states or operations in the web pages. Some of the web session data may be generated locally on processing device 110 and never sent over network connection 108 . For example, the control logic within web page 112 may change the state of web page 112 in response to user inputs without sending any data back to web application 104 . In another example, a batch data transfer of only completed information in web page 112 may be transferred back to web application 104 over network connection 108 .
  • some web session data 124 may comprise document object model (DOM) events within the web pages. For example, changes in the DOM of displayed web page 106 may be captured by UI event monitor 122 A as some of web session data 124 .
  • web session data 124 may comprise operating parameters or any other logged data in computing device 110 and/or web server 102 .
  • web session data 124 may comprise network bandwidth indicators, processor bandwidth indicators, network condition indicators, computer operating conditions, or the like, or any combination thereof.
  • network session monitor 122 B may capture the network data, such as web pages, requests, responses, and/or logic exchanged between computing device 110 and web application 104 over network connection 108 .
  • User interface (UI) monitor 122 A may capture the user interface events generated locally at computing device 110 .
  • UI monitor 122 A also may capture some or all of the network data exchanged between computing device 110 and web application 104 over network connection 108 .
  • UI event monitor 122 A and/or network session monitor 122 B may not capture all the web session data 124 and may only detect occurrences of some web session events.
  • monitors 122 A and 122 B may send unique identifiers identifying occurrences of certain web session events and may send timestamps indicating when the web session events were detected.
  • a user may enter a user name into field 114 A, a password into field 114 B and/or a social security number into field 114 C. Due to the security requirements discussed above, the password and/or social security number may need to be filtered before captured web session data 124 can be stored in a database 136 .
  • a privacy processing system 130 filters sensitive personal information, such as the password and/or social security number from captured web session data 124 . Filtering refers to any combination of removing, blocking, replacing, encrypting, hashing, or the like, information in web session data 124 . Privacy processing system 130 stores filtered web session data 138 in web session database 136 . A replay system 134 can then use the captured and now filtered web session data 138 to replay the original web session 100 without displaying sensitive personal information.
  • One example of a replay system 134 is described in U.S. Pat. No. 8,127,000, entitled: METHOD AND APPARATUS FOR MONITORING AND SYNCHRONIZING USER INTERFACE EVENTS WITH NETWORK DATA, issued Feb. 28, 2012 which has been incorporated by reference in its entirety.
  • Privacy processing system 130 may apply privacy rules to the captured web session data 124 to remove the sensitive personal information.
  • Privacy profiles or privacy metadata may be generated for the privacy rules.
  • the privacy profiles may identify how often the privacy rules are called, how often the privacy rules successfully complete actions, and amounts of processing time required to execute the privacy rules.
  • the privacy profiles may detect privacy filtering problems, such as privacy rules that do not filter personal information or filter the wrong information from the web session data, or if certain patterns of data require abnormally large privacy resources, such as time or CPU usage. In addition, any big deviation in privacy resource usage may indicate a change to the website or in end user behavior.
  • Filtering and/or encrypting sensitive personal information in captured web session data may be computationally expensive. For example, a particular web site may service hundreds of millions of users and hundreds of millions of associated web sessions each day.
  • the privacy profiles may identify privacy rules that are written incorrectly or inefficiently and waste processing bandwidth. The identified privacy rules can be identified and rewritten so privacy rules can more efficiently search and filter personal information from the millions of web sessions.
  • Privacy processing system 130 may detect other web session events or states that may impact user experiences during web sessions 100 .
  • the privacy profiles may identify incorrect user actions, virus attacks, errors or gaps in the logic of the source web application, etc.
  • privacy processing system 130 not only generates quantitative privacy filtering metrics but may also identify other general problems with the web sessions.
  • FIG. 2 depicts in more detail one example of privacy processing system 130 .
  • a privacy rule parser 148 may apply privacy rules 150 to web session data 124 captured from web sessions 100 . Different rules 150 may be applied to different web pages and different data captured during web session 100 . For example, a first rule 150 A may search for a particular web page in captured web session data 124 that may contain a social security number. A second rule 150 B may search for different fields in several different web pages in web session data 124 that may contain a credit card number.
  • Rules 150 A and 150 B may filter the web session data by replacing, blocking, hashing, encrypting, etc. sensitive personal information, such as social security numbers or credit card numbers.
  • Filtered web session data 138 is stored in database 136 and can then be used for subsequent replay and analysis by replay system 134 .
  • a privacy profiler 152 may generate privacy profiles 158 for privacy rules 150 .
  • privacy profiler 152 may track the number of times each rule 150 is called when filtering web session data 124 , the number of times each rule 150 succeeded in filtering information in web session data 124 , and/or the amount of time required to execute rules 150 while filtering web session data 124 .
  • Privacy profiler 152 may generate privacy profiles 158 by aggregating the privacy metrics for rules 150 .
  • privacy profiler 152 may calculate an average number of times over the last five minutes that rule 150 A is called while filtering web session data 124 .
  • the aggregated privacy metrics may be used as a baseline or “profile” of typical or normal privacy filtering behavior.
  • Security metrics outside of privacy profile thresholds may be indicative of privacy filtering problems or other web session problems. For example, there may be a substantial change in the average number of times a particular privacy rule is called for each captured web session. In another example, there may be a substantial change in the amount of time required to successfully complete execution of a privacy rule.
  • Privacy rule parser 148 may only call rule 150 A when particular web page names or field names are identified in captured web session data 124 . If the enterprise operating the web application changes the web page name or field name, privacy rule parser 148 may no longer call rule 150 A to filter data in the renamed web page. Accordingly, a social security number entered into the renamed web page may no longer be filtered from web session data 124 compromising the overall privacy and security of filtered web session data 138 .
  • a change in the amount of time required to successfully complete privacy rules 150 also may indicate a privacy filtering problem.
  • web browsers, web pages, and/or web page logic may change or reformat data entered into the web pages.
  • the changed or reformatted data may cause privacy rule parser 148 to unintentionally call privacy rules 150 for web pages that do not contain personal information or may cause rules 150 to filter the wrong data.
  • These incorrect or unintentional filtering operations may waste processing bandwidth and/or remove web session data needed for accurately replaying the captured web sessions.
  • a change in the privacy metrics may indicate an abnormal web session.
  • a denial of service attack or other bot attacks may substantially change the number or percentage of times a particular rule 150 is called or the amount of time required to successfully complete the privacy rule.
  • Privacy profiles 158 can identify privacy metric changes and isolate privacy filtering or web session problems.
  • a privacy event processor 154 may display graphs 162 on a computing device 160 for privacy profiles 158 .
  • graphs 162 may identify the average privacy metric values for different privacy rules. Any substantial deviations in graphs 162 may indicate privacy filtering problems and/or web session problems.
  • a user may direct privacy event processor 154 to display the privacy profiles for different dimensions, such as for particular privacy rules, privacy rule parameters, web session categories, or web browsers. For example, the user may direct privacy event processor 154 to display privacy metrics associated with different desk-top web browsers and mobile applications.
  • Computing device 160 may use replay system 134 to further isolate privacy filtering irregularities. For example, a user may compare replayed filtered web sessions prior to large privacy metric changes with replayed web sessions after the large privacy metric changes. Any filtering problems identified during replay can then be corrected by modifying the associated rules 150 in privacy rule parser 148 .
  • FIG. 3 depicts in more detail examples of privacy rules 150 and privacy metrics 176 used and generated by the privacy rule parser.
  • a first rule 150 A may comprise a test 172 A that looks for a particular set of numbers, text, values, parameters, etc. For example, test 172 A may test for a first set of three numbers separated from a second set of two numbers by a space or dash. Test 172 A may then look for a third set of four numbers separated by a second space or dash from the second set of two numbers.
  • rule 150 A may trigger an action 174 A.
  • the action 174 A for rule 150 A may replace the detected sequence of numbers with an “X”.
  • Examples of other actions may include only “X-ing” out some of the numbers in the identified sequence or using a hash algorithm to encrypt the number sequence.
  • the encrypted number sequence might not be decrypted, but could still be used with the hash algorithm to confirm association of the number with a particular user. Any other actions may be used for filtering information.
  • a set of privacy metrics 176 A may be generated by the privacy rule parser in association with privacy rule 150 A.
  • privacy metrics 176 may include web session identifiers identifying the particular web sessions where rule 150 A is called.
  • a web page identifier may identify particular web pages in the web session data where rule 150 A is called.
  • a field identifier may identify particular fields in the captured web session data where rule 150 A is called for filtering data entered into the fields.
  • a web browser identifier may identify a particular web browser or application used during the associated web session where rule 150 A was called.
  • a time stamp metric may identify when the web sessions producing the captured web session data took place and/or may identify when rule 150 A was called for filtering the captured web session data.
  • a “# of calls” metric may identify the number of times rule 150 A was called while filtering the captured web session data or the number of times rule 150 A was called for individual web sessions, web page, etc.
  • a “# of completed actions” metric may identify the number of times rule 150 A successfully completed an associated filtering action. For example, the # of actions may identify the number of times rule 150 A replaced the sequence in test 172 A with the X's in action 174 A.
  • An execution time metric may identify the amount of processing time required to complete rule 150 A for a web session, web page, etc.
  • a second rule 150 B may comprise a test 172 B that also parses the web session data for a particular set of numbers, text, values, parameters, etc.
  • test 172 B may test for a sequence of numbers associated with a credit card number. The number sequence may be four sets of four numbers separated by spaces or dashes.
  • satisfaction of test 172 B may initiate an associated action 174 B that replaces the first twelve numbers of the detected sequence of sixteen numbers with “X's”.
  • Another similar set of privacy metrics 176 B may be generated for rule 150 B.
  • Rules 150 A and 150 B are just examples of any variety of tests and actions that may be applied to the web session data. Rules 150 may be applied to any combination of web session data. For example, rules 150 may only be called for particular web pages that query a user for sensitive personal information, such as a social security number. Other rules 150 may be called to filter data associated with other sensitive data captured on other web pages.
  • FIG. 4 depicts an example of privacy profiles 158 generated by privacy profiler 152 in FIG. 3 from the privacy metrics.
  • Privacy profiles 158 may identify different statistical dimensions associated with filtering the web session data. For example, a first privacy profile 158 A may identify the average number of times any of the rules in the privacy rule parser were called while filtering the web session data. Privacy profile 158 A may be derived for different categories, such as for time periods, web sessions, web pages, web browsers and/or mobile web applications.
  • a privacy profile 158 B may identify the average successful completion rate for the rules indicating the average percentage of times the privacy rules successfully filter information.
  • a privacy profile 158 C may identify an average amount of time required for all of the rules to filter the web session data for web sessions, web pages, browsers, etc.
  • a privacy profile 158 D may identify the average number of times a particular rule #1 is called while filtering the web session data.
  • a privacy profile 158 E may identify the successful completion rate for rule #1.
  • Privacy profiles 158 may be associated with any other privacy rules and collected privacy dimensions.
  • the privacy profiler may generate aggregated average privacy metrics 180 for aggregation periods. For example, privacy profiler may aggregate the number of times different rules are called, aggregate the percentage of times the rules successfully complete filtering actions, and aggregate the amounts of time required for the rules to complete execution.
  • the privacy metrics may be aggregated for some selectable time period, such as five minutes, and the aggregated values averaged to generate privacy metrics 180 .
  • Privacy event processor 154 may compare privacy metrics 180 for each aggregated time period with privacy profiles 158 .
  • a privacy metric notification 182 may be generated for any privacy metrics 180 that are outside of threshold ranges for privacy profiles 158 .
  • the privacy profiler may determine standard deviations for the values in privacy profiles 158 .
  • Privacy event processor 154 may send notification 182 or create an entry in a log file for any of the privacy metrics 180 that are outside of the standard deviations for associated privacy profiles 158 .
  • rule #1 may have successfully completed 100% of the time over the last five minute aggregation period.
  • the average successful completion rate in privacy profile 158 E for rule #1 may be 80 percent and the standard deviation may be +/ ⁇ 4%.
  • the threshold range for privacy profile 158 E may be between 76% and 84%. Since the successful completion rate for rule #1 in privacy metrics 180 is outside of the threshold range for privacy profile 158 E, privacy event processor 154 may generate a notification 182 or generate an entry in a file or table identifying privacy metric 180 for rule #1 as an outlier.
  • Privacy event processor 154 also may automatically determine differences in the web session data associated with different privacy metrics.
  • the captured web session data may include Document Object Models (DOMs) for the web pages.
  • Privacy event processor 154 may detect privacy metrics outside of the privacy profile thresholds.
  • the DOMs for filtered web pages having privacy metrics within the privacy profile thresholds may be compared with the DOMs for filtered web pages with privacy metrics outside of the privacy profile thresholds.
  • the DOM differences may be identified and sent to the operator.
  • the DOM differences may identify a web page with a changed name that may have prevented rule #1 from correctly triggering.
  • FIG. 5 depicts an example process for generating privacy metrics.
  • privacy rules are applied to captured web session data received from the web session monitoring systems.
  • any number of web sessions may be continuously, periodically, or randomly monitored and the associated web session data sent to the privacy processing system.
  • Privacy rules called or triggered during the privacy filtering process are identified in operation 202 .
  • a rule may be called when a particular web page is identified in the web session data and the rule may not be called when that web page is never opened during the captured web session.
  • privacy metrics are generated for the privacy rules.
  • each web session and web page may have an associated identifier.
  • the privacy rule parser may identify the web sessions and/or web pages where the rules are triggered.
  • the privacy rule parser may identify any other privacy metrics associated with the rules, such a time or day, when the rules are triggered, a type of web browser used on client site where the rule was triggered, etc.
  • the privacy rule parser may determine if the rules are successfully completed. For example, the privacy rules may be triggered whenever a particular web page is identified. The triggered rule may then execute a test to identify any data in the web page satisfying a particular condition and/or matching a particular value, sequence, location, etc. If the test is satisfied, the rule performs an associated action. For example, the action may replace a matching combination of numbers with X's. Replacement of the matching combination of numbers is identified as a successful completion of the rule.
  • the amounts of time required to complete the privacy rules may be identified. For example, privacy rules may need to apply an associated test to all of the data associated with one or more web pages.
  • the privacy rule parser may track the amount of processing time required to parse through the one or more web pages. The time can be tracked on any other variety of dimensions or categories, such as for periods of time, web sessions, web pages, particular fields in the web pages, etc.
  • the privacy metrics are sent to the privacy profiler.
  • FIG. 6 depicts an example process for generating privacy profiles.
  • the privacy profiler receives the privacy metrics from the privacy rule parser in operation 230 .
  • the privacy metrics are aggregated in operation 232 for a selectable time period.
  • the privacy profiler may count the total number of times a particular rule is triggered during an aggregation period.
  • the aggregation period may be any selectable time period, such as seconds, minutes, hours, days, etc.
  • the privacy metrics may be aggregated for different dimensions, such as for all privacy rules, individual privacy rules, privacy rule calls, privacy rule completions, privacy rule execution times, web sessions, web page, etc.
  • Operation 234 determines when the aggregation period has completed.
  • the aggregation period may be five minutes and the privacy profiler may count the number of times each privacy rule is trigger during the five minute aggregation period.
  • averages may be calculated for certain privacy metrics.
  • the privacy profiler may calculate the average number of times the privacy rules are triggered for each web session, the average completion rate for the privacy rules, and the average execution times for the privacy rules.
  • Operation 238 stores the aggregated averaged privacy metrics in a database as the privacy profiles.
  • FIG. 7 depicts an example process for automatically identifying irregularities in the privacy filtering process or irregularities in the captured web sessions.
  • the privacy event processor may receive new privacy metrics from the privacy profiler.
  • the new privacy metrics may be associated with the last five minutes of privacy filtering by the privacy rule parser.
  • the new privacy metrics may be compared with previously generated privacy profiles. For example, as described above, the average execution time for a particular privacy rule over the last five minutes may be compared with the average execution times identified with the rule in the privacy profiles.
  • the privacy event processor may send a notification in operation 256 for any of the most recent privacy metrics that extend outside of a threshold range of the privacy profiles in operation 254 . For example, an email may be sent to a system operator or an entry may be added to a log file identifying the particular rule, associated privacy metrics, and any other associated web session information, such as time, web session, web page, etc.
  • the new privacy metrics may be added to the existing privacy profiles.
  • the privacy profiles may track the average execution times of the rules over entire days, weeks, months, years.
  • the new privacy metrics may identify the next time period for the privacy profile.
  • the new privacy metrics may be further accumulated with other accumulated and averaged privacy metrics in the privacy profiles. For example, all of the privacy metrics for a last hour may be accumulated and averaged to generate one reference point in a day, week, or month long privacy profile.
  • FIGS. 8A and 8B depict examples of privacy metrics displayed for different privacy rules.
  • Selection boxes 302 may be used for selecting different privacy metrics or privacy dimensions for displaying on an electronic page 300 .
  • a selection box 302 A may select a parameter or dimension for displaying on the vertical Y-axis and a selection box 302 B may select a parameter or dimension for displaying along a horizontal X-axis.
  • selection box 302 A may select the vertical Y-axis to represent the average execution time required for the privacy processing system to complete execution of different privacy rules for captured web session data.
  • Selection box 302 B may select the horizontal X-axis to represent a particular time period for displaying the average execution times, such as for a particular day.
  • FIG. 8A shows a range for the average execution time on the Y-axis of between 0.0 milliseconds (ms) and 5.0 ms and a time range on the X-axis of between 8:00 am and 3:00 pm.
  • ms milliseconds
  • time range on the X-axis of between 8:00 am and 3:00 pm.
  • other privacy dimensions and time ranges may be displayed on the X and Y axes.
  • a selection box 302 C may select the privacy rules for displaying associated privacy metrics. For example, selection box 302 C may select all rules #1, #2, and #3 for displaying associated average execution times.
  • a selection box 302 D may select a web session category for displaying associated security metrics. For example, selection box 302 D may select privacy metrics to be displayed for captured web sessions, captured web pages within the captured web sessions, etc.
  • the privacy processing system displays three lines 304 , 306 , and 308 representing changes in the average execution times over a particular day for rules #1, #2, and #3, respectively.
  • line 304 stays relatively constant at around 4.5 ms
  • line 306 stays relatively constant at around 3.5 ms.
  • Normal variations may be expected in the average execution times due to different user activities during the web sessions. For example, users may navigate through different web pages during the web sessions and may or may not complete transactions during those web sessions.
  • different types and amounts of aggregated data may be captured for different individual web sessions that may or may not trigger the execution of certain privacy rules and may vary the amounts of time required to execute the privacy rules.
  • Line 308 shows a substantial change in the average execution time for rule #3 sometime after 11:00 am. Up until around 11:00 am the average execution time is around 2.5 ms and after 11:00 am the average execution time drops to around 1.0 ms.
  • the change in the average execution time may indicate a problem with rule #3.
  • the web application may have changed a web page name or field name that was previously used for triggering rule #3. As a result, rule #3 may no longer be called for the renamed web page and personal information in the renamed web page may no longer be filtered by rule #3.
  • Line 308 identifies a potential filtering problem associated with rule #3.
  • An operator may replay some of the web sessions captured after 11:00 am to determine if rule #3 is correctly filtering personal information from the captured web session data. For example, the operator may determine if rule #3 is removing a social security number from a particular web page in the replayed web session data.
  • FIG. 8B depicts another example of privacy metrics displayed by the privacy processing system.
  • the operator may decide to investigate in more detail the change in the average execution time for rule #3. Either via entries in selection boxes 302 or by selecting line 308 , the privacy processing system may display bar graphs showing other privacy metrics for rule #3 associated with different web pages within the web sessions. For example, the operator may select rule #3 in selection box 302 C and select a web page category in selection box 302 D.
  • the privacy processing system may display different bar graphs 320 , 322 , and 324 each associated with a different web page that may be have been displayed during the captured web sessions.
  • bar graph 320 may be associated with a log-in page for the web session where a user logs into a web account
  • bar graph 322 may be associated with an accounts web page where a user enters address information
  • bar graph 324 may be associated with a checkout page where the user completes a transaction for purchasing a product or service.
  • a first solid line bar graph may represent the average execution time for rule #3 at 11:00 am and a second dashed line bar graph may represent the average execution time for rule #3 at 12:00 pm.
  • Bar graph 320 shows that the average execution time for privacy rule #3 for the log-in web page did not change substantially from 11:00 am and 12:00 pm and bar graph 322 shows that the average execution time for privacy rule #3 for the accounts web page did not change substantially from 11:00 am to 12:00 pm.
  • bar graph 324 shows that the average execution time for rule #3 when applied to the checkout web page substantially decreased from around 2.5 ms at 11:00 am to around 0.5 ms at 12:00 pm.
  • the operator may use the replay system or may use other search software to then determine if rule #3 is correctly filtering personal information captured by the checkout web page. For example, by replaying some of the web sessions captured after 12:00 pm, the operator may determine that rule #3 is not filtering credit card numbers from the captured web session data. This would provide an early warning to a breach of privacy.
  • FIGS. 9A and 9B depict another example of how the privacy processing system may display privacy metrics that identify irregularities in the privacy filtering process.
  • selection box 302 A selects the vertical Y-axis to represent the successful completion rate for different rules.
  • the percentage completion rate may indicate the percentage of times a particular rule was called or triggered and then successfully completed an associated action, such as replacing or encrypting a sequence of numbers.
  • FIG. 9A shows successful completion percentages between 60% and 100%.
  • Selection box 302 B selects the time period for displaying the successful completion rates between 7:00 am and 1:00 pm.
  • Selection box 302 C selects all rules #1, #2, and #3 for displaying associated completion rates and selection box 302 D selects web sessions as the category of web session data for displaying associated completion rates.
  • the privacy processing system displays three lines 340 , 342 , and 344 showing the changes over a particular day for the completion rates for rules #1, #2, and #3, respectively.
  • line 340 shows that privacy rule #1 stays relatively constant at around 90%
  • line 342 shows that privacy rule #2 stays relatively constant at around 80%.
  • Variations in the completion rate also may be expected due to the different user activities during the web sessions. Again, users may navigate though different web pages during the web sessions and may or may not complete transactions during those web sessions. For example, some users may enter credit card information into web pages during web sessions that later during privacy filtering may trigger certain privacy rules and allow the triggered privacy rules to complete associated actions. Users in other web sessions may never enter credit card numbers into web pages and thus prevent some privacy rules from completing their associated actions.
  • Line 344 shows a substantial increase in the completion rate for privacy rule #3 sometime between 9:00 am and 10:00 am. Up until around 9:30 am the completion rate for privacy rule #3 is around 60% and after 10:00 am the completion rate for rule #3 increases to over 80%.
  • the increase in the completion rate may indicate a problem with privacy rule #3. For example, modifications to a web page may cause rule #3 to unintentionally replace all of the data entered into the web page. As a result, privacy rule #3 may remove web session data needed for properly replaying and analyzing captured web sessions.
  • line 344 identifies a potential filtering problem associated with privacy rule #3.
  • the operator may again replay some web sessions captured after 10:00 am to determine if rule #3 is filtering the correct information from the captured web session data.
  • FIG. 9B depicts other privacy metrics displayed by the privacy processing system.
  • the operator may decide to investigate the change in the completion rate for privacy rule #3. Either via entries in selection boxes 302 or by selecting line 344 , the privacy processing system may display bar graphs showing security metrics for different web browsers used during the web sessions. For example, the operator may select privacy rule #3 in selection box 302 C and select browsers in selection box 302 D.
  • the privacy processing system may display different bar graphs 350 , 352 , and 354 each associated with a different web browser or web application that may have been used during the captured web sessions.
  • bar graph 350 may be associated with a mobile web browser or mobile application used on mobile devices
  • bar graph 352 may be associated with a desk-top web browser used on a personal computer
  • bar graph 354 may be associated with a second type of desk-top web browser used on personal computers.
  • a first solid line bar graph may represent the completion rates for privacy rule #3 at 9:00 am and a second dashed line bar graph may represent the completion rates for privacy rule #3 at 12:00 pm.
  • Bar graphs 352 and 354 show that the completion rates associated with the two desktop browsers did not change substantially between 9:00 am and 10:00 pm. This may indicate that the web sessions conducted with the two desk top browsers and the privacy filtering associated with the browsers are both operating normally.
  • bar graph 350 shows that the completion rate for privacy rule #3 for captured web sessions associated with the mobile browser increased substantially from around 60% at 9:00 am to around 85% at 10:00 am.
  • the operator may again use the replay system or other software to verify privacy rule #3 is filtering the correct information in the captured web sessions. For example, replaying some of the 10:00 am mobile browser web sessions may determine privacy rule #3 is filtering the wrong data.
  • the test algorithm may interpret rule #3 differently for data originating from a mobile web browser, causing the data to be handled differently.
  • FIGS. 10A-10C depict examples of how the replay system may confirm proper privacy filtering of captured web session data.
  • FIG. 10A depicts a mobile device 370 having a screen displaying an electronic web page 374 used to enter personal information for completing a credit card transaction.
  • a user may enter a name into a name field 372 A, enter a street address into an address field 372 B, enter a town and zip code into a city field 372 C, and enter a credit card number into a credit card field 372 D.
  • a monitoring system may capture and send the personal information entered into fields 372 to the privacy processing system.
  • FIG. 10B shows a replayed web session after the captured web session data was filtered by the privacy processing system.
  • the web session may be replayed on computing device 160 and may replay electronic web page 374 .
  • the capture and filtering of the web session data may have happened around 9:00 am.
  • FIG. 10B may represent a properly filtered web session where only the first eight digits of the credit card number were replaced with X.
  • FIG. 10C shows a second replayed web session after the captured web session data was filtered by the privacy processing system.
  • the filtering of the captured web session data may have happened around 10:00 am.
  • FIG. 10C may represent incorrectly filtered web session data where all of the information captured in electronic web page 374 is replaced with X's.
  • the replay system can be used to further investigate and identify privacy filtering problems that may have originally been identified by comparing privacy metrics with the privacy profiles.
  • FIG. 11 shows a computing device 1000 that may be used for operating the privacy processing system and performing any combination of the privacy processing operations discussed above.
  • the computing device 1000 may operate in the capacity of a server or a client machine in a server-client network environment, or as a peer machine in a peer-to-peer (or distributed) network environment.
  • computing device 1000 may be a personal computer (PC), a tablet, a Personal Digital Assistant (PDA), a cellular telephone, a smart phone, a web appliance, or any other machine or device capable of executing instructions 1006 (sequential or otherwise) that specify actions to be taken by that machine.
  • PC personal computer
  • PDA Personal Digital Assistant
  • computing device 1000 may include any collection of devices or circuitry that individually or jointly execute a set (or multiple sets) of instructions to perform any one or more of the operations discussed above.
  • Computing device 1000 may be part of an integrated control system or system manager, or may be provided as a portable electronic device configured to interface with a networked system either locally or remotely via wireless transmission.
  • Processors 1004 may comprise a central processing unit (CPU), a graphics processing unit (GPU), programmable logic devices, dedicated processor systems, micro controllers, or microprocessors that may perform some or all of the operations described above. Processors 1004 may also include, but may not be limited to, an analog processor, a digital processor, a microprocessor, multi-core processor, processor array, network processor, etc.
  • CPU central processing unit
  • GPU graphics processing unit
  • programmable logic devices dedicated processor systems
  • micro controllers microprocessors that may perform some or all of the operations described above.
  • Processors 1004 may also include, but may not be limited to, an analog processor, a digital processor, a microprocessor, multi-core processor, processor array, network processor, etc.
  • Processors 1004 may execute instructions or “code” 1006 stored in any one of memories 1008 , 1010 , or 1020 .
  • the memories may store data as well. Instructions 1006 and data can also be transmitted or received over a network 1014 via a network interface device 1012 utilizing any one of a number of well-known transfer protocols.
  • Memories 1008 , 1010 , and 1020 may be integrated together with processing device 1000 , for example RAM or FLASH memory disposed within an integrated circuit microprocessor or the like.
  • the memory may comprise an independent device, such as an external disk drive, storage array, or any other storage devices used in database systems.
  • the memory and processing devices may be operatively coupled together, or in communication with each other, for example by an I/O port, network connection, etc. such that the processing device may read a file stored on the memory.
  • Some memory may be “read only” by design (ROM) by virtue of permission settings, or not.
  • Other examples of memory may include, but may be not limited to, WORM, EPROM, EEPROM, FLASH, etc. which may be implemented in solid state semiconductor devices.
  • Other memories may comprise moving parts, such a conventional rotating disk drive. All such memories may be “machine-readable” in that they may be readable by a processing device.
  • Computer-readable storage medium may include all of the foregoing types of memory, as well as new technologies that may arise in the future, as long as they may be capable of storing digital information in the nature of a computer program or other data, at least temporarily, in such a manner that the stored information may be “read” by an appropriate processing device.
  • the term “computer-readable” may not be limited to the historical usage of “computer” to imply a complete mainframe, mini-computer, desktop, wireless device, or even a laptop computer. Rather, “computer-readable” may comprise storage medium that may be readable by a processor, processing device, or any computing system. Such media may be any available media that may be locally and/or remotely accessible by a computer or processor, and may include volatile and non-volatile media, and removable and non-removable media.
  • Computing device 1000 can further include a video display 1016 , such as a liquid crystal display (LCD) or a cathode ray tube (CRT)) and a user interface 1018 , such as a keyboard, mouse, touch screen, etc. All of the components of computing device 1000 may be connected together via a bus 1002 and/or network.
  • a video display 1016 such as a liquid crystal display (LCD) or a cathode ray tube (CRT)
  • a user interface 1018 such as a keyboard, mouse, touch screen, etc. All of the components of computing device 1000 may be connected together via a bus 1002 and/or network.

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Security & Cryptography (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Health & Medical Sciences (AREA)
  • Bioethics (AREA)
  • General Health & Medical Sciences (AREA)
  • Software Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computing Systems (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Databases & Information Systems (AREA)
  • Medical Informatics (AREA)
  • Automation & Control Theory (AREA)
  • Computer And Data Communications (AREA)
  • Information Transfer Between Computers (AREA)

Abstract

A privacy processing system may use privacy rules to filter sensitive personal information from web session data. The privacy processing system may generate privacy profiles or privacy metadata that identifies how often the privacy rules are called, how often the privacy rules successfully complete actions, and the processing time required to execute the privacy rules. The privacy profiles may be used to detect irregularities in the privacy filtering process that may be associated with a variety of privacy filtering and web session problems.

Description

    RELATED APPLICATIONS
  • The present application is a continuation of U.S. patent application Ser. No. 13/658,155, filed Oct. 23, 2012, which is herein incorporated by reference in its entirety.
  • BACKGROUND
  • Monitoring and replay systems may capture web session data, such as web pages sent from a web application server to a client computer and user interface events entered into the web pages at the client computer. The captured web session data may be used to replay and analyze user experiences during the web sessions. For example, the replayed web sessions may be used to identify problems users may be have while navigating through web pages during the web session.
  • Sensitive personal information may be entered into the web pages during the web sessions. For example, the web sessions may involve on-line purchases of products and/or services. In order to complete the on-line transactions, users may need to enter social security numbers, passwords, credit card numbers, bank account numbers, health information, stock information, home addresses, or the like, or any combination thereof.
  • Government privacy regulations may prohibit the retention of certain personal information or limit the retention of the personal information to certified entities. These privacy regulations may require monitoring and replay systems to filter sensitive personal information before storing the captured web session data in a database for subsequent replay analysis.
  • Current monitoring and replay systems attempt to remove sensitive personal information. However, some personal information may not be successfully filtered from the captured web session data. For example, a web application may change the name of a web page or the name of a field in the web page that was previously used for triggering the privacy rules that filter the sensitive personal information. If the sensitive personal information is not filtered, some or all of the captured web session data may need to be destroyed.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 depicts an example of a system for filtering information from captured web session data.
  • FIG. 2 depicts an example of a privacy processing system.
  • FIG. 3 depicts examples of privacy metrics generated by the privacy processing system.
  • FIG. 4 depicts an example process for comparing privacy metrics with privacy profiles.
  • FIG. 5 depicts an example process for generating privacy metrics.
  • FIG. 6 depicts an example process for generating privacy profiles.
  • FIG. 7 depicts an example of a process for detecting irregular privacy metrics.
  • FIGS. 8A and 8B depict example graphs displaying average execution times for privacy rules.
  • FIGS. 9A and 9B depict example graphs displaying percentage of successful completions for privacy rules.
  • FIGS. 10A, 10B and 10C depict examples of replayed web sessions showing correct and incorrect privacy filtering.
  • FIG. 11 depicts an example computing device for implementing the privacy processing system.
  • DETAILED DESCRIPTION
  • FIG. 1 depicts a web session 100 conducted between a web application 104 operating on a web server 102 and a computing device 110. Web application 104 may support any type of online web session such as online purchases, online financial or medical services, social networking, etc. Of course, these are just examples, and any type of electronic web based transaction or activity may be performed using web application 104.
  • Computing device 110 may comprise a Personal Computer (PC), laptop computer, wireless Personal Digital Assistant (PDA), cellular telephone, smart phone, tablet, or any other wired or wireless device that accesses and exchanges information with web application 104. Any number of computing devices 110 may conduct different web sessions 100 with web application 104 at any geographical location and at any time of day.
  • Computing device 110 may communicate with web application 104 over a network connection 108. Network connection 108 may comprise any combination of connections over an Internet network, a wireless network, a WiFi network, a telephone network, a Public Services Telephone Network (PSTN), a cellular network, a cable network, a Wide Area Network (WAN), a Local Area Network (LAN), or the like, or any combination thereof.
  • In one example, a web browser or web application 118 operating on computing device 110 may send Hyper Text Transfer Protocol (HTTP) requests to web application 104 over network connection 108. Web application 104 may send back one or more of web pages 106 in response to the HTTP requests and computing device 110 may display the web pages via the web browser or application 118 on a computer screen 116. For example, web browser or mobile application 118 may display an electronic web page 112 that contains fields 114A-114C for entering a user name, password, and social security number, respectively. Web application 104 may send additional web pages 106 and/or responses to computing device 110 in response to the information entered into fields 114.
  • Web session monitors 122 may capture web session data 124 during web session 100. Web session data 124 may include network data transferred over network connection 108 between computing device 110 and web application 104 and user interface events generated on computing device 110. For example, web session data 124 may comprise the Hyper Text Transfer Protocol (HTTP) requests and other data requests sent from computing device 110 to web application 104 and the Hyper Text Markup Language (HTML) web pages 106 and other responses sent back to computing device 110 from web application 104.
  • Some of web session data 124 may include user interface events entered by a user into computing device 110, such as mouse clicks, keystrokes, alpha-numeric data, or the like, or any combination thereof. For example, some of the user interface events may comprise data entered into fields 114 of web page 112 or may comprise selections of icons or links on web page 112.
  • Other web session data 124 may comprise webpage logic/code sent by web application 104 along with web pages 106 to computing device 110 that further determine the different states or operations in the web pages. Some of the web session data may be generated locally on processing device 110 and never sent over network connection 108. For example, the control logic within web page 112 may change the state of web page 112 in response to user inputs without sending any data back to web application 104. In another example, a batch data transfer of only completed information in web page 112 may be transferred back to web application 104 over network connection 108.
  • In another example, some web session data 124 may comprise document object model (DOM) events within the web pages. For example, changes in the DOM of displayed web page 106 may be captured by UI event monitor 122A as some of web session data 124. In yet another example, web session data 124 may comprise operating parameters or any other logged data in computing device 110 and/or web server 102. For example, web session data 124 may comprise network bandwidth indicators, processor bandwidth indicators, network condition indicators, computer operating conditions, or the like, or any combination thereof.
  • In one example, network session monitor 122B may capture the network data, such as web pages, requests, responses, and/or logic exchanged between computing device 110 and web application 104 over network connection 108. User interface (UI) monitor 122A may capture the user interface events generated locally at computing device 110. In another example, UI monitor 122A also may capture some or all of the network data exchanged between computing device 110 and web application 104 over network connection 108.
  • In yet another example, UI event monitor 122A and/or network session monitor 122B may not capture all the web session data 124 and may only detect occurrences of some web session events. In this example, monitors 122A and 122B may send unique identifiers identifying occurrences of certain web session events and may send timestamps indicating when the web session events were detected.
  • Examples of systems for capturing and/or identifying web session data and events are described in U.S. Pat. No. 6,286,030 issued Sep. 4, 2001, entitled: Systems and Methods for Recording and Visually Recreating Sessions in a Client-Server Environment now reissued as U.S. Pat. No. RE41903; U.S. Pat. No. 8,127,000 issued Feb. 28, 2012, entitled: Method and Apparatus for Monitoring and Synchronizing User Interface Events with Network Data; and U.S. patent application Ser. No. 13/419,179 filed Mar. 13, 2012, entitled: Method and Apparatus for Intelligent Capture of Document Object Model Events which are all herein incorporated by reference in their entireties.
  • During web session 100, a user may enter a user name into field 114A, a password into field 114B and/or a social security number into field 114C. Due to the security requirements discussed above, the password and/or social security number may need to be filtered before captured web session data 124 can be stored in a database 136.
  • A privacy processing system 130 filters sensitive personal information, such as the password and/or social security number from captured web session data 124. Filtering refers to any combination of removing, blocking, replacing, encrypting, hashing, or the like, information in web session data 124. Privacy processing system 130 stores filtered web session data 138 in web session database 136. A replay system 134 can then use the captured and now filtered web session data 138 to replay the original web session 100 without displaying sensitive personal information. One example of a replay system 134 is described in U.S. Pat. No. 8,127,000, entitled: METHOD AND APPARATUS FOR MONITORING AND SYNCHRONIZING USER INTERFACE EVENTS WITH NETWORK DATA, issued Feb. 28, 2012 which has been incorporated by reference in its entirety.
  • Privacy processing system 130 may apply privacy rules to the captured web session data 124 to remove the sensitive personal information. Privacy profiles or privacy metadata may be generated for the privacy rules. For example, the privacy profiles may identify how often the privacy rules are called, how often the privacy rules successfully complete actions, and amounts of processing time required to execute the privacy rules. The privacy profiles may detect privacy filtering problems, such as privacy rules that do not filter personal information or filter the wrong information from the web session data, or if certain patterns of data require abnormally large privacy resources, such as time or CPU usage. In addition, any big deviation in privacy resource usage may indicate a change to the website or in end user behavior.
  • Filtering and/or encrypting sensitive personal information in captured web session data may be computationally expensive. For example, a particular web site may service hundreds of millions of users and hundreds of millions of associated web sessions each day. The privacy profiles may identify privacy rules that are written incorrectly or inefficiently and waste processing bandwidth. The identified privacy rules can be identified and rewritten so privacy rules can more efficiently search and filter personal information from the millions of web sessions.
  • Privacy processing system 130 may detect other web session events or states that may impact user experiences during web sessions 100. For example, the privacy profiles may identify incorrect user actions, virus attacks, errors or gaps in the logic of the source web application, etc. Thus, privacy processing system 130 not only generates quantitative privacy filtering metrics but may also identify other general problems with the web sessions.
  • FIG. 2 depicts in more detail one example of privacy processing system 130. A privacy rule parser 148 may apply privacy rules 150 to web session data 124 captured from web sessions 100. Different rules 150 may be applied to different web pages and different data captured during web session 100. For example, a first rule 150A may search for a particular web page in captured web session data 124 that may contain a social security number. A second rule 150B may search for different fields in several different web pages in web session data 124 that may contain a credit card number.
  • Rules 150A and 150B may filter the web session data by replacing, blocking, hashing, encrypting, etc. sensitive personal information, such as social security numbers or credit card numbers. Filtered web session data 138 is stored in database 136 and can then be used for subsequent replay and analysis by replay system 134.
  • A privacy profiler 152 may generate privacy profiles 158 for privacy rules 150. For example, privacy profiler 152 may track the number of times each rule 150 is called when filtering web session data 124, the number of times each rule 150 succeeded in filtering information in web session data 124, and/or the amount of time required to execute rules 150 while filtering web session data 124.
  • Privacy profiler 152 may generate privacy profiles 158 by aggregating the privacy metrics for rules 150. For example, privacy profiler 152 may calculate an average number of times over the last five minutes that rule 150A is called while filtering web session data 124. The aggregated privacy metrics may be used as a baseline or “profile” of typical or normal privacy filtering behavior. Security metrics outside of privacy profile thresholds may be indicative of privacy filtering problems or other web session problems. For example, there may be a substantial change in the average number of times a particular privacy rule is called for each captured web session. In another example, there may be a substantial change in the amount of time required to successfully complete execution of a privacy rule.
  • These changes in the privacy metrics may be caused by changes in the web application or web pages used by the web application. As mentioned above, a user may enter personal information into particular fields within particular web pages. Privacy rule parser 148 may only call rule 150A when particular web page names or field names are identified in captured web session data 124. If the enterprise operating the web application changes the web page name or field name, privacy rule parser 148 may no longer call rule 150A to filter data in the renamed web page. Accordingly, a social security number entered into the renamed web page may no longer be filtered from web session data 124 compromising the overall privacy and security of filtered web session data 138.
  • In another example, a change in the amount of time required to successfully complete privacy rules 150 also may indicate a privacy filtering problem. For example, web browsers, web pages, and/or web page logic may change or reformat data entered into the web pages. The changed or reformatted data may cause privacy rule parser 148 to unintentionally call privacy rules 150 for web pages that do not contain personal information or may cause rules 150 to filter the wrong data. These incorrect or unintentional filtering operations may waste processing bandwidth and/or remove web session data needed for accurately replaying the captured web sessions.
  • In yet another example, a change in the privacy metrics may indicate an abnormal web session. For example, a denial of service attack or other bot attacks may substantially change the number or percentage of times a particular rule 150 is called or the amount of time required to successfully complete the privacy rule. Privacy profiles 158 can identify privacy metric changes and isolate privacy filtering or web session problems.
  • A privacy event processor 154 may display graphs 162 on a computing device 160 for privacy profiles 158. For example, graphs 162 may identify the average privacy metric values for different privacy rules. Any substantial deviations in graphs 162 may indicate privacy filtering problems and/or web session problems. A user may direct privacy event processor 154 to display the privacy profiles for different dimensions, such as for particular privacy rules, privacy rule parameters, web session categories, or web browsers. For example, the user may direct privacy event processor 154 to display privacy metrics associated with different desk-top web browsers and mobile applications.
  • Computing device 160 may use replay system 134 to further isolate privacy filtering irregularities. For example, a user may compare replayed filtered web sessions prior to large privacy metric changes with replayed web sessions after the large privacy metric changes. Any filtering problems identified during replay can then be corrected by modifying the associated rules 150 in privacy rule parser 148.
  • FIG. 3 depicts in more detail examples of privacy rules 150 and privacy metrics 176 used and generated by the privacy rule parser. A first rule 150A may comprise a test 172A that looks for a particular set of numbers, text, values, parameters, etc. For example, test 172A may test for a first set of three numbers separated from a second set of two numbers by a space or dash. Test 172A may then look for a third set of four numbers separated by a second space or dash from the second set of two numbers.
  • Upon detection of the number sequence, rule 150A may trigger an action 174A. In one example, the action 174A for rule 150A may replace the detected sequence of numbers with an “X”. Examples of other actions may include only “X-ing” out some of the numbers in the identified sequence or using a hash algorithm to encrypt the number sequence. In one example, the encrypted number sequence might not be decrypted, but could still be used with the hash algorithm to confirm association of the number with a particular user. Any other actions may be used for filtering information.
  • A set of privacy metrics 176A may be generated by the privacy rule parser in association with privacy rule 150A. For example, privacy metrics 176 may include web session identifiers identifying the particular web sessions where rule 150A is called. A web page identifier may identify particular web pages in the web session data where rule 150A is called. A field identifier may identify particular fields in the captured web session data where rule 150A is called for filtering data entered into the fields.
  • A web browser identifier may identify a particular web browser or application used during the associated web session where rule 150A was called. A time stamp metric may identify when the web sessions producing the captured web session data took place and/or may identify when rule 150A was called for filtering the captured web session data. A “# of calls” metric may identify the number of times rule 150A was called while filtering the captured web session data or the number of times rule 150A was called for individual web sessions, web page, etc. A “# of completed actions” metric may identify the number of times rule 150A successfully completed an associated filtering action. For example, the # of actions may identify the number of times rule 150A replaced the sequence in test 172A with the X's in action 174A. An execution time metric may identify the amount of processing time required to complete rule 150A for a web session, web page, etc.
  • A second rule 150B may comprise a test 172B that also parses the web session data for a particular set of numbers, text, values, parameters, etc. In this example, test 172B may test for a sequence of numbers associated with a credit card number. The number sequence may be four sets of four numbers separated by spaces or dashes. In one example, satisfaction of test 172B may initiate an associated action 174B that replaces the first twelve numbers of the detected sequence of sixteen numbers with “X's”. Another similar set of privacy metrics 176B may be generated for rule 150B.
  • Rules 150A and 150B are just examples of any variety of tests and actions that may be applied to the web session data. Rules 150 may be applied to any combination of web session data. For example, rules 150 may only be called for particular web pages that query a user for sensitive personal information, such as a social security number. Other rules 150 may be called to filter data associated with other sensitive data captured on other web pages.
  • FIG. 4 depicts an example of privacy profiles 158 generated by privacy profiler 152 in FIG. 3 from the privacy metrics. Privacy profiles 158 may identify different statistical dimensions associated with filtering the web session data. For example, a first privacy profile 158A may identify the average number of times any of the rules in the privacy rule parser were called while filtering the web session data. Privacy profile 158A may be derived for different categories, such as for time periods, web sessions, web pages, web browsers and/or mobile web applications.
  • A privacy profile 158B may identify the average successful completion rate for the rules indicating the average percentage of times the privacy rules successfully filter information. A privacy profile 158C may identify an average amount of time required for all of the rules to filter the web session data for web sessions, web pages, browsers, etc.
  • A privacy profile 158D may identify the average number of times a particular rule #1 is called while filtering the web session data. A privacy profile 158E may identify the successful completion rate for rule #1. Privacy profiles 158 may be associated with any other privacy rules and collected privacy dimensions.
  • The privacy profiler may generate aggregated average privacy metrics 180 for aggregation periods. For example, privacy profiler may aggregate the number of times different rules are called, aggregate the percentage of times the rules successfully complete filtering actions, and aggregate the amounts of time required for the rules to complete execution. The privacy metrics may be aggregated for some selectable time period, such as five minutes, and the aggregated values averaged to generate privacy metrics 180.
  • Privacy event processor 154 may compare privacy metrics 180 for each aggregated time period with privacy profiles 158. A privacy metric notification 182 may be generated for any privacy metrics 180 that are outside of threshold ranges for privacy profiles 158. For example, the privacy profiler may determine standard deviations for the values in privacy profiles 158. Privacy event processor 154 may send notification 182 or create an entry in a log file for any of the privacy metrics 180 that are outside of the standard deviations for associated privacy profiles 158.
  • For example, rule #1 may have successfully completed 100% of the time over the last five minute aggregation period. The average successful completion rate in privacy profile 158E for rule #1 may be 80 percent and the standard deviation may be +/−4%. Thus, the threshold range for privacy profile 158E may be between 76% and 84%. Since the successful completion rate for rule #1 in privacy metrics 180 is outside of the threshold range for privacy profile 158E, privacy event processor 154 may generate a notification 182 or generate an entry in a file or table identifying privacy metric 180 for rule #1 as an outlier.
  • Privacy event processor 154 also may automatically determine differences in the web session data associated with different privacy metrics. For example, the captured web session data may include Document Object Models (DOMs) for the web pages. Privacy event processor 154 may detect privacy metrics outside of the privacy profile thresholds. The DOMs for filtered web pages having privacy metrics within the privacy profile thresholds may be compared with the DOMs for filtered web pages with privacy metrics outside of the privacy profile thresholds. The DOM differences may be identified and sent to the operator. For example, the DOM differences may identify a web page with a changed name that may have prevented rule #1 from correctly triggering.
  • FIG. 5 depicts an example process for generating privacy metrics. In operation 200, privacy rules are applied to captured web session data received from the web session monitoring systems. As mentioned above any number of web sessions may be continuously, periodically, or randomly monitored and the associated web session data sent to the privacy processing system.
  • Privacy rules called or triggered during the privacy filtering process are identified in operation 202. For example, a rule may be called when a particular web page is identified in the web session data and the rule may not be called when that web page is never opened during the captured web session. In operation 204, privacy metrics are generated for the privacy rules. For example, each web session and web page may have an associated identifier. The privacy rule parser may identify the web sessions and/or web pages where the rules are triggered. The privacy rule parser may identify any other privacy metrics associated with the rules, such a time or day, when the rules are triggered, a type of web browser used on client site where the rule was triggered, etc.
  • In operation 206, the privacy rule parser may determine if the rules are successfully completed. For example, the privacy rules may be triggered whenever a particular web page is identified. The triggered rule may then execute a test to identify any data in the web page satisfying a particular condition and/or matching a particular value, sequence, location, etc. If the test is satisfied, the rule performs an associated action. For example, the action may replace a matching combination of numbers with X's. Replacement of the matching combination of numbers is identified as a successful completion of the rule.
  • In operation 208, the amounts of time required to complete the privacy rules may be identified. For example, privacy rules may need to apply an associated test to all of the data associated with one or more web pages. The privacy rule parser may track the amount of processing time required to parse through the one or more web pages. The time can be tracked on any other variety of dimensions or categories, such as for periods of time, web sessions, web pages, particular fields in the web pages, etc. In operation 210, the privacy metrics are sent to the privacy profiler.
  • FIG. 6 depicts an example process for generating privacy profiles. The privacy profiler receives the privacy metrics from the privacy rule parser in operation 230. The privacy metrics are aggregated in operation 232 for a selectable time period. For example, the privacy profiler may count the total number of times a particular rule is triggered during an aggregation period. The aggregation period may be any selectable time period, such as seconds, minutes, hours, days, etc. The privacy metrics may be aggregated for different dimensions, such as for all privacy rules, individual privacy rules, privacy rule calls, privacy rule completions, privacy rule execution times, web sessions, web page, etc.
  • Operation 234 determines when the aggregation period has completed. For example, the aggregation period may be five minutes and the privacy profiler may count the number of times each privacy rule is trigger during the five minute aggregation period.
  • In operation 236, averages may be calculated for certain privacy metrics. For example, the privacy profiler may calculate the average number of times the privacy rules are triggered for each web session, the average completion rate for the privacy rules, and the average execution times for the privacy rules. Operation 238 stores the aggregated averaged privacy metrics in a database as the privacy profiles.
  • FIG. 7 depicts an example process for automatically identifying irregularities in the privacy filtering process or irregularities in the captured web sessions. In operation 250, the privacy event processor may receive new privacy metrics from the privacy profiler. For example, the new privacy metrics may be associated with the last five minutes of privacy filtering by the privacy rule parser.
  • In operation 252, the new privacy metrics may be compared with previously generated privacy profiles. For example, as described above, the average execution time for a particular privacy rule over the last five minutes may be compared with the average execution times identified with the rule in the privacy profiles. The privacy event processor may send a notification in operation 256 for any of the most recent privacy metrics that extend outside of a threshold range of the privacy profiles in operation 254. For example, an email may be sent to a system operator or an entry may be added to a log file identifying the particular rule, associated privacy metrics, and any other associated web session information, such as time, web session, web page, etc.
  • In operation 258, the new privacy metrics may be added to the existing privacy profiles. For example, the privacy profiles may track the average execution times of the rules over entire days, weeks, months, years. The new privacy metrics may identify the next time period for the privacy profile. In one example, the new privacy metrics may be further accumulated with other accumulated and averaged privacy metrics in the privacy profiles. For example, all of the privacy metrics for a last hour may be accumulated and averaged to generate one reference point in a day, week, or month long privacy profile.
  • FIGS. 8A and 8B depict examples of privacy metrics displayed for different privacy rules. Selection boxes 302 may be used for selecting different privacy metrics or privacy dimensions for displaying on an electronic page 300. A selection box 302A may select a parameter or dimension for displaying on the vertical Y-axis and a selection box 302B may select a parameter or dimension for displaying along a horizontal X-axis.
  • For example, selection box 302A may select the vertical Y-axis to represent the average execution time required for the privacy processing system to complete execution of different privacy rules for captured web session data. Selection box 302B may select the horizontal X-axis to represent a particular time period for displaying the average execution times, such as for a particular day. FIG. 8A shows a range for the average execution time on the Y-axis of between 0.0 milliseconds (ms) and 5.0 ms and a time range on the X-axis of between 8:00 am and 3:00 pm. Of course other privacy dimensions and time ranges may be displayed on the X and Y axes.
  • A selection box 302C may select the privacy rules for displaying associated privacy metrics. For example, selection box 302C may select all rules #1, #2, and #3 for displaying associated average execution times. A selection box 302D may select a web session category for displaying associated security metrics. For example, selection box 302D may select privacy metrics to be displayed for captured web sessions, captured web pages within the captured web sessions, etc.
  • Based on the entries in selection boxes 302, the privacy processing system displays three lines 304, 306, and 308 representing changes in the average execution times over a particular day for rules #1, #2, and #3, respectively. In this example, line 304 stays relatively constant at around 4.5 ms and line 306 stays relatively constant at around 3.5 ms. Normal variations may be expected in the average execution times due to different user activities during the web sessions. For example, users may navigate through different web pages during the web sessions and may or may not complete transactions during those web sessions. Thus, different types and amounts of aggregated data may be captured for different individual web sessions that may or may not trigger the execution of certain privacy rules and may vary the amounts of time required to execute the privacy rules.
  • Line 308 shows a substantial change in the average execution time for rule #3 sometime after 11:00 am. Up until around 11:00 am the average execution time is around 2.5 ms and after 11:00 am the average execution time drops to around 1.0 ms. The change in the average execution time may indicate a problem with rule #3. For example, the web application may have changed a web page name or field name that was previously used for triggering rule #3. As a result, rule #3 may no longer be called for the renamed web page and personal information in the renamed web page may no longer be filtered by rule #3.
  • Line 308 identifies a potential filtering problem associated with rule #3. An operator may replay some of the web sessions captured after 11:00 am to determine if rule #3 is correctly filtering personal information from the captured web session data. For example, the operator may determine if rule #3 is removing a social security number from a particular web page in the replayed web session data.
  • FIG. 8B depicts another example of privacy metrics displayed by the privacy processing system. The operator may decide to investigate in more detail the change in the average execution time for rule #3. Either via entries in selection boxes 302 or by selecting line 308, the privacy processing system may display bar graphs showing other privacy metrics for rule #3 associated with different web pages within the web sessions. For example, the operator may select rule #3 in selection box 302C and select a web page category in selection box 302D.
  • In response, the privacy processing system may display different bar graphs 320, 322, and 324 each associated with a different web page that may be have been displayed during the captured web sessions. For example, bar graph 320 may be associated with a log-in page for the web session where a user logs into a web account, bar graph 322 may be associated with an accounts web page where a user enters address information, and bar graph 324 may be associated with a checkout page where the user completes a transaction for purchasing a product or service.
  • A first solid line bar graph may represent the average execution time for rule #3 at 11:00 am and a second dashed line bar graph may represent the average execution time for rule #3 at 12:00 pm. Bar graph 320 shows that the average execution time for privacy rule #3 for the log-in web page did not change substantially from 11:00 am and 12:00 pm and bar graph 322 shows that the average execution time for privacy rule #3 for the accounts web page did not change substantially from 11:00 am to 12:00 pm. However, bar graph 324 shows that the average execution time for rule #3 when applied to the checkout web page substantially decreased from around 2.5 ms at 11:00 am to around 0.5 ms at 12:00 pm.
  • The operator may use the replay system or may use other search software to then determine if rule #3 is correctly filtering personal information captured by the checkout web page. For example, by replaying some of the web sessions captured after 12:00 pm, the operator may determine that rule #3 is not filtering credit card numbers from the captured web session data. This would provide an early warning to a breach of privacy.
  • FIGS. 9A and 9B depict another example of how the privacy processing system may display privacy metrics that identify irregularities in the privacy filtering process. In this example, selection box 302A selects the vertical Y-axis to represent the successful completion rate for different rules. As explained above, the percentage completion rate may indicate the percentage of times a particular rule was called or triggered and then successfully completed an associated action, such as replacing or encrypting a sequence of numbers.
  • FIG. 9A shows successful completion percentages between 60% and 100%. Selection box 302B selects the time period for displaying the successful completion rates between 7:00 am and 1:00 pm. Selection box 302C selects all rules #1, #2, and #3 for displaying associated completion rates and selection box 302D selects web sessions as the category of web session data for displaying associated completion rates.
  • Based on the entries in selection boxes 302, the privacy processing system displays three lines 340, 342, and 344 showing the changes over a particular day for the completion rates for rules #1, #2, and #3, respectively. In this example, line 340 shows that privacy rule #1 stays relatively constant at around 90% and line 342 shows that privacy rule #2 stays relatively constant at around 80%.
  • Variations in the completion rate also may be expected due to the different user activities during the web sessions. Again, users may navigate though different web pages during the web sessions and may or may not complete transactions during those web sessions. For example, some users may enter credit card information into web pages during web sessions that later during privacy filtering may trigger certain privacy rules and allow the triggered privacy rules to complete associated actions. Users in other web sessions may never enter credit card numbers into web pages and thus prevent some privacy rules from completing their associated actions.
  • Line 344 shows a substantial increase in the completion rate for privacy rule #3 sometime between 9:00 am and 10:00 am. Up until around 9:30 am the completion rate for privacy rule #3 is around 60% and after 10:00 am the completion rate for rule #3 increases to over 80%. The increase in the completion rate may indicate a problem with privacy rule #3. For example, modifications to a web page may cause rule #3 to unintentionally replace all of the data entered into the web page. As a result, privacy rule #3 may remove web session data needed for properly replaying and analyzing captured web sessions.
  • Thus, line 344 identifies a potential filtering problem associated with privacy rule #3. The operator may again replay some web sessions captured after 10:00 am to determine if rule #3 is filtering the correct information from the captured web session data.
  • FIG. 9B depicts other privacy metrics displayed by the privacy processing system. The operator may decide to investigate the change in the completion rate for privacy rule #3. Either via entries in selection boxes 302 or by selecting line 344, the privacy processing system may display bar graphs showing security metrics for different web browsers used during the web sessions. For example, the operator may select privacy rule #3 in selection box 302C and select browsers in selection box 302D.
  • In response, the privacy processing system may display different bar graphs 350, 352, and 354 each associated with a different web browser or web application that may have been used during the captured web sessions. For example, bar graph 350 may be associated with a mobile web browser or mobile application used on mobile devices, bar graph 352 may be associated with a desk-top web browser used on a personal computer, and bar graph 354 may be associated with a second type of desk-top web browser used on personal computers.
  • A first solid line bar graph may represent the completion rates for privacy rule #3 at 9:00 am and a second dashed line bar graph may represent the completion rates for privacy rule #3 at 12:00 pm. Bar graphs 352 and 354 show that the completion rates associated with the two desktop browsers did not change substantially between 9:00 am and 10:00 pm. This may indicate that the web sessions conducted with the two desk top browsers and the privacy filtering associated with the browsers are both operating normally. However, bar graph 350 shows that the completion rate for privacy rule #3 for captured web sessions associated with the mobile browser increased substantially from around 60% at 9:00 am to around 85% at 10:00 am.
  • The operator may again use the replay system or other software to verify privacy rule #3 is filtering the correct information in the captured web sessions. For example, replaying some of the 10:00 am mobile browser web sessions may determine privacy rule #3 is filtering the wrong data. The test algorithm may interpret rule #3 differently for data originating from a mobile web browser, causing the data to be handled differently.
  • FIGS. 10A-10C depict examples of how the replay system may confirm proper privacy filtering of captured web session data. FIG. 10A depicts a mobile device 370 having a screen displaying an electronic web page 374 used to enter personal information for completing a credit card transaction. In this example, a user may enter a name into a name field 372A, enter a street address into an address field 372B, enter a town and zip code into a city field 372C, and enter a credit card number into a credit card field 372D. As explained above, a monitoring system may capture and send the personal information entered into fields 372 to the privacy processing system.
  • FIG. 10B shows a replayed web session after the captured web session data was filtered by the privacy processing system. The web session may be replayed on computing device 160 and may replay electronic web page 374. The capture and filtering of the web session data may have happened around 9:00 am. FIG. 10B may represent a properly filtered web session where only the first eight digits of the credit card number were replaced with X.
  • FIG. 10C shows a second replayed web session after the captured web session data was filtered by the privacy processing system. The filtering of the captured web session data may have happened around 10:00 am. FIG. 10C may represent incorrectly filtered web session data where all of the information captured in electronic web page 374 is replaced with X's. Thus, the replay system can be used to further investigate and identify privacy filtering problems that may have originally been identified by comparing privacy metrics with the privacy profiles.
  • Hardware and Software
  • FIG. 11 shows a computing device 1000 that may be used for operating the privacy processing system and performing any combination of the privacy processing operations discussed above. The computing device 1000 may operate in the capacity of a server or a client machine in a server-client network environment, or as a peer machine in a peer-to-peer (or distributed) network environment. In other examples, computing device 1000 may be a personal computer (PC), a tablet, a Personal Digital Assistant (PDA), a cellular telephone, a smart phone, a web appliance, or any other machine or device capable of executing instructions 1006 (sequential or otherwise) that specify actions to be taken by that machine.
  • While only a single computing device 1000 is shown, the computing device 1000 may include any collection of devices or circuitry that individually or jointly execute a set (or multiple sets) of instructions to perform any one or more of the operations discussed above. Computing device 1000 may be part of an integrated control system or system manager, or may be provided as a portable electronic device configured to interface with a networked system either locally or remotely via wireless transmission.
  • Processors 1004 may comprise a central processing unit (CPU), a graphics processing unit (GPU), programmable logic devices, dedicated processor systems, micro controllers, or microprocessors that may perform some or all of the operations described above. Processors 1004 may also include, but may not be limited to, an analog processor, a digital processor, a microprocessor, multi-core processor, processor array, network processor, etc.
  • Some of the operations described above may be implemented in software and other operations may be implemented in hardware. One or more of the operations, processes, or methods described herein may be performed by an apparatus, device, or system similar to those as described herein and with reference to the illustrated figures.
  • Processors 1004 may execute instructions or “code” 1006 stored in any one of memories 1008, 1010, or 1020. The memories may store data as well. Instructions 1006 and data can also be transmitted or received over a network 1014 via a network interface device 1012 utilizing any one of a number of well-known transfer protocols.
  • Memories 1008, 1010, and 1020 may be integrated together with processing device 1000, for example RAM or FLASH memory disposed within an integrated circuit microprocessor or the like. In other examples, the memory may comprise an independent device, such as an external disk drive, storage array, or any other storage devices used in database systems. The memory and processing devices may be operatively coupled together, or in communication with each other, for example by an I/O port, network connection, etc. such that the processing device may read a file stored on the memory.
  • Some memory may be “read only” by design (ROM) by virtue of permission settings, or not. Other examples of memory may include, but may be not limited to, WORM, EPROM, EEPROM, FLASH, etc. which may be implemented in solid state semiconductor devices. Other memories may comprise moving parts, such a conventional rotating disk drive. All such memories may be “machine-readable” in that they may be readable by a processing device.
  • “Computer-readable storage medium” (or alternatively, “machine-readable storage medium”) may include all of the foregoing types of memory, as well as new technologies that may arise in the future, as long as they may be capable of storing digital information in the nature of a computer program or other data, at least temporarily, in such a manner that the stored information may be “read” by an appropriate processing device. The term “computer-readable” may not be limited to the historical usage of “computer” to imply a complete mainframe, mini-computer, desktop, wireless device, or even a laptop computer. Rather, “computer-readable” may comprise storage medium that may be readable by a processor, processing device, or any computing system. Such media may be any available media that may be locally and/or remotely accessible by a computer or processor, and may include volatile and non-volatile media, and removable and non-removable media.
  • Computing device 1000 can further include a video display 1016, such as a liquid crystal display (LCD) or a cathode ray tube (CRT)) and a user interface 1018, such as a keyboard, mouse, touch screen, etc. All of the components of computing device 1000 may be connected together via a bus 1002 and/or network.
  • For the sake of convenience, operations may be described as various interconnected or coupled functional blocks or diagrams. However, there may be cases where these functional blocks or diagrams may be equivalently aggregated into a single logic device, program or operation with unclear boundaries.
  • Having described and illustrated the principles of a preferred embodiment, it should be apparent that the embodiments may be modified in arrangement and detail without departing from such principles. Claim is made to all modifications and variation coming within the spirit and scope of the following claims.

Claims (25)

1-20. (canceled)
21. A privacy processing system comprising:
a set of privacy rules, wherein each of the privacy rules is for application in filtering sensitive personal information from web session data;
a set of privacy profiles, wherein each of the privacy profiles includes metrics associated with application of one or more of the privacy rules to web session data; and
a privacy event processor for:
using one or more of the privacy profiles, identifying an irregularity in application of one or more of the privacy rules in filtering sensitive personal information from web session data associated with a web session; and
upon identification of the irregularity, sending an electronic notification relating to the identified irregularity.
22. The system of claim 21, wherein the web session includes an online purchase of a product or service, and wherein the irregularity is associated with a filtering problem that includes a privacy rule that does not filter sensitive personal information comprising information associated with completing the online purchase.
23. The system of claim 21, wherein the web session includes a credit card purchase, and wherein the irregularity is associated with a filtering problem that includes a privacy rule that does not filter credit card information associated with the purchase.
24. The system of claim 21, comprising a privacy profiler for generating the privacy profiles.
25. The system of claim 21, wherein the privacy profiles identify how often the privacy rules are called, how often the privacy rules successfully complete actions, and amounts of processing time required to execute the privacy rules.
26. The system of claim 21, wherein application of the privacy rules in filtering sensitive personal information from web session data comprises using the privacy rules to remove sensitive personal information.
27. The system of claim 21, wherein the metrics are privacy metrics including aggregated metrics relating to privacy rule use over time.
28. The system of claim 21, wherein the metrics are privacy metrics that include aggregated metrics relating to privacy rule use over time, and wherein comparison of aggregated metrics is used in identifying the irregularity in application of one or more of the privacy rules in filtering sensitive personal information from web session data.
29. The system of claim 21, wherein the metrics are privacy metrics that include aggregated metrics relating to privacy rule use over time, and wherein comparison of aggregated metrics is used in identifying the irregularity in application of one or more of the privacy rules in filtering sensitive personal information from web session data, and wherein identifying the irregularity comprises, for a first privacy rule:
determining a first statistical metric associated with use of the first privacy rule in filtering sensitive personal information from web session data over a first period of time;
determining a second statistical metric associated with use of the first privacy rule in filtering sensitive personal information from web session data over a second period of time; and
comparing the first statistical metric to the second statistical metric in determining whether a threshold deviation has been reached between the first statistical metric and the second statistical metric, wherein reaching the threshold deviation is associated with identifying the irregularity.
30. The system of claim 21, wherein the privacy rules are for application in filtering sensitive personal information from the web session data before storing web session data, of the web session data, in a database for subsequent replay analysis.
31. The system of claim 21, wherein the privacy processing system is for replaying the web session with the sensitive personal information filtered therefrom in identifying the irregularity.
32. The system of claim 21, wherein the irregularity is associated with a filtering problem.
33. The system of claim 21, wherein the irregularity is associated with a filtering problem that includes a privacy rule that does not filter sensitive personal information from web session data.
34. The system of claim 21, wherein the irregularity is associated with a filtering problem that includes a privacy rule that does not filter sensitive personal information from web session data as required by one or more government privacy regulations.
35. The system of claim 21, wherein the notification is sent to be received or accessed by an operator of at least a portion of the privacy processing system.
36. The system of claim 21, wherein the notification comprises an email.
37. The system of claim 21, wherein the notification comprises a log entry accessible by an operator of at least a portion of the privacy processing system.
38. A method comprising:
generating a set of privacy rules for application in filtering sensitive personal information from web session data;
generating a set of privacy profiles, wherein each of the privacy profiles includes metrics associated with application of one or more of the privacy rules to web session data;
using one or more of the privacy profiles, identifying an irregularity in application of one or more of the privacy rules in filtering sensitive personal information from web session data associated with a web session; and
upon identification of the irregularity, sending an electronic notification relating to the identified irregularity.
39. The method of claim 38, comprising identifying the irregularity, wherein the web session includes an online purchase of a product or service, and wherein the irregularity is associated with a filtering problem that includes a privacy rule that does not filter sensitive personal information comprising information associated with completing the online purchase.
40. The method of claim 38, comprising identifying the irregularity, wherein the web session includes a credit card transaction, and wherein the irregularity is associated with a filtering problem that includes a privacy rule that does not filter credit card information associated with completing the transaction.
41. The method of claim 38, wherein the irregularity is determined to be associated with Document Object Model (DOM) changes made to filtered web pages.
42. The method of claim 38, wherein the metrics are privacy metrics that include aggregated metrics relating to privacy rule use over time, and wherein comparison of aggregated metrics is used in identifying the irregularity in application of one or more of the privacy rules in filtering sensitive personal information from web session data.
43. The method of claim 38, wherein the metrics are privacy metrics that include aggregated metrics relating to privacy rule use over time, and wherein comparison of aggregated metrics is used in identifying the irregularity in application of one or more of the privacy rules in filtering sensitive personal information from web session data, and wherein identifying the irregularity comprises, for a first privacy rule:
determining a first statistical metric associated with use of the first privacy rule in filtering sensitive personal information from web session data over a first period of time;
determining a second statistical metric associated with use of the first privacy rule in filtering sensitive personal information from web session data over a second period of time; and
comparing the first statistical metric to the second statistical metric in determining whether a threshold deviation has been reached between the first statistical metric and the second statistical metric, wherein reaching the threshold deviation is associated with identifying the irregularity.
44. A non-transitory computer readable medium or media containing instructions for executing a method comprising:
generating a set of privacy rules for application in filtering sensitive personal information from web session data;
generating a set of privacy profiles, wherein each of the privacy profiles includes metrics associated with application of one or more of the privacy rules to web session data;
using one or more of the privacy profiles, identifying an irregularity in application of one or more of the privacy rules in filtering sensitive personal information from web session data associated with a web session; and
upon identification of the irregularity, sending an electronic notification of the identified irregularity;
wherein the web session includes an online purchase of a product or service, and wherein the irregularity is associated with a filtering problem that includes a privacy rule that does not filter sensitive personal information comprising information associated with completing the online purchase; and
wherein the metrics are privacy metrics that include aggregated metrics relating to privacy rule use over time, and wherein comparison of aggregated metrics is used in identifying the irregularity in application of one or more of the privacy rules in filtering sensitive personal information from web session data, and wherein identifying the irregularity comprises, for a first privacy rule of the privacy rules:
determining a first statistical metric associated with use of the first privacy rule in filtering sensitive personal information from web session data over a first period of time;
determining a second statistical metric associated with use of the first privacy rule in filtering sensitive personal information from web session data over a second period of time; and
comparing the first statistical metric to the second statistical metric in determining whether a threshold deviation has been reached between the first statistical metric and the second statistical metric, wherein reaching the threshold deviation is associated with identifying the irregularity.
US16/673,262 2012-10-23 2019-11-04 Method and apparatus for generating privacy profiles Abandoned US20200065525A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US16/673,262 US20200065525A1 (en) 2012-10-23 2019-11-04 Method and apparatus for generating privacy profiles

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US13/658,155 US9536108B2 (en) 2012-10-23 2012-10-23 Method and apparatus for generating privacy profiles
US15/040,178 US10474840B2 (en) 2012-10-23 2016-02-10 Method and apparatus for generating privacy profiles
US16/673,262 US20200065525A1 (en) 2012-10-23 2019-11-04 Method and apparatus for generating privacy profiles

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US15/040,178 Continuation US10474840B2 (en) 2012-10-23 2016-02-10 Method and apparatus for generating privacy profiles

Publications (1)

Publication Number Publication Date
US20200065525A1 true US20200065525A1 (en) 2020-02-27

Family

ID=50486644

Family Applications (3)

Application Number Title Priority Date Filing Date
US13/658,155 Active 2035-04-10 US9536108B2 (en) 2012-10-23 2012-10-23 Method and apparatus for generating privacy profiles
US15/040,178 Active 2034-03-29 US10474840B2 (en) 2012-10-23 2016-02-10 Method and apparatus for generating privacy profiles
US16/673,262 Abandoned US20200065525A1 (en) 2012-10-23 2019-11-04 Method and apparatus for generating privacy profiles

Family Applications Before (2)

Application Number Title Priority Date Filing Date
US13/658,155 Active 2035-04-10 US9536108B2 (en) 2012-10-23 2012-10-23 Method and apparatus for generating privacy profiles
US15/040,178 Active 2034-03-29 US10474840B2 (en) 2012-10-23 2016-02-10 Method and apparatus for generating privacy profiles

Country Status (2)

Country Link
US (3) US9536108B2 (en)
CN (1) CN103825774B (en)

Families Citing this family (179)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8949406B2 (en) * 2008-08-14 2015-02-03 International Business Machines Corporation Method and system for communication between a client system and a server system
US8868533B2 (en) 2006-06-30 2014-10-21 International Business Machines Corporation Method and apparatus for intelligent capture of document object model events
US8583772B2 (en) 2008-08-14 2013-11-12 International Business Machines Corporation Dynamically configurable session agent
US8042055B2 (en) 2007-08-31 2011-10-18 Tealeaf Technology, Inc. Replaying captured network interactions
US8930818B2 (en) * 2009-03-31 2015-01-06 International Business Machines Corporation Visualization of website analytics
US9934320B2 (en) 2009-03-31 2018-04-03 International Business Machines Corporation Method and apparatus for using proxy objects on webpage overlays to provide alternative webpage actions
CN103384864B (en) 2011-02-22 2016-12-14 国际商业机器公司 The method and system of rendering content
US9635094B2 (en) * 2012-10-15 2017-04-25 International Business Machines Corporation Capturing and replaying application sessions using resource files
US9536108B2 (en) * 2012-10-23 2017-01-03 International Business Machines Corporation Method and apparatus for generating privacy profiles
US9535720B2 (en) 2012-11-13 2017-01-03 International Business Machines Corporation System for capturing and replaying screen gestures
US10474735B2 (en) 2012-11-19 2019-11-12 Acoustic, L.P. Dynamic zooming of content with overlays
US9167038B2 (en) * 2012-12-18 2015-10-20 Arash ESMAILZDEH Social networking with depth and security factors
US9374422B2 (en) 2012-12-18 2016-06-21 Arash Esmailzadeh Secure distributed data storage
US9497194B2 (en) * 2013-09-06 2016-11-15 Oracle International Corporation Protection of resources downloaded to portable devices from enterprise systems
US9582680B2 (en) * 2014-01-30 2017-02-28 Microsoft Technology Licensing, Llc Scrubbe to remove personally identifiable information
US9729583B1 (en) 2016-06-10 2017-08-08 OneTrust, LLC Data processing systems and methods for performing privacy assessments and monitoring of new versions of computer code for privacy compliance
CN104410532A (en) * 2014-12-12 2015-03-11 携程计算机技术(上海)有限公司 Server and log filtering method thereof
WO2016137884A1 (en) * 2015-02-25 2016-09-01 Quantea Corporation Network traffic system and method of operation thereof
US9426139B1 (en) * 2015-03-30 2016-08-23 Amazon Technologies, Inc. Triggering a request for an authentication
US9996216B2 (en) * 2015-06-25 2018-06-12 medCPU, Ltd. Smart display data capturing platform for record systems
US10127403B2 (en) * 2015-07-30 2018-11-13 Samsung Electronics Co., Ltd. Computing system with privacy control mechanism and method of operation thereof
KR20170045703A (en) * 2015-10-19 2017-04-27 삼성전자주식회사 Electronic apparatus and the controlling method thereof
CN105721477B (en) * 2016-02-25 2019-11-01 上海斐讯数据通信技术有限公司 The method and system of the control privacy compromise based on IPTABLES of mobile terminal
US9872072B2 (en) * 2016-03-21 2018-01-16 Google Llc Systems and methods for identifying non-canonical sessions
US11244367B2 (en) 2016-04-01 2022-02-08 OneTrust, LLC Data processing systems and methods for integrating privacy information management systems with data loss prevention tools or other tools for privacy design
US10706447B2 (en) 2016-04-01 2020-07-07 OneTrust, LLC Data processing systems and communication systems and methods for the efficient generation of privacy risk assessments
US11004125B2 (en) 2016-04-01 2021-05-11 OneTrust, LLC Data processing systems and methods for integrating privacy information management systems with data loss prevention tools or other tools for privacy design
US20220164840A1 (en) 2016-04-01 2022-05-26 OneTrust, LLC Data processing systems and methods for integrating privacy information management systems with data loss prevention tools or other tools for privacy design
US9888377B1 (en) * 2016-05-25 2018-02-06 Symantec Corporation Using personal computing device analytics as a knowledge based authentication source
US10002241B2 (en) 2016-05-25 2018-06-19 International Business Machines Corporation Managing data to diminish cross-context analysis
US10740487B2 (en) 2016-06-10 2020-08-11 OneTrust, LLC Data processing systems and methods for populating and maintaining a centralized database of personal data
US10776517B2 (en) 2016-06-10 2020-09-15 OneTrust, LLC Data processing systems for calculating and communicating cost of fulfilling data subject access requests and related methods
US12052289B2 (en) 2016-06-10 2024-07-30 OneTrust, LLC Data processing systems for data-transfer risk identification, cross-border visualization generation, and related methods
US11438386B2 (en) 2016-06-10 2022-09-06 OneTrust, LLC Data processing systems for data-transfer risk identification, cross-border visualization generation, and related methods
US11675929B2 (en) 2016-06-10 2023-06-13 OneTrust, LLC Data processing consent sharing systems and related methods
US10706176B2 (en) 2016-06-10 2020-07-07 OneTrust, LLC Data-processing consent refresh, re-prompt, and recapture systems and related methods
US10607028B2 (en) 2016-06-10 2020-03-31 OneTrust, LLC Data processing systems for data testing to confirm data deletion and related methods
US10713387B2 (en) 2016-06-10 2020-07-14 OneTrust, LLC Consent conversion optimization systems and related methods
US10706174B2 (en) 2016-06-10 2020-07-07 OneTrust, LLC Data processing systems for prioritizing data subject access requests for fulfillment and related methods
US11366909B2 (en) 2016-06-10 2022-06-21 OneTrust, LLC Data processing and scanning systems for assessing vendor risk
US10592648B2 (en) 2016-06-10 2020-03-17 OneTrust, LLC Consent receipt management systems and related methods
US11277448B2 (en) 2016-06-10 2022-03-15 OneTrust, LLC Data processing systems for data-transfer risk identification, cross-border visualization generation, and related methods
US11343284B2 (en) 2016-06-10 2022-05-24 OneTrust, LLC Data processing systems and methods for performing privacy assessments and monitoring of new versions of computer code for privacy compliance
US11475136B2 (en) 2016-06-10 2022-10-18 OneTrust, LLC Data processing systems for data transfer risk identification and related methods
US11222139B2 (en) 2016-06-10 2022-01-11 OneTrust, LLC Data processing systems and methods for automatic discovery and assessment of mobile software development kits
US11727141B2 (en) 2016-06-10 2023-08-15 OneTrust, LLC Data processing systems and methods for synching privacy-related user consent across multiple computing devices
US10282700B2 (en) 2016-06-10 2019-05-07 OneTrust, LLC Data processing systems for generating and populating a data inventory
US10776518B2 (en) 2016-06-10 2020-09-15 OneTrust, LLC Consent receipt management systems and related methods
US10762236B2 (en) 2016-06-10 2020-09-01 OneTrust, LLC Data processing user interface monitoring systems and related methods
US10572686B2 (en) 2016-06-10 2020-02-25 OneTrust, LLC Consent receipt management systems and related methods
US10796260B2 (en) 2016-06-10 2020-10-06 OneTrust, LLC Privacy management systems and methods
US11023842B2 (en) 2016-06-10 2021-06-01 OneTrust, LLC Data processing systems and methods for bundled privacy policies
US10909265B2 (en) 2016-06-10 2021-02-02 OneTrust, LLC Application privacy scanning systems and related methods
US10565236B1 (en) 2016-06-10 2020-02-18 OneTrust, LLC Data processing systems for generating and populating a data inventory
US11138242B2 (en) 2016-06-10 2021-10-05 OneTrust, LLC Data processing systems and methods for automatically detecting and documenting privacy-related aspects of computer software
US11222142B2 (en) 2016-06-10 2022-01-11 OneTrust, LLC Data processing systems for validating authorization for personal data collection, storage, and processing
US10997315B2 (en) 2016-06-10 2021-05-04 OneTrust, LLC Data processing systems for fulfilling data subject access requests and related methods
US10885485B2 (en) 2016-06-10 2021-01-05 OneTrust, LLC Privacy management systems and methods
US10798133B2 (en) 2016-06-10 2020-10-06 OneTrust, LLC Data processing systems for data-transfer risk identification, cross-border visualization generation, and related methods
US10909488B2 (en) 2016-06-10 2021-02-02 OneTrust, LLC Data processing systems for assessing readiness for responding to privacy-related incidents
US11520928B2 (en) 2016-06-10 2022-12-06 OneTrust, LLC Data processing systems for generating personal data receipts and related methods
US11336697B2 (en) 2016-06-10 2022-05-17 OneTrust, LLC Data processing systems for data-transfer risk identification, cross-border visualization generation, and related methods
US10726158B2 (en) 2016-06-10 2020-07-28 OneTrust, LLC Consent receipt management and automated process blocking systems and related methods
US11038925B2 (en) 2016-06-10 2021-06-15 OneTrust, LLC Data processing systems for data-transfer risk identification, cross-border visualization generation, and related methods
US11416590B2 (en) 2016-06-10 2022-08-16 OneTrust, LLC Data processing and scanning systems for assessing vendor risk
US10853501B2 (en) 2016-06-10 2020-12-01 OneTrust, LLC Data processing and scanning systems for assessing vendor risk
US11138299B2 (en) 2016-06-10 2021-10-05 OneTrust, LLC Data processing and scanning systems for assessing vendor risk
US11210420B2 (en) 2016-06-10 2021-12-28 OneTrust, LLC Data subject access request processing systems and related methods
US10592692B2 (en) 2016-06-10 2020-03-17 OneTrust, LLC Data processing systems for central consent repository and related methods
US11354435B2 (en) 2016-06-10 2022-06-07 OneTrust, LLC Data processing systems for data testing to confirm data deletion and related methods
US11157600B2 (en) 2016-06-10 2021-10-26 OneTrust, LLC Data processing and scanning systems for assessing vendor risk
US10496846B1 (en) 2016-06-10 2019-12-03 OneTrust, LLC Data processing and communications systems and methods for the efficient implementation of privacy by design
US11301796B2 (en) 2016-06-10 2022-04-12 OneTrust, LLC Data processing systems and methods for customizing privacy training
US11481710B2 (en) 2016-06-10 2022-10-25 OneTrust, LLC Privacy management systems and methods
US11586700B2 (en) 2016-06-10 2023-02-21 OneTrust, LLC Data processing systems and methods for automatically blocking the use of tracking tools
US11354434B2 (en) 2016-06-10 2022-06-07 OneTrust, LLC Data processing systems for verification of consent and notice processing and related methods
US10416966B2 (en) 2016-06-10 2019-09-17 OneTrust, LLC Data processing systems for identity validation of data subject access requests and related methods
US11134086B2 (en) 2016-06-10 2021-09-28 OneTrust, LLC Consent conversion optimization systems and related methods
US10318761B2 (en) 2016-06-10 2019-06-11 OneTrust, LLC Data processing systems and methods for auditing data request compliance
US10282559B2 (en) 2016-06-10 2019-05-07 OneTrust, LLC Data processing systems for identifying, assessing, and remediating data processing risks using data modeling techniques
US10678945B2 (en) 2016-06-10 2020-06-09 OneTrust, LLC Consent receipt management systems and related methods
US10685140B2 (en) 2016-06-10 2020-06-16 OneTrust, LLC Consent receipt management systems and related methods
US11144622B2 (en) 2016-06-10 2021-10-12 OneTrust, LLC Privacy management systems and methods
US11651106B2 (en) 2016-06-10 2023-05-16 OneTrust, LLC Data processing systems for fulfilling data subject access requests and related methods
US10510031B2 (en) 2016-06-10 2019-12-17 OneTrust, LLC Data processing systems for identifying, assessing, and remediating data processing risks using data modeling techniques
US11341447B2 (en) 2016-06-10 2022-05-24 OneTrust, LLC Privacy management systems and methods
US10878127B2 (en) 2016-06-10 2020-12-29 OneTrust, LLC Data subject access request processing systems and related methods
US10467432B2 (en) 2016-06-10 2019-11-05 OneTrust, LLC Data processing systems for use in automatically generating, populating, and submitting data subject access requests
US10949170B2 (en) 2016-06-10 2021-03-16 OneTrust, LLC Data processing systems for integration of consumer feedback with data subject access requests and related methods
US12136055B2 (en) 2016-06-10 2024-11-05 OneTrust, LLC Data processing systems for identifying, assessing, and remediating data processing risks using data modeling techniques
US11416589B2 (en) 2016-06-10 2022-08-16 OneTrust, LLC Data processing and scanning systems for assessing vendor risk
US11025675B2 (en) 2016-06-10 2021-06-01 OneTrust, LLC Data processing systems and methods for performing privacy assessments and monitoring of new versions of computer code for privacy compliance
US12118121B2 (en) 2016-06-10 2024-10-15 OneTrust, LLC Data subject access request processing systems and related methods
US10242228B2 (en) 2016-06-10 2019-03-26 OneTrust, LLC Data processing systems for measuring privacy maturity within an organization
US11222309B2 (en) 2016-06-10 2022-01-11 OneTrust, LLC Data processing systems for generating and populating a data inventory
US11188615B2 (en) 2016-06-10 2021-11-30 OneTrust, LLC Data processing consent capture systems and related methods
US10803200B2 (en) 2016-06-10 2020-10-13 OneTrust, LLC Data processing systems for processing and managing data subject access in a distributed environment
US11651104B2 (en) 2016-06-10 2023-05-16 OneTrust, LLC Consent receipt management systems and related methods
US10846433B2 (en) 2016-06-10 2020-11-24 OneTrust, LLC Data processing consent management systems and related methods
US10873606B2 (en) 2016-06-10 2020-12-22 OneTrust, LLC Data processing systems for data-transfer risk identification, cross-border visualization generation, and related methods
US11087260B2 (en) 2016-06-10 2021-08-10 OneTrust, LLC Data processing systems and methods for customizing privacy training
US11416798B2 (en) 2016-06-10 2022-08-16 OneTrust, LLC Data processing systems and methods for providing training in a vendor procurement process
US11410106B2 (en) 2016-06-10 2022-08-09 OneTrust, LLC Privacy management systems and methods
US10949565B2 (en) 2016-06-10 2021-03-16 OneTrust, LLC Data processing systems for generating and populating a data inventory
US11562097B2 (en) 2016-06-10 2023-01-24 OneTrust, LLC Data processing systems for central consent repository and related methods
US10848523B2 (en) 2016-06-10 2020-11-24 OneTrust, LLC Data processing systems for data-transfer risk identification, cross-border visualization generation, and related methods
US11100444B2 (en) 2016-06-10 2021-08-24 OneTrust, LLC Data processing systems and methods for providing training in a vendor procurement process
US10997318B2 (en) 2016-06-10 2021-05-04 OneTrust, LLC Data processing systems for generating and populating a data inventory for processing data access requests
US11200341B2 (en) 2016-06-10 2021-12-14 OneTrust, LLC Consent receipt management systems and related methods
US11074367B2 (en) 2016-06-10 2021-07-27 OneTrust, LLC Data processing systems for identity validation for consumer rights requests and related methods
US10706379B2 (en) 2016-06-10 2020-07-07 OneTrust, LLC Data processing systems for automatic preparation for remediation and related methods
US11392720B2 (en) 2016-06-10 2022-07-19 OneTrust, LLC Data processing systems for verification of consent and notice processing and related methods
US10769301B2 (en) 2016-06-10 2020-09-08 OneTrust, LLC Data processing systems for webform crawling to map processing activities and related methods
US11146566B2 (en) 2016-06-10 2021-10-12 OneTrust, LLC Data processing systems for fulfilling data subject access requests and related methods
US11366786B2 (en) 2016-06-10 2022-06-21 OneTrust, LLC Data processing systems for processing data subject access requests
US10783256B2 (en) 2016-06-10 2020-09-22 OneTrust, LLC Data processing systems for data transfer risk identification and related methods
US10944725B2 (en) 2016-06-10 2021-03-09 OneTrust, LLC Data processing systems and methods for using a data model to select a target data asset in a data migration
US10839102B2 (en) 2016-06-10 2020-11-17 OneTrust, LLC Data processing systems for identifying and modifying processes that are subject to data subject access requests
US11625502B2 (en) 2016-06-10 2023-04-11 OneTrust, LLC Data processing systems for identifying and modifying processes that are subject to data subject access requests
US10706131B2 (en) 2016-06-10 2020-07-07 OneTrust, LLC Data processing systems and methods for efficiently assessing the risk of privacy campaigns
US10606916B2 (en) 2016-06-10 2020-03-31 OneTrust, LLC Data processing user interface monitoring systems and related methods
US11403377B2 (en) 2016-06-10 2022-08-02 OneTrust, LLC Privacy management systems and methods
US11544667B2 (en) 2016-06-10 2023-01-03 OneTrust, LLC Data processing systems for generating and populating a data inventory
US10169609B1 (en) 2016-06-10 2019-01-01 OneTrust, LLC Data processing systems for fulfilling data subject access requests and related methods
US10284604B2 (en) 2016-06-10 2019-05-07 OneTrust, LLC Data processing and scanning systems for generating and populating a data inventory
US10585968B2 (en) 2016-06-10 2020-03-10 OneTrust, LLC Data processing systems for fulfilling data subject access requests and related methods
US10708305B2 (en) 2016-06-10 2020-07-07 OneTrust, LLC Automated data processing systems and methods for automatically processing requests for privacy-related information
US10565161B2 (en) 2016-06-10 2020-02-18 OneTrust, LLC Data processing systems for processing data subject access requests
US11151233B2 (en) 2016-06-10 2021-10-19 OneTrust, LLC Data processing and scanning systems for assessing vendor risk
US11188862B2 (en) 2016-06-10 2021-11-30 OneTrust, LLC Privacy management systems and methods
US11328092B2 (en) 2016-06-10 2022-05-10 OneTrust, LLC Data processing systems for processing and managing data subject access in a distributed environment
US11461500B2 (en) 2016-06-10 2022-10-04 OneTrust, LLC Data processing systems for cookie compliance testing with website scanning and related methods
US10896394B2 (en) 2016-06-10 2021-01-19 OneTrust, LLC Privacy management systems and methods
US10503926B2 (en) 2016-06-10 2019-12-10 OneTrust, LLC Consent receipt management systems and related methods
US11238390B2 (en) 2016-06-10 2022-02-01 OneTrust, LLC Privacy management systems and methods
US11228620B2 (en) 2016-06-10 2022-01-18 OneTrust, LLC Data processing systems for data-transfer risk identification, cross-border visualization generation, and related methods
US11227247B2 (en) 2016-06-10 2022-01-18 OneTrust, LLC Data processing systems and methods for bundled privacy policies
US11416109B2 (en) 2016-06-10 2022-08-16 OneTrust, LLC Automated data processing systems and methods for automatically processing data subject access requests using a chatbot
US11294939B2 (en) 2016-06-10 2022-04-05 OneTrust, LLC Data processing systems and methods for automatically detecting and documenting privacy-related aspects of computer software
US11418492B2 (en) 2016-06-10 2022-08-16 OneTrust, LLC Data processing systems and methods for using a data model to select a target data asset in a data migration
US11295316B2 (en) 2016-06-10 2022-04-05 OneTrust, LLC Data processing systems for identity validation for consumer rights requests and related methods
US12045266B2 (en) 2016-06-10 2024-07-23 OneTrust, LLC Data processing systems for generating and populating a data inventory
US10776514B2 (en) 2016-06-10 2020-09-15 OneTrust, LLC Data processing systems for the identification and deletion of personal data in computer systems
US10565397B1 (en) 2016-06-10 2020-02-18 OneTrust, LLC Data processing systems for fulfilling data subject access requests and related methods
US11636171B2 (en) 2016-06-10 2023-04-25 OneTrust, LLC Data processing user interface monitoring systems and related methods
US11057356B2 (en) 2016-06-10 2021-07-06 OneTrust, LLC Automated data processing systems and methods for automatically processing data subject access requests using a chatbot
US10484868B2 (en) * 2017-01-17 2019-11-19 International Business Machines Corporation Configuring privacy policies by formulating questions and evaluating responses
US10831509B2 (en) 2017-02-23 2020-11-10 Ab Initio Technology Llc Dynamic execution of parameterized applications for the processing of keyed network data streams
US11947978B2 (en) 2017-02-23 2024-04-02 Ab Initio Technology Llc Dynamic execution of parameterized applications for the processing of keyed network data streams
US9965648B1 (en) * 2017-04-06 2018-05-08 International Business Machines Corporation Automatic masking of sensitive data
US10839098B2 (en) 2017-04-07 2020-11-17 International Business Machines Corporation System to prevent export of sensitive data
US10013577B1 (en) 2017-06-16 2018-07-03 OneTrust, LLC Data processing systems for identifying whether cookies contain personally identifying information
US11769077B2 (en) * 2017-10-13 2023-09-26 Koninklijke Philips N.V. Methods and systems to characterize the user of a personal care device
CN107798250B (en) * 2017-10-13 2021-08-24 平安科技(深圳)有限公司 Sensitive information shielding rule issuing method, application server and computer readable storage medium
US10635825B2 (en) 2018-07-11 2020-04-28 International Business Machines Corporation Data privacy awareness in workload provisioning
US10803202B2 (en) 2018-09-07 2020-10-13 OneTrust, LLC Data processing systems for orphaned data identification and deletion and related methods
US11144675B2 (en) 2018-09-07 2021-10-12 OneTrust, LLC Data processing systems and methods for automatically protecting sensitive data within privacy management systems
US11544409B2 (en) 2018-09-07 2023-01-03 OneTrust, LLC Data processing systems and methods for automatically protecting sensitive data within privacy management systems
US11374889B2 (en) 2019-03-20 2022-06-28 Infoarmor, Inc. Unsubscribe and delete automation
US11232229B2 (en) 2019-03-20 2022-01-25 Infoarmor, Inc. Unsubscribe and delete automation
US11196693B2 (en) 2019-03-20 2021-12-07 Allstate Insurance Company Unsubscribe automation
WO2022011142A1 (en) 2020-07-08 2022-01-13 OneTrust, LLC Systems and methods for targeted data discovery
WO2022026564A1 (en) 2020-07-28 2022-02-03 OneTrust, LLC Systems and methods for automatically blocking the use of tracking tools
US11475165B2 (en) 2020-08-06 2022-10-18 OneTrust, LLC Data processing systems and methods for automatically redacting unstructured data from a data subject access request
WO2022060860A1 (en) 2020-09-15 2022-03-24 OneTrust, LLC Data processing systems and methods for detecting tools for the automatic blocking of consent requests
US11526624B2 (en) 2020-09-21 2022-12-13 OneTrust, LLC Data processing systems and methods for automatically detecting target data transfers and target data processing
US11841972B2 (en) * 2020-10-20 2023-12-12 Avaya Management L.P. System and method to safeguarding sensitive information in cobrowsing session
WO2022099023A1 (en) 2020-11-06 2022-05-12 OneTrust, LLC Systems and methods for identifying data processing activities based on data discovery results
US11687528B2 (en) 2021-01-25 2023-06-27 OneTrust, LLC Systems and methods for discovery, classification, and indexing of data in a native computing system
WO2022170047A1 (en) 2021-02-04 2022-08-11 OneTrust, LLC Managing custom attributes for domain objects defined within microservices
US20240111899A1 (en) 2021-02-08 2024-04-04 OneTrust, LLC Data processing systems and methods for anonymizing data samples in classification analysis
US11601464B2 (en) 2021-02-10 2023-03-07 OneTrust, LLC Systems and methods for mitigating risks of third-party computing system functionality integration into a first-party computing system
WO2022178089A1 (en) 2021-02-17 2022-08-25 OneTrust, LLC Managing custom workflows for domain objects defined within microservices
WO2022178219A1 (en) 2021-02-18 2022-08-25 OneTrust, LLC Selective redaction of media content
US20240311497A1 (en) 2021-03-08 2024-09-19 OneTrust, LLC Data transfer discovery and analysis systems and related methods
US11562078B2 (en) 2021-04-16 2023-01-24 OneTrust, LLC Assessing and managing computational risk involved with integrating third party computing functionality within a computing system
US11775695B2 (en) * 2021-08-03 2023-10-03 International Business Machines Corporation Image redaction for a display device
US20230090108A1 (en) * 2021-09-23 2023-03-23 Quantum Metric, Inc. Systematic identification and masking of private data for replaying user sessions
US11620142B1 (en) 2022-06-03 2023-04-04 OneTrust, LLC Generating and customizing user interfaces for demonstrating functions of interactive user environments

Family Cites Families (200)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB8801628D0 (en) 1988-01-26 1988-02-24 British Telecomm Evaluation system
CA2038244A1 (en) 1990-04-19 1991-10-20 Arthur D. Markowitz Hand held computer terminal
US5577254A (en) 1993-06-29 1996-11-19 Bull Hn Information Systems Inc. Method and apparatus for capturing the presentation of an interactive user session, monitoring, replaying and joining sessions
US5825880A (en) 1994-01-13 1998-10-20 Sudia; Frank W. Multi-step digital signature method and system
JPH07306803A (en) 1994-03-24 1995-11-21 At & T Global Inf Solutions Internatl Inc Secret protective face of computer-resource storage part
US5564043A (en) 1994-03-24 1996-10-08 At&T Global Information Solutions Launching computer program upon download of data created by program
EP0674283A3 (en) 1994-03-24 1996-03-27 At & T Global Inf Solution Ordering and downloading resources from computerized repositories.
US6026403A (en) 1994-03-24 2000-02-15 Ncr Corporation Computer system for management of resources
US5721906A (en) 1994-03-24 1998-02-24 Ncr Corporation Multiple repositories of computer resources, transparent to user
US6732358B1 (en) 1994-03-24 2004-05-04 Ncr Corporation Automatic updating of computer software
US5715314A (en) 1994-10-24 1998-02-03 Open Market, Inc. Network sales system
GB2295299B (en) 1994-11-16 1999-04-28 Network Services Inc Enterpris Enterprise network management method and apparatus
NZ306846A (en) 1995-06-05 2000-01-28 Certco Llc Digital signing method using partial signatures
US7272639B1 (en) 1995-06-07 2007-09-18 Soverain Software Llc Internet server access control and monitoring systems
WO1996041289A2 (en) 1995-06-07 1996-12-19 Electronic Data Systems Corporation System and method for electronically auditing point-of-sale transactions
US5832496A (en) 1995-10-12 1998-11-03 Ncr Corporation System and method for performing intelligent analysis of a computer database
US6085223A (en) 1995-10-20 2000-07-04 Ncr Corporation Method and apparatus for providing database information to non-requesting clients
US5930786A (en) 1995-10-20 1999-07-27 Ncr Corporation Method and apparatus for providing shared data to a requesting client
US5717879A (en) 1995-11-03 1998-02-10 Xerox Corporation System for the capture and replay of temporal data representing collaborative activities
US5774552A (en) 1995-12-13 1998-06-30 Ncr Corporation Method and apparatus for retrieving X.509 certificates from an X.500 directory
US5751962A (en) 1995-12-13 1998-05-12 Ncr Corporation Object-based systems management of computer networks
US5848396A (en) 1996-04-26 1998-12-08 Freedom Of Information, Inc. Method and apparatus for determining behavioral profile of a computer user
US5857188A (en) 1996-04-29 1999-01-05 Ncr Corporation Management of client requests in a client-server environment
US5845124A (en) 1996-05-01 1998-12-01 Ncr Corporation Systems and methods for generating and displaying a symbolic representation of a network model
US5894516A (en) 1996-07-10 1999-04-13 Ncr Corporation Broadcast software distribution
US5933601A (en) 1996-09-30 1999-08-03 Ncr Corporation Method for systems management of object-based computer networks
US5870559A (en) 1996-10-15 1999-02-09 Mercury Interactive Software system and associated methods for facilitating the analysis and management of web sites
US6295550B1 (en) 1996-10-23 2001-09-25 Ncr Corporation Session creation mechanism for collaborative network navigation
US5809250A (en) 1996-10-23 1998-09-15 Intel Corporation Methods for creating and sharing replayable modules representive of Web browsing session
US5889860A (en) 1996-11-08 1999-03-30 Sunhawk Corporation, Inc. Encryption system with transaction coded decryption key
US5848412A (en) 1996-11-19 1998-12-08 Ncr Corporation User controlled browser identification disclosing mechanism
US5969632A (en) 1996-11-22 1999-10-19 Diamant; Erez Information security method and apparatus
US5903652A (en) 1996-11-25 1999-05-11 Microsoft Corporation System and apparatus for monitoring secure information in a computer network
US6006228A (en) 1996-12-11 1999-12-21 Ncr Corporation Assigning security levels to particular documents on a document by document basis in a database
US6115742A (en) 1996-12-11 2000-09-05 At&T Corporation Method and apparatus for secure and auditable metering over a communications network
WO1998036520A1 (en) 1997-02-13 1998-08-20 Secure Transaction Solutions, Llc Cryptographic key split combiner
DE19712262B4 (en) 1997-03-24 2006-06-08 Siemens Ag Method for routing signaling messages in a signaling network
US5905868A (en) 1997-07-22 1999-05-18 Ncr Corporation Client/server distribution of performance monitoring data
US5951652A (en) 1997-10-06 1999-09-14 Ncr Corporation Dependable data element synchronization mechanism
US6779030B1 (en) 1997-10-06 2004-08-17 Worldcom, Inc. Intelligent network
US5954798A (en) 1997-10-06 1999-09-21 Ncr Corporation Mechanism for dependably managing web synchronization and tracking operations among multiple browsers
US6035332A (en) 1997-10-06 2000-03-07 Ncr Corporation Method for monitoring user interactions with web pages from web server using data and command lists for maintaining information visited and issued by participants
US5951643A (en) 1997-10-06 1999-09-14 Ncr Corporation Mechanism for dependably organizing and managing information for web synchronization and tracking among multiple browsers
US5941957A (en) 1997-10-06 1999-08-24 Ncr Corporation Dependable web page synchronization mechanism
US6418439B1 (en) 1997-11-12 2002-07-09 Ncr Corporation Computer system and computer implemented method for translation of information into multiple media variations
US6317794B1 (en) 1997-11-12 2001-11-13 Ncr Corporation Computer system and computer implemented method for synchronization of simultaneous web views
US6151601A (en) 1997-11-12 2000-11-21 Ncr Corporation Computer architecture and method for collecting, analyzing and/or transforming internet and/or electronic commerce data for storage into a data storage area
US6151584A (en) 1997-11-20 2000-11-21 Ncr Corporation Computer architecture and method for validating and collecting and metadata and data about the internet and electronic commerce environments (data discoverer)
US6286046B1 (en) 1997-12-22 2001-09-04 International Business Machines Corporation Method of recording and measuring e-business sessions on the world wide web
US6169997B1 (en) 1998-04-29 2001-01-02 Ncr Corporation Method and apparatus for forming subject (context) map and presenting Internet data according to the subject map
US6714931B1 (en) 1998-04-29 2004-03-30 Ncr Corporation Method and apparatus for forming user sessions and presenting internet data according to the user sessions
JP4064060B2 (en) 1998-05-15 2008-03-19 ユニキャスト・コミュニケーションズ・コーポレイション Technology for implementing network-distributed interstitial web advertisements that are initiated by the browser and invisible to the user using ad tags embedded in reference web pages
US6182097B1 (en) 1998-05-21 2001-01-30 Lucent Technologies Inc. Method for characterizing and visualizing patterns of usage of a web site by network users
US6658453B1 (en) 1998-05-28 2003-12-02 America Online, Incorporated Server agent system
US6745243B2 (en) 1998-06-30 2004-06-01 Nortel Networks Limited Method and apparatus for network caching and load balancing
US6286030B1 (en) 1998-07-10 2001-09-04 Sap Aktiengesellschaft Systems and methods for recording and visually recreating sessions in a client-server environment
US6286098B1 (en) 1998-08-28 2001-09-04 Sap Aktiengesellschaft System and method for encrypting audit information in network applications
US6253203B1 (en) 1998-10-02 2001-06-26 Ncr Corporation Privacy-enhanced database
US6489980B1 (en) 1998-12-29 2002-12-03 Ncr Corporation Software apparatus for immediately posting sharing and maintaining objects on a web page
US6502086B2 (en) 1999-01-04 2002-12-31 International Business Machines Corporation Mapping binary objects in extended relational database management systems with relational registry
US6397256B1 (en) 1999-01-27 2002-05-28 International Business Machines Corporation Monitoring system for computers and internet browsers
US6334110B1 (en) 1999-03-10 2001-12-25 Ncr Corporation System and method for analyzing customer transactions and interactions
US20040078464A1 (en) 1999-09-16 2004-04-22 Rajan Sreeranga P. Method and apparatus for enabling real time monitoring and notification of data updates for WEB-based data synchronization services
US7293281B1 (en) 1999-10-25 2007-11-06 Watchfire Corporation Method and system for verifying a client request
US6850975B1 (en) 1999-11-29 2005-02-01 Intel Corporation Web site monitoring
GB2357680B (en) 2000-03-14 2002-02-13 Speed Trap Com Ltd Monitoring operation of services provided over a network
US7260837B2 (en) 2000-03-22 2007-08-21 Comscore Networks, Inc. Systems and methods for user identification, user demographic reporting and collecting usage data usage biometrics
US7260774B2 (en) 2000-04-28 2007-08-21 Inceptor, Inc. Method & system for enhanced web page delivery
US7043546B2 (en) * 2000-04-28 2006-05-09 Agilent Technologies, Inc. System for recording, editing and playing back web-based transactions using a web browser and HTML
US20020070953A1 (en) 2000-05-04 2002-06-13 Barg Timothy A. Systems and methods for visualizing and analyzing conditioned data
US7580996B1 (en) 2000-05-31 2009-08-25 International Business Machines Corporation Method and system for dynamic update of an application monitoring agent using a non-polling mechanism
US20020006591A1 (en) * 2000-07-07 2002-01-17 Hugens John R. Method and apparatus for mixing combustion gases
US7278105B1 (en) 2000-08-21 2007-10-02 Vignette Corporation Visualization and analysis of user clickpaths
ATE342546T1 (en) 2000-08-24 2006-11-15 Nice Systems Ltd SYSTEM AND METHOD FOR COLLECTING BROWSER SESSIONS AND USER ACTIONS
US20020056091A1 (en) 2000-09-13 2002-05-09 Bala Ravi Narayan Software agent for facilitating electronic commerce transactions through display of targeted promotions or coupons
US6671687B1 (en) 2000-09-29 2003-12-30 Ncr Corporation Method and apparatus for protecting data retrieved from a database
WO2002037334A1 (en) 2000-10-30 2002-05-10 Elias Arts Corporation System and method for performing content experience management
US7231606B2 (en) * 2000-10-31 2007-06-12 Software Research, Inc. Method and system for testing websites
US20020083183A1 (en) 2000-11-06 2002-06-27 Sanjay Pujare Conventionally coded application conversion system for streamed delivery and execution
US6766333B1 (en) 2000-11-08 2004-07-20 Citrix Systems, Inc. Method and apparatus for synchronizing a user interface element displayed on a client and a software application component executing on a web server
WO2002044923A1 (en) 2000-11-30 2002-06-06 Webtone Technologies, Inc. Web session collaboration
US7353269B2 (en) * 2000-12-21 2008-04-01 Fujitsu Limited Network monitoring system
US7461369B2 (en) * 2001-03-30 2008-12-02 Bmc Software, Inc. Java application response time analyzer
US7076495B2 (en) 2001-04-26 2006-07-11 International Business Machines Corporation Browser rewind and replay feature for transient messages by periodically capturing screen images
AUPR464601A0 (en) 2001-04-30 2001-05-24 Commonwealth Of Australia, The Shapes vector
US6944660B2 (en) 2001-05-04 2005-09-13 Hewlett-Packard Development Company, L.P. System and method for monitoring browser event activities
US7197559B2 (en) * 2001-05-09 2007-03-27 Mercury Interactive Corporation Transaction breakdown feature to facilitate analysis of end user performance of a server system
US6589980B2 (en) 2001-05-17 2003-07-08 Wyeth Substituted 10,11-benzo[b]fluoren-10-ones as estrogenic agents
CN1647070A (en) 2001-06-22 2005-07-27 诺萨·欧莫贵 System and method for knowledge retrieval, management, delivery and presentation
US7165105B2 (en) 2001-07-16 2007-01-16 Netgenesis Corporation System and method for logical view analysis and visualization of user behavior in a distributed computer network
US20040100507A1 (en) 2001-08-24 2004-05-27 Omri Hayner System and method for capturing browser sessions and user actions
US6877007B1 (en) 2001-10-16 2005-04-05 Anna M. Hentzel Method and apparatus for tracking a user's interaction with a resource supplied by a server computer
US7631035B2 (en) 2002-01-09 2009-12-08 Digital River, Inc. Path-analysis toolbar
GB2384581A (en) 2002-01-25 2003-07-30 Hewlett Packard Co Reusing web session data
US7219138B2 (en) 2002-01-31 2007-05-15 Witness Systems, Inc. Method, apparatus, and system for capturing data exchanged between a server and a user
US20050066037A1 (en) 2002-04-10 2005-03-24 Yu Song Browser session mobility system for multi-platform applications
JP2004029939A (en) 2002-06-21 2004-01-29 Hitachi Ltd Communication proxy device and service providing method using the same device
US7627872B2 (en) 2002-07-26 2009-12-01 Arbitron Inc. Media data usage measurement and reporting systems and methods
US8015259B2 (en) 2002-09-10 2011-09-06 Alan Earl Swahn Multi-window internet search with webpage preload
US8375286B2 (en) 2002-09-19 2013-02-12 Ancestry.com Operations, Inc. Systems and methods for displaying statistical information on a web page
US7716322B2 (en) 2002-09-23 2010-05-11 Alcatel-Lucent Usa Inc. Automatic exploration and testing of dynamic Web sites
US7305470B2 (en) 2003-02-12 2007-12-04 Aol Llc Method for displaying web user's authentication status in a distributed single login network
US7565425B2 (en) 2003-07-02 2009-07-21 Amazon Technologies, Inc. Server architecture and methods for persistently storing and serving event data
US7646762B2 (en) 2003-08-06 2010-01-12 Motorola, Inc. Method and apparatus for providing session data to a subscriber to a multimedia broadcast multicast service
NZ527621A (en) 2003-08-15 2005-08-26 Aspiring Software Ltd Web playlist system, method, and computer program
US7665019B2 (en) 2003-09-26 2010-02-16 Nbor Corporation Method for recording and replaying operations in a computer environment using initial conditions
US7430597B2 (en) 2003-09-30 2008-09-30 Toshiba Corporation System and method for tracking web-based sessions
WO2005045673A2 (en) 2003-11-04 2005-05-19 Kimberly-Clark Worldwide, Inc. Testing tool for complex component based software systems
US20060184410A1 (en) 2003-12-30 2006-08-17 Shankar Ramamurthy System and method for capture of user actions and use of capture data in business processes
US20050198315A1 (en) 2004-02-13 2005-09-08 Wesley Christopher W. Techniques for modifying the behavior of documents delivered over a computer network
US20050188080A1 (en) 2004-02-24 2005-08-25 Covelight Systems, Inc. Methods, systems and computer program products for monitoring user access for a server application
US7584435B2 (en) 2004-03-03 2009-09-01 Omniture, Inc. Web usage overlays for third-party web plug-in content
US7690040B2 (en) 2004-03-10 2010-03-30 Enterasys Networks, Inc. Method for network traffic mirroring with data privacy
US20050216856A1 (en) 2004-03-23 2005-09-29 Matti Michael C System and method for displaying information on an interface device
US8010553B2 (en) 2004-04-05 2011-08-30 George Eagan Knowledge archival and recollection systems and methods
US7278092B2 (en) 2004-04-28 2007-10-02 Amplify, Llc System, method and apparatus for selecting, displaying, managing, tracking and transferring access to content of web pages and other sources
US7467402B2 (en) 2004-08-24 2008-12-16 Whitehat Security, Inc. Automated login session extender for use in security analysis systems
US7325040B2 (en) 2004-08-30 2008-01-29 University Of Utah Research Foundation Locally operated desktop environment for a remote computing system
US20060075088A1 (en) 2004-09-24 2006-04-06 Ming Guo Method and System for Building a Client-Side Stateful Web Application
US20060117055A1 (en) 2004-11-29 2006-06-01 John Doyle Client-based web server application verification and testing system
US7593603B1 (en) 2004-11-30 2009-09-22 Adobe Systems Incorporated Multi-behavior image correction tool
US7975000B2 (en) 2005-01-27 2011-07-05 Fmr Llc A/B testing of a webpage
US20060230105A1 (en) 2005-04-06 2006-10-12 Ericom Software B 2001 Ltd Method of providing a remote desktop session with the same look and feel as a local desktop
US20070027749A1 (en) * 2005-07-27 2007-02-01 Hewlett-Packard Development Company, L.P. Advertisement detection
US7983961B1 (en) * 2005-10-27 2011-07-19 Alvin Chang Methods and apparatus for marketing profiling while preserving user privacy
US20070106692A1 (en) 2005-11-10 2007-05-10 International Business Machines Corporation System and method for recording and replaying a session with a web server without recreating the actual session
US8225376B2 (en) 2006-07-25 2012-07-17 Facebook, Inc. Dynamically generating a privacy summary
US20070174429A1 (en) 2006-01-24 2007-07-26 Citrix Systems, Inc. Methods and servers for establishing a connection between a client system and a virtual machine hosting a requested computing environment
KR100805170B1 (en) 2006-03-09 2008-02-21 엔씨소프트 재팬 가부시키 가이샤 Apparatus and Method for Changing Web Design
US20070226314A1 (en) 2006-03-22 2007-09-27 Sss Research Inc. Server-based systems and methods for enabling interactive, collabortive thin- and no-client image-based applications
US7861176B2 (en) 2006-04-13 2010-12-28 Touchcommerce, Inc. Methods and systems for providing online chat
US20070255754A1 (en) 2006-04-28 2007-11-01 James Gheel Recording, generation, storage and visual presentation of user activity metadata for web page documents
US7908551B2 (en) 2006-06-29 2011-03-15 Google Inc. Dynamically generating customized user interfaces
US8949406B2 (en) 2008-08-14 2015-02-03 International Business Machines Corporation Method and system for communication between a client system and a server system
US8127000B2 (en) 2006-06-30 2012-02-28 Tealeaf Technology, Inc. Method and apparatus for monitoring and synchronizing user interface events with network data
US8583772B2 (en) * 2008-08-14 2013-11-12 International Business Machines Corporation Dynamically configurable session agent
US8868533B2 (en) 2006-06-30 2014-10-21 International Business Machines Corporation Method and apparatus for intelligent capture of document object model events
US20080052377A1 (en) 2006-07-11 2008-02-28 Robert Light Web-Based User-Dependent Customer Service Interaction with Co-Browsing
WO2008024706A2 (en) 2006-08-21 2008-02-28 Crazy Egg, Inc. Visual web page analytics
US7861213B2 (en) 2006-09-05 2010-12-28 Oracle International Corporation Mechanism for developing AJax applications using java swing framework and method for using the same
WO2008039784A2 (en) 2006-09-25 2008-04-03 Compete, Inc. Website analytics
WO2008039870A2 (en) 2006-09-26 2008-04-03 Accoona Corp Apparatuses, methods and systems for an information comparator preview generator
US7941852B2 (en) * 2006-10-04 2011-05-10 Symantec Corporation Detecting an audio/visual threat
US20080209311A1 (en) 2006-12-29 2008-08-28 Alex Agronik On-line digital image editing with wysiwyg transparency
US20080294974A1 (en) 2007-05-24 2008-11-27 Nokia Corporation Webpage history view
US20090013347A1 (en) 2007-06-11 2009-01-08 Gulrukh Ahanger Systems and methods for reporting usage of dynamically inserted and delivered ads
US20090019133A1 (en) 2007-07-13 2009-01-15 Stephen Brimley System, method and computer program for updating a web page in a web browser
US20100211638A1 (en) 2007-07-27 2010-08-19 Goojet Method and device for creating computer applications
US7809525B2 (en) 2007-07-31 2010-10-05 International Business Machines Corporation Automatic configuration of robotic transaction playback through analysis of previously collected traffic patterns
US20090037517A1 (en) 2007-08-02 2009-02-05 Randall Wayne Frei Method and system to share content between web clients
US8042055B2 (en) 2007-08-31 2011-10-18 Tealeaf Technology, Inc. Replaying captured network interactions
US9906549B2 (en) 2007-09-06 2018-02-27 Microsoft Technology Licensing, Llc Proxy engine for custom handling of web content
US8543683B2 (en) 2007-09-26 2013-09-24 Microsoft Corporation Remote monitoring of local behavior of network applications
US9063979B2 (en) 2007-11-01 2015-06-23 Ebay, Inc. Analyzing event streams of user sessions
US20090132957A1 (en) 2007-11-21 2009-05-21 Sharp Laboratories Of America Inc. System and method for selecting thumbnails in a multi-page document
US20090138554A1 (en) 2007-11-26 2009-05-28 Giuseppe Longobardi Controlling virtual meetings with a feedback history
US20090247193A1 (en) 2008-03-26 2009-10-01 Umber Systems System and Method for Creating Anonymous User Profiles from a Mobile Data Network
US20090254529A1 (en) 2008-04-04 2009-10-08 Lev Goldentouch Systems, methods and computer program products for content management
US20100007603A1 (en) 2008-07-14 2010-01-14 Sony Ericsson Mobile Communications Ab Method and apparatus for controlling display orientation
US20100058285A1 (en) 2008-08-28 2010-03-04 Microsoft Corporation Compositional view of imperative object model
US20100070929A1 (en) 2008-09-12 2010-03-18 International Business Machines Corporation Method for Automatically Constructing Pageflows by Analysing Traversed Breadcrumbs
US20100169792A1 (en) 2008-12-29 2010-07-01 Seif Ascar Web and visual content interaction analytics
US7962547B2 (en) 2009-01-08 2011-06-14 International Business Machines Corporation Method for server-side logging of client browser state through markup language
US8266673B2 (en) * 2009-03-12 2012-09-11 At&T Mobility Ii Llc Policy-based privacy protection in converged communication networks
US8554630B2 (en) 2009-03-20 2013-10-08 Ad-Vantage Networks, Llc Methods and systems for processing and displaying content
US8930818B2 (en) 2009-03-31 2015-01-06 International Business Machines Corporation Visualization of website analytics
US9934320B2 (en) 2009-03-31 2018-04-03 International Business Machines Corporation Method and apparatus for using proxy objects on webpage overlays to provide alternative webpage actions
US20100268694A1 (en) * 2009-04-17 2010-10-21 Laurent Denoue System and method for sharing web applications
US8751628B2 (en) 2009-05-05 2014-06-10 Suboti, Llc System and method for processing user interface events
WO2010141748A1 (en) 2009-06-03 2010-12-09 The Sagecos Group, Inc. Using layers to construct graphical user interfaces
US9215212B2 (en) * 2009-06-22 2015-12-15 Citrix Systems, Inc. Systems and methods for providing a visualizer for rules of an application firewall
US7627648B1 (en) 2009-07-23 2009-12-01 Yahoo! Inc. Customizing and storing web pages for mobile on web
US8634390B2 (en) 2009-10-14 2014-01-21 Verizon Patent And Licensing Inc. Systems and methods for wireless local area network based control of a set-top box device
US8433733B2 (en) 2010-01-13 2013-04-30 Vmware, Inc. Web application record-replay system and method
CN101827102B (en) 2010-04-20 2013-01-30 中国人民解放军理工大学指挥自动化学院 Data prevention method based on content filtering
US8381088B2 (en) 2010-06-22 2013-02-19 Microsoft Corporation Flagging, capturing and generating task list items
US8533532B2 (en) * 2010-06-23 2013-09-10 International Business Machines Corporation System identifying and inferring web session events
JP5325169B2 (en) 2010-06-25 2013-10-23 株式会社日立製作所 Web application operation reproduction method and system
US9304591B2 (en) 2010-08-10 2016-04-05 Lenovo (Singapore) Pte. Ltd. Gesture control
WO2012020868A1 (en) 2010-08-13 2012-02-16 엘지전자 주식회사 Mobile terminal, display device and method for controlling same
US8856688B2 (en) 2010-10-11 2014-10-07 Facebook, Inc. Pinch gesture to navigate application layers
US8881031B2 (en) 2010-11-22 2014-11-04 Ayu Technology Solutions Llc Systems and methods for facilitating media connections
US20120157122A1 (en) 2010-12-17 2012-06-21 Research In Motion Limited Mobile communication device for retrieving geospatial data
CN103384864B (en) 2011-02-22 2016-12-14 国际商业机器公司 The method and system of rendering content
CN102694942B (en) 2011-03-23 2015-07-15 株式会社东芝 Image processing apparatus, method for displaying operation manner, and method for displaying screen
CN102254120B (en) * 2011-08-09 2014-05-21 华为数字技术(成都)有限公司 Method, system and relevant device for detecting malicious codes
US20130069947A1 (en) 2011-09-20 2013-03-21 International Business Machines Corporation Calculating display settings and accurately rendering an object on a display
US8997017B2 (en) 2011-10-21 2015-03-31 International Business Machines Corporation Controlling interactions via overlaid windows
WO2013075324A1 (en) 2011-11-25 2013-05-30 Microsoft Corporation Image attractiveness based indexing and searching
US8976955B2 (en) 2011-11-28 2015-03-10 Nice-Systems Ltd. System and method for tracking web interactions with real time analytics
US20140115506A1 (en) 2012-02-29 2014-04-24 William Brandon George Systems And Methods For Measurement Of User Interface Actions
US8832846B2 (en) * 2012-05-11 2014-09-09 Verizon Patent And Licensing Inc. Methods and systems for providing a notification of a compliance level of an application with respect to a privacy profile associated with a user
US9113033B2 (en) 2012-08-28 2015-08-18 Microsoft Technology Licensing, Llc Mobile video conferencing with digital annotation
US9147058B2 (en) 2012-10-12 2015-09-29 Apple Inc. Gesture entry techniques
US9635094B2 (en) 2012-10-15 2017-04-25 International Business Machines Corporation Capturing and replaying application sessions using resource files
US9536108B2 (en) * 2012-10-23 2017-01-03 International Business Machines Corporation Method and apparatus for generating privacy profiles
US9535720B2 (en) 2012-11-13 2017-01-03 International Business Machines Corporation System for capturing and replaying screen gestures
US10474735B2 (en) 2012-11-19 2019-11-12 Acoustic, L.P. Dynamic zooming of content with overlays
AU2015360683B2 (en) 2014-12-08 2019-11-21 Rehabilitation Institute Of Chicago Powered and passive assistive device and related methods
US10304084B2 (en) * 2016-07-06 2019-05-28 Hiro Media Ltd. Real-time monitoring of ads inserted in real-time into a web page

Also Published As

Publication number Publication date
US10474840B2 (en) 2019-11-12
US9536108B2 (en) 2017-01-03
US20140115712A1 (en) 2014-04-24
CN103825774B (en) 2017-08-15
US20160162704A1 (en) 2016-06-09
CN103825774A (en) 2014-05-28

Similar Documents

Publication Publication Date Title
US20200065525A1 (en) Method and apparatus for generating privacy profiles
US20240037225A1 (en) Systems and methods for detecting resources responsible for events
US11570209B2 (en) Detecting and mitigating attacks using forged authentication objects within a domain
CN110798472B (en) Data leakage detection method and device
US8832265B2 (en) Automated analysis system for modeling online business behavior and detecting outliers
WO2015043491A1 (en) Method and system for performing security verification on login of internet account
US9105035B2 (en) Method and apparatus for customer experience segmentation based on a web session event variation
US20140325661A1 (en) Systems, methods, apparatuses, and computer program products for forensic monitoring
CA2934627C (en) Communications security
CN106126388A (en) The method of monitor event, regulation engine device and rule engine system
CN104573904A (en) Data visualizing system for monitoring user and software behaviors during network transaction
WO2015047922A1 (en) Automated risk tracking through compliance testing
Massa et al. A fraud detection system based on anomaly intrusion detection systems for e-commerce applications
US20230262081A1 (en) Intelligent alert customization in a backup and recovery activity monitoring system
CN111371643B (en) Authentication method, device and storage medium
US9723017B1 (en) Method, apparatus and computer program product for detecting risky communications
US12105808B2 (en) Automated trust center for real-time security and compliance monitoring
CN114024867B (en) Network anomaly detection method and device
US9261951B2 (en) Systems and methods for managing security data
US11743274B2 (en) Systems and methods for fraud management
RU2693646C1 (en) Method and system for selection of proposals for a user based on analysis of actions thereof
CN116723047A (en) Flow restriction processing method, device, electronic equipment and medium

Legal Events

Date Code Title Description
AS Assignment

Owner name: ACOUSTIC, L.P., NEW YORK

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:INTERNATIONAL BUSINESS MACHINES CORPORATION;REEL/FRAME:050909/0606

Effective date: 20190628

Owner name: TEALEAF TECHNOLOGY, INC., CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:POWELL, TRAVIS SPENCE;CASPI, NADAV;WENIG, ROBERT I.;AND OTHERS;REEL/FRAME:050909/0585

Effective date: 20121022

Owner name: INTERNATIONAL BUSINESS MACHINES CORPORATION, NEW YORK

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:TEALEAF TECHNOLOGY, INC.;REEL/FRAME:050926/0498

Effective date: 20130531

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION