[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

US20160080406A1 - Detecting anomalous activity from accounts of an online service - Google Patents

Detecting anomalous activity from accounts of an online service Download PDF

Info

Publication number
US20160080406A1
US20160080406A1 US14/945,010 US201514945010A US2016080406A1 US 20160080406 A1 US20160080406 A1 US 20160080406A1 US 201514945010 A US201514945010 A US 201514945010A US 2016080406 A1 US2016080406 A1 US 2016080406A1
Authority
US
United States
Prior art keywords
events
profile
recent
accounts
online service
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US14/945,010
Inventor
Art Sadovsky
Rustam Lalkaka
Vivek Sharma
Rajmohan Rajagopalan
Alexander MacLeod
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Microsoft Technology Licensing LLC
Original Assignee
Microsoft Technology Licensing LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Microsoft Technology Licensing LLC filed Critical Microsoft Technology Licensing LLC
Priority to US14/945,010 priority Critical patent/US20160080406A1/en
Assigned to MICROSOFT CORPORATION reassignment MICROSOFT CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: SHARMA, VIVEK, SADOVSKY, ART, LALKAKA, Rustam, MACLEOD, ALEXANDER, RAJAGOPALAN, RAJMOHAN
Assigned to MICROSOFT TECHNOLOGY LICENSING, LLC reassignment MICROSOFT TECHNOLOGY LICENSING, LLC ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: MICROSOFT CORPORATION
Publication of US20160080406A1 publication Critical patent/US20160080406A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L63/00Network architectures or network communication protocols for network security
    • H04L63/14Network architectures or network communication protocols for network security for detecting or protecting against malicious traffic
    • H04L63/1408Network architectures or network communication protocols for network security for detecting or protecting against malicious traffic by monitoring network traffic
    • H04L63/1425Traffic logging, e.g. anomaly detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q20/00Payment architectures, schemes or protocols
    • G06Q20/38Payment protocols; Details thereof
    • G06Q20/40Authorisation, e.g. identification of payer or payee, verification of customer or shop credentials; Review and approval of payers, e.g. check credit lines or negative lists
    • G06Q20/401Transaction verification
    • G06Q20/4016Transaction verification involving fraud or risk level assessment in transaction processing
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L63/00Network architectures or network communication protocols for network security
    • H04L63/14Network architectures or network communication protocols for network security for detecting or protecting against malicious traffic
    • H04L63/1408Network architectures or network communication protocols for network security for detecting or protecting against malicious traffic by monitoring network traffic
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L63/00Network architectures or network communication protocols for network security
    • H04L63/14Network architectures or network communication protocols for network security for detecting or protecting against malicious traffic
    • H04L63/1408Network architectures or network communication protocols for network security for detecting or protecting against malicious traffic by monitoring network traffic
    • H04L63/1416Event detection, e.g. attack signature detection
    • H04L67/22
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/2866Architectures; Arrangements
    • H04L67/30Profiles
    • H04L67/306User profiles
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/50Network services
    • H04L67/535Tracking the activity of the user
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W12/00Security arrangements; Authentication; Protecting privacy or anonymity
    • H04W12/12Detection or prevention of fraud

Definitions

  • Anomaly detection is used for determining when patterns in data do not match an expected pattern. For example, credit card companies may use anomaly detection to help detect fraudulent activity relating to a customer's credit card.
  • An online service may create a rule to detect anomalous activity when network traffic exceeds a predetermined threshold. Detecting anomalous activity that is associated with an online service can be challenging and time consuming. For example, there is typically an extremely large amount of data relating to the operation of the online service that may need to be analyzed. Instead of processing this large amount of data, many online services detect anomalous activity by determining when a predefined event occurs on a single machine or a few machines of the online service. For example, the predefined event may occur in response to network traffic for the online service exceeding some predetermined level or when a large number of processes are started in a short time period.
  • Anomalous activity is detected using event information that is monitored from accounts performing activities within an online service.
  • anomalous activity is detected by determining when a baseline profile that represents “normal” activity for the online service is different from a recent profile that represents the “current” activity from accounts in the online service.
  • anomalous activity may be detected when an event, such as a create account event, is occurring more frequently in the recent profile as compared to the occurrence of the create account event in the baseline profile.
  • the event information that is used in detecting anomalous activity may include all or a portion of events that are monitored by the online service.
  • the events used for anomaly detection may include all or a portion of security events (e.g., any event that changes permissions, such as creating accounts, changing permissions of one or more accounts, logging into the online service or logging out of the online service . . . ) as well as other types of events (e.g., system events, hardware events . . . ).
  • An authorized user may configure the events to monitor and/or the events for one or more event categories may be automatically selected.
  • the accounts that are monitored for anomalous activity may include all of or a portion of the accounts of the online service.
  • the accounts that are monitored for anomalous activity may be operator accounts (e.g., accounts that have permissions to create, modify, and delete accounts for other users or groups or users) or other types of accounts (e.g., user accounts, privileged accounts, and the like).
  • different activities may be performed. For example, an account may be blocked from performing operations, an account may be locked, one or more reports may be automatically generated and provided to one or more users to show activity that may be considered anomalous activity, and the like. Different types of reports may be generated. For example, one report may rank the accounts based on the level of anomalous activity detected whereas another report may provide more detailed information for one or more of the accounts.
  • FIG. 1 shows an overview of a system for detecting anomalous activity from accounts of an online service
  • FIG. 2 shows a more detailed system for detecting anomalous activity from accounts of an online service
  • FIG. 3 shows different profiles including event information and weighting of events used in detecting anomalous activity from accounts within an online service
  • FIG. 4 shows example Graphical User Interfaces for viewing and configuring events relating to anomaly detection
  • FIG. 5 shows exemplary reports showing detected anomalous activity for one or more accounts within the online service
  • FIG. 6 illustrates a process for detecting anomalous activity in an online service by comparing a baseline profile to a recent profile
  • FIG. 7 illustrates a process for configuring and storing event information within a baseline profile and a recent profile
  • FIG. 8 illustrates an exemplary online system for detecting anomalous activity
  • FIGS. 9 , 10 A, 10 B and 11 and the associated descriptions provide a discussion of a variety of operating environments in which embodiments of the invention may be practiced.
  • online services typically examine a small subset of the events that are logged from a single machine (or a small number of machines) in the service in order to detect anomalous activity. For example, instead of examining the events from each machine in an online service, the online service selects one or two machines to monitor for anomalous activity.
  • An online service may also establish hard coded rules to detect anomalous activity. For example, an online service may create a rule to identify a specific type of anomalous activity.
  • an anomaly detector instead of creating individual rules to detect anomalous activity, automatically detects anomalous activity based on events that are monitored and that are obtained from the different computing devices and accounts in the online service.
  • Anomalous activity may be detected using any number of monitored events from accounts of the online system without creating individual rules. Instead of looking for a predetermined event or condition to occur in order to determine when a specific type of anomalous activity is occurring, anomalous activity is detected by comparing a frequency of events that represent the “normal” behavior for the online service with the frequency of the events that occur during a recent period of time in the online service. When the frequency of the events between the baseline profile and the recent profile are different, anomalous activity may be indicated. This anomalous activity is then provided to one or more users in the form of a report. According to an embodiment, a report is delivered that includes information related to the anomalous activity detected in the online service.
  • FIG. 1 shows an overview of a system for detecting anomalous activity from accounts of an online service.
  • system 100 includes online service 110 , anomaly detector 26 , baseline profile 155 , recent profile 165 , display 115 and display 125 .
  • Anomaly detector 26 is configured to detect anomalous activity that is occurring from accounts that are associated with online service 110 .
  • Anomalous activity is activity that deviates from the expected or normal behavior of the online service.
  • anomalous activity may be detected by anomaly detector 26 in response to an unusually large number of requests from accounts in the online service to: create new accounts; change permissions; start processes; and the like.
  • anomaly detector 26 determines when a frequency of events that is stored in baseline profile 155 deviates from the frequency of the events in recent profile 165 .
  • baseline profile 155 may indicate that during normal operation of online service 110 , two accounts are created in a typical day.
  • anomaly detector 26 determines that anomalous activity may be occurring in online service 110 and generates anomaly report 130 that is shown displayed on display 125 .
  • anomaly report 130 shows a message that includes information showing the anomalous activity (e.g., “Account 1 created 10 new accounts) and also shows the normal activity (e.g., “Creating 2 accounts is normal activity).
  • the report may be provided to one or more users to show activity that may be considered anomalous activity. Different reports may be generated. For example, one report may rank the accounts based on anomalous activity whereas another report provides more detailed information for one or more accounts
  • Anomaly detector 26 may use all or a portion of monitored (e.g., logged) events of online service in order to detect anomalous activity. Instead of monitoring just one or two different events that occur on a single computing device in the online service and determining when a predetermined condition is met, anomaly detector 26 may use any number of the logged events from any number of accounts within the online service to detect anomalous activity. For example, anomaly detector may use each of the logged security events that occur within the online service (e.g., any event that changes permissions, such as creating accounts, changing permissions of one or more accounts, . . . ) in order to detect anomalous activity.
  • the logged security events e.g., any event that changes permissions, such as creating accounts, changing permissions of one or more accounts, . . .
  • an authorized user may configure the events that are used by anomaly detector 26 to detect anomalous activity.
  • a Graphical User Interface (GUI) for event configuration 117 is displayed and is configured to receive event selection and configuration information from a user. For example, the user may select all or a portion of the events that are monitored by the online service for one or more accounts.
  • GUI Graphical User Interface
  • display 115 shows that the authorized user has selected to use change permission events and add account events but not to use login events when detecting anomalous activity.
  • Other configuration information may also be received, such as, but not limited to weighting information; aggregation profiles and the like (See the FIGURES and related discussion below).
  • anomaly detector 26 uses a default algorithm weighting process. According to an embodiment, the default algorithm weighting process assigns events with a higher weight (that is, are more anomalous) the more rare the events are determined to occur based on the occurrence of the events in the baseline profile.
  • Online service 110 may be a cloud based and/or enterprise based service that may be configured to provide services, such as productivity services (e.g., messaging, collaboration, spreadsheets, documents, presentations, charts, and the like). More details regarding anomaly detection are provided below.
  • productivity services e.g., messaging, collaboration, spreadsheets, documents, presentations, charts, and the like. More details regarding anomaly detection are provided below.
  • FIG. 2 shows a more detailed system 200 for detecting anomalous activity from accounts of an online service.
  • system 200 includes application 262 , application 272 , tablet computing device 260 , computing device 270 and online service 110 that includes anomaly detector 26 , baseline profiles 210 , recent profiles 220 , accounts 230 , computing devices 240 and log 250 .
  • anomaly detector 26 is configured to detect anomalous activity by using monitored event information from accounts 230 that are associated with online service 110 .
  • the accounts that are monitored within the online service for anomalous activity may be operator accounts that have permissions to create, modify, and delete accounts for other users or groups or users.
  • Other accounts, such as user accounts, may also be monitored for anomalous activity.
  • anomaly detector 26 detects anomalous activity by comparing frequencies of past events that are included in baseline profiles 210 with the frequencies of recent events that are included in recent profiles 220 .
  • a profile e.g., a baseline profile or a recent profile
  • Anomalous activity may be detected by anomaly detector 26 when the recent event information for an account shows that one or more events are occurring more frequently as compared to the occurrence of the events as recorded in a baseline profile.
  • Anomaly detector 26 may use various methods in determining when anomalous activity is occurring within online service 210 .
  • anomaly detector 26 creates a frequency profile for each account or group of accounts that is associated with a profile.
  • Anomaly detector 26 compares the frequency profile for the events that are associated with the baseline profile with the frequency profile for the events that are associated with the recent profiles. For example, when the frequency profile that is associated with the recent profile indicates that one or more events are occurring more frequently as compared to during “normal” operation as indicated by the baseline profile, anomalous activity is detected.
  • a frequency is determined for each event that is included in a profile.
  • the baseline profile shows that event 1 accounts for 10% of the events, event 2 accounts for 25% of the overall events, event 3 accounts for 50% of the events, and event 4 accounts for 15% of the events.
  • the recent profile shows that event 1 accounts for 20% of the events, event 2 accounts for 15% of the overall events, event 3 accounts for 50% of the events, and event 4 accounts for 15% of the events.
  • anomaly detector 26 detects that anomalous activity is occurring since the frequency of event 1 is larger in the recent profile as compared to the baseline profile.
  • Other methods of detecting anomalous activity from accounts in the online service may be used. For example, other statistical methods may be used, such as: comparing a number of times each event occurs within a predetermined time period, aggregating the frequencies of some events and comparing the aggregated frequencies; or some other statistical method.
  • different events may be assigned different weights such that an increase in an occurrence of a more heavily weighted event within the monitored accounts indicates anomalous activity before an increase in an occurrence of an event that is not as heavily weighted.
  • events that occur less frequently may automatically or manually be assigned a higher weight such that an increase in the occurrence of the event will more quickly indicate anomalous activity.
  • login events may be set to a lower weight as compared to creating account events since login events are generally common events in an online service.
  • the weights may be assigned automatically and/or manually.
  • anomaly report 265 Upon detection of anomalous activity by anomaly detector 26 , different actions may be performed. For example, one or more reports may be automatically created and delivered (e.g., anomaly report 265 ), one or more accounts may be locked such that future activity is stopped, certain actions may be prevented from occurring within the online service, and the like.
  • a report may be generated and provided to one or more users to show activity that may be considered anomalous activity. Different reports may be generated. For example, anomaly report 265 ranks the accounts based on detected anomalous activity whereas another report provides more detailed information for one or more accounts (See FIG. 4 and related discussion).
  • Baseline profiles 210 include one or more baseline profiles that include event information that relates to the “normal” or “typical” operation of online service 110 .
  • Each baseline profile 210 includes event information for a predetermined period of time of the online service (e.g., one week, two weeks, one month, two months . . . ) within each of the baseline profiles 210 .
  • the baseline profiles 210 may be updated according to a predetermined schedule (e.g., daily, weekly . . . ) or using some other method (e.g., upon request).
  • an authorized user may configure the event information that is included in the baseline profile, the period of time for the event information that is included in the baseline profile, as well as an update schedule. For example, the authorized user may configure the baseline profile to store the last month of event information for one or more different types of accounts of the online service and be updated daily.
  • Recent profiles 220 include one or more recent profiles that include event information that relates to events that occur in a recent period of time of online service 110 .
  • a recent profile may include event information for one or more accounts that was generated by activity that occurred within the last day of the service, a last number of hours within the online service, and the like.
  • the event information that is included in a recent profile is more recent event information as compared to the event information that is included in the baseline profile(s) 210 .
  • the recent event information may be used to update the baseline profiles.
  • the baseline profiles may be updated with recent event information each day or some other period of time. In this way, the baseline profiles may include event information from a rolling window of time (e.g., the last month, two months, three months . . . ) of events that are generated by accounts in the online service 110 .
  • Log 250 is configured to store event information that is generated by one or more accounts of online service 110 .
  • log 250 stores records of events, including security events that are generated by accounts in the online service.
  • log 250 includes records of login/logout activity and other security-related events that are generated by one or more accounts 230 that occurred on one or more computing devices 240 that are associated with online service 110 .
  • the log data that is stored in log 250 includes information that is specified by a system's audit policy. Generally, an authorized user may configure the system to record different events and type of events in log 250 .
  • events may be logged and used by anomaly detector when detecting anomalous activity in the online service.
  • Some example categories of events that may be logged include: account logon events, account management events, directory service access events, object access events, policy changes, privileged operations, system events, and the like. For example, each time an object is created, accessed, changed or deleted; an event may be generated and logged. A list of more detailed events that may be logged and used in the detection of anomalous activity may be found in the discussion of FIG. 8 .
  • FIG. 3 shows different profiles including event information and weighting of events used in detecting anomalous activity from accounts within an online service.
  • each profile includes event information for all or a portion of the events that are monitored and/or logged that originate from accounts of the online service.
  • the following profiles are illustrated for exemplary purposes and are not intended to be limiting. While each profile illustrated shows four different events, more or less events may be included in a profile. For example, a profile may include hundreds or thousands of different types of events.
  • Profile 310 shows an aggregate baseline profile for a number of accounts of the online service.
  • profile 310 may include event information that is generated from many different types of accounts that are within an online service or may include all or a portion of a specific type of account.
  • an aggregate profile is created that includes each operator account of the online service.
  • operator account refers to accounts that have permissions to create, modify, and delete accounts for other users or groups or users.
  • Other type of accounts within online service may also be included (e.g. user accounts).
  • profile 310 includes event information for four different events (Event 1 , Event 2 , Event 3 and Event 4 ). While four events are shown there may be fewer events or more events that are included to detect anomalous activity.
  • the events may include all or a portion of security events (See discussion in FIG. 8 for a more detailed list of security events) as well as any other event that may be monitored by the online service (e.g., machine actions, user actions, and the like).
  • Each event that is included in a profile includes different event information.
  • the event information may include information, such as: a type of the event; a time the event occurred; what account generated the event; a result of the event; and the like.
  • an event may be a process start event that was started by account 2 at 11:06 AM that resulted in a specific process being started.
  • the profile may include a number of times each event that is monitored occurs within a specified period of time (e.g. within a day, a week, . . . ).
  • event 1 occurs 20% of the time
  • event 2 occurs 5% of the time
  • event 3 occurs 50% of the time
  • event 4 occurs 25% of the time.
  • events that occur more frequently are typically common events such as logging into the service, logging off of the service, and the like.
  • each event may be weighted the same or differently from other events that are being monitored and used in detecting any anomalous activity.
  • each event may be weighted to a same value, such as 1, or may be weighted using some other criteria.
  • the weighting is based on the frequency of the event that is within a profile. For example, a weight for less frequently occurring events would be weighted more heavily as compared to more frequently occurring events. Events that typically occur less often, such as creating an account, may be considered more indicative of anomalous activity as compared to common events, such as login or logoff events.
  • the relative weight for each event is automatically determined by dividing the percentage of the highest occurring event by the percentage of the occurrence of each event in the profile.
  • each weight may be based on a type of event and/or each weight may be manually configured by an authorized user (See FIG. 4 for an example Graphical User Interface for configuring weights). The weights may be configured manually or automatically.
  • Baseline profile 340 is an exemplary baseline profile for a single account of the online service.
  • each account includes a baseline profile that may be used to determine the “normal” behavior for an account.
  • baseline profile 340 includes the same event information as aggregate baseline profile 310 .
  • the frequency of the occurrence of the events, however, for baseline profile 340 may be different as compared to the aggregate baseline profile 310 .
  • the percentage of occurrence for event 1 and event 3 are the same, but the percentages for event 2 and event 4 are different.
  • the percentage of occurrence for event 2 is greater in the baseline profile 340 as compared to the occurrence for event 2 in the baseline profile 310 (10% compared 5%).
  • the percentage of occurrence for event 4 is less in the baseline profile 340 as compared to the occurrence for event 4 in the baseline profile 310 (20% compared 25%).
  • anomalous activity may be detected using baseline profiles relating to a single account and/or detected using one or more aggregate baseline profiles.
  • a newly created account would not have an established baseline profile.
  • the recent profile for the newly created account may be compared against an aggregate profile or a different account in order to detect anomalous activity.
  • Recent profile 370 is an exemplary recent profile for a single account of the online service.
  • a recent profile may be created for a single account and/or an aggregation of accounts.
  • recent profile 370 includes the same event information as aggregate baseline profile 310 .
  • the occurrence of the events for recent profile 370 is different as compared to both the aggregate baseline profile 310 as well as for baseline profile 370 . Comparing recent profile 370 to either of the baseline profiles ( 310 , 340 ) indicates anomalous activity is occurring for the account. For example, the percentage of occurrence of event 2 in the recent profile is 2 times greater than baseline profile 340 indicates and 4 times greater than the aggregate baseline profile 310 indicates.
  • FIG. 4 shows example Graphical User Interfaces (GUIs) for viewing and configuring events relating to anomaly detection.
  • GUIs Graphical User Interfaces
  • the GUIs shown in FIG. 4 are for illustrative purposes and are not intended to be limiting.
  • the first GUI that is shown is an event configuration 410 display that shows an exemplary GUI for selecting and configuring events to be monitored for anomaly detection for an online service.
  • Event categories 415 section shows different categories of events that may be selected for monitoring. While four event categories are shown, many other event categories may be displayed. According to an embodiment, each event and category of event that is monitored by the online service may be selected for use in anomaly detection.
  • the event categories that are shown in event configuration 410 includes account logon events, account management events, object access events and policy change events. As illustrated a user 430 is selecting the account management events category.
  • an event category such as the account management events
  • a more detailed view of each of the events that are included in the event category are displayed.
  • account management events 420 shows the different events that are contained within the selected account management events category.
  • a user may select or deselect the individual events of the selected event category to include in anomaly detection. For example, a user may select the account management event category and then deselect a couple of the events from being used in the anomaly detection.
  • Display 440 shows an exemplary display including an event picker 442 that displays a list of different events that may be configured and used in anomaly detection.
  • user 450 has selected event 2 .
  • a more detailed view 460 of configuration is displayed.
  • Event 2 is shown to be set to “On” indicating that Event 2 is being monitored for anomaly detection.
  • a user may also turn off Event 2 by selecting the On/Off user interface element.
  • a weight of “0.3” has also been set for Event 2 .
  • an authorized user may configure the weighting of different events such that different events are given more weight in detecting anomalous activity.
  • a default weight for an event may be set.
  • the specific calculation may also be configurable. For example, the weight may be the square of the inverse frequency, or some other calculation related to the inverse.
  • FIG. 5 shows exemplary reports showing detected anomalous activity for one or more accounts within the online service.
  • Display 510 shows an example report that indicates possible anomalous activity of different accounts.
  • Anomaly report 512 is illustrated for exemplary purposes and is not intended to be limiting. Many other types of reports showing anomalous activity may be created and displayed. As illustrated, report 512 lists a portion of the accounts being monitored for anomalous activity in the order from highest detected anomalous activity to lowest detected anomalous activity. Anomaly report 512 shows the accounts that have detected anomalous activity. For example, accounts that are operating normally and in which anomalous activity is not detected, are not shown in anomaly report. According to an embodiment, any account that has a value above some predefined number (threshold) has potential anomalous activity.
  • threshold some predefined number
  • a value above the predefined number indicates that at least one event that is being monitored is currently occurring more frequently as compared to the frequency of occurrence that is included in the baseline profile.
  • a user may select one of the displayed accounts to obtain more information relating to the detected anomalous activity. As shown, user 520 selects Account 10 from anomaly report 512 to obtain a more detailed view of the account as shown in display 540 .
  • Display 540 illustrates an example of more detailed information that may be displayed for an account in which anomalous activity is detected. As illustrated, the more detailed information shows the events that indicate anomalous activity as well as the baseline profile for the events. In display 540 , two different events being monitored are showing anomalous activity. In the current example, Event 2 is occurring five times more than what is considered normal (0.5 as compared to 0.1). Event 3 is occurring twice as often as normal event occurrence (0.2 compared to 0.1).
  • FIGS. 6 and 7 illustrate processes for anomaly detection using account information from an online service.
  • the logical operations of various embodiments are implemented (1) as a sequence of computer implemented acts or program modules running on a computing system and/or (2) as interconnected machine logic circuits or circuit modules within the computing system.
  • the implementation is a matter of choice dependent on the performance requirements of the computing system implementing the invention.
  • the logical operations illustrated and making up the embodiments described herein are referred to variously as operations, structural devices, acts or modules.
  • These operations, structural devices, acts and modules may be implemented in software, in firmware, in special purpose digital logic, and any combination thereof While the operations are shown in a particular order, the order of the operations may change, be performed in parallel, depending on the implementation.
  • FIG. 6 illustrates a process 600 for detecting anomalous activity in an online service by comparing a baseline profile to a recent profile.
  • the baseline profile may be a baseline profile for a single account or may be an aggregated baseline profile that includes event information from more than one account of the online service.
  • each tenant within an online service may have a separate baseline profile for each account that is monitored for anomalous activity.
  • the baseline profile includes event information that is generated from one or more accounts and indicates the “normal” operation of an account or accounts.
  • the profiles include event information that relates to security events that are associated with one or more accounts of the online service.
  • the event information may be configured by an authorized user and/or automatically selected based on the monitored events of the online system.
  • the online service may automatically select all or a portion of the monitored or logged events in the online service.
  • the recent profile includes event information that is obtained from a recent period of time.
  • a recent period of time may include a time period, such as, but not limited to the last hour, two hours, three hours, four hours, eight hours, day, and the like.
  • the recent profile includes event information from one or more accounts for the last six hours.
  • the recent profile may include a single account of the online service and/or an aggregate of accounts of the online service.
  • the baseline profile is compared to the recent profile to detect anomalous activity of the online service.
  • the comparison of the recent profile with the baseline profile may be performed using different methods.
  • the comparison includes determining when the frequency of one or more of the events being monitored is different between the baseline profile and the recent profile.
  • a weighting factor is used to adjust the importance of an occurrence of an event. For example, an increase in the number of accounts being created (a more heavily weighed event) for an account may be considered more of an indication of anomalous activity as compared to an increase in the number of windows being opened (a less heavily weighted event) by an account.
  • an action may be performed based on the detected anomalous activity.
  • One or more actions may occur.
  • the actions that are performed relate to notifying one or more users of the detected anomalous activity and/or attempting to stop or limit the detected anomalous activity.
  • one or more reports may be created and delivered (operation 650 ), one or more accounts may be locked such that future anomalous activity from the one or more accounts is stopped, certain actions may be prevented from occurring from one or more accounts, and the like.
  • anomalous activity that is detected is reported.
  • a summary report may be created and delivered that ranks each account according to its detected anomalous activity.
  • the accounts are ranked according to a level of detected anomalous activity for the account.
  • a more detailed report for each account may also be created for delivery and/or viewing.
  • the more detailed report may be accessed by selecting the account from within the summary report of the anomalous activity.
  • the process then flows to an end operation and returns to processing other actions.
  • FIG. 7 illustrates a process 700 for configuring and storing event information within a baseline profile and a recent profile.
  • the process moves to operation 710 , where the events to monitor and use in anomaly detection are determined.
  • the determination may be automatically and/or manually performed. For example, instead of manually selecting each event to monitor, each event that is currently being monitored and logged may be used in detecting anomalous activity.
  • an authorized user may select the events to be monitored. This selection may be in place of or in addition to the events that are currently being used in the anomaly detection.
  • a GUI is displayed that allows the authorized user to select the events to monitor and use in the anomaly detection.
  • the events that are determined to be used in the anomaly detection for the online service are configured.
  • the configuration may include manual configuration as well as automatic configuration.
  • an authorized user may set weightings to associate with the different events.
  • the weightings for the events that are being monitored are automatically set to different values based on the percentage of times the event occurs.
  • events that occur more frequently are considered to be less important in detecting anomalous activity as compared to events that typically occur less frequently in the online service.
  • an event that creates an account may be weighted more heavily as compared to a login event.
  • the configuration information is stored.
  • the configuration information may be stored in one more locations.
  • the configuration information may be stored in a data store of the online service and/or in an external data store from the online service.
  • the event information from different accounts in the online service is obtained.
  • the event information is obtained from a log that records event information that is associated with the different accounts of the online service.
  • the event information may be obtained automatically or manually. For example, the event information may be obtained every one hour, two hours, six hours, daily, and the like.
  • Different accounts and types of accounts may be monitored for the different events.
  • operator accounts of the online service are monitored and data from the monitoring is logged by the online service. Other types of accounts may also be monitored. For example, an authorized user may configure to monitor events for each account of the online service or select a subset of account to monitor.
  • the event information may include different types of information.
  • the event information may include information such as but not limited to: a type of event, a time the event occurred, a result of the event, and the like.
  • profiles are updated with the event information.
  • One or more profiles such as baseline profiles and recent profiles, may be updated with the event information.
  • a baseline profile and a recent profile for each account that is being monitored is updated.
  • An aggregate baseline profile and an aggregate recent profile are also updated.
  • the aggregate profiles include event information that is generated from more than one account.
  • the aggregate profiles include account information for each account of a tenant of the online service. For example, separate aggregate profiles are maintained for each of the different customers that are serviced by the online service. As discussed, the different profiles may be used to detect anomalous activity of the online service.
  • the process then flows to an end operation and returns to processing other actions.
  • FIG. 8 illustrates an exemplary online system that includes anomaly detection.
  • system 1000 includes service 1010 , data store 1045 , and touch screen input device 1050 (e.g. a slate), smart phone 1030 and display device 1080 .
  • touch screen input device 1050 e.g. a slate
  • service 1010 is a cloud based and/or enterprise based service that may be configured to provide services, such as productivity services (e.g. spreadsheets, documents, presentations, charts, messages, and the like).
  • productivity services e.g. spreadsheets, documents, presentations, charts, messages, and the like.
  • the service may be interacted with using different types of input/output. For example, a user may use speech input, touch input, hardware based input, and the like.
  • Functionality of one or more of the services/applications provided by service 1010 may also be configured as a client/server based application.
  • service 1010 is a multi-tenant service that provides resources 1015 and services to any number of tenants (e.g. Tenants 1 -N).
  • Multi-tenant service 1010 is a cloud based service that provides resources/services 1015 to tenants subscribed to the service and maintains each tenant's data separately and protected from other tenant data.
  • System 1000 as illustrated comprises a touch screen input device 1050 (e.g. a slate/tablet device) and smart phone 1030 that detects when a touch input has been received (e.g. a finger touching or nearly touching the touch screen).
  • a touch input e.g. a finger touching or nearly touching the touch screen.
  • the touch screen may include one or more layers of capacitive material that detects the touch input.
  • Other sensors may be used in addition to or in place of the capacitive material.
  • Infrared (IR) sensors may be used.
  • the touch screen is configured to detect objects that in contact with or above a touchable surface. Although the term “above” is used in this description, it should be understood that the orientation of the touch panel system is irrelevant.
  • the touch screen may be configured to determine locations of where touch input is received (e.g. a starting point, intermediate points and an ending point). Actual contact between the touchable surface and the object may be detected by any suitable means, including, for example, by a vibration sensor or microphone coupled to the touch panel.
  • a vibration sensor or microphone coupled to the touch panel.
  • sensors to detect contact includes pressure-based mechanisms, micro-machined accelerometers, piezoelectric devices, capacitive sensors, resistive sensors, and inductive sensors.
  • smart phone 1030 , touch screen input device 1050 , and device 1080 each include an application ( 1031 , 1051 , 1081 ).
  • touch screen input device 1050 , smart phone 1030 , and display device 1080 shows exemplary displays 1052 / 1032 / 1082 showing the use of an application and a display relating to anomaly detection.
  • Data may be stored on a device (e.g. smart phone 1030 , slate 1050 and/or at some other location (e.g. network data store 1045 ).
  • Data store 1045 or some other store, may be used to store data sets, event information, baseline profiles, recent profiles, as well as other data.
  • the applications used by the devices may be client based applications, server based applications, and cloud based applications and/or some combination.
  • display device 1080 is a device such as a MICROSOFT XBOX coupled to a display.
  • Anomaly detector 26 is configured to perform operations relating to detecting anomalous activity as described herein. While anomaly detector 26 is shown within service 1010 , the functionality of the anomaly detector may be included in other locations (e.g. on smart phone 1030 and/or slate device 1050 and/or device 1080 ).
  • Some example events that may be monitored and included in the baseline profile and the recent profile include, but are not limited to: the operating system is starting up; the operating system is shutting down; An authentication package has been loaded by the Local Security Authority; A trusted logon process has been registered with the Local Security Authority; Internal resources allocated for the queuing of audit messages have been exhausted, leading to the loss of some audits; A notification package has been loaded by the Security Account Manager; Invalid use of a port; the system time was changed; A monitored security event pattern has occurred; Administrator recovered system from CrashOnAuditFail; A security package has been loaded by the Local Security Authority; An account was successfully logged on; An account failed to log on; User/Device claims information; An account was logged off; User initiated logoff; A logon was attempted using explicit credentials; A replay attack was detected; A handle to an object was requested; A registry value was modified; The handle to an object was closed; A handle to an object was requested with intent to delete; An object was deleted; A handle to an object was requested; An operation was performed on an object
  • Custom events may also be monitored.
  • a log file records the custom events whenever the online service detects the custom event by any user of that online service.
  • the embodiments and functionalities described herein may operate via a multitude of computing systems including, without limitation, desktop computer systems, wired and wireless computing systems, mobile computing systems (e.g., mobile telephones, netbooks, tablet or slate type computers, notebook computers, and laptop computers), handheld devices, multiprocessor systems, microprocessor-based or programmable consumer electronics, minicomputers, and mainframe computers.
  • desktop computer systems e.g., desktop computer systems, wired and wireless computing systems, mobile computing systems (e.g., mobile telephones, netbooks, tablet or slate type computers, notebook computers, and laptop computers), handheld devices, multiprocessor systems, microprocessor-based or programmable consumer electronics, minicomputers, and mainframe computers.
  • mobile computing systems e.g., mobile telephones, netbooks, tablet or slate type computers, notebook computers, and laptop computers
  • handheld devices e.g., multiprocessor systems, microprocessor-based or programmable consumer electronics, minicomputers, and mainframe computers.
  • embodiments and functionalities described herein may operate over distributed systems (e.g., cloud-based computing systems), where application functionality, memory, data storage and retrieval and various processing functions may be operated remotely from each other over a distributed computing network, such as the Internet or an intranet.
  • a distributed computing network such as the Internet or an intranet.
  • User interfaces and information of various types may be displayed via on-board computing device displays or via remote display units associated with one or more computing devices. For example user interfaces and information of various types may be displayed and interacted with on a wall surface onto which user interfaces and information of various types are projected.
  • Interaction with the multitude of computing systems with which embodiments of the invention may be practiced include, keystroke entry, touch screen entry, voice or other audio entry, gesture entry where an associated computing device is equipped with detection (e.g., camera) functionality for capturing and interpreting user gestures for controlling the functionality of the computing device, and the like.
  • detection e.g., camera
  • FIGS. 9-11 and the associated descriptions provide a discussion of a variety of operating environments in which embodiments of the invention may be practiced.
  • the devices and systems illustrated and discussed with respect to FIGS. 9-11 are for purposes of example and illustration and are not limiting of a vast number of computing device configurations that may be utilized for practicing embodiments of the invention, described herein.
  • FIG. 9 is a block diagram illustrating physical components (i.e., hardware) of a computing device 1100 with which embodiments of the invention may be practiced.
  • the computing device components described below may be suitable for the computing devices described above.
  • the computing device 1100 may include at least one processing unit 1102 and a system memory 1104 .
  • the system memory 1104 may comprise, but is not limited to, volatile storage (e.g., random access memory), non-volatile storage (e.g., read-only memory), flash memory, or any combination of such memories.
  • the system memory 1104 may include an operating system 1105 and one or more program modules 1106 suitable for running software applications 1120 such as the anomaly detector 26 .
  • the operating system 1105 may be suitable for controlling the operation of the computing device 1100 .
  • embodiments of the invention may be practiced in conjunction with a graphics library, other operating systems, or any other application program and is not limited to any particular application or system.
  • This basic configuration is illustrated in FIG. 9 by those components within a dashed line 1108 .
  • the computing device 1100 may have additional features or functionality.
  • the computing device 1100 may also include additional data storage devices (removable and/or non-removable) such as, for example, magnetic disks, optical disks, or tape.
  • additional storage is illustrated in FIG. 9 by a removable storage device 1109 and a non-removable storage device 1110 .
  • program modules 1106 may perform processes including, but not limited to, one or more of the stages of the methods and processes illustrated in the figures.
  • Other program modules may include electronic mail and contacts applications, word processing applications, spreadsheet applications, database applications, slide presentation applications, drawing or computer-aided application programs, etc.
  • embodiments of the invention may be practiced in an electrical circuit comprising discrete electronic elements, packaged or integrated electronic chips containing logic gates, a circuit utilizing a microprocessor, or on a single chip containing electronic elements or microprocessors.
  • embodiments of the invention may be practiced via a system-on-a-chip (SOC) where each or many of the components illustrated in FIG. 9 may be integrated onto a single integrated circuit.
  • SOC system-on-a-chip
  • Such an SOC device may include one or more processing units, graphics units, communications units, system virtualization units and various application functionality all of which are integrated (or “burned”) onto the chip substrate as a single integrated circuit.
  • the functionality, described herein, with respect to the anomaly detector 26 may be operated via application-specific logic integrated with other components of the computing device 1100 on the single integrated circuit (chip).
  • Embodiments of the invention may also be practiced using other technologies capable of performing logical operations such as, for example, AND, OR, and NOT, including but not limited to mechanical, optical, fluidic, and quantum technologies.
  • embodiments of the invention may be practiced within a general purpose computer or in any other circuits or systems.
  • the computing device 1100 may also have one or more input device(s) 1112 such as a keyboard, a mouse, a pen, a sound input device, a touch input device, etc.
  • the output device(s) 1114 such as a display, speakers, a printer, etc. may also be included.
  • the aforementioned devices are examples and others may be used.
  • the computing device 1100 may include one or more communication connections 1116 allowing communications with other computing devices 1118 . Examples of suitable communication connections 1116 include, but are not limited to, RF transmitter, receiver, and/or transceiver circuitry; universal serial bus (USB), parallel, and/or serial ports.
  • USB universal serial bus
  • Computer readable media may include computer storage media.
  • Computer storage media may include volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information, such as computer readable instructions, data structures, or program modules.
  • the system memory 1104 , the removable storage device 1109 , and the non-removable storage device 1110 are all computer storage media examples (i.e., memory storage.)
  • Computer storage media may include RAM, ROM, electrically erasable read-only memory (EEPROM), flash memory or other memory technology, CD-ROM, digital versatile disks (DVD) or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other article of manufacture which can be used to store information and which can be accessed by the computing device 1100 . Any such computer storage media may be part of the computing device 1100 .
  • Computer storage media does not include a carrier wave or other propagated or modulated data signal.
  • Communication media may be embodied by computer readable instructions, data structures, program modules, or other data in a modulated data signal, such as a carrier wave or other transport mechanism, and includes any information delivery media.
  • modulated data signal may describe a signal that has one or more characteristics set or changed in such a manner as to encode information in the signal.
  • communication media may include wired media such as a wired network or direct-wired connection, and wireless media such as acoustic, radio frequency (RF), infrared, and other wireless media.
  • RF radio frequency
  • FIGS. 10A and 10B illustrate a mobile computing device 1200 , for example, a mobile telephone, a smart phone, a tablet personal computer, a laptop computer, and the like, with which embodiments of the invention may be practiced.
  • a mobile computing device 1200 for example, a mobile telephone, a smart phone, a tablet personal computer, a laptop computer, and the like, with which embodiments of the invention may be practiced.
  • FIG. 10A one embodiment of a mobile computing device 1200 for implementing the embodiments is illustrated.
  • the mobile computing device 1200 is a handheld computer having both input elements and output elements.
  • the mobile computing device 1200 typically includes a display 1205 and one or more input buttons 1210 that allow the user to enter information into the mobile computing device 1200 .
  • the display 1205 of the mobile computing device 1200 may also function as an input device (e.g., a touch screen display).
  • an optional side input element 1215 allows further user input.
  • the side input element 1215 may be a rotary switch, a button, or any other type of manual input element.
  • mobile computing device 1200 may incorporate more or less input elements.
  • the display 1205 may not be a touch screen in some embodiments.
  • the mobile computing device 1200 is a portable phone system, such as a cellular phone.
  • the mobile computing device 1200 may also include an optional keypad 1235 .
  • Optional keypad 1235 may be a physical keypad or a “soft” keypad generated on the touch screen display.
  • the output elements include the display 1205 for showing a graphical user interface (GUI), a visual indicator 1220 (e.g., a light emitting diode), and/or an audio transducer 1225 (e.g., a speaker).
  • GUI graphical user interface
  • the mobile computing device 1200 incorporates a vibration transducer for providing the user with tactile feedback.
  • the mobile computing device 1200 incorporates input and/or output ports, such as an audio input (e.g., a microphone jack), an audio output (e.g., a headphone jack), and a video output (e.g., a HDMI port) for sending signals to or receiving signals from an external device.
  • FIG. 10B is a block diagram illustrating the architecture of one embodiment of a mobile computing device. That is, the mobile computing device 1200 can incorporate a system 1202 (i.e., an architecture) to implement some embodiments.
  • the system 1202 is implemented as a “smart phone” capable of running one or more applications (e.g., browser, e-mail, calendaring, contact managers, messaging clients, games, and media clients/players).
  • the system 1202 is integrated as a computing device, such as an integrated personal digital assistant (PDA) and wireless phone.
  • PDA personal digital assistant
  • One or more application programs 1266 may be loaded into the memory 1262 and run on or in association with the operating system 1264 .
  • Examples of the application programs include phone dialer programs, e-mail programs, personal information management (PIM) programs, word processing programs, spreadsheet programs, Internet browser programs, messaging programs, and so forth.
  • the system 1202 also includes a non-volatile storage area 1268 within the memory 1262 .
  • the non-volatile storage area 1268 may be used to store persistent information that should not be lost if the system 1202 is powered down.
  • the application programs 1266 may use and store information in the non-volatile storage area 1268 , such as e-mail or other messages used by an e-mail application, and the like.
  • a synchronization application (not shown) also resides on the system 1202 and is programmed to interact with a corresponding synchronization application resident on a host computer to keep the information stored in the non-volatile storage area 1268 synchronized with corresponding information stored at the host computer.
  • other applications may be loaded into the memory 1262 and run on the mobile computing device 1200 , including the anomaly detector 26 as described herein.
  • the system 1202 has a power supply 1270 , which may be implemented as one or more batteries.
  • the power supply 1270 might further include an external power source, such as an AC adapter or a powered docking cradle that supplements or recharges the batteries.
  • the system 1202 may also include a radio 1272 that performs the function of transmitting and receiving radio frequency communications.
  • the radio 1272 facilitates wireless connectivity between the system 1202 and the “outside world,” via a communications carrier or service provider. Transmissions to and from the radio 1272 are conducted under control of the operating system 1264 . In other words, communications received by the radio 1272 may be disseminated to the application programs 1266 via the operating system 1264 , and vice versa.
  • the visual indicator 1220 may be used to provide visual notifications, and/or an audio interface 1274 may be used for producing audible notifications via the audio transducer 1225 .
  • the visual indicator 1220 is a light emitting diode (LED) and the audio transducer 1225 is a speaker.
  • LED light emitting diode
  • the LED may be programmed to remain on indefinitely until the user takes action to indicate the powered-on status of the device.
  • the audio interface 1274 is used to provide audible signals to and receive audible signals from the user.
  • the audio interface 1274 may also be coupled to a microphone to receive audible input, such as to facilitate a telephone conversation.
  • the microphone may also serve as an audio sensor to facilitate control of notifications, as will be described below.
  • the system 1202 may further include a video interface 1276 that enables an operation of an on-board camera to record still images, video stream, and the like.
  • a mobile computing device 1200 implementing the system 1202 may have additional features or functionality.
  • the mobile computing device 1200 may also include additional data storage devices (removable and/or non-removable) such as, magnetic disks, optical disks, or tape. Such additional storage is illustrated in FIG. 10B by the non-volatile storage area 1268 .
  • Mobile computing device 1200 may also include peripheral device port 1230 .
  • Data/information generated or captured by the mobile computing device 1200 and stored via the system 1202 may be stored locally on the mobile computing device 1200 , as described above, or the data may be stored on any number of storage media that may be accessed by the device via the radio 1272 or via a wired connection between the mobile computing device 1200 and a separate computing device associated with the mobile computing device 1200 , for example, a server computer in a distributed computing network, such as the Internet.
  • a server computer in a distributed computing network such as the Internet.
  • data/information may be accessed via the mobile computing device 1200 via the radio 1272 or via a distributed computing network.
  • data/information may be readily transferred between computing devices for storage and use according to well-known data/information transfer and storage means, including electronic mail and collaborative data/information sharing systems.
  • FIG. 11 illustrates an embodiment of an architecture of an exemplary system, as described above.
  • Content developed, interacted with, or edited in association with the anomaly detector 26 may be stored in different communication channels or other storage types. For example, various documents may be stored using a directory service 1322 , a web portal 1324 , a mailbox service 1326 , an instant messaging store 1328 , or a social networking site 1330 .
  • the anomaly detector 26 may use any of these types of systems or the like for enabling data utilization, as described herein.
  • a server 1320 may provide the anomaly detector 26 to clients.
  • the server 1320 may be a web server providing the anomaly detector 26 over the web.
  • the server 1320 may provide the anomaly detector 26 over the web to clients through a network 1315 .
  • the client computing device may be implemented as the computing device 1100 and embodied in a personal computer, a tablet computing device 1310 and/or a mobile computing device 1200 (e.g., a smart phone). Any of these embodiments of the client computing device 1100 , 1310 , 1200 may obtain content from the store 1316 .
  • Embodiments of the present invention are described above with reference to block diagrams and/or operational illustrations of methods, systems, and computer program products according to embodiments of the invention.
  • the functions/acts noted in the blocks may occur out of the order as shown in any flowchart.
  • two blocks shown in succession may in fact be executed substantially concurrently or the blocks may sometimes be executed in the reverse order, depending upon the functionality/acts involved.

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Security & Cryptography (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Computer Hardware Design (AREA)
  • Theoretical Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Computing Systems (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Accounting & Taxation (AREA)
  • Human Computer Interaction (AREA)
  • Finance (AREA)
  • Strategic Management (AREA)
  • General Business, Economics & Management (AREA)
  • Debugging And Monitoring (AREA)
  • Alarm Systems (AREA)

Abstract

Anomalous activity is detected using event information that is received from accounts from within an online service. Generally, anomalous activity is detected by comparing a baseline profile that includes past event information for accounts of the online service with a recent profile that includes recent event information for the accounts. Anomalous activity is detected when the recent profile shows that one or more events are occurring more frequently as compared to the occurrence of the event the associated baseline profile. The events that are recorded and used in the anomaly detection may include all or a portion of events that are monitored by the online service. One or more reports may also be automatically generated and provided to one or more users to show activity that may be considered anomalous activity.

Description

    CROSS-REFERENCE TO RELATED APPLICATIONS
  • This application is a continuation of application Ser. No. 14/134,575, filed Dec. 19, 2013, which is incorporated herein by reference in its entirety.
  • BACKGROUND
  • Anomaly detection is used for determining when patterns in data do not match an expected pattern. For example, credit card companies may use anomaly detection to help detect fraudulent activity relating to a customer's credit card. An online service may create a rule to detect anomalous activity when network traffic exceeds a predetermined threshold. Detecting anomalous activity that is associated with an online service can be challenging and time consuming. For example, there is typically an extremely large amount of data relating to the operation of the online service that may need to be analyzed. Instead of processing this large amount of data, many online services detect anomalous activity by determining when a predefined event occurs on a single machine or a few machines of the online service. For example, the predefined event may occur in response to network traffic for the online service exceeding some predetermined level or when a large number of processes are started in a short time period.
  • SUMMARY
  • This Summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used as an aid in determining the scope of the claimed subject matter.
  • Anomalous activity is detected using event information that is monitored from accounts performing activities within an online service. Generally, anomalous activity is detected by determining when a baseline profile that represents “normal” activity for the online service is different from a recent profile that represents the “current” activity from accounts in the online service. For example, anomalous activity may be detected when an event, such as a create account event, is occurring more frequently in the recent profile as compared to the occurrence of the create account event in the baseline profile. The event information that is used in detecting anomalous activity may include all or a portion of events that are monitored by the online service. For example, the events used for anomaly detection may include all or a portion of security events (e.g., any event that changes permissions, such as creating accounts, changing permissions of one or more accounts, logging into the online service or logging out of the online service . . . ) as well as other types of events (e.g., system events, hardware events . . . ). An authorized user may configure the events to monitor and/or the events for one or more event categories may be automatically selected. The accounts that are monitored for anomalous activity may include all of or a portion of the accounts of the online service. For example, the accounts that are monitored for anomalous activity may be operator accounts (e.g., accounts that have permissions to create, modify, and delete accounts for other users or groups or users) or other types of accounts (e.g., user accounts, privileged accounts, and the like). In response to detecting anomalous activity, different activities may be performed. For example, an account may be blocked from performing operations, an account may be locked, one or more reports may be automatically generated and provided to one or more users to show activity that may be considered anomalous activity, and the like. Different types of reports may be generated. For example, one report may rank the accounts based on the level of anomalous activity detected whereas another report may provide more detailed information for one or more of the accounts.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 shows an overview of a system for detecting anomalous activity from accounts of an online service;
  • FIG. 2 shows a more detailed system for detecting anomalous activity from accounts of an online service;
  • FIG. 3 shows different profiles including event information and weighting of events used in detecting anomalous activity from accounts within an online service;
  • FIG. 4 shows example Graphical User Interfaces for viewing and configuring events relating to anomaly detection;
  • FIG. 5 shows exemplary reports showing detected anomalous activity for one or more accounts within the online service;
  • FIG. 6 illustrates a process for detecting anomalous activity in an online service by comparing a baseline profile to a recent profile;
  • FIG. 7 illustrates a process for configuring and storing event information within a baseline profile and a recent profile;
  • FIG. 8 illustrates an exemplary online system for detecting anomalous activity; and
  • FIGS. 9, 10A, 10B and 11 and the associated descriptions provide a discussion of a variety of operating environments in which embodiments of the invention may be practiced.
  • DETAILED DESCRIPTION
  • Due to the sheer number of events that may be logged, online services typically examine a small subset of the events that are logged from a single machine (or a small number of machines) in the service in order to detect anomalous activity. For example, instead of examining the events from each machine in an online service, the online service selects one or two machines to monitor for anomalous activity. An online service may also establish hard coded rules to detect anomalous activity. For example, an online service may create a rule to identify a specific type of anomalous activity. According to embodiments of the invention, instead of creating individual rules to detect anomalous activity, an anomaly detector automatically detects anomalous activity based on events that are monitored and that are obtained from the different computing devices and accounts in the online service. Anomalous activity may be detected using any number of monitored events from accounts of the online system without creating individual rules. Instead of looking for a predetermined event or condition to occur in order to determine when a specific type of anomalous activity is occurring, anomalous activity is detected by comparing a frequency of events that represent the “normal” behavior for the online service with the frequency of the events that occur during a recent period of time in the online service. When the frequency of the events between the baseline profile and the recent profile are different, anomalous activity may be indicated. This anomalous activity is then provided to one or more users in the form of a report. According to an embodiment, a report is delivered that includes information related to the anomalous activity detected in the online service.
  • Referring now to the drawings, in which like numerals represent like elements, various embodiment will be described.
  • FIG. 1 shows an overview of a system for detecting anomalous activity from accounts of an online service. As illustrated, system 100 includes online service 110, anomaly detector 26, baseline profile 155, recent profile 165, display 115 and display 125.
  • Anomaly detector 26 is configured to detect anomalous activity that is occurring from accounts that are associated with online service 110. Anomalous activity is activity that deviates from the expected or normal behavior of the online service. For example, anomalous activity may be detected by anomaly detector 26 in response to an unusually large number of requests from accounts in the online service to: create new accounts; change permissions; start processes; and the like. In order to determine when the anomalous activity of online service 110 is occurring, anomaly detector 26 determines when a frequency of events that is stored in baseline profile 155 deviates from the frequency of the events in recent profile 165. For example, baseline profile 155 may indicate that during normal operation of online service 110, two accounts are created in a typical day. Today, however, recent profile 165 shows that ten different accounts were created by a particular account in the online service. In response to comparing baseline profile 155 to recent profile 165 and determining that the frequencies are different, anomaly detector 26 determines that anomalous activity may be occurring in online service 110 and generates anomaly report 130 that is shown displayed on display 125.
  • In the current example, anomaly report 130 shows a message that includes information showing the anomalous activity (e.g., “Account 1 created 10 new accounts) and also shows the normal activity (e.g., “Creating 2 accounts is normal activity). The report may be provided to one or more users to show activity that may be considered anomalous activity. Different reports may be generated. For example, one report may rank the accounts based on anomalous activity whereas another report provides more detailed information for one or more accounts
  • Anomaly detector 26 may use all or a portion of monitored (e.g., logged) events of online service in order to detect anomalous activity. Instead of monitoring just one or two different events that occur on a single computing device in the online service and determining when a predetermined condition is met, anomaly detector 26 may use any number of the logged events from any number of accounts within the online service to detect anomalous activity. For example, anomaly detector may use each of the logged security events that occur within the online service (e.g., any event that changes permissions, such as creating accounts, changing permissions of one or more accounts, . . . ) in order to detect anomalous activity.
  • According to an embodiment, an authorized user may configure the events that are used by anomaly detector 26 to detect anomalous activity. In the current example, a Graphical User Interface (GUI) for event configuration 117 is displayed and is configured to receive event selection and configuration information from a user. For example, the user may select all or a portion of the events that are monitored by the online service for one or more accounts. In the current example, display 115 shows that the authorized user has selected to use change permission events and add account events but not to use login events when detecting anomalous activity. Other configuration information may also be received, such as, but not limited to weighting information; aggregation profiles and the like (See the FIGURES and related discussion below). When users do not specify a configuration, anomaly detector 26 uses a default algorithm weighting process. According to an embodiment, the default algorithm weighting process assigns events with a higher weight (that is, are more anomalous) the more rare the events are determined to occur based on the occurrence of the events in the baseline profile.
  • Online service 110 may be a cloud based and/or enterprise based service that may be configured to provide services, such as productivity services (e.g., messaging, collaboration, spreadsheets, documents, presentations, charts, and the like). More details regarding anomaly detection are provided below.
  • FIG. 2 shows a more detailed system 200 for detecting anomalous activity from accounts of an online service.
  • As illustrated, system 200 includes application 262, application 272, tablet computing device 260, computing device 270 and online service 110 that includes anomaly detector 26, baseline profiles 210, recent profiles 220, accounts 230, computing devices 240 and log 250.
  • As discussed, anomaly detector 26 is configured to detect anomalous activity by using monitored event information from accounts 230 that are associated with online service 110. For example, the accounts that are monitored within the online service for anomalous activity may be operator accounts that have permissions to create, modify, and delete accounts for other users or groups or users. Other accounts, such as user accounts, may also be monitored for anomalous activity.
  • According to an embodiment, anomaly detector 26 detects anomalous activity by comparing frequencies of past events that are included in baseline profiles 210 with the frequencies of recent events that are included in recent profiles 220. A profile (e.g., a baseline profile or a recent profile) included event information for a single account of the online service or may include aggregated event information that is obtained from all or portion of the accounts of the online service. Anomalous activity may be detected by anomaly detector 26 when the recent event information for an account shows that one or more events are occurring more frequently as compared to the occurrence of the events as recorded in a baseline profile.
  • Anomaly detector 26 may use various methods in determining when anomalous activity is occurring within online service 210. According to an embodiment, anomaly detector 26 creates a frequency profile for each account or group of accounts that is associated with a profile. Anomaly detector 26 compares the frequency profile for the events that are associated with the baseline profile with the frequency profile for the events that are associated with the recent profiles. For example, when the frequency profile that is associated with the recent profile indicates that one or more events are occurring more frequently as compared to during “normal” operation as indicated by the baseline profile, anomalous activity is detected.
  • According to an embodiment, a frequency is determined for each event that is included in a profile. For example, for purposes of explanation and not intended to be limiting, assume four different events are being monitored. The baseline profile shows that event 1 accounts for 10% of the events, event 2 accounts for 25% of the overall events, event 3 accounts for 50% of the events, and event 4 accounts for 15% of the events. The recent profile shows that event 1 accounts for 20% of the events, event 2 accounts for 15% of the overall events, event 3 accounts for 50% of the events, and event 4 accounts for 15% of the events. In response to comparing the baseline profile with the recent profile, anomaly detector 26 detects that anomalous activity is occurring since the frequency of event 1 is larger in the recent profile as compared to the baseline profile. Other methods of detecting anomalous activity from accounts in the online service may be used. For example, other statistical methods may be used, such as: comparing a number of times each event occurs within a predetermined time period, aggregating the frequencies of some events and comparing the aggregated frequencies; or some other statistical method.
  • According to an embodiment, different events may be assigned different weights such that an increase in an occurrence of a more heavily weighted event within the monitored accounts indicates anomalous activity before an increase in an occurrence of an event that is not as heavily weighted. According to an embodiment, events that occur less frequently may automatically or manually be assigned a higher weight such that an increase in the occurrence of the event will more quickly indicate anomalous activity. For example, login events may be set to a lower weight as compared to creating account events since login events are generally common events in an online service. The weights may be assigned automatically and/or manually.
  • Upon detection of anomalous activity by anomaly detector 26, different actions may be performed. For example, one or more reports may be automatically created and delivered (e.g., anomaly report 265), one or more accounts may be locked such that future activity is stopped, certain actions may be prevented from occurring within the online service, and the like. A report may be generated and provided to one or more users to show activity that may be considered anomalous activity. Different reports may be generated. For example, anomaly report 265 ranks the accounts based on detected anomalous activity whereas another report provides more detailed information for one or more accounts (See FIG. 4 and related discussion).
  • Baseline profiles 210 include one or more baseline profiles that include event information that relates to the “normal” or “typical” operation of online service 110. Each baseline profile 210 includes event information for a predetermined period of time of the online service (e.g., one week, two weeks, one month, two months . . . ) within each of the baseline profiles 210. The baseline profiles 210 may be updated according to a predetermined schedule (e.g., daily, weekly . . . ) or using some other method (e.g., upon request). According to an embodiment, an authorized user may configure the event information that is included in the baseline profile, the period of time for the event information that is included in the baseline profile, as well as an update schedule. For example, the authorized user may configure the baseline profile to store the last month of event information for one or more different types of accounts of the online service and be updated daily.
  • Recent profiles 220 include one or more recent profiles that include event information that relates to events that occur in a recent period of time of online service 110. For example, a recent profile may include event information for one or more accounts that was generated by activity that occurred within the last day of the service, a last number of hours within the online service, and the like. Generally, the event information that is included in a recent profile is more recent event information as compared to the event information that is included in the baseline profile(s) 210. The recent event information may be used to update the baseline profiles. For example, the baseline profiles may be updated with recent event information each day or some other period of time. In this way, the baseline profiles may include event information from a rolling window of time (e.g., the last month, two months, three months . . . ) of events that are generated by accounts in the online service 110.
  • Log 250 is configured to store event information that is generated by one or more accounts of online service 110. According to an embodiment, log 250 stores records of events, including security events that are generated by accounts in the online service. For example, log 250 includes records of login/logout activity and other security-related events that are generated by one or more accounts 230 that occurred on one or more computing devices 240 that are associated with online service 110. According to an embodiment, the log data that is stored in log 250 includes information that is specified by a system's audit policy. Generally, an authorized user may configure the system to record different events and type of events in log 250.
  • Many types of events may be logged and used by anomaly detector when detecting anomalous activity in the online service. Some example categories of events that may be logged include: account logon events, account management events, directory service access events, object access events, policy changes, privileged operations, system events, and the like. For example, each time an object is created, accessed, changed or deleted; an event may be generated and logged. A list of more detailed events that may be logged and used in the detection of anomalous activity may be found in the discussion of FIG. 8.
  • FIG. 3 shows different profiles including event information and weighting of events used in detecting anomalous activity from accounts within an online service.
  • As discussed herein, each profile includes event information for all or a portion of the events that are monitored and/or logged that originate from accounts of the online service. The following profiles are illustrated for exemplary purposes and are not intended to be limiting. While each profile illustrated shows four different events, more or less events may be included in a profile. For example, a profile may include hundreds or thousands of different types of events.
  • Profile 310 shows an aggregate baseline profile for a number of accounts of the online service. For example, profile 310 may include event information that is generated from many different types of accounts that are within an online service or may include all or a portion of a specific type of account. According to an embodiment, an aggregate profile is created that includes each operator account of the online service. As discussed, the term operator account refers to accounts that have permissions to create, modify, and delete accounts for other users or groups or users. Other type of accounts within online service may also be included (e.g. user accounts).
  • In the current example, profile 310 includes event information for four different events (Event 1, Event 2, Event 3 and Event 4). While four events are shown there may be fewer events or more events that are included to detect anomalous activity. For example, the events may include all or a portion of security events (See discussion in FIG. 8 for a more detailed list of security events) as well as any other event that may be monitored by the online service (e.g., machine actions, user actions, and the like). Each event that is included in a profile includes different event information. The event information may include information, such as: a type of the event; a time the event occurred; what account generated the event; a result of the event; and the like. For example, an event may be a process start event that was started by account 2 at 11:06 AM that resulted in a specific process being started. According to another embodiment, the profile may include a number of times each event that is monitored occurs within a specified period of time (e.g. within a day, a week, . . . ).
  • The percentages that are shown below the event information in profile 310, as well as profiles 340 and 370, show what percentage of time each of the events occur. In the current example for aggregate profile 310, event 1 occurs 20% of the time, event 2 occurs 5% of the time, event 3 occurs 50% of the time and event 4 occurs 25% of the time. Generally, events that occur more frequently are typically common events such as logging into the service, logging off of the service, and the like.
  • As discussed, each event may be weighted the same or differently from other events that are being monitored and used in detecting any anomalous activity. For example, each event may be weighted to a same value, such as 1, or may be weighted using some other criteria. According to an embodiment, the weighting is based on the frequency of the event that is within a profile. For example, a weight for less frequently occurring events would be weighted more heavily as compared to more frequently occurring events. Events that typically occur less often, such as creating an account, may be considered more indicative of anomalous activity as compared to common events, such as login or logoff events. According to an embodiment, the relative weight for each event is automatically determined by dividing the percentage of the highest occurring event by the percentage of the occurrence of each event in the profile. In the current example shown in profile 310, the automatically determined weighting would result in weight 1 being 2.5 times higher as compared to the weight for event 3, weight 2 would be 10 times higher as compared to the weight for event 3, and weight 4 would be weighted 2 times higher as compared to the weight for event 3. Other weighting methods may be used. For example, each weight may be based on a type of event and/or each weight may be manually configured by an authorized user (See FIG. 4 for an example Graphical User Interface for configuring weights). The weights may be configured manually or automatically.
  • Baseline profile 340 is an exemplary baseline profile for a single account of the online service. According to an embodiment, each account includes a baseline profile that may be used to determine the “normal” behavior for an account. In the current example, baseline profile 340 includes the same event information as aggregate baseline profile 310. The frequency of the occurrence of the events, however, for baseline profile 340 may be different as compared to the aggregate baseline profile 310. As illustrated, the percentage of occurrence for event 1 and event 3 are the same, but the percentages for event 2 and event 4 are different. The percentage of occurrence for event 2 is greater in the baseline profile 340 as compared to the occurrence for event 2 in the baseline profile 310 (10% compared 5%). The percentage of occurrence for event 4 is less in the baseline profile 340 as compared to the occurrence for event 4 in the baseline profile 310 (20% compared 25%). As discussed herein, anomalous activity may be detected using baseline profiles relating to a single account and/or detected using one or more aggregate baseline profiles. In some cases, a newly created account would not have an established baseline profile. In this case, the recent profile for the newly created account may be compared against an aggregate profile or a different account in order to detect anomalous activity.
  • Recent profile 370 is an exemplary recent profile for a single account of the online service. A recent profile may be created for a single account and/or an aggregation of accounts. In the current example, recent profile 370 includes the same event information as aggregate baseline profile 310. The occurrence of the events for recent profile 370 is different as compared to both the aggregate baseline profile 310 as well as for baseline profile 370. Comparing recent profile 370 to either of the baseline profiles (310, 340) indicates anomalous activity is occurring for the account. For example, the percentage of occurrence of event 2 in the recent profile is 2 times greater than baseline profile 340 indicates and 4 times greater than the aggregate baseline profile 310 indicates.
  • FIG. 4 shows example Graphical User Interfaces (GUIs) for viewing and configuring events relating to anomaly detection. The GUIs shown in FIG. 4 are for illustrative purposes and are not intended to be limiting.
  • The first GUI that is shown is an event configuration 410 display that shows an exemplary GUI for selecting and configuring events to be monitored for anomaly detection for an online service. Event categories 415 section shows different categories of events that may be selected for monitoring. While four event categories are shown, many other event categories may be displayed. According to an embodiment, each event and category of event that is monitored by the online service may be selected for use in anomaly detection. In the current example, the event categories that are shown in event configuration 410 includes account logon events, account management events, object access events and policy change events. As illustrated a user 430 is selecting the account management events category.
  • In response to receiving a selection of an event category, such as the account management events, a more detailed view of each of the events that are included in the event category are displayed. In the current example, account management events 420 shows the different events that are contained within the selected account management events category. According to an embodiment, a user may select or deselect the individual events of the selected event category to include in anomaly detection. For example, a user may select the account management event category and then deselect a couple of the events from being used in the anomaly detection.
  • Display 440 shows an exemplary display including an event picker 442 that displays a list of different events that may be configured and used in anomaly detection. In the current example, user 450 has selected event 2. In response to selecting Event 2, a more detailed view 460 of configuration is displayed. In the current example, Event 2 is shown to be set to “On” indicating that Event 2 is being monitored for anomaly detection. A user may also turn off Event 2 by selecting the On/Off user interface element. A weight of “0.3” has also been set for Event 2. According to an embodiment, an authorized user may configure the weighting of different events such that different events are given more weight in detecting anomalous activity. A default weight for an event may be set. According to an embodiment, the default weight for each event is set to vary inversely with the frequency of that event in the baseline profile. For example, if Event 1 occurs with frequency 10% in the baseline profile, it may have a weight of 1/0.1=10. If Event 1 occurs with frequency 50% in the baseline profile, it may have a weight of 1/0.5=2. The specific calculation may also be configurable. For example, the weight may be the square of the inverse frequency, or some other calculation related to the inverse.
  • FIG. 5 shows exemplary reports showing detected anomalous activity for one or more accounts within the online service.
  • Display 510 shows an example report that indicates possible anomalous activity of different accounts. Anomaly report 512 is illustrated for exemplary purposes and is not intended to be limiting. Many other types of reports showing anomalous activity may be created and displayed. As illustrated, report 512 lists a portion of the accounts being monitored for anomalous activity in the order from highest detected anomalous activity to lowest detected anomalous activity. Anomaly report 512 shows the accounts that have detected anomalous activity. For example, accounts that are operating normally and in which anomalous activity is not detected, are not shown in anomaly report. According to an embodiment, any account that has a value above some predefined number (threshold) has potential anomalous activity. According to an embodiment, a value above the predefined number (threshold) indicates that at least one event that is being monitored is currently occurring more frequently as compared to the frequency of occurrence that is included in the baseline profile. In the current example, a user may select one of the displayed accounts to obtain more information relating to the detected anomalous activity. As shown, user 520 selects Account 10 from anomaly report 512 to obtain a more detailed view of the account as shown in display 540.
  • Display 540 illustrates an example of more detailed information that may be displayed for an account in which anomalous activity is detected. As illustrated, the more detailed information shows the events that indicate anomalous activity as well as the baseline profile for the events. In display 540, two different events being monitored are showing anomalous activity. In the current example, Event 2 is occurring five times more than what is considered normal (0.5 as compared to 0.1). Event 3 is occurring twice as often as normal event occurrence (0.2 compared to 0.1).
  • FIGS. 6 and 7 illustrate processes for anomaly detection using account information from an online service. When reading the discussion of the routines presented herein, it should be appreciated that the logical operations of various embodiments are implemented (1) as a sequence of computer implemented acts or program modules running on a computing system and/or (2) as interconnected machine logic circuits or circuit modules within the computing system. The implementation is a matter of choice dependent on the performance requirements of the computing system implementing the invention. Accordingly, the logical operations illustrated and making up the embodiments described herein are referred to variously as operations, structural devices, acts or modules. These operations, structural devices, acts and modules may be implemented in software, in firmware, in special purpose digital logic, and any combination thereof While the operations are shown in a particular order, the order of the operations may change, be performed in parallel, depending on the implementation.
  • FIG. 6 illustrates a process 600 for detecting anomalous activity in an online service by comparing a baseline profile to a recent profile.
  • After a start operation, the process moves to operation 610, where one or more baseline profiles are accessed. The baseline profile may be a baseline profile for a single account or may be an aggregated baseline profile that includes event information from more than one account of the online service. For example, each tenant within an online service may have a separate baseline profile for each account that is monitored for anomalous activity. As discussed, the baseline profile includes event information that is generated from one or more accounts and indicates the “normal” operation of an account or accounts. According to an embodiment, the profiles include event information that relates to security events that are associated with one or more accounts of the online service. The event information may be configured by an authorized user and/or automatically selected based on the monitored events of the online system. For example, the online service may automatically select all or a portion of the monitored or logged events in the online service.
  • Flowing to operation 620, a recent profile is accessed. The recent profile includes event information that is obtained from a recent period of time. A recent period of time may include a time period, such as, but not limited to the last hour, two hours, three hours, four hours, eight hours, day, and the like. According to an embodiment, the recent profile includes event information from one or more accounts for the last six hours. The recent profile may include a single account of the online service and/or an aggregate of accounts of the online service.
  • Transitioning to operation 630, the baseline profile is compared to the recent profile to detect anomalous activity of the online service. The comparison of the recent profile with the baseline profile may be performed using different methods. According to an embodiment, the comparison includes determining when the frequency of one or more of the events being monitored is different between the baseline profile and the recent profile. According to an embodiment, a weighting factor is used to adjust the importance of an occurrence of an event. For example, an increase in the number of accounts being created (a more heavily weighed event) for an account may be considered more of an indication of anomalous activity as compared to an increase in the number of windows being opened (a less heavily weighted event) by an account.
  • Moving to operation 640, an action may be performed based on the detected anomalous activity. One or more actions may occur. The actions that are performed relate to notifying one or more users of the detected anomalous activity and/or attempting to stop or limit the detected anomalous activity. For example, one or more reports may be created and delivered (operation 650), one or more accounts may be locked such that future anomalous activity from the one or more accounts is stopped, certain actions may be prevented from occurring from one or more accounts, and the like.
  • Flowing to operation 650, anomalous activity that is detected is reported. There are many different methods that may be used to report anomalous activity. For example, a summary report may be created and delivered that ranks each account according to its detected anomalous activity. According to an embodiment, the accounts are ranked according to a level of detected anomalous activity for the account. A more detailed report for each account may also be created for delivery and/or viewing. According to an embodiment, the more detailed report may be accessed by selecting the account from within the summary report of the anomalous activity.
  • The process then flows to an end operation and returns to processing other actions.
  • FIG. 7 illustrates a process 700 for configuring and storing event information within a baseline profile and a recent profile.
  • After a start operation, the process moves to operation 710, where the events to monitor and use in anomaly detection are determined. The determination may be automatically and/or manually performed. For example, instead of manually selecting each event to monitor, each event that is currently being monitored and logged may be used in detecting anomalous activity. As another example, an authorized user may select the events to be monitored. This selection may be in place of or in addition to the events that are currently being used in the anomaly detection. According to an embodiment, a GUI is displayed that allows the authorized user to select the events to monitor and use in the anomaly detection.
  • Flowing to operation 720, the events that are determined to be used in the anomaly detection for the online service are configured. The configuration may include manual configuration as well as automatic configuration. For example, an authorized user may set weightings to associate with the different events. According to an embodiment, the weightings for the events that are being monitored are automatically set to different values based on the percentage of times the event occurs. In this example, events that occur more frequently are considered to be less important in detecting anomalous activity as compared to events that typically occur less frequently in the online service. For example, an event that creates an account may be weighted more heavily as compared to a login event.
  • Transitioning to operation 730, the configuration information is stored. The configuration information may be stored in one more locations. For example, the configuration information may be stored in a data store of the online service and/or in an external data store from the online service.
  • Flowing to operation 740, the event information from different accounts in the online service is obtained. According to an embodiment, the event information is obtained from a log that records event information that is associated with the different accounts of the online service. The event information may be obtained automatically or manually. For example, the event information may be obtained every one hour, two hours, six hours, daily, and the like. Different accounts and types of accounts may be monitored for the different events. According to an embodiment, operator accounts of the online service are monitored and data from the monitoring is logged by the online service. Other types of accounts may also be monitored. For example, an authorized user may configure to monitor events for each account of the online service or select a subset of account to monitor. The event information may include different types of information. For example, the event information may include information such as but not limited to: a type of event, a time the event occurred, a result of the event, and the like.
  • Transitioning to operation 750, profiles are updated with the event information. One or more profiles, such as baseline profiles and recent profiles, may be updated with the event information. According to an embodiment, a baseline profile and a recent profile for each account that is being monitored is updated. An aggregate baseline profile and an aggregate recent profile are also updated. The aggregate profiles include event information that is generated from more than one account. According to an embodiment, the aggregate profiles include account information for each account of a tenant of the online service. For example, separate aggregate profiles are maintained for each of the different customers that are serviced by the online service. As discussed, the different profiles may be used to detect anomalous activity of the online service.
  • The process then flows to an end operation and returns to processing other actions.
  • FIG. 8 illustrates an exemplary online system that includes anomaly detection. As illustrated, system 1000 includes service 1010, data store 1045, and touch screen input device 1050 (e.g. a slate), smart phone 1030 and display device 1080.
  • As illustrated, service 1010 is a cloud based and/or enterprise based service that may be configured to provide services, such as productivity services (e.g. spreadsheets, documents, presentations, charts, messages, and the like). The service may be interacted with using different types of input/output. For example, a user may use speech input, touch input, hardware based input, and the like. Functionality of one or more of the services/applications provided by service 1010 may also be configured as a client/server based application.
  • As illustrated, service 1010 is a multi-tenant service that provides resources 1015 and services to any number of tenants (e.g. Tenants 1-N). Multi-tenant service 1010 is a cloud based service that provides resources/services 1015 to tenants subscribed to the service and maintains each tenant's data separately and protected from other tenant data.
  • System 1000 as illustrated comprises a touch screen input device 1050 (e.g. a slate/tablet device) and smart phone 1030 that detects when a touch input has been received (e.g. a finger touching or nearly touching the touch screen). Any type of touch screen may be utilized that detects a user's touch input. For example, the touch screen may include one or more layers of capacitive material that detects the touch input. Other sensors may be used in addition to or in place of the capacitive material. For example, Infrared (IR) sensors may be used. According to an embodiment, the touch screen is configured to detect objects that in contact with or above a touchable surface. Although the term “above” is used in this description, it should be understood that the orientation of the touch panel system is irrelevant. The term “above” is intended to be applicable to all such orientations. The touch screen may be configured to determine locations of where touch input is received (e.g. a starting point, intermediate points and an ending point). Actual contact between the touchable surface and the object may be detected by any suitable means, including, for example, by a vibration sensor or microphone coupled to the touch panel. A non-exhaustive list of examples for sensors to detect contact includes pressure-based mechanisms, micro-machined accelerometers, piezoelectric devices, capacitive sensors, resistive sensors, and inductive sensors.
  • According to an embodiment, smart phone 1030, touch screen input device 1050, and device 1080 each include an application (1031, 1051, 1081).
  • As illustrated, touch screen input device 1050, smart phone 1030, and display device 1080 shows exemplary displays 1052/1032/1082 showing the use of an application and a display relating to anomaly detection. Data may be stored on a device (e.g. smart phone 1030, slate 1050 and/or at some other location (e.g. network data store 1045). Data store 1045, or some other store, may be used to store data sets, event information, baseline profiles, recent profiles, as well as other data. The applications used by the devices may be client based applications, server based applications, and cloud based applications and/or some combination. According to an embodiment, display device 1080 is a device such as a MICROSOFT XBOX coupled to a display.
  • Anomaly detector 26 is configured to perform operations relating to detecting anomalous activity as described herein. While anomaly detector 26 is shown within service 1010, the functionality of the anomaly detector may be included in other locations (e.g. on smart phone 1030 and/or slate device 1050 and/or device 1080).
  • Some example events that may be monitored and included in the baseline profile and the recent profile include, but are not limited to: the operating system is starting up; the operating system is shutting down; An authentication package has been loaded by the Local Security Authority; A trusted logon process has been registered with the Local Security Authority; Internal resources allocated for the queuing of audit messages have been exhausted, leading to the loss of some audits; A notification package has been loaded by the Security Account Manager; Invalid use of a port; the system time was changed; A monitored security event pattern has occurred; Administrator recovered system from CrashOnAuditFail; A security package has been loaded by the Local Security Authority; An account was successfully logged on; An account failed to log on; User/Device claims information; An account was logged off; User initiated logoff; A logon was attempted using explicit credentials; A replay attack was detected; A handle to an object was requested; A registry value was modified; The handle to an object was closed; A handle to an object was requested with intent to delete; An object was deleted; A handle to an object was requested; An operation was performed on an object; An attempt was made to access an object; An attempt was made to create a hard link; An attempt was made to create an application client context; An application attempted an operation; An application client context was deleted; An application was initialized; Permissions on an object were changed; An application attempted to access a blocked ordinal; Special privileges assigned to new logon; A privileged service was called; An operation was attempted on a privileged object; Security Identifiers (SIDs) were filtered; A new process has been created; A process has exited; An attempt was made to duplicate a handle to an object; Indirect access to an object was requested; Backup of data protection master key was attempted; Recovery of data protection master key was attempted; Protection of auditable protected data was attempted; Unprotection of auditable protected data was attempted; A primary token was assigned to process; A service was installed in the system; A scheduled task was created, deleted, enabled, disabled or updated; A user right was assigned or removed; A user right was removed; A new trust was created to a domain; A trust to a domain was removed; Kerberos policy was changed; Encrypted data recovery policy was changed; The audit policy on an object was changed; Trusted domain information was modified; System security access was granted or removed to or from an account; System audit policy was changed; A user account was created, changed or enabled; An attempt was made to change or reset an account's password; An attempt was made to reset an accounts password; A user account was disabled or deleted; A user account was deleted; A security-enabled global group was created, changed or deleted; A member was added to a security-enabled global group or removed from a security-enabled global group; A security-enabled local global group was deleted, changed or created; A member was added to a security-enabled local group or removed from a security-enabled local group; Domain Policy was changed; A user account was locked out; A computer account was created, changed or deleted; A security-disabled local group was created or changed; A member was added to a security-disabled local group or removed from a security-disabled local group; A security-disabled local group was deleted, created or changed; A member was added to a security-disabled global group or removed from a security-disabled global group; A security-disabled global group was deleted; A security-enabled universal group was created or changed; A member was added or removed to a security-enabled universal group; A security-enabled universal group was created, changed or deleted; A member was added or removed to a security-disabled universal group; A security-disabled universal group was deleted; A groups type was changed; SID History was added to an account; An attempt to add SID History to an account failed; A user account was unlocked; A Kerberos authentication ticket was requested; A Kerberos service ticket was requested or renewed; Kerberos pre-authentication failed; A Kerberos authentication ticket request failed; A Kerberos service ticket request failed; An account was mapped for logon; An account could not be mapped for logon; The domain controller attempted to validate the credentials for an account; The domain controller failed to validate the credentials for an account; A session was reconnected or disconnected; The Access Control List was set on accounts which are members of administrators groups; The name of an account was changed; The password hash an account was accessed; A basic application group was created, changed or deleted; A member was added to a basic application group or removed from a basic application group; A non-member was added to a basic application group or removed from a basic application group; A Lightweight Directory Access Protocol (LDAP) query group was created or deleted; The Password Policy Checking API was called; An attempt was made to set the Directory Services Restore Mode administrator password; An attempt was made to query the existence of a blank password for an account; RPC detected an integrity violation while decrypting an incoming message; Auditing settings on object were changed; Proposed Central Access Policy does not grant the same access permissions as the current Central Access Policy; Central Access Policies on the machine have been changed; A namespace collision was detected; A trusted forest information entry was added, removed or was modified; The certificate manager denied a pending certificate request; Certificate Services received a resubmitted certificate request; Certificate Services revoked a certificate; Certificate Services received a request to publish the certificate revocation list (CRL); Certificate Services published the certificate revocation list (CRL); A certificate request extension changed; One or more certificate request attributes changed; Certificate Services received a request to shut down; Certificate Services backup started or completed; Certificate Services restore started or completed Certificate Services started or stopped; The security permissions for Certificate Services changed; Certificate Services retrieved an archived key; Certificate Services imported a certificate into its database; The audit filter for Certificate Services changed; Certificate Services received a certificate request; Certificate Services approved a certificate request and issued a certificate; Certificate Services denied a certificate request; Certificate Services set the status of a certificate request to pending; The certificate manager settings for Certificate Services changed; A configuration entry changed in Certificate Services; A property of Certificate Services changed; Certificate Services archived a key; Certificate Services imported and archived a key; Certificate Services published the CA certificate to Active Directory Domain Services; One or more rows have been deleted from the certificate database; Role separation enabled; Certificate Services loaded a template; A Certificate Services template was updated; Certificate Services template security was updated; The Per-user audit policy table was created; An attempt was made to register a security event source; An attempt was made to unregister a security event source; A CrashOnAuditFail value has changed; Auditing settings on object were changed; Special Groups Logon table modified; The local policy settings for Trusted Platform Base services (TBS) were changed; The group policy settings for the TBS were changed; Resource attributes of the object were changed; Per User Audit Policy was changed; Central Access Policy on the object was changed; An Active Directory replica source naming context was established, removed, or modified; An Active Directory replica destination naming context was modified; Synchronization of a replica of an Active Directory naming context has begun or ended; Attributes of an Active Directory object were replicated; Replication failure begins or ends; A lingering object was removed from a replica; The following policy was active when a Firewall started; A rule was listed when the Firewall started; A change has been made to Firewall exception list, such as a rule was added, modified or deleted; Firewall settings were restored to the default values; A Firewall setting has changed; A rule has been ignored because its major version number was not recognized by Firewall; Parts of a rule have been ignored because its minor version number was not recognized by Firewall; A rule has been ignored by Firewall because it could not parse the rule; Firewall Group Policy settings has changed;; Firewall has changed the active profile; Firewall did not apply the following rule; Firewall did not apply the following rule because the rule referred to items not configured on this computer; Special groups have been assigned to a new logon; Internet Protocol Security (IPSec) Services was started; IPsec Services was disabled; IPsec Services encountered a potentially serious failure; An IPsec Main Mode security association was established; An IPsec Main Mode security association was established; An IPsec Main Mode negotiation failed; An IPsec Main Mode negotiation failed; An IPsec Quick Mode negotiation failed; An IPsec Main Mode security association ended; IPsec dropped an inbound packet that failed an integrity check; IPsec dropped an inbound packet that failed a replay check; IPsec dropped an inbound packet that failed a replay check; IPsec dropped an inbound clear text packet that should have been secured; IPsec received a packet from a remote computer with an incorrect Security Parameter Index (SPI); During Main Mode negotiation, IPsec received an invalid negotiation packet; During Quick Mode negotiation, IPsec received an invalid negotiation packet; During Extended Mode negotiation, IPsec received an invalid negotiation packet; IPsec Main Mode and Extended Mode security associations were established; An IPsec Extended Mode negotiation failed and the state of a transaction has changed.
  • Custom events may also be monitored. According to an embodiment, a log file records the custom events whenever the online service detects the custom event by any user of that online service.
  • The embodiments and functionalities described herein may operate via a multitude of computing systems including, without limitation, desktop computer systems, wired and wireless computing systems, mobile computing systems (e.g., mobile telephones, netbooks, tablet or slate type computers, notebook computers, and laptop computers), handheld devices, multiprocessor systems, microprocessor-based or programmable consumer electronics, minicomputers, and mainframe computers.
  • In addition, the embodiments and functionalities described herein may operate over distributed systems (e.g., cloud-based computing systems), where application functionality, memory, data storage and retrieval and various processing functions may be operated remotely from each other over a distributed computing network, such as the Internet or an intranet. User interfaces and information of various types may be displayed via on-board computing device displays or via remote display units associated with one or more computing devices. For example user interfaces and information of various types may be displayed and interacted with on a wall surface onto which user interfaces and information of various types are projected. Interaction with the multitude of computing systems with which embodiments of the invention may be practiced include, keystroke entry, touch screen entry, voice or other audio entry, gesture entry where an associated computing device is equipped with detection (e.g., camera) functionality for capturing and interpreting user gestures for controlling the functionality of the computing device, and the like.
  • FIGS. 9-11 and the associated descriptions provide a discussion of a variety of operating environments in which embodiments of the invention may be practiced. However, the devices and systems illustrated and discussed with respect to FIGS. 9-11 are for purposes of example and illustration and are not limiting of a vast number of computing device configurations that may be utilized for practicing embodiments of the invention, described herein.
  • FIG. 9 is a block diagram illustrating physical components (i.e., hardware) of a computing device 1100 with which embodiments of the invention may be practiced. The computing device components described below may be suitable for the computing devices described above. In a basic configuration, the computing device 1100 may include at least one processing unit 1102 and a system memory 1104. Depending on the configuration and type of computing device, the system memory 1104 may comprise, but is not limited to, volatile storage (e.g., random access memory), non-volatile storage (e.g., read-only memory), flash memory, or any combination of such memories. The system memory 1104 may include an operating system 1105 and one or more program modules 1106 suitable for running software applications 1120 such as the anomaly detector 26. The operating system 1105, for example, may be suitable for controlling the operation of the computing device 1100. Furthermore, embodiments of the invention may be practiced in conjunction with a graphics library, other operating systems, or any other application program and is not limited to any particular application or system. This basic configuration is illustrated in FIG. 9 by those components within a dashed line 1108. The computing device 1100 may have additional features or functionality. For example, the computing device 1100 may also include additional data storage devices (removable and/or non-removable) such as, for example, magnetic disks, optical disks, or tape. Such additional storage is illustrated in FIG. 9 by a removable storage device 1109 and a non-removable storage device 1110.
  • As stated above, a number of program modules and data files may be stored in the system memory 1104. While executing on the processing unit 1102, the program modules 1106 (e.g., the anomaly detector 26) may perform processes including, but not limited to, one or more of the stages of the methods and processes illustrated in the figures. Other program modules that may be used in accordance with embodiments of the present invention may include electronic mail and contacts applications, word processing applications, spreadsheet applications, database applications, slide presentation applications, drawing or computer-aided application programs, etc.
  • Furthermore, embodiments of the invention may be practiced in an electrical circuit comprising discrete electronic elements, packaged or integrated electronic chips containing logic gates, a circuit utilizing a microprocessor, or on a single chip containing electronic elements or microprocessors. For example, embodiments of the invention may be practiced via a system-on-a-chip (SOC) where each or many of the components illustrated in FIG. 9 may be integrated onto a single integrated circuit. Such an SOC device may include one or more processing units, graphics units, communications units, system virtualization units and various application functionality all of which are integrated (or “burned”) onto the chip substrate as a single integrated circuit. When operating via an SOC, the functionality, described herein, with respect to the anomaly detector 26 may be operated via application-specific logic integrated with other components of the computing device 1100 on the single integrated circuit (chip). Embodiments of the invention may also be practiced using other technologies capable of performing logical operations such as, for example, AND, OR, and NOT, including but not limited to mechanical, optical, fluidic, and quantum technologies. In addition, embodiments of the invention may be practiced within a general purpose computer or in any other circuits or systems.
  • The computing device 1100 may also have one or more input device(s) 1112 such as a keyboard, a mouse, a pen, a sound input device, a touch input device, etc. The output device(s) 1114 such as a display, speakers, a printer, etc. may also be included. The aforementioned devices are examples and others may be used. The computing device 1100 may include one or more communication connections 1116 allowing communications with other computing devices 1118. Examples of suitable communication connections 1116 include, but are not limited to, RF transmitter, receiver, and/or transceiver circuitry; universal serial bus (USB), parallel, and/or serial ports.
  • The term computer readable media as used herein may include computer storage media. Computer storage media may include volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information, such as computer readable instructions, data structures, or program modules. The system memory 1104, the removable storage device 1109, and the non-removable storage device 1110 are all computer storage media examples (i.e., memory storage.) Computer storage media may include RAM, ROM, electrically erasable read-only memory (EEPROM), flash memory or other memory technology, CD-ROM, digital versatile disks (DVD) or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other article of manufacture which can be used to store information and which can be accessed by the computing device 1100. Any such computer storage media may be part of the computing device 1100. Computer storage media does not include a carrier wave or other propagated or modulated data signal.
  • Communication media may be embodied by computer readable instructions, data structures, program modules, or other data in a modulated data signal, such as a carrier wave or other transport mechanism, and includes any information delivery media. The term “modulated data signal” may describe a signal that has one or more characteristics set or changed in such a manner as to encode information in the signal. By way of example, and not limitation, communication media may include wired media such as a wired network or direct-wired connection, and wireless media such as acoustic, radio frequency (RF), infrared, and other wireless media.
  • FIGS. 10A and 10B illustrate a mobile computing device 1200, for example, a mobile telephone, a smart phone, a tablet personal computer, a laptop computer, and the like, with which embodiments of the invention may be practiced. With reference to FIG. 10A, one embodiment of a mobile computing device 1200 for implementing the embodiments is illustrated. In a basic configuration, the mobile computing device 1200 is a handheld computer having both input elements and output elements. The mobile computing device 1200 typically includes a display 1205 and one or more input buttons 1210 that allow the user to enter information into the mobile computing device 1200. The display 1205 of the mobile computing device 1200 may also function as an input device (e.g., a touch screen display). If included, an optional side input element 1215 allows further user input. The side input element 1215 may be a rotary switch, a button, or any other type of manual input element. In alternative embodiments, mobile computing device 1200 may incorporate more or less input elements. For example, the display 1205 may not be a touch screen in some embodiments. In yet another alternative embodiment, the mobile computing device 1200 is a portable phone system, such as a cellular phone. The mobile computing device 1200 may also include an optional keypad 1235. Optional keypad 1235 may be a physical keypad or a “soft” keypad generated on the touch screen display. In various embodiments, the output elements include the display 1205 for showing a graphical user interface (GUI), a visual indicator 1220 (e.g., a light emitting diode), and/or an audio transducer 1225 (e.g., a speaker). In some embodiments, the mobile computing device 1200 incorporates a vibration transducer for providing the user with tactile feedback. In yet another embodiment, the mobile computing device 1200 incorporates input and/or output ports, such as an audio input (e.g., a microphone jack), an audio output (e.g., a headphone jack), and a video output (e.g., a HDMI port) for sending signals to or receiving signals from an external device.
  • FIG. 10B is a block diagram illustrating the architecture of one embodiment of a mobile computing device. That is, the mobile computing device 1200 can incorporate a system 1202 (i.e., an architecture) to implement some embodiments. In one embodiment, the system 1202 is implemented as a “smart phone” capable of running one or more applications (e.g., browser, e-mail, calendaring, contact managers, messaging clients, games, and media clients/players). In some embodiments, the system 1202 is integrated as a computing device, such as an integrated personal digital assistant (PDA) and wireless phone.
  • One or more application programs 1266 may be loaded into the memory 1262 and run on or in association with the operating system 1264. Examples of the application programs include phone dialer programs, e-mail programs, personal information management (PIM) programs, word processing programs, spreadsheet programs, Internet browser programs, messaging programs, and so forth. The system 1202 also includes a non-volatile storage area 1268 within the memory 1262. The non-volatile storage area 1268 may be used to store persistent information that should not be lost if the system 1202 is powered down. The application programs 1266 may use and store information in the non-volatile storage area 1268, such as e-mail or other messages used by an e-mail application, and the like. A synchronization application (not shown) also resides on the system 1202 and is programmed to interact with a corresponding synchronization application resident on a host computer to keep the information stored in the non-volatile storage area 1268 synchronized with corresponding information stored at the host computer. As should be appreciated, other applications may be loaded into the memory 1262 and run on the mobile computing device 1200, including the anomaly detector 26 as described herein.
  • The system 1202 has a power supply 1270, which may be implemented as one or more batteries. The power supply 1270 might further include an external power source, such as an AC adapter or a powered docking cradle that supplements or recharges the batteries.
  • The system 1202 may also include a radio 1272 that performs the function of transmitting and receiving radio frequency communications. The radio 1272 facilitates wireless connectivity between the system 1202 and the “outside world,” via a communications carrier or service provider. Transmissions to and from the radio 1272 are conducted under control of the operating system 1264. In other words, communications received by the radio 1272 may be disseminated to the application programs 1266 via the operating system 1264, and vice versa.
  • The visual indicator 1220 may be used to provide visual notifications, and/or an audio interface 1274 may be used for producing audible notifications via the audio transducer 1225. In the illustrated embodiment, the visual indicator 1220 is a light emitting diode (LED) and the audio transducer 1225 is a speaker. These devices may be directly coupled to the power supply 1270 so that when activated, they remain on for a duration dictated by the notification mechanism even though the processor 1260 and other components might shut down for conserving battery power. The LED may be programmed to remain on indefinitely until the user takes action to indicate the powered-on status of the device. The audio interface 1274 is used to provide audible signals to and receive audible signals from the user. For example, in addition to being coupled to the audio transducer 1225, the audio interface 1274 may also be coupled to a microphone to receive audible input, such as to facilitate a telephone conversation. In accordance with embodiments of the present invention, the microphone may also serve as an audio sensor to facilitate control of notifications, as will be described below. The system 1202 may further include a video interface 1276 that enables an operation of an on-board camera to record still images, video stream, and the like.
  • A mobile computing device 1200 implementing the system 1202 may have additional features or functionality. For example, the mobile computing device 1200 may also include additional data storage devices (removable and/or non-removable) such as, magnetic disks, optical disks, or tape. Such additional storage is illustrated in FIG. 10B by the non-volatile storage area 1268. Mobile computing device 1200 may also include peripheral device port 1230.
  • Data/information generated or captured by the mobile computing device 1200 and stored via the system 1202 may be stored locally on the mobile computing device 1200, as described above, or the data may be stored on any number of storage media that may be accessed by the device via the radio 1272 or via a wired connection between the mobile computing device 1200 and a separate computing device associated with the mobile computing device 1200, for example, a server computer in a distributed computing network, such as the Internet. As should be appreciated such data/information may be accessed via the mobile computing device 1200 via the radio 1272 or via a distributed computing network. Similarly, such data/information may be readily transferred between computing devices for storage and use according to well-known data/information transfer and storage means, including electronic mail and collaborative data/information sharing systems.
  • FIG. 11 illustrates an embodiment of an architecture of an exemplary system, as described above. Content developed, interacted with, or edited in association with the anomaly detector 26 may be stored in different communication channels or other storage types. For example, various documents may be stored using a directory service 1322, a web portal 1324, a mailbox service 1326, an instant messaging store 1328, or a social networking site 1330. The anomaly detector 26 may use any of these types of systems or the like for enabling data utilization, as described herein. A server 1320 may provide the anomaly detector 26 to clients. As one example, the server 1320 may be a web server providing the anomaly detector 26 over the web. The server 1320 may provide the anomaly detector 26 over the web to clients through a network 1315. By way of example, the client computing device may be implemented as the computing device 1100 and embodied in a personal computer, a tablet computing device 1310 and/or a mobile computing device 1200 (e.g., a smart phone). Any of these embodiments of the client computing device 1100, 1310, 1200 may obtain content from the store 1316.
  • Embodiments of the present invention, for example, are described above with reference to block diagrams and/or operational illustrations of methods, systems, and computer program products according to embodiments of the invention. The functions/acts noted in the blocks may occur out of the order as shown in any flowchart. For example, two blocks shown in succession may in fact be executed substantially concurrently or the blocks may sometimes be executed in the reverse order, depending upon the functionality/acts involved.
  • The description and illustration of one or more embodiments provided in this application are not intended to limit or restrict the scope of the invention as claimed in any way. The embodiments, examples, and details provided in this application are considered sufficient to convey possession and enable others to make and use the best mode of claimed invention. The claimed invention should not be construed as being limited to any embodiment, example, or detail provided in this application. Regardless of whether shown and described in combination or separately, the various features (both structural and methodological) are intended to be selectively included or omitted to produce an embodiment with a particular set of features. Having been provided with the description and illustration of the present application, one skilled in the art may envision variations, modifications, and alternate embodiments falling within the spirit of the broader aspects of the general inventive concept embodied in this application that do not depart from the broader scope of the claimed invention.

Claims (20)

What is claimed is:
1. A method for detecting anomalous activity in an online service, comprising:
accessing a baseline profile comprising past event information related to events that originate from accounts of the online service;
accessing a recent profile comprising recent event information related to recent events that originate from accounts of the online service;
comparing a frequency of the past event information in the baseline profile to a frequency of the recent event information in the recent profile to determine when anomalous activity is detected in the online service; and
reporting the anomalous activity when detected.
2. The method of claim 1, wherein the past event information and the recent event information comprises security events of the online service.
3. The method of claim 1, wherein the recent profile comprises recent event information that occurs from between a few hours and a day of activity related data.
4. The method of claim 1, further comprising periodically accessing event information from accounts of the online service and updating the baseline profile and the recent profile using the accessed information.
5. The method of claim 1, wherein comparing the frequency of the past event information in the baseline profile to the frequency of the recent event information in the recent profile to determine when anomalous activity is detected in the online service comprises determining when the frequency of at least one of the events in the recent event information is larger than the frequency of the at least one of the events in the past event information.
6. The method of claim 1, further comprising automatically associating a weight with each of the events that are included in the baseline profile and the recent profile.
7. The method of claim 1, wherein the baseline profile and the recent profile comprises event information that is obtained from operator accounts that have permissions to create, modify, and delete accounts for users.
8. The method of claim 1, further comprising receiving configuration information relating to configuring events to monitor and include in the recent profile and the baseline profile.
9. The method of claim 1, wherein reporting the anomalous activity when detected comprises automatically creating a report that ranks each account being monitored according to a detected level of anomalous activity and displaying more detailed information relating to the anomalous activity of at least one of the accounts when determined.
10. A computer-readable storage medium storing computer-executable instructions for detecting anomalous activity in an online service, comprising:
accessing a baseline profile comprising past event information related to events comprising security events that originate from accounts of the online service;
accessing a recent profile comprising recent event information related to recent events comprising the security events that originated within a last day from accounts of the online service;
comparing the baseline profile to the recent profile and when the baseline profile and the recent profile are different then detecting anomalous activity; and
reporting the anomalous activity when detected.
11. The computer-readable storage medium of claim 10, further comprising periodically accessing a system log to obtain event information from accounts of the online service and updating the baseline profile and the recent profile using the accessed information.
12. The computer-readable storage medium of claim 10, wherein comparing the baseline profile to the recent profile comprises examining a frequency of occurrence of the recent event information with the frequency of occurrence of the event information.
13. The computer-readable storage medium of claim 10, further comprising associating a weight with each of the events that are included in the baseline profile and the recent profile that affects how much an occurrence of the event is used in detecting the anomalous activity.
14. The computer-readable storage medium of claim 10, wherein the baseline profile comprises past event information from operator accounts.
15. The computer-readable storage medium of claim 10, further comprising displaying a Graphical User Interface (GUI) and receiving configuration information relating to configuring the events from the GUI.
16. The computer-readable storage medium of claim 10, wherein reporting the anomalous activity when detected comprises automatically creating a report and delivering the report and displaying more detailed information relating to the anomalous activity of at least one of the accounts in response to a selection of an account within the report.
17. A system for detecting anomalous activity in an online service, comprising:
a processor and memory;
an operating environment executing using the processor; and
an anomaly detector that is configured to perform actions comprising:
accessing a baseline profile comprising past event information related to events comprising security events that originate from accounts of the online service;
accessing a recent profile comprising recent event information related to recent events comprising the security events that originated within a last day from accounts of the online service;
comparing frequencies of past event information in the baseline profile to frequencies of the recent event information in the recent profile to determine when anomalous activity is detected in the online service; and
reporting the anomalous activity when detected.
18. The system of claim 17, further comprising periodically accessing a system log to obtain event information from accounts of the online service and updating the baseline profile and the recent profile using the accessed information.
19. The system of claim 17, further comprising associating a weight with each of the events that are included in the baseline profile and the recent profile that affects how much an occurrence of the event is used in detecting the anomalous activity.
20. The system of claim 17, further comprising displaying a Graphical User Interface (GUI) and receiving configuration information relating to configuring the events from the GUI.
US14/945,010 2013-12-19 2015-11-18 Detecting anomalous activity from accounts of an online service Abandoned US20160080406A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US14/945,010 US20160080406A1 (en) 2013-12-19 2015-11-18 Detecting anomalous activity from accounts of an online service

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US14/134,575 US9210183B2 (en) 2013-12-19 2013-12-19 Detecting anomalous activity from accounts of an online service
US14/945,010 US20160080406A1 (en) 2013-12-19 2015-11-18 Detecting anomalous activity from accounts of an online service

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US14/134,575 Continuation US9210183B2 (en) 2013-12-19 2013-12-19 Detecting anomalous activity from accounts of an online service

Publications (1)

Publication Number Publication Date
US20160080406A1 true US20160080406A1 (en) 2016-03-17

Family

ID=52358974

Family Applications (2)

Application Number Title Priority Date Filing Date
US14/134,575 Active US9210183B2 (en) 2013-12-19 2013-12-19 Detecting anomalous activity from accounts of an online service
US14/945,010 Abandoned US20160080406A1 (en) 2013-12-19 2015-11-18 Detecting anomalous activity from accounts of an online service

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US14/134,575 Active US9210183B2 (en) 2013-12-19 2013-12-19 Detecting anomalous activity from accounts of an online service

Country Status (4)

Country Link
US (2) US9210183B2 (en)
EP (1) EP3085053A1 (en)
CN (1) CN105874767B (en)
WO (1) WO2015094873A1 (en)

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10084802B1 (en) * 2016-06-21 2018-09-25 Palantir Technologies Inc. Supervisory control and data acquisition
US10419931B1 (en) * 2016-08-25 2019-09-17 EMC IP Holding Company LLC Security for network computing environment using centralized security system
US10943023B2 (en) * 2016-06-16 2021-03-09 EMC IP Holding Company LLC Method for filtering documents and electronic device
US11100046B2 (en) * 2016-01-25 2021-08-24 International Business Machines Corporation Intelligent security context aware elastic storage
US11316851B2 (en) 2019-06-19 2022-04-26 EMC IP Holding Company LLC Security for network environment using trust scoring based on power consumption of devices within network
US11418529B2 (en) * 2018-12-20 2022-08-16 Palantir Technologies Inc. Detection of vulnerabilities in a computer network
US11546769B1 (en) * 2021-06-30 2023-01-03 Fortinet, Inc. NGFW (next generation firewall) security inspection over multiple sessions of message session relay protocol (MSRP) on a data communication network
US20230015269A1 (en) * 2021-07-15 2023-01-19 AVAST Software s.r.o. Data exfiltration detection
US11941155B2 (en) 2021-03-15 2024-03-26 EMC IP Holding Company LLC Secure data management in a network computing environment
WO2024144778A1 (en) * 2022-12-29 2024-07-04 Varonis Systems, Inc. Indicators of compromise of access

Families Citing this family (145)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10346620B2 (en) * 2004-02-06 2019-07-09 Early Warning Service, LLC Systems and methods for authentication of access based on multi-data source information
US10015134B2 (en) * 2011-12-29 2018-07-03 Verisign, Inc. Methods and systems for creating new domains
US10015153B1 (en) * 2013-12-23 2018-07-03 EMC IP Holding Company LLC Security using velocity metrics identifying authentication performance for a set of devices
US9842027B1 (en) * 2013-12-27 2017-12-12 EMC IP Holding Company LLC Intelligent application optimized backups
US10264025B2 (en) 2016-06-24 2019-04-16 Varmour Networks, Inc. Security policy generation for virtualization, bare-metal server, and cloud computing environments
US10091238B2 (en) 2014-02-11 2018-10-02 Varmour Networks, Inc. Deception using distributed threat detection
JP5613855B1 (en) * 2014-04-23 2014-10-29 株式会社 ディー・エヌ・エー User authentication system
US10122753B2 (en) * 2014-04-28 2018-11-06 Sophos Limited Using reputation to avoid false malware detections
US9917851B2 (en) 2014-04-28 2018-03-13 Sophos Limited Intrusion detection using a heartbeat
US11017330B2 (en) 2014-05-20 2021-05-25 Elasticsearch B.V. Method and system for analysing data
US10140309B2 (en) * 2014-06-10 2018-11-27 Alfresco Software, Inc. File tracking on client machines synchronized with a content management system repository
US9591006B2 (en) * 2014-09-18 2017-03-07 Microsoft Technology Licensing, Llc Lateral movement detection
US9749311B2 (en) 2014-09-24 2017-08-29 Oracle International Corporation Policy based compliance management and remediation of devices in an enterprise system
US10482404B2 (en) * 2014-09-25 2019-11-19 Oracle International Corporation Delegated privileged access grants
US10530790B2 (en) * 2014-09-25 2020-01-07 Oracle International Corporation Privileged session analytics
US9967278B2 (en) * 2014-10-21 2018-05-08 Proofpoint, Inc. Systems and methods for application security analysis
US9705896B2 (en) * 2014-10-28 2017-07-11 Facebook, Inc. Systems and methods for dynamically selecting model thresholds for identifying illegitimate accounts
US10510081B2 (en) * 2014-10-30 2019-12-17 Unisys Corporation Cargo air waybill audit
GB201915196D0 (en) 2014-12-18 2019-12-04 Sophos Ltd A method and system for network access control based on traffic monitoring and vulnerability detection using process related information
US9973505B2 (en) * 2015-01-14 2018-05-15 Samsung Electronics Co., Ltd. Method for controlling contents and electronic device thereof
WO2016118523A1 (en) * 2015-01-19 2016-07-28 InAuth, Inc. Systems and methods for trusted path secure communication
US10193929B2 (en) * 2015-03-13 2019-01-29 Varmour Networks, Inc. Methods and systems for improving analytics in distributed networks
EP3073718B1 (en) * 2015-03-27 2019-01-30 Deutsche Telekom AG Method for the individual prediction of the use and/or customisation of the use of a personalized telecommunication terminal to be operated by a user, telecommunication terminal, computer program and a computer program product
US9380027B1 (en) 2015-03-30 2016-06-28 Varmour Networks, Inc. Conditional declarative policies
CN106156149B (en) * 2015-04-14 2020-01-03 阿里巴巴集团控股有限公司 Data transfer method and device
US10193867B2 (en) 2015-05-27 2019-01-29 Ping Identity Corporation Methods and systems for API proxy based adaptive security
CN106302603A (en) * 2015-06-05 2017-01-04 腾讯科技(深圳)有限公司 The method and apparatus remotely deleting information
JP6341150B2 (en) * 2015-07-09 2018-06-13 京セラドキュメントソリューションズ株式会社 Image forming apparatus and abnormality management system for image forming apparatus
US10313212B2 (en) * 2015-09-22 2019-06-04 Veniam, Inc. Systems and methods for detecting and classifying anomalies in a network of moving things
JP5933797B1 (en) 2015-10-07 2016-06-15 株式会社ソリトンシステムズ Log information generating apparatus and program, and log information extracting apparatus and program
EP3377982A4 (en) * 2015-11-18 2019-07-31 Level 3 Communications, LLC Service activation system
US10191758B2 (en) 2015-12-09 2019-01-29 Varmour Networks, Inc. Directing data traffic between intra-server virtual machines
US10289693B2 (en) * 2015-12-30 2019-05-14 Dropbox, Inc. Techniques for providing user interface enhancements for online content management system version histories
US9680852B1 (en) 2016-01-29 2017-06-13 Varmour Networks, Inc. Recursive multi-layer examination for computer network security remediation
US10659466B2 (en) * 2016-03-22 2020-05-19 Microsoft Technology Licensing, Llc Secure resource-based policy
US10366241B2 (en) * 2016-03-30 2019-07-30 The Privacy Factor, LLC Systems and methods for analyzing, assessing and controlling trust and authentication in applications and devices
US10284567B2 (en) 2016-05-03 2019-05-07 Paypal, Inc. Targeted authentication queries based on detected user actions
US10104119B2 (en) * 2016-05-11 2018-10-16 Cisco Technology, Inc. Short term certificate management during distributed denial of service attacks
US10554614B2 (en) * 2016-06-23 2020-02-04 Cisco Technology, Inc. Utilizing service tagging for encrypted flow classification
US10755334B2 (en) 2016-06-30 2020-08-25 Varmour Networks, Inc. Systems and methods for continually scoring and segmenting open opportunities using client data and product predictors
US10702744B2 (en) * 2016-09-07 2020-07-07 ATA IT Services LLC Fitness based control of communications device
US11660504B2 (en) * 2016-09-07 2023-05-30 ATA IT Services LLC Fitness based control of communication device
US10218719B2 (en) * 2016-09-21 2019-02-26 Apple Inc. Credential modification notifications
US10574700B1 (en) * 2016-09-30 2020-02-25 Symantec Corporation Systems and methods for managing computer security of client computing machines
US10915622B2 (en) * 2016-10-18 2021-02-09 Microsoft Technology Licensing, Llc Detecting local user security-related anomalies using active scans
US10681012B2 (en) 2016-10-26 2020-06-09 Ping Identity Corporation Methods and systems for deep learning based API traffic security
US10191818B2 (en) * 2016-11-14 2019-01-29 Sap Se Filtered replication of data in distributed system of data centers
WO2018124672A1 (en) 2016-12-28 2018-07-05 Samsung Electronics Co., Ltd. Apparatus for detecting anomaly and operating method for the same
US10628590B2 (en) * 2017-01-24 2020-04-21 Salesforce.Com, Inc. Application security assessment
US10320800B2 (en) * 2017-03-13 2019-06-11 International Business Machines Corporation Fraud detection mechanism
US11621969B2 (en) 2017-04-26 2023-04-04 Elasticsearch B.V. Clustering and outlier detection in anomaly and causation detection for computing environments
US10986110B2 (en) 2017-04-26 2021-04-20 Elasticsearch B.V. Anomaly and causation detection in computing environments using counterfactual processing
US11783046B2 (en) * 2017-04-26 2023-10-10 Elasticsearch B.V. Anomaly and causation detection in computing environments
US10904289B2 (en) 2017-04-30 2021-01-26 Splunk Inc. Enabling user definition of custom threat rules in a network security system
US11032307B2 (en) * 2017-04-30 2021-06-08 Splunk Inc. User interface for defining custom threat rules in a network security system
US10999296B2 (en) 2017-05-15 2021-05-04 Forcepoint, LLC Generating adaptive trust profiles using information derived from similarly situated organizations
US9882918B1 (en) 2017-05-15 2018-01-30 Forcepoint, LLC User behavior profile in a blockchain
US10999297B2 (en) 2017-05-15 2021-05-04 Forcepoint, LLC Using expected behavior of an entity when prepopulating an adaptive trust profile
US10447718B2 (en) 2017-05-15 2019-10-15 Forcepoint Llc User profile definition and management
US10917423B2 (en) 2017-05-15 2021-02-09 Forcepoint, LLC Intelligently differentiating between different types of states and attributes when using an adaptive trust profile
US10943019B2 (en) 2017-05-15 2021-03-09 Forcepoint, LLC Adaptive trust profile endpoint
US10129269B1 (en) 2017-05-15 2018-11-13 Forcepoint, LLC Managing blockchain access to user profile information
US10623431B2 (en) * 2017-05-15 2020-04-14 Forcepoint Llc Discerning psychological state from correlated user behavior and contextual information
US10862927B2 (en) 2017-05-15 2020-12-08 Forcepoint, LLC Dividing events into sessions during adaptive trust profile operations
US10701094B2 (en) * 2017-06-22 2020-06-30 Oracle International Corporation Techniques for monitoring privileged users and detecting anomalous activities in a computing environment
US10264026B2 (en) * 2017-07-24 2019-04-16 Cyberark Software Ltd. Providing privileged access to non-privileged accounts
FR3069670A1 (en) * 2017-07-27 2019-02-01 Safran Identity and Security SOFTWARE FIREWALL
US11005892B2 (en) * 2017-09-17 2021-05-11 Allot Ltd. System, method, and apparatus of securing and managing internet-connected devices and networks
US12007941B2 (en) 2017-09-29 2024-06-11 Oracle International Corporation Session state tracking
EP4020282A1 (en) 2017-10-13 2022-06-29 Ping Identity Corporation Methods and apparatus for analyzing sequences of application programming interface traffic to identify potential malicious actions
US11979422B1 (en) * 2017-11-27 2024-05-07 Lacework, Inc. Elastic privileges in a secure access service edge
US10785190B2 (en) * 2017-12-13 2020-09-22 Adaptiv Networks Inc. System, apparatus and method for providing a unified firewall manager
US11438337B2 (en) * 2017-12-15 2022-09-06 Sap Se Multi-tenant support user cloud access
CN108133373A (en) * 2018-01-04 2018-06-08 交通银行股份有限公司 Seek the method and device for the adventure account for relating to machine behavior
US10860664B2 (en) 2018-03-19 2020-12-08 Roblox Corporation Data flood checking and improved performance of gaming processes
US10388286B1 (en) 2018-03-20 2019-08-20 Capital One Services, Llc Systems and methods of sound-based fraud protection
US10915587B2 (en) 2018-05-18 2021-02-09 Google Llc Data processing system for generating entries in data structures from network requests
US10867044B2 (en) * 2018-05-30 2020-12-15 AppOmni, Inc. Automatic computer system change monitoring and security gap detection system
US10404852B1 (en) 2018-06-01 2019-09-03 T-Mobile Usa, Inc. Control of real-time communication sessions via a communication privilege control (CPC) system
US10467310B1 (en) * 2018-06-02 2019-11-05 Romit Dey Selective online content removal based on activity history
US11218297B1 (en) * 2018-06-06 2022-01-04 Tripwire, Inc. Onboarding access to remote security control tools
RU2708355C1 (en) * 2018-06-29 2019-12-05 Акционерное общество "Лаборатория Касперского" Method of detecting malicious files that counteract analysis in isolated environment
US11017076B2 (en) * 2018-08-08 2021-05-25 Microsoft Technology Licensing, Llc Enhancing security using anomaly detection
US11165776B2 (en) * 2018-08-28 2021-11-02 International Business Machines Corporation Methods and systems for managing access to computing system resources
US10938853B1 (en) * 2018-08-29 2021-03-02 Amazon Technologies, Inc. Real-time detection and clustering of emerging fraud patterns
US11481241B2 (en) 2018-08-30 2022-10-25 Micron Technology, Inc. Virtual machine register in a computer processor
US11914726B2 (en) 2018-08-30 2024-02-27 Micron Technology, Inc. Access control for processor registers based on execution domains
US11500665B2 (en) 2018-08-30 2022-11-15 Micron Technology, Inc. Dynamic configuration of a computer processor based on the presence of a hypervisor
US10942863B2 (en) 2018-08-30 2021-03-09 Micron Technology, Inc. Security configurations in page table entries for execution domains using a sandbox application operation
US11182507B2 (en) * 2018-08-30 2021-11-23 Micron Technology, Inc. Domain crossing in executing instructions in computer processors
US11562315B2 (en) * 2018-08-31 2023-01-24 Accenture Global Solutions Limited Detecting an issue related to a report
US10999135B2 (en) * 2018-09-19 2021-05-04 Google Llc Fast provisioning in cloud computing environments
RU2720529C1 (en) * 2018-11-30 2020-04-30 Алибаба Груп Холдинг Лимитед Using table of nonces for resolving failure of parallel transactions with blockchains
US11675902B2 (en) * 2018-12-05 2023-06-13 Vmware, Inc. Security detection system with privilege management
US11188661B2 (en) * 2018-12-12 2021-11-30 Sap Se Semi-rule based high performance permission management
US11681710B2 (en) * 2018-12-23 2023-06-20 Microsoft Technology Licensing, Llc Entity extraction rules harvesting and performance
EP3678348A1 (en) 2019-01-04 2020-07-08 Ping Identity Corporation Methods and systems for data traffic based adpative security
US11140182B2 (en) 2019-01-11 2021-10-05 Optum, Inc. Predictive anomaly handling in a service provider system
CN109817347A (en) * 2019-01-15 2019-05-28 深圳市道通科技股份有限公司 Inline diagnosis platform, its right management method and Rights Management System
US11210407B2 (en) * 2019-01-25 2021-12-28 V440 Spó£Ka Akcyjna Electronic communications device and messaging application therefor
US11102187B2 (en) * 2019-02-20 2021-08-24 Aetna Inc. Systems and methods for managing workflow transactions including protected personal data in regulated computing environments
US11283827B2 (en) * 2019-02-28 2022-03-22 Xm Cyber Ltd. Lateral movement strategy during penetration testing of a networked system
US11570213B2 (en) * 2019-04-03 2023-01-31 Cisco Technology, Inc. Collaborative security for application layer encryption
US11151576B2 (en) 2019-04-05 2021-10-19 At&T Intellectual Property I, L.P. Authorizing transactions using negative pin messages
US11126713B2 (en) * 2019-04-08 2021-09-21 Microsoft Technology Licensing, Llc Detecting directory reconnaissance in a directory service
US10997295B2 (en) 2019-04-26 2021-05-04 Forcepoint, LLC Adaptive trust profile reference architecture
US11863580B2 (en) 2019-05-31 2024-01-02 Varmour Networks, Inc. Modeling application dependencies to identify operational risk
US11310284B2 (en) 2019-05-31 2022-04-19 Varmour Networks, Inc. Validation of cloud security policies
US11290494B2 (en) 2019-05-31 2022-03-29 Varmour Networks, Inc. Reliability prediction for cloud security policies
US11290493B2 (en) 2019-05-31 2022-03-29 Varmour Networks, Inc. Template-driven intent-based security
US11711374B2 (en) 2019-05-31 2023-07-25 Varmour Networks, Inc. Systems and methods for understanding identity and organizational access to applications within an enterprise environment
US11575563B2 (en) 2019-05-31 2023-02-07 Varmour Networks, Inc. Cloud security management
US11226983B2 (en) * 2019-06-18 2022-01-18 Microsoft Technology Licensing, Llc Sub-scope synchronization
US11343257B2 (en) * 2019-06-27 2022-05-24 Microsoft Technology Licensing, Llc Extended domain platform for nonmember user account management
US11329987B2 (en) * 2019-07-08 2022-05-10 Bank Of America Corporation Protecting enterprise computing resources by implementing an optical air gap system
US11027196B2 (en) * 2019-09-04 2021-06-08 Take-Two Interactive Software, Inc. System and method for managing transactions in a multiplayer network gaming environment
US11936739B2 (en) * 2019-09-12 2024-03-19 Oracle International Corporation Automated reset of session state
US11461484B2 (en) 2019-12-30 2022-10-04 Imperva, Inc. Capturing contextual information for data accesses to improve data security
US20210233081A1 (en) * 2020-01-27 2021-07-29 Visa International Service Association Embedding inferred reaction correspondence from decline data
US11455532B2 (en) * 2020-03-18 2022-09-27 Optum Services (Ireland) Limited Single point facility utility sensing for monitoring welfare of a facility occupant
US11715106B2 (en) 2020-04-01 2023-08-01 Mastercard International Incorporated Systems and methods for real-time institution analysis based on message traffic
US11410178B2 (en) 2020-04-01 2022-08-09 Mastercard International Incorporated Systems and methods for message tracking using real-time normalized scoring
US11023607B1 (en) * 2020-04-03 2021-06-01 Imperva, Inc. Detecting behavioral anomalies in user-data access logs
CN111553700B (en) * 2020-05-07 2023-03-21 支付宝(杭州)信息技术有限公司 Payment risk identification method and device
US11676368B2 (en) 2020-06-30 2023-06-13 Optum Services (Ireland) Limited Identifying anomalous activity from thermal images
US11379775B2 (en) * 2020-07-14 2022-07-05 BankCard Services, LLC Computer-based information management system configured for automated and dynamic account analysis and methods thereof
US11321157B2 (en) * 2020-08-31 2022-05-03 Northrop Grumman Systems Corporation Method of operating a digital system operable in multiple operational states and digital system implementing such method
US11522863B2 (en) * 2020-10-29 2022-12-06 Shopify Inc. Method and system for managing resource access permissions within a computing environment
US12088583B2 (en) * 2020-11-11 2024-09-10 Hewlett Packard Enterprise Development Lp Permissions for backup-related operations
WO2022120840A1 (en) * 2020-12-11 2022-06-16 Beijing Didi Infinity Technology And Development Co., Ltd. Systems and methods for improving security
US11818152B2 (en) 2020-12-23 2023-11-14 Varmour Networks, Inc. Modeling topic-based message-oriented middleware within a security system
US11876817B2 (en) 2020-12-23 2024-01-16 Varmour Networks, Inc. Modeling queue-based message-oriented middleware relationships in a security system
US11943235B2 (en) 2021-01-04 2024-03-26 Saudi Arabian Oil Company Detecting suspicious user logins in private networks using machine learning
US11777978B2 (en) 2021-01-29 2023-10-03 Varmour Networks, Inc. Methods and systems for accurately assessing application access risk
US12050693B2 (en) 2021-01-29 2024-07-30 Varmour Networks, Inc. System and method for attributing user behavior from multiple technical telemetry sources
US11785015B2 (en) * 2021-02-24 2023-10-10 Bank Of America Corporation Information security system for detecting unauthorized access requests
US11895133B2 (en) * 2021-04-05 2024-02-06 Bank Of America Corporation Systems and methods for automated device activity analysis
US11716340B2 (en) * 2021-05-28 2023-08-01 Microsoft Technology Licensing, Llc Threat detection using cloud resource management logs
US12010125B2 (en) * 2021-06-29 2024-06-11 Microsoft Technology Licensing, Llc Anomaly detection in an application with delegate authorization
US11734316B2 (en) 2021-07-08 2023-08-22 Varmour Networks, Inc. Relationship-based search in a computing environment
KR102369960B1 (en) * 2021-07-30 2022-03-04 쿠팡 주식회사 Electronic apparatus for providing information based on existence of a user account and method thereof
WO2023069213A1 (en) * 2021-10-20 2023-04-27 Visa International Service Association Method, system, and computer program product for auto-profiling anomalies
US11748374B2 (en) * 2021-11-30 2023-09-05 Snowflake Inc. Replication group objects configuration in a network-based database system
US20230269262A1 (en) * 2022-02-24 2023-08-24 Microsoft Technology Licensing, Llc Detecting mass control plane operations
US20230267198A1 (en) * 2022-02-24 2023-08-24 Microsoft Technology Licensing, Llc Anomalous behavior detection with respect to control plane operations

Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040185830A1 (en) * 1996-08-08 2004-09-23 Joao Raymond Anthony Apparatus and method for providing account security
US20080010678A1 (en) * 2004-09-17 2008-01-10 Jeff Burdette Authentication Proxy
US20080172316A1 (en) * 2007-01-12 2008-07-17 Adams Dean A Bank Card Fraud Detection And/Or Prevention Methods
US20090271343A1 (en) * 2008-04-25 2009-10-29 Anthony Vaiciulis Automated entity identification for efficient profiling in an event probability prediction system
US20100228580A1 (en) * 2009-03-04 2010-09-09 Zoldi Scott M Fraud detection based on efficient frequent-behavior sorted lists
US20120016633A1 (en) * 2010-07-16 2012-01-19 Andreas Wittenstein System and method for automatic detection of anomalous recurrent behavior
US20120137367A1 (en) * 2009-11-06 2012-05-31 Cataphora, Inc. Continuous anomaly detection based on behavior modeling and heterogeneous information analysis
US20130097709A1 (en) * 2011-10-18 2013-04-18 Mcafee, Inc. User behavioral risk assessment
US20150067845A1 (en) * 2013-08-27 2015-03-05 International Business Machines Corporation Detecting Anomalous User Behavior Using Generative Models of User Actions
US20150215329A1 (en) * 2012-07-31 2015-07-30 Anurag Singla Pattern Consolidation To Identify Malicious Activity
US9106687B1 (en) * 2011-11-01 2015-08-11 Symantec Corporation Mechanism for profiling user and group accesses to content repository
US9166993B1 (en) * 2013-07-25 2015-10-20 Symantec Corporation Anomaly detection based on profile history and peer history
US9710857B2 (en) * 2010-11-19 2017-07-18 Sap Se Detecting anomalous user activity

Family Cites Families (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5819226A (en) * 1992-09-08 1998-10-06 Hnc Software Inc. Fraud detection using predictive modeling
US20080275820A1 (en) * 2000-01-21 2008-11-06 Raymond Anthony Joao Apparatus and method for providing account security
US7971237B2 (en) 2003-05-15 2011-06-28 Verizon Business Global Llc Method and system for providing fraud detection for remote access services
US9407662B2 (en) 2005-12-29 2016-08-02 Nextlabs, Inc. Analyzing activity data of an information management system
US20070289013A1 (en) 2006-06-08 2007-12-13 Keng Leng Albert Lim Method and system for anomaly detection using a collective set of unsupervised machine-learning algorithms
US8234240B2 (en) 2007-04-26 2012-07-31 Microsoft Corporation Framework for providing metrics from any datasource
US20080270303A1 (en) * 2007-04-27 2008-10-30 Janice Zhou Method and system for detecting fraud in financial transactions
US9727440B2 (en) 2007-06-22 2017-08-08 Red Hat, Inc. Automatic simulation of virtual machine performance
US8230269B2 (en) 2008-06-17 2012-07-24 Microsoft Corporation Monitoring data categorization and module-based health correlations
US9397979B2 (en) 2009-04-22 2016-07-19 Hewlett Packard Enterprise Development Lp Router method and system
US8904241B2 (en) 2011-07-27 2014-12-02 Oracle International Corporation Proactive and adaptive cloud monitoring
US8646073B2 (en) 2011-05-18 2014-02-04 Check Point Software Technologies Ltd. Detection of account hijacking in a social network
US8745216B2 (en) 2011-11-17 2014-06-03 Infosys Limited Systems and methods for monitoring and controlling a service level agreement
CN102694696B (en) * 2012-05-14 2015-09-09 中国科学院计算机网络信息中心 The method of dns server abnormality detection and device

Patent Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040185830A1 (en) * 1996-08-08 2004-09-23 Joao Raymond Anthony Apparatus and method for providing account security
US20080010678A1 (en) * 2004-09-17 2008-01-10 Jeff Burdette Authentication Proxy
US20080172316A1 (en) * 2007-01-12 2008-07-17 Adams Dean A Bank Card Fraud Detection And/Or Prevention Methods
US20090271343A1 (en) * 2008-04-25 2009-10-29 Anthony Vaiciulis Automated entity identification for efficient profiling in an event probability prediction system
US20120101937A1 (en) * 2009-03-04 2012-04-26 Fair Isaac Corporation Fraud detection based on efficient frequent-behavior sorted lists
US20100228580A1 (en) * 2009-03-04 2010-09-09 Zoldi Scott M Fraud detection based on efficient frequent-behavior sorted lists
US20120137367A1 (en) * 2009-11-06 2012-05-31 Cataphora, Inc. Continuous anomaly detection based on behavior modeling and heterogeneous information analysis
US20120016633A1 (en) * 2010-07-16 2012-01-19 Andreas Wittenstein System and method for automatic detection of anomalous recurrent behavior
US9710857B2 (en) * 2010-11-19 2017-07-18 Sap Se Detecting anomalous user activity
US20130097709A1 (en) * 2011-10-18 2013-04-18 Mcafee, Inc. User behavioral risk assessment
US9106687B1 (en) * 2011-11-01 2015-08-11 Symantec Corporation Mechanism for profiling user and group accesses to content repository
US20150215329A1 (en) * 2012-07-31 2015-07-30 Anurag Singla Pattern Consolidation To Identify Malicious Activity
US9166993B1 (en) * 2013-07-25 2015-10-20 Symantec Corporation Anomaly detection based on profile history and peer history
US20150067845A1 (en) * 2013-08-27 2015-03-05 International Business Machines Corporation Detecting Anomalous User Behavior Using Generative Models of User Actions

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
Daniel Barbard; ADAM: A Testbed for Exploring the Use of Data Mining in Intrusion Detection; ACMYear:2001; page: 15-24 *
David Kinny; A Methodology and Modelling Technique for systems of BDI Agent: Google: page:1-9 *

Cited By (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11100046B2 (en) * 2016-01-25 2021-08-24 International Business Machines Corporation Intelligent security context aware elastic storage
US10943023B2 (en) * 2016-06-16 2021-03-09 EMC IP Holding Company LLC Method for filtering documents and electronic device
US11799877B2 (en) 2016-06-21 2023-10-24 Palantir Technologies Inc. Supervisory control and data acquisition
US10567404B2 (en) * 2016-06-21 2020-02-18 Palantir Technologies Inc. Supervisory control and data acquisition
US10084802B1 (en) * 2016-06-21 2018-09-25 Palantir Technologies Inc. Supervisory control and data acquisition
US10298603B2 (en) * 2016-06-21 2019-05-21 Palantir Technologies Inc. Supervisory control and data acquisition
US11109229B2 (en) 2016-08-25 2021-08-31 EMC IP Holding Company LLC Security for network computing environment using centralized security system
US10419931B1 (en) * 2016-08-25 2019-09-17 EMC IP Holding Company LLC Security for network computing environment using centralized security system
US11418529B2 (en) * 2018-12-20 2022-08-16 Palantir Technologies Inc. Detection of vulnerabilities in a computer network
US11882145B2 (en) 2018-12-20 2024-01-23 Palantir Technologies Inc. Detection of vulnerabilities in a computer network
US11316851B2 (en) 2019-06-19 2022-04-26 EMC IP Holding Company LLC Security for network environment using trust scoring based on power consumption of devices within network
US11941155B2 (en) 2021-03-15 2024-03-26 EMC IP Holding Company LLC Secure data management in a network computing environment
US11546769B1 (en) * 2021-06-30 2023-01-03 Fortinet, Inc. NGFW (next generation firewall) security inspection over multiple sessions of message session relay protocol (MSRP) on a data communication network
US20230007490A1 (en) * 2021-06-30 2023-01-05 Fortinet, Inc. Ngfw (next generation firewall) security inspection over multiple sessions of message session relay protocol (msrp) on a data communication network
US20230015269A1 (en) * 2021-07-15 2023-01-19 AVAST Software s.r.o. Data exfiltration detection
US11829509B2 (en) * 2021-07-15 2023-11-28 AVAST Software s.r.o. Data exfiltration detection
WO2024144778A1 (en) * 2022-12-29 2024-07-04 Varonis Systems, Inc. Indicators of compromise of access

Also Published As

Publication number Publication date
US20150180894A1 (en) 2015-06-25
WO2015094873A1 (en) 2015-06-25
EP3085053A1 (en) 2016-10-26
US9210183B2 (en) 2015-12-08
CN105874767B (en) 2019-03-26
CN105874767A (en) 2016-08-17

Similar Documents

Publication Publication Date Title
US9210183B2 (en) Detecting anomalous activity from accounts of an online service
US11930024B2 (en) Detecting behavior anomalies of cloud users
US10542021B1 (en) Automated extraction of behavioral profile features
US11075917B2 (en) Tenant lockbox
US11558388B2 (en) Provisional computing resource policy evaluation
US10375054B2 (en) Securing user-accessed applications in a distributed computing environment
US10567381B1 (en) Refresh token for credential renewal
US10250612B1 (en) Cross-account role management
US10652232B2 (en) Adaptive timeouts for security credentials
US9699173B1 (en) Incorrect password management
US11829796B2 (en) Automated rollback
EP3590247A1 (en) Security and compliance alerts based on content, activities, and metadata in cloud
US9038148B1 (en) Secret variation for network sessions
US20220368726A1 (en) Privilege assurance of computer network environments
US20200076789A1 (en) Ownership maintenance in a multi-tenant environment
CN114902224A (en) Reliable large-scale data center protection
US20200092165A1 (en) Honeypot asset cloning
US10721236B1 (en) Method, apparatus and computer program product for providing security via user clustering
US10904011B2 (en) Configuration updates for access-restricted hosts
US11411813B2 (en) Single user device staging
US11863563B1 (en) Policy scope management
US20220156375A1 (en) Detection of repeated security events related to removable media

Legal Events

Date Code Title Description
AS Assignment

Owner name: MICROSOFT CORPORATION, WASHINGTON

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:SADOVSKY, ART;LALKAKA, RUSTAM;SHARMA, VIVEK;AND OTHERS;SIGNING DATES FROM 20131211 TO 20131218;REEL/FRAME:037076/0073

Owner name: MICROSOFT TECHNOLOGY LICENSING, LLC, WASHINGTON

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:MICROSOFT CORPORATION;REEL/FRAME:037076/0142

Effective date: 20141014

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION