[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

GB2559821A - Secure access by behavior recognition - Google Patents

Secure access by behavior recognition Download PDF

Info

Publication number
GB2559821A
GB2559821A GB1713774.6A GB201713774A GB2559821A GB 2559821 A GB2559821 A GB 2559821A GB 201713774 A GB201713774 A GB 201713774A GB 2559821 A GB2559821 A GB 2559821A
Authority
GB
United Kingdom
Prior art keywords
computer
connection
security
metadata
malicious
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
GB1713774.6A
Other versions
GB201713774D0 (en
Inventor
Mustafa Hashim Hani
Shivanand Sunil
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ava Security Ltd
Original Assignee
Jazz Networks Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jazz Networks Ltd filed Critical Jazz Networks Ltd
Publication of GB201713774D0 publication Critical patent/GB201713774D0/en
Publication of GB2559821A publication Critical patent/GB2559821A/en
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/31User authentication
    • G06F21/316User authentication by observing the pattern of computer usage, e.g. typical user behaviour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/44Program or device authentication
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/50Monitoring users, programs or devices to maintain the integrity of platforms, e.g. of processors, firmware or operating systems
    • G06F21/55Detecting local intrusion or implementing counter-measures
    • G06F21/554Detecting local intrusion or implementing counter-measures involving event detection and direct action
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L63/00Network architectures or network communication protocols for network security
    • H04L63/08Network architectures or network communication protocols for network security for authentication of entities
    • H04L63/0876Network architectures or network communication protocols for network security for authentication of entities based on the identity of the terminal or configuration, e.g. MAC address, hardware or software configuration or device fingerprint
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L63/00Network architectures or network communication protocols for network security
    • H04L63/10Network architectures or network communication protocols for network security for controlling access to devices or network resources
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L63/00Network architectures or network communication protocols for network security
    • H04L63/14Network architectures or network communication protocols for network security for detecting or protecting against malicious traffic
    • H04L63/1408Network architectures or network communication protocols for network security for detecting or protecting against malicious traffic by monitoring network traffic
    • H04L63/1425Traffic logging, e.g. anomaly detection
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L63/00Network architectures or network communication protocols for network security
    • H04L63/20Network architectures or network communication protocols for network security for managing network security; network security policies in general
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/50Monitoring users, programs or devices to maintain the integrity of platforms, e.g. of processors, firmware or operating systems
    • G06F21/55Detecting local intrusion or implementing counter-measures
    • G06F21/552Detecting local intrusion or implementing counter-measures involving long-term monitoring or reporting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2221/00Indexing scheme relating to security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F2221/21Indexing scheme relating to G06F21/00 and subgroups addressing additional information or applications relating to security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F2221/2111Location-sensitive, e.g. geographical location, GPS
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2221/00Indexing scheme relating to security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F2221/21Indexing scheme relating to G06F21/00 and subgroups addressing additional information or applications relating to security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F2221/2129Authenticate client device independently of the user
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L63/00Network architectures or network communication protocols for network security
    • H04L63/02Network architectures or network communication protocols for network security for separating internal from external traffic, e.g. firewalls
    • H04L63/0227Filtering policies

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Security & Cryptography (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Computing Systems (AREA)
  • Signal Processing (AREA)
  • Social Psychology (AREA)
  • General Health & Medical Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Power Engineering (AREA)
  • Computer And Data Communications (AREA)

Abstract

A method for assessing security of a connection of a computer network, such as determining whether the connection is malicious. The connection is a connection between a first computer 103a within the computer network and a second computer 105. The method comprises: obtaining metadata 301 associated with the first computer and/or a user of the first computer, and assessing 306, 307 or 309, 310 security of the connection by determining whether an event associated with the connection is malicious based on the metadata. Metadata may consider factors such as computer location, login time, computer configuration or application(s) running or keyboard stroke(s), pattern/speed mouse movement or mouse jitter. Current metadata may be compared to historical metadata and could be compared to metadata of other computers/user in the network. An arbitration server 101 may perform the comparison/assessment 306, 307 or it could be performed locally at the PC 309, 310. The arbitration server may also supply/enforce additional security policies such as static policies to e.g. restrict particular applications or application versions, known to be vulnerable, from obtaining a connection. If determined to be malicious a request for a new connection 304 may be denied or similarly an on-going connection may be disconnected.

Description

(54) Title of the Invention: Secure access by behavior recognition
Abstract Title: Assessing the security of a connection through consideration of metadata on the computer and/or user (57) A method for assessing security of a connection of a computer network, such as determining whether the connection is malicious. The connection is a connection between a first computer 103a within the computer network and a second computer 105. The method comprises: obtaining metadata 301 associated with the first computer and/or a user of the first computer, and assessing 306, 307 or 309, 310 security of the connection by determining whether an event associated with the connection is malicious based on the metadata. Metadata may consider factors such as computer location, login time, computer configuration or application (s) running or keyboard stroke(s), pattern/speed mouse movement or mouse jitter. Current metadata may be compared to historical metadata and could be compared to metadata of other computers/user in the network. An arbitration server 101 may perform the comparison/ assessment 306, 307 or it could be performed locally at the PC 309, 310. The arbitration server may also supply/ enforce additional security policies such as static policies to e.g. restrict particular applications or application versions, known to be vulnerable, from obtaining a connection. If determined to be malicious a request for a new connection 304 may be denied or similarly an ongoing connection may be disconnected.
Figure GB2559821A_D0001
Fig. 3
1/5
100
104
Figure GB2559821A_D0002
105
Fig. 1
2/5
Figure GB2559821A_D0003
Figure GB2559821A_D0004
Fig. 2
Figure GB2559821A_D0005
Fig. 3
4/5
101
410 Obtaining module
420 determining module
430 setting module
440 communication module
480 Processor
490 Memory
Fig. 4
1030a
5/5
510 Obtaining module
520 determining module
530 communication module
580 Processor
590 Memory
Fig. 5
SECURE ACCESS BY BEHAVIOR RECOGNITION
TECHNICAL FIELD
Embodiments herein relate to providing an intelligent, distributed, security system protecting individual computers in a computer network by assessing the security of a connection.
BACKGROUND
When a connection is made from one computer system to another it is important to allow benign connections for usability while simultaneously deny malicious connections for network security. The connection may be denied by preventing it from getting established or may be denied by disconnecting an already established connection.
Industry standard security policies are generally based on source and destination hosts of the connection and information embedded in packet headers of the communication. Existing security systems prevents connections using several different mechanisms such as by information embedded in packet headers themselves; by out-ofband information that associate the src/dst (source/destination) hosts embedded in packet headers to users; by inferring the application being used by processing the payload of the packets, also referred to as deep packet inspection; and by monitoring for certain signatures included in the packets, which is done by traditional Intrusion Detection Systems (IDS) and Intrusion Prevention Systems (IPS). Existing security systems may also prevent connections by applying machine learning and statistical techniques on connections occurring on the network to assess security based on that. Distributed security system are also known where agents are installed on individual computers in a network, and each agent performs probabilistic assessment of the likelihood of a malicious event, i.e. an event that threatens the security of the connection, to occur. In response to identifying an intrusion or an attack the agent distributes a warning to other agents in the system, whereupon each individual computer updates the anti-viral software of the computer.
There are however several types of attacks that cannot be prevented by these security policies. These includes hijacking of an application already running on a trusted system and initiating a new connection to another trusted system of higher security; an illegitimate user getting physical access to the laptop and using it to initiate a new malicious connection; and a legitimate user using his own trusted system to download and leak confidential company data. These kinds of attacks look identical to legitimate requests and thus cannot be detected by inspecting packets.
SUMMARY
It is therefore an object of embodiments herein to provide an improved way of assessing the security of a connection in a computer network.
According to a first aspect of embodiments herein, the object is achieved by a method performed by a security system for assessing security of a connection of a computer network. The connection is a connection between a first computer within the computer network and a second computer.
The security system may comprise an arbitration server being configured to set security policies for the computer network. The security system may further comprise a security agent in the first computer, the security agent being configured to be in communication with the arbitration server. The arbitration server may e.g. transmit security policies to the security agent.
The security system, e.g. through the security agent in the first computer, obtains metadata associated with the first computer and/or a user of the first computer. The security agent may transmit the metadata to the arbitration server.
The security system assesses the security of the connection by determining whether an event associated with the connection is malicious based on the metadata.
The assessment of the security of the connection by determining whether an event associated with the connection is malicious based on the metadata may be performed by the arbitration server and/or by the security agent.
According to a second aspect of embodiments herein, the object is achieved by a security system (1010) in a computer network (100) for assessing security of a connection (115) of the computer network (100). The connection (115) is a connection between a first computer (103a) within the computer network (100) and a second computer (105).
The security system comprises:
an arbitration server (101), the arbitration server being configured to set security policies for the computer network (100);
a security agent (1030a) in the first computer (103a), the security agent (1030a) being configured to be in communication with the arbitration server (101), wherein the security system (1010) is configured to:
i. obtain metadata associated with the first computer (103a) and/or a user of the first computer (103a), and ii. ii. assess security of the connection (115) by determining whether an event associated with the connection is malicious based on metadata associated with the first computer (103a) and/or a user of the first computer (103a) and further based on the security policies.
According to a further aspect of embodiments herein, the object is achieved by a computer program product comprising software instructions that, when executed in a processor, performs the method according to the first aspect above.
Since the security of the connection is assessed based on the metadata it is possible to capture events related to the connection that at the face of it are not malicious for the computer network 100 based on port/ip analysis, but that are malicious nevertheless.
An advantage with embodiments herein is that they may be made more granular than current IP-address or connection based policies.
BRIEF DESCRIPTION OF THE DRAWINGS
Examples of embodiments herein are described in more detail with reference to attached drawings in which:
Figure 1 is a schematic block diagram illustrating embodiments of a security system in a computer network.
Figure 2 is a combined signalling diagram and flow chart illustrating a method according to embodiments herein.
Figure 3 is a flowchart depicting embodiments of a method performed by a security system in a computer network.
Figure 4 is a schematic block diagram illustrating embodiments of an arbitration server comprised in a security system in a computer network.
Figure 5 is a schematic block diagram illustrating embodiments of a security agent comprised in a security system in a computer network.
DETAILED DESCRIPTION
It should be noted that the following embodiments are not mutually exclusive. Components from one embodiment may be tacitly assumed to be present in another embodiment and it will be obvious to a person skilled in the art how those components may be used in the other exemplary embodiments.
Embodiments herein may be implemented in one or more computer networks.
Figure 1 illustrates a computer network 100. Figure 1 further illustrates an exemplary embodiment of a security system 1010. The security system comprises a plurality of personal computers 103a - 103d, one or more servers 102 and one or more arbitration servers 101 connected in the computer network 100. A thin security agent 1030a-1030d is running on each of the personal computers (PC), hereinafter also referred to as the PC-agent. Another thin security agent 1020 is running on the one or more servers 102, hereinafter also referred to as the server agent.
The security agents 1020, 1030a-1030d obtain and track metadata related to the server, the personal computer or a user controlling it. The metadata may be stored locally, e.g. in a memory on the computer 102, 103a-103d. However, the PC-agents and the server agents may also communicate the metadata to the one or more arbitration servers 101 over the computer network 100.
The security system 1010 may also comprise a firewall 104 which the PC-agents are able to traverse even if they are external personal computers, such as the computers 103c, 103d. This may e.g. be done by piggy backing the metadata on packets legitimately allowed by the firewall. In the case a PC-agent of a personal computer or a server agent of a server is turned off, paused or in any way not communicating with the arbitration server, then the personal computer or server will not be allowed to connect to the network.
Static security policies may be set at the arbitration server 101. The static security policies may be enforced at the arbitration server 101, distributed to the PC-agents and server agents for local enforcement, or a combination of central and local enforcement of the security policies.
The security agent 1020, 1030a-1030d running on the computer has knowledge of metadata outside the scope of the connection 115. For example, the PC-agent running on the personal computer has knowledge of metadata outside the scope of the connection 115, such as a configuration of the personal computer and applications running on it.
Thus static security policies may be imposed on metadata parameters that relate to the specific computer, such as the personal computer 103a-103d and the user logged into the computer. Such static security policies may be made more granular than current IPaddress or connection based security policies.
An exemplary parameter is the exact application being used on the personal computer. For example, a static security policy may disallow the selection of a certain web browser, word processor, e-mail program etc. Thus the security system 1010 may disallow the access even when the connection 115 itself seems legitimate. In order to overcome the access barring the user may be required to update the application to a version where a known security hole has been fixed.
As a further measure of security, on top of the static policies, the security system 1010 may use historical, e.g. earlier, metadata related to the computer network 100 and/or the user controlling it to assess the security of the connection 115. It may evaluate the history of the user and the computer network 100, e.g. how the history of the user compares to histories of other users and how the history of the computer 102, 103a-103d compares to histories of other computers in the computer network 100. For example, the security system 1010 may determine a statistical likelihood that an event on the computer 102, 103a-103d is malicious even if it looks identical to a benign event. In other words, the security system 1010 may approximate an intention of the user when deciding whether or not to deny a connection.
As mentioned above, the arbitration server 101 has access to the metadata related to the computer network 100, i.e. all the history of the entire computer network 100. Thus the security system 1010 is in a position to allow/deny connections even if they match the static security policies.
Actions for assessing security of a connection 115 of the computer network 100, according to embodiments herein will now be described in relation to Figure 2 and with continued reference to Figure 1.
The connection 115 is a connection between a first computer 103a within the computer network 100 and a second computer 105.
As mentioned above, the arbitration server 101 is configured to set security policies for the computer network 100, and the security agent 1020, 1030a-1030d is configured to be in communication with the arbitration server 101.
In a scenario in which embodiments herein may be applied an event related to the connection 115 has been recorded by the security system 1010. Such an event may be related to any one or more of:
-start of an application
-download of data over a connection from a computer outside an enterprise network
-upload of data over a connection to a computer outside an enterprise network
-download of data over a connection from a computer inside an enterprise network
-upload of data over a connection to a computer inside an enterprise network
-establishment of a new connection
-file access.
Action 201
In order to capture events related to the connection 115 that at the face of it are not malicious for the computer network 100 based on port/ip analysis, but that are malicious nevertheless, the security system 1010 obtains metadata associated with the first computer 103a and/or a user of the first computer 103a. For example, the security agent 1020, 1030a may obtain the metadata.
In some embodiments the first computer 103a is a personal computer, and the metadata comprises the metadata of the user of the personal computer.
The metadata associated with the first computer may comprise any one or more of:
- Location data of the first computer and/or second computer,
- Login time of the user
- a configuration of the first computer 103a, and
- an application running on the first computer 103a.
In some other embodiments the metadata associated with the user of the first computer comprises any one or more of:
- a keyboard stroke on a keyboard associated with the first computer 103a; and
- a speed of and/or a pattern of moving a mouse associated with the first computer; and
- jitter in movements of the mouse.
The security system 1010 may evaluate parameters relating to the user controlling the first computer 103a to assess the authenticity of the user. When the user first attempts to log into the computer network 100 the arbitration server 101 may receive information about the physical location of the first computer 103a. The arbitration server 101 may compare the present physical location of the first computer 103a with a previous physical location of the first computer 103a and/or user. If the present location is different from the previous location of the first computer 103a and/or user, the arbitration server 101 may estimate how likely it is that the user is working in that location. The arbitration server may then consider parameters such as location history of the computer/ and or user, distance between the locations, time between login attempts etc. to determine the likelihood that the computer and/or user actually is at the present location. The arbitration server 101 may receive information about the time, date, week day, whether or not it is a holiday and time of day. E.g. it may be less likely that someone tries to login at night, during holidays etc.
The arbitration server 101 may also take into consideration historical user data to determine the likelihood that a certain user attempts to log in at a certain time.
Action 202
The arbitration server 101 may set security policies for the computer network 100.
For example, as mentioned above, static security policies may be set at the arbitration server 101. The static security policies may be enforced at the arbitration server 101, distributed to the PC-agents and server agents for local enforcement, or a combination of central and local enforcement of the security policies.
In one embodiment the server agents and the PC-agents may contact the arbitration sever to download the latest policies. The policies may define that a policy is eligible for local enforcement. The policies may also define that a policy has to be resolved at the arbitration server.
As mentioned above, an exemplary parameter to which a security policy may be applied is the exact application being used on the personal computer. For example, a static security policy may disallow the selection of a certain web browser, word processor, e-mail program etc. In addition to the application being used, another policy parameter may include the version of the application being used. In case of a phishing attempts, the security system 1010 may disallow access to download a file if a web browser is out-ofdate. Thus the security system 1010 may disallow the access even when the connection 115 itself seems legitimate. In order to overcome the access barring the user may be required to update the application to a version where a known security hole has been fixed. Another exemplary parameter is the operating system OS of the computer. This parameter may take into account security vulnerabilities of the different operating systems.
Action 203
The arbitration server 101 may further communicate the security policies to the security agent 1030a in the first computer 103a in the computer network 100.
By local enforcement the assessment of the security of the connection may be improved since it’s much faster since the security agent 1030a is able to assess the security of the connection 115, and possibly deny the connection 115 if the security agent determines or takes a decision that a certain event associated with the connection 115 is malicious without consulting another device, such as the arbitration server 101 or another computer. Furthermore, at times when for whatever reason the communication to the arbitration server 101 is not possible the security agent 1030a is still able to take meaningful decisions. The alternative would be to either drop all connections or accept all connections - neither of which is ideal.
Action 204
The security system 1010 then assesses security of the connection 115 by determining whether an event associated with the connection is malicious based on the metadata. In other words, the security system 1010 may determine a value indicating the security of the connection 115. The value indicating the security of the connection 115 may be determined based on an event associated with the connection 115 in relation to the metadata.
In some embodiments assessing security of the connection 115 by determining whether the event associated with the connection 115 is malicious is further based on the security policies.
The determining may be performed by any one or more of:
- the arbitration server 101, and
- the security agent 1030a in the first computer 103a.
In some embodiments the determining whether the event associated with the connection is malicious is based on a comparison of current metadata with earlier metadata.
The determining whether the event associated with the connection 115 is malicious may comprise determining a statistical likelihood that the event is malicious by evaluating earlier metadata associated with the first computer 103a and/or a user of the first computer 103a.
In some embodiments determining the statistical likelihood that the event is malicious comprises comparing the metadata associated with the first computer with metadata associated with one or more further computers within the computer network 100.
Determining the statistical likelihood that an event on the computer network 100 is malicious may comprise comparing the metadata associated with the user of the personal computer with metadata associated with other users on the computer network 100.
Based on the above determining the connection 115 may be denied by preventing it from getting established or may be denied by disconnecting it if it is already established.
After a successful login the PC-agent 1030a-1030d may continue to assess the authenticity of the user based on user behaviour. This may hinder an illegitimate user getting physical access to the computer 103a-103d and using it to initiate a new malicious connection that threatens the security of the computer network 100. The PC-agent 1030a1030d may analyse and evaluate the user by physical parameters such as keyboard stroke analysis, speed of and pattern of moving the mouse, or jitter in mouse movements. The PC-agent 1030a-1030d may also analyse and evaluate the user by the applications that are running on the computer 103a-103d or other behavioural fingerprints associated with the user. When the PC-agent 1030a-1030d based on the behavioural parameters questions the authenticity of the user, the PC-agent 1030a-1030d informs the arbitration server 101 and the user may be disconnected from the computer network 100.
If the arbitration server 101 questions the authenticity of the user, either at time of login or afterwards based on user behaviour, the arbitration server 101 may request the user to provide further authentication. The further authentication may include providing additional passwords, pin-code, secret question, or physical biometric credentials such as fingerprints, voice recognition, face recognition, retina scan or ear pattern.
The arbitration server 101 may also request multi-factor authentication where a message is sent to a personal device registered with the user, e.g. a phone, and requesting credentials from the phone. The credentials may include one or more of a password, pin-code, secret question, physical biometric parameters such as fingerprints, voice recognition, face recognition, retina scan and ear pattern. The security system 1010 may also use a GPS of the personal device to verify the location of the user.
The security system 1010 may further evaluate parameters relating to the computer 103a-103d itself to determine the likelihood that the personal computer 103a-103d has been or can be hijacked. This may hinder hijacking an application already running on a trusted system and initiating a new connection to another trusted system of higher security. The PC-agent 1030a-1030d may analyse and evaluate how up to date other unrelated software is, the number of programs installed, the number of services running and in general how unusual the state of the computer 103a-103d is compared to the “normal” state. For example, in the case a connection is coming from a web browser that us up to date, but the PC 103a-103d is running an instant messaging client that is riddled with vulnerabilities, the instant messaging client compromises the rest of the computer network 100. The “normal” state, or baseline state, may be ascertained from the history of the particular personal computer 103a-103d. The PC-agents 1030a-1030d of all the personal computers 103a-103d in the computer network 100 over time may provide their state to the arbitration server 101. The arbitration server 101 may then analyse and evaluate the computer network 100 as whole and keep track of changes in similar computers and thus in similar security systems. The PC-agent 1030a-1030d may then compare the state of the specific personal computer 103a-103d with a baseline based on its similarity to many instances of other systems state as received from the arbitration server 101.
The security system 1010 may further evaluate parameters associated with an application itself. This may for example hinder a legitimate user using his own trusted computer network 100 to download and leak confidential company data. Abnormal file access patterns, such as high numbers of accesses, access to drives not usually accessed may for example imply corporate espionage. The PC-agent 1030a-1030d may analyse and evaluate how likely this application has been known to initiate connections to certain locations, by how many documents it has opened, and to other attributes that can be associated to even a particular version of the application to account for change in behavior after a software update. For example, a personal computer 103a running a certain OS is likely to periodically connect to a relevant server for OS updates, whereas it may be less likely that another personal computer 103b running another OS would connect to that particular server. The PC-agents 1030a-1030d of all the personal computers 103a-103d in the computer network over time provide application specific data to the arbitration server 101. The arbitration server 101 may then analyse and evaluate instances of application running on different computers in the computer network 100. The PC-agent 1030a-1030d may then gather the application specific data exclusively from the same computer, or may compare the application specific data of the first computer 103a to many instances of the application running on different computers 103b-103d as provided by the arbitration server 101.
Actions for assessing security of a connection 115 of the computer network 100, according to further embodiments herein will now be described in relation to Figure 3 and with continued reference to Figure 1.
Action 301
The security agent 1030a may obtain metadata associated with the first computer 103a and/or the user of the first computer 103a during some time. This may be referred to as historic metadata or the history of the first computer 103a and/or the user of the first computer 103a.
Action 302
The security agent 1030a may communicate the metadata associated with the first computer 103a and/or the user of the first computer 103a to the arbitration server 101.
Action 303
The arbitration server 101 may store the communicated metadata from the first computer 103 and also for other computers on the computer network 100.
Action 304
The security agent 1030a initiates the connection 115 to the second computer 105. In some cases the connection is established.
Action 305
The security agent 1030a may communicate further metadata, e.g. new or current metadata, to the arbitration server 101. The further metadata is associated with the first computer 103a and/or the user of the first computer 103a during the initiation of the connection 115.
Action 306
In some embodiments the arbitration server 101 compares the further metadata with the historic metadata.
Action 307
The arbitration server 101 may determine, e.g. based on the comparison in action 306, that an event associated with the connection 115 is malicious.
Action 308
In some embodiments the arbitration server 101 communicates security policies to the security agent 1030a. The security policies may be based on metadata associated with the computers 103a-103d on the computer network 100.
Action 309
In some embodiments the security agent 1030a compares the further metadata with the historic metadata. Then action 306 may not be performed.
Action 310
The security agent 1030a may determine, e.g. based on the comparison in action 309, that an event associated with the connection 115 is malicious. Then action 307 may not be performed.
Action 311
In some embodiments the security agent 1030a communicates the determination to the arbitration server 101. In this way the arbitration server 101 may update its historic metadata and/or its security policies.
Action 312
As mentioned above, in some embodiments the arbitration server 101 requests further authentication from the first computer 103a and/or the user.
Some specific embodiments of a method of assessing security of the connection 115 will now be described.
The method may comprise setting, in at least one arbitration server 101, security policies for the computer network 100;
communicating the security policies to at least one security agent 1030a in at least one computer 103a in the computer network 100;
determining the statistical likelihood that an event on the computer network 100 is malicious by evaluating the history of the computer 103a.
Determining the statistical likelihood that an event on the computer network 100 is malicious may comprise comparing the history of the computer 103a with the history of other computers 103b-103d on the computer network 100.
In some embodiments the computer 103a is a personal computer, and the evaluating the history of the computer 103a includes evaluating the history of a user of the computer.
In some embodiments determining the statistical likelihood that the event on the computer network 100 is malicious comprises comparing the history of the user of the computer with the history of other users on the computer network 100.
To perform the method actions, described above in relation to Figure 2, the security system 1010 comprises an arbitration server 101, the arbitration server being configured to set security policies for the computer network 100. The security system 1010 further comprises a security agent 1030a in the first computer 103a, the security agent 1030a being configured to be in communication with the arbitration server 101.
The security system 1010 is configured to obtain metadata associated with the first computer 103a and/or a user of the first computer 103a. For example, the security agent 1030a may be configured to obtain the metadata from the first computer 103a. The security agent 1030a may further be configured to transmit the metadata to the arbitration server 101 which then is configured to obtain the metadata. Other security agents in the computer network 100 may also transmit corresponding metadata to the arbitration server 101, which may be configured to obtain these corresponding metadata.
The arbitration server 101 may comprise the following arrangement depicted in Figure 4. The security agent 1030a may comprise the following arrangement depicted in Figure 5.
The security system 1010 may be configured to obtain the metadata by means of an obtaining module 410, 510 being part of the arbitration server 101 and/or the security agent 1030a. The obtaining module 410, 510 may be implemented, at least in part, by a processor 480, 580 in the arbitration server 101 and/or the security agent 1030a.
The security system 1010 is further configured to assess security of the connection 115 by determining whether an event associated with the connection is malicious based on metadata associated with the first computer 103a and/or a user of the first computer 103a. The determination may be further based on the security policies.
The security policies, which also may be referred to as security policy rules, may be consulted to check if the connection 115 should be allowed. A few examples may be:
• MATCH: Computer 103a talking to computer 103b • RESPONSE: ALLOW • MATCH Process ‘ssh-client’ (on any computer) talking to process ‘ssh-server’ (on any computer) • RESPONSE: ASK FOR 2 FACTOR AUTHORISATION • MATCH: User John talking to process ‘ssh-server’ (on any computer) o RESPONSE: ALLOW
At a more advanced level, matching may be very dynamic such as:
• MATCH connections from applications with known vulnerabilities.
• MATCH connections to suspicious ip’s.
Both these cases require that the arbitration server 101 is connected to one or more up-to-date repositories, some public, some private, in order to determine whether it is valid. For example, Chrome v49 may be good today, but in a matter of hours once a vulnerability is discovered would be invalid.
Responses may also be very dynamic, beyond allow/deny/2 factor authorisation, such as:
• Take a selfie/picture on security agent 1030a in source computer to do face authorisation • Require one or more uninvolved but relevant users to allow the user.
• Gather data on 1030a, which may comprise detailed information about open applications, all active connections, all open files, etc. This may be used to further investigate a connection that an administrator wants to understand.
These responses may be:
• Precanned responses that any customer may use • Developed by a customer using API’s.
In some embodiments the arbitration server 101 is configured to assess the security of the connection 115 by determining whether an event associated with the connection is malicious.
In some other embodiments the security agent 1030a in the first computer 103a, is configured to assess the security of the connection 115 by determining whether an event associated with the connection is malicious.
The security system 1010 may be configured to obtain the metadata by means of a determining module 420, 520 being part of the arbitration server 101 and/or the security agent 1030a. The determining module 420, 520 may be implemented, at least in part, by the processor 480, 580 in the arbitration server 101 and/or the security agent 1030a.
The arbitration server 101 may further comprise a setting module 430 configured to set the security policies.
In some embodiments each of the arbitration server 101 and the security agent 1030a further comprises a communication module 440, 530 configured to communicate with the corresponding communication module 440, 530.
The embodiments herein may be implemented through one or more processors, such as the processor 480 in the arbitration server 101 depicted in Figure 4, and the processor 580 in the security agent 1030a depicted in Figure 5 together with computer program code for performing the functions and actions of the embodiments herein. The program code mentioned above may also be provided as a computer program product, for instance in the form of a data carrier carrying computer program code for performing the embodiments herein when being loaded into the arbitration server 101 and the security agent 1030a. One such carrier may be in the form of a CD ROM disc. It is however feasible with other data carriers such as a memory stick. The computer program code may furthermore be provided as pure program code on a server and downloaded to the arbitration server 101 and security agent 1030a.
Thus, the methods according to the embodiments described herein for the arbitration server 101 and the security agent 1030a may be implemented by means of a computer program product, comprising instructions, i.e., software code portions, which, when executed on at least one processor, cause the at least one processor to carry out the actions described herein, as performed by the arbitration server 101 and the security agent 1030a. The computer program product may be stored on a computer-readable storage medium. The computer-readable storage medium, having stored there on the computer program, may comprise the instructions which, when executed on at least one processor, cause the at least one processor to carry out the actions described herein, as performed by the arbitration server 101 and the security agent 1030a. In some embodiments, the computer-readable storage medium may be a non-transitory computerreadable storage medium.
The arbitration server 101 and the security agent 1030a may further each comprise a memory 490, 590, comprising one or more memory units. The memory 490, 590 is arranged to be used to store obtained information such as metadata associated with the first computer 103a and/or the user and computer program code to perform the methods herein when being executed in the arbitration server 101 and the security agent 1030a.
When using the word comprise or “comprising” it shall be interpreted as nonlimiting, i.e. meaning consist at least of.
Modifications and other embodiments of the disclosed embodiments will come to mind to one skilled in the art having the benefit of the teachings presented in the foregoing descriptions and the associated drawings. Therefore, it is to be understood that the embodiment(s) is/are not to be limited to the specific embodiments disclosed and that modifications and other embodiments are intended to be included within the scope of this disclosure. Although specific terms may be employed herein, they are used in a generic and descriptive sense only and not for purposes of limitation.
Therefore, the above embodiments should not be taken as limiting the scope, which is defined by the appending claims.
Also note that terminology such as a first computer and a second computer should be considered to be non-limiting and does in particular not imply a certain hierarchical relation between the two.

Claims (6)

1. A method for assessing security of a connection (115) of a computer network (100), wherein the connection (115) is a connection between a first computer (103a) within the computer network (100) and a second computer (105), the method comprising:
obtaining (201) metadata associated with the first computer (103a) and/or a user of the first computer (103a), and assessing (204) security of the connection (115) by determining whether an event associated with the connection is malicious based on the metadata.
2. The method according to claim 1, wherein the determining whether the event associated with the connection (115) is malicious is based on a comparison of current metadata with earlier metadata.
3. The method according to any of the claims 1-2, wherein the determining whether the event associated with the connection (115) is malicious comprises determining a statistical likelihood that the event is malicious by evaluating earlier metadata associated with the first computer (103a) and/or a user of the first computer (103a).
4. The method according to claim 3, wherein determining the statistical likelihood that the event is malicious comprises comparing the metadata associated with the first computer (103a) with metadata associated with one or more further computers within the computer network (100).
5. Method according to any of the claims 3-4, wherein determining the statistical likelihood that an event on the computer network (100) is malicious comprises comparing the metadata associated with the user of the personal computer with metadata associated with other users on the computer network (100).
6. The method according to any of the claims 1-5, wherein the metadata associated with the first computer 103a comprises any one or more of:
- Location data of the first computer 103a and/or second computer 105,
- Login time of the user,
- a configuration of the first computer 103a, and
- an application running on the first computer 103a.
The method according to any of the claims 1-6, wherein the metadata associated with the user of the first computer comprises any one or more of:
- a keyboard stroke on a keyboard associated with the first computer (103a);
- a speed of and/or a pattern of moving a mouse associated with the first computer (103a); and
- jitter in movements of the mouse.
The method according to any of the claims 1-7, wherein the determining is performed by any one or more of:
- an arbitration server (101), the arbitration server (101) being configured to set security policies for the computer network (100); and
- a security agent (1030a) in the first computer (103a), the security agent being configured to be in communication with the arbitration server (101).
The method according to any of the claims 1-8, further comprising:
- setting (202), by the arbitration server (101), security policies for the computer network (100); and
- communicating (203) the security policies to the security agent (1030a) in the first computer (103a) in the computer network (100), and wherein the assessing (204) security of the connection (115) by determining whether an event associated with the connection (115) is malicious is further based on the security policies.
A security system (1010) in a computer network (100) for assessing security of a connection (115) of the computer network (100), wherein the connection (115) is a connection between a first computer (103a) within the computer network (100) and a second computer (105), wherein the security system (1010) is configured to:
i. obtain metadata associated with the first computer (103a) and/or a user of the first computer (103a), and ii. assess security of the connection (115) by determining whether an event associated with the connection is malicious based on the metadata.
The security system (1010) according to claim 10, wherein the security system (1010) is further configured to determine whether the event associated with the connection (115) is malicious based on a comparison of current metadata with earlier metadata.
The security system (1010) according to any of the claims 10-11, wherein the security system (1010) is further configured to determine whether the event associated with the connection (115) is malicious by determining a statistical likelihood that the event is malicious by evaluating earlier metadata associated with the first computer (103a) and/or a user of the first computer (103a).
The security system (1010) according to any of the claims 10-12, wherein the security system (1010) is further configured to determine the statistical likelihood that the event is malicious by comparing the metadata associated with the first computer with metadata associated with one or more further computers within the computer network (100).
The security system (1010) according to any of the claims 10-13, wherein the security system (1010) is further configured to determine the statistical likelihood that an event on the computer network (100) is malicious by comparing the metadata associated with the user of the personal computer with metadata associated with other users on the computer network (100).
The security system (1010) according to any of the claims 10-14, wherein the security system (1010) comprises:
- an arbitration server (101), the arbitration server being configured to set security policies for the computer network (100); and
- a security agent (1030a) in the first computer (103a), the security agent (1030a) being configured to be in communication with the arbitration server (101), wherein the security system (1010) is further configured to assess security of the connection (115) by determining whether the event associated with the connection (115) is malicious further based on the security policies.
16. The security system (1010) according to any of the claims 10-15, wherein the
5 arbitration server (101) is configured to assess the security of the connection (115) by determining whether the event associated with the connection (115) is malicious, and/or wherein the security agent (1030a) in the first computer (103a), is configured to assess the security of the connection (115) by determining whether the event associated with the connection is malicious.
17. A computer program product comprising software instructions that, when executed in a processor, performs the method of any of claims 1 to 9.
Intellectual
Property
Office
Application No: GB1713774.6 Examiner: Adam Tucker
GB1713774.6A 2017-02-20 2017-08-29 Secure access by behavior recognition Withdrawn GB2559821A (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
NO20170249A NO20170249A1 (en) 2017-02-20 2017-02-20 Secure access by behavior recognition

Publications (2)

Publication Number Publication Date
GB201713774D0 GB201713774D0 (en) 2017-10-11
GB2559821A true GB2559821A (en) 2018-08-22

Family

ID=58772615

Family Applications (1)

Application Number Title Priority Date Filing Date
GB1713774.6A Withdrawn GB2559821A (en) 2017-02-20 2017-08-29 Secure access by behavior recognition

Country Status (2)

Country Link
GB (1) GB2559821A (en)
NO (1) NO20170249A1 (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050268342A1 (en) * 2004-05-14 2005-12-01 Trusted Network Technologies, Inc. System, apparatuses, methods and computer-readable media for determining security status of computer before establishing network connection second group of embodiments-claim set II
US20070118756A2 (en) * 2003-07-01 2007-05-24 Securityprofiling, Inc. Policy-protection proxy
US20090158404A1 (en) * 2007-12-17 2009-06-18 International Business Machines Corporation Apparatus, system, and method for user authentication based on authentication credentials and location information
US20100055370A1 (en) * 2008-08-26 2010-03-04 Basf Se Adhesive composition for self-adhesive redetachable articles based on adhesive polymers and organic nanoparticles
US20100211996A1 (en) * 2008-12-26 2010-08-19 Mcgeehan Ryan Preventing phishing attacks based on reputation of user locations
US7801985B1 (en) * 2007-03-22 2010-09-21 Anchor Intelligence, Inc. Data transfer for network interaction fraudulence detection

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050203881A1 (en) * 2004-03-09 2005-09-15 Akio Sakamoto Database user behavior monitor system and method
WO2014205421A1 (en) * 2013-06-21 2014-12-24 Arizona Board Of Regents For The University Of Arizona Automated detection of insider threats
US10043006B2 (en) * 2015-06-17 2018-08-07 Accenture Global Services Limited Event anomaly analysis and prediction
US9537880B1 (en) * 2015-08-19 2017-01-03 Palantir Technologies Inc. Anomalous network monitoring, user behavior detection and database system

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070118756A2 (en) * 2003-07-01 2007-05-24 Securityprofiling, Inc. Policy-protection proxy
US20050268342A1 (en) * 2004-05-14 2005-12-01 Trusted Network Technologies, Inc. System, apparatuses, methods and computer-readable media for determining security status of computer before establishing network connection second group of embodiments-claim set II
US7801985B1 (en) * 2007-03-22 2010-09-21 Anchor Intelligence, Inc. Data transfer for network interaction fraudulence detection
US20090158404A1 (en) * 2007-12-17 2009-06-18 International Business Machines Corporation Apparatus, system, and method for user authentication based on authentication credentials and location information
US20100055370A1 (en) * 2008-08-26 2010-03-04 Basf Se Adhesive composition for self-adhesive redetachable articles based on adhesive polymers and organic nanoparticles
US20100211996A1 (en) * 2008-12-26 2010-08-19 Mcgeehan Ryan Preventing phishing attacks based on reputation of user locations

Also Published As

Publication number Publication date
NO20170249A1 (en) 2018-08-21
GB201713774D0 (en) 2017-10-11

Similar Documents

Publication Publication Date Title
US11997117B2 (en) Intrusion detection using a heartbeat
US11775622B2 (en) Account monitoring
US11722516B2 (en) Using reputation to avoid false malware detections
US9917864B2 (en) Security policy deployment and enforcement system for the detection and control of polymorphic and targeted malware
US9654489B2 (en) Advanced persistent threat detection
US8789202B2 (en) Systems and methods for providing real time access monitoring of a removable media device
US20160127417A1 (en) Systems, methods, and devices for improved cybersecurity
US8266672B2 (en) Method and system for network identification via DNS
US20090247125A1 (en) Method and system for controlling access of computer resources of mobile client facilities
US11765590B2 (en) System and method for rogue device detection
US10320829B1 (en) Comprehensive modeling and mitigation of security risk vulnerabilities in an enterprise network
US20210329459A1 (en) System and method for rogue device detection
KR100825726B1 (en) Apparatus and method for user's privacy ? intellectual property protection of enterprise against denial of information
US10313384B1 (en) Mitigation of security risk vulnerabilities in an enterprise network
US20230421579A1 (en) Traffic scanning with context-aware threat signatures
KR20100067383A (en) Server security system and server security method
GB2559821A (en) Secure access by behavior recognition
Alwahedi et al. Security in mobile computing: attack vectors, solutions, and challenges
Kim et al. A Study on the Security Requirements Analysis to build a Zero Trust-based Remote Work Environment
US20230319116A1 (en) Signature quality evaluation
US20240007440A1 (en) Persistent IP address allocation for virtual private network (VPN) clients
WO2024003539A1 (en) Persistent ip address allocation for virtual private network (vpn) clients

Legal Events

Date Code Title Description
WAP Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1)