[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

US20190253883A1 - A device, computer program and method - Google Patents

A device, computer program and method Download PDF

Info

Publication number
US20190253883A1
US20190253883A1 US16/336,470 US201716336470A US2019253883A1 US 20190253883 A1 US20190253883 A1 US 20190253883A1 US 201716336470 A US201716336470 A US 201716336470A US 2019253883 A1 US2019253883 A1 US 2019253883A1
Authority
US
United States
Prior art keywords
user
movement
authenticating
ambient sound
authenticate
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US16/336,470
Inventor
Conor Aylward
Hugo EMBRECHTS
Dimitri Torfs
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sony Corp
Original Assignee
Sony Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sony Corp filed Critical Sony Corp
Assigned to SONY CORPORATION reassignment SONY CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: TORFS, DIMITRI, EMBRECHTS, HUGO, AYLWARD, CONOR
Publication of US20190253883A1 publication Critical patent/US20190253883A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • H04W12/00508
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/31User authentication
    • G06F21/316User authentication by observing the pattern of computer usage, e.g. typical user behaviour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/31User authentication
    • G06F21/36User authentication by graphic or iconic representation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/0304Detection arrangements using opto-electronic means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • G06F3/0425Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means using a single imaging device like a video camera for tracking the absolute position of a single or a plurality of objects with respect to an imaged reference surface, e.g. video camera imaging a display or a projection screen, a table or a wall surface, on which a computer generated image is displayed or projected
    • G06F3/0426Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means using a single imaging device like a video camera for tracking the absolute position of a single or a plurality of objects with respect to an imaged reference surface, e.g. video camera imaging a display or a projection screen, a table or a wall surface, on which a computer generated image is displayed or projected tracking fingers with respect to a virtual keyboard projected or printed on the surface
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/107Static hand or arm
    • G06V40/11Hand-related biometrics; Hand pose recognition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/60Static or dynamic means for assisting the user to position a body part for biometric acquisition
    • G06V40/67Static or dynamic means for assisting the user to position a body part for biometric acquisition by interactive indications to the user
    • H04W12/00504
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W12/00Security arrangements; Authentication; Protecting privacy or anonymity
    • H04W12/06Authentication
    • H04W12/0605
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W12/00Security arrangements; Authentication; Protecting privacy or anonymity
    • H04W12/06Authentication
    • H04W12/065Continuous authentication
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W12/00Security arrangements; Authentication; Protecting privacy or anonymity
    • H04W12/30Security of mobile devices; Security of mobile applications
    • H04W12/33Security of mobile devices; Security of mobile applications using wearable devices, e.g. using a smartwatch or smart-glasses
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W12/00Security arrangements; Authentication; Protecting privacy or anonymity
    • H04W12/60Context-dependent security
    • H04W12/65Environment-dependent, e.g. using captured environmental data
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W12/00Security arrangements; Authentication; Protecting privacy or anonymity
    • H04W12/60Context-dependent security
    • H04W12/68Gesture-dependent or behaviour-dependent
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L63/00Network architectures or network communication protocols for network security
    • H04L63/08Network architectures or network communication protocols for network security for authentication of entities
    • H04L63/083Network architectures or network communication protocols for network security for authentication of entities using passwords
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L63/00Network architectures or network communication protocols for network security
    • H04L63/08Network architectures or network communication protocols for network security for authentication of entities
    • H04L63/0861Network architectures or network communication protocols for network security for authentication of entities using biometrical features, e.g. fingerprint, retina-scan
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L63/00Network architectures or network communication protocols for network security
    • H04L63/08Network architectures or network communication protocols for network security for authentication of entities
    • H04L63/0876Network architectures or network communication protocols for network security for authentication of entities based on the identity of the terminal or configuration, e.g. MAC address, hardware or software configuration or device fingerprint
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W12/00Security arrangements; Authentication; Protecting privacy or anonymity
    • H04W12/08Access security
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W12/00Security arrangements; Authentication; Protecting privacy or anonymity
    • H04W12/60Context-dependent security
    • H04W12/61Time-dependent

Definitions

  • the present technique relates to a device, computer program and method.
  • that sound information may be hacked revealing sensitive information about the user or his or her environment.
  • a device for authenticating a user comprising a sensor configured to measure the movement of a user in response to the interaction of the user with a displayed image and controller circuitry configured to authenticate the user in response to a positive comparison between the movement of the user and a stored movement associated with the user.
  • FIG. 1 shows a device 100 according to embodiments of the present disclosure
  • FIG. 2 shows a system 200 according to embodiments of the present disclosure
  • FIG. 3 shows a schematic diagram of a virtual keyboard of embodiments
  • FIG. 4 shows a schematic diagram of a template storing user information according to embodiments
  • FIG. 5 shows a process explaining according to embodiments of the disclosure
  • FIG. 6 shows a schematic diagram of a template according another embodiment of the disclosure.
  • FIG. 7 shows a state diagram explaining the mechanism for maintaining the confidence score.
  • FIG. 1 shows a device 100 according to embodiments of the disclosure.
  • the device 100 is a wearable device such as a fitness band or smartwatch which the user wears that comprises a controller 105 .
  • the device 100 is not so limited and may be any device 100 with which the user interacts, such as a hub like the Xperia® Agent.
  • the controller 105 may be implemented as controller circuitry comprising hardware that is configured to perform certain method steps.
  • the method steps are defined by computer readable code that is stored within storage 130 attached to the controller 105 .
  • the storage 130 may be optically readable storage or may be solid state storage or the like.
  • the transceiver 110 comprises circuitry that allows the device 100 to communicative with other devices and/or a network. This communication, in embodiments, will be wireless and may be performed using WiFi, Bluetooth, NFC, cellular communication or the like. An antenna 112 is provided to facilitate such communication.
  • the microphone 135 detects the sound from the location of the device 100 . This sound may be a voice command from a user or, in an embodiment, may be the ambient sound of the device 100 .
  • the “ambient sound” is a term known to the skilled person and means the background sound which is present at the location of the device 100 but which is not an instruction to the device 100 .
  • the microphone 135 may be embodied as microphone circuitry and may be a capacitive or a resistive type microphone.
  • sensors 125 are connected to the controller 105 . These sensors may be embodied as modules or circuitry located within the device 100 that perform certain functions and quantify certain physical or environmental conditions presented to or asserted on the device 100 . Examples of sensors include accelerometers, barometers, gyroscopes and the like. In embodiments, other sensors include image sensors that capture an image of the surroundings of the device 100 . These types of sensors are known to the skilled person.
  • the user output module 120 may be a display, or connected to a display, that provides a visual output.
  • the user output module 120 may be a haptic feedback device that presents the user with a specific vibration indicating a certain output.
  • any output that can be understood by the user can be provided by the user output module 120 .
  • the user input module 115 may be a touch screen wherein the user instructs the device 100 to perform certain functions using a touch screen mechanism.
  • the user input module 115 may be an image sensor (which may be the same or different to one embodied as a module in sensor 125 ) that captures an image of the user interacting with an object overlaid on an augmented reality display.
  • the user input module 115 is an image sensor that captures the position of the user's hand and acts as a gesture recognition module. That is, the movement and position of the user's hand will be captured and certain actions performed in response to the captured movement and position.
  • the device 100 is used as a device for authenticating the user.
  • the authentication in embodiments, is performed by analyzing the movement and/or physical (sometimes referred to as “physiological”) traits of the user when interacting with a displayed image of an object.
  • the object may be provided on the user output module 120 as, for example, a 3D object in free-space. In this case, the provision of the object in 3D allows the user to have a large variation in interaction with the virtual object.
  • the object may be displayed on a surface.
  • the Xperia® Projector projects objects onto a surface such as a desk or a wall.
  • FIG. 2 describes a system 200 according to embodiments of the disclosure.
  • the system of embodiments of the disclosure include a resource 205 which may be a server located in a cloud provided by a cloud based service provider.
  • the resource 205 may be a controlling device which is located in a network to which the device 100 is connected.
  • the resource 205 may be located on the cloud and may provide services to the device 100 such as authentication or storage of user profiles.
  • the resource 205 may be located on a local network and the resource 205 may be a hub that contains equivalent information.
  • An example of this hub is an Xperia Agent or the like.
  • the hub may be used to authenticate the user before granting access to the device 100 . In this instance, the user will interact with a virtual object created by the hub and in dependence upon the interaction, the user will be granted access to the device 100 .
  • the device 100 is connected to the resource 205 via a network 210 .
  • the network 210 may therefore be a local area network, a wide area network or the internet.
  • a virtual keyboard 300 is shown.
  • the virtual keyboard is the displayed image with which the user interacts.
  • the disclosure is not so limited as will be explained later.
  • the virtual keyboard 300 in FIG. 3 is a numeric keypad that includes numbers 0-9, the keypad may contain letters, symbols, colours, shapes or any kind of pattern. As the skilled person would also appreciate, the numeric keypad is presented to the user in numerical sequence. However, it will be appreciated that the keypad may be randomised so that the numbers are not presented in numerical order. This reduces the likelihood of a third party determining a user's personal identification number (PIN hereinafter).
  • PIN personal identification number
  • the virtual keyboard 300 is displayed using a projection mechanism that projects the virtual keyboard 300 onto a surface with which the user interacts.
  • the virtual keyboard is not a physical keyboard but is projected by the device 100 (or another device) onto a surface.
  • the device may present the user with a virtual keyboard (as one example of an object) in augmented reality space and the user will interact with the object.
  • the mechanism by which the object is projected or displayed is known and so will not be explained in any detail.
  • a user's hand is shown interacting with the virtual keyboard 300 .
  • an image sensor within the device 100 captures an image of the user's hand and the controller 105 performs object recognition on the user's hand.
  • the image sensor may be an array of sensors which may be used to capture depth information as is known. This provides increased flexibility with regard to interaction with a virtual object.
  • the controller 105 identifies the user's fingers on the detected hand and determines the angle between the thumb and the forefinger at a certain position of the hand. This is denoted in FIG. 3 as ⁇ .
  • the controller 105 identifies other physical traits of the user's hand in a certain position such as the angle between the forefinger and the middle finger, the angle between the middle finger and the thumb, or other physical traits like the number of fingers on a hand and the length of the user's fingers (either absolute length or relative length), size and proportion of the user's palm and so on.
  • the device 100 recognizes other physical traits of the user's hand such as skin colour, blemishes on the hand such as moles or scars or the like. This may be achieved using pattern matching whereby the captured image is compared with a stored template of the user's hand which will be explained later.
  • the device 100 recognizes which hand is being used to interact with the virtual keyboard by identifying the position of the thumb and determining whether the user's right hand or left hand is being used. In other words, the device 100 recognizes which hand is dominant for the user.
  • the device also performs object tracking which identifies movement of the user's hand as the user interacts with the keyboard.
  • object tracking The specific method for performing object tracking is known and will not be explained for brevity.
  • the device 100 identifies how the user rotates their hand and wrist and how much rotation in the wrist has occurred when the user enters the PIN on the virtual keyboard. More generally, the device 100 detects the movement of the user and how the user interacts with the displayed object.
  • the device analyses the amount of time that the user's forefinger (which, in embodiments, the user will use to press the virtual keyboard 300 ) hovers over each key. So, the device 100 tracks the user's hand over the virtual keyboard 300 and measures the movement of the user's hand over the virtual keyboard 300 . Further the physical characteristics of the user's hand such as the angle between the user's respective fingers are also analyzed. In other words, the manner in which the user interacts with the virtual keyboard 300 is analyzed by the device 100 . So, the device 100 determines the speed at which the user's hand moves over the keyboard and the amount of time that the user hovers over each key when pressing the key.
  • the method by which the user interacts with the virtual keyboard 300 is unique to the user. This is difficult for an unauthorized third party to copy. It is envisaged that the analysis of the user's movement and interaction with the displayed object may be used solely to authenticate the user. Alternatively, the analysis of the user's movement and interaction with the displayed object may be used as an additional form of authentication to the entry of a PIN or other passcode. In other words, in order to authenticate the user, the user must enter the correct PIN or other passcode in the correct manner. This improves known techniques of authentication which are liable for spoofing where only a PIN or passcode is entered.
  • the keypad will be placed at a similar position within the user's field of view each time the keypad is displayed. This is to ensure consistency of the hand position between consecutive captured movements. In other words, placing the keypad in the lower half of the user's field of view may provide different hand movements to the situation where the keypad is placed in the upper half of the user's field of view.
  • FIG. 4 shows a table 400 that is stored within the storage 130 of device 100 .
  • the contents of the table 400 are a template that defines an authorized user and is populated during a training phase of the system for any one object interaction.
  • a user who is known to be authentic is presented with a virtual keyboard or other displayed object.
  • the authentic user then trains the system by interacting with the displayed object one or more times.
  • a second object interaction may be the user putting a virtual key in a virtual lock.
  • only a single object interaction will be described.
  • the table 400 has a user identity column 405 which identifies each user uniquely.
  • the first user is identified as “User 1”.
  • a number of parameters are associated with that user. These are also stored in table 400 which may be embodied as a database.
  • the disclosure is not so limited.
  • the user characteristics within the table form a unique user signature or behaviour.
  • the confidence in an authentication score for the user may be maintained in the internal thresholds and states of a machine learning or neural network model.
  • the inputs are selected that best correlate to the output to authenticate the user. This means that the inputs to the neural networks for one user may be very different to those for another user. So, and as will be appreciated, there is not one algorithm used for all users but rather there will be many algorithm variations used combined with differences in user inputs to authenticate between many users.
  • the first parameter is a password, PIN or passcode that includes numbers, alphanumeric characters and the like. In the example of FIG. 4 , this is “1234” and is stored in the column “PIN” 410 .
  • the movement and physical characteristic of the user is stored in column 415 .
  • the physical traits of the user when entering the passcode or pin during the training phase are stored. For example, the angle between the user's thumb and the first finger is identified as 22° and the angle between the user's second finger and the thumb is identified as 87°. This is identified using object detection during the training phase. This is stored in row 420 .
  • column 415 Other physical parameters and traits are stored within column 415 .
  • the time over which the user hovers before pressing each number of their PIN is noted in row 425 .
  • the time of hover over number 1 in the PIN is 0.3 seconds and the time hovering over number 2 is 0.4 seconds.
  • column 415 are other physical characteristics of the user such as the wrist rotation in row 430 and even other physical characteristics such as colour of skin and skin blemishes.
  • the wrist rotation is 42°.
  • row 435 which is in this case, the right hand.
  • the purpose of the table 400 is to store the template of the user's interaction with the virtual keyboard 300 .
  • the template is derived during the training phase where not only a PIN or passcode is determined or stored in 410 but also the physical characteristics and traits of the user and how the user interacts with the virtual keyboard are also stored.
  • This template is stored securely in the device 100 .
  • the table or template 400 may be stored in the resource 205 or on the Cloud.
  • the contents of the table 400 may be encrypted for additional security.
  • a user may be authenticated. This is during the authentication phase where a user interacts with the displayed virtual keyboard 300 (or other object). During this phase, the device 100 identifies the PIN code or passcode that is entered by the user. In addition, or alternatively, the physical traits of the user such as the position of the forefinger relative to the thumb and the amount of time taken by the user to hover over a particular key is also identified and compared with the stored template 400 . It is on this basis that the user is authenticated as will be explained.
  • the movement of the user may be used to authenticate the user alone.
  • the user may be authenticated if the movement of the user during entry of a passcode is the same as the movement of the user 415 stored within table 400 .
  • the entered passcode should be the same as that stored in column 410 .
  • the measured movements must be within a predetermined threshold of the stored movement. For example, for the user to be authenticated to a first level of confidence, the angle between the thumb and forefinger must be within 0.5° of that stored in the table 400 . However, if the user is to be authenticated to a second, higher, level of confidence, the angle between the thumb and forefinger must be within 0.3°.
  • the level of confidence may be set by the user or by the resource 205 . So, for more sensitive information such as access to banking information where a high level of confidence is required, the user would be authenticated to the second level of confidence. However, if the user simply wants access to non-sensitive information such as stored music, the first level of confidence will suffice.
  • the level of confidence may be increased by providing multiple authentication techniques. For example, for highly sensitive data such as medical data, a third, even higher, level of confidence may be required. In this instance, the PIN entered by the user will match the PIN stored in column 410 and the angle between the user's thumb and forefinger will be within 0.3° of the stored value.
  • various other levels of confidence may be derived using the other physical characteristics.
  • the hover time over the various keys may be used in conjunction with the various angles between fingers to generate numerous confidence levels.
  • some physical characteristics are very particular to a user and so higher levels of weighting may be applied to these characteristics.
  • skin blemishes are very particular to a particular person, and are quite reliably detected.
  • the dominant hand of a user is less unique to the user. Therefore, a high weighting may be applied to the skin blemish characteristic compared to the dominant hand characteristic.
  • FIG. 5 shows a flow diagram explaining the authentication process associated with this embodiment.
  • the process 600 begins when a user 601 sends a request to resource 205 . This may be for access to sensitive information such as via a banking application. This is noted in the request resource step 605 .
  • the resource 205 will then present an authentication challenge to the device 100 in step 610 .
  • the type of challenge and the level of confidence required will be defined by the resource 205 .
  • the authentication challenge is the entry of a PIN or passcode.
  • the disclosure is in no way limited to this and other authentication challenges may include measuring how the user inserts a key into a lock or interacts with shapes such as building blocks or how the user presses a certain array of coloured buttons or any kind of interaction with an image or virtual device.
  • the selection of the authentication challenge may be specific to the resource 205 .
  • an online store may request that a passcode or PIN may be entered in order for the user's identity to be authenticated.
  • the resource may randomly choose an authentication challenge that has already been performed by the user during the training phase or uses the same motion as a challenge for which training has already taken place.
  • This authentication challenge is presented to the device 100 , in embodiments along with the level of confidence required and the device 100 generates the challenge in step 615 .
  • the device 100 presents the user with the virtual keyboard 300 .
  • the user then interacts with the object in 620 . This is shown in FIG. 3 whereby the user enters a PIN on the virtual keyboard 300 .
  • the user's interaction is measured in step 625 .
  • the device 100 captures the user's physical traits when interacting with the virtual keyboard.
  • the captured behaviour is then either compared with the user profile stored within the device 100 or, in this case, is sent in step 630 to the resource 205 for comparison with the stored table 400 .
  • the captured behaviour is compared against the template stored in table 400 either in the device 100 or, in this case, resource 205 .
  • the comparison with the template validates the behaviour of the user as being that of the user 1 in steps 635 . If the behaviour is validated in step 635 , then the authentication of the user is complete. It should be noted here that the authentication is completed to the required level of confidence. As explained above, for example, for the user to be authenticated to a first level of confidence, the angle between the thumb and forefinger must be within 0.5° of that stored in the table 400 . However, if the user is to be authenticated to a second, higher, level of confidence, the angle between the thumb and forefinger must be within 0.3°.
  • a success 640 is then provided to the device 100 and the resource 205 returns the required data such as authorization that the transaction is complete or returns the content stored within the resource 205 . This occurs in step 645 and the resource or the success of the authentication is then displayed to the user 601 in step 650 . The process then ends.
  • the disclosure is not so limited. In fact, some movement information or physical traits may be provided by a different wearable device. For example, the wrist rotation may be measured by a wearable wrist strap.
  • authentication of the user is performed by determining the proximity of the device 100 to another device that is already known to be close to the user. In order to determine whether the other device is close to the user, a behavioumetric fingerprint, or biometric fingerprint is taken on that device.
  • the device that is known to be close to the user may be attached to the user or may be embedded within the user (for example under the skin of the user).
  • ambient sound is detected whereby the content of the sound is captured in both devices (that is the device which is known to be close to the user and the device 100 ) and compared. In the event that the sounds are the same, it is determined that the devices are close together and therefore the authentication of the user is complete.
  • the ambient sound contains a large amount of personal information such as content of conversation and people's voices within that conversation. Additionally, environmental sounds such as announcements can indicate the location of the user. This may be compromised and may risk the security of the user. It is an aim of the present disclosure to address this.
  • the disclosure uses the energy in the audio signals to authenticate the user.
  • the raw audio is not compared but rather the energy content in the audio is compared.
  • the energy content is sometimes referred to as the sound volume or sound intensity. This reduces the information content within the audio signal but provides enough information that authentication of the proximity of the device of the user can be performed. This has an additional benefit of low hardware requirements. This reduces the cost and complexity of devices and battery usage.
  • a continuous confidence score may be determined and a state diagram such as that shown in FIG. 6 and FIG. 7 is maintained.
  • a user identity 505 stores a unique identifier for each user. This is stored in column 505 .
  • a device identifier is also stored in column 510 . This uniquely identifies each device associated with the user. In this case, there are three devices uniquely identified as device 1, device 2, device 3. There is also an additional device (not shown) that is known to be close to the user. As noted above, this device may be attached to the user or embedded within the user. The location of each of device 1, device 2 and device 3 which was determined as explained below is then stored in column 515 . In this example, the location of each of device 1, 2 and 3 is provided relative to the device known to be close to the user. In other words, in the example table in FIG. 6 , device 1 is noted as being close to the device known to be close to user 1 and device 2 is located as being not close to the device known to be close to user 1. Device 3 is also located close to the device known to be close to user 1.
  • a confidence score is also provided in confidence score column 520 . This provides a certain level of confidence of the nature of the location of the device. In the example of FIG. 6 , the device 100 is 85% confident that device 1 is close to user 1. Similarly, device 100 is 82% confident that device 2 is not close to user 1 and 75% confident that device 3 is close to user 1. An explanation of the determination of the confidence score and the location of the device now follows.
  • Each piece of wearable technology (shown in FIG. 6 as device 1, 2, and 3) contains a microphone 135 .
  • the microphone 135 captures the ambient noise at regular intervals. For example, the microphone 135 captures 0.5 second samples of the ambient sound at 1 second intervals. The captured ambient sound is passed to a controller for processing.
  • the controller in each device 1, 2 and 3, converts the captured ambient sound to a time series of sound intensities using a known technique.
  • the time series of sound intensities is passed to an authentication device 100 which maintains table 500 .
  • the table is stored within device 100 .
  • the table 500 is stored for authentication purposes as will be explained later.
  • the transmission of a time series of sound intensities is useful as they contain little or no information about the user or the environment in which the user is located. This means even if the time series of sound intensities was hacked, no information would be compromised.
  • the time series of sound intensities is cross-correlated.
  • the cross-correlation is performed between each device and the device we know is close to the user.
  • the time series of sound intensities from device 1 is cross-correlated with the time series of sound intensities from the device we know is close to the user.
  • the time series of sound intensities from device 2 and separately device 3 is cross-correlated with the time series of sound intensities from the device we know is close to the user.
  • Cross correlation is a known technique and will not be described in any detail hereinafter.
  • the time series will be converted to a frequency domain representation using a Fast Fourier Transform (FFT) or the like.
  • FFT Fast Fourier Transform
  • the output of the cross-correlation will determine how similar the ambient sounds are at the sample time. Where the level of similarity is at or above a threshold value, a continuous similarity score will be increased by an amount. Alternatively, where the level of similarity is below the threshold, the similarity score will be decreased by an amount.
  • the state diagram 700 sets the proximity between the device known to be close to the user and, say, device 1 starts at “not close” 705 .
  • the sound intensity from the device known to be close to the user and device 1 is received at the device 100 periodically.
  • device 100 may be one of these devices measuring the sound intensity.
  • the sound intensity is received every 0.5 seconds.
  • other periods are envisaged and these periods may or may not be regular.
  • This is step 720 in the state diagram 700 .
  • a decision has to be made. Specifically, the controller 105 determines whether the cross-correlation of the sound intensity results in a match between the two received sound intensities. In this case, the match may be that the output of the cross-correlation is at or above a threshold of, say, 85%. If the sound intensities match, the “signals match increment score” path is followed to step 730 . The continuous score associated with device 1 is incremented. In step 730 , a decision is made. Specifically, the continuous score associated with device 1 is reviewed. In the event that the continuous score is at or above a threshold of, say, 80, device 1 is determined to be close to the device known to be close to the user. On the other hand, if the continuous score is below the threshold of, in this case, 80, then device 1 is deemed to be not close to the device which is known to be close to the user.
  • step 725 if the cross-correlated signals is below the threshold of, in this case, 85%, the signals are deemed not to match and the continuous score is not incremented.
  • the path 715 is followed to the determination in 705 that device 1 is not close to the device known to be close to the user.
  • Described embodiments may be implemented in any suitable form including hardware, software, firmware or any combination of these. Described embodiments may optionally be implemented at least partly as computer software running on one or more data processors and/or digital signal processors.
  • the elements and components of any embodiment may be physically, functionally and logically implemented in any suitable way. Indeed the functionality may be implemented in a single unit, in a plurality of units or as part of other functional units. As such, the disclosed embodiments may be implemented in a single unit or may be physically and functionally distributed between different units, circuitry and/or processors.
  • a device for authenticating a user comprising a sensor configured to measure the movement of a user in response to the interaction of the user with a displayed image and controller circuitry configured to authenticate the user in response to a positive comparison between the movement of the user and a stored movement associated with the user.
  • a device comprising storage configured to store a user profile having associated therewith the stored movement and wherein the controller circuitry is configured to compare the movement of the user with the stored movement in the user profile, and in the event of a positive comparison, authenticating the user.
  • a device according to clause 1 or 2, wherein the sensor is an image sensor or a wearable sensor located on the user's body.
  • controller circuitry is further configured to authenticate the user in response to a positive comparison with entry of a predefined code associated with the user on the virtual keyboard.
  • controller circuitry is configured to authenticate the user in accordance with a physiological trait of the user.
  • a device for authenticating a user comprising microphone circuitry configured to capture the ambient sound over a predetermined period, and controller circuitry configured to: measure the energy of the ambient sound over a predefined period; compare the time series of the measured energy with a received time series of the measure of the energy of the ambient sound from a second device; and authenticate the user in the event of a positive comparison.
  • the microphone circuitry is configured to capture the ambient sound over a time period and to update a comparison result for each time period and to authenticate the user in the event that the comparison result reaches a predetermined threshold.
  • a method of authenticating a user comprising measuring, using a sensor, the movement of a user in response to the interaction of the user with a displayed image and authenticating the user in response to a positive comparison between the movement of the user and a stored movement associated with the user.
  • a method according to clause 11, comprising storing a user profile having associated therewith a stored movement and comparing the movement of the user with the stored movement in the user profile, and in the event of a positive comparison, authenticating the user.
  • a method according to clause 15, comprising authenticating the user in response to a positive comparison with entry of a predefined code associated with the user on the virtual keyboard.
  • a method of authenticating a user comprising capturing, using microphone circuitry, the ambient sound over a predetermined period, and the method comprising measuring the energy of the ambient sound over a predefined period; comparing the time series of the measured energy with a received time series of the measure of the energy of the ambient sound from a second device; and authenticating the user in the event of a positive comparison.
  • a method according to clause 19, comprising capturing the ambient sound over a time period and updating a comparison result for each time period and to authenticate the user in the event that the comparison result reaches a predetermined threshold.
  • a computer program product comprising computer readable instructions which, when loaded onto a computer, configures the computer to perform the method according to any one of clauses 11 to 20

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Security & Cryptography (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Computer Hardware Design (AREA)
  • Multimedia (AREA)
  • Software Systems (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Social Psychology (AREA)
  • Environmental & Geological Engineering (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Animal Behavior & Ethology (AREA)
  • User Interface Of Digital Computer (AREA)
  • Collating Specific Patterns (AREA)

Abstract

A device for authenticating a user is described that comprises a sensor configured to measure the movement of a user in response to the interaction of the user with a displayed image and controller circuitry configured to authenticate the user in response to a positive comparison between the movement of the user and a stored movement associated with the user.

Description

    BACKGROUND Field of the Disclosure
  • The present technique relates to a device, computer program and method.
  • Description of the Related Art
  • The “background” description provided herein is for the purpose of generally presenting the context of the disclosure. Work of the presently named inventors, to the extent it is described in the background section, as well as aspects of the description which may not otherwise qualify as prior art at the time of filing, are neither expressly or impliedly admitted as prior art against the present technique.
  • Authentication of a user is an important issue in modern technology. With more and more sensitive information being stored, hackers and malicious attackers are becoming more sophisticated in circumventing known authentication techniques. This is particular the case with new emerging technologies such as wearable technology and augmented and virtual reality where the constrained interfaces make it difficult to authenticate the user with certainty.
  • Additionally, in some instances where sound is used to authenticate the user, that sound information may be hacked revealing sensitive information about the user or his or her environment.
  • It is an aim of the disclosure to address these two issues.
  • SUMMARY
  • According to one aspect of the disclosure, there is provided a device for authenticating a user, comprising a sensor configured to measure the movement of a user in response to the interaction of the user with a displayed image and controller circuitry configured to authenticate the user in response to a positive comparison between the movement of the user and a stored movement associated with the user.
  • The foregoing paragraphs have been provided by way of general introduction, and are not intended to limit the scope of the following claims. The described embodiments, together with further advantages, will be best understood by reference to the following detailed description taken in conjunction with the accompanying drawings.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • A more complete appreciation of the disclosure and many of the attendant advantages thereof will be readily obtained as the same becomes better understood by reference to the following detailed description when considered in connection with the accompanying drawings, wherein:
  • FIG. 1 shows a device 100 according to embodiments of the present disclosure;
  • FIG. 2 shows a system 200 according to embodiments of the present disclosure
  • FIG. 3 shows a schematic diagram of a virtual keyboard of embodiments;
  • FIG. 4 shows a schematic diagram of a template storing user information according to embodiments;
  • FIG. 5 shows a process explaining according to embodiments of the disclosure;
  • FIG. 6 shows a schematic diagram of a template according another embodiment of the disclosure; and
  • FIG. 7 shows a state diagram explaining the mechanism for maintaining the confidence score.
  • DESCRIPTION OF THE EMBODIMENTS
  • Referring now to the drawings, wherein like reference numerals designate identical or corresponding parts throughout the several views.
  • FIG. 1 shows a device 100 according to embodiments of the disclosure.
  • In embodiments of the disclosure, the device 100 is a wearable device such as a fitness band or smartwatch which the user wears that comprises a controller 105. Of course, the device 100 is not so limited and may be any device 100 with which the user interacts, such as a hub like the Xperia® Agent.
  • The controller 105 may be implemented as controller circuitry comprising hardware that is configured to perform certain method steps. The method steps are defined by computer readable code that is stored within storage 130 attached to the controller 105. The storage 130 may be optically readable storage or may be solid state storage or the like.
  • Also connected to the controller 105 is a transceiver 110. The transceiver comprises circuitry that allows the device 100 to communicative with other devices and/or a network. This communication, in embodiments, will be wireless and may be performed using WiFi, Bluetooth, NFC, cellular communication or the like. An antenna 112 is provided to facilitate such communication.
  • Additionally attached to the controller 105 is a microphone 135. The microphone 135 detects the sound from the location of the device 100. This sound may be a voice command from a user or, in an embodiment, may be the ambient sound of the device 100. The “ambient sound” is a term known to the skilled person and means the background sound which is present at the location of the device 100 but which is not an instruction to the device 100. The microphone 135 may be embodied as microphone circuitry and may be a capacitive or a resistive type microphone.
  • Additionally connected to the controller 105 are sensors 125. These sensors may be embodied as modules or circuitry located within the device 100 that perform certain functions and quantify certain physical or environmental conditions presented to or asserted on the device 100. Examples of sensors include accelerometers, barometers, gyroscopes and the like. In embodiments, other sensors include image sensors that capture an image of the surroundings of the device 100. These types of sensors are known to the skilled person.
  • Additionally connected to the controller 105 is a user output module 120. The user output module may be a display, or connected to a display, that provides a visual output. An example of this is if the device 100 is a headset, such as an augmented reality headset, whereby the user output module 120 is a head-up display where a graphic is overlaid over a real world scene. Additionally or alternatively, the user output module 120 may be a haptic feedback device that presents the user with a specific vibration indicating a certain output. However, any output that can be understood by the user can be provided by the user output module 120.
  • Additionally connected to the controller 105 is a user input module 115. The user input module 115 may be a touch screen wherein the user instructs the device 100 to perform certain functions using a touch screen mechanism. Alternatively, or additionally, the user input module 115 may be an image sensor (which may be the same or different to one embodied as a module in sensor 125) that captures an image of the user interacting with an object overlaid on an augmented reality display. For example, in this particular embodiment, the user input module 115 is an image sensor that captures the position of the user's hand and acts as a gesture recognition module. That is, the movement and position of the user's hand will be captured and certain actions performed in response to the captured movement and position.
  • In particular, in embodiments of the present disclosure, the device 100 is used as a device for authenticating the user. As will be explained, the authentication, in embodiments, is performed by analyzing the movement and/or physical (sometimes referred to as “physiological”) traits of the user when interacting with a displayed image of an object. The object may be provided on the user output module 120 as, for example, a 3D object in free-space. In this case, the provision of the object in 3D allows the user to have a large variation in interaction with the virtual object. Alternatively, the object may be displayed on a surface. For example, the Xperia® Projector projects objects onto a surface such as a desk or a wall.
  • FIG. 2 describes a system 200 according to embodiments of the disclosure. The system of embodiments of the disclosure include a resource 205 which may be a server located in a cloud provided by a cloud based service provider. Alternatively, the resource 205 may be a controlling device which is located in a network to which the device 100 is connected. In other words, the resource 205 may be located on the cloud and may provide services to the device 100 such as authentication or storage of user profiles. Alternatively, the resource 205 may be located on a local network and the resource 205 may be a hub that contains equivalent information. An example of this hub is an Xperia Agent or the like. In examples, the hub may be used to authenticate the user before granting access to the device 100. In this instance, the user will interact with a virtual object created by the hub and in dependence upon the interaction, the user will be granted access to the device 100.
  • The device 100 is connected to the resource 205 via a network 210. The network 210 may therefore be a local area network, a wide area network or the internet.
  • The operation of embodiments of the disclosure will now be described.
  • Referring to FIG. 3, a virtual keyboard 300 is shown. In embodiments, the virtual keyboard is the displayed image with which the user interacts. However, the disclosure is not so limited as will be explained later.
  • Further, although the virtual keyboard 300 in FIG. 3 is a numeric keypad that includes numbers 0-9, the keypad may contain letters, symbols, colours, shapes or any kind of pattern. As the skilled person would also appreciate, the numeric keypad is presented to the user in numerical sequence. However, it will be appreciated that the keypad may be randomised so that the numbers are not presented in numerical order. This reduces the likelihood of a third party determining a user's personal identification number (PIN hereinafter).
  • The user interacts with the virtual keyboard 300 using their hand or a stylus or pointer device. As noted above, in embodiments, the virtual keyboard 300 is displayed using a projection mechanism that projects the virtual keyboard 300 onto a surface with which the user interacts. In other words, the virtual keyboard is not a physical keyboard but is projected by the device 100 (or another device) onto a surface. An example of this is the Xperia projector which projects or displays an object onto a surface such as a desk and the user interacts with the object. Alternatively, in the context of augmented reality or virtual reality, the device may present the user with a virtual keyboard (as one example of an object) in augmented reality space and the user will interact with the object. The mechanism by which the object is projected or displayed is known and so will not be explained in any detail.
  • Referring back to FIG. 3, in embodiments of the disclosure, a user's hand is shown interacting with the virtual keyboard 300. In this instance, an image sensor within the device 100 captures an image of the user's hand and the controller 105 performs object recognition on the user's hand. Of course, it is envisaged that the image sensor may be an array of sensors which may be used to capture depth information as is known. This provides increased flexibility with regard to interaction with a virtual object. The controller 105 identifies the user's fingers on the detected hand and determines the angle between the thumb and the forefinger at a certain position of the hand. This is denoted in FIG. 3 as ϕ.
  • Additionally, the controller 105 identifies other physical traits of the user's hand in a certain position such as the angle between the forefinger and the middle finger, the angle between the middle finger and the thumb, or other physical traits like the number of fingers on a hand and the length of the user's fingers (either absolute length or relative length), size and proportion of the user's palm and so on.
  • Additionally, the device 100 recognizes other physical traits of the user's hand such as skin colour, blemishes on the hand such as moles or scars or the like. This may be achieved using pattern matching whereby the captured image is compared with a stored template of the user's hand which will be explained later.
  • Moreover, the device 100 recognizes which hand is being used to interact with the virtual keyboard by identifying the position of the thumb and determining whether the user's right hand or left hand is being used. In other words, the device 100 recognizes which hand is dominant for the user.
  • The device also performs object tracking which identifies movement of the user's hand as the user interacts with the keyboard. The specific method for performing object tracking is known and will not be explained for brevity. However, the device 100 identifies how the user rotates their hand and wrist and how much rotation in the wrist has occurred when the user enters the PIN on the virtual keyboard. More generally, the device 100 detects the movement of the user and how the user interacts with the displayed object.
  • As the user moves his or her hand over the virtual keyboard 300, the device analyses the amount of time that the user's forefinger (which, in embodiments, the user will use to press the virtual keyboard 300) hovers over each key. So, the device 100 tracks the user's hand over the virtual keyboard 300 and measures the movement of the user's hand over the virtual keyboard 300. Further the physical characteristics of the user's hand such as the angle between the user's respective fingers are also analyzed. In other words, the manner in which the user interacts with the virtual keyboard 300 is analyzed by the device 100. So, the device 100 determines the speed at which the user's hand moves over the keyboard and the amount of time that the user hovers over each key when pressing the key. The method by which the user interacts with the virtual keyboard 300 is unique to the user. This is difficult for an unauthorized third party to copy. It is envisaged that the analysis of the user's movement and interaction with the displayed object may be used solely to authenticate the user. Alternatively, the analysis of the user's movement and interaction with the displayed object may be used as an additional form of authentication to the entry of a PIN or other passcode. In other words, in order to authenticate the user, the user must enter the correct PIN or other passcode in the correct manner. This improves known techniques of authentication which are liable for spoofing where only a PIN or passcode is entered.
  • In addition, the keypad will be placed at a similar position within the user's field of view each time the keypad is displayed. This is to ensure consistency of the hand position between consecutive captured movements. In other words, placing the keypad in the lower half of the user's field of view may provide different hand movements to the situation where the keypad is placed in the upper half of the user's field of view.
  • FIG. 4 shows a table 400 that is stored within the storage 130 of device 100. The contents of the table 400 are a template that defines an authorized user and is populated during a training phase of the system for any one object interaction. During the training phase, a user who is known to be authentic is presented with a virtual keyboard or other displayed object. The authentic user then trains the system by interacting with the displayed object one or more times. Of course, it is envisaged that there may be two or more different object interactions stored within the table. For example, a second object interaction may be the user putting a virtual key in a virtual lock. However, for the purposes of clarity, only a single object interaction will be described.
  • The table 400 has a user identity column 405 which identifies each user uniquely. In the example of FIG. 4, the first user is identified as “User 1”. A number of parameters are associated with that user. These are also stored in table 400 which may be embodied as a database. Of course, the disclosure is not so limited. For example, in the field of neural network and machine learning, the user characteristics within the table form a unique user signature or behaviour. The confidence in an authentication score for the user may be maintained in the internal thresholds and states of a machine learning or neural network model. In the case of a neural network, the inputs are selected that best correlate to the output to authenticate the user. This means that the inputs to the neural networks for one user may be very different to those for another user. So, and as will be appreciated, there is not one algorithm used for all users but rather there will be many algorithm variations used combined with differences in user inputs to authenticate between many users.
  • The first parameter is a password, PIN or passcode that includes numbers, alphanumeric characters and the like. In the example of FIG. 4, this is “1234” and is stored in the column “PIN” 410.
  • Additionally associated with the user profile is the movement and physical characteristic of the user. This is stored in column 415. In the example, the physical traits of the user when entering the passcode or pin during the training phase are stored. For example, the angle between the user's thumb and the first finger is identified as 22° and the angle between the user's second finger and the thumb is identified as 87°. This is identified using object detection during the training phase. This is stored in row 420.
  • Other physical parameters and traits are stored within column 415. For example, the time over which the user hovers before pressing each number of their PIN is noted in row 425. For example, the time of hover over number 1 in the PIN is 0.3 seconds and the time hovering over number 2 is 0.4 seconds.
  • Additionally noted in column 415 are other physical characteristics of the user such as the wrist rotation in row 430 and even other physical characteristics such as colour of skin and skin blemishes. In the example embodiment, the wrist rotation is 42°. Finally, the dominant hand of the user is noted in row 435 which is in this case, the right hand.
  • The purpose of the table 400 is to store the template of the user's interaction with the virtual keyboard 300. As noted above, the template is derived during the training phase where not only a PIN or passcode is determined or stored in 410 but also the physical characteristics and traits of the user and how the user interacts with the virtual keyboard are also stored. This template is stored securely in the device 100. Alternatively or additionally, the table or template 400 may be stored in the resource 205 or on the Cloud. The contents of the table 400 may be encrypted for additional security.
  • After populating the table during the training phase, a user may be authenticated. This is during the authentication phase where a user interacts with the displayed virtual keyboard 300 (or other object). During this phase, the device 100 identifies the PIN code or passcode that is entered by the user. In addition, or alternatively, the physical traits of the user such as the position of the forefinger relative to the thumb and the amount of time taken by the user to hover over a particular key is also identified and compared with the stored template 400. It is on this basis that the user is authenticated as will be explained.
  • It is important to note here, that the movement of the user may be used to authenticate the user alone. In other words, the user may be authenticated if the movement of the user during entry of a passcode is the same as the movement of the user 415 stored within table 400. Of course, for added security, the entered passcode should be the same as that stored in column 410.
  • In order to authenticate the user, the measured movements must be within a predetermined threshold of the stored movement. For example, for the user to be authenticated to a first level of confidence, the angle between the thumb and forefinger must be within 0.5° of that stored in the table 400. However, if the user is to be authenticated to a second, higher, level of confidence, the angle between the thumb and forefinger must be within 0.3°. The level of confidence may be set by the user or by the resource 205. So, for more sensitive information such as access to banking information where a high level of confidence is required, the user would be authenticated to the second level of confidence. However, if the user simply wants access to non-sensitive information such as stored music, the first level of confidence will suffice.
  • In addition, the level of confidence may be increased by providing multiple authentication techniques. For example, for highly sensitive data such as medical data, a third, even higher, level of confidence may be required. In this instance, the PIN entered by the user will match the PIN stored in column 410 and the angle between the user's thumb and forefinger will be within 0.3° of the stored value.
  • It should be noted here that various other levels of confidence may be derived using the other physical characteristics. For example, the hover time over the various keys may be used in conjunction with the various angles between fingers to generate numerous confidence levels. In addition, some physical characteristics are very particular to a user and so higher levels of weighting may be applied to these characteristics. For example, skin blemishes are very particular to a particular person, and are quite reliably detected. On the other hand, the dominant hand of a user is less unique to the user. Therefore, a high weighting may be applied to the skin blemish characteristic compared to the dominant hand characteristic.
  • FIG. 5 shows a flow diagram explaining the authentication process associated with this embodiment.
  • The process 600 begins when a user 601 sends a request to resource 205. This may be for access to sensitive information such as via a banking application. This is noted in the request resource step 605.
  • The resource 205 will then present an authentication challenge to the device 100 in step 610. The type of challenge and the level of confidence required will be defined by the resource 205.
  • In embodiments already discussed, the authentication challenge is the entry of a PIN or passcode. However, the disclosure is in no way limited to this and other authentication challenges may include measuring how the user inserts a key into a lock or interacts with shapes such as building blocks or how the user presses a certain array of coloured buttons or any kind of interaction with an image or virtual device. The selection of the authentication challenge may be specific to the resource 205. For example, an online store may request that a passcode or PIN may be entered in order for the user's identity to be authenticated. Alternatively, the resource may randomly choose an authentication challenge that has already been performed by the user during the training phase or uses the same motion as a challenge for which training has already taken place.
  • This authentication challenge is presented to the device 100, in embodiments along with the level of confidence required and the device 100 generates the challenge in step 615. In the specific embodiment described, the device 100 presents the user with the virtual keyboard 300.
  • The user then interacts with the object in 620. This is shown in FIG. 3 whereby the user enters a PIN on the virtual keyboard 300.
  • During this time, the user's interaction is measured in step 625. In other words, the device 100 captures the user's physical traits when interacting with the virtual keyboard.
  • The captured behaviour is then either compared with the user profile stored within the device 100 or, in this case, is sent in step 630 to the resource 205 for comparison with the stored table 400. In other words, the captured behaviour is compared against the template stored in table 400 either in the device 100 or, in this case, resource 205.
  • The comparison with the template validates the behaviour of the user as being that of the user 1 in steps 635. If the behaviour is validated in step 635, then the authentication of the user is complete. It should be noted here that the authentication is completed to the required level of confidence. As explained above, for example, for the user to be authenticated to a first level of confidence, the angle between the thumb and forefinger must be within 0.5° of that stored in the table 400. However, if the user is to be authenticated to a second, higher, level of confidence, the angle between the thumb and forefinger must be within 0.3°. A success 640 is then provided to the device 100 and the resource 205 returns the required data such as authorization that the transaction is complete or returns the content stored within the resource 205. This occurs in step 645 and the resource or the success of the authentication is then displayed to the user 601 in step 650. The process then ends.
  • Although the above describes the image sensor within the device 100 capturing the user's movement, the disclosure is not so limited. In fact, some movement information or physical traits may be provided by a different wearable device. For example, the wrist rotation may be measured by a wearable wrist strap.
  • In a different embodiment of the disclosure, authentication of the user is performed by determining the proximity of the device 100 to another device that is already known to be close to the user. In order to determine whether the other device is close to the user, a behavioumetric fingerprint, or biometric fingerprint is taken on that device. Moreover, the device that is known to be close to the user may be attached to the user or may be embedded within the user (for example under the skin of the user). In known techniques, ambient sound is detected whereby the content of the sound is captured in both devices (that is the device which is known to be close to the user and the device 100) and compared. In the event that the sounds are the same, it is determined that the devices are close together and therefore the authentication of the user is complete.
  • However, this has several disadvantages. Firstly, the ambient sound contains a large amount of personal information such as content of conversation and people's voices within that conversation. Additionally, environmental sounds such as announcements can indicate the location of the user. This may be compromised and may risk the security of the user. It is an aim of the present disclosure to address this.
  • Broadly speaking, the disclosure uses the energy in the audio signals to authenticate the user. In other words, the raw audio is not compared but rather the energy content in the audio is compared. The energy content is sometimes referred to as the sound volume or sound intensity. This reduces the information content within the audio signal but provides enough information that authentication of the proximity of the device of the user can be performed. This has an additional benefit of low hardware requirements. This reduces the cost and complexity of devices and battery usage. In order to ensure that the authentication is not compromised, a continuous confidence score may be determined and a state diagram such as that shown in FIG. 6 and FIG. 7 is maintained.
  • Referring to FIG. 6, a user identity 505 stores a unique identifier for each user. This is stored in column 505.
  • A device identifier is also stored in column 510. This uniquely identifies each device associated with the user. In this case, there are three devices uniquely identified as device 1, device 2, device 3. There is also an additional device (not shown) that is known to be close to the user. As noted above, this device may be attached to the user or embedded within the user. The location of each of device 1, device 2 and device 3 which was determined as explained below is then stored in column 515. In this example, the location of each of device 1, 2 and 3 is provided relative to the device known to be close to the user. In other words, in the example table in FIG. 6, device 1 is noted as being close to the device known to be close to user 1 and device 2 is located as being not close to the device known to be close to user 1. Device 3 is also located close to the device known to be close to user 1.
  • A confidence score is also provided in confidence score column 520. This provides a certain level of confidence of the nature of the location of the device. In the example of FIG. 6, the device 100 is 85% confident that device 1 is close to user 1. Similarly, device 100 is 82% confident that device 2 is not close to user 1 and 75% confident that device 3 is close to user 1. An explanation of the determination of the confidence score and the location of the device now follows.
  • Each piece of wearable technology (shown in FIG. 6 as device 1, 2, and 3) contains a microphone 135. The microphone 135 captures the ambient noise at regular intervals. For example, the microphone 135 captures 0.5 second samples of the ambient sound at 1 second intervals. The captured ambient sound is passed to a controller for processing.
  • The controller in each device 1, 2 and 3, converts the captured ambient sound to a time series of sound intensities using a known technique. The time series of sound intensities is passed to an authentication device 100 which maintains table 500. The table is stored within device 100. The table 500 is stored for authentication purposes as will be explained later. The transmission of a time series of sound intensities is useful as they contain little or no information about the user or the environment in which the user is located. This means even if the time series of sound intensities was hacked, no information would be compromised.
  • When received by the authentication device 100, the time series of sound intensities is cross-correlated. The cross-correlation is performed between each device and the device we know is close to the user. For example, in the example of FIG. 6, the time series of sound intensities from device 1 is cross-correlated with the time series of sound intensities from the device we know is close to the user. Similarly, the time series of sound intensities from device 2 and separately device 3 is cross-correlated with the time series of sound intensities from the device we know is close to the user. Cross correlation is a known technique and will not be described in any detail hereinafter.
  • It is envisaged that prior to cross-correlation, the time series will be converted to a frequency domain representation using a Fast Fourier Transform (FFT) or the like.
  • The output of the cross-correlation will determine how similar the ambient sounds are at the sample time. Where the level of similarity is at or above a threshold value, a continuous similarity score will be increased by an amount. Alternatively, where the level of similarity is below the threshold, the similarity score will be decreased by an amount.
  • Referring to FIG. 7, a state diagram 700 explaining the mechanism for maintaining the confidence score is shown. In FIG. 7, the state diagram 700 sets the proximity between the device known to be close to the user and, say, device 1 starts at “not close” 705. The sound intensity from the device known to be close to the user and device 1 is received at the device 100 periodically. Of course, device 100 may be one of these devices measuring the sound intensity. In examples, the sound intensity is received every 0.5 seconds. Of course, other periods are envisaged and these periods may or may not be regular. This is step 720 in the state diagram 700.
  • In step 725 a decision has to be made. Specifically, the controller 105 determines whether the cross-correlation of the sound intensity results in a match between the two received sound intensities. In this case, the match may be that the output of the cross-correlation is at or above a threshold of, say, 85%. If the sound intensities match, the “signals match increment score” path is followed to step 730. The continuous score associated with device 1 is incremented. In step 730, a decision is made. Specifically, the continuous score associated with device 1 is reviewed. In the event that the continuous score is at or above a threshold of, say, 80, device 1 is determined to be close to the device known to be close to the user. On the other hand, if the continuous score is below the threshold of, in this case, 80, then device 1 is deemed to be not close to the device which is known to be close to the user.
  • Returning now to step 725, if the cross-correlated signals is below the threshold of, in this case, 85%, the signals are deemed not to match and the continuous score is not incremented. The path 715 is followed to the determination in 705 that device 1 is not close to the device known to be close to the user.
  • Numerous modifications and variations of the present disclosure are possible in light of the above teachings. It is therefore to be understood that within the scope of the appended claims, the disclosure may be practiced otherwise than as specifically described herein.
  • In so far as embodiments of the disclosure have been described as being implemented, at least in part, by software-controlled data processing apparatus, it will be appreciated that a non-transitory machine-readable medium carrying such software, such as an optical disk, a magnetic disk, semiconductor memory or the like, is also considered to represent an embodiment of the present disclosure.
  • It will be appreciated that the above description for clarity has described embodiments with reference to different functional units, circuitry and/or processors. However, it will be apparent that any suitable distribution of functionality between different functional units, circuitry and/or processors may be used without detracting from the embodiments.
  • Described embodiments may be implemented in any suitable form including hardware, software, firmware or any combination of these. Described embodiments may optionally be implemented at least partly as computer software running on one or more data processors and/or digital signal processors. The elements and components of any embodiment may be physically, functionally and logically implemented in any suitable way. Indeed the functionality may be implemented in a single unit, in a plurality of units or as part of other functional units. As such, the disclosed embodiments may be implemented in a single unit or may be physically and functionally distributed between different units, circuitry and/or processors.
  • Although the present disclosure has been described in connection with some embodiments, it is not intended to be limited to the specific form set forth herein. Additionally, although a feature may appear to be described in connection with particular embodiments, one skilled in the art would recognize that various features of the described embodiments may be combined in any manner suitable to implement the technique.
  • Embodiments of the present technique can generally described by the following numbered clauses:
  • 1. A device for authenticating a user, comprising a sensor configured to measure the movement of a user in response to the interaction of the user with a displayed image and controller circuitry configured to authenticate the user in response to a positive comparison between the movement of the user and a stored movement associated with the user.
  • 2. A device according to clause 1, comprising storage configured to store a user profile having associated therewith the stored movement and wherein the controller circuitry is configured to compare the movement of the user with the stored movement in the user profile, and in the event of a positive comparison, authenticating the user.
  • 3. A device according to clause 1 or 2, wherein the sensor is an image sensor or a wearable sensor located on the user's body.
  • 4. A device according to any preceding clause, wherein the displayed image is a virtual object associated with the user and the stored movement is associated with the virtual object.
  • 5. A device according to any preceding clause, wherein the displayed image is one of a virtual keyboard.
  • 6. A device according to clause 5, wherein the controller circuitry is further configured to authenticate the user in response to a positive comparison with entry of a predefined code associated with the user on the virtual keyboard.
  • 7. A device according to clause 6, wherein the predefined code is a personal identification number.
  • 8. A device according to any preceding clause, wherein the controller circuitry is configured to authenticate the user in accordance with a physiological trait of the user.
  • 9. A device for authenticating a user, comprising microphone circuitry configured to capture the ambient sound over a predetermined period, and controller circuitry configured to: measure the energy of the ambient sound over a predefined period; compare the time series of the measured energy with a received time series of the measure of the energy of the ambient sound from a second device; and authenticate the user in the event of a positive comparison.
  • 10. A device according to clause 9, wherein the microphone circuitry is configured to capture the ambient sound over a time period and to update a comparison result for each time period and to authenticate the user in the event that the comparison result reaches a predetermined threshold.
  • 11. A method of authenticating a user, comprising measuring, using a sensor, the movement of a user in response to the interaction of the user with a displayed image and authenticating the user in response to a positive comparison between the movement of the user and a stored movement associated with the user.
  • 12. A method according to clause 11, comprising storing a user profile having associated therewith a stored movement and comparing the movement of the user with the stored movement in the user profile, and in the event of a positive comparison, authenticating the user.
  • 13. A method according to clause 11 or 12, wherein the sensor is an image sensor or a wearable sensor located on the user's body.
  • 14. A method according to any one of clauses 11 to 13, wherein the displayed image is a virtual object associated with the user and the stored movement is associated with the virtual object.
  • 15. A method according to any one of clauses 11 to 14, wherein the displayed image is one of a virtual keyboard.
  • 16. A method according to clause 15, comprising authenticating the user in response to a positive comparison with entry of a predefined code associated with the user on the virtual keyboard.
  • 17. A method according to clause 16, wherein the predefined code is a personal identification number.
  • 18. A method according to any one of clauses 11 to 17, comprising authenticating the user in accordance with a physiological trait of the user.
  • 19. A method of authenticating a user, comprising capturing, using microphone circuitry, the ambient sound over a predetermined period, and the method comprising measuring the energy of the ambient sound over a predefined period; comparing the time series of the measured energy with a received time series of the measure of the energy of the ambient sound from a second device; and authenticating the user in the event of a positive comparison.
  • 20. A method according to clause 19, comprising capturing the ambient sound over a time period and updating a comparison result for each time period and to authenticate the user in the event that the comparison result reaches a predetermined threshold.
  • 21. A computer program product comprising computer readable instructions which, when loaded onto a computer, configures the computer to perform the method according to any one of clauses 11 to 20

Claims (21)

1. A device for authenticating a user, comprising a sensor configured to measure the movement of a user in response to the interaction of the user with a displayed image and controller circuitry configured to authenticate the user in response to a positive comparison between the movement of the user and a stored movement associated with the user.
2. A device according to claim 1, comprising storage configured to store a user profile having associated therewith the stored movement and wherein the controller circuitry is configured to compare the movement of the user with the stored movement in the user profile, and in the event of a positive comparison, authenticating the user.
3. A device according to claim 1, wherein the sensor is an image sensor or a wearable sensor located on the user's body.
4. A device according to claim 1, wherein the displayed image is a virtual object associated with the user and the stored movement is associated with the virtual object.
5. A device according to claim 1, wherein the displayed image is one of a virtual keyboard.
6. A device according to claim 5, wherein the controller circuitry is further configured to authenticate the user in response to a positive comparison with entry of a predefined code associated with the user on the virtual keyboard.
7. A device according to claim 6, wherein the predefined code is a personal identification number.
8. A device according to claim 1, wherein the controller circuitry is configured to authenticate the user in accordance with a physiological trait of the user.
9. A device for authenticating a user, comprising microphone circuitry configured to capture the ambient sound over a predetermined period, and controller circuitry configured to: measure the energy of the ambient sound over a predefined period; compare the time series of the measured energy with a received time series of the measure of the energy of the ambient sound from a second device; and authenticate the user in the event of a positive comparison.
10. A device according to claim 9, wherein the microphone circuitry is configured to capture the ambient sound over a time period and to update a comparison result for each time period and to authenticate the user in the event that the comparison result reaches a predetermined threshold.
11. A method of authenticating a user, comprising measuring, using a sensor, the movement of a user in response to the interaction of the user with a displayed image and authenticating the user in response to a positive comparison between the movement of the user and a stored movement associated with the user.
12. A method according to claim 11, comprising storing a user profile having associated therewith a stored movement and comparing the movement of the user with the stored movement in the user profile, and in the event of a positive comparison, authenticating the user.
13. A method according to claim 11, wherein the sensor is an image sensor or a wearable sensor located on the user's body.
14. A method according to claim 11, wherein the displayed image is a virtual object associated with the user and the stored movement is associated with the virtual object.
15. A method according to claim 11, wherein the displayed image is one of a virtual keyboard.
16. A method according to claim 15, comprising authenticating the user in response to a positive comparison with entry of a predefined code associated with the user on the virtual keyboard.
17. A method according to claim 16, wherein the predefined code is a personal identification number.
18. A method according to claim 11, comprising authenticating the user in accordance with a physiological trait of the user.
19. A method of authenticating a user, comprising capturing, using microphone circuitry, the ambient sound over a predetermined period, and the method comprising measuring the energy of the ambient sound over a predefined period; comparing the time series of the measured energy with a received time series of the measure of the energy of the ambient sound from a second device; and authenticating the user in the event of a positive comparison.
20. A method according to claim 19, comprising capturing the ambient sound over a time period and updating a comparison result for each time period and to authenticate the user in the event that the comparison result reaches a predetermined threshold.
21. A computer program product comprising computer readable instructions which, when loaded onto a computer, configures the computer to perform the method according to claim 11.
US16/336,470 2016-09-28 2017-09-07 A device, computer program and method Abandoned US20190253883A1 (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
EP16191189 2016-09-28
EP16191189.6 2016-09-28
PCT/EP2017/072513 WO2018059905A1 (en) 2016-09-28 2017-09-07 A device, computer program and method

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2017/072513 A-371-Of-International WO2018059905A1 (en) 2016-09-28 2017-09-07 A device, computer program and method

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US18/057,763 Division US12015925B2 (en) 2016-09-28 2022-11-22 Device, computer program and method

Publications (1)

Publication Number Publication Date
US20190253883A1 true US20190253883A1 (en) 2019-08-15

Family

ID=57047039

Family Applications (2)

Application Number Title Priority Date Filing Date
US16/336,470 Abandoned US20190253883A1 (en) 2016-09-28 2017-09-07 A device, computer program and method
US18/057,763 Active US12015925B2 (en) 2016-09-28 2022-11-22 Device, computer program and method

Family Applications After (1)

Application Number Title Priority Date Filing Date
US18/057,763 Active US12015925B2 (en) 2016-09-28 2022-11-22 Device, computer program and method

Country Status (4)

Country Link
US (2) US20190253883A1 (en)
EP (1) EP3520456A1 (en)
CN (1) CN109804652A (en)
WO (1) WO2018059905A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20200117788A1 (en) * 2018-10-11 2020-04-16 Ncr Corporation Gesture Based Authentication for Payment in Virtual Reality
EP4052231A4 (en) * 2019-11-01 2022-11-02 Visa International Service Association Computer-implemented method and a virtual reality device for providing behavior-based authentication in virtual environment

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140369572A1 (en) * 2013-06-14 2014-12-18 Apple Inc. Electronic device switchable to a user-interface unlocked mode based upon a pattern of input motions and related methods
US20160179205A1 (en) * 2013-06-27 2016-06-23 Eyesight Mobile Technologies Ltd. Systems and methods of direct pointing detection for interaction with a digital device
US20160188861A1 (en) * 2014-12-31 2016-06-30 Hand Held Products, Inc. User authentication system and method

Family Cites Families (25)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5610631A (en) 1992-07-09 1997-03-11 Thrustmaster, Inc. Reconfigurable joystick controller recalibration
US6819219B1 (en) * 2000-10-13 2004-11-16 International Business Machines Corporation Method for biometric-based authentication in wireless communication for access control
US20020091937A1 (en) * 2001-01-10 2002-07-11 Ortiz Luis M. Random biometric authentication methods and systems
US20060205505A1 (en) 2005-03-08 2006-09-14 Saied Hussaini Wireless game controller with integrated audio system
US20090054146A1 (en) 2007-08-23 2009-02-26 Michael Epstein Configurable single handed video game controller
US20090102603A1 (en) * 2007-10-19 2009-04-23 Fein Gene S Method and apparatus for providing authentication with a user interface system
US8638989B2 (en) 2012-01-17 2014-01-28 Leap Motion, Inc. Systems and methods for capturing motion in three-dimensional space
US9311909B2 (en) * 2012-09-28 2016-04-12 Microsoft Technology Licensing, Llc Sensed sound level based fan speed adjustment
US9092600B2 (en) 2012-11-05 2015-07-28 Microsoft Technology Licensing, Llc User authentication on augmented reality display device
KR20140058996A (en) * 2012-11-07 2014-05-15 삼성전자주식회사 User terminal, external apparatus, data transreceiving system and data transreceiving method
US20160065558A1 (en) * 2013-01-08 2016-03-03 Coursera, Inc. Identity verification for online education
US20140314242A1 (en) * 2013-04-19 2014-10-23 Plantronics, Inc. Ambient Sound Enablement for Headsets
US9979547B2 (en) * 2013-05-08 2018-05-22 Google Llc Password management
US9226094B2 (en) 2013-07-25 2015-12-29 Elwha Llc Systems and methods for receiving gesture indicative data at a limb wearable computing device
US20150062086A1 (en) * 2013-08-29 2015-03-05 Rohildev Nattukallingal Method and system of a wearable ring device for management of another computing device
KR102208112B1 (en) * 2013-11-28 2021-01-27 엘지전자 주식회사 A display device and the method of controlling thereof
US9524385B1 (en) * 2013-12-12 2016-12-20 Marvell International Ltd. Using an audio channel for authenticating a device
US9195878B2 (en) * 2014-02-21 2015-11-24 Fingerprint Cards Ab Method of controlling an electronic device
WO2016010524A1 (en) 2014-07-15 2016-01-21 Hewlett-Packard Development Company, L.P. Virtual keyboard
US9743279B2 (en) * 2014-09-16 2017-08-22 Samsung Electronics Co., Ltd. Systems and methods for device based authentication
US20160085958A1 (en) * 2014-09-22 2016-03-24 Intel Corporation Methods and apparatus for multi-factor user authentication with two dimensional cameras
US10588005B2 (en) * 2014-09-26 2020-03-10 Mcafee, Llc Fuzzy fingerprinting of communicating wearables
US20160140553A1 (en) 2014-11-17 2016-05-19 Visa International Service Association Authentication and transactions in a three-dimensional image enhancing display device
US20180089519A1 (en) 2016-09-26 2018-03-29 Michael Raziel Multi-modal user authentication
US11050752B2 (en) 2018-06-07 2021-06-29 Ebay Inc. Virtual reality authentication

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140369572A1 (en) * 2013-06-14 2014-12-18 Apple Inc. Electronic device switchable to a user-interface unlocked mode based upon a pattern of input motions and related methods
US20160179205A1 (en) * 2013-06-27 2016-06-23 Eyesight Mobile Technologies Ltd. Systems and methods of direct pointing detection for interaction with a digital device
US20160188861A1 (en) * 2014-12-31 2016-06-30 Hand Held Products, Inc. User authentication system and method

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20200117788A1 (en) * 2018-10-11 2020-04-16 Ncr Corporation Gesture Based Authentication for Payment in Virtual Reality
EP4052231A4 (en) * 2019-11-01 2022-11-02 Visa International Service Association Computer-implemented method and a virtual reality device for providing behavior-based authentication in virtual environment

Also Published As

Publication number Publication date
US12015925B2 (en) 2024-06-18
EP3520456A1 (en) 2019-08-07
WO2018059905A1 (en) 2018-04-05
US20230080732A1 (en) 2023-03-16
CN109804652A (en) 2019-05-24

Similar Documents

Publication Publication Date Title
Wang et al. User authentication on mobile devices: Approaches, threats and trends
Zheng et al. You are how you touch: User verification on smartphones via tapping behaviors
Meng et al. Surveying the development of biometric user authentication on mobile phones
Frank et al. Touchalytics: On the applicability of touchscreen input as a behavioral biometric for continuous authentication
Draffin et al. Keysens: Passive user authentication through micro-behavior modeling of soft keyboard interaction
US9788203B2 (en) System and method for implicit authentication
Gascon et al. Continuous authentication on mobile devices by analysis of typing motion behavior
Dahia et al. Continuous authentication using biometrics: An advanced review
US20160226865A1 (en) Motion based authentication systems and methods
Neal et al. Surveying biometric authentication for mobile device security
Alqarni et al. Identifying smartphone users based on how they interact with their phones
US12015925B2 (en) Device, computer program and method
Thomas et al. A broad review on non-intrusive active user authentication in biometrics
Buriro et al. Itsme: Multi-modal and unobtrusive behavioural user authentication for smartphones
Roy et al. An HMM-based multi-sensor approach for continuous mobile authentication
Koong et al. A user authentication scheme using physiological and behavioral biometrics for multitouch devices
Mayrhofer et al. Adversary models for mobile device authentication
US10313508B2 (en) Non-intrusive user authentication system
Sae-Bae et al. Emerging NUI-based methods for user authentication: A new taxonomy and survey
Teh et al. Recognizing your touch: Towards strengthening mobile device authentication via touch dynamics integration
Ali et al. User behaviour-based mobile authentication system
Mondal et al. A continuous combination of security & forensics for mobile devices
Shuwandy et al. Sensor-Based Authentication in Smartphone; a Systematic Review
Chang et al. Making a good thing better: enhancing password/PIN-based user authentication with smartwatch
Alsuhibany et al. Analyzing the effectiveness of touch keystroke dynamic authentication for the Arabic language

Legal Events

Date Code Title Description
STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

AS Assignment

Owner name: SONY CORPORATION, JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:AYLWARD, CONOR;EMBRECHTS, HUGO;TORFS, DIMITRI;SIGNING DATES FROM 20190219 TO 20190423;REEL/FRAME:049879/0401

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE AFTER FINAL ACTION FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: ADVISORY ACTION MAILED

STCV Information on status: appeal procedure

Free format text: NOTICE OF APPEAL FILED

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION