[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

WO2021192061A1 - Authentication control device, authentication system, authentication control method, and recording medium - Google Patents

Authentication control device, authentication system, authentication control method, and recording medium Download PDF

Info

Publication number
WO2021192061A1
WO2021192061A1 PCT/JP2020/013145 JP2020013145W WO2021192061A1 WO 2021192061 A1 WO2021192061 A1 WO 2021192061A1 JP 2020013145 W JP2020013145 W JP 2020013145W WO 2021192061 A1 WO2021192061 A1 WO 2021192061A1
Authority
WO
WIPO (PCT)
Prior art keywords
authentication
face
image
user
input
Prior art date
Application number
PCT/JP2020/013145
Other languages
French (fr)
Japanese (ja)
Inventor
樹輝 阿久津
Original Assignee
日本電気株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 日本電気株式会社 filed Critical 日本電気株式会社
Priority to JP2022509854A priority Critical patent/JPWO2021192061A5/en
Priority to PCT/JP2020/013145 priority patent/WO2021192061A1/en
Publication of WO2021192061A1 publication Critical patent/WO2021192061A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis

Definitions

  • This disclosure relates to an authentication control device, an authentication system, an authentication control method, and a recording medium.
  • Patent Document 1 describes a system that performs normal payment processing if authentication (verification) using identification information is successful even when face recognition fails.
  • the purpose of the present disclosure is to capture an image taken by a third party other than the certified person (for example, a third party lined up behind or a third party crossing the shooting range of the camera). It is an object of the present invention to provide an authentication control device, an authentication system, an authentication control method, and a recording medium capable of correctly authenticating a recognition target person even if the person is crowded.
  • the authentication control device captures a photographing control means for causing a camera to take an image including a user who is performing an input operation with a finger on the input means, and the image taken by the camera.
  • the authentication device to be executed includes an authentication control means for executing face recognition of the user included in the face area detected by the face detection means.
  • the authentication system captures an input means, a camera that captures an image including a user who is performing an input operation on the input means by a finger, and the image captured by the camera.
  • the authentication device to be executed includes an authentication control means for executing face recognition of the user included in the face area detected by the face detection means.
  • the authentication control method includes a photographing control step in which an image including a user who is performing an input operation with a finger on the input means is photographed by a camera, and the image captured by the camera. From the image acquisition step to be acquired, the face detection step of detecting the face area to which the fingers performing the input operation with respect to the input means are connected from the image acquired by the image acquisition step, and face authentication.
  • the authentication device to be executed includes an authentication control step for executing face authentication of the user included in the face area detected by the face detection step.
  • the recording medium is a photographing control process in which an electronic device including at least one processor is made to take an image including a user who is performing an input operation with a finger on an input means by a camera. From the image acquisition process for acquiring the image taken by the camera and the image acquired by the image acquisition process, a face region to which the fingers performing the input operation with respect to the input means are connected. A program for causing the face detection process to be detected and the authentication control process for causing the authentication device that executes face authentication to execute the face authentication of the user included in the face area detected by the face detection process is recorded. It is a computer-readable recording medium.
  • an authentication control device an authentication system, an authentication control method and a recording medium capable of correctly authenticating a recognition target person can be provided.
  • FIG. 1 is a schematic configuration diagram of the authentication control device 20.
  • the authentication control device 20 includes a shooting control means 22b for causing the camera 31 to capture an image including a user (authentication target person) who is performing an input operation on the input means 33 by hand, and a camera 31.
  • a face that detects a face area to which a finger performing an input operation on the input means 33 is connected from an image acquisition means 22c that acquires the image taken by the camera and an image acquired from the image acquisition means 22c.
  • the detection means 22k and the authentication device 10 that executes face recognition include authentication control means 22d that executes face recognition of the user included in the face area detected by the face detection means 22k.
  • FIG. 2 is a flowchart of an example of the operation of the authentication control device 20.
  • the photographing control means 22b causes the camera 31 to take an image including a user who is performing an input operation with a finger on the input means 33 (step S10).
  • the image acquisition means 22c acquires the image captured by the camera 31 (step S11).
  • the face detection means 22k detects the face area to which the fingers performing the input operation on the input means 33 are connected from the image acquired in step S11 (step S12).
  • the authentication control means 22d causes the authentication device 10 that executes face authentication to execute the face authentication of the user included in the face area detected in step S12 (step S13).
  • a third party other than the person to be authenticated (for example, a third party lined up behind or a third party crossing the shooting range of the camera 31) took a picture with the camera 31. Even if it is reflected in the image, the recognition target person can be correctly authenticated.
  • the photographing control unit will be used as the photographing control means 22b.
  • an imaging control unit 22b an image acquisition unit is used as the image acquisition means 22c.
  • the authentication control unit is used as the authentication control means 22d.
  • a payment terminal is used as the information processing terminal 30.
  • a touch panel is used as the input means 33.
  • a touch panel 33 is used as the input means 33.
  • FIG. 3 is a block diagram showing the configuration of the authentication system 1 according to the second embodiment.
  • the authentication system 1 includes an authentication device 10, an authentication control device 20, and a payment terminal 30 that can communicate with each other via a network NW (for example, the Internet).
  • NW for example, the Internet
  • FIG. 4 is a schematic configuration diagram of the authentication device 10.
  • the authentication device 10 includes a storage unit 11, a control unit 12, a memory 13, and a communication unit 14.
  • the storage unit 11 is, for example, a non-volatile storage unit such as a hard disk device or a ROM.
  • the program 11a and the face information DB 11b are stored in the storage unit 11.
  • Program 11a is a program executed by the control unit 12 (processor).
  • the face information DB 11b the user ID (plurality) and the face feature information of the user are stored (registered) in association with each other.
  • the authentication device 10 collates the face image or face feature information included in the request with the face feature information of each user in response to the face recognition request received from the outside (for example, the authentication control device 20). The verification result is returned to the requester.
  • control unit 12 includes a processor.
  • the processor is, for example, a CPU (Central Processing Unit). There may be one processor or multiple processors.
  • the processor serves as an image acquisition unit 12a, a face detection unit 12b, a feature point extraction unit 12c, a registration unit 12d, and an authentication unit 12e. Function. Some or all of these may be implemented in hardware.
  • the image acquisition unit 12a acquires an image including the user's face.
  • the image acquisition unit 12a acquires an image received by the communication unit 14.
  • the images received by the communication unit 14 include a registration image transmitted from a user terminal (not shown) and an authentication (verification) image transmitted from the authentication control device 20.
  • the face detection unit 12b detects a face region from the image acquired by the image acquisition unit 12a and outputs it to the feature point extraction unit 12c.
  • the feature point extraction unit 12c extracts feature points (for example, facial feature points such as eyes, nose, and mouth edge) from the face region detected by the face detection unit 12b.
  • feature points for example, facial feature points such as eyes, nose, and mouth edge
  • the feature point extraction unit 12c When the image acquired by the image acquisition unit 12a is an image for registration, the feature point extraction unit 12c outputs the face feature information to the registration unit 12d.
  • the face feature information is a set of extracted feature points.
  • the feature point extraction unit 12c outputs the face feature information to the authentication unit 12e.
  • the registration unit 12d newly issues a user ID when registering facial feature information.
  • the registration unit 12d registers the issued user ID and the face feature information extracted from the image for registration in the face information DB 11b in association with each other.
  • the authentication unit 12e collates the face feature information extracted from the face area detected from the authentication image with the face feature information in the face information DB 11b.
  • the authentication unit 12e returns to the authentication control device 20 whether or not the facial feature information matches.
  • the presence or absence of matching of facial feature information corresponds to the success or failure of authentication.
  • the communication unit 14 is a communication device that communicates with the authentication control device 20 via the network NW.
  • FIG. 5 is a flowchart of an example of the operation of the authentication device 10 (face information registration process).
  • the authentication device 10 acquires an image (image for registration) including the user's face included in the face information registration request (step S10).
  • the authentication device 10 receives the face information registration request from the user terminal (not shown) via the network NW.
  • the authentication device 10 (face detection unit 12b) detects the face area from the registration image acquired in step S10 (step S11).
  • the authentication device 10 (feature point extraction unit 12c) extracts facial feature points from the face region detected in step S11 (step S12), and outputs face feature information to the registration unit 12d.
  • the authentication device 10 (registration unit 12d) issues a user ID, associates the user ID with the face feature information, and registers the user ID in the face information DB 11b (step S13).
  • the authentication device 10 may receive face feature information from a face authentication terminal or the like and register it in the face information DB 11b in association with the user ID.
  • FIG. 6 is a flowchart of an example of the operation (face recognition processing) of the authentication device 10.
  • the authentication device 10 acquires an image (image for authentication) including the user's face included in the face authentication request (step S20).
  • the authentication device 10 receives the face recognition request from the authentication control device 20 via the network NW.
  • the authentication device 10 face detection unit 12b) detects the face region from the authentication image acquired in step S20 (step S21).
  • the feature point extraction unit 12c extracts facial feature points from the face region detected in step S21 (step S22).
  • the authentication device 10 may receive face feature information from the authentication control device 20.
  • the authentication device 10 (authentication unit 12e) collates the acquired face feature information with the face information DB 11b (step S23).
  • step S24: Yes the authentication unit 12e identifies the user ID of the user whose face feature information matches (step S25), and determines that the face authentication was successful and the specified user ID. Reply to the authentication control device 20 (step S26).
  • step S24: No the authentication unit 12e returns to the authentication control device 20 that the face authentication has failed (step S27).
  • FIG. 7 is a schematic configuration diagram of the authentication control device 20.
  • the authentication control device 20 is an information processing device that performs authentication control processing, and is, for example, a server device realized by a computer.
  • the authentication control device 20 includes a storage unit 21, a control unit 22, a memory 23, and a communication unit 24.
  • the storage unit 21 is a non-volatile storage unit such as a hard disk device or a ROM.
  • the program 21a and the personal authentication information 21b are stored in the storage unit 21.
  • Program 21a is a program executed by the control unit 22 (processor).
  • the personal authentication information 21b is information in which the user ID (plurality) and the authentication information of the user are associated with each other.
  • the authentication information is, for example, a PIN (Personal Identification Number).
  • the authentication information is referred to as a registered PIN.
  • the PIN can be called, for example, PIN information or a PIN code.
  • a PIN is a simple code having about 10 digits or less, which is composed of a combination of a plurality of elements (for example, numbers, letters, and symbols).
  • One user ID is assigned to one user.
  • one PIN is assigned to a plurality of users.
  • One PIN may be assigned to one user.
  • control unit 22 includes a processor.
  • the processor is, for example, a CPU (Central Processing Unit). There may be one processor or multiple processors.
  • the processor executes the program 21a read from the storage unit 21 into the memory 23 (for example, RAM) to acquire the display control unit 22a, the photographing control unit 22b, the image acquisition unit 22c, the authentication control unit 22d, and the face authentication result. It functions as a unit 22e, an authentication information acquisition unit 22f, a collation unit 22g, a processing control unit 22h, a feature point detection unit 22j, and a face area detection unit 22k. Some or all of these may be implemented in hardware.
  • the display control unit 22a displays, for example, the PIN input reception screen G1 (authentication information input screen, see FIG. 8) on the display surface of the display unit 32 of the payment terminal 30. Specifically, the display control unit 22a transmits a screen display instruction for displaying the PIN input reception screen G1 to the payment terminal 30 via the communication unit 24. When displaying another screen, for example, the payment execution inquiry screen G2 (see FIG. 11), the same screen display instruction is transmitted to the payment terminal 30 via the communication unit 24.
  • the PIN input reception screen G1 and the payment execution inquiry screen G2 will be described later.
  • the shooting control unit 22b activates the camera 31 of the payment terminal 30 in response to the input operation of the user U1 on the touch panel 33 of the payment terminal 30 to take an image including the face of the user U1. Specifically, the shooting control unit 22b communicates when the communication unit 24 receives an input start detection notification (notification that the user U1 has started inputting a PIN) transmitted from the payment terminal 30. The shooting instruction is transmitted to the payment terminal 30 via the unit 24.
  • the image acquisition unit 22c acquires an image (hereinafter, also referred to as a photographed image) photographed by the camera 31 activated by the image capture control unit 22b. Specifically, the communication unit 24 receives the captured image transmitted from the payment terminal 30, and the image acquisition unit 22c acquires the captured image received by the communication unit 24.
  • the authentication control unit 22d causes the authentication device 10 that executes face authentication to execute the face authentication of the user U1 included in the captured image acquired by the image acquisition unit 22c. Specifically, the authentication control unit 22d transmits the captured image acquired by the image acquisition unit 22c to the authentication device 10 via the communication unit 24. Instead of the captured image, the face region (or the feature point extracted from the captured image) detected from the captured image may be transmitted to the authentication device 10.
  • the face authentication result acquisition unit 22e acquires the result of the face authentication executed by the authentication device 10. Specifically, the communication unit 24 receives the face authentication result transmitted from the authentication device 10, and the face authentication result acquisition unit 22e acquires the face authentication result received by the communication unit 24.
  • the authentication information acquisition unit 22f acquires a PIN (hereinafter referred to as an input PIN) input via the touch panel 33 of the payment terminal 30. Specifically, the communication unit 24 receives the input PIN transmitted from the payment terminal 30, and the authentication information acquisition unit 22f acquires the input PIN received by the communication unit 24.
  • a PIN hereinafter referred to as an input PIN
  • the collation unit 22g collates the input PIN acquired by the authentication information acquisition unit 22f with the registered PIN (see FIG. 7) stored in the storage unit 21.
  • the processing control unit 22h causes the payment terminal 30 that executes the payment processing to execute the payment processing when the face authentication by the authentication device 10 is successful and the matching results by the matching unit 22g match. Specifically, the payment processing control unit 22h transmits a payment processing instruction to the payment terminal 30 via the communication unit 24.
  • the settlement process is an example of the predetermined process of the present invention.
  • the feature point detection unit 22j detects the physical feature points (for example, the feature points of the skeleton) of the person included in the captured image from the captured image acquired by the image acquisition unit 22c.
  • Skeleton feature points can be detected by existing techniques (eg, OpenPose®).
  • the detected feature points include, for example, fingers, wrists, elbows, shoulders, neck, face (nose, ears, eyes), hips, knees, and ankles.
  • FIG. 13 is an example of a captured image (hereinafter referred to as a captured image I) acquired by the image acquisition unit 22c.
  • FIG. 14 is an example of a feature point and a face region detected from the captured image I of FIG.
  • the feature points where the symbols P1R, P2R, PRL, P3R, P3L, P4, and P5 are detected are represented, and the face regions where the symbols A1 and A2 are detected are represented.
  • the photographed image I includes two users, a user U1 who is performing an input operation on the touch panel 33 of the payment terminal 30 by hand and a user U2 who is lined up behind the user U1, the photographed image I (see FIG. 13) is shown in FIG.
  • the feature points of the user U1 for example, wrist P1R, elbow P2R, PRL, shoulder P3R, P3L, neck P4, face P5
  • the feature points of the user U2 are detected. That is, the feature points are detected for each person included in the captured image.
  • the feature points may or may not be displayed on the captured image.
  • the face area detection unit 22k detects the face area to which the fingers performing the input operation on the touch panel 33 of the payment terminal 30 are connected from the captured image acquired by the image acquisition unit 22c. For example, from the captured image I shown in FIG. 13, the face area A1 shown in FIG. 14 is detected as the face area to which the fingers performing the input operation are connected to the touch panel 33 of the payment terminal 30.
  • the process of detecting the face region to which the fingers are connected will be described in more detail in the following example of the operation of the authentication control device 20 (mainly, the face detection process by the face region detection unit 22k).
  • FIG. 15 is a flowchart of an example of the operation of the authentication control device 20 (mainly the face detection process by the face area detection unit 22k).
  • the face area A1 shown in FIG. 14 as the face area to which the fingers performing the input operation on the touch panel 33 of the payment terminal 30 are connected from the captured image I shown in FIG. 13 will be described.
  • the image acquisition unit 22c acquires a captured image captured by the camera 31 activated by the imaging control unit 22b (step S40). Here, it is assumed that the captured image I shown in FIG. 13 has been acquired.
  • the feature point detection unit 22j detects the feature points of the skeleton of the person included in the captured image from the captured image acquired in step S40 (step S41).
  • the feature points of the user U1 shown in FIG. 14 (wrist P1R, elbow P2R, PRL, shoulder P3R, P3L, neck P4, face P5) and the feature points of the user U2 (FIG. 14). (Omitted) is detected.
  • the face area detection unit 22k detects the face area from the captured image acquired in step S40 (step S42).
  • This face region can be detected by existing technology.
  • the face area detection unit 22k detects the finger closest to the touch panel 33 of the payment terminal 30 (step S43). For example, the wrist feature point (or finger feature point) having the shortest distance L (see FIG. 14) from the lower side of the captured image I is detected.
  • the wrist feature point P1R of the user U1 is detected as the wrist feature point having the shortest distance L from the lower side of the captured image I.
  • the face area detection unit 22k is closest to the facial feature points of the user having the fingers (finger feature points) closest to the touch panel 33 detected in step S43 among the face regions detected in step S42.
  • the face area is determined (step S44).
  • the determined face area A1 is a face area (that is, an authentication target person who is inputting a PIN for performing payment processing) to which a finger performing an input operation is connected to the touch panel 33 of the payment terminal 30.
  • the face area to which the fingers performing the input operation on the touch panel 33 of the payment terminal 30 are connected (here, the face area A1 of the user U1). ) Can be detected (determined).
  • steps S40 to S44 are not limited to this order.
  • the execution order of step S41 and step S42 may be reversed.
  • FIG. 8 is an external view of the payment terminal 30, and FIG. 9 is a schematic configuration diagram.
  • the payment terminal 30 is an information processing device including a camera 31, a display unit 32, a touch panel 33, a storage unit 34, a control unit 35, a memory 36, and a communication unit 37.
  • the payment terminal 30 is installed in a store, for example.
  • the authentication device 10 and the authentication control device 20 may be installed in the same store or in a place remote from the store.
  • the payment terminal 30 executes face authentication and PIN verification (two-factor authentication) before the payment process. Then, when the face authentication is successful and the PIN verification results match, the payment terminal 30 performs the payment process. Since publicly known technology can be used for the processing before and after the two-factor authentication such as the input (reading) of the purchased product and the payment processing itself, detailed description thereof will be omitted.
  • the camera 31 captures an image including the face of the user U1.
  • the camera 31 is the face of the user U1 who is inputting the PIN via the touch panel 33, that is, the display surface of the display unit 32 on which the PIN input reception screen G1 is displayed (and its display surface).
  • the display surface of the display unit 32 on which the PIN input reception screen G1 is displayed and its display surface.
  • it is attached to the upper part of the frame of the display unit 32 so that the face of the user U1 facing the camera 31) arranged in the vicinity can be photographed from the front or substantially from the front.
  • the display unit 32 is, for example, a display such as a display with a touch panel.
  • a display with a touch panel is also called a touch screen display.
  • the PIN input reception screen G1 is displayed on the display surface of the display unit 32.
  • the PIN input reception screen G1 includes an image g symbolizing each element constituting the PIN, which the finger of the user U1 should follow via the touch panel 33.
  • Each image g is displayed on the display surface of the display unit 32 in a state of being arranged in a predetermined pattern. For example, the image g is displayed in a state of being arranged in a 3 ⁇ 3 grid pattern (see FIG. 8).
  • the touch panel 33 is an input device operated by the user U1.
  • the touch panel 33 is arranged so as to cover the display surface of the display unit 32.
  • the touch panel 33 is used, for example, to input the PIN by the finger of the user U1.
  • the user U1 inputs a PIN by moving a finger in contact with the touch panel 33 on the touch panel 33 and tracing (via) the image g corresponding to the elements constituting the PIN in order from the beginning of the PIN. do.
  • the PIN is a simple code having a number of digits of about 10 digits or less composed of a combination of a plurality of elements (for example, numbers, letters, symbols), and the number of fingers is counted so as to follow the corresponding image g. Since the PIN can be input only by moving the number of times, the burden of the PIN input of the user U1 is reduced.
  • the PIN may be input by tapping a software keyboard (not shown) displayed on the display surface of the display unit 32 via the touch panel 33, or a physical keyboard (not shown) arranged in the vicinity of the display unit 32. It may be input via).
  • the storage unit 34 is a non-volatile storage unit such as a hard disk device or a ROM.
  • the program 34a is stored in the storage unit 34.
  • Program 34a is a program executed by the control unit 35 (processor).
  • control unit 35 includes a processor.
  • the processor is, for example, a CPU (Central Processing Unit). There may be one processor or multiple processors.
  • the processor functions as a display control unit 35a, an input state detection unit 35b, a shooting control unit 35c, and a payment processing unit 35d by executing a program 34a read from the storage unit 34 into the memory 36 (for example, RAM). Some or all of these may be implemented in hardware.
  • the communication unit 37 When the communication unit 37 receives a screen display instruction (screen display instruction for displaying the PIN input reception screen G1) transmitted from the authentication control device 20, for example, the display control unit 35a is displayed on the display surface of the display unit 32.
  • the PIN input reception screen G1 (see FIG. 8) is displayed.
  • the PIN input reception screen G1 may be displayed based on the information stored in the storage unit 34 of the payment terminal 30 in advance, or may be displayed based on the information received from the authentication control device 20 together with the screen display instruction. May be good.
  • the input state detection unit 35b detects the input state of the PIN.
  • the PIN input state detected by the input state detection unit 35b is, for example, a PIN input start and a PIN input end.
  • the shooting control unit 35c activates the camera 31 to shoot an image including the face of the user U1.
  • the camera 31 is activated by supplying power to the camera 31 or shifting the camera 31 from the sleep state to the normal state.
  • the shooting control unit 35c stops the activation of the camera 31 when the shooting is completed.
  • the activation of the camera 31 is stopped by stopping the power supply to the camera 31 or putting the camera 31 into the sleep state.
  • the payment processing unit 35d executes the payment processing when, for example, the communication unit 37 receives the payment processing instruction transmitted from the authentication control device 20. That is, the payment process is completed based on the payment information associated with the face-authenticated user U1.
  • the communication unit 37 is a communication device that communicates with the authentication control device 20 via the network NW.
  • FIG. 10 is a sequence diagram of the authentication system 1 (first operation example).
  • the authentication control device 20 transmits a screen display instruction for displaying the PIN input reception screen G1 to the payment terminal 30 via the communication unit 24 (step). S10).
  • the payment terminal 30 displays the PIN input reception screen G1 (see FIG. 8) on the display surface of the display unit 32. Display (step S11).
  • the payment terminal 30 activates the camera 31 in response to the input operation of the user U1 on the touch panel 33, and captures an image including the face of the user U1 (steps S12 to S16).
  • the user U1 starts inputting the PIN assigned to himself / herself via the touch panel 33 (step S12). For example, the user U1 moves a finger in contact with the touch panel 33 on the touch panel 33, and traces (vias) the image g corresponding to the elements constituting the PIN in order from the beginning of the PIN. Enter.
  • the payment terminal 30 (input state detection unit 35b) detects the start of PIN input.
  • the payment terminal 30 transmits an input start detection notification to the authentication control device 20 (step S13).
  • the authentication control device 20 (shooting control unit 22b) transmits a shooting instruction to the payment terminal 30 via the communication unit 24 ( Step S14).
  • the payment terminal 30 (shooting control unit 35c) activates the camera 31 (step S15) and inputs the PIN via the touch panel 33.
  • An image including the face of the user U1 is taken (step S16).
  • the number of shots may be one or a plurality.
  • the shooting control unit 35c stops the activation of the camera 31 when the shooting is completed.
  • the payment terminal 30 activates the camera 31 in response to the input operation of the user U1 on the touch panel 33 to take an image including the face of the user U1.
  • the payment terminal 30 (communication unit 37) transmits the captured image I captured in step S16 to the authentication control device 20 (step S17).
  • the shooting control unit 35c may end the shooting of the camera 31 at the timing when the shot image is transmitted (or may stop the activation of the camera 31).
  • the authentication control device 20 receives the captured image I transmitted in step S17, and the image acquisition unit 22c acquires the captured image I received by the communication unit 24.
  • the authentication control device 20 (face area detection unit 22k) detects the face area of the user U1 to which the finger performing the input operation is connected to the touch panel 33 of the payment terminal 30, so that the face detection process (face detection process) (See FIG. 15) is executed (step S1700).
  • face detection process face detection process
  • the face area A1 see FIG. 14
  • the face area A1 is detected as the face area to which the fingers performing the input operation are connected to the touch panel 33 of the payment terminal 30. do.
  • the authentication control device 20 (authentication control unit 22d) makes a face authentication request for the face authentication of the user U1 included in the face area A1 (face image) detected in step S1700 via the communication unit 24. It is transmitted to the authentication device 10 (step S18).
  • This face recognition request includes the detected face area A1.
  • step S19 when the communication unit 14 receives the face authentication request transmitted in step S18, the authentication device 10 executes the face authentication process (see FIG. 6) (step S19).
  • the authentication device 10 (authentication unit 12e) transmits the authentication result to the authentication control device 20 of the face authentication request transmission source via the communication unit 14 (step S20).
  • the authentication control device 20 As an authentication result, it is assumed that the authentication is successful and the user ID of the authenticated user U1 is transmitted to the authentication control device 20.
  • the authentication control device 20 receives the face authentication result and the user ID transmitted in step S20, and the face authentication result acquisition unit 22e receives the face authentication result received by the communication unit 24. get.
  • the payment terminal 30 transmits the input PIN to the authentication control device 20 (step S22).
  • the authentication control device 20 receives the input PIN transmitted in step S22, and the authentication information acquisition unit 22f acquires the input PIN received by the communication unit 24.
  • This PIN verification is a process of collating the input PIN acquired by the authentication information acquisition unit 22f with the registered PIN stored in the storage unit 21 (the registered PIN associated with the acquired user ID). Is.
  • the authentication control device 20 (communication unit 24) succeeds in face recognition (assuming that the result of face recognition acquired by the face recognition result acquisition unit 22e is successful in authentication) and PIN verification in step S23.
  • the payment processing instruction is transmitted to the payment terminal 30 via the communication unit 24 (step S24).
  • the payment terminal 30 displays a payment execution inquiry screen (see FIG. 11) as to whether or not to execute payment. It is displayed on the display surface of the display unit 32 (step S25).
  • FIG. 11 is an example of the settlement execution inquiry screen G2.
  • the payment terminal 30 performs the payment processing. Is executed (step S27).
  • the shooting control unit 35c may end the shooting of the camera 31 (or may stop the activation of the camera 31) at the timing when the user U taps the payment button B via the touch panel 33.
  • step S24 The processing of ⁇ S27 is not executed. In this case, the display unit 32 displays, for example, that the authentication has failed.
  • the feature points of the skeleton detected in the face authentication process (for example, wrist P1R, elbow P2R, PRL, shoulder P3R, P3L, neck in FIG. 14).
  • P4, face P5 is superimposed on the authentication target person (user who is performing an input operation on the touch panel 33 of the payment terminal 30 with a finger) in the image taken in step S16, and is covered by the touch panel screen (touch panel 33). It may be displayed on the display surface of the displayed display unit 32 (see FIG. 14). At that time, if a third party other than the authentication target person lined up behind the authentication target person is reflected in the image taken in step S16 (not shown), the body part of the third party.
  • the feature points may not be displayed.
  • a third party other than the authentication target person lined up behind the authentication target person is reflected in the image taken in step S16 (not shown), it is detected in the authentication process (step S19).
  • the feature points of the body part may be superimposed on the authentication target person and the third party in the image taken in step S16 and displayed on the touch panel screen.
  • the feature points of the body part superimposed on the recognition target person may be highlighted (the feature points of the body part superimposed on the third party may not be highlighted).
  • the color of the feature points of the body part superimposed on the authentication target person in the image taken in step S16 may be changed and displayed on the touch panel screen. good.
  • the touch panel screen (display covered by the touch panel 33) indicates that the face of the frontmost user (operating the touch panel 33) is detected by the skeleton information. It may be displayed on the display surface of the unit 32).
  • the face authentication (step S19) is performed before the PIN verification (step S23), so that the face authentication can be performed quickly. Can be done. (We can respond quickly.)
  • the operation of the authentication system 1 having the above configuration (second operation example) will be described.
  • FIG. 12 is a sequence diagram of the authentication system 1 (second operation example).
  • the face authentication process is executed using the face feature information of all the user IDs registered in the face information DB 11b, whereas in the second operation example, it is registered in the face information DB 11b.
  • the face authentication process is executed using the face feature information of the user ID associated with the input PIN among the user IDs. Since the processing of steps S10 to S22 is the same as that of the first operation example, the same reference numerals are given and the description thereof will be omitted. Hereinafter, steps S30 to S37 after step S22 will be mainly described.
  • the authentication control device 20 receives the captured image transmitted in step S17, the image acquisition unit 22c acquires the captured image received by the communication unit 24, and the face area detection unit 22k. Executes the face detection process (see FIG. 15) (step S1700). Further, the authentication control device 20 (communication unit 24) receives the input PIN transmitted in step S22, and the authentication information acquisition unit 22f acquires the input PIN received by the communication unit 24.
  • the authentication control device 20 executes PIN verification (step S30).
  • PIN verification the input PIN acquired by the authentication information acquisition unit 22f is collated with the registered PIN (all PINs) stored in the storage unit 21, and is acquired from the storage unit 21 by the authentication information acquisition unit 22f. This is a process of extracting the user ID associated with the input PIN.
  • the authentication control device 20 detects the face area A1 (face) as a result of the face detection process by the face area detection unit 22k when the user ID is extracted as a result of the PIN verification in step S30.
  • the face authentication request for the face authentication of the user U1 included in the image) is transmitted to the authentication device 10 via the communication unit 24 (step S31).
  • This face authentication request includes the detected face area A1 and the user ID extracted in step S30.
  • the authentication device 10 executes the face authentication process (see FIG. 6) (step S32). At that time, the authentication device 10 uses the face feature information of the user IDs (for example, 100 people) extracted in step S30 instead of the face feature information of all user IDs (for example, 1000 people) for face recognition processing. To execute.
  • the face feature information of the user IDs for example, 100 people
  • the face feature information of all user IDs for example, 1000 people
  • the authentication device 10 (authentication unit 12e) transmits the authentication result to the authentication control device 20 of the face recognition request sender (step S33).
  • the authentication result the fact that the authentication is successful is transmitted to the authentication control device 20.
  • the authentication control device 20 receives the face authentication result transmitted in step S33, and the face authentication result acquisition unit 22e acquires the face authentication result received by the communication unit 24. ..
  • the authentication control device 20 (communication unit 24) transmits a payment processing instruction to the payment terminal 30 when the result of the acquired face recognition is that the authentication is successful (step S34).
  • step S34 when the payment terminal 30 (display control unit 35a) receives the payment processing instruction transmitted in step S34, the payment terminal 30 (see FIG. 11) displays a payment execution inquiry screen (see FIG. 11) on the display unit 32 as to whether or not to execute payment. Display (step S35).
  • step S37 when the user U1 inputs the payment execution, for example, when the user U1 taps the payment button B in the payment execution inquiry screen (see FIG. 11) via the touch panel 33, the payment terminal 30 performs the payment process. Execute (step S37).
  • step S30 When the user ID is not extracted as a result of the PIN verification in step S30, or when the face authentication fails (when the result of the face authentication acquired by the face authentication result acquisition unit 22e indicates that the authentication has failed).
  • the processes of steps S34 to S37 are not executed. In this case, the display unit 32 displays, for example, that the authentication has failed.
  • the feature points of the skeleton detected in the face authentication process (for example, wrist P1R, elbow P2R, PRL, shoulder P3R, P3L, neck in FIG. 14).
  • P4, face P5 is superimposed on the authentication target person (user who is performing an input operation on the touch panel 33 of the payment terminal 30 with a finger) in the image taken in step S16, and is covered by the touch panel screen (touch panel 33). It may be displayed on the display surface of the displayed display unit 32 (see FIG. 14). At that time, if a third party other than the authentication target person lined up behind the authentication target person is reflected in the image taken in step S16 (not shown), the body part of the third party.
  • the feature points may not be displayed.
  • a third party other than the authentication target person lined up behind the authentication target person is reflected in the image taken in step S16 (not shown), it is detected in the authentication process (step S32).
  • the feature points of the body part may be superimposed on the authentication target person and the third party in the image taken in step S16 and displayed on the touch panel screen.
  • the feature points of the body part superimposed on the recognition target person may be highlighted (the feature points of the body part superimposed on the third party may not be highlighted).
  • the color of the feature points of the body part superimposed on the authentication target person in the image taken in step S16 may be changed and displayed on the touch panel screen. good.
  • the touch panel screen (display covered by the touch panel 33) indicates that the face of the frontmost user (operating the touch panel 33) is detected by the skeleton information. It may be displayed on the display surface of the unit 32).
  • step S30 after narrowing down by PIN verification (step S30) (for example, in the above example, after narrowing down from 1000 to 100 people). ), Face authentication (step S32) can be performed, so that the authentication accuracy can be improved.
  • a third party other than the user U1 (authentication target person) (for example, the user U2 lined up behind or a third party crossing the shooting range of the camera) is the camera 31.
  • the user U1 can be correctly authenticated even when it is reflected in the captured image (see FIG. 13).
  • the touch panel 33 of the payment terminal 30 is used.
  • it is suppressed to mistakenly detect the face area of a third party who is not connected to the finger performing the input operation (input operation for payment), and the PIN is input to perform the payment processing.
  • the face area of the target user U1 can be correctly detected.
  • the face authentication by the authentication device 10 can be correctly executed.
  • the camera 31 is activated at an appropriate timing according to the user's input operation to the touch panel 33 (for example, the timing when the user U1 starts the PIN input via the touch panel 33), and the user U1 You can take a picture of your face.
  • the power consumption can be reduced as compared with the case where the camera 31 is always activated.
  • a third party other than the authentication target person for example, a third party other than the authentication target person lined up behind the authentication target person or a third party crossing the shooting range of the camera 31).
  • the camera 31 is at an appropriate timing (that is, at the timing when authentication is required) according to the user's input operation to the touch panel 33 as in the second embodiment. By activating, it is possible to suppress the reflection of a third party other than the person to be authenticated.
  • the touch panel since the face of the user U1 is photographed at the timing corresponding to the user's input operation to the touch panel 33 (for example, the timing when the user U1 starts the PIN input via the touch panel 33), the touch panel is used. It faces the face of the user U1 who is inputting the PIN via 33, that is, the display surface of the display unit 32 (and the camera 31 arranged in the vicinity thereof) on which the PIN input reception screen G1 is displayed (gaze).
  • the face of the user U1 can be photographed from the front or almost from the front. That is, the face of the user U1 can be photographed at an angle suitable for face recognition. This can be expected to improve the authentication accuracy of face recognition.
  • the wrist feature point (or finger feature point) having the shortest distance L (see FIG. 14) from the lower side of the captured image I is detected as the finger closest to the touch panel 33 of the payment terminal 30.
  • An example when the camera 31 is attached to the upper part of the frame of the display unit 32 (see FIG. 8) has been described, but the present invention is not limited to this.
  • the wrist having the shortest distance from the upper side of the captured image I as the finger closest to the touch panel 33 of the payment terminal 30. (Or the feature point of the finger) may be detected.
  • the distance between the photographed image I and the left side of the photographed image I is the shortest as the finger closest to the touch panel 33 of the payment terminal 30. Wrist feature points (or finger feature points) may be detected.
  • the distance between the camera 31 and the right side of the captured image I is the shortest as the finger closest to the touch panel 33 of the payment terminal 30.
  • Wrist feature points (or finger feature points) may be detected.
  • the touch panel screen the display surface of the display unit 31 covered by the touch panel 33
  • the touch panel 33 is touched as the finger closest to the touch panel 33 of the payment terminal 30. Wrist feature points (or finger feature points) may be detected.
  • the user U1 starts the PIN input via the touch panel 33 at the timing corresponding to the user's input operation to the touch panel 33 (the timing at which the camera 31 is activated to take a picture of the user U1's face).
  • the example of the timing is described, but the present invention is not limited to this.
  • the timing corresponding to the user's input operation on the touch panel 33 is any timing after the start of the PIN input and before the end of the PIN input. May be good.
  • the face of the user U1 who is inputting the PIN via the touch panel 33 that is, the display surface of the display unit 32 on which the PIN input reception screen G1 is displayed (and the camera 31 arranged in the vicinity thereof).
  • the face of the user U1 facing (looking at) can be photographed from the front or substantially from the front. That is, the face of the user U1 can be photographed at an angle suitable for face recognition. This can be expected to improve the authentication accuracy of face recognition.
  • the timing according to the user's input operation to the touch panel 33 may be the timing at which the user U1 finishes the PIN input or a timing after that. good.
  • the camera 31 is attached to the upper part of the frame of the display unit 32 (see FIG. 8)
  • the present invention is not limited to this. That is, the camera 31 may be attached anywhere as long as it can shoot at an angle suitable for face recognition. For example, it may be attached to the left portion or the right portion of the frame of the display unit 32. Alternatively, it may be attached to a structure (for example, a wall or a pillar) installed in the vicinity of the payment terminal 30 or other places.
  • this predetermined process may be a process of opening a gate or a door through which the user U1 passes, or another process.
  • the authentication system 1 is composed of an authentication device 10, an authentication control device 20, and a payment terminal 30 capable of communicating with each other via a network NW (for example, the Internet) has been described. Not exclusively.
  • NW for example, the Internet
  • the configuration or function of all or part of the authentication device 10 and the authentication control device 20 may be added to the payment terminal 30. Further, the configuration or function of all or a part of the payment terminal 30 and the authentication device 10 may be added to the authentication control device 20.
  • Non-temporary computer-readable media include various types of tangible storage mediums.
  • Examples of non-temporary computer-readable media include magnetic recording media (eg, flexible disks, magnetic tapes, hard disk drives), magneto-optical recording media (eg, magneto-optical disks), CD-ROMs (Read Only Memory), CD-Rs, It includes a CD-R / W and a semiconductor memory (for example, a mask ROM, a PROM (Programmable ROM), an EPROM (Erasable PROM), a flash ROM, and a RAM (Random Access Memory)).
  • a semiconductor memory for example, a mask ROM, a PROM (Programmable ROM), an EPROM (Erasable PROM), a flash ROM, and a RAM (Random Access Memory)
  • the program may also be supplied to the computer by various types of temporary computer readable medium.
  • temporary computer-readable media include electrical, optical, and electromagnetic waves.
  • the temporary computer-readable medium can supply the program to the computer via a wired communication path such as an electric wire and an optical fiber, or a wireless communication path.
  • An imaging control means that allows the camera to capture an image including a user who is performing an input operation with a finger on the input means.
  • An image acquisition means for acquiring the image taken by the camera, and A face area detecting means for detecting a face area to which the fingers performing the input operation on the input means are connected from an image acquired by the image acquiring means, and a face area detecting means.
  • An authentication control device including an authentication control means for causing an authentication device that executes face authentication to execute face authentication of the user included in the face area detected by the face area detection means.
  • the input means is an input means for causing the user to input the authentication information assigned to the user.
  • An authentication information acquisition means for acquiring the authentication information input via the input means, and an authentication information acquisition means.
  • a collation means for collating the authentication information acquired by the authentication information acquisition means with the registered authentication information,
  • Appendix 3 The authentication control device according to Appendix 2, wherein the predetermined process is a settlement process and a process of opening a gate or door through which the user passes.
  • Appendix 5 The authentication control device according to Appendix 4, wherein the input means is a touch panel arranged so as to cover the display surface of the display unit.
  • the display surface of the display unit further includes a display control unit that displays an authentication information input screen including an image symbolizing each element constituting the authentication information that the user's finger should follow via the touch panel.
  • the authentication control device according to 5.
  • a feature point detecting means for detecting at least one feature point of the user's skeleton from the image acquired by the image acquiring means is further provided.
  • the face area detecting means connects the image acquired by the image acquiring means via a feature point at which the finger performing the input operation to the input means is detected by the feature point detecting means.
  • the authentication control device according to any one of Supplementary note 1 to 6, which detects a face area.
  • (Appendix 8) Input means and A camera that captures an image including a user who is performing an input operation with a finger on the input means, and a camera.
  • An image acquisition means for acquiring the image taken by the camera, and A face area detecting means for detecting a face area to which the fingers performing the input operation on the input means are connected from an image acquired by the image acquiring means, and a face area detecting means.
  • An authentication system including an authentication device for executing face authentication, and an authentication control means for executing face authentication of the user included in the face area detected by the face area detecting means.
  • Appendix 9 It is equipped with an information processing device, an authentication control device, and the authentication device capable of communicating with each other via a network.
  • the input means and the camera are provided in the information processing device.
  • An authentication control method comprising an authentication control step for causing an authentication device that executes face authentication to execute face authentication of the user included in the face area detected by the face detection step.
  • (Appendix 11) For electronic devices with at least one processor Shooting control processing that causes the camera to capture an image that includes a user who is performing an input operation with a finger on the input means.
  • An image acquisition process for acquiring the image taken by the camera, and From the image acquired by the image acquisition process, a face area detection process for detecting a face area to which the fingers performing the input operation on the input means are connected, and a face area detection process.
  • a computer-readable recording medium recording a program for executing an authentication control process for causing an authentication device that executes face authentication to execute face authentication of the user included in the face area detected by the face detection process. ..
  • Authentication system 10 Authentication device 11 Storage unit 11a Program 11b Face information DB 12 Control unit 12a Image acquisition unit 12b Face detection unit 12c Feature point extraction unit 12d Registration unit 12e Authentication unit 13 Memory 14 Communication unit 20 Authentication control device 21 Storage unit 21a Program 21b Personal authentication information 21b2 Authentication information 22 Control unit 22a Display control unit 22b Imaging control unit (imaging control means) 22c Image acquisition unit (image acquisition means) 22d Authentication control unit (authentication control means) 22e Authentication information acquisition department (face authentication result acquisition department) 22g Collation unit 22h Completion processing unit 22j Feature point detection unit 22k Face area detection unit 23 Memory 24 Communication unit 30 Payment terminal (information processing terminal) 31 Camera 32 Display 33 Touch panel (input means) 34 Storage unit 34a Program 35 Control unit 35a Display control unit 35b Input status detection unit 35c Shooting control unit 35d Payment processing unit 36 Memory 37 Communication unit B Payment button G1 PIN input reception screen G2 Inquiry screen NW network g Image

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Collating Specific Patterns (AREA)

Abstract

An authentication control device (20) according to the present disclosure is provided with: an image capture control means (22b) which causes a camera (31) to capture an image including a user who is performing an input operation on an input means (33) with fingers; an image acquisition means (22c) which acquires the image captured by the camera; a face detection means (22k) which detects, from the image acquired by the image acquisition means, a face region to which the fingers performing the input operation on the input means are connected; and an authentication control means (22d) which causes an authentication device (10) that performs face authentication to perform face authentication of the user included in the face region detected by the face detection means.

Description

認証制御装置、認証システム、認証制御方法及び記録媒体Authentication control device, authentication system, authentication control method and recording medium
 本開示は、認証制御装置、認証システム、認証制御方法及び記録媒体に関する。 This disclosure relates to an authentication control device, an authentication system, an authentication control method, and a recording medium.
 認証対象者の顔をカメラで撮影した画像から抽出される顔情報による顔認証と認証対象者が入力する識別情報による認証(照合)とを並行して実施し、顔認証が成功した場合に通常の決済処理を実行し、顔認証が失敗した場合であっても識別情報による認証(照合)が成功すれば、通常の決済処理を行うシステムが例えば特許文献1に記載されている。 Face recognition based on face information extracted from the image of the person to be authenticated by the camera and authentication (verification) based on the identification information input by the person to be authenticated are performed in parallel, and when face recognition is successful, it is usually performed. Patent Document 1, for example, describes a system that performs normal payment processing if authentication (verification) using identification information is successful even when face recognition fails.
特開2019-36888号公報Japanese Unexamined Patent Publication No. 2019-36888
 しかしながら、特許文献1に記載のシステムにおいては、認証対象者以外の第三者(例えば、後方に並んでいる第三者やカメラの撮影範囲を横切る第三者)がカメラで撮影した画像に写り込むと、認識対象者を正しく認証できない場合があるという問題がある。 However, in the system described in Patent Document 1, a third party other than the person to be certified (for example, a third party lined up behind or a third party crossing the shooting range of the camera) appears in the image taken by the camera. If it is included, there is a problem that the recognition target person may not be authenticated correctly.
 本開示の目的は、上述した課題を鑑み、認証対象者以外の第三者(例えば、後方に並んでいる第三者やカメラの撮影範囲を横切る第三者)がカメラで撮影した画像に写り込んでいる場合であっても、認識対象者を正しく認証することができる認証制御装置、認証システム、認証制御方法及び記録媒体を提供することにある。 In view of the above-mentioned problems, the purpose of the present disclosure is to capture an image taken by a third party other than the certified person (for example, a third party lined up behind or a third party crossing the shooting range of the camera). It is an object of the present invention to provide an authentication control device, an authentication system, an authentication control method, and a recording medium capable of correctly authenticating a recognition target person even if the person is crowded.
 本開示の第1の態様にかかる認証制御装置は、入力手段に対して手指により入力操作を行っているユーザを含む画像をカメラにより撮影させる撮影制御手段と、前記カメラにより撮影された前記画像を取得する画像取得手段と、前記画像取得手段により取得された画像から、前記入力手段に対して前記入力操作を行っている前記手指がつながっている顔領域を検出する顔検出手段と、顔認証を実行する認証装置に、前記顔検出手段により検出された顔領域に含まれる前記ユーザの顔認証を実行させる認証制御手段と、を備える。 The authentication control device according to the first aspect of the present disclosure captures a photographing control means for causing a camera to take an image including a user who is performing an input operation with a finger on the input means, and the image taken by the camera. The image acquisition means to be acquired, the face detection means for detecting the face area to which the fingers performing the input operation to the input means are connected from the image acquired by the image acquisition means, and face authentication. The authentication device to be executed includes an authentication control means for executing face recognition of the user included in the face area detected by the face detection means.
 本開示の第2の態様にかかる認証システムは、入力手段と、前記入力手段に対して手指により入力操作を行っているユーザを含む画像を撮影するカメラと、前記カメラにより撮影された前記画像を取得する画像取得手段と、前記画像取得手段により取得された画像から、前記入力手段に対して前記入力操作を行っている前記手指がつながっている顔領域を検出する顔検出手段と、顔認証を実行する認証装置に、前記顔検出手段により検出された顔領域に含まれる前記ユーザの顔認証を実行させる認証制御手段と、を備える。 The authentication system according to the second aspect of the present disclosure captures an input means, a camera that captures an image including a user who is performing an input operation on the input means by a finger, and the image captured by the camera. The image acquisition means to be acquired, the face detection means for detecting the face area to which the fingers performing the input operation to the input means are connected from the image acquired by the image acquisition means, and face authentication. The authentication device to be executed includes an authentication control means for executing face recognition of the user included in the face area detected by the face detection means.
 本開示の第3の態様にかかる認証制御方法は、入力手段に対して手指により入力操作を行っているユーザを含む画像をカメラにより撮影させる撮影制御ステップと、前記カメラにより撮影された前記画像を取得する画像取得ステップと、前記画像取得ステップにより取得された画像から、前記入力手段に対して前記入力操作を行っている前記手指がつながっている顔領域を検出する顔検出ステップと、顔認証を実行する認証装置に、前記顔検出ステップにより検出された顔領域に含まれる前記ユーザの顔認証を実行させる認証制御ステップと、を備える。 The authentication control method according to the third aspect of the present disclosure includes a photographing control step in which an image including a user who is performing an input operation with a finger on the input means is photographed by a camera, and the image captured by the camera. From the image acquisition step to be acquired, the face detection step of detecting the face area to which the fingers performing the input operation with respect to the input means are connected from the image acquired by the image acquisition step, and face authentication. The authentication device to be executed includes an authentication control step for executing face authentication of the user included in the face area detected by the face detection step.
 本開示の第4の態様にかかる記録媒体は、少なくとも1つのプロセッサを備えた電子デバイスに、入力手段に対して手指により入力操作を行っているユーザを含む画像をカメラにより撮影させる撮影制御処理と、前記カメラにより撮影された前記画像を取得する画像取得処理と、前記画像取得処理により取得された画像から、前記入力手段に対して前記入力操作を行っている前記手指がつながっている顔領域を検出する顔検出処理と、顔認証を実行する認証装置に、前記顔検出処理により検出された顔領域に含まれる前記ユーザの顔認証を実行させる認証制御処理と、を実行させるためのプログラムを記録したコンピュータ読取可能な記録媒体である。 The recording medium according to the fourth aspect of the present disclosure is a photographing control process in which an electronic device including at least one processor is made to take an image including a user who is performing an input operation with a finger on an input means by a camera. From the image acquisition process for acquiring the image taken by the camera and the image acquired by the image acquisition process, a face region to which the fingers performing the input operation with respect to the input means are connected. A program for causing the face detection process to be detected and the authentication control process for causing the authentication device that executes face authentication to execute the face authentication of the user included in the face area detected by the face detection process is recorded. It is a computer-readable recording medium.
 本発明により、認証対象者以外の第三者(例えば、後方に並んでいる第三者やカメラの撮影範囲を横切る第三者)がカメラで撮影した画像に写り込んでいる場合であっても、認識対象者を正しく認証することができる認証制御装置、認証システム、認証制御方法及び記録媒体を提供することができる。 According to the present invention, even when a third party other than the person to be authenticated (for example, a third party lined up behind or a third party crossing the shooting range of the camera) is reflected in the image taken by the camera. , An authentication control device, an authentication system, an authentication control method and a recording medium capable of correctly authenticating a recognition target person can be provided.
認証制御装置20の概略構成図である。It is a schematic block diagram of the authentication control device 20. 認証制御装置20の動作の一例のフローチャートである。It is a flowchart of an example of the operation of the authentication control device 20. 実施形態2にかかる認証システムの構成を示すブロック図である。It is a block diagram which shows the structure of the authentication system which concerns on Embodiment 2. 認証装置10の概略構成図である。It is a schematic block diagram of the authentication device 10. 認証装置10の動作(顔情報登録処理)の一例のフローチャートである。It is a flowchart of an example of the operation (face information registration processing) of the authentication device 10. 認証装置10の動作(顔認証処理)の一例のフローチャートである。It is a flowchart of an example of the operation (face recognition processing) of the authentication device 10. 認証制御装置20の概略構成図である。It is a schematic block diagram of the authentication control device 20. 決済端末30の外観図である。It is an external view of the payment terminal 30. 決済端末30の概略構成図である。It is a schematic block diagram of the payment terminal 30. 認証システム1(第1動作例)のシーケンス図である。It is a sequence diagram of the authentication system 1 (first operation example). 決済実行問い合わせ画面G2の一例である。This is an example of the payment execution inquiry screen G2. 認証システム1(第2動作例)のシーケンス図である。It is a sequence diagram of the authentication system 1 (second operation example). 画像取得部22cにより取得される撮影画像の一例である。This is an example of a captured image acquired by the image acquisition unit 22c. 図13の撮影画像から検出される特徴点及び顔領域の一例である。It is an example of a feature point and a face area detected from the photographed image of FIG. 認証制御装置20の動作(主に、顔領域検出部22kによる顔検出処理)の一例のフローチャートである。It is a flowchart of an example of the operation of the authentication control device 20 (mainly a face detection process by a face area detection unit 22k).
 (実施形態1)
 まず、図1を用いて、実施形態1の認証システムを構成する認証制御装置20の構成例について説明する。
(Embodiment 1)
First, a configuration example of the authentication control device 20 constituting the authentication system of the first embodiment will be described with reference to FIG.
 図1は、認証制御装置20の概略構成図である。 FIG. 1 is a schematic configuration diagram of the authentication control device 20.
 図1に示すように、認証制御装置20は、入力手段33に対して手指により入力操作を行っているユーザ(認証対象者)を含む画像をカメラ31により撮影させる撮影制御手段22bと、カメラ31により撮影された前記画像を取得する画像取得手段22cと、画像取得手段に22cより取得された画像から、入力手段33に対して入力操作を行っている手指がつながっている顔領域を検出する顔検出手段22kと、顔認証を実行する認証装置10に、顔検出手段22kにより検出された顔領域に含まれる前記ユーザの顔認証を実行させる認証制御手段22dと、を備えている。 As shown in FIG. 1, the authentication control device 20 includes a shooting control means 22b for causing the camera 31 to capture an image including a user (authentication target person) who is performing an input operation on the input means 33 by hand, and a camera 31. A face that detects a face area to which a finger performing an input operation on the input means 33 is connected from an image acquisition means 22c that acquires the image taken by the camera and an image acquired from the image acquisition means 22c. The detection means 22k and the authentication device 10 that executes face recognition include authentication control means 22d that executes face recognition of the user included in the face area detected by the face detection means 22k.
 次に、上記構成の認証制御装置20の動作の一例について説明する。 Next, an example of the operation of the authentication control device 20 having the above configuration will be described.
 図2は、認証制御装置20の動作の一例のフローチャートである。 FIG. 2 is a flowchart of an example of the operation of the authentication control device 20.
 まず、撮影制御手段22bが、入力手段33に対して手指により入力操作を行っているユーザを含む画像をカメラ31により撮影させる(ステップS10)。 First, the photographing control means 22b causes the camera 31 to take an image including a user who is performing an input operation with a finger on the input means 33 (step S10).
 次に、画像取得手段22cが、カメラ31により撮影された前記画像を取得する(ステップS11)。 Next, the image acquisition means 22c acquires the image captured by the camera 31 (step S11).
 次に、顔検出手段22kが、ステップS11で取得された画像から、入力手段33に対して入力操作を行っている手指がつながっている顔領域を検出する(ステップS12)。 Next, the face detection means 22k detects the face area to which the fingers performing the input operation on the input means 33 are connected from the image acquired in step S11 (step S12).
 次に、認証制御手段22dが、顔認証を実行する認証装置10に、ステップS12で検出された顔領域に含まれる前記ユーザの顔認証を実行させる(ステップS13)。 Next, the authentication control means 22d causes the authentication device 10 that executes face authentication to execute the face authentication of the user included in the face area detected in step S12 (step S13).
 以上説明したように、実施形態1によれば、認証対象者以外の第三者(例えば、後方に並んでいる第三者やカメラ31の撮影範囲を横切る第三者)がカメラ31で撮影した画像に写り込んでいる場合であっても、認識対象者を正しく認証することができる。 As described above, according to the first embodiment, a third party other than the person to be authenticated (for example, a third party lined up behind or a third party crossing the shooting range of the camera 31) took a picture with the camera 31. Even if it is reflected in the image, the recognition target person can be correctly authenticated.
 (実施形態2)
 以下、本発明の実施形態2として、認証システム1について詳細に説明する。以下、撮影制御手段22bとして撮影制御部を用いる。以下、撮影制御部22bと記載する。また、画像取得手段22cとして画像取得部を用いる。以下、画像取得部22cと記載する。また、認証制御手段22dとして認証制御部を用いる。以下、認証制御部22dと記載する。また、情報処理端末30として決済端末を用いる。以下、決済端末30と記載する。また、入力手段33としてタッチパネルを用いる。以下、タッチパネル33と記載する。
(Embodiment 2)
Hereinafter, the authentication system 1 will be described in detail as the second embodiment of the present invention. Hereinafter, the photographing control unit will be used as the photographing control means 22b. Hereinafter, it will be referred to as an imaging control unit 22b. Further, an image acquisition unit is used as the image acquisition means 22c. Hereinafter, it will be referred to as an image acquisition unit 22c. Further, the authentication control unit is used as the authentication control means 22d. Hereinafter, it will be referred to as an authentication control unit 22d. Further, a payment terminal is used as the information processing terminal 30. Hereinafter, it will be referred to as a payment terminal 30. Further, a touch panel is used as the input means 33. Hereinafter, it will be referred to as a touch panel 33.
 図3は、実施形態2にかかる認証システム1の構成を示すブロック図である。 FIG. 3 is a block diagram showing the configuration of the authentication system 1 according to the second embodiment.
 認証システム1は、ネットワークNW(例えば、インターネット)を介して互いに通信可能な認証装置10、認証制御装置20、決済端末30を備えている。 The authentication system 1 includes an authentication device 10, an authentication control device 20, and a payment terminal 30 that can communicate with each other via a network NW (for example, the Internet).
 まず、認証装置10の構成例について説明する。 First, a configuration example of the authentication device 10 will be described.
 図4は、認証装置10の概略構成図である。 FIG. 4 is a schematic configuration diagram of the authentication device 10.
 図4に示すように、認証装置10は、記憶部11と、制御部12と、メモリ13と、通信部14と、を備えている。 As shown in FIG. 4, the authentication device 10 includes a storage unit 11, a control unit 12, a memory 13, and a communication unit 14.
 記憶部11は、例えば、ハードディスク装置やROM等の不揮発性の記憶部である。記憶部11には、プログラム11a、顔情報DB11bが記憶されている。 The storage unit 11 is, for example, a non-volatile storage unit such as a hard disk device or a ROM. The program 11a and the face information DB 11b are stored in the storage unit 11.
 プログラム11aは、制御部12(プロセッサ)により実行されるプログラムである。顔情報DB11bには、ユーザID(複数)と当該ユーザの顔特徴情報とが対応付けて記憶(登録)されている。また、認証装置10は、外部(例えば、認証制御装置20)から受信した顔認証要求に応じて、当該要求に含まれる顔画像又は顔特徴情報について、各ユーザの顔特徴情報と照合を行い、照合結果を要求元へ返信する。 Program 11a is a program executed by the control unit 12 (processor). In the face information DB 11b, the user ID (plurality) and the face feature information of the user are stored (registered) in association with each other. Further, the authentication device 10 collates the face image or face feature information included in the request with the face feature information of each user in response to the face recognition request received from the outside (for example, the authentication control device 20). The verification result is returned to the requester.
 制御部12は、図示しないが、プロセッサを備えている。プロセッサは、例えば、CPU(Central Processing Unit)である。プロセッサは、1つの場合もあるし、複数の場合もある。プロセッサは、記憶部11からメモリ13(例えば、RAM)に読み込まれたプログラム11aを実行することで、画像取得部12a、顔検出部12b、特徴点抽出部12c、登録部12d、認証部12eとして機能する。これらの一部又は全部は、ハードウェアで実現してもよい。 Although not shown, the control unit 12 includes a processor. The processor is, for example, a CPU (Central Processing Unit). There may be one processor or multiple processors. By executing the program 11a read from the storage unit 11 into the memory 13 (for example, RAM), the processor serves as an image acquisition unit 12a, a face detection unit 12b, a feature point extraction unit 12c, a registration unit 12d, and an authentication unit 12e. Function. Some or all of these may be implemented in hardware.
 画像取得部12aは、ユーザの顔を含む画像を取得する。例えば、画像取得部12aは、通信部14が受信する画像を取得する。通信部14が受信する画像には、ユーザ端末(図示せず)から送信される登録用の画像や認証制御装置20から送信される認証(照合)用の画像がある。 The image acquisition unit 12a acquires an image including the user's face. For example, the image acquisition unit 12a acquires an image received by the communication unit 14. The images received by the communication unit 14 include a registration image transmitted from a user terminal (not shown) and an authentication (verification) image transmitted from the authentication control device 20.
 顔検出部12bは、画像取得部12aにより取得された画像から顔領域を検出し、特徴点抽出部12cに出力する。 The face detection unit 12b detects a face region from the image acquired by the image acquisition unit 12a and outputs it to the feature point extraction unit 12c.
 特徴点抽出部12cは、顔検出部12bにより検出された顔領域から特徴点(例えば、目、鼻、口端等の顔の特徴点)を抽出する。 The feature point extraction unit 12c extracts feature points (for example, facial feature points such as eyes, nose, and mouth edge) from the face region detected by the face detection unit 12b.
 画像取得部12aにより取得された画像が登録用の画像である場合、特徴点抽出部12cは、登録部12dに顔特徴情報を出力する。ここで、顔特徴情報は、抽出した特徴点の集合である。一方、画像取得部12aにより取得された画像が認証用の画像である場合、特徴点抽出部12cは、認証部12eに顔特徴情報を出力する。 When the image acquired by the image acquisition unit 12a is an image for registration, the feature point extraction unit 12c outputs the face feature information to the registration unit 12d. Here, the face feature information is a set of extracted feature points. On the other hand, when the image acquired by the image acquisition unit 12a is an image for authentication, the feature point extraction unit 12c outputs the face feature information to the authentication unit 12e.
 登録部12dは、顔特徴情報の登録に際して、ユーザIDを新規に発行する。登録部12dは、発行したユーザIDと、登録用の画像から抽出した顔特徴情報とを対応付けて顔情報DB11bに登録する。 The registration unit 12d newly issues a user ID when registering facial feature information. The registration unit 12d registers the issued user ID and the face feature information extracted from the image for registration in the face information DB 11b in association with each other.
 認証部12eは、認証用の画像から検出された顔領域から抽出された顔特徴情報と、顔情報DB11b内の顔特徴情報との照合を行う。認証部12eは、顔特徴情報の一致の有無を認証制御装置20に返信する。顔特徴情報の一致の有無は、認証の成否に対応する。 The authentication unit 12e collates the face feature information extracted from the face area detected from the authentication image with the face feature information in the face information DB 11b. The authentication unit 12e returns to the authentication control device 20 whether or not the facial feature information matches. The presence or absence of matching of facial feature information corresponds to the success or failure of authentication.
 通信部14は、認証制御装置20との間でネットワークNWを介して通信する通信装置である。 The communication unit 14 is a communication device that communicates with the authentication control device 20 via the network NW.
 次に、認証装置10の動作(顔情報登録処理)の一例について説明する。 Next, an example of the operation of the authentication device 10 (face information registration process) will be described.
 図5は、認証装置10の動作(顔情報登録処理)の一例のフローチャートである。 FIG. 5 is a flowchart of an example of the operation of the authentication device 10 (face information registration process).
 まず、認証装置10(画像取得部12a)は、顔情報登録要求に含まれるユーザの顔を含む画像(登録用の画像)を取得する(ステップS10)。例えば、認証装置10(通信部14)は、顔情報登録要求を、ユーザ端末(図示せず)からネットワークNWを介して受信する。 First, the authentication device 10 (image acquisition unit 12a) acquires an image (image for registration) including the user's face included in the face information registration request (step S10). For example, the authentication device 10 (communication unit 14) receives the face information registration request from the user terminal (not shown) via the network NW.
 次に、認証装置10(顔検出部12b)は、ステップS10で取得された登録用の画像から顔領域を検出する(ステップS11)。次に、認証装置10(特徴点抽出部12c)は、ステップS11で検出した顔領域から顔の特徴点を抽出し(ステップS12)、登録部12dに顔特徴情報を出力する。最後に、認証装置10(登録部12d)は、ユーザIDを発行し、当該ユーザIDと顔特徴情報とを対応付けて顔情報DB11bに登録する(ステップS13)。なお、認証装置10は、顔認証端末等から顔特徴情報を受信し、ユーザIDと対応付けて顔情報DB11bに登録してもよい。 Next, the authentication device 10 (face detection unit 12b) detects the face area from the registration image acquired in step S10 (step S11). Next, the authentication device 10 (feature point extraction unit 12c) extracts facial feature points from the face region detected in step S11 (step S12), and outputs face feature information to the registration unit 12d. Finally, the authentication device 10 (registration unit 12d) issues a user ID, associates the user ID with the face feature information, and registers the user ID in the face information DB 11b (step S13). The authentication device 10 may receive face feature information from a face authentication terminal or the like and register it in the face information DB 11b in association with the user ID.
 次に、認証装置10の動作(顔認証処理)の一例について説明する。 Next, an example of the operation (face recognition processing) of the authentication device 10 will be described.
 図6は、認証装置10の動作(顔認証処理)の一例のフローチャートである。 FIG. 6 is a flowchart of an example of the operation (face recognition processing) of the authentication device 10.
 まず、認証装置10(画像取得部12a)は、顔認証要求に含まれるユーザの顔を含む画像(認証用の画像)を取得する(ステップS20)。例えば、認証装置10(通信部14)は、顔認証要求を、認証制御装置20からネットワークNWを介して受信する。次に、認証装置10(顔検出部12b)は、ステップS20で取得された認証用の画像から顔領域を検出する(ステップS21)。次に、特徴点抽出部12cは、ステップS21で検出された顔領域から顔の特徴点を抽出する(ステップS22)。または、認証装置10は、認証制御装置20から顔特徴情報を受信してもよい。次に、認証装置10(認証部12e)は、取得した顔特徴情報を、顔情報DB11bと照合する(ステップS23)。顔特徴情報が一致した場合(ステップS24:Yes)、認証部12eは、顔特徴情報が一致したユーザのユーザIDを特定し(ステップS25)、顔認証が成功した旨と特定したユーザIDとを認証制御装置20に返信する(ステップS26)。一致する顔特徴情報が存在しない場合(ステップS24:No)、認証部12eは、顔認証が失敗した旨を認証制御装置20に返信する(ステップS27)。 First, the authentication device 10 (image acquisition unit 12a) acquires an image (image for authentication) including the user's face included in the face authentication request (step S20). For example, the authentication device 10 (communication unit 14) receives the face recognition request from the authentication control device 20 via the network NW. Next, the authentication device 10 (face detection unit 12b) detects the face region from the authentication image acquired in step S20 (step S21). Next, the feature point extraction unit 12c extracts facial feature points from the face region detected in step S21 (step S22). Alternatively, the authentication device 10 may receive face feature information from the authentication control device 20. Next, the authentication device 10 (authentication unit 12e) collates the acquired face feature information with the face information DB 11b (step S23). When the face feature information matches (step S24: Yes), the authentication unit 12e identifies the user ID of the user whose face feature information matches (step S25), and determines that the face authentication was successful and the specified user ID. Reply to the authentication control device 20 (step S26). When there is no matching face feature information (step S24: No), the authentication unit 12e returns to the authentication control device 20 that the face authentication has failed (step S27).
 次に、認証制御装置20の構成例について説明する。 Next, a configuration example of the authentication control device 20 will be described.
 図7は、認証制御装置20の概略構成図である。 FIG. 7 is a schematic configuration diagram of the authentication control device 20.
 認証制御装置20は、認証制御処理を行う情報処理装置であり、例えば、コンピュータにより実現されるサーバ装置である。 The authentication control device 20 is an information processing device that performs authentication control processing, and is, for example, a server device realized by a computer.
 図7に示すように、認証制御装置20は、記憶部21、制御部22、メモリ23及び通信部24を備えている。 As shown in FIG. 7, the authentication control device 20 includes a storage unit 21, a control unit 22, a memory 23, and a communication unit 24.
 記憶部21は、ハードディスク装置やROM等の不揮発性の記憶部である。記憶部21には、プログラム21a、個人認証情報21bが記憶されている。 The storage unit 21 is a non-volatile storage unit such as a hard disk device or a ROM. The program 21a and the personal authentication information 21b are stored in the storage unit 21.
 プログラム21aは、制御部22(プロセッサ)により実行されるプログラムである。個人認証情報21bは、ユーザID(複数)と当該ユーザの認証情報とを対応付けた情報である。認証情報は、例えば、PIN(Personal Identification Number)である。以下、認証情報を、登録済みPINと呼ぶ。PINは、例えば、PIN情報やPINコードと呼ぶことができる。PINは、複数の要素(例えば、数字、文字、記号)の組み合わせで構成される10桁程度又はそれ以下の桁数の簡易なコードである。ユーザIDは、一人のユーザに1つ割り当てられている。これに対して、PINは、複数のユーザに一つ割り当てられている。なお、PINは、一人のユーザに1つ割り当てられていてもよい。 Program 21a is a program executed by the control unit 22 (processor). The personal authentication information 21b is information in which the user ID (plurality) and the authentication information of the user are associated with each other. The authentication information is, for example, a PIN (Personal Identification Number). Hereinafter, the authentication information is referred to as a registered PIN. The PIN can be called, for example, PIN information or a PIN code. A PIN is a simple code having about 10 digits or less, which is composed of a combination of a plurality of elements (for example, numbers, letters, and symbols). One user ID is assigned to one user. On the other hand, one PIN is assigned to a plurality of users. One PIN may be assigned to one user.
 制御部22は、図示しないが、プロセッサを備えている。プロセッサは、例えば、CPU(Central Processing Unit)である。プロセッサは、1つの場合もあるし、複数の場合もある。プロセッサは、記憶部21からメモリ23(例えば、RAM)に読み込まれたプログラム21aを実行することで、表示制御部22a、撮影制御部22b、画像取得部22c、認証制御部22d、顔認証結果取得部22e、認証情報取得部22f、照合部22g、処理制御部22h、特徴点検出部22j、顔領域検出部22kとして機能する。これらの一部又は全部は、ハードウェアで実現してもよい。 Although not shown, the control unit 22 includes a processor. The processor is, for example, a CPU (Central Processing Unit). There may be one processor or multiple processors. The processor executes the program 21a read from the storage unit 21 into the memory 23 (for example, RAM) to acquire the display control unit 22a, the photographing control unit 22b, the image acquisition unit 22c, the authentication control unit 22d, and the face authentication result. It functions as a unit 22e, an authentication information acquisition unit 22f, a collation unit 22g, a processing control unit 22h, a feature point detection unit 22j, and a face area detection unit 22k. Some or all of these may be implemented in hardware.
 表示制御部22aは、例えば、決済端末30の表示部32の表示面にPIN入力受付画面G1(認証情報入力画面。図8参照)を表示させる。具体的には、表示制御部22aは、PIN入力受付画面G1を表示させるための画面表示指示を、通信部24を介して決済端末30に送信する。他の画面、例えば、決済実行問い合わせ画面G2(図11参照)を表示させる場合も同様の画面表示指示を、通信部24を介して決済端末30に送信する。PIN入力受付画面G1、決済実行問い合わせ画面G2については後述する。 The display control unit 22a displays, for example, the PIN input reception screen G1 (authentication information input screen, see FIG. 8) on the display surface of the display unit 32 of the payment terminal 30. Specifically, the display control unit 22a transmits a screen display instruction for displaying the PIN input reception screen G1 to the payment terminal 30 via the communication unit 24. When displaying another screen, for example, the payment execution inquiry screen G2 (see FIG. 11), the same screen display instruction is transmitted to the payment terminal 30 via the communication unit 24. The PIN input reception screen G1 and the payment execution inquiry screen G2 will be described later.
 撮影制御部22bは、決済端末30のタッチパネル33に対するユーザU1の入力操作に応じて決済端末30のカメラ31を起動させてユーザU1の顔を含む画像を撮影させる。具体的には、撮影制御部22bは、決済端末30から送信される入力開始検出通知(ユーザU1がPINの入力を開始したことを検出した旨の通知)を通信部24が受信した場合、通信部24を介して撮影指示を決済端末30に送信する。 The shooting control unit 22b activates the camera 31 of the payment terminal 30 in response to the input operation of the user U1 on the touch panel 33 of the payment terminal 30 to take an image including the face of the user U1. Specifically, the shooting control unit 22b communicates when the communication unit 24 receives an input start detection notification (notification that the user U1 has started inputting a PIN) transmitted from the payment terminal 30. The shooting instruction is transmitted to the payment terminal 30 via the unit 24.
 画像取得部22cは、撮影制御部22bにより起動されたカメラ31により撮影された画像(以下、撮影画像とも呼ぶ)を取得する。具体的には、決済端末30から送信される撮影画像を通信部24が受信し、画像取得部22cは、この通信部24が受信した撮影画像を取得する。 The image acquisition unit 22c acquires an image (hereinafter, also referred to as a photographed image) photographed by the camera 31 activated by the image capture control unit 22b. Specifically, the communication unit 24 receives the captured image transmitted from the payment terminal 30, and the image acquisition unit 22c acquires the captured image received by the communication unit 24.
 認証制御部22dは、顔認証を実行する認証装置10に、画像取得部22cにより取得された撮影画像に含まれるユーザU1の顔認証を実行させる。具体的には、認証制御部22dは、画像取得部22cにより取得された撮影画像を、通信部24を介して認証装置10に送信する。なお、撮影画像に代えて、撮影画像から検出される顔領域(又は顔領域から抽出される特徴点)を認証装置10に送信してもよい。 The authentication control unit 22d causes the authentication device 10 that executes face authentication to execute the face authentication of the user U1 included in the captured image acquired by the image acquisition unit 22c. Specifically, the authentication control unit 22d transmits the captured image acquired by the image acquisition unit 22c to the authentication device 10 via the communication unit 24. Instead of the captured image, the face region (or the feature point extracted from the captured image) detected from the captured image may be transmitted to the authentication device 10.
 顔認証結果取得部22eは、認証装置10により実行された顔認証の結果を取得する。具体的には、認証装置10から送信される顔認証の結果を通信部24が受信し、顔認証結果取得部22eは、この通信部24が受信した顔認証の結果を取得する。 The face authentication result acquisition unit 22e acquires the result of the face authentication executed by the authentication device 10. Specifically, the communication unit 24 receives the face authentication result transmitted from the authentication device 10, and the face authentication result acquisition unit 22e acquires the face authentication result received by the communication unit 24.
 認証情報取得部22fは、決済端末30のタッチパネル33を介して入力されたPIN(以下、入力PINと呼ぶ)を取得する。具体的には、決済端末30から送信される入力PINを通信部24が受信し、認証情報取得部22fは、この通信部24が受信した入力PINを取得する。 The authentication information acquisition unit 22f acquires a PIN (hereinafter referred to as an input PIN) input via the touch panel 33 of the payment terminal 30. Specifically, the communication unit 24 receives the input PIN transmitted from the payment terminal 30, and the authentication information acquisition unit 22f acquires the input PIN received by the communication unit 24.
 照合部22gは、認証情報取得部22fにより取得された入力PINと記憶部21に記憶されている登録済みPIN(図7参照)とを照合する。 The collation unit 22g collates the input PIN acquired by the authentication information acquisition unit 22f with the registered PIN (see FIG. 7) stored in the storage unit 21.
 処理制御部22hは、認証装置10による顔認証が成功し、かつ、照合部22gによる照合結果が一致する場合、決済処理を実行する決済端末30に決済処理を実行させる。具体的には、決済処理制御部22hは、決済処理指示を、通信部24を介して決済端末30に送信する。決済処理が本発明の所定処理の一例である。 The processing control unit 22h causes the payment terminal 30 that executes the payment processing to execute the payment processing when the face authentication by the authentication device 10 is successful and the matching results by the matching unit 22g match. Specifically, the payment processing control unit 22h transmits a payment processing instruction to the payment terminal 30 via the communication unit 24. The settlement process is an example of the predetermined process of the present invention.
 特徴点検出部22jは、画像取得部22cにより取得された撮影画像から、当該撮影画像に含まれている人物の身体的特徴点(例えば、骨格の特徴点)を検出する。骨格の特徴点は、既存技術(例えば、OpenPose(登録商標))により検出することができる。検出される特徴点としては、例えば、指、手首、肘、肩、首、顔(鼻、耳、目)、腰、膝、足首がある。 The feature point detection unit 22j detects the physical feature points (for example, the feature points of the skeleton) of the person included in the captured image from the captured image acquired by the image acquisition unit 22c. Skeleton feature points can be detected by existing techniques (eg, OpenPose®). The detected feature points include, for example, fingers, wrists, elbows, shoulders, neck, face (nose, ears, eyes), hips, knees, and ankles.
 検出される特徴点の一例について説明する。図13は、画像取得部22cにより取得される撮影画像(以下、撮影画像Iと呼ぶ)の一例である。図14は、図13の撮影画像Iから検出される特徴点及び顔領域の一例である。図14中、符号P1R、P2R、PRL、P3R、P3L、P4、P5が検出される特徴点を表し、符号A1、A2が検出される顔領域を表す。 An example of detected feature points will be described. FIG. 13 is an example of a captured image (hereinafter referred to as a captured image I) acquired by the image acquisition unit 22c. FIG. 14 is an example of a feature point and a face region detected from the captured image I of FIG. In FIG. 14, the feature points where the symbols P1R, P2R, PRL, P3R, P3L, P4, and P5 are detected are represented, and the face regions where the symbols A1 and A2 are detected are represented.
 決済端末30のタッチパネル33に対して手指により入力操作を行っているユーザU1及びその後方に並んでいるユーザU2の二名が撮影画像I(図13参照)に含まれている場合、図14に示すように、ユーザU1の特徴点(例えば、手首P1R、肘P2R、PRL、肩P3R、P3L、首P4、顔P5)及びユーザU2の特徴点(図14中省略)が検出される。すなわち、特徴点は、撮影画像に含まれる人物ごとに検出される。なお、特徴点は、撮影画像に表示してもよいし、表示しなくてもよい。 When the photographed image I (see FIG. 13) includes two users, a user U1 who is performing an input operation on the touch panel 33 of the payment terminal 30 by hand and a user U2 who is lined up behind the user U1, the photographed image I (see FIG. 13) is shown in FIG. As described above, the feature points of the user U1 (for example, wrist P1R, elbow P2R, PRL, shoulder P3R, P3L, neck P4, face P5) and the feature points of the user U2 (omitted in FIG. 14) are detected. That is, the feature points are detected for each person included in the captured image. The feature points may or may not be displayed on the captured image.
 顔領域検出部22kは、画像取得部22cにより取得された撮影画像から、決済端末30のタッチパネル33に対して入力操作を行っている手指がつながっている顔領域を検出する。例えば、図13に示す撮影画像Iから、決済端末30のタッチパネル33に対して入力操作を行っている手指がつながっている顔領域として図14に示す顔領域A1を検出する。この手指が繋がっている顔領域を検出する処理については、次の認証制御装置20の動作(主に、顔領域検出部22kによる顔検出処理)の一例でさらに詳しく説明する。 The face area detection unit 22k detects the face area to which the fingers performing the input operation on the touch panel 33 of the payment terminal 30 are connected from the captured image acquired by the image acquisition unit 22c. For example, from the captured image I shown in FIG. 13, the face area A1 shown in FIG. 14 is detected as the face area to which the fingers performing the input operation are connected to the touch panel 33 of the payment terminal 30. The process of detecting the face region to which the fingers are connected will be described in more detail in the following example of the operation of the authentication control device 20 (mainly, the face detection process by the face region detection unit 22k).
 次に、認証制御装置20の動作(主に、顔領域検出部22kによる顔検出処理)の一例について説明する。 Next, an example of the operation of the authentication control device 20 (mainly the face detection process by the face area detection unit 22k) will be described.
 図15は、認証制御装置20の動作(主に、顔領域検出部22kによる顔検出処理)の一例のフローチャートである。ここでは、図13に示す撮影画像Iから、決済端末30のタッチパネル33に対して入力操作を行っている手指がつながっている顔領域として図14に示す顔領域A1を検出する例について説明する。 FIG. 15 is a flowchart of an example of the operation of the authentication control device 20 (mainly the face detection process by the face area detection unit 22k). Here, an example of detecting the face area A1 shown in FIG. 14 as the face area to which the fingers performing the input operation on the touch panel 33 of the payment terminal 30 are connected from the captured image I shown in FIG. 13 will be described.
 まず、画像取得部22cが、撮影制御部22bにより起動されたカメラ31により撮影された撮影画像を取得する(ステップS40)。ここでは、図13に示す撮影画像Iが取得されたものとする。 First, the image acquisition unit 22c acquires a captured image captured by the camera 31 activated by the imaging control unit 22b (step S40). Here, it is assumed that the captured image I shown in FIG. 13 has been acquired.
 次に、特徴点検出部22jが、ステップS40で取得された撮影画像から、当該撮影画像に含まれている人物の骨格の特徴点を検出する(ステップS41)。ここでは、図13に示す撮影画像Iから、図14に示すユーザU1の特徴点(手首P1R、肘P2R、PRL、肩P3R、P3L、首P4、顔P5)及びユーザU2の特徴点(図14中省略)が検出されたものとする。 Next, the feature point detection unit 22j detects the feature points of the skeleton of the person included in the captured image from the captured image acquired in step S40 (step S41). Here, from the captured image I shown in FIG. 13, the feature points of the user U1 shown in FIG. 14 (wrist P1R, elbow P2R, PRL, shoulder P3R, P3L, neck P4, face P5) and the feature points of the user U2 (FIG. 14). (Omitted) is detected.
 次に、顔領域検出部22kが、ステップS40で取得された撮影画像から顔領域を検出する(ステップS42)。この顔領域は、既存技術により検出することができる。ここでは、図14に示すユーザU1の顔領域A1及びユーザU2の顔領域A2が検出されたものとする。 Next, the face area detection unit 22k detects the face area from the captured image acquired in step S40 (step S42). This face region can be detected by existing technology. Here, it is assumed that the face area A1 of the user U1 and the face area A2 of the user U2 shown in FIG. 14 are detected.
 次に、顔領域検出部22kが、決済端末30のタッチパネル33に最も近い手指を検出する(ステップS43)。例えば、撮影画像Iの下辺との間の距離L(図14参照)が最も短い手首の特徴点(又は指の特徴点)を検出する。ここでは、撮影画像Iの下辺との間の距離Lが最も短い手首の特徴点としてユーザU1の手首の特徴点P1Rが検出されたものとする。 Next, the face area detection unit 22k detects the finger closest to the touch panel 33 of the payment terminal 30 (step S43). For example, the wrist feature point (or finger feature point) having the shortest distance L (see FIG. 14) from the lower side of the captured image I is detected. Here, it is assumed that the wrist feature point P1R of the user U1 is detected as the wrist feature point having the shortest distance L from the lower side of the captured image I.
 次に、顔領域検出部22kは、ステップS42で検出された顔領域のうち、ステップS43で検出されたタッチパネル33に最も近い手指(手指の特徴点)を持つユーザの顔の特徴点から最も近い顔領域を決定する(ステップS44)。ここでは、ステップS42で検出された顔領域A1、A2のうち、ステップS43で検出されたタッチパネル33に最も近い手指(手指の特徴点P1R)を持つユーザU1の顔の特徴点P5から最も近い顔領域A1が決定されたものとする。この決定された顔領域A1が、決済端末30のタッチパネル33に対して入力操作を行っている手指がつながっている顔領域(つまり、決済処理を行うためにPINを入力している認証対象者であるユーザU1の顔領域)である。 Next, the face area detection unit 22k is closest to the facial feature points of the user having the fingers (finger feature points) closest to the touch panel 33 detected in step S43 among the face regions detected in step S42. The face area is determined (step S44). Here, among the face areas A1 and A2 detected in step S42, the face closest to the facial feature point P5 of the user U1 having the finger (finger feature point P1R) closest to the touch panel 33 detected in step S43. It is assumed that the region A1 has been determined. The determined face area A1 is a face area (that is, an authentication target person who is inputting a PIN for performing payment processing) to which a finger performing an input operation is connected to the touch panel 33 of the payment terminal 30. The face area of a user U1).
 以上のようにして、画像取得部22cにより取得された撮影画像から、決済端末30のタッチパネル33に対して入力操作を行っている手指がつながっている顔領域(ここでは、ユーザU1の顔領域A1)を検出(決定)することができる。 As described above, from the captured image acquired by the image acquisition unit 22c, the face area to which the fingers performing the input operation on the touch panel 33 of the payment terminal 30 are connected (here, the face area A1 of the user U1). ) Can be detected (determined).
 なお、ステップS40~S44は、この順に限られない。例えば、ステップS41とステップS42の実行順を逆にしてもよい。 Note that steps S40 to S44 are not limited to this order. For example, the execution order of step S41 and step S42 may be reversed.
 次に、決済端末30の構成例について説明する。 Next, a configuration example of the payment terminal 30 will be described.
 図8は決済端末30の外観図、図9は概略構成図である。 FIG. 8 is an external view of the payment terminal 30, and FIG. 9 is a schematic configuration diagram.
 図8、図9に示すように、決済端末30は、カメラ31、表示部32、タッチパネル33、記憶部34、制御部35、メモリ36、通信部37を備える情報処理装置である。決済端末30は、例えば、店舗に設置されている。なお、認証装置10、認証制御装置20は、同じ店舗に設置されていてもよいし、店舗から遠隔の地に設置されていてもよい。 As shown in FIGS. 8 and 9, the payment terminal 30 is an information processing device including a camera 31, a display unit 32, a touch panel 33, a storage unit 34, a control unit 35, a memory 36, and a communication unit 37. The payment terminal 30 is installed in a store, for example. The authentication device 10 and the authentication control device 20 may be installed in the same store or in a place remote from the store.
 決済端末30は、決済処理の前に顔認証及びPIN照合(二要素認証)を実行する。そして、決済端末30は、顔認証が成功し、かつ、PIN照合の結果が一致する場合、決済処理を行う。なお、購入商品の入力(読取)や決済処理そのものといった二要素認証の前後の処理については、公知技術を用いることができるため、詳細な説明を省略する。 The payment terminal 30 executes face authentication and PIN verification (two-factor authentication) before the payment process. Then, when the face authentication is successful and the PIN verification results match, the payment terminal 30 performs the payment process. Since publicly known technology can be used for the processing before and after the two-factor authentication such as the input (reading) of the purchased product and the payment processing itself, detailed description thereof will be omitted.
 カメラ31は、ユーザU1の顔を含む画像を撮影する。例えば、カメラ31は、図8に示すように、タッチパネル33を介してPINを入力しているユーザU1の顔、すなわち、PIN入力受付画面G1が表示されている表示部32の表示面(及びその近傍に配置されたカメラ31)を向いているユーザU1の顔を正面又は概ね正面から撮影することができるように、例えば、表示部32のフレームのうち上部に取り付けられている。 The camera 31 captures an image including the face of the user U1. For example, as shown in FIG. 8, the camera 31 is the face of the user U1 who is inputting the PIN via the touch panel 33, that is, the display surface of the display unit 32 on which the PIN input reception screen G1 is displayed (and its display surface). For example, it is attached to the upper part of the frame of the display unit 32 so that the face of the user U1 facing the camera 31) arranged in the vicinity can be photographed from the front or substantially from the front.
 表示部32は、例えば、タッチパネル付きディスプレイ等のディスプレイである。タッチパネル付きディスプレイは、タッチスクリーンディスプレイとも呼ばれる。表示部32の表示面には、例えば、PIN入力受付画面G1が表示される。図8に示すように、PIN入力受付画面G1は、ユーザU1の手指がタッチパネル33を介して辿るべき、PINを構成する各々の要素を象徴する画像gを含む。各々の画像gは、所定パターンで配置された状態で表示部32の表示面に表示される。例えば、画像gは、3×3の格子状のパターンで配置された状態で表示される(図8参照)。 The display unit 32 is, for example, a display such as a display with a touch panel. A display with a touch panel is also called a touch screen display. For example, the PIN input reception screen G1 is displayed on the display surface of the display unit 32. As shown in FIG. 8, the PIN input reception screen G1 includes an image g symbolizing each element constituting the PIN, which the finger of the user U1 should follow via the touch panel 33. Each image g is displayed on the display surface of the display unit 32 in a state of being arranged in a predetermined pattern. For example, the image g is displayed in a state of being arranged in a 3 × 3 grid pattern (see FIG. 8).
 タッチパネル33は、ユーザU1が操作する入力装置である。タッチパネル33は、表示部32の表示面を覆った状態で配置されている。タッチパネル33は、例えば、PINをユーザU1の手指により入力させるために用いられる。ユーザU1は、タッチパネル33に接触させた状態の手指をタッチパネル33上で移動させ、PINを構成する要素に対応する画像gを、PINの先頭から順番に辿る(経由する)ことで、PINを入力する。その際、PINは複数の要素(例えば、数字、文字、記号)の組み合わせで構成される10桁程度又はそれ以下の桁数の簡易なコードであり、該当の画像gを辿るように手指を数回移動させるだけでPIN入力できるため、ユーザU1のPIN入力の負担が軽減される。 The touch panel 33 is an input device operated by the user U1. The touch panel 33 is arranged so as to cover the display surface of the display unit 32. The touch panel 33 is used, for example, to input the PIN by the finger of the user U1. The user U1 inputs a PIN by moving a finger in contact with the touch panel 33 on the touch panel 33 and tracing (via) the image g corresponding to the elements constituting the PIN in order from the beginning of the PIN. do. At that time, the PIN is a simple code having a number of digits of about 10 digits or less composed of a combination of a plurality of elements (for example, numbers, letters, symbols), and the number of fingers is counted so as to follow the corresponding image g. Since the PIN can be input only by moving the number of times, the burden of the PIN input of the user U1 is reduced.
 なお、PINは、表示部32の表示面に表示されるソフトウェアキーボード(図示せず)をタッチパネル33介してタップすることで入力してもよいし、表示部32近傍に配置された物理キーボード(図示せず)を介して入力してもよい。 The PIN may be input by tapping a software keyboard (not shown) displayed on the display surface of the display unit 32 via the touch panel 33, or a physical keyboard (not shown) arranged in the vicinity of the display unit 32. It may be input via).
 記憶部34は、ハードディスク装置やROM等の不揮発性の記憶部である。記憶部34には、プログラム34aが記憶されている。 The storage unit 34 is a non-volatile storage unit such as a hard disk device or a ROM. The program 34a is stored in the storage unit 34.
 プログラム34aは、制御部35(プロセッサ)により実行されるプログラムである。 Program 34a is a program executed by the control unit 35 (processor).
 制御部35は、図示しないが、プロセッサを備えている。プロセッサは、例えば、CPU(Central Processing Unit)である。プロセッサは、1つの場合もあるし、複数の場合もある。プロセッサは、記憶部34からメモリ36(例えば、RAM)に読み込まれたプログラム34aを実行することで、表示制御部35a、入力状態検出部35b、撮影制御部35c、決済処理部35dとして機能する。これらの一部又は全部は、ハードウェアで実現してもよい。 Although not shown, the control unit 35 includes a processor. The processor is, for example, a CPU (Central Processing Unit). There may be one processor or multiple processors. The processor functions as a display control unit 35a, an input state detection unit 35b, a shooting control unit 35c, and a payment processing unit 35d by executing a program 34a read from the storage unit 34 into the memory 36 (for example, RAM). Some or all of these may be implemented in hardware.
 表示制御部35aは、例えば、認証制御装置20から送信される画面表示指示(PIN入力受付画面G1を表示させるための画面表示指示)を通信部37が受信した場合、表示部32の表示面にPIN入力受付画面G1(図8参照)を表示する。なお、PIN入力受付画面G1は、予め決済端末30の記憶部34に格納された情報に基づいて表示してもよいし、画面表示指示と共に認証制御装置20から受信した情報に基づいて表示してもよい。 When the communication unit 37 receives a screen display instruction (screen display instruction for displaying the PIN input reception screen G1) transmitted from the authentication control device 20, for example, the display control unit 35a is displayed on the display surface of the display unit 32. The PIN input reception screen G1 (see FIG. 8) is displayed. The PIN input reception screen G1 may be displayed based on the information stored in the storage unit 34 of the payment terminal 30 in advance, or may be displayed based on the information received from the authentication control device 20 together with the screen display instruction. May be good.
 入力状態検出部35bは、PINの入力状態を検出する。入力状態検出部35bにより検出されるPINの入力状態は、例えば、PINの入力開始、PINの入力終了である。 The input state detection unit 35b detects the input state of the PIN. The PIN input state detected by the input state detection unit 35b is, for example, a PIN input start and a PIN input end.
 撮影制御部35cは、例えば、認証制御装置20から送信される撮影指示を通信部37が受信した場合、カメラ31を起動させてユーザU1の顔を含む画像を撮影する。例えば、カメラ31に電源を供給すること又はカメラ31をスリープ状態から通常状態へ移行させることで、カメラ31を起動させる。一方、撮影制御部35cは、撮影が終了した場合、カメラ31の起動を停止する。例えば、カメラ31への電源供給を停止すること又はカメラ31をスリープ状態へ移行させることで、カメラ31の起動を停止する。 For example, when the communication unit 37 receives a shooting instruction transmitted from the authentication control device 20, the shooting control unit 35c activates the camera 31 to shoot an image including the face of the user U1. For example, the camera 31 is activated by supplying power to the camera 31 or shifting the camera 31 from the sleep state to the normal state. On the other hand, the shooting control unit 35c stops the activation of the camera 31 when the shooting is completed. For example, the activation of the camera 31 is stopped by stopping the power supply to the camera 31 or putting the camera 31 into the sleep state.
 決済処理部35dは、例えば、認証制御装置20から送信される決済処理指示を通信部37が受信した場合、決済処理を実行する。つまり、顔認証されたユーザU1に対応付けられた決済情報をもとに、支払い処理を完了させる。 The payment processing unit 35d executes the payment processing when, for example, the communication unit 37 receives the payment processing instruction transmitted from the authentication control device 20. That is, the payment process is completed based on the payment information associated with the face-authenticated user U1.
 通信部37は、認証制御装置20との間でネットワークNWを介して通信する通信装置である。 The communication unit 37 is a communication device that communicates with the authentication control device 20 via the network NW.
 次に、上記構成の認証システム1の動作の一例(第1動作例)について説明する。 Next, an example of the operation of the authentication system 1 having the above configuration (first operation example) will be described.
 図10は、認証システム1(第1動作例)のシーケンス図である。 FIG. 10 is a sequence diagram of the authentication system 1 (first operation example).
 図10に示すように、まず、認証制御装置20(表示制御部22a)は、PIN入力受付画面G1を表示させるための画面表示指示を、通信部24を介して決済端末30に送信する(ステップS10)。 As shown in FIG. 10, first, the authentication control device 20 (display control unit 22a) transmits a screen display instruction for displaying the PIN input reception screen G1 to the payment terminal 30 via the communication unit 24 (step). S10).
 次に、決済端末30(表示制御部35a)は、ステップS10で送信される画面表示指示を通信部37が受信した場合、表示部32の表示面にPIN入力受付画面G1(図8参照)を表示する(ステップS11)。 Next, when the communication unit 37 receives the screen display instruction transmitted in step S10, the payment terminal 30 (display control unit 35a) displays the PIN input reception screen G1 (see FIG. 8) on the display surface of the display unit 32. Display (step S11).
 次に、決済端末30は、タッチパネル33に対するユーザU1の入力操作に応じてカメラ31を起動させてユーザU1の顔を含む画像を撮影する(ステップS12~S16)。 Next, the payment terminal 30 activates the camera 31 in response to the input operation of the user U1 on the touch panel 33, and captures an image including the face of the user U1 (steps S12 to S16).
 具体的には、まず、ユーザU1は、タッチパネル33を介して自己に割り当てられているPINの入力を開始する(ステップS12)。例えば、ユーザU1は、タッチパネル33に接触させた状態の手指をタッチパネル33上で移動させ、PINを構成する要素に対応する画像gを、PINの先頭から順番に辿る(経由する)ことで、PINを入力する。決済端末30(入力状態検出部35b)は、PINの入力開始を検出する。 Specifically, first, the user U1 starts inputting the PIN assigned to himself / herself via the touch panel 33 (step S12). For example, the user U1 moves a finger in contact with the touch panel 33 on the touch panel 33, and traces (vias) the image g corresponding to the elements constituting the PIN in order from the beginning of the PIN. Enter. The payment terminal 30 (input state detection unit 35b) detects the start of PIN input.
 次に、決済端末30(通信部37)は、入力状態検出部35bがPINの入力開始を検出した場合、入力開始検出通知を認証制御装置20に送信する(ステップS13)。 Next, when the input state detection unit 35b detects the start of PIN input, the payment terminal 30 (communication unit 37) transmits an input start detection notification to the authentication control device 20 (step S13).
 次に、認証制御装置20(撮影制御部22b)は、ステップS13で送信される入力開始検出通知を通信部24が受信した場合、通信部24を介して撮影指示を決済端末30に送信する(ステップS14)。 Next, when the communication unit 24 receives the input start detection notification transmitted in step S13, the authentication control device 20 (shooting control unit 22b) transmits a shooting instruction to the payment terminal 30 via the communication unit 24 ( Step S14).
 次に、決済端末30(撮影制御部35c)は、ステップS14で送信される撮影指示を通信部37が受信した場合、カメラ31を起動させて(ステップS15)タッチパネル33を介してPINを入力しているユーザU1の顔を含む画像を撮影する(ステップS16)。ここでは、図13に示す撮影画像Iが撮影されたものとする。撮影枚数は、一枚でもよいし、複数枚でもよい。撮影制御部35cは、撮影が終了した場合、カメラ31の起動を停止する。 Next, when the communication unit 37 receives the shooting instruction transmitted in step S14, the payment terminal 30 (shooting control unit 35c) activates the camera 31 (step S15) and inputs the PIN via the touch panel 33. An image including the face of the user U1 is taken (step S16). Here, it is assumed that the captured image I shown in FIG. 13 has been captured. The number of shots may be one or a plurality. The shooting control unit 35c stops the activation of the camera 31 when the shooting is completed.
 以上のように、決済端末30は、タッチパネル33に対するユーザU1の入力操作に応じてカメラ31を起動させてユーザU1の顔を含む画像を撮影する。 As described above, the payment terminal 30 activates the camera 31 in response to the input operation of the user U1 on the touch panel 33 to take an image including the face of the user U1.
 次に、決済端末30(通信部37)は、ステップS16で撮影された撮影画像Iを認証制御装置20に送信する(ステップS17)。その際、撮影制御部35cは、撮影画像が送信されたタイミングでカメラ31の撮影を終了してもよい(又はカメラ31の起動を停止してもよい)。 Next, the payment terminal 30 (communication unit 37) transmits the captured image I captured in step S16 to the authentication control device 20 (step S17). At that time, the shooting control unit 35c may end the shooting of the camera 31 at the timing when the shot image is transmitted (or may stop the activation of the camera 31).
 次に、認証制御装置20(通信部24)はステップS17で送信される撮影画像Iを受信し、画像取得部22cは、この通信部24が受信した撮影画像Iを取得する。 Next, the authentication control device 20 (communication unit 24) receives the captured image I transmitted in step S17, and the image acquisition unit 22c acquires the captured image I received by the communication unit 24.
 次に、認証制御装置20(顔領域検出部22k)は、決済端末30のタッチパネル33に対して入力操作を行っている手指がつながっているユーザU1の顔領域を検出するため、顔検出処理(図15参照)を実行する(ステップS1700)。ここでは、この顔検出処理の結果、決済端末30のタッチパネル33に対して入力操作を行っている手指がつながっている顔領域としてユーザU1の顔領域A1(図14参照)が検出されたものとする。 Next, the authentication control device 20 (face area detection unit 22k) detects the face area of the user U1 to which the finger performing the input operation is connected to the touch panel 33 of the payment terminal 30, so that the face detection process (face detection process) (See FIG. 15) is executed (step S1700). Here, as a result of this face detection process, the face area A1 (see FIG. 14) of the user U1 is detected as the face area to which the fingers performing the input operation are connected to the touch panel 33 of the payment terminal 30. do.
 次に、認証制御装置20(認証制御部22d)は、ステップS1700で検出された顔領域A1(顔画像)に含まれるユーザU1の顔認証を要求する顔認証要求を、通信部24を介して認証装置10に送信する(ステップS18)。この顔認証要求は、上記検出された顔領域A1を含む。 Next, the authentication control device 20 (authentication control unit 22d) makes a face authentication request for the face authentication of the user U1 included in the face area A1 (face image) detected in step S1700 via the communication unit 24. It is transmitted to the authentication device 10 (step S18). This face recognition request includes the detected face area A1.
 次に、認証装置10は、ステップS18で送信される顔認証要求を通信部14が受信した場合、顔認証処理(図6参照)を実行する(ステップS19)。 Next, when the communication unit 14 receives the face authentication request transmitted in step S18, the authentication device 10 executes the face authentication process (see FIG. 6) (step S19).
 次に、認証装置10(認証部12e)は、認証結果を、通信部14を介して顔認証要求送信元の認証制御装置20に送信する(ステップS20)。ここでは、認証結果として、認証が成功した旨及び認証したユーザU1のユーザIDを認証制御装置20に送信したとする。 Next, the authentication device 10 (authentication unit 12e) transmits the authentication result to the authentication control device 20 of the face authentication request transmission source via the communication unit 14 (step S20). Here, as an authentication result, it is assumed that the authentication is successful and the user ID of the authenticated user U1 is transmitted to the authentication control device 20.
 次に、認証制御装置20(通信部24)はステップS20で送信される顔認証の結果及びユーザIDを受信し、顔認証結果取得部22eは、この通信部24が受信した顔認証の結果を取得する。 Next, the authentication control device 20 (communication unit 24) receives the face authentication result and the user ID transmitted in step S20, and the face authentication result acquisition unit 22e receives the face authentication result received by the communication unit 24. get.
 決済端末30(通信部37)は、入力状態検出部35bがPINの入力終了を検出した場合、入力PINを認証制御装置20に送信する(ステップS22)。認証制御装置20(通信部24)はステップS22で送信される入力PINを受信し、認証情報取得部22fは、この通信部24が受信した入力PINを取得する。 When the input state detection unit 35b detects the end of the PIN input, the payment terminal 30 (communication unit 37) transmits the input PIN to the authentication control device 20 (step S22). The authentication control device 20 (communication unit 24) receives the input PIN transmitted in step S22, and the authentication information acquisition unit 22f acquires the input PIN received by the communication unit 24.
 次に、認証制御装置20(照合部22g)は、PIN照合を実行する(ステップS23)。このPIN照合は、認証情報取得部22fにより取得された入力PINと記憶部21に記憶されている登録済みPIN(上記取得されたユーザIDが対応付けられている登録済みPIN)とを照合する処理である。 Next, the authentication control device 20 (verification unit 22g) executes PIN verification (step S23). This PIN verification is a process of collating the input PIN acquired by the authentication information acquisition unit 22f with the registered PIN stored in the storage unit 21 (the registered PIN associated with the acquired user ID). Is.
 次に、認証制御装置20(通信部24)は、顔認証が成功し(顔認証結果取得部22eが取得した顔認証の結果が、認証が成功した旨で)、かつ、ステップS23のPIN照合の結果が一致した場合、決済処理指示を、通信部24を介して決済端末30に送信する(ステップS24)。 Next, the authentication control device 20 (communication unit 24) succeeds in face recognition (assuming that the result of face recognition acquired by the face recognition result acquisition unit 22e is successful in authentication) and PIN verification in step S23. When the results of the above are the same, the payment processing instruction is transmitted to the payment terminal 30 via the communication unit 24 (step S24).
 次に、決済端末30(表示制御部35a)は、ステップS24で送信される決済処理指示を通信部37が受信した場合、決済を実行するか否かの決済実行問い合わせ画面(図11参照)を表示部32の表示面に表示する(ステップS25)。図11は、決済実行問い合わせ画面G2の一例である。 Next, when the communication unit 37 receives the payment processing instruction transmitted in step S24, the payment terminal 30 (display control unit 35a) displays a payment execution inquiry screen (see FIG. 11) as to whether or not to execute payment. It is displayed on the display surface of the display unit 32 (step S25). FIG. 11 is an example of the settlement execution inquiry screen G2.
 次に、ユーザU1が決済実行を入力した場合、例えば、ユーザU1が決済実行問い合わせ画面中の決済ボタンBをタッチパネル33を介してタップした場合、決済端末30(決済処理部35d)は、決済処理を実行する(ステップS27)。その際、撮影制御部35cは、ユーザUが決済ボタンBをタッチパネル33を介してタップしたタイミングでカメラ31の撮影を終了してもよい(又はカメラ31の起動を停止してもよい)。 Next, when the user U1 inputs the payment execution, for example, when the user U1 taps the payment button B on the payment execution inquiry screen via the touch panel 33, the payment terminal 30 (payment processing unit 35d) performs the payment processing. Is executed (step S27). At that time, the shooting control unit 35c may end the shooting of the camera 31 (or may stop the activation of the camera 31) at the timing when the user U taps the payment button B via the touch panel 33.
 なお、顔認証が失敗した場合(顔認証結果取得部22eが取得した顔認証の結果が、認証が失敗した旨である場合)、又は、ステップS23のPIN照合の結果が一致しない場合、ステップS24~S27の処理は実行されない。この場合、表示部32には、例えば、認証が失敗した旨が表示される。 If the face authentication fails (the face authentication result acquired by the face authentication result acquisition unit 22e indicates that the authentication has failed), or if the PIN verification results in step S23 do not match, step S24 The processing of ~ S27 is not executed. In this case, the display unit 32 displays, for example, that the authentication has failed.
 なお、ステップS19~S24の処理を実行する過程において、顔認証処理(ステップS19)で検出される骨格の特徴点(例えば、図14中の手首P1R、肘P2R、PRL、肩P3R、P3L、首P4、顔P5)を、ステップS16で撮影された画像中の認証対象者(決済端末30のタッチパネル33に対して手指により入力操作を行っているユーザ)に重畳させてタッチパネル画面(タッチパネル33により覆われた表示部32の表示面)に表示させてもよい(図14参照)。その際、ステップS16で撮影された画像中に、認証対象者の後方に並んでいる認証対象者以外の第三者が写り込んでいる場合(図示せず)、当該第三者の身体部分の特徴点は、表示させないようにしてもよい。または、ステップS16で撮影された画像中に、認証対象者の後方に並んでいる認証対象者以外の第三者が写り込んでいる場合(図示せず)、認証処理(ステップS19)で検出される身体部分の特徴点をステップS16で撮影された画像中の認証対象者及び第三者に重畳させてタッチパネル画面に表示させてもよい。その際、認識対象者に重畳される身体部分の特徴点を強調表示させてもよい(第三者に重畳される身体部分の特徴点は強調表示させなくてよい)。さらに、認証処理(ステップS19)の認証が成功した場合、ステップS16で撮影された画像中の認証対象者に重畳されている身体部分の特徴点の色を変更させてタッチパネル画面に表示させてもよい。もしくは、ステップS19~S24の処理を実行する過程において、骨格情報により一番前(タッチパネル33を操作している)のユーザの顔を検出している旨をタッチパネル画面(タッチパネル33により覆われた表示部32の表示面)に表示させてもよい。 In the process of executing the processes of steps S19 to S24, the feature points of the skeleton detected in the face authentication process (step S19) (for example, wrist P1R, elbow P2R, PRL, shoulder P3R, P3L, neck in FIG. 14). P4, face P5) is superimposed on the authentication target person (user who is performing an input operation on the touch panel 33 of the payment terminal 30 with a finger) in the image taken in step S16, and is covered by the touch panel screen (touch panel 33). It may be displayed on the display surface of the displayed display unit 32 (see FIG. 14). At that time, if a third party other than the authentication target person lined up behind the authentication target person is reflected in the image taken in step S16 (not shown), the body part of the third party. The feature points may not be displayed. Alternatively, if a third party other than the authentication target person lined up behind the authentication target person is reflected in the image taken in step S16 (not shown), it is detected in the authentication process (step S19). The feature points of the body part may be superimposed on the authentication target person and the third party in the image taken in step S16 and displayed on the touch panel screen. At that time, the feature points of the body part superimposed on the recognition target person may be highlighted (the feature points of the body part superimposed on the third party may not be highlighted). Further, when the authentication of the authentication process (step S19) is successful, the color of the feature points of the body part superimposed on the authentication target person in the image taken in step S16 may be changed and displayed on the touch panel screen. good. Alternatively, in the process of executing the processes of steps S19 to S24, the touch panel screen (display covered by the touch panel 33) indicates that the face of the frontmost user (operating the touch panel 33) is detected by the skeleton information. It may be displayed on the display surface of the unit 32).
 以上説明したように、認証システム1の動作の一例(第1動作例)によれば、PIN照合(ステップS23)より先に顔認証(ステップS19)を行うので、顔認証を迅速に実施することができる。(早く対応できる。)
 次に、上記構成の認証システム1の動作の一例(第2動作例)について説明する。
As described above, according to an example of the operation of the authentication system 1 (first operation example), the face authentication (step S19) is performed before the PIN verification (step S23), so that the face authentication can be performed quickly. Can be done. (We can respond quickly.)
Next, an example of the operation of the authentication system 1 having the above configuration (second operation example) will be described.
 図12は、認証システム1(第2動作例)のシーケンス図である。 FIG. 12 is a sequence diagram of the authentication system 1 (second operation example).
 上記第1動作例では、顔情報DB11bに登録されている全てのユーザIDの顔特徴情報を用いて顔認証処理が実行されたのに対して、第2動作例では、顔情報DB11bに登録されているユーザIDのうち、入力PINが対応づけられているユーザIDの顔特徴情報を用いて顔認証処理が実行される。ステップS10~S22の処理は、上記第1動作例と同じであるため、同一の符号を付して説明を省略する。以下、ステップS22以降のステップS30~S37を中心に説明する。 In the first operation example, the face authentication process is executed using the face feature information of all the user IDs registered in the face information DB 11b, whereas in the second operation example, it is registered in the face information DB 11b. The face authentication process is executed using the face feature information of the user ID associated with the input PIN among the user IDs. Since the processing of steps S10 to S22 is the same as that of the first operation example, the same reference numerals are given and the description thereof will be omitted. Hereinafter, steps S30 to S37 after step S22 will be mainly described.
 まず、前提として、認証制御装置20(通信部24)はステップS17で送信される撮影画像を受信し、画像取得部22cはこの通信部24が受信した撮影画像を取得し、顔領域検出部22kは、顔検出処理(図15参照)を実行する(ステップS1700)。また、認証制御装置20(通信部24)はステップS22で送信される入力PINを受信し、認証情報取得部22fは、この通信部24が受信した入力PINを取得する。 First, as a premise, the authentication control device 20 (communication unit 24) receives the captured image transmitted in step S17, the image acquisition unit 22c acquires the captured image received by the communication unit 24, and the face area detection unit 22k. Executes the face detection process (see FIG. 15) (step S1700). Further, the authentication control device 20 (communication unit 24) receives the input PIN transmitted in step S22, and the authentication information acquisition unit 22f acquires the input PIN received by the communication unit 24.
 次に、認証制御装置20(照合部22g)は、PIN照合を実行する(ステップS30)。このPIN照合は、認証情報取得部22fにより取得された入力PINと記憶部21に記憶されている登録済みPIN(全PIN)とを照合し、記憶部21から、認証情報取得部22fにより取得された入力PINが対応づけられているユーザIDを抽出する処理である。 Next, the authentication control device 20 (verification unit 22g) executes PIN verification (step S30). In this PIN verification, the input PIN acquired by the authentication information acquisition unit 22f is collated with the registered PIN (all PINs) stored in the storage unit 21, and is acquired from the storage unit 21 by the authentication information acquisition unit 22f. This is a process of extracting the user ID associated with the input PIN.
 次に、認証制御装置20(認証制御部22d)は、ステップS30のPIN照合の結果ユーザIDが抽出された場合、上記顔領域検出部22kによる顔検出処理の結果検出された顔領域A1(顔画像)に含まれるユーザU1の顔認証を要求する顔認証要求を、通信部24を介して認証装置10に送信する(ステップS31)。この顔認証要求は、上記検出された顔領域A1及びステップS30で抽出されたユーザIDを含む。 Next, the authentication control device 20 (authentication control unit 22d) detects the face area A1 (face) as a result of the face detection process by the face area detection unit 22k when the user ID is extracted as a result of the PIN verification in step S30. The face authentication request for the face authentication of the user U1 included in the image) is transmitted to the authentication device 10 via the communication unit 24 (step S31). This face authentication request includes the detected face area A1 and the user ID extracted in step S30.
 次に、認証装置10は、ステップS31で送信される顔認証要求を通信部14が受信した場合、顔認証処理(図6参照)を実行する(ステップS32)。その際、認証装置10は、全ユーザID(例えば、1000名分)の顔特徴情報ではなく、ステップS30で抽出されたユーザID(例えば、100名分)の顔特徴情報を用いて顔認証処理を実行する。 Next, when the communication unit 14 receives the face authentication request transmitted in step S31, the authentication device 10 executes the face authentication process (see FIG. 6) (step S32). At that time, the authentication device 10 uses the face feature information of the user IDs (for example, 100 people) extracted in step S30 instead of the face feature information of all user IDs (for example, 1000 people) for face recognition processing. To execute.
 次に、認証装置10(認証部12e)は、認証結果を顔認証要求送信元の認証制御装置20に送信する(ステップS33)。ここでは、認証結果として、認証が成功した旨を認証制御装置20に送信したとする。 Next, the authentication device 10 (authentication unit 12e) transmits the authentication result to the authentication control device 20 of the face recognition request sender (step S33). Here, it is assumed that as the authentication result, the fact that the authentication is successful is transmitted to the authentication control device 20.
 次に、認証制御装置20(通信部24)は、ステップS33で送信される顔認証の結果を受信し、顔認証結果取得部22eは、この通信部24が受信した顔認証の結果を取得する。 Next, the authentication control device 20 (communication unit 24) receives the face authentication result transmitted in step S33, and the face authentication result acquisition unit 22e acquires the face authentication result received by the communication unit 24. ..
 次に、認証制御装置20(通信部24)は、この取得された顔認証の結果が認証が成功した旨である場合、決済端末30に決済処理指示を送信する(ステップS34)。 Next, the authentication control device 20 (communication unit 24) transmits a payment processing instruction to the payment terminal 30 when the result of the acquired face recognition is that the authentication is successful (step S34).
 次に、決済端末30(表示制御部35a)は、ステップS34で送信される決済処理指示を受信した場合、決済を実行するか否かの決済実行問い合わせ画面(図11参照)を表示部32に表示する(ステップS35)。 Next, when the payment terminal 30 (display control unit 35a) receives the payment processing instruction transmitted in step S34, the payment terminal 30 (see FIG. 11) displays a payment execution inquiry screen (see FIG. 11) on the display unit 32 as to whether or not to execute payment. Display (step S35).
 次に、ユーザU1が決済実行を入力した場合、例えば、ユーザU1が決済実行問い合わせ画面(図11参照)中の決済ボタンBをタッチパネル33を介してタップした場合、決済端末30は、決済処理を実行する(ステップS37)。 Next, when the user U1 inputs the payment execution, for example, when the user U1 taps the payment button B in the payment execution inquiry screen (see FIG. 11) via the touch panel 33, the payment terminal 30 performs the payment process. Execute (step S37).
 なお、ステップS30のPIN照合の結果ユーザIDが抽出されない場合、又は、顔認証が失敗した場合(顔認証結果取得部22eが取得した顔認証の結果が、認証が失敗した旨である場合)、ステップS34~S37の処理は実行されない。この場合、表示部32には、例えば、認証が失敗した旨が表示される。 When the user ID is not extracted as a result of the PIN verification in step S30, or when the face authentication fails (when the result of the face authentication acquired by the face authentication result acquisition unit 22e indicates that the authentication has failed). The processes of steps S34 to S37 are not executed. In this case, the display unit 32 displays, for example, that the authentication has failed.
 なお、ステップS32~S34の処理を実行する過程において、顔認証処理(ステップS32)で検出される骨格の特徴点(例えば、図14中の手首P1R、肘P2R、PRL、肩P3R、P3L、首P4、顔P5)を、ステップS16で撮影された画像中の認証対象者(決済端末30のタッチパネル33に対して手指により入力操作を行っているユーザ)に重畳させてタッチパネル画面(タッチパネル33により覆われた表示部32の表示面)に表示させてもよい(図14参照)。その際、ステップS16で撮影された画像中に、認証対象者の後方に並んでいる認証対象者以外の第三者が写り込んでいる場合(図示せず)、当該第三者の身体部分の特徴点は、表示させないようにしてもよい。または、ステップS16で撮影された画像中に、認証対象者の後方に並んでいる認証対象者以外の第三者が写り込んでいる場合(図示せず)、認証処理(ステップS32)で検出される身体部分の特徴点をステップS16で撮影された画像中の認証対象者及び第三者に重畳させてタッチパネル画面に表示させてもよい。その際、認識対象者に重畳される身体部分の特徴点を強調表示させてもよい(第三者に重畳される身体部分の特徴点は強調表示させなくてよい)。さらに、認証処理(ステップS32)の認証が成功した場合、ステップS16で撮影された画像中の認証対象者に重畳されている身体部分の特徴点の色を変更させてタッチパネル画面に表示させてもよい。もしくは、ステップS19~S24の処理を実行する過程において、骨格情報により一番前(タッチパネル33を操作している)のユーザの顔を検出している旨をタッチパネル画面(タッチパネル33により覆われた表示部32の表示面)に表示させてもよい。 In the process of executing the processes of steps S32 to S34, the feature points of the skeleton detected in the face authentication process (step S32) (for example, wrist P1R, elbow P2R, PRL, shoulder P3R, P3L, neck in FIG. 14). P4, face P5) is superimposed on the authentication target person (user who is performing an input operation on the touch panel 33 of the payment terminal 30 with a finger) in the image taken in step S16, and is covered by the touch panel screen (touch panel 33). It may be displayed on the display surface of the displayed display unit 32 (see FIG. 14). At that time, if a third party other than the authentication target person lined up behind the authentication target person is reflected in the image taken in step S16 (not shown), the body part of the third party. The feature points may not be displayed. Alternatively, if a third party other than the authentication target person lined up behind the authentication target person is reflected in the image taken in step S16 (not shown), it is detected in the authentication process (step S32). The feature points of the body part may be superimposed on the authentication target person and the third party in the image taken in step S16 and displayed on the touch panel screen. At that time, the feature points of the body part superimposed on the recognition target person may be highlighted (the feature points of the body part superimposed on the third party may not be highlighted). Further, when the authentication of the authentication process (step S32) is successful, the color of the feature points of the body part superimposed on the authentication target person in the image taken in step S16 may be changed and displayed on the touch panel screen. good. Alternatively, in the process of executing the processes of steps S19 to S24, the touch panel screen (display covered by the touch panel 33) indicates that the face of the frontmost user (operating the touch panel 33) is detected by the skeleton information. It may be displayed on the display surface of the unit 32).
 以上説明したように、認証システム1の動作の一例(第2動作例)によれば、PIN照合(ステップS30)により絞り込んだ上で(例えば、上記例では1000名から100名に絞り込んだ上で)、顔認証(ステップS32)を実施できるため認証精度を向上することができる。 As described above, according to an example of the operation of the authentication system 1 (second operation example), after narrowing down by PIN verification (step S30) (for example, in the above example, after narrowing down from 1000 to 100 people). ), Face authentication (step S32) can be performed, so that the authentication accuracy can be improved.
 以上説明したように、実施形態2によれば、ユーザU1(認証対象者)以外の第三者(例えば、後方に並んでいるユーザU2やカメラの撮影範囲を横切る第三者)がカメラ31で撮影した画像に写り込んでいる場合であっても(図13参照)、ユーザU1を正しく認証することができる。 As described above, according to the second embodiment, a third party other than the user U1 (authentication target person) (for example, the user U2 lined up behind or a third party crossing the shooting range of the camera) is the camera 31. The user U1 can be correctly authenticated even when it is reflected in the captured image (see FIG. 13).
 すなわち、顔領域検出部22kが決済端末30のタッチパネル33に対して入力操作(決済のための入力操作)を行っている手指がつながっている顔領域を検出するため、決済端末30のタッチパネル33に対して入力操作(決済のための入力操作)を行っている手指がつながっていない第三者の顔領域を誤って検出することが抑制され、決済処理を行うためにPINを入力している認証対象者であるユーザU1の顔領域を正しく検出することができる。その結果、認証装置10による顔認証を正しく実行することができる。 That is, in order to detect the face area to which the fingers performing the input operation (input operation for payment) are connected to the touch panel 33 of the payment terminal 30 by the face area detection unit 22k, the touch panel 33 of the payment terminal 30 is used. On the other hand, it is suppressed to mistakenly detect the face area of a third party who is not connected to the finger performing the input operation (input operation for payment), and the PIN is input to perform the payment processing. The face area of the target user U1 can be correctly detected. As a result, the face authentication by the authentication device 10 can be correctly executed.
 また、実施形態2によれば、タッチパネル33に対するユーザの入力操作に応じた適切なタイミング(例えば、ユーザU1がタッチパネル33を介してPINの入力を開始したタイミング)でカメラ31を起動させてユーザU1の顔を撮影することができる。これにより、カメラ31を常時起動させている場合と比べ、消費電力の削減が可能となる。また、カメラ31を常時起動させていると、認証対象者以外の第三者(例えば、認証対象者の後方に並んでいる認証対象者以外の第三者やカメラ31の撮影範囲を横切る第三者)が映り込む機会が増えるためプライバシー保護の観点から好ましくないが、実施形態2のようにタッチパネル33に対するユーザの入力操作に応じた適切なタイミングで(つまり、認証が必要なタイミングで)カメラ31を起動することにより、認証対象者以外の第三者が映り込むのを抑制することができる。 Further, according to the second embodiment, the camera 31 is activated at an appropriate timing according to the user's input operation to the touch panel 33 (for example, the timing when the user U1 starts the PIN input via the touch panel 33), and the user U1 You can take a picture of your face. As a result, the power consumption can be reduced as compared with the case where the camera 31 is always activated. Further, when the camera 31 is always activated, a third party other than the authentication target person (for example, a third party other than the authentication target person lined up behind the authentication target person or a third party crossing the shooting range of the camera 31). It is not preferable from the viewpoint of privacy protection because the chances of the person being reflected are increased, but the camera 31 is at an appropriate timing (that is, at the timing when authentication is required) according to the user's input operation to the touch panel 33 as in the second embodiment. By activating, it is possible to suppress the reflection of a third party other than the person to be authenticated.
 また、実施形態2によれば、タッチパネル33に対するユーザの入力操作に応じたタイミング(例えば、ユーザU1がタッチパネル33を介してPINの入力を開始したタイミング)でユーザU1の顔を撮影するため、タッチパネル33を介してPINを入力しているユーザU1の顔、すなわち、PIN入力受付画面G1が表示されている表示部32の表示面(及びその近傍に配置されたカメラ31)を向いている(注視している)ユーザU1の顔を正面又は概ね正面から撮影することができる。つまり、ユーザU1の顔を顔認証に適した角度で撮影することができる。これにより、顔認証の認証精度の向上を期待できる。 Further, according to the second embodiment, since the face of the user U1 is photographed at the timing corresponding to the user's input operation to the touch panel 33 (for example, the timing when the user U1 starts the PIN input via the touch panel 33), the touch panel is used. It faces the face of the user U1 who is inputting the PIN via 33, that is, the display surface of the display unit 32 (and the camera 31 arranged in the vicinity thereof) on which the PIN input reception screen G1 is displayed (gaze). The face of the user U1 can be photographed from the front or almost from the front. That is, the face of the user U1 can be photographed at an angle suitable for face recognition. This can be expected to improve the authentication accuracy of face recognition.
 次に、変形例について説明する。 Next, a modified example will be described.
 上記実施形態2では、決済端末30のタッチパネル33に最も近い手指として、撮影画像Iの下辺との間の距離L(図14参照)が最も短い手首の特徴点(又は指の特徴点)を検出する例(カメラ31が表示部32のフレームのうち上部に取り付けられている場合(図8参照))について説明したが、これに限られない。 In the second embodiment, the wrist feature point (or finger feature point) having the shortest distance L (see FIG. 14) from the lower side of the captured image I is detected as the finger closest to the touch panel 33 of the payment terminal 30. An example (when the camera 31 is attached to the upper part of the frame of the display unit 32 (see FIG. 8)) has been described, but the present invention is not limited to this.
 例えば、カメラ31が表示部32のフレームのうち下部に取り付けられている場合(図示略)、決済端末30のタッチパネル33に最も近い手指として、撮影画像Iの上辺との間の距離が最も短い手首の特徴点(又は指の特徴点)を検出してもよい。また、カメラ31が表示部32のフレームのうち右部に取り付けられている場合(図示略)、決済端末30のタッチパネル33に最も近い手指として、撮影画像Iの左辺との間の距離が最も短い手首の特徴点(又は指の特徴点)を検出してもよい。また、カメラ31が表示部32のフレームのうち左部に取り付けられている場合(図示略)、決済端末30のタッチパネル33に最も近い手指として、撮影画像Iの右辺との間の距離が最も短い手首の特徴点(又は指の特徴点)を検出してもよい。さらに、カメラ31がタッチパネル画面(タッチパネル33により覆われた表示部31の表示面)に取り付けられている場合(図示略)、決済端末30のタッチパネル33に最も近い手指として、タッチパネル33を触れている手首の特徴点(又は指の特徴点)を検出してもよい。 For example, when the camera 31 is attached to the lower part of the frame of the display unit 32 (not shown), the wrist having the shortest distance from the upper side of the captured image I as the finger closest to the touch panel 33 of the payment terminal 30. (Or the feature point of the finger) may be detected. When the camera 31 is attached to the right side of the frame of the display unit 32 (not shown), the distance between the photographed image I and the left side of the photographed image I is the shortest as the finger closest to the touch panel 33 of the payment terminal 30. Wrist feature points (or finger feature points) may be detected. When the camera 31 is attached to the left side of the frame of the display unit 32 (not shown), the distance between the camera 31 and the right side of the captured image I is the shortest as the finger closest to the touch panel 33 of the payment terminal 30. Wrist feature points (or finger feature points) may be detected. Further, when the camera 31 is attached to the touch panel screen (the display surface of the display unit 31 covered by the touch panel 33) (not shown), the touch panel 33 is touched as the finger closest to the touch panel 33 of the payment terminal 30. Wrist feature points (or finger feature points) may be detected.
 また、上記実施形態2では、タッチパネル33に対するユーザの入力操作に応じたタイミング(カメラ31を起動させてユーザU1の顔を撮影するタイミング)が、ユーザU1がタッチパネル33を介してPINの入力を開始したタイミングである例について説明したが、これに限らない。例えば、タッチパネル33に対するユーザの入力操作に応じたタイミング(カメラ31を起動させてユーザU1の顔を撮影するタイミング)は、PINの入力開始後、PINの入力終了前のいずれかのタイミングであってもよい。 Further, in the second embodiment, the user U1 starts the PIN input via the touch panel 33 at the timing corresponding to the user's input operation to the touch panel 33 (the timing at which the camera 31 is activated to take a picture of the user U1's face). The example of the timing is described, but the present invention is not limited to this. For example, the timing corresponding to the user's input operation on the touch panel 33 (the timing at which the camera 31 is activated to take a picture of the user U1's face) is any timing after the start of the PIN input and before the end of the PIN input. May be good.
 このようにすれば、タッチパネル33を介してPINを入力しているユーザU1の顔、すなわち、PIN入力受付画面G1が表示されている表示部32の表示面(及びその近傍に配置されたカメラ31)を向いている(注視している)ユーザU1の顔を正面又は概ね正面から撮影することができる。つまり、ユーザU1の顔を顔認証に適した角度で撮影することができる。これにより、顔認証の認証精度の向上を期待できる。 In this way, the face of the user U1 who is inputting the PIN via the touch panel 33, that is, the display surface of the display unit 32 on which the PIN input reception screen G1 is displayed (and the camera 31 arranged in the vicinity thereof). The face of the user U1 facing (looking at) can be photographed from the front or substantially from the front. That is, the face of the user U1 can be photographed at an angle suitable for face recognition. This can be expected to improve the authentication accuracy of face recognition.
 また、タッチパネル33に対するユーザの入力操作に応じたタイミング(カメラ31を起動させてユーザU1の顔を撮影するタイミング)は、ユーザU1がPINの入力を終了したタイミング又はそれ以降のタイミングであってもよい。 Further, the timing according to the user's input operation to the touch panel 33 (the timing at which the camera 31 is activated to take a picture of the user U1's face) may be the timing at which the user U1 finishes the PIN input or a timing after that. good.
 また、上記実施形態2では、カメラ31を表示部32のフレームのうち上部に取り付けた例(図8参照)について説明したが、これに限らない。すなわち、顔認証に適した角度で撮影することができるのであれば、カメラ31は、どこに取り付けてもよい。例えば、表示部32のフレームのうち左部又は右部に取り付けてもよい。または、決済端末30の近傍に設置された構造物(例えば、壁や柱)やその他の箇所に取り付けてもよい。 Further, in the second embodiment, an example in which the camera 31 is attached to the upper part of the frame of the display unit 32 (see FIG. 8) has been described, but the present invention is not limited to this. That is, the camera 31 may be attached anywhere as long as it can shoot at an angle suitable for face recognition. For example, it may be attached to the left portion or the right portion of the frame of the display unit 32. Alternatively, it may be attached to a structure (for example, a wall or a pillar) installed in the vicinity of the payment terminal 30 or other places.
 また、上記実施形態2では、認証装置10による顔認証が成功し、かつ、照合部22gによる照合結果が一致する場合に実行される所定処理が決済処理である例について説明したが、これに限らない。例えば、この所定処理は、ユーザU1が通過するゲート又はドアを開く処理、その他の処理であってもよい。 Further, in the second embodiment, an example in which the predetermined process executed when the face authentication by the authentication device 10 is successful and the collation results by the collation unit 22g match is the settlement process has been described, but the present invention is limited to this. No. For example, this predetermined process may be a process of opening a gate or a door through which the user U1 passes, or another process.
 また、上記実施形態2では、認証システム1を、ネットワークNW(例えば、インターネット)を介して互いに通信可能な認証装置10、認証制御装置20、決済端末30により構成した例について説明したが、これに限らない。 Further, in the second embodiment, an example in which the authentication system 1 is composed of an authentication device 10, an authentication control device 20, and a payment terminal 30 capable of communicating with each other via a network NW (for example, the Internet) has been described. Not exclusively.
 例えば、決済端末30に、認証装置10、認証制御装置20の全部又は一部の構成又は機能を追加してもよい。また、認証制御装置20に、決済端末30、認証装置10の全部又は一部の構成又は機能を追加してもよい。 For example, the configuration or function of all or part of the authentication device 10 and the authentication control device 20 may be added to the payment terminal 30. Further, the configuration or function of all or a part of the payment terminal 30 and the authentication device 10 may be added to the authentication control device 20.
 上記実施形態1、2において、プログラムは、様々なタイプの非一時的なコンピュータ可読媒体(non-transitory computer readable medium)を用いて格納され、コンピュータに供給することができる。非一時的なコンピュータ可読媒体は、様々なタイプの実体のある記録媒体(tangible storage medium)を含む。非一時的なコンピュータ可読媒体の例は、磁気記録媒体(例えばフレキシブルディスク、磁気テープ、ハードディスクドライブ)、光磁気記録媒体(例えば光磁気ディスク)、CD-ROM(Read Only Memory)、CD-R、CD-R/W、半導体メモリ(例えば、マスクROM、PROM(Programmable ROM)、EPROM(Erasable PROM)、フラッシュROM、RAM(Random Access Memory))を含む。また、プログラムは、様々なタイプの一時的なコンピュータ可読媒体(transitory computer readable medium)によってコンピュータに供給されてもよい。一時的なコンピュータ可読媒体の例は、電気信号、光信号、及び電磁波を含む。一時的なコンピュータ可読媒体は、電線及び光ファイバ等の有線通信路、又は無線通信路を介して、プログラムをコンピュータに供給できる。 In the above-described first and second embodiments, the program is stored using various types of non-transitory computer readable medium and can be supplied to the computer. Non-temporary computer-readable media include various types of tangible storage mediums. Examples of non-temporary computer-readable media include magnetic recording media (eg, flexible disks, magnetic tapes, hard disk drives), magneto-optical recording media (eg, magneto-optical disks), CD-ROMs (Read Only Memory), CD-Rs, It includes a CD-R / W and a semiconductor memory (for example, a mask ROM, a PROM (Programmable ROM), an EPROM (Erasable PROM), a flash ROM, and a RAM (Random Access Memory)). The program may also be supplied to the computer by various types of temporary computer readable medium. Examples of temporary computer-readable media include electrical, optical, and electromagnetic waves. The temporary computer-readable medium can supply the program to the computer via a wired communication path such as an electric wire and an optical fiber, or a wireless communication path.
 上記実施形態で示した数値は全て例示であり、これと異なる適宜の数値を用いることができるのは無論である。 The numerical values shown in the above embodiments are all examples, and it goes without saying that an appropriate numerical value different from this can be used.
 上記実施形態はあらゆる点で単なる例示にすぎない。上記実施形態の記載によって本発明は限定的に解釈されるものではない。本発明はその精神または主要な特徴から逸脱することなく他の様々な形で実施することができる。 The above embodiment is merely an example in every respect. The present invention is not limitedly construed by the description of the above embodiment. The present invention can be practiced in various other forms without departing from its spirit or key features.
 上記の実施形態の一部又は全部は、以下の付記のようにも記載されうるが、以下には限られない。 Part or all of the above embodiments may be described as in the following appendix, but are not limited to the following.
 (付記1)
 入力手段に対して手指により入力操作を行っているユーザを含む画像をカメラにより撮影させる撮影制御手段と、
 前記カメラにより撮影された前記画像を取得する画像取得手段と、
 前記画像取得手段により取得された画像から、前記入力手段に対して前記入力操作を行っている前記手指がつながっている顔領域を検出する顔領域検出手段と、
 顔認証を実行する認証装置に、前記顔領域検出手段により検出された顔領域に含まれる前記ユーザの顔認証を実行させる認証制御手段と、を備える認証制御装置。
(Appendix 1)
An imaging control means that allows the camera to capture an image including a user who is performing an input operation with a finger on the input means.
An image acquisition means for acquiring the image taken by the camera, and
A face area detecting means for detecting a face area to which the fingers performing the input operation on the input means are connected from an image acquired by the image acquiring means, and a face area detecting means.
An authentication control device including an authentication control means for causing an authentication device that executes face authentication to execute face authentication of the user included in the face area detected by the face area detection means.
 (付記2)
 前記入力手段は、前記ユーザに割り当てられた認証情報を前記ユーザに入力させる入力手段であり、
 さらに、
 前記入力手段を介して入力された前記認証情報を取得する認証情報取得手段と、
 前記認証情報取得手段により取得された前記認証情報と登録済み認証情報とを照合する照合手段と、
 前記認証装置による顔認証が成功し、かつ、前記照合手段による照合結果が一致する場合、所定処理を実行する処理手段に前記所定処理を実行させる処理制御手段と、を備える付記1に記載の認証制御装置。
(Appendix 2)
The input means is an input means for causing the user to input the authentication information assigned to the user.
Moreover,
An authentication information acquisition means for acquiring the authentication information input via the input means, and an authentication information acquisition means.
A collation means for collating the authentication information acquired by the authentication information acquisition means with the registered authentication information,
The authentication according to Appendix 1, further comprising a processing control means for causing a processing means for executing a predetermined process to execute the predetermined process when the face authentication by the authentication device is successful and the collation results by the collation means match. Control device.
 (付記3)
 前記所定処理は、決済処理、前記ユーザが通過するゲート又はドアを開く処理である付記2に記載の認証制御装置。
(Appendix 3)
The authentication control device according to Appendix 2, wherein the predetermined process is a settlement process and a process of opening a gate or door through which the user passes.
 (付記4)
 前記入力手段は、認証情報を前記ユーザの手指により入力させる入力手段である付記1から3のいずれか1項に記載の認証制御装置。
(Appendix 4)
The authentication control device according to any one of Supplementary note 1 to 3, wherein the input means is an input means for inputting authentication information by the user's finger.
 (付記5)
 前記入力手段は、表示部の表示面を覆った状態で配置されたタッチパネルである付記4に記載の認証制御装置。
(Appendix 5)
The authentication control device according to Appendix 4, wherein the input means is a touch panel arranged so as to cover the display surface of the display unit.
 (付記6)
 前記表示部の表示面に、前記ユーザの手指が前記タッチパネルを介して辿るべき、前記認証情報を構成する各々の要素を象徴する画像を含む認証情報入力画面を表示させる表示制御部をさらに備える付記5に記載の認証制御装置。
(Appendix 6)
Additional note that the display surface of the display unit further includes a display control unit that displays an authentication information input screen including an image symbolizing each element constituting the authentication information that the user's finger should follow via the touch panel. The authentication control device according to 5.
 (付記7)
 前記画像取得手段により取得された画像から、前記ユーザの骨格の特徴点を少なくとも一つ検出する特徴点検出手段をさらに備え、
 前記顔領域検出手段は、前記画像取得手段により取得された画像から、前記入力手段に対して前記入力操作を行っている前記手指が前記特徴点検出手段により検出される特徴点を経由してつながっている顔領域を検出する付記1から6のいずれか1項に記載の認証制御装置。
(Appendix 7)
A feature point detecting means for detecting at least one feature point of the user's skeleton from the image acquired by the image acquiring means is further provided.
The face area detecting means connects the image acquired by the image acquiring means via a feature point at which the finger performing the input operation to the input means is detected by the feature point detecting means. The authentication control device according to any one of Supplementary note 1 to 6, which detects a face area.
 (付記8)
 入力手段と、
 前記入力手段に対して手指により入力操作を行っているユーザを含む画像を撮影するカメラと、
 前記カメラにより撮影された前記画像を取得する画像取得手段と、
 前記画像取得手段により取得された画像から、前記入力手段に対して前記入力操作を行っている前記手指がつながっている顔領域を検出する顔領域検出手段と、
 顔認証を実行する認証装置に、前記顔領域検出手段により検出された顔領域に含まれる前記ユーザの顔認証を実行させる認証制御手段と、を備える認証システム。
(Appendix 8)
Input means and
A camera that captures an image including a user who is performing an input operation with a finger on the input means, and a camera.
An image acquisition means for acquiring the image taken by the camera, and
A face area detecting means for detecting a face area to which the fingers performing the input operation on the input means are connected from an image acquired by the image acquiring means, and a face area detecting means.
An authentication system including an authentication device for executing face authentication, and an authentication control means for executing face authentication of the user included in the face area detected by the face area detecting means.
 (付記9)
 ネットワークを介して互いに通信可能な情報処理装置、認証制御装置及び前記認証装置を備え、
 前記入力手段及び前記カメラは、前記情報処理装置に設けられ、
 前記画像取得手段、前記顔検出手段及び前記認証制御手段は、前記認証制御装置に設けられている付記8に記載の認証システム。
(Appendix 9)
It is equipped with an information processing device, an authentication control device, and the authentication device capable of communicating with each other via a network.
The input means and the camera are provided in the information processing device.
The authentication system according to Appendix 8, wherein the image acquisition means, the face detection means, and the authentication control means are provided in the authentication control device.
 (付記10)
 入力手段に対して手指により入力操作を行っているユーザを含む画像をカメラにより撮影させる撮影制御ステップと、
 前記カメラにより撮影された前記画像を取得する画像取得ステップと、
 前記画像取得ステップにより取得された画像から、前記入力手段に対して前記入力操作を行っている前記手指がつながっている顔領域を検出する顔領域検出ステップと、
 顔認証を実行する認証装置に、前記顔検出ステップにより検出された顔領域に含まれる前記ユーザの顔認証を実行させる認証制御ステップと、を備える認証制御方法。
(Appendix 10)
A shooting control step in which the camera captures an image including a user who is performing an input operation with a finger on the input means.
An image acquisition step of acquiring the image taken by the camera, and
From the image acquired in the image acquisition step, a face area detection step for detecting a face area to which the fingers performing the input operation for the input means are connected, and a face area detection step.
An authentication control method comprising an authentication control step for causing an authentication device that executes face authentication to execute face authentication of the user included in the face area detected by the face detection step.
 (付記11)
 少なくとも1つのプロセッサを備えた電子デバイスに、
 入力手段に対して手指により入力操作を行っているユーザを含む画像をカメラにより撮影させる撮影制御処理と、
 前記カメラにより撮影された前記画像を取得する画像取得処理と、
 前記画像取得処理により取得された画像から、前記入力手段に対して前記入力操作を行っている前記手指がつながっている顔領域を検出する顔領域検出処理と、
 顔認証を実行する認証装置に、前記顔検出処理により検出された顔領域に含まれる前記ユーザの顔認証を実行させる認証制御処理と、を実行させるためのプログラムを記録したコンピュータ読取可能な記録媒体。
(Appendix 11)
For electronic devices with at least one processor
Shooting control processing that causes the camera to capture an image that includes a user who is performing an input operation with a finger on the input means.
An image acquisition process for acquiring the image taken by the camera, and
From the image acquired by the image acquisition process, a face area detection process for detecting a face area to which the fingers performing the input operation on the input means are connected, and a face area detection process.
A computer-readable recording medium recording a program for executing an authentication control process for causing an authentication device that executes face authentication to execute face authentication of the user included in the face area detected by the face detection process. ..
1 認証システム
10 認証装置
11 記憶部
11a プログラム
11b 顔情報DB
12 制御部
12a 画像取得部
12b 顔検出部
12c 特徴点抽出部
12d 登録部
12e 認証部
13 メモリ
14 通信部
20 認証制御装置
21 記憶部
21a プログラム
21b 個人認証情報
21b2 認証情報
22 制御部
22a 表示制御部
22b 撮影制御部(撮影制御手段)
22c 画像取得部(画像取得手段)
22d 認証制御部(認証制御手段)
22e 認証情報取得部(顔認証結果取得部)
22g 照合部
22h 完了処理部
22j 特徴点検出部
22k 顔領域検出部
23 メモリ
24 通信部
30 決済端末(情報処理端末)
31 カメラ
32 表示部
33 タッチパネル(入力手段)
34 記憶部
34a プログラム
35 制御部
35a 表示制御部
35b 入力状態検出部
35c 撮影制御部
35d 決済処理部
36 メモリ
37 通信部
B 決済ボタン
G1 PIN入力受付画面
G2 問い合わせ画面
NW ネットワーク
g 画像
1 Authentication system 10 Authentication device 11 Storage unit 11a Program 11b Face information DB
12 Control unit 12a Image acquisition unit 12b Face detection unit 12c Feature point extraction unit 12d Registration unit 12e Authentication unit 13 Memory 14 Communication unit 20 Authentication control device 21 Storage unit 21a Program 21b Personal authentication information 21b2 Authentication information 22 Control unit 22a Display control unit 22b Imaging control unit (imaging control means)
22c Image acquisition unit (image acquisition means)
22d Authentication control unit (authentication control means)
22e Authentication information acquisition department (face authentication result acquisition department)
22g Collation unit 22h Completion processing unit 22j Feature point detection unit 22k Face area detection unit 23 Memory 24 Communication unit 30 Payment terminal (information processing terminal)
31 Camera 32 Display 33 Touch panel (input means)
34 Storage unit 34a Program 35 Control unit 35a Display control unit 35b Input status detection unit 35c Shooting control unit 35d Payment processing unit 36 Memory 37 Communication unit B Payment button G1 PIN input reception screen G2 Inquiry screen NW network g Image

Claims (11)

  1.  入力手段に対して手指により入力操作を行っているユーザを含む画像をカメラにより撮影させる撮影制御手段と、
     前記カメラにより撮影された前記画像を取得する画像取得手段と、
     前記画像取得手段により取得された画像から、前記入力手段に対して前記入力操作を行っている前記手指がつながっている顔領域を検出する顔領域検出手段と、
     顔認証を実行する認証装置に、前記顔領域検出手段により検出された顔領域に含まれる前記ユーザの顔認証を実行させる認証制御手段と、を備える認証制御装置。
    An imaging control means that allows the camera to capture an image including a user who is performing an input operation with a finger on the input means.
    An image acquisition means for acquiring the image taken by the camera, and
    A face area detecting means for detecting a face area to which the fingers performing the input operation on the input means are connected from an image acquired by the image acquiring means, and a face area detecting means.
    An authentication control device including an authentication control means for causing an authentication device that executes face authentication to execute face authentication of the user included in the face area detected by the face area detection means.
  2.  前記入力手段は、前記ユーザに割り当てられた認証情報を前記ユーザに入力させる入力手段であり、
     さらに、
     前記入力手段を介して入力された前記認証情報を取得する認証情報取得手段と、
     前記認証情報取得手段により取得された前記認証情報と登録済み認証情報とを照合する照合手段と、
     前記認証装置による顔認証が成功し、かつ、前記照合手段による照合結果が一致する場合、所定処理を実行する処理手段に前記所定処理を実行させる処理制御手段と、を備える請求項1に記載の認証制御装置。
    The input means is an input means for causing the user to input the authentication information assigned to the user.
    Moreover,
    An authentication information acquisition means for acquiring the authentication information input via the input means, and an authentication information acquisition means.
    A collation means for collating the authentication information acquired by the authentication information acquisition means with the registered authentication information,
    The first aspect of claim 1, wherein when the face authentication by the authentication device is successful and the collation results by the collation means match, the processing means for executing the predetermined process is provided with the process control means for executing the predetermined process. Authentication control device.
  3.  前記所定処理は、決済処理、前記ユーザが通過するゲート又はドアを開く処理である請求項2に記載の認証制御装置。 The authentication control device according to claim 2, wherein the predetermined process is a settlement process and a process of opening a gate or a door through which the user passes.
  4.  前記入力手段は、認証情報を前記ユーザの手指により入力させる入力手段である請求項1から3のいずれか1項に記載の認証制御装置。 The authentication control device according to any one of claims 1 to 3, wherein the input means is an input means for inputting authentication information by the user's finger.
  5.  前記入力手段は、表示部の表示面を覆った状態で配置されたタッチパネルである請求項4に記載の認証制御装置。 The authentication control device according to claim 4, wherein the input means is a touch panel arranged so as to cover the display surface of the display unit.
  6.  前記表示部の表示面に、前記ユーザの手指が前記タッチパネルを介して辿るべき、前記認証情報を構成する各々の要素を象徴する画像を含む認証情報入力画面を表示させる表示制御部をさらに備える請求項5に記載の認証制御装置。 A claim further comprising a display control unit on the display surface of the display unit for displaying an authentication information input screen including an image symbolizing each element constituting the authentication information, which the user's finger should follow via the touch panel. Item 5. The authentication control device according to Item 5.
  7.  前記画像取得手段により取得された画像から、前記ユーザの骨格の特徴点を少なくとも一つ検出する特徴点検出手段をさらに備え、
     前記顔領域検出手段は、前記画像取得手段により取得された画像から、前記入力手段に対して前記入力操作を行っている前記手指が前記特徴点検出手段により検出される特徴点を経由してつながっている顔領域を検出する請求項1から6のいずれか1項に記載の認証制御装置。
    A feature point detecting means for detecting at least one feature point of the user's skeleton from the image acquired by the image acquiring means is further provided.
    The face area detecting means connects the image acquired by the image acquiring means via a feature point at which the finger performing the input operation to the input means is detected by the feature point detecting means. The authentication control device according to any one of claims 1 to 6, which detects a face area.
  8.  入力手段と、
     前記入力手段に対して手指により入力操作を行っているユーザを含む画像を撮影するカメラと、
     前記カメラにより撮影された前記画像を取得する画像取得手段と、
     前記画像取得手段により取得された画像から、前記入力手段に対して前記入力操作を行っている前記手指がつながっている顔領域を検出する顔領域検出手段と、
     顔認証を実行する認証装置に、前記顔領域検出手段により検出された顔領域に含まれる前記ユーザの顔認証を実行させる認証制御手段と、を備える認証システム。
    Input means and
    A camera that captures an image including a user who is performing an input operation with a finger on the input means, and a camera.
    An image acquisition means for acquiring the image taken by the camera, and
    A face area detecting means for detecting a face area to which the fingers performing the input operation on the input means are connected from an image acquired by the image acquiring means, and a face area detecting means.
    An authentication system including an authentication device for executing face authentication, and an authentication control means for executing face authentication of the user included in the face area detected by the face area detecting means.
  9.  ネットワークを介して互いに通信可能な情報処理装置、認証制御装置及び前記認証装置を備え、
     前記入力手段及び前記カメラは、前記情報処理装置に設けられ、
     前記画像取得手段、前記顔検出手段及び前記認証制御手段は、前記認証制御装置に設けられている請求項8に記載の認証システム。
    It is equipped with an information processing device, an authentication control device, and the authentication device capable of communicating with each other via a network.
    The input means and the camera are provided in the information processing device.
    The authentication system according to claim 8, wherein the image acquisition means, the face detection means, and the authentication control means are provided in the authentication control device.
  10.  入力手段に対して手指により入力操作を行っているユーザを含む画像をカメラにより撮影させる撮影制御ステップと、
     前記カメラにより撮影された前記画像を取得する画像取得ステップと、
     前記画像取得ステップにより取得された画像から、前記入力手段に対して前記入力操作を行っている前記手指がつながっている顔領域を検出する顔領域検出ステップと、
     顔認証を実行する認証装置に、前記顔検出ステップにより検出された顔領域に含まれる前記ユーザの顔認証を実行させる認証制御ステップと、を備える認証制御方法。
    A shooting control step in which the camera captures an image including a user who is performing an input operation with a finger on the input means.
    An image acquisition step of acquiring the image taken by the camera, and
    From the image acquired in the image acquisition step, a face area detection step for detecting a face area to which the fingers performing the input operation for the input means are connected, and a face area detection step.
    An authentication control method comprising an authentication control step for causing an authentication device that executes face authentication to execute face authentication of the user included in the face area detected by the face detection step.
  11.  少なくとも1つのプロセッサを備えた電子デバイスに、
     入力手段に対して手指により入力操作を行っているユーザを含む画像をカメラにより撮影させる撮影制御処理と、
     前記カメラにより撮影された前記画像を取得する画像取得処理と、
     前記画像取得処理により取得された画像から、前記入力手段に対して前記入力操作を行っている前記手指がつながっている顔領域を検出する顔領域検出処理と、
     顔認証を実行する認証装置に、前記顔検出処理により検出された顔領域に含まれる前記ユーザの顔認証を実行させる認証制御処理と、を実行させるためのプログラムを記録したコンピュータ読取可能な記録媒体。
    For electronic devices with at least one processor
    Shooting control processing that causes the camera to capture an image that includes a user who is performing an input operation with a finger on the input means.
    An image acquisition process for acquiring the image taken by the camera, and
    From the image acquired by the image acquisition process, a face area detection process for detecting a face area to which the fingers performing the input operation on the input means are connected, and a face area detection process.
    A computer-readable recording medium recording a program for executing an authentication control process for causing an authentication device that executes face authentication to execute face authentication of the user included in the face area detected by the face detection process. ..
PCT/JP2020/013145 2020-03-24 2020-03-24 Authentication control device, authentication system, authentication control method, and recording medium WO2021192061A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
JP2022509854A JPWO2021192061A5 (en) 2020-03-24 Authentication control device, authentication system, authentication control method and program
PCT/JP2020/013145 WO2021192061A1 (en) 2020-03-24 2020-03-24 Authentication control device, authentication system, authentication control method, and recording medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2020/013145 WO2021192061A1 (en) 2020-03-24 2020-03-24 Authentication control device, authentication system, authentication control method, and recording medium

Publications (1)

Publication Number Publication Date
WO2021192061A1 true WO2021192061A1 (en) 2021-09-30

Family

ID=77891218

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2020/013145 WO2021192061A1 (en) 2020-03-24 2020-03-24 Authentication control device, authentication system, authentication control method, and recording medium

Country Status (1)

Country Link
WO (1) WO2021192061A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11308473B2 (en) * 2018-04-20 2022-04-19 Banks And Acquirers International Holding Device for determining a transactional device, corresponding method and computer program

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004355377A (en) * 2003-05-29 2004-12-16 Toshiba Corp Face image collation system and its method for distributing face image data
JP2006079382A (en) * 2004-09-10 2006-03-23 Hitachi Omron Terminal Solutions Corp Information processor, method for detecting illegal person and automatic teller machine
EP2103255A2 (en) * 2007-01-09 2009-09-23 Pavel Anatolievich Zaitsev Biometric information carrier
JP2014211838A (en) * 2013-04-22 2014-11-13 富士通株式会社 Biometric authentication apparatus, biometric authentication system, and biometric authentication method
US20170243062A1 (en) * 2007-06-11 2017-08-24 Jeffrey A. Matos Apparatus and method for verifying the identity of an author and a person receiving information
WO2019187421A1 (en) * 2018-03-29 2019-10-03 ソニー株式会社 Information processing device, information processing method, and program

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004355377A (en) * 2003-05-29 2004-12-16 Toshiba Corp Face image collation system and its method for distributing face image data
JP2006079382A (en) * 2004-09-10 2006-03-23 Hitachi Omron Terminal Solutions Corp Information processor, method for detecting illegal person and automatic teller machine
EP2103255A2 (en) * 2007-01-09 2009-09-23 Pavel Anatolievich Zaitsev Biometric information carrier
US20170243062A1 (en) * 2007-06-11 2017-08-24 Jeffrey A. Matos Apparatus and method for verifying the identity of an author and a person receiving information
JP2014211838A (en) * 2013-04-22 2014-11-13 富士通株式会社 Biometric authentication apparatus, biometric authentication system, and biometric authentication method
WO2019187421A1 (en) * 2018-03-29 2019-10-03 ソニー株式会社 Information processing device, information processing method, and program

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
MORI, SHOTA ET AL.: "Face and hand region extraction including concealment using distance information for sign language recognition", IPSJ SIG TECHNICAL REPORTS, 15 April 2013 (2013-04-15), pages 1 - 6 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11308473B2 (en) * 2018-04-20 2022-04-19 Banks And Acquirers International Holding Device for determining a transactional device, corresponding method and computer program

Also Published As

Publication number Publication date
JPWO2021192061A1 (en) 2021-09-30

Similar Documents

Publication Publication Date Title
US20210224370A1 (en) Systems and methods for executing electronic transactions using biometric data
JP5169257B2 (en) Automatic transaction apparatus and automatic transaction system
US20210029112A1 (en) Taptic authentication system and method
KR101795556B1 (en) Personal Authentication Methods and Apparatus using Face Recognition and Face Motion Pattern Recognition
JP7509216B2 (en) Input control device, input system, input control method and input control program
EP3786820B1 (en) Authentication system, authentication device, authentication method, and program
TW202029030A (en) Authentication system, authentication device, authentication method, and program
WO2021192061A1 (en) Authentication control device, authentication system, authentication control method, and recording medium
WO2021192101A1 (en) Authentication control device, information processing device, authentication system, authentication control method, and recording medium
TWI762065B (en) Authentication system, authentication device, authentication method, and program product
JP6584855B2 (en) Input support apparatus, input support method, and program
WO2021192102A1 (en) Authentication control device, authentication system, authentication control method, and recording medium
TWI748415B (en) System and method for executing transaction based on a mobile communication device
JP6891355B1 (en) Authentication system, authentication device, authentication method, and program
TWM626411U (en) Cardless finance transaction system and its host server
JP6852508B2 (en) Mobile terminal devices, their authentication methods, and programs
JP6668013B2 (en) Judgment device, judgment method and program
WO2023199455A1 (en) Identification system, entry/exit management system, and pos system
JP6907426B1 (en) Authentication system, authentication method, and program
JP2011123729A (en) Authentication system, human body communication terminal device, and host device
WO2016059710A1 (en) Authentication system, authentication device, authentication method, and authentication program
JP2015225486A (en) Biological information registration system and information acquisition device
US20230308435A1 (en) Authentication system, authentication terminal, method for controlling authentication system, method for controlling authentication terminal, and storage medium
JP2022031099A (en) Input system, input program, and input method
Das et al. A Palm Index and Hand Gesture-Based User Authentication System

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20927121

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2022509854

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20927121

Country of ref document: EP

Kind code of ref document: A1