[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

WO2023229052A1 - Authentication system, control device, and computer program - Google Patents

Authentication system, control device, and computer program Download PDF

Info

Publication number
WO2023229052A1
WO2023229052A1 PCT/JP2023/020731 JP2023020731W WO2023229052A1 WO 2023229052 A1 WO2023229052 A1 WO 2023229052A1 JP 2023020731 W JP2023020731 W JP 2023020731W WO 2023229052 A1 WO2023229052 A1 WO 2023229052A1
Authority
WO
WIPO (PCT)
Prior art keywords
data
value
information
orientation
coordinate
Prior art date
Application number
PCT/JP2023/020731
Other languages
French (fr)
Japanese (ja)
Inventor
誠司 渡邉
Original Assignee
ノラシステムス合同会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ノラシステムス合同会社 filed Critical ノラシステムス合同会社
Publication of WO2023229052A1 publication Critical patent/WO2023229052A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis

Definitions

  • This disclosure relates to an authentication system.
  • the present disclosure also relates to a control device used in an authentication system and a computer program executed by the control device. Further, the present disclosure relates to a control device that generates authentication information used for authentication by an authentication system, and a computer program executed by the control device.
  • Patent Document 1 discloses a face authentication device and a face authentication method.
  • the face recognition device identifies the person by extracting each feature point from the face area of the person to be recognized, and calculating and comparing the degree of similarity with the dictionary image of each registrant's face registered in advance. do.
  • face data of each person is stored in a recording medium as a dictionary image.
  • This disclosure prevents individuals from being identified even if data stored for use in authentication is leaked.
  • the authentication system includes: an authentication data acquisition device that acquires biometric data for biometric authentication regarding a part of the body of a person to be authenticated; acquiring first information related to the biometric authentication of the authenticated person from an information storage medium owned by the authenticated person, acquiring second information related to the biometric authentication of the authenticated person from a storage device, and acquiring the authentication data. Authentication that biometrically authenticates the person to be authenticated based on the biometric data for biometric authentication acquired from the device, the first information acquired from the information storage medium, and the second information acquired from the storage device.
  • the first information includes a position/orientation change value set for at least one of a plurality of data extracted from biometric data regarding a part of the body of the person to be authenticated that has been obtained in advance;
  • the position and orientation change value is a value obtained by changing at least one of the position and orientation of the at least one data from a first state to a second state different from the first state,
  • the second information includes coordinate values of each of the plurality of data,
  • the coordinate value of the at least one data set with the position/orientation change value is a coordinate value related to the second state,
  • the position and orientation change value is set to a value that makes it impossible to determine the correlation between the plurality of data in the biological data when the plurality of data are arranged based on the coordinate values included in the second information.
  • the control device includes: creating reference data by arranging the plurality of data in the first state based on the first information and the second information; Authentication processing is performed based on the reference data and the biometric data for biometric
  • a control device includes: Accepting biometric data for biometric authentication regarding a part of the body of the person to be authenticated, receiving first information regarding biometric authentication of the person to be authenticated from an information storage medium owned by the person to be authenticated, and receiving first information regarding the biometric authentication of the person to be authenticated from a storage device.
  • the processor includes: creating a first information that is a position/orientation change value set for at least one of a plurality of data extracted from biometric data regarding a part of the body of the person to be authenticated that has been obtained in advance;
  • the position and orientation change value is a value obtained by changing at least one of the position and orientation of the at least one data from a first state to a second state different from the first state,
  • the second information includes coordinate values of each of the plurality of data,
  • the coordinate value of the at least one data set with the position/orientation change value is a coordinate value related to the second state,
  • the position and orientation change value is set to a value that makes it impossible to determine the correlation between the plurality of data in the biological data when the plurality of data are arranged based on the coordinate values included in the second information.
  • the processor includes: creating
  • a computer program executed by a control device includes: The execution causes the control device to: accept biometric data for biometric authentication regarding a part of the body of the person to be authenticated; receiving first information regarding the biometric authentication of the person to be authenticated from an information storage medium owned by the person to be authenticated; receiving second information regarding biometric authentication of the person to be authenticated from a storage device;
  • the first information includes a position/orientation change value set for at least one of a plurality of data extracted from biometric data regarding a part of the body of the person to be authenticated that has been obtained in advance;
  • the position and orientation change value is a value obtained by changing at least one of the position and orientation of the at least one data from a first state to a second state different from the first state,
  • the second information includes coordinate values of each of the plurality of data,
  • the coordinate value of the at least one data set with the position/orientation change value is a coordinate value related to the second state,
  • the position and orientation change value is set to a value that makes it impossible
  • the information storage medium stores, as the first information, a position/orientation change value of at least one of the plurality of data, but the coordinate values of the data are stored. It has not been. Further, although the coordinate values of the data are stored as second information in the storage device, the position and orientation change values are not stored. As a result, the first state of the plurality of data is unknown and the plurality of data cannot be restored to the original biometric data using only the authentication information stored in either the information storage medium or the storage device. Therefore, even if either the first information stored in the information storage medium or the second information stored in the storage device is leaked, it is possible to prevent an individual from being identified.
  • the position and orientation change value includes a shift value
  • the shift value is a value obtained by shifting the position of the at least one data from a first coordinate value to a second coordinate value different from the first coordinate value
  • the coordinate values related to the second state include the second coordinate values of the at least one data set with the shift value
  • the control device includes: Reference data may be created by arranging the plurality of pieces of data at the first coordinate values based on the first information and the second information.
  • the information storage medium stores the shift value as the first information, but does not store the coordinate value. Further, the storage device stores coordinate values as second information, but does not store shift values. Therefore, even if the first information or the second information becomes known to another person, it is possible to prevent the plurality of data from being restored by another person.
  • the position and orientation change value includes a rotation inversion value
  • the rotation inversion value changes the orientation of the at least one data from a first orientation to a second orientation different from the first orientation by performing at least one of rotation processing and reversal processing about the origin of the at least one data. This is the value when changing to two postures
  • the coordinate values related to the second state include the coordinate values of the origin after changing to the second posture
  • the control device includes: Based on the first information and the second information, at least one of a rotation process and a reversal process is performed on the at least one data, the plurality of data are arranged in the first orientation, and the reference data is may be created.
  • the rotation inversion value is stored as the first information in the information storage medium, but the coordinate value of the origin, which is the base axis of the rotation process and the inversion process, is not saved. Further, as second information, the storage device stores the coordinate values of the origin, which are the base axes of rotation processing and reversal processing, but does not store rotation reversal values. Therefore, even if the first information or the second information becomes known to another person, it is possible to prevent the plurality of data from being restored by another person.
  • the position and orientation change value includes a shift value and a rotation inversion value
  • the shift value is a value obtained by shifting the position of the at least one data from a first coordinate value to a second coordinate value different from the first coordinate value
  • the rotation inversion value changes the orientation of the at least one data from a first orientation to a second orientation different from the first orientation by performing at least one of rotation processing and reversal processing about the origin of the at least one data.
  • the coordinate values related to the second state include the coordinate values of the origin of the at least one data after shifting to the second coordinate values and changing to the second posture
  • the control device includes: Reference data may be created by arranging the plurality of data at the first coordinate values and the first orientation based on the first information and the second information.
  • the information storage medium stores the shift value and the rotation reversal value as the first information, but also the coordinate value of the data and the coordinate of the origin, which is the base axis of the rotation process and the reversal process. Value is not saved. Further, the storage device stores, as second information, the coordinate values of the data and the coordinate values of the origin, which are the basis of rotation processing and reversal processing, but does not store shift values and rotation reversal values. Therefore, even if the first information or the second information becomes known to another person, it is possible to prevent the plurality of data from being restored by another person.
  • the plurality of data may be data that is extracted from the biometric data regarding a part of the body of the person to be authenticated, which has been obtained in advance, and is divided into a size that does not allow identification of the characteristics of the person to be authenticated. This makes it possible to prevent individuals from being identified even if multiple pieces of data are known to others.
  • the plurality of data may be data extracted so as not to constitute a continuous area of the person to be authenticated when placed in the first state.
  • the distance between pieces of data is unknown, making it impossible to restore multiple pieces of data and making it impossible to identify individuals.
  • the authentication system may further include a first information acquisition device that acquires first information regarding biometric authentication of the person to be authenticated from the information storage medium.
  • the authentication system includes the storage device,
  • the storage device may also store second information regarding biometric authentication of a person to be authenticated who is different from the person to be authenticated.
  • the second information in the storage device becomes known to another person, the second information of the person to be authenticated cannot be identified from the second information of multiple people, and the data of the person to be authenticated cannot be restored by another person. can be prevented from happening.
  • the first information includes a part of a plurality of segments constituting the plurality of data
  • the second information may include the remainder of the plurality of segments. This makes it possible to prevent individuals from being identified even if the first information or second information is leaked.
  • a control device includes: an input unit that accepts biometric data regarding a part of the subject's body; a processor that generates information used for biometric authentication of the subject based on the biometric data and outputs the information to a storage device; the processor sets a position/orientation change value for at least one of the plurality of data extracted from the biological data;
  • the position and orientation change value is a value obtained by changing at least one of the position and orientation of the at least one data from a first state to a second state different from the first state,
  • the position/orientation change value is configured to adjust the position/orientation change value between the plurality of data based on the coordinate value of each of the plurality of data including the coordinate value related to the second state of the at least one data to which the position/orientation change value is set. is set to a value that makes it impossible to determine the correlation between the plurality of data in the biometric data when The processor generates the information including coordinate values of each of the plurality of data, and outputs the information as second information
  • a computer program executed by a control device includes: The execution causes the control device to: Accept biometric data regarding a part of the target person's body, generating information used for biometric authentication of the subject based on the biometric data and outputting the information to a storage device; setting a position/orientation change value for at least one of the plurality of data extracted from the biological data;
  • the position and orientation change value is a value obtained by changing at least one of the position and orientation of the at least one data from a first state to a second state different from the first state,
  • the position/orientation change value is configured to adjust the position/orientation change value between the plurality of data based on the coordinate value of each of the plurality of data including the coordinate value related to the second state of the at least one data to which the position/orientation change value is set. is set to a value that makes it impossible to determine the correlation between the plurality of data in the biometric data when The information including the coordinate values of each of the plurality of data is generated, and the information is output
  • the storage device stores coordinate values related to the second state of at least one piece of data in which a position/orientation change value is set as authentication information used for biometric authentication. has been done. As a result, multiple pieces of data cannot be restored to the original biometric data using only the authentication information stored in the storage device. Therefore, even if the authentication information stored in the storage device is leaked, it is possible to prevent the individual from being identified.
  • the processor generates first information including the position/orientation change value set for at least one data of the plurality of data, and stores the first information in an information storage medium owned by the subject. You can also output it.
  • the information storage medium stores at least one position/orientation change value of a plurality of data as the first information, but values indicating the position and orientation state of the data are not stored. Not yet. As a result, multiple pieces of data cannot be restored to the original biometric data using only the authentication information stored in the information storage medium. Therefore, even if the first information stored in the information storage medium is leaked, it is possible to prevent individuals from being identified.
  • the position and orientation change value includes a shift value
  • the shift value is a value obtained by shifting the position of the at least one data from a first coordinate value to a second coordinate value different from the first coordinate value
  • the coordinate value of each of the plurality of data may include the second coordinate value of the at least one data set with the shift value.
  • the information storage medium stores the shift value as the first information, but does not store the coordinate value. Further, the storage device stores coordinate values as second information, but does not store shift values. Therefore, even if the first information or the second information becomes known to another person, it is possible to prevent the plurality of data from being restored by another person.
  • the position and orientation change value includes a rotation inversion value
  • the rotation inversion value changes the orientation of the at least one data from a first orientation to a second orientation different from the first orientation by performing at least one of rotation processing and reversal processing about the origin of the at least one data. This is the value when changing to two postures
  • the coordinate value of each of the plurality of data may include the coordinate value of the origin of the at least one data after changing to the second attitude.
  • the rotation inversion value is stored as the first information in the information storage medium, but the coordinate value of the origin, which is the base axis of the rotation process and the inversion process, is not saved. Further, as second information, the storage device stores the coordinate values of the origin, which are the base axes of rotation processing and reversal processing, but does not store rotation reversal values. Therefore, even if the first information or the second information becomes known to another person, it is possible to prevent the plurality of data from being restored by another person.
  • the position and orientation change value includes a shift value and a rotation inversion value
  • the shift value is a value obtained by shifting the position of the at least one data from a first coordinate value to a second coordinate value different from the first coordinate value
  • the rotation inversion value changes the orientation of the at least one data from a first orientation to a second orientation different from the first orientation by performing at least one of rotation processing and reversal processing about the origin of the at least one data. This is the value when changing to two postures
  • the coordinate value of each of the plurality of data may include a coordinate value of the origin of the at least one data after shifting to the second coordinate value and changing to the second attitude.
  • the information storage medium stores the shift value and the rotation reversal value as the first information, but also the coordinate value of the data and the coordinate of the origin, which is the base axis of the rotation process and the reversal process. Value is not saved. Further, the storage device stores, as second information, the coordinate values of the data and the coordinate values of the origin, which are the basis of rotation processing and reversal processing, but does not store shift values and rotation reversal values. Therefore, even if the first information or the second information becomes known to another person, it is possible to prevent the plurality of data from being restored by another person.
  • the processor may include a part of the plurality of segments constituting the plurality of data in the first information, and include the remainder of the plurality of segments in the second information. This makes it possible to prevent individuals from being identified even if the first information or second information is leaked.
  • the processor may divide the biometric data into sizes that do not allow identification of characteristics of the subject and extract the plurality of pieces of data. This makes it possible to prevent individuals from being identified even if multiple pieces of data are known to others.
  • the processor may extract the plurality of data so that when arranged in the first state, they do not constitute a continuous area of the person to be authenticated. As a result, the distance between pieces of data is unknown, making it impossible to restore multiple pieces of data and making it impossible to identify individuals.
  • the functional configuration of the authentication system is illustrated.
  • the configuration of the imaging system is illustrated.
  • the flow of the generation process of first information and second information executed by the processor of the control device is illustrated.
  • An example of an area from which a plurality of pieces of data used for authentication are extracted from a target person's face is illustrated.
  • Data corresponding to an area extracted from imaging data of a subject's face is illustrated.
  • Each data in the XYZ coordinate system is illustrated.
  • a database stored in a storage device is illustrated.
  • Data extracted from two-dimensional image data in the XY coordinate system is illustrated.
  • 8 illustrates a state in which the extracted data of FIG. 8 has been subjected to shift and conversion processing.
  • a diagram for explaining an image data extraction method of three-dimensional image data is illustrated.
  • the flow of the authentication process executed in the processor of the control device is illustrated.
  • An example is shown in which the data in FIG. 6 is shifted so as to be placed at a second coordinate value in the XYZ coordinate system.
  • An example is shown in which the data in FIG. 6 is restored by being arranged at the first coordinate value in the XYZ coordinate system.
  • FIG. 1 illustrates the functional configuration of an authentication system 10 according to this embodiment.
  • the authentication system 10 can be used to authenticate a person 50 (FIG. 2) to be authenticated as a user of a system that performs a predetermined process or operation, and to permit use of the person 50 to be authenticated.
  • An example of a system that performs predetermined processing or operation is a payment system.
  • the authentication performed by the authentication system 10 is biometric authentication using a part of the body of the person 50 to be authenticated. Examples of biometric authentication include face authentication, iris authentication, fingerprint authentication, or vein authentication. In the following description, a case will be described in which the authentication system 10 performs face authentication.
  • the authentication system 10 includes an imaging device 11, a reading device 12, and a control device 13.
  • the imaging device 11 and the reading device 12 are installed in the imaging system 14 and are integrally configured.
  • the imaging device 11 and the reading device 12 may not be configured integrally but may be configured as separate bodies.
  • the imaging system 14 can be connected to the communication network 20 wirelessly or by wire, and is connected to the control device 13 via the communication network 20. Note that the imaging system 14 is appropriately placed at a location where the person to be authenticated 50 performs authentication, and a plurality of imaging systems 14 (not shown) are connected to the control device 13.
  • the imaging device 11 is configured to acquire image data for face authentication of the person to be authenticated 50 by capturing an image of the face of the person to be authenticated.
  • Image data for face authentication is an example of biometric data for biometric authentication.
  • the imaging device 11 is an example of an authentication data acquisition device.
  • the imaging device 11 is configured to acquire two-dimensional image data, three-dimensional image data, or three-dimensional shape data as imaging data.
  • the imaging device 11 includes a plurality of light source units 111, a plurality of cameras 112, and a projection device 113.
  • the light source section 111 is configured to emit light toward the person to be authenticated 50.
  • the light source unit 111 includes one or more light sources that emit light at a predetermined wavelength from visible light to invisible light.
  • the plurality of cameras 112 are infrared cameras or cameras capable of recognizing light of wavelengths other than infrared, and are configured with imaging elements such as CCDs (Charge-Coupled Devices) and CMOSs (Complementary MOS).
  • the projection device 113 is configured to irradiate infrared rays or visible light toward the person 50 to be authenticated in order to obtain three-dimensional shape data of the person 50 to be authenticated.
  • the projection device 113 projects a pattern of infrared or visible light made up of a plurality of dots onto the person to be authenticated 50 .
  • Two-dimensional image data or three-dimensional image data is acquired by, for example, the light source section 111 and the camera 112.
  • two-dimensional image data of the person to be authenticated 50 is acquired by capturing an image of the person to be authenticated 50 irradiated with light by the light source unit 111 using the camera 112 .
  • the three-dimensional image data is generated, for example, by a photogrammetry technique based on a plurality of two-dimensional image data captured by the cameras 112 at different positions.
  • the three-dimensional image data is generated by normalizing a plurality of two-dimensional images of the person to be authenticated 50 captured by the cameras 112 at different positions into three-dimensional images.
  • the three-dimensional image data may be generated by projecting and displaying two-dimensional image data, which will be described later, on the three-dimensional shape data.
  • the three-dimensional shape data is acquired by, for example, the camera 112 and the projection device 113.
  • three-dimensional shape data of the person to be authenticated 50 is acquired by capturing an image of the person to be authenticated 50 irradiated with infrared rays or visible light by the projection device 113 using the camera 112 .
  • a collection of points having a plurality of coordinate values is obtained by imaging a pattern of infrared rays or visible light made up of a plurality of dots projected onto the face of the person to be authenticated 50 by the projection device 113 using the camera 112.
  • Three-dimensional shape data is generated. Note that the three-dimensional shape data may be acquired by a sensor other than the camera 112 and the projection device 113.
  • the reading device 12 is configured to acquire first information I1 regarding face authentication of the person to be authenticated 50 from the information storage medium 30 held by the person to be authenticated 50.
  • the reading device 12 is an example of a first information acquisition device.
  • the first information I1 is used when performing face authentication of the person to be authenticated 50, which will be described later.
  • Examples of the information storage medium 30 include a smartphone and an NFC (Near Field Communication) tag, which is a type of RFID (Radio Frequency IDentification) tag. Note that when a smartphone is used as the information storage medium 30, a medium other than the smartphone used as the information storage medium 30 is used as the storage device 40, which will be described later.
  • the reading device 12 is configured to read the first information I1 from the information storage medium 30 in a contact or non-contact manner.
  • the reading device 12 includes a two-dimensional code reader 121 that reads a two-dimensional code.
  • An example of a two-dimensional code is a QR code (Quick Response Code) (registered trademark).
  • the first information I1 is stored as a two-dimensional code in a smartphone owned by the person to be authenticated 50, and the two-dimensional code reader 121 reads the first information I1.
  • the reading device 12 includes an NFC reader 122.
  • the first information I1 is stored in an NFC tag owned by the person to be authenticated 50, and the first information I1 is read by the NFC reader 122.
  • the NFC tag is possessed by the person to be authenticated 50 in various shapes and devices, such as a card, a sticker, and an NFC-compatible smartphone.
  • the NFC tag may be implanted in the authenticated person 50 as an NFC microchip.
  • the imaging system 14 further includes a display section 114 and a sensor 115.
  • the display unit 114 is configured to display imaging data of the person to be authenticated 50 acquired by the imaging device 11.
  • the display unit 114 includes, for example, a liquid crystal display, an organic EL display, or the like.
  • the display unit 114 may include a capacitive touch sensor or a resistive touch sensor attached to the entire or a part of the front surface of the display.
  • a guide (frame) G for arranging the face of the person to be authenticated 50 may be displayed on the display unit 114.
  • the sensor 115 is, for example, a proximity sensor or an ambient light sensor.
  • the proximity sensor detects that the face of the person to be authenticated 50 is located within a predetermined range with respect to the imaging system 14 .
  • the imaging device 11 is configured to image the person to be authenticated 50 when the proximity sensor detects that the face of the person to be authenticated 50 is located within a predetermined range with respect to the imaging system 14.
  • the ambient light sensor detects the brightness around the imaging system 14.
  • the light source unit 111 may be configured so that the brightness of the light source automatically becomes an optimal value based on the surrounding brightness detected by the environmental light sensor.
  • the imaging system 14 is not limited to the configuration illustrated in FIG. 2.
  • a smartphone or a personal digital assistant may be used as the imaging system 14.
  • the smartphone may also be used as the information storage medium 30. That is, the imaging system 14 and the information storage medium 30 may be integrated and realized as a function of a smartphone.
  • control device 13 is connected to the storage device 40.
  • the control device 13 and the storage device 40 may be installed in the same server device (not shown) or may be connected via the communication network 20.
  • the storage device 40 stores second information I2 regarding face authentication of the person to be authenticated 50.
  • the second information I2 is used when performing face authentication of the person to be authenticated 50, which will be described later.
  • the control device 13 includes an input section 131, a processor 132, and an output section 133.
  • the input unit 131 is configured as an interface capable of acquiring the imaging data of the face of the person to be authenticated 50 acquired by the imaging device 11 and the first information I1 regarding the face authentication of the person to be authenticated 50 acquired by the reading device 12. There is. Further, the input unit 131 is configured as an interface capable of acquiring second information I2 related to face authentication of the person to be authenticated 50 from the storage device 40.
  • the processor 132 performs an authentication process of facially authenticating the person to be authenticated 50 based on the imaging data, the first information I1, and the second information I2. In the authentication process, the processor 132 first constructs reference data for face authentication of the person to be authenticated 50 based on the first information I1 and the second information I2.
  • the first information I1 and the second information I2 respectively correspond to different areas of the face of the person to be authenticated 50 extracted from image data obtained in advance by capturing an image of the face of the person to be authenticated. Generated based on multiple data.
  • the first information I1 includes a position/orientation change value set for at least one of a plurality of extracted data when the imaging data is applied to a predetermined coordinate system (for example, an XYZ coordinate system). There is.
  • the position/orientation change value is a value obtained by changing at least one of the position and orientation of at least one piece of data from a first state to a second state.
  • the second information I2 includes coordinate values of each of the plurality of data.
  • the coordinate value of at least one piece of data set with a position/orientation change value is a coordinate value related to the second state.
  • the coordinate values of data for which no position/orientation change value is set are coordinate values related to the first state.
  • the second information I2 includes coordinate values related to the second state of each of the plurality of data.
  • the position/orientation change value is set to a value that makes it impossible to determine the correlation between the plurality of pieces of biometric data when the plurality of pieces of data are arranged based on the coordinate values included in the second information I2.
  • the position/orientation change value includes, for example, a shift value.
  • the shift value is a value obtained by shifting the position of at least one piece of data from its coordinate value (hereinafter referred to as the first coordinate value) to a coordinate value different from the coordinate value (hereinafter referred to as the second coordinate value).
  • the coordinate values related to the second state of the second information I2 include the second coordinate values of at least one data set with a shift value.
  • the coordinate values related to the first state of the second information I2 include the first coordinate values of data for which no shift value is set.
  • the first coordinate value is the value of the person to be authenticated 50 (target person 51) when the data of the plurality of second information I2 is arranged in a predetermined coordinate system (for example, XYZ coordinate system) based on the first coordinate value.
  • a predetermined coordinate system for example, XYZ coordinate system
  • the second coordinate value is determined when the data of the second information I2 is arranged in a predetermined coordinate system (for example, XYZ coordinate system) based on the second coordinate value.
  • 50 (target person 51)'s face is set to a value that makes it impossible to correctly reproduce the correlation between a plurality of pieces of data as imaged data.
  • the shift value is set to a value that makes it impossible to determine the correlation between the plural pieces of data in the imaging data when the plural pieces of data of the second information I2 are arranged based on the coordinate values of the second information I2.
  • the position/orientation change value includes, for example, a rotation inversion value.
  • the rotation reversal value is a value obtained by changing the orientation of at least one data from a first orientation to a second orientation by executing at least one of rotation processing and reversal processing with the origin of at least one data as a base axis.
  • the coordinate values related to the second state of the second information I2 include the coordinate values of the origin after changing to the second attitude.
  • the coordinates related to the second state The value further includes the coordinate value of the forming point that forms the data after changing to the second attitude.
  • the coordinate values related to the first state of the second information I2 include coordinate values of data that has not changed to the second attitude.
  • the imaging data when the imaging data is divided into areas of various shapes formed by a plurality of forming points and extracted, as described later, the image data changes to the second posture.
  • the coordinate values of data that are not included include the coordinate values of a plurality of formation points that form the data.
  • the position/orientation change value may include both a shift value and a rotation inversion value.
  • the coordinate values related to the second state of the second information I2 include the coordinate values of the origin of at least one data after shifting to the second coordinate values and changing to the second attitude.
  • the coordinate values related to the first state of the second information I2 include the first coordinate values of data in which neither shift nor orientation has changed.
  • the processor 132 Based on the first information I1 and the second information I2, the processor 132 converts the data of the plurality of second information I2 into a predetermined coordinate system (a coordinate system to which the previously acquired imaging data described above is applied, for example, XYZ coordinates). system). For example, when the position and orientation change value includes a shift value, the first coordinate value of the data set with the shift value is calculated based on the shift value of the first information I1 and the coordinate value of the second information I2, and a predetermined A plurality of pieces of data are each placed at a first coordinate value in the coordinate system.
  • a predetermined coordinate system a coordinate system to which the previously acquired imaging data described above is applied, for example, XYZ coordinates. system.
  • the position/orientation change value includes a rotational reversal value
  • rotation reversal processing is performed on at least one piece of data based on the rotational reversal value of the first information I1 and the coordinate value of the second information I2, and a predetermined A plurality of pieces of data including rotationally inverted data in the coordinate system are each arranged in a first orientation.
  • the position/orientation change value includes a shift value and a rotation inversion value
  • the first coordinate value of each data is calculated based on the shift value and rotation inversion value of the first information I1 and the coordinate value of the second information I2.
  • the processor 132 performs authentication processing based on the constructed face authentication data and the imaged data of the face of the person to be authenticated 50. That is, the processor 132 performs the authentication process by comparing the restored plurality of data with the imaged data of the face of the person to be authenticated 50 acquired by the imaging device 11.
  • the processor 132 outputs the authentication result from the output unit 133 to a system that performs predetermined processing or operation. Thereby, the person to be authenticated 50 can perform a payment procedure using, for example, a settlement (payment) system. Further, the processor 132 outputs the authentication result from the output unit 133 to the imaging system 14. For example, the authentication result is displayed on the display unit 114.
  • first information I1 and the second information I2 are generated by, for example, the control device 13 of the authentication system 10 and stored in the information storage medium 30 or the storage device 40.
  • FIG. 3 illustrates the flow of the generation process of the first information I1 and the second information I2 executed by the processor 132 of the control device 13.
  • the processor 132 acquires imaging data obtained by imaging the face of the subject 51 who is registering authentication information (i.e., first information I1, second information I2) used for face authentication (STEP 1).
  • authentication information i.e., first information I1, second information I2 used for face authentication
  • the face of a subject 51 is imaged by the imaging device 11.
  • the face of the subject 51 may be imaged by an imaging device different from the imaging device 11.
  • the image since the data restored at the time of authentication is compared with the imaged data captured by the imaging device 11 at the time of authentication, it is preferable that the image be captured under the same imaging conditions as the imaging device 11.
  • the processor 132 extracts a plurality of data used for authentication from the image data of the face of the subject 51 (STEP 2). Specifically, the processor 132 divides the captured image data of the face of the subject 51 into a plurality of pieces of data, and extracts the divided data as data used for authentication. Each piece of data is extracted to a size that makes it impossible to recognize the facial features of the subject 51 (that is, the individual cannot be identified) by itself.
  • FIG. 4 shows areas A1 to A9 from which a plurality of pieces of data used for authentication are extracted from the face of the target person 51.
  • FIG. 5 shows data B1 to B9 corresponding to areas A1 to A9 extracted from the imaging data of the face of the subject 51.
  • data B1 to B9 illustrated in FIG. 5 are data extracted from three-dimensional shape data as imaging data.
  • each of the data B1 to B9 is shown as a predetermined shape surrounded by a line, but in reality, it is composed of a collection of formation points having one or more coordinate values.
  • each of the data B1 to B9 is extracted from the imaging data by dividing it into areas A1 to A9 programmed in advance.
  • the size of each area A1 to A9 is appropriately set so that an individual cannot be identified from each data B1 to B9.
  • each area may be set to be discontinuous with adjacent areas such as areas A1, A2, A5 to A9, or adjacent areas such as areas A3 and A4. It may be set to be continuous with the area.
  • each area includes an element that can identify facial features of the target person 51
  • the size of each area is divided into sizes that do not allow identification of the facial features of the subject 51, and each area corresponding to the divided area is Data may also be extracted.
  • elements that can identify the facial features of the subject 51 include foreign objects such as moles, tattoos, and scars, or the iris of the eyeball (in the case of two-dimensional or three-dimensional image data), and foreign objects such as deep scars on the face (three-dimensional (in the case of shape data), etc.
  • the data B1 to B9 contain elements that can identify the facial features of the subject 51, they may be subjected to predetermined image processing (such as blacking out), or they may not be extracted as data for authentication. You can.
  • shapes of data B1 to B9 are not limited to the shapes shown in FIG. 5, and for example, one forming point is one area, or data is extracted after being divided into areas of various shapes formed by a plurality of forming points. But that's fine.
  • the processor 132 sets a shift value for at least one of the plurality of extracted data B1 to B9 (STEP 3).
  • shift values are set for all data B1 to B9.
  • a second coordinate value different from the first coordinate value of each data B1 to B9 is set, and a shift value from the first coordinate value to the second coordinate value is calculated.
  • the shift values of each data B1 to B9 may be set without setting the second coordinate values.
  • the second coordinate value is calculated from the first coordinate value and the shift value.
  • the shift value or the second coordinate value is randomly determined by the program.
  • FIG. 6 illustrates each data B1 to B9 in the XYZ coordinate system.
  • formation points P1 to P9 forming each data B1 to B9 and their first coordinate values PA1 to PA9 are shown.
  • each of the data B1 to B9 has an origin set therein, which serves as a reference axis for rotation processing and reversal processing, which will be described later.
  • the coordinate position of the origin of each data item B1 to B9 is randomly determined by the program.
  • formation points P1 to P9 are set as the origin set to each data B1 to B9.
  • Data B8 shows a second coordinate value PB8 of the formation point and the origin, and a shift value S8 from the first coordinate value PA8 to the second coordinate value PB8.
  • each formation point and origin are changed from the first coordinate values PA1 to PA7, PA9 to the second coordinate values PB1 to PB7 based on the set shift values S1 to S7, S9. , PB9 (not shown).
  • each data B1 to B9 is composed of two or more forming points
  • each data B1 to B9 also includes forming points (not shown) other than forming points P1 to P9.
  • each forming point has a first coordinate value, and a second coordinate value is also set.
  • an XYZ coordinate system is used as the predetermined coordinate system, but an encrypted coordinate system, a polar coordinate system, etc. may also be adopted.
  • An encrypted coordinate system is a coordinate system in which the coordinate values of the XY axes or the XYZ axes are intentionally arranged irregularly. It is a coordinate system.
  • Polar coordinate systems include circular coordinate systems, cylindrical coordinate systems, spherical coordinate systems, and the like.
  • the processor 132 performs a predetermined conversion process on the data B1 to B9 (STEP 4).
  • the predetermined conversion process includes, for example, at least one of a rotation process and an inversion process.
  • rotation processing is performed to rotate the orientations of the data B1 to B9 by a predetermined angle to the XYZ plane using the origins P1 to P9 as the base axes.
  • the predetermined angle is randomly set by the program.
  • the processor 132 associates the shift values S1 to S9 of each of the data B1 to B9 and the information of the conversion process (for example, the rotation inversion value of the rotation process and/or the inversion process) with the identification information of each of the data B1 to B9. Then, it is output to the information storage medium 30 of the subject 51 as the first information I1 (STEP 5).
  • the shift values S1 to S9 of each of the data B1 to B9 and conversion processing information are stored in the information storage medium 30 in association with the identification numbers of each of the data B1 to B9.
  • the "identification information" is a unique ID that identifies each piece of saved data, and is randomly generated by a program for each target person 51 so that the identification information of multiple pieces of data to be extracted are not consecutive numbers. .
  • the first information I1 is output to the information storage medium 30 via the imaging system 14.
  • the imaging system 14 includes a writing device 15.
  • the two-dimensional code reader 121 shown in FIG. 2 may also have a function of reading a two-dimensional code as the writing device 15.
  • the NFC reader 122 may also have an NFC data writing function as the writing device 15.
  • Processor 132 outputs first information I1 to imaging system 14 via communication network 20.
  • the writing device 15 of the imaging system 14 outputs the first information I1 to the information storage medium 30 of the subject 51.
  • the first information I1 is transmitted from the two-dimensional code reader 121 having a reading function to a two-dimensional code, for example, by short-range wireless communication (see the broken line in FIG. 1). Load it into your smartphone in the form of a code.
  • the information storage medium 30 held by the target person 51 is an NFC tag
  • the first information I1 is transmitted to the target person 51 from the NFC reader 122 having a writing function by short-range wireless communication (see the broken line in FIG. 1). is output to the NFC tag owned by the person 51 or the NFC microchip embedded in the subject 51.
  • the first information I1 of data B8 stored in the information storage medium 30 is No. F007_SFT ⁇ X-2Y-5Z2 ⁇ _FLP ⁇ YES_YZ ⁇ _TRN ⁇ YES_XY90YZ0XZ0 ⁇ _END (1).
  • F007 is an identification number (an example of identification information) assigned to data B8.
  • SFT is the shift value S8 of data B8.
  • the shift values S8 set in the data B8 are X-2, Y-5, and Z2. This indicates that the shift value from the first coordinate value PA8 to the second coordinate value PB8 is shifted by -2 in the X-axis direction, -5 in the Y-axis direction, and +2 in the Z-axis direction.
  • the values indicated by FLP and TRN are rotation inversion values indicating the conversion process, FLP indicates the presence or absence of inversion and the direction of inversion, and TRN indicates the presence or absence of rotation and the rotation angle.
  • data B8 indicates that there is YZ plane inversion. That is, it shows that the data B8 is reversed in the YZ plane with respect to the origin P8. Further, in this example, data B8 indicates that the data B8 has been rotated by 90 degrees in the XY plane direction, 0 degree in the YZ plane direction, and 0 degree in the XZ plane direction with respect to the origin P8.
  • the processor 132 associates the coordinate values of the formation points forming each data B1 to B9 and the coordinate values of origins P1 to P9, which are the base axes of rotation processing, with the identification information of each data B1 to B9, and generates a second The information is output to the storage device 40 as information I2 (STEP 6).
  • the second coordinate values PB1 to PB9 of the data B1 to B9 and the second coordinate values PB1 to PB9 of the origins P1 to P9 are used as the second information I2. included.
  • the first coordinate value is included in the second information I2. Note that the output processing of the second information I2 in STEP 6 may be performed before the output processing of the first information I1 in STEP 5.
  • FIG. 7 illustrates a database stored in the storage device 40.
  • each data item B1 to B9 having a second coordinate value is assigned an identification number, and the second coordinate values of one or more forming points and the origin are stored together.
  • the identification number is No. F007 is assigned and saved together with the second coordinate value PB8 of the formation point P8 and the origin P8.
  • FIG. 7 only the second coordinate value PB8 of the formation point P8 and the origin P8 of the data B8 is shown, but the other data B1 to B7 and B9 are similarly formed to the formation points P1 to P7 and P9 and the origin P1 to P7. , P9's second coordinate values PB1 to PB7 and PB9 are stored.
  • each data B1 to B9 is shown having a predetermined shape, but in reality, each data B1 to B9 is formed by one or more forming points and a second point of the origin. Coordinate values are expressed using alphanumeric symbols.
  • the second information I2 of data B8 is No. F007_ ⁇ X9.0Y5.0Z2X7.0Y2.0Z4.0X9.0Y0Z2X11.0Y-1.0Z0.5 ⁇ _ORG ⁇ X7.0Y2.0Z4.0 ⁇ It can be expressed as in (2).
  • the numerical values written following F007 indicate the second coordinate values of a plurality of formation points (four formation points in this example) that constitute data B8. Further, the value indicated by ORG is the origin of the data B8, and indicates the second coordinate value of the origin P8, which is the base axis of the conversion process (rotation process and inversion process in this example). Although there are actually more formation points in the data B8, for the sake of simplicity, only the second coordinate values of four formation points including the origin will be shown and explained.
  • data B1 to B9 of the subject 51 are further stored in the database as the region where the imaging data was acquired (for example, JPN: Japan) and the data B1 to B9. stored in association with the version of the control device's computer program used at the time of acquisition.
  • the storage device 40 stores only data B1 to B9 of the subject 51, but data of other subjects may be stored in the same manner.
  • the storage device 40 can be configured to store dummy data (artificially created fake data) that is not associated with any target person, thereby preventing data from being stored only for the target person 51. can.
  • the processor 132 sets shift values for the data B1 to B9 extracted from the imaging data in STEP 3 of FIG.
  • a shift value may be set for at least one of the data B1 to B9.
  • the configuration may be such that only the rotation process or the inversion process in STEP 4 is executed without setting shift values for all data. If the shift value is not set, for example, in the first information I1 stored in the information storage medium 30 expressed in (1) above, the shift value of the SFT item is zero, and the shift value is indicated by FLP and TRN. The value of the item is set.
  • the second information I2 stored in the storage device 40 represented by (2) above the identification number No.
  • the numerical values written following F007 indicate the second coordinate values of the plurality of formation points and the origin that constitute the data B8.
  • the processor 132 performs a predetermined conversion process on the data B1 to B9 extracted from the imaging data in STEP 4 of FIG.
  • a predetermined conversion process may be performed on at least one of the data B1 to B9.
  • the configuration may be such that only the shift value setting in STEP 3 is performed without converting all data. If the conversion process is not executed, for example, the first information I1 expressed in (1) above stored in the information storage medium 30 will be shown as NO in the FLP and TRN items, and will be shown as NO in the SFT item. Shift value is set.
  • predetermined image processing blacking, etc.
  • processing that will not be extracted as authentication data may be performed as appropriate. good.
  • the first information I1 and the second information I2 may be generated based on the two-dimensional image data or three-dimensional image data.
  • the processor 132 uses two-dimensional image data, three-dimensional image data, or three-dimensional shape data and two-dimensional image data From the image data of the generated three-dimensional image data, each area is defined as a block with a minimum coding unit of 8 pixels on a side on the XY coordinate axes, or a block with a multiple of 8 pixels on a side as one area. Data is extracted. In FIG. 8, an area surrounded by one square indicates one data area.
  • the coordinate value of the first pixel PX of each data is set as the coordinate value where each data is arranged. Further, the first pixel PX of each data is set as the origin, which is the base axis of rotation processing and inversion processing. That is, the coordinate values of the first pixel PX are the coordinate values of data, and are the coordinate values of the origin. Note that in FIG. 8, data B17 and the first pixel PX, which is its origin, are shown enlarged as an example of extracted data. Note that when iris and other information is included as in data B17 and iris recognition is not used, the program may be configured to automatically fill out the iris and other information.
  • the processor 132 sets a shift value for at least one of the plurality of extracted data. That is, a shift value of the coordinate value of the first pixel PX is set for at least one of the plurality of data.
  • the shift value like the three-dimensional shape data, is randomly determined by the program.
  • the processor 132 performs a predetermined conversion process (rotation process or inversion process) on at least one of the data. That is, at least one of the plurality of data is subjected to rotation processing or inversion processing using the first pixel PX as the origin.
  • rotation processing or inversion processing when rotation or inversion processing is performed, the coordinate values of the origin (data coordinate values) remain the same, but the state of the image displayed within the image data changes (rotation or inversion).
  • the coordinate values of the origin after the rotation or reversal processing are the coordinate values of the first pixel PX, the same as the coordinate values of the origin before the rotation or reversal processing (data coordinate values). .
  • the processor 132 associates the shift value and rotation inversion value of each data with the identification information of each data, and outputs it to the information storage medium 30 of the subject 51 as the first information I1. Output.
  • the processor 132 outputs the coordinate values of each data (that is, the coordinate values of the origin) to the storage device 40 as the second information I2.
  • the coordinate value of data for which a shift value is set is the second coordinate value (i.e., the second coordinate value of the origin), and the coordinate value of data for which no shift value is set is the first coordinate value (i.e., the second coordinate value of the origin).
  • the first coordinate value is included in the second information I2 as the data coordinate value.
  • the shift value may not be set for the data, and only the rotation process or the inversion process in STEP 4 may be executed.
  • the configuration may be such that in STEP 4, only the shift value setting in STEP 3 is performed without performing a predetermined conversion process on the data.
  • FIG. 9 shows data B17 that has been subjected to shift processing, rotation processing, and inversion processing.
  • data B17 is Y-axis inverted with respect to the first pixel PX, which is the origin, and rotated clockwise by 90 degrees in the XY plane, and then the coordinate values of the first pixel PX are -20 for X and -20 for Y. It is placed on the coordinate value shifted by 10. Then, the XY coordinate values of the first pixel PX are output to the storage device 40 as second information I2.
  • the coordinate values of the origin after rotation processing and inversion processing are performed are the coordinate values of the first pixel PX.
  • the information regarding the data B17 of the second information I2 is No. F017_ ⁇ X-20Y-10 ⁇ _IMG ⁇ FFDA 000C 0301...7517 ⁇ _END (3). Note that in the information shown in (3) above, some information on the IMG is omitted.
  • the number shown on the far left is No. F017 is an identification number (an example of identification information) assigned to data B17. Further, following the identification number, the second coordinate value of the first pixel PX, which is the second coordinate value of the data B17 and the second coordinate value of the origin which is the base axis of the conversion process (rotation process or inversion process), is shown. has been done. Further, the value indicated by IMG indicates a segment that includes color information, etc. of each pixel constituting the block of data B17 (in this example, a block of 80 pixels vertically by 80 pixels horizontally).
  • the first information I1 of data B17 is No. F017_SFT ⁇ X-10Y-12 ⁇ _FLP ⁇ YES_Y ⁇ _TRN ⁇ YES_90 ⁇ _IMG ⁇ FFD8 FFE0 0010...FFDB...FFC0...FFC4...FFC4...FFC4...FFD9 ⁇ _END (4) . Note that in the information shown in (4) above, some information on the IMG is omitted.
  • the shift value of data B17 is shown.
  • the shift values set for data B17 are X-10, Y-12, and the shift value from the first coordinate value to the second coordinate value of the first pixel PX of data B17 is This indicates a shift of -10 in the direction and -12 in the Y-axis direction.
  • data B17 indicates that there is a Y-axis inversion around the first pixel PX, which is the origin, and a clockwise rotation of 90 degrees in the XY plane around the first pixel PX, which is the origin.
  • a JFIF file is composed of data units called segments
  • the first information I1 includes information on some segments that make up the JFIF file. May include.
  • the value indicated by IMG includes a segment that includes information such as the compression method and the number of vertical and horizontal pixels regarding the data B17.
  • the Image data cannot be restored with only one piece of information I1 because the data is incomplete.
  • the coordinate position where the first pixel, which is the origin of the image data, is to be placed is also unknown using only the first information I1, the image data cannot be accurately restored. Therefore, even if either the first information I1 or the second information I2 is leaked, it is possible to prevent an individual from being identified.
  • the first information I1 of the data B8 extracted for three-dimensional image data is, for example, No. F007_SFT ⁇ X-2Y-5Z2 ⁇ _FLP ⁇ YES_YZ ⁇ _TRN ⁇ YES_XY90YZ0XZ0 ⁇ _IMG ⁇ FFD8 FFE0 0010...FFDB...FFC0...FFC4...FFC4...FFC4...FFD9 ⁇ ..._END...It is expressed as (5). Note that in the information shown in (5) above, some information on the IMG is omitted.
  • the second information I2 of the data B8 extracted for three-dimensional image data is, for example, No. F007_ ⁇ X9.0Y5.0Z2X7.0Y2.0Z4.0X9.0Y0Z2X11.0Y-1.0Z0.5 ⁇ _ORG ⁇ X7.0Y2.0Z4.0 ⁇ _IMG ⁇ X4.5Y7.0_FFDA 001A 0101...55 26 ⁇ ..._END... ⁇ It is expressed as (6). Note that in the information shown in (6) above, some information on the IMG is omitted.
  • Three-dimensional image data is data in which two-dimensional image data is projected and displayed on the XY coordinate axes of three-dimensional shape data.
  • the first information I1 includes information such as the shift value and rotation inversion value of the three-dimensional shape data, as well as the compression method and number of vertical and horizontal pixels regarding the two-dimensional image data. It may also include segment information indicating.
  • the second information I2 includes three-dimensional shape data of the XYZ coordinate axes in addition to information on the second coordinate values of the origin and the plurality of forming points that constitute the three-dimensional shape data. It may also include segment information indicating the XY coordinate value position of the first pixel, which is the origin of the two-dimensional image data to be projected and displayed, and color information of each pixel.
  • identification information No From the XY coordinate values of the three-dimensional shape data of data B8 of F007, the position of each block of the two-dimensional image data to be projected and displayed on the three-dimensional shape data can be calculated.
  • the image data of data B8 corresponds to, for example, a rectangular block indicated by a thick frame.
  • the XY position (origin) of the thick frame block and the information in the image data are identified as identification information No.
  • the first information I1 and the second information I2 of F007 are separated and stored in the information storage medium 30 and the storage device 40, respectively.
  • the block position and size of the image data are calculated based on the XY coordinate values of the three-dimensional shape data.
  • pixels located on the XY coordinates of the three-dimensional shape data within the block of image data are specified. For example, as shown by the circled numbers in FIG. 10, 30 pixels (an example simplified for purposes of explanation) are identified.
  • the XY coordinate values (X4.5, Y7.0) with the first pixel of the block as the origin are stored in the storage device 40 as second information I2. Further, shift values, conversion processing information (rotation processing and inversion processing), and various parameters of image information are stored in the information storage medium 30 as first information I1.
  • one block of image data is composed of 8 x 8 pixels
  • the second information I2 may include segment information indicating color information of each pixel forming the image data.
  • the value indicated by IMG includes color information of each pixel. Note that color information of other pixels that are not located on the XY coordinates of the three-dimensional shape data within the block of image data is extracted as zero (black with R zero G zero B zero). R zero G zero B zero is just an example, and other colors may be extracted.
  • FIG. 11 illustrates the flow of authentication processing executed by the processor 132 of the control device 13.
  • the processor 132 acquires image data of the person to be authenticated 50 captured by the imaging device 11 (STEP 11). For example, when a person to be authenticated 50 using a payment system or the like is located within a predetermined range with respect to the imaging system 14, the face of the person to be authenticated 50 is captured by the imaging device 11. Note that acquisition of the imaging data in STEP 11 may be performed after STEP 12 or STEP 13, which will be described later.
  • the processor 132 acquires the first information I1 read by the reading device 12 and stored in the information storage medium 30 owned by the person to be authenticated 50 (STEP 12). For example, when the person to be authenticated 50 brings a smartphone displaying a two-dimensional code close to the two-dimensional code reader 121, the two-dimensional code reader 121 reads the first information I1.
  • the processor 132 acquires second information I2 regarding the authentication of the person to be authenticated 50 from the storage device 40 (STEP 13). Specifically, the processor 132 acquires data corresponding to the identification information of the authentication data included in the first information I1 from the storage device 40.
  • the processor 132 restores the authentication data of the person to be authenticated 50 on a predetermined coordinate system based on the first information I1 and the second information I2 (STEP 14).
  • the predetermined coordinate system the coordinate system used when generating the first information I1 and the second information I2 is used.
  • Information on the coordinate system may be included in the first information I1.
  • the processor 132 arranges the data B8 in a predetermined coordinate system based on the information on the second coordinate value of the data B8 included in the second information I2. Specifically, the data B8 is arranged based on all the second coordinate values of the plurality of forming points that constitute the data B8 included in the second information I2. Then, based on the information of the shift value S8 corresponding to the data B8 included in the first information I1, the position of the data B8 arranged in the coordinate system is moved to the first coordinate value. Specifically, all of the plurality of formation points forming the data B8 are moved to the first coordinate value.
  • FIGS. 12 and 13 only data B8 is shown, and the first coordinate value PA8, second coordinate value PB8, and Only shift value S8 is shown.
  • the origin P8 included in the second information I2 is used as a reference axis, and the conversion processing information included in the first information I1 (for example, the rotation inversion value of the rotation processing or inversion processing) is used. Then, adjust the rotation and inversion of data B8. Thereby, as illustrated in FIG. 13, data B8 is placed (restored) at the correct position and correct angle on the coordinate system (same position and same angle as in FIG. 6). Note that the two-dimensional image data is similarly arranged (restored) at the position and angle before the shift and conversion processing (not shown).
  • the processor 132 repeats the process in STEP 14 until the restoration of all data B1 to B9 is completed (YES in STEP 15). As a result, all data B1 to B9 are rearranged (restored) on the coordinate system (see FIG. 6). This generates reference data.
  • the processor 132 performs authentication processing based on the restored data and the imaging data acquired in STEP 1 (STEP 16). Specifically, the processor 132 compares the restored reference data and the imaged data. The processor 132 establishes authentication when the degree of matching between the two exceeds a threshold value. In other words, the processor 132 authenticates the person to be authenticated 50 who caused the imaging device 11 to acquire imaged data of his or her face as a registered user when the degree of matching between the two exceeds the threshold value.
  • the control device 13 acquires the first information I1 from the information storage medium 30 and acquires the second information I2 from the storage device 40. Then, the control device 13 restores the plurality of data B1 to B9 on a predetermined coordinate system based on the first information I1 and the second information I2.
  • the shift values S1 to S9 of the plurality of data B1 to B9 are stored as the first information I1, and on the other hand, the coordinate values of the plurality of data B1 to B9 are not stored.
  • the storage device 40 stores coordinate values of a plurality of data B1 to B9 as second information I2, but does not store shift values. As a result, it is not possible to restore a plurality of pieces of data to the original captured data using only the authentication information stored in either the information storage medium 30 or the storage device 40. Therefore, even if either the first information I1 stored in the information storage medium 30 or the second information I2 stored in the storage device 40 is leaked, it is possible to prevent an individual from being identified.
  • rotation processing and inversion processing are performed on the plurality of data B1 to B9.
  • the information storage medium 30 stores, as first information I1, the rotation reversal values of the rotation processing and reversal processing, but the coordinate values of the origins P1 to P9, which are the base axes of the rotation processing and reversal processing, are not stored.
  • the storage device 40 stores, as second information I2, the coordinate values of the origins P1 to P9, which are the base axes of rotation processing and reversal processing, but does not store rotation reversal values. Therefore, even if the first information I1 or the second information I2 is known to another person, it is possible to prevent the other person from restoring the plurality of data.
  • the plurality of data B1 to B9 are data extracted from the imaging data of the subject 51 (person 50 to be authenticated) after being divided into a size that does not allow the characteristics of the subject 51 (person 50 to be authenticated) to be specified.
  • the two-dimensional image data is extracted data that is divided into sizes (blocks) in which the characteristics of the subject 51 (authenticated person 50) cannot be specified.
  • the plurality of data B1, B2, B5 to B9 are data extracted so as not to constitute a continuous area of the person to be authenticated 50 when combined.
  • the two-dimensional image data is data extracted so that the image data does not constitute a continuous area.
  • the storage device 40 may also store second information I2 regarding biometric authentication of a person to be authenticated who is different from the person to be authenticated 50.
  • the second information I2 in the storage device 40 is known to another person, the second information I2 of the person to be authenticated 50 cannot be specified from the second information I2 of multiple people, and It is possible to prevent the data of the person 50 from being restored.
  • the processor 132 of the control device 13 having the various functions described above can be realized by a general-purpose microprocessor that operates in cooperation with a general-purpose memory.
  • general-purpose microprocessors include CPUs, MPUs, and GPUs.
  • general-purpose memory include ROM and RAM.
  • the ROM may store a computer program that executes the above-described processing.
  • the computer program may include an artificial intelligence (AI) program.
  • AI program is a program (trained model) constructed by supervised or unsupervised machine learning (particularly deep learning) using a multilayer neural network.
  • ROM is an example of a non-transitory computer-readable medium.
  • the general-purpose microprocessor specifies at least a part of the computer program stored on the ROM, loads it on the RAM, and executes the above-described processing in cooperation with the RAM.
  • the computer program described above may be preinstalled in the general-purpose memory, or may be downloaded from an external server device via a wireless communication network and then installed in the general-purpose memory.
  • the processor 132 may be realized by a dedicated integrated circuit such as a microcontroller, ASIC, or FPGA that is capable of executing the computer program described above. In this case, the above-mentioned computer program is preinstalled in the memory element included in the dedicated integrated circuit.
  • Each processor may also be implemented by a combination of a general purpose microprocessor and a special purpose integrated circuit.
  • one formation point among the plurality of formation points forming the data is described as the origin, but the origin does not have to be the formation point forming the data.
  • the authentication system 10 performs face authentication based on image data of the face of the person to be authenticated 50 acquired using the imaging device 11.
  • the authentication system 10 may be configured to perform biometric authentication other than face authentication.
  • authentication system 10 may be configured to perform fingerprint authentication.
  • fingerprint authentication may be performed based on image data of the fingerprint of the person to be authenticated 50 acquired by the imaging device 11.
  • the imaging system 14 is equipped with a fingerprint sensor instead of the imaging device 11, emits a laser or sound wave toward the finger of the person to be authenticated 50, receives the laser or sound wave reflected from the finger, and receives the received signal.
  • Image data, shape data, etc. of the fingerprint of the person to be authenticated 50 may be obtained by analyzing the information.
  • the authentication system 10 may be configured to perform vein authentication.
  • the imaging device 11 may be used to irradiate the finger of the person to be authenticated 50 with near-infrared rays
  • the vein pattern may be imaged by a camera, and vein authentication may be performed based on the imaging data of the vein pattern.
  • the authentication system 10 may be configured to perform iris authentication.
  • the authentication of the person to be authenticated 50 is successful when one authentication method, which is face authentication, is successful.
  • authentication processing based on the account information of the person to be authenticated 50 may be performed. That is, the processor 132 determines whether the account information (including a password, etc.) input by the person to be authenticated 50 matches the pre-registered account information stored in the storage device 40. Then, the processor 132 may determine that the authentication of the person to be authenticated 50 is successful when the authentication based on the account information is successful and the face authentication is successful.
  • fingerprint authentication or vein authentication may be performed.
  • the processor 132 may determine that the authentication of the person to be authenticated 50 is successful when both face authentication and fingerprint authentication or vein recognition are successful.
  • face authentication it may be determined that the authentication of the person to be authenticated 50 is successful when both fingerprint authentication and vein authentication are successful.
  • the imaging system 14 is connected to the control device 13 via the communication network 20.
  • the imaging system 14 and the control device 13 may be configured integrally.
  • the integrated configuration of the imaging system 14 and the control device 13 may be realized by, for example, a smartphone, and data restoration and authentication processing may be performed in the smartphone. According to such a configuration, only the second information I2 is received from the storage device 40 via the communication network 20 without transmitting the first information I1 held by the authenticated person 50 to the communication network 20 at the time of authentication. data can be restored and authenticated within the smartphone. Thereby, it is possible to prevent the first information I1 from leaking to the outside.
  • the information storage medium 30, the imaging system 14, and the control device 13 may be integrally configured.
  • the known public key system common key
  • the data may be configured to maintain data confidentiality by using various encryption and decryption methods.
  • shift values are set for all data extracted from biometric data.
  • a shift value may be set for at least one piece of data.
  • the data to which the shift value is set is randomly determined by the program.
  • the first information I1 expressed in (1) above and stored in the information storage medium 30 is such that each value of XYZ of SFT ⁇ X0Y0Z0 ⁇ is It becomes zero.
  • the origin and coordinate values are also stored as the first coordinate values at the time of extraction.
  • the shift value SFT ⁇ X0Y0 ⁇ of the first information I1 becomes zero, and the storage device 40 represented by (3) above becomes zero.
  • the origin and coordinate values of the second information I2 stored in are also stored as the first coordinate values at the time of extraction.

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Collating Specific Patterns (AREA)

Abstract

According to the present invention, first information includes position and orientation change values that are set to at least one piece of data among a plurality of pieces of data that are extracted from pre-acquired biological data pertaining to a portion of the body of a subject to be authenticated. The position and orientation change values that are values obtained by changing, from a first state to a second state, at least one among the position and orientation of the at least one piece of data. Second information includes a coordinate value of each of the plurality of pieces of data. The coordinate value of the at least one piece of data of which the position and orientation change values are set is related to the second state. A control device performs an authentication process on the basis of biological data for biometric authentication and reference data of the plurality of pieces of data created on the basis of the first information and second information.

Description

認証システム、制御装置、およびコンピュータプログラムAuthentication systems, control devices and computer programs
 本開示は、認証システムに関連する。また、本開示は、認証システムに用いられる制御装置、および当該制御装置によって実行されるコンピュータプログラムに関連する。さらに、本開示は、認証システムによる認証に使用される認証情報を生成する制御装置、および当該制御装置によって実行されるコンピュータプログラムに関連する。 This disclosure relates to an authentication system. The present disclosure also relates to a control device used in an authentication system and a computer program executed by the control device. Further, the present disclosure relates to a control device that generates authentication information used for authentication by an authentication system, and a computer program executed by the control device.
 特許文献1は、顔認証装置および顔認証方法を開示している。顔認証装置は、認識対象となる人物の顔領域の中から各特徴点を抽出し、予め登録されている各登録者の顔の辞書画像との類似度を計算し照合することにより人物を識別する。特許文献1の顔認証装置は、記録媒体に各人物の顔のデータが辞書画像として保存されている。 Patent Document 1 discloses a face authentication device and a face authentication method. The face recognition device identifies the person by extracting each feature point from the face area of the person to be recognized, and calculating and comparing the degree of similarity with the dictionary image of each registrant's face registered in advance. do. In the face authentication device disclosed in Patent Document 1, face data of each person is stored in a recording medium as a dictionary image.
日本国特開平9−251534号公報Japanese Patent Application Publication No. 9-251534
 しかし、近年、様々な手法から各種情報が漏洩する事件が起きており、個人情報保護の観点から、これらのデータを長期間安全に保管する様々な課題があった。 However, in recent years, incidents have occurred in which various types of information have been leaked through various methods, and from the perspective of protecting personal information, there have been various issues in safely storing this data for a long period of time.
 本開示は、認証に使用されるために保存されたデータが流出しても個人が特定されることを防止する。 This disclosure prevents individuals from being identified even if data stored for use in authentication is leaked.
 本開示の第一態様に係る認証システムは、
 被認証者の身体の一部に関する生体認証用の生体データを取得する認証用データ取得装置と、
 前記被認証者により所持される情報保存媒体から前記被認証者の生体認証に関する第一情報を取得し、記憶装置から前記被認証者の生体認証に関する第二情報を取得し、前記認証用データ取得装置から取得された生体認証用の前記生体データと前記情報保存媒体から取得された前記第一情報と前記記憶装置から取得された前記第二情報とに基づいて前記被認証者を生体認証する認証処理を行う制御装置と、を備えており、
 前記第一情報は、予め取得された前記被認証者の身体の一部に関する生体データから抽出された複数のデータの少なくとも一つのデータに対して設定された位置姿勢変更値を含み、
 前記位置姿勢変更値は、前記少なくとも一つのデータの位置および姿勢の少なくとも一方を第一状態から前記第一状態とは異なる第二状態へ変化させた値であり、
 前記第二情報は、前記複数のデータの各々の座標値を含み、
 前記位置姿勢変更値が設定された前記少なくとも一つのデータの前記座標値は、前記第二状態に関連する座標値であり、
 前記位置姿勢変更値は、前記第二情報に含まれる前記座標値に基づいて前記複数のデータ同士を配置した場合に前記生体データにおける前記複数のデータ同士の相関関係が判別できない値に設定されており、
 前記制御装置は、
 前記第一情報および前記第二情報に基づいて前記複数のデータを前記第一状態で配置して参照データを作成し、
 前記参照データ、および前記認証用データ取得装置により取得された前記生体認証用の前記生体データに基づいて認証処理を行う。
The authentication system according to the first aspect of the present disclosure includes:
an authentication data acquisition device that acquires biometric data for biometric authentication regarding a part of the body of a person to be authenticated;
acquiring first information related to the biometric authentication of the authenticated person from an information storage medium owned by the authenticated person, acquiring second information related to the biometric authentication of the authenticated person from a storage device, and acquiring the authentication data. Authentication that biometrically authenticates the person to be authenticated based on the biometric data for biometric authentication acquired from the device, the first information acquired from the information storage medium, and the second information acquired from the storage device. It is equipped with a control device that performs processing,
The first information includes a position/orientation change value set for at least one of a plurality of data extracted from biometric data regarding a part of the body of the person to be authenticated that has been obtained in advance;
The position and orientation change value is a value obtained by changing at least one of the position and orientation of the at least one data from a first state to a second state different from the first state,
The second information includes coordinate values of each of the plurality of data,
The coordinate value of the at least one data set with the position/orientation change value is a coordinate value related to the second state,
The position and orientation change value is set to a value that makes it impossible to determine the correlation between the plurality of data in the biological data when the plurality of data are arranged based on the coordinate values included in the second information. Ori,
The control device includes:
creating reference data by arranging the plurality of data in the first state based on the first information and the second information;
Authentication processing is performed based on the reference data and the biometric data for biometric authentication acquired by the authentication data acquisition device.
 本開示の第二態様に係る制御装置は、
 被認証者の身体の一部に関する生体認証用の生体データを受け付け、前記被認証者により所持される情報保存媒体から前記被認証者の生体認証に関する第一情報を受け付け、記憶装置から前記被認証者の生体認証に関する第二情報を受け付ける入力部と、
 生体認証用の前記生体データと、前記第一情報と、前記第二情報とに基づいて、前記被認証者を生体認証する認証処理を行うように構成されているプロセッサと、を備えており、
 前記第一情報は、予め取得された前記被認証者の身体の一部に関する生体データから抽出された複数のデータの少なくとも一つのデータに対して設定された位置姿勢変更値を含み、
 前記位置姿勢変更値は、前記少なくとも一つのデータの位置および姿勢の少なくとも一方を第一状態から前記第一状態とは異なる第二状態へ変化させた値であり、
 前記第二情報は、前記複数のデータの各々の座標値を含み、
 前記位置姿勢変更値が設定された前記少なくとも一つのデータの前記座標値は、前記第二状態に関連する座標値であり、
 前記位置姿勢変更値は、前記第二情報に含まれる前記座標値に基づいて前記複数のデータ同士を配置した場合に前記生体データにおける前記複数のデータ同士の相関関係が判別できない値に設定されており、
 前記プロセッサは、
 前記第一情報および前記第二情報に基づいて前記複数のデータを前記第一状態で配置して参照データを作成し、
 前記参照データ、および前記生体認証用の前記生体データに基づいて認証処理を行う。
A control device according to a second aspect of the present disclosure includes:
Accepting biometric data for biometric authentication regarding a part of the body of the person to be authenticated, receiving first information regarding biometric authentication of the person to be authenticated from an information storage medium owned by the person to be authenticated, and receiving first information regarding the biometric authentication of the person to be authenticated from a storage device. an input unit that receives second information regarding biometric authentication of a person;
a processor configured to perform an authentication process for biometrically authenticating the person to be authenticated based on the biometric data for biometric authentication, the first information, and the second information;
The first information includes a position/orientation change value set for at least one of a plurality of data extracted from biometric data regarding a part of the body of the person to be authenticated that has been obtained in advance;
The position and orientation change value is a value obtained by changing at least one of the position and orientation of the at least one data from a first state to a second state different from the first state,
The second information includes coordinate values of each of the plurality of data,
The coordinate value of the at least one data set with the position/orientation change value is a coordinate value related to the second state,
The position and orientation change value is set to a value that makes it impossible to determine the correlation between the plurality of data in the biological data when the plurality of data are arranged based on the coordinate values included in the second information. Ori,
The processor includes:
creating reference data by arranging the plurality of data in the first state based on the first information and the second information;
Authentication processing is performed based on the reference data and the biometric data for biometric authentication.
 本開示の第三態様に係る制御装置により実行されるコンピュータプログラムは、
 実行されることにより、前記制御装置に、
 被認証者の身体の一部に関する生体認証用の生体データを受け付けさせ、
 前記被認証者により所持される情報保存媒体から前記被認証者の生体認証に関する第一情報を受け付けさせ、
 記憶装置から前記被認証者の生体認証に関する第二情報を受け付けさせ、
 前記第一情報は、予め取得された前記被認証者の身体の一部に関する生体データから抽出された複数のデータの少なくとも一つのデータに対して設定された位置姿勢変更値を含み、
 前記位置姿勢変更値は、前記少なくとも一つのデータの位置および姿勢の少なくとも一方を第一状態から前記第一状態とは異なる第二状態へ変化させた値であり、
 前記第二情報は、前記複数のデータの各々の座標値を含み、
 前記位置姿勢変更値が設定された前記少なくとも一つのデータの前記座標値は、前記第二状態に関連する座標値であり、
 前記位置姿勢変更値は、前記第二情報に含まれる前記座標値に基づいて前記複数のデータ同士を配置した場合に前記生体データにおける前記複数のデータ同士の相関関係が判別できない値に設定されており、
 前記第一情報および前記第二情報に基づいて前記複数のデータを前記第一状態で配置して参照データを作成させ、
 前記参照データ、および前記生体認証用の前記生体データに基づいて認証処理を行わせる。
A computer program executed by a control device according to a third aspect of the present disclosure includes:
The execution causes the control device to:
accept biometric data for biometric authentication regarding a part of the body of the person to be authenticated;
receiving first information regarding the biometric authentication of the person to be authenticated from an information storage medium owned by the person to be authenticated;
receiving second information regarding biometric authentication of the person to be authenticated from a storage device;
The first information includes a position/orientation change value set for at least one of a plurality of data extracted from biometric data regarding a part of the body of the person to be authenticated that has been obtained in advance;
The position and orientation change value is a value obtained by changing at least one of the position and orientation of the at least one data from a first state to a second state different from the first state,
The second information includes coordinate values of each of the plurality of data,
The coordinate value of the at least one data set with the position/orientation change value is a coordinate value related to the second state,
The position and orientation change value is set to a value that makes it impossible to determine the correlation between the plurality of data in the biological data when the plurality of data are arranged based on the coordinate values included in the second information. Ori,
creating reference data by arranging the plurality of data in the first state based on the first information and the second information;
Authentication processing is performed based on the reference data and the biometric data for biometric authentication.
 上記の第一態様から第三態様によれば、情報保存媒体には、第一情報として、複数のデータの少なくとも一つのデータの位置姿勢変更値が保存されているが、データの座標値は保存されていない。また、記憶装置には、第二情報として、データの座標値が保存されているが、位置姿勢変更値は保存されていない。これにより、情報保存媒体または記憶装置のいずれか一方に記憶された認証情報だけでは、複数のデータの第一状態は不明であり、複数のデータを元の生体データに復元できない。したがって、情報保存媒体に保存された第一情報または記憶装置に保存された第二情報のいずれか一方の情報が流出しても個人が特定されることを防止できる。 According to the first to third aspects described above, the information storage medium stores, as the first information, a position/orientation change value of at least one of the plurality of data, but the coordinate values of the data are stored. It has not been. Further, although the coordinate values of the data are stored as second information in the storage device, the position and orientation change values are not stored. As a result, the first state of the plurality of data is unknown and the plurality of data cannot be restored to the original biometric data using only the authentication information stored in either the information storage medium or the storage device. Therefore, even if either the first information stored in the information storage medium or the second information stored in the storage device is leaked, it is possible to prevent an individual from being identified.
 前記位置姿勢変更値は、シフト値を含み、
 前記シフト値は、前記少なくとも一つのデータの位置を、第一座標値から前記第一座標値とは異なる第二座標値までシフトした値であり、
 前記第二状態に関連する座標値は、前記シフト値が設定された前記少なくとも一つのデータの前記第二座標値を含み、
 前記制御装置は、
 前記第一情報および前記第二情報に基づいて、前記複数のデータを前記第一座標値で配置して参照データを作成してもよい。
The position and orientation change value includes a shift value,
The shift value is a value obtained by shifting the position of the at least one data from a first coordinate value to a second coordinate value different from the first coordinate value,
The coordinate values related to the second state include the second coordinate values of the at least one data set with the shift value,
The control device includes:
Reference data may be created by arranging the plurality of pieces of data at the first coordinate values based on the first information and the second information.
 上記のような構成によれば、情報保存媒体には、第一情報として、シフト値が保存されているが、座標値は保存されていない。また、記憶装置には、第二情報として、座標値は保存されているが、シフト値は保存されていない。したがって、第一情報または第二情報が他人に知られた場合でも、他人により複数のデータを復元することを防ぐことができる。 According to the above configuration, the information storage medium stores the shift value as the first information, but does not store the coordinate value. Further, the storage device stores coordinate values as second information, but does not store shift values. Therefore, even if the first information or the second information becomes known to another person, it is possible to prevent the plurality of data from being restored by another person.
 前記位置姿勢変更値は、回転反転値を含み、
 前記回転反転値は、前記少なくとも一つのデータの姿勢を、前記少なくとも一つのデータの原点を基軸とした回転処理および反転処理の少なくとも一つの実行により、第一姿勢から前記第一姿勢とは異なる第二姿勢へ変化した値であり、
 前記第二状態に関連する座標値は、前記第二姿勢に変化した後の前記原点の座標値を含み、
 前記制御装置は、
 前記第一情報および前記第二情報に基づいて、前記少なくとも一つのデータに対して回転処理および反転処理の少なくとも一つを実行し、前記複数のデータを前記第一姿勢で配置して、参照データを作成してもよい。
The position and orientation change value includes a rotation inversion value,
The rotation inversion value changes the orientation of the at least one data from a first orientation to a second orientation different from the first orientation by performing at least one of rotation processing and reversal processing about the origin of the at least one data. This is the value when changing to two postures,
The coordinate values related to the second state include the coordinate values of the origin after changing to the second posture,
The control device includes:
Based on the first information and the second information, at least one of a rotation process and a reversal process is performed on the at least one data, the plurality of data are arranged in the first orientation, and the reference data is may be created.
 上記のような構成によれば、情報保存媒体には、第一情報として、回転反転値が保存されているが、回転処理および反転処理の基軸となる原点の座標値は保存されていない。また、記憶装置には、第二情報として、回転処理および反転処理の基軸となる原点の座標値は保存されているが、回転反転値は保存されていない。したがって、第一情報または第二情報が他人に知られた場合でも、他人により複数のデータを復元することを防ぐことができる。 According to the above configuration, the rotation inversion value is stored as the first information in the information storage medium, but the coordinate value of the origin, which is the base axis of the rotation process and the inversion process, is not saved. Further, as second information, the storage device stores the coordinate values of the origin, which are the base axes of rotation processing and reversal processing, but does not store rotation reversal values. Therefore, even if the first information or the second information becomes known to another person, it is possible to prevent the plurality of data from being restored by another person.
 前記位置姿勢変更値は、シフト値および回転反転値を含み、
 前記シフト値は、前記少なくとも一つのデータの位置を、第一座標値から前記第一座標値とは異なる第二座標値までシフトした値であり、
 前記回転反転値は、前記少なくとも一つのデータの姿勢を、前記少なくとも一つのデータの原点を基軸とした回転処理および反転処理の少なくとも一つの実行により、第一姿勢から前記第一姿勢とは異なる第二姿勢へ変化した値であり、
 前記第二状態に関連する座標値は、前記第二座標値へシフトし且つ前記第二姿勢に変化した後の前記少なくとも一つのデータの前記原点の座標値を含み、
 前記制御装置は、
 前記第一情報および前記第二情報に基づいて、前記複数のデータを前記第一座標値および前記第一姿勢で配置して参照データを作成してもよい。
The position and orientation change value includes a shift value and a rotation inversion value,
The shift value is a value obtained by shifting the position of the at least one data from a first coordinate value to a second coordinate value different from the first coordinate value,
The rotation inversion value changes the orientation of the at least one data from a first orientation to a second orientation different from the first orientation by performing at least one of rotation processing and reversal processing about the origin of the at least one data. This is the value when changing to two postures,
The coordinate values related to the second state include the coordinate values of the origin of the at least one data after shifting to the second coordinate values and changing to the second posture,
The control device includes:
Reference data may be created by arranging the plurality of data at the first coordinate values and the first orientation based on the first information and the second information.
 上記のような構成によれば、情報保存媒体には、第一情報として、シフト値と回転反転値が保存されているが、データの座標値と回転処理および反転処理の基軸となる原点の座標値は保存されていない。また、記憶装置には、第二情報として、データの座標値と回転処理および反転処理の基軸となる原点の座標値は保存されているが、シフト値と回転反転値は保存されていない。したがって、第一情報または第二情報が他人に知られた場合でも、他人により複数のデータを復元することを防ぐことができる。 According to the above configuration, the information storage medium stores the shift value and the rotation reversal value as the first information, but also the coordinate value of the data and the coordinate of the origin, which is the base axis of the rotation process and the reversal process. Value is not saved. Further, the storage device stores, as second information, the coordinate values of the data and the coordinate values of the origin, which are the basis of rotation processing and reversal processing, but does not store shift values and rotation reversal values. Therefore, even if the first information or the second information becomes known to another person, it is possible to prevent the plurality of data from being restored by another person.
 前記複数のデータは、予め取得された前記被認証者の身体の一部に関する前記生体データから前記被認証者の特徴を特定できないサイズに分割されて抽出されたデータでもよい。これにより、複数のデータが他人に知られた場合でも、個人が特定されることを防止できる。 The plurality of data may be data that is extracted from the biometric data regarding a part of the body of the person to be authenticated, which has been obtained in advance, and is divided into a size that does not allow identification of the characteristics of the person to be authenticated. This makes it possible to prevent individuals from being identified even if multiple pieces of data are known to others.
 また、前記複数のデータは、前記第一状態で配置された時に前記被認証者の連続した領域を構成しないように抽出されたデータでもよい。これにより、データ間の距離が不明なため、複数のデータを復元することができず、個人を特定することができない。 Furthermore, the plurality of data may be data extracted so as not to constitute a continuous area of the person to be authenticated when placed in the first state. As a result, the distance between pieces of data is unknown, making it impossible to restore multiple pieces of data and making it impossible to identify individuals.
 前記認証システムは、前記情報保存媒体から前記被認証者の生体認証に関する第一情報を取得する第一情報取得装置をさらに備えてもよい。 The authentication system may further include a first information acquisition device that acquires first information regarding biometric authentication of the person to be authenticated from the information storage medium.
 前記認証システムは、前記記憶装置を備えており、
 前記記憶装置は、前記被認証者とは異なる被認証者の生体認証に関する第二情報も記憶してもよい。これにより、記憶装置の第二情報が他人に知られた場合でも、複数の人物の第二情報から被認証者の第二情報を特定することができず、他人により被認証者のデータを復元することを防ぐことができる。
The authentication system includes the storage device,
The storage device may also store second information regarding biometric authentication of a person to be authenticated who is different from the person to be authenticated. As a result, even if the second information in the storage device becomes known to another person, the second information of the person to be authenticated cannot be identified from the second information of multiple people, and the data of the person to be authenticated cannot be restored by another person. can be prevented from happening.
 前記第一情報は、前記複数のデータを構成する複数のセグメントの一部を含んでおり、
 前記第二情報は、前記複数のセグメントの残りを含んでもよい。これにより、第一情報または第二情報が流出しても個人を特定することを防止できる。
The first information includes a part of a plurality of segments constituting the plurality of data,
The second information may include the remainder of the plurality of segments. This makes it possible to prevent individuals from being identified even if the first information or second information is leaked.
 本開示の第四態様に係る制御装置は、
 対象者の身体の一部に関する生体データを受け付ける入力部と、
 前記生体データに基づいて前記対象者の生体認証に用いられる情報を生成し、前記情報を記憶装置へ出力するプロセッサと、を備えており、
 前記プロセッサは、前記生体データから抽出された複数のデータの少なくとも一つのデータに対して位置姿勢変更値を設定し、
 前記位置姿勢変更値は、前記少なくとも一つのデータの位置および姿勢の少なくとも一方を第一状態から前記第一状態とは異なる第二状態へ変化させた値であり、
 前記位置姿勢変更値は、前記位置姿勢変更値が設定された前記少なくとも一つのデータの前記第二状態に関連する座標値を含む前記複数のデータの各々の座標値に基づいて前記複数のデータ同士を配置した場合に前記生体データにおける前記複数のデータ同士の相関関係が判別できない値に設定されており、
 前記プロセッサは、前記複数のデータの各々の座標値を含む前記情報を生成し、前記情報を第二情報として前記記憶装置へ出力する。
A control device according to a fourth aspect of the present disclosure includes:
an input unit that accepts biometric data regarding a part of the subject's body;
a processor that generates information used for biometric authentication of the subject based on the biometric data and outputs the information to a storage device;
the processor sets a position/orientation change value for at least one of the plurality of data extracted from the biological data;
The position and orientation change value is a value obtained by changing at least one of the position and orientation of the at least one data from a first state to a second state different from the first state,
The position/orientation change value is configured to adjust the position/orientation change value between the plurality of data based on the coordinate value of each of the plurality of data including the coordinate value related to the second state of the at least one data to which the position/orientation change value is set. is set to a value that makes it impossible to determine the correlation between the plurality of data in the biometric data when
The processor generates the information including coordinate values of each of the plurality of data, and outputs the information as second information to the storage device.
 本開示の第五態様に係る制御装置により実行されるコンピュータプログラムは、
 実行されることにより、前記制御装置に、
 対象者の身体の一部に関する生体データを受け付けさせ、
 前記生体データに基づいて前記対象者の生体認証に用いられる情報を生成し、前記情報を記憶装置へ出力させ、
 前記生体データから抽出された複数のデータの少なくとも一つのデータに対して位置姿勢変更値を設定させ、
 前記位置姿勢変更値は、前記少なくとも一つのデータの位置および姿勢の少なくとも一方を第一状態から前記第一状態とは異なる第二状態へ変化させた値であり、
 前記位置姿勢変更値は、前記位置姿勢変更値が設定された前記少なくとも一つのデータの前記第二状態に関連する座標値を含む前記複数のデータの各々の座標値に基づいて前記複数のデータ同士を配置した場合に前記生体データにおける前記複数のデータ同士の相関関係が判別できない値に設定されており、
 前記複数のデータの各々の座標値を含む前記情報を生成し、前記情報を第二情報として前記記憶装置へ出力させる。
A computer program executed by a control device according to a fifth aspect of the present disclosure includes:
The execution causes the control device to:
Accept biometric data regarding a part of the target person's body,
generating information used for biometric authentication of the subject based on the biometric data and outputting the information to a storage device;
setting a position/orientation change value for at least one of the plurality of data extracted from the biological data;
The position and orientation change value is a value obtained by changing at least one of the position and orientation of the at least one data from a first state to a second state different from the first state,
The position/orientation change value is configured to adjust the position/orientation change value between the plurality of data based on the coordinate value of each of the plurality of data including the coordinate value related to the second state of the at least one data to which the position/orientation change value is set. is set to a value that makes it impossible to determine the correlation between the plurality of data in the biometric data when
The information including the coordinate values of each of the plurality of data is generated, and the information is output to the storage device as second information.
 上記の第四態様および第五態様によれば、記憶装置には、生体認証に用いられる認証情報として、位置姿勢変更値が設定された少なくとも一つのデータの第二状態に関連する座標値が保存されている。これにより、記憶装置に記憶された認証情報だけでは、複数のデータを元の生体データに復元できない。したがって、記憶装置に保存された認証情報が流出しても個人が特定されることを防止できる。 According to the fourth and fifth aspects described above, the storage device stores coordinate values related to the second state of at least one piece of data in which a position/orientation change value is set as authentication information used for biometric authentication. has been done. As a result, multiple pieces of data cannot be restored to the original biometric data using only the authentication information stored in the storage device. Therefore, even if the authentication information stored in the storage device is leaked, it is possible to prevent the individual from being identified.
 また、前記プロセッサは、前記複数のデータの少なくとも一つのデータに対して設定された前記位置姿勢変更値を含む第一情報を生成し、前記第一情報を前記対象者が所持する情報保存媒体へ出力してもよい。 Further, the processor generates first information including the position/orientation change value set for at least one data of the plurality of data, and stores the first information in an information storage medium owned by the subject. You can also output it.
 このような構成によれば、情報保存媒体には、第一情報として、複数のデータの少なくとも一つの位置姿勢変更値が保存されているが、データの位置および姿勢の状態を示す値は保存されていない。これにより、情報保存媒体に記憶された認証情報だけでは、複数のデータを元の生体データに復元できない。したがって、情報保存媒体に保存された第一情報が流出しても個人が特定されることを防止できる。 According to such a configuration, the information storage medium stores at least one position/orientation change value of a plurality of data as the first information, but values indicating the position and orientation state of the data are not stored. Not yet. As a result, multiple pieces of data cannot be restored to the original biometric data using only the authentication information stored in the information storage medium. Therefore, even if the first information stored in the information storage medium is leaked, it is possible to prevent individuals from being identified.
 前記位置姿勢変更値は、シフト値を含み、
 前記シフト値は、前記少なくとも一つのデータの位置を、第一座標値から前記第一座標値とは異なる第二座標値までシフトした値であり、
 前記複数のデータの各々の座標値には、前記シフト値が設定された前記少なくとも一つのデータの前記第二座標値が含まれてもよい。
The position and orientation change value includes a shift value,
The shift value is a value obtained by shifting the position of the at least one data from a first coordinate value to a second coordinate value different from the first coordinate value,
The coordinate value of each of the plurality of data may include the second coordinate value of the at least one data set with the shift value.
 上記のような構成によれば、情報保存媒体には、第一情報として、シフト値が保存されているが、座標値は保存されていない。また、記憶装置には、第二情報として、座標値は保存されているが、シフト値は保存されていない。したがって、第一情報または第二情報が他人に知られた場合でも、他人により複数のデータを復元することを防ぐことができる。 According to the above configuration, the information storage medium stores the shift value as the first information, but does not store the coordinate value. Further, the storage device stores coordinate values as second information, but does not store shift values. Therefore, even if the first information or the second information becomes known to another person, it is possible to prevent the plurality of data from being restored by another person.
 前記位置姿勢変更値は、回転反転値を含み、
 前記回転反転値は、前記少なくとも一つのデータの姿勢を、前記少なくとも一つのデータの原点を基軸とした回転処理および反転処理の少なくとも一つの実行により、第一姿勢から前記第一姿勢とは異なる第二姿勢へ変化した値であり、
 前記複数のデータの各々の座標値には、前記第二姿勢に変化した後の前記少なくとも一つのデータの原点の座標値が含まれてもよい。
The position and orientation change value includes a rotation inversion value,
The rotation inversion value changes the orientation of the at least one data from a first orientation to a second orientation different from the first orientation by performing at least one of rotation processing and reversal processing about the origin of the at least one data. This is the value when changing to two postures,
The coordinate value of each of the plurality of data may include the coordinate value of the origin of the at least one data after changing to the second attitude.
 上記のような構成によれば、情報保存媒体には、第一情報として、回転反転値が保存されているが、回転処理および反転処理の基軸となる原点の座標値は保存されていない。また、記憶装置には、第二情報として、回転処理および反転処理の基軸となる原点の座標値は保存されているが、回転反転値は保存されていない。したがって、第一情報または第二情報が他人に知られた場合でも、他人により複数のデータを復元することを防ぐことができる。 According to the above configuration, the rotation inversion value is stored as the first information in the information storage medium, but the coordinate value of the origin, which is the base axis of the rotation process and the inversion process, is not saved. Further, as second information, the storage device stores the coordinate values of the origin, which are the base axes of rotation processing and reversal processing, but does not store rotation reversal values. Therefore, even if the first information or the second information becomes known to another person, it is possible to prevent the plurality of data from being restored by another person.
 前記位置姿勢変更値は、シフト値および回転反転値を含み、
 前記シフト値は、前記少なくとも一つのデータの位置を、第一座標値から前記第一座標値とは異なる第二座標値までシフトした値であり、
 前記回転反転値は、前記少なくとも一つのデータの姿勢を、前記少なくとも一つのデータの原点を基軸とした回転処理および反転処理の少なくとも一つの実行により、第一姿勢から前記第一姿勢とは異なる第二姿勢へ変化した値であり、
 前記複数のデータの各々の座標値には、前記第二座標値へシフトし且つ前記第二姿勢に変化した後の前記少なくとも一つのデータの前記原点の座標値が含まれてもよい。
The position and orientation change value includes a shift value and a rotation inversion value,
The shift value is a value obtained by shifting the position of the at least one data from a first coordinate value to a second coordinate value different from the first coordinate value,
The rotation inversion value changes the orientation of the at least one data from a first orientation to a second orientation different from the first orientation by performing at least one of rotation processing and reversal processing about the origin of the at least one data. This is the value when changing to two postures,
The coordinate value of each of the plurality of data may include a coordinate value of the origin of the at least one data after shifting to the second coordinate value and changing to the second attitude.
 上記のような構成によれば、情報保存媒体には、第一情報として、シフト値と回転反転値が保存されているが、データの座標値と回転処理および反転処理の基軸となる原点の座標値は保存されていない。また、記憶装置には、第二情報として、データの座標値と回転処理および反転処理の基軸となる原点の座標値は保存されているが、シフト値と回転反転値は保存されていない。したがって、第一情報または第二情報が他人に知られた場合でも、他人により複数のデータを復元することを防ぐことができる。 According to the above configuration, the information storage medium stores the shift value and the rotation reversal value as the first information, but also the coordinate value of the data and the coordinate of the origin, which is the base axis of the rotation process and the reversal process. Value is not saved. Further, the storage device stores, as second information, the coordinate values of the data and the coordinate values of the origin, which are the basis of rotation processing and reversal processing, but does not store shift values and rotation reversal values. Therefore, even if the first information or the second information becomes known to another person, it is possible to prevent the plurality of data from being restored by another person.
 また、前記プロセッサは、前記複数のデータを構成する複数のセグメントの一部を前記第一情報に含め、前記複数のセグメントの残りを前記第二情報に含めもよい。これにより、第一情報または第二情報が流出しても個人を特定することを防止できる。 Furthermore, the processor may include a part of the plurality of segments constituting the plurality of data in the first information, and include the remainder of the plurality of segments in the second information. This makes it possible to prevent individuals from being identified even if the first information or second information is leaked.
 また、前記プロセッサは、前記生体データを前記対象者の特徴を特定できないサイズに分割して前記複数のデータとして抽出してもよい。これにより、複数のデータが他人に知られた場合でも、個人が特定されることを防止できる。 Additionally, the processor may divide the biometric data into sizes that do not allow identification of characteristics of the subject and extract the plurality of pieces of data. This makes it possible to prevent individuals from being identified even if multiple pieces of data are known to others.
 また、前記プロセッサは、前記複数のデータが前記第一状態で配置された時に前記被認証者の連続した領域を構成しないように抽出してもよい。これにより、データ間の距離が不明なため、複数のデータを復元することができず、個人を特定することができない。 Furthermore, the processor may extract the plurality of data so that when arranged in the first state, they do not constitute a continuous area of the person to be authenticated. As a result, the distance between pieces of data is unknown, making it impossible to restore multiple pieces of data and making it impossible to identify individuals.
 本開示によれば、認証に使用されるために保存されたデータが流出しても個人が特定されることを防止できる。 According to the present disclosure, even if data stored for use in authentication is leaked, individuals can be prevented from being identified.
本実施形態に係る認証システムの機能構成を例示している。The functional configuration of the authentication system according to the present embodiment is illustrated. 撮像システムの構成を例示している。The configuration of the imaging system is illustrated. 制御装置のプロセッサにおいて実行される第一情報および第二情報の生成処理の流れを例示している。The flow of the generation process of first information and second information executed by the processor of the control device is illustrated. 対象者の顔において認証に用いる複数のデータが抽出されるエリアを例示している。An example of an area from which a plurality of pieces of data used for authentication are extracted from a target person's face is illustrated. 対象者の顔の撮像データから抽出されたエリアに対応するデータを例示している。Data corresponding to an area extracted from imaging data of a subject's face is illustrated. XYZ座標系における各データを例示している。Each data in the XYZ coordinate system is illustrated. 記憶装置に記憶されるデータベースを例示している。A database stored in a storage device is illustrated. XY座標系における2次元画像データから抽出されたデータを例示している。Data extracted from two-dimensional image data in the XY coordinate system is illustrated. 図8の抽出されたデータがシフトおよび変換処理された状態を例示している。8 illustrates a state in which the extracted data of FIG. 8 has been subjected to shift and conversion processing. 3次元画像データの画像データ抽出方法を説明するための図を例示している。A diagram for explaining an image data extraction method of three-dimensional image data is illustrated. 制御装置のプロセッサにおいて実行される認証処理の流れを例示している。The flow of the authentication process executed in the processor of the control device is illustrated. 図6のデータをXYZ座標系においてデータが第二座標値に配置されてシフトされた例を示している。An example is shown in which the data in FIG. 6 is shifted so as to be placed at a second coordinate value in the XYZ coordinate system. 図6のデータをXYZ座標系においてデータが第一座標値に配置されて復元された例を示している。An example is shown in which the data in FIG. 6 is restored by being arranged at the first coordinate value in the XYZ coordinate system.
 添付の図面を参照しつつ、本開示の実施形態について以下詳細に説明する。図1は、本実施形態に係る認証システム10の機能構成を例示している。認証システム10は、被認証者50(図2)を所定の処理または動作を行うシステムのユーザと認証し、被認証者50の利用を許容するために使用されうる。所定の処理または動作を行うシステムの例としては、決済(支払い)システムなどが挙げられる。認証システム10が行う認証は、被認証者50の身体の一部を用いて認証を行う生体認証である。生体認証の例としては、顔認証、虹彩認証、指紋認証、または静脈認証などが挙げられる。以下の説明では、認証システム10が顔認証を行う場合について説明する。 Embodiments of the present disclosure will be described in detail below with reference to the accompanying drawings. FIG. 1 illustrates the functional configuration of an authentication system 10 according to this embodiment. The authentication system 10 can be used to authenticate a person 50 (FIG. 2) to be authenticated as a user of a system that performs a predetermined process or operation, and to permit use of the person 50 to be authenticated. An example of a system that performs predetermined processing or operation is a payment system. The authentication performed by the authentication system 10 is biometric authentication using a part of the body of the person 50 to be authenticated. Examples of biometric authentication include face authentication, iris authentication, fingerprint authentication, or vein authentication. In the following description, a case will be described in which the authentication system 10 performs face authentication.
 図1に例示されるように、認証システム10は、撮像装置11、読取装置12、および制御装置13を備えている。本例においては、撮像装置11と読取装置12は、撮像システム14に搭載されており、一体的に構成されている。なお、撮像装置11と読取装置12は、一体的に構成されずに、別体として構成されてもよい。 As illustrated in FIG. 1, the authentication system 10 includes an imaging device 11, a reading device 12, and a control device 13. In this example, the imaging device 11 and the reading device 12 are installed in the imaging system 14 and are integrally configured. Note that the imaging device 11 and the reading device 12 may not be configured integrally but may be configured as separate bodies.
 撮像システム14は、図1に例示されるように、無線または有線により通信ネットワーク20に接続可能であり、通信ネットワーク20を介して制御装置13に接続されている。なお、撮像システム14は、被認証者50が認証を行う場所に適宜配置されており、制御装置13には、複数の撮像システム14(図示せず)が接続されている。 As illustrated in FIG. 1, the imaging system 14 can be connected to the communication network 20 wirelessly or by wire, and is connected to the control device 13 via the communication network 20. Note that the imaging system 14 is appropriately placed at a location where the person to be authenticated 50 performs authentication, and a plurality of imaging systems 14 (not shown) are connected to the control device 13.
 撮像装置11は、被認証者50の顔を撮像することにより被認証者50の顔認証用の撮像データを取得するように構成されている。顔認証用の撮像データは、生体認証用の生体データの一例である。撮像装置11は、認証用データ取得装置の一例である。撮像装置11は、撮像データとして、2次元画像データ、3次元画像データ、または3次元形状データを取得するように構成される。例えば、図2に示されるように、撮像装置11は、複数の光源部111、複数のカメラ112、および投影デバイス113を有する。 The imaging device 11 is configured to acquire image data for face authentication of the person to be authenticated 50 by capturing an image of the face of the person to be authenticated. Image data for face authentication is an example of biometric data for biometric authentication. The imaging device 11 is an example of an authentication data acquisition device. The imaging device 11 is configured to acquire two-dimensional image data, three-dimensional image data, or three-dimensional shape data as imaging data. For example, as shown in FIG. 2, the imaging device 11 includes a plurality of light source units 111, a plurality of cameras 112, and a projection device 113.
 光源部111は、被認証者50に向けて光を照射するように構成されている。例えば、光源部111は、可視光線から不可視光線までの波長のうち所定の波長の光を発する一つまたは複数の光源を有する。複数のカメラ112は、赤外線カメラや赤外線以外の波長の光を認識可能なカメラであり、CCD(Charge−Coupled Device)やCMOS(相補型MOS)等の撮像素子などにより構成される。投影デバイス113は、被認証者50の3次元形状データを取得するために被認証者50に向けて赤外線または可視光を照射するように構成される。例えば、投影デバイス113は、複数のドットなどで構成したパターンの赤外線または可視光を被認証者50上に投影する。 The light source section 111 is configured to emit light toward the person to be authenticated 50. For example, the light source unit 111 includes one or more light sources that emit light at a predetermined wavelength from visible light to invisible light. The plurality of cameras 112 are infrared cameras or cameras capable of recognizing light of wavelengths other than infrared, and are configured with imaging elements such as CCDs (Charge-Coupled Devices) and CMOSs (Complementary MOS). The projection device 113 is configured to irradiate infrared rays or visible light toward the person 50 to be authenticated in order to obtain three-dimensional shape data of the person 50 to be authenticated. For example, the projection device 113 projects a pattern of infrared or visible light made up of a plurality of dots onto the person to be authenticated 50 .
 2次元画像データまたは3次元画像データは、例えば、光源部111とカメラ112により取得される。具体的には、光源部111により光が照射された被認証者50をカメラ112により撮像することにより、被認証者50の2次元画像データが取得される。3次元画像データは、例えば、異なる位置のカメラ112から撮像した複数の2次元画像データに基づいて写真測量の手法により生成される。具体的には、3次元画像データは、異なる位置のカメラ112から撮像した被認証者50の複数の2次元画像を3次元画像に正規化する処理を行うことにより生成される。あるいは、3次元画像データは、後述する2次元画像データを3次元形状データの上に投影表示することにより生成されてもよい。 Two-dimensional image data or three-dimensional image data is acquired by, for example, the light source section 111 and the camera 112. Specifically, two-dimensional image data of the person to be authenticated 50 is acquired by capturing an image of the person to be authenticated 50 irradiated with light by the light source unit 111 using the camera 112 . The three-dimensional image data is generated, for example, by a photogrammetry technique based on a plurality of two-dimensional image data captured by the cameras 112 at different positions. Specifically, the three-dimensional image data is generated by normalizing a plurality of two-dimensional images of the person to be authenticated 50 captured by the cameras 112 at different positions into three-dimensional images. Alternatively, the three-dimensional image data may be generated by projecting and displaying two-dimensional image data, which will be described later, on the three-dimensional shape data.
 3次元形状データは、例えば、カメラ112と投影デバイス113により取得される。具体的には、投影デバイス113により赤外線または可視光が照射された被認証者50をカメラ112により撮像することにより、被認証者50の3次元形状データが取得される。例えば、投影デバイス113により被認証者50の顔の上に投影された複数のドットなどで構成したパターンの赤外線または可視光をカメラ112により撮像することにより複数の座標値を保有する点の集合体により構成される3次元形状データが生成される。なお、3次元形状データは、カメラ112と投影デバイス113以外のセンサにより取得されてもよい。 The three-dimensional shape data is acquired by, for example, the camera 112 and the projection device 113. Specifically, three-dimensional shape data of the person to be authenticated 50 is acquired by capturing an image of the person to be authenticated 50 irradiated with infrared rays or visible light by the projection device 113 using the camera 112 . For example, a collection of points having a plurality of coordinate values is obtained by imaging a pattern of infrared rays or visible light made up of a plurality of dots projected onto the face of the person to be authenticated 50 by the projection device 113 using the camera 112. Three-dimensional shape data is generated. Note that the three-dimensional shape data may be acquired by a sensor other than the camera 112 and the projection device 113.
 図1に例示されるように、読取装置12は、被認証者50により所持される情報保存媒体30から被認証者50の顔認証に関する第一情報I1を取得するように構成されている。読取装置12は、第一情報取得装置の一例である。第一情報I1は、後述する被認証者50の顔認証を行う際に用いられる。情報保存媒体30の例としては、スマートフォンやRFID(Radio Frequency IDentification)タグの一種であるNFC(Near Field Communication)タグなどが挙げられる。なお、スマートフォンが情報保存媒体30として使用される場合には、後述する記憶装置40としては、情報保存媒体30として使用されるスマートフォン以外の媒体が使用される。 As illustrated in FIG. 1, the reading device 12 is configured to acquire first information I1 regarding face authentication of the person to be authenticated 50 from the information storage medium 30 held by the person to be authenticated 50. The reading device 12 is an example of a first information acquisition device. The first information I1 is used when performing face authentication of the person to be authenticated 50, which will be described later. Examples of the information storage medium 30 include a smartphone and an NFC (Near Field Communication) tag, which is a type of RFID (Radio Frequency IDentification) tag. Note that when a smartphone is used as the information storage medium 30, a medium other than the smartphone used as the information storage medium 30 is used as the storage device 40, which will be described later.
 具体的には、読取装置12は、接触または非接触で情報保存媒体30から第一情報I1を読み取るように構成されている。本例においては、図2に示されるように、読取装置12は、二次元コードを読み取る二次元コードリーダ121を有する。二次元コードの例としては、QRコード(Quick Response Code)(登録商標)が挙げられる。例えば、第一情報I1は二次元コードとして被認証者50が所持するスマートフォンに保存されており、二次元コードリーダ121により第一情報I1が読み取られる。また、読取装置12は、NFCリーダ122を有する。例えば、第一情報I1は被認証者50が所持するNFCタグに保存されており、NFCリーダ122により第一情報I1が読み取られる。なお、NFCタグは、カード状やシール状、NFC対応のスマートフォンなど様々な形状や機器の形で被認証者50に所持される。あるいは、NFCタグは、NFCマイクロチップとして被認証者50に埋め込まれてもよい。 Specifically, the reading device 12 is configured to read the first information I1 from the information storage medium 30 in a contact or non-contact manner. In this example, as shown in FIG. 2, the reading device 12 includes a two-dimensional code reader 121 that reads a two-dimensional code. An example of a two-dimensional code is a QR code (Quick Response Code) (registered trademark). For example, the first information I1 is stored as a two-dimensional code in a smartphone owned by the person to be authenticated 50, and the two-dimensional code reader 121 reads the first information I1. Further, the reading device 12 includes an NFC reader 122. For example, the first information I1 is stored in an NFC tag owned by the person to be authenticated 50, and the first information I1 is read by the NFC reader 122. Note that the NFC tag is possessed by the person to be authenticated 50 in various shapes and devices, such as a card, a sticker, and an NFC-compatible smartphone. Alternatively, the NFC tag may be implanted in the authenticated person 50 as an NFC microchip.
 本例においては、撮像システム14は、さらに表示部114とセンサ115を備えている。表示部114は、撮像装置11により取得した被認証者50の撮像データを表示するように構成されている。表示部114は、例えば、液晶や有機ELなどのディスプレイから構成される。表示部114は、ディスプレイの前面の全体または一部の領域に取り付けられた静電容量方式や抵抗膜方式のタッチセンサを有してもよい。表示部114には、被認証者50の顔を配置するためのガイド(枠)Gが表示されてもよい。 In this example, the imaging system 14 further includes a display section 114 and a sensor 115. The display unit 114 is configured to display imaging data of the person to be authenticated 50 acquired by the imaging device 11. The display unit 114 includes, for example, a liquid crystal display, an organic EL display, or the like. The display unit 114 may include a capacitive touch sensor or a resistive touch sensor attached to the entire or a part of the front surface of the display. A guide (frame) G for arranging the face of the person to be authenticated 50 may be displayed on the display unit 114.
 センサ115は、例えば、近接センサや環境光センサである。近接センサは、被認証者50の顔が撮像システム14に対して所定の範囲に位置したことを検知する。例えば、撮像装置11は、近接センサにより被認証者50の顔が撮像システム14に対して所定の範囲内に位置したことが検知された場合に、被認証者50を撮像するように構成される。環境光センサは、撮像システム14の周囲の明るさを検知する。例えば、光源部111は、光源の輝度が環境光センサにより検知された周囲の明るさに基づいて自動的に最適な値になるように構成されてもよい。 The sensor 115 is, for example, a proximity sensor or an ambient light sensor. The proximity sensor detects that the face of the person to be authenticated 50 is located within a predetermined range with respect to the imaging system 14 . For example, the imaging device 11 is configured to image the person to be authenticated 50 when the proximity sensor detects that the face of the person to be authenticated 50 is located within a predetermined range with respect to the imaging system 14. . The ambient light sensor detects the brightness around the imaging system 14. For example, the light source unit 111 may be configured so that the brightness of the light source automatically becomes an optimal value based on the surrounding brightness detected by the environmental light sensor.
 なお、撮像システム14は、図2に例示される構成に限定されない。例えば、スマートフォンや携帯情報端末が、撮像システム14として使用されてもよい。なお、スマートフォンが撮像システム14として使用される場合、当該スマートフォンは情報保存媒体30としても使用されてもよい。すなわち、撮像システム14と情報保存媒体30は一体化されてスマートフォンの機能として実現されてもよい。 Note that the imaging system 14 is not limited to the configuration illustrated in FIG. 2. For example, a smartphone or a personal digital assistant may be used as the imaging system 14. Note that when a smartphone is used as the imaging system 14, the smartphone may also be used as the information storage medium 30. That is, the imaging system 14 and the information storage medium 30 may be integrated and realized as a function of a smartphone.
 図1に戻り、制御装置13は、記憶装置40に接続されている。制御装置13と記憶装置40は、同じサーバ装置(図示せず)に搭載されてもよく、通信ネットワーク20を介して接続されてもよい。記憶装置40には、被認証者50の顔認証に関する第二情報I2が保存されている。第二情報I2は、後述する被認証者50の顔認証を行う際に用いられる。 Returning to FIG. 1, the control device 13 is connected to the storage device 40. The control device 13 and the storage device 40 may be installed in the same server device (not shown) or may be connected via the communication network 20. The storage device 40 stores second information I2 regarding face authentication of the person to be authenticated 50. The second information I2 is used when performing face authentication of the person to be authenticated 50, which will be described later.
 制御装置13は、入力部131、プロセッサ132、および出力部133を備えている。入力部131は、撮像装置11により取得された被認証者50の顔の撮像データおよび読取装置12により取得された被認証者50の顔認証に関する第一情報I1を取得可能なインターフェースとして構成されている。また、入力部131は、記憶装置40から被認証者50の顔認証に関する第二情報I2を取得可能なインターフェースとして構成されている。 The control device 13 includes an input section 131, a processor 132, and an output section 133. The input unit 131 is configured as an interface capable of acquiring the imaging data of the face of the person to be authenticated 50 acquired by the imaging device 11 and the first information I1 regarding the face authentication of the person to be authenticated 50 acquired by the reading device 12. There is. Further, the input unit 131 is configured as an interface capable of acquiring second information I2 related to face authentication of the person to be authenticated 50 from the storage device 40.
 プロセッサ132は、撮像データ、第一情報I1および第二情報I2に基づいて被認証者50を顔認証する認証処理を行う。認証処理においては、まず、プロセッサ132は、第一情報I1と第二情報I2に基づいて、被認証者50の顔認証用の参照データを構築する。 The processor 132 performs an authentication process of facially authenticating the person to be authenticated 50 based on the imaging data, the first information I1, and the second information I2. In the authentication process, the processor 132 first constructs reference data for face authentication of the person to be authenticated 50 based on the first information I1 and the second information I2.
 具体的には、第一情報I1と第二情報I2は、被認証者50の顔を撮像することにより予め取得された撮像データから抽出された被認証者50の顔の異なるエリアにそれぞれ対応する複数のデータに基づいて生成されている。第一情報I1は、撮像データを所定の座標系(例えば、XYZ座標系)に適用した場合に、抽出された複数のデータの少なくとも一つのデータに対して設定された位置姿勢変更値を含んでいる。位置姿勢変更値は、少なくとも一つのデータの位置および姿勢の少なくとも一方を第一状態から第二状態へ変化させた値である。第二情報I2は、複数のデータの各々の座標値を含んでいる。位置姿勢変更値が設定された少なくとも一つのデータの座標値は、第二状態に関連する座標値である。位置姿勢変更値が設定されていないデータの座標値は、第一状態に関連する座標値である。すべてのデータに対して位置姿勢変更値が設定されている場合には、複数のデータの各々の第二状態に関連する座標値が第二情報I2に含まれる。位置姿勢変更値は、第二情報I2に含まれる座標値に基づいて複数のデータ同士を配置した場合に生体データにおける複数のデータ同士の相関関係が判別できない値に設定されている。 Specifically, the first information I1 and the second information I2 respectively correspond to different areas of the face of the person to be authenticated 50 extracted from image data obtained in advance by capturing an image of the face of the person to be authenticated. Generated based on multiple data. The first information I1 includes a position/orientation change value set for at least one of a plurality of extracted data when the imaging data is applied to a predetermined coordinate system (for example, an XYZ coordinate system). There is. The position/orientation change value is a value obtained by changing at least one of the position and orientation of at least one piece of data from a first state to a second state. The second information I2 includes coordinate values of each of the plurality of data. The coordinate value of at least one piece of data set with a position/orientation change value is a coordinate value related to the second state. The coordinate values of data for which no position/orientation change value is set are coordinate values related to the first state. When position and orientation change values are set for all data, the second information I2 includes coordinate values related to the second state of each of the plurality of data. The position/orientation change value is set to a value that makes it impossible to determine the correlation between the plurality of pieces of biometric data when the plurality of pieces of data are arranged based on the coordinate values included in the second information I2.
 位置姿勢変更値は、例えば、シフト値を含む。シフト値は、少なくとも一つのデータの位置を、その座標値(以下、第一座標値と称する)から、当該座標値とは異なる座標値(以下、第二座標値と称する)までシフトした値である。この場合、第二情報I2の第二状態に関連する座標値は、シフト値が設定された少なくとも一つのデータの第二座標値を含む。また、第二情報I2の第一状態に関連する座標値は、シフト値が設定されていないデータの第一座標値を含む。 The position/orientation change value includes, for example, a shift value. The shift value is a value obtained by shifting the position of at least one piece of data from its coordinate value (hereinafter referred to as the first coordinate value) to a coordinate value different from the coordinate value (hereinafter referred to as the second coordinate value). be. In this case, the coordinate values related to the second state of the second information I2 include the second coordinate values of at least one data set with a shift value. Further, the coordinate values related to the first state of the second information I2 include the first coordinate values of data for which no shift value is set.
 第一座標値は、第一座標値に基づいて所定の座標系(例えば、XYZ座標系)に複数の第二情報I2のデータ同士を配置した場合に、被認証者50(対象者51)の顔の撮像データとしての複数のデータ同士の相関関係を正しく再現できる値である。一方、第二座標値(またはシフト値)は、第二座標値に基づいて所定の座標系(例えば、XYZ座標系)に複数の第二情報I2のデータ同士を配置した場合に、被認証者50(対象者51)の顔の撮像データとしての複数のデータ同士の相関関係を正しく再現できない値に設定されている。 The first coordinate value is the value of the person to be authenticated 50 (target person 51) when the data of the plurality of second information I2 is arranged in a predetermined coordinate system (for example, XYZ coordinate system) based on the first coordinate value. This is a value that can correctly reproduce the correlation between a plurality of pieces of data as facial imaging data. On the other hand, the second coordinate value (or shift value) is determined when the data of the second information I2 is arranged in a predetermined coordinate system (for example, XYZ coordinate system) based on the second coordinate value. 50 (target person 51)'s face is set to a value that makes it impossible to correctly reproduce the correlation between a plurality of pieces of data as imaged data.
 シフト値は、第二情報I2の座標値に基づいて複数の第二情報I2のデータ同士を配置した場合に、撮像データにおける複数のデータ同士の相関関係が判別できない値に設定される。 The shift value is set to a value that makes it impossible to determine the correlation between the plural pieces of data in the imaging data when the plural pieces of data of the second information I2 are arranged based on the coordinate values of the second information I2.
 あるいは、位置姿勢変更値は、例えば、回転反転値を含む。回転反転値は、少なくとも一つのデータの姿勢を、少なくとも一つのデータの原点を基軸とした回転処理および反転処理の少なくとも一つの実行により、第一姿勢から第二姿勢へ変化した値である。この場合、第二情報I2の第二状態に関連する座標値は、第二姿勢に変化した後の原点の座標値を含む。なお、撮像データとして3次元形状データが取得された場合、後述するように撮像データが複数の形成点で形成した各種形状のエリアに分割されて抽出されるときは、第二状態に関連する座標値はさらに第二姿勢に変化した後のデータを形成する形成点の座標値を含む。また、第二情報I2の第一状態に関連する座標値は、第二姿勢に変化していないデータの座標値を含む。なお、撮像データとして3次元形状データが取得された場合、後述するように撮像データが複数の形成点で形成した各種形状のエリアに分割されて抽出されるときは、第二姿勢に変化していないデータの座標値は、データを形成する複数の形成点の座標値を含む。 Alternatively, the position/orientation change value includes, for example, a rotation inversion value. The rotation reversal value is a value obtained by changing the orientation of at least one data from a first orientation to a second orientation by executing at least one of rotation processing and reversal processing with the origin of at least one data as a base axis. In this case, the coordinate values related to the second state of the second information I2 include the coordinate values of the origin after changing to the second attitude. In addition, when three-dimensional shape data is acquired as imaging data, when the imaging data is divided into areas of various shapes formed by a plurality of formation points and extracted as described later, the coordinates related to the second state The value further includes the coordinate value of the forming point that forms the data after changing to the second attitude. Further, the coordinate values related to the first state of the second information I2 include coordinate values of data that has not changed to the second attitude. In addition, when three-dimensional shape data is acquired as imaging data, when the imaging data is divided into areas of various shapes formed by a plurality of forming points and extracted, as described later, the image data changes to the second posture. The coordinate values of data that are not included include the coordinate values of a plurality of formation points that form the data.
 あるいは、位置姿勢変更値は、シフト値と回転反転値の両方を含んでもよい。この場合、第二情報I2の第二状態に関連する座標値は、第二座標値へシフトし且つ第二姿勢に変化した後の少なくとも一つのデータの原点の座標値を含む。第二情報I2の第一状態に関連する座標値は、シフトも姿勢も変化していないデータの第一座標値を含む。 Alternatively, the position/orientation change value may include both a shift value and a rotation inversion value. In this case, the coordinate values related to the second state of the second information I2 include the coordinate values of the origin of at least one data after shifting to the second coordinate values and changing to the second attitude. The coordinate values related to the first state of the second information I2 include the first coordinate values of data in which neither shift nor orientation has changed.
 プロセッサ132は、第一情報I1と第二情報I2に基づいて、複数の第二情報I2のデータを所定の座標系(上述の予め取得された撮像データが適用される座標系、例えば、XYZ座標系)上に復元する。例えば、位置姿勢変更値がシフト値を含む場合、第一情報I1のシフト値および第二情報I2の座標値に基づいて、シフト値が設定されたデータの第一座標値を算出し、所定の座標系において複数のデータをそれぞれ第一座標値に配置する。例えば、位置姿勢変更値が回転反転値を含む場合、第一情報I1の回転反転値および第二情報I2の座標値に基づいて、少なくとも一つのデータに対して回転反転処理を実行し、所定の座標系において回転反転したデータを含む複数のデータをそれぞれ第一姿勢で配置する。あるいは、位置姿勢変更値がシフト値と回転反転値を含む場合、第一情報I1のシフト値と回転反転値および第二情報I2の座標値に基づいて、各データの第一座標値を算出し、少なくとも一つのデータに対して回転反転処理を実行して複数のデータをそれぞれ第一姿勢で配置し、所定の座標系において第一姿勢で配置した複数のデータをそれぞれ第一座標値に配置する。あるいは、複数のデータをそれぞれ第一座標値に配置して、少なくとも一つのデータに対して回転反転処理を実行して、複数のデータをそれぞれ第一姿勢で配置してもよい。これにより、顔認証する際に撮像装置11により取得された被認証者50の顔の撮像データと比較される被認証者50の顔認証用の参照データが構築される。 Based on the first information I1 and the second information I2, the processor 132 converts the data of the plurality of second information I2 into a predetermined coordinate system (a coordinate system to which the previously acquired imaging data described above is applied, for example, XYZ coordinates). system). For example, when the position and orientation change value includes a shift value, the first coordinate value of the data set with the shift value is calculated based on the shift value of the first information I1 and the coordinate value of the second information I2, and a predetermined A plurality of pieces of data are each placed at a first coordinate value in the coordinate system. For example, when the position/orientation change value includes a rotational reversal value, rotation reversal processing is performed on at least one piece of data based on the rotational reversal value of the first information I1 and the coordinate value of the second information I2, and a predetermined A plurality of pieces of data including rotationally inverted data in the coordinate system are each arranged in a first orientation. Alternatively, when the position/orientation change value includes a shift value and a rotation inversion value, the first coordinate value of each data is calculated based on the shift value and rotation inversion value of the first information I1 and the coordinate value of the second information I2. , execute a rotation reversal process on at least one data to arrange each of the plurality of data in a first orientation, and arrange each of the plurality of data arranged in the first orientation at a first coordinate value in a predetermined coordinate system. . Alternatively, a plurality of pieces of data may be arranged at first coordinate values, and a rotation/reversal process may be performed on at least one piece of data to arrange each of the plurality of pieces of data at a first orientation. As a result, reference data for face authentication of the person to be authenticated 50 is constructed, which is compared with image data of the face of the person to be authenticated 50 acquired by the imaging device 11 when performing face authentication.
 そして、プロセッサ132は、構築された顔認証用のデータと被認証者50の顔の撮像データとに基づいて認証処理を行う。すなわち、プロセッサ132は、復元した複数のデータと撮像装置11により取得された被認証者50の顔の撮像データとを比較することにより認証処理を行う。 Then, the processor 132 performs authentication processing based on the constructed face authentication data and the imaged data of the face of the person to be authenticated 50. That is, the processor 132 performs the authentication process by comparing the restored plurality of data with the imaged data of the face of the person to be authenticated 50 acquired by the imaging device 11.
 プロセッサ132は、認証結果を出力部133から所定の処理または動作を行うシステムへ出力する。これにより、被認証者50は、例えば決済(支払い)システムによる支払い手続きを行うことができる。また、プロセッサ132は、認証結果を出力部133から撮像システム14へ出力する。例えば認証結果は表示部114に表示される。 The processor 132 outputs the authentication result from the output unit 133 to a system that performs predetermined processing or operation. Thereby, the person to be authenticated 50 can perform a payment procedure using, for example, a settlement (payment) system. Further, the processor 132 outputs the authentication result from the output unit 133 to the imaging system 14. For example, the authentication result is displayed on the display unit 114.
 なお、第一情報I1および第二情報I2は、例えば、認証システム10の制御装置13により生成されて、情報保存媒体30または記憶装置40に保存される。 Note that the first information I1 and the second information I2 are generated by, for example, the control device 13 of the authentication system 10 and stored in the information storage medium 30 or the storage device 40.
 図3は、制御装置13のプロセッサ132において実行される第一情報I1および第二情報I2の生成処理の流れを例示している。 FIG. 3 illustrates the flow of the generation process of the first information I1 and the second information I2 executed by the processor 132 of the control device 13.
 まず、プロセッサ132は、顔認証に用いる認証情報(すなわち、第一情報I1、第二情報I2)を登録する対象者51の顔を撮像することにより得られた撮像データを取得する(STEP1)。例えば、図2に示されるように、撮像装置11により対象者51の顔が撮像される。なお、撮像装置11とは異なる撮像装置により対象者51の顔が撮像されてもよい。この場合は、認証時に復元されるデータは、認証時に撮像装置11により撮像された撮像データと比較されるので、撮像装置11と同等の撮像条件で撮像されることが好ましい。 First, the processor 132 acquires imaging data obtained by imaging the face of the subject 51 who is registering authentication information (i.e., first information I1, second information I2) used for face authentication (STEP 1). For example, as shown in FIG. 2, the face of a subject 51 is imaged by the imaging device 11. Note that the face of the subject 51 may be imaged by an imaging device different from the imaging device 11. In this case, since the data restored at the time of authentication is compared with the imaged data captured by the imaging device 11 at the time of authentication, it is preferable that the image be captured under the same imaging conditions as the imaging device 11.
 続いて、プロセッサ132は、対象者51の顔の撮像データから認証に使用される複数のデータを抽出する(STEP2)。具体的には、プロセッサ132は、対象者51の顔の撮像データを複数のデータに分割し、分割されたデータを認証に使用されるデータとして抽出する。各データは、単体では対象者51の顔の特徴が認識できない(すなわち個人が特定できない)サイズに抽出される。 Subsequently, the processor 132 extracts a plurality of data used for authentication from the image data of the face of the subject 51 (STEP 2). Specifically, the processor 132 divides the captured image data of the face of the subject 51 into a plurality of pieces of data, and extracts the divided data as data used for authentication. Each piece of data is extracted to a size that makes it impossible to recognize the facial features of the subject 51 (that is, the individual cannot be identified) by itself.
 例えば、図4は、対象者51の顔において認証に用いる複数のデータが抽出されるエリアA1~A9を示している。図5は、対象者51の顔の撮像データから抽出されたエリアA1~A9に対応するデータB1~B9を示している。なお、図5に例示されるデータB1~B9は、撮像データとして3次元形状データから抽出されたデータである。図においては、データB1~B9はそれぞれ線で囲まれた所定の形状で示されているが、実際には一つまたは複数の座標値を保有する形成点の集合体により構成されている。 For example, FIG. 4 shows areas A1 to A9 from which a plurality of pieces of data used for authentication are extracted from the face of the target person 51. FIG. 5 shows data B1 to B9 corresponding to areas A1 to A9 extracted from the imaging data of the face of the subject 51. Note that data B1 to B9 illustrated in FIG. 5 are data extracted from three-dimensional shape data as imaging data. In the figure, each of the data B1 to B9 is shown as a predetermined shape surrounded by a line, but in reality, it is composed of a collection of formation points having one or more coordinate values.
 例えば、図4と図5に示されるように、各データB1~B9は、撮像データから事前にプログラムされたエリアA1~A9ごとに分割されて抽出される。各エリアA1~A9のサイズは、各データB1~B9から個人が特定できないように適宜設定されている。例えば、各エリアは、図4に示されるように、エリアA1、A2、A5~A9のように隣接するエリアと非連続になるように設定されてもよく、エリアA3,A4のように隣接するエリアと連続するように設定されてもよい。エリアA3,A4については、そのサイズを小さくすることにより、エリアA3,A4から抽出されたデータB3,B4を見ても、それが顔のどの位置のどの角度のモノか識別できず、どのデータと接合するのかも分からないようにすることができる。 For example, as shown in FIGS. 4 and 5, each of the data B1 to B9 is extracted from the imaging data by dividing it into areas A1 to A9 programmed in advance. The size of each area A1 to A9 is appropriately set so that an individual cannot be identified from each data B1 to B9. For example, as shown in FIG. 4, each area may be set to be discontinuous with adjacent areas such as areas A1, A2, A5 to A9, or adjacent areas such as areas A3 and A4. It may be set to be continuous with the area. By reducing the size of areas A3 and A4, even if you look at the data B3 and B4 extracted from areas A3 and A4, you will not be able to identify which position and angle of the face it is, and which data You can make it so that you don't even know if it will be joined with the other one.
 なお、各エリアに対象者51の顔の特徴を特定できる要素が含まれている場合には、各エリアのサイズを当該特徴が特定できないサイズに分割して、分割された当該エリアに対応する各データを抽出してもよい。対象者51の顔の特徴を特定できる要素の例としては、ホクロまたは刺青や傷跡などの異物あるいは眼球の虹彩(2次元または3次元画像データの場合)、顔の深い傷などの異物(3次元形状データの場合)などが挙げられる。また、データB1~B9に対象者51の顔の特徴を特定できる要素が含まれる場合には、所定の画像処理(黒塗りなど)が施されてもよく、あるいは認証用のデータとして抽出されなくてもよい。また、データB1~B9の形状は図5の形状に限定されず、例えば、一つの形成点を一つのエリアとしたり、複数の形成点で形成した各種形状のエリアに分割されて抽出されるデータでもよい。 In addition, if each area includes an element that can identify facial features of the target person 51, the size of each area is divided into sizes that do not allow identification of the facial features of the subject 51, and each area corresponding to the divided area is Data may also be extracted. Examples of elements that can identify the facial features of the subject 51 include foreign objects such as moles, tattoos, and scars, or the iris of the eyeball (in the case of two-dimensional or three-dimensional image data), and foreign objects such as deep scars on the face (three-dimensional (in the case of shape data), etc. Furthermore, if the data B1 to B9 contain elements that can identify the facial features of the subject 51, they may be subjected to predetermined image processing (such as blacking out), or they may not be extracted as data for authentication. You can. In addition, the shapes of data B1 to B9 are not limited to the shapes shown in FIG. 5, and for example, one forming point is one area, or data is extracted after being divided into areas of various shapes formed by a plurality of forming points. But that's fine.
 続いて、プロセッサ132は、抽出された複数のデータB1~B9の少なくとも一つについてシフト値を設定する(STEP3)。なお、本例においては、データB1~B9の全てについてシフト値が設定される。具体的には、所定の座標系において、各データB1~B9の第一座標値とは異なる第二座標値を設定し、第一座標値から第二座標値までのシフト値を算出する。あるいは、第二座標値を設定せずに、各データB1~B9のシフト値を設定してもよい。この場合、第一座標値とシフト値から第二座標値が算出される。シフト値または第二座標値は、プログラムによりランダムで決められる。 Next, the processor 132 sets a shift value for at least one of the plurality of extracted data B1 to B9 (STEP 3). Note that in this example, shift values are set for all data B1 to B9. Specifically, in a predetermined coordinate system, a second coordinate value different from the first coordinate value of each data B1 to B9 is set, and a shift value from the first coordinate value to the second coordinate value is calculated. Alternatively, the shift values of each data B1 to B9 may be set without setting the second coordinate values. In this case, the second coordinate value is calculated from the first coordinate value and the shift value. The shift value or the second coordinate value is randomly determined by the program.
 例えば、図6は、XYZ座標系における各データB1~B9を例示している。図においては、各データB1~B9を形成する形成点P1~P9とその第一座標値PA1~PA9が示されている。また、各データB1~B9には、後述する回転処理および反転処理の基軸となる原点が設定されている。各データB1~B9の原点の座標上の位置は、プログラムによりランダムで決められる。本例においては、形成点P1~P9が、各データB1~B9に設定された原点として設定されている。データB8は、形成点と原点の第二座標値PB8と第一座標値PA8から第二座標値PB8までのシフト値S8が図示されている。このように、データB1~B7,B9は、設定されたシフト値S1~S7,S9に基づいて、各々の形成点と原点が第一座標値PA1~PA7,PA9から第二座標値PB1~PB7,PB9へシフトされる(図示せず)。なお、各データB1~B9が二つ以上の形成点から構成されている場合には、各データB1~B9には、形成点P1~P9以外の形成点(図示せず)も含まれており、各形成点はそれぞれ第一座標値を有しており、また、第二座標値も設定されている。 For example, FIG. 6 illustrates each data B1 to B9 in the XYZ coordinate system. In the figure, formation points P1 to P9 forming each data B1 to B9 and their first coordinate values PA1 to PA9 are shown. Further, each of the data B1 to B9 has an origin set therein, which serves as a reference axis for rotation processing and reversal processing, which will be described later. The coordinate position of the origin of each data item B1 to B9 is randomly determined by the program. In this example, formation points P1 to P9 are set as the origin set to each data B1 to B9. Data B8 shows a second coordinate value PB8 of the formation point and the origin, and a shift value S8 from the first coordinate value PA8 to the second coordinate value PB8. In this way, the data B1 to B7, B9 are arranged such that each formation point and origin are changed from the first coordinate values PA1 to PA7, PA9 to the second coordinate values PB1 to PB7 based on the set shift values S1 to S7, S9. , PB9 (not shown). Note that when each data B1 to B9 is composed of two or more forming points, each data B1 to B9 also includes forming points (not shown) other than forming points P1 to P9. , each forming point has a first coordinate value, and a second coordinate value is also set.
 本例においては、所定の座標系として、XYZ座標系が使用されているが、暗号化座標系、極座標系などが採用されてもよい。暗号化座標系は、XY軸またはXYZ軸の座標値を意図的に不規則に配列した座標系であり、例えば座標値が作成者しか分からない不規則な配列3,1,2,5…の座標系である。極座標系には、円座標系、円筒座標系、球座標系などが含まれる。 In this example, an XYZ coordinate system is used as the predetermined coordinate system, but an encrypted coordinate system, a polar coordinate system, etc. may also be adopted. An encrypted coordinate system is a coordinate system in which the coordinate values of the XY axes or the XYZ axes are intentionally arranged irregularly. It is a coordinate system. Polar coordinate systems include circular coordinate systems, cylindrical coordinate systems, spherical coordinate systems, and the like.
 続いて、プロセッサ132は、データB1~B9に対して所定の変換処理を行う(STEP4)。所定の変換処理としては、例えば回転処理および反転処理の少なくとも一つが含まれる。例えば、原点P1~P9を基軸として、データB1~B9の向きを、所定の角度だけXYZ面に回転させる回転処理を行う。所定の角度は、プログラムによりランダムに設定される。 Next, the processor 132 performs a predetermined conversion process on the data B1 to B9 (STEP 4). The predetermined conversion process includes, for example, at least one of a rotation process and an inversion process. For example, rotation processing is performed to rotate the orientations of the data B1 to B9 by a predetermined angle to the XYZ plane using the origins P1 to P9 as the base axes. The predetermined angle is randomly set by the program.
 なお、STEP4の変換処理は、STEP2とSTEP3の間で行われてもよい。 Note that the conversion process in STEP 4 may be performed between STEP 2 and STEP 3.
 続いて、プロセッサ132は、各データB1~B9のシフト値S1~S9と変換処理の情報(例えば、回転処理および/または反転処理の回転反転値)を、各データB1~B9の識別情報と関連付けて、第一情報I1として対象者51の情報保存媒体30へ出力する(STEP5)。これにより、情報保存媒体30には、各データB1~B9のシフト値S1~S9と変換処理の情報が各データB1~B9の識別番号に関連付けられて保存される。なお、「識別情報」は、保存された各データを識別するユニークIDであり、さらに各対象者51において、抽出される複数のデータの識別情報が連番にならないようプログラムによりランダムに生成される。 Next, the processor 132 associates the shift values S1 to S9 of each of the data B1 to B9 and the information of the conversion process (for example, the rotation inversion value of the rotation process and/or the inversion process) with the identification information of each of the data B1 to B9. Then, it is output to the information storage medium 30 of the subject 51 as the first information I1 (STEP 5). As a result, the shift values S1 to S9 of each of the data B1 to B9 and conversion processing information are stored in the information storage medium 30 in association with the identification numbers of each of the data B1 to B9. The "identification information" is a unique ID that identifies each piece of saved data, and is randomly generated by a program for each target person 51 so that the identification information of multiple pieces of data to be extracted are not consecutive numbers. .
 例えば、第一情報I1は、撮像システム14を介して情報保存媒体30へ出力される。この場合、図1に示されるように、撮像システム14は、書込装置15を有している。例えば、図2に示される二次元コードリーダ121は、書込装置15として、二次元コードを読み込ませる機能も有してもよい。あるいは、NFCリーダ122は、書込装置15として、NFCデータ書き込み機能も有してもよい。プロセッサ132は、通信ネットワーク20を介して第一情報I1を撮像システム14へ出力する。撮像システム14の書込装置15は、第一情報I1を対象者51の情報保存媒体30へ出力する。 For example, the first information I1 is output to the information storage medium 30 via the imaging system 14. In this case, as shown in FIG. 1, the imaging system 14 includes a writing device 15. For example, the two-dimensional code reader 121 shown in FIG. 2 may also have a function of reading a two-dimensional code as the writing device 15. Alternatively, the NFC reader 122 may also have an NFC data writing function as the writing device 15. Processor 132 outputs first information I1 to imaging system 14 via communication network 20. The writing device 15 of the imaging system 14 outputs the first information I1 to the information storage medium 30 of the subject 51.
 対象者51が保持する情報保存媒体30がスマートフォンである場合、第一情報I1は、読み込ませる機能を有する二次元コードリーダ121から、近距離無線通信(図1の破線参照)により、例えば二次元コードの形態でスマートフォンに読み込ませる。あるいは、対象者51が保持する情報保存媒体30がNFCタグである場合、第一情報I1は、書き込み機能を有するNFCリーダ122から、近距離無線通信(図1の破線参照)により、対象者51が所有するNFCタグ、または対象者51に埋め込まれたNFCマイクロチップに出力される。 When the information storage medium 30 held by the subject 51 is a smartphone, the first information I1 is transmitted from the two-dimensional code reader 121 having a reading function to a two-dimensional code, for example, by short-range wireless communication (see the broken line in FIG. 1). Load it into your smartphone in the form of a code. Alternatively, if the information storage medium 30 held by the target person 51 is an NFC tag, the first information I1 is transmitted to the target person 51 from the NFC reader 122 having a writing function by short-range wireless communication (see the broken line in FIG. 1). is output to the NFC tag owned by the person 51 or the NFC microchip embedded in the subject 51.
 例えば、情報保存媒体30に保存されたデータB8の第一情報I1は、
No.F007_SFT{X−2Y−5Z2}_FLP{YES_YZ}_TRN{YES_XY90YZ0XZ0}_END ・・・(1)のように表される。
For example, the first information I1 of data B8 stored in the information storage medium 30 is
No. F007_SFT{X-2Y-5Z2}_FLP{YES_YZ}_TRN{YES_XY90YZ0XZ0}_END (1).
 一番左側に示されている番号No.F007は、データB8に割り当てられた識別番号(識別情報の一例)である。また、SFTで示された値が、データB8のシフト値S8である。本例においては、データB8に設定されたシフト値S8は、X−2、Y−5、Z2である。これは、第一座標値PA8から第二座標値PB8までのシフト値が、X軸方向に−2、Y軸方向に−5、Z軸方向に+2だけシフトさせることを表している。また、FLPとTRNで示された値が、変換処理を示す回転反転値であり、FLPは反転の有無および反転方向を示し、TRNは回転の有無および回転角度を示している。本例においては、データB8は、YZ面反転があることを示している。すなわち、データB8が原点P8を基準にYZ面反転していることを示している。また、本例においては、データB8が、原点P8を基準にXY面方向に90度、YZ面方向に0度、XZ面方向に0度に回転されていることを示している。 The number shown on the far left is No. F007 is an identification number (an example of identification information) assigned to data B8. Further, the value indicated by SFT is the shift value S8 of data B8. In this example, the shift values S8 set in the data B8 are X-2, Y-5, and Z2. This indicates that the shift value from the first coordinate value PA8 to the second coordinate value PB8 is shifted by -2 in the X-axis direction, -5 in the Y-axis direction, and +2 in the Z-axis direction. Furthermore, the values indicated by FLP and TRN are rotation inversion values indicating the conversion process, FLP indicates the presence or absence of inversion and the direction of inversion, and TRN indicates the presence or absence of rotation and the rotation angle. In this example, data B8 indicates that there is YZ plane inversion. That is, it shows that the data B8 is reversed in the YZ plane with respect to the origin P8. Further, in this example, data B8 indicates that the data B8 has been rotated by 90 degrees in the XY plane direction, 0 degree in the YZ plane direction, and 0 degree in the XZ plane direction with respect to the origin P8.
 続いて、プロセッサ132は、各データB1~B9を形成する形成点の座標値と回転処理の基軸となる原点P1~P9の座標値を、各データB1~B9の識別情報と関連付けて、第二情報I2として記憶装置40へ出力する(STEP6)。本例においては、データB1~B9および原点P1~P9の座標値として、データB1~B9の第二座標値PB1~PB9と原点P1~P9の第二座標値PB1~PB9が第二情報I2に含まれる。なお、データB1~B9の一部に対してシフト値が設定されない場合には、第一座標値が第二情報I2に含まれる。なお、STEP6の第二情報I2の出力処理は、STEP5の第一情報I1の出力処理の前に行われてもよい。 Subsequently, the processor 132 associates the coordinate values of the formation points forming each data B1 to B9 and the coordinate values of origins P1 to P9, which are the base axes of rotation processing, with the identification information of each data B1 to B9, and generates a second The information is output to the storage device 40 as information I2 (STEP 6). In this example, as the coordinate values of the data B1 to B9 and the origins P1 to P9, the second coordinate values PB1 to PB9 of the data B1 to B9 and the second coordinate values PB1 to PB9 of the origins P1 to P9 are used as the second information I2. included. Note that if a shift value is not set for part of the data B1 to B9, the first coordinate value is included in the second information I2. Note that the output processing of the second information I2 in STEP 6 may be performed before the output processing of the first information I1 in STEP 5.
 図7は、記憶装置40に記憶されるデータベースを例示している。第二情報I2として、第二座標値を有する各データB1~B9が、識別番号が付与されて、一つまたは複数の形成点と原点の第二座標値が共に保存されている。例えば、データB8については、識別番号としてNo.F007が付与されて、形成点P8と原点P8の第二座標値PB8と共に保存されている。図7では、データB8の形成点P8と原点P8の第二座標値PB8のみが示されているが、他のデータB1~B7、B9も同様に形成点P1~P7、P9と原点P1~P7、P9の第二座標値PB1~PB7、PB9が保存されている。なお、図7では、各データB1~B9は、所定の形状を有して示されているが、実際には、各データB1~B9を構成する一つまたは複数の形成点と原点の第二座標値が英数字記号で表現されている。 FIG. 7 illustrates a database stored in the storage device 40. As the second information I2, each data item B1 to B9 having a second coordinate value is assigned an identification number, and the second coordinate values of one or more forming points and the origin are stored together. For example, for data B8, the identification number is No. F007 is assigned and saved together with the second coordinate value PB8 of the formation point P8 and the origin P8. In FIG. 7, only the second coordinate value PB8 of the formation point P8 and the origin P8 of the data B8 is shown, but the other data B1 to B7 and B9 are similarly formed to the formation points P1 to P7 and P9 and the origin P1 to P7. , P9's second coordinate values PB1 to PB7 and PB9 are stored. In addition, in FIG. 7, each data B1 to B9 is shown having a predetermined shape, but in reality, each data B1 to B9 is formed by one or more forming points and a second point of the origin. Coordinate values are expressed using alphanumeric symbols.
 例えば、データB8の第二情報I2は、
No.F007_{X9.0Y5.0Z2X7.0Y2.0Z4.0X9.0Y0Z2X11.0Y−1.0Z0.5}_ ORG{X7.0Y2.0Z4.0} ・・・(2)のように表されうる。
For example, the second information I2 of data B8 is
No. F007_{X9.0Y5.0Z2X7.0Y2.0Z4.0X9.0Y0Z2X11.0Y-1.0Z0.5}_ORG{X7.0Y2.0Z4.0} It can be expressed as in (2).
 識別番号No.F007に続いて記載されている数値は、データB8を構成する複数の形成点(本例においては4つの形成点)の第二座標値を示している。また、ORGで示された値が、データB8の原点であり、変換処理(本例においては回転処理および反転処理)の基軸となる原点P8の第二座標値を示している。なお、データB8は、実際にはもっと多くの形成点が存在するが、簡素化のため原点を含めて4つの形成点の第二座標値のみを示して説明する。 Identification number No. The numerical values written following F007 indicate the second coordinate values of a plurality of formation points (four formation points in this example) that constitute data B8. Further, the value indicated by ORG is the origin of the data B8, and indicates the second coordinate value of the origin P8, which is the base axis of the conversion process (rotation process and inversion process in this example). Although there are actually more formation points in the data B8, for the sake of simplicity, only the second coordinate values of four formation points including the origin will be shown and explained.
 なお、本例においては、図7に示されるように、データベースにはさらに、対象者51のデータB1~B9は、撮像データが取得された地域(例えば、JPN:日本国)とデータB1~B9の取得時に使用された制御装置のコンピュータプログラムのバージョンに関連付けられて保存されている。また、図7では、記憶装置40は、対象者51のデータB1~B9のみが保存されているが、他の対象者のデータも同様に保存されてもよい。更に、記憶装置40は、どの対象者にも関連付かないダミーデータ(人工的に作成した偽データ)を保存するように構成して、対象者51のみデータが保存されているのを防ぐことができる。 In this example, as shown in FIG. 7, data B1 to B9 of the subject 51 are further stored in the database as the region where the imaging data was acquired (for example, JPN: Japan) and the data B1 to B9. stored in association with the version of the control device's computer program used at the time of acquisition. Further, in FIG. 7, the storage device 40 stores only data B1 to B9 of the subject 51, but data of other subjects may be stored in the same manner. Further, the storage device 40 can be configured to store dummy data (artificially created fake data) that is not associated with any target person, thereby preventing data from being stored only for the target person 51. can.
 なお、上述した実施形態では、プロセッサ132は、図3のSTEP3において、撮像データから抽出されたデータB1~B9に対してシフト値が設定されている。しかしながら、データB1~B9の少なくとも一つに対してシフト値が設定されてもよい。あるいは、全てのデータに対してシフト値が設定されずにSTEP4の回転処理または反転処理のみが実行されるように構成されてもよい。シフト値が設定されなかった場合、例えば、上記(1)で表される情報保存媒体30に保存される第一情報I1は、SFTの項目のシフト値がゼロとなり、FLPとTRNで示された項目の値が設定されている。上記(2)で表される記憶装置40に保存される第二情報I2については、識別番号No.F007に続いて記載されている数値は、データB8を構成する複数の形成点および原点の第二座標値を示す。 Note that in the embodiment described above, the processor 132 sets shift values for the data B1 to B9 extracted from the imaging data in STEP 3 of FIG. However, a shift value may be set for at least one of the data B1 to B9. Alternatively, the configuration may be such that only the rotation process or the inversion process in STEP 4 is executed without setting shift values for all data. If the shift value is not set, for example, in the first information I1 stored in the information storage medium 30 expressed in (1) above, the shift value of the SFT item is zero, and the shift value is indicated by FLP and TRN. The value of the item is set. Regarding the second information I2 stored in the storage device 40 represented by (2) above, the identification number No. The numerical values written following F007 indicate the second coordinate values of the plurality of formation points and the origin that constitute the data B8.
 なお、上述した実施形態では、プロセッサ132は、図3のSTEP4において、撮像データから抽出されたデータB1~B9に対して所定の変換処理が行われている。しかしながら、データB1~B9の少なくとも一つに対して所定の変換処理が行われてもよい。あるいは、全てのデータに対して変換処理は実行されずにSTEP3のシフト値の設定のみが実行されるように構成されてもよい。変換処理が実行されなかった場合、例えば、上記(1)で表される情報保存媒体30に保存される第一情報I1は、FLPとTRNの項目にはNOと示され、SFTの項目にはシフト値が設定されている。 Note that in the embodiment described above, the processor 132 performs a predetermined conversion process on the data B1 to B9 extracted from the imaging data in STEP 4 of FIG. However, a predetermined conversion process may be performed on at least one of the data B1 to B9. Alternatively, the configuration may be such that only the shift value setting in STEP 3 is performed without converting all data. If the conversion process is not executed, for example, the first information I1 expressed in (1) above stored in the information storage medium 30 will be shown as NO in the FLP and TRN items, and will be shown as NO in the SFT item. Shift value is set.
 また、上述した実施形態において、STEP2とSTEP3の間またはSTEP4において、所定の画像処理(黒塗りなど)が施されてもよく、あるいは認証用のデータとして抽出されないような処理が適宜行われてもよい。 Further, in the above-described embodiment, predetermined image processing (blacking, etc.) may be performed between STEP 2 and STEP 3 or in STEP 4, or processing that will not be extracted as authentication data may be performed as appropriate. good.
 また、上述した実施形態では、3次元形状データの撮像データから認証情報を生成する場合について説明した。しかしながら、2次元画像データまたは3次元画像データに基づいて認証を行う場合には、2次元画像データまたは3次元画像データに基づいて第一情報I1および第二情報I2が生成されてもよい。 Furthermore, in the embodiment described above, the case where authentication information is generated from imaging data of three-dimensional shape data has been described. However, when performing authentication based on two-dimensional image data or three-dimensional image data, the first information I1 and the second information I2 may be generated based on the two-dimensional image data or three-dimensional image data.
 例えば、図3のSTEP2の複数のデータの抽出処理では、プロセッサ132は、図8に例示されるように、2次元画像データ、3次元画像データ、または、3次形状データと2次元画像データにより生成される3次元画像データの画像データから、XY座標軸上にて一辺が8ピクセルの最小コーディング単位のブロックを1つのエリアとした、または一辺が8ピクセルの倍数のブロックを1つのエリアとした各データが抽出される。図8において、一つの四角で囲まれているエリアが一つのデータの領域を示している。 For example, in the multiple data extraction process in STEP 2 of FIG. 3, the processor 132 uses two-dimensional image data, three-dimensional image data, or three-dimensional shape data and two-dimensional image data From the image data of the generated three-dimensional image data, each area is defined as a block with a minimum coding unit of 8 pixels on a side on the XY coordinate axes, or a block with a multiple of 8 pixels on a side as one area. Data is extracted. In FIG. 8, an area surrounded by one square indicates one data area.
 各データの1番目の第一ピクセルPXの座標値は、各データが配置されている座標値として設定される。また、各データの第一ピクセルPXは、回転処理および反転処理の基軸となる原点として設定される。すなわち、第一ピクセルPXの座標値は、データの座標値であり、原点の座標値である。なお、図8では、抽出されたデータの一例として、データB17とその原点である第一ピクセルPXが拡大されて示されている。なお、データB17のように虹彩などの情報が含まれている場合で虹彩認識を使用しない時は、プログラムにより虹彩などの情報を自動的に塗潰されるように設定されてもよい。 The coordinate value of the first pixel PX of each data is set as the coordinate value where each data is arranged. Further, the first pixel PX of each data is set as the origin, which is the base axis of rotation processing and inversion processing. That is, the coordinate values of the first pixel PX are the coordinate values of data, and are the coordinate values of the origin. Note that in FIG. 8, data B17 and the first pixel PX, which is its origin, are shown enlarged as an example of extracted data. Note that when iris and other information is included as in data B17 and iris recognition is not used, the program may be configured to automatically fill out the iris and other information.
 続いて、STEP3のシフト値設定処理では、プロセッサ132は、抽出された複数のデータの少なくとも一つについてシフト値を設定する。すなわち、複数のデータの少なくとも一つに対して、第一ピクセルPXの座標値のシフト値を設定する。シフト値は、3次元形状データと同様に、プログラムによりランダムで決められる。 Subsequently, in the shift value setting process of STEP 3, the processor 132 sets a shift value for at least one of the plurality of extracted data. That is, a shift value of the coordinate value of the first pixel PX is set for at least one of the plurality of data. The shift value, like the three-dimensional shape data, is randomly determined by the program.
 続いて、STEP4のデータ変換処理では、プロセッサ132は、データの少なくとも一つに対して所定の変換処理(回転処理や反転処理)を行う。すなわち、複数のデータの少なくとも一つに対して、第一ピクセルPXを原点とした回転処理または反転処理を行う。なお、画像データの場合は、回転処理または反転処理が行われると、原点の座標値(データの座標値)は変わらず、画像データ内で表示されるイメージの状態が変化(回転または反転)する。すなわち、回転処理または反転処理後の原点の座標値(データの座標値)は、回転処理または反転処理前の原点の座標値(データの座標値)と同じく、第一ピクセルPXの座標値となる。 Subsequently, in the data conversion process of STEP 4, the processor 132 performs a predetermined conversion process (rotation process or inversion process) on at least one of the data. That is, at least one of the plurality of data is subjected to rotation processing or inversion processing using the first pixel PX as the origin. In the case of image data, when rotation or inversion processing is performed, the coordinate values of the origin (data coordinate values) remain the same, but the state of the image displayed within the image data changes (rotation or inversion). . In other words, the coordinate values of the origin after the rotation or reversal processing (data coordinate values) are the coordinate values of the first pixel PX, the same as the coordinate values of the origin before the rotation or reversal processing (data coordinate values). .
 続いて、STEP5の第一情報出力処理では、プロセッサ132は、各データのシフト値と回転反転値を、各データの識別情報と関連付けて、第一情報I1として対象者51の情報保存媒体30へ出力する。 Subsequently, in the first information output process of STEP 5, the processor 132 associates the shift value and rotation inversion value of each data with the identification information of each data, and outputs it to the information storage medium 30 of the subject 51 as the first information I1. Output.
 続いて、STEP6の第二情報出力処理では、プロセッサ132は、各データの座標値(すなわち、原点の座標値)を、第二情報I2として記憶装置40へ出力する。シフト値が設定されているデータの座標値としては第二座標値(すなわち、原点の第二座標値)が、シフト値が設定されていないデータの座標値として第一座標値(すなわち、原点の第一座標値)が、データの座標値として、第二情報I2に含まれる。 Subsequently, in the second information output process of STEP 6, the processor 132 outputs the coordinate values of each data (that is, the coordinate values of the origin) to the storage device 40 as the second information I2. The coordinate value of data for which a shift value is set is the second coordinate value (i.e., the second coordinate value of the origin), and the coordinate value of data for which no shift value is set is the first coordinate value (i.e., the second coordinate value of the origin). The first coordinate value) is included in the second information I2 as the data coordinate value.
 なお、STEP3において、データに対してシフト値が設定されずに、STEP4の回転処理または反転処理のみが実行されるように構成されてもよい。あるいは、STEP4において、データに対して所定の変換処理が行われずに、STEP3のシフト値設定のみが実行されるように構成されてもよい。 Note that in STEP 3, the shift value may not be set for the data, and only the rotation process or the inversion process in STEP 4 may be executed. Alternatively, the configuration may be such that in STEP 4, only the shift value setting in STEP 3 is performed without performing a predetermined conversion process on the data.
 図9は、シフト処理ならびに回転処理および反転処理が行われたデータB17を示している。例えば、データB17は、原点である第一ピクセルPXを基軸にY軸反転され、時計回りにXY面90度回転された後、第一ピクセルPXの座標値が、Xが−20およびYが−10シフトした座標値上に配置される。そして、第一ピクセルPXのXY座標値が、第二情報I2として記憶装置40へ出力される。なお、画像データの場合は、回転処理や反転処理が行われた後の原点の座標値は、第一ピクセルPXの座標値となる。 FIG. 9 shows data B17 that has been subjected to shift processing, rotation processing, and inversion processing. For example, data B17 is Y-axis inverted with respect to the first pixel PX, which is the origin, and rotated clockwise by 90 degrees in the XY plane, and then the coordinate values of the first pixel PX are -20 for X and -20 for Y. It is placed on the coordinate value shifted by 10. Then, the XY coordinate values of the first pixel PX are output to the storage device 40 as second information I2. Note that in the case of image data, the coordinate values of the origin after rotation processing and inversion processing are performed are the coordinate values of the first pixel PX.
 具体的には、例えば、第二情報I2のデータB17に関する情報は、
No.F017_{X−20Y−10}_IMG{FFDA 000C 0301…7517}_END ・・・(3)のように表される。なお、上記(3)に示される情報では、IMGの一部の情報が省略されている。
Specifically, for example, the information regarding the data B17 of the second information I2 is
No. F017_{X-20Y-10}_IMG{FFDA 000C 0301...7517}_END (3). Note that in the information shown in (3) above, some information on the IMG is omitted.
 一番左側に示されている番号No.F017は、データB17に割り当てられた識別番号(識別情報の一例)である。また、識別番号に続いて、データB17の第二座標値であり且つ変換処理(回転処理または反転処理)の基軸となる原点の第二座標値である第一ピクセルPXの第二座標値が示されている。さらに、IMGで示された値が、データB17のブロック(本例においては縦80ピクセル×横80ピクセルのブロック)を構成する各ピクセルの色情報などが内包されているセグメントを示している。 The number shown on the far left is No. F017 is an identification number (an example of identification information) assigned to data B17. Further, following the identification number, the second coordinate value of the first pixel PX, which is the second coordinate value of the data B17 and the second coordinate value of the origin which is the base axis of the conversion process (rotation process or inversion process), is shown. has been done. Further, the value indicated by IMG indicates a segment that includes color information, etc. of each pixel constituting the block of data B17 (in this example, a block of 80 pixels vertically by 80 pixels horizontally).
 例えば、データB17の第一情報I1は、
No.F017_SFT{X−10Y−12}_FLP{YES_Y}_TRN{YES_90}_IMG{FFD8 FFE0 0010…FFDB…FFC0…FFC4…FFC4…FFC4…FFD9}_END ・・・(4)のように表される。なお、上記(4)に示される情報では、IMGの一部の情報が省略されている。
For example, the first information I1 of data B17 is
No. F017_SFT{X-10Y-12}_FLP{YES_Y}_TRN{YES_90}_IMG{FFD8 FFE0 0010...FFDB...FFC0...FFC4...FFC4...FFC4...FFD9}_END (4) . Note that in the information shown in (4) above, some information on the IMG is omitted.
 識別番号No.F017に続いて、データB17のシフト値、反転の有無および反転方向、回転処理(平面回転)の有無および回転角度が示されている。具体的には、データB17に設定されたシフト値は、X−10、Y−12であり、データB17の第一ピクセルPXの第一座標値から第二座標値までのシフト値が、X軸方向に−10、Y軸方向に−12だけシフトさせることを表している。また、データB17は、原点である第一ピクセルPXを基軸にY軸反転があり、さらに原点である第一ピクセルPXを基軸に時計回りにXY面90度回転していることを表している。 Identification number No. Following F017, the shift value of data B17, presence or absence of inversion, inversion direction, presence or absence of rotation processing (plane rotation), and rotation angle are shown. Specifically, the shift values set for data B17 are X-10, Y-12, and the shift value from the first coordinate value to the second coordinate value of the first pixel PX of data B17 is This indicates a shift of -10 in the direction and -12 in the Y-axis direction. Further, data B17 indicates that there is a Y-axis inversion around the first pixel PX, which is the origin, and a clockwise rotation of 90 degrees in the XY plane around the first pixel PX, which is the origin.
 なお、本例においては2次元画像データをJFIFファイルとして説明するが、JFIFファイルはセグメントと呼ばれるデータ単位から構成されており、第一情報I1には、JFIFファイルを構成する一部のセグメントの情報を含んでもよい。例えば、上記(4)で表されるように、第一情報I1において、IMGで示された値に、データB17に関する圧縮方式や縦横のピクセル数などの情報を含んだセグメントが含まれている。 In this example, two-dimensional image data will be explained as a JFIF file, but a JFIF file is composed of data units called segments, and the first information I1 includes information on some segments that make up the JFIF file. May include. For example, as expressed in (4) above, in the first information I1, the value indicated by IMG includes a segment that includes information such as the compression method and the number of vertical and horizontal pixels regarding the data B17.
 このように、画像データの座標値や、画像データ内の圧縮方式や色情報(RGBの数値)のセグメントの一部を、第一情報I1または第二情報I2として別々に保存することにより、第一情報I1だけでは、データが不完全なため、画像データを復元することができない。また、第一情報I1だけでは、画像データの原点である第一ピクセルを配置する座標位置も不明なため、画像データを正確に復元することができない。したがって、第一情報I1および第二情報I2のいずれか一方の情報が流出した場合でも個人が特定されることを防止できる。 In this way, by separately saving the coordinate values of the image data, the compression method in the image data, and a portion of the color information (RGB values) segments as the first information I1 or the second information I2, the Image data cannot be restored with only one piece of information I1 because the data is incomplete. Moreover, since the coordinate position where the first pixel, which is the origin of the image data, is to be placed is also unknown using only the first information I1, the image data cannot be accurately restored. Therefore, even if either the first information I1 or the second information I2 is leaked, it is possible to prevent an individual from being identified.
 また、3次元画像データ用に抽出されたデータB8の第一情報I1は、例えば
No.F007_SFT{X−2Y−5Z2}_FLP{YES_YZ}_TRN{YES_XY90YZ0XZ0}_IMG{FFD8 FFE0 0010…FFDB…FFC0…FFC4…FFC4…FFC4…FFD9}…_END ・・・(5)のように表される。なお、上記(5)に示される情報では、IMGの一部の情報が省略されている。
Further, the first information I1 of the data B8 extracted for three-dimensional image data is, for example, No. F007_SFT{X-2Y-5Z2}_FLP{YES_YZ}_TRN{YES_XY90YZ0XZ0}_IMG{FFD8 FFE0 0010...FFDB...FFC0...FFC4...FFC4...FFC4...FFD9 }..._END...It is expressed as (5). Note that in the information shown in (5) above, some information on the IMG is omitted.
 また、3次元画像データ用に抽出されたデータB8の第二情報I2は、例えば
No.F007_{X9.0Y5.0Z2X7.0Y2.0Z4.0X9.0Y0Z2X11.0Y−1.0Z0.5}_ORG{X7.0Y2.0Z4.0} _IMG{X4.5Y7.0_FFDA 001A 0101…5526}…_END ・・・(6)のように表される。なお、上記(6)に示される情報では、IMGの一部の情報が省略されている。
Further, the second information I2 of the data B8 extracted for three-dimensional image data is, for example, No. F007_{X9.0Y5.0Z2X7.0Y2.0Z4.0X9.0Y0Z2X11.0Y-1.0Z0.5}_ORG{X7.0Y2.0Z4.0} _IMG{X4.5Y7.0_FFDA 001A 0101...55 26}..._END...・It is expressed as (6). Note that in the information shown in (6) above, some information on the IMG is omitted.
 3次元画像データは、2次元画像データを3次元形状データのXY座標軸上に投影表示しているデータである。例えば、上記(5)で表されるように、第一情報I1は、3次元形状データのシフト値や回転反転値などの情報に加えて、2次元画像データに関する圧縮方式や縦横のピクセル数などを示すセグメントの情報を含んでもよい。また、上記(6)で表されるように、第二情報I2は、3次元形状データを構成する複数の形成点や原点の第二座標値の情報に加えて、XYZ座標軸の3次元形状データに投影表示する2次元画像データの原点である第一ピクセルのXY座標値の位置や各ピクセルの色情報などを示すセグメントの情報を含んでもよい。 Three-dimensional image data is data in which two-dimensional image data is projected and displayed on the XY coordinate axes of three-dimensional shape data. For example, as expressed in (5) above, the first information I1 includes information such as the shift value and rotation inversion value of the three-dimensional shape data, as well as the compression method and number of vertical and horizontal pixels regarding the two-dimensional image data. It may also include segment information indicating. In addition, as expressed in (6) above, the second information I2 includes three-dimensional shape data of the XYZ coordinate axes in addition to information on the second coordinate values of the origin and the plurality of forming points that constitute the three-dimensional shape data. It may also include segment information indicating the XY coordinate value position of the first pixel, which is the origin of the two-dimensional image data to be projected and displayed, and color information of each pixel.
 例えば、図10に例示されるように、識別情報No.F007のデータB8の3次元形状データのXY座標値から、3次元形状データ上に投影表示する2次元画像データの各ブロック位置を算出できる。具体的には、データB8の画像データは、例えば、太枠で示す四角形のブロックに対応する。 For example, as illustrated in FIG. 10, identification information No. From the XY coordinate values of the three-dimensional shape data of data B8 of F007, the position of each block of the two-dimensional image data to be projected and displayed on the three-dimensional shape data can be calculated. Specifically, the image data of data B8 corresponds to, for example, a rectangular block indicated by a thick frame.
 太枠のブロックのXY位置(原点)と画像データの中の情報を識別情報No.F007の第一情報I1および第二情報I2に分離して情報保存媒体30および記憶装置40にそれぞれ保存する。 The XY position (origin) of the thick frame block and the information in the image data are identified as identification information No. The first information I1 and the second information I2 of F007 are separated and stored in the information storage medium 30 and the storage device 40, respectively.
 具体的には、まず、3次元形状データのXY座標値を基に、画像データのブロック位置とサイズを算出する。そして、画像データのブロック内で3次元形状データのXY座標上に位置するピクセルを特定する。例えば、図10の丸で囲った数字で示されるように、30個のピクセル(あくまで説明のために簡素化した一例)が特定される。 Specifically, first, the block position and size of the image data are calculated based on the XY coordinate values of the three-dimensional shape data. Then, pixels located on the XY coordinates of the three-dimensional shape data within the block of image data are specified. For example, as shown by the circled numbers in FIG. 10, 30 pixels (an example simplified for purposes of explanation) are identified.
 続いて、前記ブロックの第一ピクセルを原点としてXY座標値(X4.5、Y7.0)を第二情報I2として記憶装置40に保存する。また、シフト値、変換処理情報(回転処理や反転処理)、画像情報の各種パラメータを第一情報I1として情報保存媒体30に保存する。 Subsequently, the XY coordinate values (X4.5, Y7.0) with the first pixel of the block as the origin are stored in the storage device 40 as second information I2. Further, shift values, conversion processing information (rotation processing and inversion processing), and various parameters of image information are stored in the information storage medium 30 as first information I1.
 なお、本例においては、画像データは1ブロックが8×8ピクセルから構成される場合について説明したが、抽出する3次元形状データのサイズに合わせて1ブロックの大きさを変えてもよい(例えば、1ブロック=80×80ピクセル)。 In this example, the case where one block of image data is composed of 8 x 8 pixels has been explained, but the size of one block may be changed according to the size of the three-dimensional shape data to be extracted (for example, , 1 block = 80 x 80 pixels).
 また、第二情報I2は、画像データを構成する各ピクセルの色情報などを示すセグメントの情報を含んでもよい。例えば、上記(6)で表されるように、第二情報I2において、IMGで示された値が各ピクセルの色情報などを含んでいる。なお、画像データのブロック内で3次元形状データのXY座標上に位置しないその他のピクセルは色情報をゼロ(RゼロGゼロBゼロの黒色)として抽出する。RゼロGゼロBゼロは一例であり、他の色で抽出してもよい。 Further, the second information I2 may include segment information indicating color information of each pixel forming the image data. For example, as expressed in (6) above, in the second information I2, the value indicated by IMG includes color information of each pixel. Note that color information of other pixels that are not located on the XY coordinates of the three-dimensional shape data within the block of image data is extracted as zero (black with R zero G zero B zero). R zero G zero B zero is just an example, and other colors may be extracted.
 次に、図11~図13を用いて、上述のように生成された認証情報(第一情報I1、第二情報I2)を用いた認証処理について説明する。図11は、制御装置13のプロセッサ132において実行される認証処理の流れを例示している。 Next, an authentication process using the authentication information (first information I1, second information I2) generated as described above will be explained using FIGS. 11 to 13. FIG. 11 illustrates the flow of authentication processing executed by the processor 132 of the control device 13.
 まず、プロセッサ132は、撮像装置11により撮像された被認証者50の撮像データを取得する(STEP11)。例えば、決済(支払い)システムなどを利用する被認証者50が撮像システム14に対して所定の範囲内に位置すると、撮像装置11により被認証者50の顔が撮像される。なお、STEP11の撮像データの取得は、後述するSTEP12またはSTEP13の後に行われてもよい。 First, the processor 132 acquires image data of the person to be authenticated 50 captured by the imaging device 11 (STEP 11). For example, when a person to be authenticated 50 using a payment system or the like is located within a predetermined range with respect to the imaging system 14, the face of the person to be authenticated 50 is captured by the imaging device 11. Note that acquisition of the imaging data in STEP 11 may be performed after STEP 12 or STEP 13, which will be described later.
 続いて、プロセッサ132は、読取装置12により読み取られた被認証者50が所持する情報保存媒体30に保存された第一情報I1を取得する(STEP12)。例えば、被認証者50が二次元コードを表示したスマートフォンを二次元コードリーダ121に近づけると、二次元コードリーダ121により第一情報I1が読み取られる。 Subsequently, the processor 132 acquires the first information I1 read by the reading device 12 and stored in the information storage medium 30 owned by the person to be authenticated 50 (STEP 12). For example, when the person to be authenticated 50 brings a smartphone displaying a two-dimensional code close to the two-dimensional code reader 121, the two-dimensional code reader 121 reads the first information I1.
 続いて、プロセッサ132は、記憶装置40から被認証者50の認証に関する第二情報I2を取得する(STEP13)。具体的には、プロセッサ132は、第一情報I1に含まれる認証用のデータの識別情報に対応するデータを記憶装置40から取得する。 Subsequently, the processor 132 acquires second information I2 regarding the authentication of the person to be authenticated 50 from the storage device 40 (STEP 13). Specifically, the processor 132 acquires data corresponding to the identification information of the authentication data included in the first information I1 from the storage device 40.
 続いて、プロセッサ132は、第一情報I1と第二情報I2に基づいて、被認証者50の認証用のデータを所定の座標系上に復元する(STEP14)。所定の座標系としては、第一情報I1および第二情報I2を生成する際に用いられた座標系が使用される。座標系の情報は、第一情報I1に含まれていてもよい。 Subsequently, the processor 132 restores the authentication data of the person to be authenticated 50 on a predetermined coordinate system based on the first information I1 and the second information I2 (STEP 14). As the predetermined coordinate system, the coordinate system used when generating the first information I1 and the second information I2 is used. Information on the coordinate system may be included in the first information I1.
 図12と図13は、図6のデータB8をXYZ座標系において復元する例を示している。まず、図12に例示されるように、プロセッサ132は、第二情報I2に含まれるデータB8の第二座標値の情報に基づいて、データB8を所定の座標系に配置する。具体的には、第二情報I2に含まれるデータB8を構成する複数の形成点の全ての第二座標値に基づいて、データB8が配置される。そして、第一情報I1に含まれるデータB8に対応するシフト値S8の情報に基づいて、座標系に配置したデータB8の位置を第一座標値まで移動させる。具体的には、データB8を構成する複数の形成点の全てを第一座標値まで移動させる。なお、図12と図13においては、データB8のみを示しており、さらにデータB8を構成する複数の形成点のうち原点として設定された形成点の第一座標値PA8、第二座標値PB8およびシフト値S8のみが示されている。 12 and 13 show an example of restoring the data B8 in FIG. 6 in the XYZ coordinate system. First, as illustrated in FIG. 12, the processor 132 arranges the data B8 in a predetermined coordinate system based on the information on the second coordinate value of the data B8 included in the second information I2. Specifically, the data B8 is arranged based on all the second coordinate values of the plurality of forming points that constitute the data B8 included in the second information I2. Then, based on the information of the shift value S8 corresponding to the data B8 included in the first information I1, the position of the data B8 arranged in the coordinate system is moved to the first coordinate value. Specifically, all of the plurality of formation points forming the data B8 are moved to the first coordinate value. In addition, in FIGS. 12 and 13, only data B8 is shown, and the first coordinate value PA8, second coordinate value PB8, and Only shift value S8 is shown.
 続いて、図13に例示されるように、第二情報I2に含まれる原点P8を基軸として第一情報I1に含まれる変換処理の情報(例えば、回転処理または反転処理の回転反転値)に応じて、データB8の回転と反転を調整する。これにより、図13に例示されるように、データB8は、座標系上において正しい位置かつ正しい角度(図6と同じ位置かつ同じ角度)で配置(復元)される。なお、2次元画像データも同じように、シフトおよび変換処理前の位置と角度に配置(復元)される(図示せず)。 Subsequently, as illustrated in FIG. 13, the origin P8 included in the second information I2 is used as a reference axis, and the conversion processing information included in the first information I1 (for example, the rotation inversion value of the rotation processing or inversion processing) is used. Then, adjust the rotation and inversion of data B8. Thereby, as illustrated in FIG. 13, data B8 is placed (restored) at the correct position and correct angle on the coordinate system (same position and same angle as in FIG. 6). Note that the two-dimensional image data is similarly arranged (restored) at the position and angle before the shift and conversion processing (not shown).
 プロセッサ132は、全てのデータB1~B9の復元が終了する(STEP15においてYES)まで、STEP14の処理を繰り返す。これにより、全てのデータB1~B9が座標系上に再配置(復元)される(図6参照)。これにより参照データが生成される。 The processor 132 repeats the process in STEP 14 until the restoration of all data B1 to B9 is completed (YES in STEP 15). As a result, all data B1 to B9 are rearranged (restored) on the coordinate system (see FIG. 6). This generates reference data.
 続いて、プロセッサ132は、復元したデータとSTEP1で取得した撮像データとに基づいて認証処理を行う(STEP16)。具体的には、プロセッサ132は、復元した参照データと撮像データとを照合する。プロセッサ132は、両者の一致度が閾値を上回る場合に、認証を成立させる。換言すると、プロセッサ132は、両者の一致度が閾値を上回る場合に、撮像装置11により自らの顔の撮像データを取得させた被認証者50を、登録されたユーザとして認証する。 Subsequently, the processor 132 performs authentication processing based on the restored data and the imaging data acquired in STEP 1 (STEP 16). Specifically, the processor 132 compares the restored reference data and the imaged data. The processor 132 establishes authentication when the degree of matching between the two exceeds a threshold value. In other words, the processor 132 authenticates the person to be authenticated 50 who caused the imaging device 11 to acquire imaged data of his or her face as a registered user when the degree of matching between the two exceeds the threshold value.
 上記したように、本実施形態に係る認証システム10によれば、制御装置13は、情報保存媒体30から第一情報I1を取得し、記憶装置40から第二情報I2を取得する。そして制御装置13は、第一情報I1と第二情報I2に基づいて、複数のデータB1~B9を所定の座標系上に復元する。 As described above, according to the authentication system 10 according to the present embodiment, the control device 13 acquires the first information I1 from the information storage medium 30 and acquires the second information I2 from the storage device 40. Then, the control device 13 restores the plurality of data B1 to B9 on a predetermined coordinate system based on the first information I1 and the second information I2.
 情報保存媒体30には、第一情報I1として、複数のデータB1~B9のシフト値S1~S9が保存されており、他方、複数のデータB1~B9の座標値は保存されていない。また、記憶装置40には、第二情報I2として、複数のデータB1~B9の座標値が保存されているが、シフト値は保存されていない。これにより、情報保存媒体30または記憶装置40のいずれか一方に記憶された認証情報だけでは、複数のデータを元の撮像データに復元できない。したがって、情報保存媒体30に保存された第一情報I1または記憶装置40に保存された第二情報I2のいずれか一方の情報が流出しても個人が特定されることを防止できる。 In the information storage medium 30, the shift values S1 to S9 of the plurality of data B1 to B9 are stored as the first information I1, and on the other hand, the coordinate values of the plurality of data B1 to B9 are not stored. Further, the storage device 40 stores coordinate values of a plurality of data B1 to B9 as second information I2, but does not store shift values. As a result, it is not possible to restore a plurality of pieces of data to the original captured data using only the authentication information stored in either the information storage medium 30 or the storage device 40. Therefore, even if either the first information I1 stored in the information storage medium 30 or the second information I2 stored in the storage device 40 is leaked, it is possible to prevent an individual from being identified.
 さらに、本実施形態においては、複数のデータB1~B9に対して、回転処理および反転処理が実行されている。情報保存媒体30には、第一情報I1として、回転処理および反転処理の回転反転値が保存されているが、回転処理および反転処理の基軸となる原点P1~P9の座標値は保存されていない。また、記憶装置40には、第二情報I2として、回転処理および反転処理の基軸となる原点P1~P9の座標値は保存されているが、回転反転値は保存されていない。したがって、第一情報I1または第二情報I2が他人に知られた場合でも、他人により複数のデータを復元することを防ぐことができる。 Furthermore, in this embodiment, rotation processing and inversion processing are performed on the plurality of data B1 to B9. The information storage medium 30 stores, as first information I1, the rotation reversal values of the rotation processing and reversal processing, but the coordinate values of the origins P1 to P9, which are the base axes of the rotation processing and reversal processing, are not stored. . Further, the storage device 40 stores, as second information I2, the coordinate values of the origins P1 to P9, which are the base axes of rotation processing and reversal processing, but does not store rotation reversal values. Therefore, even if the first information I1 or the second information I2 is known to another person, it is possible to prevent the other person from restoring the plurality of data.
 また、複数のデータB1~B9は、対象者51(被認証者50)の撮像データから対象者51(被認証者50)の特徴を特定できないサイズに分割されて抽出されたデータである。そして、2次元画像データも同様に、対象者51(被認証者50)の特徴を特定できないサイズ(ブロック)に分割されて抽出されたデータである。これにより、複数のデータB1~B9が他人に知られた場合でも、個人が特定されることを防止できる。 Further, the plurality of data B1 to B9 are data extracted from the imaging data of the subject 51 (person 50 to be authenticated) after being divided into a size that does not allow the characteristics of the subject 51 (person 50 to be authenticated) to be specified. Similarly, the two-dimensional image data is extracted data that is divided into sizes (blocks) in which the characteristics of the subject 51 (authenticated person 50) cannot be specified. Thereby, even if the plurality of data B1 to B9 are known to others, it is possible to prevent the individual from being identified.
 また、複数のデータB1、B2、B5~B9は、合成された場合に被認証者50の連続した領域を構成しないように抽出されたデータである。そして、2次元画像データも同様に、画像データが連続した領域を構成しないように抽出されたデータである。これにより、合成するデータ間の距離が不明なため、複数のデータを復元することができず、個人を特定することができない。 Furthermore, the plurality of data B1, B2, B5 to B9 are data extracted so as not to constitute a continuous area of the person to be authenticated 50 when combined. Similarly, the two-dimensional image data is data extracted so that the image data does not constitute a continuous area. As a result, the distance between the pieces of data to be combined is unknown, making it impossible to restore multiple pieces of data and making it impossible to identify individuals.
 また、記憶装置40は、被認証者50とは異なる被認証者の生体認証に関する第二情報I2も記憶してもよい。これにより、記憶装置40の第二情報I2が他人に知られた場合でも、複数の人物の第二情報I2から被認証者50の第二情報I2を特定することができず、他人により被認証者50のデータを復元することを防ぐことができる。 Furthermore, the storage device 40 may also store second information I2 regarding biometric authentication of a person to be authenticated who is different from the person to be authenticated 50. As a result, even if the second information I2 in the storage device 40 is known to another person, the second information I2 of the person to be authenticated 50 cannot be specified from the second information I2 of multiple people, and It is possible to prevent the data of the person 50 from being restored.
 これまで説明した様々な機能を有する制御装置13のプロセッサ132は、汎用メモリと協働して動作する汎用マイクロプロセッサにより実現されうる。汎用マイクロプロセッサとしては、CPU、MPU、GPUが例示されうる。汎用メモリとしては、ROMやRAMが例示されうる。この場合、ROMには、上述した処理を実行するコンピュータプログラムが記憶されうる。コンピュータプログラムは、人工知能(AI)プログラムを含んでもよい。AIプログラムは、多層のニューラルネットワークを用いた教師有りまたは教師なし機械学習(特に、ディープラーニング)によって構築されたプログラム(学習済みモデル)である。ROMは、非一時的なコンピュータ可読媒体の一例である。汎用マイクロプロセッサは、ROM上に記憶されたコンピュータプログラムの少なくとも一部を指定してRAM上に展開し、RAMと協働して上述した処理を実行する。上記のコンピュータプログラムは、汎用メモリにプリインストールされてもよいし、無線通信ネットワークを介して外部サーバ装置からダウンロードされた後、汎用メモリにインストールされてもよい。 The processor 132 of the control device 13 having the various functions described above can be realized by a general-purpose microprocessor that operates in cooperation with a general-purpose memory. Examples of general-purpose microprocessors include CPUs, MPUs, and GPUs. Examples of general-purpose memory include ROM and RAM. In this case, the ROM may store a computer program that executes the above-described processing. The computer program may include an artificial intelligence (AI) program. An AI program is a program (trained model) constructed by supervised or unsupervised machine learning (particularly deep learning) using a multilayer neural network. ROM is an example of a non-transitory computer-readable medium. The general-purpose microprocessor specifies at least a part of the computer program stored on the ROM, loads it on the RAM, and executes the above-described processing in cooperation with the RAM. The computer program described above may be preinstalled in the general-purpose memory, or may be downloaded from an external server device via a wireless communication network and then installed in the general-purpose memory.
 プロセッサ132は、マイクロコントローラ、ASIC、FPGAなどの上記のコンピュータプログラムを実行可能な専用集積回路によって実現されてもよい。この場合、当該専用集積回路に含まれる記憶素子に上記のコンピュータプログラムがプリインストールされる。各プロセッサは、汎用マイクロプロセッサと専用集積回路の組合せによっても実現されうる。 The processor 132 may be realized by a dedicated integrated circuit such as a microcontroller, ASIC, or FPGA that is capable of executing the computer program described above. In this case, the above-mentioned computer program is preinstalled in the memory element included in the dedicated integrated circuit. Each processor may also be implemented by a combination of a general purpose microprocessor and a special purpose integrated circuit.
 上記の各実施形態は、本発明の理解を容易にするための例示にすぎない。上記の各実施形態に係る構成は、本発明の趣旨を逸脱しなければ、適宜に変更・改良されうる。 Each of the above embodiments is merely an illustration to facilitate understanding of the present invention. The configurations according to each of the embodiments described above may be modified and improved as appropriate without departing from the spirit of the present invention.
 上記の実施形態において、データを構成する複数の形成点のうち一つの形成点を原点として説明しているが、原点は、データを構成する形成点である必要はない。 In the above embodiment, one formation point among the plurality of formation points forming the data is described as the origin, but the origin does not have to be the formation point forming the data.
 上記の実施形態において、認証システム10は、撮像装置11を用いて取得された被認証者50の顔の撮像データに基づいて顔認証する例について説明した。しかしながら、認証システム10は、顔認証以外の生体認証を行うように構成されうる。例えば、認証システム10は、指紋認証を行うように構成されうる。具体的には、撮像装置11により取得された被認証者50の指の指紋の撮像データに基づいて指紋認証が行われてもよい。あるいは、撮像システム14には撮像装置11の代わりに指紋センサが搭載され、レーザまたは音波を被認証者50の指に向けて照射し、指から反射されたレーザまたは音波を受信し、受信した信号を解析することにより被認証者50の指の指紋の画像データおよび形状データなどを取得してもよい。また、顔認証や指紋認証の代わりに、認証システム10は、静脈認証を行うように構成されうる。具体的には、撮像装置11を用いて被認証者50の指に近赤外線を照射してカメラにより静脈パターンを撮像し、静脈パターンの撮像データに基づいて静脈認証が行われてもよい。さらに、顔認証、指紋認証または静脈認証の代わりに、認証システム10は、虹彩認証を行うように構成されてもよい。 In the above embodiment, an example was described in which the authentication system 10 performs face authentication based on image data of the face of the person to be authenticated 50 acquired using the imaging device 11. However, the authentication system 10 may be configured to perform biometric authentication other than face authentication. For example, authentication system 10 may be configured to perform fingerprint authentication. Specifically, fingerprint authentication may be performed based on image data of the fingerprint of the person to be authenticated 50 acquired by the imaging device 11. Alternatively, the imaging system 14 is equipped with a fingerprint sensor instead of the imaging device 11, emits a laser or sound wave toward the finger of the person to be authenticated 50, receives the laser or sound wave reflected from the finger, and receives the received signal. Image data, shape data, etc. of the fingerprint of the person to be authenticated 50 may be obtained by analyzing the information. Furthermore, instead of face authentication or fingerprint authentication, the authentication system 10 may be configured to perform vein authentication. Specifically, the imaging device 11 may be used to irradiate the finger of the person to be authenticated 50 with near-infrared rays, the vein pattern may be imaged by a camera, and vein authentication may be performed based on the imaging data of the vein pattern. Furthermore, instead of face authentication, fingerprint authentication, or vein authentication, the authentication system 10 may be configured to perform iris authentication.
 上記の実施形態において、顔認証である一つの認証方法が成立した場合に、被認証者50の認証が成立したと判断している。しかしながら、認証方法は一つだけでなく、複数の認証方法に基づいて、被認証者50の認証が成立したと判断してもよい。例えば、顔認証に加えて、被認証者50のアカウント情報に基づく認証処理が行われてもよい。すなわち、プロセッサ132は、被認証者50により入力されたアカウント情報(パスワードなどを含む)が記憶装置40に保存されている予め登録されたアカウント情報と一致するかを判断する。そして、プロセッサ132は、アカウント情報による認証が成立し、且つ顔認証が成立した場合に、被認証者50の認証が成立したと判断してもよい。 In the above embodiment, it is determined that the authentication of the person to be authenticated 50 is successful when one authentication method, which is face authentication, is successful. However, it may be determined that the authentication of the person to be authenticated 50 is successful based on not only one authentication method but also a plurality of authentication methods. For example, in addition to face authentication, authentication processing based on the account information of the person to be authenticated 50 may be performed. That is, the processor 132 determines whether the account information (including a password, etc.) input by the person to be authenticated 50 matches the pre-registered account information stored in the storage device 40. Then, the processor 132 may determine that the authentication of the person to be authenticated 50 is successful when the authentication based on the account information is successful and the face authentication is successful.
 あるいは、顔認証に加えて、指紋認証または静脈認証が行われてもよい。例えば、プロセッサ132は、顔認証および指紋認証または静脈認の両方が成立した場合に、被認証者50の認証が成立したと判断してもよい。あるいは、顔認証の代わりに、指紋認証および静脈認証の両方が成立した場合に、被認証者50の認証が成立したと判断してもよい。 Alternatively, in addition to face authentication, fingerprint authentication or vein authentication may be performed. For example, the processor 132 may determine that the authentication of the person to be authenticated 50 is successful when both face authentication and fingerprint authentication or vein recognition are successful. Alternatively, instead of face authentication, it may be determined that the authentication of the person to be authenticated 50 is successful when both fingerprint authentication and vein authentication are successful.
 上記の実施形態において、撮像システム14は、通信ネットワーク20を介して制御装置13に接続されている。しかしながら、撮像システム14と制御装置13は一体的に構成されてもよい。撮像システム14と制御装置13一体型の構成は、例えばスマートフォンにより実現され、スマートフォンにおいてデータの復元と認証処理が行われてもよい。このような構成によれば、被認証者50が保有する第一情報I1が認証時に通信ネットワーク20に送信されることなく、第二情報I2だけを記憶装置40から通信ネットワーク20を介して受信してスマートフォン内においてデータを復元および認証処理できる。これにより、第一情報I1が外部に漏洩することを防止できる。また、情報保存媒体30、撮像システム14および制御装置13が一体的に構成されてもよい。 In the above embodiment, the imaging system 14 is connected to the control device 13 via the communication network 20. However, the imaging system 14 and the control device 13 may be configured integrally. The integrated configuration of the imaging system 14 and the control device 13 may be realized by, for example, a smartphone, and data restoration and authentication processing may be performed in the smartphone. According to such a configuration, only the second information I2 is received from the storage device 40 via the communication network 20 without transmitting the first information I1 held by the authenticated person 50 to the communication network 20 at the time of authentication. data can be restored and authenticated within the smartphone. Thereby, it is possible to prevent the first information I1 from leaking to the outside. Further, the information storage medium 30, the imaging system 14, and the control device 13 may be integrally configured.
 上記の実施形態において、第一情報I1や第二情報I2が、制御装置13から通信ネットワーク20を介して撮像システム14や記憶装置40へ出力される場合には、既知の公開鍵方式、共通鍵方式など様々な暗号化や復号化を使うことにより、データの気密性を保つように構成されてもよい。 In the above embodiment, when the first information I1 and the second information I2 are outputted from the control device 13 to the imaging system 14 and the storage device 40 via the communication network 20, the known public key system, common key The data may be configured to maintain data confidentiality by using various encryption and decryption methods.
 上記の実施形態において、生体データから抽出された全てのデータに対してシフト値が設定されている。しかしながら、例えば少なくとも一つのデータに対してシフト値が設定されるように構成されてもよい。シフト値が設定されるデータは、プログラムによりランダムで決められる。また、データB8に対してシフト値が設定されなかった場合、例えば、上記(1)で表される情報保存媒体30に保存される第一情報I1は、SFT{X0Y0Z0}のXYZの各値がゼロとなる。そして、上記(2)で表される記憶装置40に保存される第二情報I2は、原点と座標値も抽出時の第一座標値のままで保存される。また、上記(4)で表される2次元データも、シフト値が設定されなかった場合は第一情報I1のシフト値SFT{X0Y0}はゼロとなり、上記(3)で表される記憶装置40に保存される第二情報I2の原点と座標値も抽出時の第一座標値のままで保存される。 In the above embodiment, shift values are set for all data extracted from biometric data. However, for example, a shift value may be set for at least one piece of data. The data to which the shift value is set is randomly determined by the program. Furthermore, if a shift value is not set for data B8, for example, the first information I1 expressed in (1) above and stored in the information storage medium 30 is such that each value of XYZ of SFT{X0Y0Z0} is It becomes zero. In the second information I2 stored in the storage device 40 expressed in (2) above, the origin and coordinate values are also stored as the first coordinate values at the time of extraction. Furthermore, for the two-dimensional data represented by (4) above, if no shift value is set, the shift value SFT{X0Y0} of the first information I1 becomes zero, and the storage device 40 represented by (3) above becomes zero. The origin and coordinate values of the second information I2 stored in are also stored as the first coordinate values at the time of extraction.
 しかし、このように構成される場合でも、記憶装置40に保存されたデータのうち、いずれのデータに対してシフト値が設定されているかを判断できないので、第一情報I1が流出しても個人が特定されることを防止できる。また、記憶装置40に保存された全てのデータに対してシフト値が設定される場合に比べて、認証処理時に処理されるデータ量が減少するので認証処理の高速化が期待できる。 However, even with this configuration, it is not possible to determine which data among the data stored in the storage device 40 has a shift value set, so even if the first information I1 is leaked, it cannot be determined whether the can be prevented from being identified. Furthermore, compared to the case where shift values are set for all data stored in the storage device 40, the amount of data processed during the authentication process is reduced, so faster authentication processing can be expected.
 また、このように構成される場合でも、記憶装置40に保存されたデータは、回転処理または反転処理が実行されているかを判断できないので、第二情報I2が流出しても個人が特定されることを防止できる。また、記憶装置40に保存されたデータに対してシフト値の設定および回転処理や反転処理が行われている場合に比べて、認証処理時に処理されるデータ量が減少するので認証処理の高速化が期待できる。 Furthermore, even with this configuration, it is not possible to determine whether the data stored in the storage device 40 has been subjected to rotation processing or reversal processing, so even if the second information I2 is leaked, an individual cannot be identified. This can be prevented. Additionally, the amount of data processed during authentication processing is reduced compared to the case where shift values are set, rotation processing, and reversal processing are performed on data stored in the storage device 40, which speeds up the authentication processing. can be expected.
 本出願は、2022年5月26日出願の日本特許出願2022−086406号に基づくものであり、その内容はここに参照として取り込まれる。 This application is based on Japanese Patent Application No. 2022-086406 filed on May 26, 2022, the contents of which are incorporated herein by reference.

Claims (33)

  1.  被認証者の身体の一部に関する生体認証用の生体データを取得する認証用データ取得装置と、
     前記被認証者により所持される情報保存媒体から前記被認証者の生体認証に関する第一情報を取得し、記憶装置から前記被認証者の生体認証に関する第二情報を取得し、前記認証用データ取得装置から取得された生体認証用の前記生体データと前記情報保存媒体から取得された前記第一情報と前記記憶装置から取得された前記第二情報とに基づいて前記被認証者を生体認証する認証処理を行う制御装置と、を備えており、
     前記第一情報は、予め取得された前記被認証者の身体の一部に関する生体データから抽出された複数のデータの少なくとも一つのデータに対して設定された位置姿勢変更値を含み、
     前記位置姿勢変更値は、前記少なくとも一つのデータの位置および姿勢の少なくとも一方を第一状態から前記第一状態とは異なる第二状態へ変化させた値であり、
     前記第二情報は、前記複数のデータの各々の座標値を含み、
     前記位置姿勢変更値が設定された前記少なくとも一つのデータの前記座標値は、前記第二状態に関連する座標値であり、
     前記位置姿勢変更値は、前記第二情報に含まれる前記座標値に基づいて前記複数のデータ同士を配置した場合に前記生体データにおける前記複数のデータ同士の相関関係が判別できない値に設定されており、
     前記制御装置は、
     前記第一情報および前記第二情報に基づいて前記複数のデータを前記第一状態で配置して参照データを作成し、
     前記参照データ、および前記認証用データ取得装置により取得された前記生体認証用の前記生体データに基づいて認証処理を行う、認証システム。
    an authentication data acquisition device that acquires biometric data for biometric authentication regarding a part of the body of a person to be authenticated;
    acquiring first information related to the biometric authentication of the authenticated person from an information storage medium owned by the authenticated person, acquiring second information related to the biometric authentication of the authenticated person from a storage device, and acquiring the authentication data. Authentication that biometrically authenticates the person to be authenticated based on the biometric data for biometric authentication acquired from the device, the first information acquired from the information storage medium, and the second information acquired from the storage device. It is equipped with a control device that performs processing,
    The first information includes a position/orientation change value set for at least one of a plurality of data extracted from biometric data regarding a part of the body of the person to be authenticated that has been obtained in advance;
    The position and orientation change value is a value obtained by changing at least one of the position and orientation of the at least one data from a first state to a second state different from the first state,
    The second information includes coordinate values of each of the plurality of data,
    The coordinate value of the at least one data set with the position/orientation change value is a coordinate value related to the second state,
    The position and orientation change value is set to a value that makes it impossible to determine the correlation between the plurality of data in the biological data when the plurality of data are arranged based on the coordinate values included in the second information. Ori,
    The control device includes:
    creating reference data by arranging the plurality of data in the first state based on the first information and the second information;
    An authentication system that performs authentication processing based on the reference data and the biometric data for biometric authentication acquired by the authentication data acquisition device.
  2.  前記位置姿勢変更値は、シフト値を含み、
     前記シフト値は、前記少なくとも一つのデータの位置を、第一座標値から前記第一座標値とは異なる第二座標値までシフトした値であり、
     前記第二状態に関連する座標値は、前記シフト値が設定された前記少なくとも一つのデータの前記第二座標値を含み、
     前記制御装置は、
     前記第一情報および前記第二情報に基づいて、前記複数のデータを前記第一座標値で配置して参照データを作成する、請求項1に記載の認証システム。
    The position and orientation change value includes a shift value,
    The shift value is a value obtained by shifting the position of the at least one data from a first coordinate value to a second coordinate value different from the first coordinate value,
    The coordinate values related to the second state include the second coordinate values of the at least one data set with the shift value,
    The control device includes:
    The authentication system according to claim 1, wherein reference data is created by arranging the plurality of data at the first coordinate values based on the first information and the second information.
  3.  前記位置姿勢変更値は、回転反転値を含み、
     前記回転反転値は、前記少なくとも一つのデータの姿勢を、前記少なくとも一つのデータの原点を基軸とした回転処理および反転処理の少なくとも一つの実行により、第一姿勢から前記第一姿勢とは異なる第二姿勢へ変化した値であり、
     前記第二状態に関連する座標値は、前記第二姿勢に変化した後の前記原点の座標値を含み、
     前記制御装置は、
     前記第一情報および前記第二情報に基づいて、前記少なくとも一つのデータに対して回転処理および反転処理の少なくとも一つを実行し、前記複数のデータを前記第一姿勢で配置して、参照データを作成する、請求項1に記載の認証システム。
    The position and orientation change value includes a rotation inversion value,
    The rotation inversion value changes the orientation of the at least one data from a first orientation to a second orientation different from the first orientation by performing at least one of rotation processing and reversal processing about the origin of the at least one data. This is the value when changing to two postures,
    The coordinate values related to the second state include the coordinate values of the origin after changing to the second posture,
    The control device includes:
    Based on the first information and the second information, at least one of a rotation process and a reversal process is performed on the at least one data, the plurality of data are arranged in the first orientation, and the reference data is The authentication system according to claim 1, which creates an authentication system.
  4.  前記位置姿勢変更値は、シフト値および回転反転値を含み、
     前記シフト値は、前記少なくとも一つのデータの位置を、第一座標値から前記第一座標値とは異なる第二座標値までシフトした値であり、
     前記回転反転値は、前記少なくとも一つのデータの姿勢を、前記少なくとも一つのデータの原点を基軸とした回転処理および反転処理の少なくとも一つの実行により、第一姿勢から前記第一姿勢とは異なる第二姿勢へ変化した値であり、
     前記第二状態に関連する座標値は、前記第二座標値へシフトし且つ前記第二姿勢に変化した後の前記少なくとも一つのデータの前記原点の座標値を含み、
     前記制御装置は、
     前記第一情報および前記第二情報に基づいて、前記複数のデータを前記第一座標値および前記第一姿勢で配置して参照データを作成する、請求項1に記載の認証システム。
    The position and orientation change value includes a shift value and a rotation inversion value,
    The shift value is a value obtained by shifting the position of the at least one data from a first coordinate value to a second coordinate value different from the first coordinate value,
    The rotation inversion value changes the orientation of the at least one data from a first orientation to a second orientation different from the first orientation by performing at least one of rotation processing and reversal processing about the origin of the at least one data. This is the value when changing to two postures,
    The coordinate values related to the second state include the coordinate values of the origin of the at least one data after shifting to the second coordinate values and changing to the second posture,
    The control device includes:
    The authentication system according to claim 1, wherein reference data is created by arranging the plurality of data at the first coordinate values and the first orientation based on the first information and the second information.
  5.  前記複数のデータは、予め取得された前記被認証者の身体の一部に関する前記生体データから前記被認証者の特徴を特定できないサイズに分割されて抽出されたデータである、請求項1から請求項4のいずれか一項に記載の認証システム。 The plurality of pieces of data are data extracted from the biometric data regarding a part of the body of the person to be authenticated that has been obtained in advance and divided into a size that does not allow identification of the characteristics of the person to be authenticated. The authentication system according to any one of Item 4.
  6.  前記複数のデータは、前記第一状態で配置された時に前記被認証者の連続した領域を構成しないように抽出されたデータである、請求項1から請求項4のいずれか一項に記載の認証システム。 5. The plurality of pieces of data are data extracted so as not to constitute a continuous area of the person to be authenticated when placed in the first state. Authentication system.
  7.  前記情報保存媒体から前記被認証者の生体認証に関する第一情報を取得する第一情報取得装置をさらに備える、請求項1から請求項4のいずれか一項に記載の認証システム。 The authentication system according to any one of claims 1 to 4, further comprising a first information acquisition device that acquires first information regarding biometric authentication of the person to be authenticated from the information storage medium.
  8.  前記記憶装置を備えており、
     前記記憶装置は、前記被認証者とは異なる被認証者の生体認証に関する第二情報も記憶している、請求項1から請求項4のいずれか一項に記載の認証システム。
    comprising the storage device,
    The authentication system according to any one of claims 1 to 4, wherein the storage device also stores second information regarding biometric authentication of a person to be authenticated who is different from the person to be authenticated.
  9.  前記第一情報は、前記複数のデータを構成する複数のセグメントの一部を含んでおり、
     前記第二情報は、前記複数のセグメントの残りを含んでいる、請求項1から請求項4のいずれか一項に記載の認証システム。
    The first information includes a part of a plurality of segments constituting the plurality of data,
    The authentication system according to any one of claims 1 to 4, wherein the second information includes the remainder of the plurality of segments.
  10.  被認証者の身体の一部に関する生体認証用の生体データを受け付け、前記被認証者により所持される情報保存媒体から前記被認証者の生体認証に関する第一情報を受け付け、記憶装置から前記被認証者の生体認証に関する第二情報を受け付ける入力部と、
     生体認証用の前記生体データと、前記第一情報と、前記第二情報とに基づいて、前記被認証者を生体認証する認証処理を行うように構成されているプロセッサと、を備えており、
     前記第一情報は、予め取得された前記被認証者の身体の一部に関する生体データから抽出された複数のデータの少なくとも一つのデータに対して設定された位置姿勢変更値を含み、
     前記位置姿勢変更値は、前記少なくとも一つのデータの位置および姿勢の少なくとも一方を第一状態から前記第一状態とは異なる第二状態へ変化させた値であり、
     前記第二情報は、前記複数のデータの各々の座標値を含み、
     前記位置姿勢変更値が設定された前記少なくとも一つのデータの前記座標値は、前記第二状態に関連する座標値であり、
     前記位置姿勢変更値は、前記第二情報に含まれる前記座標値に基づいて前記複数のデータ同士を配置した場合に前記生体データにおける前記複数のデータ同士の相関関係が判別できない値に設定されており、
     前記プロセッサは、
     前記第一情報および前記第二情報に基づいて前記複数のデータを前記第一状態で配置して参照データを作成し、
     前記参照データ、および前記生体認証用の前記生体データに基づいて認証処理を行う、制御装置。
    Accepting biometric data for biometric authentication regarding a part of the body of the person to be authenticated, receiving first information regarding biometric authentication of the person to be authenticated from an information storage medium owned by the person to be authenticated, and receiving first information regarding the biometric authentication of the person to be authenticated from a storage device. an input unit that receives second information regarding biometric authentication of a person;
    a processor configured to perform an authentication process for biometrically authenticating the person to be authenticated based on the biometric data for biometric authentication, the first information, and the second information;
    The first information includes a position/orientation change value set for at least one of a plurality of data extracted from biometric data regarding a part of the body of the person to be authenticated that has been obtained in advance;
    The position and orientation change value is a value obtained by changing at least one of the position and orientation of the at least one data from a first state to a second state different from the first state,
    The second information includes coordinate values of each of the plurality of data,
    The coordinate value of the at least one data set with the position/orientation change value is a coordinate value related to the second state,
    The position and orientation change value is set to a value that makes it impossible to determine the correlation between the plurality of data in the biological data when the plurality of data are arranged based on the coordinate values included in the second information. Ori,
    The processor includes:
    creating reference data by arranging the plurality of data in the first state based on the first information and the second information;
    A control device that performs authentication processing based on the reference data and the biometric data for biometric authentication.
  11.  前記位置姿勢変更値は、シフト値を含み、
     前記シフト値は、前記少なくとも一つのデータの位置を、第一座標値から前記第一座標値とは異なる第二座標値までシフトした値であり、
     前記第二状態に関連する座標値は、前記シフト値が設定された前記少なくとも一つのデータの前記第二座標値を含み、
     前記プロセッサは、
     前記第一情報および前記第二情報に基づいて、前記複数のデータを前記第一座標値で配置して参照データを作成する、請求項10に記載の制御装置。
    The position and orientation change value includes a shift value,
    The shift value is a value obtained by shifting the position of the at least one data from a first coordinate value to a second coordinate value different from the first coordinate value,
    The coordinate values related to the second state include the second coordinate values of the at least one data set with the shift value,
    The processor includes:
    The control device according to claim 10, wherein reference data is created by arranging the plurality of data at the first coordinate values based on the first information and the second information.
  12.  前記位置姿勢変更値は、回転反転値を含み、
     前記回転反転値は、前記少なくとも一つのデータの姿勢を、前記少なくとも一つのデータの原点を基軸とした回転処理および反転処理の少なくとも一つの実行により、第一姿勢から前記第一姿勢とは異なる第二姿勢へ変化した値であり、
     前記第二状態に関連する座標値は、前記第二姿勢に変化した後の前記原点の座標値を含み、
     前記プロセッサは、
     前記第一情報および前記第二情報に基づいて、前記少なくとも一つのデータに対して回転処理および反転処理の少なくとも一つを実行し、前記複数のデータを前記第一姿勢で配置して、参照データを作成する、請求項10に記載の制御装置。
    The position and orientation change value includes a rotation inversion value,
    The rotation inversion value changes the orientation of the at least one data from a first orientation to a second orientation different from the first orientation by performing at least one of rotation processing and reversal processing about the origin of the at least one data. This is the value when changing to two postures,
    The coordinate values related to the second state include the coordinate values of the origin after changing to the second posture,
    The processor includes:
    Based on the first information and the second information, at least one of a rotation process and a reversal process is performed on the at least one data, the plurality of data are arranged in the first orientation, and the reference data is 11. The control device according to claim 10.
  13.  前記位置姿勢変更値は、シフト値および回転反転値を含み、
     前記シフト値は、前記少なくとも一つのデータの位置を、第一座標値から前記第一座標値とは異なる第二座標値までシフトした値であり、
     前記回転反転値は、前記少なくとも一つのデータの姿勢を、前記少なくとも一つのデータの原点を基軸とした回転処理および反転処理の少なくとも一つの実行により、第一姿勢から前記第一姿勢とは異なる第二姿勢へ変化した値であり、
     前記第二状態に関連する座標値は、前記第二座標値へシフトし且つ前記第二姿勢に変化した後の前記少なくとも一つのデータの前記原点の座標値を含み、
     前記プロセッサは、
     前記第一情報および前記第二情報に基づいて、前記複数のデータを前記第一座標値および前記第一姿勢で配置して参照データを作成する、請求項10に記載の制御装置。
    The position and orientation change value includes a shift value and a rotation inversion value,
    The shift value is a value obtained by shifting the position of the at least one data from a first coordinate value to a second coordinate value different from the first coordinate value,
    The rotation inversion value changes the orientation of the at least one data from a first orientation to a second orientation different from the first orientation by performing at least one of rotation processing and reversal processing about the origin of the at least one data. This is the value when changing to two postures,
    The coordinate values related to the second state include the coordinate values of the origin of the at least one data after shifting to the second coordinate values and changing to the second posture,
    The processor includes:
    The control device according to claim 10, wherein reference data is created by arranging the plurality of data at the first coordinate values and the first orientation based on the first information and the second information.
  14.  前記複数のデータは、予め取得された前記被認証者の身体の一部に関する前記生体データから前記被認証者の特徴を特定できないサイズに分割されて抽出されたデータである、請求項10から請求項13のいずれか一項に記載の制御装置。 The plurality of pieces of data are data that is extracted from the biometric data regarding a part of the body of the person to be authenticated, which has been obtained in advance, and is divided into a size that does not allow identification of the characteristics of the person to be authenticated. The control device according to any one of Item 13.
  15.  前記複数のデータは、前記第一状態で配置された時に前記被認証者の連続した領域を構成しないように抽出されたデータである、請求項10から請求項13のいずれか一項に記載の制御装置。 14. The plurality of pieces of data are data extracted so as not to constitute a continuous area of the person to be authenticated when placed in the first state. Control device.
  16.  前記第一情報は、前記複数のデータを構成する複数のセグメントの一部を含んでおり、
     前記第二情報は、前記複数のセグメントの残りを含んでいる、請求項10から請求項13のいずれか一項に記載の制御装置。
    The first information includes a part of a plurality of segments constituting the plurality of data,
    The control device according to any one of claims 10 to 13, wherein the second information includes the remainder of the plurality of segments.
  17.  制御装置により実行されるコンピュータプログラムであって、
     実行されることにより、前記制御装置に、
     被認証者の身体の一部に関する生体認証用の生体データを受け付けさせ、
     前記被認証者により所持される情報保存媒体から前記被認証者の生体認証に関する第一情報を受け付けさせ、
     記憶装置から前記被認証者の生体認証に関する第二情報を受け付けさせ、
     前記第一情報は、予め取得された前記被認証者の身体の一部に関する生体データから抽出された複数のデータの少なくとも一つのデータに対して設定された位置姿勢変更値を含み、
     前記位置姿勢変更値は、前記少なくとも一つのデータの位置および姿勢の少なくとも一方を第一状態から前記第一状態とは異なる第二状態へ変化させた値であり、
     前記第二情報は、前記複数のデータの各々の座標値を含み、
     前記位置姿勢変更値が設定された前記少なくとも一つのデータの前記座標値は、前記第二状態に関連する座標値であり、
     前記位置姿勢変更値は、前記第二情報に含まれる前記座標値に基づいて前記複数のデータ同士を配置した場合に前記生体データにおける前記複数のデータ同士の相関関係が判別できない値に設定されており、
     前記第一情報および前記第二情報に基づいて前記複数のデータを前記第一状態で配置して参照データを作成させ、
     前記参照データ、および前記生体認証用の前記生体データに基づいて認証処理を行わせる、コンピュータプログラム。
    A computer program executed by a control device, the computer program comprising:
    The execution causes the control device to:
    accept biometric data for biometric authentication regarding a part of the body of the person to be authenticated;
    receiving first information regarding the biometric authentication of the person to be authenticated from an information storage medium owned by the person to be authenticated;
    receiving second information regarding biometric authentication of the person to be authenticated from a storage device;
    The first information includes a position/orientation change value set for at least one of a plurality of data extracted from biometric data regarding a part of the body of the person to be authenticated that has been obtained in advance;
    The position and orientation change value is a value obtained by changing at least one of the position and orientation of the at least one data from a first state to a second state different from the first state,
    The second information includes coordinate values of each of the plurality of data,
    The coordinate value of the at least one data set with the position/orientation change value is a coordinate value related to the second state,
    The position and orientation change value is set to a value that makes it impossible to determine the correlation between the plurality of data in the biological data when the plurality of data are arranged based on the coordinate values included in the second information. Ori,
    creating reference data by arranging the plurality of data in the first state based on the first information and the second information;
    A computer program that causes an authentication process to be performed based on the reference data and the biometric data for biometric authentication.
  18.  対象者の身体の一部に関する生体データを受け付ける入力部と、
     前記生体データに基づいて前記対象者の生体認証に用いられる情報を生成し、前記情報を記憶装置へ出力するプロセッサと、を備えており、
     前記プロセッサは、前記生体データから抽出された複数のデータの少なくとも一つのデータに対して位置姿勢変更値を設定し、
     前記位置姿勢変更値は、前記少なくとも一つのデータの位置および姿勢の少なくとも一方を第一状態から前記第一状態とは異なる第二状態へ変化させた値であり、
     前記位置姿勢変更値は、前記位置姿勢変更値が設定された前記少なくとも一つのデータの前記第二状態に関連する座標値を含む前記複数のデータの各々の座標値に基づいて前記複数のデータ同士を配置した場合に前記生体データにおける前記複数のデータ同士の相関関係が判別できない値に設定されており、
     前記プロセッサは、前記複数のデータの各々の座標値を含む前記情報を生成し、前記情報を第二情報として前記記憶装置へ出力する、制御装置。
    an input unit that accepts biometric data regarding a part of the subject's body;
    a processor that generates information used for biometric authentication of the subject based on the biometric data and outputs the information to a storage device;
    the processor sets a position/orientation change value for at least one of the plurality of data extracted from the biological data;
    The position and orientation change value is a value obtained by changing at least one of the position and orientation of the at least one data from a first state to a second state different from the first state,
    The position/orientation change value is configured to adjust the position/orientation change value between the plurality of data based on the coordinate value of each of the plurality of data including the coordinate value related to the second state of the at least one data to which the position/orientation change value is set. is set to a value that makes it impossible to determine the correlation between the plurality of data in the biometric data when
    The processor generates the information including coordinate values of each of the plurality of data, and outputs the information to the storage device as second information.
  19.  前記プロセッサは、前記複数のデータの少なくとも一つのデータに対して設定された前記位置姿勢変更値を含む第一情報を生成し、前記第一情報を前記対象者が所持する情報保存媒体へ出力する、請求項18に記載の制御装置。 The processor generates first information including the position/orientation change value set for at least one of the plurality of data, and outputs the first information to an information storage medium owned by the subject. 19. The control device according to claim 18.
  20.  前記位置姿勢変更値は、シフト値を含み、
     前記シフト値は、前記少なくとも一つのデータの位置を、第一座標値から前記第一座標値とは異なる第二座標値までシフトした値であり、
     前記複数のデータの各々の座標値には、前記シフト値が設定された前記少なくとも一つのデータの前記第二座標値が含まれる、請求項18または請求項19に記載の制御装置。
    The position and orientation change value includes a shift value,
    The shift value is a value obtained by shifting the position of the at least one data from a first coordinate value to a second coordinate value different from the first coordinate value,
    The control device according to claim 18 or 19, wherein the coordinate value of each of the plurality of data includes the second coordinate value of the at least one data set with the shift value.
  21.  前記位置姿勢変更値は、回転反転値を含み、
     前記回転反転値は、前記少なくとも一つのデータの姿勢を、前記少なくとも一つのデータの原点を基軸とした回転処理および反転処理の少なくとも一つの実行により、第一姿勢から前記第一姿勢とは異なる第二姿勢へ変化した値であり、
     前記複数のデータの各々の座標値には、前記第二姿勢に変化した後の前記少なくとも一つのデータの原点の座標値が含まれる、請求項18または請求項19に記載の制御装置。
    The position and orientation change value includes a rotation inversion value,
    The rotation inversion value changes the orientation of the at least one data from a first orientation to a second orientation different from the first orientation by performing at least one of rotation processing and reversal processing about the origin of the at least one data. This is the value when changing to two postures,
    The control device according to claim 18 or 19, wherein the coordinate value of each of the plurality of data includes the coordinate value of the origin of the at least one data after changing to the second attitude.
  22.  前記位置姿勢変更値は、シフト値および回転反転値を含み、
     前記シフト値は、前記少なくとも一つのデータの位置を、第一座標値から前記第一座標値とは異なる第二座標値までシフトした値であり、
     前記回転反転値は、前記少なくとも一つのデータの姿勢を、前記少なくとも一つのデータの原点を基軸とした回転処理および反転処理の少なくとも一つの実行により、第一姿勢から前記第一姿勢とは異なる第二姿勢へ変化した値であり、
     前記複数のデータの各々の座標値には、前記第二座標値へシフトし且つ前記第二姿勢に変化した後の前記少なくとも一つのデータの前記原点の座標値が含まれる、請求項18または請求項19に記載の制御装置。
    The position and orientation change value includes a shift value and a rotation inversion value,
    The shift value is a value obtained by shifting the position of the at least one data from a first coordinate value to a second coordinate value different from the first coordinate value,
    The rotation inversion value changes the orientation of the at least one data from a first orientation to a second orientation different from the first orientation by performing at least one of rotation processing and reversal processing about the origin of the at least one data. This is the value when changing to two postures,
    19. The coordinate value of each of the plurality of data includes the coordinate value of the origin of the at least one data after shifting to the second coordinate value and changing to the second attitude. 20. The control device according to item 19.
  23.  前記プロセッサは、前記複数のデータを構成する複数のセグメントの一部を前記第一情報に含め、前記複数のセグメントの残りを前記第二情報に含める、請求項19に記載の制御装置。 The control device according to claim 19, wherein the processor includes a part of the plurality of segments constituting the plurality of data in the first information, and includes the remainder of the plurality of segments in the second information.
  24.  前記プロセッサは、前記生体データを前記対象者の特徴を特定できないサイズに分割して前記複数のデータとして抽出する、請求項18または請求項19に記載の制御装置。 20. The control device according to claim 18 or 19, wherein the processor divides the biometric data into sizes in which characteristics of the subject cannot be identified and extracts the plurality of data.
  25.  前記プロセッサは、前記複数のデータが前記第一状態で配置された時に前記対象者の連続した領域を構成しないように抽出する、請求項18または請求項19に記載の制御装置。 The control device according to claim 18 or 19, wherein the processor extracts the plurality of data so that they do not constitute a continuous region of the subject when arranged in the first state.
  26.  制御装置により実行されるコンピュータプログラムであって、
     実行されることにより、前記制御装置に、
     対象者の身体の一部に関する生体データを受け付けさせ、
     前記生体データに基づいて前記対象者の生体認証に用いられる情報を生成し、前記情報を記憶装置へ出力させ、
     前記生体データから抽出された複数のデータの少なくとも一つのデータに対して位置姿勢変更値を設定させ、
     前記位置姿勢変更値は、前記少なくとも一つのデータの位置および姿勢の少なくとも一方を第一状態から前記第一状態とは異なる第二状態へ変化させた値であり、
     前記位置姿勢変更値は、前記位置姿勢変更値が設定された前記少なくとも一つのデータの前記第二状態に関連する座標値を含む前記複数のデータの各々の座標値に基づいて前記複数のデータ同士を配置した場合に前記生体データにおける前記複数のデータ同士の相関関係が判別できない値に設定されており、
     前記複数のデータの各々の座標値を含む前記情報を生成し、前記情報を第二情報として前記記憶装置へ出力させる、コンピュータプログラム。
    A computer program executed by a control device, the computer program comprising:
    The execution causes the control device to:
    Accept biometric data regarding a part of the target person's body,
    generating information used for biometric authentication of the subject based on the biometric data and outputting the information to a storage device;
    setting a position/orientation change value for at least one of the plurality of data extracted from the biological data;
    The position and orientation change value is a value obtained by changing at least one of the position and orientation of the at least one data from a first state to a second state different from the first state,
    The position/orientation change value is configured to adjust the position/orientation change value between the plurality of data based on the coordinate value of each of the plurality of data including the coordinate value related to the second state of the at least one data to which the position/orientation change value is set. is set to a value that makes it impossible to determine the correlation between the plurality of data in the biometric data when
    A computer program that generates the information including coordinate values of each of the plurality of data, and causes the information to be output to the storage device as second information.
  27.  実行されることにより、前記制御装置に、
     前記複数のデータの少なくとも一つのデータに対して設定された前記位置姿勢変更値を含む第一情報を生成し、前記第一情報を前記対象者が所持する情報保存媒体へ出力させる、請求項26に記載のコンピュータプログラム。
    The execution causes the control device to:
    26. Generating first information including the position/orientation change value set for at least one of the plurality of data, and outputting the first information to an information storage medium owned by the subject. The computer program described in .
  28.  前記位置姿勢変更値は、シフト値を含み、
     前記シフト値は、前記少なくとも一つのデータの位置を、第一座標値から前記第一座標値とは異なる第二座標値までシフトした値であり、
     前記複数のデータの各々の座標値には、前記シフト値が設定された前記少なくとも一つのデータの前記第二座標値が含まれる、請求項26または請求項27に記載のコンピュータプログラム。
    The position and orientation change value includes a shift value,
    The shift value is a value obtained by shifting the position of the at least one data from a first coordinate value to a second coordinate value different from the first coordinate value,
    28. The computer program according to claim 26, wherein the coordinate value of each of the plurality of data includes the second coordinate value of the at least one data set with the shift value.
  29.  前記位置姿勢変更値は、回転反転値を含み、
     前記回転反転値は、前記少なくとも一つのデータの姿勢を、前記少なくとも一つのデータの原点を基軸とした回転処理および反転処理の少なくとも一つの実行により、第一姿勢から前記第一姿勢とは異なる第二姿勢へ変化した値であり、
     前記複数のデータの各々の座標値には、前記第二姿勢に変化した後の前記少なくとも一つのデータの原点の座標値が含まれる、請求項26または請求項27に記載のコンピュータプログラム。
    The position and orientation change value includes a rotation inversion value,
    The rotation inversion value changes the orientation of the at least one data from a first orientation to a second orientation different from the first orientation by performing at least one of rotation processing and reversal processing about the origin of the at least one data. This is the value when changing to two postures,
    28. The computer program according to claim 26, wherein the coordinate values of each of the plurality of data include coordinate values of the origin of the at least one data after changing to the second attitude.
  30.  前記位置姿勢変更値は、シフト値および回転反転値を含み、
     前記シフト値は、前記少なくとも一つのデータの位置を、第一座標値から前記第一座標値とは異なる第二座標値までシフトした値であり、
     前記回転反転値は、前記少なくとも一つのデータの姿勢を、前記少なくとも一つのデータの原点を基軸とした回転処理および反転処理の少なくとも一つの実行により、第一姿勢から前記第一姿勢とは異なる第二姿勢へ変化した値であり、
     前記複数のデータの各々の座標値には、前記第二座標値へシフトし且つ前記第二姿勢に変化した後の前記少なくとも一つのデータの前記原点の座標値が含まれる、請求項26または請求項27に記載のコンピュータプログラム。
    The position and orientation change value includes a shift value and a rotation inversion value,
    The shift value is a value obtained by shifting the position of the at least one data from a first coordinate value to a second coordinate value different from the first coordinate value,
    The rotation inversion value changes the orientation of the at least one data from a first orientation to a second orientation different from the first orientation by performing at least one of rotation processing and reversal processing about the origin of the at least one data. This is the value when changing to two postures,
    26. The coordinate value of each of the plurality of data includes the coordinate value of the origin of the at least one data after shifting to the second coordinate value and changing to the second attitude. 28. Computer program according to item 27.
  31.  実行されることにより、前記制御装置に、
     前記複数のデータを構成する複数のセグメントの一部を前記第一情報に含め、前記複数のセグメントの残りを前記第二情報に含めさせる、請求項27に記載のコンピュータプログラム。
    The execution causes the control device to:
    28. The computer program according to claim 27, wherein a part of the plurality of segments constituting the plurality of data is included in the first information, and the remainder of the plurality of segments is included in the second information.
  32.  実行されることにより、前記制御装置に、
     前記生体データを前記対象者の特徴を特定できないサイズに分割して前記複数のデータとして抽出させる、請求項26または請求項27に記載のコンピュータプログラム。
    The execution causes the control device to:
    28. The computer program according to claim 26 or 27, wherein the biometric data is divided into a size in which characteristics of the subject cannot be identified and extracted as the plurality of data.
  33.  実行されることにより、前記制御装置に、
     前記複数のデータが前記第一状態で配置された時に前記対象者の連続した領域を構成しないように抽出させる、請求項26または請求項27に記載のコンピュータプログラム。
    The execution causes the control device to:
    28. The computer program according to claim 26 or 27, wherein when the plurality of data are arranged in the first state, the data are extracted so as not to constitute a continuous region of the subject.
PCT/JP2023/020731 2022-05-26 2023-05-26 Authentication system, control device, and computer program WO2023229052A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2022086406A JP2023173879A (en) 2022-05-26 2022-05-26 Authentication system, control device, and computer program
JP2022-086406 2022-05-26

Publications (1)

Publication Number Publication Date
WO2023229052A1 true WO2023229052A1 (en) 2023-11-30

Family

ID=88919346

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2023/020731 WO2023229052A1 (en) 2022-05-26 2023-05-26 Authentication system, control device, and computer program

Country Status (2)

Country Link
JP (1) JP2023173879A (en)
WO (1) WO2023229052A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001007802A (en) * 1999-06-21 2001-01-12 Fujitsu Ltd Method and device for ciphering and deciphering living body information, and principal authentication system utilizing living body information
JP2008129743A (en) * 2006-11-20 2008-06-05 Hitachi Ltd Featured value conversion method for biological information, device therefor, and user authentication system using the same
JP3230238U (en) * 2018-04-10 2021-01-14 ブラック ゴールド コイン インコーポレイテッドBlack Gold Coin, Inc. A system for securely storing electronic data

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001007802A (en) * 1999-06-21 2001-01-12 Fujitsu Ltd Method and device for ciphering and deciphering living body information, and principal authentication system utilizing living body information
JP2008129743A (en) * 2006-11-20 2008-06-05 Hitachi Ltd Featured value conversion method for biological information, device therefor, and user authentication system using the same
JP3230238U (en) * 2018-04-10 2021-01-14 ブラック ゴールド コイン インコーポレイテッドBlack Gold Coin, Inc. A system for securely storing electronic data

Also Published As

Publication number Publication date
JP2023173879A (en) 2023-12-07

Similar Documents

Publication Publication Date Title
US11936647B2 (en) Identity verification method and apparatus, storage medium, and computer device
US9076048B2 (en) Biometric identification, authentication and verification using near-infrared structured illumination combined with 3D imaging of the human ear
JP6452617B2 (en) Biometric iris matching system
US9613428B2 (en) Fingerprint authentication using stitch and cut
JP4351982B2 (en) Personal authentication method, apparatus and program
CN107995979A (en) Use the user's identification and/or certification for staring information
CN112639871B (en) Biometric authentication system, biometric authentication method, and recording medium
JP2007011667A (en) Iris authentication device and iris authentication method
CN113646806A (en) Image processing apparatus, image processing method, and recording medium storing program
CN109686011A (en) The user identification method of self-aided terminal and self-aided terminal
KR20100034843A (en) Method and apparatus for security using three-dimensional(3d) face recognition
JP5531585B2 (en) Biological information processing apparatus, biological information processing method, biological information processing system, and computer program for biological information processing
WO2023229052A1 (en) Authentication system, control device, and computer program
Samatha et al. Securesense: Enhancing person verification through multimodal biometrics for robust authentication
JP5279007B2 (en) Verification system, verification method, program, and recording medium
JP4900701B2 (en) Authentication system
US20240013574A1 (en) Age verification
KORICHI Biometrics and Information Security for a Secure Person Identi cation
Chaudhari et al. Prevention of spoof attacks in fingerprinting using histogram features
WO2018113803A1 (en) Multi-factor authentication method
Stanuch et al. Artificial database expansion based on hand position variability for palm vein biometric system
KR20240012626A (en) Authenticator capable of self-authentication and adult authentication
KR101792014B1 (en) Integrate module checking algorithm of finger vein and finger vein at the same time
KR101792020B1 (en) Integrate module checking algorithm of fingerprint1 and fingerprint2 at the same time
Ahmed et al. Data fusion-based multimodal biometric system for face recognition using manhattan distance penalty weight

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23811924

Country of ref document: EP

Kind code of ref document: A1