[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

WO2016157827A1 - Système de détermination, procédé de détermination et programme de détermination - Google Patents

Système de détermination, procédé de détermination et programme de détermination Download PDF

Info

Publication number
WO2016157827A1
WO2016157827A1 PCT/JP2016/001641 JP2016001641W WO2016157827A1 WO 2016157827 A1 WO2016157827 A1 WO 2016157827A1 JP 2016001641 W JP2016001641 W JP 2016001641W WO 2016157827 A1 WO2016157827 A1 WO 2016157827A1
Authority
WO
WIPO (PCT)
Prior art keywords
registration
image
unit
determination
identification information
Prior art date
Application number
PCT/JP2016/001641
Other languages
English (en)
Japanese (ja)
Inventor
今井 豊
Original Assignee
日本電気株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 日本電気株式会社 filed Critical 日本電気株式会社
Priority to JP2017509259A priority Critical patent/JP6481754B2/ja
Publication of WO2016157827A1 publication Critical patent/WO2016157827A1/fr

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B42BOOKBINDING; ALBUMS; FILES; SPECIAL PRINTED MATTER
    • B42DBOOKS; BOOK COVERS; LOOSE LEAVES; PRINTED MATTER CHARACTERISED BY IDENTIFICATION OR SECURITY FEATURES; PRINTED MATTER OF SPECIAL FORMAT OR STYLE NOT OTHERWISE PROVIDED FOR; DEVICES FOR USE THEREWITH AND NOT OTHERWISE PROVIDED FOR; MOVABLE-STRIP WRITING OR READING APPARATUS
    • B42D25/00Information-bearing cards or sheet-like structures characterised by identification or security features; Manufacture thereof
    • B42D25/20Information-bearing cards or sheet-like structures characterised by identification or security features; Manufacture thereof characterised by a particular use or purpose
    • B42D25/26Entrance cards; Admission tickets
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B42BOOKBINDING; ALBUMS; FILES; SPECIAL PRINTED MATTER
    • B42DBOOKS; BOOK COVERS; LOOSE LEAVES; PRINTED MATTER CHARACTERISED BY IDENTIFICATION OR SECURITY FEATURES; PRINTED MATTER OF SPECIAL FORMAT OR STYLE NOT OTHERWISE PROVIDED FOR; DEVICES FOR USE THEREWITH AND NOT OTHERWISE PROVIDED FOR; MOVABLE-STRIP WRITING OR READING APPARATUS
    • B42D25/00Information-bearing cards or sheet-like structures characterised by identification or security features; Manufacture thereof
    • B42D25/30Identification or security features, e.g. for preventing forgery

Definitions

  • the present invention relates to a determination system that can determine the authenticity of an object or the authenticity of an owner by using object authentication technology.
  • Authenticated items such as ID cards and concert tickets are subject to counterfeiting and theft. Therefore, a system that can confirm the authenticity of an object has been developed.
  • Patent Document 1 An example of a system that can determine the authenticity of an object is described in Patent Document 1.
  • the system of Patent Document 1 includes an input unit, a verification data generation unit, a verification reference data storage unit, and a verification unit, and operates as follows.
  • the input means inputs the image data for verification displayed on the ID card to be verified.
  • the collation data generation means converts the input image data into binary one-dimensional information.
  • the collation reference data storage means stores binary one-dimensional information serving as a collation reference.
  • the collation unit collates the binary one-dimensional information generated by the collation data generation unit with the binary one-dimensional information stored in the collation reference data storage unit.
  • the system as described above has a problem that it can notify whether the object is authentic, but cannot notify whether the owner of the object is authentic.
  • an object of the present invention is to provide a determination system capable of notifying the authenticity of an object and the authenticity of the owner of the object.
  • a storage unit that stores registration identification information of a registration target object and a registration image of a specific part of the registration target object and an image obtained by imaging the presentation target object are designated by a user. Obtaining a designated image of the location, obtaining the registered image associated with the registration identification information identical to the presentation identification information of the presentation object from the storage means, the obtained designated image and the A determination unit configured to determine whether the registered images match; and a notification unit configured to notify the determination result of the determination unit.
  • the present invention it is possible to provide a determination system capable of notifying the authenticity of an object and the authenticity of the owner of the object.
  • FIG. 2nd embodiment It is a flowchart which shows an example of the registration process of 2nd embodiment. It is a flowchart which shows an example of the determination process of 2nd embodiment. It is a block diagram which shows an example of a structure of 3rd embodiment. It is a flowchart which shows an example of the registration process of 3rd embodiment. It is a block diagram which shows an example of a structure of 4th embodiment. It is an example of the information which the memory
  • FIG. 1 is a block diagram showing a configuration example of the first embodiment.
  • the determination system 2000 includes a reception unit 20, a transmission unit 21, a storage unit 22, an acquisition unit 23, a determination unit 24, and a notification unit 25.
  • the storage unit 22 acquires the registration identification information of the registration target received by the reception unit 20 and the registration image of the specific part of the registration target. And the memory
  • the registration identification information may be, for example, an identification number of an object to be registered, a manufacturing number, information about an owner, information about an article, and the like. Further, the registration identification information may not be information for specifying one registration object, but may be information that can narrow down one or a plurality of registration objects from all the registration objects.
  • the registration identification information may be input by a user of the system using an input device (not shown), or may be read from a registration object by a reading device (not shown). In that case, the receiving unit 20 receives registration identification information from an input device (not shown) or a reading device (not shown). Alternatively, the registration identification information may be input by a user of the system via an input unit (not shown) in the determination system 2000. In that case, the storage unit 22 acquires registration identification information from an input unit (not shown).
  • the registered image may be generated from an image of a registration object captured by an imaging device (not shown).
  • a processing unit (not shown) generates a registered image by cutting out a specific portion from an image obtained by capturing the registration target.
  • the receiving part 20 receives a registration image from the imaging device not shown.
  • the registered image may be generated by an imaging device (not shown) imaging a specific part of the registration target. The specific location may be specified by the user or the system.
  • FIG. 2 shows an example of the registration object and the specific part.
  • FIG. 2 shows an image obtained by capturing a concert ticket 400 that is a registration object.
  • the registered image may be an image generated by cutting out the specific portion 40 from the image obtained by capturing the concert ticket 400.
  • the registered image may be an image obtained by capturing the specific portion 40 in the concert ticket 400.
  • “ID 2100” printed on the concert ticket 400 is an identification number of the concert ticket and corresponds to registration identification information. Therefore, the storage unit 22 acquires the registration identification information “2100” received by the receiving unit 20 and a registered image that is an image of the specific location 40, and stores them in association with each other.
  • this system notifies the owner of the registration target object of the location of the specific part of the registration target object.
  • the notification of the position of the specific location may be performed by a display unit (not shown).
  • a display unit may mark and display the position of a specific location on an image obtained by imaging a registration target.
  • the storage unit 22 stores the registration identification information “2100” and the registered image with the file name “bbb” in association with each other. That is, the registered image which is the image of the specific location 40 is stored in the storage unit 22 as the file name “bbb”.
  • the storage unit 22 may store two or more registered images in association with one piece of registration identification information. That is, the storage unit 22 may acquire one registration identification information received by the receiving unit 20 and two or more registered images, and store them in association with each other. 3, for example, the storage unit 22 stores registration identification information “2101”, a registered image with the file name “ccc1”, and an image with the file name “ccc2” in association with each other.
  • the file name may be set by a processing unit (not shown) when the storage unit 22 stores a registered image, or may be set by a user of this system.
  • the obtaining unit 23 obtains a designated image at a location designated by the user among images obtained by capturing the presentation object. That is, the acquisition unit 23 acquires the designated image received by the reception unit 20.
  • the designated image is an image of a part designated by the user among images obtained by capturing the presentation object.
  • FIG. 4 shows an example of the presentation object and the location designated by the user.
  • FIG. 4 shows an image obtained by capturing a concert ticket 500 that is an object to be presented. An image generated by cutting out a part designated by the user from the image obtained by capturing the concert ticket 500 is the designated image. If the location designated by the user (hereinafter referred to as the designated location) is the designated location 50, the designated image is an image of the designated location 50 in the image obtained by capturing the concert ticket 500. Similarly, if the user has designated the designated location 51, the designated image is an image of the designated location 51 in the image obtained by capturing the concert ticket 500.
  • the specified image may be generated by a generation unit (not shown) by cutting out a portion specified by the user from an image obtained by capturing the presentation target object.
  • a generation unit (not illustrated) acquires position information of a location specified by the user, which is received by the reception unit 20. Furthermore, the generation unit (not shown) acquires an image obtained by capturing the presentation object received by the reception unit 20. And the production
  • the acquisition unit 23 acquires a designated image from a generation unit (not shown).
  • the acquisition unit 23 acquires a registration image associated with the same registration identification information as the presentation identification information of the presentation object from the storage unit 22. Specifically, the acquisition unit 23 acquires the presentation identification information received by the reception unit 20. Then, the acquisition unit 23 acquires a registration image associated with the same registration identification information as the acquired presentation identification information from the storage unit 22.
  • the presentation identification information may be anything as long as it can identify the presentation object. That is, the presentation identification information may be the same information as the registration identification information. That is, the identification number of an object to be presented, a manufacturing number, information about an owner, information about an article, and the like may be used. Further, the presentation identification information may not be information for specifying one presentation object, but may be information that can narrow down one or more presentation objects from all the presentation objects.
  • the presentation identification information may be input by a user of the system using an input device (not shown), or may be read from a presentation object by a reading device (not shown). In that case, the receiving unit 20 receives the presentation identification information from an input device (not shown) or a reading device (not shown). Alternatively, the presentation identification information may be input by a user of the system via an input unit (not shown) in the determination system 2000. In that case, the acquisition unit 23 acquires presentation identification information from an input unit (not shown).
  • the acquisition unit 23 acquires the presentation identification information “2100” received by the reception unit 20. Then, the acquisition unit 23 acquires from the storage unit 22 a registered image associated with the registration identification information “2100” that is the same as the presentation identification information. That is, when the storage unit 22 stores the information illustrated in FIG. 3, the acquisition unit 23 acquires an image with the file name “bbb” associated with the registration identification information “2100” from the storage unit 22. .
  • the determination unit 24 determines whether the designated image acquired by the acquisition unit 23 matches the registered image acquired by the acquisition unit 23.
  • Various existing object authentication techniques can be used for the determination by the determination unit 24.
  • the determination unit 24 can determine the matching of images by a method using pattern matching, a method using graph matching, a method using the Eugrid distance of feature amounts, and the like.
  • the determination unit 24 determines that the designated image and the registered image match, it can be said that the position of the designated portion and the specific portion are the same, and the presentation object and the registration object are the same. This is because even if the presentation object and the registration object are the same, if the position of the designated place and the specific place are different, the images to be judged are different and do not match. Moreover, even if the position of the designated place and the specific place is the same, if the presentation object and the registration object are not the same, the determination target images are different and do not match. Therefore, when the determination unit 24 determines that the specified image and the registered image match, the position of the specified location and the specific location are the same, and the presentation object and the registration object are the same.
  • the owner of the registration target object and the owner of the presentation target object are the same. This is because the present system notifies the owner of the registration target object of the position of the specific part in the registration target object, so that the owner of the registration target object knows the position of the specific part. Therefore, when the determination unit 24 determines that they match, the owner of the presentation object is the true owner (the owner at the time of registration), and the presentation object is the true object (the object at the time of registration) )You can say that.
  • the determination unit 24 determines the designated image of the designated place 50 in the concert ticket 500 and the registered image (file name “bbb” of the specific place 40 in the concert ticket 400. It is determined whether or not the registered images match. When the determination unit 24 determines that they match, the designated location 50 and the specific location 40 have the same position in the concert ticket. Furthermore, the concert ticket 400 and the concert ticket 500 are the same.
  • the determination unit 24 determines whether the designated image at the designated portion 51 in the concert ticket 500 matches the registered image with the file name “bbb”. .
  • the determination unit 24 determines that they do not match, the position of the designated place 51 and the specific place 40 in the concert ticket is not the same, or the concert ticket 400 and the concert ticket 500 are not the same.
  • the determination unit 24 determines that they do not match, the positions of the designated place 51 and the specific place 40 in the concert ticket are different, and the concert ticket 400 and the concert ticket 500 are not the same.
  • the notification unit 25 notifies the determination result of the determination unit 24. Specifically, the notification unit 25 acquires a determination result from the determination unit 24. Then, the acquired determination result is notified. For example, the notification unit 25 notifies the determination result to a display (not shown) via the transmission unit 21. In the case of coincidence determination, for example, a display (not shown) may display such as “successful collation. Both owner and object are true”. In the case of a determination that they do not match, for example, a display (not shown) may display “Matching failure. Owner or object is false”.
  • the notification by the notification unit 25 is not limited to that displayed on the display.
  • the notification unit 25 may notify the determination result by causing the speaker (not shown) to send audio through the transmission unit 21.
  • the notification unit 25 may synthesize the voice with a voice synthesis unit (not shown).
  • FIG. 5 is a flowchart showing the registration process of the first embodiment.
  • the registration process is a process of the present system until the storage unit 22 stores the registration identification information of the registration target object and the registration image of the specific part of the registration target object in association with each other.
  • the storage unit 22 acquires the registration identification information of the registration target received by the reception unit 20 and the registration image of the specific part of the registration target. And the memory
  • FIG. 6 is a flowchart showing the determination process of the first embodiment.
  • the determination process is a process of the present system from when the acquisition unit 22 acquires registration identification information or a registered image until notification by the notification unit 25 is performed.
  • the acquisition unit 23 acquires the designated image received by the reception unit 20.
  • the designated image is an image of a part designated by the user among images obtained by capturing the presentation object. Furthermore, the acquisition unit 23 acquires a registration image associated with the same registration identification information as the presentation identification information of the presentation object from the storage unit 22. Specifically, the acquisition unit 23 acquires the presentation identification information received by the reception unit 20. And the acquisition part 23 acquires the registration image linked
  • the determination unit 24 determines whether the designated image acquired by the acquisition unit 23 matches the registered image acquired by the acquisition unit 23 (S3).
  • the notification unit 25 acquires the determination result from the determination unit 24 and notifies the acquired determination result. In the case of the coincidence determination result, the notification unit 25 notifies the determination result that coincides (S4). The notification unit 25 notifies the determination result to a display (not shown) via the transmission unit 21. In the case of coincidence determination, for example, a display (not shown) may display such as “successful collation. Both owner and object are true”. On the other hand, in the case of a determination result that does not match, the notification unit 25 notifies the determination result that it does not match (S5). As in the case of matching determination, the notification unit 25 notifies the determination result to a display (not shown) via the transmission unit 21. A display (not shown) displays the determination result. In the case of a determination that they do not match, for example, a display (not shown) may display “Matching failure. Owner or object is false”.
  • the first embodiment may have the configuration shown in FIG.
  • a determination system 1000 illustrated in FIG. 7 includes a storage unit 100, an acquisition unit 101, a determination unit 102, and a notification unit 103. That is, the first embodiment may not include the reception unit 20 and the transmission unit 21. Furthermore, each component in the determination system 1000 may be realized by any hardware and in any combination.
  • the storage unit 100 stores the registration identification information of the registration object and the registration image of the specific part of the registration object in association with each other.
  • the acquisition unit 101 acquires a designated image at a location designated by the user among images obtained by capturing the presentation target object. Furthermore, the acquisition unit 101 acquires a registration image associated with the same registration identification information as the presentation identification information of the presentation object from the storage unit 101.
  • the determination unit 102 determines whether the designated image acquired by the acquisition unit 101 matches the registered image.
  • the notification unit 103 notifies the determination result of the determination unit 102.
  • FIG. 8 is an example of a hardware configuration of the determination system 1001.
  • the communication control unit 8 communicates with an external device via a network.
  • a RAM 9 Random Access Memory
  • the RAM 9 has a capacity for storing various data necessary for realizing the present embodiment.
  • the large-capacity storage unit 10 is a storage medium that stores data such as a database necessary for realizing the present embodiment and an application program executed by the CPU 11 (Central Processing Unit) in a nonvolatile manner.
  • the CPU 11 is a processor for arithmetic control, and implements each functional unit of the present invention by executing a program. Specifically, the CPU 11 implements the acquisition unit 101, the determination unit 102, and the notification unit 103 of the present invention by executing a program.
  • the large-capacity storage unit 10 can realize the storage unit 100.
  • the CPU 11 can realize the acquisition unit 101, the determination unit 102, and the notification unit 103.
  • the association between each hardware configuration and each component is not limited to that shown in FIG. 8.
  • the acquisition unit 101, the determination unit 102, and the notification unit 103 may be realized by different CPUs.
  • FIG. 9 is a flowchart showing the operation of the determination system 1000.
  • the storage unit 100 stores the registration identification information of the registration object and the registration image of the specific part of the registration object in association with each other (S6).
  • the acquisition unit 101 acquires a designated image at a location designated by the user among images obtained by capturing the presentation target object. Furthermore, the acquisition unit 101 acquires a registration image associated with the same registration identification information as the presentation identification information of the presentation object from the storage unit 100 (S7).
  • the determination unit 102 determines whether the designated image acquired by the acquisition unit 101 matches the registered image (S8).
  • the notification unit 103 notifies the determination result of the determination unit 102 (S9).
  • the determination unit 102 determines whether or not the designated image at the location designated by the user in the image obtained by capturing the presentation object matches the registered image. When it is determined that the determination unit 102 matches, it can be said that the location designated by the user matches the specific location, and the registration target and the presentation target are the same. Therefore, this system can notify the authenticity of the object and the authenticity of the owner of the object.
  • the storage unit 22 stores the registration position information of a specific location in the registration target object and the registration identification information in association with each other. That is, the storage unit 22 stores the registration position information in association with the registration identification information and the registration image.
  • An example of information stored in the storage unit 22 is shown in FIG. From FIG. 10, for example, the storage unit 22 includes registration identification information “2099”, a registered image with the file name “aaa”, and registration position information (4, 2) (6, 2) (4, 4) (6, 4) is stored in association with each other.
  • the registered position information is position information of a specific location in the registration object, and may be generated by a processing unit (not shown).
  • a processing unit first acquires an image obtained by imaging a registration target from an imaging unit (not shown).
  • a processing unit (not shown) generates position information of a specific portion in the image obtained by capturing the registration target object. For example, in FIG. 2, a specific portion is shown in an image obtained by capturing a concert ticket 400 that is a registration target.
  • a processing unit (not shown) acquires an image obtained by imaging the concert ticket 400 from an imaging device (not shown).
  • the processing unit When the specific location 40 is designated, the processing unit (not shown) generates the coordinates of the specific location 40 using the lower left corner of the concert ticket 400 as the origin of the xy coordinates. That is, the processing unit not shown in FIG. 2 uses four points whose coordinates (x, y) are (10, 5), (12, 5), (10, 7), (12, 7) as end points. Generate coordinates to do. Thereby, the processing unit (not shown) generates the registered position information of the specific location 40 as (10, 5), (12, 5), (10, 7), (12, 7). Then, the processing unit (not shown) stores the generated registered position information in the storage unit 22. Note that the position of the origin on the registration object may be arbitrarily determined by the system designer or user.
  • the acquisition unit 23 acquires the registration position information associated with the same registration identification information as the presentation identification information from the storage unit 22. For example, as in the example of the first embodiment, it is assumed that the presentation identification information is “2100”. Then, the acquisition unit 23 refers to, for example, the information of FIG. 10 stored in the storage unit 23, and the registration position information (10, 5) (12, 12) associated with the registration identification information “2100” that is the same as the presentation identification information. 5) Obtain (10, 7) (12, 7).
  • the acquisition unit 23 acquires specified position information of a specified image in an image obtained by imaging the presentation target.
  • the designated image is an image of a designated portion designated by the user among images obtained by capturing the presentation target object.
  • the designated position information may be generated by a processing unit (not shown) as in the case of the registered position information. That is, a processing unit (not shown) first acquires an image obtained by imaging a presentation target from an imaging unit (not shown). Then, a processing unit (not shown) generates position information of a designated location in an image obtained by capturing the designated object.
  • the processing unit designates (10, 5), (12, 5), (10, 7), (12, 7) as designated location information. Generate as On the other hand, when the place designated by the user is the designated place 51, the processing unit (not shown) uses (1, 1), (3, 1), (1, 3), (3, 3) as designated position information. Generate. Then, the acquisition unit 23 acquires designated position information from a processing unit (not shown).
  • the determination unit 24 determines whether the registered position information acquired by the acquisition unit 23 matches the specified position information. Specifically, the determination unit 24 acquires the registered position information and the designated position information from the acquisition unit 23 and determines whether or not they match. Then, in the case of matching determination, the determination unit 24 further determines whether or not the designated image and the registered image match.
  • the determination that the registered position information and the designated position information match does not require that the position information be completely the same.
  • the determination unit 24 may determine that there is a match when the difference in position information between the registered position information and the specified position information is a predetermined value or less.
  • the predetermined value may be arbitrarily set by a user or designer of the system.
  • the determination unit 24 registers the registered location information (10, 5), (12, 5), (10, 7), (12, 7) and the designated location information ( 10,5), (12,5), (10,7), and (12,7) are acquired, and it is determined that they match. Then, the determination unit 24 further determines whether or not the designated image matches the registered image.
  • the determination as to whether or not the designated image matches the registered image is the same as in the first embodiment, and a description thereof will be omitted.
  • the determination unit 24 registers the registered location information (10, 5), (12, 5), (10, 7), (12, 7). And the designated position information (1, 1), (3, 1), (1, 3), (3, 3) are determined not to match. In this case, the determination unit 24 may determine that the specified image and the registered image do not match without determining whether or not the specified image and the registered image match.
  • a processing unit may convert an image obtained by imaging a registration target object or an image obtained by imaging a presentation target object into a predetermined size when generating registration position information or designated position information. That is, the processing unit (not shown) may convert the size of each image so that the scales of the captured objects are unified.
  • the imaging unit captures the registration target object or the presentation target object
  • the user of this system captures each image so that the scale of each captured target object is unified. May be.
  • the user may adjust the distance from the imaging unit to each object to be constant, or may make the imaging magnification of the imaging unit constant. Thereby, the actual position of each registered object in the registered position information or designated position information indicated by the coordinates is unified.
  • FIG. 11 is a flowchart showing the registration process of the second embodiment.
  • the operation of the second embodiment differs from the first embodiment in that it has S10. Since other operations are the same as those in the first embodiment, description thereof will be omitted.
  • the storage unit 22 acquires registration position information from a processing unit (not shown), and stores the acquired registration position information and registration identification information in association with each other. Note that the storage of the registration identification information and registration image in S1 and the storage of registration position information in S10 may be performed at the same timing or at different timings. The storage unit 22 may store the registration position information and the registration identification information, and then store the registration identification information and the registered image in association with each other.
  • FIG. 12 is a flowchart showing the determination process of the second embodiment.
  • the operation of the second embodiment differs from the first embodiment in that it has S11 and S12. Since other operations are the same as those in the first embodiment, description thereof will be omitted.
  • the acquisition unit 23 acquires registration position information associated with the same registration identification information as the presentation identification information received by the reception unit 20 from the storage unit 22. Furthermore, the acquisition unit 23 acquires specified position information of a specified image in an image obtained by imaging the presentation target object from a processing unit (not illustrated) (S11).
  • the determination unit 24 acquires the registered position information and the specified position information acquired by the acquisition unit 23. Then, the determination unit 24 determines whether or not the acquired registered position information matches the specified position information (S12). In the case of matching, the determination unit 24 determines whether the designated image matches the registered image (S3). In the case of the determination that they do not match in S12, the notification unit 25 notifies the determination result that they do not match (S5).
  • the determination unit 24 may pass a determination result different from the case where the specified image and the registered image do not match to the notification unit 25.
  • the notification part 25 may perform notification different from the case where a designation
  • the notification unit 25 may notify that the owner does not match when the registered position information and the specified position information do not match. Specifically, the notification unit 25 may perform notification such as “collation failure. Owner is false”. Further, the notification unit 25 may pass through the fact that the object does not match when the designated image and the registered image do not match. Specifically, the notification unit 25 may notify that “the verification has failed. The object is false.”
  • the determination unit 24 determines the match between the designated image and the registered image when the registered position information matches the designated position information. Therefore, in the present system, when the registered position information and the specified position information do not match, it is not necessary to authenticate the images, so that the processing load is reduced and the determination result can be notified to the user earlier. . Furthermore, when the registered position information and the specified position information do not match, the notification unit 25 can make a notification different from the case where the specified image and the registered image do not match. Therefore, the user of this system can know whether the object is false or the owner of the object is false.
  • FIG. 13 is a block diagram illustrating a configuration example of the third embodiment.
  • the third embodiment is different from the first embodiment in that it includes a designation unit 26 and a generation unit 27.
  • the designation unit 26 acquires an image obtained by capturing the registration target received by the reception unit 20. And the designation
  • the image of the specific location is an image including information that can identify the registration target.
  • An image of the registration target may be captured by an imaging unit (not shown). In that case, the receiving unit 20 receives an image of the registration target from an imaging unit (not shown).
  • the information that can identify the registration object may be, for example, a pattern that is generated on the surface of a product or part when it is manufactured.
  • the pattern may be a pattern composed of colors or a pattern composed of irregularities on the surface of a product or the like.
  • An image including information that can identify a registration object is an image including a pattern generated on the surface of the product or part when the product or part is manufactured.
  • an image including information that can identify a registration object includes, for example, a region made of a material containing particles such as lame, a rough region that has been textured or sandblasted, a region made of metal, An image obtained by imaging an area made of fibers such as cloth may be used.
  • the image including information that can identify the registration target object may be an image that is captured after manufacturing a product or a part, and that is an image of scratches, distortions, stains, and the like unique to the registration target object.
  • FIG. 2 shows an image obtained by capturing a concert ticket 400 that is a registration object.
  • the concert ticket 400 includes regions 41, 42, and 43 made of a material containing glitter particles.
  • the designation unit 26 first detects an area made of a material containing glitter particles. That is, the designation unit 26 detects the areas 41, 42, and 43. And the designation
  • the features of the area detected by the designation unit 26 are determined by the system designer and user. May be determined arbitrarily. Further, the selection of the area by the designation unit 26 may not be random. For example, when a plurality of regions made of a material containing glitter particles are detected, the designating unit 26 may select a region most suitable for identifying the registration object.
  • the specification of the specific part by the specification unit 26 may be performed by the user.
  • the user inputs an arbitrary specific portion from an image obtained by capturing the registration target using an input unit (not shown) such as a mouse or a keyboard.
  • the designation unit 26 acquires information on a specific part input by the user from an input unit (not shown), and designates the specific part.
  • the generation unit 27 cuts out an image of a specific location designated by the designation unit 26 from an image obtained by capturing the registration target, and generates a registered image. Further, the generation unit 27 generates registration image position information (hereinafter referred to as registration image position information) in an image obtained by imaging the registration target. Specifically, in the present embodiment, the registered image position information is “(10, 5), (12, 5), (10, 7), (12, 7)”. And the production
  • generation part 27 is four points from which the coordinate (x, y) is (10,5), (12,5), (10,7), (12,7) from the image which imaged the registration target object. Cut out a part with the end point as the end point to generate a registered image. Then, the specification unit 27 stores the generated registered image in the storage unit 22.
  • FIG. 14 is a flowchart showing the registration process of the third embodiment.
  • the designation unit 26 acquires an image obtained by capturing the registration target received by the receiving unit 20 (S13). And the designation
  • the generation unit 27 cuts out an image of a specific location designated by the designation unit 26 from an image obtained by capturing the registration target object, and generates a registration image (S15).
  • the storage unit 22 stores the registration image generated by the generation unit 26 and the registration identification information received by the reception unit 20 in association with each other (S1).
  • the specifying unit 26 may receive the registration identification information from the receiving unit 20 together with an image obtained by capturing the registration object. Then, the generation unit 27 may store the registration identification information acquired from the specification unit 26 in the storage unit 22 together with the registered image.
  • the system can specify the specific location for each registration object because the location designated by the designation unit 26 can be designated as the specific location. Therefore, since this system can make the location containing the information which can identify a registration target object into a specific location, the precision of the determination by the determination part 24 is high. Thereby, the user of this system can know more accurately the authenticity of the object and the authenticity of the owner of the object.
  • FIG. 15 is a block diagram illustrating a configuration example of the third embodiment.
  • the fourth embodiment differs from the first embodiment in that it includes a display control unit 28 and a display unit 29.
  • the storage unit 22 stores information that associates the object category, the position information of the region, and the priority in the region. In other words, the storage unit 22 acquires and stores information associated with the object category, the position information of the region, and the priority in the region received by the receiving unit 20.
  • An example of information stored in the storage unit 22 is shown in FIG.
  • the table of FIG. 16 associates the object category, the position information of the area, and the priority in the area. For example, a region having four endpoints whose coordinates (x, y) are (0, 0), (3, 0), (0, 2), and (3, 2) has a priority of 1. On the other hand, a region having four endpoints whose coordinates (x, y) are (3, 0), (11, 0), (3, 2), and (11, 2) has a priority of 2.
  • the priority is a value indicating a degree suitable for a determination process by the determination unit 24.
  • the priority is a numerical value representing the level of discriminating power for identifying a registered object and the difficulty of deterioration over time.
  • a region having high discrimination power has a large image difference between different registration objects, and when the presentation object and the registration object are different, there is a high possibility that the determination unit 23 does not match.
  • areas made of material containing particles such as lame, textured areas that have been textured or sandblasted, areas made of metal, areas made of fibers such as cloth, etc. have high discriminating power, so priority Can be big.
  • the determination unit 24 Even if the region subject to deterioration over time is the same object, there is a difference in the image at the time of registration and at the time of determination, and even when the presentation object and the registration object are the same, the determination unit 24 There is a high possibility that it will be determined that they do not match. For example, an area that hits a corner of an object, an edge area, an area that is easily deteriorated due to structural use, etc. is likely to deteriorate over time, and therefore may have a low priority.
  • the object category is a category for classifying registered objects.
  • the object category may be a general name of an article to be registered, such as a concert ticket, an air ticket, a passport, or a license.
  • the object category may be a more detailed category than the general name of the article of the registration object.
  • the target object category may be a category in which the registration target object is classified by style or manufacturer, such as a concert ticket “type A” or a concert ticket “manufactured by XX”.
  • the designer or user of this system may transmit information relating the object category, area position information, and priority in the area to the determination apparatus 202 from an unillustrated apparatus. That is, the information shown in FIG. 16 may be received by the receiving unit 20 from a device (not shown). Alternatively, a processing unit (not shown) in the determination apparatus 202 evaluates the level of discriminating power for identifying the registered object as described above or the difficulty of aging deterioration from the area of the image obtained by imaging the object, Accordingly, priority may be assigned to each area.
  • the display unit 29 displays an image obtained by capturing the registration target object.
  • the display unit 29 displays the priority by superimposing the priority on an area indicated by the position information associated with the priority among the images to be displayed.
  • the display by the display unit 29 is controlled by the display control unit 28.
  • the display control unit 28 acquires an image captured by the receiving unit 20 that is an image of the registration target. Then, the display control unit 28 acquires the object category of the registration object received by the receiving unit 20.
  • the object category may be input by a user of the system using an input device (not shown). In that case, the receiving unit 20 receives the object category input by the user from an input device (not shown).
  • the user may input the object category when inputting the registration identification information.
  • the user may input the object category “concert ticket” using an input device (not shown) when inputting the registration identification information (ID) “2100” of the concert ticket 400 that is the registration object. .
  • the user may input the object category at a timing different from the input of the registration identification information.
  • the display control unit 28 determines the position information of the area associated with the acquired object category and the priority in the area. The degree is acquired from the storage unit 22. Then, the display control unit 28 causes the display unit 29 to display the priority associated with the region superimposed on the region of the image obtained by capturing the registration target object.
  • An example of display by the display unit 29 is shown in FIG. In FIG. 17, priority is superimposed on an image obtained by imaging a registration target. For example, as shown in FIG. 16, a region having four endpoints whose coordinates (x, y) are (0, 0), (3, 0), (0, 2), (3, 2) has a priority of 1. Therefore, the display control unit 28 causes the display unit 29 to display “1” in the area.
  • the user inputs an arbitrary specific portion from an image obtained by capturing the registration target object using an input unit (not shown) such as a mouse or a keyboard with reference to the display by the display unit 29.
  • the designation unit 26 acquires information on a specific location input by the user from an input unit (not shown) and designates the specific location.
  • FIG. 18 is a flowchart showing the operation of the registration process in the fourth embodiment.
  • the storage unit 22 acquires and stores information associated with the object category, the position information of the region, and the priority in the region received by the receiving unit 20. (S17).
  • the display control unit 28 acquires an image obtained by capturing the registration target object received by the receiving unit 20. And the display control part 28 acquires the target object category of the registration target object which the receiving part 20 received (S18). Further, the display control unit 28 acquires, from the storage unit 22, the position information of the area associated with the acquired object category and the priority in the area. Then, the display control unit 28 causes the display unit 29 to display the priority associated with the region superimposed on the region of the image obtained by capturing the registration target (S19). The user refers to the display by the display unit 29 and inputs an arbitrary specific portion from an image obtained by imaging the registration target using an input unit (not shown) such as a mouse or a keyboard (S20).
  • an input unit not shown
  • a mouse or a keyboard S20
  • the designation unit 26 acquires information regarding the specific location input by the user and designates the specific location.
  • a processing unit (not shown) generates a registered image by cutting out a specific portion designated by the designation unit 26 from an image obtained by capturing the registration target.
  • the storage unit 22 stores a registration image generated by a processing unit (not shown) in association with registration identification information (S1).
  • the display unit 29 of the present system displays the priority superimposed on the image obtained by capturing the registration object. Therefore, the user of this system can grasp
  • the display unit 29 and the display control unit 28 may not be provided.
  • An example of the operation of the designation unit 26 in that case will be described below.
  • the designation unit 26 designates the specific part so that the sum of the priorities associated with the area included in the specific part is equal to or greater than a predetermined threshold. Specifically, the designation unit 26 first acquires an image obtained by capturing the registration target received by the reception unit 20 and a target category of the registration target. Then, the specifying unit 26 acquires the position information of the area associated with the acquired object category and the priority in the area from the storage unit 22.
  • the designation unit 26 designates one or a plurality of specific locations from the image obtained by capturing the registration target object. At this time, the designation unit 26 designates the specific location so that the sum of the priorities associated with the area included in the specific location is equal to or greater than a predetermined threshold.
  • the predetermined threshold can be arbitrarily set by a designer or user of this system.
  • FIG. 19 shows the designation of a specific part by the designation unit 26.
  • FIG. 19 shows an image obtained by capturing a concert ticket 400 that is a registration target and the priority.
  • the predetermined threshold is 5.
  • the number of specific parts designated by the designation unit 26 is two.
  • the candidate location 52 which is a candidate for the specific location, includes an area having a priority of 3.
  • the designation unit 26 may adopt the priority associated with the area occupying the largest proportion. good.
  • the priority associated with the region occupying the largest proportion of the candidate locations 54 is 2. That is, the designation unit 26 may set the priority associated with the area included in the candidate location 54 to 2.
  • the method for determining the priority when the candidate location includes two or more areas associated with different priorities is not limited to this.
  • the designating unit 26 may employ an average value of two or more different priorities, or may calculate the priorities according to the proportion of each area occupied by candidate locations.
  • the designation unit 26 designates the specific part so that the priority is equal to or higher than the threshold value, the specific part suitable for the determination process by the determination unit 23 is designated. Therefore, the accuracy of determination by the determination unit 24 is further improved, and the time and effort for the user to specify a specific location is reduced.
  • FIG. 20 is a block diagram showing a configuration example of the fifth embodiment.
  • FIG. 20 shows a functional unit block diagram showing components and a hardware configuration example for realizing them.
  • the receiving unit 20 and the transmitting unit 21 may be realized by the communication control unit 1.
  • the acquisition unit 23, the determination unit 24, the notification unit 25, the designation unit 26, the generation unit 27, and the display control unit 28 may be realized by the CPU 2.
  • the storage unit 22 may be realized by the large-capacity storage unit 3.
  • the display unit 29 may be realized by the display 4.
  • the storage unit 22 may be in a device other than the determination device 203.
  • the present invention can be applied, for example, to an apparatus that discriminates the authenticity of an object for which the authenticity of itself and its owner is unknown.

Landscapes

  • Inspection Of Paper Currency And Valuable Securities (AREA)
  • Image Analysis (AREA)

Abstract

L'invention concerne un système de détermination, etc. permettant de signaler l'authenticité d'un objet cible et l'authenticité d'un propriétaire de l'objet cible. Ce système de détermination comprend : un moyen de stockage qui stocke des informations d'identification d'enregistrement pour un objet à enregistrer et une image d'enregistrement d'une section spécifiée de l'objet à enregistrer en association ; un moyen d'acquisition qui acquiert une image désignée d'une section d'une image d'un objet à présenter, ladite section étant désignée par l'utilisateur, et acquiert, à partir du moyen de stockage, l'image d'enregistrement qui est associée auxdites informations d'identification d'enregistrement en tant qu'informations d'identification de présentation pour l'objet à présenter ; un moyen de détermination qui détermine si oui ou non l'image désignée acquise et l'image d'enregistrement correspondent ; et un moyen de notification qui notifie le résultat de détermination du moyen de détermination.
PCT/JP2016/001641 2015-03-30 2016-03-22 Système de détermination, procédé de détermination et programme de détermination WO2016157827A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2017509259A JP6481754B2 (ja) 2015-03-30 2016-03-22 判定システム、判定方法、判定プログラム

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2015-069328 2015-03-30
JP2015069328 2015-03-30

Publications (1)

Publication Number Publication Date
WO2016157827A1 true WO2016157827A1 (fr) 2016-10-06

Family

ID=57004912

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2016/001641 WO2016157827A1 (fr) 2015-03-30 2016-03-22 Système de détermination, procédé de détermination et programme de détermination

Country Status (2)

Country Link
JP (1) JP6481754B2 (fr)
WO (1) WO2016157827A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106920318A (zh) * 2017-03-07 2017-07-04 深圳怡化电脑股份有限公司 一种纸币的鉴别方法及装置

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001216395A (ja) * 2000-02-03 2001-08-10 Michimasa Hatana 所持紙幣による認証システムとその応用
JP2004118643A (ja) * 2002-09-27 2004-04-15 Hitachi Ltd Idカード真偽判別補助装置
JP2010530095A (ja) * 2007-06-01 2010-09-02 カーベーアー−ジオリ ソシエテ アノニム セキュリティ文書、特に紙幣の認証

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001216395A (ja) * 2000-02-03 2001-08-10 Michimasa Hatana 所持紙幣による認証システムとその応用
JP2004118643A (ja) * 2002-09-27 2004-04-15 Hitachi Ltd Idカード真偽判別補助装置
JP2010530095A (ja) * 2007-06-01 2010-09-02 カーベーアー−ジオリ ソシエテ アノニム セキュリティ文書、特に紙幣の認証

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106920318A (zh) * 2017-03-07 2017-07-04 深圳怡化电脑股份有限公司 一种纸币的鉴别方法及装置

Also Published As

Publication number Publication date
JPWO2016157827A1 (ja) 2018-01-11
JP6481754B2 (ja) 2019-03-13

Similar Documents

Publication Publication Date Title
JP2020161149A (ja) 真贋判定方法、真贋判定システム、及びそのプログラム
JP2010526361A (ja) 特徴点ペアの指標化による高速指紋識別方法
KR20150024421A (ko) 검증 방법, 검증 시스템, 검증 장치, 및 그 프로그램
JP6245880B2 (ja) 情報処理装置および情報処理手法、プログラム
JP5765749B2 (ja) 個体識別情報生成装置、物品判定装置、物品判定システム及び方法
CN110738204B (zh) 一种证件区域定位的方法及装置
JP6810392B2 (ja) 個体識別装置
KR101274098B1 (ko) 물리적 특징정보를 활용한 원본 인증 시스템 및 방법
EP2736012A1 (fr) Système et programme d'identification d'objet
CN112465517A (zh) 防伪验证方法、装置及计算机可读存储介质
JP6236825B2 (ja) 自動販売機の販売商品認識装置及び販売商品認識方法、並びにコンピュータ・プログラム
CN112017352B (zh) 证件鉴伪方法、装置、设备及可读存储介质
JP6481754B2 (ja) 判定システム、判定方法、判定プログラム
JP6541226B2 (ja) 情報端末装置及びプログラム
JPWO2011010705A1 (ja) マーカ生成装置、マーカ生成検出システム、マーカ生成検出装置、マーカ、マーカ生成方法及びそのプログラム
CN111932281A (zh) 一种防伪检测方法及装置
JP6555338B2 (ja) 判定システム、判定方法および判定プログラム
JP2015045919A (ja) 画像認識方法及びロボット
JP2005228150A (ja) 画像照合装置
JP7130423B2 (ja) 部品情報管理システム、および部品情報管理プログラム
US10902584B2 (en) Detection of surface irregularities in coins
JP2009151445A (ja) 部分領域検出装置、対象物識別装置、及びプログラム
JP6204634B1 (ja) 形状弁別装置、形状弁別方法及び形状弁別プログラム
JP2017091252A (ja) 情報入力装置及び情報入力プログラム
JP2016018403A (ja) 画像処理装置、画像処理システム、画像処理方法及び画像処理プログラム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16771719

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2017509259

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16771719

Country of ref document: EP

Kind code of ref document: A1