[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

WO2022072580A1 - Encoding and decoding data, including within image data - Google Patents

Encoding and decoding data, including within image data Download PDF

Info

Publication number
WO2022072580A1
WO2022072580A1 PCT/US2021/052778 US2021052778W WO2022072580A1 WO 2022072580 A1 WO2022072580 A1 WO 2022072580A1 US 2021052778 W US2021052778 W US 2021052778W WO 2022072580 A1 WO2022072580 A1 WO 2022072580A1
Authority
WO
WIPO (PCT)
Prior art keywords
data
marks
image
field
location
Prior art date
Application number
PCT/US2021/052778
Other languages
French (fr)
Inventor
William H. MAYBAUM
Original Assignee
SignaKey LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by SignaKey LLC filed Critical SignaKey LLC
Publication of WO2022072580A1 publication Critical patent/WO2022072580A1/en
Priority to ZA2023/03526A priority Critical patent/ZA202303526B/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/234Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs
    • H04N21/2347Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs involving video stream encryption
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T1/00General purpose image data processing
    • G06T1/0021Image watermarking
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09CCIPHERING OR DECIPHERING APPARATUS FOR CRYPTOGRAPHIC OR OTHER PURPOSES INVOLVING THE NEED FOR SECRECY
    • G09C5/00Ciphering apparatus or methods not provided for in the preceding groups, e.g. involving the concealment or deformation of graphic data such as designs, written or printed messages
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L9/00Cryptographic mechanisms or cryptographic arrangements for secret or secure communications; Network security protocols
    • H04L9/32Cryptographic mechanisms or cryptographic arrangements for secret or secure communications; Network security protocols including means for verifying the identity or authority of a user of the system or for message authentication, e.g. authorization, entity authentication, data integrity or data verification, non-repudiation, key authentication or verification of credentials
    • H04L9/3247Cryptographic mechanisms or cryptographic arrangements for secret or secure communications; Network security protocols including means for verifying the identity or authority of a user of the system or for message authentication, e.g. authorization, entity authentication, data integrity or data verification, non-repudiation, key authentication or verification of credentials involving digital signatures
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/169Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding
    • H04N19/17Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being an image region, e.g. an object
    • H04N19/172Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being an image region, e.g. an object the region being a picture, frame or field
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/46Embedding additional information in the video signal during the compression process
    • H04N19/467Embedding additional information in the video signal during the compression process characterised by the embedded information being invisible, e.g. watermarking
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/65Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using error resilience
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/83Generation or processing of protective or descriptive data associated with content; Content structuring
    • H04N21/835Generation of protective data, e.g. certificates
    • H04N21/8358Generation of protective data, e.g. certificates involving watermark
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2201/00General purpose image data processing
    • G06T2201/005Image watermarking
    • G06T2201/0051Embedding of the watermark in the spatial domain

Definitions

  • This disclosure relates to encoding and decoding with respect to a code embedded within image data.
  • codes and encryption techniques are used to identify genuine products, data and things throughout a supply chain. Such codes and techniques may be used, for example, to inhibit or prevent introduction of counterfeit products in a supply chain, and/or to permit end users or consumers to verify the authenticity of a product.
  • a method of encoding data within an image includes determining an unencrypted code including at least one identifier, encrypting the unencrypted code to provide an encrypted code including multiple values, and embedding the values of the encrypted code into pixel data of selected pixels of the image.
  • the values are embedded into the pixel data as binary numbers that correspond to the values of the encrypted code.
  • the pixel data may include multiple channels for each pixel, each of the multiple channels includes binary numbers, and the values are embedded into at least one channel of each of a plurality of selected pixels. In at least some implementations, the values are embedded into the lowest order bits of the binary numbers.
  • the encrypting step is done with a key that provides which channels of which pixels are to be embedded with which of the values of the encrypted code.
  • a method of decoding data within an image includes steps of extracting from pixel data of the image an encrypted code including multiple values, and using an encryption key to decrypt the encrypted code to provide an unencrypted code including at least one identifier.
  • the extracting step may be accomplished with a key or map that indicates which channels within which pixels within the image include values of the encrypted code, and the order in which the values are arranged in the encrypted code.
  • FIG. 1 is a component view of an encoding system according to an embodiment of the present disclosure
  • FIG. 2 is a component view of a decoding system according to an embodiment of the present disclosure
  • FIG. 3 is a flow chart to encode data to symbology according to an embodiment of the present disclosure
  • FIG. 4 is a flow chart to decode symbology to data according to an embodiment of the present disclosure
  • FIG. 5 is a flow chart to decode symbology to data according to an embodiment of the present disclosure
  • FIG. 6 is a field identification scheme according to an embodiment of the present disclosure
  • FIG. 7 is a selection of field row and column identifier marks according to an embodiment of the present disclosure.
  • FIG. 8 is a selection of data marks according to an embodiment of the present disclosure.
  • FIG. 9 is a selection of row and column identifier marks according to an embodiment of the present disclosure.
  • FIG. 10 is a field of data marks with an obliterated portion according to an embodiment of the present disclosure.
  • FIG. 11 is a diagram of a method for embedding a code within image data of an image.
  • FIG. 12 is a diagram of a method for extracting a code from image data of an image.
  • FIG. 1 is a component view of an encoding system is shown according to an embodiment of the present disclosure.
  • the encoding system 101 may include an input module 105, a database module 113, an encryption module 115, an error correction module 117, a symbology conversion module 119, and an output module 107.
  • the encoding system may be a single system, or may be two or more systems in communication with each other.
  • the encoding system may include one or more input devices, one or more output devices, one or more processors, and memory associated with the one or more processors.
  • the memory associated with the one or more processors may include, but is not limited to, memory associated with the execution of the modules, and memory associated with the storage of data.
  • the encoding system may also be associated with one or more networks, and may communicate with one or more additional systems via the one or more networks.
  • the modules may be implemented in hardware or software, or a combination of hardware and software.
  • the encoding system may also include additional hardware and/or software to allow the encoding system to access the input devices, the output devices, the processors, the memory, and the modules.
  • the modules may be associated with a different processor and/or memory, for example on distinct systems, and the systems may be located separately from one another.
  • the modules may be executed on the same system as one or more processes or services.
  • the modules may be operable to communicate with one another and to share information.
  • the modules are described as separate and distinct from one another, the functions of two or more modules may instead be executed in the same process, or in the same system.
  • the input module 105 may receive a request for data from a requestor 103.
  • the requestor 103 may be one or more individuals interacting with the input module 105 via an interface.
  • the interface may be a keyboard, computer mouse, trackpad, touch sensitive screen or film, or other device used to generate an input.
  • the input module 105 may also receive input over a network from another system.
  • the input module 105 may receive one or more signals from a computer over one or more networks.
  • the input module 105 may accept inputs regarding, for example, the total number of unique identifiers requested by the requestor 103, and/or other information such as information that may remain the same and be associated with one or more of the unique identifiers.
  • the requestor 103 may input data associated with a vendor identifier, and the vendor identifier may be associated with the unique identifier and converted into a mark field.
  • the input device may communicate with the input module 105 via a dedicated connection or any other type of connection.
  • the input device may be in communication with the input module 105 via a Universal Serial Bus ("USB") connection, via a serial or parallel connection to the input module 105, or via an optical or radio link to the input module 105.
  • USB Universal Serial Bus
  • Any communications protocol may be used to communicate between the input device and the input module 105.
  • a USB protocol or a Bluetooth protocol may be used.
  • the network may include one or more of: a local area network, a wide area network, a radio network such as a radio network using an IEEE 802.1 lx communications protocol, a cable network, a fiber network or other optical network, a token ring network, or any other kind of packet-switched network may be used.
  • the network may include the Internet, or may include any other type of public or private network.
  • the use of the term "network" does not limit the network to a single style or type of network, or imply that one network is used.
  • a combination of networks of any communications protocol or type may be used. For example, two or more packet-switched networks may be used, or a packet-switched network may be in communication with a radio network.
  • the database module 113 may interact with one or more databases 111.
  • the database module 113 may communicate with a database 111 to request information, or to send information to be organized in the database 111.
  • the database module 113 may interact with a plurality of databases 111.
  • the plurality of databases 111 may be associated with different types of data. For example, and without limitation, one or more databases 111 may be associated with a particular product, or a particular manufacturer.
  • the database module 113 may be operable to interact with one or more databases 111 at the same time, and may be operable to retrieve or send data to each of the one or more databases 111.
  • the database 111 may store one or more pieces of data associated with unique identifiers, or may store other data related to the unique identifiers.
  • the database 111 may store information regarding manufacturer identification numbers, or part identification numbers.
  • the data may be organized in the database 111 in any way that allows access to the data.
  • the database 111 may be organized as a relational database 111, and may use relational database 111 management system software.
  • the database 111 may be organized as a flat file, a spreadsheet, or any other type of data organization schema.
  • the database 111 may be stored on one or more systems that are associated with the encoding system, or the database 111 may be stored on the encoding system.
  • the database 111 and the encoding system may be in communication, for example and without limitation, via one or more networks.
  • the database 111 may be stored on the encoding system and be controlled by one or more processes executed on the encoding system, and the database module 113 may interface with the one or more processes to store and retrieve data in the database 111.
  • the encryption module 115 may receive an input and may encrypt the input.
  • the input may be in the form of data.
  • the input may be a string of alphanumeric characters.
  • the encrypted input may then be communicated to another module within the encoding system, or may be communicated to a system associated with the encoding system.
  • the encryption module 115 may use any method for encrypting data to encrypt the input.
  • the encryption module 115 may present a choice between, for example and without limitation, a block encryption using one or more implementations of the Advanced Encryption Standard ("AES") and/or a stream encryption. In one embodiment, more than one encryption method is used to encrypt the input.
  • the encryption module 115 may encrypt the data using one or more keys.
  • Each key may be substantially unique, so that each time the encryption module 115 receives an input, it may encrypt the input with a different key, or one key may be used to encrypt more than one input.
  • a key may be used to encrypt inputs that correspond to a particular manufacturer, or a particular manufactured item.
  • the keys may be transmitted to the database 111 to be associated with the input that the keys encrypt, or a particular key may have a unique identifier in the database 111, and the encryption module 115 may transmit the unique identifier of the key used to encrypt an input.
  • the encryption module 115 may also transmit the encryption method or methods to the database 111, to be associated with the input.
  • the error correction module 117 may receive an input, and may generate an output containing one or more error correcting elements.
  • the error correcting elements in the output may be used so that if a portion of the output is destroyed or lost, the input may still be able to be retrieved based at least in part on the remainder of the output and the remainder of the error correcting elements.
  • the error correcting code may be generated by an implementation of one or more algorithms to receive an input, and generate Reed-Solomon codes based, at least in part, on the input.
  • the implementation may be in either hardware or software, or a combination of hardware and software.
  • the error correction module 117 may receive an encrypted input from the encryption module 115, and may apply the error correcting code to the encrypted input.
  • the symbology conversion module 119 may receive an input, and may generate a field of one or more marks in one or more orientations based at least in part on the input. In one embodiment, the symbology conversion module 119 may generate a field of marks that include data marks and location marks. The data marks and the location marks may be the same symbol in differing orientations, or may be different symbols. The marks that the symbology conversion module 119 uses may be located in the database 111, or may be located in another system. The symbology conversion module 119 may use different marks for different applications. For example, and without limitation, the symbology conversion module 119 may use one mark or one set of marks for each given manufacturer listed in the database 111. The symbology conversion module 119 may also use other marks that may be supplied from an external source.
  • the symbology conversion module 119 may receive the symbols to use for the encoding process from the database 111, or may request the symbols to use for the encoding process from the database 111.
  • the symbology conversion module 119, or another module may transmit, for example and without limitation, the unique identifier, or other data, to the database 111.
  • the database 111 may transmit one or more symbols to use for the symbology to the symbology conversion module 119.
  • the marks in a field may include data symbols and location symbols.
  • the data symbols and the location symbols are the same symbol rotated in different orientations.
  • FIG. 8 there is depicted a selection of data marks according to an embodiment of the present disclosure.
  • a symbol according to the example shown in FIG. 8 is depicted by two lines intersecting at an approximately fifty degree angle. Other intersection angles may be used, including a ninety degree and a forty -five degree angle.
  • FIG. 8 depicts four orientations of a data mark.
  • the first data mark 803 is oriented at zero degrees clockwise
  • the second data mark 805 is oriented at ninety degrees clockwise
  • the third data mark 807 is oriented at one hundred eighty degrees clockwise
  • the fourth data mark 809 is oriented at two hundred seventy degrees clockwise.
  • Four different data marks are shown as different orientations of a single mark, corresponding to four different values that the mark may represent as a data mark.
  • more orientations of a single mark may be used to correspond to additional values that the mark may represent.
  • the single mark depicted in FIG. 8 may be oriented in 45 degree increments, yielding eight positions instead of the four depicted in FIG. 8.
  • additional marks may be used in place of a specifically oriented mark.
  • the letters "A,” “B,” “C,” and “D” may be used as marks, and the orientation of the letters may not be considered to encode additional data.
  • both additional marks and orientation information may be used to represent data values.
  • the letters "A,” “B,” “C,” and “D” may be used as marks, and each of the letters may be oriented in four positions as depicted in FIG. 8. For four different marks presented, and four different orientations, a total of sixteen data values may be represented by a single mark in a specific orientation.
  • FIG. 9 depicts a selection of location marks according to an embodiment of the present disclosure.
  • a symbol according to the example shown in FIG. 8 is depicted by two lines intersecting at an approximately fifty (50) degree angle. Of course, other intersection angles may be used, including a ninety degree and a forty-five (45) degree angle.
  • FIG. 9 depicts four orientations of a single location mark. In relation to an orientation line 901, the first data mark 903 is oriented at forty-five degrees clockwise, the second data mark 905 is oriented at one hundred thirty-five degrees clockwise, the third data mark 907 is oriented at two hundred twenty -five degrees clockwise, and the fourth data mark 909 is oriented at three hundred fifteen degrees clockwise.
  • additional locations marks may be provided, as well as additional orientations of the same or different location marks.
  • the data marks and the location marks may be the same mark oriented to different angles.
  • the data marks of FIG. 8 may be oriented at right angles with respect to orientation line 801
  • the location marks of FIG. 9 may be oriented at 45 degree angles with respect to orientation line 901.
  • a field may include one or more marks arranged in an order to represent data values.
  • the field may be a fixed number of marks arranged in a pattern. For example, and as shown in FIG. 6, the field may be a number of marks arranged in a square array of nineteen marks across and nineteen marks down. Other amounts of marks, in other orientations and patterns, may also be used.
  • the size and shape of the field may be changed according to the medium that the field may be affixed to, or the amount of data that is required for a task. For example, if a task requires only that a small amount of data be encoded into a field, then the field may be smaller. If a task requires that a large amount of data be encoded into a field, or if a particular application may use additional error correction, then the number of marks to encode the field may increase, and the field may be larger.
  • the field may be divided into sections of data marks, representative sections of which are shown as elements 611, 613, 615, 619, and 621 in FIG. 6, and sections of location marks, representative sections of which are shown as elements 603, 605, 607, and 609.
  • the data marks may encode data information
  • the location marks may encode position within the field of marks.
  • the location marks may be arranged so that one or more location marks may identify the relative position of the location marks, and also the marks surrounding the location marks.
  • a representation of the position of location marks is shown in FIG. 6.
  • Data mark subfield 611 is, for example, bounded by a row of location marks 603, and a column of location marks 605.
  • the row of location marks alternates between noting a row location identifier, depicted by identifier 603b, and a column location identifier, depicted by identifier 603a.
  • the row of location marks 603, for example, indicates that the row of location marks 603 is the first row of location marks in the field, indicated by row identifier 603b.
  • the row of location marks 603 bounding data mark subsection 611 also indicates, for example, that it bounds the first column of data marks in the field, indicated by column identifier 603 a.
  • the column of location marks 605 alternates between noting a column location identifier, depicted by identifier 605b, and a row location identifier, depicted by identifier 605a.
  • the column of location marks 605, for example, indicates that the column of location marks 605 is the first row of location marks in the field, indicated by row identifier 605b.
  • the column of location marks 605 bounding data mark subsection 611 also indicates, for example, that the column of location marks 605 bounds the first row of data marks in the field, indicated by column identifier 605a.
  • data mark subfield 613 is bounded by a row of location marks 603, and a column of location marks 609.
  • the row location identifier 603b that bounds the data mark subsection 613 may be the same as the row location identifier 603b that bounds the data mark subsection 611, to indicate that the row of location marks 603 is the same row bounding both data subsections 611 and 613.
  • the column of location marks 609 alternates between noting a column location identifier, depicted by 609b, and a row location identifier, depicted by identifier 609a.
  • the column of location marks 609 indicates that the column of location marks 609 is the second column of location marks in the field, indicated by row location identifier 609b.
  • the column of location marks 609 bounding data mark subsection 613 also indicates, for example, that the column of location marks 609 bounds the second row of data marks in the field, indicated by column identifier 609a.
  • the row of generic identifiers depicted in 701 may be the same identifiers shown in the location rows and location columns of FIG. 6. That is, the first four columns of row 701 may identify the column location identifiers shown in the location rows and the location columns of FIG. 6. The fifth through eighth columns of row 701 may identify the row location identifiers shown in the location rows and the location columns of FIG. 6. row location identifiers and column location identifiers are shown in rows 703 and 705.
  • the column location identifiers may comprise a horizontal line intersecting a vertical line. The thickness of the vertical line may indicate positional data.
  • a relatively thin vertical line may indicate that the column location identifier is the first column location identifier in the field
  • a relatively thicker vertical line may indicate that the column location identifier is the last column location identifier in the field.
  • the row location identifiers may comprise a vertical line intersecting a horizontal line.
  • the thickness of the horizontal line may indicate positional data.
  • a relatively thin horizontal line may indicate that the row location identifier is the first row location identifier in the field
  • a relatively thicker horizontal line may indicate that the row location identifier is the last row location identifier in the field.
  • frequency of lines may also convey positional information.
  • the column location identifiers may comprise a horizontal line intersecting one or more vertical lines, all of approximately the same thickness.
  • the number of vertical lines may indicate positional data.
  • one vertical line may indicate that the column location identifier is the first column location identifier in the field, and four vertical lines may indicate that the column location identifier is the last column location identifier in the field.
  • the row location identifiers may comprise a vertical line intersecting one or more horizontal lines, all of approximately the same thickness.
  • the number of horizontal lines may indicate positional data.
  • one horizontal line may indicate that the row location identifier is the first row location identifier in the field, and four horizontal lines may indicate that the row location identifier is the last row location identifier in the field.
  • Rows 703 and 705 of FIG. 7 are intended to be examples only.
  • the row location identifiers and the section location identifiers may be mixed, so that, for example, the row location identifiers of row 703 and the section location identifiers of row 705 may be used.
  • other marks or styles may be used. For example, and without limitation, a single mark with four orientations may be used to represent four different location information data points. Two marks, each with four orientations, may be used to convey information associated with the row and section location identifiers.
  • the output module 107 may receive an input, and may be in communication with the image output device 109 to present the one or more symbols. In one embodiment, the output module 107 may receive the one or more symbols from the symbology conversion module 119, and may transmit the one or more symbols to the image output device 109.
  • the output module 107 and the image output device 109 may be in communication with one another. For example, and without limitation, the output module 107 and the image output device 109 may be in communication via a network, or may be in communication via a dedicated connection, such as a cable or radio link.
  • the image output device 109 may create a representation of the field.
  • the image output device 109 may create a representation of the field in a physical form.
  • the image output device 109 may include one or more lasers and a device to position the lasers.
  • the image output device 109 may direct the beam of the laser onto a surface, and ablate or etch a representation of the field onto the surface.
  • the image output device 109 may include a printer.
  • the printer may print a representation of the field onto a medium or substrate, such as a piece of paper or a material with an adhesive backing (e.g., a sticker).
  • FIG. 2 a component view of a decoding system 201 according to an embodiment of the present disclosure is shown.
  • the decoding system 201 may include an input module 205, a database module 211, a decryption module 213, an error correction module 215, a symbology conversion module 217, and an output module 207.
  • the decoding system 201 may be a single system, or may be two or more systems in communication with each other.
  • the decoding system 201 may include one or more input devices, one or more output devices, one or more processors, and memory associated with the one or more processors.
  • the memory associated with the one or more processors may include, but is not limited to, memory associated with the execution of the modules, and memory associated with the storage of data.
  • the decoding system 201 may also be associated with one or more networks, and may communicate with one or more additional systems via the one or more networks.
  • the modules may be implemented in hardware or software, or a combination of hardware and software.
  • the decoding system 201 may also include additional hardware and/or software to allow the decoding system 201 to access the input devices, the output devices, the processors, the memory, and the modules.
  • the modules, or a combination of the modules may be associated with a different processor and/or memory, for example on distinct systems, and the systems may be located separately from one another.
  • the modules may be executed on the same system as one or more processes or services.
  • the modules may be operable to communicate with one another and to share information. Although the modules are described as separate and distinct from one another, the functions of two or more modules may instead be executed in the same process, or in the same system.
  • the input module 205 may be in communication with an image input device 203.
  • the input module 205 may be operable to receive signals from the image input device 203 and to transmit the signals to other modules in the decoding system 201.
  • the input module 205 may receive the image from the image input device 203, and may apply one or more transformations to the image to allow the image to be analyzed. For example, and without limitation, the input module 205 may resize, rotate, deskew, flatten, or otherwise process or clarify the image to allow the image to be further processed.
  • the image input device 203 may be operable to receive a field and to convert the field into signals that may be communicated to the input module 205.
  • the image input device 203 may be a charge-coupled device ("CCD") in a video capture device, a digital camera, or a scanner.
  • the CCD device may, for example, record or otherwise capture a video.
  • the field may pass in front of the CCD, and one or more images of the field may be captured by the video capture device.
  • the CCD device may also be mounted in a still-frame camera.
  • the still-frame camera may be operable to, for example and without limitation, capture images on a timed scale (e.g., every second or every minute) or to capture images based on a trigger (e.g., if an object breaks a beam of light).
  • the image input device 203 may also be a scanner.
  • the field may be placed on the scanner, and the scanner may capture a representation of the field.
  • the representation may be transmitted to the input module 205.
  • the database module 211 may interact with one or more databases 219.
  • the database module 211 may communicate with a database 219 to request information, or to send information to be organized in the database 219.
  • the database module 211 may interact with a plurality of databases 219.
  • the plurality of databases 219 may be associated with different types of data. For example, and without limitation, one or more databases 219 may be associated with a particular product, or a particular manufacturer.
  • the database module 211 may be operable to interact with one or more databases 219 at the same time, and may be operable to retrieve or send data to each of the one or more databases 219.
  • the database 219 may store one or more pieces of data associated with unique identifiers, or may store other data related to the unique identifiers.
  • the database 219 may store information regarding manufacturer identification numbers, or part identification numbers.
  • the data may be organized in the database 219 in any way that allows access to the data.
  • the database 219 may be organized as a relational database 219, and may use relational database 219 management system software.
  • the database 219 may be organized as a flat file, a spreadsheet, or any other type of data organization schema.
  • the database 219 may be stored on one or more systems that are associated with the decoding system 201, or the database 219 may be stored on the decoding system 201.
  • the database 219 and the decoding system 201 may be in communication, for example and without limitation, via one or more networks.
  • the database 219 may be stored on the decoding system 201 and be controlled by one or more processes executed on the decoding system 201, and the database module 211 may interface with the one or more processes to store and retrieve data in the database 219.
  • the symbology conversion module 217 may be operable to receive an input from, for example and without limitation, the input module 205, and to decipher the field of marks represented in the field.
  • the symbology conversion module 217 may use character recognition methods to identify information regarding the one or more marks in the field.
  • the information may include, for example and without limitation, the size, type, style, position, orientation, and/or number of the marks in the field.
  • the information of the marks may convey data.
  • the symbology conversion module 217 may assign data values to the information received from the field of marks. For example, and without limitation, the symbology conversion module 217 may assign a value to a mark in a certain orientation.
  • the symbology conversion module 217 may assign a value of "A" to a mark having a zero degree orientation, and a value of "B" to a mark having a ninety degree orientation.
  • the symbology conversion module 217 may assign other values to remaining marks in the field, or remaining mark orientations in the field.
  • the symbology conversion module 217 may produce as an output a string of one or more alphanumeric characters representative of the information encoded by the marks in the field.
  • the decryption module 213 may receive an input and may decrypt the input to create a decrypted output.
  • the input may be in the form of data.
  • the input may be a string of alphanumeric characters.
  • the encrypted input may then be communicated to another module within the decoding system 201, or may be communicated to a system associated with the decoding system 201.
  • the decryption module 213 may use any method for decrypting data to decrypt the input.
  • the decryption may use a complementary method as was used to encrypt the input at the encryption system. In one embodiment, more than one decryption method is used to decrypt the input.
  • the decryption module 213 may decrypt the data using one or more keys.
  • Each key may be substantially unique, so that each time the decryption module 213 receives an input, it may decrypt the input with a different key, or one key may be used to decrypt more than one input.
  • a key may be used to decrypt inputs that correspond to a particular manufacturer, or a particular manufactured item.
  • the keys may be transmitted to the database 219 to be associated with the input that the keys encrypt, or a particular key may have a unique identifier in the database 219, and the decryption module 213 may transmit the unique identifier of the key used to decrypt the input.
  • the decryption module 213 may also transmit the decryption method or methods to the database 219, to be associated with the input.
  • the error correction module 215 may receive an input with one or more error correcting elements, and may generate an output containing data.
  • the error correcting code may be generated by an implementation of one or more algorithms to receive an input, and generate Reed-Solomon codes based, at least in part, on the input.
  • the implementation may be in either hardware or software, or a combination of hardware and software.
  • the output module 207 may receive an input and generate an output to be communicated to a requestor 209 or a requestor system.
  • the output module 207 may, for example and without limitation, receive the input and communicate the input to the database 219.
  • the input may be, but is not limited to, the data received from the mark field, converted into data by the symbology conversion module 217, decoded by the error correction module 215, and decrypted by the decryption module 213.
  • the output module 207 may communicate the data to the database 219 to query if the data is present within the database 219. If the data is present within the database 219, the field may represent data that was encoded by the encoding system. If the data is not present within the database 219, the field may represent data that was not encoded by the encoding system.
  • the field, and therefore the item that the field was affixed to may, for example and without limitation, be counterfeit, or the amount of the field recovered by the image capture device may not be sufficient to yield accurate data.
  • a decoding system 201 may be positioned near an encoding system, and may attempt to decode the fields of marks that the encoding system produces, to ensure that the encoding system is operating properly and that the data encoded in the field of marks may be properly decoded and processed.
  • the encoding system 101 may receive a request to encode data.
  • the request may be, for example and without limitation, a request for a number of unique identifiers with one or more additional data codes.
  • a manufacturer may desire to create records for a number of individual parts, all of a certain type. The manufacturer may request data records for each of the number of individual parts.
  • the encoding system 101, the database module 113, and the database 111 may generate records containing a unique identifier, the manufacturer's unique code, and the part's unique code.
  • the database 111 may store the unique identifiers and associate the unique identifiers with the part's unique code and the manufacturer's unique code.
  • the encoding system 101 may receive the request for data, and may generate the data records associated with the data request.
  • the data records may be stored in the database 111 and may be communicated to the encoding system 101 via the database module 113 or the input module 105.
  • the data records may be transmitted to the database 219 for the decoding system 201, or the database 111 for the encoding system 101 and the database 219 for the decoding system 201 may be the same database, or copies of the same database.
  • the data records may be encrypted.
  • the data records may be encrypted by the encryption module 115.
  • the encryption module 115 may use any encryption method to encrypt the data records, and may use more than one encryption method to encrypt the data records.
  • the encryption module 115 may also digitally sign the encrypted data records.
  • the key or keys used to digitally sign the encrypted data records may be generated for each data record, or the key or keys may be used by the encoding system 101 to digitally sign some or all of the data records.
  • the error correction module 117 may receive the encrypted and signed data records, and may add one or more error correcting codes to the encrypted and signed data records.
  • the error correction module 117 may use any error correcting algorithms to add the error correcting codes.
  • the addition of the error correcting codes may allow the data to be extracted if a portion of the code has been obliterated. More or fewer error correcting codes, or additional redundancy, may be added to the encrypted and signed data records depending on the anticipated wear of the field of marks. For example, where heavy wear is anticipated and much or most of the field of marks is expected to be obliterated, additional error correction and/or redundancy may be added to the encrypted and signed data records. Where light wear or no wear is expected, less error correction and/or redundancy may be added to the encrypted and signed data records.
  • the symbology conversion module 119 may convert the data with the error correction codes from the error correction module 117 and may generate a field of marks to represent the data with the error correction codes.
  • the symbology conversion module 119 may use marks of different types, or a single mark of different orientations, or other combinations of size, shape, type, or orientation of mark to encode the data with the error correction codes.
  • the symbology conversion module 119 may also add additional error correction.
  • the symbology conversion module 119 may add row location identifiers and column location identifiers to location rows and location columns. The addition of the location rows and the location columns may allow the decoding system 201 to location the position and orientation of marks with a portion of the field of marks destroyed or obliterated.
  • the output module 107 may communicate the field of marks to the image output device 109.
  • the image output device 109 may affix the field of marks to an item or to another physical object.
  • the image output device 109 may etch a field of marks to a physical item.
  • the image output device 109 may print a field of marks to a medium, for example a sticker, a piece of paper, or a slide.
  • the output module 107 may communicate to the encoding system 101 that the field of marks representing a particular data record has been printed or etched, and the database 111 may record that the data record has been created. The method may then end, as represented in step 317.
  • the steps may be performed in any order.
  • the encoding system 101 may sign the data records, as represented in step 309 before encrypting the data records, as represented in step 307.
  • the encoding system 101 may create the error correcting codes, as represented in step 311, before either encrypting the data records, as represented in step 307, or signing the data records, as represented in step 309.
  • the decoding system 201 may receive a field of marks.
  • the field may be etched or otherwise affixed to an item, or the field may be provided and otherwise associated with an item (e.g., printed on a packing slip or on a box containing the item or items).
  • the field of marks may be input into the decoding system 201.
  • the field of marks may be imaged using a video camera or a still camera, either film-based or digital, or an article may be scanned using a scanner. Other methods of reading a mark into a system may also be used.
  • the field of marks may be drawn into a system, or the field of marks may be typed into the system or input into the system in another way. The scan of the field of marks may yield an image.
  • the image may be processed to allow the decoding system 201 to convert the field of marks into representative data.
  • FIG. 5 a flow chart 500 to decode symbology to data is shown according to an embodiment of the present disclosure.
  • the image may be subjected to planar projection, where a three dimensional image, such as from a video camera or a still camera, may be the image may be mapped to points on a two dimensional plane.
  • the image may be subjected to planar projection so that the marks present in the image may be detected and analyzed by the decoding system 201.
  • the image may be processed to find its orientation.
  • the field of marks may be imaged in any orientation, and the decoding system 201 may use one or more algorithms to determine the field's orientation in the image.
  • the image may be rotated so that the field is in substantially a specific orientation, so that the marks present in the image may be detected and decoded.
  • Orientation may be discovered by the decoding system 201 by, for example and without limitation, the row identifiers and/or column identifiers that may be present in the row location identifiers and the column location identifiers present in rows and/or columns of location marks present within the field of marks. If no field if marks can be found, the decoding system 201 may send an error reporting that fact, or may wait for another image to analyze.
  • the image may be processed to determine spatial detection.
  • Spatial detection may include, for example and without limitation, the relative size of the field of marks in the image, or the relative position of the field of marks in the image. The size and position of the field of marks may be adjusted to allow for more efficient processing of the image. For example, and without limitation, the field of marks may be centered in the image, and the field of marks may be resized to a specific size. Spatial detection and image resizing may occur to counter the effects of differing equipment that may be used to print the field of marks and the equipment that may read the field of marks. If an image contains a field of marks that is in an acceptable position and size, repositioning and/or resizing of the field of marks in the image may not be desired.
  • an image that has been processed may be analyzed to determine the orientation and position of the marks in the field of marks.
  • the orientation of each of the marks in the field of marks may be noted, and one or more data values may be associated with each orientation and type of mark in the field of marks.
  • the location rows and the location columns may also be noted, and the data in the row location identifiers and the column location identifiers may also be noted.
  • Raw data, indicating the position and orientation of the marks may be recovered from the image. If any of the field of marks has been obliterated, the decoding system 201 may note the points that have been obliterated. The decoding system 201 may decode the location rows and the location columns to determine the position of the obliterated data marks.
  • eighty percent of the field of marks may be obliterated, and the data encoded in the field of marks may be able to be completely decoded with the remaining twenty percent of the field of marks, if the twenty percent forms a continuous portion of the field of marks (i.e. , is not in two or more unconnected sections).
  • FIG. 10 A representation of a field of marks, with a section of the field of marks obliterated, is shown in FIG. 10.
  • FIG. 10 there is shown an field 1000 of data marks with an obliterated portion according to an embodiment of the present disclosure.
  • the field 1000 may include location columns 1005 and location rows 1003.
  • the location columns 1005 may be represented by vertical lines that increase in thickness from left to right across the field.
  • the location rows 1003 may be represented by vertical lines that increase in thickness from top to bottom across the field.
  • the darker portion 1007 of the field of marks may be visible, and the remainder of the field 1000 may be obliterated or otherwise unreadable.
  • the detection features represented in steps 501, 503, and 505 may resize and reorient the field 1000 so that the field is able to be analyzed.
  • the decoding system 201 may find that, based on the relative thickness of the location columns 1005, that the marks visible in the remaining portion of the field may be in a proper orientation, that is, right-to-left in the field.
  • the decoding system 201 may also analyze the location rows 1003 remaining in the visible portion of the field 1000, and may also find that, based on the relative thickness of the location rows 1003, that the marks visible in the remaining portion of the field may be in the proper orientation.
  • the decoding system 201 may also analyze the location columns and the location rows and may determine, for example and without limitation, that the location rows and the location columns indicate that the visible portion of the field of marks comprises portions of the first, second, and third rows of marks, and portions of the second, third, fourth, and fifth columns of marks.
  • the decoding system 201 may use this information to determine the position of the marks in the visible portion of the field of marks. Based on the information, the decoding system 201 may determine that enough marks remain to properly decode the message.
  • steps 501, 503, and 505 may be performed in any order, and may also be performed in any order with respect to the steps depicted in FIG. 4.
  • the decoding system 201 may receive the raw data from the module or modules for image processing (not shown), and the error correction module 215 may use the values from the raw data to retrieve the data encoded in the field of marks. As represented in step 411, if the data may be retrieved from the field of marks, the method may continue to step 413. If the data cannot be retrieved from the field of marks, because, for example, a portion of the field of marks has been obliterated or may not have been scanned, the decoding system 201 may process and transmit an error condition, as represented in step 419.
  • the error condition may include an error message, or the decoding system 201 may send an error message to another system. The method may end as represented in step 423.
  • the decryption module 213 may attempt to validate the signature of the data, if the data has been signed by the encoding system 101.
  • the signature may be transmitted from the encoding system 101 to the decoding system 201, or the decoding system 201 may have a key that may be used to verify the signature of the data.
  • the method may continue to step 421. If the validation of the digital signature is not successful, the decoding system 201 may process and transmit an error condition, as represented in step 419.
  • the error condition may include an error message, or the decoding system 201 may send an error message to another system. The method may end as represented in step 423.
  • the data may be decrypted.
  • the data may be decrypted by the decryption module 213.
  • the decryption module 213 may use any decryption method to decrypt the data, and may use more than one decryption method to decrypt the data records.
  • the decryption module 213 may use the same decryption method or methods as the encryption module 115 of the encoding system 101 used to encrypt the data records.
  • the decrypted data records may be compared to the data records generated by the encoding system 101.
  • the decoding system 201 may indicate that the data record is valid and authentic. If the decrypted data record is not found in the database 111, then the decoding system 201 may indicate that the data record is not authentic.
  • the decoding system 201 may also use the database 219 associated with the decoding system 201, or may use another database to track data records.
  • the steps may be performed in any order.
  • the steps for the decoding system 201, represented in method 400 may be the reverse of the steps for the encoding system 101, represented in method 300, however this is not a requirement.
  • the decoding system 201 may decrypt the data record, as represented in step 421, before validating the digital signature of the data record, as represented in step 415.
  • the decoding system 201 may decrypt the data record, as represented in step 421, before attempting error correction, as represented in step 409.
  • FIG. 11 illustrates an encoding process 2000 in which data, such as an encrypted code, is embedded within image data of an image file.
  • the source image 2002 is a 32-bit bitmap type file which is 109,000 pixels by 109,000 pixels (about 12 Megapixels).
  • each pixel is a color dictated by a string including information for red, green and blue channels (8 bits each), as well as an alpha channel (also 8 bits) for transparency or density.
  • Each of the red, green and blue channels has 256 levels/magnitudes/shades (generally denoted by numerals 0 through 255, about 16.8 million colors), and the alpha channel likewise includes 256 levels of transparency (also generally denoted by numerals 0 through 255).
  • Other image data may include channels for hue, saturation, brightness, in at least some images, and these other channels may likewise include 256 levels. The combination of data from each channel provides a particular color and density for a pixel within the image.
  • an unencrypted message or code 2004 may include multiple identifiers such as, but not limited to, manufacturer identification, part identification, date information and other information as desired. The information may be presented in any desired order, and, if desired, at least one of the identifiers may be unique to a particular product or thing.
  • the message or unencrypted code 2004 includes 2-bit values, such as 1-100-1-1, which may be provided from a database module (e.g. module 113 described above).
  • the encryption module e.g.
  • the module 115 described above may then convert the base code or message to an encrypted code 2006 utilizing one or more keys.
  • the encrypted code 2006 may include any desired number of data/values, such as is represented by the quatnary values shown in FIG. 1.
  • the 2-bit value 00 relates to a quatnary value of 1
  • the 2-bit value 01 relates to a quatnary value of 2
  • the 2-bit value 10 relates to a quatnary value of 3
  • the 2-bit value 11 relates to a quatnary value of 4.
  • other values both the noted 2-bit and noted quatnary values can be selected as desired
  • the converted, encrypted code 2006 may be provided to the output module (e.g. module 107 described above) which maps the encrypted code 2006 into the source image which, with the encrypted code 2006 embedded therein, becomes a coded image 2008 (also referred to as a coded image file) which is the output from the encryption process 2000.
  • the mapping of the encrypted code 2006 into the source image 2002 may be done in any desired manner.
  • the value of any one of the various channels for a pixel in the source image 2002 may be modified, such as by being replaced with a two-bit value (00, 01, 10, 11) representing one of the quatnary values of the encrypted code.
  • a two-bit value (00, 01, 10, 11) representing one of the quatnary values of the encrypted code.
  • An example is described for a pixel having values 192, 48, 12, and 51 for red, green, blue and alpha channels, respectively.
  • These channel values may be represented as binary' values 11000000, 00110000, 00001 100 and 0011001 1, respectively.
  • One of these channel values may be changed to a binary' value representing a quatnary' value of the encrypted code (e.g. 0, 1, 2 or 3 in an example wherein the encrypted code uses quatnary' values 0 through 3).
  • the code is provided in the last 2 digits of binary- value for one or more channels of pixel data, as these are the lowest order bits and will provide the lowest magnitude of change to the pixel data, and hence the color/ appearance of the pixel. In this way, the embedded code provides a minimal, if any, change from the original channel value.
  • the original value for the red channel in this example is 192, which in binary form is 11000000. No change to this value need be made if the encrypted code 2006 requires the quatnary' value of zero in this position of the image data because the last two binary' digits are 00 which corresponds to a quatnary' value of zero. If the encrypted code 2006 requires a quatnary value of 1 in this position of the image data, then the binary' value for the red channel of this pixel is changed to 11000001. This changes the decimal value of the red channel from 192 to 193. If the encrypted code 2006 requires a quatnary value of 2 in this position of the image data, then the binary value for the red channel of this pixel is changed to 11000010.
  • Such changes to the image/pixel data may be done for any desired number of pixels, and the manipulated pixels can have one or more than one channel altered, as desired to embed a particular code.
  • To provide 256 quatnary values only 512 bits of data are needed, with two bits per channel for 256 channels used for the code.
  • the pixels and/or channels selected to be changed in this way may be the same for each image in a series of images, or the pixels may be randomly or arbitrarily selected, or selected based upon a key or keys used in the encryption process.
  • the sequence of pixels can also be different for each iteration, if desired. That is, the encoding process need not follow a particular sequence by row number or column number of pixels in the matrix of image pixels.
  • the channel of image data within a selected pixel that is changed from its base value to the code value, and the order in which the pixels are changed can be preselected or chosen in any desired way.
  • the key(s) and any other data relevant to the data embedded into the coded image may be transmitted to a requestor separately from the image (e.g. within a database or otherwise, as desired), or such information may be embedded within the image.
  • FIG. 12 illustrates a decoding process 2010 that begins with receipt of file including the coded image 2008 (e.g. at an image input device or input module at which the image file is received).
  • the file for the coded image 2008 is analyzed pursuant to the key(s) and other mapping information provided, pixel data in the relevant pixels/channels is captured and ordered according the manner in which such data was embedded/mapped into the image, and the encrypted code 2006 is decrypted to provide the base message or code 2004 to a requestor.
  • Error correction methods may be employed to enable decoding of messages even with all then the needed data (e.g. due to corruption or some modification of the image file).
  • FIGS. 11 and 12 do not require visual capture of an encrypted code, such as described above with regard to the field of symbols, some of which may be rotated at different angles. Instead, the underlying image data is analyzed to provide, initially, the quatnary values which may then be decoded to the base message/code 2004 which is provided to a requestor/recipient. In some implementations, the image may be captured by a camera and the resulting image data analyzed to provide the data for decoding, as described herein.
  • the encoding and decoding method can be utilized in any image type having multiple channels of pixel data.
  • RAW, BMP, and PNG are all lossless image formats - lossless TIFF, and there are others.
  • a 24-bit color image is one with 3 bytes of data for the red, green, and blue channels, and they may be manipulated in the same manner.
  • compressed image file formats, or any other type of image format may also be used so long as compression or alteration of the file data does not occur between the encryption and decryption steps, or so long as the compression/ alteration is done in a manner consistent with decoding the data (i.e, wherein changes made are understood and can be accounted for in decoding).
  • the input and output file types/formats should be the same so the encoded data is not altered in a manner not recognized during decoding. While numerical values are shown and described with regard to the unencrypted code 204 and the encrypted code 206, other characters or indicia may be used, with appropriate conversions to binary 7 values for use within the pixel data as set forth herein.
  • the terms “for example,” “for instance,” “e.g.,” “such as,” and “like,” and the verbs “comprising,” “having,” “including,” and their other verb forms, when used in conjunction with a listing of one or more components or other items, are each to be construed as open-ended, meaning that that the listing is not to be considered as excluding other, additional components or items.
  • Other terms are to be construed using their broadest reasonable meaning unless they are used in a context that requires a different interpretation.

Landscapes

  • Engineering & Computer Science (AREA)
  • Signal Processing (AREA)
  • Multimedia (AREA)
  • Computer Security & Cryptography (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Image Processing (AREA)
  • Compression Or Coding Systems Of Tv Signals (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)

Abstract

A method of decoding data within an image including extracting from pixel data of the image an encrypted code including multiple values; using an encryption key to decrypt the encrypted code to provide an unencrypted code including at least one identifier.

Description

ENCODING AND DECODING DATA, INCLUDING WITHIN IMAGE DATA
REFERENCE TO RELATED APPLICATIONS
This application claims the benefit of U.S. Provisional Application Serial No. 63/085,196 filed on September 30, 2020 the entire content of which is incorporated herein by reference in its entirety.
TECHNICAL FIELD
This disclosure relates to encoding and decoding with respect to a code embedded within image data.
BACKGROUND
Various codes and encryption techniques are used to identify genuine products, data and things throughout a supply chain. Such codes and techniques may be used, for example, to inhibit or prevent introduction of counterfeit products in a supply chain, and/or to permit end users or consumers to verify the authenticity of a product.
SUMMARY
In at least some implementations, a method of encoding data within an image, includes determining an unencrypted code including at least one identifier, encrypting the unencrypted code to provide an encrypted code including multiple values, and embedding the values of the encrypted code into pixel data of selected pixels of the image. In at least some implementations, the values are embedded into the pixel data as binary numbers that correspond to the values of the encrypted code. The pixel data may include multiple channels for each pixel, each of the multiple channels includes binary numbers, and the values are embedded into at least one channel of each of a plurality of selected pixels. In at least some implementations, the values are embedded into the lowest order bits of the binary numbers.
In at least some implementations, the encrypting step is done with a key that provides which channels of which pixels are to be embedded with which of the values of the encrypted code.
In at least some implementations, a method of decoding data within an image includes steps of extracting from pixel data of the image an encrypted code including multiple values, and using an encryption key to decrypt the encrypted code to provide an unencrypted code including at least one identifier. The extracting step may be accomplished with a key or map that indicates which channels within which pixels within the image include values of the encrypted code, and the order in which the values are arranged in the encrypted code.
BRIEF DESCRIPTION OF THE DRAWINGS
The following detailed description of preferred embodiments and best mode will be set forth with reference to the accompanying drawings, in which:
FIG. 1 is a component view of an encoding system according to an embodiment of the present disclosure;
FIG. 2 is a component view of a decoding system according to an embodiment of the present disclosure;
FIG. 3 is a flow chart to encode data to symbology according to an embodiment of the present disclosure;
FIG. 4 is a flow chart to decode symbology to data according to an embodiment of the present disclosure;
FIG. 5 is a flow chart to decode symbology to data according to an embodiment of the present disclosure; FIG. 6 is a field identification scheme according to an embodiment of the present disclosure;
FIG. 7 is a selection of field row and column identifier marks according to an embodiment of the present disclosure;
FIG. 8 is a selection of data marks according to an embodiment of the present disclosure;
FIG. 9 is a selection of row and column identifier marks according to an embodiment of the present disclosure;
FIG. 10 is a field of data marks with an obliterated portion according to an embodiment of the present disclosure;
FIG. 11 is a diagram of a method for embedding a code within image data of an image; and
FIG. 12 is a diagram of a method for extracting a code from image data of an image.
Corresponding reference characters indicate corresponding parts throughout the several views. The embodiments of the disclosure described herein are not intended to be exhaustive or to limit the disclosure to the precise forms disclosed. Rather, the embodiments selected for description have been chosen to enable one skilled in the art to practice the subject matter of the disclosure. Although the disclosure describes specific configurations of systems and methods to encode and decode data, it should be understood that the concepts presented herein may be used in other various configurations consistent with this disclosure.
DETAILED DESCRIPTION
Referring in more detail to the drawings, FIG. 1 is a component view of an encoding system is shown according to an embodiment of the present disclosure. The encoding system 101 may include an input module 105, a database module 113, an encryption module 115, an error correction module 117, a symbology conversion module 119, and an output module 107.
The encoding system may be a single system, or may be two or more systems in communication with each other. The encoding system may include one or more input devices, one or more output devices, one or more processors, and memory associated with the one or more processors. The memory associated with the one or more processors may include, but is not limited to, memory associated with the execution of the modules, and memory associated with the storage of data. The encoding system may also be associated with one or more networks, and may communicate with one or more additional systems via the one or more networks. The modules may be implemented in hardware or software, or a combination of hardware and software. The encoding system may also include additional hardware and/or software to allow the encoding system to access the input devices, the output devices, the processors, the memory, and the modules. The modules, or a combination of the modules, may be associated with a different processor and/or memory, for example on distinct systems, and the systems may be located separately from one another. In one embodiment, the modules may be executed on the same system as one or more processes or services. The modules may be operable to communicate with one another and to share information. Although the modules are described as separate and distinct from one another, the functions of two or more modules may instead be executed in the same process, or in the same system.
The input module 105 may receive a request for data from a requestor 103. The requestor 103 may be one or more individuals interacting with the input module 105 via an interface. For example, and without limitation, the interface may be a keyboard, computer mouse, trackpad, touch sensitive screen or film, or other device used to generate an input. The input module 105 may also receive input over a network from another system. For example, and without limitation, the input module 105 may receive one or more signals from a computer over one or more networks. The input module 105 may accept inputs regarding, for example, the total number of unique identifiers requested by the requestor 103, and/or other information such as information that may remain the same and be associated with one or more of the unique identifiers. For example, the requestor 103 may input data associated with a vendor identifier, and the vendor identifier may be associated with the unique identifier and converted into a mark field.
The input device may communicate with the input module 105 via a dedicated connection or any other type of connection. For example, and without limitation, the input device may be in communication with the input module 105 via a Universal Serial Bus ("USB") connection, via a serial or parallel connection to the input module 105, or via an optical or radio link to the input module 105. Any communications protocol may be used to communicate between the input device and the input module 105. For example, and without limitation, a USB protocol or a Bluetooth protocol may be used.
The network may include one or more of: a local area network, a wide area network, a radio network such as a radio network using an IEEE 802.1 lx communications protocol, a cable network, a fiber network or other optical network, a token ring network, or any other kind of packet-switched network may be used. The network may include the Internet, or may include any other type of public or private network. The use of the term "network" does not limit the network to a single style or type of network, or imply that one network is used. A combination of networks of any communications protocol or type may be used. For example, two or more packet-switched networks may be used, or a packet-switched network may be in communication with a radio network.
The database module 113 may interact with one or more databases 111. The database module 113 may communicate with a database 111 to request information, or to send information to be organized in the database 111. In one embodiment, the database module 113 may interact with a plurality of databases 111. The plurality of databases 111 may be associated with different types of data. For example, and without limitation, one or more databases 111 may be associated with a particular product, or a particular manufacturer. The database module 113 may be operable to interact with one or more databases 111 at the same time, and may be operable to retrieve or send data to each of the one or more databases 111.
The database 111 may store one or more pieces of data associated with unique identifiers, or may store other data related to the unique identifiers. For example, the database 111 may store information regarding manufacturer identification numbers, or part identification numbers. The data may be organized in the database 111 in any way that allows access to the data. For example, and without limitation, the database 111 may be organized as a relational database 111, and may use relational database 111 management system software. In another embodiment, the database 111 may be organized as a flat file, a spreadsheet, or any other type of data organization schema. The database 111 may be stored on one or more systems that are associated with the encoding system, or the database 111 may be stored on the encoding system. The database 111 and the encoding system may be in communication, for example and without limitation, via one or more networks. For example, the database 111 may be stored on the encoding system and be controlled by one or more processes executed on the encoding system, and the database module 113 may interface with the one or more processes to store and retrieve data in the database 111.
The encryption module 115 may receive an input and may encrypt the input. The input may be in the form of data. For example, and without limitation, the input may be a string of alphanumeric characters. The encrypted input may then be communicated to another module within the encoding system, or may be communicated to a system associated with the encoding system. The encryption module 115 may use any method for encrypting data to encrypt the input. For example, the encryption module 115 may present a choice between, for example and without limitation, a block encryption using one or more implementations of the Advanced Encryption Standard ("AES") and/or a stream encryption. In one embodiment, more than one encryption method is used to encrypt the input. The encryption module 115 may encrypt the data using one or more keys. Each key may be substantially unique, so that each time the encryption module 115 receives an input, it may encrypt the input with a different key, or one key may be used to encrypt more than one input. For example, and without limitation, a key may be used to encrypt inputs that correspond to a particular manufacturer, or a particular manufactured item. The keys may be transmitted to the database 111 to be associated with the input that the keys encrypt, or a particular key may have a unique identifier in the database 111, and the encryption module 115 may transmit the unique identifier of the key used to encrypt an input. The encryption module 115 may also transmit the encryption method or methods to the database 111, to be associated with the input.
The error correction module 117 may receive an input, and may generate an output containing one or more error correcting elements. The error correcting elements in the output may be used so that if a portion of the output is destroyed or lost, the input may still be able to be retrieved based at least in part on the remainder of the output and the remainder of the error correcting elements. In one embodiment, the error correcting code may be generated by an implementation of one or more algorithms to receive an input, and generate Reed-Solomon codes based, at least in part, on the input. The implementation may be in either hardware or software, or a combination of hardware and software. In one embodiment, the error correction module 117 may receive an encrypted input from the encryption module 115, and may apply the error correcting code to the encrypted input.
The symbology conversion module 119 may receive an input, and may generate a field of one or more marks in one or more orientations based at least in part on the input. In one embodiment, the symbology conversion module 119 may generate a field of marks that include data marks and location marks. The data marks and the location marks may be the same symbol in differing orientations, or may be different symbols. The marks that the symbology conversion module 119 uses may be located in the database 111, or may be located in another system. The symbology conversion module 119 may use different marks for different applications. For example, and without limitation, the symbology conversion module 119 may use one mark or one set of marks for each given manufacturer listed in the database 111. The symbology conversion module 119 may also use other marks that may be supplied from an external source.
In one embodiment, the symbology conversion module 119 may receive the symbols to use for the encoding process from the database 111, or may request the symbols to use for the encoding process from the database 111. The symbology conversion module 119, or another module, may transmit, for example and without limitation, the unique identifier, or other data, to the database 111. The database 111 may transmit one or more symbols to use for the symbology to the symbology conversion module 119.
The marks in a field may include data symbols and location symbols. In one embodiment, the data symbols and the location symbols are the same symbol rotated in different orientations. Turning now to FIG. 8, there is depicted a selection of data marks according to an embodiment of the present disclosure. A symbol according to the example shown in FIG. 8 is depicted by two lines intersecting at an approximately fifty degree angle. Other intersection angles may be used, including a ninety degree and a forty -five degree angle. FIG. 8 depicts four orientations of a data mark. In relation to an orientation line 801, the first data mark 803 is oriented at zero degrees clockwise, the second data mark 805 is oriented at ninety degrees clockwise, the third data mark 807 is oriented at one hundred eighty degrees clockwise, and the fourth data mark 809 is oriented at two hundred seventy degrees clockwise. Four different data marks are shown as different orientations of a single mark, corresponding to four different values that the mark may represent as a data mark. In one embodiment, more orientations of a single mark may be used to correspond to additional values that the mark may represent. For example, the single mark depicted in FIG. 8 may be oriented in 45 degree increments, yielding eight positions instead of the four depicted in FIG. 8. Additional orientations are possible, depending on the sensitivity of the image output device 109 and/or the image input device 203, or the size requirements or space requirements of the encoded field. In one embodiment, additional marks may be used in place of a specifically oriented mark. For example, in the embodiment shown in FIG. 8, the letters "A," "B," "C," and "D" may be used as marks, and the orientation of the letters may not be considered to encode additional data. In one embodiment, both additional marks and orientation information may be used to represent data values. For example, the letters "A," "B," "C," and "D" may be used as marks, and each of the letters may be oriented in four positions as depicted in FIG. 8. For four different marks presented, and four different orientations, a total of sixteen data values may be represented by a single mark in a specific orientation.
Turning now to FIG. 9, there is depicted a selection of location marks according to an embodiment of the present disclosure. A symbol according to the example shown in FIG. 8 is depicted by two lines intersecting at an approximately fifty (50) degree angle. Of course, other intersection angles may be used, including a ninety degree and a forty-five (45) degree angle. FIG. 9 depicts four orientations of a single location mark. In relation to an orientation line 901, the first data mark 903 is oriented at forty-five degrees clockwise, the second data mark 905 is oriented at one hundred thirty-five degrees clockwise, the third data mark 907 is oriented at two hundred twenty -five degrees clockwise, and the fourth data mark 909 is oriented at three hundred fifteen degrees clockwise. As shown above with respect to the data marks, additional locations marks may be provided, as well as additional orientations of the same or different location marks. Shown in FIGS. 8 and 9, the data marks and the location marks may be the same mark oriented to different angles. The data marks of FIG. 8 may be oriented at right angles with respect to orientation line 801, and the location marks of FIG. 9 may be oriented at 45 degree angles with respect to orientation line 901.
A field may include one or more marks arranged in an order to represent data values. The field may be a fixed number of marks arranged in a pattern. For example, and as shown in FIG. 6, the field may be a number of marks arranged in a square array of nineteen marks across and nineteen marks down. Other amounts of marks, in other orientations and patterns, may also be used. The size and shape of the field may be changed according to the medium that the field may be affixed to, or the amount of data that is required for a task. For example, if a task requires only that a small amount of data be encoded into a field, then the field may be smaller. If a task requires that a large amount of data be encoded into a field, or if a particular application may use additional error correction, then the number of marks to encode the field may increase, and the field may be larger.
The field may be divided into sections of data marks, representative sections of which are shown as elements 611, 613, 615, 619, and 621 in FIG. 6, and sections of location marks, representative sections of which are shown as elements 603, 605, 607, and 609. The data marks may encode data information, and the location marks may encode position within the field of marks.
The location marks may be arranged so that one or more location marks may identify the relative position of the location marks, and also the marks surrounding the location marks. A representation of the position of location marks is shown in FIG. 6. Data mark subfield 611 is, for example, bounded by a row of location marks 603, and a column of location marks 605. The row of location marks alternates between noting a row location identifier, depicted by identifier 603b, and a column location identifier, depicted by identifier 603a. The row of location marks 603, for example, indicates that the row of location marks 603 is the first row of location marks in the field, indicated by row identifier 603b. The row of location marks 603 bounding data mark subsection 611 also indicates, for example, that it bounds the first column of data marks in the field, indicated by column identifier 603 a.
Similarly, the column of location marks 605 alternates between noting a column location identifier, depicted by identifier 605b, and a row location identifier, depicted by identifier 605a. The column of location marks 605, for example, indicates that the column of location marks 605 is the first row of location marks in the field, indicated by row identifier 605b. The column of location marks 605 bounding data mark subsection 611 also indicates, for example, that the column of location marks 605 bounds the first row of data marks in the field, indicated by column identifier 605a.
Similarly, data mark subfield 613 is bounded by a row of location marks 603, and a column of location marks 609. The row location identifier 603b that bounds the data mark subsection 613 may be the same as the row location identifier 603b that bounds the data mark subsection 611, to indicate that the row of location marks 603 is the same row bounding both data subsections 611 and 613. The column of location marks 609 alternates between noting a column location identifier, depicted by 609b, and a row location identifier, depicted by identifier 609a. The column of location marks 609, for example, indicates that the column of location marks 609 is the second column of location marks in the field, indicated by row location identifier 609b. The column of location marks 609 bounding data mark subsection 613 also indicates, for example, that the column of location marks 609 bounds the second row of data marks in the field, indicated by column identifier 609a.
Turning now to FIG. 7, there is shown a selection of field row and column identifier marks according to an embodiment of the present disclosure. The row of generic identifiers depicted in 701 may be the same identifiers shown in the location rows and location columns of FIG. 6. That is, the first four columns of row 701 may identify the column location identifiers shown in the location rows and the location columns of FIG. 6. The fifth through eighth columns of row 701 may identify the row location identifiers shown in the location rows and the location columns of FIG. 6. row location identifiers and column location identifiers are shown in rows 703 and 705. In row 703, the column location identifiers may comprise a horizontal line intersecting a vertical line. The thickness of the vertical line may indicate positional data. For example, a relatively thin vertical line may indicate that the column location identifier is the first column location identifier in the field, and a relatively thicker vertical line may indicate that the column location identifier is the last column location identifier in the field. In row 703, the row location identifiers may comprise a vertical line intersecting a horizontal line. The thickness of the horizontal line may indicate positional data. For example, a relatively thin horizontal line may indicate that the row location identifier is the first row location identifier in the field, and a relatively thicker horizontal line may indicate that the row location identifier is the last row location identifier in the field.
In another embodiment, and as shown in row 705 of FIG. 7, frequency of lines may also convey positional information. In row 703, the column location identifiers may comprise a horizontal line intersecting one or more vertical lines, all of approximately the same thickness. The number of vertical lines may indicate positional data. For example, one vertical line may indicate that the column location identifier is the first column location identifier in the field, and four vertical lines may indicate that the column location identifier is the last column location identifier in the field. In row 703, the row location identifiers may comprise a vertical line intersecting one or more horizontal lines, all of approximately the same thickness. The number of horizontal lines may indicate positional data. For example, one horizontal line may indicate that the row location identifier is the first row location identifier in the field, and four horizontal lines may indicate that the row location identifier is the last row location identifier in the field.
Rows 703 and 705 of FIG. 7 are intended to be examples only. In one embodiment, the row location identifiers and the section location identifiers may be mixed, so that, for example, the row location identifiers of row 703 and the section location identifiers of row 705 may be used. Additionally, other marks or styles may be used. For example, and without limitation, a single mark with four orientations may be used to represent four different location information data points. Two marks, each with four orientations, may be used to convey information associated with the row and section location identifiers.
The output module 107 may receive an input, and may be in communication with the image output device 109 to present the one or more symbols. In one embodiment, the output module 107 may receive the one or more symbols from the symbology conversion module 119, and may transmit the one or more symbols to the image output device 109. The output module 107 and the image output device 109 may be in communication with one another. For example, and without limitation, the output module 107 and the image output device 109 may be in communication via a network, or may be in communication via a dedicated connection, such as a cable or radio link.
The image output device 109 may create a representation of the field. For example, and without limitation, the image output device 109 may create a representation of the field in a physical form. In one embodiment, the image output device 109 may include one or more lasers and a device to position the lasers. The image output device 109 may direct the beam of the laser onto a surface, and ablate or etch a representation of the field onto the surface. In one embodiment, the image output device 109 may include a printer. The printer may print a representation of the field onto a medium or substrate, such as a piece of paper or a material with an adhesive backing (e.g., a sticker). Turning now to FIG. 2, a component view of a decoding system 201 according to an embodiment of the present disclosure is shown. The decoding system 201 may include an input module 205, a database module 211, a decryption module 213, an error correction module 215, a symbology conversion module 217, and an output module 207. The decoding system 201 may be a single system, or may be two or more systems in communication with each other. The decoding system 201 may include one or more input devices, one or more output devices, one or more processors, and memory associated with the one or more processors. The memory associated with the one or more processors may include, but is not limited to, memory associated with the execution of the modules, and memory associated with the storage of data. The decoding system 201 may also be associated with one or more networks, and may communicate with one or more additional systems via the one or more networks. The modules may be implemented in hardware or software, or a combination of hardware and software. The decoding system 201 may also include additional hardware and/or software to allow the decoding system 201 to access the input devices, the output devices, the processors, the memory, and the modules. The modules, or a combination of the modules, may be associated with a different processor and/or memory, for example on distinct systems, and the systems may be located separately from one another. In one embodiment, the modules may be executed on the same system as one or more processes or services. The modules may be operable to communicate with one another and to share information. Although the modules are described as separate and distinct from one another, the functions of two or more modules may instead be executed in the same process, or in the same system.
The input module 205 may be in communication with an image input device 203. The input module 205 may be operable to receive signals from the image input device 203 and to transmit the signals to other modules in the decoding system 201. The input module 205 may receive the image from the image input device 203, and may apply one or more transformations to the image to allow the image to be analyzed. For example, and without limitation, the input module 205 may resize, rotate, deskew, flatten, or otherwise process or clarify the image to allow the image to be further processed.
The image input device 203 may be operable to receive a field and to convert the field into signals that may be communicated to the input module 205. For example, and without limitation, the image input device 203 may be a charge-coupled device ("CCD") in a video capture device, a digital camera, or a scanner. The CCD device may, for example, record or otherwise capture a video. The field may pass in front of the CCD, and one or more images of the field may be captured by the video capture device. The CCD device may also be mounted in a still-frame camera. The still-frame camera may be operable to, for example and without limitation, capture images on a timed scale (e.g., every second or every minute) or to capture images based on a trigger (e.g., if an object breaks a beam of light). The image input device 203 may also be a scanner. For example, and without limitation, the field may be placed on the scanner, and the scanner may capture a representation of the field. The representation may be transmitted to the input module 205.
The database module 211 may interact with one or more databases 219. The database module 211 may communicate with a database 219 to request information, or to send information to be organized in the database 219. In one embodiment, the database module 211 may interact with a plurality of databases 219. The plurality of databases 219 may be associated with different types of data. For example, and without limitation, one or more databases 219 may be associated with a particular product, or a particular manufacturer. The database module 211 may be operable to interact with one or more databases 219 at the same time, and may be operable to retrieve or send data to each of the one or more databases 219.
The database 219 may store one or more pieces of data associated with unique identifiers, or may store other data related to the unique identifiers. For example, the database 219 may store information regarding manufacturer identification numbers, or part identification numbers. The data may be organized in the database 219 in any way that allows access to the data. For example, and without limitation, the database 219 may be organized as a relational database 219, and may use relational database 219 management system software. In another embodiment, the database 219 may be organized as a flat file, a spreadsheet, or any other type of data organization schema. The database 219 may be stored on one or more systems that are associated with the decoding system 201, or the database 219 may be stored on the decoding system 201. The database 219 and the decoding system 201 may be in communication, for example and without limitation, via one or more networks. For example, the database 219 may be stored on the decoding system 201 and be controlled by one or more processes executed on the decoding system 201, and the database module 211 may interface with the one or more processes to store and retrieve data in the database 219.
The symbology conversion module 217 may be operable to receive an input from, for example and without limitation, the input module 205, and to decipher the field of marks represented in the field. The symbology conversion module 217 may use character recognition methods to identify information regarding the one or more marks in the field. The information may include, for example and without limitation, the size, type, style, position, orientation, and/or number of the marks in the field. The information of the marks may convey data. The symbology conversion module 217 may assign data values to the information received from the field of marks. For example, and without limitation, the symbology conversion module 217 may assign a value to a mark in a certain orientation. The symbology conversion module 217 may assign a value of "A" to a mark having a zero degree orientation, and a value of "B" to a mark having a ninety degree orientation. The symbology conversion module 217 may assign other values to remaining marks in the field, or remaining mark orientations in the field. The symbology conversion module 217 may produce as an output a string of one or more alphanumeric characters representative of the information encoded by the marks in the field.
The decryption module 213 may receive an input and may decrypt the input to create a decrypted output. The input may be in the form of data. For example, and without limitation, the input may be a string of alphanumeric characters. The encrypted input may then be communicated to another module within the decoding system 201, or may be communicated to a system associated with the decoding system 201. The decryption module 213 may use any method for decrypting data to decrypt the input. For example, the decryption may use a complementary method as was used to encrypt the input at the encryption system. In one embodiment, more than one decryption method is used to decrypt the input. The decryption module 213 may decrypt the data using one or more keys. Each key may be substantially unique, so that each time the decryption module 213 receives an input, it may decrypt the input with a different key, or one key may be used to decrypt more than one input. For example, and without limitation, a key may be used to decrypt inputs that correspond to a particular manufacturer, or a particular manufactured item. The keys may be transmitted to the database 219 to be associated with the input that the keys encrypt, or a particular key may have a unique identifier in the database 219, and the decryption module 213 may transmit the unique identifier of the key used to decrypt the input. The decryption module 213 may also transmit the decryption method or methods to the database 219, to be associated with the input.
The error correction module 215 may receive an input with one or more error correcting elements, and may generate an output containing data. In one embodiment, the error correcting code may be generated by an implementation of one or more algorithms to receive an input, and generate Reed-Solomon codes based, at least in part, on the input. The implementation may be in either hardware or software, or a combination of hardware and software. The output module 207 may receive an input and generate an output to be communicated to a requestor 209 or a requestor system. The output module 207 may, for example and without limitation, receive the input and communicate the input to the database 219. The input may be, but is not limited to, the data received from the mark field, converted into data by the symbology conversion module 217, decoded by the error correction module 215, and decrypted by the decryption module 213. The output module 207 may communicate the data to the database 219 to query if the data is present within the database 219. If the data is present within the database 219, the field may represent data that was encoded by the encoding system. If the data is not present within the database 219, the field may represent data that was not encoded by the encoding system. The field, and therefore the item that the field was affixed to, may, for example and without limitation, be counterfeit, or the amount of the field recovered by the image capture device may not be sufficient to yield accurate data. In one embodiment, a decoding system 201 may be positioned near an encoding system, and may attempt to decode the fields of marks that the encoding system produces, to ensure that the encoding system is operating properly and that the data encoded in the field of marks may be properly decoded and processed.
Turning now to FIG. 3, a flow chart 300 to encode data to symbology is shown according to an embodiment of the present disclosure. The method may begin as represented in step 301. As represented in step 303, the encoding system 101 may receive a request to encode data. The request may be, for example and without limitation, a request for a number of unique identifiers with one or more additional data codes. For example, a manufacturer may desire to create records for a number of individual parts, all of a certain type. The manufacturer may request data records for each of the number of individual parts. The encoding system 101, the database module 113, and the database 111 may generate records containing a unique identifier, the manufacturer's unique code, and the part's unique code. The database 111 may store the unique identifiers and associate the unique identifiers with the part's unique code and the manufacturer's unique code.
As represented in step 305, the encoding system 101 may receive the request for data, and may generate the data records associated with the data request. The data records may be stored in the database 111 and may be communicated to the encoding system 101 via the database module 113 or the input module 105. In one embodiment, the data records may be transmitted to the database 219 for the decoding system 201, or the database 111 for the encoding system 101 and the database 219 for the decoding system 201 may be the same database, or copies of the same database.
As represented in step 307, the data records may be encrypted. The data records may be encrypted by the encryption module 115. The encryption module 115 may use any encryption method to encrypt the data records, and may use more than one encryption method to encrypt the data records. As represented in step 309, the encryption module 115 may also digitally sign the encrypted data records. The key or keys used to digitally sign the encrypted data records may be generated for each data record, or the key or keys may be used by the encoding system 101 to digitally sign some or all of the data records.
As represented in step 311, the error correction module 117 may receive the encrypted and signed data records, and may add one or more error correcting codes to the encrypted and signed data records. The error correction module 117 may use any error correcting algorithms to add the error correcting codes. The addition of the error correcting codes may allow the data to be extracted if a portion of the code has been obliterated. More or fewer error correcting codes, or additional redundancy, may be added to the encrypted and signed data records depending on the anticipated wear of the field of marks. For example, where heavy wear is anticipated and much or most of the field of marks is expected to be obliterated, additional error correction and/or redundancy may be added to the encrypted and signed data records. Where light wear or no wear is expected, less error correction and/or redundancy may be added to the encrypted and signed data records.
As represented in step 313, the symbology conversion module 119 may convert the data with the error correction codes from the error correction module 117 and may generate a field of marks to represent the data with the error correction codes. The symbology conversion module 119 may use marks of different types, or a single mark of different orientations, or other combinations of size, shape, type, or orientation of mark to encode the data with the error correction codes. The symbology conversion module 119 may also add additional error correction. For example, the symbology conversion module 119 may add row location identifiers and column location identifiers to location rows and location columns. The addition of the location rows and the location columns may allow the decoding system 201 to location the position and orientation of marks with a portion of the field of marks destroyed or obliterated.
As represented in step 315, the output module 107 may communicate the field of marks to the image output device 109. The image output device 109 may affix the field of marks to an item or to another physical object. In one embodiment, the image output device 109 may etch a field of marks to a physical item. In one embodiment, the image output device 109 may print a field of marks to a medium, for example a sticker, a piece of paper, or a slide. The output module 107 may communicate to the encoding system 101 that the field of marks representing a particular data record has been printed or etched, and the database 111 may record that the data record has been created. The method may then end, as represented in step 317.
While the method 300 is represented as a series of steps, the steps may be performed in any order. For example, the encoding system 101 may sign the data records, as represented in step 309 before encrypting the data records, as represented in step 307. Additionally, the encoding system 101 may create the error correcting codes, as represented in step 311, before either encrypting the data records, as represented in step 307, or signing the data records, as represented in step 309.
Turning now to FIG. 4, a flow chart 400 to decode symbology to data is shown according to an embodiment of the present disclosure. The method may begin as represented in step 401. As represented in step 403, the decoding system 201 may receive a field of marks. The field may be etched or otherwise affixed to an item, or the field may be provided and otherwise associated with an item (e.g., printed on a packing slip or on a box containing the item or items).
As represented in step 405, the field of marks may be input into the decoding system 201. For example, and without limitation, the field of marks may be imaged using a video camera or a still camera, either film-based or digital, or an article may be scanned using a scanner. Other methods of reading a mark into a system may also be used. For example, the field of marks may be drawn into a system, or the field of marks may be typed into the system or input into the system in another way. The scan of the field of marks may yield an image.
As represented in step 407, and shown also with reference to FIG. 5, the image may be processed to allow the decoding system 201 to convert the field of marks into representative data. Turning now to FIG. 5, a flow chart 500 to decode symbology to data is shown according to an embodiment of the present disclosure. As represented in step 501, the image may be subjected to planar projection, where a three dimensional image, such as from a video camera or a still camera, may be the image may be mapped to points on a two dimensional plane. The image may be subjected to planar projection so that the marks present in the image may be detected and analyzed by the decoding system 201.
As represented in step 503, the image may be processed to find its orientation. The field of marks may be imaged in any orientation, and the decoding system 201 may use one or more algorithms to determine the field's orientation in the image. The image may be rotated so that the field is in substantially a specific orientation, so that the marks present in the image may be detected and decoded. Orientation may be discovered by the decoding system 201 by, for example and without limitation, the row identifiers and/or column identifiers that may be present in the row location identifiers and the column location identifiers present in rows and/or columns of location marks present within the field of marks. If no field if marks can be found, the decoding system 201 may send an error reporting that fact, or may wait for another image to analyze.
As represented in step 505, the image may be processed to determine spatial detection. Spatial detection may include, for example and without limitation, the relative size of the field of marks in the image, or the relative position of the field of marks in the image. The size and position of the field of marks may be adjusted to allow for more efficient processing of the image. For example, and without limitation, the field of marks may be centered in the image, and the field of marks may be resized to a specific size. Spatial detection and image resizing may occur to counter the effects of differing equipment that may be used to print the field of marks and the equipment that may read the field of marks. If an image contains a field of marks that is in an acceptable position and size, repositioning and/or resizing of the field of marks in the image may not be desired.
As represented in step 507, an image that has been processed may be analyzed to determine the orientation and position of the marks in the field of marks. The orientation of each of the marks in the field of marks may be noted, and one or more data values may be associated with each orientation and type of mark in the field of marks. The location rows and the location columns may also be noted, and the data in the row location identifiers and the column location identifiers may also be noted. Raw data, indicating the position and orientation of the marks, may be recovered from the image. If any of the field of marks has been obliterated, the decoding system 201 may note the points that have been obliterated. The decoding system 201 may decode the location rows and the location columns to determine the position of the obliterated data marks. In one embodiment, eighty percent of the field of marks may be obliterated, and the data encoded in the field of marks may be able to be completely decoded with the remaining twenty percent of the field of marks, if the twenty percent forms a continuous portion of the field of marks (i.e. , is not in two or more unconnected sections). A representation of a field of marks, with a section of the field of marks obliterated, is shown in FIG. 10. Turning now to FIG. 10, there is shown an field 1000 of data marks with an obliterated portion according to an embodiment of the present disclosure. The field 1000 may include location columns 1005 and location rows 1003. The location columns 1005 may be represented by vertical lines that increase in thickness from left to right across the field. The location rows 1003 may be represented by vertical lines that increase in thickness from top to bottom across the field. The darker portion 1007 of the field of marks may be visible, and the remainder of the field 1000 may be obliterated or otherwise unreadable. The detection features represented in steps 501, 503, and 505 may resize and reorient the field 1000 so that the field is able to be analyzed. For example, the decoding system 201 may find that, based on the relative thickness of the location columns 1005, that the marks visible in the remaining portion of the field may be in a proper orientation, that is, right-to-left in the field. The decoding system 201 may also analyze the location rows 1003 remaining in the visible portion of the field 1000, and may also find that, based on the relative thickness of the location rows 1003, that the marks visible in the remaining portion of the field may be in the proper orientation. The decoding system 201 may also analyze the location columns and the location rows and may determine, for example and without limitation, that the location rows and the location columns indicate that the visible portion of the field of marks comprises portions of the first, second, and third rows of marks, and portions of the second, third, fourth, and fifth columns of marks. The decoding system 201 may use this information to determine the position of the marks in the visible portion of the field of marks. Based on the information, the decoding system 201 may determine that enough marks remain to properly decode the message.
The activities represented in steps 501, 503, and 505 may be performed in any order, and may also be performed in any order with respect to the steps depicted in FIG. 4.
As represented in step 409, the decoding system 201 may receive the raw data from the module or modules for image processing (not shown), and the error correction module 215 may use the values from the raw data to retrieve the data encoded in the field of marks. As represented in step 411, if the data may be retrieved from the field of marks, the method may continue to step 413. If the data cannot be retrieved from the field of marks, because, for example, a portion of the field of marks has been obliterated or may not have been scanned, the decoding system 201 may process and transmit an error condition, as represented in step 419. The error condition may include an error message, or the decoding system 201 may send an error message to another system. The method may end as represented in step 423.
As represented in step 415, if data can be retrieved from the field of marks, the decryption module 213 may attempt to validate the signature of the data, if the data has been signed by the encoding system 101. The signature may be transmitted from the encoding system 101 to the decoding system 201, or the decoding system 201 may have a key that may be used to verify the signature of the data. As represented in step 417, if the validation of the digital signature is successful, the method may continue to step 421. If the validation of the digital signature is not successful, the decoding system 201 may process and transmit an error condition, as represented in step 419. The error condition may include an error message, or the decoding system 201 may send an error message to another system. The method may end as represented in step 423. As represented in step 421, if the data has been validated as digitally signed by the encoding system, the data may be decrypted. The data may be decrypted by the decryption module 213. The decryption module 213 may use any decryption method to decrypt the data, and may use more than one decryption method to decrypt the data records. For example, and without limitation, the decryption module 213 may use the same decryption method or methods as the encryption module 115 of the encoding system 101 used to encrypt the data records. The decrypted data records may be compared to the data records generated by the encoding system 101. If the decrypted data record is found in the database 111 in the encoding system 101, then the decoding system 201 may indicate that the data record is valid and authentic. If the decrypted data record is not found in the database 111, then the decoding system 201 may indicate that the data record is not authentic. The decoding system 201 may also use the database 219 associated with the decoding system 201, or may use another database to track data records.
While the method 400 is represented as a series of steps, the steps may be performed in any order. The steps for the decoding system 201, represented in method 400, may be the reverse of the steps for the encoding system 101, represented in method 300, however this is not a requirement. For example, the decoding system 201 may decrypt the data record, as represented in step 421, before validating the digital signature of the data record, as represented in step 415. Additionally, the decoding system 201 may decrypt the data record, as represented in step 421, before attempting error correction, as represented in step 409.
FIG. 11 illustrates an encoding process 2000 in which data, such as an encrypted code, is embedded within image data of an image file. In the example shown, the source image 2002 is a 32-bit bitmap type file which is 109,000 pixels by 109,000 pixels (about 12 Megapixels). In the initial, unaltered source image 2002, each pixel is a color dictated by a string including information for red, green and blue channels (8 bits each), as well as an alpha channel (also 8 bits) for transparency or density. Each of the red, green and blue channels has 256 levels/magnitudes/shades (generally denoted by numerals 0 through 255, about 16.8 million colors), and the alpha channel likewise includes 256 levels of transparency (also generally denoted by numerals 0 through 255). Other image data may include channels for hue, saturation, brightness, in at least some images, and these other channels may likewise include 256 levels. The combination of data from each channel provides a particular color and density for a pixel within the image.
In the encoding process 2000, a code or more than one code is embedded in the image data for a desired number of pixels of a source image 2002. As described above, an unencrypted message or code 2004 may include multiple identifiers such as, but not limited to, manufacturer identification, part identification, date information and other information as desired. The information may be presented in any desired order, and, if desired, at least one of the identifiers may be unique to a particular product or thing. In at least some implementations, the message or unencrypted code 2004 includes 2-bit values, such as 1-100-1-1, which may be provided from a database module (e.g. module 113 described above). The encryption module (e.g. module 115 described above) may then convert the base code or message to an encrypted code 2006 utilizing one or more keys. The encrypted code 2006 may include any desired number of data/values, such as is represented by the quatnary values shown in FIG. 1. In one example, the 2-bit value 00 relates to a quatnary value of 1, the 2-bit value 01 relates to a quatnary value of 2, the 2-bit value 10 relates to a quatnary value of 3 and the 2-bit value 11 relates to a quatnary value of 4. Of course, other values (both the noted 2-bit and noted quatnary values can be selected as desired) may be used, as desired and this is merely one example. The converted, encrypted code 2006 may be provided to the output module (e.g. module 107 described above) which maps the encrypted code 2006 into the source image which, with the encrypted code 2006 embedded therein, becomes a coded image 2008 (also referred to as a coded image file) which is the output from the encryption process 2000.
The mapping of the encrypted code 2006 into the source image 2002 may be done in any desired manner. For example, the value of any one of the various channels for a pixel in the source image 2002 may be modified, such as by being replaced with a two-bit value (00, 01, 10, 11) representing one of the quatnary values of the encrypted code. An example is described for a pixel having values 192, 48, 12, and 51 for red, green, blue and alpha channels, respectively. These channel values may be represented as binary' values 11000000, 00110000, 00001 100 and 0011001 1, respectively. One of these channel values may be changed to a binary' value representing a quatnary' value of the encrypted code (e.g. 0, 1, 2 or 3 in an example wherein the encrypted code uses quatnary' values 0 through 3).
In at least some implementations, the code is provided in the last 2 digits of binary- value for one or more channels of pixel data, as these are the lowest order bits and will provide the lowest magnitude of change to the pixel data, and hence the color/ appearance of the pixel. In this way, the embedded code provides a minimal, if any, change from the original channel value.
For example, the original value for the red channel in this example is 192, which in binary form is 11000000. No change to this value need be made if the encrypted code 2006 requires the quatnary' value of zero in this position of the image data because the last two binary' digits are 00 which corresponds to a quatnary' value of zero. If the encrypted code 2006 requires a quatnary value of 1 in this position of the image data, then the binary' value for the red channel of this pixel is changed to 11000001. This changes the decimal value of the red channel from 192 to 193. If the encrypted code 2006 requires a quatnary value of 2 in this position of the image data, then the binary value for the red channel of this pixel is changed to 11000010. This changes the decimal value of the red channel from 192 to 194. And if the encrypted code 2006 requires a quatnary value of 3 in this position of the image data, then the binary7 value for the red channel of this pixel is changed to 11000011. This changes the decimal value of the red channel from 192 to 195. Each of these represents a relatively minor change to the red channel of one pixel in the image data (between values of 192 and 195) and these changes are not detectable by a person viewing the coded image 2008.
Such changes to the image/pixel data may be done for any desired number of pixels, and the manipulated pixels can have one or more than one channel altered, as desired to embed a particular code. To provide 256 quatnary values, only 512 bits of data are needed, with two bits per channel for 256 channels used for the code. The pixels and/or channels selected to be changed in this way may be the same for each image in a series of images, or the pixels may be randomly or arbitrarily selected, or selected based upon a key or keys used in the encryption process. The sequence of pixels can also be different for each iteration, if desired. That is, the encoding process need not follow a particular sequence by row number or column number of pixels in the matrix of image pixels. Thus, which pixels are changed, the channel of image data within a selected pixel that is changed from its base value to the code value, and the order in which the pixels are changed (i.e. their place within the code need not follow their position within the image pixel matrix), can be preselected or chosen in any desired way. The key(s) and any other data relevant to the data embedded into the coded image may be transmitted to a requestor separately from the image (e.g. within a database or otherwise, as desired), or such information may be embedded within the image.
FIG. 12 illustrates a decoding process 2010 that begins with receipt of file including the coded image 2008 (e.g. at an image input device or input module at which the image file is received). The file for the coded image 2008 is analyzed pursuant to the key(s) and other mapping information provided, pixel data in the relevant pixels/channels is captured and ordered according the manner in which such data was embedded/mapped into the image, and the encrypted code 2006 is decrypted to provide the base message or code 2004 to a requestor.
Error correction methods may be employed to enable decoding of messages even with all then the needed data (e.g. due to corruption or some modification of the image file).
The methods of FIGS. 11 and 12 do not require visual capture of an encrypted code, such as described above with regard to the field of symbols, some of which may be rotated at different angles. Instead, the underlying image data is analyzed to provide, initially, the quatnary values which may then be decoded to the base message/code 2004 which is provided to a requestor/recipient. In some implementations, the image may be captured by a camera and the resulting image data analyzed to provide the data for decoding, as described herein.
While described above with reference to a 32-bit bitmap image, the encoding and decoding method can be utilized in any image type having multiple channels of pixel data. RAW, BMP, and PNG are all lossless image formats - lossless TIFF, and there are others. As another example, a 24-bit color image is one with 3 bytes of data for the red, green, and blue channels, and they may be manipulated in the same manner. Further, compressed image file formats, or any other type of image format, may also be used so long as compression or alteration of the file data does not occur between the encryption and decryption steps, or so long as the compression/ alteration is done in a manner consistent with decoding the data (i.e, wherein changes made are understood and can be accounted for in decoding). That is, the input and output file types/formats should be the same so the encoded data is not altered in a manner not recognized during decoding. While numerical values are shown and described with regard to the unencrypted code 204 and the encrypted code 206, other characters or indicia may be used, with appropriate conversions to binary7 values for use within the pixel data as set forth herein.
The forms of the invention herein disclosed constitute presently preferred embodiments and many other forms and embodiments are possible. It is not intended herein to mention all the possible equivalent forms or ramifications of the invention. It is understood that the terms used herein are merely descriptive, rather than limiting, and that various changes may be made without departing from the spirit or scope of the invention.
As used in this specification and claims, the terms “for example,” “for instance,” “e.g.,” “such as,” and “like,” and the verbs “comprising,” “having,” “including,” and their other verb forms, when used in conjunction with a listing of one or more components or other items, are each to be construed as open-ended, meaning that that the listing is not to be considered as excluding other, additional components or items. Other terms are to be construed using their broadest reasonable meaning unless they are used in a context that requires a different interpretation.

Claims

1. A method of encoding data within an image, comprising the steps of: determining an unencrypted code including at least one identifier; encrypting the unencrypted code to provide an encrypted code including multiple values; and embedding the values of the encrypted code into pixel data of selected pixels of the image.
2. The method of claim 1 wherein the values are embedded into the pixel data as binary numbers that correspond to the values of the encrypted code.
3. The method of claim 2 wherein the pixel data includes multiple channels for each pixel, each of the multiple channels includes binary numbers, and the values are embedded into at least one channel of each of a plurality of selected pixels.
4. The method of claim 2 or 3 wherein the values are embedded into the lowest order bits of the binary numbers.
5. The method of claim 1 wherein the encrypting step is done with a key that provides which channels of which pixels are to be embedded with which of the values of the encrypted code.
6. A method of decoding data within an image, comprising the steps of: extracting from pixel data of the image an encrypted code including multiple values; and using an encryption key to decrypt the encrypted code to provide an unencrypted code including at least one identifier.
7. The method of claim 6 wherein the extracting step is accomplished with a key or map that indicates which channels within which pixels within the image include values of the encrypted code, and the order in which the values are arranged in the encrypted code.
PCT/US2021/052778 2020-09-30 2021-09-30 Encoding and decoding data, including within image data WO2022072580A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
ZA2023/03526A ZA202303526B (en) 2020-09-30 2023-03-13 Encoding and decoding data, including within image data

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US202063085196P 2020-09-30 2020-09-30
US63/085,196 2020-09-30

Publications (1)

Publication Number Publication Date
WO2022072580A1 true WO2022072580A1 (en) 2022-04-07

Family

ID=80949203

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2021/052778 WO2022072580A1 (en) 2020-09-30 2021-09-30 Encoding and decoding data, including within image data

Country Status (2)

Country Link
WO (1) WO2022072580A1 (en)
ZA (1) ZA202303526B (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100021004A1 (en) * 1993-11-18 2010-01-28 Rhoads Geoffrey B Hiding and Detecting Messages in Media Signals
US20110274270A1 (en) * 1995-07-27 2011-11-10 Rhoads Geoffrey B Content containing a steganographically encoded process identifier
US20120207344A1 (en) * 2000-08-14 2012-08-16 The Hong Kong University Of Science And Technology Method and apparatus for hiding data for halftone images
US20190278917A1 (en) * 2018-03-09 2019-09-12 Citrix Systems, Inc. Systems and methods for embedding data in remote session displays
US20200267404A1 (en) * 2020-05-04 2020-08-20 Intel Corportation Detection of video tampering

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100021004A1 (en) * 1993-11-18 2010-01-28 Rhoads Geoffrey B Hiding and Detecting Messages in Media Signals
US20110274270A1 (en) * 1995-07-27 2011-11-10 Rhoads Geoffrey B Content containing a steganographically encoded process identifier
US20120207344A1 (en) * 2000-08-14 2012-08-16 The Hong Kong University Of Science And Technology Method and apparatus for hiding data for halftone images
US20190278917A1 (en) * 2018-03-09 2019-09-12 Citrix Systems, Inc. Systems and methods for embedding data in remote session displays
US20200267404A1 (en) * 2020-05-04 2020-08-20 Intel Corportation Detection of video tampering

Also Published As

Publication number Publication date
ZA202303526B (en) 2023-12-20

Similar Documents

Publication Publication Date Title
US10373033B2 (en) Two dimensional barcode and method of authentication of such barcode
Tkachenko et al. Two-level QR code for private message sharing and document authentication
RU2606056C2 (en) Documents protection and authentication method and device
RU2477522C2 (en) Method and apparatus for protecting documents
USRE44982E1 (en) Mixed code, and method and apparatus for generating the same
US7440143B2 (en) Tampering judgement system, encrypting system for judgement of tampering and tampering judgement method
EP1520369B1 (en) Biometric authentication system
CN101755274B (en) Method and device for securing documents
US7533817B2 (en) Color barcode producing method and apparatus, color barcode reading method and apparatus and color barcode reproducing method and apparatus
US6272222B1 (en) Method of and apparatus for manipulating digital data works
US6807634B1 (en) Watermarks for customer identification
US20170076127A1 (en) Method to Store a Secret QR Code into a Colored Secure QR Code
CN110766594B (en) Information hiding method and device, detection method and device and anti-counterfeiting tracing method
CA2115905C (en) Secure personal identification instrument and method for creating same
RU2458395C2 (en) Methods and apparatus for ensuring integrity and authenticity of documents
US8430315B2 (en) Data encoding and decoding
WO1995020291A1 (en) Method of and apparatus for manipulating digital data works
CN1183693A (en) Protecting images with image watermark
US20060255141A1 (en) Machine readable data
US9477853B2 (en) Generating an incremental information object
CN111428532B (en) Coding and decoding method capable of encrypting three-dimensional code
WO2022072580A1 (en) Encoding and decoding data, including within image data
EP2661719B1 (en) Dual deterrent incremental information object
JP2009033356A (en) Image forming apparatus and image forming method
WO2022131965A1 (en) Method for encoding and decoding digital information in the form of a multidimensional nano-barcode

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21876447

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21876447

Country of ref document: EP

Kind code of ref document: A1