US20230281344A1 - Secure digital media authentication and analysis - Google Patents
Secure digital media authentication and analysis Download PDFInfo
- Publication number
- US20230281344A1 US20230281344A1 US17/962,242 US202217962242A US2023281344A1 US 20230281344 A1 US20230281344 A1 US 20230281344A1 US 202217962242 A US202217962242 A US 202217962242A US 2023281344 A1 US2023281344 A1 US 2023281344A1
- Authority
- US
- United States
- Prior art keywords
- digital media
- media asset
- insurance
- image
- certified
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000004458 analytical method Methods 0.000 title description 2
- 238000012545 processing Methods 0.000 claims abstract description 24
- 238000000034 method Methods 0.000 claims description 63
- 238000012546 transfer Methods 0.000 claims description 49
- 230000015654 memory Effects 0.000 claims description 30
- 230000000977 initiatory effect Effects 0.000 claims 1
- 230000008569 process Effects 0.000 description 33
- 238000004891 communication Methods 0.000 description 20
- 238000005516 engineering process Methods 0.000 description 18
- 238000012795 verification Methods 0.000 description 17
- 230000000694 effects Effects 0.000 description 13
- 230000001413 cellular effect Effects 0.000 description 11
- 230000004044 response Effects 0.000 description 10
- 238000010586 diagram Methods 0.000 description 9
- 230000003287 optical effect Effects 0.000 description 8
- 238000001514 detection method Methods 0.000 description 6
- 239000003550 marker Substances 0.000 description 4
- 238000005259 measurement Methods 0.000 description 4
- 230000002093 peripheral effect Effects 0.000 description 4
- 238000012552 review Methods 0.000 description 4
- 230000005540 biological transmission Effects 0.000 description 3
- 230000008859 change Effects 0.000 description 3
- 239000000835 fiber Substances 0.000 description 3
- 230000006870 function Effects 0.000 description 3
- 238000010801 machine learning Methods 0.000 description 3
- 230000008520 organization Effects 0.000 description 3
- 238000001228 spectrum Methods 0.000 description 3
- 239000003086 colorant Substances 0.000 description 2
- 230000001815 facial effect Effects 0.000 description 2
- 230000007246 mechanism Effects 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000007639 printing Methods 0.000 description 2
- 230000001360 synchronised effect Effects 0.000 description 2
- 230000009897 systematic effect Effects 0.000 description 2
- VYZAMTAEIAYCRO-UHFFFAOYSA-N Chromium Chemical compound [Cr] VYZAMTAEIAYCRO-UHFFFAOYSA-N 0.000 description 1
- 230000009471 action Effects 0.000 description 1
- 239000008186 active pharmaceutical agent Substances 0.000 description 1
- 230000004075 alteration Effects 0.000 description 1
- 238000003491 array Methods 0.000 description 1
- 238000013473 artificial intelligence Methods 0.000 description 1
- 238000003556 assay Methods 0.000 description 1
- 230000008901 benefit Effects 0.000 description 1
- 239000000969 carrier Substances 0.000 description 1
- 239000003795 chemical substances by application Substances 0.000 description 1
- 229910052804 chromium Inorganic materials 0.000 description 1
- 239000011651 chromium Substances 0.000 description 1
- 238000012937 correction Methods 0.000 description 1
- 238000002059 diagnostic imaging Methods 0.000 description 1
- SDIXRDNYIMOKSG-UHFFFAOYSA-L disodium methyl arsenate Chemical compound [Na+].[Na+].C[As]([O-])([O-])=O SDIXRDNYIMOKSG-UHFFFAOYSA-L 0.000 description 1
- 230000002708 enhancing effect Effects 0.000 description 1
- 239000000284 extract Substances 0.000 description 1
- 239000011521 glass Substances 0.000 description 1
- 230000036541 health Effects 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 230000002452 interceptive effect Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 230000004297 night vision Effects 0.000 description 1
- 229910021420 polycrystalline silicon Inorganic materials 0.000 description 1
- 230000011514 reflex Effects 0.000 description 1
- 230000002207 retinal effect Effects 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 239000000126 substance Substances 0.000 description 1
- 238000010897 surface acoustic wave method Methods 0.000 description 1
- 238000013518 transcription Methods 0.000 description 1
- 230000035897 transcription Effects 0.000 description 1
- 238000010200 validation analysis Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F21/00—Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F21/60—Protecting data
- G06F21/64—Protecting data integrity, e.g. using checksums, certificates or signatures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F21/00—Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F21/60—Protecting data
- G06F21/602—Providing cryptographic facilities or services
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F21/00—Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F21/60—Protecting data
- G06F21/64—Protecting data integrity, e.g. using checksums, certificates or signatures
- G06F21/645—Protecting data integrity, e.g. using checksums, certificates or signatures using a third party
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q40/00—Finance; Insurance; Tax strategies; Processing of corporate or income taxes
- G06Q40/08—Insurance
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L63/00—Network architectures or network communication protocols for network security
- H04L63/12—Applying verification of the received information
- H04L63/123—Applying verification of the received information received data contents, e.g. message integrity
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L9/00—Cryptographic mechanisms or cryptographic arrangements for secret or secure communications; Network security protocols
- H04L9/06—Cryptographic mechanisms or cryptographic arrangements for secret or secure communications; Network security protocols the encryption apparatus using shift registers or memories for block-wise or stream coding, e.g. DES systems or RC4; Hash functions; Pseudorandom sequence generators
- H04L9/0643—Hash functions, e.g. MD5, SHA, HMAC or f9 MAC
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L9/00—Cryptographic mechanisms or cryptographic arrangements for secret or secure communications; Network security protocols
- H04L9/30—Public key, i.e. encryption algorithm being computationally infeasible to invert or user's encryption keys not requiring secrecy
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L9/00—Cryptographic mechanisms or cryptographic arrangements for secret or secure communications; Network security protocols
- H04L9/32—Cryptographic mechanisms or cryptographic arrangements for secret or secure communications; Network security protocols including means for verifying the identity or authority of a user of the system or for message authentication, e.g. authorization, entity authentication, data integrity or data verification, non-repudiation, key authentication or verification of credentials
- H04L9/3236—Cryptographic mechanisms or cryptographic arrangements for secret or secure communications; Network security protocols including means for verifying the identity or authority of a user of the system or for message authentication, e.g. authorization, entity authentication, data integrity or data verification, non-repudiation, key authentication or verification of credentials using cryptographic hash functions
- H04L9/3239—Cryptographic mechanisms or cryptographic arrangements for secret or secure communications; Network security protocols including means for verifying the identity or authority of a user of the system or for message authentication, e.g. authorization, entity authentication, data integrity or data verification, non-repudiation, key authentication or verification of credentials using cryptographic hash functions involving non-keyed hash functions, e.g. modification detection codes [MDCs], MD5, SHA or RIPEMD
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L9/00—Cryptographic mechanisms or cryptographic arrangements for secret or secure communications; Network security protocols
- H04L9/32—Cryptographic mechanisms or cryptographic arrangements for secret or secure communications; Network security protocols including means for verifying the identity or authority of a user of the system or for message authentication, e.g. authorization, entity authentication, data integrity or data verification, non-repudiation, key authentication or verification of credentials
- H04L9/3247—Cryptographic mechanisms or cryptographic arrangements for secret or secure communications; Network security protocols including means for verifying the identity or authority of a user of the system or for message authentication, e.g. authorization, entity authentication, data integrity or data verification, non-repudiation, key authentication or verification of credentials involving digital signatures
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L9/00—Cryptographic mechanisms or cryptographic arrangements for secret or secure communications; Network security protocols
- H04L9/50—Cryptographic mechanisms or cryptographic arrangements for secret or secure communications; Network security protocols using hash chains, e.g. blockchains or hash trees
Definitions
- the present invention generally relates to digital media processing. More specifically, the present technology relates to authentication, validation, and analysis of digital media, and generation of documentation using the digital media that may be used for insurance processes.
- User devices such as smartphones or tablets can capture photos or videos using camera software applications designed to interact with camera hardware embedded in the user device. Some of these camera applications store photo metadata along with the photo.
- cameras software applications store photo metadata along with the photo.
- viewers viewing photos or videos captured using such software applications traditionally have no way to know if the photos or videos and associated metadata are genuine, or if the photos or videos have been altered or manipulated in some way. This can be particularly problematic if such photos or videos are used in contexts such as the insurance industry, as manipulated photos, videos, or metadata could be used by malicious parties for fraudulent insurance claims.
- a method includes receiving a first digital media receiving a first digital media asset, a first information set associated with the first digital media asset, a second digital media asset, and a second information set associated with the second digital media asset. The method also includes verifying that the first digital media asset and the first information set are unaltered and verifying that the second digital media asset and the second information set are unaltered. The method also includes verifying that a first time of capture of the first digital media asset as identified in the first information set is before a second time of capture of the second digital media asset as identified in the second information set.
- the method also includes generating a document that includes the first digital media asset and the second digital media asset in response to verifying that the first time of capture is before the second time of capture and to verifying that the first digital media asset, the first information set, the second digital media asset, and the second information set are unaltered.
- FIG. 1 is a flow diagram illustrating operations of an image capture and certification system combined with an end to end mobile insurance system.
- FIG. 2 illustrates an image capture system that combines a camera image with a sensor data set from a sensor system.
- FIG. 3 illustrates exemplary functions and features of the image capture system.
- FIG. 4 identifies exemplary media metadata and certification data.
- FIG. 5 illustrates a network architecture through which digital media may be captured, certified, uploaded to a server network, verified, synchronized, and distributed to clients.
- FIG. 6 is a flow diagram illustrating techniques for media and metadata certification.
- FIG. 7 illustrates an exemplary web portal display for viewing a document with an integrated certified image.
- FIG. 8 illustrates a document including two certified digital media assets corresponding to different capture times.
- FIG. 9 illustrates operations for secure digital media processing.
- FIG. 10 illustrates an exemplary certified image within an exemplary image viewer application.
- FIG. 11 illustrates capture of an image at a media capture device and transfer of the image upon its certification to a computing device at which the certified image is viewed.
- FIG. 12 is a flow diagram illustrating operations for capturing media, certifying the media, securely transferring the certified media, securely storing the certified media, and securely broadcasting the certified media.
- FIG. 13 illustrates a modified version of a digital media asset with an overlaid glyph.
- FIG. 14 is a block diagram of an exemplary computing device that may be used to implement some aspects of the technology.
- Techniques for secure media processing may be used to certify a digital media asset by verifying that the digital media asset is authentic and has not been altered since capture.
- these secure media processing techniques may be used in the context of an enterprise ecosystem including a mobile application and a corresponding server system. Together the mobile application and the corresponding server system automatically generate quotes or identify insurance policies for users based on certified digital media depicting an object to be insured, and automatically generates and handles insurance claims based on additional certified digital media depicting the insured object after an event occurs that, for instance, causes the object to incur damage.
- the mobile application and corresponding server system can perform tasks of various types, including marketing, communications, selling, underwriting, providing digital copies of an insurance policy, binding (e.g., including through electronic signatures), payment transaction of an insurance policy, insurance claim loss notices, analytics, adjustment processes, to final payment and delivery of insurance claim damages.
- the mobile application covers capturing, processing, and certifying a digital media asset submitted for insurance purposes. Through these techniques, fraud is reduced in the insurance claim process due to the certification of digital media asset.
- An artificial intelligent object recognition system and third-party databases may be utilized to determine the identity, condition, replacement cost and value of the property, and verify the validity of the insurance claim. Data transferred between the parties in the system are transmitted to a secure server and are securely accessible by the parties involved to provide transparency in the insurance process.
- the mobile application and corresponding server system may generate and/or identify an insurance policy, and perform other tasks related to the insurance policy, such as underwriting, filing an insurance claim, analyzing the insurance claim, adjustment process and processing of the final payment of insurance claim damage along with policy renewals.
- types of insurance the mobile application and corresponding server system may be used for but not limited to are: homeowners, business, auto, condo, commercial business, ID theft, motorcycle, pet, RV, renters, travel, jewelry, boat, flood, hurricane, marine ships and shipping, tornado, hail, wind, aviation, agriculture, earthquake, fire, and property and casualty insurance.
- FIG. 1 is a flow diagram illustrating operations of an image capture and certification system combined with an end to end mobile insurance system.
- a mobile application is downloaded on the user's mobile device.
- the mobile application may be downloaded from a website or other storage repository through the user search, or by clicking on an advertisement that was sent to the user from a website, email, or messages.
- a mobile application is utilized in the lifecycle of obtaining insurance and filing an insurance claim.
- the mobile application may contain marketing tools, ability to send and receive documents, tools to capture, certify, and organize digital media, automatically attach the digital media to the insurance documents, incorporate electronic signatures, and notary credentials, and to review the policies and insurance claim process.
- the mobile application may be installed on any mobile device equipped with various sensors such as one or more cameras, video, audio recording microphones, GPS or GNSS receivers, any other types of sensors discussed herein, or some combination thereof.
- a device may be a smart phone, a tablet computer, a personal digital assistant (PDA), drones, UGV, pocket PC, satellite camera equipped phone, or a laptop.
- PDA personal digital assistant
- the mobile device can connect to internet via Wi-Fi or wired connection or communicate with a cellular tower.
- the mobile application may be downloaded on the mobile device via the cellular or internet connection, from an advertisement with a QR code, website, Apple/Google app store, or directly searching for the application.
- a touchscreen display of the mobile device may display a graphical user interface (GUI) to assist a user seeking to obtain an insurance in utilizing the mobile application.
- GUI graphical user interface
- the GUI may contain various menu screens and menu buttons for the user to interact with.
- the GUI may provide a search bar to search for an insurance company and the company's history, rates, and customer reviews.
- the GUI may also provide an option to search insurance companies by price, reviews, or products the companies provide.
- the user searches for an insurance company and the insurance policies from the GUI of the mobile device using the mobile application.
- the user may view information regarding the policies and the insurance companies on the mobile application.
- the application may query the user for basic questions to assist the user in narrowing the options for the user.
- the user sends information relevant to the user and the property to be insured to the insurance company at step 130 .
- the mobile application and corresponding server system may assist in downloading and uploading various documents needed to obtain a quote.
- a menu option may present the user with links to various documents the user needs such that the documents may be downloaded to the mobile device.
- Another menu option may be presented to upload a document from the device memory.
- the menu option may be personalized for different insurance company the user is seeking insurance from.
- a digital media asset which may include an image, video, audio, map, satellite photo, or some combination thereof, may also be uploaded via the application.
- the user utilizes intelligent digital media capture system 131 and the digital media certification system 132 when sending the certified digital media to the insurance company to receive a quote.
- the user may take one or more digital images or videos of the property to be insured to document and verify the identity, value, and the quality of the property for the purpose of obtaining a quote for the insurance and for the underwriting department to evaluate the property in the original state.
- Sensor data such as the GPS coordinates where the digital media was taken, may be retrieved along with the digital media asset at the time of the capture.
- Metadata related to the digital media asset such as the author, date and time of the capture, description, media title, names of individuals involved, may also be retrieved.
- the sensor data and metadata may be used to verify the assertion made by the user. For example, for a house insurance, the GPS coordinates attached with the digital media asset support the assertion that the house is located where the user asserts is the location of the house.
- the digital media asset may be certified by generating a hash digest of the digital media and encrypting the digital media and its hash value.
- the certified digital media asset guarantees that the digital media is unaltered and authentic.
- the encryption may include generating an asymmetric public key infrastructure key pair as further illustrated in FIG. 6 .
- Hash values of sensor data and the metadata of the digital media asset may similarly be generated and encrypted.
- the system may generate an information page based on the information associated with the digital media asset and stores the information page at a network location.
- the server system may also generate an interactive glyph, such as a QR code, that encodes information pointing to the network location.
- the server system generates a modified digital media asset by overlaying or otherwise embedding the glyph onto the digital media asset.
- the server system outputs the modified digital media asset, for example by printing the modified digital media asset or transmitting the modified digital media asset to a display device that displays the modified digital media asset.
- a requesting device scans the glyph from the modified digital media asset, for example using a camera, and decodes from the glyph the information pointing to the network location. Once the requesting device decodes the information pointing to the network location, the requesting device requests the information page from the server system.
- the server system serves the information page to the requesting device.
- the requesting device transmits communications to the server system, for example to leave a comment regarding the digital media asset, to indicate approval or disapproval of the digital media asset, or to edit the digital media asset.
- the server system receives the communication and modifies the information page to reflect comments, approvals, disapprovals, or edits.
- the system utilizes AI object recognition 141 and third-party database 142 to verify the information and assess the property to assist in obtaining a quote for the user.
- An artificial intelligent (AI) processor 141 may be used to determine the identity, value, age, condition, type, and scope of the property.
- the AI may recognize from the shape, pixel information, and the curvature of the lines that the digital image or a video is of a blue car and the car has dents and other imperfections.
- a third-party database 142 may be utilized in identifying and evaluating the property captured in the digital media.
- the system may search and connect to an appropriate third-party database automatically or an appraiser, an underwriter, or an adjuster may search for a third-party database manually.
- the AI may recognize that the digital media is an image of a Nissan vehicle from the “H” of the front grill emblem and search the certified Honda database for the model and the year of the car. From the license plate of the vehicle, the AI may also obtain history, ownership, and registration information of the car from a database containing the history of the vehicle. From the information of the make, model, year, history, and the condition of the car, the AI may search for a database with prices of cars of similar condition to determine the value of the car.
- the AI may recognize a digital media of a house and determine the address of the house and the date the image was taken from the sensor and the metadata of the digital media. The AI then may search the appropriate databases to obtain tax records, improvements records to determine the value of the house.
- a human may assist with the AI by specifying that the image is a car so that the server system may search for an appropriate third-party database.
- Another third-party database may be utilized in comparing pricing of similar policies between insurance companies.
- a third-party database containing consumer opinions or reviews may also be utilized and displayed for the user.
- the user accepts a policy and the system transfers insurance documents and payment for the insurance policy between the user and the insurance company.
- the mobile application and corresponding server system may facilitate payments between the user and the insurance company. For example, once the customer chooses a policy from an insurance company, the mobile device may transmit the payment information of the user to the insurance company by allowing the user to input the credit card information on the GUI of the mobile application.
- the server system allows the user to transmit electronic signature to bind the user.
- the mobile application and corresponding server system may also generate a certified insurance card once the policy is granted to the user. When the insurance company issues the insurance card to the user, the insurance card may be retrieved, downloaded, stored, and viewed from the mobile application. Using asymmetrical encryption and a hash generator similar to certifying a digital media asset above, the image of the insurance card may be certified such that the image is guaranteed to be authentic.
- the user may send the insurance claim form embedded with certified digital media at step 160 .
- the mobile application and corresponding server system may assist in the user filing an insurance claim in response to a loss in regards to the insured property.
- the mobile application allows the user to file an insurance claim by downloading the insurance claim form to the mobile device from the application, taking a certified digital media, and automatically populate the insurance document with the digital media. Similar to when the user uploads digital media to obtain a quote, the system generates a certified image by receiving digital media asset from the mobile device of the user, generating the hash values for the digital media and the sensor and metadata, encrypting the hash values, and overlaying a glyph on the digital media.
- the system may search for the certified digital media used at the time the user applied for the insurance to automatically attach the digital media to the insurance claim form.
- the insurance claim form containing certified media is then transferred to a secure server for the insurance claim adjuster to investigate the insurance claim.
- the user may also send other supporting documents to the insurance claim, such as estimates of financial loss, engineering reports, and police reports from the mobile application.
- an AI/ML object recognition system and third-party database may be used to verify the authenticity of the certified digital media submitted at the time the insurance quote was obtained with the certified digital media submitted at the time the insurance claim was filed as discussed with respect to FIG. 8 .
- the insurance claim is reviewed by the insurance company at step 170 .
- the insurance claim adjuster or any other individuals relevant to assessing the insurance claim may obtain access to the information that the user submitted, such as the insurance claim form and the certified digital media.
- the insurance claim adjuster or any other relevant individuals may also upload certified digital media or other documents to the cloud, such as digital media of damages the insurance claim adjuster personally inspected, video or audio of witness interviews, or other documents that the adjuster or other individuals submitted.
- the documents and the certified digital media may be viewable by all relevant individuals given access within the system. Furthermore, the user who filed the insurance claim may be allowed to view the status of the insurance claim via claim status indicator from the mobile application. As documents and digital media are added and the insurance claim adjuster makes progress with the insurance claim, the user is allowed to view and receive updates regarding the insurance claim.
- the mobile application and corresponding server system may utilize the sensor and metadata of the digital image to verify the validity of the insurance claim.
- the system may verify the location of the car accident from the GPS coordinates in the sensor data of the digital media.
- the system may determine whether the two sets of digital media, one from before the insurance claim and one after the insurance claim was made, are taken in the same property based on the sensor and metadata of the digital media.
- the mobile application and corresponding server system may utilize artificial intelligence (AI) and/or machine learning (ML) image recognition, object recognition, and/or feature recognition algorithms to compare the set of digital media assets obtained at the time of the quote with the set of digital media asset obtained at the time the insurance claim is made.
- AI artificial intelligence
- ML machine learning
- the recognition algorithms may determine the locations and the extent of the damage of the insured car by comparing the digital images of the car before and after the accident.
- the system may also utilize third-party database to determine the validity of the insurance claim. For an insurance claim asserting flood damage to a house, the system may obtain the metadata and the sensor data from the digital media of a house to determine the location and the date the digital image was taken.
- the AI may search for a third-party database for the weather data of the location of the property to determine the validity of the insurance claim that there was a flood.
- a human assessor may assist with the recognition algorithms in narrowing the search fields, by specifying the terms for the search, or by correcting the recognition algorithms.
- the insurance claim funds may be transferred to the user's financial institution or via selectable modes by the user of the mobile application at step 180 .
- the system can utilize the bank account information or a third-party payment system in order to transfer the funds between parties.
- FedEx or other carriers may be utilized by the system instead of wired transfer.
- the mobile system spans the efforts from marketing, selling, underwriting, binding a policy, insurance claim loss notices, analytics, records, cost estimates, in and out of app communication records, adjustment processes, to final payment of the insurance claim damage.
- the system incorporates capturing, processing, and certifying a digital media asset submitted for insurance purposes to reduce fraudulent insurance claims.
- An artificial intelligent object recognition system and third-party databases may be utilized to determine the identity, condition, and value of the property, and verify the validity of the insurance claim.
- Data transferred between the parties in the system are transmitted to a secure server and are accessible by the parties involved to provide transparency in the insurance process.
- FIG. 2 illustrates a digital media capture system that combines a camera image with a sensor data set from a sensor system.
- the camera image is taken using a camera embedded as a part of an image capture device with an intelligent image sensor 200 that includes a digital image sensor 210 as well as a number of sensors, such as one or more laser rangefinders, LIDAR sensors, inertial measurement units (IMUs), magnetometers, GPS receivers, GNSS receivers, accelerometers, gyroscopes, other sensors discussed herein, or some combination thereof.
- This device can be a camera such as a digital or analog point-and-shoot camera, a digital or analog single-lens reflex “SLR” camera, a digital or analog image-capturing telescope, a digital or analog image-capturing microscope, commercial system used by DMV, badge systems, or a digital or analog camcorder.
- the user device can be a portable consumer user device such as a smartphone, a tablet device, a laptop computer, a wearable device, a portable gaming console, or a portable media player device. Less mobile devices may also be used such as desktop computers, television systems, gaming consoles, and various types of conveyances.
- the image capture system is an example of a digital media capture device.
- a digital media capture device may capture images as discussed with respect to the image capture system. Images captured by the digital media capture device may include still images and/or one or more frames from a video. The digital media capture device may capture audio via a microphone. The digital media capture system may capture various sensor measurements from various sensors that may be part of, or coupled to, the digital media capture device.
- the metadata captured by the media capture system can include, for example, the exact location in latitude, longitude, camera heading, and elevation of the image capture device. Captured information may further or alternatively include the roll of the image capture device, the pitch of the image capture device, the yaw of the image capture device, the velocity and/or direction of the image capture device, the viewing angle of the image capture device, the azimuth and/or compass bearing of the image capture device. Information concerning the horizon angles of the image capture device and the inclination and declination of the image capture device may likewise be collected. Such metadata can be attached to both images and video.
- the digital data organization process of the image capture system may be controlled manually by a user or automatically by computer hardware/software control processes or mobile app. These may include organization by photos, video, audio, location, position, by image capture device, by user, by date, time, logged user, subscription user, or a number of other attributes of an image/video/audio/media file. Likewise, these images may be made searchable via these attributes in a network based (including “cloud based”) storage system as well as a local storage system. In some embodiments, the captured image/video/media can also be stored and organized by facial recognition means and subsequently searched or made searchable by facial recognition applications.
- FIG. 3 illustrates exemplary functions and features of the image capture system.
- the media capture system 300 may include the three-dimensional intelligent image sensor 200 of FIG. 2 , with internally embedded individual functions and/or features 300 for providing navigation, gyroscopic, processor, and communication technology. These various functionalities, which may be implemented through software, hardware, or a combination of the two, collect incident, event, and item images with sensor data.
- such information may include but is not limited to elevation, inclination, roll, pitch, yaw and bearing information.
- Other types of information might include position, time, altitude, zoom/telescope information, distance to/from a pictured object, object measurement in image or video, elevation angles, centimeter accurate XYZ position, global positioning system (GPS) coordinates, GPS differential corrections, 3-axis positions of an object in the image, real-time kinematics (RTK) sensor data, real-time network (RTN) survey data, cell triangulation data, satellite map location data, street view map location data, map pointer data, map compass data, roll/pitch/yaw data, and world geodetic system (WGS) or world geodetic datum (WGD) data.
- This collected information may also include custom notes, text, or recordings as well as custom image markers, logged data, object location attachments, or pointers to additional files.
- FIG. 4 identifies exemplary media metadata and certification data.
- the media certification system 400 can be used to certify media 410 (e.g., images, videos, audio, or some combination thereof), metadata 420 , and sensor data recorded by the image capture device.
- a certified media asset is one whose media data 410 , metadata 420 corresponding to a time of capture of the media, and sensor data corresponding to a time of capture of the media have been ensured to be authentic and unaltered via certification (e.g., according to the process 600 of FIG. 6 ).
- the sensor data may be considered part of the metadata 420 , part of the media 410 , or both.
- the metadata 420 may be considered part of the sensor data, the media data 410 , or both.
- the media certification system 400 of FIG. 4 can certify the media asset 410 by creating a new data form that will travel with each captured media asset 410 .
- the system can culminate data into the media data file 410 and organize it in a visual file that is part of the media file 410 .
- the data can be presented to a user through a traditional display device along with the media 410 or can be viewed through a user control.
- the certification will be highlighted by certain colors and integrated with systematically collected data at time of capture and loaded into the appropriate location.
- a certified media asset can be signified as the media overlaid by a marker such as a barber pole, a “certified image” stamp, or a “certified by [image certifying authority]” stamp, a quick response (QR) code, a barcode, another marker or watermark discussed herein, or some combination thereof.
- the certified media of FIG. 4 includes an example of certified marker, a barber pole logo and a “certified by ImageKeeper” stamp.
- the media certification system 400 may be used to ensure that digital media data 410 is protected in an “original captured state.” That state can be certified by the media capture system 400 and/or a third-party auditor system. That data can then be organized and/or secured (e.g., encrypted).
- the media certification system may store the media assets in network, cloud, or local area storage system like those shown in FIG. 5 . The storage system may require a subscription before a user is able to interact with the data.
- Embodiments of the secure digital media processing and document generation techniques may be extended to other third-party applications or services included as an SDK or API.
- a user may share/post, via social networks and social media sites or mobile applications (e.g., Twitter, Facebook, Instagram, Pinterest, Myspace), a certified image/video/audio/media, in either a secure (e.g., encrypted) or unsecure format.
- a user may also have the ability to send certified images/video/media to another device/user via a messaging system (e.g., SMS, Apple iMessage, Google Hangouts, SnapChat, email, or another communication method) in a secured (e.g., encrypted) or unsecured manner.
- a messaging system e.g., SMS, Apple iMessage, Google Hangouts, SnapChat, email, or another communication method
- FIG. 5 illustrates a network architecture through which digital media may be captured, certified, uploaded to a server network, verified, synchronized, and distributed to clients.
- a mobile software application 515 is installed on a mobile user device 510 , such as a cell phone.
- the mobile application 515 may utilize the digital media capture system of the mobile device 510 to capture the digital media asset, sensor data, and metadata of the digital media asset.
- the digital media asset may include one or more image(s), video(s), sensor data, or a combination thereof.
- the digital media capture device 510 may be any type of computing device 1400 , such as a user mobile device, tablet, computer web camera, Alexa, Siri, a drone or UAV, a dash camera of a vehicle, a body camera, a surveillance camera, medical imaging systems, dental cameras, satellite phone/camera, a wearable device, a phone another type of device, or a combination thereof.
- the digital media certification system certifies the digital media captured by the mobile device 510 via the mobile insurance application 515 .
- the digital media and/or sensor data may be categorized by the categorization system based on type of media, time, location, other metadata associated with capture of the digital media, or some combination thereof.
- the media captured by the media capture device 510 may be automatically grouped and/or categorized into groups or categories by type, ID, time, address or GPS location, or other properties by the media capture device 510 and/or by the server system 540 .
- the media captured by the media capture device 510 may alternately or additionally be manually grouped and/or categorized into group and/or categories by the user using the GUI. Audio and video capture may be transcribed automatically or by user selection and transferred to the secure cloud server 540 .
- the data is then autonomously sent to the internet/cloud system 530 where the digital data is filed, stored and accessed through the web in a systematic or serialized format constant with image identification formed with the media capture device 510 (as seen on the right side of FIG. 5 ).
- Various information related to the multiple user such as a family or business and the policies may be stored in user cloud storage associated with the server system 540 , or local storage associated with the mobile device 510 .
- the user can access multiple policies purchased in the mobile application 515 and switch between policies to access.
- Past insurance claims, history of payments, certificate of renewals, warranty for home device purchased, terms of the insurance, and past quotes may be stored in the cloud.
- This data can be transferred over a wired or a wireless connection.
- Such connections may include cellular networks (e.g., 3G, 4G, LTE, or 5G), W-Fi, local area networks (LAN), wireless local area networks (WLAN), satellite networks, the Internet, any other wired and/or wireless network interfaces 520 discussed herein, or a combination thereof.
- the media capture device 510 can first synchronize its image and/or sensor data with a second device.
- a camera device e.g., a digital point-and-shoot camera
- a user device such as a smartphone or wearable device, which can then form a connection to the internet/cloud system 530 .
- the internet/cloud system 530 can include one or more server systems 540 , which may be connected to each other.
- this internet/cloud system 530 is a wireless multiplexed system for securely storing digital data to and from mobile digital devices.
- the digital data e.g., images, reports
- the digital data are securely held in one central place, either by a hardware memory device, server, or a data center.
- the digital data is stored in the distributed ledger such as blockchain.
- the authenticated certified media, its sensor data, metadata, as well as their encrypted hash values may be manually or automatically transferred to a blockchain system to ensure that the media and the related data are unaltered even if multiple parties have access to the media.
- the secure servers 540 may receive the public key transmitted by the certification system and use the public key to verify that the certified digital media data is authentic and unaltered.
- the data may be accessible through a web portal.
- This web portal may include image-editing tools, worldwide access, and collaboration mechanisms available to other relevant individuals. Security, digital signature, watermarking, encryption physical access, password credentials area can be utilized throughout the system. Original digital data can be confirmed, saved and protected though various technologies and system controls.
- the user of the mobile application 515 who seeks to be insured may collaborate remotely in real-time with an appraiser, underwriter, an adjuster, or any other relevant individuals involved in the insurance process using a web portal.
- the web portal may be a website hosted at the cloud server system 540 or hosted elsewhere on a host server or directly in a data center. The web portal may provide access to media assets to third parties who are involved in the insurance and the claim process.
- the user may hold and aim the camera towards the property to be insured and the appraiser, underwriter, or an adjuster may operate the portal by clicking to capture the digital media asset remotely.
- the web portal may assist in recording a statement from the user or other relevant individuals, such as a witness, in a similar manner.
- the web portal may further provide collaboration between the relevant individuals using different mobile devices 510 with the mobile application 515 installed in the devices 510 by facilitating streaming videos or web meeting communication between the devices 510 . Those individuals may simultaneously share images, video, audio, or any other digital media asset via the mobile application 515 or the web portal.
- the certified media asset collaboration will be recorded and logged in the cloud system with its metadata such that the record will be recalled in the future. This creates a chain of custody from the instant the media is captured and assures the media is authentic and certified.
- Object recognition system may be utilized by the mobile insurance system to evaluate digital media obtained from the digital media capture system.
- third-party database containing information about the insured property or other similar properties may be utilized in order to evaluate condition, extent of the damage, and value of the property.
- FIG. 6 is a flow diagram illustrating techniques for media and metadata certification.
- the processes 600 of FIG. 6 may be performed by the media capture device 510 , the internet/cloud system 530 , or some combination thereof.
- a media asset is captured by a sensor of a digital media capture device 510 , optionally with its metadata as well.
- the metadata may include, for example, latitude and longitude coordinates from a GNSS receiver or other positioning receiver, an identification of the media capture device 510 , a timestamp identifying date and time of capture, an altitude at capture, a heading at capture, an inclination at capture, a yaw at capture, a roll at capture, pitch at capture, a watermark, an annotation, any other data that might be found in image EXIF metadata, elevation or altitude, velocity at capture, path, speed, direction, distance, weather conditions, barometer reading & change, dew point, humidity, sun angle, temperature, compass heading, media certification status, annotation certification status, incident note certifications status, incident report certification status, event number, time, date, time zone, title, media type (IR, multi-spectrum, lidar, UV, 2-dimensionality, 3-dimensionality), wind speed, wind direction, radar data, cloud coverage,
- an asymmetric public key infrastructure (PKI) key pair with a private key and a corresponding public key—is generated by the media capture device 510 of step 605 or by server 540 .
- the keys of the key pair may be RSA encrypted.
- Other types of asymmetric keys may be used.
- a digital signature is computed by generating a hash digest—optionally using a secure hash algorithm such as SHA-0, SHA-1, SHA-2, or SHA-3—of the captured media, and optionally of the metadata as well.
- the digital signature is encrypted with the private key.
- the media and/or metadata may also be encrypted using the private key.
- the private key is optionally destroyed at step 625 , or may simply never be written to non-volatile memory in the first place.
- the public key is published, either by sending it to the servers 540 , to an authentication server such as a certificate authority, or by otherwise sending it for publication in another publicly accessible and trusted network location.
- verification as to the authenticity of the media and metadata may occur by decrypting the encrypted digital signature using the public key before or after publication at step 630 , and verifying whether or not the hash digest stored as part of the decrypted digital signature matches a newly generated hash digest of the media. If the new hash matches the hash decrypted using the public key, then verification is successful, and the media asset has not been modified since capture (or at least since certification).
- the new hash does not match the hash decrypted using the public key, then verification is unsuccessful, and the media asset has been modified since capture (or at least since certification). The same can be done using the metadata if a hash digest of the metadata is included in the digital signature.
- the verification as to the authenticity of the media and metadata at step 635 may also include decrypting the media asset and/or the metadata itself, if either or both were encrypted at step 620 . This verification may occur at the digital media capture device 510 —though it may instead or additionally be performed at the server 540 , for example before the server indexes the media as part of a cloud storage system accessible by client devices.
- a certified media dataset is generated by bundling the media, metadata, and the encrypted digital signature, for example in a zip file or other compressed archive file.
- the public key may also be bundled with them, though additional security may be provided by publishing it elsewhere to a trusted authentication server.
- the certified media dataset (and optionally the public key) is transmitted to a secondary device, such as a server 540 or a viewer device (i.e., a client device).
- additional data besides the media asset and associated metadata may also be certified, either or separately from the media asset or together with the certification of the media asset. If the additional data is certified together with the media asset, the hash and digital signatures at step 615 may be hashes of the media asset as well as the additional data, thereby certifying the media asset along with the additional data. If the additional data is certified separately from the media asset, the entire process 600 may be repeated, with the additional data treated as a media asset. Additional data may include alterations or annotations to a media asset, or at least a subset of a report that is generated based on the media asset, or at least a subset of a report that is generated to include the media asset.
- Metadata corresponding to the additional data in some cases identifying one or more author(s) of the additional data and/or one or more devices on which the additional data was generated and/or certified, and/or from which the additional data was submitted to the server(s) 540 .
- a certain media asset can be associated with multiple additional data items, such as multiple notes, annotations, and/or reports by different authors, the same authors, or some combination thereof.
- the operations 600 of FIG. 6 illustrate data integrity precautions that can be taken.
- all data e.g., media asset and/or additional data and/or metadata
- the asset's security and integrity can be ensured via a Digital Signature that is made up of a Secure Hash Algorithm (SHA) digest, the time that the asset was captured and the device 510 of origin. This allows the mobile app or server to detect changes due to storage or transmission errors as well as any attempt to manipulate or change the content of the asset.
- the Digital Signature can be encrypted with a public/private key-pair that is generated uniquely for that asset by the media capture device 510 .
- the private key can be destroyed by the media capture device 510 and/or never written to a disk or stored in a memory of the media capture device 510 or any other device; as such, this ensures that the asset cannot be re-signed and cannot be changed without those changes being detectable.
- media asset data such as image, video, audio, 3D distance measurements, or other sensor data are captured by a camera, microphone, and/or other sensors integrated with the digital media capture device 510 and/or sensors connected to the digital media capture device 510 in a wired or wireless manner.
- the digital media capture device 510 also generates and/or extracts metadata (e.g., EXIF metadata) corresponding to this captured media asset, for example identifying the digital media capture device 510 , a timestamp of capture, a date of capture, an author or owner of the digital media capture device 510 , and any other metadata.
- a digital signature is generated by generating a hash of both the captured media and at least some of this metadata.
- the digital signature may be a hash of the captured media, the timestamp, and an identifier of the digital media capture device 510 that captured the media.
- the hash may be computed using a secure hash algorithm (SHA), such as SHA-0, SHA-1, SHA-2, or SHA-3.
- SHA secure hash algorithm
- the digital media capture device 510 and/or a second device that receives the media asset from the digital media capture device 510 may then generate a public and private key pair using a public key infrastructure (PKI), where the keys may be for example RSA 1024 or 2048 bit keys.
- PKI public key infrastructure
- the private key is used to encrypt the digital signature, and may then be deleted, erased, and/or destroyed, in some cases via overwriting for more security.
- the certified media asset meaning the media asset, the encrypted digital signature, and the (optionally encrypted) metadata—are uploaded to the cloud severs, in some cases along with the public key, optionally securely via HTTPS or another secure network transfer protocol.
- the public key may be uploaded to the same cloud server(s) or to a different system, such as a certificate authority (CA) server.
- CA certificate authority
- Any server or client can retrieve the public key from the cloud server system 540 or CA server and decrypt the encrypted digital signature to verify that it matches a new hash generated using media asset and/or metadata at a later time, thereby verifying that the media asset and metadata have not been changed since certification.
- the same certification process may be used for additional data based on the media asset, such as annotations, notes, and reports.
- such a verification check is performed at the media capture device 510 or second device before the media asset and metadata and encrypted digital signature and public key are sent by the media capture device 510 or second device to the server(s). In some cases, such a verification check is performed at the server after receipt of the certified media asset.
- Metadata may include, for example, time, location, media capture, orientation, media size, resolution, frame size, elevations, centimeter 3D GPS position, digital media capture device speed, heading, or some combination thereof.
- FIG. 7 illustrates an exemplary web portal display for viewing a document with an integrated certified image.
- the web portal may be used to view user submitted forms to receive a quote, insurance claim document, or any other document submitted via the mobile insurance application 515 .
- the document here, an insurance claim document 710 —may be a digital or physical document.
- the document can include a certified digital media, description of the digital media, or the insurance claim. This description of the digital media can include information such as an identification and/or contact information of the user who took the certified photo using the image capture device 510 and/or who wrote the description of the digital media or event, a location of the property or event, and a brief textual or graphical description of the property or event.
- the digital data organization process of the incident reporting system may be controlled manually by a user or automatically by computer hardware/software control processes.
- the web portal may be accessed through the mobile software application 515 as well as through a network page, such as a webpage accessible through the Internet.
- FIG. 8 illustrates a document including two certified digital media assets corresponding to different capture times.
- the document of FIG. 8 is an insurance claim document, and includes two certified images—a first certified image 805 and a second certified image 810 —that are certified via the process 600 of FIG. 6 .
- the first certified image 805 was captured on Jan. 6, 2019, as identified in the document and as determined from certified metadata of the first certified image 805 .
- the second certified image 810 was captured on Mar. 20, 2020 (after the capture of the first certified image), as identified in the document and as determined from certified metadata of the second certified image 810 .
- the first certified image 805 is captured before the second certified image is captured 810 , and depicts an object—a building—that is undamaged.
- the second certified image 810 is captured after the first certified image 805 is captured, and depicts the same object as the first certified image 805 does—a building—that is damaged.
- the first certified image 805 and the second certified image 810 may be captured by the same media capture device 510 .
- the media capture device 510 may be, for example user's mobile device 510 of a user associated with an insurance policy associated with the object—the building—depicted in the first certified image 805 and in the second certified image 810 .
- the media capture device 510 may alternately be associated with an insurance company or agent associated with the insurance policy associated with the object.
- the first certified image 805 and the second certified image 810 may be captured by different media capture devices 510 .
- the first certified image 805 may be an image that is captured before the insurance policy is generated, selected, identified, activated, put into effect, used for an insurance claim, or some combination thereof.
- the user may capture one or more images of an object—here, the building—certify the one or more images (which now include the first certified image 805 ), and send the one or more certified images to an insurance company so that the insurance company can identify the condition that the object is in at the time of capture of the first certified image 805 .
- the time of capture of the first certified image 805 may be before the insurance policy associated with the object is generated, selected, identified, activated, put into effect, used for an insurance claim, or some combination thereof.
- the second certified image 810 may be an image that is captured after the insurance policy is generated, selected, identified, activated, put into effect, or some combination thereof.
- the second certified image 810 is specifically captured in order to file an insurance claim, such as when the object that is depicted in the first certified image 805 and that is now associated with an insurance policy—here, the building—has incurred some damage.
- the building is illustrated as including damage—a crack—as depicted in the second certified image 810 , where the crack is missing in the first certified image 805 .
- the user may capture one or more new images of the object that is depicted in the first certified image 805 , certify the one or more new images (which now include the second certified image 810 ), and send the one or more new certified images to an insurance company so that the insurance company can identify the condition that the object is in at the time of capture of the second certified image 810 .
- the time of capture of the second certified image 810 may be after the insurance policy associated with the object is generated, selected, identified, activated, put into effect, or some combination thereof, and may be after damage has been incurred to the object over which a user wishes to file an insurance claim associated with the insurance policy.
- the object may be depicted in both the first certified image 805 and the second certified image 810 .
- the damage incurred by the object may be depicted in the second certified image 810 , but not the first certified image 805 .
- the server system 540 may automatically verify that the insurance claim is not fraudulent by automatically verifying a number of properties of the first certified image 805 , the second certified image 810 , metadata associated with capture of the first certified image 805 , metadata associated with capture of the second certified image 810 , sensor data associated with capture of the first certified image 805 and captured by sensors of the media capture device 510 that captured the first certified image 805 , sensor data associated with capture of the second certified image 810 and captured by sensors of the media capture device 510 that captured the second certified image 810 , or some combination thereof.
- the server system 540 may use an image recognition algorithm to identify that the object associated with the insurance policy is the object depicted in the both the first certified image 805 and the second certified image 810 .
- the image recognition algorithm may identify features of the object as depicted in first certified image 805 , such as various corners and edges and colors of the building, and identify that the features match features found in the second certified image 810 .
- the image recognition algorithm may retrieve another image of the type of object insured from a database or other storage repository, for example by querying an image of a building or of the specific building, identifying features of the object as depicted in the image from the search, and then identify that the features match features found in the first certified image 805 and the second certified image 810 .
- This verification process prevents fraudulent activity in an automated technological manner in that it prevents a malicious user from capturing images of two different objects in the two images, and trying to pass them off as the same object.
- the server system 540 may use the image recognition algorithm to verify that the damage incurred by the object that is depicted in the second certified image 810 —here, a crack in the building—is not depicted in the first certified image 805 .
- This verification process prevents fraudulent activity in an automated technological manner in that it prevents a malicious user from trying to file an insurance claim on damage that was already present in the object when the user was filing the insurance policy.
- the server system 540 may use the image recognition algorithm to verify that the damage incurred by the object that is depicted in the second certified image 810 —here, a crack in the building—is not depicted in another certified image that has been received by the insurance company and/or that may be associated with a previous and/or different insurance claim. This verification process prevents fraudulent activity in an automated technological manner in that it prevents a malicious user from trying to file multiple insurance claims on the same damage.
- the server system 540 may also verify, based on certified metadata associated with the first certified image 805 and the second certified image 810 , that the location at which the first certified image 805 was captured matches the location at which the second certified image 810 was captured, at least within a predetermined radius or distance to accommodate for slight variations in media capture vantage point and variations in positioning functionality. If the insured object is a building or other stationary object, the locations of capture of these images should be the same, at least within a margin of error of a predetermined range or distance, which may for instance be between 0.1 miles and 10 miles, or between 0.1 kilometers and 10 kilometers.
- the server system 540 may also retrieve a location of the object from a database or other storage repository, such as one associated with a land title deed, and verify that the retrieved location of the object matches the locations of capture of the first certified image 805 and the second certified image 810 , at least within a margin of error of the predetermined range or distance. This verification process prevents fraudulent activity in an automated technological manner in that it prevents a malicious user from capturing images of two different objects in the two images if the objects are in different places, even if they look identical, and trying to pass the images off as the same object.
- a database or other storage repository such as one associated with a land title deed
- the server system 540 may also verify, based on certified metadata associated with the first certified image 805 and the second certified image 810 , that the time of capture of the first certified image 805 is before the time of capture of the second certified image 810 . In other words, the server system 540 may also verify that the time of capture of the second certified image 810 is after the time of capture of the first certified image 805 . In some cases, the server system 540 may verify that the time of capture of the first certified image 805 is also before one or more times and/or dates associated with the insurance policy on the object being generated, selected, identified, activated, put into effect, used for an insurance claim, or some combination thereof.
- the server system 540 may verify that the time of capture of the second certified image 810 is also after one or more times and/or dates associated with the insurance policy on the object being generated, selected, identified, activated, put into effect, or some combination thereof. This verification process prevents fraudulent activity in an automated technological manner in that it ensures that images are captured when the user states that they are.
- the server system 540 may also verify, based on certified sensor data associated with capture of the first certified image 805 and certified sensor data associated with capture of the second certified image 810 , that certified sensor data associated with capture of the first certified image 805 matches certified sensor data associated with capture of the second certified image 810 .
- the sensor data associate with both certified images may include data captured by a RADAR sensor, a LIDAR sensor, an infrared sensor, a microphone, or some combination thereof.
- This verification process prevents fraudulent activity in an automated technological manner in that it ensures that the same object is depicted in both the first certified image 805 and the second certified image 810 based on additional sensor data that could catch discrepancies that might not be caught otherwise through the images or other metadata alone.
- the first certified image 805 may be received by the server system 540 (or a separate server system 540 ) and stored there at or shortly after capture; that is, before the insurance policy on the object is used for an insurance claim, and before, during, or shortly after the insurance policy on the object is generated, selected, identified, activated, put into effect, or some combination thereof.
- FIG. 8 illustrates the first certified image 805 as coming from a server system 820 .
- the second certified image 810 may be received after receipt of the first certified image 805 , and therefore may be received from the media capture device 510 , such as the mobile devices 830 illustrated in FIG. 8 .
- the first certified image 805 may be instead received from the media capture device 830 .
- the second certified image 810 may be instead received from the server system 820 .
- While the document of FIG. 8 includes two certified images 805 and 810 , and various verification processes are discussed above with respect to the two certified images 805 and 810 , other types of certified media may be used in the document and in the verification processes discussed above.
- one or both of the certified images 805 and 810 may be replaced with or supplemented with other forms of media, such as additional images, videos, audio, LIDAR point clouds, RADAR point clouds, infrared images or other captures, other types of media discussed herein, or some combination thereof.
- FIG. 9 illustrates operations for secure digital media processing.
- the process 900 of FIG. 9 is performed by the secure server system 540 and/or another portion of the internet/cloud system 530 of FIG. 5 .
- the server system 540 receives a first digital media asset and first information set associated with the first digital media asset.
- the server system 540 receives a second digital media asset and a second information set associated with the second digital media asset.
- the server system 540 verifies that the first digital media asset and the first information set are unaltered.
- the server system 540 verifies that the second digital media asset and the second information set are unaltered.
- the server system 540 verifies that a first time of capture of the first digital media asset as identified in the first information set is before a second time of capture of the second digital media asset as identified in the second information set.
- the server system 540 generates a document that includes the first digital media asset and the second digital media asset in response to verifying that the first time of capture is before the second time of capture and to verifying that the first digital media asset, the first information set, the second digital media asset, and the second information set are unaltered.
- the document produced in step 930 may be an insurance claim document such as the insurance claim document of FIG. 8 .
- the verification steps 915 - 925 may be performed as discussed or substituted by any verification processes discussed with respect to FIG. 8 .
- FIG. 10 illustrates an exemplary certified image within an exemplary image viewer application.
- the exemplary image viewer application 1010 displays an image 1020 and shows a “certified image” stamp or “certified by [image certifying authority]” stamp graphic 1050 in the upper left corner of the image 1020 , signifying that the image 1020 is a certified image and therefore is certified as authentic and unaltered.
- the exemplary image viewer 1010 also shows some image metadata along the bottom 140 of the image 1020 and additional metadata in a metadata viewing box 1030 on the right-hand side of the image 1020 .
- This metadata is also certified as part of the certification process, and may in some cases also be marked with a “certified image” stamp or “certified by [image certifying authority]” stamp graphic itself to signify that it, too, is certified as authentic and unaltered.
- FIG. 11 illustrates capture of an image at a media capture device and transfer of the image upon its certification to a computing device at which the certified image is viewed.
- a media capture device 1110 captures an image 1150 and certifies the image 1150 upon capture.
- the certified image 1150 is transferred, optionally through the server(s) 540 , to a viewer device 1120 , which views the certified image 1150 in a viewer application 1130 .
- the viewer application 1130 may display meters corresponding to image data security levels, for instance identifying a degree of confidence in the authenticity and unaltered nature of the certified image 1150 .
- FIG. 12 is a flow diagram illustrating operations for capturing media, certifying the media, securely transferring the certified media, securely storing the certified media, and securely broadcasting the certified media.
- a digital media asset 1205 captured by a digital media capture device 510 is received by the media processing system 1210 , which may in some cases be the media capture device 510 , the internet/cloud system 530 , or some combination thereof.
- the digital media asset 1205 may be in the form of image data, audio data, video data, RADAR data, LIDAR data, SONAR data, SODAR data, sensor data from one or more other sensors, or some combination thereof.
- the media processing system may also receive metadata related to the digital media asset 1205 .
- the metadata may include location, orientation, navigation attributes, orientation of the media capture device 510 , time of the capture, date of capture, photographer, identity of the digital media capture device 510 , owner of the digital media capture device 510 , sensor data from one or more other sensors coupled to the digital media capture device 510 , or some combination thereof.
- the digital media asset 1205 and its metadata are captured by the digital media capture device 510 , the digital media asset 1205 and its metadata are automatically certified instantly at (or within a short time period after) the press of the camera's shutter button (as in FIG. 6 ) and transmitted by the digital media capture device 510 to the internet/cloud system 530 where the digital data is filed, stored, and accessed through the web in a systematic or serialized format.
- the system generates an information page based on the information associated with the digital media asset on a network location.
- the internet/cloud system 530 can include one or more server systems 540 , which may be connected to each other.
- this internet/cloud system 530 is a wireless multiplexed system for securely storing digital data to and from multiple mobile digital devices 510 .
- the digital data e.g., images, reports
- the digital data are securely held in one central place, either by a hardware memory device, server, or a data center.
- the digital media asset 1205 is embedded with a glyph that encodes information pointing to the internet/cloud system 530 .
- This glyph can be a quick response (QR) code, a barcode, Aztec code, brand logo, URL, or other identifier.
- the information pointing to the internet/loud system can encode or otherwise include a URL.
- the system then generates a modified digital media asset by overlaying the glyph on or over the digital media asset 1205 .
- the digital media asset 1205 may be protected through a various technologies and system controls. Security, digital signature, watermarking, encryption physical access, password credentials area can be utilized throughout the system. Original digital data can be confirmed, saved and protected though various technologies and system controls.
- the system may generate a hash digest of the digital media asset and a unique key pair corresponding to the digital media asset.
- the unique key consists of a private key and a public key, in which the private key is used to encrypt both the hash digest and the metadata.
- the public key may be transmitted to a network server. The information pointing to the network location may require a passcode to access the digital data.
- a content provider may broadcast the information regarding the digital media by displaying the media as well as the glyph associated with the media during its broadcast on a display device.
- the content provider may be television network, radio channel, newspaper, or magazine publisher.
- Viewers of the broadcast may request to access the digital media and its associated information by scanning the glyph from the broadcast with the viewer's device. This data can be transferred over a wired or a wireless connection or printed on photo, legal document, newspaper, magazine, or flyer.
- the system may generate a decrypted signature by decrypting the digital signature using the public key, generate a hash based on the digital media asset, and verify that the hash matches the decrypted signature.
- the viewer when a viewer of the broadcast requests this information, the viewer may be shown the information page corresponding to the digital media asset. In another embodiment, the viewer may be prompted to provide a passcode to access the information page containing the digital media asset and its metadata.
- the information page may include a certification that the digital media asset is unaltered based on the hash matching the decrypted signature.
- FIG. 13 illustrates a modified version of a digital media asset with an overlaid glyph.
- the server system 540 generates the glyph 1320 to encode a URL and/or other information identifying a network location (e.g., URL) of an information page.
- the server system 540 generates the modified digital media asset by generating a glyph 1320 and overlaying the glyph 1320 on the digital media asset 1310 , or by otherwise embedding the glyph 1320 into the digital media asset 1310 .
- the certified image 1310 may also be modified by overlaying certain metadata 1330 over the image 1310 .
- the glyph 1320 is located on the bottom right of the certified image 1310 as a quick response (QR) Code 1320 .
- a requesting device may interact with this image by scanning the QR Code 1320 with a camera or other scanner of the requesting device, such as a smart phone or tablet with a QR Code scanner. The requesting device may then decode the glyph to obtain the network location. The requesting device may go to the network location, for example in a browser, and request the information page from the servers 540 . The servers 540 then provide the information page or media to the requesting device.
- the modified digital media asset 1310 may alternately or additionally include a hyperlinked text or region of the digital media asset, such that a user viewing the modified digital media asset on a device can click or tap the hyperlinked text or region to go to the network location (e.g., URL) of the information page.
- the QR code may be the hyperlinked region of the modified digital media asset.
- the viewer using the digital device with a QR code can determine how much access the content provider may be given to the viewer's device when providing the digital media. That is, a viewer or administrator may set permissions so that only certain requesting devices may be able to access the information page. For example, devices that have a passcode.
- the secure digital media processing and document generation techniques discussed herein may be implemented in the general context of computer executable instructions via software located on and between a remotely operated user device (e.g., Smartphone, tablet, or other electronic device) with a computerized operating system.
- a remotely operated user device e.g., Smartphone, tablet, or other electronic device
- the user devices may be remotely addressable with secure login, with or without password, biometric fingerprint, voice, retinal scan or encrypted log on web portal providing global internet access, either via fiber, radio frequency, satellite, or data linking with bi-directional, cloud system and or data center or location under user command.
- the web portal and/or viewer device mobile application can include annotation tools, allowing for creation of charts and graphics with color annotation incorporating picture in picture, with image numbers associated with each image brought into the web portal workspace.
- annotation tools allowing for creation of charts and graphics with color annotation incorporating picture in picture, with image numbers associated with each image brought into the web portal workspace.
- any interface with all of the external data inputs, such as weather, news, Internet sites, other sensor data can be integrated into the workspace and integrated into the image work areas.
- FIG. 14 illustrates an exemplary computing system 1400 that may be used to implement some aspects of the technology.
- any of the computing devices, computing systems, network devices, network systems, servers, and/or arrangements of circuitry described herein may include at least one computing system 1400 , or may include at least one component of the computer system 1400 identified in FIG. 14 .
- the computing system 1400 of FIG. 14 includes one or more processors 1410 and memory 1420 .
- Each of the processor(s) 1410 may refer to one or more processors, controllers, microcontrollers, central processing units (CPUs), graphics processing units (GPUs), arithmetic logic units (ALUs), accelerated processing units (APUs), digital signal processors (DSPs), application specific integrated circuits (ASICs), field-programmable gate arrays (FPGAs), or combinations thereof.
- Each of the processor(s) 1410 may include one or more cores, either integrated onto a single chip or spread across multiple chips connected or coupled together.
- Memory 1420 stores, in part, instructions and data for execution by processor 1410 .
- Memory 1420 can store the executable code when in operation.
- the system 1400 of FIG. 14 further includes a mass storage device 1430 , portable storage medium drive(s) 1440 , output devices 1450 , user input devices 1460 , a graphics display 1470 , and peripheral devices 1480 .
- processor unit 1410 and memory 1420 may be connected via a local microprocessor bus
- the mass storage device 1430 , peripheral device(s) 1480 , portable storage device 1440 , and display system 1470 may be connected via one or more input/output (I/O) buses.
- I/O input/output
- Mass storage device 1430 which may be implemented with a magnetic disk drive or an optical disk drive, is a non-volatile storage device for storing data and instructions for use by processor unit 1410 . Mass storage device 1430 can store the system software for implementing some aspects of the subject technology for purposes of loading that software into memory 1420 .
- Portable storage device 1440 operates in conjunction with a portable non-volatile storage medium, such as a floppy disk, compact disk or Digital video disc, to input and output data and code to and from the computer system 1400 of FIG. 14 .
- a portable non-volatile storage medium such as a floppy disk, compact disk or Digital video disc
- the system software for implementing aspects of the subject technology may be stored on such a portable medium and input to the computer system 1400 via the portable storage device 1440 .
- the memory 1420 , mass storage device 1430 , or portable storage 1440 may in some cases store sensitive information, such as transaction information, health information, or cryptographic keys, and may in some cases encrypt or decrypt such information with the aid of the processor 1410 .
- the memory 1420 , mass storage device 1430 , or portable storage 1440 may in some cases store, at least in part, instructions, executable code, or other data for execution or processing by the processor 1410 .
- Output devices 1450 may include, for example, communication circuitry for outputting data through wired or wireless means, display circuitry for displaying data via a display screen, audio circuitry for outputting audio via headphones or a speaker, printer circuitry for printing data via a printer, transcription or some combination thereof.
- the display screen may be any type of display discussed with respect to the display system 1470 .
- the printer may be inkjet, LaserJet, thermal, or some combination thereof.
- the output device circuitry 1450 may allow for transmission of data over an audio jack/plug, a microphone jack/plug, a universal serial bus (USB) port/plug, an Apple® Lightning® port/plug, an Ethernet port/plug, a fiber optic port/plug, a proprietary wired port/plug, a BLUETOOTH® wireless signal transfer, a BLUETOOTH® low energy (BLE) wireless signal transfer, an IBEACON® wireless signal transfer, a radio-frequency identification (RFID) wireless signal transfer, near-field communications (NFC) wireless signal transfer, dedicated short range communication (DSRC) wireless signal transfer, 802.11 Wi-Fi wireless signal transfer, wireless local area network (WLAN) signal transfer, Visible Light Communication (VLC), Worldwide Interoperability for Microwave Access (WiMAX), Infrared (IR) communication wireless signal transfer, Public Switched Telephone Network (PSTN) signal transfer, Integrated Services Digital Network (ISDN) signal transfer, 3G/4G/5G/LTE cellular
- USB
- Input devices 1460 may include circuitry providing a portion of a user interface.
- Input devices 1460 may include an alpha-numeric keypad, such as a keyboard, for inputting alpha-numeric and other information, or a pointing device, such as a mouse, a trackball, stylus, or cursor direction keys.
- Input devices 1460 may include touch-sensitive surfaces as well, either integrated with a display as in a touchscreen, or separate from a display as in a trackpad. Touch-sensitive surfaces may in some cases detect localized variable pressure or force detection.
- the input device circuitry may allow for receipt of data over an audio jack, a microphone jack, a universal serial bus (USB) port/plug, an Apple® Lightning® port/plug, an Ethernet port/plug, a fiber optic port/plug, a proprietary wired port/plug, a wired local area network (LAN) port/plug, a BLUETOOTH® wireless signal transfer, a BLUETOOTH® low energy (BLE) wireless signal transfer, an IBEACON® wireless signal transfer, a radio-frequency identification (RFID) wireless signal transfer, near-field communications (NFC) wireless signal transfer, dedicated short range communication (DSRC) wireless signal transfer, 802.11 Wi-Fi wireless signal transfer, wireless local area network (WLAN) signal transfer, Visible Light Communication (VLC), Worldwide Interoperability for Microwave Access (WiMAX), Infrared (IR) communication wireless signal transfer, Public Switched Telephone Network (PSTN) signal transfer, Integrated Services Digital Network (ISDN) signal transfer, 3G/4G
- USB
- Input devices 1460 may include receivers or transceivers used for positioning of the computing system 1400 as well. These may include any of the wired or wireless signal receivers or transceivers.
- a location of the computing system 1400 can be determined based on signal strength of signals as received at the computing system 1400 from three cellular network towers, a process known as cellular triangulation. Fewer than three cellular network towers can also be used—even one can be used—though the location determined from such data will be less precise (e.g., somewhere within a particular circle for one tower, somewhere along a line or within a relatively small area for two towers) than via triangulation. More than three cellular network towers can also be used, further enhancing the location's accuracy.
- Similar positioning operations can be performed using proximity beacons, which might use short-range wireless signals such as BLUETOOTH® wireless signals, BLUETOOTH® low energy (BLE) wireless signals, IBEACON® wireless signals, personal area network (PAN) signals, microwave signals, radio wave signals, or other signals discussed above. Similar positioning operations can be performed using wired local area networks (LAN) or wireless local area networks (WLAN) where locations are known of one or more network devices in communication with the computing system 1400 such as a router, modem, switch, hub, bridge, gateway, or repeater.
- LAN local area networks
- WLAN wireless local area networks
- GNSS Global Navigation Satellite System
- GLONASS Russia-based Global Navigation Satellite System
- BDS BeiDou Navigation Satellite System
- Input devices 1460 may include receivers or transceivers corresponding to one or more of these GNSS systems.
- Display system 1470 may include a liquid crystal display (LCD), a plasma display, an organic light-emitting diode (OLED) display, a low-temperature poly-silicon (LTPO) display, an electronic ink or “e-paper” display, a projector-based display, a holographic display, or another suitable display device.
- Display system 1470 receives textual and graphical information, and processes the information for output to the display device.
- the display system 1470 may include multiple-touch touchscreen input capabilities, such as capacitive touch detection, resistive touch detection, surface acoustic wave touch detection, or infrared touch detection. Such touchscreen input capabilities may or may not allow for variable pressure or force detection.
- Peripherals 1480 may include any type of computer support device to add additional functionality to the computer system.
- peripheral device(s) 1480 may include one or more additional output devices of any of the types discussed with respect to output device 1450 , one or more additional input devices of any of the types discussed with respect to input device 1460 , one or more additional display systems of any of the types discussed with respect to display system 1470 , one or more memories or mass storage devices or portable storage devices of any of the types discussed with respect to memory 1420 or mass storage 1430 or portable storage 1440 , a modem, a router, an antenna, a wired or wireless transceiver, a printer, a bar code scanner, a quick-response (“QR”) code scanner, a magnetic stripe card reader, a integrated circuit chip (ICC) card reader such as a smartcard reader or a EUROPAY®-MASTERCARD®-VISA® (EMV) chip card reader, a near field communication (NFC) reader, a document/image scanner, a visible light
- the components contained in the computer system 1400 of FIG. 14 can include those typically found in computer systems that may be suitable for use with some aspects of the subject technology and represent a broad category of such computer components that are well known in the art. That said, the computer system 1400 of FIG. 14 can be customized and specialized for the purposes discussed herein and to carry out the various operations discussed herein, with specialized hardware components, specialized arrangements of hardware components, and/or specialized software. Thus, the computer system 1400 of FIG.
- the 14 can be a personal computer, a hand held computing device, a telephone (“smartphone” or otherwise), a mobile computing device, a workstation, a server (on a server rack or otherwise), a minicomputer, a mainframe computer, a tablet computing device, a wearable device (such as a watch, a ring, a pair of glasses, or another type of jewelry or clothing or accessory), a video game console (portable or otherwise), an e-book reader, a media player device (portable or otherwise), a vehicle-based computer, another type of computing device, or some combination thereof.
- the computer system 1400 may in some cases be a virtual computer system executed by another computer system.
- the computer can also include different bus configurations, networked platforms, multi-processor platforms, etc.
- Various operating systems can be used including Unix®, Linux®, FreeBSD®, FreeNAS®, pfSense®, Windows®, Apple® Macintosh OS® (“MacOS®”), Palm OS®, Google® Android®, Google® Chrome OS®, Chromium® OS®, OPENSTEP®, XNU®, Darwin®, Apple® iOS®, Apple® tvOS®, Apple® watchOS®, Apple® audioOS®, Amazon® Fire OS®, Amazon® Kindle OS®, variants of any of these, other suitable operating systems, or combinations thereof.
- the computer system 1400 may also use a Basic Input/Output System (BIOS) or Unified Extensible Firmware Interface (UEFI) as a layer upon which the operating system(s) are run.
- BIOS Basic Input/Output System
- UEFI Unified Extensible Firmware Interface
- the computer system 1400 may be part of a multi-computer system that uses multiple computer systems 1400 , each for one or more specific tasks or purposes.
- the multi-computer system may include multiple computer systems 1400 communicatively coupled together via at least one of a personal area network (PAN), a local area network (LAN), a wireless local area network (WLAN), a municipal area network (MAN), a wide area network (WAN), or some combination thereof.
- PAN personal area network
- LAN local area network
- WLAN wireless local area network
- MAN municipal area network
- WAN wide area network
- the multi-computer system may further include multiple computer systems 1400 from different networks communicatively coupled together via the internet (also known as a “distributed” system).
- Non-transitory computer-readable storage media refer to any medium or media that participate in providing instructions to a central processing unit (CPU) for execution and that may be used in the memory 1420 , the mass storage 1430 , the portable storage 1440 , or some combination thereof.
- Such media can take many forms, including, but not limited to, non-volatile and volatile media such as optical or magnetic disks and dynamic memory, respectively.
- non-transitory computer-readable media include, for example, a floppy disk, a flexible disk, a hard disk, magnetic tape, a magnetic strip/stripe, any other magnetic storage medium, flash memory, memristor memory, any other solid-state memory, a compact disc read only memory (CD-ROM) optical disc, a rewritable compact disc (CD) optical disc, digital video disk (DVD) optical disc, a blu-ray disc (BDD) optical disc, a holographic optical disk, another optical medium, a secure digital (SD) card, a micro secure digital (microSD) card, a Memory Stick® card, a smartcard chip, a EMV chip, a subscriber identity module (SIM) card, a mini/micro/nano/pico SIM card, another integrated circuit (IC) chip/card, random access memory (RAM), static RAM (SRAM), dynamic RAM (DRAM), read-only memory (ROM), programmable read-only memory (PROM), erasable programmable read-only
- a bus 1490 carries the data to system RAM or another memory 1420 , from which a processor 1410 retrieves and executes the instructions.
- the instructions received by system RAM or another memory 1420 can optionally be stored on a fixed disk (mass storage device 1430 /portable storage 1440 ) either before or after execution by processor 1410 .
- Various forms of storage may likewise be implemented as well as the necessary network interfaces and network topologies to implement the same.
- any process illustrated in any flow diagram herein or otherwise illustrated or described herein may be performed by a machine, mechanism, and/or computing system 1400 discussed herein, and may be performed automatically (e.g., in response to one or more triggers/conditions described herein), autonomously, semi-autonomously (e.g., based on received instructions), or a combination thereof.
- any action described herein as occurring in response to one or more particular triggers/conditions should be understood to optionally occur automatically response to the one or more particular triggers/conditions.
Landscapes
- Engineering & Computer Science (AREA)
- Computer Security & Cryptography (AREA)
- Theoretical Computer Science (AREA)
- Signal Processing (AREA)
- Computer Networks & Wireless Communication (AREA)
- Business, Economics & Management (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Accounting & Taxation (AREA)
- Finance (AREA)
- Computer Hardware Design (AREA)
- General Engineering & Computer Science (AREA)
- Software Systems (AREA)
- General Health & Medical Sciences (AREA)
- Bioethics (AREA)
- Health & Medical Sciences (AREA)
- Development Economics (AREA)
- Economics (AREA)
- Marketing (AREA)
- Strategic Management (AREA)
- Technology Law (AREA)
- General Business, Economics & Management (AREA)
- Computing Systems (AREA)
- Power Engineering (AREA)
- Financial Or Insurance-Related Operations Such As Payment And Settlement (AREA)
Abstract
A server system for digital media processing may receive two digital media assets and information associated with capture of the two digital media assets, such as metadata or sensor data from the respective times of capture of the two digital media assets. The server system verifies that the two digital media assets and associated information are unaltered, and that the second of the two digital media assets is captured after the first. The server system then generates a document that incorporates the two digital media assets, such as an insurance claim document.
Description
- The present application is a continuation and claims the priority benefit of U.S. patent application Ser. No. 16/837,944 filed Apr. 1, 2020, which is incorporated herein by reference in its entirety.
- The present invention generally relates to digital media processing. More specifically, the present technology relates to authentication, validation, and analysis of digital media, and generation of documentation using the digital media that may be used for insurance processes.
- User devices such as smartphones or tablets can capture photos or videos using camera software applications designed to interact with camera hardware embedded in the user device. Some of these camera applications store photo metadata along with the photo. However, viewers viewing photos or videos captured using such software applications traditionally have no way to know if the photos or videos and associated metadata are genuine, or if the photos or videos have been altered or manipulated in some way. This can be particularly problematic if such photos or videos are used in contexts such as the insurance industry, as manipulated photos, videos, or metadata could be used by malicious parties for fraudulent insurance claims. Thus, there is a need for improved techniques and technologies that combat digital media and metadata manipulation.
- System and methods for secure media processing are disclosed. In one example, a method is provided that includes receiving a first digital media receiving a first digital media asset, a first information set associated with the first digital media asset, a second digital media asset, and a second information set associated with the second digital media asset. The method also includes verifying that the first digital media asset and the first information set are unaltered and verifying that the second digital media asset and the second information set are unaltered. The method also includes verifying that a first time of capture of the first digital media asset as identified in the first information set is before a second time of capture of the second digital media asset as identified in the second information set. The method also includes generating a document that includes the first digital media asset and the second digital media asset in response to verifying that the first time of capture is before the second time of capture and to verifying that the first digital media asset, the first information set, the second digital media asset, and the second information set are unaltered.
-
FIG. 1 is a flow diagram illustrating operations of an image capture and certification system combined with an end to end mobile insurance system. -
FIG. 2 illustrates an image capture system that combines a camera image with a sensor data set from a sensor system. -
FIG. 3 illustrates exemplary functions and features of the image capture system. -
FIG. 4 identifies exemplary media metadata and certification data. -
FIG. 5 illustrates a network architecture through which digital media may be captured, certified, uploaded to a server network, verified, synchronized, and distributed to clients. -
FIG. 6 is a flow diagram illustrating techniques for media and metadata certification. -
FIG. 7 illustrates an exemplary web portal display for viewing a document with an integrated certified image. -
FIG. 8 illustrates a document including two certified digital media assets corresponding to different capture times. -
FIG. 9 illustrates operations for secure digital media processing. -
FIG. 10 illustrates an exemplary certified image within an exemplary image viewer application. -
FIG. 11 illustrates capture of an image at a media capture device and transfer of the image upon its certification to a computing device at which the certified image is viewed. -
FIG. 12 is a flow diagram illustrating operations for capturing media, certifying the media, securely transferring the certified media, securely storing the certified media, and securely broadcasting the certified media. -
FIG. 13 illustrates a modified version of a digital media asset with an overlaid glyph. -
FIG. 14 is a block diagram of an exemplary computing device that may be used to implement some aspects of the technology. - Techniques for secure media processing may be used to certify a digital media asset by verifying that the digital media asset is authentic and has not been altered since capture. In some cases, these secure media processing techniques may be used in the context of an enterprise ecosystem including a mobile application and a corresponding server system. Together the mobile application and the corresponding server system automatically generate quotes or identify insurance policies for users based on certified digital media depicting an object to be insured, and automatically generates and handles insurance claims based on additional certified digital media depicting the insured object after an event occurs that, for instance, causes the object to incur damage.
- The mobile application and corresponding server system can perform tasks of various types, including marketing, communications, selling, underwriting, providing digital copies of an insurance policy, binding (e.g., including through electronic signatures), payment transaction of an insurance policy, insurance claim loss notices, analytics, adjustment processes, to final payment and delivery of insurance claim damages. The mobile application covers capturing, processing, and certifying a digital media asset submitted for insurance purposes. Through these techniques, fraud is reduced in the insurance claim process due to the certification of digital media asset. An artificial intelligent object recognition system and third-party databases may be utilized to determine the identity, condition, replacement cost and value of the property, and verify the validity of the insurance claim. Data transferred between the parties in the system are transmitted to a secure server and are securely accessible by the parties involved to provide transparency in the insurance process.
- The mobile application and corresponding server system may generate and/or identify an insurance policy, and perform other tasks related to the insurance policy, such as underwriting, filing an insurance claim, analyzing the insurance claim, adjustment process and processing of the final payment of insurance claim damage along with policy renewals. Examples of types of insurance the mobile application and corresponding server system may be used for but not limited to are: homeowners, business, auto, condo, commercial business, ID theft, motorcycle, pet, RV, renters, travel, jewelry, boat, flood, hurricane, marine ships and shipping, tornado, hail, wind, aviation, agriculture, earthquake, fire, and property and casualty insurance.
-
FIG. 1 is a flow diagram illustrating operations of an image capture and certification system combined with an end to end mobile insurance system. - At step 110, a mobile application is downloaded on the user's mobile device. The mobile application may be downloaded from a website or other storage repository through the user search, or by clicking on an advertisement that was sent to the user from a website, email, or messages. A mobile application is utilized in the lifecycle of obtaining insurance and filing an insurance claim. The mobile application may contain marketing tools, ability to send and receive documents, tools to capture, certify, and organize digital media, automatically attach the digital media to the insurance documents, incorporate electronic signatures, and notary credentials, and to review the policies and insurance claim process.
- The mobile application may be installed on any mobile device equipped with various sensors such as one or more cameras, video, audio recording microphones, GPS or GNSS receivers, any other types of sensors discussed herein, or some combination thereof. Such a device may be a smart phone, a tablet computer, a personal digital assistant (PDA), drones, UGV, pocket PC, satellite camera equipped phone, or a laptop. The mobile device can connect to internet via Wi-Fi or wired connection or communicate with a cellular tower. The mobile application may be downloaded on the mobile device via the cellular or internet connection, from an advertisement with a QR code, website, Apple/Google app store, or directly searching for the application.
- A touchscreen display of the mobile device may display a graphical user interface (GUI) to assist a user seeking to obtain an insurance in utilizing the mobile application. The GUI may contain various menu screens and menu buttons for the user to interact with. The GUI may provide a search bar to search for an insurance company and the company's history, rates, and customer reviews. The GUI may also provide an option to search insurance companies by price, reviews, or products the companies provide.
- At step 120, the user searches for an insurance company and the insurance policies from the GUI of the mobile device using the mobile application. The user may view information regarding the policies and the insurance companies on the mobile application. The application may query the user for basic questions to assist the user in narrowing the options for the user. To receive further information and the personalized quote, the user sends information relevant to the user and the property to be insured to the insurance company at step 130. The mobile application and corresponding server system may assist in downloading and uploading various documents needed to obtain a quote. A menu option may present the user with links to various documents the user needs such that the documents may be downloaded to the mobile device. Another menu option may be presented to upload a document from the device memory. The menu option may be personalized for different insurance company the user is seeking insurance from.
- A digital media asset, which may include an image, video, audio, map, satellite photo, or some combination thereof, may also be uploaded via the application. The user utilizes intelligent digital media capture system 131 and the digital media certification system 132 when sending the certified digital media to the insurance company to receive a quote. At step 131, the user may take one or more digital images or videos of the property to be insured to document and verify the identity, value, and the quality of the property for the purpose of obtaining a quote for the insurance and for the underwriting department to evaluate the property in the original state. Sensor data, such as the GPS coordinates where the digital media was taken, may be retrieved along with the digital media asset at the time of the capture. Metadata related to the digital media asset, such as the author, date and time of the capture, description, media title, names of individuals involved, may also be retrieved. The sensor data and metadata may be used to verify the assertion made by the user. For example, for a house insurance, the GPS coordinates attached with the digital media asset support the assertion that the house is located where the user asserts is the location of the house.
- At step 132, the digital media asset may be certified by generating a hash digest of the digital media and encrypting the digital media and its hash value. The certified digital media asset guarantees that the digital media is unaltered and authentic. The encryption may include generating an asymmetric public key infrastructure key pair as further illustrated in
FIG. 6 . Hash values of sensor data and the metadata of the digital media asset may similarly be generated and encrypted. The system may generate an information page based on the information associated with the digital media asset and stores the information page at a network location. The server system may also generate an interactive glyph, such as a QR code, that encodes information pointing to the network location. The server system generates a modified digital media asset by overlaying or otherwise embedding the glyph onto the digital media asset. The server system outputs the modified digital media asset, for example by printing the modified digital media asset or transmitting the modified digital media asset to a display device that displays the modified digital media asset. A requesting device scans the glyph from the modified digital media asset, for example using a camera, and decodes from the glyph the information pointing to the network location. Once the requesting device decodes the information pointing to the network location, the requesting device requests the information page from the server system. The server system serves the information page to the requesting device. In some cases, the requesting device transmits communications to the server system, for example to leave a comment regarding the digital media asset, to indicate approval or disapproval of the digital media asset, or to edit the digital media asset. The server system receives the communication and modifies the information page to reflect comments, approvals, disapprovals, or edits. - At step 140, the system utilizes
AI object recognition 141 and third-party database 142 to verify the information and assess the property to assist in obtaining a quote for the user. An artificial intelligent (AI)processor 141 may be used to determine the identity, value, age, condition, type, and scope of the property. For example, the AI may recognize from the shape, pixel information, and the curvature of the lines that the digital image or a video is of a blue car and the car has dents and other imperfections. - A third-
party database 142 may be utilized in identifying and evaluating the property captured in the digital media. The system may search and connect to an appropriate third-party database automatically or an appraiser, an underwriter, or an adjuster may search for a third-party database manually. For example, the AI may recognize that the digital media is an image of a Honda vehicle from the “H” of the front grill emblem and search the certified Honda database for the model and the year of the car. From the license plate of the vehicle, the AI may also obtain history, ownership, and registration information of the car from a database containing the history of the vehicle. From the information of the make, model, year, history, and the condition of the car, the AI may search for a database with prices of cars of similar condition to determine the value of the car. In another example, the AI may recognize a digital media of a house and determine the address of the house and the date the image was taken from the sensor and the metadata of the digital media. The AI then may search the appropriate databases to obtain tax records, improvements records to determine the value of the house. A human may assist with the AI by specifying that the image is a car so that the server system may search for an appropriate third-party database. - Another third-party database may be utilized in comparing pricing of similar policies between insurance companies. A third-party database containing consumer opinions or reviews may also be utilized and displayed for the user.
- At step 150, the user accepts a policy and the system transfers insurance documents and payment for the insurance policy between the user and the insurance company. The mobile application and corresponding server system may facilitate payments between the user and the insurance company. For example, once the customer chooses a policy from an insurance company, the mobile device may transmit the payment information of the user to the insurance company by allowing the user to input the credit card information on the GUI of the mobile application. The server system allows the user to transmit electronic signature to bind the user. The mobile application and corresponding server system may also generate a certified insurance card once the policy is granted to the user. When the insurance company issues the insurance card to the user, the insurance card may be retrieved, downloaded, stored, and viewed from the mobile application. Using asymmetrical encryption and a hash generator similar to certifying a digital media asset above, the image of the insurance card may be certified such that the image is guaranteed to be authentic.
- When an event arising to filing of an insurance claim occurs, the user may send the insurance claim form embedded with certified digital media at
step 160. The mobile application and corresponding server system may assist in the user filing an insurance claim in response to a loss in regards to the insured property. The mobile application allows the user to file an insurance claim by downloading the insurance claim form to the mobile device from the application, taking a certified digital media, and automatically populate the insurance document with the digital media. Similar to when the user uploads digital media to obtain a quote, the system generates a certified image by receiving digital media asset from the mobile device of the user, generating the hash values for the digital media and the sensor and metadata, encrypting the hash values, and overlaying a glyph on the digital media. The system may search for the certified digital media used at the time the user applied for the insurance to automatically attach the digital media to the insurance claim form. The insurance claim form containing certified media is then transferred to a secure server for the insurance claim adjuster to investigate the insurance claim. The user may also send other supporting documents to the insurance claim, such as estimates of financial loss, engineering reports, and police reports from the mobile application. - In assessing the validity of the insurance claim, an AI/ML object recognition system and third-party database may be used to verify the authenticity of the certified digital media submitted at the time the insurance quote was obtained with the certified digital media submitted at the time the insurance claim was filed as discussed with respect to
FIG. 8 . The insurance claim is reviewed by the insurance company atstep 170. The insurance claim adjuster or any other individuals relevant to assessing the insurance claim may obtain access to the information that the user submitted, such as the insurance claim form and the certified digital media. The insurance claim adjuster or any other relevant individuals may also upload certified digital media or other documents to the cloud, such as digital media of damages the insurance claim adjuster personally inspected, video or audio of witness interviews, or other documents that the adjuster or other individuals submitted. The documents and the certified digital media may be viewable by all relevant individuals given access within the system. Furthermore, the user who filed the insurance claim may be allowed to view the status of the insurance claim via claim status indicator from the mobile application. As documents and digital media are added and the insurance claim adjuster makes progress with the insurance claim, the user is allowed to view and receive updates regarding the insurance claim. - The mobile application and corresponding server system may utilize the sensor and metadata of the digital image to verify the validity of the insurance claim. In a car accident insurance claim, the system may verify the location of the car accident from the GPS coordinates in the sensor data of the digital media. For a house that is insured, the system may determine whether the two sets of digital media, one from before the insurance claim and one after the insurance claim was made, are taken in the same property based on the sensor and metadata of the digital media.
- The mobile application and corresponding server system may utilize artificial intelligence (AI) and/or machine learning (ML) image recognition, object recognition, and/or feature recognition algorithms to compare the set of digital media assets obtained at the time of the quote with the set of digital media asset obtained at the time the insurance claim is made. For example, the recognition algorithms may determine the locations and the extent of the damage of the insured car by comparing the digital images of the car before and after the accident. The system may also utilize third-party database to determine the validity of the insurance claim. For an insurance claim asserting flood damage to a house, the system may obtain the metadata and the sensor data from the digital media of a house to determine the location and the date the digital image was taken. From this information, the AI may search for a third-party database for the weather data of the location of the property to determine the validity of the insurance claim that there was a flood. At any point, a human assessor may assist with the recognition algorithms in narrowing the search fields, by specifying the terms for the search, or by correcting the recognition algorithms.
- Once the insurance claim assessment is complete and the user is entitled to payment from the insurance company, the insurance claim funds may be transferred to the user's financial institution or via selectable modes by the user of the mobile application at step 180. The system can utilize the bank account information or a third-party payment system in order to transfer the funds between parties. According to the user preference, FedEx or other carriers may be utilized by the system instead of wired transfer. The mobile system spans the efforts from marketing, selling, underwriting, binding a policy, insurance claim loss notices, analytics, records, cost estimates, in and out of app communication records, adjustment processes, to final payment of the insurance claim damage. The system incorporates capturing, processing, and certifying a digital media asset submitted for insurance purposes to reduce fraudulent insurance claims. An artificial intelligent object recognition system and third-party databases may be utilized to determine the identity, condition, and value of the property, and verify the validity of the insurance claim. Data transferred between the parties in the system are transmitted to a secure server and are accessible by the parties involved to provide transparency in the insurance process.
-
FIG. 2 illustrates a digital media capture system that combines a camera image with a sensor data set from a sensor system. - The camera image is taken using a camera embedded as a part of an image capture device with an
intelligent image sensor 200 that includes a digital image sensor 210 as well as a number of sensors, such as one or more laser rangefinders, LIDAR sensors, inertial measurement units (IMUs), magnetometers, GPS receivers, GNSS receivers, accelerometers, gyroscopes, other sensors discussed herein, or some combination thereof. This device can be a camera such as a digital or analog point-and-shoot camera, a digital or analog single-lens reflex “SLR” camera, a digital or analog image-capturing telescope, a digital or analog image-capturing microscope, commercial system used by DMV, badge systems, or a digital or analog camcorder. Consumer electronic devices with imaging componentry may also be used. For example, the user device can be a portable consumer user device such as a smartphone, a tablet device, a laptop computer, a wearable device, a portable gaming console, or a portable media player device. Less mobile devices may also be used such as desktop computers, television systems, gaming consoles, and various types of conveyances. - The image capture system is an example of a digital media capture device. A digital media capture device may capture images as discussed with respect to the image capture system. Images captured by the digital media capture device may include still images and/or one or more frames from a video. The digital media capture device may capture audio via a microphone. The digital media capture system may capture various sensor measurements from various sensors that may be part of, or coupled to, the digital media capture device.
- The metadata captured by the media capture system can include, for example, the exact location in latitude, longitude, camera heading, and elevation of the image capture device. Captured information may further or alternatively include the roll of the image capture device, the pitch of the image capture device, the yaw of the image capture device, the velocity and/or direction of the image capture device, the viewing angle of the image capture device, the azimuth and/or compass bearing of the image capture device. Information concerning the horizon angles of the image capture device and the inclination and declination of the image capture device may likewise be collected. Such metadata can be attached to both images and video.
- The digital data organization process of the image capture system may be controlled manually by a user or automatically by computer hardware/software control processes or mobile app. These may include organization by photos, video, audio, location, position, by image capture device, by user, by date, time, logged user, subscription user, or a number of other attributes of an image/video/audio/media file. Likewise, these images may be made searchable via these attributes in a network based (including “cloud based”) storage system as well as a local storage system. In some embodiments, the captured image/video/media can also be stored and organized by facial recognition means and subsequently searched or made searchable by facial recognition applications.
-
FIG. 3 illustrates exemplary functions and features of the image capture system. - The
media capture system 300 may include the three-dimensionalintelligent image sensor 200 ofFIG. 2 , with internally embedded individual functions and/or features 300 for providing navigation, gyroscopic, processor, and communication technology. These various functionalities, which may be implemented through software, hardware, or a combination of the two, collect incident, event, and item images with sensor data. - As noted above, such information may include but is not limited to elevation, inclination, roll, pitch, yaw and bearing information. Other types of information might include position, time, altitude, zoom/telescope information, distance to/from a pictured object, object measurement in image or video, elevation angles, centimeter accurate XYZ position, global positioning system (GPS) coordinates, GPS differential corrections, 3-axis positions of an object in the image, real-time kinematics (RTK) sensor data, real-time network (RTN) survey data, cell triangulation data, satellite map location data, street view map location data, map pointer data, map compass data, roll/pitch/yaw data, and world geodetic system (WGS) or world geodetic datum (WGD) data. This collected information may also include custom notes, text, or recordings as well as custom image markers, logged data, object location attachments, or pointers to additional files.
-
FIG. 4 identifies exemplary media metadata and certification data. - The
media certification system 400 can be used to certify media 410 (e.g., images, videos, audio, or some combination thereof),metadata 420, and sensor data recorded by the image capture device. A certified media asset is one whosemedia data 410,metadata 420 corresponding to a time of capture of the media, and sensor data corresponding to a time of capture of the media have been ensured to be authentic and unaltered via certification (e.g., according to theprocess 600 ofFIG. 6 ). In some cases, the sensor data may be considered part of themetadata 420, part of themedia 410, or both. In some cases, themetadata 420 may be considered part of the sensor data, themedia data 410, or both. - The
media certification system 400 ofFIG. 4 can certify themedia asset 410 by creating a new data form that will travel with each capturedmedia asset 410. The system can culminate data into themedia data file 410 and organize it in a visual file that is part of themedia file 410. The data can be presented to a user through a traditional display device along with themedia 410 or can be viewed through a user control. The certification will be highlighted by certain colors and integrated with systematically collected data at time of capture and loaded into the appropriate location. A certified media asset can be signified as the media overlaid by a marker such as a barber pole, a “certified image” stamp, or a “certified by [image certifying authority]” stamp, a quick response (QR) code, a barcode, another marker or watermark discussed herein, or some combination thereof. The certified media ofFIG. 4 includes an example of certified marker, a barber pole logo and a “certified by ImageKeeper” stamp. - A user seeing such a marker can be confident that a media certifying authority has certified the authenticity of the underlying
digital media data 410,metadata 420, and/or sensor data. Themedia certification system 400 may be used to ensure thatdigital media data 410 is protected in an “original captured state.” That state can be certified by themedia capture system 400 and/or a third-party auditor system. That data can then be organized and/or secured (e.g., encrypted). The media certification system may store the media assets in network, cloud, or local area storage system like those shown inFIG. 5 . The storage system may require a subscription before a user is able to interact with the data. - Embodiments of the secure digital media processing and document generation techniques may be extended to other third-party applications or services included as an SDK or API. For example, a user may share/post, via social networks and social media sites or mobile applications (e.g., Twitter, Facebook, Instagram, Pinterest, Myspace), a certified image/video/audio/media, in either a secure (e.g., encrypted) or unsecure format. A user may also have the ability to send certified images/video/media to another device/user via a messaging system (e.g., SMS, Apple iMessage, Google Hangouts, SnapChat, email, or another communication method) in a secured (e.g., encrypted) or unsecured manner.
-
FIG. 5 illustrates a network architecture through which digital media may be captured, certified, uploaded to a server network, verified, synchronized, and distributed to clients. - A
mobile software application 515 is installed on amobile user device 510, such as a cell phone. Themobile application 515 may utilize the digital media capture system of themobile device 510 to capture the digital media asset, sensor data, and metadata of the digital media asset. The digital media asset may include one or more image(s), video(s), sensor data, or a combination thereof. The digitalmedia capture device 510 may be any type ofcomputing device 1400, such as a user mobile device, tablet, computer web camera, Alexa, Siri, a drone or UAV, a dash camera of a vehicle, a body camera, a surveillance camera, medical imaging systems, dental cameras, satellite phone/camera, a wearable device, a phone another type of device, or a combination thereof. The digital media certification system certifies the digital media captured by themobile device 510 via themobile insurance application 515. The digital media and/or sensor data may be categorized by the categorization system based on type of media, time, location, other metadata associated with capture of the digital media, or some combination thereof. The media captured by themedia capture device 510 may be automatically grouped and/or categorized into groups or categories by type, ID, time, address or GPS location, or other properties by themedia capture device 510 and/or by theserver system 540. The media captured by themedia capture device 510 may alternately or additionally be manually grouped and/or categorized into group and/or categories by the user using the GUI. Audio and video capture may be transcribed automatically or by user selection and transferred to thesecure cloud server 540. - The data is then autonomously sent to the internet/
cloud system 530 where the digital data is filed, stored and accessed through the web in a systematic or serialized format constant with image identification formed with the media capture device 510 (as seen on the right side ofFIG. 5 ). Various information related to the multiple user such as a family or business and the policies may be stored in user cloud storage associated with theserver system 540, or local storage associated with themobile device 510. The user can access multiple policies purchased in themobile application 515 and switch between policies to access. Past insurance claims, history of payments, certificate of renewals, warranty for home device purchased, terms of the insurance, and past quotes may be stored in the cloud. - This data can be transferred over a wired or a wireless connection. Such connections may include cellular networks (e.g., 3G, 4G, LTE, or 5G), W-Fi, local area networks (LAN), wireless local area networks (WLAN), satellite networks, the Internet, any other wired and/or wireless network interfaces 520 discussed herein, or a combination thereof. In some embodiments, the
media capture device 510 can first synchronize its image and/or sensor data with a second device. For example, a camera device (e.g., a digital point-and-shoot camera) may first be required to synchronize its data with a user device such as a smartphone or wearable device, which can then form a connection to the internet/cloud system 530. - The internet/
cloud system 530 can include one ormore server systems 540, which may be connected to each other. In one embodiment, this internet/cloud system 530 is a wireless multiplexed system for securely storing digital data to and from mobile digital devices. In another embodiment, the digital data (e.g., images, reports) are securely held in one central place, either by a hardware memory device, server, or a data center. In another embodiment, the digital data is stored in the distributed ledger such as blockchain. The authenticated certified media, its sensor data, metadata, as well as their encrypted hash values may be manually or automatically transferred to a blockchain system to ensure that the media and the related data are unaltered even if multiple parties have access to the media. Thesecure servers 540 may receive the public key transmitted by the certification system and use the public key to verify that the certified digital media data is authentic and unaltered. - Once the data is in the internet/
cloud system 530, the data may be accessible through a web portal. This web portal may include image-editing tools, worldwide access, and collaboration mechanisms available to other relevant individuals. Security, digital signature, watermarking, encryption physical access, password credentials area can be utilized throughout the system. Original digital data can be confirmed, saved and protected though various technologies and system controls. The user of themobile application 515 who seeks to be insured may collaborate remotely in real-time with an appraiser, underwriter, an adjuster, or any other relevant individuals involved in the insurance process using a web portal. The web portal may be a website hosted at thecloud server system 540 or hosted elsewhere on a host server or directly in a data center. The web portal may provide access to media assets to third parties who are involved in the insurance and the claim process. For example, the user may hold and aim the camera towards the property to be insured and the appraiser, underwriter, or an adjuster may operate the portal by clicking to capture the digital media asset remotely. In another example, the web portal may assist in recording a statement from the user or other relevant individuals, such as a witness, in a similar manner. The web portal may further provide collaboration between the relevant individuals using differentmobile devices 510 with themobile application 515 installed in thedevices 510 by facilitating streaming videos or web meeting communication between thedevices 510. Those individuals may simultaneously share images, video, audio, or any other digital media asset via themobile application 515 or the web portal. The certified media asset collaboration will be recorded and logged in the cloud system with its metadata such that the record will be recalled in the future. This creates a chain of custody from the instant the media is captured and assures the media is authentic and certified. - Object recognition system may be utilized by the mobile insurance system to evaluate digital media obtained from the digital media capture system. In some embodiments, third-party database containing information about the insured property or other similar properties may be utilized in order to evaluate condition, extent of the damage, and value of the property.
-
FIG. 6 is a flow diagram illustrating techniques for media and metadata certification. Theprocesses 600 ofFIG. 6 may be performed by themedia capture device 510, the internet/cloud system 530, or some combination thereof. - At step 605, a media asset is captured by a sensor of a digital
media capture device 510, optionally with its metadata as well. The metadata may include, for example, latitude and longitude coordinates from a GNSS receiver or other positioning receiver, an identification of themedia capture device 510, a timestamp identifying date and time of capture, an altitude at capture, a heading at capture, an inclination at capture, a yaw at capture, a roll at capture, pitch at capture, a watermark, an annotation, any other data that might be found in image EXIF metadata, elevation or altitude, velocity at capture, path, speed, direction, distance, weather conditions, barometer reading & change, dew point, humidity, sun angle, temperature, compass heading, media certification status, annotation certification status, incident note certifications status, incident report certification status, event number, time, date, time zone, title, media type (IR, multi-spectrum, lidar, UV, 2-dimensionality, 3-dimensionality), wind speed, wind direction, radar data, cloud coverage, visibility, flood data, any other metadata discussed herein, or combinations thereof. - At
step 610, an asymmetric public key infrastructure (PKI) key pair—with a private key and a corresponding public key—is generated by themedia capture device 510 of step 605 or byserver 540. In some cases, the keys of the key pair may be RSA encrypted. Other types of asymmetric keys may be used. - At
step 615, a digital signature is computed by generating a hash digest—optionally using a secure hash algorithm such as SHA-0, SHA-1, SHA-2, or SHA-3—of the captured media, and optionally of the metadata as well. At step 620, the digital signature is encrypted with the private key. The media and/or metadata may also be encrypted using the private key. The private key is optionally destroyed atstep 625, or may simply never be written to non-volatile memory in the first place. - At
step 630, the public key is published, either by sending it to theservers 540, to an authentication server such as a certificate authority, or by otherwise sending it for publication in another publicly accessible and trusted network location. At step 635, verification as to the authenticity of the media and metadata may occur by decrypting the encrypted digital signature using the public key before or after publication atstep 630, and verifying whether or not the hash digest stored as part of the decrypted digital signature matches a newly generated hash digest of the media. If the new hash matches the hash decrypted using the public key, then verification is successful, and the media asset has not been modified since capture (or at least since certification). If the new hash does not match the hash decrypted using the public key, then verification is unsuccessful, and the media asset has been modified since capture (or at least since certification). The same can be done using the metadata if a hash digest of the metadata is included in the digital signature. The verification as to the authenticity of the media and metadata at step 635 may also include decrypting the media asset and/or the metadata itself, if either or both were encrypted at step 620. This verification may occur at the digitalmedia capture device 510—though it may instead or additionally be performed at theserver 540, for example before the server indexes the media as part of a cloud storage system accessible by client devices. - Once the authentication of step 635 succeeds, a certified media dataset is generated by bundling the media, metadata, and the encrypted digital signature, for example in a zip file or other compressed archive file. The public key may also be bundled with them, though additional security may be provided by publishing it elsewhere to a trusted authentication server. At
step 645, the certified media dataset (and optionally the public key) is transmitted to a secondary device, such as aserver 540 or a viewer device (i.e., a client device). - In some cases, additional data besides the media asset and associated metadata may also be certified, either or separately from the media asset or together with the certification of the media asset. If the additional data is certified together with the media asset, the hash and digital signatures at
step 615 may be hashes of the media asset as well as the additional data, thereby certifying the media asset along with the additional data. If the additional data is certified separately from the media asset, theentire process 600 may be repeated, with the additional data treated as a media asset. Additional data may include alterations or annotations to a media asset, or at least a subset of a report that is generated based on the media asset, or at least a subset of a report that is generated to include the media asset. Metadata corresponding to the additional data in some cases identifying one or more author(s) of the additional data and/or one or more devices on which the additional data was generated and/or certified, and/or from which the additional data was submitted to the server(s) 540. In some cases, a certain media asset can be associated with multiple additional data items, such as multiple notes, annotations, and/or reports by different authors, the same authors, or some combination thereof. - In other words, the
operations 600 ofFIG. 6 illustrate data integrity precautions that can be taken. For example, all data (e.g., media asset and/or additional data and/or metadata) can, in some embodiments, be secured in a local database with a globally unique identifier to ensure its integrity. The asset's security and integrity can be ensured via a Digital Signature that is made up of a Secure Hash Algorithm (SHA) digest, the time that the asset was captured and thedevice 510 of origin. This allows the mobile app or server to detect changes due to storage or transmission errors as well as any attempt to manipulate or change the content of the asset. The Digital Signature can be encrypted with a public/private key-pair that is generated uniquely for that asset by themedia capture device 510. The private key can be destroyed by themedia capture device 510 and/or never written to a disk or stored in a memory of themedia capture device 510 or any other device; as such, this ensures that the asset cannot be re-signed and cannot be changed without those changes being detectable. - More specifically, media asset data, such as image, video, audio, 3D distance measurements, or other sensor data are captured by a camera, microphone, and/or other sensors integrated with the digital
media capture device 510 and/or sensors connected to the digitalmedia capture device 510 in a wired or wireless manner. The digitalmedia capture device 510 also generates and/or extracts metadata (e.g., EXIF metadata) corresponding to this captured media asset, for example identifying the digitalmedia capture device 510, a timestamp of capture, a date of capture, an author or owner of the digitalmedia capture device 510, and any other metadata. A digital signature is generated by generating a hash of both the captured media and at least some of this metadata. For example, the digital signature may be a hash of the captured media, the timestamp, and an identifier of the digitalmedia capture device 510 that captured the media. The hash may be computed using a secure hash algorithm (SHA), such as SHA-0, SHA-1, SHA-2, or SHA-3. The digitalmedia capture device 510 and/or a second device that receives the media asset from the digitalmedia capture device 510 may then generate a public and private key pair using a public key infrastructure (PKI), where the keys may be for example RSA 1024 or 2048 bit keys. The private key is used to encrypt the digital signature, and may then be deleted, erased, and/or destroyed, in some cases via overwriting for more security. The certified media asset—meaning the media asset, the encrypted digital signature, and the (optionally encrypted) metadata—are uploaded to the cloud severs, in some cases along with the public key, optionally securely via HTTPS or another secure network transfer protocol. The public key may be uploaded to the same cloud server(s) or to a different system, such as a certificate authority (CA) server. The media asset and its metadata are now certified. Any server or client can retrieve the public key from thecloud server system 540 or CA server and decrypt the encrypted digital signature to verify that it matches a new hash generated using media asset and/or metadata at a later time, thereby verifying that the media asset and metadata have not been changed since certification. The same certification process may be used for additional data based on the media asset, such as annotations, notes, and reports. In some cases, such a verification check is performed at themedia capture device 510 or second device before the media asset and metadata and encrypted digital signature and public key are sent by themedia capture device 510 or second device to the server(s). In some cases, such a verification check is performed at the server after receipt of the certified media asset. - Metadata may include, for example, time, location, media capture, orientation, media size, resolution, frame size, elevations, centimeter 3D GPS position, digital media capture device speed, heading, or some combination thereof.
-
FIG. 7 illustrates an exemplary web portal display for viewing a document with an integrated certified image. - The web portal may be used to view user submitted forms to receive a quote, insurance claim document, or any other document submitted via the
mobile insurance application 515. The document—here, aninsurance claim document 710—may be a digital or physical document. The document can include a certified digital media, description of the digital media, or the insurance claim. This description of the digital media can include information such as an identification and/or contact information of the user who took the certified photo using theimage capture device 510 and/or who wrote the description of the digital media or event, a location of the property or event, and a brief textual or graphical description of the property or event. The digital data organization process of the incident reporting system may be controlled manually by a user or automatically by computer hardware/software control processes. The web portal may be accessed through themobile software application 515 as well as through a network page, such as a webpage accessible through the Internet. -
FIG. 8 illustrates a document including two certified digital media assets corresponding to different capture times. - The document of
FIG. 8 is an insurance claim document, and includes two certified images—a firstcertified image 805 and a secondcertified image 810—that are certified via theprocess 600 ofFIG. 6 . The firstcertified image 805 was captured on Jan. 6, 2019, as identified in the document and as determined from certified metadata of the firstcertified image 805. The secondcertified image 810 was captured on Mar. 20, 2020 (after the capture of the first certified image), as identified in the document and as determined from certified metadata of the secondcertified image 810. - The first
certified image 805 is captured before the second certified image is captured 810, and depicts an object—a building—that is undamaged. The secondcertified image 810 is captured after the firstcertified image 805 is captured, and depicts the same object as the firstcertified image 805 does—a building—that is damaged. In some cases, the firstcertified image 805 and the secondcertified image 810 may be captured by the samemedia capture device 510. Themedia capture device 510 may be, for example user'smobile device 510 of a user associated with an insurance policy associated with the object—the building—depicted in the firstcertified image 805 and in the secondcertified image 810. Themedia capture device 510 may alternately be associated with an insurance company or agent associated with the insurance policy associated with the object. In some cases, the firstcertified image 805 and the secondcertified image 810 may be captured by differentmedia capture devices 510. - In some cases, the first
certified image 805 may be an image that is captured before the insurance policy is generated, selected, identified, activated, put into effect, used for an insurance claim, or some combination thereof. For example, the user may capture one or more images of an object—here, the building—certify the one or more images (which now include the first certified image 805), and send the one or more certified images to an insurance company so that the insurance company can identify the condition that the object is in at the time of capture of the firstcertified image 805. The time of capture of the firstcertified image 805 may be before the insurance policy associated with the object is generated, selected, identified, activated, put into effect, used for an insurance claim, or some combination thereof. - In some cases, the second
certified image 810 may be an image that is captured after the insurance policy is generated, selected, identified, activated, put into effect, or some combination thereof. In some cases, the secondcertified image 810 is specifically captured in order to file an insurance claim, such as when the object that is depicted in the firstcertified image 805 and that is now associated with an insurance policy—here, the building—has incurred some damage. InFIG. 8 , the building is illustrated as including damage—a crack—as depicted in the secondcertified image 810, where the crack is missing in the firstcertified image 805. For example, once the object incurs damage that the user wishes to file an insurance claim over, the user may capture one or more new images of the object that is depicted in the firstcertified image 805, certify the one or more new images (which now include the second certified image 810), and send the one or more new certified images to an insurance company so that the insurance company can identify the condition that the object is in at the time of capture of the secondcertified image 810. The time of capture of the secondcertified image 810 may be after the insurance policy associated with the object is generated, selected, identified, activated, put into effect, or some combination thereof, and may be after damage has been incurred to the object over which a user wishes to file an insurance claim associated with the insurance policy. The object may be depicted in both the firstcertified image 805 and the secondcertified image 810. The damage incurred by the object may be depicted in the secondcertified image 810, but not the firstcertified image 805. - In reviewing an insurance claim, an insurance company might typically wish to verify that the insurance claim is not fraudulent. Here, the
server system 540 may automatically verify that the insurance claim is not fraudulent by automatically verifying a number of properties of the firstcertified image 805, the secondcertified image 810, metadata associated with capture of the firstcertified image 805, metadata associated with capture of the secondcertified image 810, sensor data associated with capture of the firstcertified image 805 and captured by sensors of themedia capture device 510 that captured the firstcertified image 805, sensor data associated with capture of the secondcertified image 810 and captured by sensors of themedia capture device 510 that captured the secondcertified image 810, or some combination thereof. - For instance, the
server system 540 may use an image recognition algorithm to identify that the object associated with the insurance policy is the object depicted in the both the firstcertified image 805 and the secondcertified image 810. To do so, the image recognition algorithm may identify features of the object as depicted in firstcertified image 805, such as various corners and edges and colors of the building, and identify that the features match features found in the secondcertified image 810. The image recognition algorithm may retrieve another image of the type of object insured from a database or other storage repository, for example by querying an image of a building or of the specific building, identifying features of the object as depicted in the image from the search, and then identify that the features match features found in the firstcertified image 805 and the secondcertified image 810. This verification process prevents fraudulent activity in an automated technological manner in that it prevents a malicious user from capturing images of two different objects in the two images, and trying to pass them off as the same object. - The
server system 540 may use the image recognition algorithm to verify that the damage incurred by the object that is depicted in the secondcertified image 810—here, a crack in the building—is not depicted in the firstcertified image 805. This verification process prevents fraudulent activity in an automated technological manner in that it prevents a malicious user from trying to file an insurance claim on damage that was already present in the object when the user was filing the insurance policy. - The
server system 540 may use the image recognition algorithm to verify that the damage incurred by the object that is depicted in the secondcertified image 810—here, a crack in the building—is not depicted in another certified image that has been received by the insurance company and/or that may be associated with a previous and/or different insurance claim. This verification process prevents fraudulent activity in an automated technological manner in that it prevents a malicious user from trying to file multiple insurance claims on the same damage. - The
server system 540 may also verify, based on certified metadata associated with the firstcertified image 805 and the secondcertified image 810, that the location at which the firstcertified image 805 was captured matches the location at which the secondcertified image 810 was captured, at least within a predetermined radius or distance to accommodate for slight variations in media capture vantage point and variations in positioning functionality. If the insured object is a building or other stationary object, the locations of capture of these images should be the same, at least within a margin of error of a predetermined range or distance, which may for instance be between 0.1 miles and 10 miles, or between 0.1 kilometers and 10 kilometers. Theserver system 540 may also retrieve a location of the object from a database or other storage repository, such as one associated with a land title deed, and verify that the retrieved location of the object matches the locations of capture of the firstcertified image 805 and the secondcertified image 810, at least within a margin of error of the predetermined range or distance. This verification process prevents fraudulent activity in an automated technological manner in that it prevents a malicious user from capturing images of two different objects in the two images if the objects are in different places, even if they look identical, and trying to pass the images off as the same object. - The
server system 540 may also verify, based on certified metadata associated with the firstcertified image 805 and the secondcertified image 810, that the time of capture of the firstcertified image 805 is before the time of capture of the secondcertified image 810. In other words, theserver system 540 may also verify that the time of capture of the secondcertified image 810 is after the time of capture of the firstcertified image 805. In some cases, theserver system 540 may verify that the time of capture of the firstcertified image 805 is also before one or more times and/or dates associated with the insurance policy on the object being generated, selected, identified, activated, put into effect, used for an insurance claim, or some combination thereof. In some cases, theserver system 540 may verify that the time of capture of the secondcertified image 810 is also after one or more times and/or dates associated with the insurance policy on the object being generated, selected, identified, activated, put into effect, or some combination thereof. This verification process prevents fraudulent activity in an automated technological manner in that it ensures that images are captured when the user states that they are. - The
server system 540 may also verify, based on certified sensor data associated with capture of the firstcertified image 805 and certified sensor data associated with capture of the secondcertified image 810, that certified sensor data associated with capture of the firstcertified image 805 matches certified sensor data associated with capture of the secondcertified image 810. For example, the sensor data associate with both certified images may include data captured by a RADAR sensor, a LIDAR sensor, an infrared sensor, a microphone, or some combination thereof. This verification process prevents fraudulent activity in an automated technological manner in that it ensures that the same object is depicted in both the firstcertified image 805 and the secondcertified image 810 based on additional sensor data that could catch discrepancies that might not be caught otherwise through the images or other metadata alone. - The first
certified image 805 may be received by the server system 540 (or a separate server system 540) and stored there at or shortly after capture; that is, before the insurance policy on the object is used for an insurance claim, and before, during, or shortly after the insurance policy on the object is generated, selected, identified, activated, put into effect, or some combination thereof. Thus,FIG. 8 illustrates the firstcertified image 805 as coming from aserver system 820. The secondcertified image 810 may be received after receipt of the firstcertified image 805, and therefore may be received from themedia capture device 510, such as themobile devices 830 illustrated inFIG. 8 . However, in some cases, the firstcertified image 805 may be instead received from themedia capture device 830. In some cases, the secondcertified image 810 may be instead received from theserver system 820. - While the document of
FIG. 8 includes twocertified images certified images certified images -
FIG. 9 illustrates operations for secure digital media processing. The process 900 ofFIG. 9 is performed by thesecure server system 540 and/or another portion of the internet/cloud system 530 ofFIG. 5 . - At
step 905, theserver system 540 receives a first digital media asset and first information set associated with the first digital media asset. Atstep 910, theserver system 540 receives a second digital media asset and a second information set associated with the second digital media asset. - At
step 915, theserver system 540 verifies that the first digital media asset and the first information set are unaltered. Atstep 920, theserver system 540 verifies that the second digital media asset and the second information set are unaltered. - At
step 925, theserver system 540 verifies that a first time of capture of the first digital media asset as identified in the first information set is before a second time of capture of the second digital media asset as identified in the second information set. - At
step 930, theserver system 540 generates a document that includes the first digital media asset and the second digital media asset in response to verifying that the first time of capture is before the second time of capture and to verifying that the first digital media asset, the first information set, the second digital media asset, and the second information set are unaltered. - In some cases, the document produced in
step 930 may be an insurance claim document such as the insurance claim document ofFIG. 8 . The verification steps 915-925 may be performed as discussed or substituted by any verification processes discussed with respect toFIG. 8 . -
FIG. 10 illustrates an exemplary certified image within an exemplary image viewer application. - The exemplary
image viewer application 1010 displays animage 1020 and shows a “certified image” stamp or “certified by [image certifying authority]” stamp graphic 1050 in the upper left corner of theimage 1020, signifying that theimage 1020 is a certified image and therefore is certified as authentic and unaltered. Theexemplary image viewer 1010 also shows some image metadata along the bottom 140 of theimage 1020 and additional metadata in ametadata viewing box 1030 on the right-hand side of theimage 1020. This metadata is also certified as part of the certification process, and may in some cases also be marked with a “certified image” stamp or “certified by [image certifying authority]” stamp graphic itself to signify that it, too, is certified as authentic and unaltered. -
FIG. 11 illustrates capture of an image at a media capture device and transfer of the image upon its certification to a computing device at which the certified image is viewed. - In particular, a
media capture device 1110 captures animage 1150 and certifies theimage 1150 upon capture. Thecertified image 1150 is transferred, optionally through the server(s) 540, to aviewer device 1120, which views thecertified image 1150 in aviewer application 1130. Theviewer application 1130 may display meters corresponding to image data security levels, for instance identifying a degree of confidence in the authenticity and unaltered nature of thecertified image 1150. -
FIG. 12 is a flow diagram illustrating operations for capturing media, certifying the media, securely transferring the certified media, securely storing the certified media, and securely broadcasting the certified media. - A
digital media asset 1205 captured by a digitalmedia capture device 510, such as an image captured by a camera, is received by themedia processing system 1210, which may in some cases be themedia capture device 510, the internet/cloud system 530, or some combination thereof. Thedigital media asset 1205 may be in the form of image data, audio data, video data, RADAR data, LIDAR data, SONAR data, SODAR data, sensor data from one or more other sensors, or some combination thereof. The media processing system may also receive metadata related to thedigital media asset 1205. The metadata may include location, orientation, navigation attributes, orientation of themedia capture device 510, time of the capture, date of capture, photographer, identity of the digitalmedia capture device 510, owner of the digitalmedia capture device 510, sensor data from one or more other sensors coupled to the digitalmedia capture device 510, or some combination thereof. - When the
digital media asset 1205 and its metadata are captured by the digitalmedia capture device 510, thedigital media asset 1205 and its metadata are automatically certified instantly at (or within a short time period after) the press of the camera's shutter button (as inFIG. 6 ) and transmitted by the digitalmedia capture device 510 to the internet/cloud system 530 where the digital data is filed, stored, and accessed through the web in a systematic or serialized format. The system generates an information page based on the information associated with the digital media asset on a network location. The internet/cloud system 530 can include one ormore server systems 540, which may be connected to each other. In some cases, this internet/cloud system 530 is a wireless multiplexed system for securely storing digital data to and from multiple mobiledigital devices 510. In some cases, the digital data (e.g., images, reports) are securely held in one central place, either by a hardware memory device, server, or a data center. - The
digital media asset 1205 is embedded with a glyph that encodes information pointing to the internet/cloud system 530. This glyph can be a quick response (QR) code, a barcode, Aztec code, brand logo, URL, or other identifier. The information pointing to the internet/loud system can encode or otherwise include a URL. The system then generates a modified digital media asset by overlaying the glyph on or over thedigital media asset 1205. - Once the digital data is in the internet or cloud system, the
digital media asset 1205 may be protected through a various technologies and system controls. Security, digital signature, watermarking, encryption physical access, password credentials area can be utilized throughout the system. Original digital data can be confirmed, saved and protected though various technologies and system controls. The system may generate a hash digest of the digital media asset and a unique key pair corresponding to the digital media asset. The unique key consists of a private key and a public key, in which the private key is used to encrypt both the hash digest and the metadata. The public key may be transmitted to a network server. The information pointing to the network location may require a passcode to access the digital data. - A content provider (e.g., the
servers 540 or a device that retrieves the modified digital media dataset and/or information page from the media capture device 510) may broadcast the information regarding the digital media by displaying the media as well as the glyph associated with the media during its broadcast on a display device. The content provider may be television network, radio channel, newspaper, or magazine publisher. Viewers of the broadcast may request to access the digital media and its associated information by scanning the glyph from the broadcast with the viewer's device. This data can be transferred over a wired or a wireless connection or printed on photo, legal document, newspaper, magazine, or flyer. When requested to provide the digital media data, the system may generate a decrypted signature by decrypting the digital signature using the public key, generate a hash based on the digital media asset, and verify that the hash matches the decrypted signature. - In one embodiment, when a viewer of the broadcast requests this information, the viewer may be shown the information page corresponding to the digital media asset. In another embodiment, the viewer may be prompted to provide a passcode to access the information page containing the digital media asset and its metadata. The information page may include a certification that the digital media asset is unaltered based on the hash matching the decrypted signature.
-
FIG. 13 illustrates a modified version of a digital media asset with an overlaid glyph. Theserver system 540 generates theglyph 1320 to encode a URL and/or other information identifying a network location (e.g., URL) of an information page. Theserver system 540 generates the modified digital media asset by generating aglyph 1320 and overlaying theglyph 1320 on thedigital media asset 1310, or by otherwise embedding theglyph 1320 into thedigital media asset 1310. Thecertified image 1310 may also be modified by overlayingcertain metadata 1330 over theimage 1310. - The
glyph 1320 is located on the bottom right of thecertified image 1310 as a quick response (QR)Code 1320. A requesting device may interact with this image by scanning theQR Code 1320 with a camera or other scanner of the requesting device, such as a smart phone or tablet with a QR Code scanner. The requesting device may then decode the glyph to obtain the network location. The requesting device may go to the network location, for example in a browser, and request the information page from theservers 540. Theservers 540 then provide the information page or media to the requesting device. - In some cases, the modified
digital media asset 1310 may alternately or additionally include a hyperlinked text or region of the digital media asset, such that a user viewing the modified digital media asset on a device can click or tap the hyperlinked text or region to go to the network location (e.g., URL) of the information page. In some cases, the QR code may be the hyperlinked region of the modified digital media asset. - The viewer using the digital device with a QR code can determine how much access the content provider may be given to the viewer's device when providing the digital media. That is, a viewer or administrator may set permissions so that only certain requesting devices may be able to access the information page. For example, devices that have a passcode.
- The secure digital media processing and document generation techniques discussed herein may be implemented in the general context of computer executable instructions via software located on and between a remotely operated user device (e.g., Smartphone, tablet, or other electronic device) with a computerized operating system. There may be multiple user devices interacting with a web portal, which may be local or may be geographically separated. The user devices may be remotely addressable with secure login, with or without password, biometric fingerprint, voice, retinal scan or encrypted log on web portal providing global internet access, either via fiber, radio frequency, satellite, or data linking with bi-directional, cloud system and or data center or location under user command.
- The web portal and/or viewer device mobile application can include annotation tools, allowing for creation of charts and graphics with color annotation incorporating picture in picture, with image numbers associated with each image brought into the web portal workspace. In addition to annotation tools, any interface with all of the external data inputs, such as weather, news, Internet sites, other sensor data can be integrated into the workspace and integrated into the image work areas.
-
FIG. 14 illustrates anexemplary computing system 1400 that may be used to implement some aspects of the technology. For example, any of the computing devices, computing systems, network devices, network systems, servers, and/or arrangements of circuitry described herein may include at least onecomputing system 1400, or may include at least one component of thecomputer system 1400 identified inFIG. 14 . Thecomputing system 1400 ofFIG. 14 includes one ormore processors 1410 andmemory 1420. Each of the processor(s) 1410 may refer to one or more processors, controllers, microcontrollers, central processing units (CPUs), graphics processing units (GPUs), arithmetic logic units (ALUs), accelerated processing units (APUs), digital signal processors (DSPs), application specific integrated circuits (ASICs), field-programmable gate arrays (FPGAs), or combinations thereof. Each of the processor(s) 1410 may include one or more cores, either integrated onto a single chip or spread across multiple chips connected or coupled together.Memory 1420 stores, in part, instructions and data for execution byprocessor 1410.Memory 1420 can store the executable code when in operation. Thesystem 1400 ofFIG. 14 further includes amass storage device 1430, portable storage medium drive(s) 1440,output devices 1450,user input devices 1460, agraphics display 1470, andperipheral devices 1480. - The components shown in
FIG. 14 are depicted as being connected via a single bus 1490. However, the components may be connected through one or more data transport means. For example,processor unit 1410 andmemory 1420 may be connected via a local microprocessor bus, and themass storage device 1430, peripheral device(s) 1480,portable storage device 1440, anddisplay system 1470 may be connected via one or more input/output (I/O) buses. -
Mass storage device 1430, which may be implemented with a magnetic disk drive or an optical disk drive, is a non-volatile storage device for storing data and instructions for use byprocessor unit 1410.Mass storage device 1430 can store the system software for implementing some aspects of the subject technology for purposes of loading that software intomemory 1420. -
Portable storage device 1440 operates in conjunction with a portable non-volatile storage medium, such as a floppy disk, compact disk or Digital video disc, to input and output data and code to and from thecomputer system 1400 ofFIG. 14 . The system software for implementing aspects of the subject technology may be stored on such a portable medium and input to thecomputer system 1400 via theportable storage device 1440. - The
memory 1420,mass storage device 1430, orportable storage 1440 may in some cases store sensitive information, such as transaction information, health information, or cryptographic keys, and may in some cases encrypt or decrypt such information with the aid of theprocessor 1410. Thememory 1420,mass storage device 1430, orportable storage 1440 may in some cases store, at least in part, instructions, executable code, or other data for execution or processing by theprocessor 1410. -
Output devices 1450 may include, for example, communication circuitry for outputting data through wired or wireless means, display circuitry for displaying data via a display screen, audio circuitry for outputting audio via headphones or a speaker, printer circuitry for printing data via a printer, transcription or some combination thereof. The display screen may be any type of display discussed with respect to thedisplay system 1470. The printer may be inkjet, LaserJet, thermal, or some combination thereof. In some cases, theoutput device circuitry 1450 may allow for transmission of data over an audio jack/plug, a microphone jack/plug, a universal serial bus (USB) port/plug, an Apple® Lightning® port/plug, an Ethernet port/plug, a fiber optic port/plug, a proprietary wired port/plug, a BLUETOOTH® wireless signal transfer, a BLUETOOTH® low energy (BLE) wireless signal transfer, an IBEACON® wireless signal transfer, a radio-frequency identification (RFID) wireless signal transfer, near-field communications (NFC) wireless signal transfer, dedicated short range communication (DSRC) wireless signal transfer, 802.11 Wi-Fi wireless signal transfer, wireless local area network (WLAN) signal transfer, Visible Light Communication (VLC), Worldwide Interoperability for Microwave Access (WiMAX), Infrared (IR) communication wireless signal transfer, Public Switched Telephone Network (PSTN) signal transfer, Integrated Services Digital Network (ISDN) signal transfer, 3G/4G/5G/LTE cellular data network wireless signal transfer, ad-hoc network signal transfer, radio wave signal transfer, microwave signal transfer, infrared signal transfer, visible light signal transfer, ultraviolet light signal transfer, wireless signal transfer along the electromagnetic spectrum, or some combination thereof.Output devices 1450 may include any ports, plugs, antennae, wired or wireless transmitters, wired or wireless transceivers, or any other components necessary for or usable to implement the communication types listed above, such as cellular Subscriber Identity Module (SIM) cards. -
Input devices 1460 may include circuitry providing a portion of a user interface.Input devices 1460 may include an alpha-numeric keypad, such as a keyboard, for inputting alpha-numeric and other information, or a pointing device, such as a mouse, a trackball, stylus, or cursor direction keys.Input devices 1460 may include touch-sensitive surfaces as well, either integrated with a display as in a touchscreen, or separate from a display as in a trackpad. Touch-sensitive surfaces may in some cases detect localized variable pressure or force detection. In some cases, the input device circuitry may allow for receipt of data over an audio jack, a microphone jack, a universal serial bus (USB) port/plug, an Apple® Lightning® port/plug, an Ethernet port/plug, a fiber optic port/plug, a proprietary wired port/plug, a wired local area network (LAN) port/plug, a BLUETOOTH® wireless signal transfer, a BLUETOOTH® low energy (BLE) wireless signal transfer, an IBEACON® wireless signal transfer, a radio-frequency identification (RFID) wireless signal transfer, near-field communications (NFC) wireless signal transfer, dedicated short range communication (DSRC) wireless signal transfer, 802.11 Wi-Fi wireless signal transfer, wireless local area network (WLAN) signal transfer, Visible Light Communication (VLC), Worldwide Interoperability for Microwave Access (WiMAX), Infrared (IR) communication wireless signal transfer, Public Switched Telephone Network (PSTN) signal transfer, Integrated Services Digital Network (ISDN) signal transfer, 3G/4G/5G/LTE cellular data network wireless signal transfer, personal area network (PAN) signal transfer, wide area network (WAN) signal transfer, ad-hoc network signal transfer, radio wave signal transfer, microwave signal transfer, infrared signal transfer, visible light signal transfer, ultraviolet light signal transfer, wireless signal transfer along the electromagnetic spectrum, or some combination thereof.Input devices 1460 may include any ports, plugs, antennae, wired or wireless receivers, wired or wireless transceivers, or any other components necessary for or usable to implement the communication types listed above, such as cellular SIM cards. -
Input devices 1460 may include receivers or transceivers used for positioning of thecomputing system 1400 as well. These may include any of the wired or wireless signal receivers or transceivers. For example, a location of thecomputing system 1400 can be determined based on signal strength of signals as received at thecomputing system 1400 from three cellular network towers, a process known as cellular triangulation. Fewer than three cellular network towers can also be used—even one can be used—though the location determined from such data will be less precise (e.g., somewhere within a particular circle for one tower, somewhere along a line or within a relatively small area for two towers) than via triangulation. More than three cellular network towers can also be used, further enhancing the location's accuracy. Similar positioning operations can be performed using proximity beacons, which might use short-range wireless signals such as BLUETOOTH® wireless signals, BLUETOOTH® low energy (BLE) wireless signals, IBEACON® wireless signals, personal area network (PAN) signals, microwave signals, radio wave signals, or other signals discussed above. Similar positioning operations can be performed using wired local area networks (LAN) or wireless local area networks (WLAN) where locations are known of one or more network devices in communication with thecomputing system 1400 such as a router, modem, switch, hub, bridge, gateway, or repeater. These may also include Global Navigation Satellite System (GNSS) receivers or transceivers that are used to determine a location of thecomputing system 1400 based on receipt of one or more signals from one or more satellites associated with one or more GNSS systems. GNSS systems include, but are not limited to, the US-based Global Positioning System (GPS), the Russia-based Global Navigation Satellite System (GLONASS), the China-based BeiDou Navigation Satellite System (BDS), and the Europe-based Galileo GNSS.Input devices 1460 may include receivers or transceivers corresponding to one or more of these GNSS systems. -
Display system 1470 may include a liquid crystal display (LCD), a plasma display, an organic light-emitting diode (OLED) display, a low-temperature poly-silicon (LTPO) display, an electronic ink or “e-paper” display, a projector-based display, a holographic display, or another suitable display device.Display system 1470 receives textual and graphical information, and processes the information for output to the display device. Thedisplay system 1470 may include multiple-touch touchscreen input capabilities, such as capacitive touch detection, resistive touch detection, surface acoustic wave touch detection, or infrared touch detection. Such touchscreen input capabilities may or may not allow for variable pressure or force detection. -
Peripherals 1480 may include any type of computer support device to add additional functionality to the computer system. For example, peripheral device(s) 1480 may include one or more additional output devices of any of the types discussed with respect to output device 1450, one or more additional input devices of any of the types discussed with respect to input device 1460, one or more additional display systems of any of the types discussed with respect to display system 1470, one or more memories or mass storage devices or portable storage devices of any of the types discussed with respect to memory 1420 or mass storage 1430 or portable storage 1440, a modem, a router, an antenna, a wired or wireless transceiver, a printer, a bar code scanner, a quick-response (“QR”) code scanner, a magnetic stripe card reader, a integrated circuit chip (ICC) card reader such as a smartcard reader or a EUROPAY®-MASTERCARD®-VISA® (EMV) chip card reader, a near field communication (NFC) reader, a document/image scanner, a visible light camera, a thermal/infrared camera, an ultraviolet-sensitive camera, a night vision camera, a light sensor, a phototransistor, a photoresistor, a thermometer, a thermistor, a battery, a power source, a proximity sensor, a laser rangefinder, a sonar transceiver, a radar transceiver, a lidar transceiver, a network device, a motor, an actuator, a pump, a conveyer belt, a robotic arm, a rotor, a drill, a chemical assay device, or some combination thereof. - The components contained in the
computer system 1400 ofFIG. 14 can include those typically found in computer systems that may be suitable for use with some aspects of the subject technology and represent a broad category of such computer components that are well known in the art. That said, thecomputer system 1400 ofFIG. 14 can be customized and specialized for the purposes discussed herein and to carry out the various operations discussed herein, with specialized hardware components, specialized arrangements of hardware components, and/or specialized software. Thus, thecomputer system 1400 ofFIG. 14 can be a personal computer, a hand held computing device, a telephone (“smartphone” or otherwise), a mobile computing device, a workstation, a server (on a server rack or otherwise), a minicomputer, a mainframe computer, a tablet computing device, a wearable device (such as a watch, a ring, a pair of glasses, or another type of jewelry or clothing or accessory), a video game console (portable or otherwise), an e-book reader, a media player device (portable or otherwise), a vehicle-based computer, another type of computing device, or some combination thereof. Thecomputer system 1400 may in some cases be a virtual computer system executed by another computer system. The computer can also include different bus configurations, networked platforms, multi-processor platforms, etc. Various operating systems can be used including Unix®, Linux®, FreeBSD®, FreeNAS®, pfSense®, Windows®, Apple® Macintosh OS® (“MacOS®”), Palm OS®, Google® Android®, Google® Chrome OS®, Chromium® OS®, OPENSTEP®, XNU®, Darwin®, Apple® iOS®, Apple® tvOS®, Apple® watchOS®, Apple® audioOS®, Amazon® Fire OS®, Amazon® Kindle OS®, variants of any of these, other suitable operating systems, or combinations thereof. Thecomputer system 1400 may also use a Basic Input/Output System (BIOS) or Unified Extensible Firmware Interface (UEFI) as a layer upon which the operating system(s) are run. - In some cases, the
computer system 1400 may be part of a multi-computer system that usesmultiple computer systems 1400, each for one or more specific tasks or purposes. For example, the multi-computer system may includemultiple computer systems 1400 communicatively coupled together via at least one of a personal area network (PAN), a local area network (LAN), a wireless local area network (WLAN), a municipal area network (MAN), a wide area network (WAN), or some combination thereof. The multi-computer system may further includemultiple computer systems 1400 from different networks communicatively coupled together via the internet (also known as a “distributed” system). - Some aspects of the subject technology may be implemented in an application that may be operable using a variety of devices. Non-transitory computer-readable storage media refer to any medium or media that participate in providing instructions to a central processing unit (CPU) for execution and that may be used in the
memory 1420, themass storage 1430, theportable storage 1440, or some combination thereof. Such media can take many forms, including, but not limited to, non-volatile and volatile media such as optical or magnetic disks and dynamic memory, respectively. Some forms of non-transitory computer-readable media include, for example, a floppy disk, a flexible disk, a hard disk, magnetic tape, a magnetic strip/stripe, any other magnetic storage medium, flash memory, memristor memory, any other solid-state memory, a compact disc read only memory (CD-ROM) optical disc, a rewritable compact disc (CD) optical disc, digital video disk (DVD) optical disc, a blu-ray disc (BDD) optical disc, a holographic optical disk, another optical medium, a secure digital (SD) card, a micro secure digital (microSD) card, a Memory Stick® card, a smartcard chip, a EMV chip, a subscriber identity module (SIM) card, a mini/micro/nano/pico SIM card, another integrated circuit (IC) chip/card, random access memory (RAM), static RAM (SRAM), dynamic RAM (DRAM), read-only memory (ROM), programmable read-only memory (PROM), erasable programmable read-only memory (EPROM), electrically erasable programmable read-only memory (EEPROM), flash EPROM (FLASHEPROM), cache memory (L1/L2/L3/L4/L5/L16), resistive random-access memory (RRAM/ReRAM), phase change memory (PCM), spin transfer torque RAM (STT-RAM), another memory chip or cartridge, or a combination thereof. - Various forms of transmission media may be involved in carrying one or more sequences of one or more instructions to a
processor 1410 for execution. A bus 1490 carries the data to system RAM or anothermemory 1420, from which aprocessor 1410 retrieves and executes the instructions. The instructions received by system RAM or anothermemory 1420 can optionally be stored on a fixed disk (mass storage device 1430/portable storage 1440) either before or after execution byprocessor 1410. Various forms of storage may likewise be implemented as well as the necessary network interfaces and network topologies to implement the same. - While various flow diagrams provided and described above may show a particular order of operations performed by some embodiments of the subject technology, it should be understood that such order is exemplary. Alternative embodiments may perform the operations in a different order, combine certain operations, overlap certain operations, or some combination thereof. It should be understood that unless disclosed otherwise, any process illustrated in any flow diagram herein or otherwise illustrated or described herein may be performed by a machine, mechanism, and/or
computing system 1400 discussed herein, and may be performed automatically (e.g., in response to one or more triggers/conditions described herein), autonomously, semi-autonomously (e.g., based on received instructions), or a combination thereof. Furthermore, any action described herein as occurring in response to one or more particular triggers/conditions should be understood to optionally occur automatically response to the one or more particular triggers/conditions. - The foregoing detailed description of the technology has been presented for purposes of illustration and description. It is not intended to be exhaustive or to limit the technology to the precise form disclosed. Many modifications and variations are possible in light of the above teaching. The described embodiments were chosen in order to best explain the principles of the technology, its practical application, and to enable others skilled in the art to utilize the technology in various embodiments and with various modifications as are suited to the particular use contemplated. It is intended that the scope of the technology be defined by the claim.
Claims (21)
1. (canceled)
2. A method of media processing, the method comprising:
receiving a first digital media asset depicting an object;
analyzing the first digital media asset to recognize a first condition of the object as depicted in the first digital media asset;
receiving a second digital media asset depicting the object after receiving the first digital media asset;
analyzing the second digital media asset to recognize a second condition of the object as depicted in the second digital media asset; and
identifying an extent of difference between the first condition of the object as depicted in the first digital media asset and the second condition of the object as depicted in the second digital media asset.
3. The method of claim 2 , further comprising:
identifying an insurance policy for the object based on the first condition of the object as recognized in the first digital media asset; and
identifying an amount of an insurance claim payment associated with the insurance policy for the object to identify the extent of the difference.
4. The method of claim 3 , further comprising:
selecting the insurance policy for the object from a plurality of possible insurance policies based on respective characteristics of the plurality of possible insurance policies to identify the insurance policy for the object.
5. The method of claim 3 , further comprising:
analyzing the first digital media asset to recognize an object type of the object depicted in the first digital media asset, wherein identifying the insurance policy for the object is also based on the object type of the object.
6. The method of claim 5 , wherein the object type of the object is indicative of at least one of an identity of the object, a value of the object, an age of the object, a property scope of the object, an ownership history of the object, a make of the object, a model of the object, or a color of the object.
7. The method of claim 5 , further comprising:
querying a database using the object type to retrieve at least one attribute of the object, wherein identifying the insurance policy for the object is also based on the at least one attribute of the object.
8. The method of claim 3 , further comprising:
processing a transfer of funds between a plurality of accounts, the transfer of the funds associated with the insurance claim payment.
9. The method of claim 3 , further comprising:
identifying a payment to be made for the insurance policy for the object, wherein the payment is based on the first condition of the object as recognized in the first digital media asset; and
initiating the insurance policy based on an indication of processing of the payment.
10. The method of claim 2 , wherein the object is one of a vehicle or a structure.
11. The method of claim 2 , further comprising:
identifying damage to the object to identify the second condition of the object;
receiving weather data for an area associated with the damage to the object and for a time associated with the damage to the object; and
verifying that the damage to the object is consistent with the weather data.
12. The method of claim 2 , further comprising:
recognizing a characteristic of the object in both the first digital media asset and the second digital media asset to verify that the first digital media asset and the second digital media asset both depict the object.
13. The method of claim 2 , further comprising:
verifying that a first time of capture of the first digital media asset is before a second time of capture of the second digital media asset.
14. The method of claim 2 , further comprising:
verifying that the first digital media asset is unaltered based on a comparison between a first hash generated using the first digital media asset and a second hash decrypted from a first encrypted signature associated with the first digital media asset; and
verifying that the second digital media asset is unaltered based on a comparison between a third hash generated using the second digital media asset and a fourth hash decrypted from a second encrypted signature associated with the second digital media asset.
15. A system for media processing, the system comprising:
at least one memory; and
at least one processor coupled to the at least one memory and configured to:
receive a first digital media asset depicting an object;
analyze the first digital media asset to recognize a first condition of the object as depicted in the first digital media asset;
receive a second digital media asset depicting the object after receiving the first digital media asset;
analyze the second digital media asset to recognize a second condition of the object as depicted in the second digital media asset; and
identify an extent of difference between the first condition of the object as depicted in the first digital media asset and the second condition of the object as depicted in the second digital media asset.
16. The system of claim 15 , the at least one processor configured to:
identify an insurance policy for the object based on the first condition of the object as recognized in the first digital media asset; and
identify an amount of an insurance claim payment associated with the insurance policy for the object to identify the extent of the difference.
17. The system of claim 16 , the at least one processor configured to:
analyze the first digital media asset to recognize an object type of the object depicted in the first digital media asset, wherein identifying the insurance policy for the object is also based on the object type of the object.
18. The system of claim 15 , the at least one processor configured to:
identify damage to the object to identify the second condition of the object;
receive weather data for an area associated with the damage to the object and for a time associated with the damage to the object; and
verify that the damage to the object is consistent with the weather data.
19. The system of claim 15 , the at least one processor configured to:
recognize a characteristic of the object in both the first digital media asset and the second digital media asset to verify that the first digital media asset and the second digital media asset both depict the object.
20. The system of claim 15 , further comprising:
verify that the first digital media asset is unaltered based on a comparison between a first hash generated using the first digital media asset and a second hash decrypted from a first encrypted signature associated with the first digital media asset; and
verify that the second digital media asset is unaltered based on a comparison between a third hash generated using the second digital media asset and a fourth hash decrypted from a second encrypted signature associated with the second digital media asset.
21. A non-transitory computer readable storage medium having embodied thereon a program, wherein the program is executable by a processor to perform a method of media processing, the method comprising:
receiving a first digital media asset depicting an object;
analyzing the first digital media asset to recognize a first condition of the object as depicted in the first digital media asset;
receiving a second digital media asset depicting the object after receiving the first digital media asset;
analyzing the second digital media asset to recognize a second condition of the object as depicted in the second digital media asset; and
identifying an extent of difference between the first condition of the object as depicted in the first digital media asset and the second condition of the object as depicted in the second digital media asset.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US17/962,242 US20230281344A1 (en) | 2020-04-01 | 2022-10-07 | Secure digital media authentication and analysis |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US16/837,944 US11468198B2 (en) | 2020-04-01 | 2020-04-01 | Secure digital media authentication and analysis |
US17/962,242 US20230281344A1 (en) | 2020-04-01 | 2022-10-07 | Secure digital media authentication and analysis |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/837,944 Continuation US11468198B2 (en) | 2020-04-01 | 2020-04-01 | Secure digital media authentication and analysis |
Publications (1)
Publication Number | Publication Date |
---|---|
US20230281344A1 true US20230281344A1 (en) | 2023-09-07 |
Family
ID=77921859
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/837,944 Active 2040-12-28 US11468198B2 (en) | 2020-04-01 | 2020-04-01 | Secure digital media authentication and analysis |
US17/962,242 Pending US20230281344A1 (en) | 2020-04-01 | 2022-10-07 | Secure digital media authentication and analysis |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/837,944 Active 2040-12-28 US11468198B2 (en) | 2020-04-01 | 2020-04-01 | Secure digital media authentication and analysis |
Country Status (1)
Country | Link |
---|---|
US (2) | US11468198B2 (en) |
Families Citing this family (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10783585B1 (en) | 2012-08-16 | 2020-09-22 | Allstate Insurance Company | Agent-facilitated claims damage estimation |
US8510196B1 (en) | 2012-08-16 | 2013-08-13 | Allstate Insurance Company | Feedback loop in mobile damage assessment and claims processing |
US10430885B1 (en) | 2012-08-16 | 2019-10-01 | Allstate Insurance Company | Processing insured items holistically with mobile damage assessment and claims processing |
US11532048B2 (en) | 2012-08-16 | 2022-12-20 | Allstate Insurance Company | User interactions in mobile damage assessment and claims processing |
US11481854B1 (en) | 2015-02-23 | 2022-10-25 | ImageKeeper LLC | Property measurement with automated document production |
US10282562B1 (en) | 2015-02-24 | 2019-05-07 | ImageKeeper LLC | Secure digital data collection |
US20210357533A1 (en) * | 2019-07-22 | 2021-11-18 | Andrew Duncan Britton | Runtime Signature Integrity |
US11526711B1 (en) * | 2020-05-20 | 2022-12-13 | State Farm Mutual Automobile Insurance Company | Synchronizing image data with either vehicle telematics data or infrastructure data pertaining to a road segment |
US11682025B2 (en) * | 2020-06-26 | 2023-06-20 | Digipraise LLC | Digital appraisal system providing computational attestation of appraisal integrity |
US11553105B2 (en) | 2020-08-31 | 2023-01-10 | ImageKeeper, LLC | Secure document certification and execution system |
US20220114672A1 (en) * | 2020-10-13 | 2022-04-14 | Mitchell International, Inc. | Systems and methods for processing paper bills |
US11568090B2 (en) * | 2021-05-17 | 2023-01-31 | Business Objects Software Ltd. | Digital signatures for analytics |
US20230134191A1 (en) * | 2021-11-04 | 2023-05-04 | State Farm Mutual Automobile Insurance Company | Systems and methods for insurance application processing |
WO2023128908A1 (en) * | 2021-12-30 | 2023-07-06 | Cloudpeer Global Teknoloji A. S. | A mobile electronic signature system |
US20240095405A1 (en) * | 2022-09-21 | 2024-03-21 | Sony Group Corporation | Electronic devices for media file verification |
Citations (28)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20050036034A1 (en) * | 2003-08-15 | 2005-02-17 | Rea David D. | Apparatus for communicating over a network images captured by a digital camera |
US20060114338A1 (en) * | 2004-11-29 | 2006-06-01 | Rothschild Leigh M | Device and method for embedding and retrieving information in digital images |
CN101529797A (en) * | 2006-10-27 | 2009-09-09 | 国际商业机器公司 | System, device, method and program for authenticating communication partner by means of electronic certificate including personal information |
US20090279734A1 (en) * | 2008-05-09 | 2009-11-12 | Hartford Fire Insurance Company | System and method for assessing a condition of property |
US20120275651A1 (en) * | 2008-05-09 | 2012-11-01 | Hartford Fire Insurance Company | System and method for detecting potential property insurance fraud |
US20140114691A1 (en) * | 2012-10-23 | 2014-04-24 | InnovaPad, LP | Methods and Systems for the Integrated Collection of Data for Use in Incident Reports and Insurance Claims and to Related Methods of Performing Emergency Responder Cost Recovery |
WO2016028933A1 (en) * | 2014-08-19 | 2016-02-25 | Stempora Jeffrey | System for determining an underwriting risk, risk score, or price of insurance using sensor information |
US20160171622A1 (en) * | 2014-12-15 | 2016-06-16 | Loss of Use, Inc. | Insurance Asset Verification and Claims Processing System |
US20170293894A1 (en) * | 2016-04-06 | 2017-10-12 | American International Group, Inc. | Automatic assessment of damage and repair costs in vehicles |
US9836793B2 (en) * | 2009-12-31 | 2017-12-05 | Hartford Fire Insurance Company | Insurance processing system and method using mobile devices for proof of ownership |
US20180082378A1 (en) * | 2016-09-21 | 2018-03-22 | Allstate Insurance Company | Enhanced Image Capture and Analysis of Damaged Tangible Objects |
US20180089776A1 (en) * | 2009-08-14 | 2018-03-29 | Mousiki Inc. | System and method for acquiring, comparing and evaluating property conditions |
CN108040050A (en) * | 2017-12-12 | 2018-05-15 | 任天民 | A kind of primary photo identification method and application |
US20180293552A1 (en) * | 2017-04-11 | 2018-10-11 | Alibaba Group Holding Limited | Image-based vehicle maintenance plan |
US20190073641A1 (en) * | 2017-09-01 | 2019-03-07 | Allstate Insurance Company | Analyzing Images and Videos of Damaged Vehicles to Determine Damaged Vehicle Parts and Vehicle Asymmetries |
US10304137B1 (en) * | 2012-12-27 | 2019-05-28 | Allstate Insurance Company | Automated damage assessment and claims processing |
US10430886B1 (en) * | 2012-08-16 | 2019-10-01 | Allstate Insurance Company | Processing insured items holistically with mobile damage assessment and claims processing |
US10497250B1 (en) * | 2017-09-27 | 2019-12-03 | State Farm Mutual Automobile Insurance Company | Real property monitoring systems and methods for detecting damage and other conditions |
US20200226734A1 (en) * | 2019-01-14 | 2020-07-16 | Ford Global Technologies, Llc | Systems and methods for detecting and reporting vehicle damage events |
US10791265B1 (en) * | 2017-10-13 | 2020-09-29 | State Farm Mutual Automobile Insurance Company | Systems and methods for model-based analysis of damage to a vehicle |
US10949814B1 (en) * | 2019-05-09 | 2021-03-16 | Ccc Information Services Inc. | Intelligent vehicle repair estimation system |
US10956879B1 (en) * | 2013-03-15 | 2021-03-23 | United Services Automobile Association (Usaa) | Financial security indicator |
US10991049B1 (en) * | 2014-09-23 | 2021-04-27 | United Services Automobile Association (Usaa) | Systems and methods for acquiring insurance related informatics |
US11004187B2 (en) * | 2018-10-05 | 2021-05-11 | The Toronto-Dominion Bank | System and method for verifying image data of a vehicle |
US20210192629A1 (en) * | 2014-09-22 | 2021-06-24 | State Farm Mutual Automobile Insurance Company | Disaster damage analysis and loss mitigation implementing unmanned aerial vehicles (uavs) |
US11379886B1 (en) * | 2017-08-11 | 2022-07-05 | State Farm Mutual Automobile Insurance Company | Using machine learning techniques to calculate damage of vehicles involved in an accident |
US20230342855A1 (en) * | 2019-03-22 | 2023-10-26 | BlueOwl, LLC | Systems and methods for detecting insurance claim fraud by using image data validation |
US11861721B1 (en) * | 2013-09-10 | 2024-01-02 | Allstate Insurance Company | Maintaining current insurance information at a mobile device |
Family Cites Families (83)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
AUPQ131399A0 (en) | 1999-06-30 | 1999-07-22 | Silverbrook Research Pty Ltd | A method and apparatus (NPAGE02) |
US4876597A (en) | 1987-09-04 | 1989-10-24 | Adt Security Systems, Inc. | Video observation systems |
US5343527A (en) | 1993-10-27 | 1994-08-30 | International Business Machines Corporation | Hybrid encryption method and system for protecting reusable software components |
US5553609A (en) | 1995-02-09 | 1996-09-10 | Visiting Nurse Service, Inc. | Intelligent remote visual monitoring system for home health care service |
US6122526A (en) | 1997-04-24 | 2000-09-19 | Eastman Kodak Company | Cellular telephone and electronic camera system with programmable transmission capability |
EP0760565B1 (en) | 1995-08-28 | 1998-07-08 | Ofra Feldbau | Apparatus and method for authenticating the dispatch and contents of documents |
EP0814611B1 (en) | 1996-06-17 | 2002-08-28 | Siemens Aktiengesellschaft | Communication system and method for recording and managing digital images |
US5737491A (en) | 1996-06-28 | 1998-04-07 | Eastman Kodak Company | Electronic imaging system capable of image capture, local wireless transmission and voice recognition |
GB2347576B (en) | 1998-01-07 | 2002-07-10 | Intel Corp | Automatic transfer of image information between imaging device and host system |
US6784925B1 (en) | 1998-03-24 | 2004-08-31 | Canon Kabushiki Kaisha | System to manage digital camera images |
US6487301B1 (en) | 1998-04-30 | 2002-11-26 | Mediasec Technologies Llc | Digital authentication with digital and analog documents |
US6278466B1 (en) | 1998-06-11 | 2001-08-21 | Presenter.Com, Inc. | Creating animation from a video |
US6847334B2 (en) | 1998-06-29 | 2005-01-25 | William Hayhurst | Mobile telecommunication device for simultaneously transmitting and receiving sound and image data |
US6161131A (en) | 1998-10-02 | 2000-12-12 | Garfinkle; Jeffrey | Digital real time postcards including information such as geographic location or landmark |
US6256059B1 (en) | 1999-01-07 | 2001-07-03 | Intel Corporation | Automatic transfer of image information between imaging device and host system |
US7239346B1 (en) | 1999-10-18 | 2007-07-03 | Priddy Dennis G | System and architecture that supports a multi-function semiconductor device between networks and portable wireless communications products |
US6662226B1 (en) | 2000-01-27 | 2003-12-09 | Inbit, Inc. | Method and system for activating and capturing screen displays associated with predetermined user interface events |
US6304211B1 (en) | 2000-02-16 | 2001-10-16 | Bertho Boman | System and method for measuring distance between two objects using received satellite transmitted data |
US6584564B2 (en) | 2000-04-25 | 2003-06-24 | Sigaba Corporation | Secure e-mail system |
US7034880B1 (en) | 2000-05-11 | 2006-04-25 | Eastman Kodak Company | System and camera for transferring digital images to a service provider |
US6963358B2 (en) | 2000-11-10 | 2005-11-08 | Dropfire, Inc. | Wireless digital camera adapter and systems and methods related thereto and for use with such an adapter |
JP4095243B2 (en) | 2000-11-28 | 2008-06-04 | キヤノン株式会社 | A storage medium storing a URL acquisition and processing system and method and a program for executing the method. |
US7028184B2 (en) | 2001-01-17 | 2006-04-11 | International Business Machines Corporation | Technique for digitally notarizing a collection of data streams |
WO2002061527A2 (en) | 2001-02-01 | 2002-08-08 | Mezrah Todd M | Online insurance sales platform |
JP2002344867A (en) | 2001-05-18 | 2002-11-29 | Fujitsu Ltd | Image data storage system |
US6751454B2 (en) | 2001-05-29 | 2004-06-15 | Leap Wireless International, Inc. | System and method for sampling audio recordings on a wireless communication device |
US6995789B2 (en) | 2001-06-26 | 2006-02-07 | Eastman Kodak Company | Method for managing images over a communication network |
US20030085989A1 (en) | 2001-11-07 | 2003-05-08 | Hiok-Nam Tay | Mobile communication device with instant image capture and transmission |
US7343049B2 (en) | 2002-03-07 | 2008-03-11 | Marvell International Technology Ltd. | Method and apparatus for performing optical character recognition (OCR) and text stitching |
US7093974B2 (en) | 2002-03-13 | 2006-08-22 | Ulrich Kienitz | Radiometer with digital imaging system |
US20040012811A1 (en) | 2002-07-22 | 2004-01-22 | Toshiba Tec Kabushiki Kaisha | System and method for taking, storing and printing digital pictures |
US20040174434A1 (en) | 2002-12-18 | 2004-09-09 | Walker Jay S. | Systems and methods for suggesting meta-information to a camera user |
US7149961B2 (en) | 2003-04-30 | 2006-12-12 | Hewlett-Packard Development Company, L.P. | Automatic generation of presentations from “path-enhanced” multimedia |
US20040218910A1 (en) | 2003-04-30 | 2004-11-04 | Chang Nelson L. | Enabling a three-dimensional simulation of a trip through a region |
US7526718B2 (en) | 2003-04-30 | 2009-04-28 | Hewlett-Packard Development Company, L.P. | Apparatus and method for recording “path-enhanced” multimedia |
US6906643B2 (en) * | 2003-04-30 | 2005-06-14 | Hewlett-Packard Development Company, L.P. | Systems and methods of viewing, modifying, and interacting with “path-enhanced” multimedia |
US7334186B2 (en) | 2003-04-30 | 2008-02-19 | Hewlett-Packard Development Company, L.P. | System and method for creation of video annotations |
JP2005143094A (en) | 2003-10-14 | 2005-06-02 | Canon Inc | Image verification system, management method, and storage medium readable by computer |
US7437419B2 (en) | 2003-11-12 | 2008-10-14 | Sony Computer Entertainment Inc. | System and method for effectively performing a streamlined photo distribution procedure |
US20050110880A1 (en) | 2003-11-26 | 2005-05-26 | Eastman Kodak Company | Method for correcting the date/time metadata in digital image files |
US7565183B2 (en) | 2004-11-15 | 2009-07-21 | Sony Ericsson Mobile Communications Ab | Mobile device with selectable camera position |
US20070008321A1 (en) | 2005-07-11 | 2007-01-11 | Eastman Kodak Company | Identifying collection images with special events |
US7702821B2 (en) | 2005-09-15 | 2010-04-20 | Eye-Fi, Inc. | Content-aware digital media storage device and methods of using the same |
US7621442B2 (en) | 2005-09-19 | 2009-11-24 | Silverbrook Research Pty Ltd | Printing a subscription using a mobile device |
US20070074035A1 (en) * | 2005-09-29 | 2007-03-29 | Tom Scanlon | Secure recordation for inspection systems and methods |
US8634712B1 (en) | 2007-03-06 | 2014-01-21 | Alex Robert Woods | Real time transmission of photographic images from portable handheld devices |
US8224178B2 (en) | 2007-03-06 | 2012-07-17 | Igotit Solutions, Llc | Real time transmission of photographic images from portable handheld devices |
US20150334257A1 (en) * | 2007-03-06 | 2015-11-19 | Capture Lock Inc. | Real time transmission of photographic images from portable handheld devices |
US8360841B2 (en) * | 2007-06-11 | 2013-01-29 | Igt | Facilitating a payout at a gaming device using audiovisual content |
US7934264B2 (en) | 2007-07-27 | 2011-04-26 | International Business Machines Corporation | Methods, systems, and computer program products for detecting alteration of audio or image data |
US20110276423A1 (en) * | 2007-08-07 | 2011-11-10 | Onenews Corporation | Systems and Methods for Content Communication |
US10318110B2 (en) | 2007-08-13 | 2019-06-11 | Oath Inc. | Location-based visualization of geo-referenced context |
US8897541B2 (en) * | 2009-09-14 | 2014-11-25 | Trimble Navigation Limited | Accurate digitization of a georeferenced image |
US20110137561A1 (en) | 2009-12-04 | 2011-06-09 | Nokia Corporation | Method and apparatus for measuring geographic coordinates of a point of interest in an image |
US8924302B2 (en) | 2009-12-11 | 2014-12-30 | Eoriginal, Inc. | System and method for electronic transmission, storage, retrieval and remote signing of authenticated electronic original documents |
US8868342B2 (en) | 2011-08-17 | 2014-10-21 | Abram L. Balloga | Orientation device and method |
US9228835B2 (en) | 2011-09-26 | 2016-01-05 | Ja Vad Gnss, Inc. | Visual stakeout |
US9582843B2 (en) | 2012-08-20 | 2017-02-28 | Tautachrome, Inc. | Authentication and validation of smartphone imagery |
JP6140991B2 (en) | 2012-12-04 | 2017-06-07 | オリンパス株式会社 | Server system, program, server system operating method, and terminal device |
DE102012223928A1 (en) | 2012-12-20 | 2014-06-26 | Hilti Aktiengesellschaft | Method and device for determining the location coordinates of a target object |
US8896709B2 (en) | 2012-12-26 | 2014-11-25 | Turner Broadcasting System, Inc. | Method and system for image and metadata management |
US10101465B2 (en) | 2012-12-28 | 2018-10-16 | Trimble Inc. | Electronic tape measure on a cellphone |
US9538336B2 (en) | 2012-12-28 | 2017-01-03 | Trimble Inc. | Performing data collection based on internal raw observables using a mobile data collection platform |
US9767299B2 (en) | 2013-03-15 | 2017-09-19 | Mymail Technology, Llc | Secure cloud data sharing |
US9818165B2 (en) | 2013-04-04 | 2017-11-14 | Conduent Business Services, Llc | Birth registration |
US8799799B1 (en) | 2013-05-07 | 2014-08-05 | Palantir Technologies Inc. | Interactive geospatial map |
US9288368B2 (en) * | 2013-10-08 | 2016-03-15 | Delightfit, Inc. | Video and map data synchronization for simulated athletic training |
US9521001B2 (en) | 2014-04-28 | 2016-12-13 | Adobe Systems Incorporated | Privacy preserving electronic document signature service |
US20160042767A1 (en) | 2014-08-08 | 2016-02-11 | Utility Associates, Inc. | Integrating data from multiple devices |
EP3021078B1 (en) | 2014-11-14 | 2018-09-26 | Leica Geosystems AG | Geodetic surveying system with virtual camera |
US10753918B2 (en) | 2014-12-15 | 2020-08-25 | Saudi Arabian Oil Company | Physical reservoir rock interpretation in a 3D petrophysical modeling environment |
US11973910B2 (en) | 2015-01-05 | 2024-04-30 | Musaed Ruzeg N. ALRAHAILI | System, apparatus, method and computer program product to set up a request for, generate, receive and send official communications |
US9767163B2 (en) | 2015-01-16 | 2017-09-19 | Stephen Brent Kass | Computerized technical authentication and grading system for collectible objects |
US10282562B1 (en) | 2015-02-24 | 2019-05-07 | ImageKeeper LLC | Secure digital data collection |
US9600146B2 (en) | 2015-08-17 | 2017-03-21 | Palantir Technologies Inc. | Interactive geospatial map |
US10468129B2 (en) | 2016-09-16 | 2019-11-05 | David Lyle Schneider | Biometric medical antifraud and consent system |
US20200184465A1 (en) | 2017-05-08 | 2020-06-11 | Security Matters Ltd. | A system for virtual currency based on blockchain architecture and physical marking |
US9984366B1 (en) | 2017-06-09 | 2018-05-29 | Hand Held Products, Inc. | Secure paper-free bills in workflow applications |
PL3461074T3 (en) | 2017-09-21 | 2023-03-06 | Lleidanetworks Serveis Telemàtics, S.A. | Method of certification of an electronic contract for electronic identification and trust services (eidas) |
US11212416B2 (en) | 2018-07-06 | 2021-12-28 | ImageKeeper LLC | Secure digital media capture and analysis |
CN109460732B (en) * | 2018-11-05 | 2020-11-10 | 泰康保险集团股份有限公司 | Image recognition-based application information processing method and device and readable storage medium |
CN110866224A (en) * | 2019-11-15 | 2020-03-06 | 支付宝(杭州)信息技术有限公司 | Picture right determining method and system based on block chain |
US11553105B2 (en) | 2020-08-31 | 2023-01-10 | ImageKeeper, LLC | Secure document certification and execution system |
-
2020
- 2020-04-01 US US16/837,944 patent/US11468198B2/en active Active
-
2022
- 2022-10-07 US US17/962,242 patent/US20230281344A1/en active Pending
Patent Citations (28)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20050036034A1 (en) * | 2003-08-15 | 2005-02-17 | Rea David D. | Apparatus for communicating over a network images captured by a digital camera |
US20060114338A1 (en) * | 2004-11-29 | 2006-06-01 | Rothschild Leigh M | Device and method for embedding and retrieving information in digital images |
CN101529797A (en) * | 2006-10-27 | 2009-09-09 | 国际商业机器公司 | System, device, method and program for authenticating communication partner by means of electronic certificate including personal information |
US20090279734A1 (en) * | 2008-05-09 | 2009-11-12 | Hartford Fire Insurance Company | System and method for assessing a condition of property |
US20120275651A1 (en) * | 2008-05-09 | 2012-11-01 | Hartford Fire Insurance Company | System and method for detecting potential property insurance fraud |
US20180089776A1 (en) * | 2009-08-14 | 2018-03-29 | Mousiki Inc. | System and method for acquiring, comparing and evaluating property conditions |
US9836793B2 (en) * | 2009-12-31 | 2017-12-05 | Hartford Fire Insurance Company | Insurance processing system and method using mobile devices for proof of ownership |
US10430886B1 (en) * | 2012-08-16 | 2019-10-01 | Allstate Insurance Company | Processing insured items holistically with mobile damage assessment and claims processing |
US20140114691A1 (en) * | 2012-10-23 | 2014-04-24 | InnovaPad, LP | Methods and Systems for the Integrated Collection of Data for Use in Incident Reports and Insurance Claims and to Related Methods of Performing Emergency Responder Cost Recovery |
US10304137B1 (en) * | 2012-12-27 | 2019-05-28 | Allstate Insurance Company | Automated damage assessment and claims processing |
US10956879B1 (en) * | 2013-03-15 | 2021-03-23 | United Services Automobile Association (Usaa) | Financial security indicator |
US11861721B1 (en) * | 2013-09-10 | 2024-01-02 | Allstate Insurance Company | Maintaining current insurance information at a mobile device |
WO2016028933A1 (en) * | 2014-08-19 | 2016-02-25 | Stempora Jeffrey | System for determining an underwriting risk, risk score, or price of insurance using sensor information |
US20210192629A1 (en) * | 2014-09-22 | 2021-06-24 | State Farm Mutual Automobile Insurance Company | Disaster damage analysis and loss mitigation implementing unmanned aerial vehicles (uavs) |
US10991049B1 (en) * | 2014-09-23 | 2021-04-27 | United Services Automobile Association (Usaa) | Systems and methods for acquiring insurance related informatics |
US20160171622A1 (en) * | 2014-12-15 | 2016-06-16 | Loss of Use, Inc. | Insurance Asset Verification and Claims Processing System |
US20170293894A1 (en) * | 2016-04-06 | 2017-10-12 | American International Group, Inc. | Automatic assessment of damage and repair costs in vehicles |
US20180082378A1 (en) * | 2016-09-21 | 2018-03-22 | Allstate Insurance Company | Enhanced Image Capture and Analysis of Damaged Tangible Objects |
US20180293552A1 (en) * | 2017-04-11 | 2018-10-11 | Alibaba Group Holding Limited | Image-based vehicle maintenance plan |
US11379886B1 (en) * | 2017-08-11 | 2022-07-05 | State Farm Mutual Automobile Insurance Company | Using machine learning techniques to calculate damage of vehicles involved in an accident |
US20190073641A1 (en) * | 2017-09-01 | 2019-03-07 | Allstate Insurance Company | Analyzing Images and Videos of Damaged Vehicles to Determine Damaged Vehicle Parts and Vehicle Asymmetries |
US10497250B1 (en) * | 2017-09-27 | 2019-12-03 | State Farm Mutual Automobile Insurance Company | Real property monitoring systems and methods for detecting damage and other conditions |
US10791265B1 (en) * | 2017-10-13 | 2020-09-29 | State Farm Mutual Automobile Insurance Company | Systems and methods for model-based analysis of damage to a vehicle |
CN108040050A (en) * | 2017-12-12 | 2018-05-15 | 任天民 | A kind of primary photo identification method and application |
US11004187B2 (en) * | 2018-10-05 | 2021-05-11 | The Toronto-Dominion Bank | System and method for verifying image data of a vehicle |
US20200226734A1 (en) * | 2019-01-14 | 2020-07-16 | Ford Global Technologies, Llc | Systems and methods for detecting and reporting vehicle damage events |
US20230342855A1 (en) * | 2019-03-22 | 2023-10-26 | BlueOwl, LLC | Systems and methods for detecting insurance claim fraud by using image data validation |
US10949814B1 (en) * | 2019-05-09 | 2021-03-16 | Ccc Information Services Inc. | Intelligent vehicle repair estimation system |
Also Published As
Publication number | Publication date |
---|---|
US20210312561A1 (en) | 2021-10-07 |
US11468198B2 (en) | 2022-10-11 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11468198B2 (en) | Secure digital media authentication and analysis | |
US11212416B2 (en) | Secure digital media capture and analysis | |
US11831710B2 (en) | Tracking and certification of digital media via distributed ledger | |
US11550960B2 (en) | Secure digital data collection | |
US12008123B2 (en) | Systems and methods for sharing verified identity documents | |
US11838475B2 (en) | Secure document certification and execution system | |
US11764974B2 (en) | Method and system for certification and authentication of objects | |
US20170140174A1 (en) | Systems and Methods for Obtaining Authorization to Release Personal Information Associated with a User | |
WO2019246188A1 (en) | Atomic capture using a distributed ledger, for proof of authenticity and/or real time visual validation | |
AU2020412358B2 (en) | Tagging objects in augmented reality to track object data | |
US11436588B1 (en) | Location-based purchase verification for fund transfers | |
WO2018162989A1 (en) | Systems and methods for obtaining authorization to release personal information associated with a user | |
US20220408165A1 (en) | Interactive broadcast media content provider with direct audience interaction | |
US20220222760A1 (en) | Asset Verification Systems and/or Methods | |
JP2020522831A (en) | System and method for credibility and validity verification of crowdsourcing users | |
US20210374741A1 (en) | Compliance controller for the integration of legacy systems in smart contract asset control | |
US11582044B2 (en) | Systems and methods to timestamp and authenticate digital documents using a secure ledger | |
KR102669785B1 (en) | Non-fungible token generating method using global positioning system | |
US11977661B2 (en) | System for generating a digital handwritten signature using a mobile device | |
US20220129890A1 (en) | Compliance controller for the integration of legacy systems in smart contract asset control | |
US20230089680A1 (en) | Systems and Methods Using Cameras on Smartphones to Provide Provably Trusted and Authentic Photographs of Persons, Locations, Items, and Property |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
AS | Assignment |
Owner name: IMAGEKEEPER LLC, NEVADA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:SPEASL, JERRY;PATTERSON, MICHAEL;ROBERTS, MARC;SIGNING DATES FROM 20200514 TO 20200515;REEL/FRAME:066483/0852 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |