US20180182087A1 - Tethered 3d scanner - Google Patents
Tethered 3d scanner Download PDFInfo
- Publication number
- US20180182087A1 US20180182087A1 US15/902,349 US201815902349A US2018182087A1 US 20180182087 A1 US20180182087 A1 US 20180182087A1 US 201815902349 A US201815902349 A US 201815902349A US 2018182087 A1 US2018182087 A1 US 2018182087A1
- Authority
- US
- United States
- Prior art keywords
- data collection
- collection system
- data
- laser pulses
- scanner
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
- 238000000034 method Methods 0.000 claims abstract description 48
- 238000013480 data collection Methods 0.000 claims description 92
- 238000007689 inspection Methods 0.000 claims description 35
- 238000004891 communication Methods 0.000 claims description 7
- 230000004044 response Effects 0.000 claims description 6
- 230000015654 memory Effects 0.000 description 48
- 238000007405 data analysis Methods 0.000 description 23
- 230000002093 peripheral effect Effects 0.000 description 19
- 239000000126 substance Substances 0.000 description 15
- 238000010586 diagram Methods 0.000 description 12
- 230000000875 corresponding effect Effects 0.000 description 10
- 230000008569 process Effects 0.000 description 10
- 230000005236 sound signal Effects 0.000 description 9
- 230000005855 radiation Effects 0.000 description 6
- 239000007921 spray Substances 0.000 description 5
- 239000000853 adhesive Substances 0.000 description 4
- 230000001070 adhesive effect Effects 0.000 description 4
- 238000011835 investigation Methods 0.000 description 4
- 238000004458 analytical method Methods 0.000 description 3
- 238000001514 detection method Methods 0.000 description 3
- 238000004020 luminiscence type Methods 0.000 description 3
- 230000003287 optical effect Effects 0.000 description 3
- 230000008439 repair process Effects 0.000 description 3
- 238000011160 research Methods 0.000 description 3
- 238000010079 rubber tapping Methods 0.000 description 3
- 230000000007 visual effect Effects 0.000 description 3
- 238000004873 anchoring Methods 0.000 description 2
- 238000010276 construction Methods 0.000 description 2
- 230000005670 electromagnetic radiation Effects 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 230000006870 function Effects 0.000 description 2
- 239000000463 material Substances 0.000 description 2
- 238000005259 measurement Methods 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000012545 processing Methods 0.000 description 2
- 238000005096 rolling process Methods 0.000 description 2
- 238000007790 scraping Methods 0.000 description 2
- 238000001228 spectrum Methods 0.000 description 2
- 238000007792 addition Methods 0.000 description 1
- 230000001413 cellular effect Effects 0.000 description 1
- 239000003795 chemical substances by application Substances 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 230000001276 controlling effect Effects 0.000 description 1
- 230000002596 correlated effect Effects 0.000 description 1
- 230000009193 crawling Effects 0.000 description 1
- 230000001934 delay Effects 0.000 description 1
- 230000000994 depressogenic effect Effects 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000011156 evaluation Methods 0.000 description 1
- 239000000835 fiber Substances 0.000 description 1
- 238000001914 filtration Methods 0.000 description 1
- 238000005339 levitation Methods 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 230000006855 networking Effects 0.000 description 1
- 238000011176 pooling Methods 0.000 description 1
- 238000011084 recovery Methods 0.000 description 1
- 238000005070 sampling Methods 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 230000001953 sensory effect Effects 0.000 description 1
- 239000013589 supplement Substances 0.000 description 1
- 238000002366 time-of-flight method Methods 0.000 description 1
- 238000012876 topography Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T17/00—Three dimensional [3D] modelling, e.g. data description of 3D objects
- G06T17/20—Finite element generation, e.g. wire-frame surface description, tesselation
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64C—AEROPLANES; HELICOPTERS
- B64C39/00—Aircraft not otherwise provided for
- B64C39/02—Aircraft not otherwise provided for characterised by special use
- B64C39/024—Aircraft not otherwise provided for characterised by special use of the remote controlled vehicle type, i.e. RPV
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/62—Systems in which the material investigated is excited whereby it emits light or causes a change in wavelength of the incident light
- G01N21/63—Systems in which the material investigated is excited whereby it emits light or causes a change in wavelength of the incident light optically excited
- G01N21/64—Fluorescence; Phosphorescence
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/84—Systems specially adapted for particular applications
- G01N21/88—Investigating the presence of flaws or contamination
- G01N21/8851—Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N22/00—Investigating or analysing materials by the use of microwaves or radio waves, i.e. electromagnetic waves with a wavelength of one millimetre or more
- G01N22/02—Investigating the presence of flaws
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/88—Radar or analogous systems specially adapted for specific applications
- G01S13/89—Radar or analogous systems specially adapted for specific applications for mapping or imaging
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S15/00—Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
- G01S15/88—Sonar systems specially adapted for specific applications
- G01S15/89—Sonar systems specially adapted for specific applications for mapping or imaging
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/86—Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/88—Lidar systems specially adapted for specific applications
- G01S17/89—Lidar systems specially adapted for specific applications for mapping or imaging
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/481—Constructional features, e.g. arrangements of optical elements
- G01S7/4817—Constructional features, e.g. arrangements of optical elements relating to scanning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F30/00—Computer-aided design [CAD]
- G06F30/10—Geometric CAD
- G06F30/13—Architectural design, e.g. computer-aided architectural design [CAAD] related to design of buildings, bridges, landscapes, production plants or roads
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q30/00—Commerce
- G06Q30/02—Marketing; Price estimation or determination; Fundraising
- G06Q30/0278—Product appraisal
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q30/00—Commerce
- G06Q30/02—Marketing; Price estimation or determination; Fundraising
- G06Q30/0283—Price estimation or determination
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q40/00—Finance; Insurance; Tax strategies; Processing of corporate or income taxes
- G06Q40/08—Insurance
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q50/00—Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
- G06Q50/10—Services
- G06Q50/16—Real estate
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q50/00—Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
- G06Q50/10—Services
- G06Q50/16—Real estate
- G06Q50/163—Real estate management
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T1/00—General purpose image data processing
- G06T1/0007—Image acquisition
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
-
- H04N13/0007—
-
- H04N13/0253—
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/106—Processing image signals
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/204—Image signal generators using stereoscopic image cameras
- H04N13/254—Image signal generators using stereoscopic image cameras in combination with electromagnetic radiation sources for illuminating objects
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/271—Image signal generators wherein the generated image signals comprise depth maps or disparity maps
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/275—Image signal generators from 3D object models, e.g. computer-generated stereoscopic image signals
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
- H04N7/183—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source
- H04N7/185—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source from a mobile camera, e.g. for remote control
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04R—LOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
- H04R23/00—Transducers other than those covered by groups H04R9/00 - H04R21/00
- H04R23/008—Transducers other than those covered by groups H04R9/00 - H04R21/00 using optical signals for detecting or generating sound
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64U—UNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
- B64U2101/00—UAVs specially adapted for particular uses or applications
- B64U2101/30—UAVs specially adapted for particular uses or applications for imaging, photography or videography
- B64U2101/32—UAVs specially adapted for particular uses or applications for imaging, photography or videography for cartography or topography
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N2201/00—Features of devices classified in G01N21/00
- G01N2201/06—Illumination; Optics
- G01N2201/061—Sources
- G01N2201/06113—Coherent sources; lasers
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N2201/00—Features of devices classified in G01N21/00
- G01N2201/10—Scanning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2200/00—Indexing scheme for image data processing or generation, in general
- G06T2200/08—Indexing scheme for image data processing or generation, in general involving all processing steps from image acquisition to 3D model generation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10028—Range image; Depth image; 3D point clouds
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10032—Satellite or aerial image; Remote sensing
Definitions
- This disclosure relates to 3D modeling, and in particular, to estimating the condition of a structure using 3D modeling.
- the present disclosure generally relates to a system or method for inspecting a structure to estimate the condition of the structure.
- property owners After an accident or loss, property owners typically file claims with their insurance companies.
- the insurance companies assign an appraiser to investigate the claims to determine the extent of damage and/or loss, document the damage, and provide its clients with appropriate compensation.
- Determining and documenting the extent of damage can be risky for the appraiser. For example, in a situation where a structure has experienced roof damage, appraisers typically climb onto the roof to evaluate the damage. Once on the roof they may sketch the damaged area of the roof in order to document the damage. In the alternative, appraisers might take a digital picture of the damaged area. In either scenario, the appraise has exposed himself to a risk of falling. Afterwards, the picture is typically attached to an electronic claim file for future reference where it can be analyzed by an appraiser to estimate the extent of damage to the structure.
- the process for determining and documenting the extent of the damage can be inefficient and time consuming.
- significant paperwork and calculations may be involved in calculating compensation owed to the clients. For example, if an insurance appraiser takes photos on the roof of a client's building to assess a claim for roof damage from a hurricane, in order to calculate how much money should be paid to the client, the appraiser may have to come back to his office, research the client's property, research the cost of the damaged property and research repair costs. All of these steps are time consuming and both delay payment to the client and prevent the appraiser from assessing other client claims.
- an insurance appraiser may not have time to perform a timely claim investigations of all the received claims. If claim investigations are not performed quickly, property owners may not receive recovery for their losses for long periods of time. Additionally, long time delays when performing claim investigations can lead to inaccurate investigations results (e.g., the delay may lead to increased opportunity for fraud and/or may make it more difficult to ascertain the extent of damage at the time of the accident or loss).
- two-dimensional digital pictures or video of a roof or structure often provide inadequate detail for thorough inspection of a structure. Poor image quality resulting from camera movement or out-of-focus images can make it difficult to estimate the condition of a property based on an image. Even where image quality is adequate, poor angles or bad lighting may hide or exaggerate details important to estimating the condition of the structure, leading to inaccurate assessments of the structure's condition.
- a system and method for inspecting a structure and estimating the condition of the structure includes deploying one or more 3D scanners to scan a structure and generating, at the one or more 3D scanners, a plurality of 3D data points corresponding to points on the surface of the structure.
- the method further includes identifying coordinate sets, at the one or more 3D scanners, associated with each of the generated plurality of 3D data points.
- the method also includes storing a point cloud, comprising one or more of the generated plurality of 3D data points, to a memory.
- the method further includes causing a processor to construct a 3D model from the point cloud and storing the 3D model to the memory. Then, the processor analyzes the 3D model to identify features associated with the structure.
- the processor finally generates an estimate of the condition of the structure based on the identified features before storing the estimate to memory.
- the estimate of the condition of the structure may be used to calculate a financial cost estimate (representing, for example, a loss in value or a cost to repair damage).
- the 3D scanners may be contact 3D scanners (detecting 3D information via physical contact with a structure) or non-contact 3D scanners (detecting 3D information via light or sound, for example, reflected off of the structure).
- the contact 3D scanner detects 3D information by using a tactile sensor to detect an imprint left on a pad that was stamped on the surface or a roller that was rolled across the surface.
- the contact scanner detects 3D information by pulling, tapping or scraping objects on the structure (such as roof shingles).
- the 3D scanner utilizes an audio sensor to listen for an audio response to the tapping.
- the non-contact 3D scanners may detect sound or electromagnetic radiation (including white light, laser light, infrared light, ultraviolet light) to generate the 3D data points.
- the 3D scanner may identify coordinate sets associated with the 3D data points by detecting a projected light pattern or laser using triangulation methods or time-of-flight methods (timing how long it takes for a light to reflect off of a surface).
- the 3D scanners may also generate 3D data points by detecting a chemical sprayed onto the structure (wherein the chemical may pool in cracks or crevices, for example).
- the 3D scanners may be physically connected to (or may themselves be) stationary devices, flying devices, hovering devices, crawling devices or rolling devices.
- the 3D scanners may also be physically connected to (or may themselves be) a wirelessly controlled device or an autonomously controlled device.
- the processor that analyzes the 3D model to identify features associated with the structure is located in a data analysis system remotely located relative to the 3D scanners. In other instances, the processor that analyzes the 3D model may be in a system in close proximity to the 3D scanners.
- FIG. 1 a illustrates a block diagram of a property inspection system according to an embodiment of the present disclosure.
- FIG. 1 b illustrates a block diagram of a property inspection system according to a further embodiment of the present disclosure
- FIG. 2 illustrates a block diagram of a data collection system according to an embodiment of the present disclosure.
- FIG. 3 illustrates a block diagram of a data collection system according to an embodiment of the present disclosure.
- FIG. 4 illustrates a block diagram of a data collection system according to an embodiment of the present disclosure.
- FIG. 5 illustrates a block diagram of a data analysis system according to an embodiment of the present disclosure.
- FIG. 6 illustrates a flow chart of an example method for inspecting and analyzing the condition of a structure.
- FIG. 7 illustrates a flow chart of an exemplary method for detecting a point on a surface using a 3D scanner.
- FIG. 1 a illustrates a block diagram of a property inspection system 106 according to an exemplary embodiment.
- the property inspection system 106 is configured to inspect the structure 105 .
- the structure 105 may be any type of construction or object.
- the structure 105 may be a building, which may be residential, commercial, industrial, agricultural, educational, or of any other nature.
- the structure 105 may be personal property such as a vehicle, boat, aircraft, furniture, etc.
- the property inspection system 106 may include a number of modules, devices, systems, sub-systems, or routines.
- the property inspection system 106 includes a 3D scanning system or 3D scanner for generating 3D data, and may include a number of other sensing devices.
- the property inspection system 106 includes a data collection module or system (for scanning or collecting the structure 105 ) and a data analysis module or system (for analyzing the scanned or collected data).
- the property inspection system 106 may be utilized in a number of situations, but in the preferred embodiment, a user associated with an insurance company utilizes the property inspection system 106 for the purpose of estimating the condition of the structure 105 .
- an insurance policy-holder may file a claim because the policy-holder believes that the structure 105 is damaged.
- a user e.g., an insurance company or claim adjuster
- the user may be an appraiser appraising the structure 105 or an inspector inspecting the structure 105 .
- the property inspection system 106 inspects the structure 105 by scanning the structure 105 to detect information related to the structure 105 .
- the information may relate to any kind of audio, visual, tactile or thermal features associated with the structure 105 .
- the property inspection system 106 uses the detected information to generate data representative of one or more features associated with the structure 105 .
- the property inspection system 106 may scan the structure 105 and generate a full-color 3D model of the structure 105 .
- the property inspection system 106 analyzes the data to estimate the condition of the structure 105 . Based on the estimated condition of the structure, the property inspection system 106 may also determine that the structure 105 is damaged and may then automatically calculate a financial cost associated with the damage.
- the property inspection system 106 may determine that the roof of the structure 105 is damaged and then calculate how much it will cost to fix the roof.
- the property inspection system 106 may determine that a body panel, window, frame, or another surface associated with the vehicle, boat, or aircraft is damaged. The property inspection system 106 may calculate a cost to fix the body panel, window, frame, or other surface.
- FIG. 1 b illustrates a block diagram of a property inspection system 100 according to a further embodiment of the present disclosure.
- the property inspection system 100 includes a data collection module 101 , a network 102 , and a data analysis module 103 .
- the data collection module 101 and the data analysis module 103 are each communicatively connected to the network 102 .
- the data collection module 101 may be in direct wired or wireless communication with the data analysis module 103 .
- the data collection module 101 and the data analysis module 103 may exist on a single device or platform and may share components, hardware, equipment, or any other resources.
- the network 102 may be a single network, or may include multiple networks of one or more types (e.g., a public switched telephone network (PSTN), a cellular telephone network, a wireless local area network (WLAN), the Internet, etc.).
- PSTN public switched telephone network
- WLAN wireless local area network
- the data collection module 101 scans a structure (such as structure 105 ) and generates data representing the scanned information.
- the data collection module is operable on a 3D scanning system such as the data collection system 201 shown in FIG. 2 .
- the generated data may represent a point cloud or 3D model of the scanned structure.
- the data collection module 101 transmits the generated data over the network 102 .
- the data analysis module 103 receives the generated data from the network 102 , where the data analysis module 103 operates to estimate the condition of the structure by analyzing the generated data. In some embodiments, estimating the condition of the structure may include comparing the generated data to reference data.
- the reference data may be any type of data that can provide a point of comparison for estimating the condition of the structure.
- the reference data may represent an image, model, or any previously collected or generated data relating to the same or a similar structure.
- the reference data may also represent stock images or models unrelated to the scanned structure.
- the data analysis module 103 may use the estimate of the condition of the structure to determine that the structure is damaged, and then may calculate an estimated cost correlated to the extent of the damage to the structure.
- the data collection module 101 wirelessly transmits, and the data analysis module 103 wirelessly receives, the generated data. While in the preferred embodiment the generated data represents a point cloud or 3D model of the scanned structure, the generated data may also correspond to any visual (2D or 3D), acoustic, thermal, or tactile characteristics of the scanned structure.
- the data collection module 101 may use one or more 3D scanners, image sensors, video recorders, light projectors, audio sensors, audio projectors, chemical sprays, chemical sensors, thermal sensors, or tactile sensors to scan the structure and generate the data.
- the network 102 may include one or more devices such as computers, servers, routers, modems, switches, hubs, or any other networking equipment.
- the data collection module 101 may be handled or operated by a person.
- the data collection module 101 may also be affixed to a locally or remotely controlled device.
- the data collection module 101 may also be affixed to a device that crawls or rolls along a surface; or a flying device, such as a unmanned aerial vehicle (“UAV”), airplane or helicopter.
- UAV unmanned aerial vehicle
- helicopter may be a multicopter with two or more rotors.
- the data collection module 101 may also be affixed to a projectile, balloon or satellite.
- FIG. 2 illustrates a block diagram of a data collection system 201 according to an embodiment of the present disclosure.
- the data collection system 201 is used to scan the structure 205 .
- the structure 205 may be any of the aforementioned structure types, such as a building, boat, vehicle, or aircraft.
- the data collection system 201 includes a processor 210 , a memory 215 , a user input interface 220 , a network interface 230 , a peripheral interface 235 , a system bus 250 , and a 3D scanner 285 .
- the 3D scanner 285 includes a tactile sensor 260 , an image sensor 265 , a light projector 270 , an audio sensor 275 , and an audio projector 280 .
- the 3D scanner 285 of the data collection system 201 may include only one of, or some subset of: the tactile sensor 260 , the image sensor 265 , the light projector 270 , the audio sensor 275 , and the audio projector 280 . Some embodiments may also have multiple tactile sensors, multiple image sensors, multiple light projectors, multiple audio sensors, or multiple audio projectors.
- the memory 215 may include volatile and/or non-volatile memory and may be removable or non-removable memory.
- the memory 215 may include computer storage media in the form of random access memory (RAM), read only memory (ROM), EEPROM, FLASH memory or other memory technology, CD-ROM, digital versatile disks (DVD) or other optical disk storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information.
- the network interface 230 may include an antenna, a port for wired connection, or both.
- the peripheral interface 235 may be a serial interface such as a Universal Serial Bus (USB) interface.
- the peripheral interface 235 may be a wireless interface for establishing wireless connection with another device.
- the peripheral interface 235 may be a short range wireless interface compliant with standards such as Bluetooth (operating in the 2400-2480 MHz frequency band) or Near Field Communication (operating in the 13.56 MHz frequency band).
- the 3D scanner 285 is a non-contact 3D scanner, which may be active (where the 3D scanner 285 emits radiation and detects the reflection of the radiation off of an object) or passive (where the 3D scanner 285 detects radiation that it did not emit).
- the 3D scanner 285 may be a contact 3D scanner that scans an object by coming into physical contact with the object.
- the 3D scanner may be a time-of-flight 3D scanner, a triangulation 3D scanner, a conoscopic 3D scanner, volumetric 3D scanner, a structured light 3D scanner, or a modulated light 3D scanner.
- the image sensor 265 may include any of a number of photosensor, photodiode, photomultiplier, or image sensor types, including charge-coupled-devices (CCD), complementary metal-oxide-semiconductors (CMOS), or some combination thereof.
- CCD charge-coupled-devices
- CMOS complementary metal-oxide-semiconductors
- the image sensor 265 may be a single-camera setup. In other instances, the image sensor 365 may be a multi-camera setup.
- the light projector 270 may include one or more light sources and may project light in the frequency of either visible or invisible light (including infrared light and ultraviolet light). The light projector 270 may also project directional light such as a laser light.
- the light projector 270 may include, but is not limited to, LED, incandescent, fluorescent, high intensity discharge lamp, or laser light sources.
- the audio sensor may include any of a number of audio sensor or microphone types.
- the audio sensor may include one or more condenser microphones, dynamic microphones, piezoelectric microphones, fiber optic microphones, laser microphones, or MEMS microphones.
- the data collection system 201 may be held and operated by a person.
- the data collection system 201 may also be affixed to a remotely controlled device, such as a radio controlled device; a flying device; a device that rolls, drives, crawls, climbs or drives; a mechanical apparatus affixed to or near the structure 205 ; or a satellite.
- the processor 210 , the memory 215 , the user input interface 220 , the network interface 230 , the peripheral interface 235 , and the 3D scanner 285 are each communicatively connected to the system bus 250 .
- the tactile sensor 260 , the image sensor 265 , the light projector 270 , the audio sensor 275 , and the audio projector 280 are also communicatively connected to the system bus 250 .
- the tactile sensor 260 , the image sensor 265 , the light projector 270 , the audio sensor 275 , and the audio projector 280 communicate over a bus internal to the 3D scanner and are controlled by the 3D scanner.
- all or some of the elements in the data collection system 201 may be in contact with or close proximity to the structure 205 . In other embodiments of the invention, all or some of the aforementioned elements may be remotely located in relation to the structure 205 (for example, and as discussed later, the data collection system 201 may be affixed, in whole or in part, to a satellite in orbit).
- the processor 210 is configured to fetch and execute instructions stored in the memory 215 .
- the memory 215 is configured to store data such as operating system data or program data.
- the user input interface 220 is configured to receive user input and to transmit data representing the user input over the system bus 250 .
- the peripheral interface 235 is configured to communicatively connect to a peripheral device such as a computer.
- the network interface 230 is configured to communicatively connect to a network, such as the network 102 shown in FIG. 1 b , and wirelessly receive or transmit data using the network.
- a network such as the network 102 shown in FIG. 1 b
- the network interface 230 may receive and transmit data using a wired connection, such as Ethernet.
- the 3D scanner 285 is configured to receive control commands over the system bus 250 and scan an object such as the structure 205 to detect 3D characteristics of the scanned object.
- the 3D scanner 285 is further configured to transmit data representing a 3D data point, a point cloud or a 3D model (“3D data”) relating to the scanned object over the system bus 250 .
- the 3D scanner is further configured to use any of the tactile sensor 260 , the image sensor 265 , the light projector 270 , the audio projector 270 , or the audio projector 280 to generate and transmit the 3D data.
- the tactile sensor 260 is configured to capture sensory information associated with a surface of the structure 205 (“tactile data”), such as shapes and features or topography of the surface, and transmit the tactile data over the system bus 250 .
- the image sensor 265 is configured to capture an image of the structure 205 and transmit data representing the image (“image data”) over the system bus 250 .
- the image sensor may receive visible light, invisible light (such as infrared or ultraviolet), or radiation in other parts of the electromagnetic spectrum (radio waves, microwaves, x-rays, gamma rays, etc.).
- subsurface features may be detected using radar.
- the transmitted image data may represent a thermal, color, infrared, or panchromatic image.
- the light projector 270 is configured to receive control commands over the system bus 250 from the 3D scanner 285 or the processor 210 , and is further configured to project light in the direction of the structure 205 .
- the audio sensor 275 is configured to receive an audio signal or sound waves reflected off of the structure 205 and transmit data representing the audio signal (“audio data”) over the system bus 250 .
- the audio projector 280 is configured to receive control commands over the system bus 250 or from the 3D scanner 285 and project a sound or audio signal in the direction of the structure 205 .
- the network interface 250 receives data representing a command to collect 3D information associated with the structure 205 (“3D capture command”).
- the network interface 250 transmits the 3D capture command over the system bus 250 to the processor 210 , where the 3D capture command data is received.
- the processor 210 transmits, over the system bus 250 , a signal (“3D capture signal”) instructing the 3D scanner 285 to detect 3D characteristics associated with an object.
- the 3D scanner 285 scans the structure 205 and generates data representing 3D characteristics of the structure 205 (“3D data”) corresponding to the collected 3D information. More particularly, in one embodiment the 3D scanner 285 projects a light pattern onto the structure 205 .
- the 3D scanner 285 then records the structure 205 and the projected light pattern.
- the 3D scanner 285 may then alter the projected light pattern or the area of the structure 205 on which the light pattern is projected.
- the 3D scanner 285 then records, for a second time, the structure 205 and projected light pattern. This process may be continuously repeated until a sufficient portion of the structure 205 has been scanned.
- the 3D scanner 285 analyzes the deformations associated with each of the recorded light patterns to identify coordinate sets associated with the structure 205 .
- Each coordinate set includes vertical, horizontal, and depth distance measurements (relative to the 3D scanner 285 ) of a particular point on the surface of the structure 205 .
- the 3D scanner 285 generates 3D data points representing each of the coordinate sets associated with the scanned points on the surface of the structure 205 .
- the 3D scanner 285 may normalize the coordinates for all of the collected 3D data points so that the 3D data points share a common coordinate system.
- the coordinates may be normalized by a processor external to the 3D scanner 285 .
- the 3D scanner 285 then stores a point cloud, constructed from the 3D data points, to memory 215 .
- the processor 210 operates to transmit the 3D data (i.e., the point cloud) to the network interface 230 , where the 3D data is transmitted over a network such as the network 102 shown in Figure lb.
- the 3D data may represent a 3D model that was constructed by the processor 210 or the 3D scanner 285 .
- the 3D scanner may be a time-of-flight 3D scanner where the round trip time of a laser is identified in order to identify the distance to a particular point on the structure 205 .
- the 3D scanner 285 may also be any type of triangulation 3D scanner that uses ordinary light or laser light.
- the 3D scanner 285 may use any one of or a combination of the tactile sensor 260 , the image sensor 265 , the light projector 270 , the audio sensor 275 , or the audio projector 280 in generating the 3D data.
- the tactile sensor 260 receives a signal from the 3D scanner 285 instructing the tactile sensor 260 to detect topographical features associated with a surface (“tactile capture signal”).
- the tactile sensor 260 receives the tactile capture signal and the tactile sensor 260 is exposed to a surface associated with the structure 205 .
- the tactile sensor 260 generates tactile data representing at least some of the shapes and features of the surface that the tactile sensor 260 was exposed to.
- the 3D scanner 285 then uses the tactile data to generate 3D data.
- the tactile sensor 260 may transmit the tactile data over the system bus 250 to the memory 215 where the tactile data is stored.
- the tactile sensor 260 may include, or be used with, a pad, mat, stamp, or surface that is depressed onto a surface associated with the structure 205 .
- the tactile sensor 260 may then be used to detect the imprint made on the pad.
- the pad may have an adhesive surface so that any objects on the surface of the structure 205 (such as a shingle) stick to the pad.
- the tactile sensor 260 may then detect the resistive force exerted by the object as the pad is pulled away from the structure 205 .
- the tactile sensor 260 may use a roller that is run across a surface of the structure 205 , wherein the shapes and features of the surface are temporarily imprinted on the roller and the tactile sensor 260 detects the shapes and features that have been temporarily imprinted on the roller.
- the image sensor 265 receives a signal (“image capture signal”) from the 3D scanner 285 instructing the image sensor 265 to capture reflected light or to capture an image.
- the image sensor 265 receives the image capture signal and the image sensor 265 is exposed to light reflected off of the structure 205 .
- the image sensor 265 generates image data representing at least part of an image of the structure 205 , wherein the image corresponds to the light that the image sensor 265 was exposed to.
- the 3D scanner 285 then uses the image data to generate 3D data.
- the image data may be transmitted over the system bus 250 to the memory 215 where the image data is stored.
- the 3D scanner 285 may also use image data corresponding to multiple previously captured images to generate the 3D data.
- the image sensor 265 may be utilized to capture 2D images.
- the 3D scanner 285 may use the image sensor 265 to capture 2D images in order to supplement the 3D data captured by the 3D scanner 285 .
- the data collection system 201 may use the image sensor 265 to capture 2D images independently of the 3D scanner 285 .
- the 2D image data may be transmitted to the memory 215 to be stored.
- the 2D image data may also be transmitted, via the network interface 230 , to a data analysis module such as the data analysis module 103 , where the 2D image data, or combination 2D-3D image data, may analyzed to estimate the condition of the structure 205 .
- the image sensor 265 may be used to detect thermal characteristics associated with the structure 205 in addition to visual characteristics associated with the structure 205 (capturing infrared light, for example). Furthermore, in some embodiments the light reflected off of the structure 205 may originate from the light projector 270 , while in other embodiments the light may originate elsewhere. In the former case, the processor 210 or the 3D scanner 285 operates to transmit a command instructing the light projector 270 to generate light. The light projector 270 receives the command to generate light and projects light in the direction of the structure 205 .
- the light may be visible light, such as laser light or ordinary light emitted from an HID lamp; or invisible light, such as infrared light or ultraviolet light.
- the light projector 370 may also be configured to emit radiation in other frequencies of the electromagnetic spectrum (e.g., radio waves, microwaves, terahertz radiation, x-rays, or gamma rays).
- the light projector 370 may emit radio waves.
- the radio waves may reflect off the structure 205 and may be detected by an antenna (not shown) communicatively coupled to the data collection system 201 .
- the light projector and antenna may operate as a radar system, allowing the data collection system 201 to, for example, scan a subsurface associated with the structure 205 .
- the data collection system 201 may scan the subsurface associated with shingles, enabling a data analysis module to determine if the subsurface of the shingles are damaged.
- the audio sensor 275 receives a signal from the 3D scanner 285 instructing the audio sensor 275 to detect audio or sound waves (“audio capture signal”).
- the audio sensor 275 receives the audio capture signal and the audio sensor 275 is exposed to one or more audio signals or sound waves reflected off of the structure 205 .
- the audio sensor 275 generates audio data representing at least part of one of the audio signals that the audio sensor 275 was exposed to.
- the 3D scanner 285 then uses the audio data to generate 3D data. Alternatively, the audio data may then be transmitted over the system bus 250 from the audio sensor 275 to the memory 215 where the audio data is stored.
- the audio signals or sound waves received at the audio sensor 275 may originate from the audio projector 280 , while in other embodiments the audio signals may originate elsewhere.
- the processor 210 operates to transmit a command instructing the audio projector 280 to generate audio.
- the audio projector 280 receives the command to generate audio and emits one or more sound waves or audio signals in the direction of the structure 205 .
- the audio sensor 275 and the audio projector 280 may operate as a sonar system, allowing the data collection system 201 to, for example, scan a subsurface associated with the structure 205 .
- the data collection system 201 may scan the subsurface associated with shingles, enabling a data analysis module to determine if the subsurface of the shingles are damaged.
- the image capture signal, the audio capture signal, or the tactile capture signal may be received by from the processor 210 , wherein the respective signal was generated in response to a capture command received by the processor 210 from the peripheral interface 235 , the network interface 230 , or the input interface 220 .
- the processor 210 may also operate to transmit the image data, audio data, tactile data, or 3D data to the network interface 230 or the peripheral interface 235 to be transmitted to another device or system.
- the data collection system 201 may include a chemical spray device, or may be used in conjunction with a chemical spray device, wherein the chemical spray device sprays a chemical onto a surface of the structure 205 .
- the chemical may then be detected in order to help generate the image data or tactile data.
- the data collection system 201 may include or may be used in conjunction with a chemical detection sensor.
- the presence of the chemical may also be detected using the image sensor 265 .
- a visually distinct or luminescent chemical such as a phosphorescent or fluorescent chemical
- the image sensor 265 may then be used to detect the presence and extent of luminescence on the structure 205 .
- a black light may also be used in conjunction with the process of detecting the chemical.
- the degree of luminescence present on the structure 205 may be used to determine topographical features associated with the structure 205 and may be used by the 3D scanner in generating 3D data. For example, the degree of luminescence may indicate pooling or seeping at certain locations on the surface of the structure. Detecting the luminescent chemical may also reveal run-off or drainage patterns, which may indicate an uneven surface or a dent on the surface.
- the data collection system 201 may be configured to implement a data analysis method wherein the processor 210 accesses one or more of the image data, the audio data, the tactile data, or the 3D data on the memory 215 for analysis.
- the processor 210 may further operate to estimate the condition of the structure 205 based on said analysis.
- FIG. 3 illustrates a block diagram of a data collection system 301 according to an embodiment of the present disclosure.
- the data collection system 301 is configured to scan the structure 305 .
- the data collection system 301 includes a 3D scanner 385 , a flying device 310 , a base station 320 , an antenna 325 , and a tether 330 .
- the 3D scanner 385 includes an antenna 316 .
- the flying device 310 may be a balloon, airplane, helicopter, projectile, rocket, or any other device capable of flight, levitation, or gliding.
- the 3D scanner 385 is similar to the 3D scanner 285 and may also include one or more of: a tactile sensor similar to the tactile sensor 260 , an image sensor similar to the image sensor 265 , a light projector similar to the light projector 270 , an audio sensor similar to the audio sensor 275 , or an audio projector similar to the audio projector 280 .
- the base station 320 may include one or more of: a processor similar to the process 210 , a memory similar to the memory 215 , a peripheral interface similar to the peripheral interface 230 , a user input interface similar to the user input interface 220 , or a transmitter similar to the transmitter 235 .
- the 3D scanner 385 is affixed to the flying device 310 .
- the 3D scanner 385 is tethered to the base station 320 .
- the antenna 316 of the 3D scanner 385 is in communication with the antenna 325 of the base station 320 .
- the flying device 310 is used to position the 3D scanner 385 at an elevation higher than at least part of the structure 305 .
- the tether 330 functions to keep the flying device 310 within the vicinity of the base station 320 by tethering the flying device 310 to the base station 320 .
- the tether 330 may provide power to the flying device 310 .
- the tether may also provide a communication channel between the flying device 310 and the base station 320 (and may replace the antennas 316 and 325 in certain embodiments).
- the 3D scanner 385 collects information associated with the structure 305 .
- the 3D scanner 385 scans the structure 305 and generates 3D data (e.g., 3D data points, a point cloud, or a 3D model). In some embodiments the 3D scanner 385 may collect image information, audio information, or tactile information as discussed with regard to the data collection system 201 . The 3D scanner 385 then uses the antenna 316 to transmit the collected information to the antenna 325 of the base station 320 . The base station 320 then transmits the collected information over a network such as network 102 shown in FIG. 1 b.
- 3D data e.g., 3D data points, a point cloud, or a 3D model.
- the 3D scanner 385 may collect image information, audio information, or tactile information as discussed with regard to the data collection system 201 .
- the 3D scanner 385 uses the antenna 316 to transmit the collected information to the antenna 325 of the base station 320 .
- the base station 320 then transmits the collected information over a network such as network 102 shown in FIG. 1 b.
- the base station 320 may be affixed to the flying device 310 along with the 3D scanner 285 and the tether 330 may instead tether the data collection system 301 to an anchoring device or apparatus.
- the components of the data collection system 301 may communicate over a system bus such as the system bus 250 discussed with regard to FIG. 2 .
- the flying device 310 may operate to bring the 3D scanner 385 in contact with the structure 305 , or may drop the 3D scanner 385 onto the structure 305 .
- the flying device 310 may operate autonomously.
- the flying device 310 may also be controlled wirelessly by a remote device such as a radio control device.
- the 3D scanner 385 may be free of a connection to the tether 330 .
- the 3D scanner 385 may be held and operated by a person, while in others the 3D scanner 385 may be affixed to a mechanical apparatus located on or near the structure 305 .
- FIG. 4 illustrates a block diagram of a data collection system 401 according to an embodiment of the present disclosure.
- the data collection system 401 includes a 3D scanner 485 , a base station 420 , and a tether 430 .
- the 3D scanner 485 includes an antenna 416 and a roller 417 .
- the base station 420 includes an antenna 425 .
- the 3D scanner 485 may also include one or more of: a tactile sensor similar to the tactile sensor 260 , an image sensor similar to the image sensor 265 , a light projector similar to the light projector 270 , an audio sensor similar to the audio sensor 275 , an audio projector similar to the audio projector 280 , or a 3D scanner similar to the 3D scanner 285 .
- the base station 420 may include one or more of: a processor similar to the process 210 , a memory similar to the memory 215 , a peripheral interface similar to the peripheral interface 230 , a user input interface similar to the user input interface 220 , or a transmitter similar to the transmitter 235 .
- the roller 417 of the 3D scanner 485 comes into contact with a surface of the structure 405 .
- the 3D scanner 485 is physically connected to the base station 420 by the tether 430 .
- the antenna 416 of the 3D scanner 485 is in communication with the antenna 425 of the base station 420 .
- the 3D scanner 485 is deployed on a surface associated with the structure 405 .
- the roller 417 comes into contact with the surface and rolls as the 3D scanner 485 moves.
- the roller 417 experiences a temporary imprint as it rolls, reflecting the shapes and features of the surface that it is rolling across.
- Sensors internal or external to the roller (such as the tactile sensor 260 of FIG. 2 ) detect the imprinted texture.
- the 3D scanner 485 generates tactile data representing the imprinted texture
- the 3D scanner uses the tactile data to generate 3D data and uses the antenna 416 to transmit the 3D data to the antenna 425 of the base station 420 .
- the base station 420 may then transmit the 3D data over a network such as the network 102 shown in FIG. 1 b.
- the 3D scanner 485 may have mechanical feelers for contacting a surface associated with the structure 405 .
- the mechanical feelers may pull on an object associated with the surface (such as shingles on a roof) by gripping the object between opposable feelers in order to detect how strongly adhered to the surface the object is.
- the 3D scanner 485 may deploy a mechanical feeler with an adhesive surface that detects how strongly an object is adhered to the surface by applying the adhesive surface of the mechanical feeler to the object, pulling the mechanical feeler away from the object, and detecting the resistive force associated with the object.
- the 3D scanner 485 may deploy a mechanical feeler to physically manipulate the surface or an object on the surface (by tapping, pulling, or scraping, for example) and using an audio sensor (such as the audio sensor 275 , for example) to detect the audio response to the physical manipulation.
- the audio response may be analyzed (by the data analysis module 103 shown in FIG. 1 b , for example) and used in determining the condition of the structure 405 .
- either or both of the data collection system 401 and the 3D scanner 485 may be unconnected to the tether 430 .
- the 3D scanner 485 may include a pad or a stamp instead of or in addition to the roller 417 .
- the 3D scanner 485 may depress the stamp onto a surface of the structure 405 .
- the features and shapes of the surface cause an imprint on the stamp and the sensing device detects the imprint using a tactile sensor such as the tactile sensor 260 shown in FIG. 2 .
- the stamp or pad may also have an adhesive surface causing objects on the surface of the structure 405 to stick to the pad.
- the 3D scanner 485 may then detect the resistive force exerted by an object when the stamp or pad is pulled away from the surface of the structure 405 .
- the entire data collection system 401 may be affixed to or included in the 3D scanner 485 .
- the tether 430 may instead tether the 3D scanner 485 to an anchoring device or apparatus on or near the ground, the structure 405 , or some other point of attachment.
- the 3D scanner 485 may be controlled by a device remotely located relative to the 3D scanner 485 .
- the 3D scanner 485 may be wirelessly controlled (e.g., via radio frequency by a radio control device). In other embodiments the 3D scanner 485 may operate autonomously.
- FIG. 5 illustrates a block diagram of a data analysis system 503 according to an embodiment of the present disclosure.
- the data analysis system 503 includes a processor 510 , a memory 515 , a user input interface 520 , a network interface 535 , a peripheral interface 535 , a video interface 540 , and a system bus 550 .
- the processor 510 , memory 515 , user input interface 520 , network interface 535 , peripheral interface 535 , and video interface 540 are each communicatively connected to the system bus 550 .
- the memory 515 may be any type of memory similar to memory 215 .
- the processor 510 may be any processor similar to the processor 210
- the network interface 530 may be any network interface similar to the network interface 230
- the peripheral interface 535 may be any peripheral interface similar to the peripheral interface 235
- the user input interface 520 may be any user input interface similar to the user input interface 220 .
- the video interface 540 is configured to communicate over the system bus 540 and transmit video signals to a display device such as a monitor.
- the network interface 535 receives 3D data points corresponding to a structure such as the structure 205 shown in FIG. 2 .
- the network interface 535 transmits the received data over the system bus 550 to the memory 515 .
- the processor 510 accesses the memory 515 to generate a first 3D model of the structure based on the 3D data points, wherein the edges and vertices associated with the model are derived from the 3D data points.
- the processor 510 may then make one or more comparisons between the first 3D model and one or more second models.
- the second models may represent previously received data relating to the same structure, or they may represent previously received data relating to similar structures.
- the second models may have been created specifically for the purpose of estimating the condition of a structure and may not relate to any actual physical structure.
- the processor 510 Based on the one or more comparisons, the processor 510 generates an estimate of the condition of the structure.
- the estimate of the condition of the structure is saved to the memory 515 .
- network interface 535 may receive 2D image data or 2D-3D combination image data and may transmit the data to the memory 515 .
- the processor 510 may identify features with the 2D images and/or 2D-3D combination images and may generate the estimate of the condition of the structure in accordance with the identified features.
- the processor 510 may determine, based on the generated estimate, that the structure has been damaged. The processor 510 may then operate to calculate (based on the condition of the structure and data relating to costs such as cost of supplies, materials, components and labor) an estimated financial cost associated with the damage. The estimated financial cost is then saved to the memory 515 .
- the video interface 540 may be used to display: the first 3D model, any of the one or more second models, the estimate of the condition of the structure, or the estimated financial cost.
- the received data may also represent images, videos, sounds, thermal maps, pressure maps, or topographical maps, any of which may be displayed via the video interface 540 .
- the received data may then be used to generate a 3D model.
- the received data may be compared to reference images, videos, sound, thermal maps, pressure maps, or topographical maps to estimate the condition of the structure.
- FIG. 6 illustrates a flow chart of an example method 600 for inspecting and analyzing the condition of a structure.
- the method 600 may be implemented, in whole or in part, on one or more devices or systems such as those shown in the property inspection system 100 of FIG. 1 , the data collection system 201 of FIG. 2 , the data collection system 301 of FIG. 3 , the data collection system 401 of FIG. 4 , or the data analysis system 503 of FIG. 5 .
- the method may be saved as a set of instructions, routines, programs, or modules on memory such as memory 215 of FIG. 2 or memory 515 of FIG. 5 , and may be executed by a processor such as processor 210 of FIG. 2 or processor 510 of FIG. 5 .
- the method 600 begins when a 3D scanner scans a structure, such as the structure 205 shown in FIG. 2 , structure 305 shown in FIG. 3 , or structure 405 shown in FIG. 4 , and detects a point on the surface of the structure (block 605 ).
- the structure may be any kind of building or structure.
- the structure may be, for example, a single-family home, townhome, condominium, apartment, storefront, or retail space, and the structure may be owned, leased, possessed, or occupied by an insurance policy holder.
- the structure may also be any of the structure types discussed regarding FIG. 1 , such as a vehicle, boat, or aircraft.
- the 3D scanner may be used to inspect the body panels, windows, frame, and other surfaces associated with the vehicle, boat, or aircraft.
- the 3D scanner identifies a coordinate set corresponding to each detected point on the surface of the structure (block 610 ).
- the coordinate set relates to vertical, horizontal, and depth distance measurements relative to the 3D scanner that detected the point.
- the 3D scanner then generates a 3D data point, corresponding to the detected point on the surface of the structure, that includes the corresponding coordinate data (block 615 ).
- the 3D data point may then be saved to memory.
- a decision is made thereafter to either stop scanning the structure or continue scanning the structure (block 620 ). If there is more surface area or more surface points to be scanned, the 3D scanner continues scanning the structure. Otherwise, the method 600 continues to block 625 .
- the method 600 activates the 3D scanner, or a processor such as the processor 210 of FIG. 2 or the processor 510 of
- FIG. 5 to normalize the coordinate data for all of the generated 3D data points so that the 3D data points share a common coordinate system (block 625 ).
- the normalized 3D data points may then be saved to memory.
- the 3D scanner, or a processor operates to build a point cloud from the 3D data points (block 630 ). This may be done by sampling or filtering the 3D data points. Alternatively, all of the 3D data points may be used. In any event, the point cloud may then be saved to memory.
- the 3D scanner or processor operates to construct a 3D model from the point cloud (block 635 ).
- the edges and vertices associated with the model are derived from the points in the point cloud.
- Any of a number of surface reconstruction algorithms may be used to generate the surface of the model. In certain embodiments the surface reconstruction may be skipped altogether and the raw point cloud may be subsequently used instead of the constructed 3D model.
- a processor such as the processor 210 of FIG. 2 or the processor 510 of FIG. 5 operates to analyze the 3D model (or point cloud) to estimate a condition of the structure (block 640 ). In some embodiments, this may include comparing the model to other models, wherein the other models relate to previously collected data corresponding to the same structure, or previously collected data corresponding to other structures. In the alternative, the other models may only exist for the purpose of analysis or estimation and may not correlate to any real structure.
- a processor Based on the estimated condition of the structure, a processor operates to calculate a financial cost estimate corresponding to any damage to the structure (block 645 ).
- the financial cost estimate may correspond to the estimated cost for materials, labor, and other resources required to repair or refurbish the structure.
- a processor After calculating a financial cost estimate, a processor operates to determine a claim assessment (block 650 ). The claim assessment may then be saved to memory. In some embodiments the claim assessment may be sent to a third party associated with the structure, such as a client holding an insurance policy on the structure. In other embodiments the claim assessment may be sent to an insurance agent for evaluation.
- FIG. 7 illustrates a flow chart of an exemplary method 700 for detecting a point on a surface using a 3D scanner.
- the method may be implemented by a 3D scanner, such as the 3D scanner 285 of FIG. 2 or the 3D scanner 385 of FIG. 3 .
- the method 700 begins when a light source is deployed oriented toward a structure such as structure 105 , 205 , 305 , or 405 of FIG. 1, 2, 3 , or 4 , respectively (block 705 ).
- the light source may be a part of the 3D scanner, or it may be a separate device used in conjunction with the 3D scanner.
- the light source may be any type of light source, but in the preferred embodiment the light source is a laser that projects a dot or line. In other embodiments the light source may be a white light source that projects a pattern onto an object.
- a photosensor or image sensing device such as the image sensor 265 of FIG. 2 , is then deployed oriented toward the structure (block 710 ).
- the image sensing device may be part of the 3D scanner, or it may be a separate device used in conjunction with the 3D scanner.
- the image sensing device is capable of detecting and processing laser light. After the image sensing device has been deployed, the distance between the light source and the image sensing device is determined (block 715 ).
- the light source projects light onto a surface of the structure (block 720 ) and the image sensing device detects light reflected off of the surface of the structure (block 725 ).
- a first and second angle are determined (block 730 and block 735 , respectively).
- the first angle includes the light source as an end point, the projected light beam or laser as a first side, and a line extending to the image sensing device as the second side of the angle.
- the second angle includes the image sensing device as an end point, the received light beam or laser as a first side, and a line extending to the light source as a second side of the angle.
- the position (including depth) of the surface reflecting the light is determined (block 740 ) using the distance discussed in relation to block 715 , the first angle discussed in relation to block 730 , and the second angle discussed in relation to block 735 .
- the position of the surface reflecting the light is saved to memory as coordinate data included in a 3D data point (block 745 ).
- the coordinate data may be relative to the 3D scanner, or it may be normalized so that is it is consistent with other saved 3D data points.
- the light source is adjusted so that the light is projected onto a different area on the surface of the property (block 750 ).
- a decision is then made to either continue scanning or stop scanning (block 755 ). If more of the structure needs to be scanned, the method returns to step 725 where the light from the adjusted light source is reflected off of the surface of the structure and detected. If the structure has been sufficiently scanned, the 3D scanner or a processor can begin the process of building a 3D model of the structure using the 3D data points.
- Modules may constitute either software modules (e.g., code implemented on a tangible, non-transitory machine-readable medium such as RAM, ROM, flash memory of a computer, hard disk drive, optical disk drive, tape drive, etc.) or hardware modules (e.g., an integrated circuit, an application-specific integrated circuit (ASIC), a field programmable logic array (FPLA)/field-programmable gate array (FPGA), etc.).
- a hardware module is a tangible unit capable of performing certain operations and may be configured or arranged in a certain manner.
- one or more computer systems e.g., a standalone, client or server computer system
- one or more hardware modules of a computer system e.g., a processor or a group of processors
- software e.g., an application or application portion
- any reference to “one implementation,” “one embodiment,” “an implementation,” or “an embodiment” means that a particular element, feature, structure, or characteristic described in connection with the implementation is included in at least one implementation.
- the appearances of the phrase “in one implementation” or “in one embodiment” in various places in the specification are not necessarily all referring to the same implementation.
- Coupled along with its derivatives.
- some implementations may be described using the term “coupled” to indicate that two or more elements are in direct physical or electrical contact.
- the term “coupled,” however, may also mean that two or more elements are not in direct contact with each other, but yet still co-operate or interact with each other. The implementations are not limited in this context.
- the terms “comprises,” “comprising,” “includes,” “including,” “has,” “having” or any other variation thereof, are intended to cover a non-exclusive inclusion.
- a process, method, article, or apparatus that comprises a list of elements is not necessarily limited to only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus.
- “or” refers to an inclusive or and not to an exclusive or. For example, a condition A or B is satisfied by any one of the following: A is true (or present) and B is false (or not present), A is false (or not present) and B is true (or present), and both A and B are true (or present).
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Business, Economics & Management (AREA)
- General Physics & Mathematics (AREA)
- Remote Sensing (AREA)
- Radar, Positioning & Navigation (AREA)
- Theoretical Computer Science (AREA)
- Strategic Management (AREA)
- Finance (AREA)
- Accounting & Taxation (AREA)
- Development Economics (AREA)
- Signal Processing (AREA)
- Economics (AREA)
- Marketing (AREA)
- General Business, Economics & Management (AREA)
- Computer Networks & Wireless Communication (AREA)
- Electromagnetism (AREA)
- Health & Medical Sciences (AREA)
- Multimedia (AREA)
- Tourism & Hospitality (AREA)
- General Health & Medical Sciences (AREA)
- Geometry (AREA)
- Game Theory and Decision Science (AREA)
- Entrepreneurship & Innovation (AREA)
- Primary Health Care (AREA)
- Human Resources & Organizations (AREA)
- Immunology (AREA)
- Chemical & Material Sciences (AREA)
- Pathology (AREA)
- Life Sciences & Earth Sciences (AREA)
- Biochemistry (AREA)
- Analytical Chemistry (AREA)
- Acoustics & Sound (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Technology Law (AREA)
- Computer Hardware Design (AREA)
- Software Systems (AREA)
- Computer Graphics (AREA)
- Quality & Reliability (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
Abstract
In a computer-implemented method and system for capturing the condition of a structure, the structure is scanned with a three-dimensional (3D) scanner. The 3D scanner generates 3D data. A point cloud or 3D model is constructed from the 3D data. The point cloud or 3D model is then analyzed to determine the condition of the structure.
Description
- This application is a continuation of (i) U.S. application Ser. No. 15/134,273, filed Apr. 20, 2016 and titled, “Tethered 3D Scanner,” which is a continuation of (ii) U.S. application Ser. No. 14/631,568, filed Feb. 25, 2015 and titled, “Laser-Based Methods and Systems for Capturing the Condition of A Physical Structure,” which is a continuation of (iii) U.S. application Ser. No. 14/496,802, filed on Sep. 25, 2014, and titled “Methods and Systems for Capturing the Condition of A Physical Structure via Detection of Electromagnetic Radiation,” which is a continuation of and claims priority to (iv) U.S. application Ser. No. 13/836,695, filed on Mar. 15, 2013, and titled “Methods and Systems for Capturing the Condition of A Physical Structure,” the entire disclosures of which are hereby expressly incorporated by reference.
- This disclosure relates to 3D modeling, and in particular, to estimating the condition of a structure using 3D modeling.
- The present disclosure generally relates to a system or method for inspecting a structure to estimate the condition of the structure. After an accident or loss, property owners typically file claims with their insurance companies. In response to these claims, the insurance companies assign an appraiser to investigate the claims to determine the extent of damage and/or loss, document the damage, and provide its clients with appropriate compensation.
- Determining and documenting the extent of damage can be risky for the appraiser. For example, in a situation where a structure has experienced roof damage, appraisers typically climb onto the roof to evaluate the damage. Once on the roof they may sketch the damaged area of the roof in order to document the damage. In the alternative, appraisers might take a digital picture of the damaged area. In either scenario, the appraise has exposed himself to a risk of falling. Afterwards, the picture is typically attached to an electronic claim file for future reference where it can be analyzed by an appraiser to estimate the extent of damage to the structure.
- The process for determining and documenting the extent of the damage can be inefficient and time consuming. In addition to the time required to drive to and from the incident site and to perform the inspection itself, significant paperwork and calculations may be involved in calculating compensation owed to the clients. For example, if an insurance appraiser takes photos on the roof of a client's building to assess a claim for roof damage from a hurricane, in order to calculate how much money should be paid to the client, the appraiser may have to come back to his office, research the client's property, research the cost of the damaged property and research repair costs. All of these steps are time consuming and both delay payment to the client and prevent the appraiser from assessing other client claims.
- In situations where the insurance company has received a large number of claims in a short time period (e.g., when a town is affected by a hurricane, tornado, or other natural disaster), an insurance appraiser may not have time to perform a timely claim investigations of all the received claims. If claim investigations are not performed quickly, property owners may not receive recovery for their losses for long periods of time. Additionally, long time delays when performing claim investigations can lead to inaccurate investigations results (e.g., the delay may lead to increased opportunity for fraud and/or may make it more difficult to ascertain the extent of damage at the time of the accident or loss).
- Finally, two-dimensional digital pictures or video of a roof or structure often provide inadequate detail for thorough inspection of a structure. Poor image quality resulting from camera movement or out-of-focus images can make it difficult to estimate the condition of a property based on an image. Even where image quality is adequate, poor angles or bad lighting may hide or exaggerate details important to estimating the condition of the structure, leading to inaccurate assessments of the structure's condition.
- A system and method for inspecting a structure and estimating the condition of the structure includes deploying one or more 3D scanners to scan a structure and generating, at the one or more 3D scanners, a plurality of 3D data points corresponding to points on the surface of the structure. The method further includes identifying coordinate sets, at the one or more 3D scanners, associated with each of the generated plurality of 3D data points. The method also includes storing a point cloud, comprising one or more of the generated plurality of 3D data points, to a memory. The method further includes causing a processor to construct a 3D model from the point cloud and storing the 3D model to the memory. Then, the processor analyzes the 3D model to identify features associated with the structure. The processor finally generates an estimate of the condition of the structure based on the identified features before storing the estimate to memory. In some embodiments the estimate of the condition of the structure may be used to calculate a financial cost estimate (representing, for example, a loss in value or a cost to repair damage).
- The 3D scanners may be contact 3D scanners (detecting 3D information via physical contact with a structure) or non-contact 3D scanners (detecting 3D information via light or sound, for example, reflected off of the structure). In some embodiments, the
contact 3D scanner detects 3D information by using a tactile sensor to detect an imprint left on a pad that was stamped on the surface or a roller that was rolled across the surface. In other embodiments, the contact scanner detects 3D information by pulling, tapping or scraping objects on the structure (such as roof shingles). In some instances the 3D scanner utilizes an audio sensor to listen for an audio response to the tapping. - The non-contact 3D scanners may detect sound or electromagnetic radiation (including white light, laser light, infrared light, ultraviolet light) to generate the 3D data points. The 3D scanner may identify coordinate sets associated with the 3D data points by detecting a projected light pattern or laser using triangulation methods or time-of-flight methods (timing how long it takes for a light to reflect off of a surface). The 3D scanners may also generate 3D data points by detecting a chemical sprayed onto the structure (wherein the chemical may pool in cracks or crevices, for example).
- The 3D scanners may be physically connected to (or may themselves be) stationary devices, flying devices, hovering devices, crawling devices or rolling devices. The 3D scanners may also be physically connected to (or may themselves be) a wirelessly controlled device or an autonomously controlled device.
- In some instances, the processor that analyzes the 3D model to identify features associated with the structure is located in a data analysis system remotely located relative to the 3D scanners. In other instances, the processor that analyzes the 3D model may be in a system in close proximity to the 3D scanners.
-
FIG. 1a illustrates a block diagram of a property inspection system according to an embodiment of the present disclosure. -
FIG. 1b illustrates a block diagram of a property inspection system according to a further embodiment of the present disclosure -
FIG. 2 illustrates a block diagram of a data collection system according to an embodiment of the present disclosure. -
FIG. 3 illustrates a block diagram of a data collection system according to an embodiment of the present disclosure. -
FIG. 4 illustrates a block diagram of a data collection system according to an embodiment of the present disclosure. -
FIG. 5 illustrates a block diagram of a data analysis system according to an embodiment of the present disclosure. -
FIG. 6 illustrates a flow chart of an example method for inspecting and analyzing the condition of a structure. -
FIG. 7 illustrates a flow chart of an exemplary method for detecting a point on a surface using a 3D scanner. -
FIG. 1a illustrates a block diagram of aproperty inspection system 106 according to an exemplary embodiment. Theproperty inspection system 106 is configured to inspect thestructure 105. Thestructure 105 may be any type of construction or object. In certain embodiments, thestructure 105 may be a building, which may be residential, commercial, industrial, agricultural, educational, or of any other nature. In other embodiments thestructure 105 may be personal property such as a vehicle, boat, aircraft, furniture, etc. Theproperty inspection system 106 may include a number of modules, devices, systems, sub-systems, or routines. For example, theproperty inspection system 106 includes a 3D scanning system or 3D scanner for generating 3D data, and may include a number of other sensing devices. In some embodiments, theproperty inspection system 106 includes a data collection module or system (for scanning or collecting the structure 105) and a data analysis module or system (for analyzing the scanned or collected data). Theproperty inspection system 106 may be utilized in a number of situations, but in the preferred embodiment, a user associated with an insurance company utilizes theproperty inspection system 106 for the purpose of estimating the condition of thestructure 105. In one embodiment, an insurance policy-holder may file a claim because the policy-holder believes that thestructure 105 is damaged. A user (e.g., an insurance company or claim adjuster) may then deploy theproperty inspection system 106 to inspect thestructure 105 and estimate the condition of thestructure 105. In other embodiments, the user may be an appraiser appraising thestructure 105 or an inspector inspecting thestructure 105. - In operation, the
property inspection system 106 inspects thestructure 105 by scanning thestructure 105 to detect information related to thestructure 105. The information may relate to any kind of audio, visual, tactile or thermal features associated with thestructure 105. Theproperty inspection system 106 uses the detected information to generate data representative of one or more features associated with thestructure 105. For example, and as further described below, theproperty inspection system 106 may scan thestructure 105 and generate a full-color 3D model of thestructure 105. Theproperty inspection system 106 then analyzes the data to estimate the condition of thestructure 105. Based on the estimated condition of the structure, theproperty inspection system 106 may also determine that thestructure 105 is damaged and may then automatically calculate a financial cost associated with the damage. For example, theproperty inspection system 106 may determine that the roof of thestructure 105 is damaged and then calculate how much it will cost to fix the roof. With regard to a vehicle, boat, or aircraft, theproperty inspection system 106 may determine that a body panel, window, frame, or another surface associated with the vehicle, boat, or aircraft is damaged. Theproperty inspection system 106 may calculate a cost to fix the body panel, window, frame, or other surface. -
FIG. 1b illustrates a block diagram of aproperty inspection system 100 according to a further embodiment of the present disclosure. Theproperty inspection system 100 includes adata collection module 101, anetwork 102, and adata analysis module 103. In theproperty inspection system 100, thedata collection module 101 and thedata analysis module 103 are each communicatively connected to thenetwork 102. In alternative embodiments of theproperty inspection system 100, thedata collection module 101 may be in direct wired or wireless communication with thedata analysis module 103. Furthermore, in some embodiments thedata collection module 101 and thedata analysis module 103 may exist on a single device or platform and may share components, hardware, equipment, or any other resources. Thenetwork 102 may be a single network, or may include multiple networks of one or more types (e.g., a public switched telephone network (PSTN), a cellular telephone network, a wireless local area network (WLAN), the Internet, etc.). - In operation of the
property inspection system 100, thedata collection module 101 scans a structure (such as structure 105) and generates data representing the scanned information. In certain embodiments, the data collection module is operable on a 3D scanning system such as thedata collection system 201 shown inFIG. 2 . The generated data may represent a point cloud or 3D model of the scanned structure. Thedata collection module 101 transmits the generated data over thenetwork 102. Thedata analysis module 103 receives the generated data from thenetwork 102, where thedata analysis module 103 operates to estimate the condition of the structure by analyzing the generated data. In some embodiments, estimating the condition of the structure may include comparing the generated data to reference data. The reference data may be any type of data that can provide a point of comparison for estimating the condition of the structure. For example, the reference data may represent an image, model, or any previously collected or generated data relating to the same or a similar structure. The reference data may also represent stock images or models unrelated to the scanned structure. Furthermore, thedata analysis module 103 may use the estimate of the condition of the structure to determine that the structure is damaged, and then may calculate an estimated cost correlated to the extent of the damage to the structure. - In some embodiments of the
property inspection system 100, thedata collection module 101 wirelessly transmits, and thedata analysis module 103 wirelessly receives, the generated data. While in the preferred embodiment the generated data represents a point cloud or 3D model of the scanned structure, the generated data may also correspond to any visual (2D or 3D), acoustic, thermal, or tactile characteristics of the scanned structure. Thedata collection module 101 may use one or more 3D scanners, image sensors, video recorders, light projectors, audio sensors, audio projectors, chemical sprays, chemical sensors, thermal sensors, or tactile sensors to scan the structure and generate the data. In some embodiments thenetwork 102 may include one or more devices such as computers, servers, routers, modems, switches, hubs, or any other networking equipment. - In further embodiments of the
property inspection system 100, thedata collection module 101 may be handled or operated by a person. Thedata collection module 101 may also be affixed to a locally or remotely controlled device. Thedata collection module 101 may also be affixed to a device that crawls or rolls along a surface; or a flying device, such as a unmanned aerial vehicle (“UAV”), airplane or helicopter. In some embodiments, the helicopter may be a multicopter with two or more rotors. Thedata collection module 101 may also be affixed to a projectile, balloon or satellite. -
FIG. 2 illustrates a block diagram of adata collection system 201 according to an embodiment of the present disclosure. Thedata collection system 201 is used to scan thestructure 205. Thestructure 205 may be any of the aforementioned structure types, such as a building, boat, vehicle, or aircraft. Thedata collection system 201 includes aprocessor 210, amemory 215, auser input interface 220, anetwork interface 230, aperipheral interface 235, asystem bus 250, and a3D scanner 285. The3D scanner 285 includes atactile sensor 260, animage sensor 265, alight projector 270, anaudio sensor 275, and anaudio projector 280. In alternative embodiments, the3D scanner 285 of thedata collection system 201 may include only one of, or some subset of: thetactile sensor 260, theimage sensor 265, thelight projector 270, theaudio sensor 275, and theaudio projector 280. Some embodiments may also have multiple tactile sensors, multiple image sensors, multiple light projectors, multiple audio sensors, or multiple audio projectors. - In certain embodiments of the
memory 215 of thedata collection system 201, thememory 215 may include volatile and/or non-volatile memory and may be removable or non-removable memory. For example, thememory 215 may include computer storage media in the form of random access memory (RAM), read only memory (ROM), EEPROM, FLASH memory or other memory technology, CD-ROM, digital versatile disks (DVD) or other optical disk storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information. Thenetwork interface 230 may include an antenna, a port for wired connection, or both. - In some embodiments of the
peripheral interface 235 of thedata collection system 201, theperipheral interface 235 may be a serial interface such as a Universal Serial Bus (USB) interface. In other embodiments theperipheral interface 235 may be a wireless interface for establishing wireless connection with another device. For example, in some embodiments theperipheral interface 235 may be a short range wireless interface compliant with standards such as Bluetooth (operating in the 2400-2480 MHz frequency band) or Near Field Communication (operating in the 13.56 MHz frequency band). - In the preferred embodiments of the
3D scanner 285 of thedata collection system 201, the3D scanner 285 is a non-contact 3D scanner, which may be active (where the3D scanner 285 emits radiation and detects the reflection of the radiation off of an object) or passive (where the3D scanner 285 detects radiation that it did not emit). In other embodiments the3D scanner 285 may be acontact 3D scanner that scans an object by coming into physical contact with the object. The 3D scanner may be a time-of-flight 3D scanner, atriangulation 3D scanner, aconoscopic 3D scanner, volumetric 3D scanner, astructured light 3D scanner, or a modulated light 3D scanner. The 3D scanner may use light detection and ranging (LIDAR), light field, stereoscopic, multi-camera, laser scanning, ultrasonic, x-ray, distance range system (laser or acoustic) technology, or some combination thereof. In typical embodiments, the3D scanner 285 includes a controller, microcontroller or processor for controlling the3D scanner 285 and included components. Furthermore, in certain embodiments the 3D scanner includes internal memory. - In some embodiments of the
3D scanner 285 of thedata collection system 201, theimage sensor 265 may include any of a number of photosensor, photodiode, photomultiplier, or image sensor types, including charge-coupled-devices (CCD), complementary metal-oxide-semiconductors (CMOS), or some combination thereof. In some instances theimage sensor 265 may be a single-camera setup. In other instances, the image sensor 365 may be a multi-camera setup. Thelight projector 270 may include one or more light sources and may project light in the frequency of either visible or invisible light (including infrared light and ultraviolet light). Thelight projector 270 may also project directional light such as a laser light. Thelight projector 270 may include, but is not limited to, LED, incandescent, fluorescent, high intensity discharge lamp, or laser light sources. The audio sensor may include any of a number of audio sensor or microphone types. For example, the audio sensor may include one or more condenser microphones, dynamic microphones, piezoelectric microphones, fiber optic microphones, laser microphones, or MEMS microphones. - The
data collection system 201 may be held and operated by a person. Thedata collection system 201 may also be affixed to a remotely controlled device, such as a radio controlled device; a flying device; a device that rolls, drives, crawls, climbs or drives; a mechanical apparatus affixed to or near thestructure 205; or a satellite. Theprocessor 210, thememory 215, theuser input interface 220, thenetwork interface 230, theperipheral interface 235, and the3D scanner 285 are each communicatively connected to thesystem bus 250. In the preferred embodiment, thetactile sensor 260, theimage sensor 265, thelight projector 270, theaudio sensor 275, and theaudio projector 280 are also communicatively connected to thesystem bus 250. In certain embodiments, thetactile sensor 260, theimage sensor 265, thelight projector 270, theaudio sensor 275, and theaudio projector 280 communicate over a bus internal to the 3D scanner and are controlled by the 3D scanner. - In some embodiments of the
data collection system 201, all or some of the elements in thedata collection system 201 may be in contact with or close proximity to thestructure 205. In other embodiments of the invention, all or some of the aforementioned elements may be remotely located in relation to the structure 205 (for example, and as discussed later, thedata collection system 201 may be affixed, in whole or in part, to a satellite in orbit). Theprocessor 210 is configured to fetch and execute instructions stored in thememory 215. Thememory 215 is configured to store data such as operating system data or program data. Theuser input interface 220 is configured to receive user input and to transmit data representing the user input over thesystem bus 250. Theperipheral interface 235 is configured to communicatively connect to a peripheral device such as a computer. Thenetwork interface 230 is configured to communicatively connect to a network, such as thenetwork 102 shown inFIG. 1b , and wirelessly receive or transmit data using the network. In alternative embodiments, thenetwork interface 230 may receive and transmit data using a wired connection, such as Ethernet. - The
3D scanner 285 is configured to receive control commands over thesystem bus 250 and scan an object such as thestructure 205 to detect 3D characteristics of the scanned object. The3D scanner 285 is further configured to transmit data representing a 3D data point, a point cloud or a 3D model (“3D data”) relating to the scanned object over thesystem bus 250. The 3D scanner is further configured to use any of thetactile sensor 260, theimage sensor 265, thelight projector 270, theaudio projector 270, or theaudio projector 280 to generate and transmit the 3D data. Thetactile sensor 260 is configured to capture sensory information associated with a surface of the structure 205 (“tactile data”), such as shapes and features or topography of the surface, and transmit the tactile data over thesystem bus 250. Theimage sensor 265 is configured to capture an image of thestructure 205 and transmit data representing the image (“image data”) over thesystem bus 250. In certain embodiments, the image sensor may receive visible light, invisible light (such as infrared or ultraviolet), or radiation in other parts of the electromagnetic spectrum (radio waves, microwaves, x-rays, gamma rays, etc.). In some embodiments, for example, subsurface features may be detected using radar. The transmitted image data may represent a thermal, color, infrared, or panchromatic image. Thelight projector 270 is configured to receive control commands over thesystem bus 250 from the3D scanner 285 or theprocessor 210, and is further configured to project light in the direction of thestructure 205. Theaudio sensor 275 is configured to receive an audio signal or sound waves reflected off of thestructure 205 and transmit data representing the audio signal (“audio data”) over thesystem bus 250. Theaudio projector 280 is configured to receive control commands over thesystem bus 250 or from the3D scanner 285 and project a sound or audio signal in the direction of thestructure 205. - In operation of the
3D scanner 285 ofdata collection system 201, thenetwork interface 250 receives data representing a command to collect 3D information associated with the structure 205 (“3D capture command”). Thenetwork interface 250 transmits the 3D capture command over thesystem bus 250 to theprocessor 210, where the 3D capture command data is received. Theprocessor 210 then transmits, over thesystem bus 250, a signal (“3D capture signal”) instructing the3D scanner 285 to detect 3D characteristics associated with an object. The3D scanner 285 scans thestructure 205 and generates data representing 3D characteristics of the structure 205 (“3D data”) corresponding to the collected 3D information. More particularly, in one embodiment the3D scanner 285 projects a light pattern onto thestructure 205. The3D scanner 285 then records thestructure 205 and the projected light pattern. The3D scanner 285 may then alter the projected light pattern or the area of thestructure 205 on which the light pattern is projected. The3D scanner 285 then records, for a second time, thestructure 205 and projected light pattern. This process may be continuously repeated until a sufficient portion of thestructure 205 has been scanned. - In further operation of the
3D scanner 285, the3D scanner 285 analyzes the deformations associated with each of the recorded light patterns to identify coordinate sets associated with thestructure 205. Each coordinate set includes vertical, horizontal, and depth distance measurements (relative to the 3D scanner 285) of a particular point on the surface of thestructure 205. The3D scanner 285 generates 3D data points representing each of the coordinate sets associated with the scanned points on the surface of thestructure 205. In some embodiments (particularly in embodiments where the 3D scanner moves or uses sensors in multiple locations or positions), the3D scanner 285 may normalize the coordinates for all of the collected 3D data points so that the 3D data points share a common coordinate system. In alternative embodiments, the coordinates may be normalized by a processor external to the3D scanner 285. In any event, the3D scanner 285 then stores a point cloud, constructed from the 3D data points, tomemory 215. Theprocessor 210 operates to transmit the 3D data (i.e., the point cloud) to thenetwork interface 230, where the 3D data is transmitted over a network such as thenetwork 102 shown in Figure lb. In certain embodiments, the 3D data may represent a 3D model that was constructed by theprocessor 210 or the3D scanner 285. - In alternative embodiments of the
3D scanner 285, the 3D scanner may be a time-of-flight 3D scanner where the round trip time of a laser is identified in order to identify the distance to a particular point on thestructure 205. The3D scanner 285 may also be any type oftriangulation 3D scanner that uses ordinary light or laser light. Furthermore, in some embodiments the3D scanner 285 may use any one of or a combination of thetactile sensor 260, theimage sensor 265, thelight projector 270, theaudio sensor 275, or theaudio projector 280 in generating the 3D data. - In operation of the
tactile sensor 260 of the3D scanner 285, thetactile sensor 260 receives a signal from the3D scanner 285 instructing thetactile sensor 260 to detect topographical features associated with a surface (“tactile capture signal”). Thetactile sensor 260 receives the tactile capture signal and thetactile sensor 260 is exposed to a surface associated with thestructure 205. Thetactile sensor 260 generates tactile data representing at least some of the shapes and features of the surface that thetactile sensor 260 was exposed to. The3D scanner 285 then uses the tactile data to generate 3D data. Alternatively, thetactile sensor 260 may transmit the tactile data over thesystem bus 250 to thememory 215 where the tactile data is stored. - In some embodiments of the
tactile sensor 260 of thedata collection system 201, thetactile sensor 260 may include, or be used with, a pad, mat, stamp, or surface that is depressed onto a surface associated with thestructure 205. Thetactile sensor 260, may then be used to detect the imprint made on the pad. Furthermore, the pad may have an adhesive surface so that any objects on the surface of the structure 205 (such as a shingle) stick to the pad. Thetactile sensor 260 may then detect the resistive force exerted by the object as the pad is pulled away from thestructure 205. In further embodiments, thetactile sensor 260 may use a roller that is run across a surface of thestructure 205, wherein the shapes and features of the surface are temporarily imprinted on the roller and thetactile sensor 260 detects the shapes and features that have been temporarily imprinted on the roller. - In operation of the
image sensor 265 of the3D scanner 285, theimage sensor 265 receives a signal (“image capture signal”) from the3D scanner 285 instructing theimage sensor 265 to capture reflected light or to capture an image. Theimage sensor 265 receives the image capture signal and theimage sensor 265 is exposed to light reflected off of thestructure 205. Theimage sensor 265 generates image data representing at least part of an image of thestructure 205, wherein the image corresponds to the light that theimage sensor 265 was exposed to. The3D scanner 285 then uses the image data to generate 3D data. Alternatively, the image data may be transmitted over thesystem bus 250 to thememory 215 where the image data is stored. Furthermore, the3D scanner 285 may also use image data corresponding to multiple previously captured images to generate the 3D data. - In some embodiments, the
image sensor 265 may be utilized to capture 2D images. In some embodiments the3D scanner 285 may use theimage sensor 265 to capture 2D images in order to supplement the 3D data captured by the3D scanner 285. In other embodiments, thedata collection system 201 may use theimage sensor 265 to capture 2D images independently of the3D scanner 285. The 2D image data may be transmitted to thememory 215 to be stored. The 2D image data may also be transmitted, via thenetwork interface 230, to a data analysis module such as thedata analysis module 103, where the 2D image data, or combination 2D-3D image data, may analyzed to estimate the condition of thestructure 205. - In some embodiments of the
image sensor 265, theimage sensor 265 may be used to detect thermal characteristics associated with thestructure 205 in addition to visual characteristics associated with the structure 205 (capturing infrared light, for example). Furthermore, in some embodiments the light reflected off of thestructure 205 may originate from thelight projector 270, while in other embodiments the light may originate elsewhere. In the former case, theprocessor 210 or the3D scanner 285 operates to transmit a command instructing thelight projector 270 to generate light. Thelight projector 270 receives the command to generate light and projects light in the direction of thestructure 205. The light may be visible light, such as laser light or ordinary light emitted from an HID lamp; or invisible light, such as infrared light or ultraviolet light. In certain embodiments, the light projector 370 may also be configured to emit radiation in other frequencies of the electromagnetic spectrum (e.g., radio waves, microwaves, terahertz radiation, x-rays, or gamma rays). For example, the light projector 370 may emit radio waves. The radio waves may reflect off thestructure 205 and may be detected by an antenna (not shown) communicatively coupled to thedata collection system 201. In such an embodiment, the light projector and antenna may operate as a radar system, allowing thedata collection system 201 to, for example, scan a subsurface associated with thestructure 205. In one embodiment, for example, thedata collection system 201 may scan the subsurface associated with shingles, enabling a data analysis module to determine if the subsurface of the shingles are damaged. - In operation of the
audio sensor 275 of the3D scanner 285, theaudio sensor 275 receives a signal from the3D scanner 285 instructing theaudio sensor 275 to detect audio or sound waves (“audio capture signal”). Theaudio sensor 275 receives the audio capture signal and theaudio sensor 275 is exposed to one or more audio signals or sound waves reflected off of thestructure 205. Theaudio sensor 275 generates audio data representing at least part of one of the audio signals that theaudio sensor 275 was exposed to. The3D scanner 285 then uses the audio data to generate 3D data. Alternatively, the audio data may then be transmitted over thesystem bus 250 from theaudio sensor 275 to thememory 215 where the audio data is stored. - In some embodiments of the
audio sensor 275 of thedata collection system 201, the audio signals or sound waves received at theaudio sensor 275 may originate from theaudio projector 280, while in other embodiments the audio signals may originate elsewhere. In the former case, theprocessor 210 operates to transmit a command instructing theaudio projector 280 to generate audio. Theaudio projector 280 receives the command to generate audio and emits one or more sound waves or audio signals in the direction of thestructure 205. In certain embodiments theaudio sensor 275 and theaudio projector 280 may operate as a sonar system, allowing thedata collection system 201 to, for example, scan a subsurface associated with thestructure 205. In one embodiment, for example, thedata collection system 201 may scan the subsurface associated with shingles, enabling a data analysis module to determine if the subsurface of the shingles are damaged. - In alternative embodiments of the
data collection system 201, the image capture signal, the audio capture signal, or the tactile capture signal may be received by from theprocessor 210, wherein the respective signal was generated in response to a capture command received by theprocessor 210 from theperipheral interface 235, thenetwork interface 230, or theinput interface 220. Likewise, theprocessor 210 may also operate to transmit the image data, audio data, tactile data, or 3D data to thenetwork interface 230 or theperipheral interface 235 to be transmitted to another device or system. - In further embodiments of the
data collection system 201, thedata collection system 201 may include a chemical spray device, or may be used in conjunction with a chemical spray device, wherein the chemical spray device sprays a chemical onto a surface of thestructure 205. The chemical may then be detected in order to help generate the image data or tactile data. In such an embodiment, thedata collection system 201 may include or may be used in conjunction with a chemical detection sensor. In some embodiments, the presence of the chemical may also be detected using theimage sensor 265. For example, a visually distinct or luminescent chemical (such as a phosphorescent or fluorescent chemical) may be sprayed on thestructure 205. Theimage sensor 265 may then be used to detect the presence and extent of luminescence on thestructure 205. A black light may also be used in conjunction with the process of detecting the chemical. The degree of luminescence present on thestructure 205 may be used to determine topographical features associated with thestructure 205 and may be used by the 3D scanner in generating 3D data. For example, the degree of luminescence may indicate pooling or seeping at certain locations on the surface of the structure. Detecting the luminescent chemical may also reveal run-off or drainage patterns, which may indicate an uneven surface or a dent on the surface. - In further alternative embodiments of the
data collection system 201, thedata collection system 201 may be configured to implement a data analysis method wherein theprocessor 210 accesses one or more of the image data, the audio data, the tactile data, or the 3D data on thememory 215 for analysis. Theprocessor 210 may further operate to estimate the condition of thestructure 205 based on said analysis. -
FIG. 3 illustrates a block diagram of adata collection system 301 according to an embodiment of the present disclosure. Thedata collection system 301 is configured to scan thestructure 305. Thedata collection system 301 includes a3D scanner 385, a flyingdevice 310, abase station 320, anantenna 325, and atether 330. The3D scanner 385 includes anantenna 316. The flyingdevice 310 may be a balloon, airplane, helicopter, projectile, rocket, or any other device capable of flight, levitation, or gliding. - In the preferred embodiment, the
3D scanner 385 is similar to the3D scanner 285 and may also include one or more of: a tactile sensor similar to thetactile sensor 260, an image sensor similar to theimage sensor 265, a light projector similar to thelight projector 270, an audio sensor similar to theaudio sensor 275, or an audio projector similar to theaudio projector 280. Thebase station 320 may include one or more of: a processor similar to theprocess 210, a memory similar to thememory 215, a peripheral interface similar to theperipheral interface 230, a user input interface similar to theuser input interface 220, or a transmitter similar to thetransmitter 235. - In the
data collection system 301, the3D scanner 385 is affixed to the flyingdevice 310. In thedata collection system 301, the3D scanner 385 is tethered to thebase station 320. Theantenna 316 of the3D scanner 385 is in communication with theantenna 325 of thebase station 320. - In operation of the
data collection system 301, the flyingdevice 310 is used to position the3D scanner 385 at an elevation higher than at least part of thestructure 305. Thetether 330 functions to keep the flyingdevice 310 within the vicinity of thebase station 320 by tethering the flyingdevice 310 to thebase station 320. In some embodiments, thetether 330 may provide power to the flyingdevice 310. The tether may also provide a communication channel between the flyingdevice 310 and the base station 320 (and may replace theantennas 3D scanner 385 has reached the desired elevation, the3D scanner 385 collects information associated with thestructure 305. In the preferred embodiment, the3D scanner 385 scans thestructure 305 and generates 3D data (e.g., 3D data points, a point cloud, or a 3D model). In some embodiments the3D scanner 385 may collect image information, audio information, or tactile information as discussed with regard to thedata collection system 201. The3D scanner 385 then uses theantenna 316 to transmit the collected information to theantenna 325 of thebase station 320. Thebase station 320 then transmits the collected information over a network such asnetwork 102 shown inFIG. 1 b. - In alternative embodiments of the
data collection system 301, thebase station 320 may be affixed to the flyingdevice 310 along with the3D scanner 285 and thetether 330 may instead tether thedata collection system 301 to an anchoring device or apparatus. In such an embodiment, the components of thedata collection system 301 may communicate over a system bus such as thesystem bus 250 discussed with regard toFIG. 2 . - In further embodiments of the
data collection system 301, the flyingdevice 310 may operate to bring the3D scanner 385 in contact with thestructure 305, or may drop the3D scanner 385 onto thestructure 305. In some embodiments, the flyingdevice 310 may operate autonomously. The flyingdevice 310 may also be controlled wirelessly by a remote device such as a radio control device. Furthermore, in certain embodiments the3D scanner 385 may be free of a connection to thetether 330. In some embodiments the3D scanner 385 may be held and operated by a person, while in others the3D scanner 385 may be affixed to a mechanical apparatus located on or near thestructure 305. -
FIG. 4 illustrates a block diagram of adata collection system 401 according to an embodiment of the present disclosure. Thedata collection system 401 includes a3D scanner 485, abase station 420, and atether 430. The3D scanner 485 includes anantenna 416 and aroller 417. Thebase station 420 includes anantenna 425. - The
3D scanner 485 may also include one or more of: a tactile sensor similar to thetactile sensor 260, an image sensor similar to theimage sensor 265, a light projector similar to thelight projector 270, an audio sensor similar to theaudio sensor 275, an audio projector similar to theaudio projector 280, or a 3D scanner similar to the3D scanner 285. Thebase station 420 may include one or more of: a processor similar to theprocess 210, a memory similar to thememory 215, a peripheral interface similar to theperipheral interface 230, a user input interface similar to theuser input interface 220, or a transmitter similar to thetransmitter 235. - In the
data collection system 401, theroller 417 of the3D scanner 485 comes into contact with a surface of thestructure 405. The3D scanner 485 is physically connected to thebase station 420 by thetether 430. Theantenna 416 of the3D scanner 485 is in communication with theantenna 425 of thebase station 420. - In operation of the
data collection system 401 of thedata collection system 401, the3D scanner 485 is deployed on a surface associated with thestructure 405. Theroller 417 comes into contact with the surface and rolls as the3D scanner 485 moves. Theroller 417 experiences a temporary imprint as it rolls, reflecting the shapes and features of the surface that it is rolling across. Sensors internal or external to the roller (such as thetactile sensor 260 ofFIG. 2 ) detect the imprinted texture. The3D scanner 485 generates tactile data representing the imprinted texture, The 3D scanner uses the tactile data to generate 3D data and uses theantenna 416 to transmit the 3D data to theantenna 425 of thebase station 420. Thebase station 420 may then transmit the 3D data over a network such as thenetwork 102 shown inFIG. 1 b. - In further embodiments of the
3D scanner 485, the3D scanner 485 may have mechanical feelers for contacting a surface associated with thestructure 405. The mechanical feelers may pull on an object associated with the surface (such as shingles on a roof) by gripping the object between opposable feelers in order to detect how strongly adhered to the surface the object is. Alternatively, the3D scanner 485 may deploy a mechanical feeler with an adhesive surface that detects how strongly an object is adhered to the surface by applying the adhesive surface of the mechanical feeler to the object, pulling the mechanical feeler away from the object, and detecting the resistive force associated with the object. Furthermore, the3D scanner 485 may deploy a mechanical feeler to physically manipulate the surface or an object on the surface (by tapping, pulling, or scraping, for example) and using an audio sensor (such as theaudio sensor 275, for example) to detect the audio response to the physical manipulation. The audio response may be analyzed (by thedata analysis module 103 shown inFIG. 1b , for example) and used in determining the condition of thestructure 405. In some embodiments, either or both of thedata collection system 401 and the3D scanner 485 may be unconnected to thetether 430. - In another embodiment of the
3D scanner 485, the3D scanner 485 may include a pad or a stamp instead of or in addition to theroller 417. The3D scanner 485 may depress the stamp onto a surface of thestructure 405. The features and shapes of the surface cause an imprint on the stamp and the sensing device detects the imprint using a tactile sensor such as thetactile sensor 260 shown inFIG. 2 . As discussed previously with respect to thedata collection system 201 shown inFIG. 2 , the stamp or pad may also have an adhesive surface causing objects on the surface of thestructure 405 to stick to the pad. The3D scanner 485 may then detect the resistive force exerted by an object when the stamp or pad is pulled away from the surface of thestructure 405. - In an alternative embodiment of the
data collection system 401, the entiredata collection system 401 may be affixed to or included in the3D scanner 485. In such an embodiment, thetether 430 may instead tether the3D scanner 485 to an anchoring device or apparatus on or near the ground, thestructure 405, or some other point of attachment. In a further embodiment, the3D scanner 485 may be controlled by a device remotely located relative to the3D scanner 485. In particular, the3D scanner 485 may be wirelessly controlled (e.g., via radio frequency by a radio control device). In other embodiments the3D scanner 485 may operate autonomously. -
FIG. 5 illustrates a block diagram of adata analysis system 503 according to an embodiment of the present disclosure. Thedata analysis system 503 includes aprocessor 510, amemory 515, auser input interface 520, anetwork interface 535, aperipheral interface 535, avideo interface 540, and asystem bus 550. Theprocessor 510,memory 515,user input interface 520,network interface 535,peripheral interface 535, andvideo interface 540 are each communicatively connected to thesystem bus 550. Thememory 515 may be any type of memory similar tomemory 215. Likewise, theprocessor 510 may be any processor similar to theprocessor 210, thenetwork interface 530 may be any network interface similar to thenetwork interface 230, theperipheral interface 535 may be any peripheral interface similar to theperipheral interface 235, and theuser input interface 520 may be any user input interface similar to theuser input interface 220. Thevideo interface 540 is configured to communicate over thesystem bus 540 and transmit video signals to a display device such as a monitor. - In operation of the
data analysis system 503, thenetwork interface 535 receives 3D data points corresponding to a structure such as thestructure 205 shown inFIG. 2 . Thenetwork interface 535 transmits the received data over thesystem bus 550 to thememory 515. Theprocessor 510 accesses thememory 515 to generate a first 3D model of the structure based on the 3D data points, wherein the edges and vertices associated with the model are derived from the 3D data points. Theprocessor 510 may then make one or more comparisons between the first 3D model and one or more second models. The second models may represent previously received data relating to the same structure, or they may represent previously received data relating to similar structures. Alternatively, the second models may have been created specifically for the purpose of estimating the condition of a structure and may not relate to any actual physical structure. Based on the one or more comparisons, theprocessor 510 generates an estimate of the condition of the structure. The estimate of the condition of the structure is saved to thememory 515. In some embodiments,network interface 535 may receive 2D image data or 2D-3D combination image data and may transmit the data to thememory 515. Theprocessor 510 may identify features with the 2D images and/or 2D-3D combination images and may generate the estimate of the condition of the structure in accordance with the identified features. - In further operation of the
data analysis system 503, theprocessor 510 may determine, based on the generated estimate, that the structure has been damaged. Theprocessor 510 may then operate to calculate (based on the condition of the structure and data relating to costs such as cost of supplies, materials, components and labor) an estimated financial cost associated with the damage. The estimated financial cost is then saved to thememory 515. Thevideo interface 540 may be used to display: the first 3D model, any of the one or more second models, the estimate of the condition of the structure, or the estimated financial cost. - In alternative embodiments of the
data analysis system 503, the received data may also represent images, videos, sounds, thermal maps, pressure maps, or topographical maps, any of which may be displayed via thevideo interface 540. The received data may then be used to generate a 3D model. Alternatively, the received data may be compared to reference images, videos, sound, thermal maps, pressure maps, or topographical maps to estimate the condition of the structure. -
FIG. 6 illustrates a flow chart of anexample method 600 for inspecting and analyzing the condition of a structure. Themethod 600 may be implemented, in whole or in part, on one or more devices or systems such as those shown in theproperty inspection system 100 ofFIG. 1 , thedata collection system 201 ofFIG. 2 , thedata collection system 301 ofFIG. 3 , thedata collection system 401 ofFIG. 4 , or thedata analysis system 503 ofFIG. 5 . The method may be saved as a set of instructions, routines, programs, or modules on memory such asmemory 215 ofFIG. 2 ormemory 515 ofFIG. 5 , and may be executed by a processor such asprocessor 210 ofFIG. 2 orprocessor 510 ofFIG. 5 . - The
method 600 begins when a 3D scanner scans a structure, such as thestructure 205 shown inFIG. 2 ,structure 305 shown inFIG. 3 , orstructure 405 shown inFIG. 4 , and detects a point on the surface of the structure (block 605). The structure may be any kind of building or structure. The structure may be, for example, a single-family home, townhome, condominium, apartment, storefront, or retail space, and the structure may be owned, leased, possessed, or occupied by an insurance policy holder. The structure may also be any of the structure types discussed regardingFIG. 1 , such as a vehicle, boat, or aircraft. In such structures, the 3D scanner may be used to inspect the body panels, windows, frame, and other surfaces associated with the vehicle, boat, or aircraft. Next, the 3D scanner identifies a coordinate set corresponding to each detected point on the surface of the structure (block 610). The coordinate set relates to vertical, horizontal, and depth distance measurements relative to the 3D scanner that detected the point. - The 3D scanner then generates a 3D data point, corresponding to the detected point on the surface of the structure, that includes the corresponding coordinate data (block 615). The 3D data point may then be saved to memory. A decision is made thereafter to either stop scanning the structure or continue scanning the structure (block 620). If there is more surface area or more surface points to be scanned, the 3D scanner continues scanning the structure. Otherwise, the
method 600 continues to block 625. - When it is determined that no further scanning is required, the
method 600 activates the 3D scanner, or a processor such as theprocessor 210 ofFIG. 2 or theprocessor 510 of -
FIG. 5 , to normalize the coordinate data for all of the generated 3D data points so that the 3D data points share a common coordinate system (block 625). The normalized 3D data points may then be saved to memory. The 3D scanner, or a processor, operates to build a point cloud from the 3D data points (block 630). This may be done by sampling or filtering the 3D data points. Alternatively, all of the 3D data points may be used. In any event, the point cloud may then be saved to memory. - After the point cloud is saved, the 3D scanner or processor operates to construct a 3D model from the point cloud (block 635). The edges and vertices associated with the model are derived from the points in the point cloud. Any of a number of surface reconstruction algorithms may be used to generate the surface of the model. In certain embodiments the surface reconstruction may be skipped altogether and the raw point cloud may be subsequently used instead of the constructed 3D model.
- Next, a processor such as the
processor 210 ofFIG. 2 or theprocessor 510 ofFIG. 5 operates to analyze the 3D model (or point cloud) to estimate a condition of the structure (block 640). In some embodiments, this may include comparing the model to other models, wherein the other models relate to previously collected data corresponding to the same structure, or previously collected data corresponding to other structures. In the alternative, the other models may only exist for the purpose of analysis or estimation and may not correlate to any real structure. - Based on the estimated condition of the structure, a processor operates to calculate a financial cost estimate corresponding to any damage to the structure (block 645). In some embodiments, the financial cost estimate may correspond to the estimated cost for materials, labor, and other resources required to repair or refurbish the structure.
- After calculating a financial cost estimate, a processor operates to determine a claim assessment (block 650). The claim assessment may then be saved to memory. In some embodiments the claim assessment may be sent to a third party associated with the structure, such as a client holding an insurance policy on the structure. In other embodiments the claim assessment may be sent to an insurance agent for evaluation.
-
FIG. 7 illustrates a flow chart of anexemplary method 700 for detecting a point on a surface using a 3D scanner. The method may be implemented by a 3D scanner, such as the3D scanner 285 ofFIG. 2 or the3D scanner 385 ofFIG. 3 . - The
method 700 begins when a light source is deployed oriented toward a structure such asstructure FIG. 1, 2, 3 , or 4, respectively (block 705). The light source may be a part of the 3D scanner, or it may be a separate device used in conjunction with the 3D scanner. The light source may be any type of light source, but in the preferred embodiment the light source is a laser that projects a dot or line. In other embodiments the light source may be a white light source that projects a pattern onto an object. - A photosensor or image sensing device, such as the
image sensor 265 ofFIG. 2 , is then deployed oriented toward the structure (block 710). The image sensing device may be part of the 3D scanner, or it may be a separate device used in conjunction with the 3D scanner. In the preferred embodiment, the image sensing device is capable of detecting and processing laser light. After the image sensing device has been deployed, the distance between the light source and the image sensing device is determined (block 715). - The light source projects light onto a surface of the structure (block 720) and the image sensing device detects light reflected off of the surface of the structure (block 725). In order to identify the position of the surface reflecting the light, a first and second angle are determined (block 730 and block 735, respectively). The first angle includes the light source as an end point, the projected light beam or laser as a first side, and a line extending to the image sensing device as the second side of the angle. The second angle includes the image sensing device as an end point, the received light beam or laser as a first side, and a line extending to the light source as a second side of the angle. Finally, the position (including depth) of the surface reflecting the light is determined (block 740) using the distance discussed in relation to block 715, the first angle discussed in relation to block 730, and the second angle discussed in relation to block 735.
- The position of the surface reflecting the light is saved to memory as coordinate data included in a 3D data point (block 745). The coordinate data may be relative to the 3D scanner, or it may be normalized so that is it is consistent with other saved 3D data points. After saving the coordinate data, the light source is adjusted so that the light is projected onto a different area on the surface of the property (block 750). A decision is then made to either continue scanning or stop scanning (block 755). If more of the structure needs to be scanned, the method returns to step 725 where the light from the adjusted light source is reflected off of the surface of the structure and detected. If the structure has been sufficiently scanned, the 3D scanner or a processor can begin the process of building a 3D model of the structure using the 3D data points.
- The following additional considerations apply to the foregoing discussion. Throughout this specification, plural instances may implement components, operations, or structures described as a single instance. Although individual operations of one or more methods are illustrated and described as separate operations, one or more of the individual operations may be performed concurrently, and nothing requires that the operations be performed in the order illustrated. Structures and functionality presented as separate components in example configurations may be implemented as a combined structure or component. Similarly, structures and functionality presented as a single component may be implemented as separate components. These and other variations, modifications, additions, and improvements fall within the scope of the subject matter herein.
- Discussions herein referring to an “appraiser,” “inspector,” “adjuster,” “claim representative” or the like are non-limiting. One skilled in the art will appreciate that any user associated with an insurance company or an insurance function may utilize one or more of the devices, systems, and methods disclosed in the foregoing description. One skilled in the art will further realize that any reference to a specific job title or role does not limit the disclosed devices, systems, or methods, or the type of user of said devices, systems, or methods.
- Certain implementations are described herein as including logic or a number of components, modules, or mechanisms. Modules may constitute either software modules (e.g., code implemented on a tangible, non-transitory machine-readable medium such as RAM, ROM, flash memory of a computer, hard disk drive, optical disk drive, tape drive, etc.) or hardware modules (e.g., an integrated circuit, an application-specific integrated circuit (ASIC), a field programmable logic array (FPLA)/field-programmable gate array (FPGA), etc.). A hardware module is a tangible unit capable of performing certain operations and may be configured or arranged in a certain manner. In example implementations, one or more computer systems (e.g., a standalone, client or server computer system) or one or more hardware modules of a computer system (e.g., a processor or a group of processors) may be configured by software (e.g., an application or application portion) as a hardware module that operates to perform certain operations as described herein.
- Unless specifically stated otherwise, discussions herein using words such as “processing,” “computing,” “calculating,” “determining,” “presenting,” “displaying,” or the like may refer to actions or processes of a machine (e.g., a computer) that manipulates or transforms data represented as physical (e.g., electronic, magnetic, or optical) quantities within one or more memories (e.g., volatile memory, non-volatile memory, or a combination thereof), registers, or other machine components that receive, store, transmit, or display information.
- As used herein any reference to “one implementation,” “one embodiment,” “an implementation,” or “an embodiment” means that a particular element, feature, structure, or characteristic described in connection with the implementation is included in at least one implementation. The appearances of the phrase “in one implementation” or “in one embodiment” in various places in the specification are not necessarily all referring to the same implementation.
- Some implementations may be described using the expression “coupled” along with its derivatives. For example, some implementations may be described using the term “coupled” to indicate that two or more elements are in direct physical or electrical contact. The term “coupled,” however, may also mean that two or more elements are not in direct contact with each other, but yet still co-operate or interact with each other. The implementations are not limited in this context.
- As used herein, the terms “comprises,” “comprising,” “includes,” “including,” “has,” “having” or any other variation thereof, are intended to cover a non-exclusive inclusion. For example, a process, method, article, or apparatus that comprises a list of elements is not necessarily limited to only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Further, unless expressly stated to the contrary, “or” refers to an inclusive or and not to an exclusive or. For example, a condition A or B is satisfied by any one of the following: A is true (or present) and B is false (or not present), A is false (or not present) and B is true (or present), and both A and B are true (or present).
- In addition, use of the “a” or “an” are employed to describe elements and components of the implementations herein. This is done merely for convenience and to give a general sense of the invention. This description should be read to include one or at least one and the singular also includes the plural unless it is obvious that it is meant otherwise.
- Upon reading this disclosure, those of skill in the art will appreciate still additional alternative structural and functional designs for a system and a process for inspecting a structure to estimate the condition of a structure through the disclosed principles herein. Thus, while particular implementations and applications have been illustrated and described, it is to be understood that the disclosed implementations are not limited to the precise construction and components disclosed herein. Various modifications, changes and variations, which will be apparent to those skilled in the art, may be made in the arrangement, operation and details of the method and apparatus disclosed herein without departing from the spirit and scope defined in the appended claims.
Claims (20)
1. A computer-implemented method of inspecting a structure, the method comprising:
deploying a three-dimensional (3D) data collection system including one or more 3D scanners;
physically tethering via a tether the 3D data collection system to a device on the ground;
transmitting, by a laser projector of the one or more 3D scanners, a plurality of laser pulses directed toward a surface of a structure;
detecting, at a laser light sensor of the one or more 3D scanners, the plurality of laser pulses after the plurality of laser pulses have reflected off of the surface of the structure; and
generating, by one or more processors, a 3D model of the surface of the structure based upon the detected plurality of laser pulses.
2. The computer-implemented method of claim 1 , further comprising:
automatically analyzing, by the one or more processors, the 3D model of the surface of the structure to identify one or more features of the surface of the structure; and
generating, by the one or more processors, an estimate of a condition of the surface of the structure based upon the identified one or more features.
3. The computer-implemented method of claim 1 , wherein deploying the 3D data collection system comprises:
moving a vehicle to which the 3D data collection system is affixed to a position so that the laser projector can transmit the plurality of laser pulses directed toward the surface of the structure.
4. The computer-implemented method of claim 3 , wherein moving the vehicle comprises moving the vehicle through the air.
5. The computer-implemented method of claim 1 , further comprising providing power to the 3D data collection system via the tether.
6. The computer-implemented method of claim 1 , further comprising transmitting to the 3D data collection system via the tether a control command, wherein the plurality of laser pulses are transmitted in response to the 3D data collection system receiving the control command.
7. The computer-implemented method of claim 1 , further comprising:
transmitting from the 3D data collection system via the tether data representing the plurality of laser pulses detected at the laser light sensor; and
receiving at the device via the tether the data representing the plurality of laser pulses
detected at the laser light sensor, wherein the device is a base station housing the one or more processors.
8. A computer system for inspecting a structure, the computer system comprising:
a three-dimensional (3D) data collection system including:
(i) a means for transmitting a plurality of laser pulses directed toward a surface of the structure; and
(ii) a means for detecting the plurality of laser pulses after the plurality of laser pulses have reflected off of the surface of the structure;
a means for physically tethering the 3D data collection system to a device on the ground; and
a means for generating a 3D model of the surface of the structure based upon the detected plurality of laser pulses.
9. The computer system of claim 8 , the computer system further comprising:
a means for automatically analyzing the 3D model of the surface of the structure to identify one or more features of the surface of the structure; and
a means for generating an estimate of a condition of the surface of the structure based upon the identified one or more features.
10. The computer system of claim 8 , wherein the 3D data collection system includes a vehicle; and
wherein the system further includes a means for moving the vehicle to a position where the plurality of laser pulses are transmitted toward the surface of the structure.
11. The computer system of claim 10 , wherein the means for moving the vehicle includes a
means for moving the vehicle on the ground.
12. The computer system of claim 8 , wherein the means for generating a 3D model of the
surface of the structure based upon the detected plurality of laser pulses is a base station; and
wherein the system further includes a means for transmitting from the 3D data collection
system to the base station data representing the detected plurality of laser pulses.
13. The computer system of claim 12 , wherein the means for automatically analyzing the 3D
model of the surface of the structure to identify one or more features of the surface of the structure is a server;
wherein the system further includes a means for transmitting the generated 3D model from the base station to the server.
14. A property inspection system comprising:
a three-dimensional (3D) data collection system including one or more 3D scanners, the one or more 3D scanners including:
(i) a laser projector for projecting a plurality of laser pulses directed toward a surface of a structure; and
(ii) a laser light sensor for detecting the plurality of laser pulses after the plurality of laser pulses have reflected off of the surface of the structure;
a tether, wherein a first end of the tether is physically connected to the 3D data collection system; and
a device physically connected to a second end of the tether such that the device is connected to the 3D data collection system via the tether.
15. The property inspection system of claim 14 , the system further comprising:
one or more processors; and
one or more memory devices communicatively connected to the one or more processors,
the one or more memory devices including instructions that, when executed, cause the one or more processors to:
(i) generate a 3D model of the surface of the structure based upon the detected plurality of laser pulses;
(ii) automatically analyze the 3D model of the surface of the structure to identify one or more features of the surface of the structure; and
(iii) generate an estimate of a condition of the surface of the structure based upon the identified one or more features.
16. The property inspection system of claim 14 , wherein the tether is configured to do one or more of the following:
deliver power to the 3D data collection system; or
facilitate communication between the 3D data collection system and the one or more processors.
17. The property inspection system of claim 14 , wherein the one or more processors
and the one or more memory devices share a physical platform with the 3D data collection system.
18. The property inspection system of claim 14 , wherein the tether includes a communication channel;
wherein the 3D data collection system is configured to transmit via the tether data representing the detected plurality of laser pulses; and
wherein a base station is configured to receive via the tether the data representing the detected plurality of laser pulses.
19. The property inspection system of claim 18 , wherein the 3D data collection
system includes an antenna, and wherein the 3D data collection system is configured to
wirelessly transmit to the base station via the antenna data representing the received plurality of laser pulses.
20. The property inspection system of claim 14 , wherein the one or more processors
and the one or more memory devices are located within a server communicatively coupled to the
3D data collection system;
wherein the 3D data collection system is configured to wirelessly transmit data
representing the detected plurality of laser pulses; and
wherein a server is configured to receive the data representing the detected plurality of laser pulses.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/902,349 US20180182087A1 (en) | 2013-03-15 | 2018-02-22 | Tethered 3d scanner |
Applications Claiming Priority (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13/836,695 US8872818B2 (en) | 2013-03-15 | 2013-03-15 | Methods and systems for capturing the condition of a physical structure |
US14/496,802 US9262788B1 (en) | 2013-03-15 | 2014-09-25 | Methods and systems for capturing the condition of a physical structure via detection of electromagnetic radiation |
US14/631,568 US9336552B1 (en) | 2013-03-15 | 2015-02-25 | Laser-based methods and systems for capturing the condition of a physical structure |
US15/134,273 US9959608B1 (en) | 2013-03-15 | 2016-04-20 | Tethered 3D scanner |
US15/902,349 US20180182087A1 (en) | 2013-03-15 | 2018-02-22 | Tethered 3d scanner |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/134,273 Continuation US9959608B1 (en) | 2013-03-15 | 2016-04-20 | Tethered 3D scanner |
Publications (1)
Publication Number | Publication Date |
---|---|
US20180182087A1 true US20180182087A1 (en) | 2018-06-28 |
Family
ID=51525606
Family Applications (18)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/836,695 Active US8872818B2 (en) | 2013-03-15 | 2013-03-15 | Methods and systems for capturing the condition of a physical structure |
US14/496,840 Active US9292630B1 (en) | 2013-03-15 | 2014-09-25 | Methods and systems for capturing the condition of a physical structure via audio-based 3D scanning |
US14/496,802 Active US9262788B1 (en) | 2013-03-15 | 2014-09-25 | Methods and systems for capturing the condition of a physical structure via detection of electromagnetic radiation |
US14/631,558 Active US9131224B1 (en) | 2013-03-15 | 2015-02-25 | Methods and systems for capturing the condition of a physical structure via chemical detection |
US14/631,568 Active US9336552B1 (en) | 2013-03-15 | 2015-02-25 | Laser-based methods and systems for capturing the condition of a physical structure |
US14/820,328 Active US9958387B1 (en) | 2013-03-15 | 2015-08-06 | Methods and systems for capturing the condition of a physical structure via chemical detection |
US14/968,147 Active 2033-09-15 US10013708B1 (en) | 2013-03-15 | 2015-12-14 | Estimating a condition of a physical structure |
US14/997,154 Active US9519058B1 (en) | 2013-03-15 | 2016-01-15 | Audio-based 3D scanner |
US15/134,273 Active 2033-06-17 US9959608B1 (en) | 2013-03-15 | 2016-04-20 | Tethered 3D scanner |
US15/344,268 Active US9996970B2 (en) | 2013-03-15 | 2016-11-04 | Audio-based 3D point cloud generation and analysis |
US15/902,349 Abandoned US20180182087A1 (en) | 2013-03-15 | 2018-02-22 | Tethered 3d scanner |
US15/902,354 Active US10176632B2 (en) | 2013-03-15 | 2018-02-22 | Methods and systems for capturing the condition of a physical structure via chemical detection |
US15/935,989 Active US10242497B2 (en) | 2013-03-15 | 2018-03-26 | Audio-based 3D point cloud generation and analysis |
US15/975,836 Active 2033-03-22 US10679262B1 (en) | 2013-03-15 | 2018-05-10 | Estimating a condition of a physical structure |
US16/831,518 Active 2033-05-18 US11270504B2 (en) | 2013-03-15 | 2020-03-26 | Estimating a condition of a physical structure |
US16/831,547 Active 2033-05-19 US11295523B2 (en) | 2013-03-15 | 2020-03-26 | Estimating a condition of a physical structure |
US17/687,843 Active US11694404B2 (en) | 2013-03-15 | 2022-03-07 | Estimating a condition of a physical structure |
US18/198,760 Active US12039669B2 (en) | 2013-03-15 | 2023-05-17 | Estimating a condition of a physical structure |
Family Applications Before (10)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/836,695 Active US8872818B2 (en) | 2013-03-15 | 2013-03-15 | Methods and systems for capturing the condition of a physical structure |
US14/496,840 Active US9292630B1 (en) | 2013-03-15 | 2014-09-25 | Methods and systems for capturing the condition of a physical structure via audio-based 3D scanning |
US14/496,802 Active US9262788B1 (en) | 2013-03-15 | 2014-09-25 | Methods and systems for capturing the condition of a physical structure via detection of electromagnetic radiation |
US14/631,558 Active US9131224B1 (en) | 2013-03-15 | 2015-02-25 | Methods and systems for capturing the condition of a physical structure via chemical detection |
US14/631,568 Active US9336552B1 (en) | 2013-03-15 | 2015-02-25 | Laser-based methods and systems for capturing the condition of a physical structure |
US14/820,328 Active US9958387B1 (en) | 2013-03-15 | 2015-08-06 | Methods and systems for capturing the condition of a physical structure via chemical detection |
US14/968,147 Active 2033-09-15 US10013708B1 (en) | 2013-03-15 | 2015-12-14 | Estimating a condition of a physical structure |
US14/997,154 Active US9519058B1 (en) | 2013-03-15 | 2016-01-15 | Audio-based 3D scanner |
US15/134,273 Active 2033-06-17 US9959608B1 (en) | 2013-03-15 | 2016-04-20 | Tethered 3D scanner |
US15/344,268 Active US9996970B2 (en) | 2013-03-15 | 2016-11-04 | Audio-based 3D point cloud generation and analysis |
Family Applications After (7)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/902,354 Active US10176632B2 (en) | 2013-03-15 | 2018-02-22 | Methods and systems for capturing the condition of a physical structure via chemical detection |
US15/935,989 Active US10242497B2 (en) | 2013-03-15 | 2018-03-26 | Audio-based 3D point cloud generation and analysis |
US15/975,836 Active 2033-03-22 US10679262B1 (en) | 2013-03-15 | 2018-05-10 | Estimating a condition of a physical structure |
US16/831,518 Active 2033-05-18 US11270504B2 (en) | 2013-03-15 | 2020-03-26 | Estimating a condition of a physical structure |
US16/831,547 Active 2033-05-19 US11295523B2 (en) | 2013-03-15 | 2020-03-26 | Estimating a condition of a physical structure |
US17/687,843 Active US11694404B2 (en) | 2013-03-15 | 2022-03-07 | Estimating a condition of a physical structure |
US18/198,760 Active US12039669B2 (en) | 2013-03-15 | 2023-05-17 | Estimating a condition of a physical structure |
Country Status (2)
Country | Link |
---|---|
US (18) | US8872818B2 (en) |
CA (1) | CA2844320C (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10679262B1 (en) | 2013-03-15 | 2020-06-09 | State Farm Mutual Automobile Insurance Company | Estimating a condition of a physical structure |
US10997668B1 (en) | 2016-04-27 | 2021-05-04 | State Farm Mutual Automobile Insurance Company | Providing shade for optical detection of structural features |
Families Citing this family (105)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8311856B1 (en) | 2008-10-13 | 2012-11-13 | Allstate Insurance Company | Communication of insurance claim data |
US8265963B1 (en) * | 2008-10-13 | 2012-09-11 | Allstate Insurance Company | Communication of insurance claim data |
US8422825B1 (en) * | 2008-11-05 | 2013-04-16 | Hover Inc. | Method and system for geometry extraction, 3D visualization and analysis using arbitrary oblique imagery |
US9672567B2 (en) * | 2012-06-29 | 2017-06-06 | Estimatics In The Fourth Dimensions, Llc | Damage assessment and reporting system |
US10360636B1 (en) | 2012-08-01 | 2019-07-23 | Allstate Insurance Company | System for capturing passenger and trip data for a taxi vehicle |
US9002719B2 (en) | 2012-10-08 | 2015-04-07 | State Farm Mutual Automobile Insurance Company | Device and method for building claim assessment |
US10713726B1 (en) | 2013-01-13 | 2020-07-14 | United Services Automobile Association (Usaa) | Determining insurance policy modifications using informatic sensor data |
US8931144B2 (en) | 2013-03-14 | 2015-01-13 | State Farm Mutual Automobile Insurance Company | Tethering system and method for remote device |
US9082015B2 (en) | 2013-03-15 | 2015-07-14 | State Farm Mutual Automobile Insurance Company | Automatic building assessment |
US8874454B2 (en) | 2013-03-15 | 2014-10-28 | State Farm Mutual Automobile Insurance Company | Systems and methods for assessing a roof |
US8818572B1 (en) | 2013-03-15 | 2014-08-26 | State Farm Mutual Automobile Insurance Company | System and method for controlling a remote aerial device for up-close inspection |
US10013507B2 (en) * | 2013-07-01 | 2018-07-03 | Here Global B.V. | Learning synthetic models for roof style classification using point clouds |
US10002339B2 (en) * | 2013-07-11 | 2018-06-19 | Fluor Technologies Corporation | Post-disaster assessment systems and methods |
US9947051B1 (en) | 2013-08-16 | 2018-04-17 | United Services Automobile Association | Identifying and recommending insurance policy products/services using informatic sensor data |
CN103489218B (en) * | 2013-09-17 | 2016-06-29 | 中国科学院深圳先进技术研究院 | Point cloud data quality automatic optimization method and system |
KR102241706B1 (en) * | 2013-11-13 | 2021-04-19 | 엘지전자 주식회사 | 3 dimensional camera and method for controlling the same |
AU2014262221C1 (en) | 2013-11-25 | 2021-06-10 | Esco Group Llc | Wear part monitoring |
US12100050B1 (en) | 2014-01-10 | 2024-09-24 | United Services Automobile Association (Usaa) | Electronic sensor management |
US11416941B1 (en) | 2014-01-10 | 2022-08-16 | United Services Automobile Association (Usaa) | Electronic sensor management |
US10552911B1 (en) | 2014-01-10 | 2020-02-04 | United Services Automobile Association (Usaa) | Determining status of building modifications using informatics sensor data |
US11087404B1 (en) | 2014-01-10 | 2021-08-10 | United Services Automobile Association (Usaa) | Electronic sensor management |
US9633050B2 (en) * | 2014-02-21 | 2017-04-25 | Wipro Limited | Methods for assessing image change and devices thereof |
US11847666B1 (en) | 2014-02-24 | 2023-12-19 | United Services Automobile Association (Usaa) | Determining status of building modifications using informatics sensor data |
US10614525B1 (en) | 2014-03-05 | 2020-04-07 | United Services Automobile Association (Usaa) | Utilizing credit and informatic data for insurance underwriting purposes |
US9153079B1 (en) * | 2014-03-18 | 2015-10-06 | Robert Bruce Wood | System and method of automated 3D scanning for vehicle maintenance |
US10204530B1 (en) | 2014-07-11 | 2019-02-12 | Shape Matrix Geometric Instruments, LLC | Shape-matrix geometric instrument |
US10410289B1 (en) | 2014-09-22 | 2019-09-10 | State Farm Mutual Automobile Insurance Company | Insurance underwriting and re-underwriting implementing unmanned aerial vehicles (UAVS) |
US10991049B1 (en) | 2014-09-23 | 2021-04-27 | United Services Automobile Association (Usaa) | Systems and methods for acquiring insurance related informatics |
EP3201562A4 (en) * | 2014-09-29 | 2018-04-11 | Sikorsky Aircraft Corporation | Apparatus for detecting corrosion in an article |
US11688014B1 (en) | 2014-10-02 | 2023-06-27 | United Services Automobile Association (Usaa) | Systems and methods for unmanned vehicle management |
US10102590B1 (en) | 2014-10-02 | 2018-10-16 | United Services Automobile Association (Usaa) | Systems and methods for unmanned vehicle management |
US9928553B1 (en) | 2014-10-09 | 2018-03-27 | State Farm Mutual Automobile Insurance Company | Method and system for generating real-time images of customer homes during a catastrophe |
US10134092B1 (en) * | 2014-10-09 | 2018-11-20 | State Farm Mutual Automobile Insurance Company | Method and system for assessing damage to insured properties in a neighborhood |
US9875509B1 (en) | 2014-10-09 | 2018-01-23 | State Farm Mutual Automobile Insurance Company | Method and system for determining the condition of insured properties in a neighborhood |
US9129355B1 (en) | 2014-10-09 | 2015-09-08 | State Farm Mutual Automobile Insurance Company | Method and system for assessing damage to infrastructure |
US10538325B1 (en) * | 2014-11-11 | 2020-01-21 | United Services Automobile Association | Utilizing unmanned vehicles to initiate and/or facilitate claims processing |
US10365646B1 (en) | 2015-01-27 | 2019-07-30 | United Services Automobile Association (Usaa) | Systems and methods for unmanned vehicle management |
US11532050B1 (en) | 2015-01-27 | 2022-12-20 | United Services Automobile Association (Usaa) | Unmanned vehicle service delivery |
MY190902A (en) | 2015-02-13 | 2022-05-18 | Esco Group Llc | Monitoring ground-engaging products for earth working equipment |
EP3265845A4 (en) * | 2015-03-05 | 2019-01-09 | Commonwealth Scientific and Industrial Research Organisation | Structure modelling |
US9877114B2 (en) * | 2015-04-13 | 2018-01-23 | DSCG Solutions, Inc. | Audio detection system and methods |
US10580199B2 (en) * | 2015-04-14 | 2020-03-03 | ETAK Systems, LLC | Systems and methods for data capture for telecommunications site modeling via a telescoping apparatus |
US10382975B2 (en) * | 2015-04-14 | 2019-08-13 | ETAK Systems, LLC | Subterranean 3D modeling at cell sites |
US10311565B2 (en) * | 2015-04-14 | 2019-06-04 | ETAK Systems, LLC | Cell site equipment verification using 3D modeling comparisons |
US9939810B1 (en) | 2015-04-17 | 2018-04-10 | United Services Automobile Association | Indoor drone flight awareness system |
US10489863B1 (en) | 2015-05-27 | 2019-11-26 | United Services Automobile Association (Usaa) | Roof inspection systems and methods |
US9894327B1 (en) * | 2015-06-22 | 2018-02-13 | State Farm Mutual Automobile Insurance Company | Systems and methods for remote data collection using unmanned vehicles |
FR3038109B1 (en) * | 2015-06-29 | 2018-07-27 | Airbus Operations (S.A.S.) | SYSTEM AND METHOD FOR LOCATING IMPACTS ON EXTERNAL SURFACE |
US9389314B1 (en) | 2015-07-27 | 2016-07-12 | State Farm Mutual Automobile Insurance Company | Subsurface imaging system and method for inspecting the condition of a structure |
US20170046865A1 (en) * | 2015-08-14 | 2017-02-16 | Lucasfilm Entertainment Company Ltd. | Animation motion capture using three-dimensional scanner data |
US9291544B1 (en) | 2015-08-24 | 2016-03-22 | State Farm Mutual Automobile Insurance Company | Tactile sensor system and method for inspecting the condition of a structure |
US11307042B2 (en) * | 2015-09-24 | 2022-04-19 | Allstate Insurance Company | Three-dimensional risk maps |
EP3165945B1 (en) * | 2015-11-03 | 2024-01-03 | Leica Geosystems AG | Surface measuring device for determining the 3d coordinates of a surface |
US10621744B1 (en) | 2015-12-11 | 2020-04-14 | State Farm Mutual Automobile Insurance Company | Structural characteristic extraction from 3D images |
WO2017103982A1 (en) * | 2015-12-14 | 2017-06-22 | 株式会社 ニコン・トリンブル | Defect detection apparatus and program |
US20170177748A1 (en) * | 2015-12-16 | 2017-06-22 | Wal-Mart Stores, Inc. | Residential Upgrade Design Tool |
CA2952098A1 (en) * | 2015-12-18 | 2017-06-18 | Wal-Mart Stores, Inc. | Apparatus and method for surveying premises of a customer |
CA2952484A1 (en) * | 2015-12-23 | 2017-06-23 | Wal-Mart Stores, Inc. | Apparatus and method for monitoring premises |
US9740200B2 (en) | 2015-12-30 | 2017-08-22 | Unmanned Innovation, Inc. | Unmanned aerial vehicle inspection system |
US9513635B1 (en) | 2015-12-30 | 2016-12-06 | Unmanned Innovation, Inc. | Unmanned aerial vehicle inspection system |
US10083616B2 (en) | 2015-12-31 | 2018-09-25 | Unmanned Innovation, Inc. | Unmanned aerial vehicle rooftop inspection system |
US10354386B1 (en) | 2016-01-27 | 2019-07-16 | United Services Automobile Association (Usaa) | Remote sensing of structure damage |
US10169856B1 (en) * | 2016-01-27 | 2019-01-01 | United Services Automobile Association (Usaa) | Laser-assisted image processing |
US10699347B1 (en) | 2016-02-24 | 2020-06-30 | Allstate Insurance Company | Polynomial risk maps |
WO2017151641A1 (en) * | 2016-02-29 | 2017-09-08 | Optecks, Llc | Aerial three-dimensional scanner |
US9975632B2 (en) | 2016-04-08 | 2018-05-22 | Drona, LLC | Aerial vehicle system |
DE102016206982B4 (en) * | 2016-04-25 | 2022-02-10 | Siemens Aktiengesellschaft | Airmobile for scanning an object and system for damage analysis of the object |
US11029352B2 (en) | 2016-05-18 | 2021-06-08 | Skydio, Inc. | Unmanned aerial vehicle electromagnetic avoidance and utilization system |
US9922412B1 (en) * | 2016-06-03 | 2018-03-20 | State Farm Mutual Automobile Insurance Company | System and method for assessing a physical structure |
US9870609B2 (en) * | 2016-06-03 | 2018-01-16 | Conduent Business Services, Llc | System and method for assessing usability of captured images |
CA2933860C (en) | 2016-06-23 | 2017-08-29 | Matthieu Grosfils | Systems and equipment for monitoring the contents of one or several compartment(s) in a medication distributor, fabrication method for the systems and equipment and corresponding methods for use |
US11181375B2 (en) * | 2016-06-30 | 2021-11-23 | Skydio, Inc. | Dynamically adjusting UAV flight operations based on thermal sensor data |
US10207820B2 (en) * | 2016-07-05 | 2019-02-19 | Toyota Motor Engineering & Manufacturing North America, Inc. | Systems for transporting, deploying, and docking unmanned aerial vehicles mountable on a ground vehicle |
GB2552019B (en) * | 2016-07-08 | 2020-01-08 | Rolls Royce Plc | Methods and apparatus for controlling at least one of a first robot and a second robot to collaborate within a system |
US10664750B2 (en) | 2016-08-10 | 2020-05-26 | Google Llc | Deep machine learning to predict and prevent adverse conditions at structural assets |
US9979813B2 (en) | 2016-10-04 | 2018-05-22 | Allstate Solutions Private Limited | Mobile device communication access and hands-free device activation |
US10264111B2 (en) | 2016-10-04 | 2019-04-16 | Allstate Solutions Private Limited | Mobile device communication access and hands-free device activation |
US11295218B2 (en) | 2016-10-17 | 2022-04-05 | Allstate Solutions Private Limited | Partitioning sensor based data to generate driving pattern map |
CN106442570B (en) * | 2016-11-23 | 2023-08-22 | 中国计量大学 | Device and method for detecting defects in pipeline and method for opening and setting camera |
US10771768B2 (en) | 2016-12-15 | 2020-09-08 | Qualcomm Incorporated | Systems and methods for improved depth sensing |
US10552981B2 (en) | 2017-01-16 | 2020-02-04 | Shapetrace Inc. | Depth camera 3D pose estimation using 3D CAD models |
WO2018143994A1 (en) | 2017-02-02 | 2018-08-09 | Hewlett-Packard Development Company, L.P. | Three-dimensional scanning with functional elements |
US11422725B2 (en) * | 2017-07-25 | 2022-08-23 | General Electric Company | Point-cloud dataset storage structure and method thereof |
US10338592B2 (en) * | 2017-08-24 | 2019-07-02 | Saudi Arabian Oil Company | High accuracy remote coordinate machine |
CN114777679A (en) | 2017-10-06 | 2022-07-22 | 先进扫描仪公司 | Generating one or more luminance edges to form a three-dimensional model of an object |
US11086315B2 (en) | 2017-10-26 | 2021-08-10 | 2KR Systems, LLC | Building rooftop intelligence gathering, decision-support and snow load removal system for protecting buildings from excessive snow load conditions, and automated methods for carrying out the same |
US10969521B2 (en) | 2017-10-26 | 2021-04-06 | 2KR Systems, LLC | Flexible networked array for measuring snow water equivalent (SWE) and system network for providing environmental monitoring services using the same |
JP6945478B2 (en) | 2018-03-05 | 2021-10-06 | 株式会社日立製作所 | Structure information management system and method |
FR3080839B1 (en) * | 2018-05-04 | 2020-04-17 | Airbus (S.A.S.) | SYSTEM AND METHOD FOR INSPECTING AN EXTERNAL SURFACE |
US10783230B2 (en) * | 2018-05-09 | 2020-09-22 | Shape Matrix Geometric Instruments, LLC | Methods and apparatus for encoding passwords or other information |
US10546371B1 (en) | 2018-08-22 | 2020-01-28 | William Pyznar | System and method for inspecting the condition of structures using remotely controlled devices |
US10708718B1 (en) | 2019-02-28 | 2020-07-07 | At&T Intellectual Property I, L.P. | Space characterization using electromagnetic fields |
CN109870464A (en) * | 2019-03-19 | 2019-06-11 | 陕西三星洁净工程有限公司 | Rod-like articles surface defects detection system and method based on machine vision |
DE102020107804A1 (en) * | 2019-04-26 | 2020-10-29 | Infineon Technologies Ag | Radar apparatus and method for detecting radar targets |
CN112017058A (en) * | 2019-05-30 | 2020-12-01 | 深圳市聚蜂智能科技有限公司 | Insurance loss assessment method and device, computer equipment and storage medium |
US11397909B2 (en) * | 2019-07-02 | 2022-07-26 | Tattle Systems Technology Inc. | Long term sensor monitoring for remote assets |
US11282288B2 (en) | 2019-11-20 | 2022-03-22 | Shape Matrix Geometric Instruments, LLC | Methods and apparatus for encoding data in notched shapes |
CN111368690B (en) * | 2020-02-28 | 2021-03-02 | 珠海大横琴科技发展有限公司 | Deep learning-based video image ship detection method and system under influence of sea waves |
CN111369609B (en) * | 2020-03-04 | 2023-06-30 | 山东交通学院 | Building local deformation analysis method based on point cloud curved surface feature constraint |
CN111508008B (en) * | 2020-04-08 | 2023-07-14 | 达闼机器人股份有限公司 | Point cloud registration method, electronic equipment and storage medium |
US11555693B2 (en) * | 2020-05-12 | 2023-01-17 | The Boeing Company | Measurement of surface profiles using unmanned aerial vehicles |
CN111986309B (en) * | 2020-07-24 | 2023-11-28 | 山东金东数字创意股份有限公司 | System and method for generating special film Pre-vis based on three-dimensional scanning |
US11782167B2 (en) | 2020-11-03 | 2023-10-10 | 2KR Systems, LLC | Methods of and systems, networks and devices for remotely detecting and monitoring the displacement, deflection and/or distortion of stationary and mobile systems using GNSS-based technologies |
CN113465524B (en) * | 2021-05-26 | 2023-07-28 | 中国水利水电第七工程局有限公司 | Real-time dam face deformation monitoring method for earth-rock dam filling based on point cloud data |
CN116661479B (en) * | 2023-07-28 | 2023-11-07 | 深圳市城市公共安全技术研究院有限公司 | Building inspection path planning method, equipment and readable storage medium |
Family Cites Families (148)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US3170206A (en) | 1963-03-22 | 1965-02-23 | Gladwin Plastics Inc | Cable assembly |
US3767152A (en) | 1972-06-28 | 1973-10-23 | W Killinger | Tabletop clamp |
US3883926A (en) | 1973-11-23 | 1975-05-20 | Rodney Kent Reynolds | Flexible hanger clamp for electrical lamp socket |
US4956824A (en) * | 1989-09-12 | 1990-09-11 | Science Accessories Corp. | Position determination apparatus |
US5076079A (en) | 1990-01-22 | 1991-12-31 | Monoson David B | Anti-theft device for computers and the like |
US5035558A (en) | 1990-04-30 | 1991-07-30 | Prosen Gildo G | Cargo tie-down device |
US5207171A (en) | 1991-11-14 | 1993-05-04 | Westwood Iii Samuel M | Adjustable rope lock |
US5304809A (en) * | 1992-09-15 | 1994-04-19 | Luxtron Corporation | Luminescent decay time measurements by use of a CCD camera |
US5950169A (en) | 1993-05-19 | 1999-09-07 | Ccc Information Services, Inc. | System and method for managing insurance claim processing |
US6181837B1 (en) | 1994-11-18 | 2001-01-30 | The Chase Manhattan Bank, N.A. | Electronic check image storage and retrieval system |
US5730246A (en) | 1995-07-13 | 1998-03-24 | State Farm Mutual Automobile Insurance Co. | Roof inspection fall protection system |
JP3058248B2 (en) | 1995-11-08 | 2000-07-04 | キヤノン株式会社 | Image processing control device and image processing control method |
DE19609613A1 (en) | 1996-03-12 | 1997-09-18 | Vdo Luftfahrtgeraete Werk Gmbh | Procedure for detecting a collision risk and for avoiding collisions in aviation |
US5697001A (en) | 1996-04-24 | 1997-12-09 | Eastman Kodak Company | Camera for generating and recording object data with the recorded image |
US6460810B2 (en) | 1996-09-06 | 2002-10-08 | Terry Jack James | Semiautonomous flight director |
US5913479A (en) | 1996-09-18 | 1999-06-22 | Westwood, Iii; Samuel M. | Snap hook with pivotal gate |
US5875867A (en) | 1996-10-09 | 1999-03-02 | State Farm Mutual Automobile Insurance Co. | Fall restraint system and method useful for roof inspection |
WO1998019099A1 (en) | 1996-10-28 | 1998-05-07 | Meyer Ostrobrod | Quadrapod safety support |
JP4181667B2 (en) | 1998-09-04 | 2008-11-19 | キヤノン株式会社 | Image processing apparatus, image processing method, and recording medium |
US6266610B1 (en) | 1998-12-31 | 2001-07-24 | Honeywell International Inc. | Multi-dimensional route optimizer |
US7953615B2 (en) | 2000-04-03 | 2011-05-31 | Mitchell International, Inc. | System and method of administering, tracking and managing of claims processing |
CA2378342A1 (en) | 2000-04-20 | 2001-11-01 | General Electric Company | Method and system for graphically identifying replacement parts for generally complex equipment |
US7343307B1 (en) | 2000-06-23 | 2008-03-11 | Computer Sciences Corporation | Dynamic help method and system for an insurance claims processing system |
IL138695A (en) | 2000-09-26 | 2004-08-31 | Rafael Armament Dev Authority | Unmanned mobile device |
US6907324B2 (en) | 2000-10-11 | 2005-06-14 | Honeywell International Inc. | Instrument reference flight display system for horizon representation of direction to next waypoint |
US20020055861A1 (en) | 2000-11-08 | 2002-05-09 | King Daniel A. | Claiming system and method |
CA2438435A1 (en) | 2001-03-30 | 2002-10-10 | E.I. Du Pont De Nemours And Company | Automotive collision repair claims management method and system |
US7023432B2 (en) | 2001-09-24 | 2006-04-04 | Geomagic, Inc. | Methods, apparatus and computer program products that reconstruct surfaces from data point sets |
US20030200123A1 (en) | 2001-10-18 | 2003-10-23 | Burge John R. | Injury analysis system and method for insurance claims |
US20050051667A1 (en) | 2001-12-21 | 2005-03-10 | Arlton Paul E. | Micro-rotorcraft surveillance system |
US6694228B2 (en) | 2002-05-09 | 2004-02-17 | Sikorsky Aircraft Corporation | Control system for remotely operated vehicles for operational payload employment |
US7379978B2 (en) | 2002-07-19 | 2008-05-27 | Fiserv Incorporated | Electronic item management and archival system and method of operating the same |
US20050144189A1 (en) | 2002-07-19 | 2005-06-30 | Keay Edwards | Electronic item management and archival system and method of operating the same |
US7885829B2 (en) | 2002-08-07 | 2011-02-08 | Metropolitan Property And Casualty Insurance Company | System and method for identifying and assessing comparative negligence in insurance claims |
EP1563312B1 (en) | 2002-09-23 | 2008-05-07 | Captron Electronic GmbH | Measuring and stabilising system for machine-controllable vehicles |
US7451148B2 (en) | 2002-10-31 | 2008-11-11 | Computer Sciences Corporation | Method of modifying a business rule while tracking the modifications |
US7822660B1 (en) | 2003-03-07 | 2010-10-26 | Mantas, Inc. | Method and system for the protection of broker and investor relationships, accounts and transactions |
US20040243423A1 (en) | 2003-05-30 | 2004-12-02 | Decision Support Services | Automotive collision estimate audit system |
US7061401B2 (en) | 2003-08-07 | 2006-06-13 | BODENSEEWERK GERäTETECHNIK GMBH | Method and apparatus for detecting a flight obstacle |
US20050080649A1 (en) | 2003-10-08 | 2005-04-14 | Alvarez Andres C. | Systems and methods for automating the capture, organization, and transmission of data |
US20050108063A1 (en) | 2003-11-05 | 2005-05-19 | Madill Robert P.Jr. | Systems and methods for assessing the potential for fraud in business transactions |
US20050108065A1 (en) | 2003-11-18 | 2005-05-19 | Dorfstatter Walter A. | Method and system of estimating vehicle damage |
JP2005165733A (en) | 2003-12-03 | 2005-06-23 | Sony Corp | Information processing system, remote control device and method, controller and method, program, and recording medium |
US20050159889A1 (en) | 2004-01-20 | 2005-07-21 | Isaac Emad S. | Adaptive route guidance |
WO2005080914A1 (en) * | 2004-02-25 | 2005-09-01 | The University Of Tokyo | Shape measurement device and method thereof |
US7809587B2 (en) | 2004-05-07 | 2010-10-05 | International Business Machines Corporation | Rapid business support of insured property using image analysis |
US20050286080A1 (en) | 2004-06-29 | 2005-12-29 | Samsung Electronics Co., Ltd. | Apparatus and method of transmitting document |
US20060031103A1 (en) | 2004-08-06 | 2006-02-09 | Henry David S | Systems and methods for diagram data collection |
US7672543B2 (en) | 2005-08-23 | 2010-03-02 | Ricoh Co., Ltd. | Triggering applications based on a captured text in a mixed media environment |
WO2006047266A1 (en) | 2004-10-22 | 2006-05-04 | Agrios, Inc. | Systems and methods for automated vehicle image acquisition, analysis, and reporting |
WO2006074682A2 (en) | 2004-12-27 | 2006-07-20 | Swiss Reinsurance Company | Dynamic control system for the automated monitoring of the correlation between damage notifications and claims and associated method |
US20100004802A1 (en) | 2005-01-25 | 2010-01-07 | William Kress Bodin | Navigating UAVS with an on-board digital camera |
FR2887065B1 (en) | 2005-06-14 | 2007-07-20 | Airbus France Sas | METHOD AND SYSTEM FOR AIDING THE STEERING OF A LOW ALTITUDE FLYING AIRCRAFT |
US20060289233A1 (en) | 2005-06-23 | 2006-12-28 | Flaherty Brian J | Roofing safety system and method |
WO2007047953A2 (en) | 2005-10-20 | 2007-04-26 | Prioria, Inc. | System and method for onboard vision processing |
US7898651B2 (en) * | 2005-10-24 | 2011-03-01 | General Electric Company | Methods and apparatus for inspecting an object |
DE102005051799A1 (en) | 2005-10-27 | 2007-05-03 | Stefan Reich | Method and device for remote control and stabilization of unmanned aerial vehicles |
DE102005063082A1 (en) * | 2005-12-29 | 2007-07-05 | Robert Bosch Gmbh | Vehicle chassis optical measurement method, involves extracting surface profiles from scanned surface structure, and determining spatial position characteristic of surface points as position data to determine chassis data |
US20070179868A1 (en) | 2006-01-17 | 2007-08-02 | Bozym William W | Electronic damaged vehicle estimating system that downloads part price data |
US7523910B2 (en) | 2006-06-08 | 2009-04-28 | Sean Thomas Moran | Device and method for the suspension of objects |
US8239220B2 (en) | 2006-06-08 | 2012-08-07 | Injury Sciences Llc | Method and apparatus for obtaining photogrammetric data to estimate impact severity |
US7458238B2 (en) | 2006-07-13 | 2008-12-02 | Stolk Frank M | Load binder locking device |
US20100231692A1 (en) * | 2006-07-31 | 2010-09-16 | Onlive, Inc. | System and method for performing motion capture and image reconstruction with transparent makeup |
EP2064676B1 (en) | 2006-09-21 | 2011-09-07 | Thomson Licensing | A method and system for three-dimensional model acquisition |
US20090138290A1 (en) | 2006-09-26 | 2009-05-28 | Holden Johnny L | Insurance adjustment through digital imaging system and method |
US7984500B1 (en) | 2006-10-05 | 2011-07-19 | Amazon Technologies, Inc. | Detecting fraudulent activity by analysis of information requests |
DE102006048578B4 (en) | 2006-10-13 | 2010-06-17 | Gerhard Witte | Method and apparatus for determining the change in the shape of a three-dimensional object |
US8025125B2 (en) | 2006-11-03 | 2011-09-27 | D B Industries, Inc. | Anchor assembly |
US9633426B2 (en) * | 2014-05-30 | 2017-04-25 | General Electric Company | Remote visual inspection image capture system and method |
FR2910876B1 (en) | 2007-01-02 | 2009-06-05 | Jannick Simeray | HELICOPTER WITH AUTOMATIC PILOT. |
US7979199B2 (en) | 2007-01-10 | 2011-07-12 | Honeywell International Inc. | Method and system to automatically generate a clearance request to deviate from a flight plan |
WO2008118977A1 (en) | 2007-03-26 | 2008-10-02 | Desert Research Institute | Data analysis process |
US20080255887A1 (en) | 2007-04-10 | 2008-10-16 | Autoonline Gmbh Informationssysteme | Method and system for processing an insurance claim for a damaged vehicle |
US20100094664A1 (en) | 2007-04-20 | 2010-04-15 | Carfax, Inc. | Insurance claims and rate evasion fraud system based upon vehicle history |
DE102007032084A1 (en) | 2007-07-09 | 2009-01-22 | Eads Deutschland Gmbh | Collision and Conflict Prevention System for autonomous unmanned aerial vehicles (UAV) |
US20090018717A1 (en) | 2007-07-11 | 2009-01-15 | Keith Reed | Vehicle auto-guidance memory |
US20090028003A1 (en) * | 2007-07-24 | 2009-01-29 | International Business Machines Corporation | Apparatus and method for sensing of three-dimensional environmental information |
US20090055226A1 (en) | 2007-08-20 | 2009-02-26 | American International Group, Inc. | Method and system for determining rates of insurance |
US8019447B2 (en) | 2007-09-14 | 2011-09-13 | The Boeing Company | Method and system to control operation of a device using an integrated simulation with a time shift option |
US8041637B1 (en) | 2007-12-05 | 2011-10-18 | United Services Automobile Association (Usaa) | Systems and methods for automated payment processing |
US8200025B2 (en) | 2007-12-07 | 2012-06-12 | University Of Ottawa | Image classification and search |
US8120376B2 (en) * | 2007-12-12 | 2012-02-21 | Novellus Systems, Inc. | Fault detection apparatuses and methods for fault detection of semiconductor processing tools |
FR2927262B1 (en) | 2008-02-13 | 2014-11-28 | Parrot | METHOD FOR CONTROLLING A ROTARY WING DRONE |
WO2009129496A2 (en) | 2008-04-17 | 2009-10-22 | The Travelers Indemnity Company | A method of and system for determining and processing object structure condition information |
CA2734143C (en) * | 2008-08-15 | 2021-08-31 | Brown University | Method and apparatus for estimating body shape |
US8521339B2 (en) | 2008-09-09 | 2013-08-27 | Aeryon Labs Inc. | Method and system for directing unmanned vehicles |
JP5173721B2 (en) | 2008-10-01 | 2013-04-03 | キヤノン株式会社 | Document processing system, control method therefor, program, and storage medium |
US8265963B1 (en) | 2008-10-13 | 2012-09-11 | Allstate Insurance Company | Communication of insurance claim data |
US8731234B1 (en) | 2008-10-31 | 2014-05-20 | Eagle View Technologies, Inc. | Automated roof identification systems and methods |
US8392036B2 (en) | 2009-01-08 | 2013-03-05 | Raytheon Company | Point and go navigation system and method |
US20100228406A1 (en) | 2009-03-03 | 2010-09-09 | Honeywell International Inc. | UAV Flight Control Method And System |
US8380367B2 (en) | 2009-03-26 | 2013-02-19 | The University Of North Dakota | Adaptive surveillance and guidance system for vehicle collision avoidance and interception |
US8558194B2 (en) * | 2009-04-10 | 2013-10-15 | The Penn State Research Foundation | Interactive coatings, surfaces and materials |
JP5189679B2 (en) | 2009-04-15 | 2013-04-24 | パイオニア株式会社 | Active vibration noise control device |
FR2945630B1 (en) | 2009-05-14 | 2011-12-30 | Airbus France | METHOD AND SYSTEM FOR REMOTELY INSPECTING A STRUCTURE |
US8977407B2 (en) | 2009-05-27 | 2015-03-10 | Honeywell International Inc. | Adaptive user interface for semi-automatic operation |
US20100302359A1 (en) | 2009-06-01 | 2010-12-02 | Honeywell International Inc. | Unmanned Aerial Vehicle Communication |
US8651440B2 (en) | 2009-07-08 | 2014-02-18 | Steven J. Hollinger | Portable multi-purpose mast for rapid, secure attachment to unsteady, inclined and irregular surfaces |
US8755923B2 (en) | 2009-12-07 | 2014-06-17 | Engineering Technology Associates, Inc. | Optimization system |
DE102009058802B4 (en) | 2009-12-18 | 2018-03-29 | Airbus Operations Gmbh | Arrangement for the combined representation of a real and a virtual model |
US20120013617A1 (en) * | 2009-12-30 | 2012-01-19 | Institute Of Automation, Chinese Academy Of Sciences | Method for global parameterization and quad meshing on point cloud |
US9558520B2 (en) | 2009-12-31 | 2017-01-31 | Hartford Fire Insurance Company | System and method for geocoded insurance processing using mobile devices |
US9097532B2 (en) | 2010-01-20 | 2015-08-04 | Honeywell International Inc. | Systems and methods for monocular airborne object detection |
CA2801486C (en) | 2010-02-01 | 2018-12-11 | Eagle View Technologies, Inc. | Geometric correction of rough wireframe models derived from photographs |
FR2957266B1 (en) | 2010-03-11 | 2012-04-20 | Parrot | METHOD AND APPARATUS FOR REMOTE CONTROL OF A DRONE, IN PARTICULAR A ROTATING SAIL DRONE. |
US9014415B2 (en) | 2010-04-22 | 2015-04-21 | The University Of North Carolina At Charlotte | Spatially integrated aerial photography for bridge, structure, and environmental monitoring |
US20110302091A1 (en) | 2010-06-07 | 2011-12-08 | Robert Hornedo | Auto Repair Estimation System |
WO2012003512A2 (en) | 2010-07-02 | 2012-01-05 | Sandel Avionics, Inc. | Aircraft hover system and method |
FR2963356B1 (en) | 2010-07-29 | 2014-08-22 | Arjowiggins Security | SECURITY STRUCTURE INCORPORATING PHOSPHORESCENT AND FLUORESCENT COMPOSITIONS |
US8645060B2 (en) | 2010-09-07 | 2014-02-04 | Qualcomm Incorporated | Positioning network availability and reliability based routing |
US8525830B2 (en) | 2010-09-17 | 2013-09-03 | The Boeing Company | Point cloud generation system |
US8120522B2 (en) | 2010-11-30 | 2012-02-21 | General Electric Company | System and method for inspecting a wind turbine blade |
US8401879B1 (en) | 2010-12-15 | 2013-03-19 | United Services Automobile Association (Usaa) | Umbrella travel insurance |
US8983806B2 (en) | 2011-01-11 | 2015-03-17 | Accurence, Inc. | Method and system for roof analysis |
DE102011010679A1 (en) | 2011-02-08 | 2012-08-09 | Eads Deutschland Gmbh | Unmanned aircraft with built-in collision warning system |
US20120215380A1 (en) | 2011-02-23 | 2012-08-23 | Microsoft Corporation | Semi-autonomous robot that supports multiple modes of navigation |
EP2691996A4 (en) * | 2011-03-30 | 2015-01-28 | Ambature Inc | Electrical, mechanical, computing, and/or other devices formed of extremely low resistance materials |
US8651206B2 (en) | 2011-03-31 | 2014-02-18 | Tobor Technology, Llc | Roof inspection systems and methods of use |
EP2511659A1 (en) | 2011-04-14 | 2012-10-17 | Hexagon Technology Center GmbH | Geodesic marking system for marking target points |
DE102011017564B4 (en) | 2011-04-26 | 2017-02-16 | Airbus Defence and Space GmbH | Method and system for inspecting a surface for material defects |
US8738198B2 (en) | 2011-05-26 | 2014-05-27 | Foster-Miller, Inc. | Robot surveillance system and method |
GB2496834B (en) | 2011-08-23 | 2015-07-22 | Toshiba Res Europ Ltd | Object location method and system |
EP2562682B1 (en) | 2011-08-24 | 2014-10-08 | DERMALOG Identification Systems GmbH | Method and device for capture of a fingerprint with authenticity recognition |
US8537338B1 (en) | 2011-08-24 | 2013-09-17 | Hrl Laboratories, Llc | Street curb and median detection using LIDAR data |
US20130233964A1 (en) | 2012-03-07 | 2013-09-12 | Aurora Flight Sciences Corporation | Tethered aerial system for data gathering |
US20130262153A1 (en) | 2012-03-28 | 2013-10-03 | The Travelers Indemnity Company | Systems and methods for certified location data collection, management, and utilization |
US9767598B2 (en) | 2012-05-31 | 2017-09-19 | Microsoft Technology Licensing, Llc | Smoothing and robust normal estimation for 3D point clouds |
US8775219B2 (en) | 2012-07-13 | 2014-07-08 | Northrop Grumman Systems Corporation | Spectral image classification of rooftop condition for use in property insurance |
US8510196B1 (en) | 2012-08-16 | 2013-08-13 | Allstate Insurance Company | Feedback loop in mobile damage assessment and claims processing |
US9002719B2 (en) | 2012-10-08 | 2015-04-07 | State Farm Mutual Automobile Insurance Company | Device and method for building claim assessment |
ITPR20130007A1 (en) | 2013-02-05 | 2014-08-06 | Ingegneria Biomedica Santa Lucia S P A | METHOD FOR TRACKING MATERIALS AND HEALTH GOODS WITH RFID IDENTIFICATION SYSTEM; AREA AND / OR STRUCTURE OF CONTAINING IMPLEMENTATION OF THIS METHOD |
US20140245165A1 (en) | 2013-02-28 | 2014-08-28 | Donan Engineering Co., Inc. | Systems and methods for collecting and representing attributes related to damage in a geographic area |
US8931144B2 (en) | 2013-03-14 | 2015-01-13 | State Farm Mutual Automobile Insurance Company | Tethering system and method for remote device |
US8872818B2 (en) | 2013-03-15 | 2014-10-28 | State Farm Mutual Automobile Insurance Company | Methods and systems for capturing the condition of a physical structure |
US9082015B2 (en) | 2013-03-15 | 2015-07-14 | State Farm Mutual Automobile Insurance Company | Automatic building assessment |
US8874454B2 (en) | 2013-03-15 | 2014-10-28 | State Farm Mutual Automobile Insurance Company | Systems and methods for assessing a roof |
US8818572B1 (en) | 2013-03-15 | 2014-08-26 | State Farm Mutual Automobile Insurance Company | System and method for controlling a remote aerial device for up-close inspection |
CN104107067A (en) | 2013-04-16 | 2014-10-22 | 深圳迈瑞生物医疗电子股份有限公司 | Ultrasonic diagnosis equipment and ultrasonic diagnosis method supporting multi-probe synchronous scanning |
ES2719492T3 (en) * | 2013-07-16 | 2019-07-10 | Polyrix Inc | Inspection system to inspect an object and inspection method for it |
US9759547B2 (en) * | 2014-08-19 | 2017-09-12 | The Boeing Company | Systems and methods for fiber placement inspection during fabrication of fiber-reinforced composite components |
JP6457072B2 (en) * | 2014-09-11 | 2019-01-23 | サイバーオプティクス コーポレーション | Integration of point clouds from multiple cameras and light sources in 3D surface shape measurement |
US10036801B2 (en) * | 2015-03-05 | 2018-07-31 | Big Sky Financial Corporation | Methods and apparatus for increased precision and improved range in a multiple detector LiDAR array |
US10289770B2 (en) * | 2015-04-13 | 2019-05-14 | Bell Helicopter Textron Inc. | Rotorcraft component simulation using scan-based geometry |
CA2995850A1 (en) | 2015-08-31 | 2017-03-09 | Ryan Kottenstette | Systems and methods for analyzing remote sensing imagery |
WO2020102339A1 (en) | 2018-11-14 | 2020-05-22 | Cape Analytics, Inc. | Systems, methods, and computer readable media for predictive analytics and change detection from remotely sensed imagery |
EP4133235A4 (en) | 2020-04-10 | 2024-04-03 | Cape Analytics, Inc. | System and method for geocoding |
US11222426B2 (en) | 2020-06-02 | 2022-01-11 | Cape Analytics, Inc. | Method for property feature segmentation |
WO2022082007A1 (en) | 2020-10-15 | 2022-04-21 | Cape Analytics, Inc. | Method and system for automated debris detection |
-
2013
- 2013-03-15 US US13/836,695 patent/US8872818B2/en active Active
-
2014
- 2014-02-28 CA CA2844320A patent/CA2844320C/en active Active
- 2014-09-25 US US14/496,840 patent/US9292630B1/en active Active
- 2014-09-25 US US14/496,802 patent/US9262788B1/en active Active
-
2015
- 2015-02-25 US US14/631,558 patent/US9131224B1/en active Active
- 2015-02-25 US US14/631,568 patent/US9336552B1/en active Active
- 2015-08-06 US US14/820,328 patent/US9958387B1/en active Active
- 2015-12-14 US US14/968,147 patent/US10013708B1/en active Active
-
2016
- 2016-01-15 US US14/997,154 patent/US9519058B1/en active Active
- 2016-04-20 US US15/134,273 patent/US9959608B1/en active Active
- 2016-11-04 US US15/344,268 patent/US9996970B2/en active Active
-
2018
- 2018-02-22 US US15/902,349 patent/US20180182087A1/en not_active Abandoned
- 2018-02-22 US US15/902,354 patent/US10176632B2/en active Active
- 2018-03-26 US US15/935,989 patent/US10242497B2/en active Active
- 2018-05-10 US US15/975,836 patent/US10679262B1/en active Active
-
2020
- 2020-03-26 US US16/831,518 patent/US11270504B2/en active Active
- 2020-03-26 US US16/831,547 patent/US11295523B2/en active Active
-
2022
- 2022-03-07 US US17/687,843 patent/US11694404B2/en active Active
-
2023
- 2023-05-17 US US18/198,760 patent/US12039669B2/en active Active
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10679262B1 (en) | 2013-03-15 | 2020-06-09 | State Farm Mutual Automobile Insurance Company | Estimating a condition of a physical structure |
US11270504B2 (en) | 2013-03-15 | 2022-03-08 | State Farm Mutual Automobile Insurance Company | Estimating a condition of a physical structure |
US11295523B2 (en) | 2013-03-15 | 2022-04-05 | State Farm Mutual Automobile Insurance Company | Estimating a condition of a physical structure |
US11694404B2 (en) | 2013-03-15 | 2023-07-04 | State Farm Mutual Automobile Insurance Company | Estimating a condition of a physical structure |
US12039669B2 (en) | 2013-03-15 | 2024-07-16 | State Farm Mutual Automobile Insurance Company | Estimating a condition of a physical structure |
US10997668B1 (en) | 2016-04-27 | 2021-05-04 | State Farm Mutual Automobile Insurance Company | Providing shade for optical detection of structural features |
Also Published As
Publication number | Publication date |
---|---|
US9336552B1 (en) | 2016-05-10 |
US9131224B1 (en) | 2015-09-08 |
US10242497B2 (en) | 2019-03-26 |
CA2844320A1 (en) | 2014-09-15 |
US10679262B1 (en) | 2020-06-09 |
US20140267627A1 (en) | 2014-09-18 |
US11295523B2 (en) | 2022-04-05 |
US9262788B1 (en) | 2016-02-16 |
US9519058B1 (en) | 2016-12-13 |
US20180218537A1 (en) | 2018-08-02 |
US11694404B2 (en) | 2023-07-04 |
US9996970B2 (en) | 2018-06-12 |
US20170076493A1 (en) | 2017-03-16 |
US8872818B2 (en) | 2014-10-28 |
US20230290066A1 (en) | 2023-09-14 |
US12039669B2 (en) | 2024-07-16 |
US9292630B1 (en) | 2016-03-22 |
US10013708B1 (en) | 2018-07-03 |
US20220189117A1 (en) | 2022-06-16 |
US10176632B2 (en) | 2019-01-08 |
US9958387B1 (en) | 2018-05-01 |
US20200387939A1 (en) | 2020-12-10 |
US9959608B1 (en) | 2018-05-01 |
US20200387940A1 (en) | 2020-12-10 |
US11270504B2 (en) | 2022-03-08 |
US20180180543A1 (en) | 2018-06-28 |
CA2844320C (en) | 2021-02-23 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US12039669B2 (en) | Estimating a condition of a physical structure | |
US10853931B2 (en) | System and method for structural inspection and construction estimation using an unmanned aerial vehicle | |
US10997668B1 (en) | Providing shade for optical detection of structural features | |
CN102763420B (en) | depth camera compatibility | |
Taylor et al. | Automatic calibration of lidar and camera images using normalized mutual information | |
US20220171068A1 (en) | Damage detection and analysis using three-dimensional surface scans | |
Santise et al. | Preliminary tests of a new low-cost photogrammetric system | |
KR102542556B1 (en) | Method and system for real-time detection of major vegetation in wetland areas and location of vegetation objects using high-resolution drone video and deep learning object recognition technology | |
Mahmoudzadeh et al. | Kinect, a novel cutting edge tool in pavement data collection | |
JP6216353B2 (en) | Information identification system, information identification method, and program thereof | |
US20240378813A1 (en) | Estimating a condition of a physical structure | |
CN112561874A (en) | Blocking object detection method and device and monitoring camera | |
US11674891B1 (en) | Systems and methods for detecting properties relating to building components | |
JP7226553B2 (en) | Information processing device, data generation method, and program | |
KR102054438B1 (en) | Building Management Device, Building Management Autonomous Flight Device | |
Zainuddin et al. | 3D MODELLING METHOD OF HIGH ABOVE GROUND ROCK ART PAINTING USING MULTISPECTRAL CAMERA | |
US20240134007A1 (en) | System and Method for Robotic Inspection | |
Elhassan | 3D Modeling of Indoor Building Geometry Using Unmanned Aerial Systems |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: STATE FARM MUTUAL AUTOMOBILE INSURANCE COMPANY, IL Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:FREEMAN, JAMES M.;SCHMIDGALL, ROGER D.;BOYER, PATRICK H.;AND OTHERS;REEL/FRAME:045359/0059 Effective date: 20130315 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO PAY ISSUE FEE |