US20160249021A1 - 3d asset inspection - Google Patents
3d asset inspection Download PDFInfo
- Publication number
- US20160249021A1 US20160249021A1 US15/050,898 US201615050898A US2016249021A1 US 20160249021 A1 US20160249021 A1 US 20160249021A1 US 201615050898 A US201615050898 A US 201615050898A US 2016249021 A1 US2016249021 A1 US 2016249021A1
- Authority
- US
- United States
- Prior art keywords
- data
- sensor
- probe
- camera
- images
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
- H04N7/183—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source
- H04N7/185—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source from a mobile camera, e.g. for remote control
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/24—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/002—Measuring arrangements characterised by the use of optical techniques for measuring two or more coordinates
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/86—Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/88—Lidar systems specially adapted for specific applications
- G01S17/89—Lidar systems specially adapted for specific applications for mapping or imaging
- G01S17/894—3D imaging with simultaneous measurement of time-of-flight at a 2D array of receiver pixels, e.g. time-of-flight cameras or flash lidar
-
- H04N13/0282—
-
- H04N13/0289—
-
- H04N5/23203—
Definitions
- Embodiments of the present invention generally relate to the fields of three-dimensional (3D) imaging and inspection of physical assets.
- various embodiments relate to systems and methods for vertical structure inspection based on 3D data generated based on depth sensor information and 2D data generated based on imaging data captured by one or more video cameras.
- a probe is positioned to multiple data capture positions with reference to a physical asset.
- data is collected regarding the physical asset by performing a data collection process including: (a) reading, by a central processing unit (CPU) of the probe, odometry data from one or more of an encoder and an inertial measurement unit (IMU) attached to or integrated with the probe; (b) capturing, by a camera attached to or integrated with the probe having a first view plane, one or more two-dimensional (2D) images; and (c) capturing, by a three-dimensional (3D) sensor attached to or integrated with the probe and having a second view plane overlapping that of the first view plane, one or more 3D sensor data frames; and (ii) performing a data synthesis process including: (d) linking, by the CPU, the odometry data, the one or more 2D images and the one or more 3D sensor data frames; (e) associating, by the
- FIG. 1 is a perspective view of an asset inspection robot in accordance with an embodiment of the present invention.
- FIG. 2 is a block diagram conceptually illustrating various functional units of an asset inspection robot in accordance with an embodiment of the present invention.
- FIG. 3 is a block diagram conceptually illustrating an asset inspection system in accordance with an embodiment of the present invention.
- FIG. 4 is a flow diagram illustrating data collection processing in accordance with an embodiment of the present invention.
- FIG. 5 is a flow diagram illustrating data synthesis processing in accordance with an embodiment of the present invention.
- FIG. 6 is a flow diagram illustrating data augmentation processing in accordance with an embodiment of the present invention.
- FIG. 7 conceptually illustrates the linking of data captured from various sensors, including 2D cameras and 3D sensors, with a physical point that exists in the real world at a given time in accordance with an embodiment of the present invention.
- FIG. 8 is a block diagram that conceptually illustrates a method for using multiple cameras to collect images, video, or 3D data from a single nodal point in accordance with an embodiment of the present invention.
- FIG. 9 illustrates a user interface screen shot in accordance with an embodiment of the present invention.
- FIG. 10 is an exemplary computer system in which or with which embodiments of the present invention may be utilized.
- a tethered robot includes one or more 2D cameras, one or more 3D depth sensors, a GPS sensor, a gas sensor and a custom software package.
- the 2D and 3D sensors capture visual data of a structure, the GPS sensor orients the robot to the physical world, and the software creates an immersive representation of this data for assessment purposes.
- automated guided vehicles such as quadracopters, aerial or submersible drones or other remote flying or underwater vehicles may be used for inspection of telecommunications structures (e.g., base tower stations, antennas, masts, latticed towers and cell phone towers) and commercial/industrial structures (e.g., skyscrapers, bridges, platforms, water tanks, water processing systems, factories, oil rigs, solar paneling and other civil infrastructure).
- telecommunications structures e.g., base tower stations, antennas, masts, latticed towers and cell phone towers
- commercial/industrial structures e.g., skyscrapers, bridges, platforms, water tanks, water processing systems, factories, oil rigs, solar paneling and other civil infrastructure.
- the inspection technologies described herein may be incorporated or integrated within crawling, legged, line following or wheeled robots. Therefore, the specific examples of transportation bodies presented and/or described herein are not intended to be limiting and are merely exemplary.
- the inspection technologies described herein may
- sensors and/or cameras may be mounted on or off the central axis and may rotate about or relative to the body of the probe.
- Embodiments of the present invention include various steps, which will be described below.
- the steps may be performed by hardware components or may be embodied in machine-executable instructions, which may be used to cause a general-purpose or special-purpose processor programmed with the instructions to perform the steps.
- the steps may be performed by a combination of hardware, software, firmware and/or by human operators.
- Embodiments of the present invention may be provided as a computer program product, which may include a machine-readable storage medium tangibly embodying thereon instructions, which may be used to program a computer (or other electronic devices) to perform a process.
- the machine-readable medium may include, but is not limited to, fixed (hard) drives, magnetic tape, floppy diskettes, optical disks, compact disc read-only memories (CD-ROMs), and magneto-optical disks, semiconductor memories, such as ROMs, PROMs, random access memories (RAMs), programmable read-only memories (PROMs), erasable PROMs (EPROMs), electrically erasable PROMs (EEPROMs), flash memory, magnetic or optical cards, or other type of media/machine-readable medium suitable for storing electronic instructions (e.g., computer programming code, such as software or firmware).
- embodiments of the present invention may also be downloaded as one or more computer program products, wherein the program may be transferred from a remote computer to a requesting computer by way of data signals embodied in a carrier wave or other propagation medium via a communication link (e.g., a modem or network connection).
- a communication link e.g., a modem or network connection
- the article(s) of manufacture e.g., the computer program products
- the computer programming code may be used by executing the code directly from the machine-readable storage medium or by copying the code from the machine-readable storage medium into another machine-readable storage medium (e.g., a hard disk, RAM, etc.) or by transmitting the code on a network for remote execution.
- Various methods described herein may be practiced by combining one or more machine-readable storage media containing the code according to the present invention with appropriate standard computer hardware to execute the code contained therein.
- An apparatus for practicing various embodiments of the present invention may involve one or more computers (or one or more processors within a single computer) and storage systems containing or having network access to computer program(s) coded in accordance with various methods described herein, and the method steps of the invention could be accomplished by modules, routines, subroutines, or subparts of a computer program product.
- the code implementing various embodiments of the present invention is not so limited.
- the code may reflect other programming paradigms and/or styles, including, but not limited to object-oriented programming (OOP), agent oriented programming, aspect-oriented programming, attribute-oriented programming (@OP), automatic programming, dataflow programming, declarative programming, functional programming, event-driven programming, feature oriented programming, imperative programming, semantic-oriented programming, functional programming, genetic programming, logic programming, pattern matching programming and the like.
- OOP object-oriented programming
- agent oriented programming aspect-oriented programming
- attribute-oriented programming @OP
- automatic programming dataflow programming
- declarative programming functional programming
- event-driven programming feature oriented programming
- feature oriented programming imperative programming
- semantic-oriented programming functional programming
- genetic programming logic programming
- pattern matching programming pattern matching programming and the like.
- 2D camera or the term “camera” generally refer to a device for recording visual images in the form of photographs, film and/or video signals.
- 3D sensor generally refers to a device using a remote sensing technology.
- a 3D sensor may measure distance by illuminating a target with a laser and analyze the reflected light.
- Examples of 3D sensors include, but are not limited to, a LiDAR, time of flight camera, structured light camera or laser displacement sensor.
- connection or coupling and related terms are used in an operational sense and are not necessarily limited to a direct connection or coupling.
- two devices may be coupled directly, or via one or more intermediary media or devices.
- devices may be coupled in such a way that information can be passed there between, while not sharing any physical connection with one another.
- connection or coupling exists in accordance with the aforementioned definition.
- probe generally refers to a physical structure with one or more 2D cameras and one or more 3D sensors attached to it.
- responsive includes completely or partially responsive.
- robot generally refers to a mechanical or electro-mechanical machine that may be guided or controlled by a computer program, electronic circuitry and/or a human operator.
- Non-limiting examples of robots include automated guided vehicles, such as quadracopters, aerial or submersible drones or other remote flying or underwater vehicles. Robots may move by flying, swimming, crawling, using legs, following lines or by being wheel-based.
- asset inspection robot 100 includes multiple sensors, including one or more 2D cameras 110 and one or more 3D sensors 120 .
- cameras and sensors 110 and 120 are arranged in such a way that the intersection of their respective view cones provides image coverage of the entire structure surface being inspected when the robot is moved vertically.
- cameras and sensors 110 and 120 may be arranged such that their nodal points are in vertical alignment so as to avoid parallax error and provide for proper registration when stitching the captured sensor data together.
- the robot body 150 is typically shaped in a way that provides for complete imaging with no sensor occlusion.
- sensors may be installed on robot 100 to facilitate collection of additional data.
- sensors include inertial measurement units (IMUs), orientation sensors, accelerometers, radar guns, metal detectors, voltage detectors, smoke detectors, humistors, flow sensors, depth gauges, gyroscopes, compasses, barometers, thermometers, proximity sensors, motion detectors and gas sensors.
- IMUs inertial measurement units
- a gas sensor 130 may be employed to detect various gases within the structure being inspected.
- a sonar or laser 140 may be used to detect the bottom surface of the structure.
- Light sources 160 may also be integrated with or attached to robot 100 to provide lighting for the sensors.
- FIG. 2 is a block diagram conceptually illustrating various functional units of an asset inspection robot 200 in accordance with an embodiment of the present invention.
- a printed circuit board (PCB) 217 with a microcontroller (not shown) is installed inside robot 200 to control various electro-mechanical functions.
- a central processing unit (CPU) 218 is installed inside robot 200 to process sensor data from sensors (e.g., sensors 211 , 212 , 213 and/or 214 ) to perform various processing, which is described further below, to prepare the gathered sensor data for human analysis.
- CPU 218 is also responsible for writing the captured sensor data to a storage mechanism 220 .
- storage mechanism 220 may take a variety of forms and may be local or remote.
- non-limiting examples of appropriate mass storage systems may include one or more hard drives, magnetic tape drives, magneto-optical disc drives, optical disc drives and/or solid-state drives (SSDs).
- captured sensor data may be temporarily buffered in a random access memory (RAM) (not shown) within robot 200 and stored remotely from robot 200 by transmitting the captured sensor data by wired or wireless means to a remote site.
- Robot 200 may be powered by a battery 219 and/or may be connected to an external source of direct or alternating current.
- FIG. 3 is a block diagram conceptually illustrating an asset inspection system 300 in accordance with an embodiment of the present invention.
- a robot 310 may be suspended by one or more (e.g., 3 ) cables or belts 321 and deployed into the structure by a winch system 324 consisting of a motor (not shown) and controls 323 .
- Winch system 324 may be powered by a battery (not shown) or an external source of direct or alternating current.
- An operator may use controls 323 to automatically lower robot 310 into the structure to perform the inspection.
- Advantages of various embodiments of the present invention include, without limitation, that they may capture qualitative and quantitative inspection data in a way that is consistent, repeatable, and complete. By using a robot, a human does not have to enter the structure to perform the inspection, and inspections can be performed far more quickly than is typical using conventional inspection methods.
- FIGS. 4-6 collectively illustrate a process used to collect asset inspection data and augment this data with human observations for the purpose of analysis in accordance with an embodiment of the present invention.
- FIG. 4 is a flow diagram illustrating data collection processing in accordance with an embodiment of the present invention.
- the data collection process is described with reference to a tethered probe that uses a vertical arrangement of 2D and 3D sensors.
- the probe is moved into a position such that the bottom most 2D camera is able to image the top most part of the asset.
- odometry information is read from the encoders and recorded so that the probe can be oriented in real-world space.
- a 2D image or set of images (video) is captured from the bottom-most 2D camera.
- data from other sensors such as IMU data, gas detection data, or any other data type is also recorded.
- the probe is at the bottom of the structure.
- data from a downward facing 3D sensor or a laser dot projected onto the asset bottom may be used to detect whether the probe is at the bottom of the area to be inspection. If so, then data collection processing is complete and at block 470 the scan is terminated and the probe returns to the docked position. Otherwise, data collection processing continues by looping back to block 410 at which point the probe is moved so that the 2D and 3D sensor(s) directly above the aforementioned 2D and 3D sensor(s) are in a position that places their nodal points in the same position as the aforementioned nodal points as described further below with reference to FIG. 8 . Data is captured and recorded from all sensors using blocks 420 through 460 as above until the bottom is detected
- FIG. 5 is a flow diagram illustrating data synthesis processing in accordance with an embodiment of the present invention.
- the collected data is synthesized for later use.
- the purpose of the data synthesis process is twofold. First discrete data sets (2D image frames, 3D data frames, other sensor data, odometry and IMU data, real world data and the like) are linked together. Second, the data is prepared for presentation in software so that a human can augment and/or analyze the data.
- Odometry generally refers to encoder data, IMU data, and any other data collected that contribute to defining the position of the probe at the time of the first data set capture as well as the position of each sensor in relation to the probe itself and the other sensors.
- An example of an encoder is a shaft encoder that counts a fraction of a revolution of a motor shaft or a drive axles of a wheel. Using multiple of such encoders in the context of a differentially steered robot with a pair of drive wheels and a castering tail or nose wheel, for example, allows both velocity and direction of travel (e.g., the heading in degrees) to be determined.
- the first 2D image captured is related to real-world space using odometry and sensor index information. If other 2D images (e.g., video) have been captured from the same point, these are also related to real-world space.
- 2D images e.g., video
- FIG. 7 A conceptual illustration of a process for linking data captured from various sensors with a physical point in the real-world is described below with reference to FIG. 7 .
- 3D data frames are related to 2D data. In one embodiment, this is performed using relative sensor position. Because the physical position of the 3D data sensors relative to the 2D cameras is a known physical geometry, each 3D data frame may be accurately related to a 2D data frame. Each point in a 3D data frame can be related to a pixel in the 2D data frame such that a set of points containing both 3D (X,Y,Z) data and 2D data (R, G, B, a) is formed. This process, called UV mapping, may be repeated for all matching 2D and 3D data.
- any other sensor data is also related to the 2D and 3D data sets using odometry and sensor index.
- the data may be post-processed for display in software. This post-processing may include assembling the 2D images into cube maps or other environment maps, smoothing or blending the 2D images to compensate for exposure differences, concatenating various 3D data sets using odometry and/or algorithmic functions, smoothing other sensor data, etc.
- image post-processing may be performed prior to UV mapping.
- FIG. 6 is a flow diagram illustrating data augmentation processing in accordance with an embodiment of the present invention.
- data is displayed to the user through a software package during the data augmentation process.
- this software package projects the 2D and 3D data in an immersive environment that allows the user to navigate the asset as if the user was looking through a camera at the asset.
- An exemplary user interface screen, in accordance with an embodiment of the present invention, is illustrated in FIG. 9 .
- the user identifies a region of interest as it is being displayed or projected onto a screen.
- the view matrix of the software projection may be recorded.
- the user may enter qualitative data about one or more pixels that are currently projected on the screen.
- the qualitative data is written to a database by the software package and associated with both the view matrices and the appropriate 2D imagery (including frame numbers if the 2D imagery is video data).
- the user is provided with the ability to switch between 2D views, 3D (point cloud) views, flattened views, or any other view as desired. Because the pixels in these views are linked to one another the qualitative data will also be linked to specific view matrices and pixels each view.
- the user may also measure the distance between one or more pixels in the software projection and another pixel in the same projection or a different projection. For example, if two pixels are selected, the distance between the two pixels in real-world space may be calculated by the software. If more than two pixels are selected, the resulting circle, oval, or polygon may be constructed by the software and relevant geometry in real-world space may be calculated. This data may be written to a database.
- the resulting 2D images, 3D data, other sensor data, and augmented data may be used to assess the condition of the asset.
- the user may choose to compare the same point over time (either using video data or two different inspections), compare different points at the same time, compare different points over different times, or perform any other analysis desired.
- FIG. 7 conceptually illustrates the linking of data captured from various sensors, including 2D cameras and 3D sensors, with a physical point that exists in the real world at a given time in accordance with an embodiment of the present invention.
- a physical asset is made up of many points (e.g., physical point 710 ), which may be on a surface of the physical asset or which may be below the surface of the physical asset. These points exist at specific locations in the real-world. The physical nature of the points may change or evolve over time. For example, a component of the asset may break down.
- points e.g., physical point 710
- a 2D image 711 that includes physical point 710 is captured with a camera. This capture (or, in the case of video, series of captures) occurs in a known time period.
- 3D data 712 from a 3D sensor is captured.
- This 3D data also includes physical point 710 of the asset.
- the physical orientation of the 3D sensor with respect to the 2D camera is also known.
- other data points 713 may also be recorded within the same space and/or time that encompasses physical point 710 .
- a human may augment the data collected about physical point 710 in 711 , 712 , 713 with qualitative and quantitative observations 714 .
- Qualitative observations may include, but are not limited to, a textual description of point 710
- quantitative observations may include, but are not limited to, measurements of point 710 in relation to other points within the same inspection/asset.
- the data collected about physical point 710 in steps 711 , 712 , 713 , and 714 may be used to assess the condition of the asset.
- various comparisons among points may be made available to the end user. For example, a long-range inspection or multiple inspections of the same asset over time allows for comparison of attributes associated with physical point 710 over different time periods. Comparison of two different points 716 at the same time is possible from within the same inspection. Comparison of different points at different times 717 is also possible through long-range or multiple inspections. Other comparisons 718 , including, but not limited to, a comparison of physical point 710 to a reference model, may also be utilized to assess the condition of an asset.
- FIG. 8 is a block diagram that conceptually illustrates a method for using multiple cameras to collect images, video, or 3D data from a single nodal point in accordance with an embodiment of the present invention.
- This simplified example uses vertically aligned 2D cameras, however, those skilled in the art will appreciate additional sensors, e.g., 3D sensors, infrared cameras and the like, could be placed within the same alignment.
- the alignment does not have to be vertical. Horizontal alignment or alignment over any other straight vector may be employed.
- FIG. 8 provides a non-limiting concrete example of sensor attributes and relative positioning that may be employed to facilitate the process of co-relating captured sensor data.
- a robot 810 is moved vertically within the interior of a target asset 817 (e.g., a manhole).
- Robot 810 contains six cameras ( 811 , 812 , 813 , 814 , 815 , 816 ) each of which has a field of view greater than 90 degrees.
- Camera 811 is positioned so that its lens is facing directly up, camera 811 is positioned facing directly outward, camera 813 is positioned facing directly outward rotated 90 degrees with respect to camera 812 , camera 814 is positioned facing directly outward rotated 90 degrees with respect to camera 813 , camera 815 is positioned facing directly outward rotated 90 degrees with respect to camera 814 , and camera 816 is positioned facing directly downward. All cameras are aligned such that their nodal points fall on the same vector. In other embodiments, more or less sensors could be used, but it is desirable to have a sufficient number of sensors and an arrangement thereof that allows complete imaging of target asset 817 .
- an image is taken from camera 816 , then the robot 810 is moved downward by a known distance 819 between the nodal points of cameras 816 and 815 .
- an image is taken from camera 815 .
- This process repeats until images have been taken from all cameras 811 , 812 , 813 , 814 , 815 , 816 and the robot 810 has moved the distance between the nodal point of camera 816 and the nodal point of camera 811 .
- the entire process is repeated until the entire asset has been imaged.
- FIG. 10 is an exemplary computer system 1000 in which or with which embodiments of the present invention may be utilized.
- Embodiments of the present disclosure include various steps, which have been described above. A variety of these steps may be performed by hardware components or may be tangibly embodied on a non-transitory computer-readable storage medium in the form of machine-executable instructions, which may be used to cause a general-purpose or special-purpose processor programmed with instructions to perform these steps. Alternatively, the steps may be performed by a combination of hardware, software, and/or firmware.
- Computer system 1000 may represent or form a part of a client device (e.g., an end-user workstation, a laptop or desktop computer system), a server, a probe or a robot.
- Computer system 1000 may be part of a distributed computer system (not shown) in which various aspects and functions described herein are practiced.
- the distributed computer system may include one more additional computer systems (not shown) that exchange information with each other and/or computer system 1000 .
- the computer systems of the distributed computer system may be interconnected by, and may exchange data through, a communication network (not shown), which may include any communication network through which computer systems may exchange data.
- the computer systems and the network may use various methods, protocols and standards, including, among others, Fibre Channel, Token Ring, Ethernet, Wireless Ethernet, Bluetooth, Internet Protocol (IP), IPv6, Transmission Control Protocol (TCP)/IP, User Datagram Protocol (UDP), Delay-Tolerant Networking (DTN), Hypertext Transfer Protocol (HTTP), File Transfer Protocol (FTP), Simple Network Mail Protocol (SNMP), SMS, MIMS, Signalling System No. 7 (SS7), JavaScript Object Notation (JSON), Simple Object Access Protocol (SOAP), Common Object Request Broker Architecture (CORBA), REST and Web Services.
- IP Internet Protocol
- IP IPv6
- TCP Transmission Control Protocol
- UDP User Datagram Protocol
- DTN Delay-Tolerant Networking
- HTTP Hypertext Transfer Protocol
- FTP File Transfer Protocol
- SNMP Simple Network Mail Protocol
- SMS MIMS
- Signalling System No. 7 SS7
- JSON JavaScript Object Notation
- SOAP Simple Object Access Protocol
- CORBA Common Object Request Broker Architecture
- aspects and functions described herein may be implemented as specialized hardware and/or software components executing in one or more computer systems, such as computer system 1000 .
- Various aspects and functionality described herein may be located on a single computer system or may be distributed among multiple computer systems (e.g., a probe or robot, a server and an end-user workstation) connected to one or more communications networks.
- various aspects and functions may be distributed among one or more server computer systems configured to provide a service to one or more client computers, or to perform an overall task as part of a distributed system.
- aspects may be performed on a client-server or multi-tier system that includes components distributed among one or more server systems that perform various functions.
- aspects and functions described herein are not limited to executing on any particular system or group of systems. Further, aspects and functions may be implemented in software, hardware or firmware, or any combination thereof. Thus, aspects and functions may be implemented within methods, acts, systems, system elements and components using a variety of hardware and software configurations, and the various aspects and functions described herein are not limited to any particular distributed architecture, network, or communication protocol.
- Computer system 1000 may include a bus 1030 , a processor 1005 , communication port 1010 , a main memory 1015 , a removable storage media (not shown), a read only memory (ROM) 1020 and a mass storage device 1025 .
- ROM read only memory
- Computer system 1000 may include more than one processor and more than on communication port.
- processor 1005 performs a series of instructions that result in manipulated data.
- Processor 1005 may be any type of processor, multiprocessor or controller.
- Some exemplary processors include commercially available processors such as an Intel Xeon, Itanium, Core, Celeron, or Pentium processor, an AMD Opteron processor, a Sun UltraSPARC or IBM Power5+ processor and an IBM mainframe chip.
- Processor 1005 is connected to other system components, including one or more memory devices representing main memory 1015 , ROM 1020 and mass storage device 1025 via bus 1030 .
- Main memory 1015 stores programs and data during operation of computer system 1000 .
- main memory 1015 may be a relatively high performance, volatile, random access memory (e.g., dynamic random access memory (DRAM) or static memory (SRAM)).
- main memory 1015 may include any device for storing data, such as a disk drive or other non-volatile storage device.
- Various examples may organize main memory 1015 into particularized and, in some cases, unique structures to perform the functions disclosed herein. These data structures may be sized and organized to store values for particular data and types of data.
- Bus 1030 may include one or more physical busses, for example, busses between components that are integrated within the same machine, but may include any communication coupling between system elements including specialized or standard computing bus technologies including, but not limited to, Integrated Drive Electronics (IDE), Small Computer System Interface (SCSI), Peripheral Component Interconnect (PCI) and InfiniBand.
- Bus 1030 enables communications of data and instructions, for example, to be exchanged between system components of computer system 1000 .
- computer system 1000 typically also includes one or more interface devices (not shown), e.g., input devices, output devices and combination input/output devices.
- Interface devices may receive input or provide output. More particularly, output devices may render information for external presentation.
- Input devices may accept information from external sources.
- Non-limiting examples of interface devices include keyboards, mouse devices, trackballs, microphones, touch screens, printing devices, display screens, speakers, network interface cards, etc.
- Interface devices allow computer system 1000 to exchange information and to communicate with external entities, e.g., users and other systems.
- Mass storage device 1025 includes a computer readable and writeable nonvolatile, or non-transitory, data storage medium in which instructions are stored that define a program or other object that is executed by processor 1005 .
- Mass storage device 1025 also may include information that is recorded, on or in, the medium, and that is processed by processor 1005 during execution of the program. More specifically, the information may be stored in one or more data structures specifically configured to conserve storage space or increase data exchange performance.
- the instructions may be persistently stored as encoded signals, and the instructions may cause processor 1005 to perform any of the functions described herein.
- the medium may, for example, be optical disk, magnetic disk or flash memory, among others.
- processor 1005 or some other controller causes data to be read from the nonvolatile recording medium into another memory, such as main memory 1015 , that allows for faster access to the information by processor 1005 than does the storage medium included in mass storage device 1025 .
- main memory 1015 another memory
- main memory 1015 main memory
- a variety of components may manage data movement between main memory 1015 , mass storage device 1025 and other memory elements and examples are not limited to particular data management components. Further, examples are not limited to a particular memory system or data storage system.
- communication port 1010 may include, but is not limited to, an RS-232 port for use with a modem based dialup connection, a 10/100 Ethernet port, a Gigabit or 10 Gigabit port using copper or fiber, a serial port, a parallel port, or other existing or future ports.
- Communication port 610 may be chosen depending on a network, such a Local Area Network (LAN), Wide Area Network (WAN), or any network to which computer system 1000 connects.
- LAN Local Area Network
- WAN Wide Area Network
- communication ports 1010 may serve as interfaces with various sensors (not shown).
- Removable storage media can be any kind of external hard-drives, floppy drives, IOMEGA® Zip Drives, Compact Disc-Read Only Memory (CD-ROM), Compact Disc-Re-Writable (CD-RW), Digital Video Disk-Read Only Memory (DVD-ROM).
- CD-ROM Compact Disc-Read Only Memory
- CD-RW Compact Disc-Re-Writable
- DVD-ROM Digital Video Disk-Read Only Memory
- computer system 1005 is shown by way of example as one type of computer system upon which various aspects and functions may be practiced, aspects and functions are not limited to being implemented on computer system 1000 .
- Various aspects and functions may be practiced on one or more computers having a different architecture or components than that shown in FIG. 10 .
- computer system 1000 may include specially programmed, special-purpose hardware, such as an application-specific integrated circuit (ASIC) tailored to perform a particular operation disclosed herein.
- ASIC application-specific integrated circuit
- another example may perform the same function using a grid of several general-purpose computing devices running MAC OS System X with Motorola PowerPC processors and several specialized computing devices running proprietary hardware and operating systems.
- Computer system 1000 may include an operating system (not shown) that manages at least a portion of the hardware elements included in computer system 1000 .
- a processor or controller such as the processor 1005 , executes the operating system.
- Non-limiting examples of operating systems for an end-user workstation or a server include a Windows-based operating system, such as, Windows NT, Windows 2000 (Windows ME), Windows XP, Windows Vista or Windows 7 operating systems, available from Microsoft Corporation, a MAC OS System X operating system available from Apple Inc., one of many Linux-based operating system distributions, for example, the Enterprise Linux operating system available from Red Hat Inc., a Solaris operating system available from Sun Microsystems, or a UNIX operating systems available from various sources. Many other operating systems may be used.
- Processor 1005 and operating system together define a computer platform for which application programs in high-level programming languages may be written. These applications may be executable, intermediate, bytecode or interpreted code, which communicates over a communication network, for example, the Internet, using a communication protocol, for example, TCP/IP. Similarly, aspects may be implemented using an object-oriented programming language, such as .Net, SmallTalk, Java, C++, Ada, or C# (C-Sharp). Other object-oriented programming languages may also be used. Alternatively, functional, scripting, or logical programming languages may be used.
- various aspects and functions may be implemented in a non-programmed environment, for example, documents created in Hypertext Markup Language (HTML), eXtensible Markup Language (XML) or other format that, when viewed in a window of a browser program, can render aspects of a graphical-user interface or perform other functions.
- HTML Hypertext Markup Language
- XML eXtensible Markup Language
- various examples may be implemented as programmed or non-programmed elements, or any combination thereof.
- a web page may be implemented using HTML while a data object called from within the web page may be written in C++.
- the examples are not limited to a specific programming language and any suitable programming language could be used.
- the functional components disclosed herein may include a wide variety of elements, e.g. specialized hardware, executable code, data structures or objects, that are configured to perform the functions described herein.
- the components disclosed herein may read parameters that affect the functions performed by the components. These parameters may be physically stored in any form of suitable memory including volatile memory (such as RAM) or nonvolatile memory (such as a magnetic hard drive). In addition, the parameters may be logically stored in a propriety data structure (such as a database or file defined by a user mode application) or in a commonly shared data structure (such as an application registry that is defined by an operating system). In addition, some examples provide for both system and user interfaces that allow external entities to modify the parameters and thereby configure the behavior of the components.
Landscapes
- Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Computer Networks & Wireless Communication (AREA)
- Electromagnetism (AREA)
- Length Measuring Devices By Optical Means (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
Abstract
Description
- This application claims the benefit of priority of U.S. Provisional Application No. 62/119,788, filed Feb. 23, 2015, which is hereby incorporated by reference in its entirety for all purposes.
- Contained herein is material that is subject to copyright protection. The copyright owner has no objection to the facsimile reproduction of the patent disclosure by any person as it appears in the Patent and Trademark Office patent files or records, but otherwise reserves all rights to the copyright whatsoever. Copyright © 2015-2016, Industrial Technology Group, LLC.
- 1. Field
- Embodiments of the present invention generally relate to the fields of three-dimensional (3D) imaging and inspection of physical assets. In particular, various embodiments relate to systems and methods for vertical structure inspection based on 3D data generated based on depth sensor information and 2D data generated based on imaging data captured by one or more video cameras.
- 2. Description of the Related Art
- Conventional methods of inspecting vertical structures, such as manholes, do not capture qualitative and quantitative inspection data in a way that is consistent, repeatable, and complete. One such conventional inspection method is manned entry of the structure relying on tape measure and note taking for data collection, which often leaves out critical information. Another such conventional inspection method relies on a human operating a camera attached to a pole from above the structure, also leaving out critical information. Another such conventional inspection method utilizes a robot with one or more panoramic cameras for data collection, which often results in distorted images and/or incomplete data.
- Systems and methods are described for physical asset inspection. According to one embodiment, a probe is positioned to multiple data capture positions with reference to a physical asset. For each of the data capture positions: (i) data is collected regarding the physical asset by performing a data collection process including: (a) reading, by a central processing unit (CPU) of the probe, odometry data from one or more of an encoder and an inertial measurement unit (IMU) attached to or integrated with the probe; (b) capturing, by a camera attached to or integrated with the probe having a first view plane, one or more two-dimensional (2D) images; and (c) capturing, by a three-dimensional (3D) sensor attached to or integrated with the probe and having a second view plane overlapping that of the first view plane, one or more 3D sensor data frames; and (ii) performing a data synthesis process including: (d) linking, by the CPU, the odometry data, the one or more 2D images and the one or more 3D sensor data frames; (e) associating, by the CPU, the one or more 2D images and the one or more 3D sensor data frames with a physical point in real-world space based on the odometry data; and (f) facilitating subsequent ability on behalf of a user navigating the collected data to switch between a 2D view and a 3D view by forming, by the CPU, a set of points each containing both 3D data and 2D data by performing UV mapping based on the one or more 2D images, the one or more 3D sensor data frames and based on a known physical geometry of positioning of the camera relative to the 3D sensor.
- Embodiments of the present invention are illustrated by way of example, and not by way of limitation, in the figures of the accompanying drawings and in which like reference numerals refer to similar elements and in which:
-
FIG. 1 is a perspective view of an asset inspection robot in accordance with an embodiment of the present invention. -
FIG. 2 is a block diagram conceptually illustrating various functional units of an asset inspection robot in accordance with an embodiment of the present invention. -
FIG. 3 is a block diagram conceptually illustrating an asset inspection system in accordance with an embodiment of the present invention. -
FIG. 4 is a flow diagram illustrating data collection processing in accordance with an embodiment of the present invention. -
FIG. 5 is a flow diagram illustrating data synthesis processing in accordance with an embodiment of the present invention. -
FIG. 6 is a flow diagram illustrating data augmentation processing in accordance with an embodiment of the present invention. -
FIG. 7 conceptually illustrates the linking of data captured from various sensors, including 2D cameras and 3D sensors, with a physical point that exists in the real world at a given time in accordance with an embodiment of the present invention. -
FIG. 8 is a block diagram that conceptually illustrates a method for using multiple cameras to collect images, video, or 3D data from a single nodal point in accordance with an embodiment of the present invention. -
FIG. 9 illustrates a user interface screen shot in accordance with an embodiment of the present invention. -
FIG. 10 is an exemplary computer system in which or with which embodiments of the present invention may be utilized. - Systems and methods are described for physical asset inspection. According to one embodiment, designed to assess manhole conditions, a tethered robot includes one or more 2D cameras, one or more 3D depth sensors, a GPS sensor, a gas sensor and a custom software package. The 2D and 3D sensors capture visual data of a structure, the GPS sensor orients the robot to the physical world, and the software creates an immersive representation of this data for assessment purposes.
- While various embodiments of the present invention are described with reference to a robot with winch control to provide vertical motion, it is to be understood that the 2D and 3D data capturing methodologies and software representation technologies described herein are equally applicable to alternative transportation bodies and mechanisms. As will be appreciated by those skilled in the art, the physical asset inspection functionality may be housed in whatever transportation body is most appropriate for the particular physical asset being inspected. For example, automated guided vehicles, such as quadracopters, aerial or submersible drones or other remote flying or underwater vehicles may be used for inspection of telecommunications structures (e.g., base tower stations, antennas, masts, latticed towers and cell phone towers) and commercial/industrial structures (e.g., skyscrapers, bridges, platforms, water tanks, water processing systems, factories, oil rigs, solar paneling and other civil infrastructure). Alternatively, the inspection technologies described herein may be incorporated or integrated within crawling, legged, line following or wheeled robots. Therefore, the specific examples of transportation bodies presented and/or described herein are not intended to be limiting and are merely exemplary. In some embodiments, the inspection technologies described herein may be attached to existing commercial equipment, such as closed-circuit television (CCTV) sewer cameras to augment data collection, for example.
- Furthermore, while, for convenience, various embodiments of the present invention may be described with reference to fixed-position sensors and cameras relative to the probe body, it is to be understood that in alternative embodiments the sensors and/or cameras may be mounted on or off the central axis and may rotate about or relative to the body of the probe.
- In the following description, numerous specific details are set forth in order to provide a thorough understanding of embodiments of the present invention. It will be apparent, however, to one skilled in the art that embodiments of the present invention may be practiced without some of these specific details. In other instances, well-known structures and devices are shown in block diagram form.
- Embodiments of the present invention include various steps, which will be described below. The steps may be performed by hardware components or may be embodied in machine-executable instructions, which may be used to cause a general-purpose or special-purpose processor programmed with the instructions to perform the steps. Alternatively, the steps may be performed by a combination of hardware, software, firmware and/or by human operators.
- Embodiments of the present invention may be provided as a computer program product, which may include a machine-readable storage medium tangibly embodying thereon instructions, which may be used to program a computer (or other electronic devices) to perform a process. The machine-readable medium may include, but is not limited to, fixed (hard) drives, magnetic tape, floppy diskettes, optical disks, compact disc read-only memories (CD-ROMs), and magneto-optical disks, semiconductor memories, such as ROMs, PROMs, random access memories (RAMs), programmable read-only memories (PROMs), erasable PROMs (EPROMs), electrically erasable PROMs (EEPROMs), flash memory, magnetic or optical cards, or other type of media/machine-readable medium suitable for storing electronic instructions (e.g., computer programming code, such as software or firmware). Moreover, embodiments of the present invention may also be downloaded as one or more computer program products, wherein the program may be transferred from a remote computer to a requesting computer by way of data signals embodied in a carrier wave or other propagation medium via a communication link (e.g., a modem or network connection).
- In various embodiments, the article(s) of manufacture (e.g., the computer program products) containing the computer programming code may be used by executing the code directly from the machine-readable storage medium or by copying the code from the machine-readable storage medium into another machine-readable storage medium (e.g., a hard disk, RAM, etc.) or by transmitting the code on a network for remote execution. Various methods described herein may be practiced by combining one or more machine-readable storage media containing the code according to the present invention with appropriate standard computer hardware to execute the code contained therein. An apparatus for practicing various embodiments of the present invention may involve one or more computers (or one or more processors within a single computer) and storage systems containing or having network access to computer program(s) coded in accordance with various methods described herein, and the method steps of the invention could be accomplished by modules, routines, subroutines, or subparts of a computer program product.
- Notably, while embodiments of the present invention may be described using modular programming terminology, the code implementing various embodiments of the present invention is not so limited. For example, the code may reflect other programming paradigms and/or styles, including, but not limited to object-oriented programming (OOP), agent oriented programming, aspect-oriented programming, attribute-oriented programming (@OP), automatic programming, dataflow programming, declarative programming, functional programming, event-driven programming, feature oriented programming, imperative programming, semantic-oriented programming, functional programming, genetic programming, logic programming, pattern matching programming and the like.
- Brief definitions of terms used throughout this application are given below.
- The phrase “2D camera” or the term “camera” generally refer to a device for recording visual images in the form of photographs, film and/or video signals.
- The phrase “3D sensor” generally refers to a device using a remote sensing technology. A 3D sensor may measure distance by illuminating a target with a laser and analyze the reflected light. Examples of 3D sensors include, but are not limited to, a LiDAR, time of flight camera, structured light camera or laser displacement sensor.
- The terms “connected” or “coupled” and related terms are used in an operational sense and are not necessarily limited to a direct connection or coupling. Thus, for example, two devices may be coupled directly, or via one or more intermediary media or devices. As another example, devices may be coupled in such a way that information can be passed there between, while not sharing any physical connection with one another. Based on the disclosure provided herein, one of ordinary skill in the art will appreciate a variety of ways in which connection or coupling exists in accordance with the aforementioned definition.
- The phrases “in one embodiment,” “according to one embodiment,” and the like generally mean the particular feature, structure, or characteristic following the phrase is included in at least one embodiment of the present invention, and may be included in more than one embodiment of the present invention. Importantly, such phrases do not necessarily refer to the same embodiment.
- If the specification states a component or feature “may”, “can”, “could”, or “might” be included or have a characteristic, that particular component or feature is not required to be included or have the characteristic.
- The term “probe” generally refers to a physical structure with one or more 2D cameras and one or more 3D sensors attached to it.
- The term “responsive” includes completely or partially responsive.
- The term “robot” generally refers to a mechanical or electro-mechanical machine that may be guided or controlled by a computer program, electronic circuitry and/or a human operator. Non-limiting examples of robots include automated guided vehicles, such as quadracopters, aerial or submersible drones or other remote flying or underwater vehicles. Robots may move by flying, swimming, crawling, using legs, following lines or by being wheel-based.
- Referring now to
FIG. 1 andFIG. 2 , anasset inspection robot 100 is illustrated in accordance with an embodiment of the present invention. In the current example,asset inspection robot 100 includes multiple sensors, including one ormore 2D cameras 110 and one or more 3D sensors 120. According to one embodiment, cameras andsensors 110 and 120 are arranged in such a way that the intersection of their respective view cones provides image coverage of the entire structure surface being inspected when the robot is moved vertically. Further, cameras andsensors 110 and 120 may be arranged such that their nodal points are in vertical alignment so as to avoid parallax error and provide for proper registration when stitching the captured sensor data together. Therobot body 150 is typically shaped in a way that provides for complete imaging with no sensor occlusion. - With continuing reference to
FIG. 1 andFIG. 2 , as will be appreciated by those skilled in the art, a variety of other sensors may be installed onrobot 100 to facilitate collection of additional data. Non-limiting examples of sensors include inertial measurement units (IMUs), orientation sensors, accelerometers, radar guns, metal detectors, voltage detectors, smoke detectors, humistors, flow sensors, depth gauges, gyroscopes, compasses, barometers, thermometers, proximity sensors, motion detectors and gas sensors. In one embodiment, agas sensor 130 may be employed to detect various gases within the structure being inspected. Additionally or alternatively, a sonar orlaser 140 may be used to detect the bottom surface of the structure.Light sources 160 may also be integrated with or attached torobot 100 to provide lighting for the sensors. -
FIG. 2 is a block diagram conceptually illustrating various functional units of anasset inspection robot 200 in accordance with an embodiment of the present invention. In the context of the present example, a printed circuit board (PCB) 217 with a microcontroller (not shown) is installed insiderobot 200 to control various electro-mechanical functions. A central processing unit (CPU) 218 is installed insiderobot 200 to process sensor data from sensors (e.g.,sensors CPU 218 is also responsible for writing the captured sensor data to astorage mechanism 220. As those skilled in the art will appreciate,storage mechanism 220 may take a variety of forms and may be local or remote. Depending upon the particular implementation, non-limiting examples of appropriate mass storage systems may include one or more hard drives, magnetic tape drives, magneto-optical disc drives, optical disc drives and/or solid-state drives (SSDs). In some embodiments, captured sensor data may be temporarily buffered in a random access memory (RAM) (not shown) withinrobot 200 and stored remotely fromrobot 200 by transmitting the captured sensor data by wired or wireless means to a remote site.Robot 200 may be powered by abattery 219 and/or may be connected to an external source of direct or alternating current. -
FIG. 3 is a block diagram conceptually illustrating anasset inspection system 300 in accordance with an embodiment of the present invention. In the context of the present example, arobot 310 may be suspended by one or more (e.g., 3) cables orbelts 321 and deployed into the structure by awinch system 324 consisting of a motor (not shown) and controls 323.Winch system 324 may be powered by a battery (not shown) or an external source of direct or alternating current. An operator may usecontrols 323 to automaticallylower robot 310 into the structure to perform the inspection. Advantages of various embodiments of the present invention include, without limitation, that they may capture qualitative and quantitative inspection data in a way that is consistent, repeatable, and complete. By using a robot, a human does not have to enter the structure to perform the inspection, and inspections can be performed far more quickly than is typical using conventional inspection methods. -
FIGS. 4-6 collectively illustrate a process used to collect asset inspection data and augment this data with human observations for the purpose of analysis in accordance with an embodiment of the present invention.FIG. 4 is a flow diagram illustrating data collection processing in accordance with an embodiment of the present invention. In the context of the present example, the data collection process is described with reference to a tethered probe that uses a vertical arrangement of 2D and 3D sensors. Atblock 410, the probe is moved into a position such that the bottom most 2D camera is able to image the top most part of the asset. - At
block 420, odometry information is read from the encoders and recorded so that the probe can be oriented in real-world space. - At
block 430, a 2D image or set of images (video) is captured from the bottom-most 2D camera. - At
block 440, if there is a 3D data sensor with view plane overlap to the previously mentioned 2D camera, one or more frames are recorded from this 3D data sensor. - At
block 450, data from other sensors, such as IMU data, gas detection data, or any other data type is also recorded. - At
decision block 460, it is determined whether the probe is at the bottom of the structure. According to one embodiment, data from a downward facing 3D sensor or a laser dot projected onto the asset bottom may be used to detect whether the probe is at the bottom of the area to be inspection. If so, then data collection processing is complete and atblock 470 the scan is terminated and the probe returns to the docked position. Otherwise, data collection processing continues by looping back to block 410 at which point the probe is moved so that the 2D and 3D sensor(s) directly above the aforementioned 2D and 3D sensor(s) are in a position that places their nodal points in the same position as the aforementioned nodal points as described further below with reference toFIG. 8 . Data is captured and recorded from allsensors using blocks 420 through 460 as above until the bottom is detected -
FIG. 5 is a flow diagram illustrating data synthesis processing in accordance with an embodiment of the present invention. After all of the data has been collected from a scan during a data collection process (e.g., the data collection process described with reference toFIG. 4 ), in one embodiment, the collected data is synthesized for later use. According to one embodiment, the purpose of the data synthesis process is twofold. First discrete data sets (2D image frames, 3D data frames, other sensor data, odometry and IMU data, real world data and the like) are linked together. Second, the data is prepared for presentation in software so that a human can augment and/or analyze the data. - At
block 510, the odometry and sensor index are defined. Odometry generally refers to encoder data, IMU data, and any other data collected that contribute to defining the position of the probe at the time of the first data set capture as well as the position of each sensor in relation to the probe itself and the other sensors. An example of an encoder is a shaft encoder that counts a fraction of a revolution of a motor shaft or a drive axles of a wheel. Using multiple of such encoders in the context of a differentially steered robot with a pair of drive wheels and a castering tail or nose wheel, for example, allows both velocity and direction of travel (e.g., the heading in degrees) to be determined. - At
block 520, the first 2D image captured is related to real-world space using odometry and sensor index information. If other 2D images (e.g., video) have been captured from the same point, these are also related to real-world space. A conceptual illustration of a process for linking data captured from various sensors with a physical point in the real-world is described below with reference toFIG. 7 . - At
block - At
block 540, any other sensor data is also related to the 2D and 3D data sets using odometry and sensor index. - At
decision block 550, it is determined whether there is further data to be processed. If so, then data synthesis processing continues by looping back to block 510; otherwise data synthesis is complete and post processing for display is performed atblock 560. After all of the data have been related to real-world space and to each other (linking), the data may be post-processed for display in software. This post-processing may include assembling the 2D images into cube maps or other environment maps, smoothing or blending the 2D images to compensate for exposure differences, concatenating various 3D data sets using odometry and/or algorithmic functions, smoothing other sensor data, etc. - Those skilled in the art will appreciate that while the steps of the data synthesis process are described in a particular order, the steps may be performed in a different order and some step may not be performed at all. For example, in one embodiment, image post-processing may be performed prior to UV mapping.
-
FIG. 6 is a flow diagram illustrating data augmentation processing in accordance with an embodiment of the present invention. In the context of the present example, it is assumed that data is displayed to the user through a software package during the data augmentation process. According to one embodiment, this software package projects the 2D and 3D data in an immersive environment that allows the user to navigate the asset as if the user was looking through a camera at the asset. An exemplary user interface screen, in accordance with an embodiment of the present invention, is illustrated inFIG. 9 . - At
block 610, the user identifies a region of interest as it is being displayed or projected onto a screen. - At
block 620, responsive to receiving an indication that the current software projection contains a region of interest to the user, the view matrix of the software projection may be recorded. - At
block 630, the user may enter qualitative data about one or more pixels that are currently projected on the screen. The qualitative data is written to a database by the software package and associated with both the view matrices and the appropriate 2D imagery (including frame numbers if the 2D imagery is video data). In one embodiment, the user is provided with the ability to switch between 2D views, 3D (point cloud) views, flattened views, or any other view as desired. Because the pixels in these views are linked to one another the qualitative data will also be linked to specific view matrices and pixels each view. - At
block 640, the user may also measure the distance between one or more pixels in the software projection and another pixel in the same projection or a different projection. For example, if two pixels are selected, the distance between the two pixels in real-world space may be calculated by the software. If more than two pixels are selected, the resulting circle, oval, or polygon may be constructed by the software and relevant geometry in real-world space may be calculated. This data may be written to a database. - At
block 650, after the data has been augmented by the user, the resulting 2D images, 3D data, other sensor data, and augmented data may be used to assess the condition of the asset. The user may choose to compare the same point over time (either using video data or two different inspections), compare different points at the same time, compare different points over different times, or perform any other analysis desired. -
FIG. 7 conceptually illustrates the linking of data captured from various sensors, including 2D cameras and 3D sensors, with a physical point that exists in the real world at a given time in accordance with an embodiment of the present invention. - A physical asset is made up of many points (e.g., physical point 710), which may be on a surface of the physical asset or which may be below the surface of the physical asset. These points exist at specific locations in the real-world. The physical nature of the points may change or evolve over time. For example, a component of the asset may break down.
- A
2D image 711 that includesphysical point 710 is captured with a camera. This capture (or, in the case of video, series of captures) occurs in a known time period. - Next,
3D data 712 from a 3D sensor is captured. This 3D data also includesphysical point 710 of the asset. The physical orientation of the 3D sensor with respect to the 2D camera is also known. - In certain cases,
other data points 713 may also be recorded within the same space and/or time that encompassesphysical point 710. - Lastly, a human may augment the data collected about
physical point 710 in 711, 712, 713 with qualitative andquantitative observations 714. Qualitative observations may include, but are not limited to, a textual description ofpoint 710, and quantitative observations may include, but are not limited to, measurements ofpoint 710 in relation to other points within the same inspection/asset. - Together, the data collected about
physical point 710 insteps physical point 710 over different time periods. Comparison of two different points 716 at the same time is possible from within the same inspection. Comparison of different points atdifferent times 717 is also possible through long-range or multiple inspections.Other comparisons 718, including, but not limited to, a comparison ofphysical point 710 to a reference model, may also be utilized to assess the condition of an asset. -
FIG. 8 is a block diagram that conceptually illustrates a method for using multiple cameras to collect images, video, or 3D data from a single nodal point in accordance with an embodiment of the present invention. This simplified example uses vertically aligned 2D cameras, however, those skilled in the art will appreciate additional sensors, e.g., 3D sensors, infrared cameras and the like, could be placed within the same alignment. Furthermore, the alignment does not have to be vertical. Horizontal alignment or alignment over any other straight vector may be employed. - In general, it is desirable to collect data from a single nodal point with multiple sensors having overlapping fields of view.
FIG. 8 provides a non-limiting concrete example of sensor attributes and relative positioning that may be employed to facilitate the process of co-relating captured sensor data. In the context ofFIG. 8 , a robot 810 is moved vertically within the interior of a target asset 817 (e.g., a manhole). Robot 810 contains six cameras (811, 812, 813, 814, 815, 816) each of which has a field of view greater than 90 degrees. Camera 811 is positioned so that its lens is facing directly up, camera 811 is positioned facing directly outward, camera 813 is positioned facing directly outward rotated 90 degrees with respect tocamera 812, camera 814 is positioned facing directly outward rotated 90 degrees with respect to camera 813, camera 815 is positioned facing directly outward rotated 90 degrees with respect to camera 814, and camera 816 is positioned facing directly downward. All cameras are aligned such that their nodal points fall on the same vector. In other embodiments, more or less sensors could be used, but it is desirable to have a sufficient number of sensors and an arrangement thereof that allows complete imaging of target asset 817. - During the data capture process, an image is taken from camera 816, then the robot 810 is moved downward by a known distance 819 between the nodal points of cameras 816 and 815. Next, an image is taken from camera 815. This process repeats until images have been taken from all
cameras 811, 812, 813, 814, 815, 816 and the robot 810 has moved the distance between the nodal point of camera 816 and the nodal point of camera 811. The entire process is repeated until the entire asset has been imaged. - It is important to note that the process above represents only 2D still images taken in sequence and spanning an entire camera-to-camera nodal point movement. In other embodiments, video capture may be used and camera data may be captured in any order and with any movement distance. These images and/or video may subsequently be arranged using odometry and presented to the user in a logical way.
-
FIG. 10 is anexemplary computer system 1000 in which or with which embodiments of the present invention may be utilized. Embodiments of the present disclosure include various steps, which have been described above. A variety of these steps may be performed by hardware components or may be tangibly embodied on a non-transitory computer-readable storage medium in the form of machine-executable instructions, which may be used to cause a general-purpose or special-purpose processor programmed with instructions to perform these steps. Alternatively, the steps may be performed by a combination of hardware, software, and/or firmware. -
Computer system 1000 may represent or form a part of a client device (e.g., an end-user workstation, a laptop or desktop computer system), a server, a probe or a robot.Computer system 1000 may be part of a distributed computer system (not shown) in which various aspects and functions described herein are practiced. The distributed computer system may include one more additional computer systems (not shown) that exchange information with each other and/orcomputer system 1000. The computer systems of the distributed computer system may be interconnected by, and may exchange data through, a communication network (not shown), which may include any communication network through which computer systems may exchange data. To exchange data using the communication network, the computer systems and the network may use various methods, protocols and standards, including, among others, Fibre Channel, Token Ring, Ethernet, Wireless Ethernet, Bluetooth, Internet Protocol (IP), IPv6, Transmission Control Protocol (TCP)/IP, User Datagram Protocol (UDP), Delay-Tolerant Networking (DTN), Hypertext Transfer Protocol (HTTP), File Transfer Protocol (FTP), Simple Network Mail Protocol (SNMP), SMS, MIMS, Signalling System No. 7 (SS7), JavaScript Object Notation (JSON), Simple Object Access Protocol (SOAP), Common Object Request Broker Architecture (CORBA), REST and Web Services. To ensure data transfer is secure, the computer systems may transmit data via the network using a variety of security measures including, for example, Transport Layer Security (TLS), Secure Sockets Layer (SSL) or a Virtual Private Network (VPN). - Various aspects and functions described herein may be implemented as specialized hardware and/or software components executing in one or more computer systems, such as
computer system 1000. Various aspects and functionality described herein may be located on a single computer system or may be distributed among multiple computer systems (e.g., a probe or robot, a server and an end-user workstation) connected to one or more communications networks. For example, various aspects and functions may be distributed among one or more server computer systems configured to provide a service to one or more client computers, or to perform an overall task as part of a distributed system. Additionally, aspects may be performed on a client-server or multi-tier system that includes components distributed among one or more server systems that perform various functions. Consequently, the various aspects and functions described herein are not limited to executing on any particular system or group of systems. Further, aspects and functions may be implemented in software, hardware or firmware, or any combination thereof. Thus, aspects and functions may be implemented within methods, acts, systems, system elements and components using a variety of hardware and software configurations, and the various aspects and functions described herein are not limited to any particular distributed architecture, network, or communication protocol. -
Computer system 1000 may include a bus 1030, aprocessor 1005,communication port 1010, amain memory 1015, a removable storage media (not shown), a read only memory (ROM) 1020 and amass storage device 1025. Those skilled in the art will appreciate thatcomputer system 1000 may include more than one processor and more than on communication port. - To implement at least some of the aspects, functions and processes disclosed herein,
processor 1005 performs a series of instructions that result in manipulated data.Processor 1005 may be any type of processor, multiprocessor or controller. Some exemplary processors include commercially available processors such as an Intel Xeon, Itanium, Core, Celeron, or Pentium processor, an AMD Opteron processor, a Sun UltraSPARC or IBM Power5+ processor and an IBM mainframe chip.Processor 1005 is connected to other system components, including one or more memory devices representingmain memory 1015,ROM 1020 andmass storage device 1025 via bus 1030. -
Main memory 1015 stores programs and data during operation ofcomputer system 1000. Thus,main memory 1015 may be a relatively high performance, volatile, random access memory (e.g., dynamic random access memory (DRAM) or static memory (SRAM)). However,main memory 1015 may include any device for storing data, such as a disk drive or other non-volatile storage device. Various examples may organizemain memory 1015 into particularized and, in some cases, unique structures to perform the functions disclosed herein. These data structures may be sized and organized to store values for particular data and types of data. - Components of
computer system 1000 are coupled by an interconnection element, such as bus 1030. Bus 1030 may include one or more physical busses, for example, busses between components that are integrated within the same machine, but may include any communication coupling between system elements including specialized or standard computing bus technologies including, but not limited to, Integrated Drive Electronics (IDE), Small Computer System Interface (SCSI), Peripheral Component Interconnect (PCI) and InfiniBand. Bus 1030 enables communications of data and instructions, for example, to be exchanged between system components ofcomputer system 1000. - In the context of an end-user workstation, for example,
computer system 1000 typically also includes one or more interface devices (not shown), e.g., input devices, output devices and combination input/output devices. Interface devices may receive input or provide output. More particularly, output devices may render information for external presentation. Input devices may accept information from external sources. Non-limiting examples of interface devices include keyboards, mouse devices, trackballs, microphones, touch screens, printing devices, display screens, speakers, network interface cards, etc. Interface devices allowcomputer system 1000 to exchange information and to communicate with external entities, e.g., users and other systems. -
Mass storage device 1025 includes a computer readable and writeable nonvolatile, or non-transitory, data storage medium in which instructions are stored that define a program or other object that is executed byprocessor 1005.Mass storage device 1025 also may include information that is recorded, on or in, the medium, and that is processed byprocessor 1005 during execution of the program. More specifically, the information may be stored in one or more data structures specifically configured to conserve storage space or increase data exchange performance. The instructions may be persistently stored as encoded signals, and the instructions may causeprocessor 1005 to perform any of the functions described herein. The medium may, for example, be optical disk, magnetic disk or flash memory, among others. In operation,processor 1005 or some other controller causes data to be read from the nonvolatile recording medium into another memory, such asmain memory 1015, that allows for faster access to the information byprocessor 1005 than does the storage medium included inmass storage device 1025. A variety of components may manage data movement betweenmain memory 1015,mass storage device 1025 and other memory elements and examples are not limited to particular data management components. Further, examples are not limited to a particular memory system or data storage system. - In the context of an end-user workstation or a server, for example,
communication port 1010 may include, but is not limited to, an RS-232 port for use with a modem based dialup connection, a 10/100 Ethernet port, a Gigabit or 10 Gigabit port using copper or fiber, a serial port, a parallel port, or other existing or future ports.Communication port 610 may be chosen depending on a network, such a Local Area Network (LAN), Wide Area Network (WAN), or any network to whichcomputer system 1000 connects. In the context of a probe or a robot,communication ports 1010 may serve as interfaces with various sensors (not shown). - Removable storage media can be any kind of external hard-drives, floppy drives, IOMEGA® Zip Drives, Compact Disc-Read Only Memory (CD-ROM), Compact Disc-Re-Writable (CD-RW), Digital Video Disk-Read Only Memory (DVD-ROM).
- Although
computer system 1005 is shown by way of example as one type of computer system upon which various aspects and functions may be practiced, aspects and functions are not limited to being implemented oncomputer system 1000. Various aspects and functions may be practiced on one or more computers having a different architecture or components than that shown inFIG. 10 . For instance,computer system 1000 may include specially programmed, special-purpose hardware, such as an application-specific integrated circuit (ASIC) tailored to perform a particular operation disclosed herein. While another example may perform the same function using a grid of several general-purpose computing devices running MAC OS System X with Motorola PowerPC processors and several specialized computing devices running proprietary hardware and operating systems. -
Computer system 1000 may include an operating system (not shown) that manages at least a portion of the hardware elements included incomputer system 1000. In some examples, a processor or controller, such as theprocessor 1005, executes the operating system. Non-limiting examples of operating systems for an end-user workstation or a server include a Windows-based operating system, such as, Windows NT, Windows 2000 (Windows ME), Windows XP, Windows Vista or Windows 7 operating systems, available from Microsoft Corporation, a MAC OS System X operating system available from Apple Inc., one of many Linux-based operating system distributions, for example, the Enterprise Linux operating system available from Red Hat Inc., a Solaris operating system available from Sun Microsystems, or a UNIX operating systems available from various sources. Many other operating systems may be used. -
Processor 1005 and operating system together define a computer platform for which application programs in high-level programming languages may be written. These applications may be executable, intermediate, bytecode or interpreted code, which communicates over a communication network, for example, the Internet, using a communication protocol, for example, TCP/IP. Similarly, aspects may be implemented using an object-oriented programming language, such as .Net, SmallTalk, Java, C++, Ada, or C# (C-Sharp). Other object-oriented programming languages may also be used. Alternatively, functional, scripting, or logical programming languages may be used. - Additionally, various aspects and functions may be implemented in a non-programmed environment, for example, documents created in Hypertext Markup Language (HTML), eXtensible Markup Language (XML) or other format that, when viewed in a window of a browser program, can render aspects of a graphical-user interface or perform other functions. Further, various examples may be implemented as programmed or non-programmed elements, or any combination thereof. For example, a web page may be implemented using HTML while a data object called from within the web page may be written in C++. Thus, the examples are not limited to a specific programming language and any suitable programming language could be used. Accordingly, the functional components disclosed herein may include a wide variety of elements, e.g. specialized hardware, executable code, data structures or objects, that are configured to perform the functions described herein.
- In some examples, the components disclosed herein may read parameters that affect the functions performed by the components. These parameters may be physically stored in any form of suitable memory including volatile memory (such as RAM) or nonvolatile memory (such as a magnetic hard drive). In addition, the parameters may be logically stored in a propriety data structure (such as a database or file defined by a user mode application) or in a commonly shared data structure (such as an application registry that is defined by an operating system). In addition, some examples provide for both system and user interfaces that allow external entities to modify the parameters and thereby configure the behavior of the components.
- Components described above are meant only to exemplify various possibilities. In no way should the aforementioned exemplary computer system limit the scope of the present disclosure.
- While embodiments of the invention have been illustrated and described, it will be clear that the invention is not limited to these embodiments only. Numerous modifications, changes, variations, substitutions, and equivalents will be apparent to those skilled in the art, without departing from the spirit and scope of the invention.
Claims (16)
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/050,898 US20160249021A1 (en) | 2015-02-23 | 2016-02-23 | 3d asset inspection |
US16/390,243 US11067388B2 (en) | 2015-02-23 | 2019-04-22 | 3D asset inspection |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201562119788P | 2015-02-23 | 2015-02-23 | |
US15/050,898 US20160249021A1 (en) | 2015-02-23 | 2016-02-23 | 3d asset inspection |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/390,243 Continuation US11067388B2 (en) | 2015-02-23 | 2019-04-22 | 3D asset inspection |
Publications (1)
Publication Number | Publication Date |
---|---|
US20160249021A1 true US20160249021A1 (en) | 2016-08-25 |
Family
ID=56690644
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/050,898 Abandoned US20160249021A1 (en) | 2015-02-23 | 2016-02-23 | 3d asset inspection |
US16/390,243 Active US11067388B2 (en) | 2015-02-23 | 2019-04-22 | 3D asset inspection |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/390,243 Active US11067388B2 (en) | 2015-02-23 | 2019-04-22 | 3D asset inspection |
Country Status (1)
Country | Link |
---|---|
US (2) | US20160249021A1 (en) |
Cited By (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20170329297A1 (en) * | 2016-05-13 | 2017-11-16 | General Electric Company | Robotic repair or maintenance of an asset |
CN108124132A (en) * | 2017-12-22 | 2018-06-05 | 国家电网公司 | A kind of method for safety monitoring and device |
WO2019178311A1 (en) | 2018-03-15 | 2019-09-19 | Redzone Robotics, Inc. | Image processing techniques for multi-sensor inspection of pipe interiors |
WO2021185823A1 (en) | 2020-03-20 | 2021-09-23 | Spacepal | Apparatus and method for three-dimensional modelling of a shaft |
US11163052B2 (en) * | 2018-11-16 | 2021-11-02 | Koko Home, Inc. | System and method for processing multi-directional frequency modulated continuous wave wireless backscattered signals |
US11462330B2 (en) | 2017-08-15 | 2022-10-04 | Koko Home, Inc. | System and method for processing wireless backscattered signal using artificial intelligence processing for activities of daily life |
US11558717B2 (en) | 2020-04-10 | 2023-01-17 | Koko Home, Inc. | System and method for processing using multi-core processors, signals, and AI processors from multiple sources to create a spatial heat map of selected region |
US11719804B2 (en) | 2019-09-30 | 2023-08-08 | Koko Home, Inc. | System and method for determining user activities using artificial intelligence processing |
US11948441B2 (en) | 2019-02-19 | 2024-04-02 | Koko Home, Inc. | System and method for state identity of a user and initiating feedback using multiple sources |
US11971503B2 (en) | 2019-02-19 | 2024-04-30 | Koko Home, Inc. | System and method for determining user activities using multiple sources |
US11997455B2 (en) | 2019-02-11 | 2024-05-28 | Koko Home, Inc. | System and method for processing multi-directional signals and feedback to a user to improve sleep |
US12028776B2 (en) | 2020-04-03 | 2024-07-02 | Koko Home, Inc. | System and method for processing using multi-core processors, signals and AI processors from multiple sources to create a spatial map of selected region |
US12094614B2 (en) | 2017-08-15 | 2024-09-17 | Koko Home, Inc. | Radar apparatus with natural convection |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US12118703B2 (en) * | 2020-07-02 | 2024-10-15 | Redzone Robotics, Inc. | Photo-realistic infrastructure inspection |
WO2023178389A1 (en) * | 2022-03-25 | 2023-09-28 | UAM Tec Pty Ltd | Visual analyser of confined pathways |
Family Cites Families (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5107705A (en) * | 1990-03-30 | 1992-04-28 | Schlumberger Technology Corporation | Video system and method for determining and monitoring the depth of a bottomhole assembly within a wellbore |
US20030038756A1 (en) * | 2001-08-27 | 2003-02-27 | Blume Leo R. | Stacked camera system for environment capture |
US20030038814A1 (en) * | 2001-08-27 | 2003-02-27 | Blume Leo R. | Virtual camera system for environment capture |
US8467049B2 (en) * | 2006-09-15 | 2013-06-18 | RedzoneRobotics, Inc. | Manhole modeler using a plurality of scanners to monitor the conduit walls and exterior |
JP4885080B2 (en) | 2007-07-09 | 2012-02-29 | 任天堂株式会社 | Image processing program and image processing apparatus |
WO2011038170A2 (en) * | 2009-09-26 | 2011-03-31 | Halliburton Energy Services, Inc. | Downhole optical imaging tools and methods |
US9323250B2 (en) * | 2011-01-28 | 2016-04-26 | Intouch Technologies, Inc. | Time-dependent navigation of telepresence robots |
US9457473B2 (en) | 2012-06-20 | 2016-10-04 | Irobot Corporation | Suspended robot systems and methods for using same |
-
2016
- 2016-02-23 US US15/050,898 patent/US20160249021A1/en not_active Abandoned
-
2019
- 2019-04-22 US US16/390,243 patent/US11067388B2/en active Active
Cited By (20)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10518411B2 (en) * | 2016-05-13 | 2019-12-31 | General Electric Company | Robotic repair or maintenance of an asset |
US20170329297A1 (en) * | 2016-05-13 | 2017-11-16 | General Electric Company | Robotic repair or maintenance of an asset |
US10618168B2 (en) | 2016-05-13 | 2020-04-14 | General Electric Company | Robot system path planning for asset health management |
US11462330B2 (en) | 2017-08-15 | 2022-10-04 | Koko Home, Inc. | System and method for processing wireless backscattered signal using artificial intelligence processing for activities of daily life |
US12094614B2 (en) | 2017-08-15 | 2024-09-17 | Koko Home, Inc. | Radar apparatus with natural convection |
US11776696B2 (en) | 2017-08-15 | 2023-10-03 | Koko Home, Inc. | System and method for processing wireless backscattered signal using artificial intelligence processing for activities of daily life |
CN108124132A (en) * | 2017-12-22 | 2018-06-05 | 国家电网公司 | A kind of method for safety monitoring and device |
WO2019178311A1 (en) | 2018-03-15 | 2019-09-19 | Redzone Robotics, Inc. | Image processing techniques for multi-sensor inspection of pipe interiors |
EP3766038A4 (en) * | 2018-03-15 | 2021-12-15 | Redzone Robotics, Inc. | Image processing techniques for multi-sensor inspection of pipe interiors |
US11163052B2 (en) * | 2018-11-16 | 2021-11-02 | Koko Home, Inc. | System and method for processing multi-directional frequency modulated continuous wave wireless backscattered signals |
US11997455B2 (en) | 2019-02-11 | 2024-05-28 | Koko Home, Inc. | System and method for processing multi-directional signals and feedback to a user to improve sleep |
US11971503B2 (en) | 2019-02-19 | 2024-04-30 | Koko Home, Inc. | System and method for determining user activities using multiple sources |
US11948441B2 (en) | 2019-02-19 | 2024-04-02 | Koko Home, Inc. | System and method for state identity of a user and initiating feedback using multiple sources |
US12210087B2 (en) | 2019-09-30 | 2025-01-28 | Koko Home, Inc. | System and method for determining user activities using artificial intelligence processing |
US11719804B2 (en) | 2019-09-30 | 2023-08-08 | Koko Home, Inc. | System and method for determining user activities using artificial intelligence processing |
WO2021185823A1 (en) | 2020-03-20 | 2021-09-23 | Spacepal | Apparatus and method for three-dimensional modelling of a shaft |
BE1028155A1 (en) | 2020-03-20 | 2021-10-13 | Spacepal Bvba | DEVICE AND METHOD FOR THREE-DIMENSIONAL SHAFT MODELING |
US12028776B2 (en) | 2020-04-03 | 2024-07-02 | Koko Home, Inc. | System and method for processing using multi-core processors, signals and AI processors from multiple sources to create a spatial map of selected region |
US11736901B2 (en) | 2020-04-10 | 2023-08-22 | Koko Home, Inc. | System and method for processing using multi-core processors, signals, and AI processors from multiple sources to create a spatial heat map of selected region |
US11558717B2 (en) | 2020-04-10 | 2023-01-17 | Koko Home, Inc. | System and method for processing using multi-core processors, signals, and AI processors from multiple sources to create a spatial heat map of selected region |
Also Published As
Publication number | Publication date |
---|---|
US11067388B2 (en) | 2021-07-20 |
US20190242696A1 (en) | 2019-08-08 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11067388B2 (en) | 3D asset inspection | |
Asadi et al. | Vision-based integrated mobile robotic system for real-time applications in construction | |
US10788428B2 (en) | Positioning system for aerial non-destructive inspection | |
Ellenberg et al. | Bridge related damage quantification using unmanned aerial vehicle imagery | |
US11263761B2 (en) | Systems and methods for visual target tracking | |
US9639960B1 (en) | Systems and methods for UAV property assessment, data capture and reporting | |
Taylor et al. | Automatic calibration of lidar and camera images using normalized mutual information | |
Leingartner et al. | Evaluation of sensors and mapping approaches for disasters in tunnels | |
Berezowski et al. | Geomatic techniques in forensic science: A review | |
WO2022077296A1 (en) | Three-dimensional reconstruction method, gimbal load, removable platform and computer-readable storage medium | |
CA3089307A1 (en) | System and method for creating geo-localized enhanced floor plans | |
JP2021035833A (en) | Inspection system | |
CN112312113A (en) | Method, device and system for generating three-dimensional model | |
Tran et al. | Low-cost 3D scene reconstruction for response robots in real-time | |
Sa et al. | Inspection of pole-like structures using a visual-inertial aided vtol platform with shared autonomy | |
Nocerino et al. | Multi-camera system calibration of a low-cost remotely operated vehicle for underwater cave exploration | |
Liu et al. | Framework for automated UAV-based inspection of external building façades | |
Zhou et al. | Autonomous wireless sensor deployment with unmanned aerial vehicles for structural health monitoring applications | |
Neumann et al. | A rotating platform for swift acquisition of dense 3D point clouds | |
JP2024169573A (en) | Crane photography system and program | |
Hallermann et al. | The application of unmanned aerial vehicles for the inspection of structures | |
CN116203976A (en) | Indoor inspection method and device for transformer substation, unmanned aerial vehicle and storage medium | |
Tavasoli et al. | Autonomous post‐disaster indoor navigation and survivor detection using low‐cost micro aerial vehicles | |
JP7467206B2 (en) | Video management support system and video management support method | |
Birk et al. | 3d data collection at disaster city at the 2008 nist response robot evaluation exercise (rree) |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: INDUSTRIAL TECHNOLOGY GROUP, LLC, COLORADO Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:MCALEENAN, G. CHRISTOPHER;WICKE, MATTHEW T.;REEL/FRAME:037800/0595 Effective date: 20160223 |
|
AS | Assignment |
Owner name: R.S. TECHNICAL SERVICES, INC., CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:INDUSTRIAL TECHNOLOGY GROUP, LLC;REEL/FRAME:042982/0180 Effective date: 20170630 |
|
AS | Assignment |
Owner name: SUBSITE, LLC, OKLAHOMA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:R.S. TECHNICAL SERVICES, INC.;REEL/FRAME:042994/0275 Effective date: 20170630 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |
|
AS | Assignment |
Owner name: THE CHARLES MACHINE WORKS, INC., OKLAHOMA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:SUBSITE, LLC;REEL/FRAME:051315/0711 Effective date: 20191217 |