WO2016107989A1 - Estimation of lower bounds for deviations of as-built structures from as-designed models - Google Patents
Estimation of lower bounds for deviations of as-built structures from as-designed models Download PDFInfo
- Publication number
- WO2016107989A1 WO2016107989A1 PCT/FI2015/050960 FI2015050960W WO2016107989A1 WO 2016107989 A1 WO2016107989 A1 WO 2016107989A1 FI 2015050960 W FI2015050960 W FI 2015050960W WO 2016107989 A1 WO2016107989 A1 WO 2016107989A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- edge
- designed model
- lines
- vertices
- Prior art date
Links
- 238000000034 method Methods 0.000 claims abstract description 69
- 230000015654 memory Effects 0.000 claims description 23
- 238000012545 processing Methods 0.000 claims description 21
- 239000013598 vector Substances 0.000 claims description 21
- 238000004422 calculation algorithm Methods 0.000 claims description 12
- 230000008569 process Effects 0.000 claims description 5
- 238000003708 edge detection Methods 0.000 claims description 3
- 230000014509 gene expression Effects 0.000 description 20
- 238000010586 diagram Methods 0.000 description 10
- 230000006870 function Effects 0.000 description 10
- 230000003044 adaptive effect Effects 0.000 description 6
- 238000010276 construction Methods 0.000 description 6
- 238000013519 translation Methods 0.000 description 6
- 241001484259 Lacuna Species 0.000 description 4
- 238000004891 communication Methods 0.000 description 4
- 238000004590 computer program Methods 0.000 description 4
- 229920001621 AMOLED Polymers 0.000 description 2
- 238000012897 Levenberg–Marquardt algorithm Methods 0.000 description 2
- 238000004458 analytical method Methods 0.000 description 2
- 238000013459 approach Methods 0.000 description 2
- 238000013461 design Methods 0.000 description 2
- 238000001514 detection method Methods 0.000 description 2
- 238000004090 dissolution Methods 0.000 description 2
- 238000000605 extraction Methods 0.000 description 2
- 238000007689 inspection Methods 0.000 description 2
- 238000009434 installation Methods 0.000 description 2
- 230000003993 interaction Effects 0.000 description 2
- 239000011159 matrix material Substances 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 239000004065 semiconductor Substances 0.000 description 2
- 230000009466 transformation Effects 0.000 description 2
- 241001442234 Cosa Species 0.000 description 1
- 230000004075 alteration Effects 0.000 description 1
- 230000000712 assembly Effects 0.000 description 1
- 238000000429 assembly Methods 0.000 description 1
- 230000008901 benefit Effects 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 238000012937 correction Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 238000005286 illumination Methods 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000012423 maintenance Methods 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 229910044991 metal oxide Inorganic materials 0.000 description 1
- 150000004706 metal oxides Chemical class 0.000 description 1
- 238000012805 post-processing Methods 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 239000010409 thin film Substances 0.000 description 1
- 230000001131 transforming effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q50/00—Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
- G06Q50/08—Construction
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F30/00—Computer-aided design [CAD]
- G06F30/10—Geometric CAD
- G06F30/13—Architectural design, e.g. computer-aided architectural design [CAAD] related to design of buildings, bridges, landscapes, production plants or roads
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0004—Industrial image inspection
- G06T7/001—Industrial image inspection using an image reference approach
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/60—Analysis of geometric attributes
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2200/00—Indexing scheme for image data processing or generation, in general
- G06T2200/04—Indexing scheme for image data processing or generation, in general involving 3D image data
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30108—Industrial image inspection
- G06T2207/30132—Masonry; Concrete
Definitions
- the present disclosure generally relates to photogrammetry and, more particularly, to image-based detection and measuring of deviations against an as-designed Building Information Model (BIM).
- BIM Building Information Model
- Detection and measuring of deviations between as-built and as-designed structures is of important value for high-quality construction work.
- An optimal solution would allow checking of positioning of building elements against allowed tolerances already during installation when it is easy to remedy possible errors.
- as-built documentation facilitates proper installation and later maintenance of building services equipment as one knows where they were actually installed.
- BIM Building Information Model
- the inspection of geometries of the as-built structure typically involves reconstruction of a dense set of 3-D points from stereo or by laser scanning of the as-built structure, and thereafter modeling of the data into surface patches which are compared against the design.
- a method includes facilitating receipt of an image of an as-built structure.
- the image is a two-dimensional (2-D) image.
- the method also facilitates receipt of an as-designed model associated with the as-built structure.
- the as- designed model is a three-dimensional (3-D) model including a plurality of vertices and a plurality of lines connecting the plurality of vertices.
- the method determines a plurality of edge features in the image and an exterior orientation of the image including the plurality of edge features corresponding to the as-designed model is performed to generate an oriented image.
- the method deforms the as-designed model, such that, upon projection of the deformed as-designed model onto the oriented image, the plurality of lines of the as-designed model fit substantially with the plurality of edge features in the image. Based on the image, lower-bounds for deviations of the as-built structure from the as-designed model are determined. The deviations are determined corresponding to the plurality of vertices of the as-designed model based on the deformation of the as-designed model.
- an apparatus for determining lower bounds for deviations of an as-built structure from an as-designed model includes a memory and a processor.
- the memory stores image processing instructions and the processor is electronically coupled with the memory.
- the processor is configured to execute the image processing instructions stored in the memory to cause the apparatus to perform facilitating receipt of an image of an as-built structure.
- the image is a two- dimensional (2-D) image.
- the apparatus also facilitates receipt of an as-designed model associated with the as-built structure.
- the as-designed model is a three-dimensional (3-D) model including a plurality of vertices and a plurality of lines connecting the plurality of vertices.
- the apparatus further determines a plurality of edge features in the image and performs an exterior orientation of the image including the plurality of edge features corresponding to the as-designed model to generate an oriented image. Furthermore, the apparatus deforms the as-designed model, such that, upon projection of the deformed as- designed model onto the oriented image, the plurality of lines of the as-designed model fit substantially with the plurality of edge features in the image. Based on the image, lower- bounds for deviations of the as-built structure from the as-designed model are determined. The deviations are determined corresponding to the plurality of vertices of the as-designed model based on the deformation of the as-designed model.
- a non-transitory, computer-readable storage medium storing computer-executable program instructions to implement a method for determining lower bounds for deviations of an as-built structure from an as-designed model.
- the method includes facilitating receipt of an image of an as-built structure.
- the image is a two-dimensional (2-D) image.
- the method further facilitates receipt of an as-designed model associated with the as-built structure.
- the as-designed model is a three-dimensional (3-D) model including a plurality of vertices and a plurality of lines connecting the plurality of vertices.
- the method determines a plurality of edge features in the image and an exterior orientation of the image including the plurality of edge features corresponding to the as-designed model is performed to generate an oriented image. Also, the method deforms the as-designed model, such that, upon projection of the deformed as-designed model onto the oriented image, the plurality of lines of the as-designed model fit substantially with the plurality of edge features in the image. Based on the image, lower-bounds for deviations of the as-built structure from the as-designed model are determined. The deviations are determined corresponding to the plurality of vertices of the as-designed model based on the deformation of the as-designed model.
- FIG. 1 illustrates a block diagram representation of an apparatus, in accordance with an example embodiment
- FIG. 2 is a flow diagram depicting a method for determining lower bounds for deviations of an as-built structure from an as-designed model (BIM), in accordance with an example embodiment
- FIGS. 3 A and 3B show example representations of imaging geometries for a planar image and a spherical image, in accordance with an example embodiment
- FIG. 4 is a flow diagram depicting a method for extracting edge features from a spherical image, in accordance with an example embodiment
- FIG. 5A shows an example image representation of edge features extracted in an image of a real-construction site with selected part of BIM projected thereon with an initial orientation, in accordance with an example embodiment
- FIG. 5B shows an example image representation of the real-construction site of FIG. 5A with selected part of BIM projected thereon with a refined exterior orientation, in accordance with an example embodiment
- FIG. 5C shows an example image representation of the real-construction site of FIG. 5B after deforming the selected part of BIM to substantially match with the edge features in the image, in accordance with an example embodiment
- FIG. 6A shows an example image representation of edge features extracted in an image of a parking lot corner with selected part of BIM projected thereon with an initial orientation, in accordance with an example embodiment
- FIG. 6B shows an example image representation of the image of FIG. 6A with selected part of BIM projected thereon with a refined exterior orientation, in accordance with an example embodiment
- FIG. 6C shows an example image representation of the image of FIG. 6B after deforming the selected part of BIM to substantially match with the edge features in the image, in accordance with an example embodiment
- FIG. 7 shows a plotted representation of a deformed model in BIM coordinates with shading differences illustrating lower bounds of differences of the as-built structure against the BIM, in accordance with an example embodiment.
- the term 'as-designed structure' 'as-designed model', 'building information model', BIM are used interchangeably throughout the description, and these terms represent a 3-D modeling of any planned structured that is to be built, or already built, or a work in progress.
- the term 'as-built structure' refers to any actual structure that is built such that its images can be captured. Examples of the images herein include planar images and/or spherical images including planar and/or spherical panoramic images of the as-built structure.
- FIG. 1 illustrates a block diagram of an apparatus 100 for adjusting a building information model to fit with an image, in accordance with an example embodiment of the present invention.
- the apparatus 100 is configured to determine lower-bounds for deviations of the as-built structure from the as-designed model.
- the apparatus 100 may be any computing or data processing machine for example, a laptop computer, a tablet computer, a mobile phone, a server, and the like. It is noted that the apparatus 100 may include fewer or more components than those depicted in FIG. 1.
- the apparatus 100 may be implemented as a centralized device, or, alternatively, various components of the apparatus 100 may be deployed in a distributed manner while being operatively coupled to each other. In an embodiment, one or more components of the apparatus 100 may be implemented as a set of software layers on top of existing hardware systems.
- the apparatus 100 includes at least one processor for example, a processor 102, and at least one memory for example, a memory 104.
- the memory 104 is capable of storing machine executable instructions, particularly the image processing instructions.
- the processor 102 is capable of executing the stored machine executable instructions.
- the processor 102 may be embodied in a number of different ways.
- the processor 102 may be embodied as one or more of various processing devices, such as a coprocessor, a microprocessor, a controller, a digital signal processor (DSP), processing circuitry with or without an accompanying DSP, or various other processing devices including integrated circuits such as, for example, an application specific integrated circuit (ASIC), a field programmable gate array (FPGA), a microcontroller unit (MCU), a hardware accelerator, a special-purpose computer chip, or the like.
- various processing devices such as a coprocessor, a microprocessor, a controller, a digital signal processor (DSP), processing circuitry with or without an accompanying DSP, or various other processing devices including integrated circuits such as, for example, an application specific integrated circuit (ASIC), a field programmable gate array (FPGA), a microcontroller unit (MCU), a hardware accelerator, a special-purpose computer chip, or the like.
- ASIC application specific integrated circuit
- FPGA field programmable gate array
- MCU microcontroller unit
- the processor 102 utilizes computer program code to cause the apparatus 100 to perform one or more actions responsible for adjusting a building information model of a structure to fit with an image corresponding to as-built structure, and to calculate lower bound estimations for deviations between the as-built structure and the as-designed model (the building information model).
- the memory 104 may be embodied as one or more volatile memory devices, one or more non-volatile memory devices, and/or a combination of one or more volatile memory devices and non- volatile memory devices.
- the memory 104 may be embodied as magnetic storage devices (such as hard disk drives, floppy disks, magnetic tapes, etc.), optical magnetic storage devices (e.g., magneto-optical disks), CD- ROM (compact disc read only memory), CD-R (compact disc recordable), CD-R/W (compact disc rewritable), DVD (Digital Versatile Disc), BD (Blu-ray® Disc), and semiconductor memories (such as mask ROM, PROM (programmable ROM), EPROM (erasable PROM), flash ROM, RAM (random access memory), etc.).
- magnetic storage devices such as hard disk drives, floppy disks, magnetic tapes, etc.
- optical magnetic storage devices e.g., magneto-optical disks
- CD- ROM compact disc read only memory
- CD-R compact disc recordable
- CD-R/W compact disc rewritable
- DVD Digital Versatile Disc
- BD Blu-ray® Disc
- semiconductor memories such as mask
- the apparatus 100 includes a user interface 106 (also referred to as UI 106) for providing an output and/or receiving an input.
- the user interface 106 is configured to be in communication with the processor 102 and the memory 104.
- Examples of the user interface 106 include, but are not limited to, an input interface and/or an output interface.
- Examples of the input interface may include, but are not limited to, a keyboard, a mouse, a joystick, a keypad, a touch screen, soft keys, a microphone, and the like.
- the output interface may include, but are not limited to, a display such as light emitting diode display, thin-film transistor (TFT) display, liquid crystal display, active-matrix organic light-emitting diode (AMOLED) display, a microphone, a speaker, ringers, vibrators, and the like.
- the processor 102 may include user interface circuitry configured to control at least some functions of one or more elements of the user interface 106, such as, for example, a speaker, a ringer, a microphone, a display, and/or the like.
- the processor 102 and/or the user interface circuitry may be configured to control one or more functions of the one or more elements of the user interface 106 through computer program instructions, for example, software and/or firmware, stored in a memory, for example, the memory 104, and/or the like, accessible to the processor 102.
- the apparatus 100 includes a camera module 108, for example including one or more digital cameras.
- the camera module 108 is configured to be in communication with the processor 102 and/or other components of the apparatus 100 to capture digital image frames, videos and/or other graphic media.
- the camera module 108 may include hardware and/or software necessary for taking various kinds of images, for example, planar images, spherical images, or planar panoramic or spherical panoramic images.
- the camera module 108 may include hardware, such as a lens and/or other optical component(s) such as one or more image sensors.
- Examples of one or more image sensors may include, but are not limited to, a complementary metal- oxide semiconductor (CMOS) image sensor, a charge-coupled device (CCD) image sensor, a backside illumination sensor (BSI) and the like.
- the camera module 108 may further include a processing element such as a co-processor that assists the processor 102 in processing image frame data and an encoder and/or a decoder for compressing and/or decompressing image frame data.
- the encoder and/or decoder may encode and/or decode according to a standard format, for example, a Joint Photographic Experts Group (JPEG) standard format.
- JPEG Joint Photographic Experts Group
- the various components of the apparatus 100 may communicate with each other via a centralized circuit system 110 to determine lower bounds for deviations of the as-built structure from the as-designed model.
- the centralized circuit system 110 may be various devices configured to, among other things, provide or enable communication between the components (102-108) of the apparatus 100.
- the centralized circuit system 118 may be a central printed circuit board (PCB) such as a motherboard, a main board, a system board, or a logic board.
- PCB central printed circuit board
- the centralized circuit system 110 may also, or alternatively, include other printed circuit assemblies (PCAs) or communication channel media.
- the memory 104 is configured to store image processing instructions for processing of the as-designed model (hereinafter also interchangeably referred to as 'building information model' or 'BIM') and the images corresponding to the as-built structure.
- the image processing instructions stored in the memory 104 are executable by the processor 102 for performing a method explained with reference to FIG. 2.
- FIG. 2 is a flowchart depicting an example method 200 for estimation of lower bounds of deviations of the as-built structure from the as-designed model, in accordance with an example embodiment.
- the method 200 depicted in the flow chart may be executed by, for example, the apparatus 100 of FIG. 1. It should be noted that to facilitate discussions of the flowchart of FIG.
- the method 200 includes facilitating receipt of an image (I) of an as-built structure and an as-designed model (BIM) associated with the as-built structure.
- the as-built structure may include, but not limited to, any man-made structure such as a building, a site, roads, etc., complete or partially built structures, foundations of any structure, etc.
- the BIM associated with the as-designed structure is a 3-D modeling of an intended structure that is originally planned to be built.
- the BIM is represented as a 3-D wire-frame model with a plurality of vertices and a plurality of BIM lines (i.e., straight lines) joining the plurality of vertices.
- the image (I) of the as-built structure is captured by the camera module 108 present in or otherwise accessible to the apparatus 100.
- the image (I) may be prerecorded or stored in the apparatus 100, or may be received from sources external to the apparatus 100.
- the apparatus 100 is caused to receive the image (I) from external storage medium such as DVD, Compact Disk (CD), flash drive, memory card, or from external storage locations through Internet, Bluetooth ® , and the like.
- the method 200 includes determining a plurality of edge features in the image (I).
- the plurality of edge features includes edge lines in those cases where the image (I) is a planar image, and the plurality of edge features includes edge curves in those cases where the image (I) is a spherical image.
- straight lines of the BIM are usually visible as edge lines in the planar image (Ip) or as edge curves in the spherical image (Is), and hence the edge features are determined from the image (I) so that the edge features and the BIM lines, after the post processing, can be utilized to compute lower bounds for the deviations of the as-built structure from the as-designed model.
- FIGS. 3A-3B to 6A-6C Various example embodiments of determination of the edge features are described later with reference to one or more of FIGS. 3A-3B to 6A-6C.
- the method 200 includes performing an exterior orientation of the image (I) comprising the plurality of edge features corresponding to the as-designed model to generate an oriented image.
- the exterior orientation of the image (I) is found by determining correspondences between the BIM lines and the edge features of the image (I).
- the exterior orientation of the planar image or planar panoramic image (Ip) can be achieved by line to line correspondences of the 3-D lines of the BIM and the edge lines of the planar image (Ip).
- the exterior orientation of the spherical image or spherical panoramic image (Is) can be achieved by estimating arc to line correspondences of the 3-D lines of the BIM and the arcs of the edge curves of the spherical image or spherical panoramic image (Is), by suitable algorithms such as modified RANSAC algorithm.
- suitable algorithms such as modified RANSAC algorithm.
- the method 200 includes deforming the as-designed model (BIM), such that, upon projecting the deformed as-designed model onto the oriented image, the plurality of lines of the as-designed model fit substantially with the plurality of edge features in the image (I).
- deforming the BIM comprises adjusting positions of the plurality of vertices of the BIM such that a vector from a camera projection center to an individual vertex of the plurality of vertices is perpendicular to normals of projecting planes corresponding to one or more edge features of the plurality of edge features in the image (I), wherein the one or more edge features correspond to one or more lines of the plurality of lines departing from the individual vertex.
- the method 200 includes determining, based on the image (I), lower-bounds for deviations of the as-built structure from the BIM.
- the deviations are determined corresponding to the plurality of vertices of the BIM based on the deformation of the BIM (performed at operation 220).
- 3A and 3B represent image geometry and coordinates used for the explanation of various embodiments of the operations 205-225 of the method 200.
- FIG. 3A an example representation 300 of object coordinate system and camera coordinate system corresponding to the planar image (Ip) is illustrated, and referring to FIG. 3B an example representation 350 of object coordinate system and camera coordinate system corresponding to the spherical image (Is) is illustrated. In both the FIGS.
- the rectangular object coordinate system ⁇ ⁇ , ⁇ ⁇ , ⁇ ⁇ ⁇ see, 302, 304 and 306) is aligned with the BIM so that the origin (see, 310) of the object coordinate system is in a vertex of a rectangular corner of the BIM and the coordinate axes ( ⁇ ⁇ , ⁇ ⁇ , ⁇ ⁇ ) (see, 352, 354 and 356) are aligned with the BIM lines emanating from the vertex (i.e., the origin 360).
- the exterior orientation of the camera is given by a rotation matrix 'i? ' and a translation vector
- the object point is projected onto the image (I) by well-known equations of perspective projection (e.g., the projection of points Pc and Qc to points p and q, respectively).
- ⁇ is the rotation angle (azimuth) around the j-axis counted from the negative z- axis
- ⁇ is the rotation angle (elevation) around the once rotated x-axis.
- the processor 102 is configured to, with the image processing instructions stored in the memory 104, and optionally with other components described herein, to cause the apparatus 100 to perform the operation 210 of the method 200.
- the apparatus 100 is caused to determine a plurality of edge features in the image (I).
- image (I) being the planar image (Ip)
- the examples of the edge features are edge lines; and in the embodiments of the image (I) being the spherical image (Is), the examples of the edge features are edge curves.
- the edge lines can be extracted by any well-known method such as Hough transform from edge pixels detected by any well-known method such as Canny algorithm.
- the edge curves can be extracted as per a flow diagram illustrated in FIG. 4.
- a method 400 illustrates a flow diagram for extracting the edge features (edge curves) from the spherical image (Is), in accordance with an example embodiment.
- a straight line in the object space is projected to an arc of a circle on the image sphere of the spherical image (Is).
- the method 400 includes detecting canny edge pixels in the spherical image (Is) using Canny algorithm as set forth in J. Canny, "A Computational Approach to Edge Detection,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 8, no. 6, pp. 679-698, 1986.
- the method 400 includes grouping the canny edge pixels into a plurality of connected components based on 8-connectivity of neighboring pixels. It is noted that each connected component may contain pixels from several BIM lines.
- the method 400 includes selecting a connected component of the plurality of connected components. Further, at 420, the method 400 includes dividing the connected component into one or more segments and merging the compatible segments using a region growing process. In an example embodiment, for the operation 420, the method 400 includes estimating, robustly at each edge pixel, the angle of normal of a local edge line and the signed distance of the local edge line from the image origin. In an example, estimation of the angle and the signed distance include computing a set of lines through two points, i.e. each edge pixel in a local neighborhood (of size M x M pixels) and the pixel in question. Further, the method 400 includes selecting the best line which has the biggest number of other edge pixels closer than a threshold T e from the line. The line parameters are refined by fitting a straight line to all edge pixels within the neighborhood that are closer than T e from the best line.
- the method 400 further includes performing region growing along edge pixels with similarity of the local line direction and signed distance from the origin as criteria for adding the next pixel to the segment of previous pixels. Since the edge is curved, the local line parameters of the next pixel are compared to the moving average of line parameters of a pre-determined number of previously added pixels. In an example, the pre-determined number of previously added pixels may be 20 pixels.
- the line directions are similar if the normal angles differ less than a threshold T a and the signed distances from the origin are similar if they differ less than D max (
- D is the distance of the moving average from the origin and a is the normal angle of the moving average.
- appropriate care is also paid to vertical lines, where the normal angle of the line has a discontinuity of ⁇ radians and the signed distance from the origin changes to the opposite sign with the same absolute value. If the next pixel is not compatible with the region grown by then, a new segment is started. In this manner, the connected component is divided into segments.
- the method 400 includes forming one or more arcs corresponding to one or more segments of the connected component using edge boundary pixels corresponding to the one or more segments. For instance, each segment of the connected component is selected, and for pixels of the each segment, the coordinates on the image sphere are computed according to expression (1). Further, a plane is then fitted to these points with the origin of the sphere added to the point set the same number of times as there are points on the image sphere. It is noted that the unit normal n of the fitted plane represents the projecting plane computed from the image measurements ⁇ see, FIG. 3B).
- the end points i.e.
- edge boundary pixels p and q of an arc of a circle are determined by projecting the points on the sphere to the plane fitted, computing unit vectors from the projection center toward the projected points, selecting one vector as a reference and computing angles of other vectors with respect to the reference, choosing the vectors of smallest and largest angle (including possibly the reference), and scaling them to the circle.
- the method 400 checks whether all of the connected components are processed (i.e. the segments are formed, and thereby arcs are also formed). If all connected components are processed, the method 400 proceeds to 435, otherwise the method 400 goes back to 415, and a next connected component is selected for processing with the operations 420 and 425.
- compatible arcs of circles which satisfy an overlap criterion, are merged one by one if the unit normals of their projecting planes are close to each other or point to near opposite directions. For instance, if the angle between normals of the projecting planes of two arcs is below a threshold ⁇ ⁇ or above ⁇ ( ⁇ ) - ⁇ ⁇ , the two arcs are merged.
- the arcs are defined to satisfy an overlap criterion, for example, whether the projected and scaled arcs overlap with each other or they are apart from each other by not greater than a tolerance T 0 .
- the projected and scaled arcs are obtained by orthogonally projecting the end point vectors of the arcs to a new projecting plane and scaling the projected vectors so that their end points are on the image sphere.
- the new projecting plane is computed with a new unit normal averaged from the normals of the old planes with the lengths of the old arcs divided by the focal length as weights.
- the end points of the new merged arc are given by those projected and radially scaled old end point vectors which have the smallest and largest angle with respect to one end point vector selected as a reference (or the reference itself).
- FIG. 5A The result of extraction of edge curves from a spherical panoramic image (Is) of a real construction site is illustrated in FIG. 5A.
- the extracted edge curves that are longer than a pre-determined number of pixels e.g., 200 pixels
- reference numerals 505 a selected part of the BIM projected on the spherical panoramic image (Is) according to an initial orientation
- the image of the real construction site shown in FIG. 5A may be constructed by stitching together a plurality of sub-images taken of the site.
- FIG. 6A result of extraction of edge curves from another spherical image (an image of a corner of a parking hall) is illustrated in FIG. 6A.
- the extracted edge curves that are longer than 200 pixels are shown by reference numerals 605.
- a selected part of the BIM projected on the spherical image (Is) according to an initial orientation is shown by reference numerals 610.
- the processor 102 is configured to, with the image processing instructions stored in the memory 104, and optionally with other components described herein, to cause the apparatus 100 to perform an exterior orientation of the image (I) comprising the plurality of edge features corresponding to the as-designed model (BIM) to generate an oriented image.
- the exterior orientation of a planar image (Ip) can be performed from line to line correspondences by any suitable methods known in the art.
- the exterior orientation of the spherical image (Is) or panoramic image is solved using determination of arc (e.g., arcs of Is) to line (e.g., BIM lines) correspondences.
- arc e.g., arcs of Is
- line e.g., BIM lines
- a modified RANSAC algorithm is used to determine the correspondences between the 3-D lines of the BIM and arcs of circles on the image sphere corresponding to the spherical image (Is).
- more than two pairs of BIM lines are first randomly selected with a requirement that the two lines of each pair share the same vertex of the BIM as one of the end points of the lines.
- the pairs of lines constitute a set of K distinct lines where 3 ⁇ K ⁇ 2S.
- differences between directions of normals parameterized by two angles 3 ⁇ 4 are also computed.
- an arc with a projecting plane normal n ⁇ is then randomly selected from the image sphere.
- Each subset consists of arcs, the direction angles of projecting plane normals of which differ less than a threshold To, from the direction angles of normal approximated from the direction angles of normals of previously selected arcs and 3-D lines of the BIM.
- This approximated normal is obtained by the condition that the differences between the projecting plane normals of arcs are similar as differences between the projecting plane normals of 3-D lines of the BIM (also referred to as '3-D BIM lines').
- the rotation R of the spherical image is solved using the normals and the direction vectors L k of the 3-D BIM lines by applying adaptive weighting to the technique set forth in Y. Liu, T.S. Huang, and O.D. Faugeras, "Determination of Camera Location from 2-D to 3-D Line and Point Correspondences," in IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 12, no. 1, pp. 28-37, 1990.
- the merit function to be minimized with respect to the rotation matrix parameterized by three angles is as per the following expression (3)
- the minimization of the merit function (fx) is solved using the Levenberg-Marquardt method where an initial estimate for the rotation can be provided manually.
- the weights ( ⁇ 3 ⁇ 4) are equal to one at correspondences, where the mean distance from the end points of the arc to the projecting plane of the transformed BIM line is below an adaptive threshold.
- the transformation of the BIM line means transformation to the camera coordinate system according to the current estimate for the rotation and translation.
- the adaptive threshold is chosen based on mean and standard deviation of arc to projecting plane distances at each of the K correspondences. The adaptive threshold gets tighter as the iteration proceeds similarly as set forth in Z. Zhang, "Iterative Point Matching for Registration of Free-Form Curves and Surfaces," International Journal of Computer Vision, vol. 13, no. 2, pp. 119-152, 1994, for registration of curves and surfaces.
- the weights ( ⁇ 3 ⁇ 4) are equal to zero at correspondences, where the arc to projecting plane distances are larger than or equal to the adaptive threshold. It is noted that such weighting scheme removes false correspondences and uses only reliable ones. In some example embodiments, however, the weighting can be ignored when the value of K is small, as it has more influence when K is large.
- two or more intersections of 3-D BIM lines and intersections of circles of corresponding arcs on the image sphere corresponding to the spherical image (Is), are taken.
- Vu denote the vertex, where the BIM lines k and / of one of the selected S pairs of 3-D BIM lines intersect.
- the corresponding arcs in the image sphere need not intersect, but the intersections of circles, which the arcs are part of, are used.
- the translation t may be represented as per the following expression (4):
- the 3-D lines of the BIM are transformed from the object to camera coordinate system.
- the projecting plane normals Ne k in the camera coordinates and the end points of respective arcs on the image sphere are computed.
- n u and Nc P are close to each other if the angle between them is less than a threshold T y or larger than ⁇ (pi) - T y .
- the closest arc according to expression (6) may not always be the best solution, especially when there are several arcs close to each other or the arcs represent different parts of the same edge line. Consequently, in an example embodiment, it is determined if there are other arcs D q2 close to D q (e.g., mean distances of the end points of D q from the projecting planes of D q2 are less than a threshold r K c), which have been previously matched with some other arcs A p2 computed from the BIM such that the projecting plane normals of A p2 are close to the one of A p (i.e.
- D q is the arc of the image sphere corresponding to the BIM line from which A p i has been computed. Further, the closeness between D q to A p , and D q 3 to A P 3 is measured by the mean distance of end points of D q and D q 3 from the projecting planes of A p and A p i, respectively. The closer of these two alternatives is kept for consideration and the other is deleted (or ignored). If D q is deleted from consideration, then the second closest arc according to expression (6) is considered and the process is iterated until a compatible arc can be found.
- the best orientation estimate is considered to be obtained when the total number of arc to line correspondences (including the other ones as per the expression (6) and as per the further considerations according to paragraph [0053], i.e. the paragraph following expression (6)) is maximized and within the maximum number of correspondences, the orientation which maximizes the merit function f 2 in the expression (5).
- the result of exterior orientation of the spherical panoramic image (Is) is illustrated in an example representation 540 of FIG. 5B and in an example representation 625 of FIG. 6B.
- the BIM see, 510 projected onto the spherical image (Is) fits mostly well with the edge curves (see, 505) except along the left border of the lacuna 520 (see, region 522) and along the side of the lacuna 525 near the conduit elements 530 (see, region 528).
- the processor 102 is configured to, with the image processing instructions stored in the memory 104, and optionally with other components described herein, to cause the apparatus 100 to perform the operations 215 and 220.
- the apparatus 100 is caused to deform the as-designed model, such that, upon projecting the deformed as- designed model onto the oriented image, the plurality of lines of the as-designed model fit substantially with the plurality of edge features in the image, and the apparatus 100 is further caused to determine lower-bounds for deviations of the as-built structure from the as-designed model, based on the image (I).
- the processor 102 is configured to determine the deviations corresponding to the plurality of vertices of the as-designed model based on the deformation of the as-designed model.
- an edge line extracted from a planar image (Ip) and an edge curve or an arc of an image sphere extracted from a spherical image (Is) can be jointly denoted as a 2-D feature of the image (I).
- a total of K number of 2-D feature to 3-D BIM line correspondences which yield the optimal exterior orientation and a total of T' number of other feature to line correspondences using the optimal orientation are established ⁇ e.g., determined during the operation 215). It should be noted that these K' other correspondences are obtained for the spherical image (Is) as described above in paragraphs [0052] and [0053] ⁇ i.e.
- K' other correspondences can be obtained using the same procedure by replacing term 'arc' by the 'line' and term 'image sphere' by the 'image plane'.
- U h be a set of indices of features that correspond to 3-D BIM lines that share PBH as one of the end points of the BIM line. For instance, in other words, 1 ⁇ 2 groups the features corresponding to BIM lines that depart from vertex PBH-
- the vector to each adjusted vertex PBU + ⁇ should be perpendicular to the unit normals of projecting planes of all features belonging to U h .
- Each weight (1 ⁇ 43 ⁇ 4) is proportional to the length of overlap between the image feature and the feature computed from the corresponding BIM line when transformed to the same circle on the image sphere similarly as in the overlap criterion in the case of the spherical image (Is), and correspondingly for the planar image (Ip).
- the set U h typically contains zero to three features. In an example embodiment, if there are two or more features in U h , then all of the three coordinates of APs h are regarded as unknown. In an example embodiment, if there is only one feature in the set U h , then the position of PBH is kept fixed in the direction of the 3-D BIM line corresponding to the feature in question and changes are allowed only perpendicular to the BIM line direction. Due to the selection of the BIM coordinate system, each line of the BIM is usually parallel to one of the coordinate axes.
- fixing the movement in the direction of the BIM line represents fixing one coordinate of PBH-
- the set U h is empty, all the coordinates of vertex Psh are kept fixed.
- the other feature to line correspondences for which one or both end points of the 3-D line have moved more than a threshold T p (i.e., ⁇ ⁇ ⁇ ⁇ T p ) are removed from consideration and the corresponding deformations are set to zeros.
- the Levenberg-Marquardt algorithm with Lagrange multipliers is applied again with zero values as initial estimates for a new set of unknown coordinates of ⁇ 3 ⁇ 4 determined based on the remaining feature to line correspondences.
- the two step approach thus uses information from the 3-D object space to eliminate false feature to line correspondences. It is noted that when the scene contains 3-D lines at various depths, it is difficult to set an appropriate value for T y , which describes the closeness of normals of projecting planes of the image feature and the 3-D line. However, the threshold T p gives a depth invariant criterion, which can be set based on knowledge about how much the vertices are expected to be misplaced at most.
- the solution with the reduced number of unknowns gives the deformation of the BIM, where the magnitudes of deformations represent the lower- bounds for deviations against the as-designed BIM at each vertex of the BIM.
- FIG. 5C The result of BIM adjustment for the spherical panoramic image (Is) is illustrated in an example representation 560 of FIG. 5C and in an example representation 650 of FIG. 6C.
- the BIM is deformed so as to substantially fit with the BIM lines with the corresponding edge curves of the lacuna 520.
- the region 522 shown in FIG. 5B is adjusted and is hence not visible in FIG. 5C.
- FIG. 7 shows the deformed model in the BIM coordinates ( ⁇ , ⁇ , ⁇ ) with shading differences illustrating differences against the as-designed BIM.
- one or more algorithms include several thresholds which determine when two quantities are considered to be close to each other. It should however be noted that determination of appropriate values for the parameters and thresholds depends on the scene content such as straightness of edge lines in the object space (M, T e , T a ), dissolution potential one wants to achieve by not merging adjacent edge curves ( ⁇ ⁇ , ⁇ 0 ), closeness of initial orientation to the true one ( ⁇ ⁇ ), dissolution potential needed to separate arcs close to each other ( ⁇ ⁇ , ⁇ ⁇ ), and magnitude of BIM deformation there is expected to be (T 7 , T P ).
- a technical effect of one or more of the example embodiments disclosed herein is to provide methods for providing lower-bounds for deviations of the as-built structure from the as-designed model.
- Various example embodiments are capable of working with a single image ⁇ e.g., a planar or a spherical image) and are yet able to derive 3-D deformation information.
- the lower bound obtained can be applied to determine deviations exceeding tolerances and requiring further inspection.
- Various example embodiments operate both on the planar as well as spherical images stitched from one or several concentric sub-images.
- spherical panoramic images contain features from all surroundings of the setup and may be thus more accurately oriented with the BIM.
- orientations of the spherical images are solved by applying the concept of edge-based methods, the computational complexities and overheads reduce significantly.
- Various embodiments described above may be implemented in software, hardware, application logic or a combination of software, hardware and application logic.
- the software, application logic and/or hardware may reside on at least one memory, at least one processor, an apparatus or, a non-transitory computer program product.
- the application logic, software or an instruction set is maintained on any one of various conventional computer-readable media.
- a "computer-readable medium" may be any non-transitory media or means that can contain, store, communicate, propagate or transport the instructions for use by or in connection with an instruction execution system, apparatus, or device, such as a computer, with one example of a system described and depicted in FIG. 1.
- a computer- readable medium may comprise a computer-readable storage medium that may be any media or means that can contain or store the instructions for use by or in connection with an instruction execution system, apparatus, or device, such as a computer.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Geometry (AREA)
- Business, Economics & Management (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Computer Hardware Design (AREA)
- Computational Mathematics (AREA)
- Economics (AREA)
- Mathematical Optimization (AREA)
- Pure & Applied Mathematics (AREA)
- Evolutionary Computation (AREA)
- General Engineering & Computer Science (AREA)
- Structural Engineering (AREA)
- Civil Engineering (AREA)
- Quality & Reliability (AREA)
- Architecture (AREA)
- Health & Medical Sciences (AREA)
- Mathematical Analysis (AREA)
- General Health & Medical Sciences (AREA)
- Human Resources & Organizations (AREA)
- Marketing (AREA)
- Primary Health Care (AREA)
- Strategic Management (AREA)
- Tourism & Hospitality (AREA)
- General Business, Economics & Management (AREA)
- Image Analysis (AREA)
- Image Processing (AREA)
Abstract
A method and a system for determining lower bounds for deviations of an as-built structure from an as-designed model are provided. The method includes facilitating receipt of a 2-D image of an as-built structure and a 3-D as-designed model associated with the as-built structure, where the as-designed model includes a plurality of vertices and a plurality of lines. The method includes determining a plurality of edge features in the image performing an exterior orientation of the image corresponding to the as-designed model to generate an oriented image. The as-designed model is deformed, such that, upon projection of the deformed as-designed model onto the oriented image, the plurality of lines of the as-designed model substantially fit with the plurality of edge features. Lower-bounds for deviations of the as-built structure from the as-designed model are determined corresponding to the plurality of vertices based on the deformation of the as-designed model.
Description
ESTIMATION OF LOWER BOUNDS FOR DEVIATIONS OF AS-BUILT STRUCTURES FROM AS-DESIGNED MODELS
TECHNICAL FIELD
[0001] The present disclosure generally relates to photogrammetry and, more particularly, to image-based detection and measuring of deviations against an as-designed Building Information Model (BIM).
CROSS-REFERENCE TO RELATED APPLICATIONS
[0002] This application claims priority to U.S. Provisional Application No. 62/098464, filed Dec. 31 , 2014, titled "As-built Deformations of 3-D Building Information Model from Single Spherical Panoramic Image", by Olli Jokinen.
BACKGROUND
[0003] Detection and measuring of deviations between as-built and as-designed structures (e.g., buildings) is of important value for high-quality construction work. An optimal solution would allow checking of positioning of building elements against allowed tolerances already during installation when it is easy to remedy possible errors. At later stages, as-built documentation facilitates proper installation and later maintenance of building services equipment as one knows where they were actually installed. [0004] Oftentimes, building geometry of the as-designed structures is usually represented in a Building Information Model (BIM) with vertices, lines, and surfaces. The inspection of geometries of the as-built structure typically involves reconstruction of a dense set of 3-D points from stereo or by laser scanning of the as-built structure, and thereafter modeling of the data into surface patches which are compared against the design. Processing of such huge amount of data is computationally heavy and requires identification of corresponding surface patches between the design (BIM) and the reality (as-built structure).
[0005] Hence, techniques are needed that can reduce the computational overhead while studying deviations between the as-built structures and their corresponding as- designed models.
SUMMARY
[0006] Various methods, systems and computer readable mediums for determining lower bounds for deviations of an as-built structure from an as-designed model are disclosed. In an embodiment, a method includes facilitating receipt of an image of an as-built structure. The image is a two-dimensional (2-D) image. The method also facilitates receipt of an as-designed model associated with the as-built structure. The as- designed model is a three-dimensional (3-D) model including a plurality of vertices and a plurality of lines connecting the plurality of vertices. Furthermore, the method determines a plurality of edge features in the image and an exterior orientation of the image including the plurality of edge features corresponding to the as-designed model is performed to generate an oriented image. Also, the method deforms the as-designed model, such that, upon projection of the deformed as-designed model onto the oriented image, the plurality of lines of the as-designed model fit substantially with the plurality of edge features in the image. Based on the image, lower-bounds for deviations of the as-built structure from the as-designed model are determined. The deviations are determined corresponding to the plurality of vertices of the as-designed model based on the deformation of the as-designed model.
[0007] In another embodiment, an apparatus for determining lower bounds for deviations of an as-built structure from an as-designed model is disclosed. The apparatus includes a memory and a processor. The memory stores image processing instructions and the processor is electronically coupled with the memory. The processor is configured to execute the image processing instructions stored in the memory to cause the apparatus to perform facilitating receipt of an image of an as-built structure. The image is a two- dimensional (2-D) image. The apparatus also facilitates receipt of an as-designed model associated with the as-built structure. The as-designed model is a three-dimensional (3-D) model including a plurality of vertices and a plurality of lines connecting the plurality of vertices. The apparatus further determines a plurality of edge features in the image and
performs an exterior orientation of the image including the plurality of edge features corresponding to the as-designed model to generate an oriented image. Furthermore, the apparatus deforms the as-designed model, such that, upon projection of the deformed as- designed model onto the oriented image, the plurality of lines of the as-designed model fit substantially with the plurality of edge features in the image. Based on the image, lower- bounds for deviations of the as-built structure from the as-designed model are determined. The deviations are determined corresponding to the plurality of vertices of the as-designed model based on the deformation of the as-designed model.
[0008] In yet another embodiment, a non-transitory, computer-readable storage medium storing computer-executable program instructions to implement a method for determining lower bounds for deviations of an as-built structure from an as-designed model is disclosed. The method includes facilitating receipt of an image of an as-built structure. The image is a two-dimensional (2-D) image. The method further facilitates receipt of an as-designed model associated with the as-built structure. The as-designed model is a three-dimensional (3-D) model including a plurality of vertices and a plurality of lines connecting the plurality of vertices. Furthermore, the method determines a plurality of edge features in the image and an exterior orientation of the image including the plurality of edge features corresponding to the as-designed model is performed to generate an oriented image. Also, the method deforms the as-designed model, such that, upon projection of the deformed as-designed model onto the oriented image, the plurality of lines of the as-designed model fit substantially with the plurality of edge features in the image. Based on the image, lower-bounds for deviations of the as-built structure from the as-designed model are determined. The deviations are determined corresponding to the plurality of vertices of the as-designed model based on the deformation of the as-designed model.
[0009] Other aspects and example embodiments are provided in the drawings and the detailed description that follows.
BRIEF DESCRIPTION OF THE FIGURES
[0010] For a more complete understanding of example embodiments of the present technology, reference is now made to the following descriptions taken in connection with the accompanying drawings in which:
FIG. 1 illustrates a block diagram representation of an apparatus, in accordance with an example embodiment;
FIG. 2 is a flow diagram depicting a method for determining lower bounds for deviations of an as-built structure from an as-designed model (BIM), in accordance with an example embodiment;
FIGS. 3 A and 3B show example representations of imaging geometries for a planar image and a spherical image, in accordance with an example embodiment;
FIG. 4 is a flow diagram depicting a method for extracting edge features from a spherical image, in accordance with an example embodiment;
FIG. 5A shows an example image representation of edge features extracted in an image of a real-construction site with selected part of BIM projected thereon with an initial orientation, in accordance with an example embodiment;
FIG. 5B shows an example image representation of the real-construction site of FIG. 5A with selected part of BIM projected thereon with a refined exterior orientation, in accordance with an example embodiment;
FIG. 5C shows an example image representation of the real-construction site of FIG. 5B after deforming the selected part of BIM to substantially match with the edge features in the image, in accordance with an example embodiment;
FIG. 6A shows an example image representation of edge features extracted in an image of a parking lot corner with selected part of BIM projected thereon with an initial orientation, in accordance with an example embodiment;
FIG. 6B shows an example image representation of the image of FIG. 6A with selected part of BIM projected thereon with a refined exterior orientation, in accordance with an example embodiment;
FIG. 6C shows an example image representation of the image of FIG. 6B after deforming the selected part of BIM to substantially match with the edge features in the image, in accordance with an example embodiment; and
FIG. 7 shows a plotted representation of a deformed model in BIM coordinates with shading differences illustrating lower bounds of differences of the as-built structure against the BIM, in accordance with an example embodiment.
[0011] The drawings referred to in this description are not to be understood as being drawn to scale except if specifically noted, and such drawings are only exemplary in nature.
DETAILED DESCRIPTION
[0012] In the following description, for purposes of explanation, numerous specific details are set forth in order to provide a thorough understanding of the present disclosure. It will be apparent, however, to one skilled in the art that the present disclosure can be practiced without these specific details. In other instances, apparatuses and methods are shown in block diagram form only in order to avoid obscuring the present disclosure.
[0013] Reference in this specification to "one embodiment" or "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the present disclosure. The appearance of the phrase "in one embodiment" in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. Moreover, various features are described which may be exhibited by some embodiments and not by others. Similarly, various requirements are described which may be requirements for some embodiments but not for other embodiments.
[0014] Moreover, although the following description contains many specifics for the purposes of illustration, anyone skilled in the art will appreciate that many variations and/or alterations to said details are within the scope of the present disclosure. Similarly, although many of the features of the present disclosure are described in terms of each other, or in conjunction with each other, one skilled in the art will appreciate that many of these features can be provided independently of other features. Accordingly, this description of the present disclosure is set forth without any loss of generality to, and without imposing limitations upon, the present disclosure. [0015] The term 'as-designed structure' 'as-designed model', 'building information model', BIM are used interchangeably throughout the description, and these terms represent a 3-D modeling of any planned structured that is to be built, or already
built, or a work in progress. Further, the term 'as-built structure' refers to any actual structure that is built such that its images can be captured. Examples of the images herein include planar images and/or spherical images including planar and/or spherical panoramic images of the as-built structure. Herein, for the purposes of the description, the term 'planar image' also includes 'planar panoramic image' unless the context suggests otherwise, and the terms 'planar image' and 'planar panoramic image' are jointly referred to as as 'planar image (Ip)'. Similarly, the term 'spherical image' also includes 'spherical panoramic image' unless the context suggests otherwise, and the terms 'spherical image' and 'spherical panoramic image' are jointly referred to as 'spherical image (Is)'. [0016] FIG. 1 illustrates a block diagram of an apparatus 100 for adjusting a building information model to fit with an image, in accordance with an example embodiment of the present invention. More specifically, the apparatus 100 is configured to determine lower-bounds for deviations of the as-built structure from the as-designed model. [0017] It is understood that the apparatus 100 as illustrated and hereinafter described is merely illustrative of an apparatus that could benefit from embodiments of the disclosure and, therefore, should not be taken to limit the scope of the disclosure. The apparatus 100 may be any computing or data processing machine for example, a laptop computer, a tablet computer, a mobile phone, a server, and the like. It is noted that the apparatus 100 may include fewer or more components than those depicted in FIG. 1. Moreover, the apparatus 100 may be implemented as a centralized device, or, alternatively, various components of the apparatus 100 may be deployed in a distributed manner while being operatively coupled to each other. In an embodiment, one or more components of the apparatus 100 may be implemented as a set of software layers on top of existing hardware systems.
[0018] In at least one example embodiment, the apparatus 100 includes at least one processor for example, a processor 102, and at least one memory for example, a memory 104. The memory 104 is capable of storing machine executable instructions, particularly the image processing instructions. Further, the processor 102 is capable of executing the stored machine executable instructions. The processor 102 may be embodied in a number of different ways. In an embodiment, the processor 102 may be embodied as one or more of various processing devices, such as a coprocessor, a microprocessor, a controller, a digital signal processor (DSP), processing circuitry with or
without an accompanying DSP, or various other processing devices including integrated circuits such as, for example, an application specific integrated circuit (ASIC), a field programmable gate array (FPGA), a microcontroller unit (MCU), a hardware accelerator, a special-purpose computer chip, or the like. In at least one example embodiment, the processor 102 utilizes computer program code to cause the apparatus 100 to perform one or more actions responsible for adjusting a building information model of a structure to fit with an image corresponding to as-built structure, and to calculate lower bound estimations for deviations between the as-built structure and the as-designed model (the building information model). [0019] The memory 104 may be embodied as one or more volatile memory devices, one or more non-volatile memory devices, and/or a combination of one or more volatile memory devices and non- volatile memory devices. For example, the memory 104 may be embodied as magnetic storage devices (such as hard disk drives, floppy disks, magnetic tapes, etc.), optical magnetic storage devices (e.g., magneto-optical disks), CD- ROM (compact disc read only memory), CD-R (compact disc recordable), CD-R/W (compact disc rewritable), DVD (Digital Versatile Disc), BD (Blu-ray® Disc), and semiconductor memories (such as mask ROM, PROM (programmable ROM), EPROM (erasable PROM), flash ROM, RAM (random access memory), etc.).
[0020] In at least one embodiment, the apparatus 100 includes a user interface 106 (also referred to as UI 106) for providing an output and/or receiving an input. The user interface 106 is configured to be in communication with the processor 102 and the memory 104. Examples of the user interface 106 include, but are not limited to, an input interface and/or an output interface. Examples of the input interface may include, but are not limited to, a keyboard, a mouse, a joystick, a keypad, a touch screen, soft keys, a microphone, and the like. Examples of the output interface may include, but are not limited to, a display such as light emitting diode display, thin-film transistor (TFT) display, liquid crystal display, active-matrix organic light-emitting diode (AMOLED) display, a microphone, a speaker, ringers, vibrators, and the like. In an example embodiment, the processor 102 may include user interface circuitry configured to control at least some functions of one or more elements of the user interface 106, such as, for example, a speaker, a ringer, a microphone, a display, and/or the like. The processor 102 and/or the user interface circuitry may be configured to control one or more functions of the one or more elements of the user interface 106 through computer program
instructions, for example, software and/or firmware, stored in a memory, for example, the memory 104, and/or the like, accessible to the processor 102.
[0021] In an example embodiment, the apparatus 100 includes a camera module 108, for example including one or more digital cameras. The camera module 108 is configured to be in communication with the processor 102 and/or other components of the apparatus 100 to capture digital image frames, videos and/or other graphic media. The camera module 108 may include hardware and/or software necessary for taking various kinds of images, for example, planar images, spherical images, or planar panoramic or spherical panoramic images. The camera module 108 may include hardware, such as a lens and/or other optical component(s) such as one or more image sensors. Examples of one or more image sensors may include, but are not limited to, a complementary metal- oxide semiconductor (CMOS) image sensor, a charge-coupled device (CCD) image sensor, a backside illumination sensor (BSI) and the like. In an example embodiment, the camera module 108 may further include a processing element such as a co-processor that assists the processor 102 in processing image frame data and an encoder and/or a decoder for compressing and/or decompressing image frame data. The encoder and/or decoder may encode and/or decode according to a standard format, for example, a Joint Photographic Experts Group (JPEG) standard format.
[0022] The various components of the apparatus 100, such as components (102- 108) may communicate with each other via a centralized circuit system 110 to determine lower bounds for deviations of the as-built structure from the as-designed model. The centralized circuit system 110 may be various devices configured to, among other things, provide or enable communication between the components (102-108) of the apparatus 100. In certain embodiments, the centralized circuit system 118 may be a central printed circuit board (PCB) such as a motherboard, a main board, a system board, or a logic board. The centralized circuit system 110 may also, or alternatively, include other printed circuit assemblies (PCAs) or communication channel media.
[0023] In at least one embodiment, the memory 104 is configured to store image processing instructions for processing of the as-designed model (hereinafter also interchangeably referred to as 'building information model' or 'BIM') and the images corresponding to the as-built structure. The image processing instructions stored in the memory 104 are executable by the processor 102 for performing a method explained with reference to FIG. 2.
[0024] FIG. 2 is a flowchart depicting an example method 200 for estimation of lower bounds of deviations of the as-built structure from the as-designed model, in accordance with an example embodiment. The method 200 depicted in the flow chart may be executed by, for example, the apparatus 100 of FIG. 1. It should be noted that to facilitate discussions of the flowchart of FIG. 2, certain operations are described herein as constituting distinct steps performed in a certain order. Such implementations are examples only and non-limiting in scope. Certain operations may be grouped together and performed in a single operation, and certain operations may be performed in an order that differs from the order employed in the examples set forth herein. Moreover, certain operations of the method 200 are performed in an automated fashion. These operations involve substantially no interaction with the user. Other operations of the method 200 may be performed in a manual fashion or semi-automatic fashion. These operations involve interaction with the user via one or more user interface presentations.
[0025] At 205, the method 200 includes facilitating receipt of an image (I) of an as-built structure and an as-designed model (BIM) associated with the as-built structure. Examples of the as-built structure may include, but not limited to, any man-made structure such as a building, a site, roads, etc., complete or partially built structures, foundations of any structure, etc. In an example embodiment, the BIM associated with the as-designed structure is a 3-D modeling of an intended structure that is originally planned to be built. In an example, the BIM is represented as a 3-D wire-frame model with a plurality of vertices and a plurality of BIM lines (i.e., straight lines) joining the plurality of vertices. In an example embodiment, the image (I) of the as-built structure is captured by the camera module 108 present in or otherwise accessible to the apparatus 100. In some other example embodiments, the image (I) may be prerecorded or stored in the apparatus 100, or may be received from sources external to the apparatus 100. In such example embodiments, the apparatus 100 is caused to receive the image (I) from external storage medium such as DVD, Compact Disk (CD), flash drive, memory card, or from external storage locations through Internet, Bluetooth®, and the like.
[0026] At 210, the method 200 includes determining a plurality of edge features in the image (I). The plurality of edge features includes edge lines in those cases where the image (I) is a planar image, and the plurality of edge features includes edge curves in those cases where the image (I) is a spherical image. It should be noted that straight lines of the BIM are usually visible as edge lines in the planar image (Ip) or as edge curves in
the spherical image (Is), and hence the edge features are determined from the image (I) so that the edge features and the BIM lines, after the post processing, can be utilized to compute lower bounds for the deviations of the as-built structure from the as-designed model. Various example embodiments of determination of the edge features are described later with reference to one or more of FIGS. 3A-3B to 6A-6C.
[0027] At 215, the method 200 includes performing an exterior orientation of the image (I) comprising the plurality of edge features corresponding to the as-designed model to generate an oriented image. The exterior orientation of the image (I) is found by determining correspondences between the BIM lines and the edge features of the image (I). For instance, the exterior orientation of the planar image or planar panoramic image (Ip) can be achieved by line to line correspondences of the 3-D lines of the BIM and the edge lines of the planar image (Ip). Further, the exterior orientation of the spherical image or spherical panoramic image (Is) can be achieved by estimating arc to line correspondences of the 3-D lines of the BIM and the arcs of the edge curves of the spherical image or spherical panoramic image (Is), by suitable algorithms such as modified RANSAC algorithm. Various example embodiments of the exterior orientation are described later with reference to one or more of FIGS. 3A-3B to 6A-6C.
[0028] At 220, the method 200 includes deforming the as-designed model (BIM), such that, upon projecting the deformed as-designed model onto the oriented image, the plurality of lines of the as-designed model fit substantially with the plurality of edge features in the image (I). In an example embodiment, deforming the BIM comprises adjusting positions of the plurality of vertices of the BIM such that a vector from a camera projection center to an individual vertex of the plurality of vertices is perpendicular to normals of projecting planes corresponding to one or more edge features of the plurality of edge features in the image (I), wherein the one or more edge features correspond to one or more lines of the plurality of lines departing from the individual vertex. Further, at 225, the method 200 includes determining, based on the image (I), lower-bounds for deviations of the as-built structure from the BIM. In an example embodiment, the deviations are determined corresponding to the plurality of vertices of the BIM based on the deformation of the BIM (performed at operation 220). Various example embodiments of the deformation of the BIM and determination of the lower bounds for the deviations are described further with reference to one or more of FIGS. 3A-3B to 7.
[0029] The operations 205-225 of the method 200, which are performed by the apparatus 100, will be described by jointly referring to FIGS. 3A-3B to 7, in which the FIGS. 3A and 3B represent image geometry and coordinates used for the explanation of various embodiments of the operations 205-225 of the method 200. [0030] Referring to FIG. 3A, an example representation 300 of object coordinate system and camera coordinate system corresponding to the planar image (Ip) is illustrated, and referring to FIG. 3B an example representation 350 of object coordinate system and camera coordinate system corresponding to the spherical image (Is) is illustrated. In both the FIGS. 3A and 3B, the rectangular object coordinate system ΧΒ, ΥΒ ,ΖΒ {see, 302, 304 and 306) is aligned with the BIM so that the origin (see, 310) of the object coordinate system is in a vertex of a rectangular corner of the BIM and the coordinate axes (ΧΒ, ΥΒ,ΖΒ) (see, 352, 354 and 356) are aligned with the BIM lines emanating from the vertex (i.e., the origin 360). The camera coordinates Xc, Yc,Zc (see, 352, 354 and 356) are related to the object coordinates by rc = R(VB ~ t), where rB and rc are the positions of a point in the object and camera coordinate systems, respectively. Herein, the exterior orientation of the camera is given by a rotation matrix 'i? ' and a translation vector
[0031] As shown in FIG. 3A, in the case of the planar image (Ip), the object point is projected onto the image (I) by well-known equations of perspective projection (e.g., the projection of points Pc and Qc to points p and q, respectively). Further, as shown in FIG. 3B in the case of the spherical image (Is), the object point is projected onto the image sphere by r = c rc/\rc\, where c is the focal length of the camera. The coordinates of r = [x y z]rare related to the spherical coordinates θ, φ as per the following expression (1):
x =— c cos6 sin0
where φ is the rotation angle (azimuth) around the j-axis counted from the negative z- axis, and Θ is the rotation angle (elevation) around the once rotated x-axis.
[0032] The processor 102 is configured to, with the image processing instructions stored in the memory 104, and optionally with other components described herein, to cause the apparatus 100 to perform the operation 210 of the method 200. For example, the apparatus 100 is caused to determine a plurality of edge features in the image (I). In
the embodiment of image (I) being the planar image (Ip), the examples of the edge features are edge lines; and in the embodiments of the image (I) being the spherical image (Is), the examples of the edge features are edge curves.
[0033] In the embodiments of the image (I) being the planar image (Ip), the edge lines can be extracted by any well-known method such as Hough transform from edge pixels detected by any well-known method such as Canny algorithm. In an example embodiment of the image (I) being the spherical image (Is), the edge curves can be extracted as per a flow diagram illustrated in FIG. 4.
[0034] Referring now to FIG. 4, a method 400 illustrates a flow diagram for extracting the edge features (edge curves) from the spherical image (Is), in accordance with an example embodiment. In the case of a spherical image (Is), a straight line in the object space is projected to an arc of a circle on the image sphere of the spherical image (Is). The arc is defined as an intersection of the image sphere with the projecting plane having a unit normal defined as per the expression Nc = (Pc xQc)/ \\ Pc xQc \\ , and containing the projection center (see, FIG. 3B). It is noted that in spherical coordinates, the arc is a curve with a slowly varying tangent vector.
[0035] At 405, the method 400 includes detecting canny edge pixels in the spherical image (Is) using Canny algorithm as set forth in J. Canny, "A Computational Approach to Edge Detection," IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 8, no. 6, pp. 679-698, 1986. At 410, the method 400 includes grouping the canny edge pixels into a plurality of connected components based on 8-connectivity of neighboring pixels. It is noted that each connected component may contain pixels from several BIM lines.
[0036] At 415, the method 400 includes selecting a connected component of the plurality of connected components. Further, at 420, the method 400 includes dividing the connected component into one or more segments and merging the compatible segments using a region growing process. In an example embodiment, for the operation 420, the method 400 includes estimating, robustly at each edge pixel, the angle of normal of a local edge line and the signed distance of the local edge line from the image origin. In an example, estimation of the angle and the signed distance include computing a set of lines through two points, i.e. each edge pixel in a local neighborhood (of size M x M pixels) and the pixel in question. Further, the method 400 includes selecting the best line which
has the biggest number of other edge pixels closer than a threshold Te from the line. The line parameters are refined by fitting a straight line to all edge pixels within the neighborhood that are closer than Te from the best line.
[0037] In an example embodiment, for the operation 420, the method 400 further includes performing region growing along edge pixels with similarity of the local line direction and signed distance from the origin as criteria for adding the next pixel to the segment of previous pixels. Since the edge is curved, the local line parameters of the next pixel are compared to the moving average of line parameters of a pre-determined number of previously added pixels. In an example, the pre-determined number of previously added pixels may be 20 pixels. In an example embodiment, the line directions are similar if the normal angles differ less than a threshold Ta and the signed distances from the origin are similar if they differ less than D max (|cos( + Ta) - cosa|, 1 - cos Ta), where D is the distance of the moving average from the origin and a is the normal angle of the moving average. In an example embodiment, if there are several neighboring edge pixels of the current pixel, then the most compatible of them is selected and the others are stored to be considered later, with appropriate moving averages of line parameters, once the branch of the selected pixel has been tracked to the end. In an example embodiment, appropriate care is also paid to vertical lines, where the normal angle of the line has a discontinuity of π radians and the signed distance from the origin changes to the opposite sign with the same absolute value. If the next pixel is not compatible with the region grown by then, a new segment is started. In this manner, the connected component is divided into segments.
[0038] At 425, the method 400 includes forming one or more arcs corresponding to one or more segments of the connected component using edge boundary pixels corresponding to the one or more segments. For instance, each segment of the connected component is selected, and for pixels of the each segment, the coordinates on the image sphere are computed according to expression (1). Further, a plane is then fitted to these points with the origin of the sphere added to the point set the same number of times as there are points on the image sphere. It is noted that the unit normal n of the fitted plane represents the projecting plane computed from the image measurements {see, FIG. 3B). In an example embodiment, the end points (i.e. edge boundary pixels) p and q of an arc of a circle are determined by projecting the points on the sphere to the plane fitted, computing unit vectors from the projection center toward the projected points, selecting
one vector as a reference and computing angles of other vectors with respect to the reference, choosing the vectors of smallest and largest angle (including possibly the reference), and scaling them to the circle.
[0039] At 430, the method 400 checks whether all of the connected components are processed (i.e. the segments are formed, and thereby arcs are also formed). If all connected components are processed, the method 400 proceeds to 435, otherwise the method 400 goes back to 415, and a next connected component is selected for processing with the operations 420 and 425.
[0040] Once all connected components are processed, at 435, compatible arcs of circles, which satisfy an overlap criterion, are merged one by one if the unit normals of their projecting planes are close to each other or point to near opposite directions. For instance, if the angle between normals of the projecting planes of two arcs is below a threshold Τβ or above π(ρϊ) - Τβ, the two arcs are merged. In an example, the arcs are defined to satisfy an overlap criterion, for example, whether the projected and scaled arcs overlap with each other or they are apart from each other by not greater than a tolerance T0. Herein, the projected and scaled arcs are obtained by orthogonally projecting the end point vectors of the arcs to a new projecting plane and scaling the projected vectors so that their end points are on the image sphere. In an example, the new projecting plane is computed with a new unit normal averaged from the normals of the old planes with the lengths of the old arcs divided by the focal length as weights. The end points of the new merged arc are given by those projected and radially scaled old end point vectors which have the smallest and largest angle with respect to one end point vector selected as a reference (or the reference itself).
[0041] The result of extraction of edge curves from a spherical panoramic image (Is) of a real construction site is illustrated in FIG. 5A. In this example representation 500, the extracted edge curves that are longer than a pre-determined number of pixels (e.g., 200 pixels) are shown by reference numerals 505. Further, a selected part of the BIM projected on the spherical panoramic image (Is) according to an initial orientation is shown by reference numerals 510. The image of the real construction site shown in FIG. 5A may be constructed by stitching together a plurality of sub-images taken of the site. It should be understood that only some structures such as lacunas 520, 525, conduit elements 530 and elevator shaft 535 are exemplarily represented for the sake of description. Further, only some selected part of the BIM (see, 510) is projected on the
spherical image (Is) by manually setting an initial orientation between the spherical image (Is) and the BIM.
[0042] In another representation, result of extraction of edge curves from another spherical image (an image of a corner of a parking hall) is illustrated in FIG. 6A. In this example representation 600, the extracted edge curves that are longer than 200 pixels are shown by reference numerals 605. Further, a selected part of the BIM projected on the spherical image (Is) according to an initial orientation is shown by reference numerals 610.
[0043] Some example embodiments of performing the operation 215 of the method 200 are explained herein with following description. For example, the processor 102 is configured to, with the image processing instructions stored in the memory 104, and optionally with other components described herein, to cause the apparatus 100 to perform an exterior orientation of the image (I) comprising the plurality of edge features corresponding to the as-designed model (BIM) to generate an oriented image. In the embodiments of the image being the planar image (Ip), the exterior orientation of a planar image (Ip) can be performed from line to line correspondences by any suitable methods known in the art.
[0044] In the embodiments of the image (I) being the spherical image (Is), the exterior orientation of the spherical image (Is) or panoramic image is solved using determination of arc (e.g., arcs of Is) to line (e.g., BIM lines) correspondences. In an example embodiment, a modified RANSAC algorithm is used to determine the correspondences between the 3-D lines of the BIM and arcs of circles on the image sphere corresponding to the spherical image (Is). In this example embodiment, several combinations of the correspondences are tested between the 3-D lines of the BIM and arcs of circles on the image sphere corresponding to the spherical image (Is), and an optimum correspondence from the several correspondences is determined and is used to compute the exterior orientation.
[0045] In an example embodiment, for each RANSAC iteration, more than two pairs of BIM lines (e.g., S≥ 2 pairs of BIM lines) are first randomly selected with a requirement that the two lines of each pair share the same vertex of the BIM as one of the end points of the lines. The pairs of lines constitute a set of K distinct lines where 3 < K < 2S. Further, in an example embodiment, unit normals Nk, k = I,..., K, of their projecting
planes are computed. Furthermore, differences between directions of normals parameterized by two angles ¾ are also computed.
[0046] In an example embodiment, an arc with a projecting plane normal n\ is then randomly selected from the image sphere. Thereafter, arcs for k = 2,..., K, are also randomly and sequentially chosen from subsets of all the arcs. Each subset consists of arcs, the direction angles of projecting plane normals of which differ less than a threshold To, from the direction angles of normal approximated from the direction angles of normals of previously selected arcs and 3-D lines of the BIM. This approximated normal is obtained by the condition that the differences between the projecting plane normals of arcs are similar as differences between the projecting plane normals of 3-D lines of the BIM (also referred to as '3-D BIM lines'). More precisely, the direction angle <¾ of an approximated normal is given by following expression (2) ω¾ = E -i (ω¾ + nk - no/ k - 1) (2)
for k = 2,...,K, and same for the other direction angle.
[0047] In an example embodiment, given the set of K arc to line correspondences, the rotation R of the spherical image is solved using the normals and the direction vectors Lkof the 3-D BIM lines by applying adaptive weighting to the technique set forth in Y. Liu, T.S. Huang, and O.D. Faugeras, "Determination of Camera Location from 2-D to 3-D Line and Point Correspondences," in IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 12, no. 1, pp. 28-37, 1990. In an example embodiment the merit function to be minimized with respect to the rotation matrix parameterized by three angles, is as per the following expression (3)
where <¾ are weights. In an example embodiment, the minimization of the merit function (fx) is solved using the Levenberg-Marquardt method where an initial estimate for the rotation can be provided manually.
[0048] In an embodiment, in the expression (3), the weights (<¾) are equal to one at correspondences, where the mean distance from the end points of the arc to the projecting plane of the transformed BIM line is below an adaptive threshold. Herein, it is noted that the transformation of the BIM line means transformation to the camera coordinate system according to the current estimate for the rotation and translation. In an
example embodiment, the adaptive threshold is chosen based on mean and standard deviation of arc to projecting plane distances at each of the K correspondences. The adaptive threshold gets tighter as the iteration proceeds similarly as set forth in Z. Zhang, "Iterative Point Matching for Registration of Free-Form Curves and Surfaces," International Journal of Computer Vision, vol. 13, no. 2, pp. 119-152, 1994, for registration of curves and surfaces.
[0049] Further, in the expression (3), the weights (<¾) are equal to zero at correspondences, where the arc to projecting plane distances are larger than or equal to the adaptive threshold. It is noted that such weighting scheme removes false correspondences and uses only reliable ones. In some example embodiments, however, the weighting can be ignored when the value of K is small, as it has more influence when K is large.
[0050] In an example embodiment, for the estimation of translation t, two or more intersections of 3-D BIM lines and intersections of circles of corresponding arcs on the image sphere corresponding to the spherical image (Is), are taken. In an example representation, let Vu denote the vertex, where the BIM lines k and / of one of the selected S pairs of 3-D BIM lines intersect. The corresponding arcs in the image sphere need not intersect, but the intersections of circles, which the arcs are part of, are used. The circles intersect at two points given by ±c(«^ x «/)/ || x «/ 1| · These two vectors are rotated by R l and the one is selected, for which the direction of the rotated vector is closer to Vu- Let this intersection point be denoted by Vu (before rotation). The image point and the object point lie on the same image ray. Accordingly, in an example embodiment, the translation t may be represented as per the following expression (4):
where the real number μu and the three components of t are unknown. It is noted that in the scenario of S intersection correspondences, the number of equations is 3S with 3 + S unknowns, and hence the translation is solvable for S≥ 2.
[0051] Using the estimated rotation and translation, the 3-D lines of the BIM are transformed from the object to camera coordinate system. The projecting plane normals Nek in the camera coordinates and the end points of respective arcs on the image sphere are computed. In an example embodiment, a merit function evaluating the quality of the selected set of arc to BIM line correspondences is given by the following expression (5):
f2 =∑K k=1 \nT kNck \/K (5) provided the arcs related to and Not satisfy the overlap criterion strictly with T0 = 0 for all k = \,..., K (with the new projecting plane being the same as the projecting plane defined by Not)- [0052] Additionally, in an example embodiment, the quality of the estimated orientation is also evaluated by the number of other arc to line correspondences that appear besides those K correspondences which are used for the orientation estimation. In this example embodiment, the other 3-D lines of the BIM are transformed to the camera coordinate system and the projecting plane normals NcP and respective arcs Ap on the image sphere are computed. Further, for each arc Ap, those arcs Du of the image sphere, the projecting plane normals nu of which are close to NcP and which satisfy the overlap criterion with Ap strictly with T0 = 0, are searched. Herein nu and NcP are close to each other if the angle between them is less than a threshold Ty or larger than π (pi) - Ty. In an example embodiment, the closest arc Dq may be chosen as a candidate arc to be considered to correspond Ap and the respective 3-D line of the BIM; and closest arc Dq should satisfy the condition as per the following expression (6): q = arg(maxu|n¾p |) (6)
[0053] In some scenarios, the closest arc according to expression (6) may not always be the best solution, especially when there are several arcs close to each other or the arcs represent different parts of the same edge line. Consequently, in an example embodiment, it is determined if there are other arcs Dq2 close to Dq (e.g., mean distances of the end points of Dq from the projecting planes of Dq2 are less than a threshold rKc), which have been previously matched with some other arcs Ap2 computed from the BIM such that the projecting plane normals of Ap2 are close to the one of Ap (i.e. angle between the normals less than a threshold Τμ or larger than π - Τμ), and Ap2 satisfy the overlap criterion with Ap strictly with T0 = 0. In an example embodiment, if any of Ap2 corresponds to one of the K BIM lines used for orientation estimation, then Dq is deleted from consideration. Otherwise, the arc AP3 of the arcs Ap2 which is closest to Dq (i.e. absolute value of dot product between normals of projecting planes is maximized) is searched. Consequently, it is studied which one is closer, Dq to Ap, or Dq3 to AP3. Herein, Dq is the arc of the image sphere corresponding to the BIM line from which Api has been computed. Further, the closeness between Dq to Ap, and Dq3 to AP3 is measured by the
mean distance of end points of Dq and Dq3 from the projecting planes of Ap and Api, respectively. The closer of these two alternatives is kept for consideration and the other is deleted (or ignored). If Dq is deleted from consideration, then the second closest arc according to expression (6) is considered and the process is iterated until a compatible arc can be found. Further, if D¾3 is deleted, then the arc AP3 is processed again to find another arc of the image sphere which yields a better overall solution. These considerations thus take into account the neighboring arc to line correspondences when establishing a new one. As a result, the algorithm is able to find a correct arc instead of mixing with another arc close to the correct one. It would be appreciated by those skilled in the art that the result of the algorithm does not depend on the order of BIM lines in which they are processed.
[0054] In an example embodiment, the best orientation estimate is considered to be obtained when the total number of arc to line correspondences (including the other ones as per the expression (6) and as per the further considerations according to paragraph [0053], i.e. the paragraph following expression (6)) is maximized and within the maximum number of correspondences, the orientation which maximizes the merit function f2 in the expression (5).
[0055] The result of exterior orientation of the spherical panoramic image (Is) is illustrated in an example representation 540 of FIG. 5B and in an example representation 625 of FIG. 6B. Referring particularly to the example representation 540 of FIG. 5B, after the exterior orientation of the spherical image (Is) is performed, the BIM (see, 510) projected onto the spherical image (Is) fits mostly well with the edge curves (see, 505) except along the left border of the lacuna 520 (see, region 522) and along the side of the lacuna 525 near the conduit elements 530 (see, region 528). [0056] Some example embodiments of performing the operations 220 and 225 of the method 200 are explained herein with following description. For example, the processor 102 is configured to, with the image processing instructions stored in the memory 104, and optionally with other components described herein, to cause the apparatus 100 to perform the operations 215 and 220. For instance, the apparatus 100 is caused to deform the as-designed model, such that, upon projecting the deformed as- designed model onto the oriented image, the plurality of lines of the as-designed model fit substantially with the plurality of edge features in the image, and the apparatus 100 is further caused to determine lower-bounds for deviations of the as-built structure from the
as-designed model, based on the image (I). In an example embodiment, the processor 102 is configured to determine the deviations corresponding to the plurality of vertices of the as-designed model based on the deformation of the as-designed model.
[0057] In an example representation, an edge line extracted from a planar image (Ip) and an edge curve or an arc of an image sphere extracted from a spherical image (Is) can be jointly denoted as a 2-D feature of the image (I). In an example, it may be assumed that a total of K number of 2-D feature to 3-D BIM line correspondences which yield the optimal exterior orientation and a total of T' number of other feature to line correspondences using the optimal orientation, are established {e.g., determined during the operation 215). It should be noted that these K' other correspondences are obtained for the spherical image (Is) as described above in paragraphs [0052] and [0053] {i.e. the paragraph including expression (6) and the following paragraph), and for the planar image (Ip), K' other correspondences can be obtained using the same procedure by replacing term 'arc' by the 'line' and term 'image sphere' by the 'image plane'. The positions of the vertices Psh, where h = \,...,H, of the BIM (in BIM coordinates) are adjusted so that the K +I BIM lines fit optimally with the corresponding edge features of the image (I).
[0058] In an example embodiment, for each h of PBH, let Uh be a set of indices of features that correspond to 3-D BIM lines that share PBH as one of the end points of the BIM line. For instance, in other words, ½ groups the features corresponding to BIM lines that depart from vertex PBH- In an example embodiment, after transforming to camera coordinates, the vector to each adjusted vertex PBU +ΔΡΒΗ should be perpendicular to the unit normals of projecting planes of all features belonging to Uh. In an example embodiment, a merit function to be minimized, with respect to APsh, h = \,...,H, can be as per the following expression (7):
In the above expression (7), Pch are vertices of the BIM in the camera coordinates and M¾ are weights, and the Pch can be given as per the following expression (8): Pch = R {PBh + APBh - t) (8).
Each weight (¼¾) is proportional to the length of overlap between the image feature and the feature computed from the corresponding BIM line when transformed to the same
circle on the image sphere similarly as in the overlap criterion in the case of the spherical image (Is), and correspondingly for the planar image (Ip).
[0059] The set Uh typically contains zero to three features. In an example embodiment, if there are two or more features in Uh, then all of the three coordinates of APsh are regarded as unknown. In an example embodiment, if there is only one feature in the set Uh, then the position of PBH is kept fixed in the direction of the 3-D BIM line corresponding to the feature in question and changes are allowed only perpendicular to the BIM line direction. Due to the selection of the BIM coordinate system, each line of the BIM is usually parallel to one of the coordinate axes. It should be noted that fixing the movement in the direction of the BIM line represents fixing one coordinate of PBH- In this example embodiment, for BIM lines that are non-parallel to any of the coordinate axes, constraint equations Lk APBh = 0, where Lk is the direction vector of the line, are introduced. In an example embodiment, if the set Uh is empty, all the coordinates of vertex Psh are kept fixed. [0060] Since all the projecting planes of features in the set Uh include the camera projection center and the vertex, the solution for the vertex position is not unique but all points along the ray from the camera to the vertex are valid solutions, when there are two or more features in the set Uh. The same holds also if there is only one feature in the set Uh except that the degeneracy is along the intersection of the projecting plane and the plane perpendicular to the BIM line. Consequently, an additional constraint (Psh ~ t)TAPsh = 0 is introduced, which forces the correction to the vertex position to be perpendicular to the ray from the camera to the original vertex position. This latter constraint (i.e. (Peh ~ t)TAPsh = 0) ensures a unique solution that has approximately a minimum norm (e.g. , the point, which is closest to Psh on the image ray from the camera to the adjusted vertex, would be the exact minimum norm solution). In an example embodiment, the vertex position is thus corrected by a vector of minimum length so that the distance between the adjusted vertex position and the original vertex position gives a lower bound for the magnitude of deformation.
[0061] In an example embodiment, the apparatus 100 is caused to solve the minimization of metric function ( ) in expressions (7) and (8) using the Levenberg- Marquardt algorithm with Lagrange multipliers and zero values as initial estimates for APsh, h = \,..., H. In this example embodiment, the other feature to line correspondences for which one or both end points of the 3-D line have moved more than a threshold Tp,
(i.e., \\ ΑΡΒΗ \\ ≥ Tp) are removed from consideration and the corresponding deformations are set to zeros. Thereafter, the Levenberg-Marquardt algorithm with Lagrange multipliers is applied again with zero values as initial estimates for a new set of unknown coordinates of ΛΡ¾ determined based on the remaining feature to line correspondences. In an example embodiment, the two step approach thus uses information from the 3-D object space to eliminate false feature to line correspondences. It is noted that when the scene contains 3-D lines at various depths, it is difficult to set an appropriate value for Ty, which describes the closeness of normals of projecting planes of the image feature and the 3-D line. However, the threshold Tp gives a depth invariant criterion, which can be set based on knowledge about how much the vertices are expected to be misplaced at most. In an example embodiment, the solution with the reduced number of unknowns gives the deformation of the BIM, where the magnitudes of deformations represent the lower- bounds for deviations against the as-designed BIM at each vertex of the BIM.
[0062] The result of BIM adjustment for the spherical panoramic image (Is) is illustrated in an example representation 560 of FIG. 5C and in an example representation 650 of FIG. 6C. Referring particularly to FIG. 5C, after the exterior orientation process, the BIM is deformed so as to substantially fit with the BIM lines with the corresponding edge curves of the lacuna 520. For example, the region 522 shown in FIG. 5B is adjusted and is hence not visible in FIG. 5C. In this example representation 560, after adjusting the BIM, all the projected BIM lines of established arc to line correspondences match perfectly with edge curves of the spherical image (Is). Further, FIG. 7 shows the deformed model in the BIM coordinates (ΧΒ, ΥΒ,ΖΒ) with shading differences illustrating differences against the as-designed BIM.
[0063] It should be noted that while describing various example embodiments, particularly, for the spherical image (Is), one or more algorithms include several thresholds which determine when two quantities are considered to be close to each other. It should however be noted that determination of appropriate values for the parameters and thresholds depends on the scene content such as straightness of edge lines in the object space (M, Te, Ta), dissolution potential one wants to achieve by not merging adjacent edge curves (Τβ, Τ0), closeness of initial orientation to the true one (Τω), dissolution potential needed to separate arcs close to each other ( Τκ, Τμ), and magnitude of BIM deformation there is expected to be (T7, TP). Further, increasing the number K of feature to line correspondences may help to find the correct exterior orientation, although
a too large K may incorrectly distribute possible deformations along some lines also to other lines which are actually correct. On the other hand, the disclosed adaptive weighting in rotation estimation is intended to cope with these cases.
[0064] Without in any way limiting the scope, interpretation, or application of the claims appearing below, a technical effect of one or more of the example embodiments disclosed herein is to provide methods for providing lower-bounds for deviations of the as-built structure from the as-designed model. Various example embodiments are capable of working with a single image {e.g., a planar or a spherical image) and are yet able to derive 3-D deformation information. The lower bound obtained can be applied to determine deviations exceeding tolerances and requiring further inspection. Various example embodiments operate both on the planar as well as spherical images stitched from one or several concentric sub-images. It is noted that spherical panoramic images contain features from all surroundings of the setup and may be thus more accurately oriented with the BIM. As the orientations of the spherical images are solved by applying the concept of edge-based methods, the computational complexities and overheads reduce significantly.
[0065] The present disclosure is described above with reference to block diagrams and flowchart illustrations of method and device embodying the present disclosure. It will be understood that various block of the block diagram and flowchart illustrations, and combinations of blocks in the block diagrams and flowchart illustrations, respectively, may be implemented by a set of computer program instructions. These set of instructions may be loaded onto a general purpose computer, special purpose computer, or other programmable data processing apparatus to cause a device, such that the set of instructions when executed on the computer or other programmable data processing apparatus create a means for implementing the functions specified in the flowchart block or blocks. Although other means for implementing the functions including various combinations of hardware, firmware and software as described herein may also be employed.
[0066] Various embodiments described above may be implemented in software, hardware, application logic or a combination of software, hardware and application logic. The software, application logic and/or hardware may reside on at least one memory, at least one processor, an apparatus or, a non-transitory computer program product. In an example embodiment, the application logic, software or an instruction set is maintained
on any one of various conventional computer-readable media. In the context of this document, a "computer-readable medium" may be any non-transitory media or means that can contain, store, communicate, propagate or transport the instructions for use by or in connection with an instruction execution system, apparatus, or device, such as a computer, with one example of a system described and depicted in FIG. 1. A computer- readable medium may comprise a computer-readable storage medium that may be any media or means that can contain or store the instructions for use by or in connection with an instruction execution system, apparatus, or device, such as a computer.
[0067] The foregoing descriptions of specific embodiments of the present disclosure have been presented for purposes of illustration and description. They are not intended to be exhaustive or to limit the present disclosure to the precise forms disclosed, and obviously many modifications and variations are possible in light of the above teaching. The embodiments were chosen and described in order to best explain the principles of the present disclosure and its practical application, to thereby enable others skilled in the art to best utilize the present disclosure and various embodiments with various modifications as are suited to the particular use contemplated. It is understood that various omissions and substitutions of equivalents are contemplated as circumstance may suggest or render expedient, but such are intended to cover the application or implementation without departing from the spirit or scope of the claims of the present disclosure.
Claims
What is claimed is: 1. A method, comprising:
facilitating receipt of:
an image of an as-built structure, the image being a two-dimensional (2-D) image, and
an as-designed model associated with the as-built structure, the as- designed model being a three-dimensional (3-D) model comprising a plurality of vertices and a plurality of lines connecting the plurality of vertices;
determining a plurality of edge features in the image;
performing an exterior orientation of the image comprising the plurality of edge features corresponding to the as-designed model to generate an oriented image;
deforming the as-designed model, such that, upon projecting the deformed as- designed model onto the oriented image, the plurality of lines of the as-designed model fit substantially with the plurality of edge features in the image; and determining, based on the image, lower-bounds for deviations of the as-built structure from the as-designed model, the deviations determined corresponding to the plurality of vertices of the as-designed model based on the deformation of the as-designed model.
2. The method as claimed in claim 1, wherein the image is a planar image.
3. The method as claimed in claim 2, wherein the plurality of edge features is a plurality of edge lines in the planar image, and wherein determining an edge line of the plurality of edge lines comprises detecting edge pixels in the planar image.
4. The method as claimed in claim 3, wherein performing the exterior orientation of the planar image comprises determining correspondences between the plurality of edge lines of the planar image and the plurality of lines of the as-designed model.
5. The method as claimed in claim 1, wherein deforming the as-designed model comprises adjusting positions of the plurality of vertices of the as-designed model such that a vector from a camera projection center to an individual vertex of the plurality of vertices is perpendicular to normals of projecting planes corresponding to one or more edge features of the plurality of edge features in the image, wherein the one or more edge features correspond to one or more lines of the plurality of lines departing from the individual vertex.
6. The method as claimed in claim 1, wherein the image is a spherical image.
7. The method as claimed in claim 6, wherein the plurality of edge features is a plurality of edge curves in the spherical image, and wherein determining the plurality of edge curves comprises:
determining edge pixels in the spherical image using a Canny edge detection algorithm;
grouping the edge pixels into a plurality of connected components based on 8-connectivity of neighbouring pixels;
forming one or more segments within each connected component of the plurality of connected components using a region growing process based on determining at each edge pixel, a local line direction and a signed distance from a camera origin;
forming a plurality of arcs corresponding to segments of the plurality of connected components using edge boundary pixels corresponding to the segments; and
merging overlapping arcs from among the plurality of arcs to determine the plurality of edge curves.
8. The method as claimed in claim 6, the plurality of edge features is a plurality of edge curves and wherein performing the exterior orientation of the image comprises determining correspondences between the plurality of edge curves of the spherical image and the plurality of lines of the as-designed model using RANSAC algorithm.
9. The method as claimed in claim 6, wherein deforming the as-designed model comprises adjusting positions of the plurality of vertices of the as-designed model such that a vector from a camera projection center to an individual vertex of the plurality of vertices is perpendicular to normals of projecting planes corresponding to one or more edge features of the plurality of edge features in the image, wherein the one or more edge features correspond to one or more lines of the plurality of lines departing from the individual vertex.
10. An apparatus, comprising:
a memory to store image processing instructions; and
a processor electronically coupled with the memory, the processor configured to execute the image processing instructions stored in the memory to cause the apparatus to perform at least:
facilitating receipt of:
an image of an as-built structure, the image being a two- dimensional (2-D) image, and
an as-designed model associated with the as-built structure, the as-designed model being a three-dimensional (3-D) model comprising a plurality of vertices and a plurality of lines connecting the plurality of vertices;
determining a plurality of edge features in the image;
performing an exterior orientation of the image comprising the plurality of edge features corresponding to the as-designed model to generate an oriented image;
deforming the as-designed model, such that, upon projecting the deformed as-designed model onto the oriented image, the plurality of lines of the as-designed model fit substantially with the plurality of edge features in the image; and
determining, based on the image, lower-bounds for deviations of the as-built structure from the as-designed model, the deviations determined corresponding to the plurality of vertices of the as-designed model based on the deformation of the as-designed model.
11. The apparatus as claimed in claim 10, wherein the image is a planar image.
12. The apparatus as claimed in claim 11, wherein the plurality of edge features is a plurality of edge lines in the planar image, and wherein determining an edge line of the plurality of edge lines comprises detecting edge pixels in the planar image.
13. The apparatus as claimed in claim 12, wherein for performing the exterior orientation of the planar image, the apparatus is further caused, at least in part to determine correspondences between the plurality of edge lines of the planar image and the plurality of lines of the as-designed model.
14. The apparatus as claimed in claim 10, wherein for deforming the as-designed model, the apparatus is further caused, at least in part to adjust positions of the plurality of vertices of the as-designed model such that a vector from a camera projection center to an individual vertex of the plurality of vertices is perpendicular to normals of projecting planes corresponding to one or more edge features of the plurality of edge features in the image, and wherein the one or more edge features correspond to one or more lines of the plurality of lines departing from the individual vertex.
15. The apparatus as claimed in claim 10, wherein the image is a spherical image.
16. The apparatus as claimed in claim 15, wherein the plurality of edge features is a plurality of edge curves in the spherical image, and wherein for determining the plurality of edge curves, the apparatus is further caused, at least in part to:
determine edge pixels in the spherical image using a Canny edge detection algorithm;
group the edge pixels into a plurality of connected components based on 8-connectivity of neighbouring pixels;
form one or more segments within each connected component of the plurality of connected components using a region growing process based on determining at each edge pixel, a local line direction and a signed distance from a camera origin;
form a plurality of arcs corresponding to segments of the plurality of connected components using edge boundary pixels corresponding to the segments; and
merge overlapping arcs from among the plurality of arcs to determine the plurality of edge curves.
17. The apparatus as claimed in claim 15, the plurality of edge features is a plurality of edge curves and wherein for performing the exterior orientation of the image, the apparatus is further caused, at least in part to determine correspondences between the plurality of edge curves of the spherical image and the plurality of lines of the as-designed model using RANSAC algorithm.
18. The apparatus as claimed in claim 15, wherein for deforming the as-designed model, the apparatus is further caused, at least in part to adjust positions of the plurality of vertices of the as-designed model such that a vector from a camera projection center to an individual vertex of the plurality of vertices is perpendicular to normals of projecting planes corresponding to one or more edge features of the plurality of edge features in the image, wherein the one or more edge features correspond to one or more lines of the plurality of lines departing from the individual vertex.
19. A non-transitory, computer-readable storage medium storing computer- executable program instructions to implement a method for determining lower bounds for deviations of an as-built structure from an as-designed model, the method comprising:
facilitating receipt of:
an image of an as-built structure, the image being a two- dimensional (2-D) image, and
an as-designed model associated with the as-built structure, the as- designed model being a three-dimensional (3-D) model comprising a plurality of vertices and a plurality of lines connecting the plurality of vertices;
determining a plurality of edge features in the image;
performing an exterior orientation of the image comprising the plurality of edge features corresponding to the as-designed model to generate an oriented image;
deforming the as-designed model, such that, upon projecting the deformed as-designed model onto the oriented image, the plurality of lines of the as- designed model fit substantially with the plurality of edge features in the image; and
determining, based on the image, lower-bounds for deviations of the as- built structure from the as-designed model, the deviations determined corresponding to the plurality of vertices of the as-designed model based on the deformation of the as-designed model.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201462098464P | 2014-12-31 | 2014-12-31 | |
US62/098,464 | 2014-12-31 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2016107989A1 true WO2016107989A1 (en) | 2016-07-07 |
Family
ID=56284345
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/FI2015/050960 WO2016107989A1 (en) | 2014-12-31 | 2015-12-31 | Estimation of lower bounds for deviations of as-built structures from as-designed models |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO2016107989A1 (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108133078A (en) * | 2017-12-01 | 2018-06-08 | 中国建筑第八工程局有限公司 | Virtual model based on 720 distant view photographs is led the way quality management-control method |
CN118735770A (en) * | 2024-09-03 | 2024-10-01 | 上海禹创数维技术有限公司 | Method, device, equipment, medium and program for integrating continuous panoramic images with BIM |
CN119048466A (en) * | 2024-08-21 | 2024-11-29 | 深圳亚太航空技术股份有限公司 | High-locking bolt detection method and device, electronic equipment and storage medium thereof |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6324299B1 (en) * | 1998-04-03 | 2001-11-27 | Cognex Corporation | Object image search using sub-models |
US20040068187A1 (en) * | 2000-04-07 | 2004-04-08 | Krause Norman M. | Computer-aided orthopedic surgery |
US6963338B1 (en) * | 1998-07-31 | 2005-11-08 | Cognex Corporation | Method for refining geometric description models using images |
US20090010489A1 (en) * | 2007-05-18 | 2009-01-08 | Mirko Appel | Method for comparison of 3D computer model and as-built situation of an industrial plant |
US20090322742A1 (en) * | 2008-06-25 | 2009-12-31 | Microsoft Corporation | Registration of street-level imagery to 3d building models |
US20110187713A1 (en) * | 2010-02-01 | 2011-08-04 | Eagle View Technologies, Inc. | Geometric correction of rough wireframe models derived from photographs |
US20130155058A1 (en) * | 2011-12-14 | 2013-06-20 | The Board Of Trustees Of The University Of Illinois | Four-dimensional augmented reality models for interactive visualization and automated construction progress monitoring |
WO2013106802A1 (en) * | 2012-01-12 | 2013-07-18 | Gehry Technologies, Inc. | Method and apparatus for determining and presenting differences between 3d models |
-
2015
- 2015-12-31 WO PCT/FI2015/050960 patent/WO2016107989A1/en active Application Filing
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6324299B1 (en) * | 1998-04-03 | 2001-11-27 | Cognex Corporation | Object image search using sub-models |
US6963338B1 (en) * | 1998-07-31 | 2005-11-08 | Cognex Corporation | Method for refining geometric description models using images |
US20040068187A1 (en) * | 2000-04-07 | 2004-04-08 | Krause Norman M. | Computer-aided orthopedic surgery |
US20090010489A1 (en) * | 2007-05-18 | 2009-01-08 | Mirko Appel | Method for comparison of 3D computer model and as-built situation of an industrial plant |
US20090322742A1 (en) * | 2008-06-25 | 2009-12-31 | Microsoft Corporation | Registration of street-level imagery to 3d building models |
US20110187713A1 (en) * | 2010-02-01 | 2011-08-04 | Eagle View Technologies, Inc. | Geometric correction of rough wireframe models derived from photographs |
US20130155058A1 (en) * | 2011-12-14 | 2013-06-20 | The Board Of Trustees Of The University Of Illinois | Four-dimensional augmented reality models for interactive visualization and automated construction progress monitoring |
WO2013106802A1 (en) * | 2012-01-12 | 2013-07-18 | Gehry Technologies, Inc. | Method and apparatus for determining and presenting differences between 3d models |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108133078A (en) * | 2017-12-01 | 2018-06-08 | 中国建筑第八工程局有限公司 | Virtual model based on 720 distant view photographs is led the way quality management-control method |
CN119048466A (en) * | 2024-08-21 | 2024-11-29 | 深圳亚太航空技术股份有限公司 | High-locking bolt detection method and device, electronic equipment and storage medium thereof |
CN118735770A (en) * | 2024-09-03 | 2024-10-01 | 上海禹创数维技术有限公司 | Method, device, equipment, medium and program for integrating continuous panoramic images with BIM |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US12106495B2 (en) | Three-dimensional stabilized 360-degree composite image capture | |
US12309333B2 (en) | Method and apparatus for scanning and printing a 3D object | |
US11568516B2 (en) | Depth-based image stitching for handling parallax | |
CN109671115B (en) | Image processing method and device using depth value estimation | |
US8447099B2 (en) | Forming 3D models using two images | |
US8452081B2 (en) | Forming 3D models using multiple images | |
Chaiyasarn et al. | Distortion-free image mosaicing for tunnel inspection based on robust cylindrical surface estimation through structure from motion | |
WO2016181687A1 (en) | Image processing device, image processing method and program | |
WO2013163579A2 (en) | Automatic adjustment of images | |
WO2014200625A1 (en) | Systems and methods for feature-based tracking | |
US20150199572A1 (en) | Object tracking using occluding contours | |
Concha et al. | Manhattan and Piecewise-Planar Constraints for Dense Monocular Mapping. | |
GB2567245A (en) | Methods and apparatuses for depth rectification processing | |
Trzeciak et al. | Dense 3D reconstruction of building scenes by ai-based camera–lidar fusion and odometry | |
WO2016107989A1 (en) | Estimation of lower bounds for deviations of as-built structures from as-designed models | |
JP2014102805A (en) | Information processing device, information processing method and program | |
Pathak et al. | Distortion-robust spherical camera motion estimation via dense optical flow | |
KR100961616B1 (en) | Omnidirectional Camera Correction Method and System Using Contour Matching | |
Xu et al. | Optical flow-based video completion in spherical image sequences | |
JP2006145419A (en) | Image processing method | |
Rossi et al. | Real-time reconstruction of underwater environments: from 2D to 3D | |
Schenk et al. | Guided sparse camera pose estimation | |
CN119027310A (en) | Panoramic image stitching method and device | |
Benziger | Stereo model setup and 3d data capture for ios programming environment | |
Mitchell et al. | A robust structure and motion replacement for bundle adjustment |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 15875310 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 15875310 Country of ref document: EP Kind code of ref document: A1 |