US11030823B2 - Adjustment of architectural elements relative to facades - Google Patents
Adjustment of architectural elements relative to facades Download PDFInfo
- Publication number
- US11030823B2 US11030823B2 US16/578,964 US201916578964A US11030823B2 US 11030823 B2 US11030823 B2 US 11030823B2 US 201916578964 A US201916578964 A US 201916578964A US 11030823 B2 US11030823 B2 US 11030823B2
- Authority
- US
- United States
- Prior art keywords
- façade
- dimensional
- building model
- architectural element
- architectural
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 claims description 48
- 230000002596 correlated effect Effects 0.000 claims description 7
- 238000005259 measurement Methods 0.000 abstract description 81
- 238000010276 construction Methods 0.000 abstract description 8
- 239000011449 brick Substances 0.000 description 36
- 238000005516 engineering process Methods 0.000 description 23
- 230000008569 process Effects 0.000 description 14
- 238000012545 processing Methods 0.000 description 14
- 230000000875 corresponding effect Effects 0.000 description 9
- 238000012937 correction Methods 0.000 description 7
- 230000006870 function Effects 0.000 description 6
- 238000001514 detection method Methods 0.000 description 5
- 238000003860 storage Methods 0.000 description 5
- 238000004458 analytical method Methods 0.000 description 4
- 238000004891 communication Methods 0.000 description 4
- 238000013459 approach Methods 0.000 description 3
- 230000008859 change Effects 0.000 description 3
- 238000012935 Averaging Methods 0.000 description 2
- 230000005540 biological transmission Effects 0.000 description 2
- 230000008878 coupling Effects 0.000 description 2
- 238000010168 coupling process Methods 0.000 description 2
- 238000005859 coupling reaction Methods 0.000 description 2
- 210000003195 fascia Anatomy 0.000 description 2
- 238000001914 filtration Methods 0.000 description 2
- 238000003384 imaging method Methods 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 238000012546 transfer Methods 0.000 description 2
- 238000005303 weighing Methods 0.000 description 2
- 239000004566 building material Substances 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 230000001413 cellular effect Effects 0.000 description 1
- 238000012517 data analytics Methods 0.000 description 1
- 238000013135 deep learning Methods 0.000 description 1
- 238000000605 extraction Methods 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 230000002452 interceptive effect Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 238000010295 mobile communication Methods 0.000 description 1
- 239000004570 mortar (masonry) Substances 0.000 description 1
- 230000006855 networking Effects 0.000 description 1
- 238000003672 processing method Methods 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 238000007619 statistical method Methods 0.000 description 1
- 230000002123 temporal effect Effects 0.000 description 1
- 238000012800 visualization Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T3/00—Geometric image transformations in the plane of the image
- G06T3/40—Scaling of whole images or parts thereof, e.g. expanding or contracting
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/20—Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
- G06F16/28—Databases characterised by their database models, e.g. relational or object models
- G06F16/283—Multi-dimensional databases or data warehouses, e.g. MOLAP or ROLAP
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/50—Information retrieval; Database structures therefor; File system structures therefor of still image data
- G06F16/58—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
- G06F16/583—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/24—Classification techniques
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F30/00—Computer-aided design [CAD]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F30/00—Computer-aided design [CAD]
- G06F30/10—Geometric CAD
- G06F30/13—Architectural design, e.g. computer-aided architectural design [CAAD] related to design of buildings, bridges, landscapes, production plants or roads
-
- G06K9/6267—
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T17/00—Three dimensional [3D] modelling, e.g. data description of 3D objects
- G06T17/05—Geographic models
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/20—Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/12—Edge-based segmentation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2200/00—Indexing scheme for image data processing or generation, in general
- G06T2200/04—Indexing scheme for image data processing or generation, in general involving 3D image data
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30181—Earth observation
- G06T2207/30184—Infrastructure
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2210/00—Indexing scheme for image generation or computer graphics
- G06T2210/04—Architectural design, interior design
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2219/00—Indexing scheme for manipulating 3D models or images for computer graphics
- G06T2219/20—Indexing scheme for editing of 3D models
- G06T2219/2016—Rotation, translation, scaling
Definitions
- the technology described herein relates generally to a system and method for constructing or reconstructing a multi-dimensional model using positions of architectural elements relative to facades based on known architectural standards.
- 3D maps of urban cities with accurate 3D textured models of the buildings via aerial imagery or specialized camera-equipped vehicles.
- these 3D maps have limited texture resolution, geometry quality, inaccurate scaling and are expensive, time consuming and difficult to update and provide no robust real-time image data analytics for various consumer and commercial use cases.
- FIG. 1 illustrates one embodiment of a system architecture in accordance with the present disclosure
- FIG. 2 illustrates a flowchart representing one embodiment of a process for accurately rescaling a multi-dimensional building model in accordance with the present disclosure
- FIG. 3 illustrates a flowchart representing another embodiment of a process for accurately scaling/rescaling a multi-dimensional building model in accordance with the present disclosure
- FIG. 4 illustrates a flowchart representing one embodiment of a process for accurately scaling/rescaling/repositioning one or more planes of a multi-dimensional building model in accordance with the present disclosure
- FIG. 5 illustrates an example embodiment for identifying scale/scale error in a multi-dimensional building model using siding rows as the known architectural dimension in accordance with the present disclosure
- FIG. 6 illustrates an example embodiment for identifying scale/scale error in a multi-dimensional building model using a door as the known architectural dimension in accordance with the present disclosure
- FIG. 7 illustrates yet another example embodiment for identifying scale/scale error in a multi-dimensional building model using brick layout as the known architectural dimension in accordance with the present disclosure
- FIG. 8A-8D illustrate additional example embodiments for identifying scale/scale error in a multi-dimensional building model using roofing elements as the known architectural dimension in accordance with the present disclosure
- FIG. 9 illustrates yet another example embodiment for identifying scale error in a multi-dimensional building model using known positions of elements as the known architectural dimension in accordance with the present disclosure
- FIG. 10 illustrates an embodiment of a flowchart for improving the accuracy of the dimensions of a building model in accordance with the present disclosure
- FIG. 11 illustrates an embodiment of a flowchart for weighing various known architectural elements to scale dimensions and/or adjust scale dimensions of a multi-dimensional building model in accordance with the present disclosure
- FIG. 12 illustrates a diagrammatic representation of a machine in the example form of a computer system in accordance with the present disclosure.
- FIG. 1 illustrates one embodiment of system architecture in accordance with the present disclosure.
- image processing system 100 includes image processing servers 102 .
- Image database (DB) 104 and image processing servers 102 are coupled via a network channel 106 .
- the network channel 106 is a system for communication.
- Network channel 106 includes, for example, an Ethernet or other wire-based network or a wireless NIC (WNIC) or wireless adapter for communicating with a wireless network, such as a WI-FI network.
- WNIC wireless NIC
- the network channel 106 includes any suitable network for any suitable communication interface.
- the network channel 106 can include an ad hoc network, a personal area network (PAN), a local area network (LAN), a wide area network (WAN), a metropolitan area network (MAN), or one or more portions of the Internet or a combination of two or more of these.
- PAN personal area network
- LAN local area network
- WAN wide area network
- MAN metropolitan area network
- One or more portions of one or more of these networks may be wired or wireless.
- the network channel 106 can be a wireless PAN (WPAN) (such as, for example, a BLUETOOTH WPAN), a WI-FI network, a WI-MAX network, a 3G or 4G network, a cellular telephone network (such as, for example, a Global System for Mobile Communications (GSM) network).
- WPAN wireless PAN
- WI-FI wireless Fidelity
- WI-MAX wireless Fidelity
- 3G or 4G network a cellular telephone network
- GSM Global System for Mobile Communications
- the network channel 106 uses standard communications technologies and/or protocols.
- the network channel 106 can include links using technologies such as Ethernet, 802.11, worldwide interoperability for microwave access (WiMAX), 3G, 4G, CDMA, digital subscriber line (DSL), etc.
- the networking protocols used on the network channel 106 can include multiprotocol label switching (MPLS), the transmission control protocol/Internet protocol (TCP/IP), the User Datagram Protocol (UDP), the hypertext transport protocol (HTTP), the simple mail transfer protocol (SMTP), and the file transfer protocol (FTP).
- MPLS multiprotocol label switching
- TCP/IP transmission control protocol/Internet protocol
- UDP User Datagram Protocol
- HTTP hypertext transport protocol
- SMTP simple mail transfer protocol
- FTP file transfer protocol
- the data exchanged over the network channel 106 is represented using technologies and/or formats including the hypertext markup language (HTML) and the extensible markup language (XML).
- all or some of links can be encrypted using conventional encryption technologies such as secure sockets layer
- the image processing servers 102 include suitable hardware/software in the form of circuitry, logic gates, and/or code functions to process digital images to include, but not limited to, calculation of one or more image measurements according to an architectural feature measurement located within the images themselves.
- Capture device(s) 108 is in communication with image processing servers 102 for collecting digital images of building objects.
- Capture devices 108 are defined as digital devices for capturing images.
- the capture devices include, but are not limited to: a camera, a phone, a smartphone, a tablet, a video camera, a security camera, a closed-circuit television camera, a computer, a laptop, a webcam, wearable camera devices, photosensitive sensors, drone mounted imaging devices, equivalents or any combination thereof.
- Image processing system 100 also provides for viewer device 110 that is defined as a display device.
- viewer device 110 can be a computer with a monitor, a laptop, a smartphone, a tablet, a touch screen display, an LED array, a television set, a projector display, a wearable heads-up display of some sort, a remote display associate with a camera device, or any combination thereof.
- the viewer device includes display of one or more building facades and associated measurements, such as, for example, a mobile device, a conventional desktop personal computer having input devices such as a mouse, keyboard, joystick, or other such input devices enabling the input of data and interaction with the displayed images and associated measurements.
- ground-level images of a physical building are uploaded to image processing system 100 from a capture device.
- An uploaded image is, for example, a digital photograph of a physical building showing a façade (side) of the physical building.
- Image processing system 100 is used to generate accurately textured, 2D/3D building models based on the collected digital images.
- the textured, 2D/3D building models are generated using systems and methods, for example, as provided in U.S. Pat. Nos. 8,878,865, and 8,422,825, and hereby incorporated by reference.
- third party sources of textured models can be substituted in the various embodiments described herein without departing from the scope of the technology described.
- a scaled 3D model can be used to calculate dimensions for building materials (i.e., siding for an exterior wall, exterior brick, a door, etc.) in a construction project.
- the calculated dimensions are likely to include error given the low resolution and potential for visibility errors (e.g., occlusions).
- a system and method are provided for dimensioning and/or correcting error in an untextured/textured multi-dimensional building model. Images of a multi-dimensional building model are used to identify scale and/or scale error by comparing to known architectural dimensions. Once scale or scale error is identified, the textured models are reconstructed with accurately scaled multi-dimensional building models.
- a measurement of one or more known architectural elements can be used to scale one or more image planes of a multi-dimensional model while it is being constructed or after construction is completed.
- scale errors can be determined based on one or more known architectural elements and used to rescale one or more image planes of the multi-dimensional model.
- FIG. 2 illustrates a flowchart representing one embodiment process for accurately scaling/rescaling a textured multi-dimensional building model in accordance with the present disclosure.
- step 201 at least one digital image is retrieved (e.g., image associated with building object).
- a portion of the digital image is retrieved since the entire image (façade) may not be needed for scaling/scale error corrections.
- This portion for example front façade, may include a cut-out of a full 2D image that has been rectified and correlated to vertices of geometric planes/polygons that make up a portion of a 3D model.
- the portion may be a close-up of the front porch of a house that includes the front door.
- step 202 known architectural elements of the digital image are identified.
- architectural elements are identified using known image or object recognition techniques, including those techniques of the US references incorporated herein by reference.
- the identification of architectural elements is accomplished using other approaches.
- the boundaries for rows of siding are automatically identified using line detection techniques (e.g., frequency domain filtering).
- line detection techniques e.g., frequency domain filtering
- step 203 boundaries (i.e., siding, bricks, etc.) are identified (defined) using unique feature detection methods that look for repeated patterns, such as, consistent parallel lines or line intersections.
- boundaries for architectural elements are detected using unsupervised clustering algorithms based on learned, predictable patterns.
- boundaries can be manually marked up (e.g., by human observer).
- a measurement (e.g., dimensional ratios) of the architectural element is conducted in step 204 using image processing system 100 of FIG. 1 .
- siding rows are used as the known architectural element and the distance between siding rows (top boundary and bottom boundary) is measured in the multi-dimensional model. Pixels defining the boundaries of the architectural elements are identified within the multi-dimensional building model.
- a plurality of measurements is conducted to determine an average measurement between pixels representing the boundaries of a siding row.
- a plurality of measurements is conducted to determine an average measurement between pixels representing the boundaries of a siding row.
- the calculated average measurement value of the known architectural element is compared to a threshold measurement (measurement including + ⁇ threshold error) according to known architectural standard dimensions in step 205 .
- the threshold measurement accounts for the inherent inaccuracy of the imagery (scale error) and provides a likely range of values that are used to correlate the average measurement value to an actual measurement value (real dimensions based on known architectural standard dimensions). For example, if the known architectural standard dimensions for a solar panel is 10 ⁇ 10 (feet), the threshold will be established using, for example, +/ ⁇ 10% of the 10 feet (or up to 1 ft) in both directions (x and y). If the average measurement falls within the threshold measurement, it is assumed that the average measurement is likely to be the known architectural standard dimension. If the average measurement fails to fall within the threshold measurement, it is assumed that it does not apply to the known architectural standard or it is from a different standard dimension (i.e., 5 ⁇ 5, 15 ⁇ 15, etc.).
- the known architectural standard dimension is a distance between rows of siding on the façade of a building.
- the boundaries (top and bottom edges) for rows of siding applied to exterior walls of a building object are frequently separated by 6-10 inches (depending on the type of siding).
- a digital image having siding exposed on at least one exterior wall is provided that corresponds to a textured façade of a multi-dimensional building model.
- Image processing system 100 of FIG. 1 identifies the exterior siding using known computer vision techniques, defines the boundaries of the siding rows, correlates the pixels to the defined boundaries and measures the distance between the boundary defining pixels of adjacent rows.
- the distance between the boundaries is measured in a plurality of locations on the exterior wall to create an average measurement value.
- dimensional ratios are used to determine proper scaling.
- the average measurement value of the boundaries between the rows of siding is compared to known architectural standard dimensions of, for example, 6 inches, 7 inches, 8 inches, etc. separating each row from an adjacent row. For example, an average measurement of 6.63 inches indicates ambiguity whether it is actually 6 inches (would represent approximately 10.5% error) or 7 inches (would represent approximately 5.3% error) as the architectural dimension standards indicate. In one embodiment, the average measurement falls within a given threshold range of +/ ⁇ 10% (inherent orthogonal imagery error).
- an average measurement value of 6.63 inches is indicative that the siding may represent either the 6 inch or 7 inch architectural standard dimension.
- the average measurement of 6.63 i.e., 10.5% is outside of the threshold and, therefore, the rows of siding are not correlated to a 6 inch distance between rows. While described using a +/ ⁇ 10% threshold, other thresholds are envisioned without departing from the scope of the technology described herein.
- the average measurement in order for the average measurement to be correlated to an actual measurement of 7 inches, the average measurement would have to fall between the threshold range of 6.3 inches and 7.7 inches.
- the average measurement of 6.63 inches i.e., 5.3%) falls between the threshold so it is determined that the distance between rows of siding has high likelihood of an actual value of 7 inches.
- a door is used as the known architectural standard dimension.
- doors used in the construction industry (e.g., single doors, French doors, etc.).
- a typical door size may be 30 ⁇ 80 (i.e., 30 inches wide by 80 inches high). It is understood by those skilled in the art that the technology described here includes, but is not limited to, commonly used door sizes (e.g., the most common widths are 28, 30 and 32 inches; typically, around 80 inches.)
- a corresponding digital image is used to identify the door as an architectural element on at least one façade of the building model.
- an average measurement of the width and the height is determined and compared to the threshold.
- a width-to-height ratio or height-to-width ratio of the architectural element is determined and compared to a known ratio, including error threshold.
- the total area of the door is determined and used as the average measurement value. Using the comparison of the average measurement to the measurement with threshold error, the actual door size is determined based on the known door standard dimensions.
- bricks sizes are used as the known architectural standard dimensions.
- There are various sizes of bricks used in the construction industry e.g., standard, modular, Norman, Roman, jumbo, etc.
- the size of the brick is used to extrapolate a wall size and identify error in a multi-dimensioned building model.
- a typical brick dimension is 31 ⁇ 2 ⁇ 21 ⁇ 4 ⁇ 8 (depth (D) ⁇ height (H) ⁇ length (L) in inches).
- D depth
- H height
- L length
- brick height and width is used to identify error in the building model.
- An average measurement of a distance between rows of bricks is compared to known architectural standard dimensions separating each row from a subsequent row.
- An average measurement value of the multi-dimensional building model's brick façade is determined and compared to the measurements, including threshold error values, for known architectural dimensions separating the brick rows. Threshold values are established for each of the brick types and the comparison is made between the average measurement value and the known architectural standard dimension + ⁇ threshold error.
- a brick's width or width and height, width-to-height, or height-to-width ratio is compared against known dimensional architectural standards (with or without error thresholds).
- the determined actual dimension of a known architectural element is used to scale/rescale and reconstruct the multi-dimensional (2D/3D) building model.
- an untextured, building model is scaled/rescaled using one of the vertices as an anchor point. For example, once a known architectural measurement is determined (e.g., based on 30:80 (3:8) ratio of front door), that scale (30 inches for width of door and 80 inches for height of door) is used for a corresponding door image measurement and thereafter is used to scale one or more image planes of the multi-dimensional model.
- a vertex is used as an anchor point and the length of one of the lines/edges corresponding to the vertex is reduced by 10%.
- the vertex is anchored (i.e., anchored in a real-world position).
- the dimensions and position of the remaining vertices and edges are adjusted accordingly to maintain the original geometry (angles of the vertices) of the building model.
- a centroid the geometric center of the building model
- the building model is textured based on the digital images with the original digital images used for textures.
- FIG. 3 illustrates a flowchart representing another embodiment of a process for accurately scaling/rescaling a multi-dimensional building model in accordance with the present disclosure.
- a random object in a field-of-view e.g., foreground/background
- a field-of-view e.g., foreground/background
- scaling one or more planes within a multi-dimensional model of the building object.
- step 301 at least one digital image is retrieved (e.g., image associated with building object).
- the digital image may be an image of a house that includes the front yard surrounding a front façade of the house that is the subject of the multi-dimensional building model.
- foreground/background objects in a field of view of the building object are identified.
- foreground/background objects are identified using known image or object recognition techniques, including those techniques of the US references incorporated herein by reference.
- the identification of foreground/background objects is accomplished using other approaches. For example, parked automobiles, light fixtures, lamp posts, telephone poles, stop signs, play structures, tables, chairs, or lawn equipment are automatically identified using image recognition techniques.
- boundaries of these foreground/background objects are identified using unique feature detection methods that look for repeated patterns, such as, consistent parallel lines or line intersections.
- boundaries for foreground/background objects are detected using unsupervised clustering algorithms based on learned, predictable patterns.
- boundaries can be manually marked up (e.g., by human observer).
- a measurement (e.g., dimensional ratio) of the foreground/background object is conducted in step 304 using image processing system 100 of FIG. 1 .
- telephone poles within the digital image are used as the known foreground/background object and the known dimensional ratio (e.g., ratio of height-to-width) of a standard telephone pole is used to scale/rescale (step 306 ) one or more image planes of the associated multi-dimensional building model.
- a plurality of telephone pole ratios located within the image is averaged to determine an average ratio of a plurality of telephone poles.
- a distance from the pole to a plane of the building (house) is used to geometrically calculate a relative measurement relationship using known geometric image processing methods.
- the calculated average ratio value of the known foreground/background objects is compared to a threshold ratio (ratio with threshold error) according to known standard dimensions in step 305 .
- the threshold measurement accounts for the inherent inaccuracy of the imagery (scale error) and provides a likely range of values that are used to correlate the average ratio value to an actual ratio value (real dimensions based on known standard dimensions). For example, if the known standard height for a class 6 telephone pole in the United States is about 40 ft. (12 m) long and it is buried about 6 ft. (2 m) in the ground, 34 ft. will be used for a standard height, the threshold will be established using, for example, +/ ⁇ 10% of the 34 feet (or up to 3.4 ft.).
- the minimum width for a class 6, 40 foot pole is 9.07 inches measured at six feet from the butt end. Therefore, the width threshold will be established using, for example, +/ ⁇ 10% of the 9.07 inches (or up to 0.907 inches). If the average ratio falls within the threshold, it is assumed that the average measurement is likely to be the known standard dimension. If the average measurement fails to fall within the threshold, it is assumed that it does not apply to the known standard or it is from a different standard dimension (i.e., 50 ft, 60 ft, class 7, class 8, etc.).
- a width-to-height ratio or height-to-width ratio of the foreground/background object is determined and compared to a known threshold ratio.
- the total area of the object is determined and used as the average measurement value. Using the comparison of the average measurement to the threshold measurement, the actual object is determined based on the known standard dimensions.
- the determined actual dimension of a known architectural element is used to scale/rescale and reconstruct the multi-dimensional (2D/3D) building model.
- an untextured, building model is scaled/rescaled using one of the vertices as an anchor point. For example, once a known measurement is determined, that scale (e.g., 34 feet) is used for a corresponding telephone pole image measurement and thereafter is used to scale one or more image planes of the multi-dimensional model (as corrected based on distance from plane).
- a vertex is used as an anchor point and the length of one of the lines/edges corresponding to the vertex is reduced by 10%.
- the vertex is anchored (i.e., anchored in a real-world position).
- the dimensions and position of the remaining vertices and edges are adjusted accordingly to maintain the original geometry (angles of the vertices) of the building model.
- a centroid the geometric center of the building model
- the building model is textured based on the digital images with the original digital images used for textures.
- FIG. 4 illustrates a flowchart representing one embodiment of a process for accurately scaling/rescaling/repositioning one or more planes of a multi-dimensional building model in accordance with the present disclosure.
- errors in positioning of various image planes within a multi-dimensional model occur due to skew, obfuscation (i.e., hidden planes), or missing images of the subject plane.
- the model can be corrected by properly scaling/rescaling or moving selected planes (e.g., walls) to correct positions based on known relationships between the key known architectural features and associated planes in the multi-dimensional building model. For example, downspouts always follow an exterior wall edge.
- gables are symmetrical and therefore walls supporting the gables should also be positioned symmetrically (e.g., equidistant from the center of the gable). Therefore, if one half of a symmetrical architectural feature is properly identified and dimensioned, for example, one side of a gable, everything is known to draw the other side of the gable. Once known, the dimensions of the gable can be used to properly size other architectural features, for example, a garage door under the gable.
- At least one digital image is retrieved (e.g., image associated with building object).
- the digital image may be an image of a front building facade.
- only a portion of the digital image is retrieved since the entire image (façade) may not be needed for scaling/scale error/plane positioning corrections.
- This portion for example front façade, may include a cut-out of a full 2D image that has been rectified and correlated to vertices of geometric planes/polygons that make up a portion of a 3D model.
- the portion may be a close-up of the outside wall edge of a house that includes a downspout.
- key known architectural elements are identified.
- architectural elements are identified using known image or object recognition techniques, including those techniques of the US references incorporated herein by reference.
- the identification of architectural elements is accomplished using other approaches.
- downspouts are automatically identified using line detection techniques (e.g., frequency domain filtering) or using unique feature detection methods that look for repeated patterns, such as, consistent parallel lines or line intersections.
- key known architectural elements are detected using unsupervised clustering algorithms based on learned, predictable patterns.
- key known architectural elements can be manually marked up (e.g., by human observer).
- An identification of the position (included boundaries) of the key known architectural element is conducted in step 403 using image processing system 100 of FIG. 1 .
- image processing system 100 of FIG. 1 In one embodiment (see FIG. 9 and associated discussion), downspouts are used as the key known architectural element.
- the position of the known architectural element e.g., at least an inside edge (nearest exterior of wall plane)
- a position of an exterior wall plane associated with the downspout is compared to a position of an exterior wall plane associated with the downspout.
- the comparison accounts for the inherent inaccuracy of the imagery and provides a likely position of the exterior wall plane (based on a known relationship between the downspout and the exterior wall plane), juxtaposed with the downspout (adjusting for a predetermined/calculated known gap between downspout and wall).
- step 405 the determined actual positioning of the key known architectural element is used to scale/rescale/reposition image planes and reconstruct the multi-dimensional (2D/3D) building model. Once a scaled/rescaled/repositioned plane building model has been constructed/reconstructed, the building model is textured based on the digital images with the original digital images used for textures.
- FIG. 5 illustrates an example embodiment for identifying error in a building model using siding rows as the known architectural dimension in accordance with the present disclosure.
- Digital image 501 captures a building object showing a façade covered with siding. Using the techniques described previously, corresponding to the boundaries of each row of siding are determined. In one embodiment, the boundaries of the entire row (e.g., left edge of the façade to right edge of the façade) of siding are determined. In other embodiments, less than an entire row is used.
- Lines 502 and 503 define the upper and lower boundary of one row of siding.
- Measurement 504 determines the distance between the pixels defining the upper and lower boundary of the row of siding.
- the number of pixels between the upper and lower boundary of the row of siding is determined and, based on the resolution of each pixel, a measurement (e.g., feet (f), inches (in), meters (m), centimeters (cm), etc.) can be extrapolated.
- a measurement e.g., feet (f), inches (in), meters (m), centimeters (cm), etc.
- the boundaries of a portion of the siding row are determined.
- Lines 505 and 506 define the upper and lower boundaries of a portion of the row of siding.
- Measurement 507 determines the distance between the pixels defining the upper and lower boundaries of the row of siding.
- measurements 504 and 507 are averaged to determine an average measurement of the distance between rows of siding.
- the camera orientation relative to the facades of the building in the image is solved using known methods.
- the building facade orientation information is used to skew (i.e., rectify) the digital image to appear as if it was captured from directly in front of the corresponding façade of the building.
- the siding rows are parallel lines and do not converge to cause error in the average measurements across the length of the façade.
- the digital image is a rectified digital image created during model construction and texturing.
- FIG. 6 illustrates an example embodiment for identifying error in a building model using a door as the known architectural dimension in accordance with the present disclosure.
- Digital image 601 shows a front façade of a building object that includes a door.
- the boundaries of the known architectural feature i.e., the door
- a dimension dimensional ratio
- the ratio between the door height and width is used to identify error and determine the actual dimensions of the door.
- the measurements are compared to existing known door dimensions and/or ratios, including threshold error rates, to identify a possible known architectural standards match.
- FIG. 6 is used for diagrammatic purposes, specific positioning and designation of dimensions may change from image-to-image and distinct architectural feature.
- the known door architectural standard may include or not include the door frame as shown.
- measurements will be taken of the storm door only, the exterior door or both.
- FIG. 7 illustrates yet another example embodiment for identifying error in a building model using a brick layout as the known architectural dimension in accordance with the present disclosure.
- Digital image 701 captures a façade of a building object showing exposed brick. In one embodiment, a portion of the façade is used to establish the top and bottom boundaries of a horizontal row of brick in a similar technique to the previously discussed siding rows.
- Portion 702 shows an exposed brick façade with bricks in the traditional offset pattern.
- Top boundary 703 and bottom boundary 704 are determined for a row of bricks.
- Average measurement 705 is determined using the difference between the pixels corresponding to top boundary 703 and bottom boundary 704 . The average measurement between top boundary 703 and bottom boundary 704 are compared to the known architectural standards for bricks dimensions (see Table 1) to determine actual dimensions.
- the left boundary 706 and right boundary 707 of the brick are used to identify error and rescale and reconstruct the building model.
- Average measurement 708 is determined using the difference between the pixels corresponding to left boundary 706 and right boundary 707 and compared to the known architectural standards for brick dimensions. For greater accuracy of multiple smaller dimensioned architectural features (e.g., bricks), averaging of an error associated with a large number of the bricks will increase accuracy.
- simple ratios of width-to-height and height-to-width of the bricks are used (without a pixel analysis) to match to known architectural ratios of standard brick sizes (with or without a threshold error).
- FIGS. 8A, 8B, 8C, and 8D collectively illustrate example embodiments for identifying error in scaling of a building model using roofing elements as the known architectural dimension in accordance with the present disclosure.
- roofing elements as the known architectural dimension in accordance with the present disclosure.
- shingles, gables, chimneys, vent pipes, dormers, skylights, solar panels, fans, gutters, fascia boards, rakes, etc. are used as known roofing elements (known architectural features).
- FIG. 8A illustrates an end view of a house with gable 801 , chimney 802 and vent pipe 803 .
- the boundaries of the known architectural features are determined and a dimension (ratio) is extrapolated based on H:W (height/width) or W:H (width/height).
- angles such as ⁇ 1 or ⁇ 2
- H:W:L height:width:length
- known dimensions such as, depth, radius, diameter, circumference, perimeter, surface area, area, etc.
- the ratio between the gable height and width is used to identify scale and determine the actual dimensions of the gable using techniques in any of the various embodiments described herein.
- the measurements are compared to existing known gable dimensional ratios, with or without threshold error rates, to identify possible known architectural standards matches.
- FIG. 8A is used for diagrammatic purposes, specific positioning and designation of dimensions may change from image-to-image and distinct architectural feature.
- the known chimney architectural standard may include or not include an upper collar or chimney cap.
- ratio or length measurements will only be taken for one side of the gable, and because of known symmetry, be considered applicable for scaling an opposing side or positioning a supporting wall.
- the gable can be analyzed with or without an overhang, fascia, rake, etc.
- FIGS. 8B, 8C and 8D each illustrate various shingle patterns.
- the boundaries of the known architectural features are determined and a dimension (ratio) is extrapolated based on H:W (height/width) or W:H (width/height).
- shingle patterns can include multiple dimensional ratio analyses each covering identified unique shingle shapes within the shingle pattern.
- the ratios/measurements are compared to existing known shingle dimensional ratios, with or without scale error rates, to identify a possible known architectural standards match using techniques in any of the various embodiments described herein.
- FIGS. 8B, 8C and 8D are used for diagrammatic purposes, specific positioning, shapes, patterns, material types and designation of dimensions may change from image-to-image and with distinct architectural features.
- FIG. 9 illustrates yet another example embodiment for identifying positions, scale, or scale error in a multi-dimensional building model using identified relative positions of elements as the known architectural dimension.
- the model can be accurately dimensioned (scaled) by properly moving selected planes (e.g., walls) to correct positions, scaling or rescaling. For example, downspouts always follow an exterior wall edge. If the downspouts can be properly identified, proper placement of associated wall planes can be improved.
- exterior wall 901 is shown at a position 18 inches inset from roof overhang outer edge 902 .
- an inside edge, plus gap between gutter and exterior wall, of downspout 904 is actually positioned at 18.8 inches from the overhang edge.
- the positioning of the downspout would suggest that the original exterior wall placement was too close to the overhang edge.
- the exterior wall plane in the model is simply moved to be adjacent to downspout 904 .
- the scale error between an accurate position measurement of 18.8 inches and the model placement of 18 inches can be used to scale/rescale the entire model.
- this scale error can be applied simply to the associated plane or architectural features in the plane in which the downspout was identified.
- FIG. 10 illustrates an embodiment of a flowchart for improving the accuracy of the dimensions of a building model in accordance with the present disclosure.
- the process of identifying scale, scale error, scaling, rescaling and reconstructing is used to improve the accuracy of dimensioning a multi-dimensional building model.
- Process 1000 includes identification of scale errors in a multi-dimensional building model by retrieving a digital image in step 1001 that contains architectural elements (i.e., siding, brick, door, window, gables, roofing features, etc.).
- architectural elements are identified (e.g., doors).
- steps 1003 boundaries of the identified architectural element are defined and are used to measure the various dimensions/dimensional ratios of the architectural elements in step 1004 .
- Step 1005 The measurement is compared to the known architectural element standard dimensions/dimensional ratios to determine the actual measurement in step 1005 .
- Steps 1002 through 1005 are repeatable 1009 in an identification and measurement cycle 1006 that can be mathematically combined (e.g., averaged, mean, median, etc.) over multiple similar or dissimilar architectural features.
- some additional objects include, but are not limited to, parked automobiles, light fixtures, distance between door knob and base of doors, lamp posts, telephone poles, stop signs, play structures, tables, chairs, or lawn equipment.
- Specific roof features include, but are not limited to, shingles, gables, eaves, dormers, gutters, chimney structures, pipe stacks, turbine vents and skylights.
- any known object (with identifiable dimensions) in the foreground or background can count as a dimension reference.
- the repeated comparison between the measurement of multiple selections of an architectural element (e.g., siding measurements in various locations within the image) and the known architectural standard dimensions established in step 1005 is fed into a weighted decision engine in step 1007 to determine an average scaling error.
- the weighted decision engine in step 1007 uses learned statistical analysis to improve scaling over time and measurements. As more statistical information is accumulated (learned), the weighted decision engine creates a more predictable result.
- the building model is rescaled and reconstructed according to the decision determined by the weighted decision engine in step 1007 .
- deep learning systems can be substituted for the weighted decision engine without departing from the scope of the technology described herein.
- FIG. 11 illustrates an embodiment of a flowchart for weighing various known architectural elements to adjust scale in the dimensions of a building model in accordance with the present disclosure.
- Process 1100 includes identification of error in a multi-dimensional building model by retrieving a digital image in step 1101 that contains architectural elements.
- the repeated identification and measurement cycle 1102 is performed ( 1102 ( 1 ), 1102 ( 2 ) . . . 1102 ( n )) for multiple architectural elements (e.g., siding ( 1 ), brick ( 2 ), door ( 3 ), etc.) identified in the digital image retrieved in step 1101 .
- identification and measurement cycle 1106 is performed for each architectural feature to determine which architectural feature(s) in weighted decision engine in step 1107 would likely provide the most accurate, rescaled building model in step 1108 .
- measuring the front door may statistically prove a better gauge of scale error (e.g., closer to actual known measurements or having a higher frequency of correlation to known dimensions over multiple cycles) than siding scale determinations.
- a weighted decision engine is provided to determine the architectural elements(s) that are most likely to produce an accurately scaled/rescaled and constructed/reconstructed ( 1104 ) building model based on using a least quantity of processing or providing fastest cycle times, or that prove more accurate over time.
- location of architectural elements may determine specific façade scaling. For example, if a door on a façade indicates an error (4% too large) and bricks on a side façade indicate an error in width (3% too narrow), the different facades could be rescaled separately.
- Computer system 1200 includes a processor, memory, non-volatile memory, and an interface device. Various common components (e.g., cache memory) are omitted for illustrative simplicity.
- the computer system 1200 is intended to illustrate a hardware device on which any of the components depicted in the example of FIG. 1 (and any other components described in this specification) can be implemented.
- the computer system 1200 can be of any applicable known or convenient type.
- the components of the computer system 1200 can be coupled together via a bus or through some other known or convenient device.
- computer system 1200 may be an embedded computer system, a system-on-chip (SOC), a single-board computer system (SBC) (such as, for example, a computer-on-module (COM) or system-on-module (SOM)), a desktop computer system, a laptop, notebook or tablet computer system, an interactive kiosk, a mainframe, a mesh of computer systems, a mobile telephone, a smartphone, a personal digital assistant (PDA), a server, or a combination of two or more of these.
- SOC system-on-chip
- SBC single-board computer system
- COM computer-on-module
- SOM system-on-module
- computer system 1200 may include one or more computer systems 1200 ; be unitary or distributed; span multiple locations; span multiple machines; or reside in a cloud, which may include one or more cloud components in one or more networks.
- one or more computer systems 1200 may perform without substantial spatial or temporal limitation one or more steps of one or more methods described or illustrated herein.
- one or more computer systems 1200 may perform in real time or in batch mode one or more steps of one or more methods described or illustrated herein.
- One or more computer systems 1200 may perform at different times or at different locations one or more steps of one or more methods described or illustrated herein, where appropriate.
- the processor may be, for example, a conventional microprocessor such as an Intel Pentium microprocessor or Motorola power PC microprocessor.
- Intel Pentium microprocessor or Motorola power PC microprocessor.
- machine-readable (storage) medium or “computer-readable (storage) medium” include any type of device that is accessible by the processor.
- the memory is coupled to the processor by, for example, a bus.
- the memory can include, by way of example but not limitation, random access memory (RAM), such as dynamic RAM (DRAM) and static RAM (SRAM).
- RAM random access memory
- DRAM dynamic RAM
- SRAM static RAM
- the memory can be local, remote, or distributed.
- the bus also couples the processor to the non-volatile memory and drive unit.
- the non-volatile memory is often a magnetic floppy or hard disk, a magnetic-optical disk, an optical disk, a read-only memory (ROM), such as a CD-ROM, EPROM, or EEPROM, a magnetic or optical card, or another form of storage for large amounts of data. Some of this data is often written, by a direct memory access process, into memory during execution of software in the computer 1200 .
- the non-volatile storage can be local, remote, or distributed.
- the non-volatile memory is optional because systems can be created with all applicable data available in memory.
- a typical computer system will usually include at least a processor, memory, and a device (e.g., a bus) coupling the memory to the processor.
- Software is typically stored in the non-volatile memory and/or the drive unit. Indeed, for large programs, it may not even be possible to store the entire program in the memory. Nevertheless, it should be understood that for software to run, if necessary, it is moved to a computer readable location appropriate for processing, and for illustrative purposes, that location is referred to as the memory in this paper. Even when software is moved to the memory for execution, the processor will typically make use of hardware registers to store values associated with the software, and local cache that, ideally, serves to speed up execution.
- a software program is assumed to be stored at any known or convenient location (from non-volatile storage to hardware registers) when the software program is referred to as “implemented in a computer-readable medium.”
- a processor is considered to be “configured to execute a program” when at least one value associated with the program is stored in a register readable by the processor.
- the bus also couples the processor to the network interface device.
- the interface can include one or more of a modem or network interface. It will be appreciated that a modem or network interface can be considered to be part of the computer system 1200 .
- the interface can include an analog modem, isdn modem, cable modem, token ring interface, satellite transmission interface (e.g., “direct PC”), or other interfaces for coupling a computer system to other computer systems.
- the interface can include one or more input and/or output devices.
- the I/O devices can include, by way of example but not limitation, a keyboard, a mouse or other pointing device, disk drives, printers, a scanner, and other input and/or output devices, including a display device.
- the display device can include, by way of example but not limitation, a cathode ray tube (CRT), liquid crystal display (LCD), or some other applicable known or convenient display device.
- CTR cathode ray tube
- LCD liquid crystal display
- controllers of any devices not depicted reside in the interface.
- the computer system 1200 can be controlled by operating system software that includes a file management system, such as a disk operating system.
- operating system software with associated file management system software is the family of operating systems known as Windows® from Microsoft Corporation of Redmond, Wash., and their associated file management systems.
- Windows® is the family of operating systems known as Windows® from Microsoft Corporation of Redmond, Wash.
- WindowsTM is the LinuxTM operating system and its associated file management system.
- the file management system is typically stored in the non-volatile memory and/or drive unit and causes the processor to execute the various acts required by the operating system to input and output data and to store data in the memory, including storing files on the non-volatile memory and/or drive unit.
- the technology as described herein may have also been described, at least in part, in terms of one or more embodiments.
- An embodiment of the technology as described herein is used herein to illustrate an aspect thereof, a feature thereof, a concept thereof, and/or an example thereof.
- a physical embodiment of an apparatus, an article of manufacture, a machine, and/or of a process that embodies the technology described herein may include one or more of the aspects, features, concepts, examples, etc. described with reference to one or more of the embodiments discussed herein.
- the embodiments may incorporate the same or similarly named functions, steps, modules, etc. that may use the same or different reference numbers and, as such, the functions, steps, modules, etc. may be the same or similar functions, steps, modules, etc. or different ones.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Geometry (AREA)
- General Engineering & Computer Science (AREA)
- Databases & Information Systems (AREA)
- Software Systems (AREA)
- Computer Hardware Design (AREA)
- Data Mining & Analysis (AREA)
- Computer Graphics (AREA)
- Architecture (AREA)
- Library & Information Science (AREA)
- Evolutionary Computation (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Remote Sensing (AREA)
- Mathematical Optimization (AREA)
- Civil Engineering (AREA)
- Pure & Applied Mathematics (AREA)
- Structural Engineering (AREA)
- Computational Mathematics (AREA)
- Mathematical Analysis (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Evolutionary Biology (AREA)
- Artificial Intelligence (AREA)
- Bioinformatics & Computational Biology (AREA)
- Life Sciences & Earth Sciences (AREA)
- Length Measuring Devices By Optical Means (AREA)
- Image Analysis (AREA)
Abstract
Description
TABLE 1 |
brick types and standard dimensions. |
Actual Size | |||
Brick Type | D × H × L (inches) | ||
Modular | 3½ × 2¼ × 7½ | ||
Norman | 3½ × 2¼ × 11½ | ||
Roman | 3½ × 1¼ × 11½ | ||
Jumbo | 3½ × 2½ × 8 | ||
Economy | 3½ × 3½ × 7½ | ||
Engineer | 3½ × 2¾ × 7½ | ||
King | 3 × 2¾ × 9¾ | ||
Queen | 3 × 2¾ × 8 | ||
Utility | 3½ × 3½ × 11½ | ||
Claims (14)
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US16/578,964 US11030823B2 (en) | 2014-01-31 | 2019-09-23 | Adjustment of architectural elements relative to facades |
US16/846,260 US11017612B2 (en) | 2014-01-31 | 2020-04-10 | Multi-dimensional model dimensioning and scale error correction |
US17/306,787 US11676243B2 (en) | 2014-01-31 | 2021-05-03 | Multi-dimensional model reconstruction |
US18/307,246 US20230274390A1 (en) | 2014-01-31 | 2023-04-26 | Multi-dimensional model reconstruction |
Applications Claiming Priority (7)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201461934541P | 2014-01-31 | 2014-01-31 | |
US14/610,850 US9478031B2 (en) | 2014-01-31 | 2015-01-30 | Scale error correction in a multi-dimensional model |
US15/332,481 US9830742B2 (en) | 2014-01-31 | 2016-10-24 | Scale error correction in a multi-dimensional model |
US15/400,718 US9830681B2 (en) | 2014-01-31 | 2017-01-06 | Multi-dimensional model dimensioning and scale error correction |
US15/817,620 US10475156B2 (en) | 2014-01-31 | 2017-11-20 | Multi-dimensional model dimensioning and scale error correction |
US16/538,386 US10515434B2 (en) | 2014-01-31 | 2019-08-12 | Adjustment of architectural elements relative to facades |
US16/578,964 US11030823B2 (en) | 2014-01-31 | 2019-09-23 | Adjustment of architectural elements relative to facades |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/538,386 Continuation US10515434B2 (en) | 2014-01-31 | 2019-08-12 | Adjustment of architectural elements relative to facades |
Related Child Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/846,260 Continuation US11017612B2 (en) | 2014-01-31 | 2020-04-10 | Multi-dimensional model dimensioning and scale error correction |
US17/306,787 Continuation US11676243B2 (en) | 2014-01-31 | 2021-05-03 | Multi-dimensional model reconstruction |
Publications (2)
Publication Number | Publication Date |
---|---|
US20200020074A1 US20200020074A1 (en) | 2020-01-16 |
US11030823B2 true US11030823B2 (en) | 2021-06-08 |
Family
ID=58558609
Family Applications (9)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/400,718 Active US9830681B2 (en) | 2014-01-31 | 2017-01-06 | Multi-dimensional model dimensioning and scale error correction |
US15/817,620 Active US10475156B2 (en) | 2014-01-31 | 2017-11-20 | Multi-dimensional model dimensioning and scale error correction |
US15/817,755 Active US10297007B2 (en) | 2014-01-31 | 2017-11-20 | Multi-dimensional model dimensioning and scale error correction |
US16/280,169 Active US10453177B2 (en) | 2014-01-31 | 2019-02-20 | Multi-dimensional model dimensioning and scale error correction |
US16/538,386 Active US10515434B2 (en) | 2014-01-31 | 2019-08-12 | Adjustment of architectural elements relative to facades |
US16/578,964 Active US11030823B2 (en) | 2014-01-31 | 2019-09-23 | Adjustment of architectural elements relative to facades |
US16/846,260 Active US11017612B2 (en) | 2014-01-31 | 2020-04-10 | Multi-dimensional model dimensioning and scale error correction |
US17/306,787 Active US11676243B2 (en) | 2014-01-31 | 2021-05-03 | Multi-dimensional model reconstruction |
US18/307,246 Pending US20230274390A1 (en) | 2014-01-31 | 2023-04-26 | Multi-dimensional model reconstruction |
Family Applications Before (5)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/400,718 Active US9830681B2 (en) | 2014-01-31 | 2017-01-06 | Multi-dimensional model dimensioning and scale error correction |
US15/817,620 Active US10475156B2 (en) | 2014-01-31 | 2017-11-20 | Multi-dimensional model dimensioning and scale error correction |
US15/817,755 Active US10297007B2 (en) | 2014-01-31 | 2017-11-20 | Multi-dimensional model dimensioning and scale error correction |
US16/280,169 Active US10453177B2 (en) | 2014-01-31 | 2019-02-20 | Multi-dimensional model dimensioning and scale error correction |
US16/538,386 Active US10515434B2 (en) | 2014-01-31 | 2019-08-12 | Adjustment of architectural elements relative to facades |
Family Applications After (3)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/846,260 Active US11017612B2 (en) | 2014-01-31 | 2020-04-10 | Multi-dimensional model dimensioning and scale error correction |
US17/306,787 Active US11676243B2 (en) | 2014-01-31 | 2021-05-03 | Multi-dimensional model reconstruction |
US18/307,246 Pending US20230274390A1 (en) | 2014-01-31 | 2023-04-26 | Multi-dimensional model reconstruction |
Country Status (1)
Country | Link |
---|---|
US (9) | US9830681B2 (en) |
Families Citing this family (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8977520B2 (en) * | 2010-10-21 | 2015-03-10 | Pictometry International Corp. | Computer system for automatically classifying roof elements |
US10867328B2 (en) * | 2016-05-03 | 2020-12-15 | Yembo, Inc. | Systems and methods for providing AI-based cost estimates for services |
US10555181B2 (en) * | 2017-05-03 | 2020-02-04 | ARRIS Enterprises, LLC | Determining transceiver height |
US11514644B2 (en) | 2018-01-19 | 2022-11-29 | Enphase Energy, Inc. | Automated roof surface measurement from combined aerial LiDAR data and imagery |
CN108961395B (en) * | 2018-07-03 | 2019-07-30 | 上海亦我信息技术有限公司 | A method of three dimensional spatial scene is rebuild based on taking pictures |
US11775700B2 (en) | 2018-10-04 | 2023-10-03 | Insurance Services Office, Inc. | Computer vision systems and methods for identifying anomalies in building models |
JP7052670B2 (en) * | 2018-10-18 | 2022-04-12 | 日本電信電話株式会社 | Ground-based estimation method, ground-based estimation device and program |
CN110176060B (en) * | 2019-04-28 | 2020-09-18 | 华中科技大学 | Dense three-dimensional reconstruction method and system based on multi-scale geometric consistency guidance |
JP7363096B2 (en) * | 2019-05-23 | 2023-10-18 | 富士フイルムビジネスイノベーション株式会社 | Image processing device and image processing program |
US11094135B1 (en) | 2021-03-05 | 2021-08-17 | Flyreel, Inc. | Automated measurement of interior spaces through guided modeling of dimensions |
CN114880741B (en) * | 2022-04-25 | 2023-01-31 | 清华大学 | Building structure component size design method and device embedded with domain knowledge |
Citations (38)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4798028A (en) | 1987-11-30 | 1989-01-17 | Pinion John A | Downspout trap and clean out |
US5973697A (en) | 1997-01-27 | 1999-10-26 | International Business Machines Corporation | Method and system for providing preferred face views of objects in a three-dimensional (3D) environment in a display in a computer system |
US20030014224A1 (en) | 2001-07-06 | 2003-01-16 | Yanlin Guo | Method and apparatus for automatically generating a site model |
US20030052896A1 (en) | 2000-03-29 | 2003-03-20 | Higgins Darin Wayne | System and method for synchronizing map images |
US20040196282A1 (en) | 2003-02-14 | 2004-10-07 | Oh Byong Mok | Modeling and editing image panoramas |
US20060037279A1 (en) | 2004-07-30 | 2006-02-23 | Dean Onchuck | Dormer calculator |
US7218318B2 (en) | 2000-12-14 | 2007-05-15 | Nec Corporation | Server and client for improving three-dimensional air excursion and method and programs thereof |
US20070168153A1 (en) | 2006-01-13 | 2007-07-19 | Digicontractor Corporation | Method and apparatus for photographic measurement |
WO2007147830A1 (en) | 2006-06-19 | 2007-12-27 | Jochen Hummel | Method for producing a three-dimensional computer model of a town |
US20080221843A1 (en) | 2005-09-01 | 2008-09-11 | Victor Shenkar | System and Method for Cost-Effective, High-Fidelity 3D-Modeling of Large-Scale Urban Environments |
US20090043504A1 (en) | 2007-05-31 | 2009-02-12 | Amrit Bandyopadhyay | System and method for locating, tracking, and/or monitoring the status of personnel and/or assets both indoors and outdoors |
US20100045869A1 (en) | 2008-08-19 | 2010-02-25 | Sony Computer Entertainment Europe Ltd. | Entertainment Device, System, and Method |
US20100074532A1 (en) | 2006-11-21 | 2010-03-25 | Mantisvision Ltd. | 3d geometric modeling and 3d video content creation |
US20100110074A1 (en) | 2008-10-31 | 2010-05-06 | Eagle View Technologies, Inc. | Pitch determination systems and methods for aerial roof estimation |
US20100114537A1 (en) | 2008-10-31 | 2010-05-06 | Eagle View Technologies, Inc. | Concurrent display systems and methods for aerial roof estimation |
US20100201682A1 (en) * | 2009-02-06 | 2010-08-12 | The Hong Kong University Of Science And Technology | Generating three-dimensional fadeçade models from images |
US20100214291A1 (en) | 2007-07-27 | 2010-08-26 | ETH Zürich | Computer system and method for generating a 3d geometric model |
US7814436B2 (en) | 2003-07-28 | 2010-10-12 | Autodesk, Inc. | 3D scene orientation indicator system with scene orientation change capability |
US20110029897A1 (en) | 2009-07-31 | 2011-02-03 | Siemens Corporation | Virtual World Building Operations Center |
US20110064312A1 (en) | 2009-09-14 | 2011-03-17 | Janky James M | Image-based georeferencing |
WO2011079241A1 (en) | 2009-12-23 | 2011-06-30 | Tomtom International Bv | Method of generating building facade data for a geospatial database for a mobile device |
US20110181589A1 (en) | 2010-01-28 | 2011-07-28 | The Hong Kong University Of Science And Technology | Image-based procedural remodeling of buildings |
WO2011091552A1 (en) | 2010-02-01 | 2011-08-04 | Intel Corporation | Extracting and mapping three dimensional features from geo-referenced images |
US8040343B2 (en) | 2005-03-02 | 2011-10-18 | Navitime Japan Co., Ltd. | Map display device and map display method |
US8098899B2 (en) | 2005-11-14 | 2012-01-17 | Fujifilm Corporation | Landmark search system for digital camera, map data, and method of sorting image data |
US20120041722A1 (en) * | 2009-02-06 | 2012-02-16 | The Hong Kong University Of Science And Technology | Generating three-dimensional models from images |
US8139111B2 (en) | 2008-12-04 | 2012-03-20 | The Boeing Company | Height measurement in a perspective image |
US20120182392A1 (en) | 2010-05-20 | 2012-07-19 | Irobot Corporation | Mobile Human Interface Robot |
US20120224770A1 (en) | 2011-03-02 | 2012-09-06 | Harman Becker Automotive Systems Gmbh | System for floor number determination in buildings |
US8339394B1 (en) | 2011-08-12 | 2012-12-25 | Google Inc. | Automatic method for photo texturing geolocated 3-D models from geolocated imagery |
US8350850B2 (en) | 2008-03-31 | 2013-01-08 | Microsoft Corporation | Using photo collections for three dimensional modeling |
US8390617B1 (en) | 2009-06-04 | 2013-03-05 | Google Inc. | Visualizing oblique images |
US20130202157A1 (en) | 2012-02-03 | 2013-08-08 | Chris Pershing | Systems and methods for estimation of building wall area |
US20130257850A1 (en) | 2012-03-30 | 2013-10-03 | Honeywell International Inc. | Extracting data from a 3d geometric model by geometry analysis |
US20140212028A1 (en) | 2013-01-31 | 2014-07-31 | Eagle View Technologies, Inc. | Statistical point pattern matching technique |
US20150086084A1 (en) | 2013-09-25 | 2015-03-26 | Maynard C. Falconer | Systems and methods for mapping |
US9478031B2 (en) | 2014-01-31 | 2016-10-25 | Hover Inc. | Scale error correction in a multi-dimensional model |
US20160350969A1 (en) | 2015-05-29 | 2016-12-01 | Hover Inc. | Graphical overlay guide for interface |
Family Cites Families (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3516856B2 (en) * | 1998-01-30 | 2004-04-05 | 富士重工業株式会社 | Outside monitoring device |
DE19926559A1 (en) * | 1999-06-11 | 2000-12-21 | Daimler Chrysler Ag | Method and device for detecting objects in the vicinity of a road vehicle up to a great distance |
US8346578B1 (en) * | 2007-06-13 | 2013-01-01 | United Services Automobile Association | Systems and methods for using unmanned aerial vehicles |
JP4875654B2 (en) | 2008-04-11 | 2012-02-15 | 三菱重工業株式会社 | Supercharger |
US9727834B2 (en) | 2011-06-08 | 2017-08-08 | Jerome Reyes | Remote measurement via on-site portable platform |
FI20125644L (en) * | 2012-06-12 | 2013-12-13 | Tekla Corp | Computer-aided modeling |
US20140023996A1 (en) * | 2012-07-18 | 2014-01-23 | F3 & Associates, Inc. | Three Dimensional Model Objects |
US9020191B2 (en) * | 2012-11-30 | 2015-04-28 | Qualcomm Incorporated | Image-based indoor position determination |
US8923650B2 (en) * | 2013-01-07 | 2014-12-30 | Wexenergy Innovations Llc | System and method of measuring distances related to an object |
FR3008805B3 (en) * | 2013-07-16 | 2015-11-06 | Fittingbox | METHOD FOR DETERMINING OCULAR MEASUREMENTS WITH A CONSUMER SENSOR |
US9613388B2 (en) * | 2014-01-24 | 2017-04-04 | Here Global B.V. | Methods, apparatuses and computer program products for three dimensional segmentation and textured modeling of photogrammetry surface meshes |
US9171403B2 (en) * | 2014-02-13 | 2015-10-27 | Microsoft Technology Licensing, Llc | Contour completion for augmenting surface reconstructions |
-
2017
- 2017-01-06 US US15/400,718 patent/US9830681B2/en active Active
- 2017-11-20 US US15/817,620 patent/US10475156B2/en active Active
- 2017-11-20 US US15/817,755 patent/US10297007B2/en active Active
-
2019
- 2019-02-20 US US16/280,169 patent/US10453177B2/en active Active
- 2019-08-12 US US16/538,386 patent/US10515434B2/en active Active
- 2019-09-23 US US16/578,964 patent/US11030823B2/en active Active
-
2020
- 2020-04-10 US US16/846,260 patent/US11017612B2/en active Active
-
2021
- 2021-05-03 US US17/306,787 patent/US11676243B2/en active Active
-
2023
- 2023-04-26 US US18/307,246 patent/US20230274390A1/en active Pending
Patent Citations (39)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4798028A (en) | 1987-11-30 | 1989-01-17 | Pinion John A | Downspout trap and clean out |
US5973697A (en) | 1997-01-27 | 1999-10-26 | International Business Machines Corporation | Method and system for providing preferred face views of objects in a three-dimensional (3D) environment in a display in a computer system |
US20030052896A1 (en) | 2000-03-29 | 2003-03-20 | Higgins Darin Wayne | System and method for synchronizing map images |
US7218318B2 (en) | 2000-12-14 | 2007-05-15 | Nec Corporation | Server and client for improving three-dimensional air excursion and method and programs thereof |
US20030014224A1 (en) | 2001-07-06 | 2003-01-16 | Yanlin Guo | Method and apparatus for automatically generating a site model |
US20040196282A1 (en) | 2003-02-14 | 2004-10-07 | Oh Byong Mok | Modeling and editing image panoramas |
US7814436B2 (en) | 2003-07-28 | 2010-10-12 | Autodesk, Inc. | 3D scene orientation indicator system with scene orientation change capability |
US20060037279A1 (en) | 2004-07-30 | 2006-02-23 | Dean Onchuck | Dormer calculator |
US8040343B2 (en) | 2005-03-02 | 2011-10-18 | Navitime Japan Co., Ltd. | Map display device and map display method |
US20080221843A1 (en) | 2005-09-01 | 2008-09-11 | Victor Shenkar | System and Method for Cost-Effective, High-Fidelity 3D-Modeling of Large-Scale Urban Environments |
US8098899B2 (en) | 2005-11-14 | 2012-01-17 | Fujifilm Corporation | Landmark search system for digital camera, map data, and method of sorting image data |
US20070168153A1 (en) | 2006-01-13 | 2007-07-19 | Digicontractor Corporation | Method and apparatus for photographic measurement |
WO2007147830A1 (en) | 2006-06-19 | 2007-12-27 | Jochen Hummel | Method for producing a three-dimensional computer model of a town |
US20100074532A1 (en) | 2006-11-21 | 2010-03-25 | Mantisvision Ltd. | 3d geometric modeling and 3d video content creation |
US20090043504A1 (en) | 2007-05-31 | 2009-02-12 | Amrit Bandyopadhyay | System and method for locating, tracking, and/or monitoring the status of personnel and/or assets both indoors and outdoors |
US20100214291A1 (en) | 2007-07-27 | 2010-08-26 | ETH Zürich | Computer system and method for generating a 3d geometric model |
US8350850B2 (en) | 2008-03-31 | 2013-01-08 | Microsoft Corporation | Using photo collections for three dimensional modeling |
US20100045869A1 (en) | 2008-08-19 | 2010-02-25 | Sony Computer Entertainment Europe Ltd. | Entertainment Device, System, and Method |
US20100114537A1 (en) | 2008-10-31 | 2010-05-06 | Eagle View Technologies, Inc. | Concurrent display systems and methods for aerial roof estimation |
US20100110074A1 (en) | 2008-10-31 | 2010-05-06 | Eagle View Technologies, Inc. | Pitch determination systems and methods for aerial roof estimation |
US8139111B2 (en) | 2008-12-04 | 2012-03-20 | The Boeing Company | Height measurement in a perspective image |
US20100201682A1 (en) * | 2009-02-06 | 2010-08-12 | The Hong Kong University Of Science And Technology | Generating three-dimensional fadeçade models from images |
US20120041722A1 (en) * | 2009-02-06 | 2012-02-16 | The Hong Kong University Of Science And Technology | Generating three-dimensional models from images |
US8390617B1 (en) | 2009-06-04 | 2013-03-05 | Google Inc. | Visualizing oblique images |
US20110029897A1 (en) | 2009-07-31 | 2011-02-03 | Siemens Corporation | Virtual World Building Operations Center |
US20110064312A1 (en) | 2009-09-14 | 2011-03-17 | Janky James M | Image-based georeferencing |
US20130195362A1 (en) | 2009-09-14 | 2013-08-01 | Trimble Navigation Limited | Image-based georeferencing |
WO2011079241A1 (en) | 2009-12-23 | 2011-06-30 | Tomtom International Bv | Method of generating building facade data for a geospatial database for a mobile device |
US20110181589A1 (en) | 2010-01-28 | 2011-07-28 | The Hong Kong University Of Science And Technology | Image-based procedural remodeling of buildings |
WO2011091552A1 (en) | 2010-02-01 | 2011-08-04 | Intel Corporation | Extracting and mapping three dimensional features from geo-referenced images |
US20120182392A1 (en) | 2010-05-20 | 2012-07-19 | Irobot Corporation | Mobile Human Interface Robot |
US20120224770A1 (en) | 2011-03-02 | 2012-09-06 | Harman Becker Automotive Systems Gmbh | System for floor number determination in buildings |
US8339394B1 (en) | 2011-08-12 | 2012-12-25 | Google Inc. | Automatic method for photo texturing geolocated 3-D models from geolocated imagery |
US20130202157A1 (en) | 2012-02-03 | 2013-08-08 | Chris Pershing | Systems and methods for estimation of building wall area |
US20130257850A1 (en) | 2012-03-30 | 2013-10-03 | Honeywell International Inc. | Extracting data from a 3d geometric model by geometry analysis |
US20140212028A1 (en) | 2013-01-31 | 2014-07-31 | Eagle View Technologies, Inc. | Statistical point pattern matching technique |
US20150086084A1 (en) | 2013-09-25 | 2015-03-26 | Maynard C. Falconer | Systems and methods for mapping |
US9478031B2 (en) | 2014-01-31 | 2016-10-25 | Hover Inc. | Scale error correction in a multi-dimensional model |
US20160350969A1 (en) | 2015-05-29 | 2016-12-01 | Hover Inc. | Graphical overlay guide for interface |
Non-Patent Citations (19)
Title |
---|
"Murillo et al.,""Visual Door Detection Integrating Appearance and Shape Cues"", 2008, Elsevier, vol. 56, No. 6, pp. 512-521 (Year: 2008)". |
"Scope Technologies; Solutions; Mar. 4, 2014; pp. 1-2, downloaded from the internet: [http://www.myscopetech.com/solutions.php]". |
Abdul Hasanulhakeem1; A tool to measure dimensions of buildings in various scales for Google Earth Plug-ins and 3D maps; Aug. 6, 2010; pp. 1-2 downloaded from intemet: [https://groups.google.com/forum/#topic/google-earth-browser-plugin/frlvZQ-m38l]. |
Bansal, et al., "Geo-Localization of Street Views with Aerial Image Databases," Nov. 28-Dec. 1, 2011, pp. 1125-1128. |
Becker, et al., "Semiautomatic 3-D model extraction from uncalibrated 2-D camera views," MIT Media Laboratory, Apr. 1995, 15 pages. |
Chen, et al., "City-Scale Landmark Identification on Mobile Devices," Jul. 2011, pp. 737-744. |
Fairfax County Virginia, "Virtual Fairfax," http://www.fairfaxcounty.gov/gis/virtualfairfax, Feb. 24, 2014; 2 pages. |
Fruh and Zakhor, "Constructing 3D City Models by Merging Aerial and Ground Views," IEEE Computer Graphics and Applications, Nov./Dec. 2003, pp. 52-61, 10 pages. |
Huang and Wu, et al., "Towards 3D City Modeling through Combining Ground Level Panoramic and Orthogonal Aerial Imagery," 2011 Workshop on Digital Media and Digital Content Management, pp. 66-71, 6 pages. |
Jaynes, "View Alignment of Aerial and Terrestrial Imagery in Urban Environments," Springer-Verlag Berlin Heidelberg 1999, pp. 3-19, 17 pages. |
Kroepfl, et al., "Efficiently Locating Photographs in Many Panoramas," Nov. 2-5, 2010, ACM GIS10. |
Lee, et al., "Automatic Integration of Facade Textures into 3D Building Models with a Projective Geometry Based Line Clustering," Eurographics 2002, vol. 2, No. 3, 10 pages. |
Lee, et al., "Integrating Ground and Aerial Views for Urban Site Modeling," 2002; 6 pages. |
Ou et al., "A New Method for Automatic Large Scale Map Updating Using Mobile Mapping Imagery", Sep. 2013, Wiley Online Library, vol. 28, No. 143, pp. 240-260. |
Pu et al., "Automatic Extraction of Building Features From Terrestrial Laser Scanning," 2006, International Institute for Geo-information Science and Earth Observation, 5 pages. |
SketchUp Knowledge Base, Tape Measure Tool: Scaling an entire model, http://help.sketchup.com/en/article/95006, 2013 Trimble Navigation Limited, 2 pages. |
Vosselman et al., "3D Building Model Reconstruction From Point Clouds and Ground Plans", Oct. 2001, Natural Resources Canada, vol. 34, No. 3/W4, pp. 37-44. |
Wang, et al.; Large-Scale Urban Modeling by Combining Ground Level Panoramic and Aerial Imagery; IEEE Third International Symposium on 3D Data Processing, Visualization, and Transmission; Jun. 14-16, 2006; pp. 806-813. |
Xiao, et al., "Image-based Facade Modeling," ACM Transaction on Graphics (TOG), 2008, 10 pages. |
Also Published As
Publication number | Publication date |
---|---|
US10475156B2 (en) | 2019-11-12 |
US20210256777A1 (en) | 2021-08-19 |
US20230274390A1 (en) | 2023-08-31 |
US20170116707A1 (en) | 2017-04-27 |
US20190362468A1 (en) | 2019-11-28 |
US20200020074A1 (en) | 2020-01-16 |
US10453177B2 (en) | 2019-10-22 |
US20180089797A1 (en) | 2018-03-29 |
US11017612B2 (en) | 2021-05-25 |
US20190180412A1 (en) | 2019-06-13 |
US9830681B2 (en) | 2017-11-28 |
US11676243B2 (en) | 2023-06-13 |
US20180075580A1 (en) | 2018-03-15 |
US10515434B2 (en) | 2019-12-24 |
US10297007B2 (en) | 2019-05-21 |
US20200279350A1 (en) | 2020-09-03 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11676243B2 (en) | Multi-dimensional model reconstruction | |
US9830742B2 (en) | Scale error correction in a multi-dimensional model | |
US10902672B2 (en) | 3D building analyzer | |
US11113877B2 (en) | Systems and methods for generating three dimensional geometry | |
US20210314484A1 (en) | Directed image capture | |
US9805451B2 (en) | Building material classifications from imagery | |
US11922570B2 (en) | Estimating dimensions of geo-referenced ground-level imagery using orthogonal imagery | |
US10133830B2 (en) | Scaling in a multi-dimensional building model | |
US11935188B2 (en) | 3D building analyzer |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
FEPP | Fee payment procedure |
Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: SMALL ENTITY |
|
AS | Assignment |
Owner name: HOVER INC., CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:UPENDRAN, MANISH;ALTMAN, ADAM J.;HALLIDAY, DEREK;REEL/FRAME:050467/0666 Effective date: 20170109 |
|
FEPP | Fee payment procedure |
Free format text: ENTITY STATUS SET TO SMALL (ORIGINAL EVENT CODE: SMAL); ENTITY STATUS OF PATENT OWNER: SMALL ENTITY |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
AS | Assignment |
Owner name: SILICON VALLEY BANK, AS ADMINISTRATIVE AGENT, CALIFORNIA Free format text: INTELLECTUAL PROPERTY SECURITY AGREEMENT;ASSIGNOR:HOVER INC.;REEL/FRAME:052229/0972 Effective date: 20200325 Owner name: SILICON VALLEY BANK, CALIFORNIA Free format text: INTELLECTUAL PROPERTY SECURITY AGREEMENT;ASSIGNOR:HOVER INC.;REEL/FRAME:052229/0986 Effective date: 20200325 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT VERIFIED |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
AS | Assignment |
Owner name: HOVER INC., CALIFORNIA Free format text: TERMINATION AND RELEASE OF INTELLECTUAL PROPERTY SECURITY AGREEMENT;ASSIGNOR:SILICON VALLEY BANK;REEL/FRAME:056423/0179 Effective date: 20210527 Owner name: HOVER INC., CALIFORNIA Free format text: TERMINATION AND RELEASE OF INTELLECTUAL PROPERTY SECURITY AGREEMENT;ASSIGNOR:SILICON VALLEY BANK, AS AGENT;REEL/FRAME:056423/0189 Effective date: 20210527 Owner name: SILICON VALLEY BANK, CALIFORNIA Free format text: INTELLECTUAL PROPERTY SECURITY AGREEMENT;ASSIGNOR:HOVER INC.;REEL/FRAME:056423/0199 Effective date: 20210527 Owner name: SILICON VALLEY BANK, AS ADMINISTRATIVE AGENT, CALIFORNIA Free format text: INTELLECTUAL PROPERTY SECURITY AGREEMENT;ASSIGNOR:HOVER INC.;REEL/FRAME:056423/0222 Effective date: 20210527 |
|
AS | Assignment |
Owner name: HOVER INC., CALIFORNIA Free format text: TERMINATION OF INTELLECTUAL PROPERTY SECURITY AGREEMENT;ASSIGNOR:SILICON VALLEY BANK, AS AGENT;REEL/FRAME:061622/0761 Effective date: 20220930 Owner name: HOVER INC., CALIFORNIA Free format text: TERMINATION OF INTELLECTUAL PROPERTY SECURITY AGREEMENT;ASSIGNOR:SILICON VALLEY BANK;REEL/FRAME:061622/0741 Effective date: 20220930 |