GB2494697A - Viewing home decoration using markerless augmented reality - Google Patents
Viewing home decoration using markerless augmented reality Download PDFInfo
- Publication number
- GB2494697A GB2494697A GB1116101.5A GB201116101A GB2494697A GB 2494697 A GB2494697 A GB 2494697A GB 201116101 A GB201116101 A GB 201116101A GB 2494697 A GB2494697 A GB 2494697A
- Authority
- GB
- United Kingdom
- Prior art keywords
- furniture
- text
- items
- augmented reality
- room
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
- 230000003190 augmentative effect Effects 0.000 title claims abstract description 21
- 238000005034 decoration Methods 0.000 title description 4
- 238000000034 method Methods 0.000 claims description 19
- 239000000463 material Substances 0.000 claims description 5
- 229920001690 polydopamine Polymers 0.000 abstract 1
- 238000010586 diagram Methods 0.000 description 5
- 230000005057 finger movement Effects 0.000 description 5
- 239000011159 matrix material Substances 0.000 description 5
- PXFBZOLANLWPMH-UHFFFAOYSA-N 16-Epiaffinine Natural products C1C(C2=CC=CC=C2N2)=C2C(=O)CC2C(=CC)CN(C)C1C2CO PXFBZOLANLWPMH-UHFFFAOYSA-N 0.000 description 3
- 230000009466 transformation Effects 0.000 description 3
- 239000003086 colorant Substances 0.000 description 1
- 238000006073 displacement reaction Methods 0.000 description 1
- 239000004744 fabric Substances 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/006—Mixed reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2210/00—Indexing scheme for image generation or computer graphics
- G06T2210/04—Architectural design, interior design
Landscapes
- Engineering & Computer Science (AREA)
- Computer Graphics (AREA)
- Computer Hardware Design (AREA)
- General Engineering & Computer Science (AREA)
- Software Systems (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Processing Or Creating Images (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The invention allows everyone with a compatible portable device to visualize photorealistic 3D models of furniture and home design items in their room using markerless augmented reality. It allows consumers to view products at their own home or office space, making it easier for them to decide which product to purchase. The invention is designed to run within embodiments the likes of PDAs, smartphones and tablets, displaying in real-time a video feed of the user's room or office space while superimposing three dimensional furniture models and home design items. No markers or other special targets are needed as the users can position and rotate the furniture items with finger gestures on the display while an accelerometer tracks the pitch angle of the portable device and adjusts the position of the furniture items accordingly. The aim of the invention is to help consumers and anyone interested in buying new furniture to make the most informed decision possible and avoid disappointment after the purchase.
Description
Description
Title: Method for home decoration using markerless augmented reality
Background of the Invention
The invention relates to a method with which furniture and home design items are displayed on the display of a portable device.
Conventional catalogues from furniture retailers show the items either isolated with a uniform background colour or within a staged room with other furniture items. Regardless of which method is used consumers need to rely on their ability to imagine how the furniture item would look like in their own living room, bedroom or even office space. This decision making process is even more difficult when there are multiple furniture materials and colours available to choose from. Ultimately this uncertainty leads to frustration and disappointment when consumers receive their furniture item and realise that the style, colour and material of the item does not match their room or office decoration.
Statement of Invention
To overcome this uncertainly with furniture purchases, the invention proposes a method that utilises markerless augmented reality to visualise realistic three-dimensional models of furniture and home design items on top of a real time video feed, whereby the video feed is captured by a compatible portable device and shows the users's environment where the furniture item is going to be placed in.
Advantages: The invention allows everyone with a compatible portable device to visualize photorealistic 3D models of furniture and home design items in their room using markerless augmented reality. It allows users to view products at their own home or office space, making it easier for them to decide which product to purchase.
The invention is designed to run within embodiments the likes of Personal Digital Assistants (PDAs), smartphones and tablets, displaying in real-time a video feed of the user's room or office space while superimposing three dimensional furniture models and home design items.
The invention presented has the following distinguishing and novel characteristics in comparison to previous inventions that use augmented reality: * Most existing augmented reality methods require the user to place printed targets on the floor where the item is going to be positioned. In contrast the proposed invention does not rely on any targets and makes use of a markertess augmented reatity process that relies on the user's input to position furniture models on top of the video feed.
* In contrast to previous augmented reality attempts that use two dimensional objects, the invention presented makes use of full three dimensional objects that allow users to rotate and position them within a three dimensional object space.
* Reliance on sensor input from an accelerometer to track the pitch angle of the portable device and use touch gestures on the display to adjust the position and orientation of the furniture items accordingly.
* Using the extended form of the collinearity condition with additional self-calibration parameters to accurately determine the relationship between the object space and the coordinate system of the video sensor. This ensures an accurate registration of the three dimensional furniture models with the real time video feed.
Introduction to Drawings
The invention is further described through a number of drawings which schematize the technology. The drawings are given for illustrative purposes only and are not limitative of the presented invention.
Fig. 1 shows a diagram of overall system architecture with inputs and outputs.
Fig. 2 shows a diagram representing the real time video feed of a room, acquired from the video sensor of a
portable device.
Fig. 3 shows a diagram representing the functional model for projecting a 3D furniture model from the 3D object space onto the focal plane of the video sensor model.
Fig. 4 shows a diagram representing the output of the augmented reality engine by superimposing the projected 3D furniture model on the real time video feed.
Detailed description of the Invention
The invention is designed to provide an augmented reality view whereby the user can place 3D furniture models and other home decoration items on top of a real time video feed as shown in Fig.1. The diagram in Fig.1 provides an overview of the invention by illustrating the three main components required to achieve the augmented reality view. These are the hardware, processing and data components.
The hardware components are responsible for interacting with the sensors available in compatible mobile devices. The first hardware component shown in Fig.1 is the Video Sensor VS, which is responsible for interacting with the camera installed in compatible mobile devices to retrieve the real time video feed of a room in front of the camera. The second hardware component shown in Fig.1 is the user's Finger Movement FM on the display of the portable device. The third hardware component required is the input from an Accelerometer A. The data components section in Fig.i illustrates the necessary data required for this invention, namely the 3D Furniture Models 3DFM, and the video Sensor Model SM. The Sensor Model SM describes the functional model whereby the perspective geometry of the camera is defined. More details about the Sensor Model SM will be given in the following sections.
The processing components are responsible for managing the input from the hardware and data component sections in order to create the augmented reality view. The processing components in Fig.i show two layers, Layer 1 (Li) and Layer 2 (L2). Li is formed from the raw data received from the Video Sensor VS and L2 is formed using the 3D Furniture Model 3DFM and the Sensor Model SM. The Augmented Reality Engine ARE shown in the processing components is responsible for updating L2 depending on the user's Finger Movement FM and orientation changes from the Accelerometer A before superimposing L2 on Li to form the augmented reality view. The Augmented Reality Engine ARE performs this function at a rate of 40-50 frames per second and updates the Display D at the same rate.
Forming Li for a room The process of obtaining a real time feed of a Room R, to create Li is shown in Fig.2. The user initially positions the compatible mobile device inside the Room R and points the camera towards the location where the furniture item is to be placed. The projected image of the Room R is formed on the CCD array of the Video Sensor VS. The pinhole camera on the mobile device follows the functional model of the perspective geometry. As such the projected image of the Room R on the CCD array of the Video Sensor VS should be rotated by i80 degrees around the perspective centre of the Video Sensor VS and then scaled to match the dimensions of the display D. Scaling the image of the Room Ron the CCD array can be achieved using an affine transformation since the scale factor for the X and Y axis can be different when the aspect ratio of the CCD array does not match the aspect ratio of the Display D. The affine transformation for forming Li from the projected Room Ron the CCD is given below.
x=a*xb*y+c (i) y' = d*x ÷ e*y + f where a=Si*cos i80° b = -Si sin iSO° d = S2*sin i80° e = S2*cos i80° Si is the scale factor between the CCD of the video Sensor and the Display D for the X axis and 52 the scale factor for theY axis. Using Eq. i the four corner points (x,y) of the Video Sensor VS are transformed to the four corners of the Li as shown in Fig.2. This process is repeated at a rate of 30 frames per second with Li updated accordingly.
Forming L2 Layer 2, L2 as mentioned previously is formed using two components. The 3D Furniture Model 3DFM and the Sensor Model SM. Fig.3 shows the process of creating L2 for a given 3DEM. The 3DFM shown in Fig.3 is modelled in the object space whereby the object space is defined by a Cartesian coordinate system with axis X, Y, Land units defined in the metric system i.e. in meters. Therefore the 3DFM should be an accurate representation of the actual furniture item not only when it comes to the material and fabric but also for its shape, size and dimensions. The Sensor Model SM is responsible for simulating the geometric characteristics of the Video Sensor VS used to capture the real time video feed of the Room R. Simulating the perspective geometry of the Video Sensor VS enables an accurate superimposition of the L2 on Li. The Sensor Model SM as shown in Fig.3 relates the pixel coordinate system as defined by the CCD array of pixels to the image coordinate system. The image coordinate system is represented as shown in Fig.3 by the focal plane of the Sensor Model SM, the Perspective Centre of the lens PC, the Principal Point PP and the Focal Distance ED. The PP is formed on the focal plane from the optical axis that passes through the Perspective Centre PC. The invention assumes the lens of the Video Sensor VS is represented by a single point in space, commonly referred to as the Perspective Centre PC where all the light rays are passing through. The Focal Distance FD is the distance between the Perspective Centre PC and the Principal Point PP. Because of manufacturing imperfections the Principal Point PP is close but does not coincide with the centre of the focal plane shown in Eig.3. The centre of the focal plane is often referred to as Fiducial Centre EC and the offset between the Fiducial Centre EC and the Principal Point PP is defined as (xFc, yFc). When extending the co-ordinates of a point from the pixel coordinate system to the image co-ordinate system, it becomes: (xCCD-xFC, yccD-yFc, -FD) (2) Where the (xccD,ycc) are the pixel co-ordinates converted in physical dimension (millimeters) using the manufacturers pixel spacing and pixel count across the X andY axis of the CCD. The parameter ED in Eq.2 is the Focal Distance ED. The image co-ordinate system has an implicit origin at the Perspective Centre PC while the pixel coordinate system has its origin at the Fiducial Center FC.
The invention determines the parameters of the interior orientation (xFc, yFc and FD) using a process referred to as self-calibration through a bundle block adjustment.
In addition the Sensor Model SM takes also into account radial lens distortions that directly affect the accuracy of the registration between the real-time video feed and the projected 3D Furniture Model 3DFM.
Radial lens distortions are significant especially in consumer grade imaging sensors arid introduce a radial displacement of an imaged point from its theoretical correct position. Radial distortions increase towards the edges of the CCD array. The invention models and corrects the radial distortions by expressing the distortions present at any given point as a polynomial function of odd powers of the radial distance as shown below: dr = k1r3 + k2r5 + k3r7 (3) where: dr is the radial distortions of a specific pixel in the CCD array k1,k21k3 are the radial distortion coefficients is the radial distance away from the Fiducial Centre FC for a specific pixel in the CCD array The three radial distortion coefficients are included in the Sensor Model SM and are also determined through a bundle block adjustment with self-calibration.
In order to create L2 there needs to be a functional model in order to relate the object space with the image co-ordinate system. The object space is where the 3D Furniture Model 3DFM is located and the image co-ordinate is defined by the Sensor Model SM as explained previously. Relating the object space to the image co-ordinate system enables the accurate projection of the 3DFM on the focal plane of the Sensor Model SM. The projection of the 3DFM on the focal plane of the SM is performed with the use of what is referred to in the field of photogrammetry as the collinearity condition.
The collinearity condition is the functional model that relates image points (pixels on the CCD Array) with the equivalent 3D object points of the 3DFM and the parameters of the Sensor Model SM. The collinearity condition and the relationship between the image co-ordinate system and the 3D Furniture Model is represented in Fig.3 and is expressed as: x-x EL in31 (X -X) + in3, (Y-1,) + in33 (Z -Z) (4) YYpc =_FD1(xxp)+m22(Y1p)+m23(zzp) m31(X-X) + in3, (Y -+ -Z) Where: x, y: are the image co-ordinates of a 3D vertex from the object space on the CCD array XFC, Yrc is the offset of the Principal Point PP from the Fiducial Centre IC determined by the camera calibration process and included in the SM FD: is the Focal Distance FD as determined by the camera calibration process and included in the SM X, V, 7: are the coordinates of a 3D vertex representing a point on the 3D Furniture Model as defined in the object space Xp, Yp, Zp: are the coordinates of the perspective centre PC of the Sensor Model SM. These are always set to zero. As such the origin of the object space is always defined by the position of the mobile device within the Room R. The parameters m11,m12 m33 are the nine elements of a 3x3 rotation matrix M. The rotation matrix M is defined by the three sequential rotation angles (w, 0, k) of the mobile device respectively. Note that (w) represents the tilt angle for roll (clockwise rotation angle around the X axis), the (0) represents the tilt angle for pitch (clockwise rotation angle around the Y axis), and (k) represents the yaw angle (clockwise rotation angle around the Z axis). The invention necessitates the determination of only the pitch angle, since the roll angle is set to zero and the yaw angle is initially set to zero until a two finger rotation occurs on the Display D. The pitch angle is provided by the Accelerometer A of the mobile device as shown in Fig.1 The rotation matrix M is expressed as: in11 in1, in13 Al = in71 in,, in,3 (5) 31 32 With wand k set to zero, the rotation matrix is computed as follows: cosØ 0 sinØ M= 0 1 0 (6) sinØ 0 cosq5 By substituting all known parameters in Eq.4 the invention computes the image co-ordinates (x, y) of any given 3D vertex of a 3D Furniture Model 3DFM from the object space to the focal plane of the Sensor Model SM. Once the image coordinates are computed the radial distance from the Fiducial Center FC is determined and the image co-ordinates are corrected for the radial lens distortions using Eq.7.
XcorrecLed = x -dr VccaecLed = V -d (7) Where dr is the computed radial distortion for the given image point using Eq.3. Once the corrected image coordinates of all the visible 3D Vertices of the 3D Furniture Model are computed the projected 3DFM is formed on the focal plane of the Sensor Model. L2 is subsequently formed using the same affine transformation (Eq. 1) to rotate and scale the projected 3D Furniture Model 3DEM. This process is performed at a rate of 40-50 FPS.
Creating and Updating the Augmented Reality View The background of the L2 is transparent and only the 3DFM is rendered with an opaque material to enable the overlay process. Once both the Li and L2 are formed the Augmented Reality Engine ARE forms the final augmented reality view by superimposing L2 onto Li as shown in Fig.4. The ARE is also responsible for updating L2 depending on the user's finger movement on the Display D. When the user performs a one finger movement the invention tracks the current finger position in relation to the initial finger touch. This translation in the x,y image co-ordinates is converted to an equivalent translation of the 3D vertices in the object space along the XV plane using Eq.4. A two finger circular movement on the display (shown in Eig.4) changes the k angle (yaw angle) of the 3D Furniture Model 3DFM and the rotation matrix in Eq. 6 becomes cosØcosic 0 sinØcos,c M= -cosØsinr I sinØsin,c Sin!= 0 cosçb The Augmented Reality Engine ARE tracks finger movements so that L2 is updated after every translation and rotation of the 3D Eurniture Model. Thus the invention gives the ability to the user to control the position and orientation of the 3DEM without the need for any specific targets on the floor of the room.
Claims (1)
- <claim-text>Claims 1. A method that utilises markerless augmented reality to visualise realistic three-dimensional models of furniture and home design items on top of a real time video feed, whereby the video feed is captured by a compatible portable device and shows the user's environment where the furniture item is going to be placed in.</claim-text> <claim-text>2. A method according to claim 1, whereby the furniture items are represented accurately as to their size, shape and material within a three dimensional object space that allow users to rotate and position the furniture items to their desired location.</claim-text> <claim-text>3. A method according to claim 2, that uses an accelerometer to track the pitch of the portable device and uses the user's finger gestures to adjust the position and orientation of the three dimensional furniture items accordingly.</claim-text> <claim-text>4. A method according to claim 2, whereby the three dimensional furniture item is accurately projected on the real time video feed using the extended form of the collinearity condition with additional self-calibration parameters to determine the relationship between the object space and the coordinate system of the video sensor.</claim-text>
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
GB1116101.5A GB2494697A (en) | 2011-09-17 | 2011-09-17 | Viewing home decoration using markerless augmented reality |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
GB1116101.5A GB2494697A (en) | 2011-09-17 | 2011-09-17 | Viewing home decoration using markerless augmented reality |
Publications (2)
Publication Number | Publication Date |
---|---|
GB201116101D0 GB201116101D0 (en) | 2011-11-02 |
GB2494697A true GB2494697A (en) | 2013-03-20 |
Family
ID=44937450
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
GB1116101.5A Withdrawn GB2494697A (en) | 2011-09-17 | 2011-09-17 | Viewing home decoration using markerless augmented reality |
Country Status (1)
Country | Link |
---|---|
GB (1) | GB2494697A (en) |
Cited By (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP2816519A1 (en) * | 2013-06-17 | 2014-12-24 | Spreadtrum Communications (Shanghai) Co., Ltd. | Three-dimensional shopping platform displaying system |
EP2816530A1 (en) * | 2013-06-17 | 2014-12-24 | Spreadtrum Communications (Shanghai) Co., Ltd. | Method for updating three-dimensional shopping platform |
US20150170260A1 (en) * | 2012-02-29 | 2015-06-18 | Google Inc. | Methods and Systems for Using a Mobile Device to Visualize a Three-Dimensional Physical Object Placed Within a Three-Dimensional Environment |
DE102014217675A1 (en) | 2014-09-04 | 2016-03-24 | Zumtobel Lighting Gmbh | Augmented reality-based lighting system and procedure |
US9626737B2 (en) | 2013-11-15 | 2017-04-18 | Canon Information And Imaging Solutions, Inc. | Devices, systems, and methods for examining the interactions of objects in an enhanced scene |
US10089681B2 (en) | 2015-12-04 | 2018-10-02 | Nimbus Visulization, Inc. | Augmented reality commercial platform and method |
US10310596B2 (en) | 2017-05-25 | 2019-06-04 | International Business Machines Corporation | Augmented reality to facilitate accessibility |
WO2020097025A1 (en) * | 2018-11-06 | 2020-05-14 | Carrier Corporation | Real estate augmented reality system |
CN111369679A (en) * | 2020-02-10 | 2020-07-03 | 北京城市网邻信息技术有限公司 | Method, device and equipment for decorating three-dimensional house type scene and readable storage medium |
US11513672B2 (en) | 2018-02-12 | 2022-11-29 | Wayfair Llc | Systems and methods for providing an extended reality interface |
US11557060B2 (en) | 2018-11-05 | 2023-01-17 | Wayfair Llc | Systems and methods for scanning three-dimensional objects |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109584022B (en) * | 2018-12-07 | 2022-11-15 | 深圳市易晨虚拟现实技术有限公司 | AR technology-based furniture ornament selective purchasing method and terminal |
Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20080071559A1 (en) * | 2006-09-19 | 2008-03-20 | Juha Arrasvuori | Augmented reality assisted shopping |
-
2011
- 2011-09-17 GB GB1116101.5A patent/GB2494697A/en not_active Withdrawn
Patent Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20080071559A1 (en) * | 2006-09-19 | 2008-03-20 | Juha Arrasvuori | Augmented reality assisted shopping |
Non-Patent Citations (3)
Title |
---|
http://smartsoftmobile.com/wordpress/2011/01/25/augmented-reality-is-here-think-of-the-possibilities-inside-mobile-commerce-applications/ * |
http://www.youtube.com/watch?feature=player_embedded&v=3A4gevc7ER0&gl=GB * |
http://www.youtube.com/watch?v=byk_cuS920M&feature=player_embedded#! * |
Cited By (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20150170260A1 (en) * | 2012-02-29 | 2015-06-18 | Google Inc. | Methods and Systems for Using a Mobile Device to Visualize a Three-Dimensional Physical Object Placed Within a Three-Dimensional Environment |
EP2816519A1 (en) * | 2013-06-17 | 2014-12-24 | Spreadtrum Communications (Shanghai) Co., Ltd. | Three-dimensional shopping platform displaying system |
EP2816530A1 (en) * | 2013-06-17 | 2014-12-24 | Spreadtrum Communications (Shanghai) Co., Ltd. | Method for updating three-dimensional shopping platform |
US9626737B2 (en) | 2013-11-15 | 2017-04-18 | Canon Information And Imaging Solutions, Inc. | Devices, systems, and methods for examining the interactions of objects in an enhanced scene |
DE102014217675A1 (en) | 2014-09-04 | 2016-03-24 | Zumtobel Lighting Gmbh | Augmented reality-based lighting system and procedure |
US10089681B2 (en) | 2015-12-04 | 2018-10-02 | Nimbus Visulization, Inc. | Augmented reality commercial platform and method |
US10310596B2 (en) | 2017-05-25 | 2019-06-04 | International Business Machines Corporation | Augmented reality to facilitate accessibility |
US10317990B2 (en) | 2017-05-25 | 2019-06-11 | International Business Machines Corporation | Augmented reality to facilitate accessibility |
US10739847B2 (en) | 2017-05-25 | 2020-08-11 | International Business Machines Corporation | Augmented reality to facilitate accessibility |
US10739848B2 (en) | 2017-05-25 | 2020-08-11 | International Business Machines Corporation | Augmented reality to facilitate accessibility |
US11513672B2 (en) | 2018-02-12 | 2022-11-29 | Wayfair Llc | Systems and methods for providing an extended reality interface |
US11557060B2 (en) | 2018-11-05 | 2023-01-17 | Wayfair Llc | Systems and methods for scanning three-dimensional objects |
WO2020097025A1 (en) * | 2018-11-06 | 2020-05-14 | Carrier Corporation | Real estate augmented reality system |
US11640697B2 (en) | 2018-11-06 | 2023-05-02 | Carrier Corporation | Real estate augmented reality system |
CN111369679A (en) * | 2020-02-10 | 2020-07-03 | 北京城市网邻信息技术有限公司 | Method, device and equipment for decorating three-dimensional house type scene and readable storage medium |
Also Published As
Publication number | Publication date |
---|---|
GB201116101D0 (en) | 2011-11-02 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
GB2494697A (en) | Viewing home decoration using markerless augmented reality | |
CN111750820B (en) | Image positioning method and system | |
EP1596329B1 (en) | Marker placement information estimating method and information processing device | |
WO2019152617A1 (en) | Calibration system and method to align a 3d virtual scene and 3d real world for a stereoscopic head-mounted display | |
US20120069180A1 (en) | Information presentation apparatus | |
Zhang et al. | A universal and flexible theodolite-camera system for making accurate measurements over large volumes | |
CN104299261A (en) | Three-dimensional imaging method and system for human body | |
JP6615545B2 (en) | Image processing apparatus, image processing method, and image processing program | |
JP2002519791A (en) | Method and apparatus for capturing a stereoscopic image using an image sensor | |
US20130070094A1 (en) | Automatic registration of multi-projector dome images | |
CN105023294B (en) | With reference to the fixed point mobile augmented reality method of sensor and Unity3D | |
WO2020084951A1 (en) | Image processing device and image processing method | |
Tehrani et al. | Automated geometric registration for multi-projector displays on arbitrary 3D shapes using uncalibrated devices | |
JP2023546739A (en) | Methods, apparatus, and systems for generating three-dimensional models of scenes | |
CN108921889A (en) | A kind of indoor 3-D positioning method based on Augmented Reality application | |
Gard et al. | Projection distortion-based object tracking in shader lamp scenarios | |
US20200134927A1 (en) | Three-dimensional display method, terminal device, and storage medium | |
Yu et al. | Calibration for camera–projector pairs using spheres | |
Kahn | Reducing the gap between Augmented Reality and 3D modeling with real-time depth imaging | |
JP6061334B2 (en) | AR system using optical see-through HMD | |
CN110599432B (en) | Image processing system and image processing method | |
JP6022423B2 (en) | Monitoring device and monitoring device control program | |
JPH04102178A (en) | Object model input device | |
JP2011075336A (en) | Three-dimensional shape measuring instrument and method | |
Chandaria et al. | The MATRIS project: real-time markerless camera tracking for augmented reality and broadcast applications |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WAP | Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1) |