[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

CN110798610A - Image display system and image display method - Google Patents

Image display system and image display method Download PDF

Info

Publication number
CN110798610A
CN110798610A CN201810872027.8A CN201810872027A CN110798610A CN 110798610 A CN110798610 A CN 110798610A CN 201810872027 A CN201810872027 A CN 201810872027A CN 110798610 A CN110798610 A CN 110798610A
Authority
CN
China
Prior art keywords
image
indoor
window
house
stereoscopic
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201810872027.8A
Other languages
Chinese (zh)
Inventor
魏守德
陈韦志
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Lite On Electronics Guangzhou Co Ltd
Lite On Technology Corp
Original Assignee
Lite On Technology Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lite On Technology Corp filed Critical Lite On Technology Corp
Priority to CN201810872027.8A priority Critical patent/CN110798610A/en
Publication of CN110798610A publication Critical patent/CN110798610A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The invention provides an image display system, which comprises a control device, an image capturing device, a processor and a display. The control device obtains a control signal, the image capturing equipment moves to an observation point according to the control signal and captures a first image at the observation point, the processor generates an indoor image according to the stereoscopic house indoor model, the observation position corresponding to the observation point in the stereoscopic house indoor model and the first image, and the display displays the indoor image. Wherein, the observation point and the observation position in the indoor model of the three-dimensional house move synchronously. In addition, an image display method is also provided.

Description

Image display system and image display method
[ technical field ] A method for producing a semiconductor device
The present invention relates to an image display method, and more particularly, to an image display system and an image display method capable of simulating an indoor image.
[ background of the invention ]
Often, a builder sells a pre-sold house to set up an exhibition center or a reception center to show the environment, construction manner, or internal and external spaces of the house to customers. Generally, when making a pre-sale house model, a builder often presents the neighborhood of the pre-sale house partition walls as a green space. However, the proximity of the pre-sale premises may be another building, a gas station, a tower, or even a cemetery of the same project, which may be problematic for the purchaser.
In addition to the surrounding environment, the height of the house after it has been actually built, the scenery outside the window and the lighting are important considerations. For example, the height and the scenery outside the window of the actually built house may be related to the distance between adjacent houses, and the distance between adjacent houses is too close, which is easy to cause oppression. In addition, good lighting can improve overall indoor comfort, while poor lighting can lead to insufficient lighting or insolation. Therefore, a system capable of expressing the real feeling of the user in the house after the house is actually built is a common goal pursued by both the buyer and the seller of the house.
[ summary of the invention ]
In view of the above, the present invention provides an image display system and an image display method, which can simulate real indoor images.
The image display system comprises a control device, an image capturing device, a processor and a display. The control device obtains the control signal. The image capturing device is coupled to the control device, moves to the observation point according to the control signal and captures a first image at the observation point. The processor is coupled to the image capturing device and generates an indoor image according to the stereoscopic house indoor model, the observation position corresponding to the observation point in the stereoscopic house indoor model and the first image. The display is coupled to the processor and displays the indoor image. Wherein, the observation point and the observation position in the indoor model of the three-dimensional house move synchronously.
In an embodiment of the invention, the indoor image includes a window image and a non-window image, the window image corresponds to the first image, and the non-window image corresponds to the stereoscopic house indoor model.
In an embodiment of the invention, when generating the indoor image, the processor determines a window view portion and a non-window view portion of the stereoscopic indoor model according to a window position and an observation position in the stereoscopic indoor model, generates the window image according to the first image, and generates the non-window image according to the non-window view portion.
In an embodiment of the invention, the processor renders the indoor light into the indoor image according to the time information, the current position of the image capturing device, and the position of the window in the stereoscopic house indoor model.
In an embodiment of the invention, the control signal includes a position parameter corresponding to the viewpoint and an angle parameter corresponding to the viewing angle. The control device comprises a positioning device and a direction sensor, wherein the positioning device is used for obtaining a position parameter, and the direction sensor is used for obtaining an angle parameter.
In an embodiment of the invention, when the processor determines that the first image includes another image capturing device, the processor renders the virtual character in the indoor image according to a position of the another image capturing device in the first image.
The image display method of the invention comprises the following steps: obtaining a control signal; controlling the image capturing device to move to the observation point according to the control signal; capturing a first image at a viewpoint by using an image capturing device; generating an indoor image according to the stereo house indoor model, the observation position corresponding to the observation point in the stereo house indoor model and the first image, wherein the observation point and the observation position move synchronously; and displaying the indoor image.
In an embodiment of the invention, the indoor image includes a window image and a non-window image, the window image corresponds to the first image, and the non-window image corresponds to the stereoscopic house indoor model.
In an embodiment of the invention, the step of generating the indoor image according to the stereoscopic house indoor model, the observation position of the corresponding observation point in the stereoscopic house indoor model, and the first image includes: determining a window view part and a non-window view part of the three-dimensional house indoor model according to the window position and the observation position in the three-dimensional house indoor model; generating a window image according to the first image; and generating a non-window image based on the non-window viewing portion.
In an embodiment of the invention, the image display method further includes: and rendering the indoor light in the indoor image according to the time information, the current position of the image capturing device and the position of the window in the indoor model of the stereoscopic house.
In an embodiment of the present invention, the control signal includes a position parameter corresponding to the observation point and an angle parameter corresponding to the viewing angle, wherein the step of obtaining the control signal includes: obtaining position parameters by using a positioning device; and acquiring the angle parameter by using the direction sensor.
In an embodiment of the invention, the image display method further includes: judging whether the first image comprises another image capturing device; and when the first image comprises another image capturing device, rendering the virtual character in the indoor image according to the position of the other image capturing device in the first image.
Based on the above, the image display system and the image display method provided by the embodiments of the invention utilize the control device to control the image capturing device to obtain the actual image of the observation point, and synthesize the actual image and the stereoscopic house indoor model to generate the indoor image. Hereby, the actual view seen at the observation point can be simulated.
In order to make the aforementioned and other features and advantages of the invention more comprehensible, embodiments accompanied with figures are described in detail below.
[ description of the drawings ]
Fig. 1 is a schematic block diagram of an image display system according to an embodiment of the invention.
FIG. 2 is a schematic view illustrating an exemplary embodiment of an image display system.
FIG. 3 is a flowchart illustrating an image display method according to an embodiment of the invention.
FIG. 4 is a schematic diagram illustrating the generation of an indoor image according to the first embodiment of the present invention.
FIG. 5 is a schematic diagram illustrating generation of an indoor image according to a second embodiment of the present invention.
FIG. 6 is a schematic diagram illustrating the generation of an indoor image according to a third embodiment of the present invention.
FIG. 7 is a schematic diagram illustrating an image capture device capturing another image capture device according to an embodiment of the invention.
[ detailed description ] embodiments
FIG. 1 is a schematic block diagram of an image display system according to an embodiment of the present invention; FIG. 2 is a schematic view illustrating an exemplary embodiment of an image display system.
Referring to fig. 1 and 2, the image display system 100 includes a control device 110, an image capturing apparatus 120, a processor 130 and a display 140. In some embodiments, the control device 110, the processor 130 and the display 140 are located at the exhibition center SC, and the image capturing device 120 is located above the construction reservation BD _ RSV. The user USR controls the viewing position and viewing angle of the user in the simulated covered pre-sale house (e.g., in a three-dimensional modeling manner to establish a three-dimensional house) through the control device 110 at the exhibition center SC, and the display 140 correspondingly displays the indoor image viewed from the viewing position and viewing angle in the covered pre-sale house through cooperation of the image capturing device 120 and the processor 130. The three-dimensional house indoor model can include different indoor decorations, and the specific application can be configured according to the actual situation and can be obtained in different ways, so the details are not repeated herein. However, the invention is not limited to the context of use of the above-described embodiments.
The control device 110 is coupled to the image capturing apparatus 120 for receiving and transmitting the control signal S to control the image capturing apparatus 120 to move to a designated observation point for capturing an image. In some embodiments, the control signal S includes a position parameter for controlling a position desired to be viewed by the user USR, and the position parameter includes, for example, a floor parameter and a two-dimensional coordinate parameter of an indoor observation position. When the user USR inputs the floor parameters through the control device 110 to adjust the floor to be watched, the image capturing device 120 moves to the height above the building predetermined location BD _ RSV corresponding to the floor; when the user USR inputs the coordinate parameters through the control device 110 to move the observation position in the specific floor, the image capturing device 120 correspondingly translates the position above the predetermined building location BD _ RSV at a fixed height. In some embodiments, the control signal S further comprises an angle parameter for controlling the viewing angle of the user USR. When the user USR inputs the angle parameter through the control device 110 to adjust the angle of the line of sight he wants to view, the image capturing device 120 correspondingly rotates. However, the invention is not limited to the specific type of control parameter used to control the movement or rotation of the image capturing device 120. In other embodiments, the control signal S may also include only one three-dimensional coordinate parameter, such as longitude, latitude, and altitude.
In some embodiments, the control device 110 includes an input device, such as a keyboard, a mouse, a joystick, a microphone, or a touch screen, which can be used by the user USR to control the desired floor, viewing position, or viewing angle, and the image capturing device 120 moves or rotates accordingly.
In some embodiments, the control device 110 includes a positioning device for detecting the position and height of the user USR in the sales center SC, and a direction sensor for detecting the orientation (orientation) of the user USR. In some embodiments, the positioning device includes a plurality of image capturing devices disposed in the exhibition center, and a processor, wherein the processor can calculate and obtain the position and height of the user USR in the exhibition center SC by capturing images including the user USR and the position of each image capturing device by the plurality of image capturing devices to obtain the position parameters. For example, when the user USR moves from floor F1 to floor F2 at the sales center SC, the positioning device may retrieve the position parameters, thereby controlling the image capture apparatus 120 to move from an altitude corresponding to floor F1 to an altitude corresponding to floor F2. In some embodiments, the orientation sensor is implemented as a gyroscope, for example, which is capable of acquiring angular parameters when the user wears the orientation sensor on the body and rotates the orientation sensor. In some embodiments, the positioning device and the direction sensor are implemented as a nine-axis sensor, for example, which can provide a position parameter or an angle parameter according to the movement or rotation of the user USR when the user USR wears the nine-axis sensor on the body and moves or rotates in the sales center SC. However, the present invention is not limited thereto.
The image capturing device 120 is coupled to the processor 130, and is configured to acquire an image within a field of view of the image capturing device and transmit the acquired image to the processor 130. In some embodiments, the image capture device 120 is implemented, for example, as a drone that is controlled by the control device 110 to be movable to a designated viewpoint OB to take an image thereat. In some embodiments, the image capturing device 120 is a drone with a panoramic camera, for example, to obtain a full view image when the drone moves to the specified viewpoint OB according to the control signal. In some embodiments, for example, the unmanned aerial vehicle may carry a rotating platform and a non-panoramic camera, so as to move to a specified observation point OB according to the control signal and rotate to a specified angle to obtain an image of a non-full viewing angle.
The processor 130 is coupled to a display 140. The Processor 130 may be, for example, a Central Processing Unit (CPU), or other Programmable general purpose or special purpose Microprocessor (Microprocessor), Digital Signal Processor (DSP), Programmable controller, Application Specific Integrated Circuit (ASIC), Programmable Logic Device (PLD), or other similar devices or combinations thereof, for receiving the image from the image capturing Device 120 and performing calculations to combine the image with the stereoscopic indoor model to simulate an indoor image closer to the real environment.
The Display 140 includes a plurality of Display pixels, such as a Liquid Crystal Display (LCD), a Light Emitting Diode (LED) Display or other suitable Display, for receiving signals from the processor 130 to Display two-dimensional or three-dimensional indoor images, which is not limited by the invention.
In some embodiments, control device 100, image capture device 120, processor 130, and display 140 are each implemented as a separate device, for example. In some embodiments, the display 140 is integrated with the control device 110 and the processor 130 to be implemented as a mobile electronic device, for example, an application program of a smart phone. In some embodiments, the display 140 is, for example, integrated with the control device 110 to implement a head-mounted display, for example, a nine-axis sensor is integrated in the head-mounted display. However, the invention is not limited to the specific implementation of the image display system 100.
FIG. 3 is a flowchart illustrating an image display method according to an embodiment of the invention.
The image display method in the embodiment of fig. 3 can be performed by cooperation of various components of the image display system 100 in the embodiment of fig. 1, and therefore, various components of the image display system 100 will be referred to in the following to describe the image display method of the embodiment.
In step S110, the control device 110 receives and transmits a control signal S. The details of the control device 110 obtaining the control signal S are described in the foregoing paragraphs for illustrative embodiments, and thus are not repeated herein. In step S120, the image capturing apparatus 120 moves to the observation point OB according to the control signal S, and in step S130, the image capturing apparatus 120 captures a first image IMG1 at the observation point OB. In step S140, the processor 130 generates an indoor image combined with an outdoor real environment image according to the stereoscopic house indoor model, the viewing position OB' of the corresponding viewing point OB in the stereoscopic house indoor model, and the first image IMG 1. In step S150, the display 140 displays the indoor image so that the user USR can see a more realistic appearance.
In some embodiments, the stereoscopic house indoor model is, for example, a stereoscopic model that does not include an indoor layout of furniture or furnishings. In some embodiments, the three-dimensional house indoor model is, for example, a three-dimensional model including furniture and furnishings. It should be noted that the present invention is not limited to the details and sources of the indoor model of the three-dimensional house, and the present invention can be implemented according to the needs of the inventor. For example, the three-dimensional house indoor model may be recorded in a storage device (not shown) only, or a plurality of three-dimensional house indoor models may be recorded in the storage device corresponding to different location parameters. For example, the plurality of first location parameters corresponds to a first stereoscopic house indoor model, and the plurality of second location parameters corresponds to a second stereoscopic house indoor model, and so on. For another example, the three-dimensional house indoor model may be downloaded by the processor 130 through a network module (not shown).
Further, it is worth mentioning that the specified viewpoint OB corresponds to the observation position OB' in the three-dimensional house indoor model. For example, when the observation point OB is located at the entrance of the gate of the floor F2 after the pre-sale of the house, the observation point OB' is located at the entrance of the gate in the three-dimensional house indoor model corresponding to the floor F2. In addition, when the user USR controls the observation position OB' in the stereoscopic house indoor model through the control device 110, the image capturing device 120 moves to the corresponding observation point OB. Therefore, the observation position OB' in the three-dimensional house indoor model moves in synchronization with the observation point OB.
The manner in which the processor 130 generates the indoor image will be described below by way of example. For convenience of representation, the three-dimensional house indoor model will be represented in a two-dimensional cross-section in fig. 4 to 6.
FIG. 4 is a schematic diagram illustrating the generation of an indoor image according to an embodiment of the present invention.
In some embodiments, the image capture device 120 is, for example, a drone carrying a panoramic camera. The image capturing apparatus 120 moves to the designated viewpoint OB according to the control signal S to capture the full-view image as the first image IMG 1.
As shown in fig. 4, the processor 130 divides the three-dimensional house indoor model sect into a window view portion R1 and a non-window view portion R2 according to the viewing position OB' and the window positions W1, W2, and W3 in the three-dimensional house indoor model sect, for example. In detail, the window positions W1, W2, W3 can be seen by a line of sight seen at any one of the viewing position OB' toward the window viewing portion R1 without being blocked by other obstacles; the window positions W1, W2, W3 are not visible from a line of sight at the viewing position OB' looking toward any of the non-window viewing portions R2, perhaps because the direction is without windows or is obstructed by the furniture, furnishings or configuration of the interior model of the stereoscopic house. Subsequently, the processor 130 will calculate that the viewpoint OB coincides with the viewing position OB' and the orientation of the first image IMG1 is aligned with the orientation of the stereoscopic house indoor model sect, and project and superimpose the non-window field of view portion R2 on the second image IMG2 generated on the first image IMG 1. More specifically, the generated second image IMG2 is also a full-view image, and includes a window image composed of the original first image IMG1 and a non-window image composed of a non-window view portion of the stereoscopic house indoor model sect. Then, the processor 130 captures a portion of the image corresponding to the FOV from the second image IMG2 as an indoor image according to the angle parameter in the control signal S, so as to be displayed on the display 140.
Specifically, the above-mentioned field of view FOV is, for example, the field of view of the indoor image displayed by the display 140. Thus, the size of the field of view FOV is related to the specification of the display 140, while the orientation of the field of view FOV is related to the angle parameter. Therefore, the user can control the direction of the FOV of the visual field range by adjusting the angle parameter, and further see different indoor images.
FIG. 5 is a schematic diagram illustrating generation of an indoor image according to a second embodiment of the present invention.
In some embodiments, the image capture device 120 is, for example, a drone carrying a panoramic camera. The image capturing apparatus 120 moves to the designated viewpoint OB according to the control signal S to capture the full-view image as the first image IMG 1.
As shown in fig. 5, the processor 130 captures a partial image corresponding to the FOV from the first image IMG1 according to the angle parameter in the control signal S. Then, the observation point OB and the observation position OB 'are overlapped, and after the orientation of the first image IMG1 is aligned with the orientation of the stereoscopic house indoor model sect, the window view portion R1 and the non-window view portion R2 of the stereoscopic house indoor model sect are obtained according to the observation position OB', the window positions W1, W2, W3, and the view field FOV in the stereoscopic house indoor model sect. Finally, the non-window view portion R2 of the stereoscopic house indoor model sect is projected and superimposed on the partial image corresponding to the view field FOV captured from the first image IMG1 to form an indoor image for display on the display 140. Specifically, the indoor image includes a window image formed by the first image IMG1 of the part and a non-window image formed by the non-window view portion R2 of the stereoscopic house indoor model sect.
FIG. 6 is a schematic diagram illustrating the generation of an indoor image according to a third embodiment of the present invention.
In some embodiments, the image capture device 120 is, for example, a drone carrying a rotating platform and a non-panoramic camera. The image capturing apparatus 120 moves to the designated viewpoint OB according to the position parameter of the control signal S and captures the non-full-angle first image IMG1 by rotating the platform to a specific angle (e.g., the angle of the user USR' S line of sight) according to the angle parameter of the control signal S.
The non-full-view first image IMG1 of fig. 6 is, for example, the same as the partial image corresponding to the FOV captured from the first image IMG1 in the embodiment of fig. 5, and the image range of the non-full-view first image IMG1 of fig. 6 is, for example, the FOV. As shown in fig. 6, after the processor 130 aligns the orientation of the first image IMG1 with the orientation of the three-dimensional house indoor model sect, the window view portion R1 and the non-window view portion R2 of the three-dimensional house indoor model sect are obtained according to the viewing position OB', the window positions W1, W2, W3 and the view field FOV of the three-dimensional house indoor model sect. Finally, the non-window view portion R2 of the stereoscopic house indoor model sect is projected and superimposed on the first image IMG1 to form an indoor image for display on the display 140. Specifically, the indoor image includes a window image formed by the first image IMG1 of the part and a non-window image formed by the non-window view portion R2 of the stereoscopic house indoor model sect.
In this way, the display 140 can simultaneously display the non-window image portion of the indoor image with the stereoscopic house indoor model and the window image portion of the indoor image with the first image IMG1 captured by the image capturing device 120.
In some embodiments, when the indoor image is created, the processor 130 renders the light into the indoor image according to the time information, the current position of the image capturing device 120, and the position of the window in the indoor model of the stereoscopic house.
For example, the processor 130 obtains the time information according to the current time or the time input by the user USR, and obtains the current location of the image capturing device 120 according to the location parameter or a positioning system (such as, but not limited to, a global positioning system) provided in the image capturing device 120. Accordingly, the processor 130 can estimate the sunshine direction corresponding to the time information and the current position of the image capturing device 120, and further use an algorithm such as ray tracing to render light in the stereoscopic house indoor model according to the sunshine direction and the position of the window in the stereoscopic house indoor model, such as changing the color temperature of the stereoscopic house indoor model, generating a light shadow in the stereoscopic house indoor model, and the like. Then, the processor 130 may generate an indoor image using the lighted three-dimensional indoor model of the house according to the method described in the embodiments of fig. 4 to 6, for example, to approximate the real sunshine condition.
In some embodiments, the processor 130 further determines whether there are other image capturing devices in the first image IMG1 captured by the image capturing device 110. If so, it is determined whether to render the virtual character in the indoor image based on the location of other image capture devices in the first image IMG1 in the first image IMG 1.
FIG. 7 is a schematic diagram illustrating an image capture device capturing another image capture device according to an embodiment of the invention.
Referring to fig. 7, when another image capturing apparatus 120 ' appears in the first image IMG1 captured by the image capturing apparatus 120, it indicates that there may be another user USR ' also watching an indoor image of a house corresponding to the same building reservation BD _ RSV, i.e. another user USR ' also watches the same house together.
Therefore, in some embodiments, the processor 130 determines whether the other image capturing device 120 ' enters the portion of the first image IMG1 corresponding to the FOV, and if so, renders a virtual character in the indoor image corresponding to the other image capturing device 120 ' to cover the image of the other image capturing device 120 ' and simulate the effect that the other users are looking at the house together.
Taking the embodiment of fig. 4 as an example, as described in the previous paragraph, the processor 130 captures a portion of the first image IMG1 corresponding to the FOV according to the angle parameter in the control signal S. When another image capture device 120 'is present at a location in the first image IMG1 at full viewing angle that is within the portion of the first image IMG1 corresponding to the FOV of the field of view (e.g., the midpoint of the portion of the first image IMG1 corresponding to the FOV of the field of view), the processor 130 renders the virtual character to a corresponding location in the indoor image (e.g., the midpoint of the indoor image) based on the location of the another image capture device 120' in the first image IMG 1.
In summary, the image display system and the image display method provided in the embodiments of the invention utilize the control device to control the image capturing device to obtain the actual image of the observation point, and then synthesize the actual image and the stereoscopic house indoor model to generate the indoor image, so that the actual scene seen at the observation point can be simulated. In some embodiments of the present invention, the light outside the window is rendered into the indoor image according to the current sunshine information, so as to simulate the indoor image with more real light. In addition, in some embodiments of the present invention, another captured image capturing device is further simulated as a virtual character, so that another image capturing device can be prevented from appearing suddenly in the indoor image, and the indoor image is more natural. Based on several embodiments of the present invention, consumers can enjoy a good experience of "remote watching the house" realized by Augmented Reality (AR) technology.
Although the present invention has been described with reference to the above embodiments, it should be understood that various changes and modifications can be made therein by those skilled in the art without departing from the spirit and scope of the invention.

Claims (12)

1. An image display system, comprising:
a control device for receiving and transmitting control signals;
the image capturing equipment is coupled with the control device, moves to a viewpoint according to the control signal and captures a first image at the viewpoint;
a processor, coupled to the image capturing device, for generating an indoor image according to the stereoscopic house indoor model, an observation position corresponding to the observation point in the stereoscopic house indoor model, and the first image, wherein the observation point and the observation position move synchronously; and
and the display is coupled with the processor and displays the indoor image.
2. The image display system of claim 1, wherein the indoor image comprises a window image and a non-window image, the window image corresponding to the first image and the non-window image corresponding to the stereoscopic house indoor model.
3. The image display system of claim 2, wherein the processor determines a window view portion and a non-window view portion of the stereoscopic house indoor model based on a window position in the stereoscopic house indoor model and the viewing position when generating the indoor image, generates the window image based on the first image, and generates the non-window image based on the non-window view portion.
4. The image display system of claim 1, wherein the processor renders light into the indoor image based on time information, a current location of the image capture device, and a position of a window in the stereoscopic indoor model of the house.
5. The image display system of any one of claims 1 to 4, wherein the control signal comprises a position parameter corresponding to the viewpoint and an angle parameter corresponding to the viewing angle, and wherein the control device comprises:
a positioning device for obtaining the position parameter; and
the direction sensor is used for obtaining the angle parameter.
6. The system of claim 5, wherein when the processor determines that the first image includes another image capture device, the virtual character is rendered in the indoor image according to a position of the another image capture device in the first image.
7. An image display method, comprising:
obtaining a control signal;
controlling the image capturing device to move to the observation point according to the control signal;
capturing a first image at the viewpoint by using the image capturing device;
generating an indoor image according to the three-dimensional house indoor model, the observation position corresponding to the observation point in the three-dimensional house indoor model and the first image, wherein the observation point and the observation position move synchronously; and
and displaying the indoor image.
8. The image display method of claim 7, wherein the indoor image comprises a window image and a non-window image, the window image corresponding to the first image and the non-window image corresponding to the stereoscopic house indoor model.
9. The image displaying method of claim 8, wherein the step of generating the indoor image according to the stereoscopic house indoor model, the observing position corresponding to the observing point in the stereoscopic house indoor model and the first image comprises:
determining a window view part and a non-window view part of the three-dimensional house indoor model according to the window position in the three-dimensional house indoor model and the observation position;
generating the window image according to the first image; and
the non-window image is generated based on the non-window viewing portion.
10. The image display method of claim 7, further comprising:
and rendering light in the indoor image according to the time information, the current position of the image capturing device and the position of a window in the indoor model of the stereoscopic house.
11. The image displaying method of any one of claims 7 to 10, wherein the control signal comprises a position parameter corresponding to the viewpoint and an angle parameter corresponding to a viewing angle, and wherein the step of obtaining the control signal comprises:
obtaining the position parameter by using a positioning device; and
the angle parameter is obtained by a direction sensor.
12. The image display method of claim 11, further comprising:
judging whether the first image comprises another image capturing device; and
when the other image capturing device is included in the first image, the virtual character is rendered in the indoor image according to the position of the other image capturing device in the first image.
CN201810872027.8A 2018-08-02 2018-08-02 Image display system and image display method Pending CN110798610A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810872027.8A CN110798610A (en) 2018-08-02 2018-08-02 Image display system and image display method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810872027.8A CN110798610A (en) 2018-08-02 2018-08-02 Image display system and image display method

Publications (1)

Publication Number Publication Date
CN110798610A true CN110798610A (en) 2020-02-14

Family

ID=69425565

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810872027.8A Pending CN110798610A (en) 2018-08-02 2018-08-02 Image display system and image display method

Country Status (1)

Country Link
CN (1) CN110798610A (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102436621A (en) * 2011-11-08 2012-05-02 莫健新 House property landscape data display system and method, and data generation system and method
CN104778756A (en) * 2015-04-10 2015-07-15 北京明兰网络科技有限公司 Intelligent home decoration design system
CN104952099A (en) * 2014-03-28 2015-09-30 苏州美谷视典软件科技有限公司 One-house-one-scene digital house seeing system
JP2016113763A (en) * 2014-12-11 2016-06-23 大和ハウス工業株式会社 Structure of building
CN207212211U (en) * 2017-09-06 2018-04-10 广州市大迈文化传播有限公司 A kind of interactive intelligent window

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102436621A (en) * 2011-11-08 2012-05-02 莫健新 House property landscape data display system and method, and data generation system and method
CN104952099A (en) * 2014-03-28 2015-09-30 苏州美谷视典软件科技有限公司 One-house-one-scene digital house seeing system
JP2016113763A (en) * 2014-12-11 2016-06-23 大和ハウス工業株式会社 Structure of building
CN104778756A (en) * 2015-04-10 2015-07-15 北京明兰网络科技有限公司 Intelligent home decoration design system
CN207212211U (en) * 2017-09-06 2018-04-10 广州市大迈文化传播有限公司 A kind of interactive intelligent window

Similar Documents

Publication Publication Date Title
CN103635891B (en) The world is presented in a large amount of digital remotes simultaneously
US20190088025A1 (en) System and method for authoring and viewing augmented reality content with a drone
US20200225737A1 (en) Method, apparatus and system providing alternative reality environment
CN110382066A (en) Mixed reality observer system and method
CN114401414B (en) Information display method and system for immersive live broadcast and information pushing method
JP2013506226A (en) System and method for interaction with a virtual environment
US20130113701A1 (en) Image generation device
US20210038975A1 (en) Calibration to be used in an augmented reality method and system
JP2020035392A (en) Remote communication system and the like
WO2018113759A1 (en) Detection system and detection method based on positioning system and ar/mr
US20180239514A1 (en) Interactive 3d map with vibrant street view
JPWO2019171557A1 (en) Image display system
CN112272817B (en) Method and apparatus for providing audio content in immersive reality
CN108830944B (en) Optical perspective three-dimensional near-to-eye display system and display method
CN110058696A (en) A kind of virtual reality implementation method and its application method and correlation technique device
US20240153226A1 (en) Information processing apparatus, information processing method, and program
US20150138199A1 (en) Image generating system and image generating program product
JP2016122277A (en) Content providing server, content display terminal, content providing system, content providing method, and content display program
CN110682309B (en) Room viewing system and robot for viewing room
TWI721299B (en) Image display system and image display method
CN109741464A (en) Method and apparatus for showing outdoor scene
KR101873681B1 (en) System and method for virtual viewing based aerial photography information
CN110798610A (en) Image display system and image display method
JP7163498B2 (en) Display control device, display control method, and program
US20200336717A1 (en) Information processing device and image generation method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WD01 Invention patent application deemed withdrawn after publication
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20200214