CN115802014B - Recording medium, setting simulation method, and setting simulation device - Google Patents
Recording medium, setting simulation method, and setting simulation device Download PDFInfo
- Publication number
- CN115802014B CN115802014B CN202211133202.4A CN202211133202A CN115802014B CN 115802014 B CN115802014 B CN 115802014B CN 202211133202 A CN202211133202 A CN 202211133202A CN 115802014 B CN115802014 B CN 115802014B
- Authority
- CN
- China
- Prior art keywords
- image
- projector
- projection
- setting
- screen
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 153
- 238000004088 simulation Methods 0.000 title claims description 106
- 238000009434 installation Methods 0.000 claims abstract description 123
- 238000012545 processing Methods 0.000 claims description 217
- 238000012937 correction Methods 0.000 claims description 22
- 238000012986 modification Methods 0.000 description 29
- 230000004048 modification Effects 0.000 description 29
- 239000000725 suspension Substances 0.000 description 20
- 238000010586 diagram Methods 0.000 description 15
- 238000004364 calculation method Methods 0.000 description 14
- 230000001133 acceleration Effects 0.000 description 7
- 230000002452 interceptive effect Effects 0.000 description 6
- 238000005259 measurement Methods 0.000 description 6
- 230000004397 blinking Effects 0.000 description 4
- 238000005192 partition Methods 0.000 description 4
- 230000005540 biological transmission Effects 0.000 description 3
- 238000002156 mixing Methods 0.000 description 3
- 238000004891 communication Methods 0.000 description 2
- 230000008602 contraction Effects 0.000 description 2
- 238000001514 detection method Methods 0.000 description 2
- 238000003384 imaging method Methods 0.000 description 2
- 238000013507 mapping Methods 0.000 description 2
- 230000003190 augmentative effect Effects 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000005401 electroluminescence Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000010365 information processing Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 230000004807 localization Effects 0.000 description 1
- 229910044991 metal oxide Inorganic materials 0.000 description 1
- 150000004706 metal oxides Chemical class 0.000 description 1
- 239000000203 mixture Substances 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
Landscapes
- Projection Apparatus (AREA)
- Transforming Electric Information Into Light Information (AREA)
Abstract
The status of the setting of the projector can be confirmed so that restrictions on the place where the projector is set can be taken into consideration. The recording medium has recorded thereon a setting simulation program that causes a computer to execute: acquiring an installation site image that is an image of an installation site where a projector is installed, a projector installation position that is a position where the projector is installed in the installation site, and a projection site image that is an image of a site where an image is projected by the projector or an image of a site where an object is projected by the projected image, and performing two or one of the following processes: a process of displaying the projector image at the acquired projector setting position of the acquired setting place image; and a process of displaying a projection area image, which is an image of an area to be projected with the image of the obtained projection location.
Description
Technical Field
The present invention relates to a recording medium, a setting simulation method, and a setting simulation apparatus.
Background
A projector (projector) can project an image of a large screen, although having a relatively compact shape. Therefore, it is often used when it is desired to perform movie viewing, video conferencing, presentation, and the like on a large screen. However, in the projector structure, if the relationship between the position of the projector and the position of the screen on which the image is projected is not appropriate, there is a possibility that the size and brightness of the image become inappropriate or distortion occurs in the image. Accordingly, a network application (projection simulator) in which a user can confirm in advance a relationship between a distance (projection distance) between a projector and a screen and a size (projection size) of an image is provided. Further, for example, international publication No. 2019/012774 discloses an information processing apparatus and the like capable of reducing a burden on a layout design of a projector.
Disclosure of Invention
However, in the case where the user actually sets up the projector, restrictions on the setting location must also be considered. For example, even if a projector is provided at a position where the projector is disposed according to the conventional projection simulator or the invention disclosed in patent document 1, the projector may interfere with furniture or the like, and the projector may not be provided at the position.
The present invention has been made in view of the above-described circumstances, and an object thereof is to provide a recording medium, a setting simulation method, and a setting simulation device that can confirm the setting condition of a projector so that restrictions on the location where the projector is set can be taken into consideration.
Means for solving the problems
In order to achieve the above object, a recording medium according to one embodiment of the present invention is a recording medium having a setup simulation program recorded thereon, the setup simulation program causing a computer to acquire a setup site image, which is an image of a setup site where a projector is setup, a projector setup position, which is a position where the projector is setup in the setup site, and a projection site image, which is an image of a site where an image is projected by the projector or a site where an object to be projected is setup,
The setup simulation program causes the computer to perform two or one of the following processes: a process of displaying an image of the projector at the acquired projector installation position of the acquired installation site image; and a process of displaying a projection area image, which is an image of an area in which the video is projected, on the acquired projection area image.
ADVANTAGEOUS EFFECTS OF INVENTION
According to the present invention, the status of the setting of the projector can be confirmed so that restrictions on the place where the projector is set can be taken into consideration.
Drawings
Fig. 1 is a block diagram showing an example of a functional configuration of an installation simulation device according to the embodiment.
Fig. 2 is a diagram showing an example of a room in which the setting simulation is performed.
Fig. 3 is a diagram showing an example of an installation site image.
Fig. 4 is a diagram showing an example of a projection field image.
Fig. 5 is a diagram showing an example of screen display in the installation simulation device according to the embodiment.
Fig. 6 is a diagram showing another example of screen display in the installation simulation device according to the embodiment.
Fig. 7 is a diagram showing an example of screen display in which a depth grasping image is displayed in the installation simulation device according to the embodiment.
Fig. 8 is a diagram showing an example of screen display in the case where a screen is set in the setting simulation apparatus according to the embodiment.
Fig. 9 is a diagram showing an example of screen display of a depth grasping image in the case where a screen is set in the setting simulation device according to the embodiment.
Fig. 10 is a first part of an example of a flowchart of the setting simulation process of the embodiment.
Fig. 11 is a second part of an example of a flowchart of the setting simulation process of the embodiment.
Fig. 12 is a diagram showing an example of screen display in which a projector setting screen is displayed in the setting simulation apparatus according to the embodiment.
Fig. 13 is a diagram showing an example of screen display in which a projector projection setting screen is displayed in the setting simulation apparatus according to the embodiment.
Fig. 14 is a diagram showing an example of screen display in which a screen setting screen is displayed in the setting simulation apparatus according to the embodiment.
Fig. 15 is a diagram showing an example of screen display in which measurement results are displayed in the installation simulation device according to the embodiment.
Fig. 16 is a diagram showing an example of a screen display for correcting measurement performed by the installation simulation device according to the embodiment.
Fig. 17 is an example of a flowchart of projector drawing processing according to the embodiment.
Fig. 18 is an example of a flowchart of screen drawing processing according to the embodiment.
Fig. 19 is an example of a flowchart of projection area drawing processing according to the embodiment.
Fig. 20 is an example of a flowchart of projector interference determination processing according to the embodiment.
Fig. 21 is an example of a flowchart of screen interference determination processing according to the embodiment.
Fig. 22 is an example of a flowchart of projection area interference determination processing according to the embodiment.
Fig. 23 is an example of a flowchart of projector initial position calculation processing according to modification 1.
Fig. 24 is an example of a flowchart of the screen initial position calculation process of modification 3.
Fig. 25 is a diagram showing an example of screen display in the installation simulation device according to modification 4.
Detailed Description
The setting simulation program and the like of the embodiment are described with reference to the drawings. In the drawings, the same or corresponding portions are denoted by the same reference numerals.
(Embodiment)
The setting simulation program of the embodiment is a program such as: by taking a photograph of a location where a user wants to set up a projector (set up a location image) and a photograph of a location where a user wants to project an image output by the projector (project a location image), it is possible to confirm whether the projector can be set up without any problem even if the user does not have an entity with the projector. In this program, even if the user does not have a real object of the screen as a projection target for projecting an image output from the projector, it is possible to confirm whether or not the setting of the screen such as the screen can be set without any problem.
The setting simulation program is a program executed on the setting simulation apparatus 100 shown in fig. 1. The installation simulation device 100 of the embodiment is, for example, a smart phone, and includes a processing unit 110, a storage unit 120, a sensor unit 130, a display unit 140, and an operation unit 150, as shown in fig. 1.
The processing unit 110 is configured by a processor such as a CPU (Central Processing Unit ), for example. The processing unit 110 executes setting simulation processing and the like described later by a program stored in the storage unit 120.
The storage unit 120 stores programs and necessary data executed by the processing unit 110. The storage section 120 may include, but is not limited to, RAM (Random Access Memory ), ROM (Read Only Memory), flash Memory, and the like. The storage unit 120 may be provided inside the processing unit 110.
The storage unit 120 stores information such as a projector name, 3D modeling data, a lens position (a position of a portion from which a projected image is output (projection center point)), a normal projection direction, a projection direction at the time of suspension, a recommended projection size, and a recommended projection distance range of each projector that are commercially available. The 3D modeling data of each projector stored in the storage unit 120 stores 3D modeling data of both the case where the suspension is not mounted and the case where the suspension is mounted. Therefore, the processing unit 110 can acquire information such as 3D modeling data, lens position (position of projection center point), projection direction, and the like corresponding to the projector name and the presence or absence of the hanger by referring to the storage unit 120.
The storage unit 120 may store information on screen names of screens such as screens for projectors, roll screens, whiteboards, and paper-like screens (hereinafter, a projection object including a screen replacement or the like will be referred to as a "screen") on screen names, 3D modeling data, setting modes (floor type, wall type, etc.), sizes, expansion and contraction modes (pull-down, pull-up, and expansion and contraction-non-etc.), and the like on screens (screen types). The 3D modeling data of the screen stored in the storage unit 120 can reflect the expansion and contraction amount of the screen.
The sensor unit 130 includes an image sensor 131, an acceleration sensor 132, and a gyro sensor 133. However, the sensor unit 130 may include only the image sensor 131, or may include sensors other than the image sensor 131, the acceleration sensor 132, and the gyro sensor 133.
The image sensor 131 is, for example, a CMOS (Complementary Metal Oxide Semiconductor ) image sensor. The processing unit 110 can acquire image data by taking a photograph with the image sensor 131.
The acceleration sensor 132 is a triaxial acceleration sensor that detects movements in orthogonal triaxial directions. For example, when the user moves the setting simulation device 100, the processing unit 110 can acquire from the acceleration sensor 132 in which direction the user moves the setting simulation device 100 with what degree of acceleration.
The gyro sensor 133 is an angular velocity sensor that detects rotation. For example, when the user changes the direction in which the simulation apparatus 100 is set, the processing unit 110 can acquire from the gyro sensor 133 which direction has been changed to what extent.
The display unit 140 includes a display device such as a liquid crystal display or an organic EL (electroluminescence) display. The display unit 140 displays an image acquired by the image sensor 131, an image representing a result of the setting simulation, an operation screen for setting the simulation, and the like.
The operation unit 150 is a user interface such as a touch panel or a button switch, and accepts an operation input from a user. The touch panel is integrated with the display unit 140, and when a User touches a UI (User Interface) such as a button displayed on the display unit 140, the operation unit 150 detects the touch and transmits the detection result to the processing unit 110. The processing unit 110 can obtain what kind of operation input the user has performed based on the detection result of the operation unit 150.
Here, as an example of the setting simulation of the projector, consider a case where the user sets the projector in the room 200 shown in fig. 2. In this example, as shown in fig. 2, a window 203 is provided on the front surface of a room 200, a cabinet 201 with a door and a cabinet 202 with a drawer are provided on the left side in facing, a bed 204 is provided on the right side in facing, and a beam 205 is provided at the junction of a wall 206 and a ceiling on the right side. Then, assume that the user wishes to set the projector on the cabinet with door 201 or the cabinet with drawer 202, and project the image output by the projector onto the wall 206.
In this case, in the setting simulation process described later, the user takes a photograph of a place where the projector is to be set (setting place image) and a photograph of a place where the image output by the projector is to be projected (projection place image), respectively. Then, the installation simulation device 100 acquires the installation site image 230 shown in fig. 3 and the projection site image 240 shown in fig. 4, respectively.
In addition, the simulation apparatus 100 is provided to grasp the size and position of an object actually present in a room by using an existing AR (Augmented Reality ) system. This is performed by the user taking a plurality of pictures of a room considered to be a room for which the projector is desired to be set while moving the setting simulation apparatus 100. As a conventional AR system, ARkit (registered trademark) or ARCore (registered trademark) can be used, for example. In addition, the setting simulation apparatus 100 may use SLAM (Simultaneous Localizing AND MAPPING, instant localization and mapping) technology instead of the AR system or together with the AR system to grasp the size and position of an object existing in a room.
Then, as shown in fig. 5, the installation simulation device 100 displays the installation site image 230 and the projection site image 240 in parallel on the display unit 140. At this time, the image representing projector 210 is superimposed and displayed on installation site image 230, and the image representing projection area 220 is superimposed and displayed on projection site image 240. In the present embodiment, the image indicating the projection area is not displayed in the installation site image 230, and the image indicating the projector is not displayed in the projection site image 240. Projection information 250 such as the projection size, the distance 251 and the height 252 from the projector 210 to the wall, the distance 253 and the height 254 from the projection area 220 to the wall, and the distance 255 from the projector 210 to the projection area 220 are also displayed on the display unit 140. Thereby, the user can view and confirm the setting condition of the projector 210 on the screen (display section 140) on which the simulation apparatus 100 is set. In this embodiment, the heights 252, 254 are heights from the floor.
For example, in fig. 5, the projector 210 interferes with the partition of the drawer cabinet 202 from the installation site image 230 in which the projector 210 is superimposed and displayed. Therefore, the user can confirm that the projector 210 cannot be set at this position. Further, it is known that the projection area 220 interferes with the beam 205 or the door 207 from the projection field image 240 in which the projection area 220 is displayed superimposed. Therefore, the user can confirm that the image of the projector 210 cannot be properly projected if projected to the position.
Then, for example, in fig. 6, by moving projector 210 to the baffle-less portion of cabinet 201 with a door, it can be seen that projection area 220 converges to a portion of wall 206 that is free of other items. Therefore, the user can confirm that the image of the projector 210 is properly projected if projected to the position.
In addition, since it is difficult to grasp the installation position in the depth direction of the projector 210 in the installation site image 230, the installation simulation device 100 may display the depth grasping image 260 as shown in the left side of fig. 7 in order to display the installation position in the depth direction more easily. The user operation for displaying the depth grasp image 260 may be arbitrary, and for example, when the menu display button 256 on the upper right of fig. 5 (fig. 6 and 7) is clicked, a menu having an item of "display switch" may be displayed, and when "display switch" in the menu is clicked, the installation location image 230 may be switched to the depth grasp image 260.
In fig. 5 to 7, the image from the projector 210 is projected onto a predetermined area in the projection field image 240. In fig. 5 to 7, the area where the image is projected is the wall 206, but for example, if the user sets a screen for a projector in a room, the screen is present in the projection field image 240, and therefore the area where the image from the projector 210 is projected can be regarded as the screen. However, even if a screen is not provided in a room, it is sometimes desirable to confirm the status of projection in the case where the screen is provided. In this case, the setting simulation apparatus 100 may set a screen to be set on a screen setting screen (projection object setting screen) to be described later, so that the screen can be virtually set.
For example, when a screen setting (projection object setting) is performed so that a wall-mounted screen 208 is provided on a beam 205, as shown in fig. 8, the setting simulation apparatus 100 superimposes and displays an image representing the screen 208 (projection object image, hereinafter, the projection object image is referred to as a screen image) on a projection field image 240, and superimposes and displays an image representing a projection area 220 on the screen image and the projection field image 240. In this example, the projector 210 is suspended from the ceiling as shown by the installation site image 230.
In this case, projection information 250 such as the projection size, the distance 251 and the height 252 from the wall of the projector 210, the distance 253 and the height 254 from the wall of the projection area 220, and the distance 255 from the projector 210 to the projection area 220 are also displayed on the display unit 140. Thereby, the user can view and confirm the status of the setting of the projector 210 and the status of the setting of the screen 208 on the screen (display section 140) on which the simulation apparatus 100 is set. In this example, the height 252 represents the distance from the ceiling, and the height 254 represents the distance from the floor.
As shown in fig. 8, even in the case where the screen 208 is displayed virtually, in order to display the installation position in the depth direction more easily, the installation simulation apparatus 100 can display the depth grasp image 260 as shown in the left side of fig. 9. The user operation for displaying the depth grasp image 260 may be arbitrary, and for example, when the menu display button 256 on the upper right of fig. 8 (fig. 9) is clicked, a menu having an item of "display switch" may be displayed, and when "display switch" in the menu is clicked, the installation site image 230 may be switched to the depth grasp image 260.
A process for simulating the setting of the projector 210 described so far (setting simulation process) is described with reference to fig. 10 and 11. When the power is turned on, the setting simulation apparatus 100 starts executing the setting simulation process automatically or by an operation of the user (for example, the user instructs to start the setting simulation process).
When the setting simulation process starts, first, the processing unit 110 performs initial setting (step S101). In step S101, not only the initial settings of various variables and the like can be performed, but also the setting settings of the projector, the projection settings, and the settings of the screen can be performed by the projector setting screen 270 shown on the right side in fig. 12, the projector projection setting screen 280 shown on the left side in fig. 13, and the screen setting screen 290 shown on the left side in fig. 14.
In addition, these projector setting, projector projection setting, and screen setting may be changed after confirming the result of setting simulation as described later, even if they are not performed at the time of initial setting. As is clear from the case where the installation site image 230 in which the projector 210 is superimposed and drawn and the projection site image 240 in which the projection area 220 is superimposed and drawn are displayed, fig. 12, 13, and 14 show examples of screen display in the case where the setting value is changed while confirming the result of the installation simulation.
In the projector setting screen 270, as shown in fig. 12, for example, a projector name 271 of the projector 210 to which the setting simulation is performed, an angle 272 of inclination of left and right (X axis) and an angle 273 of inclination of up and down (Y axis) at the time of setting, a toggle button 274 to set whether to set the projector in a hanging manner (if the projector 210 is set upside down in a hanging manner), a toggle button 275 to set whether to install a hanging tool on the projector, and a height 276 of the projector can be set, respectively. As shown in fig. 12, the projector setting screen 270 is displayed on the display unit 140 together with the installation site image 230, so that the user can set the projector 210 while checking whether or not the inclination of the projector 210, the suspension or the like, which is superimposed and drawn on the installation site image 230, is present. In fig. 12, the inclination angle of the projector 210 at the time of installation can be set to be the inclination angle of the left and right (X axis) and up and down (Y axis), but the inclination angle of the rotation (Z axis) may be set to be the inclination angle of the left and right.
In the projector projection setting screen 280, as shown in fig. 13, for example, a switching button 281 for setting whether or not to perform suspension projection (projection in a projection direction downward and projection in an upside down and side-to-side direction) by the projector, a slider 282 for setting zoom (zoom) magnification, a shift ratio 283 in the X direction and a shift ratio 284 in the Y direction of lens shift, a lateral angle 285 and a longitudinal angle 286 for trapezoidal correction, and a slider 287 for setting a level of brightness can be set, respectively. As shown in fig. 13, the projector projection setting screen 280 is displayed on the display unit 140 together with the projection field image 240, so that the user can set the projection of the projector 210 while checking the projection position, size, brightness, and trapezoidal correction state of the projection area 220 superimposed and displayed on the projection field image 240.
In addition, as shown in fig. 14, in the screen setting screen 290, for example, the screen name (object name) 291 of the screen 208 that is virtually set, the height 293 of the screen 208 that is pulled up (pulled down), the distance 294 of the projection location from the back wall, and the angle 295 of the inclination of the left and right (X axis) at the time of setting can be set, respectively. When the screen name 291 is set, projection target information 292 of the screen 208 indicated by the screen name 291 (the arrangement mode, size, height (height at the time of storage to height at maximum extension), width (width of the device itself (width of an area capable of effective projection)), and the like of the screen 208 are displayed as shown in fig. 14. In fig. 14, the inclination at the time of setting the screen 208 is set to be right and left (X-axis), but may be set to be up and down (Y-axis) or a rotation (Z-axis) angle inclined right and left.
As shown in fig. 14, since the screen setting screen 290 is displayed on the display unit 140 together with the projection field image 240, the user can set the projection of the screen 208 while checking the condition of the screen 208 superimposed on the projection field image 240, the projection position, size, brightness, and the state of the trapezoidal correction of the projection area 220, and the like. In addition, when the screen name 291 is set to "none" by screen setting, as shown in fig. 5, 6, 7, and 13, the virtual screen 208 is not displayed.
Returning to fig. 10, the processing unit 110 keeps shooting the room with the image sensor 131 while grasping the operation of the installation simulation device 100 with the acceleration sensor 132 and the gyro sensor 133 (step S102). At this time, the processing unit 110 displays "please continue moving for a while" or the like on the display unit 140, and allows the user to move the setting simulation apparatus 100.
Through the processing of step S102, the processing unit 110 can acquire a plurality of images obtained by capturing a room from various positions and directions. Then, the processing section 110 can grasp the three-dimensional position of the feature point in the acquired image by using the AR system or SLAM technique, and can measure the size of the room and the size and position of the object in the room.
In step S102, the processing unit 110 displays the "installation place photographing button" and the "projection place photographing button" on the display unit 140, and prompts the user to photograph the installation place or the projection place of the projector 210. Then, if the operation section 150 detects touching the "set place shooting button", the processing section 110 saves the image shot by the image sensor 131 at this time as "set place image" in the storage section 120. In addition, if the operation section 150 detects touching the "projection place shooting button", the processing section 110 saves the image shot by the image sensor 131 at this time as "projection place image" in the storage section 120.
In the case where the depth grasping image 260 can be displayed, the processing unit 110 displays a "depth grasping imaging button" on the display unit 140 in step S102, and prompts the user to take a photograph of the projector installation place and the projection place on the left and right sides. Then, if the operation section 150 detects touching the "depth grasp shoot button", the processing section 110 saves the image shot by the image sensor 131 at this time as a "depth grasp image" in the storage section 120.
Then, the processing section 110 determines whether or not the room can be measured and the photographing of the installation site image and the projection site image is completed (step S103). Here, "the room can be measured" means that the processing unit 110 obtains the three-dimensional position of the feature point in the image obtained in step S102 in the real space using the AR system or SLAM technique. The feature points in the image are points corresponding to pixels in the image where a change can be visually recognized, such as edges, corners, and the like. For example, patterns of floors, walls, furniture, etc., boundaries of walls and ceilings, outer edges of furniture, partitions of furniture, etc., may be characteristic points. The three-dimensional position of the feature point can be obtained using triangulation or the like based on a plurality of images (images in which the feature points are reflected at different positions or angles). Then, if the three-dimensional position of each feature point can be obtained, the distance between any two feature points can be obtained by obtaining the difference in the three-dimensional position, so the processing unit 110 can measure the room.
If the measurement and setting of the image of the location has not been completed and the image of the projection location has not been completed (step S103: NO), the process returns to step S102 to continue the image of the room. Further, in the case where the depth grasp image 260 can also be displayed, the processing section 110 determines in step S103 whether or not the shooting of the depth grasp image is completed, and returns to step S102 in the case where the shooting of the depth grasp image is not completed.
If the room can be measured and the imaging of the installation site image and the projection site image is completed (step S103: yes), the processing section 110 obtains the three-dimensional positions of the feature points using the plurality of images imaged in step S102, and performs the measurement of the room and the objects in the room (step S104). Then, the processing unit 110 displays the measured result on the display unit 140 (step S105). In step S105, as shown in fig. 15, the processing unit 110 displays the height 231 and the lateral width 232 of the room in the installation site image 230 and the height 241 and the lateral width 242 of the room in the projection site image 240 on the display unit 140.
Next, the processing unit 110 determines whether or not correction data is input (step S106). The correction data is data for correcting the numerical value obtained by the measurement in step S104. For example, when the user double-clicks on the numerical value indicating the height and the lateral width of the room on the screen shown in fig. 15, the numerical value can be changed, and the changed numerical value can be used as correction data. In addition, the correction data is not limited to the height and lateral width of the room. The distance between any of the feature points where the three-dimensional position was obtained in step S104 can be corrected.
For example, fig. 16 shows the following example: the user selects the arrow 233 on the setting place image 230 by double-clicking or the like, shortens its length to represent the distance from the left wall to the drawer-equipped cabinet 202, and inputs 130.00 (cm) to the correction data input field 234. This is the following assumption: the user actually measures the distance from the wall to the drawer chest 202 at a value of 130.00 (cm) and the user inputs the measured value into the correction data input field 234. Although fig. 16 shows an example in which correction data is input for the distance displayed on the installation site image 230, correction data can be input for the distance displayed on the projection site image 240.
If such correction data is input (step S106: yes), the processing section 110 corrects the measurement data using the input correction data (step S107). For example, the processing unit 110 corrects the scale of the coordinate system representing the three-dimensional position of the feature point so as to match the inputted correction data. At this time, the other displayed values are also corrected according to the correction. Then, the processing unit 110 advances the process to step S108. If no correction data is input (step S106: NO), the processing unit 110 does not perform correction, and the process proceeds to step S108.
Then, the processing section 110 determines whether or not a certain screen 208 is set (step S108). For example, if "none" is set on the screen name 291 in the screen setting screen 290 described above, it is determined that the screen 208 is not set. In addition, if the screen name 291 is specifically set, it is determined that the screen 208 is set. If the screen 208 is not set (step S108: NO), the processing section 110 does not draw the screen 208, and advances the process to step S111.
If the screen 208 is set (step S108: yes), the processing section 110 calculates the initial position of the screen 208 (step S109). For example, the processing unit 110 calculates, as an initial position, a position of the projection field image 240 where the screen 208 is closest to the inner wall surface. Then, the processing unit 110 superimposes (superimposes) the image of the screen 208 on the initial position of the screen 208 on the projection field image 240 (step S110). The details of the superimposed drawing of the screen 208 will be described later as a screen drawing process (projected body drawing process).
Then, the processing section 110 calculates an initial position of the projector 210 (step S111). For example, the processing unit 110 calculates a position of the projector 210 closest to the inner wall surface at the center of the installation site image 230 as an initial position. Then, the processing unit 110 superimposes (superimposes) the image of the projector 210 on the initial position on the installation site image 230 (step S112). The details of the superimposed drawing of the projector 210 will be described later as projector drawing processing.
Next, referring to fig. 11, the processing section 110 determines whether or not an input of the setting position of the moving screen 208 (the projection object setting position) is detected by the operation section 150 (hereinafter, the projection object setting position is referred to as a screen setting position) (step S113). For example, when the user drags the screen 208 on the projection field image 240, the operation section 150 detects an input of a setting position of the mobile screen 208. However, since the projection field image 240 is two-dimensional and the setting position of the screen 208 is three-dimensional, only two values (values in the up-down direction and the left-right direction) of three values of the three-dimensional coordinates of the setting position can be input by dragging at the time of the input of the setting position of the moving screen 208. Therefore, it is also possible to switch between the mode (height setting mode and depth setting mode) in which the input of the height is dragged up and down or the input of the depth is performed.
For example, the left-right drag corresponds to a change in the distance of the installation position from the lateral wall in any mode, the up-down drag corresponds to a change in the height of the installation position in the height setting mode, and corresponds to a change in the depth direction of the installation position in the depth setting mode. The user operation for this mode switching may be arbitrary, for example, when the menu display button 256 displayed on the upper right side of the screen of the display unit 140 is clicked, a menu having items of "height setting mode" and "depth setting mode" may be displayed, and when "height setting mode" or "depth setting mode" in the menu is clicked, the mode may be switched to each mode.
In addition, the processing unit 110 may be configured to display the projection information 250 on the display unit 140 at, for example, "screen installation position W:87.56, H:35.43, D:25.53 ″ displays the setting position information of the screen 208, and when the user double-clicks it, the screen setting position (e.g., distance from the wall (W), height (H), distance from the inner wall (D)) can be directly input.
If an input of the setting position of the moving screen 208 is detected (step S113: yes), the processing section 110 changes the setting position of the screen 208 based on the detected input, and moves the image of the screen 208 to the setting position on the projection field image 240 to re-superimpose the drawing (step S114), and advances the process to step S115.
If the input of the setting position of the moving screen 208 is not detected (no in step S113), the processing section 110 does not move the image of the screen 208, but advances the process to step S115.
Then, the processing section 110 determines whether or not there is an input of a screen setting (step S115). For example, the user makes an input of screen setting by calling the screen setting screen 290. The user operation for calling up the screen setting screen 290 is arbitrary, and for example, when the menu display button 256 displayed on the upper right side in the screen of the display unit 140 is clicked, a menu having an item of "screen setting" may be displayed, and when "screen setting" in the menu is clicked, the screen setting screen 290 may be called up.
If there is an input of the screen setting (step S115: yes), the processing section 110 sets the screen 208 based on the input, and resumes the superimposed drawing of the image of the screen 208 (step S116). Then, the processing unit 110 advances the process to step S117. If there is no input of the screen setting (no in step S115), the processing section 110 advances the process to step S117.
Then, the processing section 110 makes an interference determination of the screen 208 (step S117). The details of step S117 are described later as screen interference determination processing (projection object interference determination processing). The interference process may be performed each time the image of the screen 208 is superimposed and depicted in step S110, step S114, and step S116.
Next, the processing section 110 determines whether or not an input to move the setting position of the projector 210 (projector setting position) is detected by the operation section 150 (step S118). For example, when the user drags the projector 210 on the setting place image 230, the operation section 150 detects an input of movement of the setting position of the projector 210. However, since the installation site image 230 is two-dimensional and the projector installation position is three-dimensional, the mode (the height setting mode and the depth setting mode) may be switched in the same manner as the input of the movement of the screen installation position described above.
In addition, the processing unit 110 may be configured to display the projector information 250 on the display unit 140 in the same manner as the projector installation position W:165.31, H:98.65, D:281.09 ″ to display setting position information of the projector 210, when a user double-clicks it, the projector setting position (e.g., distance from a wall (W), height (H), distance from a projection surface (D)) can be directly input.
If an input of movement of the installation position of the projector 210 is detected (yes in step S118), the processing unit 110 changes the installation position of the projector 210 based on the detected input, moves the image of the projector 210 to the installation position on the installation site image 230, and performs superimposition drawing again (step S119), and the processing proceeds to step S120.
If no input of movement of the setting position of the projector 210 is detected (step S118: NO), the processing section 110 does not move the image of the projector 210, but advances the process to step S120.
Next, the processing unit 110 determines whether or not there is an input of the projector setting (step S120). For example, by the user calling the projector setting screen 270, input of projector setting settings is made. The user operation for calling up the projector setting screen 270 is arbitrary, and for example, when the menu display button 256 displayed on the upper right side in the screen of the display unit 140 is clicked, a menu having an item of "projector setting" may be displayed, and when "projector setting" in the menu is clicked, the projector setting screen 270 may be called up.
If there is an input of the projector setting (yes in step S120), the processing unit 110 sets the setting of the projector 210 based on the input, and resumes the superimposition drawing of the image of the projector 210 (step S121). Then, the processing unit 110 advances the process to step S122. If there is no input of the projector setting (no at step S120), the processing section 110 advances the process to step S122.
Then, the processing unit 110 performs interference determination of the projector (step S122). The details of step S122 are described later as projector interference determination processing. The interference process may be performed when the images of the projector 210 are superimposed and drawn in step S112, step S119, and step S121.
Next, the processing unit 110 determines whether or not there is an input of the projector projection setting (step S123). For example, the user invokes the projector projection setting screen 280 to input projector projection settings. The user operation for calling up the projector projection setting screen 280 may be arbitrary, for example, when the menu display button 256 displayed on the upper right side in the screen of the display unit 140 is clicked, a menu having an item of "projector projection setting" may be displayed, and when "projector projection setting" in the menu is clicked, the projector projection setting screen 280 may be called up.
If there is an input of the projector projection setting (yes in step S123), the processing unit 110 performs the projection setting of the projector 210 based on the input (step S124). Then, the processing unit 110 advances the process to step S125. If there is no input of the projector projection setting (no in step S123), the processing section 110 advances the process to step S125.
Then, the processing unit 110 calculates the position (projection position), projection size, and shape of the display projection area 220 based on the three-dimensional position of the projector 210, the three-dimensional position of the object (wall 206, screen 208, etc.), the setting value of the projector setting (including the setting posture indicating the inclination of the projector 210), the setting value of the projector projection setting (projection setting value), and the setting value of the screen setting in the case where some screen 208 is set, and displays (superimposes and depicts) the image (projection area image) indicating the projection area 220 on the projection position on the projection place image 240 in the calculated projection size and shape (step S125). At this time, the processing unit 110 also displays projection information 250 such as the projection size on the display unit 140. The details of the processing of step S125 are described later as projection area drawing processing.
Then, the processing unit 110 performs interference determination of the projection region 220 (step S126). The details of step S126 will be described later as the projection area interference determination process.
Then, the processing unit 110 determines whether or not to end the processing (step S127). For example, when an instruction to end the processing is detected by the processing unit 110 via the operation unit 150, the processing unit determines that the processing is ended. If the process is not ended (step S127: NO), the processing section 110 returns the process to step S113. If the process is ended (step S127: yes), the processing section 110 ends the setting simulation process.
Next, a description will be given of projector drawing processing called when the projector 210 is drawn in superposition, among the processing of step S112, step S119, and step S121 of the setting simulation processing (fig. 10 and 11) described above, with reference to fig. 17.
First, the processing unit 110 acquires a projector name set in projector setting settings (step S201). Then, the processing unit 110 determines whether or not the projector setting is set to mount the suspension device (step S202). If it is set that the suspension is to be mounted (step S202: yes), the processing unit 110 acquires projector 3D modeling data in a state in which the suspension is mounted on the projector of the projector name acquired in step S201 from the storage unit 120 (step S203).
If it is not set to mount the suspension (step S202: NO), the processing unit 110 acquires 3D modeling data of the projector (state in which the suspension is not mounted) of the projector name acquired in step S201 from the storage unit 120 (step S204).
Then, the processing unit 110 determines whether or not the suspension setting is set in the projector setting (step S205). If the suspension setting is set (yes in step S205), the processing unit 110 inverts the projector 3D modeling data acquired in step S203 or step S204 up and down (step S206), and the processing proceeds to step S207. If the suspension setting is not set (step S205: NO), the processing unit 110 advances the process to step S207.
Next, the processing unit 110 rotates the projector 3D modeling data according to the set value of the inclination set by the projector setting (step S207). Then, the processing unit 110 calculates the scale of the installation site image 230 (step S208). The scale calculated here is a ratio between the actual length in the three-dimensional space and the size on the image output to the display section 140. Based on a plurality of images (projector vicinity images) of the vicinity of the place where the projector 210 is provided, the processing unit 110 can calculate the three-dimensional position (projector vicinity three-dimensional position) of the feature point in the place image by using the AR system or SLAM technique, and calculate the scale.
Next, the processing unit 110 reduces the projector 3D modeling data based on the scale calculated in step S208 (step S209). Then, the processing unit 110 superimposes and draws projector 3D modeling data on the installation position of the projector 210 on the installation site image of the display unit 140 (step S210). Then, the processing unit 110 displays the installation position of the projector 210 (for example, a distance 251 from the wall, a distance from the floor 252 from the reference point (wall, floor, etc.) to the installation position of the projector 210) on the display unit 140 (step S211). Then, the projector drawing process is ended, and the setting simulation process is returned (fig. 10 and 11).
In step S210, the processing unit 110 superimposes and draws projector 3D modeling data on the installation location of the projector 210 in the installation location image, but the superimposed drawing may be a simple overwrite drawing or a drawing in which transmission processing by α blending (α blend) or the like is performed. In the case where the depth grasping image 260 is displayed, the processing unit 110 superimposes and draws the projector 3D modeling data and the installation position of the projector 210 on the depth grasping image of the display unit 140 in steps S210 and S211. In fig. 6 and the like, the distance 251 and the height 252 indicating the installation position of the projector 210 are displayed superimposed on the installation site image 230 (or the depth grasping image 260), but the processing unit 110 may display the installation position of the projector 210 at a display position that does not overlap with the installation site image 230 (or the depth grasping image 260).
Next, with reference to fig. 18, a screen drawing process called when the screen 208 is subjected to the superimposition drawing in the processes of step S110, step S114, and step S116 of the setting simulation process (fig. 10 and 11) described above will be described.
First, the processing unit 110 acquires a screen name set in the screen setting (step S221). Then, the processing unit 110 acquires 3D modeling data of the screen 208 from the storage unit 120, and determines screen 3D modeling data according to the set value of the height 293 of the pull-up (pull-down) height in screen setting (step S222).
Next, the processing unit 110 rotates the screen 3D modeling data in accordance with the set value of the angle 295 of the inclination in the screen setting (step S223).
Then, the processing unit 110 calculates the scale of the projection field image 240 (step S224). The scale calculated here is a ratio between the actual length in the three-dimensional space and the size on the image output to the display section 140. Based on a plurality of images (near-object images) near the object (wall 206 or screen 208) of the projected image, the processing unit 110 can calculate the three-dimensional position of the feature point in the near-object image (near-object three-dimensional position) by using the AR system or SLAM technique, and calculate the scale.
Next, the processing section 110 reduces the screen 3D modeling data based on the scale calculated in step S224 (step S225). Then, the processing section 110 superimposes the drawing screen 3D modeling data on the screen setting position on the projection field image of the display section 140 (step S226). Then, the processing unit 110 ends the screen drawing process and returns to the setting simulation process (fig. 10 and 11).
As described above, in the screen drawing process, the processing unit 110 generates an image (screen image) representing the screen 208 from the screen information (screen type, screen setting value, screen setting position, etc.), and displays the image on the display unit 140. In the screen drawing process, the processing unit 110 may display the installation position of the screen 208 (for example, a distance from a wall, a distance from a floor, or the like from a reference point (wall, floor, or the like) to the installation position of the screen 208) on the display unit 140 before returning to the installation simulation process.
In step S226, the processing unit 110 superimposes the screen 3D modeling data on the projection field image at the installation position of the screen 208, but the superimposed drawing may be a simple overwrite drawing or a drawing in which transmission processing by α blending or the like is performed. In addition, when the depth grasping image 260 is displayed, in step S226, the processing unit 110 superimposes the drawing screen 3D modeling data on the depth grasping image 260 of the display unit 140.
Next, with reference to fig. 19, the projection area drawing process called in the process of step S125 of the setting simulation process (fig. 11) described above will be described.
First, the processing unit 110 determines whether or not suspension projection is set in projector projection setting (step S251). If the suspension projection is set (yes in step S251), the processing unit 110 sets the projection direction of the projector 210 to the direction (normally downward) at the time of suspension projection (step S252), and advances the process to step S254. If suspension projection is not set (step S251: no), the processing unit 110 sets the projection direction of the projector 210 to the direction (normal upward) at the time of normal projection (step S253), and advances the process to step S254.
Then, the processing unit 110 determines whether or not the suspension setting is set in the projector setting (step S254). If the suspension setting is set (step S254: yes), the processing unit 110 sets the position of the lens when the projector 210 is set upside down as the projection center point (step S255), and advances the process to step S257. If the suspension setting is not set (step S254: NO), the processing unit 110 sets the position of the lens when the projector 210 is set in the normal setting manner as the projection center point (step S256), and advances the process to step S257.
In step S257, the processing section 110 determines whether or not a certain screen 208 is set. If the screen 208 is not set (step S257: NO), the processing section 110 calculates the scale of the projection field image 240 and the position of the projection surface (step S258). The scale calculated here is a ratio between the actual length in the three-dimensional space and the size on the image output to the display section 140, and the processing section 110 can calculate the scale by using the AR system. Further, the position of the projection surface is a position of a plane (for example, a wall 206) that is mapped to the front surface of the projection field image 240, and the processing section 110 can calculate the position of the projection surface by using the AR system. By calculating the position of the projection surface, the processing section 110 can calculate the distance from the projection center point (set position) of the projector 210 to the projection area 220 (projection position).
If the screen 208 is set (step S257: yes), the processing unit 110 calculates the scale of the projection field image 240 and the position of the projection surface of the screen 208 (step S259). As described above, the scale of the projection field image 240 is calculated by using the AR system. The position of the projection surface of the screen 208 is the position of the surface on which the image from the projector 210 is projected on the screen 208. The processing unit 110 can calculate the position of the projection surface of the screen 208 based on the set value in the screen setting, the screen 3D modeling data, and the screen setting position. By calculating the position of the projection surface, the processing section 110 can calculate the distance from the projection center point (set position) of the projector 210 to the projection area 220 (projection position).
Then, the processing unit 110 calculates the projection position, projection size, brightness (image luminance) of the image, and the like of the projection area 220 based on the distance from the setting position of the projector 210 to the projection position, the projection direction set in step S252 or step S253, the projection center point set in step S255 or step S256, and the set value (zoom, lens shift, trapezoidal correction, and the like) set in the projector projection setting (step S260).
Next, the processing unit 110 displays the projection information 250 such as the projection size calculated in step S260, the projection position of the projection area 220 (for example, the distance 253 from the wall, the distance from the reference point (wall, floor, etc.) to the projection position of the projection area 220 such as the height 254 from the floor), and the distance 255 from the projector 210 to the projection area 220 on the display unit 140 (step S261). Then, the processing unit 110 superimposes and draws an image (projection area image) representing the projection area 220 on the projection field image 240 of the display unit 140 in the calculated projection size (step S262). In addition, in the case where the screen 208 is provided, since the screen image has been superimposed and drawn on the projection field image 240 by the screen drawing process, in step S262, the processing section 110 superimposes and draws the projection area image on the screen image or the projection field image 240.
At this time, the processing unit 110 changes at least one attribute value of the brightness, transparency, and gradation of the projection area image based on the calculated image luminosity. For example, as the luminosity of the image increases (brightly), the processing section 110 increases the brightness of the projection area image, or increases the transparency, or increases the degree of color change of the gradation. Then, the processing unit 110 ends the projection area drawing processing and returns to the setting simulation processing (fig. 10 and 11).
In step S262, the processing unit 110 superimposes and draws the projection area image on the projection position of the projection area 220 in the projection area image, but the superimposed drawing may be a simple overwrite drawing or a drawing in which transmission processing by α blending or the like is performed. In the case where the depth grasping image 260 is displayed, in step S261 and step S262, the processing unit 110 superimposes and draws the projection position of the projection area 220 (for example, the distance 253 from the wall, the distance from the floor surface 254 from the reference point (wall, floor surface, etc.) to the projection position of the projection area 220) and the projection area image on the depth grasping image of the display unit 140.
Next, with reference to fig. 20, a projector interference determination process called in the process of step S122 of the setting simulation process (fig. 11) described above will be described.
First, the processing unit 110 calculates a projector-occupied space, which is a space occupied by the projector 210, based on the installation position of the projector 210 and the projector 3D modeling data (step S301). Next, the processing unit 110 acquires the three-dimensional position of the feature point included in the installation site image 230 (step S302).
Then, the processing unit 110 determines whether or not the feature points (feature points that become the set obstacle, for example, feature points on the object that enters the inside of the projector-occupied space) acquired in step S302 exist inside the projector-occupied space calculated in step S301 (step S303). If no feature point exists in the projector-occupied space (no in step S303), no object is present to interfere with the projector 210, and therefore, the projector interference determination process is ended without displaying a warning, and the process returns to step S123 in which the simulation process (fig. 11) is set.
If a feature point exists in the projector-occupied space (yes in step S303), since an object interfering with the projector 210 exists, the processing unit 110 displays a warning (indicating that the projector 210 cannot be set at the current setting position) such as a message that causes the image of the projector 210 to blink, changes color, or displays "the projector interferes with another object" on the display unit 140 (step S304). Then, the projector interference determination process is ended, and the process returns to step S123 of the setting simulation process (fig. 11).
Next, the screen interference determination process called in the process of step S117 of the setting simulation process (fig. 11) described above will be described with reference to fig. 21.
First, the processing section 110 calculates a screen occupation space (occupied by a projection object) which is a space occupied by the screen 208 based on the setting position of the screen 208 and the screen 3D modeling data (step S321). Next, the processing unit 110 acquires the three-dimensional positions of the feature points included in the projection field image 240 (step S322).
Then, the processing unit 110 determines whether or not the feature points (feature points that prevent the setting, for example, feature points on an object that enters the inside of the screen occupation space) acquired in step S322 exist inside the screen occupation space calculated in step S321 (step S323). If no feature point exists in the inside of the screen occupation space (step S323: NO), there is no object interfering with the screen 208, and therefore the screen interference determination process is ended without displaying a warning, and the process returns to step S118 where the simulation process (FIG. 11) is set.
If a feature point exists inside the screen occupation space (yes in step S323), then there is an object interfering with the screen 208, and therefore the processing section 110 displays a warning (a warning indicating that the screen 208 cannot be set at the current screen setting position) such as blinking the image of the screen 208, changing color, or displaying a message such as "screen interfering with other objects" on the display section 140 (step S324). Then, the screen interference determination process is ended, and the process returns to step S118 of the setting simulation process (fig. 11).
Next, the projection area interference determination process called in the process of step S126 of the setting simulation process (fig. 11) described above will be described with reference to fig. 22.
First, the processing unit 110 calculates the position of the projection area and the projection occupied area (the area of a rectangular pyramid having the projection area as the bottom surface and the projection center point as the vertex) based on the setting values of the setting position of the projector 210, the position of the projection surface, the projection direction, the projection center point, and the projection setting of the projector (step S351). Since the position of the projection region is calculated in step S260 of the projection region drawing process (fig. 19), the processing unit 110 can use the information of the position of the projection region calculated in step S260 in step S351. Next, the processing unit 110 acquires the three-dimensional position of the feature point included in the projection field image 240 (step S352).
Then, the processing unit 110 determines whether or not the feature points (feature points that become obstacles to projection, for example, feature points on an object that enters the interior of the projection occupied area and blocks the projection light) acquired in step S352 exist in the interior of the projection occupied area calculated in step S351 (step S353). If no feature point exists in the projection space (no in step S353), the flow proceeds to step S355.
If a feature point exists in the projection space (yes in step S353), the image projected from the projector 210 interferes with the object corresponding to the feature point, and thus proper projection cannot be performed, and the processing unit 110 causes the display unit 140 to display a warning (a warning indicating that proper projection cannot be performed at the current projection position) such as a message indicating that the image representing the projection area 220 is blinking, the color is changed, or that the projected image interferes with another object is displayed (step S354).
Then, in step S355, the processing section 110 determines whether or not a certain screen 208 is set. If the screen 208 is not set (step S355: NO), the processing unit 110 confirms the pixel values of the pixels in the projection area calculated in step S351 among the pixels of the projection field image 240 (step S356), and determines whether or not the color difference in the projection area is greater than a threshold value (step S357). For example, the processing unit 110 converts a pixel Value in a projection area into a Value in an HSV (Hue, saturation, value) color space, and determines that a color difference in the projection area is larger than a threshold if the number of pixels having a Value of S (Saturation) equal to or larger than a first threshold (for example, pixels having S (Saturation) equal to or larger than 0.1) is equal to or larger than a second threshold (for example, 3% of pixels in the projection area).
If the color difference in the projection area is equal to or less than the threshold value (step S357: NO), it is found that the projection area is a substantially uniform white color, and therefore a warning about the color difference is not displayed, the projection area interference determination process is ended, and the process returns to step S127 where the simulation process (FIG. 11) is set. If the color difference in the projection area exceeds the threshold value (step S357: yes), it is known that there is a portion having a different color in the projection area, and therefore, the processing unit 110 displays a warning (a warning indicating that appropriate projection is not possible at the current projection position) such as a message such as blinking an image indicating the projection area 220, changing the color, or displaying "the projection area does not become uniform color" on the display unit 140 (step S358). Then, the projection area interference determination process is ended, and the process returns to step S127 of the setting simulation process (fig. 11).
On the other hand, if a certain screen 208 is set (yes in step S355), the processing unit 110 acquires the position and area of the projection surface of the set screen 208 (step S359). Then, the processing unit 110 determines whether or not the projection area calculated in step S351 is deviated from the projection plane of the screen 208 acquired in step S359 (step S360). If the projection area is completely included in the projection plane (no in step S360), the processing unit 110 does not display a warning, ends the projection area interference determination process, and returns to step S127 where the simulation process (fig. 11) is set.
If a part of the projection area deviates from the projection plane (yes in step S360), it is known that the projection area is not properly projected on the projection plane of the screen 208, and therefore the processing unit 110 displays a warning (a warning indicating that proper projection is not possible at the current projection position and screen position) such as a message that the image indicating the projection area 220 is blinking, the color is changed, or the projection area is "deviating from the screen" on the display unit 140 (step S361). Then, the projection area interference determination process is ended, and the process returns to step S127 of the setting simulation process (fig. 11).
Further, if a screen on which an image of projector 210 is drawn is superimposed on installation site image 230, the user can grasp from the screen whether projector 210 interferes with another object (for example, a partition of a cabinet), so in one embodiment, the projector interference determination process of step S122 may not be performed in the installation simulation process (fig. 11).
Further, if a screen on which an image representing the projection area 220 is superimposed and drawn on the projection field image 240 is viewed, the user can grasp from the screen whether or not the projection can be properly performed, and therefore, in one embodiment, the projection area interference determination process of step S126 may not be performed in the setting simulation process (fig. 11). Further, if a screen on which an image of the screen 208 is drawn is viewed superimposed on the projection field image 240, the user can grasp from the screen whether the screen 208 interferes with another object (e.g., a beam, a bed, or the like), and therefore in one embodiment, the screen interference determination process of step S117 may not be performed in the setting simulation process (fig. 11).
In the setting simulation process described above, when the screen 208 is not set, the projection plane is a plane that is projected on the front surface of the projection site image 240, and for example, the wall 206 of the projection site is assumed, but the projection plane is not limited to the wall 206. For example, in the case where a screen is provided in front of the wall 206, the screen becomes a projection surface. When a screen is provided in front of the wall 206, the plane that is reflected on the front surface of the projection field image 240 is not the wall 206 but a screen that is provided, and therefore, in this case, there is no particular problem in the setting simulation process.
By the above-described installation simulation processing, the installation simulation device 100 acquires an image of the installation site where the projector 210 is installed (installation site image 230) and a position where the projector 210 is installed (installation position), and displays the image of the projector 210 at the installation position of the installation site image 230, so that the user can confirm the situation of the installation site where the projector 210 is installed, such as whether the projector 210 interferes with another object.
The setup simulation apparatus 100 obtains an image of a projection location (projection location image 240) where the projector 210 projects the video and a projected location (projection location), and displays an image (projection area image) representing the projection area 220 at the projection location of the projection location image 240 in a projection size calculated based on a distance from the setup location of the projector 210 to the projection location, so that a user can confirm a projection situation of the projector 210.
Further, even in the case where the user does not have a screen for a projector, the setting simulation apparatus 100 can display an image (screen image) representing the screen 208 on the projection field image 240, and thus can virtually set the screen 208. Therefore, the user can project an image from projector 210 onto screen 208 that is virtually set, and confirm the projection conditions of projector 210 and screen 208.
Further, since the installation simulation device 100 displays the installation site image 230 and the projection site image 240 on the display unit 140 at the same time, the user can easily confirm how the projection state changes according to the installation state of the projector 210 or the screen 208.
In addition, the setting simulation apparatus 100 calculates the brightness (image light intensity) of the image projected by the projector 210 based on the distance from the setting position of the projector 210 to the projection position, sets at least one attribute value of the brightness, transparency, and gradation of the image (projection area image) representing the projection area 220 based on the calculated image light intensity, and superimposes and displays the projection area image based on the set attribute value on the projection area image 240, thereby enabling the user to easily confirm the brightness of the image when projected by the projector 210.
In addition, since the installation simulator 100 obtains the inclination (installation posture) of the projector 210 when the projector 210 is installed, calculates the projection position and the shape of the projection area image from the installation posture, and displays the projection area image in the calculated shape with the calculated projection position superimposed on the projection area image 240, the user can easily confirm the position and the shape of the projection area 220 when the projector 210 is inclined.
Further, the setup simulation apparatus 100 obtains a projection setting value for setting a state of projection by the projector 210, calculates a projection position and a shape of the projection area image based on the projection setting value, and superimposes and displays the projection area image in the calculated shape at the calculated projection position on the projection location image 240, so that the user can easily confirm the position and shape of the projection area 220 when the projection setting value of the projector 210 is changed.
Further, since the setting simulator 100 obtains the setting value (screen setting value) when the screen 208 is set, calculates the projection position and the shape of the projection area image based on the screen setting value, and displays the projection area image in the calculated shape while superimposing the calculated projection position on the screen image, the user can easily confirm the position and shape of the projection area 220 when the setting value of the screen 208 is changed.
The setting simulation device 100 calculates three-dimensional positions of feature points in the setting place image (three-dimensional position near the projector) and three-dimensional positions of feature points in the screen image (three-dimensional position near the projector) based on a plurality of images near the setting place of the setting projector 210 and a plurality of images near the object to be projected (wall 206, screen 208) (object near image to be projected hereinafter referred to as screen near image), calculates distances from reference points such as a wall, floor, etc. to the setting position of the projector 210 or the screen 208 using these three-dimensional positions, and displays the calculated distances, so that the user can specifically confirm the setting position of the projector 210 numerically.
Further, since the installation simulation device 100 acquires correction data corresponding to the calculated distance (calculated distance) and corrects the calculated distance based on the acquired correction data, the distance indicating the installation position of the projector 210 or the screen 208 can be displayed more accurately.
In addition, the setting simulation apparatus 100 determines whether the projector 210 or the screen 208 can be set without interference with other objects based on the setting position of the projector 210 or the screen 208 or the three-dimensional position of the feature point, and if not, the warning is displayed that the projector 210 or the screen 208 cannot be set at the setting position, and thus, the user can easily confirm whether the projector 210 or the screen 208 can be set.
Modification 1
In the above-described installation simulation processing (fig. 10 and 11), in step S111, the processing unit 110 calculates, for example, a position where the projector 210 is closest to the wall surface at the center of the installation site image 230 as an initial position of the projector 210. In this case, for example, as shown in fig. 5, the projector 210 is initially disposed at a position interfering with the partition of the drawer-equipped cabinet 202. Therefore, modification 1 of searching for a position that does not interfere with another object as the initial position of projector 210 will be described.
In modification 1, in step S111 of setting the simulation processing (fig. 10 and 11), a projector initial position calculation processing described later is called. Except for this point, the same as the above embodiment is omitted. Next, the projector initial position calculation process of modification 1 will be described with reference to fig. 23.
First, the processing section 110 determines whether or not a certain screen 208 is set (step S401). If the screen 208 is not set (step S401: NO), the processing unit 110 calculates a projection surface (a surface of the wall 206 or the like on which the projector 210 projects the video) of the projection field image 240 (step S402). In addition, if the screen 208 is set (step S401: yes), the processing section 110 calculates the projection surface of the screen 208 based on the modeling data of the screen 208 and the set values (screen set values) of the pull-up (pull-down) height, the distance from the wall, the inclination, and the like set in the screen setting (step S403).
Then, the processing section 110 acquires the recommended projection size of the projector name set in the projector setting from the storage section 120, and calculates the optimal distance from the projection surface calculated in step S402 or S403 based on the recommended projection size (step S404). Then, the processing unit 110 determines the interference of the projector 210 assuming that the projector 210 is provided at the temporary position, using the central portion of the room separated from the projection surface by the optimum distance (i.e., w=1/2 of the lateral width of the room, h=1/2 of the height of the room, d=the position separated from the projection surface by the optimum distance) as the temporary position of the projector 210 (step S405). The interference determination by the projector 210 can be performed by the same process as the projector interference determination process (fig. 20), but step S304 (display of warning) may not be performed.
Then, the processing unit 110 determines whether or not interference has occurred in the interference determination (including not only the projector interference determination in step S405 but also the projection area interference determination in step S412) (step S406). If there is no interference (step S406: NO), the processing unit 110 sets the temporary position at which the interference determination is made to obtain the interference-free determination result as the initial position of the projector 210 (step S407). Then, the projector initial position calculation process is ended, and the process returns to step S112 of the setting simulation process (fig. 10, 11).
If there is interference (step S406: yes), the processing section 110 searches for a position (a temporary position of the projector 210) where the projector 210 does not interfere with other objects based on the feature points in the room obtained by the AR system (step S408). Then, the processing unit 110 determines whether or not a temporary position where the interference determination of the projection area 220 has not been performed is found by the search (step S409).
If the temporary position where the interference determination of the projection area 220 is not made has not been found (step S409: no), there is no set position where neither the interference of the object nor the interference of the projection area occurs, and therefore, the central portion of the room separated from the projection surface by the optimum distance (w=1/2 of the lateral width of the room, h=1/2 of the height of the room, d=position separated from the projection surface by the optimum distance), that is, the initial temporary position is set as the initial position of the projector 210 (step S410). Then, the projector initial position calculation process is ended, and the process returns to step S112 of the setting simulation process (fig. 10, 11).
If a temporary position where the interference determination of the projection area 220 has not been made is found (step S409: yes), the processing section 110 determines whether or not the distance from the found temporary position to the projection surface is within the recommended projection distance range of the projector 210 stored in the storage section 120 (step S411). If it is not within the recommended projection distance range (step S411: NO), the process returns to step S408, and the search for a temporary position of the projector 210 that does not interfere with another object is repeated.
If the projection distance is within the recommended projection distance range (step S411: yes), the processing unit 110 performs interference determination of the projection area 220 assuming that the projector 210 is provided at the provisional position (step S412). The interference determination of the projection area 220 can be performed by the same process as the projection area interference determination process (fig. 22), but the processes of step S354, step S358, and step S361 (display of warning) may not be performed. Then, the processing unit 110 returns to step S406, and determines whether or not interference has occurred in the interference determination in the projection area 220.
With the above-described projector initial position calculation processing, in modification 1, when the projector 210 can be installed in a place where there is no interference, the installation simulation is performed in a state where the projector 210 is initially installed in such a place, and thus the effort of the user to move the projector 210 to the position where there is no interference can be reduced.
Further, since the setting simulation device 100 obtains the optimum distance from the setting position of the projector 210 to the projection surface and sets the position calculated based on the projection surface and the optimum distance as the initial position of the projector 210, the user can easily confirm the situation when the projector 210 is set at the optimum distance from the projection surface.
Modification 2
In the embodiment and modification 1 described above, it is determined whether or not the projector 210 interferes with another object, and a warning is displayed when the projector interferes. However, in the case of setting the projector 210 on a table or the like, it is necessary that a plane on which the projector 210 can be mounted is present directly below the projector 210, as well as the projector 210 does not interfere with other objects. Accordingly, in the process of the interference determination of the projector, it is also possible to further perform a settable determination of determining whether or not there is a plane on which the projector 210 can be placed directly below the projector 210. Modification 2 of the setting determination of projector 210 will be described.
In modification 2, in the projector interference determination process (fig. 20), the processing unit 110 determines whether or not a plane on which the projector 210 can be placed is present directly below the projector-occupied space (for example, in the case where the determination in step S303 is no and in the next process in step S304), and if such a plane is not present, a warning indicating that the projector 210 cannot be placed at the currently displayed position, such as "the projector 210 cannot be placed at the place" is displayed.
In modification 2, in the interference determination of the projector 210 in step S405 and step S408 of the projector initial position calculation process (fig. 23), the processing unit 110 determines whether or not a plane on which the projector 210 can be placed is present directly below the projector-occupied space, as in the projector determination process of modification 2, and if such a plane is not present, it is regarded as "interference". Then, in step S408, the processing section 110 searches for a temporary position of a plane where the projector 210 can be placed without interfering with other objects.
In modification 2, by performing the placement determination as described above, when the projector 210 can be placed at the placement position without interference, the placement simulation is performed in a state where the projector 210 is initially placed at such a location, so that the effort of the user to move the projector 210 to the placement position without interference can be reduced.
Modification 3
In the setting simulation processing (fig. 10 and 11) described above, in step S109, the processing unit 110 calculates, for example, in the projection field image 240, a position where the screen 208 is closest to the wall surface as an initial position of the screen 208. In the case where the screen 208 is a wall-mounted screen, there may be a case where it can be set at such a position, but in the case of a floor-mounted screen, there may be a case where it cannot be set at such a position because a space for placing the screen 208 is required to exist on the floor. Therefore, modification 3, in which the position where the search screen 208 can be reliably set is explained as the initial position of the screen 208.
In modification 3, in step S109 of the setup simulation process (fig. 10 and 11), a screen initial position calculation process (projection target initial position calculation process) described later is invoked. Except for this point, the same as the above embodiment is omitted. Next, the screen initial position calculation process of modification 3 will be described with reference to fig. 24.
First, the processing unit 110 determines whether or not the screen 208 set in the screen setting is set in a wall-mounted manner (step S451). If it is not a wall-mounted type (floor type) (step S451: NO), the processing section 110 calculates an occupied space of the screen 208 in consideration of the installation area of the screen 208 on the floor surface and the height and width of the variable portion of the screen 208, based on the modeling data of the screen 208 (step S452).
If it is a wall-mounted type (step S451: yes), the processing section 110 calculates the occupied space of the screen 208 in consideration of the setting area of the wall or ceiling portion of the setting screen 208 and the height and width of the variable portion of the screen 208, based on the modeling data of the screen 208 (step S453).
Then, in the projection field image 240, the processing unit 110 uses the position of the screen 208 closest to the inner wall surface as a temporary position of the screen 208, and the processing unit 110 determines the interference of the screen 208 on the assumption that the screen 208 is provided at the temporary position (step S454). The interference determination of the screen 208 can be performed by the same process as the screen interference determination process (fig. 21), but step S324 (display of warning) may not be performed.
Then, the processing section 110 determines whether or not interference has occurred in the interference determination (including not only the screen interference determination in step S454 but also the projection area interference determination in step S461) (step S455). If there is no interference (step S455: NO), the processing unit 110 sets the temporary position at which the interference determination is made, which gives a determination result indicating that there is no interference, as the initial position of the screen 208 (step S456). Then, the screen initial position calculation process is ended, and the process returns to step S110 of the setting simulation process (fig. 10, 11).
If there is interference (step S455: yes), the processing section 110 searches for a position (a temporary position of the screen 208) where the screen 208 does not interfere with other objects based on the feature points in the room obtained by the AR system (step S457). Then, the processing unit 110 determines whether or not a temporary position where the interference determination of the projection area 220 has not been performed has been found by the search (step S458).
If the temporary position where the interference determination of the projection area 220 has not been made can not be found (step S458: NO), there is no set position where neither the interference of the object nor the interference of the projection area occurs, and therefore, in the projection field image 240, the position of the screen 208 closest to the inner wall surface, that is, the initial temporary position, is set as the initial position of the screen 208 (step S459). Then, the screen initial position calculation process is ended, and the process returns to step S110 of the setting simulation process (fig. 10, 11).
If a temporary position where the interference determination of the projection area 220 has not been made is found (step S458: yes), the processing section 110 determines whether or not the screen 208 can be set at the found temporary position (step S460). Here, being able to set the screen 208 means that there is a plane (floor or the like) that can be placed directly below the temporary position of the screen 208 if the screen 208 is of the floor type, and that a wall or ceiling suspending the screen 208 is present above the temporary position if the screen 208 is of the wall-mounted type.
If it cannot be set (step S460: NO), the processing section 110 returns to step S457, repeatedly searches for a temporary position of the screen 208 that does not interfere with another object. If it can be set (step S460: yes), the processing section 110 performs interference determination of the projection area 220 assuming that the screen 208 is set at the temporary position (step S461). The interference determination of the projection area 220 can be performed by the same process as the projection area interference determination process (fig. 22), but the processes of step S354, step S358, and step S361 (display of warning) may not be performed. Then, the processing unit 110 returns to step S455, and determines whether or not interference has occurred in the interference determination in the projection area 220.
In modification 3, by the screen initial position calculation processing described above, when the screen 208 can be set at a place where there is no interference, setting simulation is performed in a state where the screen 208 is set at such a position from the beginning, and therefore, it is possible to reduce the effort of the user to move the screen 208 to a position where there is no interference.
Modification 4
In the installation simulation device 100 described above, for example, as shown in fig. 5, the processing unit 110 displays the installation site image 230 on the left side of the display unit 140 and the projection site image 240 on the right side. As shown in fig. 7, in the case of displaying the depth grasping image 260, the depth grasping image is displayed instead of the installation site image 230. However, the display method is not limited to this method. For example, as shown in fig. 25, the installation site image 230, the projection site image 240, the depth grasping image 260, and the menu display image 299 may be displayed simultaneously. Modification 4 of the display method will be described.
In modification 4, when the user touches "projector setting" from the menu display image 299, the menu display image 299 is replaced with the projector setting screen 270, and the user can perform setting of the projector. Further, it is assumed that when "projection area image setting" is touched, a menu showing the kind and attribute values (color, density, gradation, etc.) of the image of the projection area 220 is displayed and set.
In modification 4, the installation site image 230, the projection site image 240, the depth grasp image 260, and the menu display image 299 are simultaneously displayed, so that the user can reduce the effort to switch the screen and prevent the menu display image 299 from hiding other images.
Modification 5
In the above embodiment and modification, the plurality of images such as the installation site image 230 and the projection site image 240 are displayed on the display unit 140, but the plurality of images are not necessarily displayed on the display unit 140. The display unit 140 may display one image at a time (for example, only the place image 230 is provided) and may switch the images by a user operation. Modification 5 of the display method will be described.
In modification 5, the user operation for switching the image may be arbitrary, or may be switched to the next image if the user performs a sliding operation on the currently displayed image. For example, it may be set in the following manner: when the installation site image 230 is slid, the projection site image 240 is displayed instead of the installation site image 230, when the projection site image 240 is slid, the depth grasping image 260 is displayed instead of the projection site image 240, and when the depth grasping image 260 is slid, the installation site image 230 is displayed instead of the depth grasping image 260.
Further, for example, when the user clicks the menu display button 256 displayed on the display section 140, a menu having selectable items of image names (for example, "set place image", "projected place image", and "depth grasp image") may be displayed, and when the user clicks the image name within the menu, an image of the image name may be displayed on the display section 140.
In modification 5, the processing unit 110 displays one image on the display unit 140, and thus the image can be displayed large. Therefore, if the setting place image 230 is displayed, the user easily confirms the detailed portion of the status of the setting of the projector 210. Further, if the projection field image 240 is displayed, the user can easily confirm the detailed portion as to what projection is performed by the projector 210.
The installation simulation device 100 may be realized by a tablet PC provided with a camera, a notebook PC (Personal Computer ), or the like. Specifically, in the above embodiment, a case has been described in which a program of setting simulation processing and the like executed by the setting simulation apparatus 100 is stored in the storage unit 120 in advance. However, the program may be stored and distributed in a non-transitory computer-readable recording medium such as a floppy disk, a CD-ROM (Compact Disc Read Only Memory, compact disk read only memory), a DVD (DIGITAL VERSATILE DISC, digital versatile disk), an MO (Magneto-Optical disk), a memory card, and a USB memory, and the program may be read into a computer and installed to constitute a computer capable of executing the respective processes described above.
The program may be applied by being superimposed on a carrier wave via a communication medium such as the internet. For example, programs may be distributed by being disclosed on bulletin boards (BBS: bulletin Board System) on a communication network. Further, by starting this program, the above-described processes can be executed under the control of an OS (Operating System) in the same manner as other application programs.
The processing unit 110 may be configured by combining an arbitrary processor such as a single processor, a multiprocessor, or a multicore processor with a processing Circuit such as an ASIC (Application SPECIFIC INTEGRATED Circuit), an FPGA (Field-Programmable gate array), or the like.
While the preferred embodiments of the present invention have been described above, the present invention is not limited to the specific embodiments, and the present invention includes the inventions described in the claims and their equivalents.
Claims (18)
1. A non-transitory computer-readable recording medium having recorded thereon a setup simulation program for causing a computer to acquire a setup site image, which is an image of a setup site where a projector is setup, a projector setup position, which is a position where the projector is setup in the setup site, and a projection site image, which is an image of a site where an image is projected by the projector or a site where a subject to be projected is setup,
As one or both of a process of displaying a projector-set image and a process of displaying a projection-area-set image, the setting simulation program causes a computer to execute one or both of a process of displaying the projector-set image corresponding to the projection-area-set image and a process of displaying the projection-area-set image corresponding to the projector-set image, the projector-set image having an image of the projector at the acquired projector-set position of the acquired setting-place image; the projection area setting image is a projection area image that is an image showing an area in which the video is projected, among the acquired projection field images.
2. The recording medium of claim 1, wherein,
The setup simulation program causes the computer to execute:
a projection position which is a position of the object to be projected or the projection location where the image is projected is obtained,
The projection area image is displayed on the acquired projection position in a projection size calculated based on a distance from the projector installation position to the projection position.
3. The recording medium of claim 2, wherein,
The setup simulation program causes the computer to execute:
Calculating an image luminosity representing the brightness of the image based on a distance from the projector-set position to the projection position,
Setting at least one attribute value of brightness, transparency, and gradation of the projection area image based on the calculated image luminosity,
And displaying a projection area image based on the set attribute value.
4. The recording medium according to claim 2 or 3, wherein,
The setup simulation program causes the computer to execute:
Acquiring a setting posture which is an inclination of the projector when the projector is set, calculating a shape of the projection position and the projection area image based on the acquired setting posture,
And displaying the projection area image in the calculated shape at the calculated projection position.
5. The recording medium according to claim 2 or 3, wherein,
The setup simulation program causes the computer to execute:
a projection setting value for setting a projection state of the projector is obtained,
Calculating the projection position and the shape of the projection area image based on the obtained projection setting value,
And displaying the projection area image in the calculated shape at the calculated projection position.
6. The recording medium according to any one of claim 1 to 3, wherein,
The setup simulation program causes the computer to execute:
obtain the information of the projected object including the type of the projected object and the setting position of the projected object,
Generating a projected object image representing the projected object based on the projected object information,
Displaying the projected object image on the projection site image based on the projected object information,
And displaying the projection area image on the projected object image.
7. The recording medium according to claim 2 or 3, wherein,
The setup simulation program causes the computer to execute:
obtain the information of the projected object including the type of the projected object and the setting position of the projected object,
Generating a projected object image representing the projected object based on the projected object information,
Displaying the projected object image on the projection site image based on the projected object information,
Calculating the projection position and the shape of the projection area image based on the acquired projection subject information,
And displaying the projection region image in the calculated shape at the calculated projection position of the projected object image.
8. The recording medium according to any one of claim 1 to 3, wherein,
The setup simulation program causes the computer to execute:
A plurality of images of the vicinity of the installation site of the projector are acquired,
Calculating a three-dimensional position of a feature point in the installation site image, that is, a projector-vicinity three-dimensional position, based on the acquired plurality of images,
A distance from a reference point to the projector setting position is calculated using the calculated three-dimensional position near the projector,
Displaying the calculated distance.
9. The recording medium of claim 8, wherein,
The setup simulation program causes the computer to execute:
based on the obtained projector setting position and the calculated three-dimensional position near the projector, it is determined whether the projector can be set at the projector setting position, and if it is determined that the projector cannot be set at the projector setting position, it is displayed that the projector cannot be set at the projector setting position.
10. The recording medium of claim 9, wherein,
The setup simulation program causes the computer to execute:
Calculating a projector-occupied space, which is a space occupied by the projector, based on the obtained projector setting position and projector 3D modeling data,
Determining whether or not a feature point exists inside the calculated projector-occupied space,
When the feature point exists in the projector-occupied space, it is determined that the projector cannot be installed at the projector installation position.
11. The recording medium according to any one of claim 1 to 3, wherein,
The setup simulation program causes the computer to execute:
Acquiring a plurality of projection target vicinity images, the projection target vicinity images being images in the vicinity of the installation position of the projection target,
Calculating a three-dimensional position of a feature point in the vicinity of the object to be projected, that is, a vicinity of the object to be projected, based on the acquired plurality of vicinity images of the object to be projected,
Calculating a distance from a reference point to a set position of the projected object using the calculated three-dimensional position near the projected object,
Displaying the calculated distance.
12. The recording medium of claim 11, wherein,
The setup simulation program causes the computer to execute:
Determining whether or not the object can be set at the object setting position based on the acquired object setting position and the calculated three-dimensional position near the object,
If it is determined that the object cannot be set at the object setting position, it is displayed that the object cannot be set at the object setting position.
13. The recording medium of claim 12, wherein,
The setup simulation program causes the computer to execute:
Calculating a space occupied by the projected object, that is, a space occupied by the projected object, based on the acquired projected object setting position and projected object 3D modeling data,
Determining whether the calculated feature point exists in the space occupied by the projected object,
When the feature point exists in the space occupied by the object, it is determined that the object cannot be installed at the object installation position.
14. The recording medium of claim 8, wherein,
The setup simulation program causes the computer to execute:
Obtaining correction data corresponding to the calculated distance,
Correcting the calculated distance based on the acquired correction data.
15. The recording medium according to any one of claim 1 to 3, wherein,
The setup simulation program causes the computer to execute:
obtaining an optimal distance which is an optimal value of a distance from a projection surface which is a surface on which the image is projected by the projector to a position where the projector is provided,
And acquiring a position calculated based on the projection surface and the optimal distance as the projector setting position.
16. The recording medium according to any one of claim 1 to 3, wherein,
The setup simulation program causes the computer to execute:
and displaying the installation site image and the projection site image in an array on a display unit of the computer.
17. A setting simulation method, wherein,
Obtaining an installation site image, which is an image of an installation site where a projector is installed, a projector installation position, which is a position where the projector is installed in the installation site, and a projection site image, which is an image of a site where a video is projected by the projector or a site where a subject to which the video is projected is installed,
As either or both of a process of displaying a projector-set image and a process of displaying a projection-area-set image, either or both of a process of displaying the projector-set image corresponding to the projection-area-set image and a process of displaying the projection-area-set image corresponding to the projector-set image are performed. The projector setting image has an image of the projector at the acquired projector setting position of the acquired setting place image; the projection area setting image is a projection area image that is an image showing an area in which the video is projected, among the acquired projection field images.
18. A setup simulation device is provided with:
A processing section; and
A display unit for displaying the image of the object,
The processing unit acquires an installation site image that is an image of an installation site where a projector is installed, a projector installation position that is a position where the projector is installed in the installation site, and a projection site image that is an image of a site where an image is projected by the projector or a site where an object to be projected is installed,
The processing section displays, on the display section, two or one of a projector-set image corresponding to a projection-area-set image and the projection-area-set image corresponding to the projector-set image: the projector setting image depicts an image of the projector on the acquired projector setting position of the acquired setting place image; the projection area setting image depicts a projection area image, which is an image representing an area in which the video is projected, in the acquired projection field image.
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2021-147181 | 2021-09-09 | ||
JP2021147181 | 2021-09-09 | ||
JP2022000075A JP7424397B2 (en) | 2021-09-09 | 2022-01-04 | Installation simulation program, installation simulation method, and installation simulation device |
JP2022-000075 | 2022-01-04 |
Publications (2)
Publication Number | Publication Date |
---|---|
CN115802014A CN115802014A (en) | 2023-03-14 |
CN115802014B true CN115802014B (en) | 2024-10-29 |
Family
ID=85431954
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202211133202.4A Active CN115802014B (en) | 2021-09-09 | 2022-09-09 | Recording medium, setting simulation method, and setting simulation device |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN115802014B (en) |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102736846A (en) * | 2011-03-31 | 2012-10-17 | 夏普株式会社 | Detection area enlarging device, display device, and detection area enlarging method |
CN103037186A (en) * | 2011-09-28 | 2013-04-10 | 卡西欧计算机株式会社 | Projector control device and projector control method |
Family Cites Families (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6333826B1 (en) * | 1997-04-16 | 2001-12-25 | Jeffrey R. Charles | Omniramic optical system having central coverage means which is associated with a camera, projector, or similar article |
US9132346B2 (en) * | 2012-04-04 | 2015-09-15 | Kenneth J. Huebner | Connecting video objects and physical objects for handheld projectors |
US20190116356A1 (en) * | 2016-04-15 | 2019-04-18 | Sony Corporation | Information processing apparatus, information processing method, and program |
US11082670B2 (en) * | 2017-07-14 | 2021-08-03 | Sony Corporation | Information processing apparatus, information processing method, and program |
CN207740688U (en) * | 2018-01-23 | 2018-08-17 | 视缘(上海)智能科技有限公司 | Three-dimensional scenic simulator |
DE112019006855T5 (en) * | 2019-02-13 | 2021-11-04 | Mitsubishi Electric Corporation | SIMULATION DEVICE AND SIMULATION METHOD |
JP7160018B2 (en) * | 2019-11-07 | 2022-10-25 | カシオ計算機株式会社 | Projection device, installation state detection method, and installation state detection program |
CN111627039A (en) * | 2020-05-09 | 2020-09-04 | 北京小狗智能机器人技术有限公司 | Interaction system and interaction method based on image recognition |
-
2022
- 2022-09-09 CN CN202211133202.4A patent/CN115802014B/en active Active
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102736846A (en) * | 2011-03-31 | 2012-10-17 | 夏普株式会社 | Detection area enlarging device, display device, and detection area enlarging method |
CN103037186A (en) * | 2011-09-28 | 2013-04-10 | 卡西欧计算机株式会社 | Projector control device and projector control method |
Also Published As
Publication number | Publication date |
---|---|
CN115802014A (en) | 2023-03-14 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10803663B2 (en) | Depth sensor aided estimation of virtual reality environment boundaries | |
US8881059B2 (en) | Object control device, object control method, computer-readable recording medium, and integrated circuit | |
JP6423435B2 (en) | Method and apparatus for representing a physical scene | |
US20190033989A1 (en) | Virtual reality environment boundaries using depth sensors | |
CN110070556B (en) | Structural modeling using depth sensors | |
US20190212901A1 (en) | Manipulation of content on display surfaces via augmented reality | |
WO2013105492A1 (en) | Information processing apparatus, information processing method, and computer program | |
CN104134235B (en) | Real space and the fusion method and emerging system of Virtual Space | |
JP7224350B2 (en) | Image capture device and image capture method | |
JP6698972B2 (en) | Virtual object display control device, virtual object display system, virtual object display control method, and virtual object display control program | |
CN110209325A (en) | A kind of 3D scene display control method, system and equipment | |
US8462110B2 (en) | User input by pointing | |
CN112017133B (en) | Image display method and device and electronic equipment | |
CN115802014B (en) | Recording medium, setting simulation method, and setting simulation device | |
JP6280005B2 (en) | Information processing apparatus, image projection apparatus, and information processing method | |
JP2016066918A (en) | Video display device, video display control method and program | |
JP2010286930A (en) | Content display device, content display method, and program | |
JP7341736B2 (en) | Information processing device, information processing method and program | |
JP7424397B2 (en) | Installation simulation program, installation simulation method, and installation simulation device | |
CN116993897A (en) | Shadow rendering method, shadow rendering device, electronic equipment and readable storage medium | |
KR20220161445A (en) | Method and device for constructing 3D geometry | |
JP2006318015A (en) | Image processing device, image processing method, image display system, and program | |
EP3861479A1 (en) | Method and device for detecting a vertical planar surface | |
EP4113251A1 (en) | Calibration method of a system comprising an eye tracking device and a computing device comprising one or multiple screens | |
JP2023175503A (en) | Image processing apparatus and its image processing method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant |