[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

WO2018135141A1 - Information processing device, information processing method, and projection system - Google Patents

Information processing device, information processing method, and projection system Download PDF

Info

Publication number
WO2018135141A1
WO2018135141A1 PCT/JP2017/042835 JP2017042835W WO2018135141A1 WO 2018135141 A1 WO2018135141 A1 WO 2018135141A1 JP 2017042835 W JP2017042835 W JP 2017042835W WO 2018135141 A1 WO2018135141 A1 WO 2018135141A1
Authority
WO
WIPO (PCT)
Prior art keywords
projector
information processing
control
processing apparatus
control unit
Prior art date
Application number
PCT/JP2017/042835
Other languages
French (fr)
Japanese (ja)
Inventor
高尾 宜之
Original Assignee
ソニー株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソニー株式会社 filed Critical ソニー株式会社
Publication of WO2018135141A1 publication Critical patent/WO2018135141A1/en

Links

Images

Classifications

    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B21/00Projectors or projection-type viewers; Accessories therefor
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G3/00Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
    • G09G3/20Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/74Projection arrangements for image reproduction, e.g. using eidophor
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]

Definitions

  • the present disclosure relates to an information processing apparatus, an information processing method, and a projection system.
  • various adjustments relating to the projection are performed manually (operations by the user, etc.) in order to perform projection according to the user's expectations. For example, when the projection area of the projector is different from the projection area expected by the user, the projection area is adjusted using the lens adjustment function of the projector.
  • Non-Patent Document 1 describes that an image subjected to geometric correction processing is input to a projector based on projection destination information obtained by photographing with a camera.
  • the human burden is large.
  • the adjustment is performed for each projector, and thus the human burden may be enormous.
  • the human burden is reduced, but the image quality of the image seen by the user may be reduced.
  • the present disclosure proposes a new and improved information processing apparatus, information processing method, and projection system capable of reducing the human burden while suppressing deterioration in image quality.
  • an information processing apparatus including a simulation image including a projection area of a projector in a pre-design and a control unit that controls the projector based on a captured image including the projection area of the projector.
  • an information processing method including a processor controlling the projector based on a simulation image including a projection area of the projector in a pre-design and a captured image including the projection area of the projector. Is done.
  • a projector that projects a projection image
  • an imaging device that acquires a captured image including the projection area of the projector, a simulation image including the projection area of the projector in advance design, and the captured image
  • An information processing apparatus having a control unit for controlling the projector is provided.
  • FIG. 10 is an explanatory diagram for explaining another example of projector control by the projector control unit 318 according to the embodiment.
  • FIG. It is a flowchart which shows the operation example of the embodiment. It is explanatory drawing which shows the example of the screen displayed on a display part not shown in prior design. It is a flowchart figure which shows the more detailed process of step S13. It is a flowchart figure which shows the more detailed process of step S15. It is explanatory drawing which shows the hardware structural example.
  • a plurality of constituent elements having substantially the same functional configuration may be distinguished by adding different alphabets after the same reference numeral.
  • it is not necessary to particularly distinguish each of a plurality of constituent elements having substantially the same functional configuration only the same reference numerals are given.
  • FIG. 1 is an explanatory diagram for explaining a schematic configuration of a projection system 1 according to the present embodiment.
  • the projection system 1 includes one or more projectors 10 (projection devices), a camera 20 (imaging device), an information processing device 30, and the like that project an image (still image or moving image) onto a screen 50. And a communication network 80.
  • the projector 10 projects an image on the screen 50.
  • the projector 10 may include a plurality of projectors 10A to 10D.
  • the number of projectors included in projector 10 is not limited to the example shown in FIG.
  • the camera 20 is an imaging device that captures an image projected by the projector 10 and acquires a captured image.
  • the camera 20 may be arranged at a viewpoint overlooking the projections by the projectors 10A to 10D, and the camera 20 acquires captured images including actual projection areas of the projectors 10A to 10D.
  • the information processing apparatus 30 performs information processing related to the projection system 1. As shown in FIG. 1, the information processing apparatus 30 is connected to a communication network 80, and projects images to the projectors 10A to 10D, for example, via the communication network 80 (hereinafter sometimes referred to as projection images). May be input (provided). Further, the information processing apparatus 30 may receive a captured image acquired by the camera 20 from the camera 20 via the communication network 80. Details of the information processing apparatus 30 will be described later with reference to FIGS.
  • the screen 50 is a projection destination by the projector 10.
  • the screen 50 may have a dome shape, for example.
  • FIG. 1 the cross section which cut
  • the shape of the screen 50 is not limited to a dome shape, and may be a curved surface or a flat surface.
  • the communication network 80 is a wired or wireless transmission path for information transmitted from a device connected to the communication network 80.
  • the communication network 80 may include a public line network such as the Internet, a telephone line network, and a satellite communication network, various LANs including the Ethernet (registered trademark), a wide area network (WAN), and the like.
  • the communication network 80 may include a dedicated network such as an IP-VPN (Internet Protocol-Virtual Private Network).
  • IP-VPN Internet Protocol-Virtual Private Network
  • the projection area expected by the user may be different from the projection area of the projector.
  • the projection area expected by the user may be a projection area assumed when, for example, the screen (projection destination) and the arrangement (position and orientation) of the projector are designed in advance.
  • the image when the image is projected at a position different (shifted) from the position expected by the user, it is desirable to adjust the position of the projection area. If the projection area does not match the screen size, it is desirable to adjust the size of the projection area.
  • the shape of the projection destination is not a plane as in the screen 50 shown in FIG. 1, it is desirable to adjust the projection area according to the shape of the projection destination, and more detailed than the case of projecting onto the plane. Adjustment may be required. Also, as shown in FIG. 1, when an image can be projected onto a nearby area by a plurality of projectors, detailed adjustment in consideration of the projection areas of other projectors is required. When a plurality of projectors are used, it is desirable not only to adjust the projection area, but also to make adjustments so that colors, brightness, and the like are consistent among the plurality of projectors.
  • the projector has a lens shift function that adjusts the position of the projection area by moving the lens up, down, left, or right, for example, and a zoom that adjusts the size of the projection area by moving the lens back and forth
  • a function, a color adjusting function, a brightness adjusting function, and the like can be mounted.
  • the human burden is large. In particular, when a plurality of projectors are used, the labor for adjustment increases according to the number of projectors, which may increase the human burden.
  • a method of correcting a projection image input to the projector by image processing is also conceivable.
  • a projection image subjected to reduction processing by the information processing device 30 is input to the projector 10 based on a captured image acquired by the camera 20.
  • the colors and brightness are similar between the plurality of projectors. It is possible to project.
  • the projection system 1 according to an embodiment of the present disclosure has been created with the above circumstances as a focus.
  • the information processing apparatus 30 controls the projector 10 and automatically adjusts the projection based on the captured image acquired by the camera 20, thereby suppressing deterioration in image quality. It is possible to reduce the human burden.
  • Control of the projector 10 by the information processing apparatus 30 may be performed, for example, by transmitting a projector control signal similar to the user operation via the above-described button or remote controller from the information processing apparatus 30 to the projector 10. That is, according to this embodiment, the same adjustment as the adjustment by the user operation using the function of the projector 10 described above is automatically performed, and it is possible to reduce the human burden.
  • the adjustment using the function of the projector 10 it is possible to suppress a decrease in image quality as compared with the case where the above-described image processing is performed. For example, when the projection area is adjusted by controlling the projector 10, only the projection area is different, and the number of pixels used for projection is not changed. In addition, since control of changing the maximum output value and the minimum output value of the projector 10 is possible, it is possible to adjust the color and brightness while suppressing the above-described deterioration in image quality. is there.
  • FIG. 2 is a block diagram illustrating a configuration example of the information processing apparatus 30 illustrated in FIG.
  • the information processing apparatus 30 according to the present embodiment includes a control unit 310, a communication unit 320, and a storage unit 350.
  • the control unit 310 controls each component of the information processing apparatus 30. Further, as shown in FIG. 2, the control unit 310 also functions as a communication control unit 312, an imaging control unit 314, a simulation image generation unit 316, and a projector control unit 318.
  • the communication control unit 312 controls communication by the communication unit 320.
  • the communication control unit 312 controls the communication unit 320 to transmit a projection image and a projector control signal for controlling the projector 10 to the projector 10 (projection apparatus) described with reference to FIG.
  • the communication control unit 312 controls the communication unit 320 to transmit an imaging control signal for controlling imaging by the camera 20 to the camera 20 (imaging device) described with reference to FIG.
  • the communication control unit 312 controls the communication unit 320 to receive a captured image from the camera 20 (imaging device) described with reference to FIG.
  • the projector control signal and the imaging control signal will be described later.
  • the imaging control unit 314 controls imaging by the camera 20 in accordance with projector control by the projector control unit 318.
  • the imaging control unit 314 outputs an imaging control signal that causes the camera 20 to perform imaging when the projector control unit 318 controls to turn on one of the projectors 10A to 10D (for example, the projector 10A). Generated and provided to the communication control unit 312.
  • the simulation image generation unit 316 is an image to be acquired by the camera 20 when projection by the projector 10 is performed according to the pre-design based on the parameters in the pre-design stored in the storage unit 350 and the parameters related to the camera 20.
  • a simulation image is generated.
  • the simulation image is an image including a projection region expected by the user in the pre-design, that is, a projection region of the projector 10 in the pre-design.
  • the parameters in the pre-design may include, for example, information on the three-dimensional shape, three-dimensional position, and three-dimensional posture of the screen 50.
  • the parameters in the pre-design may further include a three-dimensional position, a three-dimensional attitude, and lens parameters (lens position, focal length, etc.) of the projector 10 set by the pre-design.
  • information indicating the relationship between the projection distance of the projector 10 and the projection area may be included in the parameters in the prior design.
  • the parameters relating to the camera 20 may include, for example, a three-dimensional position, a three-dimensional posture, and a lens parameter of the camera 20.
  • the parameters in the pre-design are values based on the specifications and performance of the projector 10 and the camera 20 that are actually used.
  • the simulation image generation unit 316 generates a simulation image based on the parameters in the advance design as described above and the parameters related to the camera 20. With this configuration, a simulation image that corresponds to the captured image and can be compared with the captured image can be generated.
  • the projector control unit 318 controls the projector 10.
  • the projector control unit 318 can control the projector 10 by generating a projector control signal for controlling the projector 10 and providing it to the communication control unit 312.
  • controlling the projector 10 means “generating a projector control signal for controlling the projector 10”.
  • the projector control signal generated by the projector control unit 318 may be, for example, a control signal for turning on the projector 10 (projecting an image on the projector 10).
  • the projector control signal generated by the projector control unit 318 may be a control signal for performing control related to the lens of the projector 10.
  • the control related to the lens is, for example, at least one of control (lens shift control) for moving the lens position of the projector 10 up and down, left and right, and control (zoom control) for changing the zoom magnification of the projector 10 (that is, changing the focal length of the lens). Either one may be included.
  • the movement of the lens position may be, for example, movement in any one direction of up, down, left, or right, and the control for moving the lens position is performed in one step in the designated direction by the projector control signal. It may include moving the lens position of the projector 10 one by one. Further, the control for changing the zoom magnification may include, for example, increasing or decreasing the zoom magnification of the projector 10 step by step by a projector control signal.
  • the projector control signal generated by the projector control unit 318 may be a control signal for performing control related to the output parameter of the projector 10.
  • the output parameters of the projector 10 may include, for example, parameters of the brightness, hue, saturation, and hue (for example, R, G, and B colors) of the projector 10.
  • the control related to the output parameter of the projector 10 may include changing at least one of the brightness, hue, saturation, and color of the projector 10 step by step.
  • the projector control unit 318 controls the projector 10 based on the simulation image generated by the simulation image generation unit 316 and the captured image provided from the camera 20.
  • the simulation image is an image including the projection area of the projector 10 in the pre-design
  • the captured image is an image including the actual projection area of the projector 10 at the time of imaging.
  • the projector control unit 318 may control the projector 10 so that the difference between the simulation image and the captured image becomes small.
  • FIG. 3 is an explanatory diagram for explaining an example of projector control by the projector control unit 318.
  • the projector control unit 318 may control the projector 10 so that the difference between the simulation image D10 and the captured image F10 is small.
  • the difference may be reduced by changing the projection area by controlling the projector 10 so as to move the lens position in any direction or change (increase or decrease) the zoom magnification. Good.
  • the projector control unit 318 may repeatedly control the projector 10 until it is determined that the difference between the simulation image and the captured image is sufficiently small (for example, the difference is smaller than a predetermined threshold).
  • the projector control unit 318 determines that the difference between the simulation image calculated immediately after performing the first control on the projector 10 and the captured image is larger (increased) than the previously calculated difference.
  • the second control different from the control may be performed. For example, when the difference increases due to the control of moving the lens position, the projector control unit 318 may perform control to change the zoom magnification or control to move the lens position in different directions. Further, when the difference increases due to the control to increase (or decrease) the zoom magnification, the projector control unit 318 may perform control to decrease (or increase) the zoom magnification or move the lens position. Control may be performed.
  • the projector control unit 318 performs the first control when the difference between the simulation image and the captured image is larger (increased) than the previously calculated difference immediately after performing the first control on the projector 10.
  • the reverse control may be performed.
  • the projector control unit 318 may perform the control of moving the lens position by reversing the moving direction of the lens position.
  • control for decreasing (or increasing) the zoom magnification of the projector control unit 318 may be performed.
  • the projection area of the projector 10 is automatically set so that the difference between the actual projection area by the projector 10 and the projection area in the pre-design is sufficiently small. Adjusted to
  • the difference between the simulation image and the captured image is, for example, the difference in area between the projection area included in the simulation image (projection area in advance design) and the projection area included in the captured image (actual projection area). Also good.
  • the difference between the simulation image and the captured image may be a difference between the vertex position of the projection area included in the simulation image and the vertex position of the projection area included in the captured image. Further, the difference between the simulation image and the captured image may be a value obtained by summing up the pixel values of the pixels of the simulation image and the captured image.
  • the difference between the simulation image and the captured image is not limited to the above example, and may be calculated by various methods.
  • the projector control unit 318 may control the projector 10 based on the captured image including the projection area of each projector acquired for each projector, and a plurality of projectors included in the projector 10 based on the plurality of captured images. May be controlled. Note that in obtaining such a plurality of captured images, the projection images used for projection may all be the same.
  • FIG. 4 is an explanatory diagram for explaining an example of projector control by the projector control unit 318.
  • the projector control unit 318 may control the projector 10 based on a plurality of captured images so as to reduce the variation among the plurality of projectors.
  • the variation among a plurality of projectors means a variation relating to appearance such as brightness, hue, saturation, or hue perceived by the user.
  • the projector control unit 318 may control the projector 10 so that variation among a plurality of projectors is reduced with respect to at least one of brightness, hue, saturation, and color.
  • the control of the projector 10 relating to the parameters of brightness, hue, saturation, or tint will be collectively described as output parameter control.
  • the projector control unit 318 includes brightness, hue, saturation, Alternatively, the color parameters can be controlled independently.
  • the projector control unit 318 calculates the output parameters of each projector from, for example, a plurality of captured images, and controls the projector 10 so that the variation of the output parameters among the plurality of projectors becomes small.
  • the projector control unit 318 may specify a reference value based on the output parameters of a plurality of projectors, and control the output parameter of each projector to approach the reference value.
  • the projector control unit 318 may control the projector 10 having an output parameter larger (smaller) than the reference value so that the output parameter is smaller (larger).
  • the reference value may be, for example, an average value of output parameters of a plurality of projectors included in the projector 10, or may be a median value, a minimum value, or a maximum value.
  • the projector control unit 318 may repeatedly control the projector 10 until it is determined that the variation in the output parameter among the plurality of projectors is sufficiently small (for example, the variation is smaller than a predetermined threshold).
  • the projector control unit 318 determines that the output parameter variation among the plurality of projectors calculated immediately after the output parameter control of the projector 10 is larger (increased) than the previously calculated variation. Control opposite to the output parameter control may be performed. For example, when the variation increases due to the control to increase (decrease) the output parameter, the projector control unit 318 may control the projector 10 to decrease (increase) the output parameter.
  • the output parameter of the projector 10 is automatically adjusted so that the variation of the output parameter among the plurality of projectors included in the projector 10 becomes sufficiently small.
  • variation in output parameters among the plurality of projectors may be, for example, dispersion of output parameters of the plurality of projectors or standard deviation. Further, the variation in the output parameter among the plurality of projectors may be a difference between the maximum value and the minimum value of the output parameters of the plurality of projectors. Further, the variation of the output parameter among the plurality of projectors may be a total value of the difference between the above-described reference value and the output parameter of each projector. Variations in output parameters among a plurality of projectors are not limited to the above example, and may be calculated by various methods.
  • the communication unit 320 is a communication interface that mediates communication with other devices.
  • the communication unit 320 supports an arbitrary wireless communication protocol or wired communication protocol, and for example, a communication connection with other devices (for example, the projector 10 and the camera 20) via the communication network 80 described with reference to FIG. Establish. Further, the communication unit 320 receives information from other devices or transmits information to other devices according to the control of the communication control unit 312.
  • the storage unit 350 stores programs and parameters for the functions of the information processing apparatus 30 to function. In addition, the storage unit 350 stores parameters in advance design and parameters related to the camera 20.
  • each function of the control unit 310 according to the present embodiment may be provided in another information processing apparatus connected via the communication unit 320.
  • the simulation image generation unit 316 is provided in another information processing apparatus, a simulation image generated by the other information processing apparatus may be provided to the information processing apparatus 30 via the communication unit 320.
  • FIG. 5 is a flowchart showing an operation example of this embodiment.
  • the user performs advance design regarding the arrangement of the screen 50 and the projector 10 (S11).
  • the pre-design may be performed using the information processing apparatus 30 or may be performed using another apparatus (not illustrated) connected to the communication network 80, and parameters in the prior design are provided to the information processing apparatus 30. May be.
  • FIG. 6 is an explanatory diagram showing an example of a screen displayed on a display unit (not shown) in the prior design in step S11.
  • the user uses a screen G50 (for example, corresponding to the screen 50 shown in FIG. 1) by the projectors G11 to G14 (for example, corresponding to the projectors 10A to 10D shown in FIG. 1) displayed on the screen G1.
  • the positions and orientations of the projectors G11 to G14 may be adjusted while confirming the projection area.
  • the screen G1 is displayed in a plane, but the screen G1 may be displayed three-dimensionally, and the user may be able to change the viewpoint position.
  • the user installs the screen 50 and the projector 10 according to a pre-design (S12).
  • the camera 20 and the information processing apparatus 30 are also installed.
  • step S13 projector control for adjusting the projection area of each projector is performed by the information processing apparatus 30 (S13). Details of step S13 will be described later with reference to FIG.
  • step S13 is repeated for the projector for which the projection area adjustment has not been completed.
  • step S13 when the projection area adjustment in step S13 is completed for all projectors included in projector 10 (YES in S14), control for adjusting the variation among the plurality of projectors included in projector 10 is performed by information processing device 30. Performed (S15). Details of step S15 will be described later with reference to FIG.
  • FIG. 7 is a flowchart showing more detailed processing of step S13 shown in FIG.
  • the projector control unit 318 selects one of the projectors 10 for which the process of step S13 has not been completed, and sets it as a target projector to be processed (S131).
  • the simulation image generation unit 316 generates a simulation image (an image including the projection area of the target projector in advance design) when the target projector is turned on (S132). Subsequently, the projector control unit 318 turns on the target projector (S133).
  • the camera 20 acquires a captured image including the projection area of the projector of interest and provides it to the information processing apparatus 30 (S134).
  • the projector control unit 318 calculates a difference between the simulation image and the captured image acquired in step S134, and controls the projector of interest so that the difference becomes small (S135).
  • the projector control unit 318 calculates the difference and then moves the lens in any direction or changes the zoom magnification (large or Any one of the projector controls to be (smaller) may be performed at random. Then, each time the process of step S135 is performed, the projector control unit 318 calculates the difference again.
  • the previous control is as described above. Different control and reverse control may be performed.
  • the projector control unit 318 may perform the same control as the previous time. As described above, the projector control unit 318 can control the projector of interest so that the difference becomes small.
  • step S135 If the difference between the simulation image calculated by the projector control unit 318 and the captured image in step S135 is sufficiently small, the projector control unit 318 does not have to control the projector of interest in step S135. If the difference between the simulation image and the captured image is sufficiently small (YES in S136), the process in step S13 for the projector of interest ends. On the other hand, if the difference between the simulation image and the captured image is not sufficiently small, the projector of interest is controlled in step S135 as described above. Further, until it is determined that the difference between the simulation image and the captured image is sufficiently small, acquisition of the captured image (S134) and control of the projector 10 (S135) are repeatedly performed.
  • step S13 The projector control for adjusting the projection area of each projector in step S13 has been described above.
  • the flowchart shown in FIG. 7 is an example, and the operation according to the present embodiment is not limited to the example.
  • the generation of the simulation image in step S132 may be performed for all projectors in advance before the target projector is set, and simulation images corresponding to all the projectors may be stored in the storage unit 350.
  • FIG. 8 is a flowchart showing more detailed processing of step S15 shown in FIG.
  • the projector control unit 318 selects one of the projectors 10 for which the processing in steps S152 to S153 has not been completed, and sets it as the target projector to be processed (S151).
  • the projector control unit 318 turns on the target projector (S152). Further, according to the control of the imaging control unit 314, the camera 20 acquires a captured image including the projection area of the projector of interest, and provides it to the information processing apparatus 30 (S153).
  • the processes in steps S151 to S153 are repeatedly performed until captured images at the time of lighting are acquired for all projectors included in the projector 10. In other words, if there is a projector for which a captured image at the time of lighting is not acquired among the projectors included in projector 10 (NO in S154), the process returns to step S151.
  • projector control unit 318 outputs output parameters between the plurality of projectors based on the acquired plurality of captured images. It is determined whether or not the variation in the number is sufficiently small (S155). If it is determined that the variation in output parameters among the plurality of projectors is sufficiently small (YES in S155), the process in step S15 ends.
  • projector control unit 318 controls each projector included in projector 10 so that the variation is small. (S156). Further, the process returns to step S151, and after acquiring the captured images for all the projectors controlled in step S156 (S151 to S154), it is determined again whether or not the variation is sufficiently small based on the captured images. (S155).
  • the projection system 1 may include a sensor other than the camera 20, and the output parameter may be acquired based on sensing data acquired by the sensor.
  • the projection system 1 includes an illuminance sensor, a brightness parameter (an example of an output parameter) may be acquired based on sensing data acquired by the illuminance sensor. According to such a configuration, the output parameter can be acquired with higher accuracy.
  • the imaging control unit 314 controls the imaging of the camera 20 and the captured image is acquired every time each projector is turned on is described.
  • the present technology is not limited to such an example.
  • the control unit 310 may not have a function as the imaging control unit 314.
  • the captured image is continuously provided from the camera 20 to the information processing device 30 (for example, constantly)
  • the processing device 30 may select a captured image used for projector control by referring to the captured time of the captured image.
  • FIG. 1 illustrates an example in which the camera 20 is a single camera
  • the present technology is not limited to such an example.
  • the camera 20 may include a plurality of cameras.
  • the imaging control unit 314 When the camera 20 includes a plurality of cameras, the imaging control unit 314 generates an imaging control signal for controlling imaging by the camera corresponding to the projector that is lit among the cameras included in the camera 20. Also good.
  • the camera corresponding to the projector is a camera capable of acquiring a captured image including a projection area by the projector, for example.
  • the camera 20 includes a plurality of cameras
  • calibration such as brightness, hue, saturation, and hue is performed between the cameras, and output parameters are calculated from the captured image based on the calibration result. Is desirable.
  • FIG. 9 is a block diagram illustrating an example of a hardware configuration of the information processing apparatus according to the present embodiment.
  • the information processing apparatus 900 illustrated in FIG. 9 can realize the information processing apparatus 30 illustrated in FIGS. 1 and 2, for example.
  • Information processing by the information processing apparatus 30 according to the present embodiment is realized by cooperation between software and hardware described below.
  • the information processing apparatus 900 includes a CPU (Central Processing Unit) 901, a ROM (Read Only Memory) 902, a RAM (Random Access Memory) 903, and a host bus 904a.
  • the information processing apparatus 900 includes a bridge 904, an external bus 904b, an interface 905, an input device 906, an output device 907, a storage device 908, a drive 909, a connection port 911, a communication device 913, and a sensor 915.
  • the information processing apparatus 900 may include a processing circuit such as a DSP or an ASIC in place of or in addition to the CPU 901.
  • the CPU 901 functions as an arithmetic processing unit and a control unit, and controls the overall operation in the information processing apparatus 900 according to various programs. Further, the CPU 901 may be a microprocessor.
  • the ROM 902 stores programs used by the CPU 901, calculation parameters, and the like.
  • the RAM 903 temporarily stores programs used in the execution of the CPU 901, parameters that change as appropriate during the execution, and the like. For example, the CPU 901 can form the control unit 310.
  • the CPU 901, ROM 902, and RAM 903 are connected to each other by a host bus 904a including a CPU bus.
  • the host bus 904 a is connected to an external bus 904 b such as a PCI (Peripheral Component Interconnect / Interface) bus via a bridge 904.
  • an external bus 904 b such as a PCI (Peripheral Component Interconnect / Interface) bus
  • PCI Peripheral Component Interconnect / Interface
  • the host bus 904a, the bridge 904, and the external bus 904b do not necessarily have to be configured separately, and these functions may be mounted on one bus.
  • the input device 906 is realized by a device in which information is input by the user, such as a mouse, a keyboard, a touch panel, a button, a microphone, a switch, and a lever.
  • the input device 906 may be, for example, a remote control device using infrared rays or other radio waves, or may be an external connection device such as a mobile phone or a PDA that supports the operation of the information processing device 900.
  • the input device 906 may include, for example, an input control circuit that generates an input signal based on information input by the user using the above-described input means and outputs the input signal to the CPU 901.
  • a user of the information processing apparatus 900 can input various data and instruct a processing operation to the information processing apparatus 900 by operating the input device 906.
  • the output device 907 is formed of a device that can notify the user of the acquired information visually or audibly. Examples of such devices include CRT display devices, liquid crystal display devices, plasma display devices, EL display devices, display devices such as lamps, audio output devices such as speakers and headphones, printer devices, and the like.
  • the output device 907 outputs results obtained by various processes performed by the information processing device 900.
  • the display device visually displays results obtained by various processes performed by the information processing device 900 in various formats such as text, images, tables, and graphs.
  • the audio output device converts an audio signal composed of reproduced audio data, acoustic data, and the like into an analog signal and outputs it aurally.
  • the storage device 908 is a data storage device formed as an example of a storage unit of the information processing device 900.
  • the storage apparatus 908 is realized by, for example, a magnetic storage device such as an HDD, a semiconductor storage device, an optical storage device, a magneto-optical storage device, or the like.
  • the storage device 908 may include a storage medium, a recording device that records data on the storage medium, a reading device that reads data from the storage medium, a deletion device that deletes data recorded on the storage medium, and the like.
  • the storage device 908 stores programs executed by the CPU 901, various data, various data acquired from the outside, and the like.
  • the storage apparatus 908 can form the storage unit 350, for example.
  • the drive 909 is a storage medium reader / writer, and is built in or externally attached to the information processing apparatus 900.
  • the drive 909 reads information recorded on a removable storage medium such as a magnetic disk, an optical disk, a magneto-optical disk, or a semiconductor memory, and outputs the information to the RAM 903.
  • the drive 909 can also write information to a removable storage medium.
  • connection port 911 is an interface connected to an external device, and is a connection port with an external device capable of transmitting data by USB (Universal Serial Bus), for example.
  • USB Universal Serial Bus
  • the communication device 913 is a communication interface formed by a communication device or the like for connecting to the network 920, for example.
  • the communication device 913 is, for example, a communication card for wired or wireless LAN (Local Area Network), LTE (Long Term Evolution), Bluetooth (registered trademark), or WUSB (Wireless USB).
  • the communication device 913 may be a router for optical communication, a router for ADSL (Asymmetric Digital Subscriber Line), a modem for various communication, or the like.
  • the communication device 913 can transmit and receive signals and the like according to a predetermined protocol such as TCP / IP, for example, with the Internet and other communication devices.
  • the sensor 915 is various sensors such as an acceleration sensor, a gyro sensor, a geomagnetic sensor, an optical sensor, a sound sensor, a distance measuring sensor, and a force sensor.
  • the sensor 915 acquires information on the state of the information processing apparatus 900 itself, such as the posture and movement speed of the information processing apparatus 900, and information on the surrounding environment of the information processing apparatus 900, such as brightness and noise around the information processing apparatus 900.
  • Sensor 915 may also include a GPS sensor that receives GPS signals and measures the latitude, longitude, and altitude of the device.
  • the network 920 is a wired or wireless transmission path for information transmitted from a device connected to the network 920.
  • the network 920 may include a public line network such as the Internet, a telephone line network, and a satellite communication network, various LANs including the Ethernet (registered trademark), a wide area network (WAN), and the like.
  • the network 920 may include a dedicated line network such as an IP-VPN (Internet Protocol-Virtual Private Network).
  • IP-VPN Internet Protocol-Virtual Private Network
  • each of the above components may be realized using a general-purpose member, or may be realized by hardware specialized for the function of each component. Therefore, it is possible to change the hardware configuration to be used as appropriate according to the technical level at the time of carrying out this embodiment.
  • a computer program for realizing each function of the information processing apparatus 900 according to the present embodiment as described above can be produced and mounted on a PC or the like.
  • a computer-readable recording medium storing such a computer program can be provided.
  • the recording medium is, for example, a magnetic disk, an optical disk, a magneto-optical disk, a flash memory, or the like.
  • the above computer program may be distributed via a network, for example, without using a recording medium.
  • An information processing apparatus comprising: a simulation image including a projection area of a projector in advance design; and a control unit that controls the projector based on a captured image including the projection area of the projector.
  • the simulation image is generated based on a parameter in the preliminary design and a parameter related to an imaging apparatus that acquires the captured image.
  • the control unit controls the projector so that a difference between the simulation image and the captured image becomes small.
  • the control unit performs control related to a lens of the projector.
  • the information processing apparatus includes lens shift control of the projector.
  • the information processing apparatus according to any one of (3) to (6), wherein the control unit repeatedly controls the projector until the difference is determined to be sufficiently small.
  • the control unit performs a second control different from the first control when the difference calculated after performing the first control is larger than the previously calculated difference, (7 ).
  • the control unit performs control opposite to the first control when the difference calculated after performing the first control is larger than the difference calculated last time, (8)
  • control unit controls a plurality of projectors based on a plurality of captured images.
  • control unit performs control related to output parameters of the plurality of projectors so that variations in output parameters among the plurality of projectors are reduced.
  • control unit controls an output parameter of each projector included in the plurality of projectors so as to approach a reference value specified based on the output parameters of the plurality of projectors. apparatus.
  • control unit repeatedly controls the projector until it is determined that the variation is sufficiently small.
  • the control unit reverses the control related to the output parameters of the plurality of projectors when the variation calculated after the control related to the output parameters of the plurality of projectors is larger than the variation calculated last time.
  • An information processing method including: a processor controlling a projector based on a simulation image including a projection area of a projector in a pre-design and a captured image including a projection area of the projector.
  • a projector that projects a projected image An imaging device for acquiring a captured image including a projection area of the projector; An information processing apparatus including a simulation image including a projection area of the projector in advance design, and a control unit that controls the projector based on the captured image;
  • a projection system comprising:

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computer Hardware Design (AREA)
  • Theoretical Computer Science (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

[Problem] To provide an information processing device, an information processing method, and a projection system. [Solution] An information processing device provided with a control unit that, on the basis of a simulation image including a projection region of a projector in a design made in advance, and a captured image including a projection region of the projector, controls the projector.

Description

情報処理装置、情報処理方法、及び投影システムInformation processing apparatus, information processing method, and projection system
 本開示は、情報処理装置、情報処理方法、及び投影システムに関する。 The present disclosure relates to an information processing apparatus, an information processing method, and a projection system.
 画像を投影するプロジェクタが用いられる際、よりユーザの期待に応じた投影を行うため、投影に関する様々な調整が人手(ユーザによる操作等)により行われている。例えば、プロジェクタの投影領域がユーザが期待する投影領域と異なる場合に、プロジェクタが有するレンズの調整機能を用いて投影領域を調整することが行われている。 When a projector that projects an image is used, various adjustments relating to the projection are performed manually (operations by the user, etc.) in order to perform projection according to the user's expectations. For example, when the projection area of the projector is different from the projection area expected by the user, the projection area is adjusted using the lens adjustment function of the projector.
 また、上記のような調整を行う代わりに、画像処理によりプロジェクタに入力される画像を補正することも行われている。例えば、非特許文献1には、カメラで撮影することにより得られる投影先の情報に基づいて、幾何補正処理を施した画像をプロジェクタに入力することが記載されている。 Further, instead of performing the adjustment as described above, an image input to the projector is corrected by image processing. For example, Non-Patent Document 1 describes that an image subjected to geometric correction processing is input to a projector based on projection destination information obtained by photographing with a camera.
 上述したように人手による調整が行われる場合、人的負担が大きく、特にプロジェクタが複数台用いられる場合にはプロジェクタごとに調整を行うため人的負担が膨大なものとなる恐れがあった。一方、画像処理によりプロジェクタに入力する画像を補正する場合、人的負担は軽減されるものの、ユーザの目に映る画像の画質が低下する恐れがあった。 As described above, when the adjustment is performed manually, the human burden is large. In particular, when a plurality of projectors are used, the adjustment is performed for each projector, and thus the human burden may be enormous. On the other hand, when the image input to the projector is corrected by image processing, the human burden is reduced, but the image quality of the image seen by the user may be reduced.
 そこで、本開示では、画質の低下を抑えつつ、人的負担を軽減することが可能な、新規かつ改良された情報処理装置、情報処理方法、及び投影システムを提案する。 Therefore, the present disclosure proposes a new and improved information processing apparatus, information processing method, and projection system capable of reducing the human burden while suppressing deterioration in image quality.
 本開示によれば、事前設計におけるプロジェクタの投影領域を含むシミュレーション画像、及び前記プロジェクタの投影領域を含む撮像画像に基づいて、前記プロジェクタを制御する制御部、を備える情報処理装置が提供される。 According to the present disclosure, there is provided an information processing apparatus including a simulation image including a projection area of a projector in a pre-design and a control unit that controls the projector based on a captured image including the projection area of the projector.
 また、本開示によれば、事前設計におけるプロジェクタの投影領域を含むシミュレーション画像、及び前記プロジェクタの投影領域を含む撮像画像に基づいて、プロセッサが前記プロジェクタを制御すること、を含む情報処理方法が提供される。 Further, according to the present disclosure, there is provided an information processing method including a processor controlling the projector based on a simulation image including a projection area of the projector in a pre-design and a captured image including the projection area of the projector. Is done.
 また、本開示によれば、投影画像を投影するプロジェクタと、前記プロジェクタの投影領域を含む撮像画像を取得する撮像装置と、事前設計における前記プロジェクタの投影領域を含むシミュレーション画像、及び前記撮像画像に基づいて、前記プロジェクタを制御する制御部、を有する情報処理装置と、を備える投影システムが提供される。 In addition, according to the present disclosure, a projector that projects a projection image, an imaging device that acquires a captured image including the projection area of the projector, a simulation image including the projection area of the projector in advance design, and the captured image An information processing apparatus having a control unit for controlling the projector is provided.
 以上説明したように本開示によれば、画質の低下を抑えつつ、人的負担を軽減することが可能である。 As described above, according to the present disclosure, it is possible to reduce the human burden while suppressing deterioration in image quality.
 なお、上記の効果は必ずしも限定的なものではなく、上記の効果とともに、または上記の効果に代えて、本明細書に示されたいずれかの効果、または本明細書から把握され得る他の効果が奏されてもよい。 Note that the above effects are not necessarily limited, and any of the effects shown in the present specification, or other effects that can be grasped from the present specification, together with or in place of the above effects. May be played.
本開示の一実施形態に係る投影システム1の概略構成を説明するための説明図である。It is explanatory drawing for demonstrating schematic structure of the projection system 1 which concerns on one Embodiment of this indication. 同実施形態に係る情報処理装置30の構成例を示すブロック図である。It is a block diagram showing an example of composition of information processor 30 concerning the embodiment. 同実施形態に係るプロジェクタ制御部318によるプロジェクタ制御の一例を説明するための説明図である。4 is an explanatory diagram for explaining an example of projector control by a projector control unit 318 according to the embodiment. FIG. 同実施形態に係るプロジェクタ制御部318によるプロジェクタ制御の他の一例を説明するための説明図である。10 is an explanatory diagram for explaining another example of projector control by the projector control unit 318 according to the embodiment. FIG. 同実施形態の動作例を示すフローチャートである。It is a flowchart which shows the operation example of the embodiment. 事前設計において不図示の表示部に表示される画面の例を示す説明図である。It is explanatory drawing which shows the example of the screen displayed on a display part not shown in prior design. ステップS13のより詳細な処理を示すフローチャート図である。It is a flowchart figure which shows the more detailed process of step S13. ステップS15のより詳細な処理を示すフローチャート図である。It is a flowchart figure which shows the more detailed process of step S15. ハードウェア構成例を示す説明図である。It is explanatory drawing which shows the hardware structural example.
 以下に添付図面を参照しながら、本開示の好適な実施の形態について詳細に説明する。なお、本明細書及び図面において、実質的に同一の機能構成を有する構成要素については、同一の符号を付することにより重複説明を省略する。 Hereinafter, preferred embodiments of the present disclosure will be described in detail with reference to the accompanying drawings. In addition, in this specification and drawing, about the component which has the substantially same function structure, duplication description is abbreviate | omitted by attaching | subjecting the same code | symbol.
 また、本明細書及び図面において、実質的に同一の機能構成を有する複数の構成要素を、同一の符号の後に異なるアルファベットを付して区別する場合もある。ただし、実質的に同一の機能構成を有する複数の構成要素の各々を特に区別する必要がない場合、同一符号のみを付する。 In the present specification and drawings, a plurality of constituent elements having substantially the same functional configuration may be distinguished by adding different alphabets after the same reference numeral. However, when it is not necessary to particularly distinguish each of a plurality of constituent elements having substantially the same functional configuration, only the same reference numerals are given.
 なお、説明は以下の順序で行うものとする。
 <<1.概要>>
  <1-1.概略構成>
  <1-2.背景>
 <<2.構成>>
 <<3.動作>>
 <<4.変形例>>
  <4-1.変形例1>
  <4-2.変形例2>
  <4-3.変形例3>
 <<5.ハードウェア構成例>>
 <<6.むすび>>
The description will be made in the following order.
<< 1. Overview >>
<1-1. Schematic configuration>
<1-2. Background>
<< 2. Configuration >>
<< 3. Operation >>
<< 4. Modification >>
<4-1. Modification 1>
<4-2. Modification 2>
<4-3. Modification 3>
<< 5. Hardware configuration example >>
<< 6. Conclusion >>
 <<1.概要>>
  <1-1.概略構成>
 まず、図1を参照して本開示の一実施形態に係る投影システムの概略構成を説明する。図1は、本実施形態に係る投影システム1の概略構成を説明するための説明図である。図1に示すように、投影システム1は、スクリーン50へ画像(静止画像、または動画像)を投影する1つ以上のプロジェクタ10(投影装置)、カメラ20(撮像装置)、情報処理装置30、及び通信網80を含む。
<< 1. Overview >>
<1-1. Schematic configuration>
First, a schematic configuration of a projection system according to an embodiment of the present disclosure will be described with reference to FIG. FIG. 1 is an explanatory diagram for explaining a schematic configuration of a projection system 1 according to the present embodiment. As shown in FIG. 1, the projection system 1 includes one or more projectors 10 (projection devices), a camera 20 (imaging device), an information processing device 30, and the like that project an image (still image or moving image) onto a screen 50. And a communication network 80.
 プロジェクタ10はスクリーン50へ画像を投影する。図1に示すように、プロジェクタ10は複数のプロジェクタ10A~10Dを含んでもよい。なお、プロジェクタ10に含まれるプロジェクタの数は図1に示す例に限定されない。 The projector 10 projects an image on the screen 50. As shown in FIG. 1, the projector 10 may include a plurality of projectors 10A to 10D. The number of projectors included in projector 10 is not limited to the example shown in FIG.
 カメラ20はプロジェクタ10により投影される様を撮像して撮像画像を取得する撮像装置である。例えばカメラ20はプロジェクタ10A~10Dによる投影を俯瞰するような視点に配置されてもよく、カメラ20はプロジェクタ10A~10Dの実際の投影領域を含む撮像画像を取得する。 The camera 20 is an imaging device that captures an image projected by the projector 10 and acquires a captured image. For example, the camera 20 may be arranged at a viewpoint overlooking the projections by the projectors 10A to 10D, and the camera 20 acquires captured images including actual projection areas of the projectors 10A to 10D.
 情報処理装置30は、投影システム1に係る情報処理を行う。図1に示すように情報処理装置30は、通信網80に接続され、通信網80を介して、例えばプロジェクタ10A~10Dの各々へ、投影用の画像(以下、投影画像と呼ぶ場合がある)を入力(提供)してもよい。また、情報処理装置30は、通信網80を介して、カメラ20から、カメラ20が取得した撮像画像を受け取ってもよい。情報処理装置30の詳細については、図2~図4を参照して後述する。 The information processing apparatus 30 performs information processing related to the projection system 1. As shown in FIG. 1, the information processing apparatus 30 is connected to a communication network 80, and projects images to the projectors 10A to 10D, for example, via the communication network 80 (hereinafter sometimes referred to as projection images). May be input (provided). Further, the information processing apparatus 30 may receive a captured image acquired by the camera 20 from the camera 20 via the communication network 80. Details of the information processing apparatus 30 will be described later with reference to FIGS.
 スクリーン50は、プロジェクタ10による投影先である。スクリーン50は、例えばドーム状であってもよい。図1では、ドーム状のスクリーン50を水平に切断した断面が示されている。なお、スクリーン50の形状はドーム状に限定されず、曲面や平面であってもよい。 The screen 50 is a projection destination by the projector 10. The screen 50 may have a dome shape, for example. In FIG. 1, the cross section which cut | disconnected the dome-shaped screen 50 horizontally is shown. The shape of the screen 50 is not limited to a dome shape, and may be a curved surface or a flat surface.
 通信網80は通信網80に接続されている装置から送信される情報の有線、または無線の伝送路である。例えば、通信網80は、インターネット、電話回線網、衛星通信網等の公衆回線網や、Ethernet(登録商標)を含む各種のLAN(Local Area Network)、WAN(Wide Area Network)等を含んでもよい。また、通信網80は、IP-VPN(Internet Protocol-Virtual Private Network)等の専用回線網を含んでもよい。 The communication network 80 is a wired or wireless transmission path for information transmitted from a device connected to the communication network 80. For example, the communication network 80 may include a public line network such as the Internet, a telephone line network, and a satellite communication network, various LANs including the Ethernet (registered trademark), a wide area network (WAN), and the like. . Further, the communication network 80 may include a dedicated network such as an IP-VPN (Internet Protocol-Virtual Private Network).
  <1-2.背景>
 以上、本開示の一実施形態に係る投影システム1の全体構成を説明した。続いて、本実施形態による投影システム1の創作に至った背景を説明する。
<1-2. Background>
The overall configuration of the projection system 1 according to an embodiment of the present disclosure has been described above. Next, the background that led to the creation of the projection system 1 according to the present embodiment will be described.
 プロジェクタにより画像を投影する際、ユーザが期待する投影領域とプロジェクタの投影領域が異なる場合がある。ユーザが期待する投影領域は、例えば事前にスクリーン(投影先)、及びプロジェクタの配置(位置、及び姿勢)を設計した際に想定される投影領域であってもよい。 When projecting an image with a projector, the projection area expected by the user may be different from the projection area of the projector. The projection area expected by the user may be a projection area assumed when, for example, the screen (projection destination) and the arrangement (position and orientation) of the projector are designed in advance.
 例えば、ユーザが期待している位置と異なる(ずれた)位置に画像が投影されている場合、投影領域の位置を調整することが望ましい。また、投影領域がスクリーンサイズと整合していない場合には、投影領域の大きさを調整することが望ましい。 For example, when the image is projected at a position different (shifted) from the position expected by the user, it is desirable to adjust the position of the projection area. If the projection area does not match the screen size, it is desirable to adjust the size of the projection area.
 また、図1に示すスクリーン50のように投影先の形状が平面ではない場合、投影先の形状に応じて投影領域を調整することが望ましく、平面に対して投影する場合と比較してより詳細な調整が求められる場合があった。また、図1に示すように、複数のプロジェクタにより同一の領域に重なって、または近傍の領域に画像が投影され得る場合には、他のプロジェクタの投影領域を考慮した詳細な調整が求められる。複数のプロジェクタが用いられる場合には、投影領域の調整だけでなく、複数のプロジェクタ間で色や明るさ等が整合するように調整を行うことが望ましい。 Further, when the shape of the projection destination is not a plane as in the screen 50 shown in FIG. 1, it is desirable to adjust the projection area according to the shape of the projection destination, and more detailed than the case of projecting onto the plane. Adjustment may be required. Also, as shown in FIG. 1, when an image can be projected onto a nearby area by a plurality of projectors, detailed adjustment in consideration of the projection areas of other projectors is required. When a plurality of projectors are used, it is desirable not only to adjust the projection area, but also to make adjustments so that colors, brightness, and the like are consistent among the plurality of projectors.
 このような調整のために、従来、人手(例えばユーザによる操作)による調整が行われていた。例えば、プロジェクタには、例えばレンズを上、下、左、または右に移動させることで投影領域の位置を調整するレンズシフト機能、レンズを前後に移動させることで投影領域の大きさを調整するズーム機能、色を調整する機能、明るさを調整する機能等が搭載され得る。例えばプロジェクタが備えるボタンやリモートコントローラ等を介したユーザの操作により上記の調整を行うことが考えられる。しかし、係る方法では、実際にプロジェクタにより投影されている様をユーザが視認しながら調整を行うため、人的負担が大きかった。特に、複数のプロジェクタが用いられる場合には、プロジェクタの台数に応じて調整の手間が増加するため、人的負担が非常に大きくなる恐れがあった。 For such adjustment, adjustment by human (for example, user operation) has been conventionally performed. For example, the projector has a lens shift function that adjusts the position of the projection area by moving the lens up, down, left, or right, for example, and a zoom that adjusts the size of the projection area by moving the lens back and forth A function, a color adjusting function, a brightness adjusting function, and the like can be mounted. For example, it is conceivable to perform the above adjustment by a user operation via a button or a remote controller provided in the projector. However, in this method, since the user performs adjustment while visually recognizing that the image is actually projected by the projector, the human burden is large. In particular, when a plurality of projectors are used, the labor for adjustment increases according to the number of projectors, which may increase the human burden.
 一方、ユーザの望む領域に画像を投影するため、例えばプロジェクタへ入力される投影画像を画像処理により補正する方法も考えられる。例えば、図1に示す投影システムにおいて、カメラ20により取得される撮像画像に基づいて、情報処理装置30が縮小処理を施した投影画像をプロジェクタ10へ入力することが考えられる。係る構成により、プロジェクタ10の投影領域のうち、ユーザが望む領域においてのみ画像が視認されるような投影を実現することが可能となる。また、カメラ20により取得される撮像画像に基づいて、色補正処理や明るさ補正処理を施した投影画像をプロジェクタ10へ入力することで、複数のプロジェクタ間で色や明るさが類似するように投影することが可能である。 On the other hand, in order to project an image on a region desired by the user, for example, a method of correcting a projection image input to the projector by image processing is also conceivable. For example, in the projection system shown in FIG. 1, it is conceivable that a projection image subjected to reduction processing by the information processing device 30 is input to the projector 10 based on a captured image acquired by the camera 20. With such a configuration, it is possible to realize a projection in which an image is visually recognized only in an area desired by the user among the projection areas of the projector 10. Further, by inputting a projection image subjected to color correction processing and brightness correction processing to the projector 10 based on the captured image acquired by the camera 20, the colors and brightness are similar between the plurality of projectors. It is possible to project.
 上記のように画像処理(縮小処理、色補正処理、明るさ補正処理等)を施した投影画像をプロジェクタへ入力する場合、人的負担が軽減されるものの、例えば画質の低下が発生する恐れがある。例えば、縮小処理を施す場合には、利用される画素数が減少するため、縮小処理を施さない場合と比較して画質(例えば解像感)が大きく低下し得る。また、色補正処理や明るさ補正処理を施す場合、例えば最大値近傍または最小値近傍の画素値を有する画素に対して適切な補正を行うことは困難であり、係る画素が存在すると画質が低下しやすい。 When a projection image that has been subjected to image processing (reduction processing, color correction processing, brightness correction processing, etc.) as described above is input to the projector, the human burden is reduced, but there is a risk that, for example, degradation in image quality may occur. is there. For example, when the reduction process is performed, the number of pixels used is reduced, so that the image quality (for example, resolution) can be greatly reduced as compared with the case where the reduction process is not performed. In addition, when color correction processing or brightness correction processing is performed, for example, it is difficult to perform appropriate correction on a pixel having a pixel value near the maximum value or the minimum value, and image quality deteriorates when such a pixel exists. It's easy to do.
 <<2.構成>>
 そこで、上記事情を一着眼点として本開示の一実施形態に係る投影システム1を創作するに至った。本実施形態による投影システム1は、カメラ20により取得された撮像画像に基づいて、情報処理装置30がプロジェクタ10を制御して投影に関する調整を自動的に行うことで、画質の低下を抑えつつ、人的負担を軽減することが可能である。
<< 2. Configuration >>
Accordingly, the projection system 1 according to an embodiment of the present disclosure has been created with the above circumstances as a focus. In the projection system 1 according to the present embodiment, the information processing apparatus 30 controls the projector 10 and automatically adjusts the projection based on the captured image acquired by the camera 20, thereby suppressing deterioration in image quality. It is possible to reduce the human burden.
 情報処理装置30によるプロジェクタ10の制御は、例えば、上述したボタンやリモートコントローラを介したユーザ操作と同様のプロジェクタ制御信号が情報処理装置30からプロジェクタ10へ送信されることで行われてもよい。すなわち、本実施形態によれば、上述したプロジェクタ10が有する機能を用いたユーザ操作による調整と同様の調整が自動的に行われ、人的負担を軽減することが可能である。 Control of the projector 10 by the information processing apparatus 30 may be performed, for example, by transmitting a projector control signal similar to the user operation via the above-described button or remote controller from the information processing apparatus 30 to the projector 10. That is, according to this embodiment, the same adjustment as the adjustment by the user operation using the function of the projector 10 described above is automatically performed, and it is possible to reduce the human burden.
 プロジェクタ10が有する機能を用いた調整であれば、上述した画像処理を施した場合と比較して、画質の低下を抑えることが可能である。例えば、プロジェクタ10を制御して投影領域の調整を行う場合には、投影領域が異なるだけで、投影に利用される画素数には変化が無いため、画質の低下を抑えることが可能である。また、プロジェクタ10の最大出力値、及び最小出力値の大きさを変更するような制御も可能であるため、上述のような画質の低下を抑えて、色や明るさを調整することも可能である。 If the adjustment using the function of the projector 10 is performed, it is possible to suppress a decrease in image quality as compared with the case where the above-described image processing is performed. For example, when the projection area is adjusted by controlling the projector 10, only the projection area is different, and the number of pixels used for projection is not changed. In addition, since control of changing the maximum output value and the minimum output value of the projector 10 is possible, it is possible to adjust the color and brightness while suppressing the above-described deterioration in image quality. is there.
 以下、このような効果を有する本実施形態の構成について、より詳細に説明する。図2は、図1に示した情報処理装置30の構成例を示すブロック図である。本実施形態に係る情報処理装置30は、図2に示すように、制御部310、通信部320、及び記憶部350を備える。 Hereinafter, the configuration of the present embodiment having such an effect will be described in more detail. FIG. 2 is a block diagram illustrating a configuration example of the information processing apparatus 30 illustrated in FIG. As illustrated in FIG. 2, the information processing apparatus 30 according to the present embodiment includes a control unit 310, a communication unit 320, and a storage unit 350.
 制御部310は、情報処理装置30の各構成を制御する。また、制御部310は、図2に示すように、通信制御部312、撮像制御部314、シミュレーション画像生成部316、及びプロジェクタ制御部318としても機能する。 The control unit 310 controls each component of the information processing apparatus 30. Further, as shown in FIG. 2, the control unit 310 also functions as a communication control unit 312, an imaging control unit 314, a simulation image generation unit 316, and a projector control unit 318.
 通信制御部312は、通信部320による通信を制御する。例えば、通信制御部312は、通信部320を制御して、図1を参照して説明したプロジェクタ10(投影装置)へ、投影画像、及びプロジェクタ10を制御するためのプロジェクタ制御信号を送信させる。また、通信制御部312は、通信部320を制御して、図1を参照して説明したカメラ20(撮像装置)へ、カメラ20による撮像を制御するための撮像制御信号を送信させる。また、通信制御部312は、通信部320を制御して、図1を参照して説明したカメラ20(撮像装置)から撮像画像を受信させる。なお、プロジェクタ制御信号、及び撮像制御信号については後述する。 The communication control unit 312 controls communication by the communication unit 320. For example, the communication control unit 312 controls the communication unit 320 to transmit a projection image and a projector control signal for controlling the projector 10 to the projector 10 (projection apparatus) described with reference to FIG. Further, the communication control unit 312 controls the communication unit 320 to transmit an imaging control signal for controlling imaging by the camera 20 to the camera 20 (imaging device) described with reference to FIG. Further, the communication control unit 312 controls the communication unit 320 to receive a captured image from the camera 20 (imaging device) described with reference to FIG. The projector control signal and the imaging control signal will be described later.
 撮像制御部314は、プロジェクタ制御部318によるプロジェクタ制御に応じて、カメラ20による撮像を制御する。例えば、撮像制御部314は、プロジェクタ制御部318がプロジェクタ10A~10Dのうち1のプロジェクタ(例えばプロジェクタ10A)を点灯させるように制御した場合に、カメラ20による撮像が行われるような撮像制御信号を生成し、通信制御部312へ提供する。 The imaging control unit 314 controls imaging by the camera 20 in accordance with projector control by the projector control unit 318. For example, the imaging control unit 314 outputs an imaging control signal that causes the camera 20 to perform imaging when the projector control unit 318 controls to turn on one of the projectors 10A to 10D (for example, the projector 10A). Generated and provided to the communication control unit 312.
 シミュレーション画像生成部316は、記憶部350に記憶される事前設計におけるパラメータ、及びカメラ20に関するパラメータに基づいて、事前設計通りにプロジェクタ10による投影が行われた場合にカメラ20により取得されるべき画像であるシミュレーション画像を生成する。ここで、シミュレーション画像は、事前設計においてユーザが期待する投影領域、すなわち事前設計におけるプロジェクタ10の投影領域を含む画像となる。 The simulation image generation unit 316 is an image to be acquired by the camera 20 when projection by the projector 10 is performed according to the pre-design based on the parameters in the pre-design stored in the storage unit 350 and the parameters related to the camera 20. A simulation image is generated. Here, the simulation image is an image including a projection region expected by the user in the pre-design, that is, a projection region of the projector 10 in the pre-design.
 なお、事前設計におけるパラメータは、例えばスクリーン50の3次元的な形状、3次元的な位置、及び3次元的な姿勢の情報を含んでもよい。また、事前設計におけるパラメータは、事前設計により設定されたプロジェクタ10の3次元的な位置、3次元的な姿勢、レンズパラメータ(レンズの位置や焦点距離等)をさらに含んでもよい。なお、プロジェクタ10のレンズパラメータに加えて、または代えて、プロジェクタ10の投影距離と投影領域の関係を示す情報が事前設計におけるパラメータに含まれてもよい。また、カメラ20に関するパラメータは、例えばカメラ20の3次元的な位置、3次元的な姿勢、及びレンズパラメータを含んでもよい。なお、事前設計におけるパラメータは、実際に用いられるプロジェクタ10、及びカメラ20の仕様や性能等に基づいた値である。 The parameters in the pre-design may include, for example, information on the three-dimensional shape, three-dimensional position, and three-dimensional posture of the screen 50. The parameters in the pre-design may further include a three-dimensional position, a three-dimensional attitude, and lens parameters (lens position, focal length, etc.) of the projector 10 set by the pre-design. In addition to or instead of the lens parameters of the projector 10, information indicating the relationship between the projection distance of the projector 10 and the projection area may be included in the parameters in the prior design. The parameters relating to the camera 20 may include, for example, a three-dimensional position, a three-dimensional posture, and a lens parameter of the camera 20. The parameters in the pre-design are values based on the specifications and performance of the projector 10 and the camera 20 that are actually used.
 シミュレーション画像生成部316は、上述したような事前設計におけるパラメータ、及びカメラ20に関するパラメータに基づいて、シミュレーション画像を生成する。係る構成により、撮像画像に対応し、撮像画像と比較可能なシミュレーション画像が生成され得る。 The simulation image generation unit 316 generates a simulation image based on the parameters in the advance design as described above and the parameters related to the camera 20. With this configuration, a simulation image that corresponds to the captured image and can be compared with the captured image can be generated.
 プロジェクタ制御部318は、プロジェクタ10を制御する。例えば、プロジェクタ制御部318は、プロジェクタ10を制御するためのプロジェクタ制御信号を生成し、通信制御部312に提供することで、プロジェクタ10を制御し得る。なお、以下の説明において、例えば「プロジェクタ10を制御する」とは、「プロジェクタ10を制御するためのプロジェクタ制御信号を生成する」ことを意味する。 The projector control unit 318 controls the projector 10. For example, the projector control unit 318 can control the projector 10 by generating a projector control signal for controlling the projector 10 and providing it to the communication control unit 312. In the following description, for example, “controlling the projector 10” means “generating a projector control signal for controlling the projector 10”.
 プロジェクタ制御部318が生成するプロジェクタ制御信号は、例えば、プロジェクタ10を点灯させる(プロジェクタ10に画像を投影させる)ための制御信号であってもよい。 The projector control signal generated by the projector control unit 318 may be, for example, a control signal for turning on the projector 10 (projecting an image on the projector 10).
 また、プロジェクタ制御部318が生成するプロジェクタ制御信号は、プロジェクタ10のレンズに関する制御を行うための制御信号であってもよい。レンズに関する制御は、例えばプロジェクタ10のレンズ位置を上下左右に移動させる制御(レンズシフト制御)、プロジェクタ10のズーム倍率を変更する(すなわちレンズの焦点距離を変更する)制御(ズーム制御)のうち少なくともいずれか一方を含んでもよい。 Further, the projector control signal generated by the projector control unit 318 may be a control signal for performing control related to the lens of the projector 10. The control related to the lens is, for example, at least one of control (lens shift control) for moving the lens position of the projector 10 up and down, left and right, and control (zoom control) for changing the zoom magnification of the projector 10 (that is, changing the focal length of the lens). Either one may be included.
 レンズ位置の移動は、例えば上、下、左、または右のいずれか1の方向への移動であってもよく、レンズ位置を移動させる制御は、プロジェクタ制御信号により、指定された方向へ1ステップずつプロジェクタ10のレンズ位置を移動させることを含んでもよい。また、ズーム倍率を変更する制御は、例えば、プロジェクタ制御信号により、プロジェクタ10のズーム倍率を1ステップずつ大きく、または小さくすることを含んでもよい。 The movement of the lens position may be, for example, movement in any one direction of up, down, left, or right, and the control for moving the lens position is performed in one step in the designated direction by the projector control signal. It may include moving the lens position of the projector 10 one by one. Further, the control for changing the zoom magnification may include, for example, increasing or decreasing the zoom magnification of the projector 10 step by step by a projector control signal.
 また、プロジェクタ制御部318が生成するプロジェクタ制御信号は、プロジェクタ10の出力パラメータに関する制御を行うための制御信号であってもよい。プロジェクタ10の出力パラメータは、例えばプロジェクタ10の明るさ、色相、彩度、色味(例えばR、G、Bそれぞれの色味)のパラメータを含んでもよい。また、プロジェクタ10の出力パラメータに関する制御は、プロジェクタ10の明るさ、色相、彩度、色味のうち少なくともいずれか一つのパラメータを1ステップずつ変更することを含んでもよい。 Further, the projector control signal generated by the projector control unit 318 may be a control signal for performing control related to the output parameter of the projector 10. The output parameters of the projector 10 may include, for example, parameters of the brightness, hue, saturation, and hue (for example, R, G, and B colors) of the projector 10. Further, the control related to the output parameter of the projector 10 may include changing at least one of the brightness, hue, saturation, and color of the projector 10 step by step.
 また、プロジェクタ制御部318は、シミュレーション画像生成部316により生成されたシミュレーション画像、及びカメラ20から提供された撮像画像に基づいて、プロジェクタ10を制御する。上述したようにシミュレーション画像は、事前設計におけるプロジェクタ10の投影領域を含む画像であり、撮像画像は撮像時におけるプロジェクタ10の実際の投影領域を含む画像である。 Further, the projector control unit 318 controls the projector 10 based on the simulation image generated by the simulation image generation unit 316 and the captured image provided from the camera 20. As described above, the simulation image is an image including the projection area of the projector 10 in the pre-design, and the captured image is an image including the actual projection area of the projector 10 at the time of imaging.
 例えば、プロジェクタ制御部318は、シミュレーション画像と撮像画像の差分が小さくなるようにプロジェクタ10を制御してもよい。 For example, the projector control unit 318 may control the projector 10 so that the difference between the simulation image and the captured image becomes small.
 図3はプロジェクタ制御部318によるプロジェクタ制御の一例を説明するための説明図である。図3に示すように、プロジェクタ制御部318は、シミュレーション画像D10と撮像画像F10の差分が小さくなるようにプロジェクタ10を制御してもよい。例えば、レンズ位置をいずれかの方向に移動させる、またはズーム倍率を変更する(大きくする、または小さくする)ようにプロジェクタ10を制御して、投影領域を変更することで、差分を小さくさせてもよい。 FIG. 3 is an explanatory diagram for explaining an example of projector control by the projector control unit 318. As shown in FIG. 3, the projector control unit 318 may control the projector 10 so that the difference between the simulation image D10 and the captured image F10 is small. For example, the difference may be reduced by changing the projection area by controlling the projector 10 so as to move the lens position in any direction or change (increase or decrease) the zoom magnification. Good.
 また、プロジェクタ制御部318は、シミュレーション画像と撮像画像との差分が十分に小さい(例えば差分が所定の閾値より小さい)と判定されるまで、プロジェクタ10を繰り返し制御してもよい。 Further, the projector control unit 318 may repeatedly control the projector 10 until it is determined that the difference between the simulation image and the captured image is sufficiently small (for example, the difference is smaller than a predetermined threshold).
 また、プロジェクタ制御部318は、プロジェクタ10に対する第1の制御を行った直後に算出されたシミュレーション画像と撮像画像との差分が、前回算出された差分よりも大きい(増加した)場合に、第1の制御とは異なる第2の制御を行ってもよい。例えば、レンズ位置を移動させる制御を行ったことで差分が増加した場合には、プロジェクタ制御部318はズーム倍率を変更させる制御や、レンズ位置を異なる方向に移動させる制御を行ってもよい。また、ズーム倍率を大きく(または小さく)する制御を行ったことで差分が増加した場合には、プロジェクタ制御部318ズーム倍率を小さく(または大きく)する制御を行ってもよいし、レンズ位置を移動させる制御を行ってもよい。 In addition, the projector control unit 318 determines that the difference between the simulation image calculated immediately after performing the first control on the projector 10 and the captured image is larger (increased) than the previously calculated difference. The second control different from the control may be performed. For example, when the difference increases due to the control of moving the lens position, the projector control unit 318 may perform control to change the zoom magnification or control to move the lens position in different directions. Further, when the difference increases due to the control to increase (or decrease) the zoom magnification, the projector control unit 318 may perform control to decrease (or increase) the zoom magnification or move the lens position. Control may be performed.
 また、プロジェクタ制御部318は、プロジェクタ10に対する第1の制御を行った直後に、シミュレーション画像と撮像画像との差分が、前回算出された差分よりも大きい(増加した)場合に、第1の制御とは逆の制御を行ってもよい。例えば、レンズ位置を移動させる制御を行ったことで差分が増加した場合には、プロジェクタ制御部318はレンズ位置の移動方向を反転させて、レンズ位置を移動させる制御を行ってもよい。また、ズーム倍率を大きく(または小さく)する制御を行ったことで差分が増加した場合には、プロジェクタ制御部318ズーム倍率を小さく(または大きく)する制御を行ってもよい。 In addition, the projector control unit 318 performs the first control when the difference between the simulation image and the captured image is larger (increased) than the previously calculated difference immediately after performing the first control on the projector 10. The reverse control may be performed. For example, when the difference increases due to the control of moving the lens position, the projector control unit 318 may perform the control of moving the lens position by reversing the moving direction of the lens position. Further, when the difference increases due to the control for increasing (or decreasing) the zoom magnification, control for decreasing (or increasing) the zoom magnification of the projector control unit 318 may be performed.
 プロジェクタ制御部318が上記のようにプロジェクタ10を制御することで、プロジェクタ10による実際の投影領域と、事前設計における投影領域との差分が十分に小さくなるように、プロジェクタ10の投影領域が自動的に調整される。 When the projector control unit 318 controls the projector 10 as described above, the projection area of the projector 10 is automatically set so that the difference between the actual projection area by the projector 10 and the projection area in the pre-design is sufficiently small. Adjusted to
 なお、シミュレーション画像と撮像画像との差分は、例えば、シミュレーション画像に含まれる投影領域(事前設計における投影領域)と、撮像画像に含まれる投影領域(実際の投影領域)の面積の差分であってもよい。また、シミュレーション画像と撮像画像との差分は、シミュレーション画像に含まれる投影領域の頂点位置と、撮像画像に含まれる投影領域の頂点位置との差分であってもよい。また、シミュレーション画像と撮像画像との差分は、シミュレーション画像と撮像画像の各画素の画素値の差分を合計した値であってもよい。シミュレーション画像と撮像画像との差分は、上記の例に限定されず、多様な方法で算出されてよい。 Note that the difference between the simulation image and the captured image is, for example, the difference in area between the projection area included in the simulation image (projection area in advance design) and the projection area included in the captured image (actual projection area). Also good. The difference between the simulation image and the captured image may be a difference between the vertex position of the projection area included in the simulation image and the vertex position of the projection area included in the captured image. Further, the difference between the simulation image and the captured image may be a value obtained by summing up the pixel values of the pixels of the simulation image and the captured image. The difference between the simulation image and the captured image is not limited to the above example, and may be calculated by various methods.
 また、プロジェクタ制御部318は、プロジェクタごとに取得される各プロジェクタの投影領域を含む撮像画像に基づいてプロジェクタ10を制御してもよく、複数の撮像画像に基づいてプロジェクタ10に含まれる複数のプロジェクタを制御してもよい。なお、係る複数の撮像画像の取得において、投影に用いられる投影画像は全て同一であってよい。 Further, the projector control unit 318 may control the projector 10 based on the captured image including the projection area of each projector acquired for each projector, and a plurality of projectors included in the projector 10 based on the plurality of captured images. May be controlled. Note that in obtaining such a plurality of captured images, the projection images used for projection may all be the same.
 図4はプロジェクタ制御部318によるプロジェクタ制御の一例を説明するための説明図である。図4に示すように、投影画像が同一であったとしても、プロジェクタ10に含まれる複数のプロジェクタによる投影領域P11~P14からユーザが知覚する、明るさ、色相、彩度、または色味等は異なり得る。そこで、プロジェクタ制御部318は、複数の撮像画像に基づいて、複数のプロジェクタ間におけるばらつきを小さくするようにプロジェクタ10を制御してもよい。なお、本明細書において、複数のプロジェクタ間におけるばらつきとは、ユーザの知覚する明るさ、色相、彩度、または色味等、見えに関するばらつきを意味する。 FIG. 4 is an explanatory diagram for explaining an example of projector control by the projector control unit 318. As shown in FIG. 4, even if the projection images are the same, the brightness, hue, saturation, hue, etc. perceived by the user from the projection areas P11 to P14 by the plurality of projectors included in the projector 10 are as follows. Can be different. Therefore, the projector control unit 318 may control the projector 10 based on a plurality of captured images so as to reduce the variation among the plurality of projectors. Note that, in this specification, the variation among a plurality of projectors means a variation relating to appearance such as brightness, hue, saturation, or hue perceived by the user.
 例えば、プロジェクタ制御部318は、明るさ、色相、彩度、または色味のうち、少なくともいずれか一つについて、複数のプロジェクタ間におけるばらつきが小さくなるようにプロジェクタ10を制御してもよい。また、以下では明るさ、色相、彩度、または色味のパラメータに関するプロジェクタ10の制御について、出力パラメータの制御としてまとめて説明を行うが、プロジェクタ制御部318は、明るさ、色相、彩度、または色味のパラメータを、各々独立に制御し得る。 For example, the projector control unit 318 may control the projector 10 so that variation among a plurality of projectors is reduced with respect to at least one of brightness, hue, saturation, and color. In the following, the control of the projector 10 relating to the parameters of brightness, hue, saturation, or tint will be collectively described as output parameter control. However, the projector control unit 318 includes brightness, hue, saturation, Alternatively, the color parameters can be controlled independently.
 プロジェクタ制御部318は、例えば複数の撮像画像から、それぞれのプロジェクタの出力パラメータを算出し、複数のプロジェクタ間で出力パラメータのばらつきが小さくなるように、プロジェクタ10を制御する。例えば、プロジェクタ制御部318は、複数のプロジェクタの出力パラメータに基づいて基準値を特定し、各プロジェクタの出力パラメータを当該基準値に近づけるように制御してもよい。例えば、プロジェクタ制御部318は、基準値よりも大きな(小さな)出力パラメータを有するプロジェクタ10に対して、当該出力パラメータを小さく(大きく)するように制御を行ってもよい。なお、基準値は、例えば、プロジェクタ10に含まれる複数のプロジェクタの出力パラメータの平均値であってもよいし、中央値、最小値、または最大値であってもよい。 The projector control unit 318 calculates the output parameters of each projector from, for example, a plurality of captured images, and controls the projector 10 so that the variation of the output parameters among the plurality of projectors becomes small. For example, the projector control unit 318 may specify a reference value based on the output parameters of a plurality of projectors, and control the output parameter of each projector to approach the reference value. For example, the projector control unit 318 may control the projector 10 having an output parameter larger (smaller) than the reference value so that the output parameter is smaller (larger). The reference value may be, for example, an average value of output parameters of a plurality of projectors included in the projector 10, or may be a median value, a minimum value, or a maximum value.
 プロジェクタ制御部318は、複数のプロジェクタ間における出力パラメータのばらつきが十分に小さい(例えば、ばらつきが所定の閾値より小さい)と判定されるまで、プロジェクタ10を繰り返し制御してもよい。 The projector control unit 318 may repeatedly control the projector 10 until it is determined that the variation in the output parameter among the plurality of projectors is sufficiently small (for example, the variation is smaller than a predetermined threshold).
 また、プロジェクタ制御部318は、プロジェクタ10の出力パラメータの制御を行った直後に算出された複数のプロジェクタ間における出力パラメータのばらつきが、前回算出されたばらつきよりも大きい(増加した)場合に、当該出力パラメータの制御とは逆の制御を行ってもよい。例えば、出力パラメータを大きく(小さく)する制御を行ったことで、ばらつきが増加した場合には、プロジェクタ制御部318は当該出力パラメータを小さく(大きく)するようにプロジェクタ10を制御してもよい。 Further, the projector control unit 318 determines that the output parameter variation among the plurality of projectors calculated immediately after the output parameter control of the projector 10 is larger (increased) than the previously calculated variation. Control opposite to the output parameter control may be performed. For example, when the variation increases due to the control to increase (decrease) the output parameter, the projector control unit 318 may control the projector 10 to decrease (increase) the output parameter.
 プロジェクタ制御部318が上記のようにプロジェクタ10を制御することで、プロジェクタ10に含まれる複数のプロジェクタ間における出力パラメータのばらつきが十分に小さくなるように、プロジェクタ10の出力パラメータが自動的に調整される。 When the projector control unit 318 controls the projector 10 as described above, the output parameter of the projector 10 is automatically adjusted so that the variation of the output parameter among the plurality of projectors included in the projector 10 becomes sufficiently small. The
 なお、複数のプロジェクタ間における出力パラメータのばらつきは、例えば、複数のプロジェクタの出力パラメータの分散、または標準偏差であってもよい。また、複数のプロジェクタ間における出力パラメータのばらつきは、複数のプロジェクタの出力パラメータの最大値と最小値との差分であってもよい。また、複数のプロジェクタ間における出力パラメータのばらつきは、上述した基準値と各プロジェクタの出力パラメータとの差分の合計値であってもよい。複数のプロジェクタ間における出力パラメータのばらつきは、上記の例に限定されず、多様な方法で算出されてよい。 Note that the variation in output parameters among the plurality of projectors may be, for example, dispersion of output parameters of the plurality of projectors or standard deviation. Further, the variation in the output parameter among the plurality of projectors may be a difference between the maximum value and the minimum value of the output parameters of the plurality of projectors. Further, the variation of the output parameter among the plurality of projectors may be a total value of the difference between the above-described reference value and the output parameter of each projector. Variations in output parameters among a plurality of projectors are not limited to the above example, and may be calculated by various methods.
 通信部320は、他の装置との間の通信を仲介する通信インタフェースである。通信部320は、任意の無線通信プロトコルまたは有線通信プロトコルをサポートし、例えば図1を参照して説明した通信網80を介して他の装置(例えばプロジェクタ10、カメラ20)との間の通信接続を確立する。また、通信部320は、通信制御部312の制御に従い、他の装置から情報を受信し、または他の装置へ情報を送信する。 The communication unit 320 is a communication interface that mediates communication with other devices. The communication unit 320 supports an arbitrary wireless communication protocol or wired communication protocol, and for example, a communication connection with other devices (for example, the projector 10 and the camera 20) via the communication network 80 described with reference to FIG. Establish. Further, the communication unit 320 receives information from other devices or transmits information to other devices according to the control of the communication control unit 312.
 記憶部350は、情報処理装置30の各構成が機能するためのプログラムやパラメータを記憶する。また、記憶部350は、事前設計におけるパラメータ、及びカメラ20に関するパラメータを記憶する。 The storage unit 350 stores programs and parameters for the functions of the information processing apparatus 30 to function. In addition, the storage unit 350 stores parameters in advance design and parameters related to the camera 20.
 以上、本実施形態の構成例について説明した。なお、図2に示す情報処理装置30の構成は一例であって、本実施形態はこれに限定されない。例えば、本実施形態による制御部310の各機能は、通信部320を介して接続される他の情報処理装置に備えられてもよい。例えば、シミュレーション画像生成部316が他の情報処理装置に備えられる場合、当該他の情報処理装置により生成されたシミュレーション画像が通信部320を介して情報処理装置30へ提供されてもよい。 The configuration example of this embodiment has been described above. The configuration of the information processing apparatus 30 illustrated in FIG. 2 is an example, and the present embodiment is not limited to this. For example, each function of the control unit 310 according to the present embodiment may be provided in another information processing apparatus connected via the communication unit 320. For example, when the simulation image generation unit 316 is provided in another information processing apparatus, a simulation image generated by the other information processing apparatus may be provided to the information processing apparatus 30 via the communication unit 320.
 <<3.動作>>
 続いて、本実施形態の動作例について説明する。まず、全体的な動作例について図5,図6を参照して説明した後に、図7を参照して各プロジェクタの投影領域を調整するプロジェクタ制御について説明し、さらに図8を参照して複数のプロジェクタ間のばらつきを調整するプロジェクタ制御について説明する。
<< 3. Operation >>
Subsequently, an operation example of the present embodiment will be described. First, an overall operation example will be described with reference to FIGS. 5 and 6, and then projector control for adjusting the projection area of each projector will be described with reference to FIG. Projector control for adjusting the variation between projectors will be described.
 (全体動作)
 図5は本実施形態の動作例を示すフローチャートである。まず、ユーザによりスクリーン50、及びプロジェクタ10の配置等に関する事前設計が行われる(S11)。事前設計は、情報処理装置30を用いて行われてもよいし、通信網80に接続された不図示の他の装置を用いて行われて、事前設計におけるパラメータが情報処理装置30に提供されてもよい。
(Overall operation)
FIG. 5 is a flowchart showing an operation example of this embodiment. First, the user performs advance design regarding the arrangement of the screen 50 and the projector 10 (S11). The pre-design may be performed using the information processing apparatus 30 or may be performed using another apparatus (not illustrated) connected to the communication network 80, and parameters in the prior design are provided to the information processing apparatus 30. May be.
 図6は、ステップS11の事前設計において不図示の表示部に表示される画面の例を示す説明図である。例えば、事前設計において、ユーザは、画面G1に表示されたプロジェクタG11~G14(例えば各々が図1に示したプロジェクタ10A~10Dに対応)によるスクリーンG50(例えば図1に示したスクリーン50に対応)への投影領域を確認しながら、プロジェクタG11~G14の位置や姿勢等を調整してもよい。なお、図6では画面G1は平面的に表示されているが、画面G1は3次元的に表示されてもよく、視点位置をユーザが変更することが可能であってもよい。 FIG. 6 is an explanatory diagram showing an example of a screen displayed on a display unit (not shown) in the prior design in step S11. For example, in the pre-design, the user uses a screen G50 (for example, corresponding to the screen 50 shown in FIG. 1) by the projectors G11 to G14 (for example, corresponding to the projectors 10A to 10D shown in FIG. 1) displayed on the screen G1. The positions and orientations of the projectors G11 to G14 may be adjusted while confirming the projection area. In FIG. 6, the screen G1 is displayed in a plane, but the screen G1 may be displayed three-dimensionally, and the user may be able to change the viewpoint position.
 図5に戻って説明を続ける。続いて、ユーザはスクリーン50、及びプロジェクタ10を事前設計に従って設置する(S12)。また、この際、カメラ20と情報処理装置30も設置される。 Referring back to FIG. Subsequently, the user installs the screen 50 and the projector 10 according to a pre-design (S12). At this time, the camera 20 and the information processing apparatus 30 are also installed.
 続いて、情報処理装置30により、各プロジェクタの投影領域を調整するプロジェクタ制御が行われる(S13)。ステップS13の詳細については、図7を参照して後述する。 Subsequently, projector control for adjusting the projection area of each projector is performed by the information processing apparatus 30 (S13). Details of step S13 will be described later with reference to FIG.
 プロジェクタ10に含まれるプロジェクタのうち、ステップS13の投影領域調整が完了していないプロジェクタが存在する場合(S14においてNO)、投影領域調整が完了していないプロジェクタについてステップS13の処理が繰り返される。 Among the projectors included in the projector 10, when there is a projector for which the projection area adjustment of step S13 has not been completed (NO in S14), the process of step S13 is repeated for the projector for which the projection area adjustment has not been completed.
 一方、プロジェクタ10に含まれる全てのプロジェクタについて、ステップS13の投影領域調整が完了した場合(S14においてYES)、情報処理装置30により、プロジェクタ10に含まれる複数のプロジェクタ間のばらつきを調整する制御が行われる(S15)。ステップS15の詳細については、図8を参照して後述する。 On the other hand, when the projection area adjustment in step S13 is completed for all projectors included in projector 10 (YES in S14), control for adjusting the variation among the plurality of projectors included in projector 10 is performed by information processing device 30. Performed (S15). Details of step S15 will be described later with reference to FIG.
 (各プロジェクタの投影領域を調整する制御)
 図7は、図5に示すステップS13のより詳細な処理を示すフローチャート図である。まず、プロジェクタ制御部318は、プロジェクタ10のうちステップS13の処理が完了していないプロジェクタを一つ選び、処理対象である注目プロジェクタとして設定する(S131)。
(Control to adjust the projection area of each projector)
FIG. 7 is a flowchart showing more detailed processing of step S13 shown in FIG. First, the projector control unit 318 selects one of the projectors 10 for which the process of step S13 has not been completed, and sets it as a target projector to be processed (S131).
 続いて、シミュレーション画像生成部316は、注目プロジェクタ点灯時のシミュレーション画像(事前設計における注目プロジェクタの投影領域を含む画像)を生成する(S132)。続いて、プロジェクタ制御部318は、注目プロジェクタを点灯させる(S133)。 Subsequently, the simulation image generation unit 316 generates a simulation image (an image including the projection area of the target projector in advance design) when the target projector is turned on (S132). Subsequently, the projector control unit 318 turns on the target projector (S133).
 続いて、撮像制御部314の制御に従い、カメラ20が注目プロジェクタの投影領域を含む撮像画像を取得して、情報処理装置30へ提供する(S134)。続いて、プロジェクタ制御部318は、シミュレーション画像とステップS134で取得された撮像画像との差分を算出し、当該差分が小さくなるように注目プロジェクタを制御する(S135)。なお、ステップS135の処理が注目プロジェクタに対して最初に行われる場合、プロジェクタ制御部318は、当該差分を算出した後に、レンズをいずれかの方向に移動させる、またはズーム倍率を変更(大きく、または小さく)するというプロジェクタ制御のうち、いずれかをランダムに行ってもよい。そして、ステップS135の処理が行われる度に、プロジェクタ制御部318は再度差分を算出し、前回算出された差分よりも今回算出された差分が大きい場合には、上述したように前回の制御とは異なる制御や逆の制御を行ってもよい。一方、前回よりも差分が小さい場合、プロジェクタ制御部318は前回と同様の制御を行ってもよい。上記のようにして、プロジェクタ制御部318は、差分が小さくなるように注目プロジェクタを制御することが可能である。 Subsequently, under the control of the imaging control unit 314, the camera 20 acquires a captured image including the projection area of the projector of interest and provides it to the information processing apparatus 30 (S134). Subsequently, the projector control unit 318 calculates a difference between the simulation image and the captured image acquired in step S134, and controls the projector of interest so that the difference becomes small (S135). When the process of step S135 is first performed on the target projector, the projector control unit 318 calculates the difference and then moves the lens in any direction or changes the zoom magnification (large or Any one of the projector controls to be (smaller) may be performed at random. Then, each time the process of step S135 is performed, the projector control unit 318 calculates the difference again. If the difference calculated this time is larger than the difference calculated last time, the previous control is as described above. Different control and reverse control may be performed. On the other hand, when the difference is smaller than the previous time, the projector control unit 318 may perform the same control as the previous time. As described above, the projector control unit 318 can control the projector of interest so that the difference becomes small.
 なお、ステップS135においてプロジェクタ制御部318により算出されたシミュレーション画像と撮像画像との差分が十分に小さければ、プロジェクタ制御部318はステップS135において注目プロジェクタの制御を行わなくてよい。また、シミュレーション画像と撮像画像との差分が十分に小さい場合(S136においてYES)、注目プロジェクタについてのステップS13の処理は終了する。一方、シミュレーション画像と撮像画像との差分が十分に小さくなければ、上述の通りステップS135において注目プロジェクタの制御が行われる。さらに、シミュレーション画像と撮像画像との差分が十分に小さいと判定されるまで、撮像画像の取得(S134)とプロジェクタ10の制御(S135)が繰り返し行われる。 If the difference between the simulation image calculated by the projector control unit 318 and the captured image in step S135 is sufficiently small, the projector control unit 318 does not have to control the projector of interest in step S135. If the difference between the simulation image and the captured image is sufficiently small (YES in S136), the process in step S13 for the projector of interest ends. On the other hand, if the difference between the simulation image and the captured image is not sufficiently small, the projector of interest is controlled in step S135 as described above. Further, until it is determined that the difference between the simulation image and the captured image is sufficiently small, acquisition of the captured image (S134) and control of the projector 10 (S135) are repeatedly performed.
 以上、ステップS13の各プロジェクタの投影領域を調整するプロジェクタ制御について説明した。なお、図7に示したフローチャートは一例であり、本実施形態に係る動作は係る例に限定されない。例えばステップS132のシミュレーション画像の生成は、注目プロジェクタが設定されるより前に、予め全てのプロジェクタについて行われ、全てのプロジェクタに対応するシミュレーション画像が記憶部350に記憶されていてもよい。 The projector control for adjusting the projection area of each projector in step S13 has been described above. Note that the flowchart shown in FIG. 7 is an example, and the operation according to the present embodiment is not limited to the example. For example, the generation of the simulation image in step S132 may be performed for all projectors in advance before the target projector is set, and simulation images corresponding to all the projectors may be stored in the storage unit 350.
 (複数のプロジェクタ間のばらつきを調整する制御)
 図8は、図5に示すステップS15のより詳細な処理を示すフローチャート図である。まず、プロジェクタ制御部318は、プロジェクタ10のうち、ステップS152~S153の処理が完了していないプロジェクタを一つ選び、処理対象である注目プロジェクタとして設定する(S151)。
(Control to adjust dispersion between multiple projectors)
FIG. 8 is a flowchart showing more detailed processing of step S15 shown in FIG. First, the projector control unit 318 selects one of the projectors 10 for which the processing in steps S152 to S153 has not been completed, and sets it as the target projector to be processed (S151).
 続いて、プロジェクタ制御部318は、注目プロジェクタを点灯させる(S152)。
さらに、撮像制御部314の制御に従い、カメラ20が注目プロジェクタの投影領域を含む撮像画像を取得して、情報処理装置30へ提供する(S153)。ステップS151~S153の処理は、プロジェクタ10に含まれる全プロジェクタについて点灯時の撮像画像が取得されるまで繰り返して行われる。つまり、プロジェクタ10に含まれるプロジェクタのうち、点灯時の撮像画像が取得されていないプロジェクタが存在する場合(S154においてNO)、処理はステップS151に戻る。
Subsequently, the projector control unit 318 turns on the target projector (S152).
Further, according to the control of the imaging control unit 314, the camera 20 acquires a captured image including the projection area of the projector of interest, and provides it to the information processing apparatus 30 (S153). The processes in steps S151 to S153 are repeatedly performed until captured images at the time of lighting are acquired for all projectors included in the projector 10. In other words, if there is a projector for which a captured image at the time of lighting is not acquired among the projectors included in projector 10 (NO in S154), the process returns to step S151.
 一方、プロジェクタ10に含まれる全プロジェクタについて点灯時の撮像画像が取得された場合(S154においてYES)、プロジェクタ制御部318は、取得された複数の撮像画像に基づいて、複数のプロジェクタ間における出力パラメータのばらつきが十分に小さいか否かを判定する(S155)。複数のプロジェクタ間における出力パラメータのばらつきが十分に小さいと判定された場合(S155においてYES)、ステップS15の処理は終了する。 On the other hand, when the captured images at the time of lighting are acquired for all the projectors included in projector 10 (YES in S154), projector control unit 318 outputs output parameters between the plurality of projectors based on the acquired plurality of captured images. It is determined whether or not the variation in the number is sufficiently small (S155). If it is determined that the variation in output parameters among the plurality of projectors is sufficiently small (YES in S155), the process in step S15 ends.
 一方、複数のプロジェクタ間における出力パラメータのばらつきが十分に小さくないと判定された場合(S155においてNO)、プロジェクタ制御部318は、当該ばらつきが小さくなるようにプロジェクタ10に含まれる各プロジェクタを制御する(S156)。さらに、処理はステップS151に戻り、ステップS156の制御後の全てのプロジェクタについて、撮像画像を取得した後(S151~S154)、当該撮像画像に基づいて、ばらつきが十分に小さいか否か再度判定される(S155)。 On the other hand, when it is determined that the variation in output parameter among the plurality of projectors is not sufficiently small (NO in S155), projector control unit 318 controls each projector included in projector 10 so that the variation is small. (S156). Further, the process returns to step S151, and after acquiring the captured images for all the projectors controlled in step S156 (S151 to S154), it is determined again whether or not the variation is sufficiently small based on the captured images. (S155).
 <<4.変形例>>
 以上、本開示の一実施形態を説明した。以下では、本実施形態の幾つかの変形例を説明する。なお、以下に説明する各変形例は、単独で本実施形態に適用されてもよいし、組み合わせで本実施形態に適用されてもよい。また、各変形例は、本実施形態で説明した構成に代えて適用されてもよいし、本実施形態で説明した構成に対して追加的に適用されてもよい。
<< 4. Modification >>
The embodiment of the present disclosure has been described above. Below, some modifications of this embodiment are explained. Each modified example described below may be applied to the present embodiment alone, or may be applied to the present embodiment in combination. Each modification may be applied instead of the configuration described in the present embodiment, or may be additionally applied to the configuration described in the present embodiment.
  <4-1.変形例1>
 上記実施形態では、撮像画像に基づいて、出力パラメータが取得される例を説明したが、本技術は係る例に限定されない。例えば、投影システム1は、カメラ20以外のセンサを有してもよく、センサにより取得されるセンシングデータに基づいて出力パラメータが取得されてもよい。例えば、投影システム1が照度センサを有する場合、照度センサにより取得されるセンシングデータに基づいて、明るさのパラメータ(出力パラメータの一例)が取得されてもよい。係る構成によれば、より高精度に出力パラメータが取得され得る。
<4-1. Modification 1>
In the above-described embodiment, the example in which the output parameter is acquired based on the captured image has been described, but the present technology is not limited to the example. For example, the projection system 1 may include a sensor other than the camera 20, and the output parameter may be acquired based on sensing data acquired by the sensor. For example, when the projection system 1 includes an illuminance sensor, a brightness parameter (an example of an output parameter) may be acquired based on sensing data acquired by the illuminance sensor. According to such a configuration, the output parameter can be acquired with higher accuracy.
  <4-2.変形例2>
 上記実施形態では、撮像制御部314によりカメラ20の撮像が制御され、各プロジェクタが点灯される毎に撮像画像が取得される例を説明したが、本技術は係る例に限定されない。例えば、制御部310は撮像制御部314としての機能を有していなくてもよく、係る場合、例えば撮像画像が連続的に(例えば常時)、カメラ20から情報処理装置30へ提供され続け、情報処理装置30は撮像画像の撮像時刻を参照することで、プロジェクタの制御に利用する撮像画像を選択してもよい。
<4-2. Modification 2>
In the above-described embodiment, the example in which the imaging control unit 314 controls the imaging of the camera 20 and the captured image is acquired every time each projector is turned on is described. However, the present technology is not limited to such an example. For example, the control unit 310 may not have a function as the imaging control unit 314. In such a case, for example, the captured image is continuously provided from the camera 20 to the information processing device 30 (for example, constantly) The processing device 30 may select a captured image used for projector control by referring to the captured time of the captured image.
  <4-3.変形例3>
 図1では、カメラ20が1台のカメラである例を示したが、本技術は係る例に限定されない。例えば、カメラ20は複数のカメラを含んでもよい。また、カメラ20が複数のカメラを含む場合、撮像制御部314は、カメラ20に含まれるカメラのうち、点灯しているプロジェクタに応じたカメラによる撮像を制御するような撮像制御信号を生成してもよい。ここで、プロジェクタに応じたカメラとは、例えば当該プロジェクタによる投影領域を含んだ撮像画像を取得可能なカメラである。
<4-3. Modification 3>
Although FIG. 1 illustrates an example in which the camera 20 is a single camera, the present technology is not limited to such an example. For example, the camera 20 may include a plurality of cameras. When the camera 20 includes a plurality of cameras, the imaging control unit 314 generates an imaging control signal for controlling imaging by the camera corresponding to the projector that is lit among the cameras included in the camera 20. Also good. Here, the camera corresponding to the projector is a camera capable of acquiring a captured image including a projection area by the projector, for example.
 また、カメラ20が複数のカメラを含む場合、カメラ間で明るさ、色相、彩度、色味等のキャリブレーションが行われ、キャリブレーション結果に基づいて、撮像画像から出力パラメータが算出されることが望ましい。 Further, when the camera 20 includes a plurality of cameras, calibration such as brightness, hue, saturation, and hue is performed between the cameras, and output parameters are calculated from the captured image based on the calibration result. Is desirable.
 <<5.ハードウェア構成例>>
 以上、本開示の実施形態を説明した。最後に、図9を参照して、本実施形態に係る情報処理装置のハードウェア構成について説明する。図9は、本実施形態に係る情報処理装置のハードウェア構成の一例を示すブロック図である。なお、図9に示す情報処理装置900は、例えば、図1,図2に示した情報処理装置30を実現し得る。本実施形態に係る情報処理装置30による情報処理は、ソフトウェアと、以下に説明するハードウェアとの協働により実現される。
<< 5. Hardware configuration example >>
The embodiment of the present disclosure has been described above. Finally, the hardware configuration of the information processing apparatus according to the present embodiment will be described with reference to FIG. FIG. 9 is a block diagram illustrating an example of a hardware configuration of the information processing apparatus according to the present embodiment. Note that the information processing apparatus 900 illustrated in FIG. 9 can realize the information processing apparatus 30 illustrated in FIGS. 1 and 2, for example. Information processing by the information processing apparatus 30 according to the present embodiment is realized by cooperation between software and hardware described below.
 図9に示すように、情報処理装置900は、CPU(Central Processing Unit)901、ROM(Read Only Memory)902、RAM(Random Access Memory)903及びホストバス904aを備える。また、情報処理装置900は、ブリッジ904、外部バス904b、インタフェース905、入力装置906、出力装置907、ストレージ装置908、ドライブ909、接続ポート911、通信装置913、及びセンサ915を備える。情報処理装置900は、CPU901に代えて、又はこれとともに、DSP若しくはASIC等の処理回路を有してもよい。 As shown in FIG. 9, the information processing apparatus 900 includes a CPU (Central Processing Unit) 901, a ROM (Read Only Memory) 902, a RAM (Random Access Memory) 903, and a host bus 904a. The information processing apparatus 900 includes a bridge 904, an external bus 904b, an interface 905, an input device 906, an output device 907, a storage device 908, a drive 909, a connection port 911, a communication device 913, and a sensor 915. The information processing apparatus 900 may include a processing circuit such as a DSP or an ASIC in place of or in addition to the CPU 901.
 CPU901は、演算処理装置および制御装置として機能し、各種プログラムに従って情報処理装置900内の動作全般を制御する。また、CPU901は、マイクロプロセッサであってもよい。ROM902は、CPU901が使用するプログラムや演算パラメータ等を記憶する。RAM903は、CPU901の実行において使用するプログラムや、その実行において適宜変化するパラメータ等を一時記憶する。CPU901は、例えば、制御部310を形成し得る。 The CPU 901 functions as an arithmetic processing unit and a control unit, and controls the overall operation in the information processing apparatus 900 according to various programs. Further, the CPU 901 may be a microprocessor. The ROM 902 stores programs used by the CPU 901, calculation parameters, and the like. The RAM 903 temporarily stores programs used in the execution of the CPU 901, parameters that change as appropriate during the execution, and the like. For example, the CPU 901 can form the control unit 310.
 CPU901、ROM902及びRAM903は、CPUバスなどを含むホストバス904aにより相互に接続されている。ホストバス904aは、ブリッジ904を介して、PCI(Peripheral Component Interconnect/Interface)バスなどの外部バス904bに接続されている。なお、必ずしもホストバス904a、ブリッジ904および外部バス904bを分離構成する必要はなく、1つのバスにこれらの機能を実装してもよい。 The CPU 901, ROM 902, and RAM 903 are connected to each other by a host bus 904a including a CPU bus. The host bus 904 a is connected to an external bus 904 b such as a PCI (Peripheral Component Interconnect / Interface) bus via a bridge 904. Note that the host bus 904a, the bridge 904, and the external bus 904b do not necessarily have to be configured separately, and these functions may be mounted on one bus.
 入力装置906は、例えば、マウス、キーボード、タッチパネル、ボタン、マイクロフォン、スイッチ及びレバー等、ユーザによって情報が入力される装置によって実現される。また、入力装置906は、例えば、赤外線やその他の電波を利用したリモートコントロール装置であってもよいし、情報処理装置900の操作に対応した携帯電話やPDA等の外部接続機器であってもよい。さらに、入力装置906は、例えば、上記の入力手段を用いてユーザにより入力された情報に基づいて入力信号を生成し、CPU901に出力する入力制御回路などを含んでいてもよい。情報処理装置900のユーザは、この入力装置906を操作することにより、情報処理装置900に対して各種のデータを入力したり処理動作を指示したりすることができる。 The input device 906 is realized by a device in which information is input by the user, such as a mouse, a keyboard, a touch panel, a button, a microphone, a switch, and a lever. The input device 906 may be, for example, a remote control device using infrared rays or other radio waves, or may be an external connection device such as a mobile phone or a PDA that supports the operation of the information processing device 900. . Furthermore, the input device 906 may include, for example, an input control circuit that generates an input signal based on information input by the user using the above-described input means and outputs the input signal to the CPU 901. A user of the information processing apparatus 900 can input various data and instruct a processing operation to the information processing apparatus 900 by operating the input device 906.
 出力装置907は、取得した情報をユーザに対して視覚的又は聴覚的に通知することが可能な装置で形成される。このような装置として、CRTディスプレイ装置、液晶ディスプレイ装置、プラズマディスプレイ装置、ELディスプレイ装置及びランプ等の表示装置や、スピーカ及びヘッドホン等の音声出力装置や、プリンタ装置等がある。出力装置907は、例えば、情報処理装置900が行った各種処理により得られた結果を出力する。具体的には、表示装置は、情報処理装置900が行った各種処理により得られた結果を、テキスト、イメージ、表、グラフ等、様々な形式で視覚的に表示する。他方、音声出力装置は、再生された音声データや音響データ等からなるオーディオ信号をアナログ信号に変換して聴覚的に出力する。 The output device 907 is formed of a device that can notify the user of the acquired information visually or audibly. Examples of such devices include CRT display devices, liquid crystal display devices, plasma display devices, EL display devices, display devices such as lamps, audio output devices such as speakers and headphones, printer devices, and the like. For example, the output device 907 outputs results obtained by various processes performed by the information processing device 900. Specifically, the display device visually displays results obtained by various processes performed by the information processing device 900 in various formats such as text, images, tables, and graphs. On the other hand, the audio output device converts an audio signal composed of reproduced audio data, acoustic data, and the like into an analog signal and outputs it aurally.
 ストレージ装置908は、情報処理装置900の記憶部の一例として形成されたデータ格納用の装置である。ストレージ装置908は、例えば、HDD等の磁気記憶部デバイス、半導体記憶デバイス、光記憶デバイス又は光磁気記憶デバイス等により実現される。ストレージ装置908は、記憶媒体、記憶媒体にデータを記録する記録装置、記憶媒体からデータを読み出す読出し装置および記憶媒体に記録されたデータを削除する削除装置などを含んでもよい。このストレージ装置908は、CPU901が実行するプログラムや各種データ及び外部から取得した各種のデータ等を格納する。上記ストレージ装置908は、例えば、記憶部350を形成し得る。 The storage device 908 is a data storage device formed as an example of a storage unit of the information processing device 900. The storage apparatus 908 is realized by, for example, a magnetic storage device such as an HDD, a semiconductor storage device, an optical storage device, a magneto-optical storage device, or the like. The storage device 908 may include a storage medium, a recording device that records data on the storage medium, a reading device that reads data from the storage medium, a deletion device that deletes data recorded on the storage medium, and the like. The storage device 908 stores programs executed by the CPU 901, various data, various data acquired from the outside, and the like. The storage apparatus 908 can form the storage unit 350, for example.
 ドライブ909は、記憶媒体用リーダライタであり、情報処理装置900に内蔵、あるいは外付けされる。ドライブ909は、装着されている磁気ディスク、光ディスク、光磁気ディスク、または半導体メモリ等のリムーバブル記憶媒体に記録されている情報を読み出して、RAM903に出力する。また、ドライブ909は、リムーバブル記憶媒体に情報を書き込むこともできる。 The drive 909 is a storage medium reader / writer, and is built in or externally attached to the information processing apparatus 900. The drive 909 reads information recorded on a removable storage medium such as a magnetic disk, an optical disk, a magneto-optical disk, or a semiconductor memory, and outputs the information to the RAM 903. The drive 909 can also write information to a removable storage medium.
 接続ポート911は、外部機器と接続されるインタフェースであって、例えばUSB(Universal Serial Bus)などによりデータ伝送可能な外部機器との接続口である。 The connection port 911 is an interface connected to an external device, and is a connection port with an external device capable of transmitting data by USB (Universal Serial Bus), for example.
 通信装置913は、例えば、ネットワーク920に接続するための通信デバイス等で形成された通信インタフェースである。通信装置913は、例えば、有線若しくは無線LAN(Local Area Network)、LTE(Long Term Evolution)、Bluetooth(登録商標)又はWUSB(Wireless USB)用の通信カード等である。また、通信装置913は、光通信用のルータ、ADSL(Asymmetric Digital Subscriber Line)用のルータ又は各種通信用のモデム等であってもよい。この通信装置913は、例えば、インターネットや他の通信機器との間で、例えばTCP/IP等の所定のプロトコルに則して信号等を送受信することができる。 The communication device 913 is a communication interface formed by a communication device or the like for connecting to the network 920, for example. The communication device 913 is, for example, a communication card for wired or wireless LAN (Local Area Network), LTE (Long Term Evolution), Bluetooth (registered trademark), or WUSB (Wireless USB). The communication device 913 may be a router for optical communication, a router for ADSL (Asymmetric Digital Subscriber Line), a modem for various communication, or the like. The communication device 913 can transmit and receive signals and the like according to a predetermined protocol such as TCP / IP, for example, with the Internet and other communication devices.
 センサ915は、例えば、加速度センサ、ジャイロセンサ、地磁気センサ、光センサ、音センサ、測距センサ、力センサ等の各種のセンサである。センサ915は、情報処理装置900の姿勢、移動速度等、情報処理装置900自身の状態に関する情報や、情報処理装置900の周辺の明るさや騒音等、情報処理装置900の周辺環境に関する情報を取得する。また、センサ915は、GPS信号を受信して装置の緯度、経度及び高度を測定するGPSセンサを含んでもよい。 The sensor 915 is various sensors such as an acceleration sensor, a gyro sensor, a geomagnetic sensor, an optical sensor, a sound sensor, a distance measuring sensor, and a force sensor. The sensor 915 acquires information on the state of the information processing apparatus 900 itself, such as the posture and movement speed of the information processing apparatus 900, and information on the surrounding environment of the information processing apparatus 900, such as brightness and noise around the information processing apparatus 900. . Sensor 915 may also include a GPS sensor that receives GPS signals and measures the latitude, longitude, and altitude of the device.
 なお、ネットワーク920は、ネットワーク920に接続されている装置から送信される情報の有線、または無線の伝送路である。例えば、ネットワーク920は、インターネット、電話回線網、衛星通信網などの公衆回線網や、Ethernet(登録商標)を含む各種のLAN(Local Area Network)、WAN(Wide Area Network)などを含んでもよい。また、ネットワーク920は、IP-VPN(Internet Protocol-Virtual Private Network)などの専用回線網を含んでもよい。 Note that the network 920 is a wired or wireless transmission path for information transmitted from a device connected to the network 920. For example, the network 920 may include a public line network such as the Internet, a telephone line network, and a satellite communication network, various LANs including the Ethernet (registered trademark), a wide area network (WAN), and the like. Further, the network 920 may include a dedicated line network such as an IP-VPN (Internet Protocol-Virtual Private Network).
 以上、本実施形態に係る情報処理装置900の機能を実現可能なハードウェア構成の一例を示した。上記の各構成要素は、汎用的な部材を用いて実現されていてもよいし、各構成要素の機能に特化したハードウェアにより実現されていてもよい。従って、本実施形態を実施する時々の技術レベルに応じて、適宜、利用するハードウェア構成を変更することが可能である。 Heretofore, an example of the hardware configuration capable of realizing the functions of the information processing apparatus 900 according to the present embodiment has been shown. Each of the above components may be realized using a general-purpose member, or may be realized by hardware specialized for the function of each component. Therefore, it is possible to change the hardware configuration to be used as appropriate according to the technical level at the time of carrying out this embodiment.
 なお、上述のような本実施形態に係る情報処理装置900の各機能を実現するためのコンピュータプログラムを作製し、PC等に実装することが可能である。また、このようなコンピュータプログラムが格納された、コンピュータで読み取り可能な記録媒体も提供することができる。記録媒体は、例えば、磁気ディスク、光ディスク、光磁気ディスク、フラッシュメモリ等である。また、上記のコンピュータプログラムは、記録媒体を用いずに、例えばネットワークを介して配信されてもよい。 It should be noted that a computer program for realizing each function of the information processing apparatus 900 according to the present embodiment as described above can be produced and mounted on a PC or the like. In addition, a computer-readable recording medium storing such a computer program can be provided. The recording medium is, for example, a magnetic disk, an optical disk, a magneto-optical disk, a flash memory, or the like. Further, the above computer program may be distributed via a network, for example, without using a recording medium.
 <<6.むすび>>
 以上説明したように、本開示の実施形態によれば、画質の低下を抑えつつ、人的負担を軽減することが可能である。
<< 6. Conclusion >>
As described above, according to the embodiment of the present disclosure, it is possible to reduce the human burden while suppressing deterioration in image quality.
 以上、添付図面を参照しながら本開示の好適な実施形態について詳細に説明したが、本開示の技術的範囲はかかる例に限定されない。本開示の技術分野における通常の知識を有する者であれば、請求の範囲に記載された技術的思想の範疇内において、各種の変更例または修正例に想到し得ることは明らかであり、これらについても、当然に本開示の技術的範囲に属するものと了解される。 The preferred embodiments of the present disclosure have been described in detail above with reference to the accompanying drawings, but the technical scope of the present disclosure is not limited to such examples. It is obvious that a person having ordinary knowledge in the technical field of the present disclosure can come up with various changes or modifications within the scope of the technical idea described in the claims. Of course, it is understood that it belongs to the technical scope of the present disclosure.
 また、本明細書に記載された効果は、あくまで説明的または例示的なものであって限定的ではない。つまり、本開示に係る技術は、上記の効果とともに、または上記の効果に代えて、本明細書の記載から当業者には明らかな他の効果を奏しうる。 In addition, the effects described in this specification are merely illustrative or illustrative, and are not limited. That is, the technology according to the present disclosure can exhibit other effects that are apparent to those skilled in the art from the description of the present specification in addition to or instead of the above effects.
 なお、以下のような構成も本開示の技術的範囲に属する。
(1)
 事前設計におけるプロジェクタの投影領域を含むシミュレーション画像、及び前記プロジェクタの投影領域を含む撮像画像に基づいて、前記プロジェクタを制御する制御部、を備える情報処理装置。
(2)
 前記シミュレーション画像は、前記事前設計におけるパラメータ、及び前記撮像画像を取得する撮像装置に関するパラメータに基づいて生成される、前記(1)に記載の情報処理装置。
(3)
 前記制御部は、前記シミュレーション画像と前記撮像画像との差分が小さくなるように、前記プロジェクタを制御する、前記(1)または(2)に記載の情報処理装置。
(4)
 前記制御部は、前記プロジェクタのレンズに関する制御を行う、前記(3)に記載の情報処理装置。
(5)
 前記レンズに関する制御は、前記プロジェクタのレンズシフト制御を含む、前記(4)に記載の情報処理装置。
(6)
 前記レンズに関する制御は、前記プロジェクタのズーム制御を含む、前記(4)または(5)に記載の情報処理装置。
(7)
 前記制御部は、前記差分が十分に小さいと判定されるまで、前記プロジェクタを繰り返し制御する、前記(3)~(6)のいずれか一項に記載の情報処理装置。
(8)
 前記制御部は、第1の制御を行った後に算出された前記差分が、前回算出された前記差分よりも大きい場合に、前記第1の制御とは異なる第2の制御を行う、前記(7)に記載の情報処理装置。
(9)
 前記制御部は、前記第1の制御を行った後に算出された前記差分が、前回算出された前記差分よりも大きい場合に、前記第1の制御とは逆の制御を行う、前記(8)に記載の情報処理装置。
(10)
 前記制御部は、複数の撮像画像に基づいて、複数のプロジェクタを制御する、前記(1)~(9)のいずれか一項に記載の情報処理装置。
(11)
 前記制御部は、前記複数のプロジェクタ間における出力パラメータのばらつきが小さくなるように、前記複数のプロジェクタの出力パラメータに関する制御を行う、前記(10)に記載の情報処理装置。
(12)
 前記制御部は、前記複数のプロジェクタに含まれる各プロジェクタの出力パラメータを、前記複数のプロジェクタの出力パラメータに基づいて特定される基準値に近づけるように制御する、前記(11)に記載の情報処理装置。
(13)
 前記制御部は、前記ばらつきが十分に小さいと判定されるまで、前記プロジェクタを繰り返し制御する、前記(11)または(12)に記載の情報処理装置。
(14)
 前記制御部は、前記複数のプロジェクタの出力パラメータに関する制御を行った後に算出された前記ばらつきが、前回算出された前記ばらつきよりも大きい場合に、前記複数のプロジェクタの出力パラメータに関する制御とは逆の制御を行う、前記(13)に記載の情報処理装置。
(15)
 前記出力パラメータは、明るさ、色相、彩度、または色味のうち、少なくともいずれか一つのパラメータを含む、前記(11)~(14)のいずれか一項に記載の情報処理装置。
(16)
 事前設計におけるプロジェクタの投影領域を含むシミュレーション画像、及び前記プロジェクタの投影領域を含む撮像画像に基づいて、プロセッサが前記プロジェクタを制御すること、を含む情報処理方法。
(17)
 投影画像を投影するプロジェクタと、
 前記プロジェクタの投影領域を含む撮像画像を取得する撮像装置と、
 事前設計における前記プロジェクタの投影領域を含むシミュレーション画像、及び前記撮像画像に基づいて、前記プロジェクタを制御する制御部、を有する情報処理装置と、
 を備える投影システム。
The following configurations also belong to the technical scope of the present disclosure.
(1)
An information processing apparatus comprising: a simulation image including a projection area of a projector in advance design; and a control unit that controls the projector based on a captured image including the projection area of the projector.
(2)
The information processing apparatus according to (1), wherein the simulation image is generated based on a parameter in the preliminary design and a parameter related to an imaging apparatus that acquires the captured image.
(3)
The information processing apparatus according to (1) or (2), wherein the control unit controls the projector so that a difference between the simulation image and the captured image becomes small.
(4)
The information processing apparatus according to (3), wherein the control unit performs control related to a lens of the projector.
(5)
The information processing apparatus according to (4), wherein the control related to the lens includes lens shift control of the projector.
(6)
The information processing apparatus according to (4) or (5), wherein the control related to the lens includes zoom control of the projector.
(7)
The information processing apparatus according to any one of (3) to (6), wherein the control unit repeatedly controls the projector until the difference is determined to be sufficiently small.
(8)
The control unit performs a second control different from the first control when the difference calculated after performing the first control is larger than the previously calculated difference, (7 ).
(9)
The control unit performs control opposite to the first control when the difference calculated after performing the first control is larger than the difference calculated last time, (8) The information processing apparatus described in 1.
(10)
The information processing apparatus according to any one of (1) to (9), wherein the control unit controls a plurality of projectors based on a plurality of captured images.
(11)
The information processing apparatus according to (10), wherein the control unit performs control related to output parameters of the plurality of projectors so that variations in output parameters among the plurality of projectors are reduced.
(12)
The information processing according to (11), wherein the control unit controls an output parameter of each projector included in the plurality of projectors so as to approach a reference value specified based on the output parameters of the plurality of projectors. apparatus.
(13)
The information processing apparatus according to (11) or (12), wherein the control unit repeatedly controls the projector until it is determined that the variation is sufficiently small.
(14)
The control unit reverses the control related to the output parameters of the plurality of projectors when the variation calculated after the control related to the output parameters of the plurality of projectors is larger than the variation calculated last time. The information processing apparatus according to (13), wherein control is performed.
(15)
The information processing apparatus according to any one of (11) to (14), wherein the output parameter includes at least one parameter of brightness, hue, saturation, and color.
(16)
An information processing method including: a processor controlling a projector based on a simulation image including a projection area of a projector in a pre-design and a captured image including a projection area of the projector.
(17)
A projector that projects a projected image;
An imaging device for acquiring a captured image including a projection area of the projector;
An information processing apparatus including a simulation image including a projection area of the projector in advance design, and a control unit that controls the projector based on the captured image;
A projection system comprising:
 1 投影システム
 10 プロジェクタ
 20 カメラ
 30 情報処理装置
 50 スクリーン
 80 通信網
 310 制御部
 312 通信制御部
 314 撮像制御部
 316 シミュレーション画像生成部
 318 プロジェクタ制御部
 320 通信部
 350 記憶部
DESCRIPTION OF SYMBOLS 1 Projection system 10 Projector 20 Camera 30 Information processing apparatus 50 Screen 80 Communication network 310 Control part 312 Communication control part 314 Imaging control part 316 Simulation image generation part 318 Projector control part 320 Communication part 350 Storage part

Claims (17)

  1.  事前設計におけるプロジェクタの投影領域を含むシミュレーション画像、及び前記プロジェクタの投影領域を含む撮像画像に基づいて、前記プロジェクタを制御する制御部、を備える情報処理装置。 An information processing apparatus comprising: a simulation image including a projection area of a projector in advance design; and a control unit that controls the projector based on a captured image including the projection area of the projector.
  2.  前記シミュレーション画像は、前記事前設計におけるパラメータ、及び前記撮像画像を取得する撮像装置に関するパラメータに基づいて生成される、請求項1に記載の情報処理装置。 The information processing apparatus according to claim 1, wherein the simulation image is generated based on a parameter in the preliminary design and a parameter relating to an imaging apparatus that acquires the captured image.
  3.  前記制御部は、前記シミュレーション画像と前記撮像画像との差分が小さくなるように、前記プロジェクタを制御する、請求項1に記載の情報処理装置。 The information processing apparatus according to claim 1, wherein the control unit controls the projector so that a difference between the simulation image and the captured image becomes small.
  4.  前記制御部は、前記プロジェクタのレンズに関する制御を行う、請求項3に記載の情報処理装置。 The information processing apparatus according to claim 3, wherein the control unit performs control related to a lens of the projector.
  5.  前記レンズに関する制御は、前記プロジェクタのレンズシフト制御を含む、請求項4に記載の情報処理装置。 5. The information processing apparatus according to claim 4, wherein the control related to the lens includes lens shift control of the projector.
  6.  前記レンズに関する制御は、前記プロジェクタのズーム制御を含む、請求項4に記載の情報処理装置。 The information processing apparatus according to claim 4, wherein the control related to the lens includes zoom control of the projector.
  7.  前記制御部は、前記差分が十分に小さいと判定されるまで、前記プロジェクタを繰り返し制御する、請求項3に記載の情報処理装置。 The information processing apparatus according to claim 3, wherein the control unit repeatedly controls the projector until it is determined that the difference is sufficiently small.
  8.  前記制御部は、第1の制御を行った後に算出された前記差分が、前回算出された前記差分よりも大きい場合に、前記第1の制御とは異なる第2の制御を行う、請求項7に記載の情報処理装置。 The said control part performs 2nd control different from said 1st control, when the said difference calculated after performing 1st control is larger than the said difference calculated last time. The information processing apparatus described in 1.
  9.  前記制御部は、前記第1の制御を行った後に算出された前記差分が、前回算出された前記差分よりも大きい場合に、前記第1の制御とは逆の制御を行う、請求項8に記載の情報処理装置。 9. The control unit according to claim 8, wherein when the difference calculated after performing the first control is larger than the difference calculated last time, the control unit performs control opposite to the first control. The information processing apparatus described.
  10.  前記制御部は、複数の撮像画像に基づいて、複数のプロジェクタを制御する、請求項1に記載の情報処理装置。 The information processing apparatus according to claim 1, wherein the control unit controls a plurality of projectors based on a plurality of captured images.
  11.  前記制御部は、前記複数のプロジェクタ間における出力パラメータのばらつきが小さくなるように、前記複数のプロジェクタの出力パラメータに関する制御を行う、請求項10に記載の情報処理装置。 The information processing apparatus according to claim 10, wherein the control unit performs control related to output parameters of the plurality of projectors so that variations in output parameters among the plurality of projectors are reduced.
  12.  前記制御部は、前記複数のプロジェクタに含まれる各プロジェクタの出力パラメータを、前記複数のプロジェクタの出力パラメータに基づいて特定される基準値に近づけるように制御する、請求項11に記載の情報処理装置。 The information processing apparatus according to claim 11, wherein the control unit controls an output parameter of each projector included in the plurality of projectors to approach a reference value specified based on the output parameters of the plurality of projectors. .
  13.  前記制御部は、前記ばらつきが十分に小さいと判定されるまで、前記プロジェクタを繰り返し制御する、請求項11に記載の情報処理装置。 The information processing apparatus according to claim 11, wherein the control unit repeatedly controls the projector until it is determined that the variation is sufficiently small.
  14.  前記制御部は、前記複数のプロジェクタの出力パラメータに関する制御を行った後に算出された前記ばらつきが、前回算出された前記ばらつきよりも大きい場合に、前記複数のプロジェクタの出力パラメータに関する制御とは逆の制御を行う、請求項13に記載の情報処理装置。 The control unit reverses the control related to the output parameters of the plurality of projectors when the variation calculated after the control related to the output parameters of the plurality of projectors is larger than the variation calculated last time. The information processing apparatus according to claim 13, wherein control is performed.
  15.  前記出力パラメータは、明るさ、色相、彩度、または色味のうち、少なくともいずれか一つのパラメータを含む、請求項11に記載の情報処理装置。 The information processing apparatus according to claim 11, wherein the output parameter includes at least one parameter of brightness, hue, saturation, or color.
  16.  事前設計におけるプロジェクタの投影領域を含むシミュレーション画像、及び前記プロジェクタの投影領域を含む撮像画像に基づいて、プロセッサが前記プロジェクタを制御すること、を含む情報処理方法。 An information processing method including: a processor controlling the projector based on a simulation image including a projection area of the projector in a pre-design and a captured image including the projection area of the projector.
  17.  投影画像を投影するプロジェクタと、
     前記プロジェクタの投影領域を含む撮像画像を取得する撮像装置と、
     事前設計における前記プロジェクタの投影領域を含むシミュレーション画像、及び前記撮像画像に基づいて、前記プロジェクタを制御する制御部、を有する情報処理装置と、
     を備える投影システム。
    A projector that projects a projected image;
    An imaging device for acquiring a captured image including a projection area of the projector;
    An information processing apparatus including a simulation image including a projection area of the projector in advance design, and a control unit that controls the projector based on the captured image;
    A projection system comprising:
PCT/JP2017/042835 2017-01-17 2017-11-29 Information processing device, information processing method, and projection system WO2018135141A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2017005675 2017-01-17
JP2017-005675 2017-01-17

Publications (1)

Publication Number Publication Date
WO2018135141A1 true WO2018135141A1 (en) 2018-07-26

Family

ID=62909298

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2017/042835 WO2018135141A1 (en) 2017-01-17 2017-11-29 Information processing device, information processing method, and projection system

Country Status (1)

Country Link
WO (1) WO2018135141A1 (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH10333088A (en) * 1997-05-28 1998-12-18 Canon Inc Display method of projected picture and projective picture display device
JP2005027154A (en) * 2003-07-04 2005-01-27 Hitachi Ltd Multi-camera system and its adjusting device
JP2007300540A (en) * 2006-05-02 2007-11-15 Hitachi Ltd Image display system
JP2009171012A (en) * 2008-01-11 2009-07-30 Nikon Corp Projector
JP2014204173A (en) * 2013-04-01 2014-10-27 キヤノン株式会社 Image processing apparatus and image processing method

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH10333088A (en) * 1997-05-28 1998-12-18 Canon Inc Display method of projected picture and projective picture display device
JP2005027154A (en) * 2003-07-04 2005-01-27 Hitachi Ltd Multi-camera system and its adjusting device
JP2007300540A (en) * 2006-05-02 2007-11-15 Hitachi Ltd Image display system
JP2009171012A (en) * 2008-01-11 2009-07-30 Nikon Corp Projector
JP2014204173A (en) * 2013-04-01 2014-10-27 キヤノン株式会社 Image processing apparatus and image processing method

Similar Documents

Publication Publication Date Title
WO2018082165A1 (en) Optical imaging method and apparatus
US11176747B2 (en) Information processing apparatus and information processing method
US11082670B2 (en) Information processing apparatus, information processing method, and program
JP5180570B2 (en) Camera control device, camera control method, camera system, and program
JP6141137B2 (en) REMOTE CONTROL DEVICE AND ITS CONTROL METHOD, IMAGING DEVICE AND ITS CONTROL METHOD, SYSTEM, AND PROGRAM
CN114125411A (en) Projection equipment correction method and device, storage medium and projection equipment
US10205940B1 (en) Determining calibration settings for displaying content on a monitor
US20180109723A1 (en) Information processing device, information processing method, and program
JPWO2017179111A1 (en) Display system and information processing method
WO2018168222A1 (en) Information processing device, image pickup device, and electronic apparatus
WO2018135141A1 (en) Information processing device, information processing method, and projection system
JP6980450B2 (en) Controls, control methods, and programs
JP2019033339A (en) Information processing apparatus, projection apparatus, information processing method, and program
US20190279341A1 (en) Systems and methods to create a dynamic blur effect in visual content
JP6398235B2 (en) Image projection apparatus, image projection system, and control method for image projection apparatus
US20200314410A1 (en) Information processing apparatus, information processing method, and program
US20190075249A1 (en) Image processing device, imaging processing method, and program
JP6943151B2 (en) Display control device and display control system
JP2017200021A (en) Information processing device, program and system
US10742862B2 (en) Information processing device, information processing method, and information processing system
JP2020191589A (en) Projection device, projection method, program, and storage medium
JP2020053710A (en) Information processing apparatus, projection device, control method of projection device, and program
US20230028087A1 (en) Control apparatus, image projection system, control method, and storage medium
US11915384B2 (en) Estimation of optimal rasterization distortion function from image distortion property
US20160212391A1 (en) Method and device for projecting content

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17892457

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17892457

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP