[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

WO2023233919A1 - Image projection system - Google Patents

Image projection system Download PDF

Info

Publication number
WO2023233919A1
WO2023233919A1 PCT/JP2023/017208 JP2023017208W WO2023233919A1 WO 2023233919 A1 WO2023233919 A1 WO 2023233919A1 JP 2023017208 W JP2023017208 W JP 2023017208W WO 2023233919 A1 WO2023233919 A1 WO 2023233919A1
Authority
WO
WIPO (PCT)
Prior art keywords
projection
image
unit
vehicle
projection unit
Prior art date
Application number
PCT/JP2023/017208
Other languages
French (fr)
Japanese (ja)
Inventor
凌一 竹内
Original Assignee
京セラ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 京セラ株式会社 filed Critical 京セラ株式会社
Publication of WO2023233919A1 publication Critical patent/WO2023233919A1/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R1/00Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/20Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/22Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle
    • B60R1/23Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view
    • B60R1/24Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view in front of the vehicle
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R1/00Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/20Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/31Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles providing stereoscopic vision
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R11/00Arrangements for holding or mounting articles, not otherwise provided for
    • B60R11/02Arrangements for holding or mounting articles, not otherwise provided for for radio sets, television sets, telephones, or the like; Arrangement of controls thereof
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B21/00Projectors or projection-type viewers; Accessories therefor
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B21/00Projectors or projection-type viewers; Accessories therefor
    • G03B21/14Details
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B21/00Projectors or projection-type viewers; Accessories therefor
    • G03B21/54Accessories
    • G03B21/56Projection screens
    • G03B21/60Projection screens characterised by the nature of the surface
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09FDISPLAYING; ADVERTISING; SIGNS; LABELS OR NAME-PLATES; SEALS
    • G09F9/00Indicating arrangements for variable information in which the information is built-up on a support by selection or combination of individual elements
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/363Image reproducers using image projection screens
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/366Image reproducers using viewer tracking
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast

Definitions

  • This application relates to an image projection system.
  • Patent Document 1 describes a projection type display device (projection device) that projects images from a plurality of projectors onto a reflex screen.
  • a plurality of projectors are arranged such that the distances between the retroreflective screens are different from each other.
  • the projection device described in Patent Document 1 can display stereoscopic images to viewers at various distances by using a recursive screen and switching the projection device depending on the viewer's position.
  • each projection device is a device that projects an image to a viewer at a set position, it is not possible to cope with a case where the position of the viewer's head changes at the same position. There is a need to project a more appropriate image according to the position of the viewer's head.
  • An image projection system includes a first projection unit having a plurality of projection devices arranged on a straight line, and a plurality of projection devices arranged on a straight line at a different angle from the first projection unit.
  • FIG. 1 is a plan view schematically showing a vehicle equipped with an image projection system.
  • FIG. 2 is a side view schematically showing a vehicle equipped with an image projection system.
  • FIG. 3 is a schematic diagram showing an example of the arrangement of projection devices of the image projection system.
  • FIG. 4 is a block diagram showing a schematic configuration of the image projection system.
  • FIG. 5 is a plan view for explaining the operation of the image projection system.
  • FIG. 6 is a partially enlarged sectional view showing the configuration of a screen and a diffuser plate of the image projection system.
  • FIG. 7 is a flowchart illustrating an example of processing of the image projection system.
  • FIG. 8 is a schematic diagram for explaining the operation of the image projection system.
  • FIG. 9 is a schematic diagram for explaining the operation of the image projection system.
  • FIG. 8 is a schematic diagram for explaining the operation of the image projection system.
  • FIG. 10 is a schematic diagram for explaining the operation of the image projection system.
  • FIG. 11 is a schematic diagram for explaining the operation of the image projection system.
  • FIG. 12 is a schematic diagram for explaining the operation of the image projection system.
  • FIG. 13 is a schematic diagram showing another example of the arrangement of the projection devices of the image projection system.
  • FIG. 14 is a schematic diagram showing another example of the arrangement of the projection devices of the image projection system.
  • FIG. 1 is a plan view schematically showing a vehicle equipped with an image projection system.
  • FIG. 2 is a side view schematically showing a vehicle equipped with an image projection system.
  • FIG. 3 is a schematic diagram showing an example of the arrangement of projection devices of the image projection system.
  • FIG. 4 is a block diagram showing a schematic configuration of the image projection system.
  • FIG. 5 is a plan view for explaining the operation of the image projection system.
  • FIG. 6 is a partially enlarged sectional view showing the configuration of a screen and a diffuser plate of the image projection system.
  • the image projection system 1 is mounted on a vehicle 2.
  • the vehicle 2 is, for example, a passenger car, and has a plurality of seats inside the vehicle, in which a driver, fellow passengers, etc. ride.
  • the vehicle 2 includes a driver's seat 4, a dashboard, doors, pillars, a back seat, and the like.
  • the image projection system 1 projects an image of the surroundings of the vehicle onto a shielding part within a range visible to the driver, so that the image of the surroundings of the vehicle without the shielding part is visible to the driver. Note that although this embodiment will be described as a case in which an image is displayed for the driver 5, the image can be similarly projected for a passenger sitting in a seat other than the driver's seat 4.
  • the image projection system 1 of the present embodiment includes external cameras 3a and 3b, which are imaging units that capture images of the surrounding scenery of a vehicle 2 and output image data obtained by imaging, and external cameras 3a and 3b, which are seats of the vehicle 2, for example, while driving.
  • a viewpoint detection unit that detects the positions of the left eye EL and right eye ER, which are the viewpoints of the driver 5 who is an observer seated on the seat 4, and outputs viewpoint position information representing the detected positions of the left eye EL and right eye ER in spatial coordinates.
  • the shielding part 7 blocks the view when looking outside the vehicle from the left eye EL and right eye ER represented by the viewpoint position information.
  • a first image processing unit 8 that generates a first image that can be seen by either the left eye EL or the right eye ER based on the image data in the corresponding range; and an image in the range corresponding to the shielding unit 7 out of the image data;
  • a second image processing unit 9 that generates a second image that is perceived by either the left eye EL or the right eye ER and has parallax with respect to the first image based on the data; and a shielding unit 7 that is provided in the vehicle 2.
  • the display device 10 includes a display device 10 that is an image display unit that displays a parallax image including a first image and a second image, and projection units 12 and 24 that project images onto the display device 10.
  • the external cameras 3a and 3b are imaging units that capture images of the surrounding scenery of the vehicle 2 and output image data obtained by capturing the images.
  • the front exterior camera 3a is installed at the front end of the vehicle 2.
  • the rear exterior camera 3b is installed at the rear of the vehicle 2.
  • Other types of digital cameras can be used as the external cameras 3a and 3b, such as a CCD (Charge Coupled Devices) camera equipped with a fisheye lens, a CMOS (Complementary Metal Oxide Semiconductor)-IC digital camera, and the like.
  • CMOS Complementary Metal Oxide Semiconductor
  • Each external camera 3a, 3b is equipped with a sensor that detects the environment outside the vehicle (temperature, humidity, etc.), and performs its own optical system, CCD temperature, cover defogging, etc. to obtain a clear image. It is preferable to have a function.
  • the image projection system 1 can also be provided with a plurality of external cameras 3a and 3b, respectively.
  • a pair of external cameras 3a may be provided at diagonal positions at the front end of the vehicle, and a pair of external cameras 3b may be provided at diagonal positions at the rear end of the vehicle. This allows clear images of the front and rear of the vehicle to be captured.
  • the external cameras 3a and 3b only need to be able to acquire images of the outside of the vehicle 2, and are not limited to being disposed outside the vehicle.
  • the cameras 3a and 3b outside the vehicle may be placed inside the vehicle.
  • the vehicle exterior camera 3a may be located inside the windshield of the vehicle, or may be located within the hood of the vehicle.
  • the external cameras 3a and 3b are provided to obtain images of both the front and rear of the vehicle 2, but it is also possible to provide only the external camera 3a and obtain images of only the front of the vehicle 2. .
  • the driver's seat 4 is equipped with a seating sensor 36 that detects whether the driver 5 is seated.
  • the seating sensor 36 is constituted by a known load sensor or limit switch. When the driver 5 is seated in the driver's seat 4, the seating sensor 36 installed in the driver's seat 4 detects that the driver 5 is seated.
  • the in-vehicle camera 6 is installed in the car at a position where it can image the driver 5, for example, at a position adjacent to the rearview mirror.
  • a CCD camera can be used as the in-vehicle camera 6.
  • the shielding part 7 is an object among the structures of the vehicle 2 that blocks the driver's view and prevents the driver from seeing the outside of the vehicle 2 .
  • the shielding portion 7 in the vehicle 2 includes a dashboard, a door, a pillar, a back seat, and the like.
  • the display device 10 displays an image by projecting the image from the projection unit 12.
  • the display devices 10 include a dashboard display device 10a disposed on the dashboard, a right side pillar display device 10b disposed on the right side pillar, and a left side pillar display device 10c disposed on the left side pillar. and a backseat display device 10d on which the backseat 23 of the rear seat 22 is disposed.
  • the display device 10 is arranged along the shape of the shielding portion 7 on one surface facing the interior space of the vehicle.
  • the projection unit 12 projects an image toward the display devices 10a, 10b, and 10c arranged further forward of the vehicle than the driver's seat 4.
  • the projection unit 12 includes a first projection unit 100 and a second projection unit 102, as shown in FIG.
  • the first projection unit 100 is arranged at an angle of view that allows it to project an image onto the display device 10b on the right pillar
  • the second projection unit 102 is arranged at an angle of view that allows it to project an image onto the display device 10b on the left pillar. It will be placed in
  • the first projection unit 100 and the second projection unit 102 can also project an image onto a display device 10a arranged on the dashboard. Each projectable area is different.
  • the first projection unit 100 includes projection devices 110a, 110b, 110c, and 110d.
  • Projection devices 110a, 110b, 110c, and 110d are projection devices that project images toward display device 10, respectively.
  • the projection devices 110a, 110b, 110c, and 110d are arranged, for example, in a row on a first straight line 122 on a virtual first plane.
  • the projection light source line which is the central axis of the image to be projected, is at a different position depending on the arrangement interval. That is, the projection devices 110a, 110b, 110c, and 110d project images onto different areas of the display area 10.
  • the first plane is a plane parallel to the projection light source line, and in this embodiment, is a plane when the vehicle 2 is viewed from above.
  • the second projection unit 102 includes projection devices 112a, 112b, 112c, and 112d.
  • Projection devices 112a, 112b, 112c, and 112d are projection devices that project images toward display device 10, respectively.
  • the projection devices 112a, 112b, 112c, and 112d are arranged, for example, in a row on the second straight line 124 on the virtual first plane.
  • the projection devices 112a, 112b, 112c, and 112d have projection light source lines, which are central axes of images to be projected, at different positions depending on the arrangement interval. That is, the projection devices 112a, 112b, 112c, and 112d project images onto different areas of the display area 10. Note that although the projection areas 130 of the projection devices 112a, 112b, 112c, and 112d are shifted according to the projection light source line, there are parts where the respective projection areas 130 overlap.
  • the second straight line 124 intersects the first straight line 122 at an intersection 126, where the intersection 126 is the center of the first projection unit 100 and the center of the second projection unit 102.
  • the second straight line 124 is a straight line in a different direction on the first plane from the first straight line 122, and the angle between the two straight lines is ⁇ . That is, the first projection unit 100 and the second projection unit 102 are arranged in an X shape.
  • the first projection unit 100 is arranged such that the projection light source line is inclined from a direction parallel to the traveling direction of the vehicle 2 toward a region including the display device 10b arranged on the pillar.
  • the second projection unit 102 is arranged so that the projection light source line is inclined from a direction parallel to the traveling direction of the vehicle 2 toward an area (passenger seat side) that includes the display device 10c arranged on the pillar.
  • the projection devices 110a, 110b, 110c, 110d, 112a, 112b, 112c, and 112d may be arranged at different positions in the direction perpendicular to the first plane, that is, stacked in a direction perpendicular to the first plane.
  • the second projection unit 102 may be arranged on a second plane that is perpendicular to the first plane. Further, the configurations of each of the projection devices 110a, 110b, 110c, 110d, 112a, 112b, 112c, and 112d and the display device 10 will be described later.
  • the projection unit 24 projects an image toward the display device 10d on the back seat.
  • the projection unit 24 is configured in the same manner as the projection unit 12 described above, and the projection unit 24 displays, on the display device 10d, image data corresponding to the range shielded by the back seat among the image data captured by the rear exterior camera 3b. to project.
  • the image projection system 1 controls the operation of each part of the control device 50.
  • the control device 50 is connected to each component of the image projection system 1 and controls each component.
  • the control device 50 is realized by a processor such as an electronic control unit (ECU) as a hardware resource, and a computer-readable program as a software resource.
  • Controller 50 may include one or more processors.
  • the processor may include a general-purpose processor that loads a specific program to execute a specific function, and a dedicated processor specialized for specific processing.
  • the dedicated processor may include an application specific integrated circuit (ASIC).
  • the processor may include a programmable logic device (PLD).
  • the PLD may include an FPGA (Field-Programmable Gate Array).
  • the control device 50 may be either an SoC (System-on-a-Chip) or an SiP (System in a Package) in which one or more processors work together.
  • the control device 50 may include a storage unit, and may store various information or programs for operating each component of the image projection system 1 in the storage unit.
  • the storage unit may be composed of, for example, a semiconductor memory.
  • the storage unit may function as a storage area that is temporarily used during data processing by the control device 50.
  • the control device 50 includes a line of sight recognition device 31, an image data processing device 33, an outside camera control device 35, an inside camera control device 37, and a display control device 39.
  • the vehicle exterior camera control device 35 controls the operation of each vehicle exterior camera 3a, 3b.
  • the vehicle exterior camera control device 35 acquires image data of images captured by each vehicle exterior camera 3a, 3b.
  • the vehicle exterior camera control device 35 receives analog image data from each vehicle exterior camera 3a, 3b, converts it into digital data, and sends it to the image data processing device 33.
  • the in-vehicle camera control device 37 sharpens the image obtained from the in-vehicle camera 6, controls the on/off switching of the in-vehicle camera 6 to capture an image of the inside of the vehicle based on a command from the driver 5, and detects the illuminance inside the vehicle. Detection data from the illuminance sensor, temperature sensor that detects the temperature inside the car, etc. is input, and control is also performed to create an environment inside the car that allows clear images of the inside of the car.
  • the viewpoint recognition device 31 starts measuring the viewpoint position of the driver 5 based on the image acquired by the in-vehicle camera control device 37.
  • the viewpoint recognition device 31 extracts the position of the viewpoint of the driver 5 and the pupil positions of the left eye EL and right eye ER from the photographed image of the in-vehicle camera 6 by image recognition processing in the three-dimensional coordinate system of X, Y, and Z.
  • the extracted pupil position is output as coordinate values (x, y, z).
  • the viewpoint recognition device 31 sends information on the acquired viewpoint position of the driver 5 to the image data processing device 33 .
  • the image data processing device 33 creates images to be projected onto the projection units 12 and 24.
  • the image data processing device 33 includes a first image processing section 8 and a second image processing section 9.
  • the first image processing unit 8 corresponds to a shielding unit 7 that blocks the view when looking outside the vehicle from the left eye EL and right eye ER represented by the viewpoint position information among the image data output from the vehicle exterior camera 3a and the vehicle exterior camera 3b.
  • a first image viewed by either the left eye EL or the right eye ER is generated based on the range of image data.
  • the first image may be an image in the range corresponding to the shielding part 7 that is seen by the right eye ER.
  • the second image processing unit 9 generates an image that is perceived by either the left eye EL or the right eye ER and has a parallax with respect to the first image, based on image data in a range corresponding to the shielding unit 7 out of the image data. 2 images are generated.
  • the second image may be an image in the range corresponding to the shielding part 7 that is seen by the left eye EL. Even if the first image and the second image contain the same object, since the viewpoints from which the object is viewed are different, the position and shape of the object in the images differ depending on the parallax.
  • the range corresponding to the shielding part 7 that can be seen by the left eye EL refers to the range that would be visible by the left eye EL if the shielding part 7 did not exist.
  • the range corresponding to the shielding part 7 that can be seen by the right eye RL refers to the range that would be visible by the right eye RL if the shielding part 7 did not exist.
  • the image data processing device 33 sends the created image and information on the left eye EL and right eye ER represented by the viewpoint position information to the display control device 39.
  • the display control device 39 controls the operation of the projection unit 12.
  • the display control device 39 determines a projection device to display the image based on the left eye EL and right eye ER represented by the viewpoint position information, and causes the determined projection device to project the image created by the image data processing device 33.
  • the display device 10 includes a retroreflective screen 11 provided on the shielding part 7, and a diffuser plate 16 laminated on the surface of the retroreflective screen 11 facing the viewer.
  • the projection unit 12 selects one projection device from the plurality of projection devices and projects a first right-eye image from the selected right-eye projection device onto the device retroreflective screen 11 . Furthermore, the projection unit 12 selects one projection device from the plurality of projection devices, and projects a second image for the left eye from the selected left-eye projection device onto the device retroreflective screen 11 .
  • two projection devices selected from the projection devices 110a, 110b, 110c, 110d, 112a, 112b, 112c, and 112d of the first projection unit 100 and the second projection unit 102 project images for the right eye.
  • a right-eye projection unit 12R and a left-eye projection unit 12L project an image for the left eye.
  • the right-eye projection unit 12R and the left-eye projection unit 12L that project images for the left eye are two projection devices from the second projection unit 102. is selected, there may be various combinations of projection devices, such as a case where one projection device is selected from each of the first projection unit 100 and the second projection unit 102.
  • the right eye projection unit 12R includes a liquid crystal display device 13R that displays a first image, and a first projection lens 14R that projects image light of the first image emitted from the liquid crystal display device 13R onto the retroreflective screen 11.
  • the left eye projection unit 12L includes a liquid crystal display device 13L that displays a second image, and a second projection lens 14L that projects the image light of the second image emitted from the liquid crystal display device 13L onto the retroreflective screen 11.
  • Each of the liquid crystal display devices 13R and 13L includes a transmissive liquid crystal display element and a backlight device that emits light to the back surface of the liquid crystal display element.
  • Each of the projection lenses 14R and 14L is configured by a combination of a plurality of lenses that form images on the retroreflective screen 11 with parallax between the first image and the second image emitted from each liquid crystal display element.
  • the retroreflective screen 11 has retroreflectivity and reflects all of the incident light in the direction of incidence.
  • the image light of the first image and the image light of the second image emitted from the first projection lens 14R and the second projection lens 14L are directed to the first projection lens 14R and the second projection lens 14L by the retroreflective screen 11.
  • the image light of the first image and the image light of the second image, which are overlapped on the retroreflective screen 11, are perceived as being separated from each other at the observer's position.
  • a diffuser plate 16 is arranged on the surface of the retroreflective screen 11 on the viewer side.
  • the diffusion plate 16 has a diffusion ability so as not to reflect the retroreflection of the retroreflective screen 11 to each of the projection units 12, 24, and 25, but to reflect the light to the observer's line of sight.
  • the diffusion plate 16 has a large diffusion capacity in the vertical direction, and an anisotropic diffusion plate that has a smaller diffusion capacity in the horizontal direction than in the vertical direction.
  • a diffuser plate 16 is laminated on the surface of the retroreflective screen 11 facing the viewer.
  • the diffuser plate 16 may be a holographic optical element and is bonded onto the reflective surface of the retroreflective screen 11.
  • the diffuser plate 16 may be configured to magnify the light from the first projection lens 14R and the second projection lens 14L.
  • the retroreflective screen 11 is made by arranging a plurality of minute glass beads 11a with a diameter of 20 ⁇ m or more and about 100 ⁇ m or less on a flat surface and pasting them on the reflective film 11b.
  • the image light projected onto the retroreflective screen 11 enters each glass bead 11a, is refracted on the surface of the glass bead 11a, reaches the back surface of the glass bead 11a on the reflective film 11b side, and is reflected by the reflective film 11b. .
  • the light reflected by the reflective film 11b is refracted again at the back surface of the glass bead 11a, reaches the surface of the glass bead 11a, is separated from the incident path of the incident light by a minute distance less than the diameter of the glass bead 11a, and is reflected by the incident light.
  • the light travels along a parallel optical path, thus achieving retroreflection.
  • the diffusion plates 16 are arranged such that the light is diffused in different directions in the Y direction (the left-right direction of the driver 5) and the Z direction (the vertical direction of the driver 5).
  • the image light of the first image and the image light of the second image emitted from the projection lenses 14R and 14L enter the retroreflective screen 11, the light is emitted in the direction of incidence.
  • a conjugate relationship is established between the projection lenses 14R and 14L, which have the same optical path length, and a clear image can be observed.
  • the diffuser plate 16 is installed on the retroreflective screen 11
  • the light emitted by retroreflection is diffused, and a conjugate relationship can be established even in locations other than the projection lenses 14R and 14L. Clear images can be obtained no matter the position.
  • the liquid crystal display devices 13R and 13L include a transmission type liquid crystal display element, and the liquid crystal display element deflects the light source light from the backlight light source and displays a first image for providing it to the left and right eyes EL and ER of the viewer. The image light and the image light of the second image are emitted.
  • the liquid crystal display element deflects the light source light from the backlight light source and displays a first image for providing it to the left and right eyes EL and ER of the viewer.
  • the image light and the image light of the second image are emitted.
  • an LED light emitting display device may be used instead of the liquid crystal display device.
  • the projection lenses 14R and 14L project the image lights of the first image and the second image emitted from the liquid crystal display devices 13R and 13L toward the retroreflective screen 11 to form an image on the retroreflective screen 11.
  • the image formed on the retroreflective screen 11 is an enlarged version of the image displayed on the liquid crystal display devices 13R and 13L, and covers a wide range.
  • the left eye projection unit 12L is located at a position where its exit pupil is at the same height as the observer's left eye EL and near the left eye EL, for example, on both sides of the headrest at the top of the back seat. Similarly, the exit pupil is located at the same height as and near the right eye ER of the observer.
  • the exit pupils of the left-eye projection unit 12L and the right-eye projection unit 12R may be arranged above at the same position as the observer's eyes. That is, it may be arranged on the ceiling of the vehicle 2. In this case, it is preferable that the anisotropy of the diffusing power of the diffusing plate 16 corresponds to the position of the exit pupil.
  • the diffusion anisotropy of the diffuser plate 16 will be strong in the left and right directions, and if the exit pupil is at a position higher than the observer's eyes, the diffusion anisotropy of the diffuser plate 16 will be strong. Orientation makes the vertical direction stronger.
  • the optical axes of the projection lenses 14L and 14R of the left-eye projection unit 12L and the right-eye projection unit 12R are parallel, and the right side pillar display device 10b is arranged perpendicularly to the optical axes of the projection lenses 14L and 14R, respectively. It's good to be there.
  • the first image for the right eye ER and the second image for the left eye EL displayed on the right side pillar display device 10b are displayed in a partially overlapping state.
  • the retroreflective screen 11 has retroreflectivity and reflects almost all of the incident light in the direction of incidence.
  • the lights projected from the projection lenses 14L and 14R are reflected by the retroreflective screen 11 toward the projection lenses 14L and 14R, respectively, and the first light for the right eye ER overlaps on the retroreflective screen 11.
  • the image light of the image and the image light of the second image for the left eye EL are separated at the observer's position and enter the right eye ER and the left eye EL separately, and the driver 5 who is the observer simultaneously receives the image light of the first image.
  • a mixed image of the image light and the image light of the second image can be perceived as a three-dimensional parallax image.
  • the vehicle length direction is the X axis
  • the vehicle width direction is the Y axis
  • the vehicle height direction is the Z axis.
  • FIG. 7 is a flowchart showing an example of the processing of the image projection system.
  • 8 to 12 are schematic diagrams for explaining the operation of the image projection system, respectively.
  • the image processing system 1 executes the process shown in FIG. 7 when the power of the vehicle is activated by a start button or the like, and the seating sensor 36 detects that the driver 5 is seated in the driver's seat 4.
  • the image projection system 1 repeatedly executes the process shown in FIG.
  • the control unit 50 controls the operation of each unit and executes the processing shown in FIG. 7.
  • FIGS. 8 to 12 a case will be described in which an image is projected from the first image unit 100. Note that, depending on the position of the driver's line of sight, an image may also be projected from the second image unit 102 using the same process.
  • the control unit 50 acquires information on the projection light source line of each projection device (step S12).
  • the control unit 50 acquires information on the projection light source line based on information on the arrangement of each projection device of the projection unit 12.
  • information on the projection light source line 140a of the projection device 110a, the projection light source line 140b of the projection device 110b, the projection light source line 140c of the projection device 110c, and the projection light source line 140d of the projection device 110d is acquired.
  • the projection light source lines 140a, 140b, 140c, and 140d are arranged on the first plane. Information on the projection light source line can be acquired as preset information.
  • the control unit 50 acquires eyeball coordinates (step S14).
  • the control unit 50 processes the image acquired by the in-vehicle camera 6 with the line-of-sight recognition device 31 of the in-vehicle camera control device 33, and detects the positions of the left eye EL and right eye ER of the driver 5.
  • the control unit 50 identifies the closest projection light source line from the eyeball coordinates (step S16).
  • the control unit 50 selects the projection light source line closest to the positions of the left eye EL and right eye ER of the driver 5, based on information on the projection light source line of each projection device and the positions of the left eye EL and right eye ER of the driver 5. Identify.
  • the control unit 50 selects one projection device and determines whether the selected projection device is the one corresponding to the projection light source line closest to the eyeball (step S18). When the control unit 50 determines that the selected projection device is not a projection device corresponding to the projection light source line closest to the eyeball (No in step S18), the control unit 50 turns off the projection of the image of the selected projection device, and returns to step S20. Proceed to.
  • the control unit 50 determines whether to project the right image. (Step S22). If the selected projection device is a projection device corresponding to the projection light source line closest to the right eye, the control unit 50 determines that the image is for the right eye (Yes in step S22), and projects the image for the right eye ( Step S24). If the control unit 50 determines that the image is not for the right eye (No in step S22), it projects the image for the left eye (step S26). That is, the control unit 50 projects the image onto the projection device corresponding to the projection light source line closest to the positions of the left eye EL and right eye ER of the driver 5, respectively.
  • the projection device corresponding to the projection light source line refers to a projection device whose projection direction coincides with the projection light source line or is most similar to the projection light source line.
  • the vehicle exterior camera control device 35 operates each vehicle exterior camera 3a, 3b to start imaging the exterior of the vehicle.
  • the in-vehicle camera control device 37 operates the viewpoint recognition device 31 and the in-vehicle camera 6, and the in-vehicle camera 6 starts capturing an image.
  • the viewpoint recognition device 31 extracts the viewpoints of the driver 5, that is, the pupil positions of the left eye EL and the right eye ER, based on the image captured by the in-vehicle camera 6, and converts the extracted pupil positions in the X, Y, Z coordinate system. Calculate as coordinate values (x, y, z).
  • the image data processing device 33 cuts out the image to be projected onto the shielding unit 7 at the viewpoint position calculated by the viewpoint recognition device 31 from the image captured by the vehicle exterior camera 3, and generates a first image as an image for right eye ER. Then, a second image is generated as an image for the left eye EL. At this time, when the observer simultaneously views the projected images with the first and second images partially overlapping, the two different images can be viewed as one, allowing for a clear three-dimensional perception.
  • the control unit 50 determines whether the determination of the projection device is completed (step S28). When the control unit 50 determines that the determination of the projection devices has not been completed (No in step S28), the control unit 50 returns to step S18 and performs determination of the undetermined projection devices. When the control unit 50 determines that the determination of the projection device has been completed (Yes in step S28), the control unit 50 ends this process.
  • the control device 50 may process steps S18 to S28 in parallel for each projection device.
  • the image device system 1 can project an image according to the position of the driver's eyeballs on the display device by selecting a projection device to project according to the driver's line of sight in the process shown in FIG. It is possible to display an image that is integrated with the surrounding scenery.
  • the control unit 50 projects the left eye image from the projection device 110b, and projects the left eye image from the projection device 110c. Projects the image for the right eye and does not project images from other projection devices.
  • the control unit 50 projects the left eye image from the projection device 110a, and projects the left eye image from the projection device 110b. Projects the image for the right eye and does not project images from other projection devices. Thereby, when the driver moves his head to the left in FIG. 8, a more appropriate image can be projected by switching the image to be projected.
  • the control unit 50 controls the left eye EL to be closest to the projection light source line 140b, and the right eye ER to be closest to the projection light source line 140d.
  • the image for the left eye is projected from the projection device 110b
  • the image for the right eye is projected from the projection device 110d, and images from other projection devices are not projected.
  • the control unit 50 controls the left eye EL to be closest to the projection light source line 140b, and the right eye ER to be closest to the projection light source line 140d.
  • the left-eye image is projected from the projection device 110b
  • the right-eye image is projected from the projection device 110d
  • no images are projected from the other projection devices.
  • the control unit 50 controls the projection device.
  • the left-eye image is projected from 110d, and images from other projection devices are not projected.
  • FIGS. 8 to 12 a case has been described where the first projection unit 100 projects an image, but one projection device of the first projection unit 100 projects an image of one eye, and one of the second projection units 102 projects an image of one eye. It is also possible to use one projection device to project the image of the other eye.
  • images are projected by the projection devices 110a to 110d corresponding to the projection light source lines 140a to 140d closest to the left eye EL and the right eye ER, but the present invention is not limited thereto.
  • the image may be projected by any of the projection devices 110a to 110d corresponding to the projection light source lines 140a to 140d within a predetermined distance from the left eye EL and the right eye ER.
  • the image projection system 1 has a plurality of projection devices arranged in a straight line in each of the first projection unit 100 and the second projection unit 102, and the first straight line 122 and the second straight line 124 are set at different angles with respect to the first plane. This allows various combinations of projection devices to be used for projecting images depending on the position of the driver's eyes. Specifically, if the projection units that project images are combined for the right eye and left eye and placed in multiple positions, and if you move away from the combined position, you will not be able to project the image according to the driver's position. .
  • the image projection system 1 combines the projection devices according to the position of the driver's head by arranging the projection devices on a straight line without specifying whether the projection device is for the right eye or the left eye. This allows a more appropriate image to be projected. Note that it is preferable to arrange three or more projection devices in one projection unit.
  • the projection units 12 are arranged by arranging the projection devices symmetrically about the axis of symmetry with the intersection 126 as the center of the first projection unit 100 and the second projection unit 102. Images with little difference can be projected at each position in the range.
  • FIG. 13 is a schematic diagram showing another example of the arrangement of the projection devices of the image projection system.
  • the projection device 202 shown in FIG. 13 includes a first projection unit 100a and a second projection unit 102a.
  • the angle ⁇ between the first straight line of the first projection unit 100a and the second straight line of the second projection unit 102a is a larger angle than in the above embodiment. In this way, even if the first projection unit 100a and the second projection unit 102a intersect at different angles, the effects of the above embodiment can be obtained. Furthermore, by increasing the angle, the range that can be projected by the projection unit 200 can be widened.
  • the angle ⁇ can be set to a preferable value depending on the angle of view of the projector. For example, when the angle of view of the projector is about 57 degrees, it is preferable that the angle ⁇ is 0 degrees or more and 70 degrees or less.
  • FIG. 14 is a schematic diagram showing another example of the arrangement of the projection devices of the image projection system.
  • the projection unit 300 shown in FIG. 14 includes a first projection unit 302, a second projection unit 304, and a third projection unit 306.
  • the first projection unit 302, the second projection unit 304, and the third projection unit 306 are arranged so that the straight line on which the projection devices are arranged forms a triangle. In other words, the end portion of each projection unit is placed in contact with another projection unit.
  • three or more projection units may be arranged. In this case as well, by arranging the straight lines on which the projection devices of the projection units are arranged at different angles on the first plane, each projection unit can project an image to each area, depending on the position of the driver's eyes. You can select the projection device that you want. Note that even when three or more projection devices are used, the straight lines of the three projection units may be arranged to intersect at one intersection.
  • each functional unit, each means, each step, etc. may be added to other embodiments so as not to be logically contradictory, or each functional unit, each means, each step, etc.
  • each embodiment it is possible to combine or divide a plurality of functional units, means, steps, etc. into one. Further, each embodiment of the present disclosure described above is not limited to being implemented faithfully to each described embodiment, but may be implemented by combining each feature or omitting a part as appropriate. You can also do that.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Signal Processing (AREA)
  • Mechanical Engineering (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
  • Devices For Indicating Variable Information By Combining Individual Elements (AREA)
  • Projection Apparatus (AREA)
  • Fittings On The Vehicle Exterior For Carrying Loads, And Devices For Holding Or Mounting Articles (AREA)
  • Closed-Circuit Television Systems (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

This image projection system projects a more appropriate image corresponding to the position of the head of a viewer. The image projection system comprises, for example: a first projection unit that has a plurality of projection devices arranged on a straight line; a second projection unit that has a plurality of the projection devices arranged on a straight line at an angle different from that of the first projection unit; a retroreflective screen arranged in a direction in which the first projection unit and the second projection unit perform image projection; and a control device that determines at least two of the projection devices on the basis of the positions of the right eye and the left eye of an observer observing the screen and that causes the determined projection devices to respectively project an image for the right eye and an image for the left eye.

Description

画像投影システムimage projection system
 本出願は、画像投影システムに関する。 This application relates to an image projection system.
 対象物に画像を投影する投影装置がある。特許文献1には、再帰性スクリーンに複数のプロジェクタから画像を投影する投射型表示装置(投影装置)が記載されている。特許文献1に記載の装置は、複数のプロジェクタが再帰性スクリーンの距離が互いに異なるように配置されている。特許文献1に記載の投影装置は、再帰性スクリーンを用い、かつ、視聴者の位置に応じて、投影する装置を切り替えることで、種々の距離で視聴者に立体視可能な画像を表示できる。 There is a projection device that projects an image onto an object. Patent Document 1 describes a projection type display device (projection device) that projects images from a plurality of projectors onto a reflex screen. In the device described in Patent Document 1, a plurality of projectors are arranged such that the distances between the retroreflective screens are different from each other. The projection device described in Patent Document 1 can display stereoscopic images to viewers at various distances by using a recursive screen and switching the projection device depending on the viewer's position.
特開2014-139592号公報Japanese Patent Application Publication No. 2014-139592
 特許文献1に記載の投影装置を用いることで、スクリーンからの距離が変化した場合も視聴者の位置に応じた画像を表示することができる。しかしながら、各投影装置が、設定された位置にいる視聴者に対して画像を投影する装置であるため、同じ位置で視聴者の頭部の位置が変化した場合に対応することができない。視聴者の頭部の位置に対応してより適切な画像を投影することにニーズがある。 By using the projection device described in Patent Document 1, it is possible to display an image according to the viewer's position even when the distance from the screen changes. However, since each projection device is a device that projects an image to a viewer at a set position, it is not possible to cope with a case where the position of the viewer's head changes at the same position. There is a need to project a more appropriate image according to the position of the viewer's head.
 態様の1つに係る画像投影システムは、直線上に配置された複数の投影装置を有する第一投影ユニットと、前記第一投影ユニットと異なる角度で直線上に配置された複数の投影装置を有する第二投影ユニットと、前記第一投影ユニットと前記第二投影ユニットが投射する方向に配置された再帰反射性スクリーンと、前記スクリーンを観察する観察者の右目と左目の位置に基づいて、少なくとも2つの投影装置を決定し、決定した投影装置からそれぞれ右目用の画像と左目用の画像を投影させる制御装置と、を備える。 An image projection system according to one aspect includes a first projection unit having a plurality of projection devices arranged on a straight line, and a plurality of projection devices arranged on a straight line at a different angle from the first projection unit. a second projection unit, a retroreflective screen disposed in the direction in which the first projection unit and the second projection unit project, and at least two retroreflective screens based on the positions of the right and left eyes of an observer observing the screen; a control device that determines one projection device and causes the determined projection devices to project an image for the right eye and an image for the left eye, respectively.
図1は、画像投影システムを備える車両を模式的に示す平面図である。FIG. 1 is a plan view schematically showing a vehicle equipped with an image projection system. 図2は、画像投影システムを備える車両を模式的に示す側面図である。FIG. 2 is a side view schematically showing a vehicle equipped with an image projection system. 図3は、画像投影システムの投影装置の配置の一例を示す模式図である。FIG. 3 is a schematic diagram showing an example of the arrangement of projection devices of the image projection system. 図4は、画像投影システムの概略構成を示すブロック図である。FIG. 4 is a block diagram showing a schematic configuration of the image projection system. 図5は、画像投影システムの動作を説明するための平面図である。FIG. 5 is a plan view for explaining the operation of the image projection system. 図6は、画像投影システムのスクリーンおよび拡散板の構成を示す一部の拡大断面図である。FIG. 6 is a partially enlarged sectional view showing the configuration of a screen and a diffuser plate of the image projection system. 図7は、画像投影システムの処理の一例を示すフローチャートである。FIG. 7 is a flowchart illustrating an example of processing of the image projection system. 図8は、画像投影システムの動作を説明するための模式図である。FIG. 8 is a schematic diagram for explaining the operation of the image projection system. 図9は、画像投影システムの動作を説明するための模式図である。FIG. 9 is a schematic diagram for explaining the operation of the image projection system. 図10は、画像投影システムの動作を説明するための模式図である。FIG. 10 is a schematic diagram for explaining the operation of the image projection system. 図11は、画像投影システムの動作を説明するための模式図である。FIG. 11 is a schematic diagram for explaining the operation of the image projection system. 図12は、画像投影システムの動作を説明するための模式図である。FIG. 12 is a schematic diagram for explaining the operation of the image projection system. 図13は、画像投影システムの投影装置の配置の他の例を示す模式図である。FIG. 13 is a schematic diagram showing another example of the arrangement of the projection devices of the image projection system. 図14は、画像投影システムの投影装置の配置の他の例を示す模式図である。FIG. 14 is a schematic diagram showing another example of the arrangement of the projection devices of the image projection system.
 本出願に係る画像投影システムを実施するための複数の実施形態を、図面を参照しつつ詳細に説明する。なお、以下の説明により本出願が限定されるものではない。また、以下の説明における構成要素には、当業者が容易に想定できるもの、実質的に同一のもの、いわゆる均等の範囲のものが含まれる。以下の説明において、同様の構成要素について同一の符号を付すことがある。さらに、重複する説明は省略することがある。 A plurality of embodiments for implementing the image projection system according to the present application will be described in detail with reference to the drawings. Note that the present application is not limited by the following explanation. Furthermore, the constituent elements in the following description include those that can be easily assumed by those skilled in the art, those that are substantially the same, and those that are within the so-called equivalent range. In the following description, similar components may be denoted by the same reference numerals. Furthermore, duplicate explanations may be omitted.
 図1は、画像投影システムを備える車両を模式的に示す平面図である。図2は、画像投影システムを備える車両を模式的に示す側面図である。図3は、画像投影システムの投影装置の配置の一例を示す模式図である。図4は、画像投影システムの概略構成を示すブロック図である。図5は、画像投影システムの動作を説明するための平面図である。図6は、画像投影システムのスクリーンおよび拡散板の構成を示す一部の拡大断面図である。 FIG. 1 is a plan view schematically showing a vehicle equipped with an image projection system. FIG. 2 is a side view schematically showing a vehicle equipped with an image projection system. FIG. 3 is a schematic diagram showing an example of the arrangement of projection devices of the image projection system. FIG. 4 is a block diagram showing a schematic configuration of the image projection system. FIG. 5 is a plan view for explaining the operation of the image projection system. FIG. 6 is a partially enlarged sectional view showing the configuration of a screen and a diffuser plate of the image projection system.
 本実施形態では、画像投影システム1を車両2に搭載した場合として説明する。車両2は、例えば、乗用車であり、車内に複数の座席を有し、座席に運転者、同乗者等が乗車する。車両2は、運転席4、ダッシュボード、ドア、ピラー、バックシート等がある。画像投影システム1は、運転者から見える範囲の遮蔽部に、車両周囲の画像を投影して、遮蔽部がない場合の車両周囲の画像が運転者から見える状態とする。なお、本実施形態では、運転者5に対する画像を表示する場合として説明するが、運転席4以外の座席に座る搭乗者に対しても同様に画像を投影することができる。 In this embodiment, a case will be described in which the image projection system 1 is mounted on a vehicle 2. The vehicle 2 is, for example, a passenger car, and has a plurality of seats inside the vehicle, in which a driver, fellow passengers, etc. ride. The vehicle 2 includes a driver's seat 4, a dashboard, doors, pillars, a back seat, and the like. The image projection system 1 projects an image of the surroundings of the vehicle onto a shielding part within a range visible to the driver, so that the image of the surroundings of the vehicle without the shielding part is visible to the driver. Note that although this embodiment will be described as a case in which an image is displayed for the driver 5, the image can be similarly projected for a passenger sitting in a seat other than the driver's seat 4.
 本実施形態の画像投影システム1は、車両2の周囲の景色を撮像し、撮像して得られた画像データを出力する撮像部である車外カメラ3a、3bと、車両2の座席である例えば運転席4に着座した観察者である運転者5の視点である左目ELおよび右目ERの位置を検出し、検出した左目ELおよび右目ERの位置を空間座標で表す視点位置情報を出力する視点検出部である車内カメラ6と、前部車外カメラ3aおよび後部車外カメラ3bから出力される画像データのうち、視点位置情報が表す左目ELおよび右目ERから車外を見たときに視界を遮る遮蔽部7に対応する範囲の画像データに基づいて、左目ELまたは右目ERのいずれか一方によって見られる第1画像を生成する第1画像処理部8と、画像データのうち、遮蔽部7に対応する範囲の画像データに基づいて、左目ELまたは右目ERのいずれか他方によって知覚され、第1画像に対して視差を有する第2画像を生成する第2画像処理部9と、車両2に設けられ、遮蔽部7に第1画像および第2画像を含む視差画像を表示する画像表示部である表示装置10と、表示装置10に画像を投影する投影ユニット12、24と、を備える。 The image projection system 1 of the present embodiment includes external cameras 3a and 3b, which are imaging units that capture images of the surrounding scenery of a vehicle 2 and output image data obtained by imaging, and external cameras 3a and 3b, which are seats of the vehicle 2, for example, while driving. A viewpoint detection unit that detects the positions of the left eye EL and right eye ER, which are the viewpoints of the driver 5 who is an observer seated on the seat 4, and outputs viewpoint position information representing the detected positions of the left eye EL and right eye ER in spatial coordinates. Among the image data output from the in-vehicle camera 6, the front exterior camera 3a, and the rear exterior camera 3b, the shielding part 7 blocks the view when looking outside the vehicle from the left eye EL and right eye ER represented by the viewpoint position information. a first image processing unit 8 that generates a first image that can be seen by either the left eye EL or the right eye ER based on the image data in the corresponding range; and an image in the range corresponding to the shielding unit 7 out of the image data; a second image processing unit 9 that generates a second image that is perceived by either the left eye EL or the right eye ER and has parallax with respect to the first image based on the data; and a shielding unit 7 that is provided in the vehicle 2. The display device 10 includes a display device 10 that is an image display unit that displays a parallax image including a first image and a second image, and projection units 12 and 24 that project images onto the display device 10.
 車外カメラ3a、3bは、車両2の周囲の景色を撮像し、撮像して得られた画像データを出力する撮像部である。前部車外カメラ3aは、車両2の前端部に設置される。後部車外カメラ3bは、車両2の後部に設置されている。車外カメラ3a、3bは、例えば、魚眼レンズを取り付けたCCD(Charge Coupled Devices)カメラ、CMOS(Complementary Metal Oxide Semiconductor)-ICを使用したデジタルカメラなど、他種のデジタルカメラを用いることができる。車外カメラ3a、3bは、魚眼レンズを用いることによって、車両2の車外カメラ3の取付け位置を基準にして車両2の進行方向前方の景色を立体角で全範囲を撮像できるようになる。各車外カメラ3a、3bは、車外の環境(温度、湿度等)を検出するセンサを備えており、鮮明な画像が得られるように自身の光学系、CCDの温度、カバーの曇り除去等を行なう機能を有することが好ましい。画像投影システム1は、車外カメラ3a、bをそれぞれ複数台設けることも可能である。例えば、車両の前端部の対角位置に一対の車外カメラ3aを設け、車両の後端部の対角位置に一対の車外カメラ3bを設けてもよい。これにより、車両の前方と後方の画像を鮮明に撮像できる。なお、車外カメラ3a、3bは、車両2の外側の画像を取得できればよく、車外に配置されることに限定されない。車外カメラ3a、3bを車内に配置してもよい。例えば、車外カメラ3aは、車のフロントガラスの内側にあってもよいし、ボンネット内にあってもよい。また、本実施形態では、車外カメラ3a、3bを設け、車両2の前方と後方の両方の画像を取得するが、車外カメラ3aのみを設け、車両2の前方のみの画像を取得してもよい。 The external cameras 3a and 3b are imaging units that capture images of the surrounding scenery of the vehicle 2 and output image data obtained by capturing the images. The front exterior camera 3a is installed at the front end of the vehicle 2. The rear exterior camera 3b is installed at the rear of the vehicle 2. Other types of digital cameras can be used as the external cameras 3a and 3b, such as a CCD (Charge Coupled Devices) camera equipped with a fisheye lens, a CMOS (Complementary Metal Oxide Semiconductor)-IC digital camera, and the like. By using a fisheye lens, the external cameras 3a and 3b can image the entire range of the scenery in front of the vehicle 2 in the direction of travel in a solid angle based on the mounting position of the external camera 3 of the vehicle 2. Each external camera 3a, 3b is equipped with a sensor that detects the environment outside the vehicle (temperature, humidity, etc.), and performs its own optical system, CCD temperature, cover defogging, etc. to obtain a clear image. It is preferable to have a function. The image projection system 1 can also be provided with a plurality of external cameras 3a and 3b, respectively. For example, a pair of external cameras 3a may be provided at diagonal positions at the front end of the vehicle, and a pair of external cameras 3b may be provided at diagonal positions at the rear end of the vehicle. This allows clear images of the front and rear of the vehicle to be captured. Note that the external cameras 3a and 3b only need to be able to acquire images of the outside of the vehicle 2, and are not limited to being disposed outside the vehicle. The cameras 3a and 3b outside the vehicle may be placed inside the vehicle. For example, the vehicle exterior camera 3a may be located inside the windshield of the vehicle, or may be located within the hood of the vehicle. Further, in this embodiment, the external cameras 3a and 3b are provided to obtain images of both the front and rear of the vehicle 2, but it is also possible to provide only the external camera 3a and obtain images of only the front of the vehicle 2. .
 運転席4には、運転者5の着席の有無を検出する着座センサ36が備えられている。着座センサ36は、公知の荷重センサやリミットスイッチによって構成される。運転席4に運転者5が着座すると、運転席4に設置してある着座センサ36によって運転者5の着席したことが検出される。 The driver's seat 4 is equipped with a seating sensor 36 that detects whether the driver 5 is seated. The seating sensor 36 is constituted by a known load sensor or limit switch. When the driver 5 is seated in the driver's seat 4, the seating sensor 36 installed in the driver's seat 4 detects that the driver 5 is seated.
 車内カメラ6は、車内には運転者5を撮像できる位置、例えばルームミラーに隣接する位置に設置される。車内カメラ6は、例えばCCDカメラを用いることができる。 The in-vehicle camera 6 is installed in the car at a position where it can image the driver 5, for example, at a position adjacent to the rearview mirror. For example, a CCD camera can be used as the in-vehicle camera 6.
 遮蔽部7は、車両2の構造物のうち、運転者5の視界を遮り、車両2の外部の視認を妨げる物体である。車両2における遮蔽部7としては、ダッシュボード、ドア、ピラー、バックシート等である。 The shielding part 7 is an object among the structures of the vehicle 2 that blocks the driver's view and prevents the driver from seeing the outside of the vehicle 2 . The shielding portion 7 in the vehicle 2 includes a dashboard, a door, a pillar, a back seat, and the like.
 表示装置10は、投影ユニット12から画像が投影されることで、画像を表示する。表示装置10は、ダッシュボードに配設されたダッシュボード表示装置10aと、右サイドピラーに配設された右サイドピラー表示装置10bと、左サイドピラーに配設された左サイドピラーの表示装置10cと、後部座席22のバックシート23の配設されたバックシートの表示装置10dと、を含む。表示装置10は、遮蔽部7の形状に沿って車内空間に臨む一表面に配置されている。 The display device 10 displays an image by projecting the image from the projection unit 12. The display devices 10 include a dashboard display device 10a disposed on the dashboard, a right side pillar display device 10b disposed on the right side pillar, and a left side pillar display device 10c disposed on the left side pillar. and a backseat display device 10d on which the backseat 23 of the rear seat 22 is disposed. The display device 10 is arranged along the shape of the shielding portion 7 on one surface facing the interior space of the vehicle.
 投影ユニット12は、運転席4よりも車両前方に配置された表示装置10a、10b、10cに向けて、画像を投影する。投影ユニット12は、図3に示すように、第一投影ユニット100と、第二投影ユニット102と、を有する。第一投影ユニット100は、右ピラーの表示装置10bに対して画像を投影可能な画角で配置され、第二投影ユニット102は、左ピラーの表示装置10bに対して画像を投影可能な画角で配置される。第一投影ユニット100と第二投影ユニット102は、ダッシュボードに配置された表示装置10aにも画像投影可能である。投影可能な領域はそれぞれ異なる。第一投影ユニット100は、投影装置110a、110b、110c、110dを有する。投影装置110a、110b、110c、110dは、それぞれ表示装置10に向けて画像を投射する投影装置である。投影装置110a、110b、110c、110dは、例えば仮想の第一平面上の第一直線122に列状に配置される。投影装置110a、110b、110c、110dは、投影する画像の中心軸である投影光源線が配置間隔に応じて異なる位置となる。つまり、投影装置110a、110b、110c、110dは、表示領域10の異なる領域に画像を投影する。なお、投影装置110a、110b、110c、110dは、投影光源線に応じて投影領域がずれるが、それぞれの投影領域が重複する部分がある。また、第一平面は、投影光源線に平行な面であり、本実施形態では、車両2を上から見た場合の平面である。 The projection unit 12 projects an image toward the display devices 10a, 10b, and 10c arranged further forward of the vehicle than the driver's seat 4. The projection unit 12 includes a first projection unit 100 and a second projection unit 102, as shown in FIG. The first projection unit 100 is arranged at an angle of view that allows it to project an image onto the display device 10b on the right pillar, and the second projection unit 102 is arranged at an angle of view that allows it to project an image onto the display device 10b on the left pillar. It will be placed in The first projection unit 100 and the second projection unit 102 can also project an image onto a display device 10a arranged on the dashboard. Each projectable area is different. The first projection unit 100 includes projection devices 110a, 110b, 110c, and 110d. Projection devices 110a, 110b, 110c, and 110d are projection devices that project images toward display device 10, respectively. The projection devices 110a, 110b, 110c, and 110d are arranged, for example, in a row on a first straight line 122 on a virtual first plane. In the projection devices 110a, 110b, 110c, and 110d, the projection light source line, which is the central axis of the image to be projected, is at a different position depending on the arrangement interval. That is, the projection devices 110a, 110b, 110c, and 110d project images onto different areas of the display area 10. Note that although the projection areas of the projection devices 110a, 110b, 110c, and 110d are shifted according to the projection light source line, there are parts where the respective projection areas overlap. Further, the first plane is a plane parallel to the projection light source line, and in this embodiment, is a plane when the vehicle 2 is viewed from above.
 第二投影ユニット102は、投影装置112a、112b、112c、112dを有する。投影装置112a、112b、112c、112dは、それぞれ表示装置10に向けて画像を投射する投影装置である。投影装置112a、112b、112c、112dは、例えば仮想の第一平面上の第二直線124に列状に配置される。投影装置112a、112b、112c、112dは、投影する画像の中心軸である投影光源線が配置間隔に応じて異なる位置となる。つまり、投影装置112a、112b、112c、112dは、表示領域10の異なる領域に画像を投影する。なお、投影装置112a、112b、112c、112dは、投影光源線に応じて投影領域130がずれるが、それぞれの投影領域130が重複する部分がある。 The second projection unit 102 includes projection devices 112a, 112b, 112c, and 112d. Projection devices 112a, 112b, 112c, and 112d are projection devices that project images toward display device 10, respectively. The projection devices 112a, 112b, 112c, and 112d are arranged, for example, in a row on the second straight line 124 on the virtual first plane. The projection devices 112a, 112b, 112c, and 112d have projection light source lines, which are central axes of images to be projected, at different positions depending on the arrangement interval. That is, the projection devices 112a, 112b, 112c, and 112d project images onto different areas of the display area 10. Note that although the projection areas 130 of the projection devices 112a, 112b, 112c, and 112d are shifted according to the projection light source line, there are parts where the respective projection areas 130 overlap.
 ここで、第一平面において、第二直線124は、第一直線122と交点126で交差する、ここで、交点126は、第一投影ユニット100の中心、かつ、第二投影ユニット102の中心である。第二直線124は、第一直線122と、第一平面で異なる向きの直線であり、2つの直線のなす角がθとなる。つまり、第一投影ユニット100と、第二投影ユニット102とは、X形状で配置される。第一投影ユニット100は、投影光源線が車両2の進行方向と平行な方向からピラーに配置された表示装置10bを含む領域に向かって傾斜した向きで配置される。第二投影ユニット102は、投影光源線が車両2の進行方向と平行な方向からピラーに配置された表示装置10cを含む領域(助手席側)に向かって傾斜した向きで配置される。投影装置110a、110b、110c、110d、112a、112b、112c、112dは、第一平面に直交する方向の位置が異なる位置、つまり、第一平面に直交する向きに積層された配置としてもよい。第二投影ユニット102は、第一平面に直交する向きに存在する第二平面に配置してもよい。また、各投影装置110a、110b、110c、110d、112a、112b、112c、112dと表示装置10の構成については後述する。 Here, in the first plane, the second straight line 124 intersects the first straight line 122 at an intersection 126, where the intersection 126 is the center of the first projection unit 100 and the center of the second projection unit 102. . The second straight line 124 is a straight line in a different direction on the first plane from the first straight line 122, and the angle between the two straight lines is θ. That is, the first projection unit 100 and the second projection unit 102 are arranged in an X shape. The first projection unit 100 is arranged such that the projection light source line is inclined from a direction parallel to the traveling direction of the vehicle 2 toward a region including the display device 10b arranged on the pillar. The second projection unit 102 is arranged so that the projection light source line is inclined from a direction parallel to the traveling direction of the vehicle 2 toward an area (passenger seat side) that includes the display device 10c arranged on the pillar. The projection devices 110a, 110b, 110c, 110d, 112a, 112b, 112c, and 112d may be arranged at different positions in the direction perpendicular to the first plane, that is, stacked in a direction perpendicular to the first plane. The second projection unit 102 may be arranged on a second plane that is perpendicular to the first plane. Further, the configurations of each of the projection devices 110a, 110b, 110c, 110d, 112a, 112b, 112c, and 112d and the display device 10 will be described later.
 投影ユニット24は、バックシートの表示装置10dに向けて画像を投影する。投影ユニット24は、前述の投影ユニット12と同様に構成され、投影ユニット24は、表示装置10dに後部車外カメラ3bによって撮像された画像データのうち、バックシートによって遮蔽された範囲に対応する画像データを投射する。 The projection unit 24 projects an image toward the display device 10d on the back seat. The projection unit 24 is configured in the same manner as the projection unit 12 described above, and the projection unit 24 displays, on the display device 10d, image data corresponding to the range shielded by the back seat among the image data captured by the rear exterior camera 3b. to project.
 画像投影システム1は、制御装置50の各部で動作を制御する。制御装置50は、画像投影システム1の各構成要素に接続され、各構成要素を制御する。制御装置50は、ハードウェア資源として例えば電子制御装置(Electronic Control Unit:ECU)等のプロセッサと、ソフトウェア資源としてコンピュータよって読み取り可能なプログラムとによって実現される。制御装置50は、1以上のプロセッサを含んでよい。プロセッサは、特定のプログラムを読み込ませて特定の機能を実行する汎用のプロセッサ、及び特定の処理に特化した専用のプロセッサを含んでよい。専用のプロセッサは、特定用途向けIC(ASIC:Application Specific Integrated Circuit)を含んでよい。プロセッサは、プログラマブルロジックデバイス(PLD:Programmable Logic Device)を含んでよい。PLDは、FPGA(Field-Programmable Gate Array)を含んでよい。制御装置50は、1個または複数のプロセッサが協働するSoC(System-on-a-Chip)、及びSiP(System in a Package)のいずれかであってよい。制御装置50は、記憶部を備え、記憶部に各種情報、または画像投影システム1の各構成要素を動作させるためのプログラム等を格納してよい。記憶部は、例えば半導体メモリ等で構成されてよい。記憶部は、制御装置50のデータ処理の途中で一次的に用いられる記憶領域として機能してよい。 The image projection system 1 controls the operation of each part of the control device 50. The control device 50 is connected to each component of the image projection system 1 and controls each component. The control device 50 is realized by a processor such as an electronic control unit (ECU) as a hardware resource, and a computer-readable program as a software resource. Controller 50 may include one or more processors. The processor may include a general-purpose processor that loads a specific program to execute a specific function, and a dedicated processor specialized for specific processing. The dedicated processor may include an application specific integrated circuit (ASIC). The processor may include a programmable logic device (PLD). The PLD may include an FPGA (Field-Programmable Gate Array). The control device 50 may be either an SoC (System-on-a-Chip) or an SiP (System in a Package) in which one or more processors work together. The control device 50 may include a storage unit, and may store various information or programs for operating each component of the image projection system 1 in the storage unit. The storage unit may be composed of, for example, a semiconductor memory. The storage unit may function as a storage area that is temporarily used during data processing by the control device 50.
 制御装置50は、視線認識装置31と、画像データ処理装置33と、車外カメラ制御装置35と、車内カメラ制御装置37と、表示制御装置39と、を有する。 The control device 50 includes a line of sight recognition device 31, an image data processing device 33, an outside camera control device 35, an inside camera control device 37, and a display control device 39.
 車外カメラ制御装置35は、各車外カメラ3a、3bの動作を制御する。車外カメラ制御装置35は、各車外カメラ3a、3bによって撮像された画像の画像データを取得する。車外カメラ制御装置35は、各車外カメラ3a、3bからアナログの画像データを受け取り、それをデジタルデータに変換して画像データ処理装置33に送る。 The vehicle exterior camera control device 35 controls the operation of each vehicle exterior camera 3a, 3b. The vehicle exterior camera control device 35 acquires image data of images captured by each vehicle exterior camera 3a, 3b. The vehicle exterior camera control device 35 receives analog image data from each vehicle exterior camera 3a, 3b, converts it into digital data, and sends it to the image data processing device 33.
 車内カメラ制御装置37は、車内カメラ6から得られた画像を鮮明化し、運転者5からの指令によって車内カメラ6に車内を撮像のオン/オフ切換えの制御を実行し、車内の照度を検出する照度センサ、車内の温度を検出する温度センサ等の検出データが入力され、車内の画像が鮮明に撮像できる車内環境を作るための制御も合わせて行う。 The in-vehicle camera control device 37 sharpens the image obtained from the in-vehicle camera 6, controls the on/off switching of the in-vehicle camera 6 to capture an image of the inside of the vehicle based on a command from the driver 5, and detects the illuminance inside the vehicle. Detection data from the illuminance sensor, temperature sensor that detects the temperature inside the car, etc. is input, and control is also performed to create an environment inside the car that allows clear images of the inside of the car.
 視点認識装置31は、車内カメラ制御装置37で取得した画像に基づいて、運転者5の視点位置の計測が開始される。視点認識装置31は、X、Y、Zの3次元座標系において、運転者5の視点の位置を、車内カメラ6の撮影画像から左目ELおよび右目ERの瞳位置を画像認識処理によって抽出し、抽出された瞳の位置を座標値(x、y、z)として出力する。視点認識装置31は、取得した運転者5の視点の位置の情報を画像データ処理装置33に送る。 The viewpoint recognition device 31 starts measuring the viewpoint position of the driver 5 based on the image acquired by the in-vehicle camera control device 37. The viewpoint recognition device 31 extracts the position of the viewpoint of the driver 5 and the pupil positions of the left eye EL and right eye ER from the photographed image of the in-vehicle camera 6 by image recognition processing in the three-dimensional coordinate system of X, Y, and Z. The extracted pupil position is output as coordinate values (x, y, z). The viewpoint recognition device 31 sends information on the acquired viewpoint position of the driver 5 to the image data processing device 33 .
 画像データ処理装置33は、投影ユニット12、24に投影する画像を作成する。画像データ処理装置33は、第1画像処理部8と、第2画像処理部9と、を含む。第1画像処理部8は、車外カメラ3aおよび車外カメラ3bから出力される画像データのうち、視点位置情報が表す左目ELおよび右目ERから車外を見たときに視界を遮る遮蔽部7に対応する範囲の画像データに基づいて、左目ELまたは右目ERのいずれか一方によって見られる第1画像を生成する。例えば、右目ERによって見られる、遮蔽部7に対応する範囲の画像を第1画像としてよい。第2画像処理部9は、画像データのうち、遮蔽部7に対応する範囲の画像データに基づいて、左目ELまたは右目ERのいずれか他方によって知覚され、第1画像に対して視差を有する第2画像を生成する。例えば、左目ELによって見られる、遮蔽部7に対応する範囲の画像を第2画像としてよい。第1画像と第2画像に含まれる同一の物体であっても、当該物体を見る視点が異なっているため、画像中における当該物体の位置や形状は視差に応じて異なっている。左目ELによって見られる、遮蔽部7に対応する範囲とは、遮蔽部7が無かったならば左目ELによって視認しうる範囲を指す。同様に、右目RLによって見られる、遮蔽部7に対応する範囲とは、遮蔽部7が無かったならば右目RLによって視認しうる範囲を指す。画像データ処理装置33は、作成した画像と、視点位置情報が表す左目ELおよび右目ERの情報を、表示制御装置39に送る。 The image data processing device 33 creates images to be projected onto the projection units 12 and 24. The image data processing device 33 includes a first image processing section 8 and a second image processing section 9. The first image processing unit 8 corresponds to a shielding unit 7 that blocks the view when looking outside the vehicle from the left eye EL and right eye ER represented by the viewpoint position information among the image data output from the vehicle exterior camera 3a and the vehicle exterior camera 3b. A first image viewed by either the left eye EL or the right eye ER is generated based on the range of image data. For example, the first image may be an image in the range corresponding to the shielding part 7 that is seen by the right eye ER. The second image processing unit 9 generates an image that is perceived by either the left eye EL or the right eye ER and has a parallax with respect to the first image, based on image data in a range corresponding to the shielding unit 7 out of the image data. 2 images are generated. For example, the second image may be an image in the range corresponding to the shielding part 7 that is seen by the left eye EL. Even if the first image and the second image contain the same object, since the viewpoints from which the object is viewed are different, the position and shape of the object in the images differ depending on the parallax. The range corresponding to the shielding part 7 that can be seen by the left eye EL refers to the range that would be visible by the left eye EL if the shielding part 7 did not exist. Similarly, the range corresponding to the shielding part 7 that can be seen by the right eye RL refers to the range that would be visible by the right eye RL if the shielding part 7 did not exist. The image data processing device 33 sends the created image and information on the left eye EL and right eye ER represented by the viewpoint position information to the display control device 39.
 表示制御装置39は、投影ユニット12の動作を制御する。表示制御装置39は、視点位置情報が表す左目ELおよび右目ERに基づいて、画像を表示させる投影装置を決定し、決定した投影装置で、画像データ処理装置33で作成した画像を投影させる。 The display control device 39 controls the operation of the projection unit 12. The display control device 39 determines a projection device to display the image based on the left eye EL and right eye ER represented by the viewpoint position information, and causes the determined projection device to project the image created by the image data processing device 33.
 次に、図5及び図6を用いて、表示装置10、投影ユニット12による画像の投影について説明する。表示装置10は、遮蔽部7に設けられた、再帰反射性スクリーン11と、再帰反射性スクリーン11に観察者側に臨む表面上に積層される拡散板16と、を含む。投影ユニット12は、複数の投影装置から1つの投影装置を選択し、選択した右目用投影装置から右目用の第1画像を装置再帰反射性スクリーン11に投射する。また、投影ユニット12は、複数の投影装置から1つの投影装置を選択し、選択した左目用投影装置から左目用の第2画像を装置再帰反射性スクリーン11に投射する。 Next, image projection by the display device 10 and the projection unit 12 will be described using FIGS. 5 and 6. The display device 10 includes a retroreflective screen 11 provided on the shielding part 7, and a diffuser plate 16 laminated on the surface of the retroreflective screen 11 facing the viewer. The projection unit 12 selects one projection device from the plurality of projection devices and projects a first right-eye image from the selected right-eye projection device onto the device retroreflective screen 11 . Furthermore, the projection unit 12 selects one projection device from the plurality of projection devices, and projects a second image for the left eye from the selected left-eye projection device onto the device retroreflective screen 11 .
 投影装置12は、第一投影ユニット100及び第二投影ユニット102の投影装置110a、110b、110c、110d、112a、112b、112c、112dから選択した2つの投影装置が、右目用の画像を投射する右目用投影ユニット12Rと左目用の画像を投射する左目用投影ユニット12Lとなる。後述するが、右目用投影ユニット12Rと左目用の画像を投射する左目用投影ユニット12Lは、第一投影ユニット100から2つの投影装置が選択される場合、第二投影ユニット102から2つの投影装置が選択される場合、第一投影ユニット100及び第二投影ユニット102からそれぞれ1つずつの投影装置が選択される場合等、種々の投影装置の組み合わせの場合がある。右目用投影ユニット12Rは、第1画像を表示する液晶表示装置13Rと、液晶表示装置13Rから出射された第1画像の画像光を再帰反射性スクリーン11に投影する第1投射レンズ14Rとを有する。左目用投影ユニット12Lは、第2画像を表示する液晶表示装置13Lと、液晶表示装置13Lから出射された第2画像の画像光を再帰反射性スクリーン11に投影する第2投射レンズ14Lとを有する。各液晶表示装置13R、13Lは、透過型液晶表示素子と、液晶表示素子の背面に光を出射するバックライト装置とを備える。各投射レンズ14R、14Lは、各液晶表示素子から出射された第1画像および第2画像が互いに視差をもたせて再帰反射性スクリーン11上に結像させる複数のレンズの組み合わせによって構成される。 In the projection device 12, two projection devices selected from the projection devices 110a, 110b, 110c, 110d, 112a, 112b, 112c, and 112d of the first projection unit 100 and the second projection unit 102 project images for the right eye. A right-eye projection unit 12R and a left-eye projection unit 12L project an image for the left eye. As will be described later, when two projection devices are selected from the first projection unit 100, the right-eye projection unit 12R and the left-eye projection unit 12L that project images for the left eye are two projection devices from the second projection unit 102. is selected, there may be various combinations of projection devices, such as a case where one projection device is selected from each of the first projection unit 100 and the second projection unit 102. The right eye projection unit 12R includes a liquid crystal display device 13R that displays a first image, and a first projection lens 14R that projects image light of the first image emitted from the liquid crystal display device 13R onto the retroreflective screen 11. . The left eye projection unit 12L includes a liquid crystal display device 13L that displays a second image, and a second projection lens 14L that projects the image light of the second image emitted from the liquid crystal display device 13L onto the retroreflective screen 11. . Each of the liquid crystal display devices 13R and 13L includes a transmissive liquid crystal display element and a backlight device that emits light to the back surface of the liquid crystal display element. Each of the projection lenses 14R and 14L is configured by a combination of a plurality of lenses that form images on the retroreflective screen 11 with parallax between the first image and the second image emitted from each liquid crystal display element.
 再帰反射性スクリーン11は、再帰反射性を有し、入射した光の全てを入射方向に反射する。第1投射レンズ14Rおよび第2投射レンズ14Lから出射された第1画像の画像光および第2画像の画像光は、再帰反射性スクリーン11によって、第1投射レンズ14Rおよび第2投射レンズ14Lに向けて反射されて再帰するため、再帰反射性スクリーン11上で重なっている第1画像の画像光と第2画像の画像光とは、観察者の位置では分離して知覚される。また、再帰反射性スクリーン11の観察者側の表面には拡散板16が配置される。拡散板16は再帰反射性スクリーン11の再帰性反射を各投影ユニット12、24、25に反射させるものではなく、観察者の視線に光を反射させるために拡散能力を持たせることがよい。例えば、観察者の上方に各投影ユニット12、24がある場合は、拡散板16の拡散能は上下方向に大きく持たせ、左右方向は上下方向より小さい異方性拡散板がよい。左右方向の拡散能を小さくすることで、右目用の画像が左目に入らず、画像の混同が見られず、明瞭な立体画像を見ることができるようになる。 The retroreflective screen 11 has retroreflectivity and reflects all of the incident light in the direction of incidence. The image light of the first image and the image light of the second image emitted from the first projection lens 14R and the second projection lens 14L are directed to the first projection lens 14R and the second projection lens 14L by the retroreflective screen 11. The image light of the first image and the image light of the second image, which are overlapped on the retroreflective screen 11, are perceived as being separated from each other at the observer's position. Further, a diffuser plate 16 is arranged on the surface of the retroreflective screen 11 on the viewer side. It is preferable that the diffusion plate 16 has a diffusion ability so as not to reflect the retroreflection of the retroreflective screen 11 to each of the projection units 12, 24, and 25, but to reflect the light to the observer's line of sight. For example, when the projection units 12 and 24 are located above the observer, it is preferable that the diffusion plate 16 has a large diffusion capacity in the vertical direction, and an anisotropic diffusion plate that has a smaller diffusion capacity in the horizontal direction than in the vertical direction. By reducing the diffusion power in the left-right direction, the image for the right eye does not enter the left eye, and the images are not confused, making it possible to see a clear stereoscopic image.
 再帰反射性スクリーン11の観察者側に臨む表面には、拡散板16が積層される。拡散板16は、ホログラフィック光学素子であっても良く、再帰反射性スクリーン11の反射面上に接合されている。拡散板16は、第1投射レンズ14Rおよび第2投射レンズ14Lからの光を拡大するように構成されていてもよい。再帰反射性スクリーン11は、直径20μm以上100μm程度以下の微小な複数のガラスビーズ11aを平面上に配置して反射膜11bに貼付けられる。再帰反射性スクリーン11に投射された画像光は、各ガラスビーズ11aに入射し、ガラスビーズ11aの表面で屈折してガラスビーズ11aの反射膜11b側の背面に達し、反射膜11bによって反射される。反射膜11bによって反射された光は、ガラスビーズ11aの背面で再び屈折し、ガラスビーズ11aの表面に達し、ガラスビーズ11aの直径以下の微小距離だけ入射光の入射経路から離間して、入射光と平行な光路を進み、こうして再帰反射が実現される。 A diffuser plate 16 is laminated on the surface of the retroreflective screen 11 facing the viewer. The diffuser plate 16 may be a holographic optical element and is bonded onto the reflective surface of the retroreflective screen 11. The diffuser plate 16 may be configured to magnify the light from the first projection lens 14R and the second projection lens 14L. The retroreflective screen 11 is made by arranging a plurality of minute glass beads 11a with a diameter of 20 μm or more and about 100 μm or less on a flat surface and pasting them on the reflective film 11b. The image light projected onto the retroreflective screen 11 enters each glass bead 11a, is refracted on the surface of the glass bead 11a, reaches the back surface of the glass bead 11a on the reflective film 11b side, and is reflected by the reflective film 11b. . The light reflected by the reflective film 11b is refracted again at the back surface of the glass bead 11a, reaches the surface of the glass bead 11a, is separated from the incident path of the incident light by a minute distance less than the diameter of the glass bead 11a, and is reflected by the incident light. The light travels along a parallel optical path, thus achieving retroreflection.
 拡散板16は、光の拡散方向がY方向(運転者5の左右方向)とZ方向(運転者5の上下方向)で異なるものを配置している。投射レンズ14R、14Lから出射される第1画像の画像光および第2画像の画像光が再帰反射性スクリーン11に入射したとき、その光は入射方向に出射される。このとき、光路長が等しくなる投射レンズ14R、14Lでは共役関係が成り立ち、明瞭な画像を観察することができる。ここで、再帰反射性スクリーン11の上に拡散板16を設置すると、再帰反射で出射した光は拡散され、投射レンズ14R、14L以外の場所でも共役関係を成り立たせることが可能となり、観察者の位置でも明瞭な画像を得ることができる。 The diffusion plates 16 are arranged such that the light is diffused in different directions in the Y direction (the left-right direction of the driver 5) and the Z direction (the vertical direction of the driver 5). When the image light of the first image and the image light of the second image emitted from the projection lenses 14R and 14L enter the retroreflective screen 11, the light is emitted in the direction of incidence. At this time, a conjugate relationship is established between the projection lenses 14R and 14L, which have the same optical path length, and a clear image can be observed. Here, when the diffuser plate 16 is installed on the retroreflective screen 11, the light emitted by retroreflection is diffused, and a conjugate relationship can be established even in locations other than the projection lenses 14R and 14L. Clear images can be obtained no matter the position.
 液晶表示装置13R、13Lは、透過型液晶表示素子を含み、液晶表示素子は、バックライト光源からの光源光を偏向して、観察者の左右両目EL、ERに提供するための第1画像の画像光および第2画像の画像光を出射する。液晶表示装置13R、13Lに表示される2つの画像に視差をもたせることによって、立体感の高い視差画像を提供することができる。なお、液晶表示装置に代えてLED発光表示装置であってもよい。 The liquid crystal display devices 13R and 13L include a transmission type liquid crystal display element, and the liquid crystal display element deflects the light source light from the backlight light source and displays a first image for providing it to the left and right eyes EL and ER of the viewer. The image light and the image light of the second image are emitted. By providing parallax between the two images displayed on the liquid crystal display devices 13R and 13L, it is possible to provide a parallax image with a high three-dimensional effect. Note that an LED light emitting display device may be used instead of the liquid crystal display device.
 投射レンズ14R、14Lは、液晶表示装置13R、13Lから出射された第1画像および第2画像の各画像光を再帰反射性スクリーン11に向けて投射して、再帰反射性スクリーン11上に結像させる。再帰反射性スクリーン11に形成される画像は、液晶表示装置13R、13Lに表示された画像を拡大したもので、広い範囲にわたる。左目用投影ユニット12Lは、その射出瞳が観察者の左目ELと同じ高さかつ左目ELの近傍となる位置、例えばバックシートの上部のヘッドレストの両側に配置されており、右目用投影ユニット12Rも同様に、その射出瞳が観察者の右目ERと同じ高さかつ右目ERの近傍となる位置に配置されている。観察者は再帰反射性スクリーン11への光の入射方向から、右サイドピラー表示装置10b上の像を観察することになる。また、左目用投影ユニット12Lおよび右目用投影ユニット12Rの射出瞳は、観察者の目と同じ位置で上方に配設されていてもよい。すなわち車両2の天井に配設されていてもよい。この場合、射出瞳の位置に対応して拡散板16の拡散能の異方性を対応させることがよい。射出瞳が観察者の目と同じ高さであれば、拡散板16の拡散異方性は左右方向を強くし、射出瞳が観察者の目より高い位置であれば、拡散板16の拡散異方性は上下方向を強くする。 The projection lenses 14R and 14L project the image lights of the first image and the second image emitted from the liquid crystal display devices 13R and 13L toward the retroreflective screen 11 to form an image on the retroreflective screen 11. let The image formed on the retroreflective screen 11 is an enlarged version of the image displayed on the liquid crystal display devices 13R and 13L, and covers a wide range. The left eye projection unit 12L is located at a position where its exit pupil is at the same height as the observer's left eye EL and near the left eye EL, for example, on both sides of the headrest at the top of the back seat. Similarly, the exit pupil is located at the same height as and near the right eye ER of the observer. An observer will observe the image on the right side pillar display device 10b from the direction of incidence of light on the retroreflective screen 11. Further, the exit pupils of the left-eye projection unit 12L and the right-eye projection unit 12R may be arranged above at the same position as the observer's eyes. That is, it may be arranged on the ceiling of the vehicle 2. In this case, it is preferable that the anisotropy of the diffusing power of the diffusing plate 16 corresponds to the position of the exit pupil. If the exit pupil is at the same height as the observer's eyes, the diffusion anisotropy of the diffuser plate 16 will be strong in the left and right directions, and if the exit pupil is at a position higher than the observer's eyes, the diffusion anisotropy of the diffuser plate 16 will be strong. Orientation makes the vertical direction stronger.
 左目用投影ユニット12Lおよび右目用投影ユニット12Rの各投射レンズ14L、14Rの光軸は平行であり、右サイドピラー表示装置10bは投射レンズ14L、14Rの光軸に対してそれぞれ垂直に配置されていることがよい。右サイドピラー表示装置10bに表示される右目ER用の第1画像と左目EL用の第2画像とは一部分が重なり合った状態で表示されている。 The optical axes of the projection lenses 14L and 14R of the left-eye projection unit 12L and the right-eye projection unit 12R are parallel, and the right side pillar display device 10b is arranged perpendicularly to the optical axes of the projection lenses 14L and 14R, respectively. It's good to be there. The first image for the right eye ER and the second image for the left eye EL displayed on the right side pillar display device 10b are displayed in a partially overlapping state.
 再帰反射性スクリーン11は、再帰反射性を有しており、入射する光のほとんど全てを入射方向に反射する。投射レンズ14L、14Rから投射された光は、再帰反射性スクリーン11によってそれぞれ投射レンズ14L、14Rに向けて反射されることになり、再帰反射性スクリーン11上で重なっている右目ER用の第1画像の画像光と左目EL用の第2画像の画像光とは、観察者の位置では分離し、右目ERおよび左目ELに個別に入射し、観察者である運転者5は同時に第1画像の画像光と第2画像の画像光との混合画像を立体的な視差画像として知覚することができる。 The retroreflective screen 11 has retroreflectivity and reflects almost all of the incident light in the direction of incidence. The lights projected from the projection lenses 14L and 14R are reflected by the retroreflective screen 11 toward the projection lenses 14L and 14R, respectively, and the first light for the right eye ER overlaps on the retroreflective screen 11. The image light of the image and the image light of the second image for the left eye EL are separated at the observer's position and enter the right eye ER and the left eye EL separately, and the driver 5 who is the observer simultaneously receives the image light of the first image. A mixed image of the image light and the image light of the second image can be perceived as a three-dimensional parallax image.
 実際にフロントガラスや後部ウインドウガラスを通じて映る車外の景色と、各遮蔽部7に設けられるダッシュボードの表示装置10a、右サイドピラーの表示装置10b、左サイドピラーの表示装置10cおよびバックシートの表示装置10dの位置に映し出される画像とを一致させるために、本実施形態では、運転者5の視点位置、すなわち左目ELと右目ERの位置を視点検出部である車内カメラ6によって撮像して得た画像から車体に対するX、Y、Zの直交3軸上の座標(x、y、z)として求めている。本実施形態では、車長方向をX軸、車幅方向をY軸、車高方向をZ軸とする。このような構成によれば、現実の運転者5の視点が基準となるため、実際にフロントガラスや後部ウインドウガラスを通じて映る景色と、各遮蔽部7に設けられる表示装置10a、表示装置10b、表示装置10cおよび表示装置10dの位置に映し出される画像とに連続性を持たせ、違和感を持たせることなく、いわゆる透明化したように車外の景色を知覚させることができる。また、運転者の視点位置を検出することで、運転者の体形や姿勢の相違に対しても柔軟に追従して表示することができる。これによって、違和感を生じさせること無く、運転者5の死角を効果的に減じることができ、早期に危険を察知し、快適な運転を支援することが可能となる。 The scenery outside the vehicle actually reflected through the windshield and rear window glass, the dashboard display device 10a provided in each shielding part 7, the right side pillar display device 10b, the left side pillar display device 10c, and the back seat display device In order to match the image projected at the position 10d, in this embodiment, an image obtained by capturing the viewpoint position of the driver 5, that is, the positions of the left eye EL and the right eye ER by the in-vehicle camera 6, which is a viewpoint detection section. It is calculated as coordinates (x, y, z) on three orthogonal axes, X, Y, and Z, relative to the vehicle body. In this embodiment, the vehicle length direction is the X axis, the vehicle width direction is the Y axis, and the vehicle height direction is the Z axis. According to such a configuration, since the actual viewpoint of the driver 5 becomes the reference, the scenery actually reflected through the windshield or rear window glass, the display device 10a, the display device 10b, and the display device provided in each shielding part 7 The images displayed at the positions of the device 10c and the display device 10d are made to have continuity, so that the scenery outside the vehicle can be perceived as being so-called transparent without giving a sense of discomfort. Furthermore, by detecting the driver's viewpoint position, it is possible to flexibly follow and display differences in the driver's body shape and posture. As a result, it is possible to effectively reduce the blind spot of the driver 5 without causing a sense of discomfort, and it is possible to detect danger at an early stage and support comfortable driving.
 図7は、画像投影システムの処理の一例を示すフローチャートである。図8から図12は、それぞれ画像投影システムの動作を説明するための模式図である。運転者5は、車両2の運転時に、運転席4に着座し、スタートボタン等によって車両の運転開始の操作が入力される。画像処理システム1は、スタートボタン等によって車両の電源が起動され、運転者5が運転席4に着座したことが着座センサ36によって検出されることで、図7に示す処理を実行する。画像投影システム1は、図7に示す処理を繰り返し実行する。画像投影システム1は、制御部50で各部の動作を制御し、図7に示す処理を実行する。また、図8から図12では、第一画像ユニット100から画像を投影する場合として説明する。なお、運転者の視線の位置に応じて、第二画像ユニット102からも同様の処理で画像が投影される場合がある。 FIG. 7 is a flowchart showing an example of the processing of the image projection system. 8 to 12 are schematic diagrams for explaining the operation of the image projection system, respectively. When driving the vehicle 2, the driver 5 is seated in the driver's seat 4, and inputs an operation to start driving the vehicle using a start button or the like. The image processing system 1 executes the process shown in FIG. 7 when the power of the vehicle is activated by a start button or the like, and the seating sensor 36 detects that the driver 5 is seated in the driver's seat 4. The image projection system 1 repeatedly executes the process shown in FIG. In the image projection system 1, the control unit 50 controls the operation of each unit and executes the processing shown in FIG. 7. Further, in FIGS. 8 to 12, a case will be described in which an image is projected from the first image unit 100. Note that, depending on the position of the driver's line of sight, an image may also be projected from the second image unit 102 using the same process.
 制御部50は、各投影装置の投影光源線の情報を取得する(ステップS12)。制御部50は、投影ユニット12の各投影装置の配置の情報に基づいて、投影光源線の情報を取得する。これにより、図8に示すように、投影装置110aの投影光源線140a、投影装置110bの投影光源線140b、投影装置110cの投影光源線140c、投影装置110dの投影光源線140dの情報を取得する。投影光源線140a、140b、140c、140dの配置は、第一平面上での線となる。投影光源線の情報は、予め設定された情報として取得することができる。 The control unit 50 acquires information on the projection light source line of each projection device (step S12). The control unit 50 acquires information on the projection light source line based on information on the arrangement of each projection device of the projection unit 12. As a result, as shown in FIG. 8, information on the projection light source line 140a of the projection device 110a, the projection light source line 140b of the projection device 110b, the projection light source line 140c of the projection device 110c, and the projection light source line 140d of the projection device 110d is acquired. . The projection light source lines 140a, 140b, 140c, and 140d are arranged on the first plane. Information on the projection light source line can be acquired as preset information.
 制御部50は、眼球座標を取得する(ステップS14)。制御部50は、車内カメラ6で取得した画像を車内カメラ制御装置33の視線認識装置31で処理して、運転者5の左目EL、右目ERの位置を検出する。 The control unit 50 acquires eyeball coordinates (step S14). The control unit 50 processes the image acquired by the in-vehicle camera 6 with the line-of-sight recognition device 31 of the in-vehicle camera control device 33, and detects the positions of the left eye EL and right eye ER of the driver 5.
 制御部50は、眼球座標から最も近い投影光源線を特定する(ステップS16)。制御部50は、各投影装置の投影光源線の情報と、運転者5の左目EL、右目ERの位置に基づいて、運転者5の左目EL、右目ERの位置のそれぞれに最も近い投影光源線を特定する。 The control unit 50 identifies the closest projection light source line from the eyeball coordinates (step S16). The control unit 50 selects the projection light source line closest to the positions of the left eye EL and right eye ER of the driver 5, based on information on the projection light source line of each projection device and the positions of the left eye EL and right eye ER of the driver 5. Identify.
 制御部50は、1つの投影装置を選定し、選定した投影装置が眼球の最も近くにある投影光源線に対応する投影装置であるかを判定する(ステップS18)。制御部50は、選定した投影装置が、眼球の最も近くの投影光源線に対応する投影装置でない(ステップS18でNo)と判定した場合、選定した投影装置の画像の投影をOFFとして、ステップS20に進む。 The control unit 50 selects one projection device and determines whether the selected projection device is the one corresponding to the projection light source line closest to the eyeball (step S18). When the control unit 50 determines that the selected projection device is not a projection device corresponding to the projection light source line closest to the eyeball (No in step S18), the control unit 50 turns off the projection of the image of the selected projection device, and returns to step S20. Proceed to.
 制御部50は、選定した投影装置が最も眼球の近くにあるの投影光源線に対応する投影装置である(ステップS18でYes)と判定した場合、右用用の画像を投影するかを判定する(ステップS22)。制御部50は、選定した投影装置が右目に最も近い投影光源線に対応する投影装置である場合、右目用の画像である(ステップS22でYes)と判定し、右目用の画像を投影する(ステップS24)。制御部50は、右目用の画像ではない(ステップS22でNo)と判定した場合、左目用の画像を投影する(ステップS26)。すなわち、制御部50は、運転者5の左目EL、右目ERの位置のそれぞれに最も近い投影光源線に対応する投影装置に、画像を投影する。投影光源線に対応する投影装置とは、投影装置の投影方向が投影光源線と一致する、または投影光源線の中で最も近似する投影装置をいう。 If the control unit 50 determines that the selected projection device is the projection device corresponding to the projection light source line closest to the eyeball (Yes in step S18), it determines whether to project the right image. (Step S22). If the selected projection device is a projection device corresponding to the projection light source line closest to the right eye, the control unit 50 determines that the image is for the right eye (Yes in step S22), and projects the image for the right eye ( Step S24). If the control unit 50 determines that the image is not for the right eye (No in step S22), it projects the image for the left eye (step S26). That is, the control unit 50 projects the image onto the projection device corresponding to the projection light source line closest to the positions of the left eye EL and right eye ER of the driver 5, respectively. The projection device corresponding to the projection light source line refers to a projection device whose projection direction coincides with the projection light source line or is most similar to the projection light source line.
 ここで、投影する画像は、以下の処理で作成する。車外カメラ制御装置35が各車外カメラ3a、3bを作動させて、車外の撮像を開始する。次いで、車内カメラ制御装置37が視点認識装置31および車内カメラ6を作動させて、車内カメラ6による撮像が開始される。視点認識装置31は、車内カメラ6によって撮像された画像に基づいて、運転者5の視点、すなわち左目ELおよび右目ERの瞳位置を抽出し、抽出した瞳位置をX、Y、Z座標系における座標値(x、y、z)として算出する。画像データ処理装置33は、視点認識装置31で算出された視点位置における遮蔽部7に投射されるべき画像を、車外カメラ3によって撮像された画像から切り出し、右目ER用の画像としての第1画像および左目EL用の画像としての第2画像を生成する。このとき、観察者が第1および第2画像を一部が重なった状態で投射画像を同時に見たとき、2つの異なる画像を一体視することによって、立体化して明瞭に知覚することができる。 Here, the image to be projected is created by the following process. The vehicle exterior camera control device 35 operates each vehicle exterior camera 3a, 3b to start imaging the exterior of the vehicle. Next, the in-vehicle camera control device 37 operates the viewpoint recognition device 31 and the in-vehicle camera 6, and the in-vehicle camera 6 starts capturing an image. The viewpoint recognition device 31 extracts the viewpoints of the driver 5, that is, the pupil positions of the left eye EL and the right eye ER, based on the image captured by the in-vehicle camera 6, and converts the extracted pupil positions in the X, Y, Z coordinate system. Calculate as coordinate values (x, y, z). The image data processing device 33 cuts out the image to be projected onto the shielding unit 7 at the viewpoint position calculated by the viewpoint recognition device 31 from the image captured by the vehicle exterior camera 3, and generates a first image as an image for right eye ER. Then, a second image is generated as an image for the left eye EL. At this time, when the observer simultaneously views the projected images with the first and second images partially overlapping, the two different images can be viewed as one, allowing for a clear three-dimensional perception.
 制御部50は、ステップS20、S24、S26の処理を行った後、投影装置の判定が終了したかを判定する(ステップS28)。制御部50は、投影装置の判定が終了していない(ステップS28でNo)と判定した場合、ステップS18に戻り、判定していない投影装置の判定を行う。制御部50は、投影装置の判定が終了している(ステップS28でYes)と判定した場合、本処理を終了する。制御装置50は、ステップS18からステップS28の処理を各投影装置について並列で処理してもよい。 After performing the processing in steps S20, S24, and S26, the control unit 50 determines whether the determination of the projection device is completed (step S28). When the control unit 50 determines that the determination of the projection devices has not been completed (No in step S28), the control unit 50 returns to step S18 and performs determination of the undetermined projection devices. When the control unit 50 determines that the determination of the projection device has been completed (Yes in step S28), the control unit 50 ends this process. The control device 50 may process steps S18 to S28 in parallel for each projection device.
 画像装置システム1は、図7の処理で、運転者の視線に合わせて、投影する投影装置を選定することで、運転者の眼球の位置に応じた画像を表示装置に投影することができ、周囲の風景と一体化した画像を表示させることができる。 The image device system 1 can project an image according to the position of the driver's eyeballs on the display device by selecting a projection device to project according to the driver's line of sight in the process shown in FIG. It is possible to display an image that is integrated with the surrounding scenery.
 制御部50は、図8に示すように、左目ELが投影光源線140bと最も近く、右目ERが投影光源線140cと最も近い場合、投影装置110bから左目用画像を投影し、投影装置110cから右目用画像を投影し、その他の投影装置からの画像の投影を行わない。 As shown in FIG. 8, when the left eye EL is closest to the projection light source line 140b and the right eye ER is closest to the projection light source line 140c, the control unit 50 projects the left eye image from the projection device 110b, and projects the left eye image from the projection device 110c. Projects the image for the right eye and does not project images from other projection devices.
 制御部50は、図9に示すように、左目ELが投影光源線140aと最も近く、右目ERが投影光源線140bと最も近い場合、投影装置110aから左目用画像を投影し、投影装置110bから右目用画像を投影し、その他の投影装置からの画像の投影を行わない。これにより、運転者が図8から頭を左に動かした場合、投影する画像を切り替えることで、より適切な画像を投影することができる。 As shown in FIG. 9, when the left eye EL is closest to the projection light source line 140a and the right eye ER is closest to the projection light source line 140b, the control unit 50 projects the left eye image from the projection device 110a, and projects the left eye image from the projection device 110b. Projects the image for the right eye and does not project images from other projection devices. Thereby, when the driver moves his head to the left in FIG. 8, a more appropriate image can be projected by switching the image to be projected.
 制御部50は、図10に示すように、運転者5が第一投影ユニット100よりも表示装置10に近づいた状態で、左目ELが投影光源線140bと最も近く、右目ERが投影光源線140dと最も近い場合、投影装置110bから左目用画像を投影し、投影装置110dから右目用画像を投影し、その他の投影装置からの画像の投影を行わない。 As shown in FIG. 10, when the driver 5 is closer to the display device 10 than the first projection unit 100, the control unit 50 controls the left eye EL to be closest to the projection light source line 140b, and the right eye ER to be closest to the projection light source line 140d. In the case where the image for the left eye is projected from the projection device 110b, the image for the right eye is projected from the projection device 110d, and images from other projection devices are not projected.
 制御部50は、図11に示すように、運転者5が第一投影ユニット100よりも表示装置10から遠い状態で、左目ELが投影光源線140bと最も近く、右目ERが投影光源線140dと最も近い場合、投影装置110bから左目用画像を投影し、投影装置110dから右目用画像を投影し、その他の投影装置からの画像の投影を行わない。 As shown in FIG. 11, when the driver 5 is farther from the display device 10 than the first projection unit 100, the control unit 50 controls the left eye EL to be closest to the projection light source line 140b, and the right eye ER to be closest to the projection light source line 140d. In the closest case, the left-eye image is projected from the projection device 110b, the right-eye image is projected from the projection device 110d, and no images are projected from the other projection devices.
 制御部50は、図12に示すように、運転者5が第一投影ユニット100よりも車両外側に頭を移動させており、左目EL、右目ERが投影光源線140dと最も近い場合、投影装置110dから左目用画像を投影し、その他の投影装置からの画像の投影を行わない。 As shown in FIG. 12, when the driver 5 has moved his head to the outside of the vehicle relative to the first projection unit 100, and the left eye EL and right eye ER are closest to the projection light source line 140d, the control unit 50 controls the projection device. The left-eye image is projected from 110d, and images from other projection devices are not projected.
 図8から図12では、第一投影ユニット100で画像を投影する場合について説明したが、第一投影ユニット100の1つの投影装置で一方の目の画像を投影し、第二投影ユニット102の1つの投影装置で他方の目の画像を投影することもできる。また、図8から図12では、左目EL、及び右目ERに最も近い投影光源線140a~140dに対応する投影装置110a~110dによって画像を投影していたが、これに限られない。例えば、左目EL、及び右目ERから所定距離内の投影光源線140a~140dに対応する任意の投影装置110a~110dによって画像を投影してもよい。 In FIGS. 8 to 12, a case has been described where the first projection unit 100 projects an image, but one projection device of the first projection unit 100 projects an image of one eye, and one of the second projection units 102 projects an image of one eye. It is also possible to use one projection device to project the image of the other eye. Furthermore, in FIGS. 8 to 12, images are projected by the projection devices 110a to 110d corresponding to the projection light source lines 140a to 140d closest to the left eye EL and the right eye ER, but the present invention is not limited thereto. For example, the image may be projected by any of the projection devices 110a to 110d corresponding to the projection light source lines 140a to 140d within a predetermined distance from the left eye EL and the right eye ER.
 画像投影システム1は、第一投影ユニット100と第二投影ユニット102にそれぞれ複数の投影装置を直線上に配置し、第一直線122と第二直線124とを第一平面に対して異なる角度とすることで、運転者の目の位置に応じて、投影する投影装置の組み合わせを種々の組み合わせとすることができる。具体的には、画像を投影する投影ユニットを右目用と左目用で組み合わせて、複数の位置に配置した場合、配置した組み合わせの位置から離れた場合、運転者の位置に応じた画像を投影できない。これに対して、画像投影システム1は、投影装置として右目用、左目用を特定せずに、直線上に投影装置を配置することで、運転者の頭の位置に応じて、投影装置を組み合わせることができ、より適切な画像を投影することができる。なお、1つの投影ユニットには、3つ以上の投影装置を配置することが好ましい。 The image projection system 1 has a plurality of projection devices arranged in a straight line in each of the first projection unit 100 and the second projection unit 102, and the first straight line 122 and the second straight line 124 are set at different angles with respect to the first plane. This allows various combinations of projection devices to be used for projecting images depending on the position of the driver's eyes. Specifically, if the projection units that project images are combined for the right eye and left eye and placed in multiple positions, and if you move away from the combined position, you will not be able to project the image according to the driver's position. . In contrast, the image projection system 1 combines the projection devices according to the position of the driver's head by arranging the projection devices on a straight line without specifying whether the projection device is for the right eye or the left eye. This allows a more appropriate image to be projected. Note that it is preferable to arrange three or more projection devices in one projection unit.
 本実施形態のように、第一投影ユニット100と第二投影ユニット102を交点126で交差する配置とすることで、交点126の周囲でより適切な画像を投影することができる。また、本実施形態の様に、交点126を第一投影ユニット100と第二投影ユニット102の中心として、投影装置を対称軸で対称となる配置とすることで、投影ユニット12が配置されている範囲の各位置で差の少ない画像を投影することができる。 By arranging the first projection unit 100 and the second projection unit 102 so that they intersect at the intersection 126 as in this embodiment, a more appropriate image can be projected around the intersection 126. Further, as in the present embodiment, the projection units 12 are arranged by arranging the projection devices symmetrically about the axis of symmetry with the intersection 126 as the center of the first projection unit 100 and the second projection unit 102. Images with little difference can be projected at each position in the range.
 ここで、投影装置の配置は、上記実施形態に限定されない。図13は、画像投影システムの投影装置の配置の他の例を示す模式図である。図13に示す投影装置202は、第一投影ユニット100aと、第二投影ユニット102aと、を有する。投影装置202は、第一投影ユニット100aの第一直線と第二投影ユニット102aの第二直線とのなす角θが、上記実施形態よりも大きい角度となる。このように、第一投影ユニット100aと第二投影ユニット102aの交差する角度を異なる角度しても、上記実施形態の効果を得ることができる。また、角度を大きくすることで、投影ユニット200で、投影できる範囲を広くすることができる。ここで、角度θは、プロジェクタの画角によって好ましい値が設定され得る。例えばプロジェクタの画角が約57°の場合、角度θが0度以上70度以下とすることが好ましい。 Here, the arrangement of the projection device is not limited to the above embodiment. FIG. 13 is a schematic diagram showing another example of the arrangement of the projection devices of the image projection system. The projection device 202 shown in FIG. 13 includes a first projection unit 100a and a second projection unit 102a. In the projection device 202, the angle θ between the first straight line of the first projection unit 100a and the second straight line of the second projection unit 102a is a larger angle than in the above embodiment. In this way, even if the first projection unit 100a and the second projection unit 102a intersect at different angles, the effects of the above embodiment can be obtained. Furthermore, by increasing the angle, the range that can be projected by the projection unit 200 can be widened. Here, the angle θ can be set to a preferable value depending on the angle of view of the projector. For example, when the angle of view of the projector is about 57 degrees, it is preferable that the angle θ is 0 degrees or more and 70 degrees or less.
 図14は、画像投影システムの投影装置の配置の他の例を示す模式図である。図14に示す投影ユニット300は、第一投影ユニット302と、第二投影ユニット304と、第3投影ユニット306と、を有する。第一投影ユニット302と、第二投影ユニット304と、第3投影ユニット306とは、投影装置が配置された直線が三角形となるように配置される。つまり、各投影ユニットの端部が他の投影ユニットと接する配置となる。画像投影システム300のように、投影ユニットを3つ以上配置してもよい。この場合も、投影ユニットの投影装置を配置する直線を第一平面で異なる角度とすることで、それぞれの投影ユニットでそれぞれの領域に画像を投影することができ、運転者の目の位置に応じた投影装置を選択することができる。なお、3つ以上の投影装置を用いる場合も、1つの交点で、3つの投影ユニットの直線が交差する配置としてもよい。 FIG. 14 is a schematic diagram showing another example of the arrangement of the projection devices of the image projection system. The projection unit 300 shown in FIG. 14 includes a first projection unit 302, a second projection unit 304, and a third projection unit 306. The first projection unit 302, the second projection unit 304, and the third projection unit 306 are arranged so that the straight line on which the projection devices are arranged forms a triangle. In other words, the end portion of each projection unit is placed in contact with another projection unit. Like the image projection system 300, three or more projection units may be arranged. In this case as well, by arranging the straight lines on which the projection devices of the projection units are arranged at different angles on the first plane, each projection unit can project an image to each area, depending on the position of the driver's eyes. You can select the projection device that you want. Note that even when three or more projection devices are used, the straight lines of the three projection units may be arranged to intersect at one intersection.
 添付の請求項に係る技術を完全かつ明瞭に開示するために特徴的な実施形態に関し記載してきた。しかし、添付の請求項は、上記実施形態に限定されるべきものでなく、本明細書に示した基礎的事項の範囲内で当該技術分野の当業者が創作しうるすべての変形例及び代替可能な構成を具現化するように構成されるべきである。本開示の内容は、当業者であれば本開示に基づき種々の変形および修正を行うことができる。したがって、これらの変形および修正は本開示の範囲に含まれる。例えば、各実施形態において、各機能部、各手段、各ステップなどは論理的に矛盾しないように他の実施形態に追加し、若しくは、他の実施形態の各機能部、各手段、各ステップなどと置き換えることが可能である。また、各実施形態において、複数の各機能部、各手段、各ステップなどを1つに組み合わせたり、或いは分割したりすることが可能である。また、上述した本開示の各実施形態は、それぞれ説明した各実施形態に忠実に実施することに限定されるものではなく、適宜、各特徴を組み合わせたり、一部を省略したりして実施することもできる。 Specific embodiments have been described to provide a complete and clear disclosure of the technology as claimed below. However, the appended claims should not be limited to the above-mentioned embodiments, but include all modifications and substitutions that can be created by a person skilled in the art within the scope of the basic matters presented in this specification. should be configured to embody a specific configuration. Those skilled in the art can make various changes and modifications to the contents of the present disclosure based on the present disclosure. Accordingly, these variations and modifications are included within the scope of this disclosure. For example, in each embodiment, each functional unit, each means, each step, etc. may be added to other embodiments so as not to be logically contradictory, or each functional unit, each means, each step, etc. of other embodiments may be added to other embodiments to avoid logical contradiction. It is possible to replace it with Further, in each embodiment, it is possible to combine or divide a plurality of functional units, means, steps, etc. into one. Further, each embodiment of the present disclosure described above is not limited to being implemented faithfully to each described embodiment, but may be implemented by combining each feature or omitting a part as appropriate. You can also do that.
 1 画像投影システム
 2 車両
 3 車外カメラ
 4 運転席
 5 観察者(運転者)
 6 車内カメラ
 7 遮蔽部
 8 第1画像処理部
 9 第2画像処理部
 10 表示装置
 10a~10d 表示装置
 11 再帰反射性スクリーン
 11a ガラスビーズ
 11b 反射膜
 12 投影ユニット
 12L 左目用投射部
 12R 右目用投射部
 13L、13R 液晶表示装置
 15 表示器
 16 拡散板
 17 パララックスバリア
 18 レンチキュラーレンズ
 33 画像データ処理装置
 35 車外カメラ制御装置
 37 車内カメラ制御装置
 39 表示制御装置
 50 制御装置
 100 第一投影ユニット
 102 第二投影ユニット
 122 第一直線
 124 第二直線
 126 交点
 EL 左目
 ER 右目
1 Image projection system 2 Vehicle 3 External camera 4 Driver's seat 5 Observer (driver)
6 In-vehicle camera 7 Shielding section 8 First image processing section 9 Second image processing section 10 Display device 10a to 10d Display device 11 Retroreflective screen 11a Glass beads 11b Reflective film 12 Projection unit 12L Left eye projection section 12R Right eye projection section 13L, 13R Liquid crystal display device 15 Display device 16 Diffusion plate 17 Parallax barrier 18 Lenticular lens 33 Image data processing device 35 External camera control device 37 In-vehicle camera control device 39 Display control device 50 Control device 100 First projection unit 102 Second projection Unit 122 First straight line 124 Second straight line 126 Intersection EL Left eye ER Right eye

Claims (7)

  1.  直線上に配置された複数の投影装置を有する第一投影ユニットと、
     前記第一投影ユニットと異なる角度で直線上に配置された複数の投影装置を有する第二投影ユニットと、
     前記第一投影ユニットと前記第二投影ユニットが投射する方向に配置された再帰反射性スクリーンと、
     前記スクリーンを観察する観察者の右目と左目の位置に基づいて、少なくとも2つの投影装置を決定し、決定した投影装置からそれぞれ右目用の画像と左目用の画像を投影させる制御装置と、を備える画像投影システム。
    a first projection unit having a plurality of projection devices arranged in a straight line;
    a second projection unit having a plurality of projection devices arranged in a straight line at different angles from the first projection unit;
    a retroreflective screen arranged in the direction in which the first projection unit and the second projection unit project;
    a control device that determines at least two projection devices based on the positions of the right eye and left eye of an observer observing the screen, and causes the determined projection devices to project an image for the right eye and an image for the left eye, respectively. Image projection system.
  2.  前記制御装置は、投影する画像の中心軸が前記右目に最も近い前記投影装置に、前記右目用の画像を投影させ、投影する画像の中心軸が前記左目に最も近い前記投影装置に、前記左目用の画像を投影させる、請求項1に記載の画像投影システム。 The control device causes the projection device whose central axis of the projected image is closest to the right eye to project the image for the right eye, and causes the projection device whose central axis of the projected image is closest to the left eye to project the image for the left eye. The image projection system according to claim 1, wherein the image projection system projects an image for.
  3.  前記第一投影ユニットと前記第二投影ユニットは、対称軸に対して投影装置が対称に配置される請求項1に記載の画像投影システム。 The image projection system according to claim 1, wherein the first projection unit and the second projection unit have projection devices arranged symmetrically with respect to an axis of symmetry.
  4.  前記第一投影ユニットの前記複数の投影装置が配置された直線と、前記第二投影ユニットの前記複数の投影装置が配置された直線とが対称軸の1点で交差する請求項3に記載の画像投影システム。 4. A straight line on which the plurality of projection devices of the first projection unit are arranged and a straight line on which the plurality of projection devices of the second projection unit are arranged intersect at one point of an axis of symmetry. Image projection system.
  5.  前記第一平面において前記第一投影ユニットと異なる角度、かつ、前記第二投影ユニットと異なる角度で直線上に配置された複数の投影装置を有する第三投影ユニットを有する請求項1に記載の画像投影システム。 The image according to claim 1, further comprising a third projection unit having a plurality of projection devices arranged in a straight line at an angle different from the first projection unit and at a different angle from the second projection unit in the first plane. projection system.
  6.  前記再帰反射性スクリーンは、車両の進行方向に対して右側のAピラーである右ピラー及び左側のAピラーである左ピラーに配置され、
     前記第一投影ユニットは、前記右ピラーに対して画像を投影し、
     前記第二投影ユニットは、前記左ピラーに対して画像を投影し、
     前記観察者は、車両の運転席に座る運転者である請求項1に記載の画像投影システム。
    The retroreflective screen is arranged at a right pillar, which is an A-pillar on the right side, and a left pillar, which is an A-pillar on the left side, with respect to the traveling direction of the vehicle,
    the first projection unit projects an image onto the right pillar;
    the second projection unit projects an image onto the left pillar;
    The image projection system according to claim 1, wherein the observer is a driver sitting in a driver's seat of a vehicle.
  7.  前記再帰反射性スクリーンは、車両のダッシュボードに配置される請求項5に記載の画像投影システム。 The image projection system according to claim 5, wherein the retroreflective screen is placed on a dashboard of a vehicle.
PCT/JP2023/017208 2022-05-31 2023-05-02 Image projection system WO2023233919A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2022089128A JP2023176698A (en) 2022-05-31 2022-05-31 image projection system
JP2022-089128 2022-05-31

Publications (1)

Publication Number Publication Date
WO2023233919A1 true WO2023233919A1 (en) 2023-12-07

Family

ID=89026275

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2023/017208 WO2023233919A1 (en) 2022-05-31 2023-05-02 Image projection system

Country Status (2)

Country Link
JP (1) JP2023176698A (en)
WO (1) WO2023233919A1 (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005199934A (en) * 2004-01-16 2005-07-28 Honda Lock Mfg Co Ltd Vehicular view supporting device
JP2013171252A (en) * 2012-02-22 2013-09-02 Keio Gijuku Information presentation device
JP2014139592A (en) * 2011-05-02 2014-07-31 Sharp Corp Projection type display device
JP2015012559A (en) * 2013-07-02 2015-01-19 株式会社デンソー Projection type display device
JP2015232634A (en) * 2014-06-10 2015-12-24 セイコーエプソン株式会社 Display device

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005199934A (en) * 2004-01-16 2005-07-28 Honda Lock Mfg Co Ltd Vehicular view supporting device
JP2014139592A (en) * 2011-05-02 2014-07-31 Sharp Corp Projection type display device
JP2013171252A (en) * 2012-02-22 2013-09-02 Keio Gijuku Information presentation device
JP2015012559A (en) * 2013-07-02 2015-01-19 株式会社デンソー Projection type display device
JP2015232634A (en) * 2014-06-10 2015-12-24 セイコーエプソン株式会社 Display device

Also Published As

Publication number Publication date
JP2023176698A (en) 2023-12-13

Similar Documents

Publication Publication Date Title
CN113022448B (en) display system
US10953799B2 (en) Display system, electronic mirror system and movable-body apparatus equipped with the same
US20160134815A1 (en) Driving assist device
WO2018061444A1 (en) Reflection plate, information display device, and movable body
JP6945150B2 (en) Display system
JP6697751B2 (en) Vehicle display system, electronic mirror system and moving body
WO2020261830A1 (en) Head-up display device
WO2022181767A1 (en) Image display device
WO2023233919A1 (en) Image projection system
JP6515796B2 (en) Head-up display device
WO2022230824A1 (en) Image display device and image display method
WO2023228770A1 (en) Image display device
WO2023228771A1 (en) Image display device, vehicle, and image display method
WO2023228752A1 (en) Image display device
JP6697747B2 (en) Display system, electronic mirror system and moving body
US20220113539A1 (en) Windshield display device
WO2021010123A1 (en) Head-up display device
WO2020031549A1 (en) Virtual image display device
WO2022255424A1 (en) Video display device
JPWO2019124323A1 (en) Virtual image display device and head-up display device
JP7002061B2 (en) Display device
JP7208378B2 (en) Display device and moving body
JP7111071B2 (en) head-up display device
JP2019120891A (en) Virtual display device and head-up display device
JP7111070B2 (en) head-up display device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23815680

Country of ref document: EP

Kind code of ref document: A1