WO2023233919A1 - Système de projection d'images - Google Patents

Système de projection d'images Download PDF

Info

Publication number
WO2023233919A1
WO2023233919A1 PCT/JP2023/017208 JP2023017208W WO2023233919A1 WO 2023233919 A1 WO2023233919 A1 WO 2023233919A1 JP 2023017208 W JP2023017208 W JP 2023017208W WO 2023233919 A1 WO2023233919 A1 WO 2023233919A1
Authority
WO
WIPO (PCT)
Prior art keywords
projection
image
unit
vehicle
projection unit
Prior art date
Application number
PCT/JP2023/017208
Other languages
English (en)
Japanese (ja)
Inventor
凌一 竹内
Original Assignee
京セラ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 京セラ株式会社 filed Critical 京セラ株式会社
Publication of WO2023233919A1 publication Critical patent/WO2023233919A1/fr

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R1/00Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/20Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/22Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle
    • B60R1/23Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view
    • B60R1/24Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view in front of the vehicle
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R1/00Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/20Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/31Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles providing stereoscopic vision
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R11/00Arrangements for holding or mounting articles, not otherwise provided for
    • B60R11/02Arrangements for holding or mounting articles, not otherwise provided for for radio sets, television sets, telephones, or the like; Arrangement of controls thereof
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B21/00Projectors or projection-type viewers; Accessories therefor
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B21/00Projectors or projection-type viewers; Accessories therefor
    • G03B21/14Details
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B21/00Projectors or projection-type viewers; Accessories therefor
    • G03B21/54Accessories
    • G03B21/56Projection screens
    • G03B21/60Projection screens characterised by the nature of the surface
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09FDISPLAYING; ADVERTISING; SIGNS; LABELS OR NAME-PLATES; SEALS
    • G09F9/00Indicating arrangements for variable information in which the information is built-up on a support by selection or combination of individual elements
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/363Image reproducers using image projection screens
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/366Image reproducers using viewer tracking
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast

Definitions

  • This application relates to an image projection system.
  • Patent Document 1 describes a projection type display device (projection device) that projects images from a plurality of projectors onto a reflex screen.
  • a plurality of projectors are arranged such that the distances between the retroreflective screens are different from each other.
  • the projection device described in Patent Document 1 can display stereoscopic images to viewers at various distances by using a recursive screen and switching the projection device depending on the viewer's position.
  • each projection device is a device that projects an image to a viewer at a set position, it is not possible to cope with a case where the position of the viewer's head changes at the same position. There is a need to project a more appropriate image according to the position of the viewer's head.
  • An image projection system includes a first projection unit having a plurality of projection devices arranged on a straight line, and a plurality of projection devices arranged on a straight line at a different angle from the first projection unit.
  • FIG. 1 is a plan view schematically showing a vehicle equipped with an image projection system.
  • FIG. 2 is a side view schematically showing a vehicle equipped with an image projection system.
  • FIG. 3 is a schematic diagram showing an example of the arrangement of projection devices of the image projection system.
  • FIG. 4 is a block diagram showing a schematic configuration of the image projection system.
  • FIG. 5 is a plan view for explaining the operation of the image projection system.
  • FIG. 6 is a partially enlarged sectional view showing the configuration of a screen and a diffuser plate of the image projection system.
  • FIG. 7 is a flowchart illustrating an example of processing of the image projection system.
  • FIG. 8 is a schematic diagram for explaining the operation of the image projection system.
  • FIG. 9 is a schematic diagram for explaining the operation of the image projection system.
  • FIG. 8 is a schematic diagram for explaining the operation of the image projection system.
  • FIG. 10 is a schematic diagram for explaining the operation of the image projection system.
  • FIG. 11 is a schematic diagram for explaining the operation of the image projection system.
  • FIG. 12 is a schematic diagram for explaining the operation of the image projection system.
  • FIG. 13 is a schematic diagram showing another example of the arrangement of the projection devices of the image projection system.
  • FIG. 14 is a schematic diagram showing another example of the arrangement of the projection devices of the image projection system.
  • FIG. 1 is a plan view schematically showing a vehicle equipped with an image projection system.
  • FIG. 2 is a side view schematically showing a vehicle equipped with an image projection system.
  • FIG. 3 is a schematic diagram showing an example of the arrangement of projection devices of the image projection system.
  • FIG. 4 is a block diagram showing a schematic configuration of the image projection system.
  • FIG. 5 is a plan view for explaining the operation of the image projection system.
  • FIG. 6 is a partially enlarged sectional view showing the configuration of a screen and a diffuser plate of the image projection system.
  • the image projection system 1 is mounted on a vehicle 2.
  • the vehicle 2 is, for example, a passenger car, and has a plurality of seats inside the vehicle, in which a driver, fellow passengers, etc. ride.
  • the vehicle 2 includes a driver's seat 4, a dashboard, doors, pillars, a back seat, and the like.
  • the image projection system 1 projects an image of the surroundings of the vehicle onto a shielding part within a range visible to the driver, so that the image of the surroundings of the vehicle without the shielding part is visible to the driver. Note that although this embodiment will be described as a case in which an image is displayed for the driver 5, the image can be similarly projected for a passenger sitting in a seat other than the driver's seat 4.
  • the image projection system 1 of the present embodiment includes external cameras 3a and 3b, which are imaging units that capture images of the surrounding scenery of a vehicle 2 and output image data obtained by imaging, and external cameras 3a and 3b, which are seats of the vehicle 2, for example, while driving.
  • a viewpoint detection unit that detects the positions of the left eye EL and right eye ER, which are the viewpoints of the driver 5 who is an observer seated on the seat 4, and outputs viewpoint position information representing the detected positions of the left eye EL and right eye ER in spatial coordinates.
  • the shielding part 7 blocks the view when looking outside the vehicle from the left eye EL and right eye ER represented by the viewpoint position information.
  • a first image processing unit 8 that generates a first image that can be seen by either the left eye EL or the right eye ER based on the image data in the corresponding range; and an image in the range corresponding to the shielding unit 7 out of the image data;
  • a second image processing unit 9 that generates a second image that is perceived by either the left eye EL or the right eye ER and has parallax with respect to the first image based on the data; and a shielding unit 7 that is provided in the vehicle 2.
  • the display device 10 includes a display device 10 that is an image display unit that displays a parallax image including a first image and a second image, and projection units 12 and 24 that project images onto the display device 10.
  • the external cameras 3a and 3b are imaging units that capture images of the surrounding scenery of the vehicle 2 and output image data obtained by capturing the images.
  • the front exterior camera 3a is installed at the front end of the vehicle 2.
  • the rear exterior camera 3b is installed at the rear of the vehicle 2.
  • Other types of digital cameras can be used as the external cameras 3a and 3b, such as a CCD (Charge Coupled Devices) camera equipped with a fisheye lens, a CMOS (Complementary Metal Oxide Semiconductor)-IC digital camera, and the like.
  • CMOS Complementary Metal Oxide Semiconductor
  • Each external camera 3a, 3b is equipped with a sensor that detects the environment outside the vehicle (temperature, humidity, etc.), and performs its own optical system, CCD temperature, cover defogging, etc. to obtain a clear image. It is preferable to have a function.
  • the image projection system 1 can also be provided with a plurality of external cameras 3a and 3b, respectively.
  • a pair of external cameras 3a may be provided at diagonal positions at the front end of the vehicle, and a pair of external cameras 3b may be provided at diagonal positions at the rear end of the vehicle. This allows clear images of the front and rear of the vehicle to be captured.
  • the external cameras 3a and 3b only need to be able to acquire images of the outside of the vehicle 2, and are not limited to being disposed outside the vehicle.
  • the cameras 3a and 3b outside the vehicle may be placed inside the vehicle.
  • the vehicle exterior camera 3a may be located inside the windshield of the vehicle, or may be located within the hood of the vehicle.
  • the external cameras 3a and 3b are provided to obtain images of both the front and rear of the vehicle 2, but it is also possible to provide only the external camera 3a and obtain images of only the front of the vehicle 2. .
  • the driver's seat 4 is equipped with a seating sensor 36 that detects whether the driver 5 is seated.
  • the seating sensor 36 is constituted by a known load sensor or limit switch. When the driver 5 is seated in the driver's seat 4, the seating sensor 36 installed in the driver's seat 4 detects that the driver 5 is seated.
  • the in-vehicle camera 6 is installed in the car at a position where it can image the driver 5, for example, at a position adjacent to the rearview mirror.
  • a CCD camera can be used as the in-vehicle camera 6.
  • the shielding part 7 is an object among the structures of the vehicle 2 that blocks the driver's view and prevents the driver from seeing the outside of the vehicle 2 .
  • the shielding portion 7 in the vehicle 2 includes a dashboard, a door, a pillar, a back seat, and the like.
  • the display device 10 displays an image by projecting the image from the projection unit 12.
  • the display devices 10 include a dashboard display device 10a disposed on the dashboard, a right side pillar display device 10b disposed on the right side pillar, and a left side pillar display device 10c disposed on the left side pillar. and a backseat display device 10d on which the backseat 23 of the rear seat 22 is disposed.
  • the display device 10 is arranged along the shape of the shielding portion 7 on one surface facing the interior space of the vehicle.
  • the projection unit 12 projects an image toward the display devices 10a, 10b, and 10c arranged further forward of the vehicle than the driver's seat 4.
  • the projection unit 12 includes a first projection unit 100 and a second projection unit 102, as shown in FIG.
  • the first projection unit 100 is arranged at an angle of view that allows it to project an image onto the display device 10b on the right pillar
  • the second projection unit 102 is arranged at an angle of view that allows it to project an image onto the display device 10b on the left pillar. It will be placed in
  • the first projection unit 100 and the second projection unit 102 can also project an image onto a display device 10a arranged on the dashboard. Each projectable area is different.
  • the first projection unit 100 includes projection devices 110a, 110b, 110c, and 110d.
  • Projection devices 110a, 110b, 110c, and 110d are projection devices that project images toward display device 10, respectively.
  • the projection devices 110a, 110b, 110c, and 110d are arranged, for example, in a row on a first straight line 122 on a virtual first plane.
  • the projection light source line which is the central axis of the image to be projected, is at a different position depending on the arrangement interval. That is, the projection devices 110a, 110b, 110c, and 110d project images onto different areas of the display area 10.
  • the first plane is a plane parallel to the projection light source line, and in this embodiment, is a plane when the vehicle 2 is viewed from above.
  • the second projection unit 102 includes projection devices 112a, 112b, 112c, and 112d.
  • Projection devices 112a, 112b, 112c, and 112d are projection devices that project images toward display device 10, respectively.
  • the projection devices 112a, 112b, 112c, and 112d are arranged, for example, in a row on the second straight line 124 on the virtual first plane.
  • the projection devices 112a, 112b, 112c, and 112d have projection light source lines, which are central axes of images to be projected, at different positions depending on the arrangement interval. That is, the projection devices 112a, 112b, 112c, and 112d project images onto different areas of the display area 10. Note that although the projection areas 130 of the projection devices 112a, 112b, 112c, and 112d are shifted according to the projection light source line, there are parts where the respective projection areas 130 overlap.
  • the second straight line 124 intersects the first straight line 122 at an intersection 126, where the intersection 126 is the center of the first projection unit 100 and the center of the second projection unit 102.
  • the second straight line 124 is a straight line in a different direction on the first plane from the first straight line 122, and the angle between the two straight lines is ⁇ . That is, the first projection unit 100 and the second projection unit 102 are arranged in an X shape.
  • the first projection unit 100 is arranged such that the projection light source line is inclined from a direction parallel to the traveling direction of the vehicle 2 toward a region including the display device 10b arranged on the pillar.
  • the second projection unit 102 is arranged so that the projection light source line is inclined from a direction parallel to the traveling direction of the vehicle 2 toward an area (passenger seat side) that includes the display device 10c arranged on the pillar.
  • the projection devices 110a, 110b, 110c, 110d, 112a, 112b, 112c, and 112d may be arranged at different positions in the direction perpendicular to the first plane, that is, stacked in a direction perpendicular to the first plane.
  • the second projection unit 102 may be arranged on a second plane that is perpendicular to the first plane. Further, the configurations of each of the projection devices 110a, 110b, 110c, 110d, 112a, 112b, 112c, and 112d and the display device 10 will be described later.
  • the projection unit 24 projects an image toward the display device 10d on the back seat.
  • the projection unit 24 is configured in the same manner as the projection unit 12 described above, and the projection unit 24 displays, on the display device 10d, image data corresponding to the range shielded by the back seat among the image data captured by the rear exterior camera 3b. to project.
  • the image projection system 1 controls the operation of each part of the control device 50.
  • the control device 50 is connected to each component of the image projection system 1 and controls each component.
  • the control device 50 is realized by a processor such as an electronic control unit (ECU) as a hardware resource, and a computer-readable program as a software resource.
  • Controller 50 may include one or more processors.
  • the processor may include a general-purpose processor that loads a specific program to execute a specific function, and a dedicated processor specialized for specific processing.
  • the dedicated processor may include an application specific integrated circuit (ASIC).
  • the processor may include a programmable logic device (PLD).
  • the PLD may include an FPGA (Field-Programmable Gate Array).
  • the control device 50 may be either an SoC (System-on-a-Chip) or an SiP (System in a Package) in which one or more processors work together.
  • the control device 50 may include a storage unit, and may store various information or programs for operating each component of the image projection system 1 in the storage unit.
  • the storage unit may be composed of, for example, a semiconductor memory.
  • the storage unit may function as a storage area that is temporarily used during data processing by the control device 50.
  • the control device 50 includes a line of sight recognition device 31, an image data processing device 33, an outside camera control device 35, an inside camera control device 37, and a display control device 39.
  • the vehicle exterior camera control device 35 controls the operation of each vehicle exterior camera 3a, 3b.
  • the vehicle exterior camera control device 35 acquires image data of images captured by each vehicle exterior camera 3a, 3b.
  • the vehicle exterior camera control device 35 receives analog image data from each vehicle exterior camera 3a, 3b, converts it into digital data, and sends it to the image data processing device 33.
  • the in-vehicle camera control device 37 sharpens the image obtained from the in-vehicle camera 6, controls the on/off switching of the in-vehicle camera 6 to capture an image of the inside of the vehicle based on a command from the driver 5, and detects the illuminance inside the vehicle. Detection data from the illuminance sensor, temperature sensor that detects the temperature inside the car, etc. is input, and control is also performed to create an environment inside the car that allows clear images of the inside of the car.
  • the viewpoint recognition device 31 starts measuring the viewpoint position of the driver 5 based on the image acquired by the in-vehicle camera control device 37.
  • the viewpoint recognition device 31 extracts the position of the viewpoint of the driver 5 and the pupil positions of the left eye EL and right eye ER from the photographed image of the in-vehicle camera 6 by image recognition processing in the three-dimensional coordinate system of X, Y, and Z.
  • the extracted pupil position is output as coordinate values (x, y, z).
  • the viewpoint recognition device 31 sends information on the acquired viewpoint position of the driver 5 to the image data processing device 33 .
  • the image data processing device 33 creates images to be projected onto the projection units 12 and 24.
  • the image data processing device 33 includes a first image processing section 8 and a second image processing section 9.
  • the first image processing unit 8 corresponds to a shielding unit 7 that blocks the view when looking outside the vehicle from the left eye EL and right eye ER represented by the viewpoint position information among the image data output from the vehicle exterior camera 3a and the vehicle exterior camera 3b.
  • a first image viewed by either the left eye EL or the right eye ER is generated based on the range of image data.
  • the first image may be an image in the range corresponding to the shielding part 7 that is seen by the right eye ER.
  • the second image processing unit 9 generates an image that is perceived by either the left eye EL or the right eye ER and has a parallax with respect to the first image, based on image data in a range corresponding to the shielding unit 7 out of the image data. 2 images are generated.
  • the second image may be an image in the range corresponding to the shielding part 7 that is seen by the left eye EL. Even if the first image and the second image contain the same object, since the viewpoints from which the object is viewed are different, the position and shape of the object in the images differ depending on the parallax.
  • the range corresponding to the shielding part 7 that can be seen by the left eye EL refers to the range that would be visible by the left eye EL if the shielding part 7 did not exist.
  • the range corresponding to the shielding part 7 that can be seen by the right eye RL refers to the range that would be visible by the right eye RL if the shielding part 7 did not exist.
  • the image data processing device 33 sends the created image and information on the left eye EL and right eye ER represented by the viewpoint position information to the display control device 39.
  • the display control device 39 controls the operation of the projection unit 12.
  • the display control device 39 determines a projection device to display the image based on the left eye EL and right eye ER represented by the viewpoint position information, and causes the determined projection device to project the image created by the image data processing device 33.
  • the display device 10 includes a retroreflective screen 11 provided on the shielding part 7, and a diffuser plate 16 laminated on the surface of the retroreflective screen 11 facing the viewer.
  • the projection unit 12 selects one projection device from the plurality of projection devices and projects a first right-eye image from the selected right-eye projection device onto the device retroreflective screen 11 . Furthermore, the projection unit 12 selects one projection device from the plurality of projection devices, and projects a second image for the left eye from the selected left-eye projection device onto the device retroreflective screen 11 .
  • two projection devices selected from the projection devices 110a, 110b, 110c, 110d, 112a, 112b, 112c, and 112d of the first projection unit 100 and the second projection unit 102 project images for the right eye.
  • a right-eye projection unit 12R and a left-eye projection unit 12L project an image for the left eye.
  • the right-eye projection unit 12R and the left-eye projection unit 12L that project images for the left eye are two projection devices from the second projection unit 102. is selected, there may be various combinations of projection devices, such as a case where one projection device is selected from each of the first projection unit 100 and the second projection unit 102.
  • the right eye projection unit 12R includes a liquid crystal display device 13R that displays a first image, and a first projection lens 14R that projects image light of the first image emitted from the liquid crystal display device 13R onto the retroreflective screen 11.
  • the left eye projection unit 12L includes a liquid crystal display device 13L that displays a second image, and a second projection lens 14L that projects the image light of the second image emitted from the liquid crystal display device 13L onto the retroreflective screen 11.
  • Each of the liquid crystal display devices 13R and 13L includes a transmissive liquid crystal display element and a backlight device that emits light to the back surface of the liquid crystal display element.
  • Each of the projection lenses 14R and 14L is configured by a combination of a plurality of lenses that form images on the retroreflective screen 11 with parallax between the first image and the second image emitted from each liquid crystal display element.
  • the retroreflective screen 11 has retroreflectivity and reflects all of the incident light in the direction of incidence.
  • the image light of the first image and the image light of the second image emitted from the first projection lens 14R and the second projection lens 14L are directed to the first projection lens 14R and the second projection lens 14L by the retroreflective screen 11.
  • the image light of the first image and the image light of the second image, which are overlapped on the retroreflective screen 11, are perceived as being separated from each other at the observer's position.
  • a diffuser plate 16 is arranged on the surface of the retroreflective screen 11 on the viewer side.
  • the diffusion plate 16 has a diffusion ability so as not to reflect the retroreflection of the retroreflective screen 11 to each of the projection units 12, 24, and 25, but to reflect the light to the observer's line of sight.
  • the diffusion plate 16 has a large diffusion capacity in the vertical direction, and an anisotropic diffusion plate that has a smaller diffusion capacity in the horizontal direction than in the vertical direction.
  • a diffuser plate 16 is laminated on the surface of the retroreflective screen 11 facing the viewer.
  • the diffuser plate 16 may be a holographic optical element and is bonded onto the reflective surface of the retroreflective screen 11.
  • the diffuser plate 16 may be configured to magnify the light from the first projection lens 14R and the second projection lens 14L.
  • the retroreflective screen 11 is made by arranging a plurality of minute glass beads 11a with a diameter of 20 ⁇ m or more and about 100 ⁇ m or less on a flat surface and pasting them on the reflective film 11b.
  • the image light projected onto the retroreflective screen 11 enters each glass bead 11a, is refracted on the surface of the glass bead 11a, reaches the back surface of the glass bead 11a on the reflective film 11b side, and is reflected by the reflective film 11b. .
  • the light reflected by the reflective film 11b is refracted again at the back surface of the glass bead 11a, reaches the surface of the glass bead 11a, is separated from the incident path of the incident light by a minute distance less than the diameter of the glass bead 11a, and is reflected by the incident light.
  • the light travels along a parallel optical path, thus achieving retroreflection.
  • the diffusion plates 16 are arranged such that the light is diffused in different directions in the Y direction (the left-right direction of the driver 5) and the Z direction (the vertical direction of the driver 5).
  • the image light of the first image and the image light of the second image emitted from the projection lenses 14R and 14L enter the retroreflective screen 11, the light is emitted in the direction of incidence.
  • a conjugate relationship is established between the projection lenses 14R and 14L, which have the same optical path length, and a clear image can be observed.
  • the diffuser plate 16 is installed on the retroreflective screen 11
  • the light emitted by retroreflection is diffused, and a conjugate relationship can be established even in locations other than the projection lenses 14R and 14L. Clear images can be obtained no matter the position.
  • the liquid crystal display devices 13R and 13L include a transmission type liquid crystal display element, and the liquid crystal display element deflects the light source light from the backlight light source and displays a first image for providing it to the left and right eyes EL and ER of the viewer. The image light and the image light of the second image are emitted.
  • the liquid crystal display element deflects the light source light from the backlight light source and displays a first image for providing it to the left and right eyes EL and ER of the viewer.
  • the image light and the image light of the second image are emitted.
  • an LED light emitting display device may be used instead of the liquid crystal display device.
  • the projection lenses 14R and 14L project the image lights of the first image and the second image emitted from the liquid crystal display devices 13R and 13L toward the retroreflective screen 11 to form an image on the retroreflective screen 11.
  • the image formed on the retroreflective screen 11 is an enlarged version of the image displayed on the liquid crystal display devices 13R and 13L, and covers a wide range.
  • the left eye projection unit 12L is located at a position where its exit pupil is at the same height as the observer's left eye EL and near the left eye EL, for example, on both sides of the headrest at the top of the back seat. Similarly, the exit pupil is located at the same height as and near the right eye ER of the observer.
  • the exit pupils of the left-eye projection unit 12L and the right-eye projection unit 12R may be arranged above at the same position as the observer's eyes. That is, it may be arranged on the ceiling of the vehicle 2. In this case, it is preferable that the anisotropy of the diffusing power of the diffusing plate 16 corresponds to the position of the exit pupil.
  • the diffusion anisotropy of the diffuser plate 16 will be strong in the left and right directions, and if the exit pupil is at a position higher than the observer's eyes, the diffusion anisotropy of the diffuser plate 16 will be strong. Orientation makes the vertical direction stronger.
  • the optical axes of the projection lenses 14L and 14R of the left-eye projection unit 12L and the right-eye projection unit 12R are parallel, and the right side pillar display device 10b is arranged perpendicularly to the optical axes of the projection lenses 14L and 14R, respectively. It's good to be there.
  • the first image for the right eye ER and the second image for the left eye EL displayed on the right side pillar display device 10b are displayed in a partially overlapping state.
  • the retroreflective screen 11 has retroreflectivity and reflects almost all of the incident light in the direction of incidence.
  • the lights projected from the projection lenses 14L and 14R are reflected by the retroreflective screen 11 toward the projection lenses 14L and 14R, respectively, and the first light for the right eye ER overlaps on the retroreflective screen 11.
  • the image light of the image and the image light of the second image for the left eye EL are separated at the observer's position and enter the right eye ER and the left eye EL separately, and the driver 5 who is the observer simultaneously receives the image light of the first image.
  • a mixed image of the image light and the image light of the second image can be perceived as a three-dimensional parallax image.
  • the vehicle length direction is the X axis
  • the vehicle width direction is the Y axis
  • the vehicle height direction is the Z axis.
  • FIG. 7 is a flowchart showing an example of the processing of the image projection system.
  • 8 to 12 are schematic diagrams for explaining the operation of the image projection system, respectively.
  • the image processing system 1 executes the process shown in FIG. 7 when the power of the vehicle is activated by a start button or the like, and the seating sensor 36 detects that the driver 5 is seated in the driver's seat 4.
  • the image projection system 1 repeatedly executes the process shown in FIG.
  • the control unit 50 controls the operation of each unit and executes the processing shown in FIG. 7.
  • FIGS. 8 to 12 a case will be described in which an image is projected from the first image unit 100. Note that, depending on the position of the driver's line of sight, an image may also be projected from the second image unit 102 using the same process.
  • the control unit 50 acquires information on the projection light source line of each projection device (step S12).
  • the control unit 50 acquires information on the projection light source line based on information on the arrangement of each projection device of the projection unit 12.
  • information on the projection light source line 140a of the projection device 110a, the projection light source line 140b of the projection device 110b, the projection light source line 140c of the projection device 110c, and the projection light source line 140d of the projection device 110d is acquired.
  • the projection light source lines 140a, 140b, 140c, and 140d are arranged on the first plane. Information on the projection light source line can be acquired as preset information.
  • the control unit 50 acquires eyeball coordinates (step S14).
  • the control unit 50 processes the image acquired by the in-vehicle camera 6 with the line-of-sight recognition device 31 of the in-vehicle camera control device 33, and detects the positions of the left eye EL and right eye ER of the driver 5.
  • the control unit 50 identifies the closest projection light source line from the eyeball coordinates (step S16).
  • the control unit 50 selects the projection light source line closest to the positions of the left eye EL and right eye ER of the driver 5, based on information on the projection light source line of each projection device and the positions of the left eye EL and right eye ER of the driver 5. Identify.
  • the control unit 50 selects one projection device and determines whether the selected projection device is the one corresponding to the projection light source line closest to the eyeball (step S18). When the control unit 50 determines that the selected projection device is not a projection device corresponding to the projection light source line closest to the eyeball (No in step S18), the control unit 50 turns off the projection of the image of the selected projection device, and returns to step S20. Proceed to.
  • the control unit 50 determines whether to project the right image. (Step S22). If the selected projection device is a projection device corresponding to the projection light source line closest to the right eye, the control unit 50 determines that the image is for the right eye (Yes in step S22), and projects the image for the right eye ( Step S24). If the control unit 50 determines that the image is not for the right eye (No in step S22), it projects the image for the left eye (step S26). That is, the control unit 50 projects the image onto the projection device corresponding to the projection light source line closest to the positions of the left eye EL and right eye ER of the driver 5, respectively.
  • the projection device corresponding to the projection light source line refers to a projection device whose projection direction coincides with the projection light source line or is most similar to the projection light source line.
  • the vehicle exterior camera control device 35 operates each vehicle exterior camera 3a, 3b to start imaging the exterior of the vehicle.
  • the in-vehicle camera control device 37 operates the viewpoint recognition device 31 and the in-vehicle camera 6, and the in-vehicle camera 6 starts capturing an image.
  • the viewpoint recognition device 31 extracts the viewpoints of the driver 5, that is, the pupil positions of the left eye EL and the right eye ER, based on the image captured by the in-vehicle camera 6, and converts the extracted pupil positions in the X, Y, Z coordinate system. Calculate as coordinate values (x, y, z).
  • the image data processing device 33 cuts out the image to be projected onto the shielding unit 7 at the viewpoint position calculated by the viewpoint recognition device 31 from the image captured by the vehicle exterior camera 3, and generates a first image as an image for right eye ER. Then, a second image is generated as an image for the left eye EL. At this time, when the observer simultaneously views the projected images with the first and second images partially overlapping, the two different images can be viewed as one, allowing for a clear three-dimensional perception.
  • the control unit 50 determines whether the determination of the projection device is completed (step S28). When the control unit 50 determines that the determination of the projection devices has not been completed (No in step S28), the control unit 50 returns to step S18 and performs determination of the undetermined projection devices. When the control unit 50 determines that the determination of the projection device has been completed (Yes in step S28), the control unit 50 ends this process.
  • the control device 50 may process steps S18 to S28 in parallel for each projection device.
  • the image device system 1 can project an image according to the position of the driver's eyeballs on the display device by selecting a projection device to project according to the driver's line of sight in the process shown in FIG. It is possible to display an image that is integrated with the surrounding scenery.
  • the control unit 50 projects the left eye image from the projection device 110b, and projects the left eye image from the projection device 110c. Projects the image for the right eye and does not project images from other projection devices.
  • the control unit 50 projects the left eye image from the projection device 110a, and projects the left eye image from the projection device 110b. Projects the image for the right eye and does not project images from other projection devices. Thereby, when the driver moves his head to the left in FIG. 8, a more appropriate image can be projected by switching the image to be projected.
  • the control unit 50 controls the left eye EL to be closest to the projection light source line 140b, and the right eye ER to be closest to the projection light source line 140d.
  • the image for the left eye is projected from the projection device 110b
  • the image for the right eye is projected from the projection device 110d, and images from other projection devices are not projected.
  • the control unit 50 controls the left eye EL to be closest to the projection light source line 140b, and the right eye ER to be closest to the projection light source line 140d.
  • the left-eye image is projected from the projection device 110b
  • the right-eye image is projected from the projection device 110d
  • no images are projected from the other projection devices.
  • the control unit 50 controls the projection device.
  • the left-eye image is projected from 110d, and images from other projection devices are not projected.
  • FIGS. 8 to 12 a case has been described where the first projection unit 100 projects an image, but one projection device of the first projection unit 100 projects an image of one eye, and one of the second projection units 102 projects an image of one eye. It is also possible to use one projection device to project the image of the other eye.
  • images are projected by the projection devices 110a to 110d corresponding to the projection light source lines 140a to 140d closest to the left eye EL and the right eye ER, but the present invention is not limited thereto.
  • the image may be projected by any of the projection devices 110a to 110d corresponding to the projection light source lines 140a to 140d within a predetermined distance from the left eye EL and the right eye ER.
  • the image projection system 1 has a plurality of projection devices arranged in a straight line in each of the first projection unit 100 and the second projection unit 102, and the first straight line 122 and the second straight line 124 are set at different angles with respect to the first plane. This allows various combinations of projection devices to be used for projecting images depending on the position of the driver's eyes. Specifically, if the projection units that project images are combined for the right eye and left eye and placed in multiple positions, and if you move away from the combined position, you will not be able to project the image according to the driver's position. .
  • the image projection system 1 combines the projection devices according to the position of the driver's head by arranging the projection devices on a straight line without specifying whether the projection device is for the right eye or the left eye. This allows a more appropriate image to be projected. Note that it is preferable to arrange three or more projection devices in one projection unit.
  • the projection units 12 are arranged by arranging the projection devices symmetrically about the axis of symmetry with the intersection 126 as the center of the first projection unit 100 and the second projection unit 102. Images with little difference can be projected at each position in the range.
  • FIG. 13 is a schematic diagram showing another example of the arrangement of the projection devices of the image projection system.
  • the projection device 202 shown in FIG. 13 includes a first projection unit 100a and a second projection unit 102a.
  • the angle ⁇ between the first straight line of the first projection unit 100a and the second straight line of the second projection unit 102a is a larger angle than in the above embodiment. In this way, even if the first projection unit 100a and the second projection unit 102a intersect at different angles, the effects of the above embodiment can be obtained. Furthermore, by increasing the angle, the range that can be projected by the projection unit 200 can be widened.
  • the angle ⁇ can be set to a preferable value depending on the angle of view of the projector. For example, when the angle of view of the projector is about 57 degrees, it is preferable that the angle ⁇ is 0 degrees or more and 70 degrees or less.
  • FIG. 14 is a schematic diagram showing another example of the arrangement of the projection devices of the image projection system.
  • the projection unit 300 shown in FIG. 14 includes a first projection unit 302, a second projection unit 304, and a third projection unit 306.
  • the first projection unit 302, the second projection unit 304, and the third projection unit 306 are arranged so that the straight line on which the projection devices are arranged forms a triangle. In other words, the end portion of each projection unit is placed in contact with another projection unit.
  • three or more projection units may be arranged. In this case as well, by arranging the straight lines on which the projection devices of the projection units are arranged at different angles on the first plane, each projection unit can project an image to each area, depending on the position of the driver's eyes. You can select the projection device that you want. Note that even when three or more projection devices are used, the straight lines of the three projection units may be arranged to intersect at one intersection.
  • each functional unit, each means, each step, etc. may be added to other embodiments so as not to be logically contradictory, or each functional unit, each means, each step, etc.
  • each embodiment it is possible to combine or divide a plurality of functional units, means, steps, etc. into one. Further, each embodiment of the present disclosure described above is not limited to being implemented faithfully to each described embodiment, but may be implemented by combining each feature or omitting a part as appropriate. You can also do that.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Signal Processing (AREA)
  • Mechanical Engineering (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
  • Controls And Circuits For Display Device (AREA)
  • Devices For Indicating Variable Information By Combining Individual Elements (AREA)
  • Projection Apparatus (AREA)
  • Fittings On The Vehicle Exterior For Carrying Loads, And Devices For Holding Or Mounting Articles (AREA)
  • Closed-Circuit Television Systems (AREA)

Abstract

Ce système de projection d'images projette une image plus appropriée correspondant à la position de la tête d'un spectateur. Le système de projection d'images comprend, par exemple une première unité de projection qui possède une pluralité de dispositifs de projection disposés sur une ligne droite ; une deuxième unité de projection qui possède une pluralité des dispositifs de projection disposés sur une ligne droite à un angle différent de celui de la première unité de projection ; un écran rétroréfléchissant disposé dans une direction dans laquelle la première unité de projection et la deuxième unité de projection effectuent la projection d'images ; et un dispositif de commande qui détermine au moins deux des dispositifs de projection sur la base des positions de l'œil droit et de l'œil gauche d'un observateur regardant l'écran et qui fait en sorte que les dispositifs de projection déterminés projettent respectivement une image pour l'œil droit et une image pour l'œil gauche.
PCT/JP2023/017208 2022-05-31 2023-05-02 Système de projection d'images WO2023233919A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2022-089128 2022-05-31
JP2022089128A JP2023176698A (ja) 2022-05-31 2022-05-31 画像投影システム

Publications (1)

Publication Number Publication Date
WO2023233919A1 true WO2023233919A1 (fr) 2023-12-07

Family

ID=89026275

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2023/017208 WO2023233919A1 (fr) 2022-05-31 2023-05-02 Système de projection d'images

Country Status (2)

Country Link
JP (1) JP2023176698A (fr)
WO (1) WO2023233919A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005199934A (ja) * 2004-01-16 2005-07-28 Honda Lock Mfg Co Ltd 車両の視界補助装置
JP2013171252A (ja) * 2012-02-22 2013-09-02 Keio Gijuku 情報提示装置
JP2014139592A (ja) * 2011-05-02 2014-07-31 Sharp Corp 投射型表示装置
JP2015012559A (ja) * 2013-07-02 2015-01-19 株式会社デンソー 投射型表示装置
JP2015232634A (ja) * 2014-06-10 2015-12-24 セイコーエプソン株式会社 表示装置

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005199934A (ja) * 2004-01-16 2005-07-28 Honda Lock Mfg Co Ltd 車両の視界補助装置
JP2014139592A (ja) * 2011-05-02 2014-07-31 Sharp Corp 投射型表示装置
JP2013171252A (ja) * 2012-02-22 2013-09-02 Keio Gijuku 情報提示装置
JP2015012559A (ja) * 2013-07-02 2015-01-19 株式会社デンソー 投射型表示装置
JP2015232634A (ja) * 2014-06-10 2015-12-24 セイコーエプソン株式会社 表示装置

Also Published As

Publication number Publication date
JP2023176698A (ja) 2023-12-13

Similar Documents

Publication Publication Date Title
US10525886B2 (en) Display system, electronic mirror system and movable-body apparatus equipped with the same
US20160134815A1 (en) Driving assist device
JP7003925B2 (ja) 反射板、情報表示装置および移動体
JP6706802B2 (ja) 表示システム、電子ミラーシステム及びそれを備える移動体
JP6697751B2 (ja) 車両用表示システム、電子ミラーシステム及び移動体
WO2020261830A1 (fr) Dispositif d'affichage tête haute
WO2022181767A1 (fr) Dispositif d'affichage d'image
WO2023233919A1 (fr) Système de projection d'images
JP6515796B2 (ja) ヘッドアップディスプレイ装置
WO2023228770A1 (fr) Dispositif d'affichage d'image
US20220113539A1 (en) Windshield display device
WO2023228771A1 (fr) Dispositif d'affichage d'images, véhicule et procédé d'affichage d'images
WO2023228752A1 (fr) Dispositif d'affichage d'image
JP6945150B2 (ja) 表示システム
WO2022230824A1 (fr) Dispositif d'affichage d'image et procédé d'affichage d'image
WO2020031549A1 (fr) Dispositif d'affichage d'image virtuelle
JP6697747B2 (ja) 表示システム、電子ミラーシステム及び移動体
CN113022448B (zh) 显示系统
WO2022255424A1 (fr) Dispositif d'affichage vidéo
JPWO2019124323A1 (ja) 虚像表示装置、およびヘッドアップディスプレイ装置
JP2021022851A (ja) ヘッドアップディスプレイ装置
JP2019120891A (ja) 虚像表示装置、およびヘッドアップディスプレイ装置
JP7002061B2 (ja) 表示装置
JP7208378B2 (ja) 表示装置及び移動体
JP7111071B2 (ja) ヘッドアップディスプレイ装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23815680

Country of ref document: EP

Kind code of ref document: A1