WO2018100377A1 - Affichage multidimensionnel - Google Patents

Affichage multidimensionnel Download PDF

Info

Publication number
WO2018100377A1
WO2018100377A1 PCT/GB2017/053610 GB2017053610W WO2018100377A1 WO 2018100377 A1 WO2018100377 A1 WO 2018100377A1 GB 2017053610 W GB2017053610 W GB 2017053610W WO 2018100377 A1 WO2018100377 A1 WO 2018100377A1
Authority
WO
WIPO (PCT)
Prior art keywords
display
region
user
driver
content
Prior art date
Application number
PCT/GB2017/053610
Other languages
English (en)
Inventor
Daping Chu
Kun Li
Ali Ozgur YONTEM
Original Assignee
Cambridge Enterprise Limited
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Cambridge Enterprise Limited filed Critical Cambridge Enterprise Limited
Publication of WO2018100377A1 publication Critical patent/WO2018100377A1/fr

Links

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • A61B5/163Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state by tracking eye movement, gaze, or pupil change
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • A61B5/18Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state for vehicle drivers or machine operators
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Arrangement of adaptations of instruments
    • B60K35/211
    • B60K35/23
    • B60K35/29
    • B60K35/65
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/0304Detection arrangements using opto-electronic means
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • A61B5/7264Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • A61B5/7264Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems
    • A61B5/7267Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems involving training the classification device
    • B60K2360/182
    • B60K2360/334
    • B60K2360/741
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/01Indexing scheme relating to G06F3/01
    • G06F2203/011Emotion or mood input determined on the basis of sensed human body parameters such as pulse, heart rate or beat, temperature of skin, facial expressions, iris, voice pitch, brain activity patterns
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods

Definitions

  • the present disclosure relates to a 3-D augmented reality display system.
  • the disclosure relates to an apparatus for creating and projecting 3-D augmented reality images onto the surfaces of a vehicle interior, such as the windscreen, dashboard and internal body panels.
  • aspects of the invention relate to an apparatus for generating multiple depth images across an extended area of a vehicle interior, including the windscreen.
  • Heads-up displays are known displays where images are projected onto a suitable surface, such as a windscreen. Such displays are well known in a number of different environments including in vehicles.
  • aspects and embodiments of the invention provide an apparatus for the generation of the multi-dimensional images on a head-up display.
  • an imaging system for generating multi-dimensional images on a screen of a head-up display, the imaging system comprising: a first picture generation unit (18) for generating a first multi-dimensional image to be rendered at a first portion (24) of the screen of the head-up display, said first portion located at a first location, said first location corresponding to the foveal vision in the line of sight of a first user (31); a second picture generation unit (20) for generating a second, different, multidimensional image at a second portion (26) of the screen of the head-up display; wherein the first region of the display is configured to display a first content; wherein the second region of the display is configured to display a second, different, content.
  • the first region displays content relating to the prevailing conditions of a vehicle. Therefore, a user of a vehicle in which the imaging system is implemented may be provided with important information at all times.
  • the second multi-dimensional image is located at a second location corresponding to the foveal region of a second user.
  • the first user is a driver of a vehicle in which the imaging system is implemented and the second user is a passenger of the vehicle.
  • the system provides information to multiple users in the vehicle.
  • the first picture generation unit is configured to display the first content in the region at a high density resolution.
  • the first picture generation unit is further configured to display a third, different, multi-dimensional image at a third portion of the head-up display, wherein the third image is located at a peripheral of the first image.
  • the third region displays infotainment content.
  • the third region is presented at the periphery of the first region to display further content in a manner which minimises the distraction of the first user.
  • the first and/or second picture generation unit is configured to display a fourth, different, multi-dimensional image at a fourth portion of the head-up display.
  • the system is configured to selectively display content when it is determined that the foveal vision of the first user is at the fourth portion of the head-up display.
  • the system further comprises a sensor configured to track an eye of the first user.
  • the second content is multimedia content.
  • the system further comprises a sensor for determining a field of view of a first user. The system may be configured to selectively turn off the multimedia content in the event that the field of view of the first user corresponds to the second location.
  • the system can provide multimedia content to a second user, such as a passenger, in a safe manner which does not distract the first user (e.g. a driver of a vehicle).
  • An advantage of the present invention is that it provides for a distraction-free visual feedback to users, such as car drivers, providing convenient access to both essential and peripheral information.
  • the present invention can be configured to show essential information in the field of view of the user (such as that relating to the conditions of a vehicle) and further ancillary information at the user's periphery.
  • the invention provides an immersive experience in a confined space.
  • it can utilise the maximum possible display area/space (2D/3D) in all regions, including the windscreen, side windows, dashboard and internal body panels of a vehicle for an augmented reality (AR) display for both direct and peripheral vision.
  • 2D/3D maximum possible display area/space
  • AR augmented reality
  • Figure 1 is a schematic representation of a vehicle and a HUD according to an embodiment of the invention
  • Figure 2 is a schematic representation of a vehicle in accordance with the an embodiment of the invention with a driver, a centre region of a HUD and a side region of the HUD highlighted;
  • Figure 3 is a schematic representation of a vehicle in accordance with an embodiment of the invention with a passenger and a passenger region of a HUD of the vehicle highlighted;
  • Figure 4 is a schematic representation of a vehicle in accordance with an embodiment of the invention with a driver extended region highlighted;
  • Figure 5 is a schematic representation of a vehicle in accordance with an embodiment of the invention with a driver peripheral region of a HUD of the vehicle highlighted.
  • Figure 6 is a schematic representation of a vehicle in accordance with an embodiment of the invention.
  • Figure 7 is a schematic representation of a vehicle in accordance with an embodiment of the invention.
  • Figure 8 is a schematic representation of a vehicle in accordance with an embodiment of the invention.
  • Figure 9 is a schematic representation of a vehicle in accordance with an embodiment of the invention.
  • Figure 10 is a schematic representation of a vehicle in accordance with an embodiment of the invention.
  • Figure 1 1 is a schematic representation of a vehicle in accordance with an embodiment of the invention.
  • Figure 12 is a schematic representation of a vehicle in accordance with an embodiment of the invention.
  • the apparatus and the HUD are installed in a vehicle, such as a motor vehicle. Whilst the following description is described with reference to a motor vehicle, the disclosure, and concepts described herein are applicable to other forms of HUD, for example those installed on other forms of vehicles, such as trains, buses, aeroplanes and boats.
  • the present invention provides a driver-centric and immersive display.
  • the driver is the key person to whom information in the HUD must be displayed and the display must react according to the driver's needs.
  • all surfaces of a vehicle interior, including the windscreen, dashboard and internal body panels are utilized to deliver information to the driver and passengers. As it is important for drivers to have their eyes always on the road, essential information is preferentially placed in the line of sight of the driver.
  • Such information may comprise information about changing road conditions, car condition, navigational instructions, for example.
  • Peripheral information such as that related to infotainment, is also provided when it is needed, and such information is presented at other regions of the vehicle interior. Such information may for example, be related to infotainment.
  • human vision perception is also taken into account. Physiologically, the human eye will perceive information differently according to the location of the information. For example, foveal vision through the line of sight can process 3-4 high- quality colour images per second, while peripheral vision is less accurate and is not sensitive to colour but can process image information at 90 images per second.
  • Figure 1 is a schematic representation of a vehicle and a HUD according to an embodiment of the invention.
  • FIG. 1 is described from the reference point of a driver and passenger insider a vehicle 10.
  • the windscreen 12 housing 14 (said housing 14 referring to the outer shell which covers features such as the centre console, air vents etc.), steering wheel 16, first image projection means 18, second image projection means 20, third image projection means 22, driver side window 34, passenger side window 36, and sensor 37.
  • the image projecting means 18, 20, 22 are integrated into the housing 14. In further embodiments the image projecting means 18, 20, 22 are placed at other locations within the vehicle 10.
  • the image projecting means generate the image to be rendered as a virtual image reflected by the windscreen 12, the driver side window 34, or on the passenger side window 36.
  • the image projecting means 16, 18, 20 are configured to generated different content for each region on the HUD depending on a vehicle situation.
  • Whist Figure 1 shows three image projecting means 18, 20, 22 in further embodiments the number of image projecting means may vary.
  • the image projecting means 18, 20, 22 are known and commercially available.
  • the image projecting means 18, 20, 22 typically comprise a picture generation unit (to generate the image to be displayed), and a diffuser.
  • the image generated by the picture generating unit is projected onto the diffuser, which subsequently renders the image as a real image.
  • the image is subsequently projected onto the screen of the HUD as a virtual image via a set of focussing optics.
  • the sensor 37 is configured to sense external stimuli, such as ambient light. In further embodiments multiple sensors 37 are present and can sense information regarding the driver and/or passengers. Such information may be physiological information relating to driver alertness, stress etc.
  • the sensor 37 is an eye-tracking sensor configured to track the eyes of the driver.
  • the windscreen 12 is partitioned into several regions, wherein each region is configured to display different content.
  • the system may be configured to display different content on each region in dependence on who (i.e. a driver or a passenger) is viewing the particular region.
  • the described configuration provides a fully immersive HUD for one or more users of the vehicle 10.
  • the configuration allows for a greatly extended display when compared to existing prior art systems, which typically have a much smaller field of view.
  • the present configuration allows for a fully immersive HUD as it allows for the entire windscreen 12 to act as the HUD.
  • the configuration provides further advantages as the information displayed on the HUD can be varied based on the ability of the viewer to assimilate the information presented. It is known that foveal vision through the line of sight can process three-four high-quality colour images per second, while peripheral vision is less accurate and is not sensitive to colour but can process image information at a much higher rate, approximately 90 images per second. Accordingly, through the use of multiple image projecting means placed at various locations across the windscreen, information can be presented at different regions of the HUD in accordance with the ability of the viewer to assimilate the information.
  • Figure 2 is a schematic representation of a vehicle with the driver, centre region and side region of the HUD highlighted.
  • the windscreen 12 comprises a centre region 24, the centre region 24 being positioned in the foveal vision of the driver. As such the centre region 24 is located above the steering wheel 16 of the vehicle 10. Surrounding the centre region 24 is a side region 26. Whilst in Figure 2 the centre region 24 and side region 26 are quadrilateral in shape in further embodiments they are of any suitable shape.
  • the centre region 24 is located in front of the driver's seat above the steering wheel 16, and is therefore is in the foveal vision through the line of sight of the driver, when the driver is a neutral position.
  • the centre region 24 may display any essential information to the driver and may always be operative to aid the driver.
  • the centre region 24 can vary in size, and it may have a field of view which is larger than 6°x4°.
  • the field of view of the centre region 24 may be smaller in size than the side region 26.
  • the content may be displayed in the centre region 24 in a high definition, for example of more than 1 arc-min.
  • the centre region 24 of the HUD has a pixels count of 1024x768 and occupies a field of view of approximately ⁇ 16.9°x12.7° when displayed at 2m away from the driver.
  • the centre region 24 may display in full colour with the information displayed at variable perceived depths.
  • the resolution of the centre region 24 may be different, for example Ultra High Definition or Standard Definition.
  • the displayed information may include information regarding prevailing conditions of the vehicle such as a speed of the vehicle, a selected gear etc., a road condition such as speed limit, line markings, etc., a navigational direction.
  • information such as driver condition (tiredness as measured by sensors in vehicle etc.).
  • the content and amount of the displayed information may change according to a speed, a road condition, etc.
  • the generation of such data occurs in a known manner.
  • the side region 26 is located around the centre region 24 in front of the driver. As such, the side region 26 defines an extended region of the HUD.
  • the side region 26 displays further information to the driver.
  • the information displayed in the side region 26 comprises non-essential information such as infotainment.
  • the side region 26 may comprise a field of view which is larger than 12°x6° in order to provide an improved immersive experience.
  • the side region 26 may be smaller than the peripheral region (see below).
  • the side region 26 displays the information at a high definition of more than 1 arm-min. This may be to accommodate the fovea resolution when an eye of the driver moves to the side region.
  • the side region 26 may also display full colour at a fixed depth of equal to or further than the windscreen location, preferably the perceived image distance is greater than 2m away from an eye of the driver.
  • the information presented in the side region 26 can relate to infotainment including phone call notifications, media, etc.
  • the side region 26 may also be used as an extension display beyond the centre region 24 for road or navigational information.
  • the driver may be able to interact with the side region 26 display using known interaction techniques such as gestures, eye tracking, voice control as well as conventional buttons and dials.
  • the side region 26 may be suitable for use by the driver only. Accordingly, it may have a limited size, or eye-box. In further embodiments, the size of the side region 26 can be enlarged to enable a passenger to view the information presented.
  • the eye-box is the maximum viewing window available to the driver or passenger to view the information presented on a display.
  • the eye-box is not a region defined on the windscreen (or indeed any other display surface), it is rather on or close to the viewing plane where the users' eyes are positioned. If a user's line of sight corresponds to the eye-box, they can see an image on the directional display region.
  • Figure 3 is a schematic representation of a vehicle with a passenger 27 and a passenger region 28 of the HUD highlighted.
  • FIG 3 there is shown the apparatus of Figure 1 , with the addition of a passenger 27 and wherein a passenger region 28 is highlighted on the windscreen 12.
  • the passenger region 28 is situated in the line-of-sight of the passenger 27 and therefore is directly in front of where the passenger 27 would be seated in the vehicle 10.
  • the eye-box of the passenger region 28 is shown as a quadrilateral shape for ease of understanding- as discussed above, the eye-box is a maximum viewing window on or close to the viewing plane where the users' eyes are positioned, and need not be quadrilateral.
  • the passenger region 28 is located on the passenger side of the windscreen 12.
  • the passenger region 28, displays information such as infotainment, or entertainment content, for the passenger only.
  • the passenger region 28 is configured such that the driver is unable to view the content displayed therein, or if it is determined that the passenger region 28 is in the foveal vision of the driver, the passenger region 28 may cease to present such content (as described with reference to Figure 4).
  • the passenger region 28 has a field of view which is larger than 6°x4°.
  • the field of view may be smaller than the driver peripheral region 32.
  • the size of the field of view of the passenger region 28 may be smaller than 6°x4°.
  • the passenger region 28 may display images as well as videos in HD at variable depths.
  • the content shown in driver peripheral region 32 may be chosen by the passenger 27.
  • the content may be able to be loaded from an external device either via wired means such as HDMI, DP and USB port connections or wirelessly such as screen mirroring over Wifi-Direct, Wi-Fi, Bluetooth, ZigBee, WLAN. . Such techniques are known in the art.
  • Figure 4 is a schematic representation of a vehicle with the driver extended region of the HUD highlighted. Specifically, Figure 4 shows the apparatus of Figure 1 with a driver 31 and a driver extended region 30 highlighted on the windscreen 12.
  • the driver extended region 30 represents a region of the windscreen 12 where the driver is likely to view for an extended period of time whilst driving.
  • a driver will typically spend most of their time focussing on the centre region 24 and side region 26.
  • the driver will typically look briefly away from the centre region 24 and side region 26, for example to look at the mirrors, and look at the driver extended region 30 when turning the vehicle towards the driver extended region 30 (in the example in Figure 4 turning the vehicle to the left).
  • the driver extended region 30 extends from the edge of the side region 26 to a far end of the windscreen 12, said far end being opposite the end of the windscreen 12 proximal to the steering wheel 16.
  • the driver extended region 30 displays essential information to the driver.
  • the content in the driver extended region 30 may be only viewable by the driver.
  • the field of view of the driver extended-view region 30 may vary. For example, it may be larger than 6°x4°, and/or may be smaller than the peripheral region 32.
  • the driver extended region 30 only displays information to the driver when required.
  • the driver extended region 30 will actively show information to the driver when the driver is looking towards the driver extended region 30 i.e. when the driver's foveal vision overlaps that of the driver extended region 30.
  • the driver extended region 30 is selectively turned on when the vehicle is making a turn towards the far end of the windscreen 12 (determined using steering wheel yaw data or eye/head tracking data).
  • the driver should look toward the driver extended region 30 because the physical road is behind it.
  • the physical movement and emotional change of the driver including the head position and eye direction as well as body part gesture can be monitored to assist the decision making for visual feedbacks.
  • eye tracking sensors are used to determine when the driver is looking at the driver extended region 30 and when it is determined that the driver is looking at the region 30 the driver extended region 30 is selectively turned on.
  • information from the sensors 37 indicative of tracked physical and physiological state of the driver is stored and processed using an artificial intelligence unit/system in order to train the system.
  • the artificial intelligence unit/system by be implemented by using neural networks provision, and/or machine learning/deep learning algorithms.
  • the artificial intelligence unit/system is configured to process the information in such a manner as to build up a profile of the behavioural characteristics of the driver in response to various conditions. This knowledge of the driver's behaviour can include how the driver reacts to certain situations, and can thus be used to take pre-emptive actions.
  • the system is optionally able to identify the driver using the artificial intelligence unit/system.
  • the system can identify a driver, identify if a potentially hazardous situation is likely to occur (for example using data from one or more sensors 37), predict how the driver will respond to the potentially hazardous situation, and intervene/assist the driver in responding to the potential hazardous situation depending on the predicted driver response. This reduces the system delay and help further to the driver and intervene/assist in hazardous situations, potentially even before they happen as a preventive measure.
  • the system processes inputs from further sensors/cameras/internet/GPS of the vehicle, in order to detect the physical world outside the vehicle (and preferably detect/identify certain features in the physical outside world) and match the displayed content.
  • the system can selectively display content for the driver to, for example, warn of a particular hazard, identify places previously visited, remind the driver of a particular location/hazard, and suggest places of interest.
  • the system can provide improved safety and improved driver awareness of the location.
  • the driver extended region 30 may be configured to display in high definition of more than 1 arm-min to accommodate the fovea resolution when the driver's eyes are turned towards this region.
  • the displayed information may be of full colour at a fixed or variable depth.
  • the image distance may be greater than 2m away from the driver's eye.
  • Figure 5 is a schematic representation of a vehicle with a driver peripheral region 32 of the HUD highlighted. Specifically, Figure 5 shows the apparatus of Figure 1 and the driver peripheral region 32 is highlighted on the windscreen 12. There is also shown the driver side window 34, and the passenger side window 36.
  • the driver peripheral region 32 Surrounding the side region 26 is the driver peripheral region 32.
  • the driver peripheral region 32 may exclude the centre region 24 and preferably the side region 26 as shown in Figure 2.
  • the driver peripheral region 32 extends across a large portion of the windscreen 12, and in further embodiments encompasses the entirety of the windscreen 12 as well as a portion of the driver side window 34 and passenger side window 36.
  • the driver peripheral region 32 may extend across the entirety of the windscreen 12 but exclude the centre region 24 and the side region 26.
  • the driver peripheral region 32 may display further content to the driver such as warning signs for the driver and passenger if there is a potential hazard.
  • the driver peripheral region 32 preferably displays content such as warnings, or similar, which is visible to both the driver and the passenger. As the information is displayed in the driver's peripheral, it can be assimilated by the driver and reacted upon.
  • the driver peripheral region 32 As the driver peripheral region 32 is in the driver's peripheral vision, it is not necessary to display a high quality colour image and/or an image depth due to the limitations of the peripheral regions of the eye. Therefore, the content may be displayed at a lower resolution in the driver peripheral region 32. Due to the lower resolution of the peripheral vision of a human eye, the driver peripheral region 32 may utilise simple intuitive stimulations to notify the driver of the potential hazard. These may include changing colours, intensities, etc.
  • the hazards can include the change of traffic conditions such as pedestrian crossing, etc. or a blind spot warning. Systems for the identification of such hazards are known in the art and not described further.
  • the driver peripheral region 32 extends across the windscreen 12 and is larger than the side region 26. Both the driver and passenger are able to see the displayed content. As the content is visible to both the driver and passenger, if the driver misses the warning sign and did not react, the passenger can remind him. As the driver peripheral region 32 overlaps with the passenger region 28, when information is presented in the driver peripheral region 32 in a region which overlaps the passenger region 28 the passenger region 28 is temporarily disabled with the driver peripheral region 32 taking priority. As the HUD extends across the windscreen 12, and in some embodiments beyond, as shown in Figures 1 to 4 the apparatus comprises multiple projection apparatus, with each projection apparatus configured to cover a portion of the windscreen.
  • the picture generation units are configured such that the images they produce are blended so that the regions in which the picture generation units overlap there is no discontinuity apparent to the end users.
  • two picture generation units are used each producing real images, which are displayed across two diffusers and reflected on the HUD to provide a seamless virtual image.
  • the allocation of the regions may change in accordance with the usage scenarios. Whilst the regions in Figures 1 to 4 are shown as rectangular boxes their size and shape may change according to the usage scenario.
  • Figures 1 to 4 relate to driving scenarios where the driver is sitting on the right- hand side of the vehicle 10, so the centre region 24 and the side region 26 are allocated on the right-hand side of the windscreen 12 and the passenger region 28 is allocated on the left-hand side of the windscreen.
  • the driver sits on the left-hand side and accordingly, the location of the centre/side and passenger regions 24, 26, 28 may be swapped.
  • both people sitting in the front may be drivers of the vehicle 10, such as in a learning-to-drive scenarios.
  • the centre and side regions 24, 26 are positioned in front of both sides of the windscreen 12 for both an instructor and learner driver to be presented with the same information.
  • both people sitting in the front may be passengers, such as in the case of an autonomous vehicle.
  • passenger regions with infotainment may be displayed in front of both of the front passengers.
  • the content can be the same for both passengers to share, or they can be selected by each passenger according to their own interests.
  • Side and peripheral regions may be merged together and displayed as an extension of the passenger region.
  • the vehicle may comprise a single front seat only.
  • the centre region and the side/peripheral regions may be displayed for the driver. If the single front seat vehicle is autonomous, then the passenger region and side/peripheral region may be displayed for the passenger infotainment.
  • the vehicle may be operated in both a manual and autonomous mode, wherein a driver may switch between such modes with a foldable driving wheel for example, the centre region and passenger region of the HUD can be merged into one centre/passenger region with the content suitable for either the driver or the passenger in dependence on the current mode of operation.
  • the vehicle and system comprise an ambient sensor, and the picture generation units are configured to adjust the brightness of the images dependent on the measured ambient light.
  • Figures 6 shows further embodiments of the invention. The embodiments described with reference to Figure 6 function in the same manner as described above with reference to Figures 1 to 5. The reference numerals in these figures refer to the same features as identified above.
  • Figure 6 shows an embodiment where the apparatus creates a fully immersive HUD for one or more users of the vehicle 10.
  • dashboard display 40 comprising a first dashboard region 42, a second dashboard region 44 and a shared dashboard region 46.
  • the displays are not limited by the depicted drawings. They are just representative examples and can be further extended (such as the middle console) or can be any suitable shape.
  • FIG. 6 shows the dashboard display 40, pillar displays 48, side mirror displays 50, side door displays 52, and roof display 54 the skilled person would realise that not all these displays need be used or present in the vehicle 10 and that any suitable combination of displays are used in different embodiments (for example a vehicle may comprise solely the dashboard display 40, or a combination of the pillar displays 48 and side mirror displays 50 etc.).
  • a vehicle may comprise solely the dashboard display 40, or a combination of the pillar displays 48 and side mirror displays 50 etc.
  • the concept in further embodiments in extended across the entirety of the interior of the vehicle cabin.
  • any pillar for example central or rear pillar
  • any door and/or window e.g. rear passenger door
  • the rear windscreen may have a display.
  • each of the regions within the vehicle 10 act as a surface on which information can be projected, in the above described manner.
  • the dashboard display 40 is utilised to form flexible or rigid (including curved) dashboard displays.
  • the dashboard display 40 can be divided into a first dashboard region 42 for the driver, a second dashboard region 44 for the passenger and a third dashboard region 46 which both the driver and passenger can view the content in the manner described above.
  • the first dashboard region 42 can be for a first passenger, second dashboard region 44 for a second passenger and third dashboard region 46 for both front passengers.
  • the dashboard display 40 can display objects (as augmented reality) which would otherwise be blocked by the front car body.
  • the vehicle 10 comprises, in a known manner, front and rear cameras and the images captured by these cameras are selectively displayed on the dashboard display 40.
  • the dashboard display 40 acts as a transparent dashboard.
  • the images are selectively rendered, for example upon driver request, or when difficult road conditions are identified, or during parking manoeuvre (for example identified by the selection of the reverse gear).
  • the dashboard display 40 can also display interactive content for both front passengers, either work related or for infotainment purpose, when installed in an autonomous vehicle and when said vehicle is in self-driving mode.
  • the side door displays 52 may be a flexible or rigid displays.
  • the side door displays 52 may also display objects (augmented reality) behind the doors.
  • objects augmented reality
  • the side door displays 52 can act like a transparent/invisible door, by displaying information to the driver/passenger regarding the environment outside of the door. For example this information is used to inform the users of the vehicle 10 whether there is sufficient space or if there is a potential hazard before opening the door.
  • the vehicle roof display 54 can be a flexible or rigid display. Again using cameras associated with the vehicle 10 it can display augmented objects that are blocked by the vehicle roof, as a transparent roof. It acts as an extension of the front windscreen display 12, showing things that are not displayed fully by the front windscreen display, for example building names on skyscrapers.
  • the vehicle roof display 54 can also display interactive contents for passengers, either work related or for infotainment purpose, during self-driving mode.
  • the vehicle pillar display 48 can be a flexible or rigid display. It can display augmented information that are blocked by the pillar, as a transparent pillar. It acts as an extension of the front windscreen display 12, showing things that are not displayed fully by the front windscreen display. For example, incoming cars from a side road or a roundabout. It can also display interactive contents for passengers, either work related or for infotainment purpose, during self-driving mode.
  • the side mirrors 50 and side windows 34 36 can act as part of the extended display for the windscreen 12.
  • an aspect of the invention is that the content rendered on the displays varies in resolution according to the user's line of sight.
  • content is displayed on a display (be it the windscreen 12, dashboard display 40, side door display 52 etc.) at a high resolution region such that content is seen at the user's fovea region on the retina at the higher resolution.
  • the surrounding display region(s) display low resolution images aiming at user's peripheral vision.
  • the pillar displays 48, side mirror displays 50, side door displays 52 and roof displays 54 always display content at a lower resolution as it assumed that the user is looking forward and that such displays will be in the peripheral regions of the user's vision.
  • the vehicle 10 comprises known eye tracking means, configured to determine where the user is looking and adjusts the resolutions of the displays accordingly.
  • the displays may be extended and interconnected to form a seamless multi-dimensional display.
  • Such a display can provide users with a seamless and fully immersive experience.
  • a further aspect of the invention is that in autonomous vehicles the standard paradigm for vehicle seating arrangement can change. Accordingly, in some vehicles passengers can face any directions they want and accordingly view content on different displays. As the resolution of the displays can be varied depending on the where the passenger is viewing (as determined by known eye tracking means) the system may determine the resolution based on the position of the eyes of the observers. This can result in power savings as less power is required to render lower resolution images. Thus based on the observed direction, the resolution for the foveal region(s) can be increased while the peripheral region(s) can be left at lower resolution.
  • system can be interfaced with user control methods such as eye/head tracking, voice control, lip reading, hand gestures, touch screen and conventional button presses to allow the user to vary and control the content displayed.
  • user control methods such as eye/head tracking, voice control, lip reading, hand gestures, touch screen and conventional button presses to allow the user to vary and control the content displayed.
  • known machine learning algorithms can be embedded in the system in order to improve the system accuracy.
  • the content displayed on the dashboard display 40 may be personalised according to a user's preference.
  • the dashboard display 40 may display a customised set of icons/ lighting/images etc.
  • each user of the vehicle is identified (for example via a known login means, or identified via a device, such as mobile telephone, or tablet computer, which is associated the user) and their customised display is displayed on the dashboard display 40.
  • the vehicle 10 is an autonomous vehicle, rental vehicle etc., with multiple users the user's preferences are easily identified and displayed on the dashboard display 40.
  • the regions described above are directional display regions. They refer to the areas/displays where the corresponding images are reflected and presented to one or more users. Where a user is looking will define an eye-box, which is defined as the maximum viewing window/surface available to the user based on their direction of view.
  • Figure 7 is a schematic representation of a seating arrangement in a vehicle and the display of content.
  • the representation in Figure 7 may apply to both a user driven and autonomous driving scenario.
  • FIG. 7 there is shown seats 60, 62, 64 and 66.
  • the driver is sat in seat 62.
  • the vehicle is autonomous.
  • Figure 8 is a further schematic representation of a seating arrangement in a vehicle and the display of content.
  • FIG 8 there is shown a different seating arrangement in an autonomous vehicle context.
  • seats 60, 62, 64 and 66 which are arranged in a non-canonical manner.
  • their foveal field of view 68 for a user sat in seat 60 the foveal field of view 70 for a user sat in seat 64 the foveal field of view 74, and for a user sat in seat 66 the foveal field of view 76.
  • all the passengers in the vehicle can view a common display zone for presentations and discussion as in an office by rearranging the seating positions accordingly. It is also possible to display different content by viewing different displays (e.g. the roof display 54 may show different content to the windscreen 12).
  • Figures 9, 10 and 11 show further embodiments in which the seat arrangement is varied causing changes in where each user's foveal line of sight region will occur.
  • Figures 9 and 10 show embodiments in which windows and non-transparent portions of the interior are utilised as the main focus of the users.
  • the pillar display 48 is within the foveal line of sight.
  • the side door display 52 is also within the foveal line of sight of the users of the vehicle. For ease of understanding not all foveal regions are shown.
  • FIG. 1 1 there is shown an embodiment where none of the user's foveal regions overlap.
  • each user may watch the same or different content.
  • One or more of the passengers can face inwards (towards the inside of the vehicle) to view content on an inner surface.
  • Figure 12 is a further embodiment in which the display is connected with an external device such as a mobile telephone, tablet computer, laptop etc.
  • FIG. 12 there is shown a schematic representation of the interior of a vehicle as described with reference to Figure 1 where the reference numerals correspond to the same features.
  • a mobile telephone 80 having a display showing a first content 82 and a second content 84.
  • the windscreen 12 mirrors the first content 82 and the dashboard display
  • Figure 12 shows first information 82 is displayed on the first portion of the windscreen 12 and the second information is displayed on the first portion of the dashboard display 40.
  • the mobile device 80 communicates with the displays via a wired or wireless connection in a known manner.
  • Figure 12 is shown as a representative example and as such the mirroring of content from an external device can occur in a number of different manners.
  • the system may allow multiple portions of the windows and interior for mirroring multiple devices.
  • a driven vehicle i.e.
  • the display of content from a mobile device 80 can only occur if the content does not distract the driver when the vehicle is controlled by the driver.
  • the displayed images may be determined automatically by a software and displayed at relevant locations. It is also possible to make it customizable and users (including passengers) may determine how to arrange them.
  • the entirety of the car interior and windows can be used as a shared media for displaying different content at the same time.
  • the connection of the mobile device 80 with the vehicle 10 allows for the identification of the user (via an association with their mobile device 80).
  • one or more user preferences (either stored on the device, or in a memory of the vehicle 10) for the dashboard display 40 (or any other part of the immersive display, side windows, pillars, roof, back windscreen etc.) are identified and displayed.
  • the content of the mobile telephone 80 is rendered in the foveal region it is rendered at a higher resolution.
  • the invention provides an immersive HUD which can extend across a large region such as a windscreen and passenger and driver side windows.

Abstract

Un système d'imagerie pour générer des images multidimensionnelles sur un écran (12) d'un affichage tête haute, le système d'imagerie comprenant: une première unité de génération d'image (18) pour générer une première image multidimensionnelle à restituer au niveau d'une première partie (24) de l'écran de l'affichage tête haute, ladite première partie étant située à un premier emplacement, ledit premier emplacement correspondant à la vision fovéale dans la ligne de visée d'un premier utilisateur (31); une seconde unité de génération d'image (20) pour générer une seconde image multidimensionnelle différente au niveau d'une seconde partie (26) de l'écran de l'affichage tête haute; la première région de l'affichage étant configurée pour afficher un premier contenu; la seconde région de l'affichage étant configurée pour afficher un second contenu différent.
PCT/GB2017/053610 2016-11-30 2017-11-30 Affichage multidimensionnel WO2018100377A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
GB1620351.5 2016-11-30
GBGB1620351.5A GB201620351D0 (en) 2016-11-30 2016-11-30 Multi-dimensional display

Publications (1)

Publication Number Publication Date
WO2018100377A1 true WO2018100377A1 (fr) 2018-06-07

Family

ID=58073492

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/GB2017/053610 WO2018100377A1 (fr) 2016-11-30 2017-11-30 Affichage multidimensionnel

Country Status (2)

Country Link
GB (1) GB201620351D0 (fr)
WO (1) WO2018100377A1 (fr)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109242943A (zh) * 2018-08-21 2019-01-18 腾讯科技(深圳)有限公司 一种图像渲染方法、装置及图像处理设备、存储介质
WO2020008876A1 (fr) * 2018-07-03 2020-01-09 ソニー株式会社 Dispositif de traitement d'informations, procédé de traitement d'informations, programme et corps mobile
JP2020071415A (ja) * 2018-11-01 2020-05-07 マクセル株式会社 ヘッドアップディスプレイシステム
CN111460865A (zh) * 2019-01-22 2020-07-28 阿里巴巴集团控股有限公司 辅助驾驶方法、辅助驾驶系统、计算设备及存储介质
WO2021106665A1 (fr) * 2019-11-27 2021-06-03 京セラ株式会社 Système d'affichage tête haute et corps mobile
CN113895228A (zh) * 2021-10-11 2022-01-07 黑龙江天有为电子有限责任公司 一种汽车组合仪表盘及汽车
CN114450942A (zh) * 2019-09-30 2022-05-06 京瓷株式会社 摄像机、平视显示器系统以及移动体
FR3118734A1 (fr) * 2021-01-13 2022-07-15 Faurecia Interieur Industrie Véhicule automobile équipé d’un dispositif de visualisation électronique

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2428153A (en) * 2005-07-08 2007-01-17 Sharp Kk Interactive multiple view display
US20100253601A1 (en) * 2009-04-02 2010-10-07 Gm Global Technology Operations, Inc. Full-windshield hud enhancement: pixelated field of view limited architecture
EP2618204A1 (fr) * 2012-01-20 2013-07-24 Delphi Technologies, Inc. Interface homme-machine pour véhicule automobile
US20150062168A1 (en) * 2013-03-15 2015-03-05 Honda Motor Co., Ltd. System and method for providing augmented reality based directions based on verbal and gestural cues
WO2016116741A1 (fr) * 2015-01-20 2016-07-28 The Technology Partnership Plc Technologie d'écran immersif à double affichage
GB2535316A (en) * 2015-01-14 2016-08-17 Jaguar Land Rover Ltd Head-up display apparatus

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2428153A (en) * 2005-07-08 2007-01-17 Sharp Kk Interactive multiple view display
US20100253601A1 (en) * 2009-04-02 2010-10-07 Gm Global Technology Operations, Inc. Full-windshield hud enhancement: pixelated field of view limited architecture
EP2618204A1 (fr) * 2012-01-20 2013-07-24 Delphi Technologies, Inc. Interface homme-machine pour véhicule automobile
US20150062168A1 (en) * 2013-03-15 2015-03-05 Honda Motor Co., Ltd. System and method for providing augmented reality based directions based on verbal and gestural cues
GB2535316A (en) * 2015-01-14 2016-08-17 Jaguar Land Rover Ltd Head-up display apparatus
WO2016116741A1 (fr) * 2015-01-20 2016-07-28 The Technology Partnership Plc Technologie d'écran immersif à double affichage

Cited By (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020008876A1 (fr) * 2018-07-03 2020-01-09 ソニー株式会社 Dispositif de traitement d'informations, procédé de traitement d'informations, programme et corps mobile
CN109242943B (zh) * 2018-08-21 2023-03-21 腾讯科技(深圳)有限公司 一种图像渲染方法、装置及图像处理设备、存储介质
EP3757944A4 (fr) * 2018-08-21 2021-09-01 Tencent Technology (Shenzhen) Company Limited Procédé et appareil de rendu d'image, dispositif de traitement d'image et support de stockage
CN109242943A (zh) * 2018-08-21 2019-01-18 腾讯科技(深圳)有限公司 一种图像渲染方法、装置及图像处理设备、存储介质
US11295528B2 (en) 2018-08-21 2022-04-05 Tencent Technology (Shenzhen) Company Limited Image rendering method and apparatus, image processing device, and storage medium
JP7111582B2 (ja) 2018-11-01 2022-08-02 マクセル株式会社 ヘッドアップディスプレイシステム
JP2020071415A (ja) * 2018-11-01 2020-05-07 マクセル株式会社 ヘッドアップディスプレイシステム
CN111460865A (zh) * 2019-01-22 2020-07-28 阿里巴巴集团控股有限公司 辅助驾驶方法、辅助驾驶系统、计算设备及存储介质
CN111460865B (zh) * 2019-01-22 2024-03-05 斑马智行网络(香港)有限公司 辅助驾驶方法、辅助驾驶系统、计算设备及存储介质
US20220353485A1 (en) * 2019-09-30 2022-11-03 Kyocera Corporation Camera, head-up display system, and movable object
CN114450942A (zh) * 2019-09-30 2022-05-06 京瓷株式会社 摄像机、平视显示器系统以及移动体
EP4040788A4 (fr) * 2019-09-30 2023-11-01 Kyocera Corporation Caméra, système d'affichage tête haute, et corps mobile
JP7241005B2 (ja) 2019-11-27 2023-03-16 京セラ株式会社 ヘッドアップディスプレイシステムおよび移動体
JP2021085990A (ja) * 2019-11-27 2021-06-03 京セラ株式会社 ヘッドアップディスプレイシステムおよび移動体
WO2021106665A1 (fr) * 2019-11-27 2021-06-03 京セラ株式会社 Système d'affichage tête haute et corps mobile
FR3118734A1 (fr) * 2021-01-13 2022-07-15 Faurecia Interieur Industrie Véhicule automobile équipé d’un dispositif de visualisation électronique
CN113895228A (zh) * 2021-10-11 2022-01-07 黑龙江天有为电子有限责任公司 一种汽车组合仪表盘及汽车

Also Published As

Publication number Publication date
GB201620351D0 (en) 2017-01-11

Similar Documents

Publication Publication Date Title
WO2018100377A1 (fr) Affichage multidimensionnel
US11449294B2 (en) Display system in a vehicle
JP6608146B2 (ja) ライブ映像を有する仮想透過型インストルメントクラスタ
US8886023B2 (en) Blind-spot image display system for vehicle, and blind-spot image display method for vehicle
WO2019124158A1 (fr) Dispositif de traitement d'informations, procédé de traitement d'informations, programme, système d'affichage et corps mobile
US20090002142A1 (en) Image Display Device
US20160313792A1 (en) Device and method for navigating within a menu for controlling a vehicle, and selecting a menu entry from the menu
US9835861B2 (en) System having an adjustment device and method for automatically adjusting/switching the adjustment device for a head-up display device
Pfannmüller et al. A comparison of display concepts for a navigation system in an automotive contact analog head-up display
JP2007045169A (ja) 車両用情報処理装置
JP2010128794A (ja) 車両周辺認知支援装置
US10391844B2 (en) Vehicle display screen safety and privacy system
JP2020537257A (ja) 頭部着用型の電子表示装置の操作方法および仮想コンテンツを表示する表示システム
CN112602041A (zh) 减轻用户在行驶期间在机动车中借助数据眼镜观看媒体内容时的晕动症的控制设备和方法
US11106045B2 (en) Display system, movable object, and design method
EP3885195A1 (fr) Système et procédé pour fournir un guidage visuel à l'aide de projection de lumière
JP2015501440A (ja) 特に自動車用のディスプレイ装置
JP5157134B2 (ja) 注意誘導装置および注意誘導方法
JP7028116B2 (ja) 車両用装飾画像合成装置
KR20230034448A (ko) 차량 및 그 제어 방법
US11800082B2 (en) Virtual 3D display
US11971547B2 (en) Control apparatus and method for reducing motion sickness in a user when looking at a media content by means of smart glasses while travelling in a motor vehicle
EP4286200A1 (fr) Véhicule, système et procédé destinés à la réduction du mal des transports d'un passager d'un véhicule en déplacement
CN113965738A (zh) 一种控制器、抬头显示系统和投影方法
JP2023012155A (ja) 表示システム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17809348

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17809348

Country of ref document: EP

Kind code of ref document: A1