CN105988220A - Method and control device for operating vision field display device for vehicle - Google Patents
Method and control device for operating vision field display device for vehicle Download PDFInfo
- Publication number
- CN105988220A CN105988220A CN201610163253.XA CN201610163253A CN105988220A CN 105988220 A CN105988220 A CN 105988220A CN 201610163253 A CN201610163253 A CN 201610163253A CN 105988220 A CN105988220 A CN 105988220A
- Authority
- CN
- China
- Prior art keywords
- image
- eye
- comfort
- driver
- display
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 title claims abstract description 44
- 230000004438 eyesight Effects 0.000 title claims abstract description 25
- 210000001508 eye Anatomy 0.000 claims description 114
- 230000000007 visual effect Effects 0.000 claims description 75
- 230000003287 optical effect Effects 0.000 claims description 23
- 238000003384 imaging method Methods 0.000 claims description 15
- 238000010168 coupling process Methods 0.000 claims description 9
- 238000005859 coupling reaction Methods 0.000 claims description 9
- 230000008878 coupling Effects 0.000 claims description 8
- 238000004590 computer program Methods 0.000 claims description 7
- 238000001514 detection method Methods 0.000 claims description 5
- 230000013011 mating Effects 0.000 claims description 5
- 210000005252 bulbus oculi Anatomy 0.000 claims description 4
- 230000004044 response Effects 0.000 claims description 2
- 230000001105 regulatory effect Effects 0.000 abstract description 2
- 239000000463 material Substances 0.000 description 44
- 238000010586 diagram Methods 0.000 description 26
- 210000003128 head Anatomy 0.000 description 10
- 230000008447 perception Effects 0.000 description 9
- 230000009977 dual effect Effects 0.000 description 8
- 230000008901 benefit Effects 0.000 description 7
- 230000008859 change Effects 0.000 description 7
- 230000002093 peripheral effect Effects 0.000 description 7
- 230000000694 effects Effects 0.000 description 6
- 239000005357 flat glass Substances 0.000 description 6
- 239000011521 glass Substances 0.000 description 6
- 238000012552 review Methods 0.000 description 6
- 238000013461 design Methods 0.000 description 5
- 238000007689 inspection Methods 0.000 description 5
- 238000005516 engineering process Methods 0.000 description 4
- 230000006870 function Effects 0.000 description 4
- 230000004886 head movement Effects 0.000 description 4
- 206010019233 Headaches Diseases 0.000 description 3
- 208000002193 Pain Diseases 0.000 description 3
- 238000006243 chemical reaction Methods 0.000 description 3
- 231100000869 headache Toxicity 0.000 description 3
- 238000004088 simulation Methods 0.000 description 3
- 230000006399 behavior Effects 0.000 description 2
- 210000004556 brain Anatomy 0.000 description 2
- 230000033001 locomotion Effects 0.000 description 2
- 230000008569 process Effects 0.000 description 2
- 238000011160 research Methods 0.000 description 2
- 241001074085 Scophthalmus aquosus Species 0.000 description 1
- 206010048232 Yawning Diseases 0.000 description 1
- 230000004308 accommodation Effects 0.000 description 1
- 230000006978 adaptation Effects 0.000 description 1
- 230000003321 amplification Effects 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 150000001875 compounds Chemical class 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 238000006073 displacement reaction Methods 0.000 description 1
- 230000005611 electricity Effects 0.000 description 1
- 230000002349 favourable effect Effects 0.000 description 1
- 238000001914 filtration Methods 0.000 description 1
- 238000007667 floating Methods 0.000 description 1
- 230000036541 health Effects 0.000 description 1
- 230000002045 lasting effect Effects 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 230000004379 myopia Effects 0.000 description 1
- 208000001491 myopia Diseases 0.000 description 1
- 238000003199 nucleic acid amplification method Methods 0.000 description 1
- 229920000642 polymer Polymers 0.000 description 1
- 229920006254 polymer film Polymers 0.000 description 1
- 238000012545 processing Methods 0.000 description 1
- 230000005855 radiation Effects 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 230000035807 sensation Effects 0.000 description 1
- 230000000392 somatic effect Effects 0.000 description 1
- 208000024891 symptom Diseases 0.000 description 1
- 230000036962 time dependent Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
- B60K35/10—Input arrangements, i.e. from user to vehicle, associated with vehicle functions or specially adapted therefor
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
- B60K35/20—Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
- B60K35/21—Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor using visual output, e.g. blinking lights or matrix displays
- B60K35/23—Head-up displays [HUD]
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/0093—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 with means for monitoring data relating to the user, e.g. head-tracking, eye-tracking
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B27/0103—Head-up displays characterised by optical features comprising holographic elements
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0179—Display position adjusting means not related to the information to be displayed
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B30/00—Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images
- G02B30/20—Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes
- G02B30/26—Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes of the autostereoscopic type
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/149—Instrument input by detecting viewing direction not otherwise provided for
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/0132—Head-up displays characterised by optical features comprising binocular systems
- G02B2027/0134—Head-up displays characterised by optical features comprising binocular systems of stereoscopic type
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0179—Display position adjusting means not related to the information to be displayed
- G02B2027/0187—Display position adjusting means not related to the information to be displayed slaved to motion of at least a part of the body of the user, e.g. head, eye
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Optics & Photonics (AREA)
- Engineering & Computer Science (AREA)
- Chemical & Material Sciences (AREA)
- Combustion & Propulsion (AREA)
- Transportation (AREA)
- Mechanical Engineering (AREA)
- Instrument Panels (AREA)
- Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
- Controls And Circuits For Display Device (AREA)
Abstract
The invention discloses a method (1500) for operating vision field display device (102) for a driving vehicle (100). The method (1500) comprises a regulation step (1502). When a converge angle between a right vision axis (502) of a right eye (704) and a left vision axis (502) of a left eye (708) of an observer using the vision display device (102) is formed, a malposition between a right side image (402) and a right side image (404) of the vision display device. When the vision axis (502) is intersected with a projection image (504) of the image (402,404), the images (402,404)is regulated without malposition.
Description
Technical field
The present invention relates to method, corresponding control equipment and the corresponding computer program of a kind of visual field display for running the observation of for motor vehicle automatic stereo.
Background technology
WO
1998/035260 A1 describes a kind of hologram screen, and this hologram screen can be described by laser projector and be desirably integrated in windshield.
Publication " Exploring Design Parameters for a 3D Head-Up Display; Broy;Höckh;Frederksen et
al;Pervasive Displays 2014 " describes the head-up display of a kind of stereopsis.
Content of the invention
Based on this background, utilize the scheme that herein proposes propose a kind of according to independent claims, for running the method for the visual field display of for motor vehicle automatic stereo observation, propose in addition a kind of according to independent claims, use the control equipment of described method and finally propose a kind of corresponding computer program according to independent claims.Favourable design is drawn by each dependent claims and explanation subsequently.
Occur in that parallax mistake in the following cases in there is the visual field display for the right eye of observer and the same image of left eye: described image is observed together with one or more destination objects arranged with another distance.This dual imaging is filtered by following manner by the brain of observer: one of eyes are preferably used to viewing.Need the sizable effort of brain at this in order to carry out filtration.
It in the scheme proposing herein, the image right of the right eye for observer is provided and the left-side images being used for left eye is provided, in order to getting rid of parallax mistake.Define virtual image at this, described virtual image can be arranged in space with proximate freedom.The locus of image can be adjusted with the viewing distance of observer relatively at this.The convergent angle between the eyes optical axis can be assessed especially.
Proposing a kind of method of visual field display for running the observation of for motor vehicle automatic stereo, wherein said method comprises the steps:
The dislocation between the image right and left-side images of described visual field display is adjusted in the case of using the convergent angle between the right optical axis of the right eye of observer of described visual field display and the left view axle of left eye, wherein when the described optical axis intersects in the perspective plane of described image, dislocation-free ground adjusts described image.
The visual field display of automatic stereo observation can be understood as head-up display or windshield display or front window glass displays.Said visual field display is designed to, and shows two images in perspective plane, and wherein the right eye of observer is only capable of seeing image right.The left eye of observer is only capable of seeing left-side images.Image right is only visible in the viewing area on right side.Left-side images is only visible in the viewing area in left side.For example, the holographic element of visual field display can have the reflection characteristic of orientation.Dislocation can be stroke (Wegstrecke), in order to make the picture point corresponding to each other of image right and left-side images stagger in perspective plane each other.
Here, the picture point corresponding to each other can be each corresponding pixel pictorial element in other words of left-side images and image right, it utilizes described dislocation to project on the projection surface.Therefore, described dislocation is referred to as stroke, in order to display image right moves relative to left-side images.The display of the dislocation-free of image right and left-side images can be understood as such a and shows, wherein image right and left-side images (particularly fully) are overlapping, and corresponding picture point imaging in the picture point belonging to left-side images of therefore image right.Described observer can be vehicle driver.
Described method can comprise the steps: to determine position in described perspective plane for the described image.Described determination can realize in the case of the optical axis using observer and eye position.Defined location can be adjusted by set-up procedure in addition.Described image can move together in perspective plane, without changing dislocation.Thus can compensate the head movement of observer.
Described method can comprise the steps: to be read in eye information by the eye detection device of vehicle.Described eye detection device is designed to detect the eyes of observer.Eye position is with right eye eyeball positional value and left eye eyeball positional value imaging.The described optical axis is with right side line of vision value and left side line of vision value imaging.Line of vision value and eye position value are tried to achieve convergent angle.Thus can process eye information in real time.Such that it is able to quickly adjust described dislocation.
In the zone of comfort related to observer, described dislocation can be adjusted.In described zone of comfort, the viewing of two image sets compound stereoscopics can be experienced by observer too arduously.Thus can realize the load less to observer.
Described method can include the step mating zone of comfort.Said zone of comfort can be mated in the way of the input to user is made a response.Described zone of comfort can be adjusted relatively with the load condition of observer.When observer is laborious, described zone of comfort can be reduced.Observer can be inputted by user interface and/or change the rated value for zone of comfort.
Relatively described zone of comfort can be mated with the quantity of depth plane (Tiefenebenen) to be shown.When the quantity of described depth plane increases, described zone of comfort can be reduced.Destination object can illustrate with different distances.Destination object is equipped with depth plane at this.The depth plane of multiple displays may be observed compared with the depth plane of a small amount of display more effortlessly.So zone of comfort can be reduced, inside this zone of comfort, show destination object.
Described zone of comfort can be changed relatively with user.When observer is tired, described zone of comfort can be reduced.Thus can reduce the load to observer.Thus observer can recover.
The fatigue of observer can be identified in the case of using eye information.Especially, catacleisis frequency can be estimated, and alternatively or additionally the catacleisis duration can be estimated.Can reliably be identified by eye information, whether user is tired.When user recovers, described zone of comfort can be amplified again.
The scheme proposing herein achieves a kind of control equipment in addition, and described control equipment is designed to, the step performing, manipulate or implementing the flexible program of the method herein proposing in corresponding device.By the present invention, form be that the enforcement flexible program of control equipment can also quickly and efficiently realize the purpose of the present invention.
Control equipment can be understood as electricity equipment at this, and it processes sensor signal and related to this sends control signal and/or data-signal.Described control equipment can have interface, and hardware and/or software sharing can be passed through in described interface.When consisting of hardware, described interface can be for example a part of so-called system-ASIC, and this part comprises the function the most different of control equipment.But, described interface can also is that integrated circuit alone or is made up of discrete component at least in part.When by software sharing, described interface can be software module, and these software modules are for example present in by other software modules on microcontroller.
Computer program or there is the computer program of program code be also advantageous, described program code can be stored in machine-readable carrier or storage medium such as semiconductor memory, harddisk memory or optical memory, and it is used to carry out, implements and/or manipulates the step of method according to one of aforementioned embodiments, particularly when program product or program perform on computer or device.
Brief description
Exemplarily elaborate the scheme herein proposing below by accompanying drawing.Wherein:
Fig. 1 shows the diagram of the vehicle with visual field display;
Fig. 2 shows the diagram of the traffic scene on the vehicle focusing on traveling ahead at driver visual angle;
Fig. 3 shows the diagram of the traffic scene focusing on the display of the visual field from driver visual angle;
Fig. 4 shows the diagram of the visual field display with control equipment, and described control equipment is for running visual field display according to an embodiment of the present;
Fig. 5 show according to an embodiment of the present, there is combiner, the diagram of visual field display of automatic stereo observation;
Fig. 6 show according to an embodiment of the present, the diagram of windshield display of automatic stereo observation;
Fig. 7 shows the schematic diagram of the relation in the visual field display of automatic stereo observation according to an embodiment of the present between image offset and convergent angle;
Fig. 8 shows the schematic diagram of the relation in the visual field display of automatic stereo observation according to an embodiment of the present between image offset and virtual image distance;
Fig. 9 shows the diagram of the vehicle with visual field display according to an embodiment of the present;
Figure 10 shows the diagram of the traffic scene in visual field display according to an embodiment of the present from driver visual angle;
Figure 11 shows the diagram of the zone of comfort being in visual field display according to an embodiment of the present in the relation between virtual image distance and projector distance;
Figure 12 shows the diagram of zone of comfort that be in visual field display according to an embodiment of the present in the relation between virtual image distance and projector distance, that reduce;
Figure 13 shows the flow chart of the method for the visual field display for running automatic stereo observation according to an embodiment of the present;With
Figure 14 shows the diagram of the operation principle of the method for the visual field display for running automatic stereo observation according to an embodiment of the present.
Detailed description of the invention
Below in the description of the advantageous embodiment of the present invention, same or similar reference, for element that is that illustrate in different drawings and that play similar effect, wherein eliminates the repeated description to these elements.
Fig. 1 shows the diagram of vehicle 100, and this vehicle has visual field display 102.Described visual field display 102 is designed to front window glass displays 102.Here, the true picture 104 of information to be shown produces in the region of the windshield 106 of vehicle 100.Described image 104 produces in the visual field of the observer 108 of vehicle 100, produces in the visual field of driver 108 at this.
In windshield display 102 or overlapping with driving scene, the transparent display 102 that is integrated in individually in vehicle, the image 104 showing on the display 102 occurs in as real image 102 in the region of windshield 106 or single glass-or plastic windshield and using bigger distance (being more than 1.8 m) as virtual image appearance unlike in HUD.Thus, compared with HUD when the picture material reading on windshield display 102, the eyes of driver 108 need more to be focused (visual accommodation) arduously from driving scene to windshield display 102 and rotate (convergence).When driver 108 observes driving scene, each optical axis of its eyes is almost parallel, and therefore nearby the destination object (within 10 meters) is dual manifests.If nigh destination object is made up of the picture material 104 illustrating on transparent display 102 now, then it may dual appearance and may play interference effect.
Fig. 2 shows the diagram of the traffic scene 202 on the vehicle 200 focusing on traveling ahead at driver visual angle.The visual angle of driver from Fig. 1 for the described traffic scene 202 illustrates.Driver sees forward traffic scene 202 by windshield 106.Here, the right eye of driver and left eye focus on the vehicle 200 of traveling ahead.But not only right eye and also left eye had all collected the image 104 in windshield 106.Thus, image 104 occurs doubly as the right side virtual image 204 and the left side virtual image 206.
Fig. 3 shows the diagram of the traffic scene 202 focusing on visual field display 102 from driver visual angle.Traffic scene 202 is corresponding to traffic scene in fig. 2.In contrast to this, right eye and left eye focus at this on image 104.But not only right eye but also left eye had all collected the vehicle 200 of traveling ahead.Thus the vehicle 200 of traveling ahead occurs doubly.
Figure 1 illustrates windshield display 102, it has exemplary picture material 104, shows 104 for digital speed in this case.If driver 108 observes driving scene, i.e. distance in the range of about 10 m to 500 m, then described picture material 104 occurs, as shown in Figure 2 doubly.If contrary driver observes the picture material 104 of windshield display 102, then driving scene 200 behind occurs, doubly as figure 3 illustrates.This effect result in eyes in windshield display 102 and switches continually between long sight and myopia, and above-mentioned situation can make people tired for a long time.
If picture material 104 demonstrates on windshield display 102, then described picture material occurs in the following cases typically doubly: driver 108 observes the destination object 200 of driving scene 202, the in this case vehicle 200 for traveling ahead.In peripheral field, it is thus difficult to the picture material 104 of windshield display 102.On the contrary, if driver observes the picture material of on windshield 106 (microcosmic illustrates), then this driver sees driving scene 202 doubly.
In the transparent display 102 that the main visual field is disposed about, creating following problems due to less image distance: when driver 108 for example sees the vehicle 200 of traveling ahead on the display 102, it sees the picture material 104 of transparent display 102 doublely.Owing to assembling, image 104 laterally offsets one from another.This point is disturbed in the following cases very much: the picture material 104 on transparent display 102 is near travel situations 202 and compatibly can not read to a certain extent in peripheral field it may be said that do not have direct viewing because described picture material 104 plane earth manifests doubly.
Fig. 4 shows the diagram of the visual field display 102 with control equipment 400, and described control equipment is for running visual field display 102 according to an embodiment of the present.As in FIG, visual field display 102 is arranged in vehicle 100.In contrast to this, described visual field display 102 is designed to the head-up display 102 of automatic stereo observation at this.The image right 402 to the right eye for observer 108 for the described head-up display 102 carries out projecting and projects the left-side images 404 of the left eye for observer 108.The 402nd, the image being projected 404 is produced on the imageing sensor 406 of head-up display 102 and is reflexed in windshield 106 by lens combination 408.The 402nd, described image 404 is turned to by described windshield 106 in the viewing area 410 residing for the eyes of observer 108.The 402nd, described image 404 is arranged by control equipment 400, in order to make picture material 412 three-dimensionally manifest at predetermined distances for observer 108.Detected for observer this described by detecting device 414 and particularly know position and the line of vision of eyes.Position and the line of vision optical axis in other words based on eyes, it is determined that the dislocation between image right 402 and left-side images 404, in order to obtain stereoeffect.
Herein propose the head-up display 102(HUD of a kind of automatic stereo observation), it has the viewing area of adaptation, described viewing area current zone of comfort of driver 108 from the aspect of depth perception (Tiefendarstellung).
In the scheme herein proposing, to automatic stereo observation head-up display 102(HUD) special use scene inquire into.Can propose in this case as an importance, (virtual) projector distance in LCD-screen and in the HUD of automatic stereo observation is substantially different from each other.Here, HUD
102 are properly termed as see-through display (also known as see-through Display) or visual field display.This display (unit) 412 is regulated according to the fatigue state of user.At this, detect line of vision and the inner rotation of the eyes of user by the tracing of human eye device 414 of user, and know and consider time change and the duration thereof of viewing regulation, in order to assess whether there is or there may be the problem that driver 108 utilizes the display symbol on display unit 412.
Head-up display 102(HUD) it is used for display is related to traveling in the viewing area 410 of driver 108 the such as speed instruction of information 412(, navigation information, warning instruction or more).Virtual image 412 information shown in this virtual image is overlapping with true environment.
In principle, HUD102 includes light source, image sensor cell 406 and the lens combination 408 carrying out imaging.Leaving the light of described system to fall on windshield 106 or combiner-glass pane of installing after which, reflected and enter into the eyes 410 of driver 108 by described windshield or combiner-pane section, driver carrys out perception virtual image 412 before it with distance and the amplification being determined by this lens combination equally being defined by lens combination 408.
For example can use LED or laser diode as light source, it has been illuminated from behind providing the LCD of picture material
406.As an alternative, it is possible to use different projectors.Here, future can use the compact projection device based on DMD technology, LCoS technology or laser technology more.In addition it is devoted to realize the display of contact simulation.The embodiment of the binocular being shown in two identical images of eyes viewing of this contact simulation occupies very big structure space volume.The flexible program of binocular two eyes in the flexible program of this binocular watch each slightly different image the 402nd, 404 in addition to the additional function of the depth preception (Tiefenempfindung) (" 3-D-effect ") of stereopsis, are also expected to obtain at fastness, construction space and the outstanding advantages becoming present aspect.
HUD 102(asHUD by automatic stereo observation), can illustrate in the way of 3D and travel related information 412, and driver 108 does not needs the additional device that adds, such as shutter glasses or polaroid glasses.Therefore, even if driver 108 also can observe perfect image 412 at any time when head movement, asHUD 102 needs: head tracing system 414, and this head tracing system can analyze head position and the eye position 410 of driver 108;With corresponding tracker.Figure 4 illustrates the System Overview of the theory structure of asHUD system 102.
Fig. 5 show according to an embodiment of the present, there is combiner 500, the diagram of visual field display 102 of automatic stereo observation.Described visual field display 102 at Fig. 1 to Fig. 4 be arranged in vehicle.Described imageing sensor 406 is illuminated by light source.Described imageing sensor 406 produces image right and left-side images.Described combiner 500 is towards the back wave of imageing sensor 406 described in the eye reflections of observer 108.Described combiner 500 is arranged in the lower edge of windshield 106 at this.Therefore when driver looks into the distance a distant place, under the optical axis 502 of the eyes observer 108 for the described combiner 500.In other words, described combiner 500 is arranged in the peripheral field of observer 108.Define virtual image 504 for observer 108 in the range of described windshield 106 in perspective plane 504.
Figure 5 illustrates a kind of embodiment, wherein by sink in instrument board, the line holographic projections glass pane 406 of directed radiation, by smooth windowpane 500(or combiner-glass pane) carry out imaging.This embodiment has the advantage that, described system 102 independently can use with windshield 106, and described image is for example seen as virtual image 504 after described windshield 106, this point due to amplify image distance and by user's cosily perception.
In another kind of unshowned embodiment, described image is projected directly on single combiner-windowpane, and described combiner-windowpane then for example has the photosensitive polymer layer with holographic scattering function.It here, eliminate the advantage of the image distance widened compared with windshield display, is that this described system more saves position and thus relatively uncomplicated, because described system is made up of less assembly.
It is also conceivable to the flexible program of " reflective display (Einspiegelungsanzeige) 102 " shown in Fig. 5, wherein said image is not by single glass pane but directly reflected by windshield 106.The advantage of this embodiment is, it is not necessary to installs additional assembly in instrument board, and therefore cannot see that the rib that may disturb of single combiner 500.In such an embodiment, yet forms both when less windshield bends and be evaluated as acceptable image by user.
The display 102 of stereopsis is also implemented as combiner-display 102, as it illustrates in the figure.This stereopsis display of the display 102 of stereopsis also again may be by this combiner-display 102 in terms of this feature according to line holographic projections display 406 and realizes.In this case, line holographic projections display 406 is opaque and by combiner 500 imaging of plane.The image of stereopsis display 406 is revealed as virtual image 504.Have an advantage in that there is somewhat larger image distance relative to driver's eyes compared with embodiment described below.
Fig. 6 show according to an embodiment of the present, the diagram of windshield display 102 of automatic stereo observation.Described windshield display 102 can also be referred to as visual field display 102.Compared with the visual field display in Fig. 5, said image right and left-side images view field 600 perspective plane in other words in being integrated into front window glass 106 produces.To this end, two projectors 602 project to one of image in view field 600 respectively from below.In view field 600, holographic film 604 be arranged in front window glass 106 or on.Described film 604 defines right side viewing area 606 and the left sides region 608 for left eye of the right eye for observer 108.It can be seen that image right in right side viewing area 606, it can be seen that left-side images in left sides region 608.By the lateral movement of projector 602, the 606th, viewing area 608 can laterally together move in company with observer 108.
In other words, it is proposed that a kind of have the dynamic stereopsis windshield display 102 assembling coalignment.
A kind of preferred embodiment of transparent display 102 is the windshield display 102 that figure 6 illustrates.Here, realize the photosensitive polymer film 604 of holographic scattering function be in turn laminated in windshield 106 or on.By compatibly combining with two single image projectors 602, the picture material of automatic stereopsis can be produced as shown.
Show a kind of embodiment of the windshield display 102 that automatic stereo observes.Region 600 shown in broken lines is spendable display surface 600, and this display surface is illuminated by two projectors 602.The transparent display 600 realizing in the way of holographic makes the image being shown by the first projector 602 be diffracted in First view frame 606, and the image being shown by the second projector 602 is diffracted in Second Sight frame 608.By being displaced sideways of projector 602, thus it is possible to vary eye frame the 606th, 608 lateral position, in order to follow lateral head movement.
Fig. 7 shows the schematic diagram of the relation in the visual field display 102 of automatic stereo observation according to an embodiment of the present between image offset 700 and convergent angle 702.Here, described visual field display 102 is arranged in vehicle 100, and the driver of vehicle 100 sees the vehicle 200 of traveling ahead via visual field display 102.Because the right eye of driver 704 and left eye 708 are at a distance of an eye spacing 706, thus eyes the 704th, 708 the optical axis 502 there is 702: two eyes of convergent angle the 704th, 708 alignment same point 710 in the following cases to each other.When image right 402 and left-side images 404 should three-dimensionally be apparent at described point 710, the image offset 700 between image right 402 and left-side images 404 to eyes the 704th, 708 and image the 402nd, the viewing distance between 404 the 712nd, related in eyes the 704th, the distance 714 between 706 and the point 710 watched and eye spacing 706.
Convergent angle 702 θ (Theta) can be the angle 702 between the optical axis 502 of left eye 708 and right eye 704, s can be the distance 712 from driver's eyes the 704th, 708 to windshield 106, and d can be the image that the 700th, i.e. illustrates on windshield 106 in the way of stereopsis of gap the 402nd, 404 dislocation 700, this dislocation is shown for so that target 710 manifests with distance 714 z.
Fig. 8 shows the schematic diagram of the relation in the visual field display 102 of automatic stereo observation according to an embodiment of the present between image offset 700 and virtual image distance 714.Described visual field display 102 substantially corresponds to one of visual field display in Fig. 4 to Fig. 6 at this.The 402nd, image 404 is arranged in identical perspective plane 504.As shown, in two examples, as in the case of the viewing distance 712 the 700th, identical in identical dislocation however be respectively provided with different references and cause different virtual image distances 714.
In the first example, image right 402 is arranged in the right side of left-side images.Right eye 704 observes image right 402.Left eye 708 observes left-side images 404.Thus defining following impression for observer, virtual image 412 can be apart from farther compared with real image the 402nd, 404.
In the second example, image right 402 is arranged in the left side of left-side images.Here, right eye 704 observes image right 402 in addition.Left eye 708 observes left-side images 404 in addition.Thus defining following impression for observer, virtual image 412 can be arranged in observer and image the 402nd, between 404.
In other words, figure 8 illustrates the principle of the 3-D-viewing of stereopsis, which illustrate virtual image distance 714(VID of (left side) viewing of (right side) for intersecting and non-crossing) and virtual screen distance 712(VSD) size.
Imaging lens system determines virtual screen distance 712(VSD of described system 102).By two single images the 402nd, 404 horizontal displacement 700 d over the display or on virtual screen distance 712 VSD, can be with chosen distance 714(virtual image distance VID), wherein driver sees image 412.Show parameter virtual screen distance 712 for that intersect and non-crossing viewing in fig. 8
VSD and virtual image distance 714 VID.Described virtual image distance 714
VID is determined by following formula,
Wherein a represents the interpupillary eye spacing 706 of driver.Symbol in denominator determines, before or after described image 412 is positioned at virtual screen distance 712 VSD.Described system 102 can be matched with a spacing 706, in order to driver can be with desired distance 714 perceptual image information 412.
Because the eye spacing 706 of driver is indispensable information, system 102 needs described information, so to realize image views 412, described image views is manifested, so system 102 can be measured the eye spacing 706 of driver immediately and correspondingly the 402nd, 404 mate view with the distance 714 having a mind to for each driver.For example it is also contemplated that the size of driver with regard to downwards angle of visibility.Then the video camera for head tracking is also used for, and knows these information and makes image views the 402nd, be matched with driver to 404 personalizations.Additionally, the priority in person in terms of driver can be saved in brightness, color, contrast or zone of comfort.
Fig. 9 shows the diagram of the vehicle 100 with visual field display 102 according to an embodiment of the present.Described visual field display 102 substantially corresponds to the image display as shown in fig. 5 and fig..Described visual field display 102 is then windshield display 102 front window glass displays 102 in other words at this.In order to driver 108 makes the image right 402 for right eye 704 and the left-side images 404 for left eye 708 gradually manifest (eingeblendet) in windshield 106 front window glass 106 in other words.The 402nd, image 404 has a dislocation 700 to each other because described driver 108 sees the driving scene before its vehicle 100, and therefore eyes the 704th, 708 the optical axis image the 402nd, 404 perspective plane after intersect.Image the 402nd, 404 distortion ground, perspective ground gradually manifest, in order to obtain the image of non-distorted from the visual angle of driver 108.
Transparent display device 102 can realize by volume hologram.Vehicle 100 can have the system observed for driver, and described system can detect the line of vision of driver 108.
In order to avoid dual imaging, propose, driver 108 eyes the 704th, 706 know in real time convergent angle on the basis of combine and use the transparent display 102 of the windshield display 102 of stereopsis or separately installed stereopsis for the system entering Mobile state gap coupling (Disparit tsanpassung).
This point needs for detecting the system of line of vision, and described system can distinguish reliably and in real time at least two situation: " driver 108 watches driving scene " and " driver 108 watch the picture material of transparent display 102 the 402nd, 404 ".If driver 108 watches driving scene, then so adjust at each for the gap 700 between left eye 708 and the picture material that illustrates on windshield display 102 for right eye 704 the 402nd, 404 so that dual imaging is minimized.The 402nd, this point 404 moves in the sightingpiston of driver 108 corresponding to picture material in the way of stereopsis.If instead its sight is forwarded in the picture material 402 of windshield display 102 by driver 108, then described picture material does not illustrates in the way of stereopsis, thus driver 108 also can be with read-out display, such as each other vehicle display as custom.
It is necessary that additional eye position is followed the trail of, in order to even if remain to keep the effect of automatic stereo observation when the lateral head of driver 108 is moved.
The picture material illustrating in the way of stereopsis on windshield 106 the 402nd, 404 can be read in the case of there is no dual imaging.Described reading can realize in the following cases: does not mate eyes the 704th, 708 convergences carrying out needing when reading traditional vehicle display (such as combination instrument).Thus read picture material the 402nd, 404 when define time advantage and comfortableness advantage, thus finally can improve driving safety.Correspondingly to the picture material of windshield display 102 the 402nd, 404 carry out size coupling when, described picture material can also perception and reading in peripheral field, to a certain extent it may be said that there is no direct viewing.The content illustrating the 402nd, 404 size then can mate according to eyes resolution ratio less in peripheral field.
Figure 9 illustrates the windshield display 102 of automatic stereo observation, but it shows the picture material identical with Fig. 1 the 402nd, 404 in the way of stereopsis.The left eye 708 of driver 108 and right eye 704 are respectively seen two destination objects illustrating the 402nd, one of 404 that offset one from another at this.It when driver 108 watches driving scene, is formed without dual imaging by the view that automatic stereo is observed.
Figure 10 shows the diagram of the traffic scene 202 in visual field display 102 according to an embodiment of the present from driver visual angle.Described traffic scene 202 substantially corresponds to traffic scene in figs. 2 and 3.Here it is shown that: such as driver in fig .9, although it utilizes right eye observe image right and utilize left eye to collect left-side images, but seeing common undistorted virtual image 412, described image floating ground between the vehicle 200 and its vehicle 100 of traveling ahead manifests.
If the 402nd, described picture material 404 illustrates on windshield display 102 in the way of stereopsis, then the 402nd, described picture material 404 does not manifest in the following cases doubly: driver watches the destination object 200 of driving scene 202 in this case for the vehicle 200 of traveling ahead.Therefore, illustrate that the 402nd, picture material described in mode 404 can be read and manifest uninterruptedly in peripheral field this.
It is to show picture material that automatic stereo observes the 402nd, 404 for the solution of this problem, as long as driver 108 does not watch transparent display 102, then described picture material is just illustrated by following manner: described picture material can at least be read in peripheral field.When picture material or display described in driver's direct viewing, then close automatic stereo observation display because described display due to less image distance for eyes uncomfortable.
Figure 11 and Figure 12 shows the diagram of the zone of comfort 1200 being in visual field display according to an embodiment of the present in the relation between virtual image distance 714 and projector distance 712 or viewing distance 712.Describe the zone of comfort 1200 for the virtual image shown in each at this in fig. 11.Figure 12 illustrates the zone of comfort 1200 for multiple virtual images, wherein said virtual image illustrates with different distances 714.In zone of comfort 1200, the observation to picture material for the observer of 75% is perceived as comfortable.Described zone of comfort 1200 starts in the case of with minimum spacing 1202 of observer and terminates in the case of in maximum spacing 1204.At described publication " the Exploring Design Parameters for a 3D of beginning
Head-Up Display " describes this research, wherein has studied in detail boundary of merely exemplary description at this of zone of comfort.
Described zone of comfort 1200 increases with the viewing distance 712 that increases at this.In the image simultaneously illustrating with multiple distances, described zone of comfort 1200 is less compared with in the case of in each image.Additionally, minimum spacing 1202 moves to bigger image distance 714 relative to each image under multiple image conditions watched, and described maximum spacing 1204 diminishes.
In fig. 11 with for representing virtual image distance 714 VID of virtual 3-D-destination object and virtual screen distance 712
VSD shows zone of comfort 1200 relatively.Described zone of comfort 1200 represents the person under inspection of 75%.In the region adjacent to zone of comfort 1200, also the person under inspection of 50% is in its zone of comfort.
In fig. 12 with for representing virtual image distance 714 VID of three virtual 3-D-destination objects (above, middle, below) and virtual screen distance 712
VSD shows zone of comfort 1200 relatively.Zone line 1200 represents the zone of comfort of the person under inspection of 75%.In adjacent region, also the person under inspection of 50% is in its zone of comfort.
Especially in the described publication " Exploring Design Parameters for a 3D Head-Up Display " of beginning, the example value by the correlation being made up of virtual image distance VID 714 applied for HUD-and virtual screen distance VSD 712 describes zone of comfort 1200.Additionally, illustrate front boundary 1202 and the rear boundary 1204 of zone of comfort 1200.Also will have the system of multiple (virtual) projector distance 714 at this as theme.
In addition to the relation being reproduced by the formula introduced in fig. 8, everyone 3-D-perception differs personalizedly.Thus everyone also has the region of personalization, described in the region of this personalization, virtual image distance 714 VID can move in the case of virtual screen distance 712 VSD determining, wherein can realize comfortable 3-D-viewing for it.Described region is properly termed as zone of comfort 1200.Described zone of comfort 1200 is limited by the standard of the poly-conflict of visual accommodation-meeting fatefully.If it exceeds described zone of comfort 1200, then driver can feel and not accommodate uncomfortable (such as eyes pain, headache).No longer may can realize desired 3-D-perception.
Each type of uncomfortable sensation, same as risk may be formed in road traffic by losing to the perception travelling related information shown in asHUD, and not only injure the safety of driver but also the safety that injures all other traffic participant.
The scheme herein proposing proposes the HUD of a kind of automatic stereo observation with self-reacting viewing area 1200.It here, described viewing area is not only matched with corresponding driver, but also is matched with its current health and/or situation at heart.Thus avoid, described 3-D-shows outside (current) zone of comfort 1200 being in driver, ensure that perception comfortable for driver accordingly, and avoid eyes pain, headache or uncomfortable, this point causes driving safety and the raising of pleasure factor.
Described asHUD uses data and other information being shot by head tracking video camera of head tracking, in order to make to illustrate the current zone of comfort 1200 that the size of the viewing area 1200 of picture material and the quantity of depth plane are matched with driver adaptively wherein.Thus achieving the negative effect of the reduction showing 3-D-, it may cause some symptoms, as headache, eyes pain, uncomfortable.In addition the risk that image information can not merge is reduced.Thus can realize the useful life extending, because for example in zone of comfort 1200 described in tired limit, described system being necessary to switch off in the case of automatically adaptively not mating driver to this, because the perception in display is uncomfortable for it.
Additionally, the fatigue of driver can be made less, because described display is so matched with driver so that driver's minimally is tired.Described viewing area 1200 can be adjusted in a different manner individually for that intersect and non-crossing viewing, i.e. front boundary 1202 and the rear boundary 1204 of viewing area 1200 can be changed individually.The driver with king-sized zone of comfort 1200 can also use this zone of comfort.Described system can adjust according to different driver/situations.Cause the raising of the raising of sense datum, the raising of security and comfortableness.
Described zone of comfort 1200 is very related to selected virtual screen distance 712 VSD.Draw virtual image distance 714 according to virtual screen distance 712 VSD
The region 1200 of the determination of VID, can show information in this region, thus driver can these information of cosily perception.With virtual screen distance 712
VSD checked described region 1200 relatively.Described result illustrates with quality at Figure 12 and Figure 13.At " Exploring Design
Parameters for a 3D Head-Up Display;Broy;Höckh;Frederksen et al;Pervasive Displays 2014 " describes and is worth more accurately.What described zone of comfort 1200 represented the person under inspection of 75% is perceived as comfortable region.Therefore, also should be according to the virtual screen distance 712 of application in asHUD
VSD, selects the region 1200 being adapted to accordingly in lower limit 1202 VIDmin and the upper limit 1204 VIDmax of virtual image distance 714 VID to show information.Multiple virtual screen distance 712 VSD or even one virtual screen distance 712 VSD that can change is being utilized to come in the system of work, equally in conversion or change virtual screen distance 712
The region 1200 for display of correspondingly matching virtual image distance 714 VID is noted during VSD.
Described zone of comfort 1200 is certainly not only related to virtual screen distance 712 VSD, but also is related to from the quantity of the virtual graph image planes with different virtual image distance 714 VID, and described virtual image distance is for display.The asHUD herein proposing also includes: mate described zone of comfort 1200 relatively with picture material, the quantity of virtual graph image planes particularly applied.
Accompanying drawing 11 and 12 and VSD illustrate only the order of magnitude of zone of comfort and common distribution relatively.Concrete numerical value is related to concrete condition, and the details with regard to basic research can be learned from above-mentioned publication.
Additionally, bigger individual difference can not be there is at this effectively for each driver in the value known by rule of thumb illustrating in figs. 11 and 12.Additionally, the zone of comfort 1200 of driver can change.Here, following factor plays bigger effect: fatigue, contrast, useful life, the defects of vision etc..Therefore, in the another kind of embodiment of described system, virtual screen distance 712 is not only considered as described above
VSD and the quantity in virtual face, but also consider personalized, the time dependent state of driver.
In the simple embodiment of one, driver can provide feedback for described system, as long as its zone of comfort 1200 suffers damage, if his perception present displayed content uncomfortably in other words.For example may include that knob, operating element, steering wheel button at this input unit, touch display screen or also voice-or gesture control device.Said input unit can be that one is simply input equipment, wherein driver is only capable of reducing or expanding the size of used zone of comfort 1200, or but also can be complex input equipment, it allows the operator to the maximum quantity of front boundary 1202 and rear boundary 1204 and/or the depth plane directly adjusting virtual image distance 714 VID by input unit.
Described in another kind of embodiment, system is more complicated.Described system automatically identifies when beyond the zone of comfort 1200 of driver and automatically adjusts described zone of comfort.
The observation of driver allows described viewing area 714 to be matched with the current zone of comfort 1200 of driver.For this, camera review of head tracing system is obtained information.
Common fatigue can be known by turn to behavior or the camera review by the face of driver that change.For example consider sight, catacleisis frequency, catacleisis duration, the eye closing percentagewised at this or yawn.Therefore, by these information and the fatigue state that driver can be detected by head tracking video camera of steering angle sensor.If if there is fatigue, then in this case it is assumed that demonstrate simplification figure with the viewing area 714 and less depth plane reducing.
Viewing area reduce at this for example can with regard to front boundary 1202 with 20% stepping (Schritt) (VIDmin is extended 20%) adjust and with regard to rear boundary 1204 with 10% stepping (VIDmax is reduced 10%) adjust.The quantity of the depth plane simultaneously being used can reduce one respectively.Carry out reappraising driving behavior after each stepping and carry out another stepping if desired.If achieving following state driver in a state can treat again HUD-display without any problems, then first maintain this state.If the Continuous Observation of driver draws following conclusion, for driver, read-out display is increasingly easier, then described system can expand its viewing area 1200 with identical stepping along each other directions again.
The system being proposed can not only know the line of vision of driver by tracing of human eye device, and the number of degrees that can rotate from two inside ofeyes infer the convergence plane of the optical axis of driver.Therefore can recognize that the observation duration of picture material and change over time thereof.In order to be able to detect picture material, not only line of vision is needed to be suitable for shown destination object but also the convergence plane of the optical axis is also suitable for shown destination object.If the duration is progressively longer, until driver achieves this lasting observation state, then starting point can be risen to combine difficulty.Described system stores: how long driver needs until line of vision and convergence plane mate, and needs the destination object how long driver sees display.If the described time, the atypically long or absolute value that defines before exceeding was in other words as useful life significantly improves for each driver, then described system is reacted with the quantity reducing zone of comfort 1200 and the depth plane being used as described above.
If driver can't combination image, then closedown 3-D-pattern.Described system is then carried out work as 2-D-HUD and can use perspective display.Upper once traveling is had no progeny or recloses 3-D-pattern according to the hope of driver.Use above-mentioned input unit for driver this described if desired.
The utilization of described system corrects the data being observed acquisition by driver in the picture material shown in each time.Being learnt by this comparison, which information made driver interested at that time most.It as long as then described information for example can be arranged in the center in the visual field and it is not to illustrate in the way of touch simulation, is arranged in the center of viewing area, wherein for example observes one minute respectively and then can react to that.
In sum it appeared that, the depth location (Tiefenposition) 714 of picture material can be carried out mating and narrowly arranges used depth plane and reduce its number by system in the following cases that herein propose: whole above-mentioned information show to described system, and picture material is currently in outside region 1200 comfortable for driver.It is only to use the zone of comfort 1200 of the personalized moment of driver in this target.If then driver is on the way for a long time for example, and additionally or alternatively sky blackening and driver weaken labor, then its zone of comfort 1200 diminishes.Then driver may be made with being increasingly difficult to identify the information illustrating in the case of less virtual image distance 714 VID.In this case, driver needs the more time, is used for reading these information.Described system reacts to that and is exaggerated the virtual image distance 1202 of minimum use
VID, until described system can determine that driver no longer has any problem at the aspect using the asHUD with self-reacting viewing area.
Figure 13 shows the flow chart of the method 1500 of the visual field display for running automatic stereo observation according to an embodiment of the present.Described method 1500 has step 1500: adjust the dislocation between the image right and left-side images of visual field display.Here, described dislocation is adjusted in the case of using the convergent angle between the right optical axis of the right eye of observer of visual field display and the left view axle of left eye.When the described optical axis intersects in the perspective plane of image, sub-step 1502 adjusts to dislocation-free described image.
In one embodiment, described method 1500 has determination step 1504.Described step 1504 has many sub-steps 1504.The line of vision of observer is determined in the first sub-step 1504.Convergent angle is determined in the second sub-step 1504.Eye position is determined in the 3rd sub-step 1504.
Distinguish when determining line of vision: observer looks into the distance a distant place or observer watches visual field display.When determining convergent angle, in the case of considering the eye spacing of eyes, assess the optical axis of right eye and left eye.When determining eye position, determine eye spacing and the eyes relative position relative to visual field display.When observer looks into the distance a distant place, in calculation procedure 1506, in the case of using convergent angle, calculate described dislocation, and in set-up procedure 1502, in the way of stereopsis, adjust image.When observer watches visual field display, in set-up procedure 1502 dislocation-free ground or microscopically (monoskopisch) image is shown.
In this another sub-step 1506 calculating external, calculate position on the display of the visual field for the image, in order to compensate the head movement of observer.
In this external set-up procedure 1502, for described visual field display, picture material is arranged.
Flow chart has illustrated the exemplary feature of the display system according to the present invention.The eye position of the driver's observing system detection line of vision, above-mentioned convergent angle θ and the driver that do not describe in detail.If driver by display transparent for its sight alignment, then microscopically illustrates its content.If driver watches driving scene, then the content of transparent display is according to the formula being givenMove in driving scene by carrying out mating to gap, in order to remove the dual imaging additionally occurring.In addition the eye position of driver is known, in order in terms of its position, dynamically mate, according to degree of depth instruction, motion parallax, the picture material moving in the way of stereopsis in driving scene accordingly when the head of driver is displaced sideways.
Figure 14 shows the diagram of the operation principle of the method for the visual field display 102 for running automatic stereo observation according to an embodiment of the present.Here, described visual field display provides the image right 402 for right eye and the left-side images 404 for left eye for observer 108 driver 108 in other words.Observer 108 is recorded by the video camera 414 of driver's monitoring system.Described video camera 414 provides camera review 1600 or video information 1600.Described camera review 1600 is estimated in control equipment 400.Camera review 1600 is estimated in the first apparatus for evaluating 1602 at this, in order to mate picture material for visual field display 102.Here, the optical axis 1606 of particularly eye position 1604 and eyes is evaluated and is provided for processing further.
Assess camera review 1600 in the second apparatus for evaluating 1608, in order to identify the zone of comfort 1610 of observer 108.Here, know the load condition of observer 108 by somatic reaction.Described zone of comfort 1610 is related to described load condition.
In the case of using eye position 1604 and the optical axis 1606, in the device 1612 for determining, determine the dislocation 700 between image right 402 and left-side images 404.Dislocation 700 between image is adjusted by the device 1614 for adjusting by following manner: the picture material illustrating moves to be come at defined location by dislocation.There is provided herein the image information 1616 on right side and the image information 1618 in left side.
In the device 1620 for coupling, in the case of using zone of comfort 1610 on the right side of coupling and the image information the 1616th, 1618 in left side.If dislocation 700 is in outside zone of comfort, then limit described dislocation.Described device 1620 provides the image right information 1622 being matched and the left-side images information 1624 being matched for visual field display 102.
Therefore having drawn adjustment loop (Regelschleife), the wherein reaction of observer 108 can be used for adjusting described zone of comfort 1610 fatefully.
In other words, Figure 14 shows: in the operation principle of the head-up display 102 that the scheme herein proposing lists automatic stereo observation in.
Relatively simple embodiment can be utilized to be combined into complex embodiment at this.Described system 400 automatically works, but driver is possible not only to close described system, but also can manually be matched with its hope as in relatively simple embodiment.
In another kind of embodiment, the system of automation is designed to learning system, thus described system understands and store each driver and typical adjustment thereof and tired performance.Therefore, preferably coupling can quickly and more successfully carry out always.
Show the schematic diagram of the operation principle of the asHUD 102 with zone of comfort-coupling.
Embodiment described by exemplarily only selecting and illustrated in the accompanying drawings.Different embodiments can be completely or with regard to each feature combination with one another.A kind of embodiment can also be supplemented by the feature of another kind of embodiment.
Additionally, the method step herein proposing can be repeatedly carried out and performs with the another kind of order different from described order.
If embodiment includes that the "and/or" between fisrt feature and second feature connects, then this point should be understood, described embodiment not only has fisrt feature according to a kind of embodiment but also has second feature, and according to another embodiment or only there is fisrt feature or only there is second feature.
Claims (11)
1. being used for running the method (1500) of the visual field display (102) of the automatic stereo observation of vehicle (100), wherein said method (1500) has a following step:
(1502) dislocation (700) between the image right (402) and left-side images (404) in described visual field display (102) is adjusted in the case of using convergent angle (702) between the right optical axis (502) of the right eye (704) of observer (108) in described visual field display (102) and the left view axle (502) of left eye (708), wherein when intersecting during the described optical axis (502) is on the perspective plane (504) of image (the 402nd, 404), dislocation-free ground adjusts described image (the 402nd, 404).
2. method according to claim 1 (1500), the method has a following step (506): determine position in described perspective plane (504) for the described image (the 402nd, 404) in the case of the optical axis (502) using observer (108) and eye position, wherein additionally in the step of adjustment (1502) to determined by position be adjusted.
3. according to method in any one of the preceding claims wherein (1500), the method has following step: read in eye information by the eye detection device (414) of vehicle (100), described eye detection device is designed to detect the eyes (704 of observer (108), 708), wherein eye position is with the right eye eyeball positional value of eye information and left eye eyeball positional value (1604) imaging, and the described optical axis (502) is with the right side line of vision value of eye information and left side line of vision value (1606) imaging, wherein in set-up procedure (1502), line of vision value (1606) and eye position value (1604) are tried to achieve described convergent angle (702).
4., according to method in any one of the preceding claims wherein (1500), wherein in set-up procedure (1502), in the zone of comfort (1200) related to observer, adjust described dislocation (700).
5. method according to claim 4 (1500), the method has the step mating described zone of comfort (1200), and wherein said zone of comfort (1200) is mated in the way of the input to user (108) is made a response.
6. the method according to according to any one of claim 4 to 5 (1500), wherein in coupling step, relatively described zone of comfort (1200) is mated with the quantity of depth plane to be shown, wherein when the quantity of described depth plane increases, reduce described zone of comfort (1200).
7. the method according to according to any one of claim 4 to 6 (1500), wherein in coupling step, described zone of comfort (1200) changes relatively with user, wherein when observer (108) is tired, reduces described zone of comfort (1200).
8. method according to claim 7 (1500), wherein in coupling step, identify the fatigue of observer (108) in the case of using eye information, are wherein especially estimated catacleisis frequency and/or catacleisis duration.
9. controlling equipment (400), this control equipment is designed to the institute of execution method (1500) according to according to any one of aforementioned claim in steps.
10. computer program, this computer program is arranged for the institute of execution method according to according to any one of aforementioned claim in steps.
11. machine-readable storage mediums, this machine-readable storage medium has computer program that be stored thereon, according to claim 10.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DE102015205167.6A DE102015205167A1 (en) | 2015-03-23 | 2015-03-23 | Method and control device for operating an autostereoscopic field of view display device for a vehicle |
DE102015205167.6 | 2015-03-23 |
Publications (2)
Publication Number | Publication Date |
---|---|
CN105988220A true CN105988220A (en) | 2016-10-05 |
CN105988220B CN105988220B (en) | 2020-09-15 |
Family
ID=56889647
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201610163253.XA Expired - Fee Related CN105988220B (en) | 2015-03-23 | 2016-03-22 | Method and control device for operating an autostereoscopic field display for a vehicle |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN105988220B (en) |
DE (1) | DE102015205167A1 (en) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110001400A (en) * | 2017-12-06 | 2019-07-12 | 矢崎总业株式会社 | Display apparatus |
CN110024381A (en) * | 2016-12-01 | 2019-07-16 | 夏普株式会社 | Display device and electronics rearview mirror |
CN110262049A (en) * | 2019-06-26 | 2019-09-20 | 爱驰汽车有限公司 | Naked eye stereo-picture display component, windshield and automobile using it |
CN110579879A (en) * | 2019-09-17 | 2019-12-17 | 中国第一汽车股份有限公司 | vehicle-mounted head-up display system and control method thereof |
CN112313558A (en) * | 2018-08-08 | 2021-02-02 | 宝马股份公司 | Method for operating a field-of-view display device for a motor vehicle |
Families Citing this family (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE102017213654A1 (en) * | 2017-08-07 | 2019-02-07 | Bayerische Motoren Werke Aktiengesellschaft | User interface, door operating module, roof control module and means of transport for displaying user-facing control element labels |
DE102018215266B4 (en) | 2018-09-07 | 2024-04-18 | Audi Ag | Motor vehicle with a display device for providing a three-dimensional display |
DE102019206358B4 (en) * | 2019-05-03 | 2022-04-21 | Audi Ag | Camera device and method for generating an image of an environment |
WO2023208907A1 (en) | 2022-04-27 | 2023-11-02 | Saint-Gobain Glass France | Composite pane with a first reflective layer and a second reflective layer |
WO2023208962A1 (en) | 2022-04-27 | 2023-11-02 | Saint-Gobain Glass France | Composite pane with a reflective layer and a hologram element |
CN115314699A (en) * | 2022-06-27 | 2022-11-08 | 中国第一汽车股份有限公司 | Control method and control device of display screen and vehicle |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102314315A (en) * | 2010-07-09 | 2012-01-11 | 株式会社东芝 | Display device, image data generating device, image data generation program and display packing |
DE102014001710A1 (en) * | 2014-02-08 | 2014-08-14 | Daimler Ag | Device for augmented representation of virtual image object in real environment of vehicle, e.g. head-up-display, reflects partial images by disc, so that virtual image object is output as virtual depth image in real environment |
CN104253990A (en) * | 2013-06-28 | 2014-12-31 | 罗伯特·博世有限公司 | Method and device for displaying three-dimensional image of imager of vehicle view display device |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE19704740B4 (en) | 1997-02-13 | 2006-07-13 | Eads Deutschland Gmbh | Holographic screen and manufacturing process |
-
2015
- 2015-03-23 DE DE102015205167.6A patent/DE102015205167A1/en not_active Withdrawn
-
2016
- 2016-03-22 CN CN201610163253.XA patent/CN105988220B/en not_active Expired - Fee Related
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102314315A (en) * | 2010-07-09 | 2012-01-11 | 株式会社东芝 | Display device, image data generating device, image data generation program and display packing |
CN104253990A (en) * | 2013-06-28 | 2014-12-31 | 罗伯特·博世有限公司 | Method and device for displaying three-dimensional image of imager of vehicle view display device |
DE102014001710A1 (en) * | 2014-02-08 | 2014-08-14 | Daimler Ag | Device for augmented representation of virtual image object in real environment of vehicle, e.g. head-up-display, reflects partial images by disc, so that virtual image object is output as virtual depth image in real environment |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110024381A (en) * | 2016-12-01 | 2019-07-16 | 夏普株式会社 | Display device and electronics rearview mirror |
CN110001400A (en) * | 2017-12-06 | 2019-07-12 | 矢崎总业株式会社 | Display apparatus |
CN110001400B (en) * | 2017-12-06 | 2022-03-29 | 矢崎总业株式会社 | Display device for vehicle |
CN112313558A (en) * | 2018-08-08 | 2021-02-02 | 宝马股份公司 | Method for operating a field-of-view display device for a motor vehicle |
US11945306B2 (en) | 2018-08-08 | 2024-04-02 | Bayerische Motoren Werke Aktiengesellschaft | Method for operating a visual field display device for a motor vehicle |
CN110262049A (en) * | 2019-06-26 | 2019-09-20 | 爱驰汽车有限公司 | Naked eye stereo-picture display component, windshield and automobile using it |
CN110579879A (en) * | 2019-09-17 | 2019-12-17 | 中国第一汽车股份有限公司 | vehicle-mounted head-up display system and control method thereof |
Also Published As
Publication number | Publication date |
---|---|
DE102015205167A1 (en) | 2016-09-29 |
CN105988220B (en) | 2020-09-15 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN105988220A (en) | Method and control device for operating vision field display device for vehicle | |
JP7370340B2 (en) | Enhanced augmented reality experience on heads-up display | |
US10650605B2 (en) | Apparatuses, methods and systems coupling visual accommodation and visual convergence to the same plane at any depth of an object of interest | |
EP2914002B1 (en) | Virtual see-through instrument cluster with live video | |
US7952808B2 (en) | Display system for vehicle and display method | |
US20170161950A1 (en) | Augmented reality system and image processing of obscured objects | |
JP6378781B2 (en) | Head-mounted display device and video display system | |
WO2016113951A1 (en) | Head-mounted display device and video display system | |
JP2015015708A (en) | Method and device for displaying three-dimensional image using video unit of view field display device for vehicle | |
JP2010070066A (en) | Head-up display | |
WO2018100239A1 (en) | Imaging system and method of producing images for display apparatus | |
US10274726B2 (en) | Dynamic eyebox correction for automotive head-up display | |
WO2019058492A1 (en) | Display system and display method | |
US9684166B2 (en) | Motor vehicle and display of a three-dimensional graphical object | |
US20230043244A1 (en) | Image display device including moveable display element and image display method | |
EP3966670B1 (en) | Display apparatus and method of correcting image distortion therefor | |
JP2009278234A (en) | Display system | |
JP2019133116A (en) | Display system, movable body, and design method | |
CN111902859B (en) | Information processing device, information processing method, and program | |
JP4929768B2 (en) | Visual information presentation device and visual information presentation method | |
JP2019083385A (en) | Head-up display unit | |
WO2018101170A1 (en) | Display device and electronic mirror | |
JP2019066562A (en) | Display, display control method, and program | |
JP2019066564A (en) | Display, display control method, and program | |
WO2017208148A1 (en) | Wearable visor for augmented reality |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant | ||
CF01 | Termination of patent right due to non-payment of annual fee |
Granted publication date: 20200915 |
|
CF01 | Termination of patent right due to non-payment of annual fee |