CN104512336A - 3-dimensional (3-D) navigation - Google Patents

3-dimensional (3-D) navigation Download PDF

Info

Publication number
CN104512336A
CN104512336A CN201410515899.0A CN201410515899A CN104512336A CN 104512336 A CN104512336 A CN 104512336A CN 201410515899 A CN201410515899 A CN 201410515899A CN 104512336 A CN104512336 A CN 104512336A
Authority
CN
China
Prior art keywords
vehicle
graphical element
focal plane
hud
road conditions
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201410515899.0A
Other languages
Chinese (zh)
Other versions
CN104512336B (en
Inventor
藤村希久雄
V·恩格-索-欣
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honda Motor Co Ltd
Original Assignee
Honda Motor Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US14/041,614 external-priority patent/US20160054563A9/en
Application filed by Honda Motor Co Ltd filed Critical Honda Motor Co Ltd
Publication of CN104512336A publication Critical patent/CN104512336A/en
Application granted granted Critical
Publication of CN104512336B publication Critical patent/CN104512336B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3626Details of the output of route guidance instructions
    • G01C21/3635Guidance using 3D or perspective road maps
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3626Details of the output of route guidance instructions
    • G01C21/365Guidance using head up displays or projectors, e.g. virtual vehicles or arrows projected on the windscreen or on the road itself
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0127Head-up displays characterised by optical features comprising devices increasing the depth of field
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features

Abstract

The invention relates to 3-dimensional (3-D) navigation. One or more embodiments of techniques or systems for 3-dimensional (3-D) navigation are provided herein. A head-up display (HUD) component can project graphic elements to focal planes surrounding a vehicle. The HUD component can cause the graphic elements to be isovolumetric or 3-dimensional by moving or adjusting the distance between the focal planes and the vehicle. Additionally, the target positions of the graphic elements can be adjusted, so that the HUD-component can project the graphic elements as moving virtual images. In other words, the focal plane distance and the target positions are adjusted in such a way that the graphic elements can be projected in three dimensions of x, y and z axes. Furthermore, the moving virtual images can be animated by sequentially projecting the virtual images on different focal planes, so that passengers can sense whether the virtual images move toward the vehicle or away from the vehicle.

Description

3 dimension (3D) navigation
to the cross reference of related application
The application is the partial continuous application (CIP) being entitled as the U.S. Non-provisional Patent patent application serial numbers 13/832918 (acting on behalf of institute reel number HRA-36332.01) of " VOLUMETRICHEADS-UP DISPLAY WITH DYNAMIC FOCAL PLANE " submitted on March 15th, 2013.Above mentioned application is incorporated into this by reference of text.
Background technology
In order to promote the convenience of chaufeur, vehicle can be provided with the head-up display (HUD) to chaufeur display information.Information is presented in the visual field of chaufeur by the windshield that the information shown by HUD can be projected to vehicle while driving at chaufeur.By showing information in the visual field of chaufeur, chaufeur watches presented information without the need to sight line being left when driving windshield (such as, towards the instrument display on central control board).
HUD can present the information of vehicles usually shown in vehicle central instrument carrier panel, such as relevant to the speed, fuel oil level, engine temperature etc. of vehicle information.In addition, HUD can present cartographic information and communication event (such as, navigation instruction, steering instructions, warning, alarm etc.) to chaufeur.The mode that vehicle HUD can adopt to be similar to Vehicular instrument panel presents information to chaufeur, such as by being presented on windshield as the instrument of graphical element appearance and text box.In addition, vehicle HUD can present the graphical element of augmented reality, and it utilizes the physical environment of real time information to vehicle periphery to strengthen.
But the existing HUD equipment used in vehicle possibly cannot present the graphical element of the augmented reality with consistent depth cue.Therefore, the graphical element of augmented reality that existing vehicle HUD presents may be presented as surface coating.
Summary of the invention
This is provided briefly to describe and introduce the conceptual choice that will be described in the following specific embodiments in a simple form.This briefly describes and the extendability be not intended to as claimed theme is summarized, and indicates key element or the essential feature of claimed theme, also and be not intended to the scope be used to claimed theme and limited.
According to an aspect, a kind of vehicle head-up display device for display graphics key element in the visual field of vehicle driver comprises the first projector and the first actuator.This first projector can be configured to be projected in by the first graphical element on the first focal plane in driver's seat.This first focal plane can be positioned as being substantially perpendicular to the sight line of chaufeur and have certain distance apart from vehicle.This first projector can be arranged on the first actuator.This first actuator can be configured to this first projector of Linear-moving.This first projector of Linear-moving can make the first focal plane of the first graphical element move with the direction of pilot's line of vision.
According to another aspect, a kind of vehicle head-up display system comprises vehicle head-up display device and controller.This vehicle head-up display device display graphics key element in the visual field of vehicle driver, and comprise the first projector and the second projector.This first projector can be configured to be projected in by the first graphical element on the first focal plane in driver's seat.This first focal plane may be oriented the sight line being substantially perpendicular to chaufeur.This first projector can be configured to move the first focal plane with the direction of pilot's line of vision.This second projector can be configured to can be configured to second graph key element be projected on the second focal plane in driver's seat.This second focal plane can be static state and be oriented to be arranged essentially parallel to floor surface.This controller can be configured to communicate with one or more vehicle control system be associated, and based on the communication control vehicle head-up display device of one or more vehicle control system be associated to show the first and second graphical elements.
According to another aspect, a kind of method for looking squarely in display the graphical element presenting augmented reality at vehicle, comprise on the first focal plane of the first graphical element being projected in driver's seat, and second graph key element is projected on the second focal plane in driver's seat.This first focal plane may be oriented the sight line being substantially perpendicular to chaufeur, and this second focal plane can be static state and be oriented to be arranged essentially parallel to floor surface.The method can comprise and to move with the direction of pilot's line of vision or to regulate this first focal plane.
This provide one or more embodiments of technology or the system of navigating for 3 dimensions (3D).Such as, a kind of system for 3D navigation can project to graphical element or the virtual image, and it shows as and moves in the visual field of automotive occupant.In one or more embodiments, look squarely display (HUD) assembly can be configured to this graphical element or the virtual image are projected on the one or more focal planes in vehicle-periphery.In other words, this HUD assembly can think that automotive occupant to provide with the virtual image or graphical element moving, flies in adjustable distance or adjustable focal plane place projecting figure key element or the virtual image, the perception of animation display etc.
As an example, HUD assembly can be configured to by the virtual image being sequentially projected in one or more different focal planes and provide moving or making it present " animation display " of the virtual image.Such as, project to the projector that these focal planes can utilize drg to move HUD assembly to realize.As its result, the depth cue of the degree of adaptability (accommodation) and vergence (vergence) and so on that are such as associated with graphical element or the virtual image is generally retained.When generating the route from primary importance to the second place, HUD assembly can generate the one or more graphical elements supplying the chaufeur of vehicle or occupant " to follow ".Because HUD assembly can carry out projecting or projected graphical element being moved to another focal plane from a focal plane on multiple focal plane, so graphical element or projected image can show more " truly ", be similar to the image seen in mirror.
When automotive occupant request navigation directions, the graphical element of such as virtual image and so on can be provided.Such as, this virtual image can show as move at vehicle front, slide, flight etc., be similar to the situation that occupant or chaufeur it will be appreciated that when they follow good friend's vehicle.In addition, this virtual image will show as like that to real vehicles and get around obstruction, obstacle, pedestrian, foreign material, pit etc.In one or more embodiments, the virtual image can be carried out " driving ", move, show as movement according to real-time traffic.Such as, if route makes chaufeur or vehicles traverse railway, then this virtual image can train through out-of-date be parked in railway before.As another example, this virtual image can not show as and another vehicle " collision " or the mode conversion track of otherwise disturbing traffic to make it.
The following description and drawings give some illustrative aspect and embodiment.These can adopted various mode be indicated one or more aspect.Other side of the present disclosure, advantage or novel feature will become apparent by following detailed description when considering by reference to the accompanying drawings.
Accompanying drawing explanation
When reading with accompanying drawing, each side of the present disclosure is understood by following detailed description.Key element in accompanying drawing, structure etc. also not necessarily are drawn to scale.Therefore, their size such as can arbitrarily increase in order to clear discussion or reduce.
Fig. 1 is the diagram of the example schematic diagram of vehicle head-up display system according to one or more embodiment.
Fig. 2 is the diagram providing the example schematic diagram of the vehicle of vehicle head-up display system wherein according to one or more embodiment.
Fig. 3 is the diagram according to the vehicle of one or more embodiment and the vehicle head-up display system example side view of four focal planes of projecting figure key element thereon.
Fig. 4 is the diagram of the exemplary patterns key element that the example visual field seen by vehicle windshield while driving according to the chaufeur of one or more embodiment and vehicle head-up display system are projected.
Fig. 5 is the diagram of exemplary components diagram of the system for 3D navigation according to one or more embodiment.
Fig. 6 is the diagram of example flow diagram of the method for 3D navigation according to one or more embodiment.
Fig. 7 A is the diagram of the example virtual image for 3D navigation according to one or more embodiment.
Fig. 7 B is the diagram of the example virtual image for 3D navigation according to one or more embodiment.
Fig. 8 A is the diagram of the example virtual image for 3D navigation according to one or more embodiment.
Fig. 8 B is the diagram of the example virtual image for 3D navigation according to one or more embodiment.
Fig. 9 A is the diagram of the example virtual image for 3D navigation according to one or more embodiment.
Fig. 9 B is the diagram of the example virtual image for 3D navigation according to one or more embodiment.
Figure 10 is the diagram of example computer readable mediums according to one or more embodiment or computer readable device, and it comprises the processor executable being configured to realize one or more measure set forth herein.
Figure 11 is the diagram implementing the example computing device of one or more measure set forth herein wherein according to one or more embodiment.
Detailed description of the invention
Concrete syntax is below used to be disclosed embodiment illustrated in figure or example.But will be appreciated that, this embodiment or example are also not intended to be restrictive.Any further application of any change in the disclosed embodiments and amendment and principle disclosed herein just as various equivalent modifications normally expected.
For one or more accompanying drawings herein, only for graphic object, one or more borders on the border 116 of such as Fig. 2 such as relative to each other carry out drawing with differing heights, width, girth, aspect ratio, shape etc. and and unnecessaryly to draw to scale.Such as, because empty line or belt dotted line is used to represent different boundary, if so dotted line or dotted line drafting superposed on one another, they cannot be distinguished in the drawings, and therefore utilize different size in one or more of the drawings or a little separately draw each other, and they can be distinguished mutually.As another example, when border is with irregularly shaped being associated, such as utilize dotted line, the border of frame that dotted line is drawn not necessarily comprises the whole assembly in one or more example.On the contrary, the frame drawn not necessarily only comprises the assembly be associated, in one or more example, it also can comprise the part of other assembly one or more.
The graphical element be visually placed in the environmental element of the direct field of view of chaufeur by vehicle HUD equipment is often referred to as the augmented reality graphical element of contact simulation (contact-analog) or conformal (conformal).Successfully present contact simulation augmented reality graphical element to vehicle driver and can be depending on the ability that vehicle HUD equipment correctly reproduces depth cue.These depth cues can inclusive fitness and vergence.Degree of adaptability is that eye muscle initiatively changes optical power to change the depth cue being in the focus of different distance.To be eyes when watching object be vergence keeps single binocular images and synchronous toward each other or inside rotation simultaneously.
Although example as described herein may relate to vehicle driver, graphical element can in vehicle other occupants one or more of such as passenger the visual field in carry out projecting, provide, play up.For this reason, these examples are also not intended to as restriction, and are only be disclosed to be described one or more exemplary aspect of the application.
When HUD equipment is on the windshield of vehicle during display graphics key element, degree of adaptability can cause human eye to be changed between the information shown by environmental element and HUD equipment.Vergence then causes eyes to be assembled to arriving windshield with the point among external environment, and this can cause HUD graphical element shown on windshield to show as ghost.Therefore, in order to present the contact simulation augmented reality graphical element with correct depth cue of reproducing, graphical element should be present in and identical with actual environment spatially (such as, be presented on corresponding focal plane), instead of is present on the windshield of vehicle.
Provide a kind of vehicle head-up display device, it is for display graphics key element in vehicle driver is by the visual field at chaufeur during windshield viewing environment.This head-up display device can be included in chaufeur by graphical element being projected to while windshield viewing environment the one or more projector on the focal plane before in driver's seat, and graphical element is projected in driver's seat the one or more projector be parallel on the focal plane on ground while chaufeur is by windshield viewing environment.The projector be projected in by graphical element on focal plane, front can be installed on the actuator, and this actuator can this projector of Linear-moving and focal plane, front is moved with the direction of visual lines of chaufeur.Projection is parallel to the projector of the focal plane on ground can fixed and arranged and make the focal plane being parallel to ground be static.
With reference to figure 1, illustrate a kind of vehicle equal-volume head-up display system 100 (HUD system 100 or HUD assembly 100), it can utilize correct depth cue of reproducing to present isopyknic contact simulation augmented reality graphical element (such as, being presented to 3 dimension or " 3D " graphical elements in the space identical with true environment).HUD system 100 comprises vehicle head-up display device 102 (" HUD equipment 102 ") and controller 104 (or " controller assemblies 104 ").Can be provided in vehicle 106 with reference to figure 2, HUD system 100, vehicle 106 comprises driver's seat 108, instrument panel skin 110 and windshield 112.
Such as, can be conventional about the configuration of the relative positioning of driver's seat 108, instrument panel skin 110 and windshield 112 in vehicle 106.In order to adapt to HUD system 100 as described herein, instrument panel skin 110 defines the spatial accommodation that HUD system 100 is accommodated therein.In addition, instrument panel skin 110 has the HUD egress hole 114 limited by its upper surface.The graphical element of such as contact simulation augmented reality graphical element and so on projects on windshield 112 by HUD egress hole 114 by the HUD system 100 be contained in instrument panel skin 110, and windshield 112 can be used as the read-out of HUD system 100.As described in more detail below, augmented reality graphical element can be presented to user as being in same space with true environment.
Chaufeur steering vehicle 106 while being sitting in driver's seat 108 of vehicle 106.Therefore, chaufeur in position can by the seated position on the driver's seat 108 that is positioningly constrained in vehicle 106.In view of this position constraint, the hypothesis of the glasses case (eye box) 116 that HUD system 100 can use the visual field of chaufeur to stem from vehicle designs.Glasses case 116 can be believed to comprise the region at eyes place while chaufeur is seated in driver's seat 108 of chaufeur in vehicle 106.
Glasses case 116 can be designed size for comprising all possible head position of chaufeur, and regardless of the position of driver's seat 108 or attitude, or HUD system 100 can be configured to detect the position of driver's seat 108 and attitude and regulate the position of glasses case 116 and size based on this.In one or more embodiments, can suppose that glasses case 116 has fixed size and is in a fixed position and designs HUD system 100.Such as, glasses case can have following size: 20cm × 10cm × 10cm.In any case, HUD system 100 can be configured to be in glasses case 116 and faced by chaufeur at the eyes of chaufeur/to be seen by the windshield 112 of vehicle 106 forwards present contact simulation augmented reality graphical element to chaufeur while direction.Although the glasses case 116 of Fig. 2 illustrates for the chaufeur of vehicle 106, glasses case 116 can be set to other occupants one or more comprised in vehicle.In one or more embodiments, such as one or more other glasses case or HUD equipment can be provided for passenger or other occupant.
HUD equipment 102 shows one or more graphical element in the chaufeur of vehicle 106 is by the visual field at chaufeur while windshield 112 viewing environment of vehicle 106.The eyes of chaufeur to be in glasses case 116 and chaufeur by faced by windshield 112/see forwards while direction, the arbitrary graphic that chaufeur is seen by windshield 112 or environmental element can considered to be among the visual field of chaufeur.As used herein, the chaufeur of vehicle 106 visual field when its windshield 112 viewing environment by vehicle 106 is intended to comprise the region seen by windshield 112, except being positioned at the instrument carrier panel display of vehicle 106.In other words, HUD equipment 102 present graphical element and make chaufeur can when sight line not the leave the road see this graphical element.
Return Fig. 1, the HUD equipment 102 of HUD system 100 comprises the first projector 118, second projector 120, the 3rd projector 122 and the 4th projector 124.The first beam splitter 126 and the first object lens 128 shared by first projector 118 and the 3rd projector 122, and the second beam splitter 130 and the second object lens 132 shared by the second projector 120 and the 4th projector 124.Therefore, the output of the first projector 118 and the 3rd projector 122 can be received and be combined into directed (and passing through) the first single output of object lens 128 in the first beam splitter 126.Similarly, the output of the second projector 120 and the 4th projector 124 can be received and be combined into directed (and passing through) the second single output of object lens 132 in the second beam splitter 130.
HUD equipment 102 comprises the 3rd beam splitter 134 in the downstream being arranged in the first and second object lens 128,132 further, and it is configured to receive the output from the first and second object lens 128,132.Output from the first and second object lens 128,132 can be combined into single output at the 3rd beam splitter 134, it can be the combination of the output from all first, second, third and fourth projector 118,120,122,124, and directed (and passing through) the 3rd object lens 136 and eyepiece 138 before pointing to windshield 112 from HUD egress hole 114, this windshield 112 can be used as the read-out of HUD system 100.
Each in first projector 118, second projector 120, the 3rd projector 122 and the 4th projector 124 comprise projector unit 140,142,144,146 and the setpoint distance that is rigidly fixed in apart from projector unit 140,142,144,146 and relative to 140,142,144,146 carry out the diffuser screen 148,150,152,154 arranged, and make the light launched from projector unit 140,142,144,146 by diffuser screen 148,150,152,154.Projector unit 140,142,144,146 can be luminescence unit, and it projects to the image of the diffuser screen 148,150,152,154 by being associated or graphical element.Diffuser screen 148, 150, 152, the 154 illuminated diagram image sources (or object) of all the other optical systems being used as HUD equipment 102, and guarantee to leave diffuser screen 148, 150, 152, most of light of 154 enters diffuser screen 148, 150, 152, optics after 154 (such as, first beam splitter 126, first object lens 128, second beam splitter 130, second object lens 132, 3rd beam splitter 134, 3rd object lens 136 and eyepiece 138), light spread out simultaneously and make it finally be full of glasses case 116, thus make the brightness of image or (multiple) graphical element in glasses case 116, keep constant while movement at the head of chaufeur.Therefore, the use of diffuser screen 148,150,152,154 substantially prevent the different piece of image or (multiple) graphical element can be in sight from the difference in glasses case 116, and therefore substantially prevent along with the slight movement of head and occur different visual performance.
Projector unit 140,142,144,146 can adopt the form using any luminescence unit being applicable to purposes as described herein.Projector unit 140,142,144,146 can adopt can according to the form of any luminescence unit of (multiple) as described herein purposes projected image or graphical element.Similarly, diffuser screen 148,150,152,154 can adopt the form of any light diffuser screen of applicable (multiple) as described herein purposes.
First projector 118 can be arranged on the first actuator 156 in HUD equipment 102.First actuator 156 can be can with linear direction by the first projector 118 towards with the linear actuators carrying out movement away from the first beam splitter 126.In addition, the 3rd projector 122 can be arranged on the second actuator 158 in HUD equipment 102.Second actuator 158 can be can with linear direction by the 3rd projector 122 towards with the linear actuators carrying out movement away from the first beam splitter 126.First and second actuators 156,158 can adopt the form of any linear actuators being suitable for purposes as described herein.First projector 118 and the 3rd projector 122 carry out linearly moving ability and allow the first projector 118 and the 3rd projector 122 to be projected in by graphical element on dynamic or moveable focal plane.With first and the 3rd projector 118,122 contrary, second and the 4th projector 120,124 can fixed and arranged in HUD equipment 102, and the graphical element that therefore projects on static focal plane.
Graphical element (contact simulation augmented reality graphical element or other) is presented in four different focal planes in the environment using first, second, third and fourth projector 118,120,122,124, HUD equipment 102 can be watched by windshield 112 at chaufeur.Thus, first projector 118 can be configured to project the first graphical element 160 in the first focal plane 162, second projector 120 can be configured to project second graph key element 164 in the second focal plane 166,3rd projector 122 can be configured to project the 3rd graphical element 168 in the 3rd focal plane 170, and the 4th projector 124 can be configured to project the 4th graphical element 172 (described by with reference to Fig. 3 and 4) in the 4th focal plane 174.All first, second, third and fourth graphical elements 160,164,168,172 and the first, second, third and fourth focal plane 162,166,170,174 that is associated thereof can at driver vehicle 160 and the eyes of chaufeur to be in glasses case 116 among the environment that to be presented in chaufeur simultaneously sees forwards direction by windshield 112 in the visual field of chaufeur.
With reference to figure 3 and Fig. 4, the sight line 178 pairs of first, second, third and fourth graphical elements 160,164,168,172 with reference to floor surface 176 and chaufeur are described in the projection of first, second, third and fourth focal plane 162,166,170,174.Thus, floor surface 176 is surfaces of vehicle 106 road ahead.For the object of current description, floor surface 176 will be assumed to be substantially smooth surface.The sight line 178 of chaufeur is the straight line being arranged essentially parallel to floor surface 176 from glasses case 116 with forward direction.As used herein, the direction of sight line 178 be along sight line 178 towards with the direction away from chaufeur and vehicle 106.
First focal plane 162 is the focal planes, front that may be oriented the sight line 178 being substantially perpendicular to chaufeur.3rd focal plane 170 is also the focal plane, front that may be oriented the sight line 178 being substantially perpendicular to chaufeur.First and the 3rd focal plane 162,170 can be dynamic focal plane, it can move in the direction of sight line 178 with forward direction (away from vehicle 106) and backward directions (towards vehicle 106).Second focal plane 166 is the focal planes being parallel to ground, and it may be oriented and is arranged essentially parallel to floor surface 176, and can be disposed in floor surface 176 and make the second focal plane 166 be focal planes, ground.4th focal plane 174 is also the focal plane being parallel to ground, and it may be oriented and is arranged essentially parallel to floor surface 176, and is disposed in the top of floor surface 176.4th focal plane can be disposed in the top of floor surface 176 and pilot's line of vision 178 and become sky or top ceiling focal plane.As a result, second and the 4th focal plane 166,174 can be static focal plane.
With reference to figure 4, first, second, third and fourth graphical element 160,164,168,172 can be used to present different information to chaufeur.The exact type of the information shown by first, second, third and fourth graphical element 160,164,168,172 can change to some extent.For exemplary object, first graphical element 160 and the 3rd graphical element 168 can present warning to chaufeur, its instruction chaufeur dodges danger or obstacle, or the steering instructions that can provide navigation instruction or be associated with road planning (such as, stop signal, dodge signal etc.).Second graph key element 164 and the 4th graphical element 172 can cover with the figure be presented in floor surface 176 and present navigation instruction to chaufeur, or can present vehicle-surroundings designator to chaufeur.First, second, third and fourth graphical element 160,164,168,172 can present to chaufeur and is different from information as described herein or graphical element, and can present the subset of first, second, third and fourth graphical element 160,164,168,172.
Return Fig. 1, controller 104 can comprise one or more computing machine, (such as, arithmetic) treater, or can carry out communicating and any miscellaneous equipment of control HUD equipment 102 with one or more vehicle control system 180.One or more in vehicle control system 180 (" vehicle control system 180 " or " wagon control assembly 180 " here) can take for initiatively or the passive any vehicle control system 180 facilitating the control of vehicle 106.Vehicle control system 180 can comprise one or more sensor (not shown) or communicate with it, and the sensor detects the driving relevant to the operation of vehicle 106 and environmental conditions.
Generally about the operation of HUD system 100, controller 104 communicates with vehicle control system 180, and determines type and the position of the graphical element that will present to the chaufeur of vehicle 106 based on the communication with vehicle control system 180.The type that will be rendered as the graphical element of first, second, third and fourth graphical element 160,164,168,172 by first, second, third and fourth projector 118,120,122,124 determined by controller 104, and controls first, second, third and fourth projector 118,120,122,124 and projected as determined graphical element by first, second, third and fourth graphical element 160,164,168,172.Controller 104 can determine that target first graphical element position and target the 3rd graphical element position are as presenting first and the 3rd target location of graphical element 160,168 to chaufeur in the environment.Controller 104 control the first and second actuator 156,158 Linear-movings first and the 3rd projector 118,122 subsequently and make first and the 3rd focal plane 162,170 can move to target first and the 3rd graphical element position respectively.
Therefore, first graphical element 160 is projected on the first focal plane 162 by the first projector 118, first focal plane 162 may be oriented the sight line being substantially perpendicular to chaufeur, and can be moved towards with away from vehicle 106 in the direction of the sight line 178 of chaufeur the Linear-moving of the first projector 118 by the first actuator 156.Second graph key element 164 is projected on the second focal plane 166 by the second projector 120, and the second focal plane 166 is static state and is oriented be parallel to floor surface 176 and be arranged in floor surface 176.3rd graphical element 168 is projected on the 3rd focal plane 170 by the 3rd projector 122,3rd focal plane 170 is oriented the sight line being substantially perpendicular to chaufeur, and can be moved towards with away from vehicle 106 in the direction of sight line 178 Linear-moving of the 3rd projector 122 by the second actuator 158.4th graphical element 172 is projected on the 4th focal plane 174 by the 4th projector 124, and the 4th focal plane 174 is static, is oriented and is parallel to floor surface 176 and the top that can be arranged in the sight line 178 of chaufeur.Controller 104 control the first and second actuators 156,158 move first and the 3rd projector 118,122 with to first and the 3rd focal plane 162,170 move.
By make first and the 3rd projector 118,122 by first and the 3rd graphical element 160,168 be projected in be oriented and be substantially perpendicular to moveable first and the 3rd on focal plane 162,170 of pilot's line of vision 178, can regulate the focus of the object apart from vehicle 106 different distance.This can facilitate for first and the 3rd graphical element 160,168 and provide correct depth cue to chaufeur, and particularly because HUD system 100 can be vehicle application, wherein vehicle 106 is used as mobile platform.
Although second and the 4th projector 120,124 by second and the 4th graphical element 164,172 be projected in static second and the 4th on focal plane 166,174, second and the 4th focal plane 166,174 can be continuous print.In order to make second and the 4th focal plane 166,174 diffuser screen 150,154 that is parallel to floor surface 176, the second and the four projector 120,124 can tilt to some extent.Distortion due to the optical system of HUD equipment 102 is very low and be almost for the image be parallel in the focal plane on ground (telocentric) with teloblastics, so light ray parallels close to optical axial, this allow to project second and the 4th graphical element 164,172 carry out projecting or presenting when second and the 4th does not have distortion or do not change amplification rate while the run-off the straight of focal plane 166,174.Produced second and the 4th graphical element 164,172 therefore appear at be parallel to floor surface 176 continuous focal plane (second and the 4th focal plane 166,174) on.Thus, second and the 4th graphical element 164,172 3 actual dimension (3D) equal-volume shapes instead of line segment can be utilized to carry out presenting increase monocular cues and strengthen degree of depth sense organ.
Continuous print static state second and the 4th focal plane 166,174 facilitate chaufeur about second and the 4th degree of depth sense organ of graphical element 164,172.Continuous print static state second and the 4th focal plane 166,174 allow correctly to generate true picture or graphical element by the fore-and-aft direction (such as, the direction of the sight line 178 of chaufeur) in 3d space, this allows to generate suitable motion parallax prompting.Therefore, when about the head of chaufeur or when moving up and down, second and the 4th graphical element 164,172 show as fixing position in the environment for chaufeur, instead of to move everywhere.As a result, HUD system 100 does not need the movement of head-tracking function to driver head to compensate.
About the exemplary information that can present to chaufeur listed before, vehicle control system 180 can comprise the process and sensor that can perform following functions: dangerous or obstacle detection; Navigation; Navigation instruction; Monitor with vehicle-surroundings (such as, blind spot).Vehicle control system 180 can comprise the process and sensor that can perform other wagon control function (such as, expressway merges to be assisted), and it can alternatively or additionally be associated with the information using HUD system 100 present to chaufeur.Regardless of the function performed by vehicle control system 180, vehicle control system 180 is used for performing the function comprising sensor and the process be associated, and butt formula may not be relevant to the operation of HUC system 100 really.
Controller 140 communicates with vehicle control system 180, and accepts input that is relevant to the operation of vehicle 106 and that be associated with (or the other) function listed by above therefrom.Controller 104 is subsequently based on the input control HUD equipment 102 received from vehicle control system 180.Thus, one of controller 104 and vehicle control system 180 or can determine both it: the type that the graphical element shown will be carried out as first, second, third and fourth graphical element 160,164,168,172; The position of first, second, third and fourth graphical element 160,164,168,172; And to show in first, second, third and fourth graphical element 160,164,168,172 which.These determine one or more vehicle functions that can adopt based on chaufeur, such as whether use navigation feature based on chaufeur.
Which no matter in controller 104 or vehicle control system 180 is used to carry out such determination, controller 104 all control HUD equipment 102 and in position show suitable graphical element.This can comprise control first, second, third and fourth projector 118,120,122,124 and project to suitable first, second, third and fourth graphical element 160,164,168,172.This can comprise control first and second actuator 156,158 Linear-moving first and the 3rd projector 118,122 and by first and the 3rd focal plane 162,170 move to suitably (such as, target) position.Such as, what one or more actuators of such as 156,158 can be configured in the focal plane of mobile such as 162,170 is one or more.Such as, about the 3rd focal plane 170, can be regulated the distance between the 3rd focal plane 170 and the windshield (such as, 302) of vehicle 106 by adjustable range 170'.Similarly, the 162' that can adjust the distance carries out the target location regulating to change focal plane 162.
In view of the exemplary information be associated with first, second, third and fourth graphical element 160,164,168,172 listed before, be described the operation of HUD system with reference to vehicle 106, this vehicle has supports the vehicle control system 180 of following functions: dangerous or obstacle detection and warning function; Navigation feature; Navigation instruction function; And vehicle-surroundings (such as, blind spot) function for monitoring.Equally, vehicle 106 can have the subset of these functions or other function, and HUD system 100 can be used with reference to this subset or other function.Be only exemplary with reference to these functions to the description of HUD system 100, and be used to facilitate the description to HUD system 100.Although one of controller 104 and vehicle control system 180 or can be associated to make with the operation of HUD system 100 both it and determine, but in the following description, controller 104 is described to be configured to based on the input received from vehicle control system 180 and determine.
The information relevant to obstacle detection and warning function can be presented to chaufeur and simulate augmented reality graphical element as the contact that the first projector 118 of HUD equipment 102 projects.Thus, vehicle control system 180 can detect the various obstacles of the road that vehicle 106 travels thereon.Such as, obstacle can comprise pass through road pedestrian, other vehicle, animal, foreign material, pit etc. in road.The detection of these obstacles can by processing the information of the environment responded to from the sensor (not shown) that vehicle 106 provides and carrying out.In addition, obstacle detection can be carried out in any way.
When obstacle being detected, complaint message is transferred to controller 104 by vehicle control system 180.Controller 104 receives this complaint message from vehicle control system 180 and determines type and the target first graphical element position of the graphical element that will present as the first graphical element 160 based on received complaint message.Although can use various types of graphical element, such as flashing icon, other mark etc., be described example with reference to " dodging " mark occurred when obstacle being detected here.
With reference to figure 4, the obstacle detected by vehicle control system 180 may be the pedestrian 182 passing through vehicle 106 travels.In the exemplary driver's seat of Figure 14, vehicle 106 travels on the road that passing through pedestrian 182.Therefore, vehicle control system 180 can send the complaint message relevant to this pedestrian 182 to controller 104.Based on this complaint message, controller 104 can determine the type will carrying out the graphical element shown as the first graphical element 160; Such as in this case, this graphical element can be " dodging " mark, although also can use other logotype.Controller 104 can be determined target first graphical element position and make the first graphical element 160 be in the pedestrian 182 identical degree of depth (such as, focal plane) with presenting to be perceived as by chaufeur by being projected.In addition, such as, controller 104 can be configured to regulate target first graphical element position and make the first graphical element 160 " tracking " or " following " pedestrian 182 while pedestrian 182 walks.
Controller 104 controls the first projector 118 subsequently and " will dodge " signal and project as the first graphical element 160, and control the first actuator 156 Linear-moving first projector 118 and make the first image-element 160 can be projected and present, to be perceived as by chaufeur (such as, being in glasses case 116 and chaufeur to be seen by windshield 112 forwards while direction at the eyes of chaufeur) and to be in the degree of depth identical with pedestrian 182.First actuator 156 can be made the first graphical element 160 can be projected on the first focal plane 162 by controlling, the first focal plane 162 can be positioned at target first graphical element position and may be oriented and be substantially perpendicular to sight line 178.
When vehicle 106 and pedestrian 182 advance on road, relative distance therebetween will change.The change of this distance can be transferred to controller 104 by vehicle control system 180, target first graphical element position can correspondingly change, and the first actuator 156 can be undertaken controlling with mobile first focal plane 162 by controller 104 and remain on (such as, change/ changing) target first graphical element position.Therefore, first graphical element 160 is projected in and can carries out on the first focal plane 162 of movement in the direction of pilot's line of vision 178, the depth cue be associated with the first graphical element 160 can be properly reproducible and make chaufeur accurately can judge the position of the first graphical element 160 (such as, detected obstacle).
In addition, the contact simulation augmented reality graphical element that relevant to navigation feature information can project as the second projector 120 of HUD equipment 102 is presented to chaufeur.Thus, vehicle control system 180, when receiving navigation requests (such as, the input of desired position) from chaufeur, generates and follows for chaufeur the navigation way arriving desired position.This navigation way comprises one group of driving direction of following for chaufeur, comprises the street on the route transferring position desired by whereabouts to.Navigation feature can perform in any way.When navigation feature is activated, vehicle control system 180 can transmit to controller 104 the driving direction be associated with navigation feature.
Controller 104 can receive from vehicle control system 180 and drives direction and determine the type will carrying out the graphical element presented as second graph key element 164.The type of the graphical element be associated with navigation feature can comprise following graphical element, this graphical element instruction chaufeur to continue to stay on present road (such as, straight line or arrow), turn left at junction ahead or turn right (such as, with the left/right arrow of suitable directional steering or lines), enter, import or leave the graphical element of express highway (such as, indicating lines or the arrow in suitable path) etc.Controller 104 is selected suitable graphical element based on the driving direction of transmitting from vehicle control system 180 and is presented as second graph key element 164.
With reference to the exemplary driver's seat of figure 4, the driving direction of the determined drive route of the navigation feature for vehicle control system 180 is included in front street.Therefore, controller 104 controls the second projector 120 and generates left-hand rotation graphical element and be projected on the second focal plane 166 as second graph key element 164.As shown in Figure 4, the second focal plane 166 may be oriented and is parallel to floor surface 176 and is disposed in floor surface 176.As the above mentioned, the second projector 120 can be fixed and be arranged in HUD equipment 102, and makes the second focal plane 166 be static.As the above mentioned, the second focal plane 166 can be continuous print, and makes second graph key element 164 that appropriate depth can be utilized as 3D rendering to point out and be presented to chaufeur.
Similarly, the contact that relevant to navigation instruction function information can project as the 3rd projector 122 of HUD equipment 102 is simulated augmented reality graphical element and is presented to chaufeur.Thus, the information that vehicle control system 180 can use sensor or storage to be associated in a database and with map to monitor the road that vehicle 106 travels, and determines the navigation instruction be associated with the traveling on this road on the horizon.Such as, vehicle control system 180 can detect and on the horizonly on road that vehicle 106 travels thereon requiredly stop, dodges or other condition (here, being referred to as " road conditions ").Vehicle control system 180 can determine the navigation instruction (cutoff command etc. such as, be associated with parking road conditions) be associated with detected road conditions.Navigation instruction function can perform in any way, its detail not necessarily is relevant to the operation of HUD system 100.In addition, among other things, road conditions can comprise traffic on section, obstruction, obstacle, weather condition, road surfaces condition, the speed limit etc. that is associated with segment path or section.In other words, the road conditions reason of carrying out while such as can being included in generally and driving accelerating, slowing down, detouring, stop, take care.
Vehicle control system 180 by road conditions or to the navigation instruction that road conditions is associated and the information transmission relevant with the position of road conditions to controller 104.Controller 104 can control the 3rd projector 122 with project the 3rd graphical element 168 and to chaufeur transmission information that is relevant to road conditions or that be associated with corresponding navigation instruction.Controller 104 can receive road conditions or navigation instruction information and location information from vehicle control system 180, and determines type and target the 3rd graphical element position that will carry out the graphical element presented as the 3rd graphical element 168.
Various types of graphical element can be used by navigation command function, such as: parking sign, dodge mark, one way only mark, red light no right turn mark etc.The type of graphical element can be selected to transmit the navigation instruction be associated with road conditions.No matter controller 104 determines that the graphical element of which kind of type should be used as the 3rd graphical element 168, and this graphical element can be projected the position appearing at this riving condition.Thus, relative to the position of vehicle 106, target the 3rd graphical element position can be defined as the position that should present the 3rd graphical element 168 in driver's seat based on detected road conditions.
Controller 104 can be configured to control the 3rd projector 122 to be projected as the 3rd graphical element 168 by suitable graphical element.This controller can control the second actuator 158 with Linear-moving the 3rd projector 122, and the 3rd graphical element 168 is projected and presents so that by chaufeur (such as, to be in glasses case 116 and chaufeur to be seen by windshield 112 forwards while direction at the eyes of chaufeur) be perceived as and be in the degree of depth (such as, there is identical focal plane) identical with road conditions.Second actuator 158 can be made the 3rd graphical element 168 be projected on the 3rd focal plane 170 by controlling, the 3rd focal plane 170 can be positioned at target the 3rd graphical element position and is substantially oriented orthogonal to sight line 178.Controller 104 can control the second actuator 158 with Linear-moving the 3rd projector 122 continuously, and make the 3rd focal plane 170 vehicle 106 and detect distance between road conditions (such as, target the 3rd graphical element position) change (as vehicle control system 180 detect and transfer to controller 104), this Change Example as due to vehicle 106 towards detect road conditions exercise result.
In the exemplary visual field at the chaufeur visual angle from Fig. 4, vehicle 106 is close to the four-way four corners that vehicle 106 should stop.Therefore, vehicle control system 180 detects the parking road conditions being in four corners entry position place, and determines that the navigation instruction be associated with parking road conditions is cutoff command.The position of this parking road conditions or instruction and parking road conditions can be transferred to controller 104, and controller 104 is determined parking sign to be presented as the 3rd graphical element 168.Controller 104 can determine that the 3rd graphical element 168 (such as, parking sign) should appear at the entry position of this four-way four corners.The entry position of this four corners is therefore, it is possible to be confirmed as target the 3rd graphical element position.
Controller 104 can control the 3rd projector 122 and project " STOP " mark as the 3rd graphical element 168, and control the second actuator 158 move the 3rd projector 122 and the 3rd graphical element 168 be projected and be presented to be in the degree of depth identical with this four-way four corners entrance to be perceived as by chaufeur (such as, being in glasses case 116 and chaufeur to be seen by windshield 112 forwards while direction at the eyes of chaufeur).Second actuator 158 can be made the 3rd graphical element 168 can be projected on the 3rd focal plane 170 by controlling, the 3rd focal plane 170 is positioned at target the 3rd graphical element position and is oriented and is substantially perpendicular to sight line 178.Along with vehicle 106 travels on road, the distance between the entrance of vehicle 106 and this four-way four corners will change.The change of this distance can be transferred to controller 104 by vehicle control system 180, target the 3rd image-element position can correspondingly change to some extent, and the second actuator 158 can be undertaken controlling with mobile 3rd focal plane 170 by controller 104 and remain on (such as, change/ changing) target the 3rd graphical element position.Therefore, 3rd graphical element 168 is projected in and can carries out on the 3rd focal plane 170 of movement in the direction of pilot's line of vision 178, therefore the depth cue be associated with the 3rd graphical element 168 can be properly reproducible and make chaufeur accurately can judge the position of the 3rd graphical element 168 (such as, detected road conditions).
The information of being correlated with vehicle-surroundings (such as, blind spot) function for monitoring can be presented to chaufeur by the 4th projector 124 of HUD equipment 102.Thus, vehicle control system 180 can detect closely around or around the existence of other vehicle in the region of vehicle 106.Detection closely around other vehicle of vehicle 106 can perform by processing the information relevant to vehicle 106 periphery that the sensor (not shown) that vehicle 106 provides is responded to.Vehicle-surroundings is determined to perform in any way.
Vehicle-surroundings information can be determined by vehicle control system 180 and transfer to controller 104.Controller 104 receives vehicle-surroundings information from vehicle control system 180 and determines if there is the 4th graphical element 172 how revised and be projected in the 4th focal plane 174.Thus, can be the vehicle-surroundings designator shown in Fig. 4 as the 4th graphical element 172 to facilitate the graphical element of vehicle-surroundings (such as, blind spot) function for monitoring.
Vehicle-surroundings designator comprises the central marker of expression vehicle 106 and represents closely around eight peripheries marks of the position of vehicle 106.Vehicle control system 180 transmits the information relevant to the vehicle location in vehicle 106 closely periphery, and controller 104 controls to change the 4th graphical element 172 to the 4th projector 124, and one or more in making to mark with eight associated peripheral are highlighted.The highlighted display of eight periphery marks is closely in the position of other vehicle of vehicle 106 periphery to chaufeur instruction.
In the diagram, the 4th graphical element 172 can be projected on the 4th focal plane 174, the 4th focal plane 174 can and be orientated and be parallel to floor surface 176 and the top that can be disposed in floor surface 176 and sight line 178.As the above mentioned, the 4th projector 124 can fixed and arranged in HUD equipment 102, and make the 4th focal plane 174 be static.As the above mentioned, the 4th focal plane 174 can be continuous print, and makes the 4th graphical element 172 that appropriate depth can be utilized as 3D rendering to point out and be presented to chaufeur.
4th graphical element 172 can present with the form of the vehicle-surroundings designator being different from Fig. 4.In any case, the 4th graphical element 172 can be projected on the 4th focal plane 174, and the 4th focal plane may be oriented and is parallel to floor surface 176 and the top that can be disposed in floor surface 176 and pilot's line of vision 178.Therefore, the 4th graphical element 172 can be provided on sky focal plane, and the information transmitted due to the 4th graphical element 172 does not need to carry out alternately with environment, so this can be suitable.
HUD system 100 described above can with continually varying focal length and before and after the direction of pilot's line of vision 178 the focal plane projecting figure key element being parallel to ground of consecutive variations focus, some graphical elements wherein project as contact simulation augmented reality graphical element.Therefore, depth perception prompting can improve to some extent and synchronous or simultaneously (or close to synchronous) facilitate and focus and improve the attention of chaufeur to environment.This information and environment that chaufeur can be observed present via graphical element.Thus, by experiment, contriver has determined that spatial impression notifies to be pointed out by focus significantly to be affected, and the focal plane regulating power of HUD the system 100 as described herein and spatial perception that the ability of graphical element is improved is shown on the focal plane being parallel to ground of continuous print static state.For this reason, when the prompting of focusing as described herein regulates, compared with viewed when regulating the size of graphical element, observed the larger improvement of spatial perception.
The configuration comprising the HUD equipment 102 of the use of beam splitter 126,130,134 and lens 128,132,136,138 allows HUD equipment 102 to have the size of relative compact.In addition, lens 128,132,136,138 allowable depth scope is from the infinity in the space that the optics that most HUD equipment 102 expanded by number formulary rice before vehicle 106 distributes.Further, beam splitter 126,130,134 can be used as optical combiner and carry out all separation ray collection projected from first, second, third and fourth projector 118,120,122,124 scioptics 128,132,136,138 merging the separate picture from first, second, third and fourth projector 118,120,122,124 to be merged into the unified image (such as, or graphical element) projected in driver's seat.
In one or more embodiments, somely above disclosed to replace from function or its with further feature or among version can be combined to other different systems many on demand or apply.In addition, current fail to estimate or predict many replacements, amendment, change or its improve subsequently may done by those skilled in the art, they be intended to also comprise by following claim.
Such as, less or more projector can be used in HUD system 100 with less or more graphical element that projects.In addition, although HUD system 100 be described to have two projecting figure key elements in focal plane, front two projector and in the focal plane being parallel to ground two projector of projecting figure key element, the ratio of front and the focal plane that is parallel to ground can change to some extent.The described above vehicle functions be associated with HUD system 100 is exemplary, and can change to some extent or revise.
Again further, the mechanism that movement is carried out in focal plane, front can be revised to some extent compared with described above.Such as, be different from mobile whole projector (such as, use first and the 3rd projector 118,122 of the first and second actuators 156,158), can only make diffuser screen (such as, first and the 3rd diffuser screen 148,152 of projector 118,122) move relative to homolographic projection instrument unit (such as, projector unit 140,144).
In addition, although be described with reference to vehicle 106 pairs of HUD systems 100, this vehicle can be for outdoor four-wheel automobile, and HUD system 100 can use in the dissimilar vehicle.Such as, this HUD system can at marine vehicles (such as, ship), transatmospheric vehicle (such as, aircraft or jet supersonic plane) or for indoor vehicle (such as, the vehicle for material process of transport trolley, such as fork truck and so on, etc.) in provide.
Fig. 5 is the diagram of diagram according to the exemplary components figure of the system 500 for 3D navigation of one or more embodiment.System 500 comprises HUD assembly 100, wagon control assembly 180, controller assemblies 104, navigation arrangement 540, depth map assembly 550, depth buffer assembly 560, one or more sensor module 570 and one or more controller area network (CAN) 580.HUD assembly 100 can be vehicle equal-volume HUD system, the HUD system 100 of such as Fig. 1, and can comprise assembly described above.In one or more embodiments, among other things, HUD assembly 100 can be HUD, the distance variable HUD, augmented reality HUD (AR-HUD) etc. of 3.
Navigation arrangement 540 can be configured to receive or identify origin position (such as, putting A) and one or more destination locations (such as, putting B).Navigation arrangement 540 can be configured to calculate or determine such as one or more route from point A to point B.Usually, navigation arrangement 540 is associated with vehicle.Such as, navigation arrangement 540 can be installed on inherent vehicle, carries out integrated, carry out linking or communicative couplings with one or more assemblies of vehicle with one or more system of this vehicle be contained in vehicle or one or more assembly, or be positioned within vehicle, etc.In any case, navigation arrangement 540 can identify or receive origin position and destination locations.In one or more embodiments, navigation arrangement 540 can comprise vehicle carried information system component (not shown), and it can be configured to determine the current location of vehicle or work as prelocalization.
In addition, navigation arrangement 540 can be configured to generate the one or more one or more route in from origin position to destination locations.In one or more embodiments, navigation arrangement 540 can be configured to generate the current location from vehicle or the current one or more one or more route navigated to destination locations.A route in this one or more route can comprise one or more position or one or more route portion.Exemplarily, one or more parts of this route can comprise one or more navigation instruction of being associated with one or more crossings in one or more section or section or dispatch.In other words, one or more parts of this route can comprise one or more turning, navigation is dispatched, section, crossing, terrestrial reference or other key element along route.Navigation arrangement 540 can be configured to identify that these are turned, navigation is dispatched, one or more in terrestrial reference etc. and correspondingly such as send one or more navigation command or one or more navigation instruction to vehicle driver.
Navigation arrangement 540 can send one or more navigation command or navigation instruction via audio prompt, visual prompts, tactile cue etc.Such as, navigation arrangement 540 can carry out with one or more external module (not shown) by transmitting one or more prompting across one or more controller area network (CAN) 580 alternately.Listened to the instruction that navigation arrangement 540 can be play such as " turning left in main street ", or in the left part of telltale blinking light, make steering wheel vibration etc. to indicate the driver behavior that should take to chaufeur.Navigation arrangement 540 can carry out mutual to facilitate transmission or the transmission of one or more steering instructions with other assembly one or more.
Such as, HUD assembly 100 can be configured to one or more navigation instruction or one or more navigation to dispatch be projected among the visual field of automotive occupant or chaufeur as one or more graphical element or the virtual image.These navigation instructions can (such as, direct or indirect) receive from navigation arrangement 540.HUD assembly 100 can be configured to the virtual image to be projected in and continuous focal plane make this virtual image show as towards occupant move, and this occupant such as has the chaufeur in the visual field of the glasses case 116 from Fig. 2.By this way, HUD assembly 100 can make chaufeur perceive isovol map picture in the visual field of chaufeur, and wherein these volumetric images can as " virtual " guided vehicle followed for vehicle driver.In other words, it such as can show as him or she to the chaufeur of vehicle and go to destination locations following a guided vehicle.In addition, as described here, other navigation command one or more or navigation instruction can project as volume reserved place, mark or flagpole.
HUD assembly 100 can be configured to one or more graphical element that projects, and this graphical element can be contact simulation augmented reality graphical element, conformal augmented reality graphical element, the virtual image, icon etc.These graphical elements can be projected in three-dimensional mode by HUD assembly 100.Therefore, the one or more visual cues be associated with graphical element or one or more depth cue can be retained substantially.One or more can the passing through retained in these visual cues or depth cue projects or present graphical element and realize on dynamic focal plane or removable focal plane.That is, HUD assembly 100 can be configured to project on removable or adjustable focal plane or present one or more graphical element.Such as discuss with reference to figure 1 and Fig. 3, dynamic focal plane or removable focal plane can carry out moving or regulating along path or lines, the sight line of this path or lines such as automotive occupant.In other words, dynamic focal plane can be moved towards or away from vehicle or vehicle windshield.
In one or more embodiments, such as, focal plane due to the projector of HUD assembly 100 or screen can be such as dynamic by the movement of actuator.That is, one or more projector of HUD assembly 100 can be configured to move in a linear fashion, make homolographic projection instrument one or more graphical element can be projected on dynamic, removable or adjustable focal plane thus, dynamic, removable or adjustable focal plane is moved when projector moves.In other embodiments, one or more can be adopted for other device of regulating or replaceable unit.
In explained another kind of mode, when graphical element is projected on dynamic, removable or adjustable focal plane, graphical element can be projected on focal plane, and the distance (such as, the distance 162' of Fig. 3 or distance 170') wherein apart from focal plane and vehicle is conditioned.Because the projector of HUD assembly 100 by graphical element projection or can be presented on moveable focal plane, so the focus of the graphical element that can project to each distance apart from vehicle regulates.As mentioned, the one or more sight lines that may be oriented perpendicular or be arranged essentially parallel to automotive occupant in focal plane.In other words, focal plane can be parallel to ground or perpendicular to ground.In addition, one or more focal plane can be removable or static about occupant's sight line or ground.This makes depth cue while vehicle moves or travels, correctly be presented to the automotive occupant of such as chaufeur (such as, and be therefore used as mobile platform).
The HUD assembly 100 of Fig. 5 can be configured to projection or present isopyknic contact simulation augmented reality graphical element.This means that these graphical elements can be projected to appear at each distance.In other words, HUD assembly 100 can in multiple focal plane or with adjustable way projecting figure key element.In explained another kind of again mode, the focal plane of the graphical element that HUD assembly 100 projects can be adjusted to the distance carrying out extending beyond windshield, pedestrian on such as footway, make occupant can be absorbed in operating environment or driving environment thus, instead of the focus of its eyes is switched at windshield or between Vehicular instrument panel and driving environment.By this way, safety can be promoted by the system 500 for 3D navigation.
Therefore, graphical element can (such as, by HUD assembly 100) projection or vision be arranged among the environment in occupant's direct field of view.This means that graphical element can be present in the space identical with true environment, instead of be present on windshield, this allows the depth cue be associated with graphical element to be reproduced in accurate or correct mode.Therefore, graphical element can be projected on the focal plane identical with real-world objects (such as, road), and makes the occupant of vehicle such as can when not watching graphical element by when the sight line leave the road.
It is because when the projector of HUD assembly 100 moves that these multiple focal planes or adjustable focal plane can be implemented, and light ray can be made the graphical element that is projected or visual object can show as more farther than windshield or have the focal plane be not on windshield by again formalizing or change.That is, the graphical element projected or visual object such as can have the focus attribute similar with (such as, ten meters) real object (such as pedestrian, vehicle, mark etc.) at a distance.When light ray from windshield carry out glass-reflected time, light ray of going out departs from, and is formed thus and can carry out " reflection " image that projects or true picture as graphical element.
To be reflected by windshield due to light ray instead of launch from windshield or manifest (such as, as utilizing specific coatings), so must not again presenting of graphical element be carried out when occupant moves its head.Such as, in Fig. 3, the static focal plane of continuous print makes optically " correctly " or real image be able to by fore-and-aft direction in 3 dimension spaces (such as, the direction of occupant's sight line) and generate, and allows thus to generate correct motion parallax prompting.Therefore, when the head of occupant moves, the graphical element be associated with these focal planes can show as fixing position in the environment instead of move everywhere.As mentioned, this means that HUD assembly 100 does not need head-tracking function and compensates the movement of occupant's head.
HUD assembly 100 can based on grating instead of based on vector.This means that the graphical element that HUD assembly 100 projects can be bitmap, bitmap has dot matrix structure, or can be the rectangular mesh of pixel.In addition, HUD assembly 100 can be configured to utilize different shade, transparency level, color, brightness etc. to project to one or more parts of one or more image-element.
By this way, HUD assembly 100 can be configured to utilize various degree of freedom to present or projecting figure key element or the virtual image.That is, can degree of adaptability be retained and make the eyes of occupant initiatively can change optical power to focus on the graphical element projected on focal plane.Similarly, can vergence be retained and make occupant can have the concurrent inside rotation of graphical element when graphical element is projected as mobile " more close " (such as, by projecting on focal plane more close continuously).
In one or more embodiments, graphical element can be projected as vehicle driver or occupant as navigation instruction, the virtual image dispatched or follow ordering or the mobile virtual image by HUD assembly 100.Such as, HUD assembly 100 can be configured to the one or more projection in graphical element or the virtual image, reserved place, identifier, flagpole, mark etc. that are rendered as movement.These graphical elements can be projected on one or more focal planes of vehicle-surroundings environment, and among the visual field being projected in automotive occupant.The virtual image that HUD assembly 100 projects or graphical element can one or more parts of guided vehicle chaufeur route via, and carry out roundabout, navigation, mobile or advance and avoid and the conflicting of obstacle, obstruction or road conditions by being projected with obstacle thing.Such as, sensor module 570 can be configured to respond to one or more obstacle or road conditions, and controller assemblies 104 can indicate HUD assembly 100 to make graphical element get around or walk around road conditions to travel with projecting figure key element, such as by lane change to avoid traffic pipeline.
In one or more embodiments, sensor module 570 can be configured to respond to, identify or detect the one or more road conditionss in vehicle periphery or ambient environment.Sensor module 570 can detect or identify section, footway, object, pedestrian, other vehicle, obstruction, obstacle, foreign material, pit, road surface conditions (such as, ice, rain, sand, rubble etc.), transportation condition, traffic signal (such as, red light, stated-speed sign, parking sign, railway crossing, train etc.).These road conditionss can be transferred into controller assemblies 104 or wagon control assembly 180.Such as, can to use in CAN580 one or more facilitates communication between sensor module 570 and controller assemblies 104 or wagon control assembly 180.In one or more embodiments, sensor module 570 can comprise one or more image-capturing apparatus, microphone, blind spot monitoring device, sensor of parking, proximity sensor, there are sensor, infrared pickoff, motion sensor etc.
Wagon control assembly 180 can be configured to the data received with the one or more data be associated in road conditions or the environment relating to vehicle-surroundings (such as, operating environment, driving environment, ambient environment etc.).In one or more embodiments, wagon control assembly 180 can receive one or more road conditions from sensor module 570.In addition, such as, wagon control assembly 180 can receive one or more road conditions from other sources one or more of such as server (not shown) or data bank (not shown).The vehicle carried information system channel that wagon control assembly 180 can be initiated via vehicle carried information system component (not shown) and being coupled with server, third party, data bank or other entity communication.By this way, wagon control assembly 180 can collect the information be associated with one or more parts of the route from origin position to destination locations.
Such as, wagon control assembly 180 can receive the road condition information of the traffic information (such as, whether traffic blocks up, and whether road has accident etc.) comprising section.In addition, wagon control assembly 180 can receive the speed-limiting messages be associated with section one or more in route.This information can be used to how to project one or more graphical element to vehicle driver or occupant.That is, if section is associated with the speed limit of 65mph, and the present speed of vehicle (such as, detected by sensor module 570) be 25mph, then wagon control assembly 180 can order HUD assembly 100 project the virtual image and make this virtual image show as turn on this section time accelerate.
As another example, if sensor module 570 detects the traffic pipeline in the current lane that vehicle travels just wherein, then wagon control assembly 180 can receive this information and make HUD assembly 100 and should to project out the decision of navigation instruction of lane change.This order can be sent to HUD assembly 100, HUD assembly 100 by one or more CAN580 and can project in response to detected traffic pipeline, to present or animation display represents lane change or changes the virtual image of position or graphical element.In other words, HUD assembly 100 can project out to show as and carry out the virtual image that is roundabout or that navigate or icon at traffic pipeline, and this traffic pipeline is positioned at the front of this vehicle in the operating environment of vehicle-surroundings.Equally, wagon control assembly 180 can be configured such that HUD assembly 100 projects turn sign as real vehicles may indicate when lane change in the virtual image.In addition, wagon control assembly 180 can in the virtual image close to the perception velocities regulating the virtual image during traffic pipeline.This can continue on more close focal plane or make by regulating the dynamic focal plane of graphical element the distance between dynamic focal plane and vehicle or vehicle windshield to reduce to some extent to realize by the virtual image or graphical element being projected in.(on the contrary, when expecting the virtual image to be projected as acceleration, dynamic focal plane can be conditioned and the distance between dynamic focal plane and vehicle or its windshield is increased).
In other words, wagon control assembly 180 can be configured to receive one or more road conditions, the traffic information that the road conditions wherein in this one or more road conditions comprises the one or more sections in section or the speed-limiting messages be associated with the one or more sections in section.In addition, wagon control assembly 180 can be configured to drive HUD assembly 100 with one or more graphical element that projects based on one or more road conditionss of such as section speed limit and vehicle present speed and so on.By this way, wagon control assembly 180 can be determined will carry out the one or more suitable action (such as, parking, acceleration, lane change, deceleration etc.) that projects or navigation instruction by HUD assembly 100.
In one or more embodiments, system 500 can comprise visual field Management Unit (not shown), and it manages one or more aspects of one or more graphical elements that HUD assembly 100 projects.In one or more embodiments, what controller assemblies 104 can be configured to manage in these aspects of being associated with wagon control assembly 180 or function is one or more.Such as, controller assemblies 104 can be configured to receive one or more road conditions.
The type of the graphical element that controller assemblies 104 can be configured to determine will to be shown by HUD assembly 100, project, animation shows, present etc.Exemplarily, when vehicle comprises one or more parts traveling in relatively straight section along route, graphical element can be projected as the virtual image by controller assemblies 104.This virtual image can show or be projected as vehicle or guided vehicle.Vehicle along route comprise one or more parts that one or more turning or other navigation dispatch travel time, controller assemblies 104 can order HUD assembly 100 graphical element to be projected as the mark of the position be associated with one or more turning.Such as, if route comprises the right-hand corner from the first street to the second street, then controller assemblies 104 can order HUD assembly 100 projection mark or identifier, its be positioned at the first street and the second street crossing place, point to this crossing, to be in around crossing etc.In this way, controller assemblies 104 can be configured to the one or more types (such as, mark, identifier, flagpole, the guide virtual image etc.) determining graphical element to display.
In addition, can be configured to determine will one or more positions of projecting figure key element for controller assemblies 104.In other words, can determine when will projecting figure key element or will how to show graphical element with where for controller assemblies 104.Such as, the position of graphical element can comprise focal plane, focal plane apart from vehicle or the distance of its windshield, the x coordinate, y coordinate, z coordinate etc. along x, y or z-axis.This position can be referred to as the target positioning of one or more graphical element.In one or more embodiments, controller assemblies 104 can be configured to regulate the distance between one or more focal plane of one or more graphical element and vehicle (such as, or the windshield of vehicle) based on the one or more road conditionss, the current location of vehicle, the present speed of vehicle etc. that are associated with one or more parts of route.
That is, if section (such as, the current place of vehicle or residing route portion) be associated (such as with the speed limit of 65mph, road conditions), and the present speed of vehicle (such as, detected by sensor module 570) be 25mph (such as, the present speed of vehicle), then controller assemblies 104 can be configured to the projection of order HUD assembly 100 and show as the virtual image or graphical element that travel with about 65mph.In one or more embodiments, this virtual image can project in the mode accelerated gradually shown from 25mph to 65mph.This means that the distance between the focal plane of the virtual image and vehicle can correspondingly regulate.Such as, when vehicle accelerates with about identical speed, the distance between focal plane with vehicle can keep roughly the same.If vehicle accelerates with the speed slower than the virtual image, then the distance between focal plane and vehicle can be adjusted to by controller assemblies 104 to be increased to some extent.Under any circumstance, this adjustment can based on the current location of vehicle and the present speed of vehicle and the road conditions of route associated therewith.
In addition, controller assemblies 104 can be configured to utilize HUD assembly 100 according to or distance between the focal plane of graphic based key element and vehicle and regulate or determine the size of graphical element.This means that controller assemblies 104 can regulate the height, size, width, the degree of depth etc. of graphical element, guide icon or the virtual image based on desired perception.Such as, in order to make the virtual image show as acceleration, controller assemblies 104 can the virtual image to be projected to continuously on farther focal plane or dynamic focal plane is adjusted to more and more far away apart from vehicle while by the size adjustment of this virtual image for reducing to some extent or reducing.
In one or more embodiments, the size of graphical element can be used as the instruction of the severity level of navigation instruction or message.In other words, message or navigation instruction more important, the virtual image, icon or graphical element will be projected larger.
Controller assemblies 104 can be configured to for will to be undertaken in the graphical element projected one or more determines one or more action by HUD assembly 100.Such as, controller assemblies 104 activates turn sign before can ordering HUD assembly 100 that the virtual image is projected as acceleration, deceleration, parking, lane change, lane change, flashes, glimmers, changes virtual image orientation or angle, change virtual image color, etc.In addition, controller assemblies 104 can regulate target location based on the present speed of the current location of road conditions, vehicle, vehicle or other attribute, characteristic or measurement for one or more graphical element.In one or more embodiments, controller assemblies 104 can carry out docking or communicating with navigation arrangement 540 by one or more CAN580.
Controller assemblies 104 can be configured to alleviate and may cause the obstacle of obstruction, interference or other side to vehicle driver or occupant.In one or more embodiments, controller assemblies 104 can be configured to such as receive position, ground line from sensor module 570 and is projected in by graphical element above ground line or sky plane, etc.Controller assemblies can be configured to determine or regulate the color of one or more graphical element, diaphaneity or shade based on the time in one day, the traffic level be associated with route, the familiarity etc. of chaufeur to route.
Depth map assembly 550 can be configured to build or receive the vehicle periphery of such as operating environment and so on or the depth map of ambient environment.HUD assembly 100 can utilize this depth map correspondingly to project one or more graphical element.This means if the virtual image turns over street corner and be in building (such as, building between institute's perception of automotive occupant sight line and graphical element or the virtual image or target location) " afterwards ", then HUD assembly 100 can the projection of one or more parts of the enable or invalid virtual image or graphical element according to the content that should see.
Depth map assembly 550 can be configured to receive depth map from server or third-party server.Such as, the vehicle carried information system channel that can initiate via vehicle carried information system component (not shown) of depth map assembly 550 and download depth map from server.In other embodiments, sensor module 570 can be configured to detect depth information, and depth information can be used to build depth map by depth map assembly 550.That is, depth map assembly 550 can carry out docking or communicate build depth map or receive the depth map built in advance from data bank with one or more sensor.In any case, depth map assembly 550 can both build based on depth information or receive depth map.Depth map can indicate one or more surfaces, object, obstruction, geometric configuration etc. in vehicle-surroundings environment or region.
Depth map can be transferred or be sent to controller assemblies 104, and it is one or more that controller assemblies 104 can order HUD assembly 100 correspondingly to present in graphical element.Such as, HUD assembly 100 can based on initiatively being responded to or project from the height of the glasses case be associated with automotive occupant, vehicle location and regional depth figure that data bank receives or present graphical element.HUD assembly 100 is therefore, it is possible to one or more with the visual angle considering one or more member in vehicle based in this depth map projecting figure key element.
Depth buffer assembly 560 can be configured to utilize by depth map assembly 550 the depth map that generates or receive and help visual angle for the one or more occupants in vehicle and manage.That is, depth buffer assembly can be configured to facilitate presenting of graphical element, and graphical element is visually shown as occupant is " correct ".Such as, if after graphical element will be projected in the object of real world, then depth buffer assembly 560 " can hide " this part of graphical element for occupant by not projecting to a part for graphical element or present.In other words, depth buffer assembly 560 can managing graphic key element which part (such as, pixel) drawn, project or present which part then not drawn, project or present.For this reason, depth buffer assembly 560 can be configured to presenting of one or more parts of enable or invalid one or more graphical element based on depth map.
In addition, depth buffer assembly 560 can be configured to the object covering real world, make thus its cannot see by automotive occupant.Such as, depth buffer assembly 560 can order HUD assembly 100 to project white graphical element and make this graphical element cover the object of real world, such as (such as, sensor module 570 detects) advertising panel.As a result, occupant may can not see the view of this advertising panel or this advertising panel of crested.By this way, depth buffer assembly can be configured to by providing the graphical element and the interference alleviated vehicle driver or occupant facilitating and weaken reality.
The example of the navigation instruction that can be projected by HUD assembly 100 comprises as follows: follow guided vehicle, acceleration (such as, change dynamic focal plane and the distance from focal plane to vehicle is increased to some extent, regulate chaufeur or occupant thus to the far and near perception of graphical element), slow down (such as, regulate the distance between focal plane and vehicle and make it reduce to some extent), lane change (such as, regulating the target location of graphical element), get around hinder navigate, turn to, arrive, mark position etc.Exemplarily, controller assemblies 104 can order HUD assembly 100 virtual image to be projected as " deceleration " when pedestrian goes on section, road, cross walk etc.As another example, controller 104 can order HUD assembly 100 to project slow down based on steering angle, the speed limit be associated with section, the road conditions etc. that such as to freeze and so on.That is, if road surface frozen, then controller 104 can order HUD assembly 100 virtual image to be projected as than mobile slowlyer when road surface not having ice.
In one or more embodiments, controller assemblies 104 can utilize the marks such as mark, flag post, flagpole, identifier or identify turning on the horizon or crossing.Such as, HUD assembly 100 can present or project reserved place or mark according to the visual angle of automotive occupant.Depth map assembly 550 can be configured to provide depth map and make the real-world object in such as building, street etc. show as the lines of view obstruction object for one or more parts of reserved place.Exemplarily, if reserved place has the height of 100 feet of institute's perception, and before the building of 50 feet high is in this reserved place, then depth buffer assembly 560 by making presenting or projecting invalid and compensate view obstruction of the base section of reserved place graphical element, can present reserved place according to the visual angle of chaufeur or occupant thus.
In one or more embodiments, one or more being projected in based on route in the visual field of automotive occupant in graphical element (such as, follows guided vehicle pattern).In one or more embodiments, graphical element can be projected as the virtual image or other guide icon.The virtual image can show as is flying and can show for the real world of vehicle-surroundings.The virtual image can be carried out moving, advancing or " flight " in 3d space or three dimensions.Therefore, the virtual image or graphical element can show as and move in 3D, and occupant or chaufeur thus for following the virtual image provide the impression of sensation or safety more intuitively.Exemplarily, the virtual image, graphical element or guide icon can be projected and make it show as the distance apart from automotive occupant based on institute's perception and height or the change of size occur.Can by virtual image animation be shown by mobile virtual image sequential projection in one or more different focal planes.In addition, the virtual image can show as that such winding in the real vehicles world hinders, obstacle, pedestrian, foreign material, pit etc. navigate.In one or more embodiments, the virtual image can be carried out " driving " according to real-time traffic, moves, show as and move.The virtual image can not show as and another vehicle " collision " or the mode conversion track of otherwise disturbing traffic to make this virtual image.As another example, if route makes chaufeur or vehicles traverse railway, then the virtual image can train through out-of-date be parked in railway before.In other embodiments, HUD assembly 100 can be configured to projection the virtual image or image-element with parking sign, red light place stop or the rule that observe traffic laws.Such as, when arriving destination locations, HUD assembly 100 can be configured to present or the virtual image that projects with rest attitude.
By this way, the system 500 for 3D navigation can generate message, instruction or order intuitively for the automotive occupant of such as chaufeur.This instruction can based on relate to as HUD assembly along the projection of one or more adjustable focal planes or present isopyknic 3D rendering key element ability one or more aspects at visual angle of providing.Such as, 3D effect can be determined based on distance, visual angle, perceived distance, road conditions etc.
Fig. 6 is the diagram of example flow diagram of the method 600 for 3D navigation according to one or more embodiment.At 602 places, the route from origin position to destination locations can be generated.In one or more embodiments, origin position or destination locations such as can receive from global positioning system (GPS) unit via vehicle information system channel.At 604 places, one or more graphical element can be projected on the one or more focal planes in the visual field of automotive occupant.Here, graphical element can show as the virtual image, image, icon, identifier, mark etc.In addition, these graphical elements can based on one or more parts of route.This means that these graphical elements can project in each distance according to the route portion (such as, the current location of vehicle) at vehicle possibility place.
At 606 places, can regulate based on the distance between the road conditions focal plane be associated with one or more parts of route and vehicle.In addition, this distance also can regulate based on the present speed of vehicle.Such as, if vehicle is advanced along the route portion be associated with the speed limit of 65 mph. (mph), and the present speed of vehicle is 25mph, distance between then projected graphical element or the focal plane of the virtual image can increase (such as, indicating acceleration to chaufeur or occupant) to some extent.In other words, graphical element can be projected as just as it just travels with about 65mph, points out occupant or chaufeur to accelerate thus and " catching up with " virtual image (such as, guided vehicle is followed in similar or emulation).
Fig. 7 A is the diagram of the example virtual image 700 for 3D navigation according to one or more embodiment.The virtual image of Fig. 7 A can appear at vehicle front and carry out flying around key element, obstruction, traffic, road conditions etc., slides, moves, dispatches.Fig. 7 B is the diagram of (multiple) example virtual image 710 for 3D navigation according to one or more embodiment.(multiple) virtual image 710 of Fig. 7 B watches from the rising visual field, and what such as lag behind (multiple) virtual image 710 a little gets a bird's eye view the visual field.Can it is seen that, one or more in the virtual image 710 are projected in different focal planes or target location, thus provide the sense organ that chaufeur or occupant are following real vehicles.
Fig. 8 A is the diagram of the example virtual image 800 for 3D navigation according to one or more embodiment.The virtual image 800 of Fig. 8 A carries out left-hand revolution to indicate left-hand rotation.Fig. 8 B is the diagram of the example virtual image 810 for 3D navigation according to one or more embodiment.In one or more embodiments, the virtual image 810 of Fig. 8 B can be turned left by the instructions such as color of glimmering, flash, change.Such as, the left wing of the paper helicopter virtual image 810 can be luminous or change intensity to indicate left-hand rotation on the horizon.In one or more embodiments, the virtual image can be projected on the focal plane more close with vehicle and make it show as this virtual image " deceleration " before turning.
Fig. 9 A is the diagram of the example virtual image 900 for 3D navigation according to one or more embodiment.Fig. 9 B is the diagram of the example virtual image 910 for 3D navigation according to one or more embodiment.The virtual image 900 of Fig. 9 A such as can be projected as the navigation instruction of the deceleration for vehicle driver.In figures 9 b and 9, the virtual image 910 is projected in make the virtual image 910 not block above ground line or sky plane one or more parts that chaufeur or occupant watch vehicle-surroundings environment.
Another embodiment relates to a kind of computer-readable medium again, and it comprises the processor executable of the one or more embodiments being configured to implement technology provided here.The embodiment of carrying out computer-readable medium or the computer readable device designed in such ways illustrates in Fig. 10, wherein embodiment 1000 comprises computer-readable medium 1008, the such as disc etc. of CD-R, DVD-R, flash memory, hard drive, on it, coding has mechanized data 1006.The mechanized data 1006 and then comprise such as comprising the binary data of multiple 0 or 1 as shown in 1006 is configured to carry out the set of the computer instruction 1004 operated according to given one or more principle here.In such embodiment 1000, the executable computer instruction 1004 of treater is configured to perform the method 1002 of the method 600 of such as Fig. 6.In another embodiment, processor executable 1004 is configured to the system of the system 500 implementing such as Fig. 5.Many such computer-readable medias are designed by those skilled in the art, and it is configured to operate according to technology provided here.
As used in this application, term " assembly ", " module ", " system ", " interface " etc. are intended to refer to computer related entity generally, and it is hardware, the combination of hardware and software, software or executory software.Such as, assembly can be but be not limited to run on a processor process, treater, object, executable program, execution thread, program or computing machine.As explanation, the application run on the controller and controller can be assemblies.Be in process or execution thread in one or more assembly and an assembly can distribute on a computing machine or between two or more computing machines.
In addition, claimed theme uses standard program or engineering technology and is implemented as method, device or manufacture to produce software, firmware, hardware or its combination in any with computer for controlling to implement disclosed theme." manufacture " is intended to comprise the computer program that can conduct interviews from any computer readable device, carrier or media as the term is used herein.Obviously, can many amendments be carried out for this configuration and not deviate from scope or the spirit of claimed theme.
Figure 11 and following discussion provides the description of the suitable computing environment to the embodiment for implementing given one or more aspects here.The operating environment of Figure 11 is only an example of proper handling environment and and is not intended to advise any restriction about the use of operating environment or the scope of function.Example Computing Device includes but are not limited to: Personal Computer, server computer, the mobile device of hand-held or laptop devices, such as vehicular telephone, personal digital assistant (PDA), media player etc., multi-processor system, consumer electronics device, small-size computer, giant brain, comprises the DCE of any said system or equipment, etc.
Generally, embodiment is described in the general environment of " computer-readable instruction " performed by one or more computing equipment.Computer-readable instruction is via distributing at computer-readable media discussed below.Computer-readable instruction is implemented as the program module performing one or more tasks or implement one or more abstract data types, such as function, object, API (API), data structure etc.Typically, the function of computer-readable instruction carries out as required combining or distributing in each embodiment.
Figure 11 illustrates system 1100, and it comprises the computing equipment 1112 being configured to implement one or more embodiment provided here.In one configuration, computing equipment 1112 comprises one or more processing unit 1116 and memory device 1118.According to exact configuration and the type of computing equipment, memory device 1118 can be volatibility, such as RAM; Can be non-volatile, such as ROM, flash memory etc.; Or the combination of the two.This configuration is illustrated by dotted line 1114 in fig. 11.
In other embodiments, equipment 1112 comprises other feature or function.Such as, equipment 1112 can comprise additional memory devices, and such as mobile storage means or non-mobile storage means, include but are not limited to: magnetic storage, optical storage etc.Such extra storage is in fig. 11 illustrated in memory storage 1120.In one or more embodiments, the computer-readable instruction for implementing one or more embodiment provided here is in memory storage 1120.Memory storage 1120 can store other computer-readable instruction for implementation and operation system, application program etc.Computer-readable instruction is such as loaded in memory device 1118 to be performed by processing unit 1116.
" computer-readable media " comprises computer storage media as the term is used herein.Computer storage media comprises the volatibility and non-volatile, removable and non-removable media that any means or technology for the information storing such as computer-readable instruction or other data and so on implement.Memory device 1118 and memory storage 1120 are examples of computer storage media.Computer storage media includes but are not limited to: RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile disc (DVD) or other optical storage, magnetic cartridge, tape, disk storage device or other magnetic storage apparatus, or can be used to store desired information and any other medium can accessed by equipment 1112.Computer storage media so is arbitrarily all a part for equipment 1112.
Term " computer-readable media " comprises communication medium.Communication medium embodies computer-readable instruction or other data usually in " modulated data signal " of such as carrier wave or other transmission mechanism, and can comprise random information delivery media.Term " modulated data signal " comprises and makes one or more feature with the signal being carried out in the mode of signal arranging or changing by information coding.
Equipment 1112 comprises (multiple) input equipment 1124, such as keyboard, mouse, pen, voice-input device, touch input device, infrared camera, video input apparatus or other input equipment arbitrarily.(multiple) outdevice 1122 can be comprised, such as one or more telltale, loud speaker, chopping machine or other outdevice arbitrarily with equipment 1112.(multiple) input equipment 1124 and (multiple) outdevice 1122 are connected to equipment 1112 via wired connection, wireless connections or its combination in any.In one or more embodiments, (multiple) input equipment 1124 for computing equipment 1112 and (multiple) outdevice 1122 is used as from the input equipment of another computing equipment or outdevice.Equipment 1112 can comprise (multiple) communication connection 1126 to facilitate the communication with one or more miscellaneous equipment.
According to one or more aspect, provide a kind of system of navigating for 3 dimensions (3D), it comprises the navigation arrangement being configured to receive origin position and destination locations.This navigation arrangement can be associated with vehicle and be configured to generate the route from origin position to destination locations.One or more parts of this route can comprise the one or more navigation instructions be associated with one or more crossings in one or more section or section.This system can comprise looks squarely display (HUD) assembly, and it is configured to one or more graphical element to be projected on one or more focal planes of vehicle-surroundings environment.This HUD assembly can be configured to based on one or more in the visual field of automotive occupant in projecting figure key element of this route.This system can comprise controller assemblies, and it is configured to based on the one or more road conditions be associated with one or more part of this route and the current location of vehicle and regulates the distance between one or more focal plane of one or more graphical element and vehicle.
In one or more embodiments, this controller assemblies can be configured to the current location based on one or more road conditions and vehicle and regulate the target location of one or more graphical element.This system can comprise wagon control assembly, and it is one or more that it is configured to receive in road conditions.In addition, this system can comprise sensor module, and it is one or more that it is configured to detect in road conditions.The traffic information that a road conditions in one or more road conditions can comprise one or more section or the speed-limiting messages be associated with one or more section.In addition, road conditions such as can comprise obstruction, obstacle, pedestrian, foreign material or pit.
This system can comprise depth map assembly, and it is configured to the depth map building vehicle-surroundings environment.This HUD assembly can be configured to project one or more graphical element based on the depth map of environment.This depth map assembly can be configured to build depth map based on depth information.In one or more embodiments, this system can comprise sensor module, and it is configured to from vehicle-surroundings environment measuring depth information.This depth map assembly can be configured to receive depth map based on vehicle information channel.This system can comprise depth buffer assembly, and it is configured to presenting of one or more parts of enable or invalid one or more graphical element based on depth map.
This HUD assembly can be configured to the virtual image or the reserved place that one or more graphical element are projected as movement, such as flagpole, mark, identifier etc.
According to one or more aspect, provide a kind of method of navigating for 3 dimensions (3D), comprise for the route of vehicle generation from origin position to destination locations.One or more parts of this route can comprise the one or more navigation instructions be associated with one or more crossings in one or more section or section.The method can comprise and being projected on one or more focal planes of vehicle-surroundings environment by one or more graphical element.One or more graphical element can be projected in the visual field of automotive occupant based on this route.The method can comprise based on the one or more road conditions be associated with one or more part of this route and the current location of vehicle and regulate the distance between one or more focal plane of one or more graphical element and vehicle.One or more parts of the method can be implemented via processing unit.
The method can comprise the current location based on one or more road conditions and vehicle and regulate the target location of one or more graphical element.The method can comprise that to receive or detect in road conditions one or more.A road conditions in one or more road conditions can comprise the traffic information in one or more section, the speed-limiting messages be associated with one or more section, obstruction, obstacle, pedestrian, foreign material or pit.
Among other things, the method can comprise the depth map building vehicle-surroundings environment, depth map based on environment projects one or more graphical element, from vehicle-surroundings environment measuring depth information, depth map is built, the presenting of one or more parts of enable or invalid one or more graphical element based on depth map based on the depth information detected.
According to one or more aspect, a kind of computer-readable recording medium comprising computer executable instructions, when performing via the processing unit on computing machine, this computer executable instructions implements action, this action comprises for vehicle generates route from origin position to destination locations, and wherein one or more parts of this route comprise the one or more navigation instructions be associated with one or more crossings in one or more section or section; One or more graphical element is projected on one or more focal planes of vehicle-surroundings environment, one or more wherein in graphical element are projected in the visual field of automotive occupant based on this route, or regulate the distance between one or more focal plane of one or more graphical element and vehicle based on the one or more road conditions be associated with one or more part of this route and the current location of vehicle.
In one or more embodiments, one or more graphical element that projects utilizes the figure based on grating.In addition, one or more embodiment can comprise via being carried out projecting as the mobile virtual image by one or more graphical element and provide one or more navigation instruction, or by mobile virtual image sequential projection being made this move the display of virtual image animation in one or more different focal planes.
Although to be described specific to the language of architectural feature or method action theme, it being understood that the theme of claims and not necessarily is confined to specific features described above or action.On the contrary, specific features described above and action are exemplarily embodiments and being disclosed.
This provide the various operations of embodiment.The order that one or more or all operations describe is not appreciated that these operations of hint necessarily depend on sequentially.Interchangeable order will be recognized based on this description.In addition, occur in each embodiment that not all operations all must be provided here.
As used in this application, "or" is intended to represent the "or" of inclusive and the "or" of nonexclusion.In addition, unless otherwise noted or from context clearly for representing singulative, otherwise " one " and " one " is generally understood to represent " one or more " as used in this application.In addition, in A and B at least one etc. at least mean both A or B or A and B.In addition, with regard to " comprising ", " having ", " having ", " with " or its version to specifically describe or in claim with regard to the scope that uses, such term is intended to be similar to mode that term " comprises " but inclusive.
In addition, unless indicated otherwise, otherwise " first ", " second " etc. be not intended to hint time aspect, aspect, space, sequence etc.On the contrary, such term is only the identifier, title etc. that are used as feature, key element, item etc.Such as, the first channel and second channel correspond to channel A and channel B or two different channels or two same channels or two same channels usually.
Although illustrate and describe the disclosure with reference to one or more embodiments, based on this specification sheets and accompanying drawing reading and understand the change carrying out being equal to and amendment.The disclosure comprises all such amendments and change and only limited by following right.

Claims (20)

1., for the system that 3 dimensions (3D) are navigated, comprising:
Navigation arrangement, described navigation arrangement is configured to:
Receive origin position and destination locations, described navigation arrangement is associated with vehicle; And
Generate from described origin position to the route of described destination locations, one or more parts of wherein said route comprise the one or more navigation instructions be associated with one or more crossings in one or more section or described section;
Look squarely display (HUD) assembly, one or more graphical element is projected in around on one or more focal planes of the environment of described vehicle by described display module of looking squarely, and one or more in wherein said graphical element are projected in the visual field of the occupant of described vehicle based on described route; And
Controller assemblies, described controller assemblies regulates the distance between one or more focal plane in the described focal plane of the one or more graphical elements in described graphical element and described vehicle based on the one or more road conditions be associated with one or more part of described route and the current location of described vehicle.
2. system according to claim 1, wherein said controller assemblies based on the one or more road conditions in described road conditions and described vehicle current location and regulate the target location of the one or more graphical elements in described graphical element.
3. system according to claim 1, it comprises wagon control assembly, described wagon control assembly receives the one or more road conditionss in described road conditions, and a road conditions in wherein said one or more road conditions comprises and the traffic information in the one or more sections in described section or the speed-limiting messages that is associated with the one or more sections in described section.
4. system according to claim 1, comprise sensor module, described sensor module detects the one or more road conditionss in described road conditions, and a road conditions in wherein said one or more road conditions comprises obstruction, obstacle, pedestrian, foreign material or pit.
5. system according to claim 1, comprises depth map assembly, and described depth map component construction is around the depth map of the described environment of described vehicle, and it is one or more that described HUD assembly projects in described graphical element based on the described depth map of described environment.
6. system according to claim 5, comprises sensor module, and described sensor module is from the described environment measuring depth information around described vehicle, and wherein said depth map assembly builds described depth map based on described depth information.
7. system according to claim 5, wherein said depth map assembly receives described depth map based on vehicle information channel.
8. system according to claim 5, comprises depth buffer assembly, presenting of the described one or more one or more parts of depth buffer assembly based on described depth map in enable or invalid described graphical element.
9. system according to claim 1, one or more graphical element is projected as the virtual image of movement by wherein said HUD assembly.
10. system according to claim 1, one or more in described graphical element are projected as reserved place by wherein said HUD assembly.
11. 1 kinds of methods of navigating for 3 dimensions (3D), comprising:
For vehicle generates route from origin position to destination locations, one or more parts of wherein said route comprise the one or more navigation instructions be associated with one or more crossings in one or more section or described section;
Be projected in by one or more graphical element around on one or more focal planes of the environment of described vehicle, one or more in wherein said graphical element are projected in the visual field of the occupant of described vehicle based on described route; And
Regulate the distance between one or more focal plane, described focal plane of the one or more graphical elements in described graphical element and described vehicle based on the one or more road conditions be associated with one or more part of described route and the current location of described vehicle, wherein said generation or described adjustment are implemented via processing unit.
12. methods according to claim 11, comprise the target location regulating the one or more graphical elements for described graphical element based on one or more road conditions of described road conditions and the current location of described vehicle.
13. methods according to claim 11, comprise the one or more road conditionss receiving or detect in described road conditions, a road conditions in wherein said one or more road conditions comprises the traffic information in one or more sections in described section, the speed-limiting messages be associated with one or more sections in described section, obstruction, obstacle, pedestrian, foreign material or pit.
14. methods according to claim 11, comprising:
Build the depth map around the described environment of described vehicle; And
Described depth map based on described environment projects the one or more graphical elements in described graphical element.
15. methods according to claim 14, comprising:
From the described environment measuring depth information around described vehicle; And
Described depth map is built based on detected depth information.
16. methods according to claim 14, comprise presenting of one or more parts of the one or more graphical elements based on described depth map in enable or invalid described graphical element.
17. 1 kinds of computer-readable recording mediums comprising computer executable instructions, when described computer executable instructions performs via the processing unit on computing machine, described computing machine performs following action, and described action comprises:
For vehicle generates route from origin position to destination locations, one or more parts of wherein said route comprise the one or more navigation instructions be associated with one or more crossings in one or more section or described section;
Be projected in by one or more graphical element around on one or more focal planes of the environment of described vehicle, one or more graphical elements of wherein said graphical element are projected in the visual field of the occupant of described vehicle based on described route; And
The distance between one or more focal plane of the described focal plane of one or more graphical elements of described graphical element and described vehicle is regulated based on the one or more road conditions be associated with one or more part of described route and the current location of described vehicle.
18. computer-readable recording mediums according to claim 17, one or more graphical elements of the described graphical element that wherein projects utilize the figure based on grating.
19. computer-readable recording mediums according to claim 17, comprise via being carried out projecting as the mobile virtual image by one or more graphical elements of described graphical element and provide the one or more navigation instructions in described navigation instruction.
20. computer-readable recording mediums according to claim 19, comprise by described mobile virtual image sequential projection being made in one or more different focal planes described mobile virtual image animation display.
CN201410515899.0A 2013-09-30 2014-09-29 3 dimension navigation Active CN104512336B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US14/041,614 2013-09-30
US14/041,614 US20160054563A9 (en) 2013-03-14 2013-09-30 3-dimensional (3-d) navigation

Publications (2)

Publication Number Publication Date
CN104512336A true CN104512336A (en) 2015-04-15
CN104512336B CN104512336B (en) 2019-03-12

Family

ID=52673399

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201410515899.0A Active CN104512336B (en) 2013-09-30 2014-09-29 3 dimension navigation

Country Status (3)

Country Link
JP (1) JP2015069656A (en)
CN (1) CN104512336B (en)
DE (1) DE102014219567A1 (en)

Cited By (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017133140A1 (en) * 2016-02-04 2017-08-10 京东方科技集团股份有限公司 Driving assistance device and driving assistance method
CN107449440A (en) * 2016-06-01 2017-12-08 北京三星通信技术研究有限公司 The display methods and display device for prompt message of driving a vehicle
CN107554425A (en) * 2017-08-23 2018-01-09 江苏泽景汽车电子股份有限公司 A kind of vehicle-mounted head-up display AR HUD of augmented reality
CN107978176A (en) * 2016-10-25 2018-05-01 福特全球技术公司 Vehicle traffic circle management
CN108028016A (en) * 2015-09-25 2018-05-11 苹果公司 Augmented reality display system
CN108025674A (en) * 2015-09-10 2018-05-11 罗伯特·博世有限公司 Method and apparatus for the vehicle environmental for showing vehicle
CN108141514A (en) * 2015-09-30 2018-06-08 富士胶片株式会社 Camera and its control method
CN108225734A (en) * 2018-01-05 2018-06-29 宁波均胜科技有限公司 A kind of error calibration system and its error calibrating method based on HUD systems
CN108583437A (en) * 2018-05-07 2018-09-28 温州中佣科技有限公司 A kind of vehicle driving system
CN108896066A (en) * 2018-03-23 2018-11-27 江苏泽景汽车电子股份有限公司 A kind of augmented reality head up display and its navigation implementation method
CN109990797A (en) * 2017-12-29 2019-07-09 周秦娜 A kind of control method of the augmented reality navigation display for HUD
CN110132301A (en) * 2019-05-28 2019-08-16 浙江吉利控股集团有限公司 One kind leading formula automobile navigation method and system
CN111190290A (en) * 2015-10-09 2020-05-22 麦克赛尔株式会社 Head-up display device
CN111357039A (en) * 2018-09-24 2020-06-30 大陆汽车系统公司 Augmented reality DSRC data visualization
US10927501B2 (en) 2016-11-23 2021-02-23 Ibs Of America Monitoring system, control system, and actuation assembly of a paper machine, and a method of controlling
CN112534334A (en) * 2018-10-10 2021-03-19 纳宝实验室株式会社 Three-dimensional augmented reality head-up display for realizing augmented reality for driver's viewpoint by locating image on ground
CN113155117A (en) * 2020-01-23 2021-07-23 阿里巴巴集团控股有限公司 Navigation system, method and device
US11920299B2 (en) 2020-03-06 2024-03-05 Ibs Of America Formation detection system and a process of controlling

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107710751B (en) 2015-04-17 2020-01-10 三菱电机株式会社 Display control device, display system, and display control method
CN106296813B (en) * 2015-05-14 2018-11-13 上海市测绘院 Three-dimensional static map producing method
JP6500734B2 (en) * 2015-10-08 2019-04-17 株式会社デンソー Driving support device
JP7000822B2 (en) * 2017-12-06 2022-01-19 株式会社アイシン Peripheral monitoring device
DE102018202200B4 (en) 2018-02-13 2022-11-10 Bayerische Motoren Werke Aktiengesellschaft Sensor device for detecting the surroundings of a vehicle
DE102018202201B4 (en) 2018-02-13 2020-06-10 Bayerische Motoren Werke Aktiengesellschaft Sensor device and method for environment detection of a vehicle
DE102018206910A1 (en) * 2018-05-04 2019-11-07 Audi Ag Method for determining a user authorization of at least one roadway by a motor vehicle and control device and display screen for a motor vehicle
KR102457275B1 (en) * 2018-05-23 2022-10-19 주식회사 프라젠 Transparent pillar device
DE102020200047A1 (en) 2020-01-06 2021-07-08 Volkswagen Aktiengesellschaft Method and device for displaying virtual navigation elements
DE102020211211A1 (en) * 2020-09-07 2022-03-10 Volkswagen Aktiengesellschaft Method for preparing a navigation maneuver of a vehicle

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6222583B1 (en) * 1997-03-27 2001-04-24 Nippon Telegraph And Telephone Corporation Device and system for labeling sight images
US20110093190A1 (en) * 2008-12-18 2011-04-21 Woong-Cherl Yoon Head-up display navigation device, system and method for implementing services
US20110106428A1 (en) * 2009-10-30 2011-05-05 Seungwook Park Information displaying apparatus and method thereof
US20120072105A1 (en) * 2010-09-20 2012-03-22 Honeywell International Inc. Ground navigational display, system and method displaying buildings in three-dimensions
CN103144588A (en) * 2011-12-06 2013-06-12 通用汽车环球科技运作有限责任公司 Vehicle ghosting on full windshield display

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6222583B1 (en) * 1997-03-27 2001-04-24 Nippon Telegraph And Telephone Corporation Device and system for labeling sight images
US20110093190A1 (en) * 2008-12-18 2011-04-21 Woong-Cherl Yoon Head-up display navigation device, system and method for implementing services
US20110106428A1 (en) * 2009-10-30 2011-05-05 Seungwook Park Information displaying apparatus and method thereof
US20120072105A1 (en) * 2010-09-20 2012-03-22 Honeywell International Inc. Ground navigational display, system and method displaying buildings in three-dimensions
CN103144588A (en) * 2011-12-06 2013-06-12 通用汽车环球科技运作有限责任公司 Vehicle ghosting on full windshield display

Cited By (27)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108025674A (en) * 2015-09-10 2018-05-11 罗伯特·博世有限公司 Method and apparatus for the vehicle environmental for showing vehicle
US11640812B2 (en) 2015-09-25 2023-05-02 Apple Inc. Visual content overlay system
CN108028016A (en) * 2015-09-25 2018-05-11 苹果公司 Augmented reality display system
US11004426B2 (en) 2015-09-25 2021-05-11 Apple Inc. Zone identification and indication system
CN108141514B (en) * 2015-09-30 2020-06-30 富士胶片株式会社 Image capturing apparatus and control method thereof
CN108141514A (en) * 2015-09-30 2018-06-08 富士胶片株式会社 Camera and its control method
CN111190290A (en) * 2015-10-09 2020-05-22 麦克赛尔株式会社 Head-up display device
WO2017133140A1 (en) * 2016-02-04 2017-08-10 京东方科技集团股份有限公司 Driving assistance device and driving assistance method
CN107449440A (en) * 2016-06-01 2017-12-08 北京三星通信技术研究有限公司 The display methods and display device for prompt message of driving a vehicle
CN107978176A (en) * 2016-10-25 2018-05-01 福特全球技术公司 Vehicle traffic circle management
CN107978176B (en) * 2016-10-25 2022-01-11 福特全球技术公司 Vehicle roundabout management
US10927501B2 (en) 2016-11-23 2021-02-23 Ibs Of America Monitoring system, control system, and actuation assembly of a paper machine, and a method of controlling
US11746471B2 (en) 2016-11-23 2023-09-05 Ibs Of America Monitoring system, control system, and actuation assembly of a paper machine, and a method of controlling
TWI756627B (en) * 2016-11-23 2022-03-01 美商美國Ibs公司 Monitoring system, control system, actuating element and control method of paper machine
CN107554425B (en) * 2017-08-23 2019-06-21 江苏泽景汽车电子股份有限公司 A kind of vehicle-mounted head-up display AR-HUD of augmented reality
CN107554425A (en) * 2017-08-23 2018-01-09 江苏泽景汽车电子股份有限公司 A kind of vehicle-mounted head-up display AR HUD of augmented reality
CN109990797A (en) * 2017-12-29 2019-07-09 周秦娜 A kind of control method of the augmented reality navigation display for HUD
CN108225734A (en) * 2018-01-05 2018-06-29 宁波均胜科技有限公司 A kind of error calibration system and its error calibrating method based on HUD systems
CN108896066A (en) * 2018-03-23 2018-11-27 江苏泽景汽车电子股份有限公司 A kind of augmented reality head up display and its navigation implementation method
CN108583437A (en) * 2018-05-07 2018-09-28 温州中佣科技有限公司 A kind of vehicle driving system
CN111357039A (en) * 2018-09-24 2020-06-30 大陆汽车系统公司 Augmented reality DSRC data visualization
CN112534334A (en) * 2018-10-10 2021-03-19 纳宝实验室株式会社 Three-dimensional augmented reality head-up display for realizing augmented reality for driver's viewpoint by locating image on ground
WO2020239036A1 (en) * 2019-05-28 2020-12-03 浙江吉利控股集团有限公司 Vehicle navigation method and system, and computer readable storage medium
CN110132301B (en) * 2019-05-28 2023-08-25 浙江吉利控股集团有限公司 Leading type vehicle navigation method and system
CN110132301A (en) * 2019-05-28 2019-08-16 浙江吉利控股集团有限公司 One kind leading formula automobile navigation method and system
CN113155117A (en) * 2020-01-23 2021-07-23 阿里巴巴集团控股有限公司 Navigation system, method and device
US11920299B2 (en) 2020-03-06 2024-03-05 Ibs Of America Formation detection system and a process of controlling

Also Published As

Publication number Publication date
DE102014219567A1 (en) 2015-04-02
CN104512336B (en) 2019-03-12
JP2015069656A (en) 2015-04-13

Similar Documents

Publication Publication Date Title
CN104512336A (en) 3-dimensional (3-D) navigation
CN104515531B (en) 3- dimension (3-D) navigation system and method for enhancing
US9400385B2 (en) Volumetric heads-up display with dynamic focal plane
US10215583B2 (en) Multi-level navigation monitoring and control
US9393870B2 (en) Volumetric heads-up display with dynamic focal plane
US20160054563A9 (en) 3-dimensional (3-d) navigation
JP7303691B2 (en) Apparatus and method for visualizing content
US10546560B2 (en) Systems and methods for presenting virtual content in a vehicle
US20140362195A1 (en) Enhanced 3-dimensional (3-d) navigation
US9482540B2 (en) Navigation display method and system
JP2012035745A (en) Display device, image data generating device, and image data generating program
US20210239972A1 (en) Methods, systems, apparatuses, and devices for facilitating provisioning of a virtual experience
WO2019131296A1 (en) Head-up display device
CN113119862B (en) Head-up display device for driving assistance
US20200371532A1 (en) Information processing device, autonomous vehicle, information processing method and program
JP2018157319A (en) Content viewing device, content providing method, and mobile object
US20230206780A1 (en) Methods, systems, apparatuses, and devices for facilitating provisioning of a virtual experience in a racing environment
US20240053609A1 (en) Methods, systems, apparatuses, and devices for facilitating provisioning of a virtual experience
Husár et al. Comparison of the Principles of Head-Up Display Technologies forthe Implementation of Augmented Reality in the Automotive Industry: AStudy
CN116684565A (en) Display method, device, vehicle and storage medium
CN115904156A (en) Display method, electronic device and vehicle
Sharma et al. A Review on Emerging Technologies in Augmented Reality Navigation Systems

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant