KR101921969B1 - augmented reality head-up display apparatus and method for vehicles - Google Patents

augmented reality head-up display apparatus and method for vehicles Download PDF

Info

Publication number
KR101921969B1
KR101921969B1 KR1020120101339A KR20120101339A KR101921969B1 KR 101921969 B1 KR101921969 B1 KR 101921969B1 KR 1020120101339 A KR1020120101339 A KR 1020120101339A KR 20120101339 A KR20120101339 A KR 20120101339A KR 101921969 B1 KR101921969 B1 KR 101921969B1
Authority
KR
South Korea
Prior art keywords
driver
position
matching
unit
display unit
Prior art date
Application number
KR1020120101339A
Other languages
Korean (ko)
Other versions
KR20130089139A (en
Inventor
김경호
박혜선
윤대섭
최종우
황윤숙
박종현
Original Assignee
한국전자통신연구원
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to KR20120010238 priority Critical
Priority to KR1020120010238 priority
Application filed by 한국전자통신연구원 filed Critical 한국전자통신연구원
Priority claimed from US13/753,590 external-priority patent/US8994558B2/en
Publication of KR20130089139A publication Critical patent/KR20130089139A/en
Application granted granted Critical
Publication of KR101921969B1 publication Critical patent/KR101921969B1/en

Links

Images

Abstract

The present invention provides a vehicle augmented reality head-up display apparatus and method capable of adaptively changing a position and a direction around a driver's view so as to reduce errors in alignment between virtual object information and real world information. The proposed apparatus includes a viewing angle calculation unit for estimating a viewing direction of the driver based on the detected face direction and the center of the pupil based on the face image of the driver, a viewing angle calculation unit for calculating the viewing angle, A matching unit for matching the position of the real world information in the front and the position of the virtual object information corresponding to the front of the driver's seat; and a display unit for displaying the matching result in the matching unit, and capable of rotating and changing the position.

Description

[0001] The present invention relates to an augmented reality head-up display apparatus and method,

BACKGROUND OF THE INVENTION 1. Field of the Invention The present invention relates to an augmented reality head-up display apparatus and method, and more particularly, to a vehicular augmented reality head-up display apparatus and method, And more particularly, to a vehicular augmented reality head-up display apparatus and method for providing augmented reality by aligning real world information with real world information.

Conventionally developed vehicle information providing apparatuses can roughly classified into three types.

First, there is a 'car terminal' which is mounted at the center of the right bottom of the driver's magnet and provides various information to the driver. This is usually done when the driver is driving while looking ahead and information is provided from the vehicle terminal, the driver turns his or her head to the place where the terminal is located to look at the provided information. At this time, the driver's attention is dispersed and there is a risk of accidents.

Secondly, it is a 'HUD (Head-Up Display) device' which is mainly equipped in high-end vehicles. This HUD device displays simple vehicle information such as arrow information for guiding the route change and text information representing the speed at the top of the dashboard. Therefore, the driver can view the information provided at the front position of the driver while watching the forward direction without changing the direction of the gaze or head to see the provided information, thereby reducing the dangerous rate due to dispersion of the attention. However, since the position of the provided information is provided to the driver without being coincident with the position of the front real world information (e.g., road, building, vehicle, sign, etc.), the driver Safety and convenience may be deteriorated. Further, when there is no positional matching with the real world, it is difficult to provide various information (for example, lanes, obstacles such as a pedestrian or a forward vehicle) through the HUD.

Third, there is an 'augmented reality HUD device' under research and development to integrate into current vehicles. This augmented reality HUD device provides safe driving information to the driver by aligning the information of the front real world information of the driver and the information displayed in the HUD centered on the driver's field of view. This augmented reality HUD device provides safe driving information in accordance with the location of real world information, so it is easy for the driver to recognize the provided information more. However, it is not easy to provide the augmented reality information accurately in accordance with the position of the real world information. In order to do so, it is necessary to precisely detect and recognize the vehicle position and posture, and accurately detect and recognize the position and posture of the driver's head and the line of sight. However, due to technical limitations, there is an error in position and attitude accuracy, and this error lowers the accuracy of position matching between real world information and safe driving information.

An example of such an augmented reality HUD device is disclosed in Korean Patent Laid-Open Publication No. 10-2009-0076242 (head-up display device for a vehicle and its operation control method). Korean Patent Laid-Open Publication No. 10-2009-0076242 discloses a head-up display device for a vehicle, which adjusts the image information projection angle of the HUD unit by adjusting the position of the HUD unit up / down / left / right according to the eye movement of the driver. The head-up display device disclosed in Korean Patent Laid-Open Publication No. 10-2009-0076242 can adjust the projection angle of the HUD unit according to the movement of the driver's eyeball so that the display position of the virtual safe driving information can be changed only, I can not do it at all.

SUMMARY OF THE INVENTION The present invention has been proposed in order to solve the above-described problems of the prior art, and it is an object of the present invention to provide a vehicular navigation system capable of adaptively shifting positions and directions around a driver's view, And an object of the present invention is to provide an augmented reality head-up display apparatus and method.

According to another aspect of the present invention, there is provided a vehicular augmented reality head-up display device for estimating a visual direction of a driver based on a face direction and a pupil center position detected based on a face image of a driver, A viewing angle calculation unit for calculating a viewing angle; A matching unit for matching the position of the real world information on the front of the driver's seat and the position of the virtual object information corresponding to the front of the driver's seat based on the viewing direction and the viewing angle from the viewing angle calculating unit; And a display unit displaying the matched result in the matching unit, wherein the display unit is capable of rotating and shifting the position.

The rotation of the display portion includes movement in the pitch, roll, and yaw directions, and the positional variation of the display portion includes movement in the left, right, up, down, forward, and backward directions.

The display unit is rotated or shifted by manual operation or automatic adjustment of the driver.

Preferably, a matching error calculation unit for calculating a matching error in the matching unit; And an adjusting unit for rotating or displacing the display unit based on the matching error calculated by the matching error calculating unit. In this case, the matching unit reassigns the position of the real world information and the position of the virtual object information according to the adjustment in the adjustment unit. Then, the adjustment unit rotates or displaces the display unit until the calculated registration error is smaller than the error range.

On the other hand, the display unit operates as a scroll type or a folder type at the initial position and becomes a displayable state. In this case, the display portion is scrolled upward or downward from the initial position by the driver's manipulation of the driver, or the folded state is expanded. Alternatively, the display portion is automatically scrolled upward or downward from the initial position or expanded by the driver's command.

The virtual object information includes at least one of a static object in front of the driver's seat through map information and a moving object in front of the driver's seat by image recognition.

According to a preferred embodiment of the present invention, a vehicular augmented reality head-up display method includes the steps of: calculating a viewing angle of a corresponding driver based on a face direction and a pupil center position detected based on a face image of a driver, ; Matching the position of the real world information on the front of the driver's seat and the position of the virtual object information corresponding to the front of the driver's seat based on the viewing direction and the viewing angle in the step of estimating the viewing direction and calculating the viewing angle; Displaying the matched result in a matching step; And the adjusting unit performing at least one of the rotation and the positional change with respect to the display unit according to the matching result in the matching step.

In the step of performing at least one of the rotation and the positional shift, the rotation includes movement in the pitch, roll and yaw directions, and in the step of performing at least one of the rotation and the positional variation, , Right, up, down, forward, and backward movement.

Preferably, performing at least one of the rotation and the position variation comprises: calculating a registration error by the registration step; And a step of rotating or displacing the display unit based on the calculated registration error. In this case, the matching step re-positions the position of the real-world information and the position of the virtual object information according to the adjustment in the step of performing at least one of rotation and position variation.

The step of performing at least one of the rotation and the position variation rotates or displaces the display unit until the calculated registration error is smaller than the error range.

The displaying step is performed after the display unit is operated in a scroll type or a folder type at an initial position and becomes a displayable state.

According to the present invention having such a configuration, the driver can use the augmented reality display when the driver desires according to the necessity of providing guidance information on safety and convenience of the driver, thereby providing driver-centered information.

In addition, the driver can directly adjust the angle and position of the display in order to more accurately match necessary guidance information to the driver's field of vision. Accordingly, the present invention can adjust the rotation and the horizontal, vertical, and back and forth positional shifts for each axis.

According to the present invention, it is possible to reduce the positional error between the virtual object information and the real world information provided to the head-up display to the driver. Accordingly, it is possible to reduce the driver perception load on the provided information and to reduce the risk of accident due to misjudgment.

The present invention can provide the convenience and safety of the driver in all driving environments including nighttime and bad weather, which is difficult to secure visibility.

FIG. 1 is a view showing an example of using a display part employed in a vehicular augmented reality head-up display device according to an embodiment of the present invention.
FIG. 2 is a view showing an example of mounting a display part employed in a vehicular augmented reality head-up display device according to an embodiment of the present invention.
3 is a view showing an example of direction adjustment of a display part employed in a vehicular ARD head up display device according to an embodiment of the present invention.
4 is a view illustrating an example of positional adjustment of a display part employed in a vehicular ARD head up display device according to an embodiment of the present invention.
5 is a block diagram of a vehicular augmented reality head-up display apparatus according to an embodiment of the present invention.
6 is a flowchart illustrating a method of displaying a head-up display for a vehicle according to an embodiment of the present invention.
7 to 9 are views for explaining a method of displaying a head up display for a vehicle according to an embodiment of the present invention.

In order to solve the problems in the conventional augmented reality HUD device, there is a need for a method of reducing the coincidence error between the real world information and the safe driving information. One of the causes of this error is that the display panel of the augmented reality head-up display device is fixed type. Due to the above-mentioned problems, such a fixed augmented reality HUD device causes an error in the process of matching the real world information and the safe driving information provided in the HUD.

Accordingly, if the driver adjusts the orientation or position of the HUD according to his / her posture or position, the user can be provided with real-world information and safe driving information more accurately in accordance with the driver's view.

Accordingly, the present invention is adapted to adaptively change the angle and position of the display according to the driver's vision, thereby reducing the error in position matching with real-world information and providing more information to the driver's view.

Hereinafter, a vehicular augmented reality head-up display apparatus and method according to the present invention will be described with reference to the accompanying drawings. Prior to the detailed description of the present invention, terms and words used in the present specification and claims should not be construed as limited to ordinary or dictionary terms. Therefore, the embodiments described in this specification and the configurations shown in the drawings are merely the most preferred embodiments of the present invention and do not represent all the technical ideas of the present invention. Therefore, It is to be understood that equivalents and modifications are possible.

FIG. 1 is a view showing an example of using a display part employed in a vehicular augmented reality head-up display device according to an embodiment of the present invention.

(Travel route information, POI information, guide board information, etc.) and safety driving information (lane information, vehicle information, pedestrian information, etc.) provided through the augmented reality head-up display device according to the embodiment of the present invention, Etc.), the display unit 10 in the initial position must be operated as a scroll type or a fold type so as to be moved to a displayable position.

1, the scroll type means a type in which the scroll type falls from the top to the bottom, or from the bottom to the top, and the folder type generally means a form in which the folded state in the form of a sunshade mounted on the top of the driver's seat is expanded.

The operation of the scroll type or the folder type is performed automatically or manually. The automatic movement of the display unit 10 to a displayable position by a motor (not shown) or the like provided therein. The manual operation means that the driver directly holds the display unit (10) by hand, scrolls upward or downward, or straightens the folded state by hand.

FIG. 2 is a view showing an example of mounting a display part employed in a vehicle augmented reality head-up display device according to an embodiment of the present invention, and FIG. 3 is a block diagram of a display part employed in a vehicular augmented reality head up display device according to an embodiment of the present invention. FIG. 4 is a view illustrating an example of position adjustment of a display part employed in a vehicular augmented reality head up display device according to an embodiment of the present invention. Referring to FIG.

The display unit 10 employed in the augmented reality head-up display apparatus according to the embodiment of the present invention may be installed in a dashboard or a windshield in front of the driver's seat. The display portion 10 is preferably made of a transparent material. 2, it is assumed that the display unit 10 is mounted near the sunshade or the windshield 12 in front of the driver's seat.

The display unit 10 is capable of both rotation and positional change. 3, the display unit 10 is rotatable in a pitch (x-axis) direction, a roll (y-axis) direction, and a yaw (z-axis) direction. Referring to FIG. 4, the display unit 10 can move left, right, up, down, forward, and backward. Here, the positional movement to the left or the right is referred to as pan movement, the positional movement to the upper or lower position is referred to as tilt movement, and the movement to the front or backward position is referred to as dolly movement.

In this way, the display unit 10 is adjusted in the upward, downward, leftward, rightward, forward, and backward positional shifts and / or in the x, y, and z axis directions in accordance with the driver's line of sight. The position and orientation of the display unit 10 are adjusted according to the field of view of the driver so that the position of the virtual object information can be precisely aligned with the position of the real world information. Accordingly, since more accurate augmented reality information can be provided, the driver can be provided with safe driving information more accurately and easily.

As described above, the position and the direction of the display unit 10 can be manually adjusted by the driver or automatically adjusted according to the driver's line of sight.

5 is a block diagram of a vehicular augmented reality head-up display apparatus according to an embodiment of the present invention.

The vehicle augmented head up display apparatus according to the present invention includes a display unit 10, a photographing unit 20, a face direction detecting unit 22, a pupil center position detecting unit 24, a viewing angle calculating unit 26, A generating unit 28, a navigation unit 30, a matching unit 32, a matching error calculating unit 34, and an adjusting unit 36.

The photographing section 20 is installed on the dashboard in front of the driver's seat so as to face the driver. The photographing section 20 includes at least one camera. The photographing section 20 photographs the face image of the driver and outputs a plurality of captured face images. For example, the photographing unit 20 first captures the front face image of the driver, and then, after capturing the front face image of the driver, displays the directions of the face, the left and right face directions considering the face direction and the eye direction, Captures images and images in the direction of the eye at various angles. At this time, the angle values of face direction, left and right face direction, and angle of view direction are used as experimental threshold values obtained by running experiments on various experimenters.

The face direction detection unit 22 detects the face direction of the driver from the captured face images captured by the photographing unit 20. For example, the face direction detection unit 22 determines facial feature points such as eyebrows, eyes, nose, mouth, and the like from the plurality of captured facial images, and models the driver's face through these feature points. The face direction detection unit 22 detects the three-dimensional face direction (position and orientation) of the corresponding driver using the generated face model.

When the face direction is determined by the face direction detection unit 22, the pupil center position detection unit 24 detects the pupil center position of the driver by using the estimated eye position values along the eye feature points and the face direction. In other words, the pupil center position detection unit 24 performs the pupil center position detection operation based on various kinds of information from the face direction detection unit 22 when the face direction detection unit 22 determines the face direction. Of course, even if the pupil center position detecting section 24 performs the pupil center position detecting operation based on the information from the photographing section 20 when the face direction detecting section 22 starts the face direction detecting operation It is acceptable.

5, the face direction detection unit 22 and the pupil center position detection unit 24 are shown as separate components, but the face direction detection unit 22 and the pupil center position detection unit 24 are collectively referred to as one block So that the face direction and the pupil center position may be detected.

The viewing angle calculating unit 26 calculates the viewing angle after estimating the gaze direction of the corresponding driver based on the face direction by the face direction detecting unit 22 and the pupil center position by the pupil center position detecting unit 24. Preferably, the viewing angle calculation unit 26 estimates the center coordinates of the head center and the eye based on the inputted face direction and the pupil center position based on the human facial shape and structure. Then, the viewing angle calculation section 26 estimates (detects) the gaze direction using the estimated coordinate values (i.e., the center coordinates of the head center coordinate and the eye center) and the center position of the eyes. The viewing angle calculation unit 26 calculates the viewing angle of the driver in consideration of the face direction and the viewing direction of the driver.

The virtual object creating unit 28 creates a virtual object through image recognition using a camera sensor or guide board information recognition using three-dimensional map information. More specifically, the virtual object generation unit 28 recognizes a moving object such as a pedestrian or a preceding vehicle in front of the driver's seat using a camera sensor, and generates a static object such as a signboard, a road sign, The object can be recognized. Accordingly, the virtual object generating unit 28 generates information about the virtual object including the static object ahead of the driver's seat through the three-dimensional map information and the moving object ahead of the driver's seat by image recognition. The information about the virtual object includes a position coordinate value for each virtual object.

The navigation unit 30 calculates and outputs the position value of the vehicle in real time.

The matching unit 32 adjusts the position of the real world information in front of the driver's seat and the position of the virtual object information corresponding to the front of the driver's seat from the virtual object creating unit 28 based on the viewing direction and the viewing angle from the viewing angle calculating unit 26 do. That is, the matching unit 32 receives the virtual object information from the virtual object generating unit 28, calculates coordinate values of the real world information and virtual object information according to the viewing direction and the viewing angle from the viewing angle calculating unit 26 ≪ / RTI > Here, the real world information means that the real world information is actually seen through the windshield in front of the driver's seat.

More specifically, the matching unit 32 maps the driver's head direction and line-of-sight position values to real-world coordinate values in consideration of the relative positional relationship between the vehicle and the driver based on the vehicle position value from the navigation unit 30 . The matching unit 32 can also calculate the initial values of the direction and the position of the display unit 10 by automatically mapping them to real world coordinate values based on the vehicle position value. On the other hand, the matching unit 32 maps and coordinates the virtual object information coordinate values according to the driver's viewing angle range and the previously calculated real world mapping position coordinate values through the camera calibration and registration method.

The matching error calculation unit 34 calculates a matching error generated in the matching process in the matching unit 32. [

The adjusting unit 36 rotates or changes the position of the display unit 10 based on the matching error calculated by the matching error calculating unit 34. [ The adjusting unit 36 rotates or changes the position of the display unit 10 until the matching error calculated by the matching error calculating unit 34 is smaller than a predetermined error range. Accordingly, the matching unit 32 reassigns the position of the real-world information and the position of the virtual object information when the adjustment in the adjusting unit 36 is made. In this way, the coordinate values of the virtual objects are finely re-calculated to the coordinate values of the real world information.

Hereinafter, a vehicular augmented reality head up display method according to an embodiment of the present invention will be described with reference to the flowchart of Fig. 7 to 9 are views for explaining a method of displaying a head up display for a vehicle according to an embodiment of the present invention.

First, the photographing section 20 photographs the face image of the driver and outputs a plurality of captured face images.

Next, the face direction detecting unit 22 detects the direction of the face of the driver based on the captured face images from the photographing unit 20 (S10).

The pupil center position detection section 24 uses the estimated eye position values along the eye's minutiae and the face direction based on the plurality of captured face images or the information from the face direction detection section 22 from the photographing section 20 And detects the central position of the pupils of the driver (S20).

Information corresponding to the detected face direction and pupil center position is transmitted to the viewing angle calculation unit 26. [

Accordingly, the viewing angle calculation unit 26 calculates the viewing angle of the corresponding driver based on the detected face direction and the pupil center position (S30).

Thereafter, the virtual object creating unit 28 creates a virtual object through image recognition using a camera sensor or guide board information recognition using three-dimensional map information (S40).

Then, based on the viewing direction and the viewing angle from the viewing angle calculating section 26, the matching section 32 calculates the position of the real world information in front of the driver's seat and the position of the virtual object information corresponding to the front of the driver's seat from the virtual object generating section 28 (S50).

When the matching of the matching portions 32 is completed in this manner, the matching error calculating section 34 calculates the matching error in the matching section 32, and the adjusting section 36 calculates the matching error in accordance with the calculated matching error value, And determines whether the rotation and the position are shifted or not.

That is, if the calculated matching error value is smaller than the predetermined error range ("Yes" in S60), the adjusting unit 36 determines that rotation and positional shift to the display unit 10 are unnecessary. As a result, a screen 14 as illustrated in Fig. 7 is displayed on the display unit 10 (S70). Here, the screen 14 of FIG. 7 shows that the position of the real-world information and the position of the virtual object information are displayed in a well-matched manner in the driver's visual range.

On the other hand, when the calculated registration error value is out of the predetermined error range ("No" in S60), the position of the real world information and the position of the virtual object information are not properly matched as illustrated in FIG. Accordingly, the adjustment unit 36 rotates or displaces the display unit 10 as shown in FIG. 9 until the matching error value is less than a preset error range, and the matching unit 32 adjusts the position of the adjustment unit 36 The position of the real world information and the position of the virtual object information are reassigned (S80).

Here, the method of adjusting the direction and the position of the display unit 10 will be described once again.

First, we explain how to adjust manually. The first manual adjustment method manually adjusts the direction and the position of the display unit 10 in accordance with the driver's viewing angle and line of sight. The second manual adjustment method adjusts the direction and position of the display unit 10 through a motor (not shown) and a button (not shown) capable of moving the direction and position of each x, y, do. The motors and buttons (not shown) may be provided in the display unit 10, and if they are not shown separately, those skilled in the related art can fully understand them.

This time, we will explain how to adjust automatically. The automatic adjustment method automatically extracts the driver's viewing angle and the viewing direction through the face direction based on the position of the driver's eyes and the face model without the driver's intervention, and automatically adjusts the position and direction of the display unit 10 .

The present invention can also be embodied as computer-readable codes on a computer-readable recording medium. A computer-readable recording medium includes all kinds of recording apparatuses in which data that can be read by a computer system is stored. Examples of the computer-readable recording medium include a ROM, a RAM, a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like, and also implemented in the form of a carrier wave (for example, transmission over the Internet) . The computer readable recording medium may also be distributed over a networked computer system so that computer readable code can be stored and executed in a distributed manner.

While the present invention has been described in connection with what is presently considered to be practical exemplary embodiments, it is to be understood that the invention is not limited to the disclosed embodiments, but, on the contrary, is intended to cover various modifications and equivalent arrangements included within the spirit and scope of the appended claims. You must see.

10: display section 20: photographing section
22: face direction detection unit 24: pupil center position detection unit
26: Viewing Angle Calculation Unit 28: Virtual Object Creation Unit
30: navigation section 32:
34: registration error calculation unit 36:

Claims (20)

  1. A viewing angle calculating unit for estimating a viewing direction of the driver based on the detected face direction and the center of the pupil based on the face image of the driver and calculating a viewing angle;
    A matching unit for matching the position of the real world information on the front of the driver's seat and the position of the virtual object information corresponding to the front of the driver's seat based on the viewing direction and the viewing angle from the viewing angle calculating unit;
    A display unit for displaying the matched result in the matching unit, the display unit being capable of rotating and shifting the position;
    A matching error calculator for calculating a matching error in the matching unit; And
    And an adjusting unit for rotating or displacing the display unit based on the matching error calculated by the matching error calculating unit,
    Wherein the matching unit reassigns the position of the real-world information and the position of the virtual object information according to the adjustment in the adjustment unit.
  2. The method according to claim 1,
    Wherein the rotation of the display unit includes a movement in a pitch, roll, and yaw directions.
  3. The method according to claim 1,
    Wherein the positional change of the display unit includes a left, right, up, down, front, and back movement.
  4. The method according to claim 1,
    Wherein the display unit is rotated or displaced by the driver's manipulation of the driver.
  5. The method according to claim 1,
    Wherein the display unit is rotated or displaced by automatic adjustment.
  6. delete
  7. The method according to claim 1,
    Wherein the adjusting unit rotates or displaces the display unit until the calculated matching error is smaller than an error range.
  8. The method according to claim 1,
    Wherein the display unit operates in a scroll type or a folder type at an initial position and becomes a displayable state.
  9. The method of claim 8,
    Wherein the display unit is scrolled upward or downward from the initial position by a manual operation of the driver, and the folded state is expanded.
  10. The method of claim 8,
    Wherein the display unit is automatically scrolled upward or downward from the initial position or expanded in a folded state according to a command from the driver.
  11. The method according to claim 1,
    Wherein the virtual object information includes at least one of a static object in front of the driver's seat through map information and a moving object in front of the driver's seat by image recognition.
  12. Estimating a viewing direction of the driver based on the detected face direction and the center of the pupil center based on the face image of the driver, and calculating a viewing angle;
    Matching the position of the real world information in front of the driver's seat and the position of the virtual object information corresponding to the front of the driver's seat based on the viewing direction and the viewing angle in the step of estimating the viewing direction and calculating the viewing angle;
    Displaying a matched result in the matching step; And
    And the adjusting unit performing at least one of the rotation and the positional change with respect to the display unit according to the matching result in the matching step,
    Wherein the step of performing at least one of the rotation and the positional shift comprises:
    Calculating a matching error by the matching step; And
    And rotating or displacing the display unit based on the calculated registration error,
    Wherein the matching step sums up the position of the real world information and the position of the virtual object information according to the adjustment in the step of performing at least one of the rotation and the position variation.
  13. The method of claim 12,
    Wherein, in the step of performing at least one of the rotation and the positional shift, the rotation includes movement in the pitch, roll, and yaw directions.
  14. The method of claim 12,
    Wherein the positional change includes a leftward, rightward, upward, downward, forward, and backward movement in the step of performing at least one of the rotation and the positional change.
  15. delete
  16. The method of claim 12,
    Wherein the step of performing at least one of the rotation and the positional shift rotates or displaces the display unit until the calculated matching error is smaller than an error range.
  17. The method of claim 12,
    Wherein the displaying step is performed after the display unit is operated in a scroll type or a folder type at an initial position to be in a displayable state.
  18. 18. The method of claim 17,
    Wherein the display unit is scrolled upward or downward from the initial position by a manual operation of the driver, and the folded state is expanded.
  19. 18. The method of claim 17,
    Wherein the display unit is automatically scrolled upward or downward from the initial position or expanded in a folded state according to a command from the driver.
  20. The method of claim 12,
    Wherein the virtual object information includes at least one of a static object in front of the driver's seat through map information and a moving object in front of the driver's seat by image recognition.
KR1020120101339A 2012-02-01 2012-09-13 augmented reality head-up display apparatus and method for vehicles KR101921969B1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
KR20120010238 2012-02-01
KR1020120010238 2012-02-01

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US13/753,590 US8994558B2 (en) 2012-02-01 2013-01-30 Automotive augmented reality head-up display apparatus and method

Publications (2)

Publication Number Publication Date
KR20130089139A KR20130089139A (en) 2013-08-09
KR101921969B1 true KR101921969B1 (en) 2018-11-28

Family

ID=49215203

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020120101339A KR101921969B1 (en) 2012-02-01 2012-09-13 augmented reality head-up display apparatus and method for vehicles

Country Status (1)

Country Link
KR (1) KR101921969B1 (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101519353B1 (en) * 2013-11-12 2015-05-13 현대오트론 주식회사 Method and apparatus for displaying information of smart curse device using head-up display
KR101583950B1 (en) * 2014-06-30 2016-01-08 현대자동차주식회사 Apparatus and method for displaying vehicle information
US9690104B2 (en) 2014-12-08 2017-06-27 Hyundai Motor Company Augmented reality HUD display method and device for vehicle
KR101900475B1 (en) * 2017-06-29 2018-11-08 주식회사 맥스트 Calibration method for matching of augmented reality object and head mounted display for executing the method
WO2019074114A1 (en) * 2017-10-13 2019-04-18 Ricoh Company, Ltd. Display device, program, image processing method, display system, and moving body

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002052953A (en) * 2000-08-10 2002-02-19 Yazaki Corp Display device for vehicle and display position adjustment method therefor
US20070124071A1 (en) 2005-11-30 2007-05-31 In-Hak Joo System for providing 3-dimensional vehicle information with predetermined viewpoint, and method thereof
US20100253542A1 (en) 2009-04-02 2010-10-07 Gm Global Technology Operations, Inc. Point of interest location marking on full windshield head-up display
JP2010256878A (en) * 2009-03-30 2010-11-11 Aisin Aw Co Ltd Information display device

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002052953A (en) * 2000-08-10 2002-02-19 Yazaki Corp Display device for vehicle and display position adjustment method therefor
US20070124071A1 (en) 2005-11-30 2007-05-31 In-Hak Joo System for providing 3-dimensional vehicle information with predetermined viewpoint, and method thereof
JP2010256878A (en) * 2009-03-30 2010-11-11 Aisin Aw Co Ltd Information display device
US20100253542A1 (en) 2009-04-02 2010-10-07 Gm Global Technology Operations, Inc. Point of interest location marking on full windshield head-up display

Also Published As

Publication number Publication date
KR20130089139A (en) 2013-08-09

Similar Documents

Publication Publication Date Title
US9075563B2 (en) Augmented reality display system and method for vehicle
US6272431B1 (en) Method for displaying a map in a vehicle en-route guidance system
US10295826B2 (en) Shape recognition device, shape recognition program, and shape recognition method
DE102004064249B3 (en) Vehicle information display system
JP4134785B2 (en) Display device
US8212662B2 (en) Automotive display system and display method
JP2005138755A (en) Device and program for displaying virtual images
US20150331236A1 (en) A system for a vehicle
JP4672190B2 (en) Video navigation device
JP2006284458A (en) System for displaying drive support information
JP4899340B2 (en) Driving sense adjustment device and driving sense adjustment method
US7423553B2 (en) Image display apparatus, image display method, measurement apparatus, measurement method, information processing method, information processing apparatus, and identification method
DE102012216623B4 (en) Method for dynamic information display on a head-up display
US20080195315A1 (en) Movable-Body Navigation Information Display Method and Movable-Body Navigation Information Display Unit
JP4476719B2 (en) Navigation system
JP2010256878A (en) Information display device
EP1415128A2 (en) Method and device for displaying driving instructions, especially in car navigation systems
US20100054580A1 (en) Image generation device, image generation method, and image generation program
CN101194143A (en) Navigation device with camera information
WO2011108091A1 (en) In-vehicle display device and display method
JP2007292713A (en) Navigation device
KR101416378B1 (en) A display apparatus capable of moving image and the method thereof
EP2766879A2 (en) Method for integrating virtual objects into vehicle displays
JP4246195B2 (en) Car navigation system
EP2133728B1 (en) Method and system for operating a display device

Legal Events

Date Code Title Description
E902 Notification of reason for refusal
E701 Decision to grant or registration of patent right
GRNT Written decision to grant