JPH10176928A - Viewpoint position measuring method and device, head-up display, and mirror adjustment device - Google Patents

Viewpoint position measuring method and device, head-up display, and mirror adjustment device

Info

Publication number
JPH10176928A
JPH10176928A JP8338067A JP33806796A JPH10176928A JP H10176928 A JPH10176928 A JP H10176928A JP 8338067 A JP8338067 A JP 8338067A JP 33806796 A JP33806796 A JP 33806796A JP H10176928 A JPH10176928 A JP H10176928A
Authority
JP
Japan
Prior art keywords
viewpoint position
means
position
vehicle
display
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
JP8338067A
Other languages
Japanese (ja)
Inventor
Yoshinori Endo
Kozo Nakamura
Mariko Okude
浩三 中村
真理子 奥出
芳則 遠藤
Original Assignee
Hitachi Ltd
Zanavy Informatics:Kk
株式会社ザナヴィ・インフォマティクス
株式会社日立製作所
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hitachi Ltd, Zanavy Informatics:Kk, 株式会社ザナヴィ・インフォマティクス, 株式会社日立製作所 filed Critical Hitachi Ltd
Priority to JP8338067A priority Critical patent/JPH10176928A/en
Publication of JPH10176928A publication Critical patent/JPH10176928A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS, OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS, OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/014Head-up displays characterised by optical features comprising information/image processing systems
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS, OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0179Display position adjusting means not related to the information to be displayed
    • G02B2027/0187Display position adjusting means not related to the information to be displayed slaved to motion of at least a part of the body of the user, e.g. head, eye

Abstract

(57) [Summary] [Problem] To accurately and inexpensively measure a driver's viewpoint position. A display position of reference points (5, 6) displayed on a screen (front glass) 20, which moves according to an instruction, and a predetermined target (for example, From the positions of the vehicle width confirmation markers) 1 and 2, the position of the viewpoint at which they appear to be overlapped is determined. Alternatively, the viewpoint position of the driver of the vehicle is obtained based on the angle of the rearview mirror and / or the side mirror of the vehicle.

Description

DETAILED DESCRIPTION OF THE INVENTION

[0001]

BACKGROUND OF THE INVENTION 1. Field of the Invention The present invention relates to a method for measuring the position of a user's viewpoint, a viewpoint position measuring device using the method, and an automobile based on eye positions measured using the method. A head-up display (hereinafter abbreviated as HUD) device for determining a display position of information on a windshield of a vehicle, etc., and a rearview mirror and / or an automobile of a vehicle based on an eye position measured using the method. The present invention relates to a mirror adjustment device that changes the angle of a side mirror.

[0002]

2. Description of the Related Art As a method of transmitting information to a driver of an automobile or the like, a method of displaying image information on a display such as a CRT (cathode ray tube) or an LCD (liquid crystal display device) is generally used. This method of using an image as an information transmission medium is excellent because complicated information can be transmitted accurately. However, with this method, the driver
In order to read the information displayed on the display, the line of sight must be deviated from the front in the traveling direction, which lowers safety.

[0003] Therefore, as a method of transmitting information without restricting the driver's line of sight, a method of outputting information by voice has been used. According to this method, the driver does not need to look away from the front in the traveling direction, and can obtain information safely. However, in this method, the amount of information that can be transmitted per unit time is limited, and it is difficult to transmit emergency information accurately. Furthermore, it is also difficult to accurately transmit complicated information to the driver using this method of hearing.

Therefore, a head-up display (HUD) has been proposed as a means for solving these problems. The HUD device is a device that projects an image (virtual image) to be presented on a transparent screen (for example, a windshield). According to this HUD device, the driver can use the HUD
The projected image of the device (herein, the projected and formed image on the screen is called a virtual image) and the outside scene seen through the windshield (herein, called the real image). In a state where they are overlapped, both can be recognized, and various information can be obtained without distracting the line of sight from the front in the traveling direction. Therefore, this HU
It can be said that the D device is an excellent information transmission means that achieves both safety and operability.

[0005] If this HUD device is used, Japanese Unexamined Patent Publication No.
As described in Japanese Patent Publication No. 00, by superimposing a virtual image on a real image to be noticed, it becomes possible to emphasize an object ahead in the traveling direction or to indicate a road on which to proceed.

In order to use the HUD device for such a purpose, a real image in front of a screen (a windshield when applied to an automobile) and a virtual image projected on the windshield are used by a user (viewer: when applied to an automobile). , In many cases, when viewed from the driver), it is essential to accurately detect the viewpoint position (ie, the eye position) of the viewer. Therefore, JP-A-6-2471
In Japanese Patent No. 84, the driver's eyes are photographed with a stereo camera,
A method has been proposed in which the image is analyzed to determine the viewpoint position.

[0007]

SUMMARY OF THE INVENTION The above-mentioned JP-A-6-24771
The technology described in Japanese Patent Publication No. 84-84204 discloses a technique in which the viewpoint position of an object providing information displayed on a HUD device, that is, a viewer (a viewer is often a driver) is photographed by using a stereo camera to capture the face of the viewer. The position of the eye is determined by image processing, and is always calculated by calculating from the difference between the positions of the eyes captured by each camera. In this method, by using a camera with high resolution, accurate viewpoint position information can always be obtained in real time even if the position of the eyes of the viewer moves.

However, in this method, an expensive camera and an image processing device must be used, and the cost increases. Further, in this method, although the resolution in the direction parallel to the straight line connecting the two cameras of the stereo camera is high, it is difficult to accurately measure the direction perpendicular to the direction.

Accordingly, it is a first object of the present invention to provide a low-cost and accurate method of measuring a viewpoint position, and a viewpoint position measuring device and a head-up display device using the method. In addition, if the viewpoint position can be accurately detected at low cost, the angle of the mirror can be adjusted based on the detected position. Therefore, a second object of the present invention is to provide a vehicle mirror adjustment device.

[0010]

According to the present invention, as a first method for measuring a viewpoint position, a reference point (hereinafter referred to as a reference point) which moves on a screen (HUD screen) in accordance with an instruction is provided. Cursor)), and from the position of the predetermined object and the position of the cursor,
There is provided a viewpoint position measuring method for obtaining a position of a viewpoint on which they appear to overlap.

According to the present invention, as a second method for measuring a viewpoint position, a viewpoint position measuring method for obtaining a viewpoint position from an adjustment angle of a side mirror and / or a room mirror is provided.

Further, the present invention provides a viewpoint position measuring device, a head-up display device, and a mirror adjusting device using at least one of these viewpoint position measuring methods.

[0013]

As described above, in the first method of the present invention, the display position of a reference point displayed on a screen, which moves according to an instruction, is observed through a screen. From the predetermined target position, the position of the viewpoint in which they are overlapped is determined.

Note that this viewpoint position measuring method is based on the vehicle H
It can be applied to UD devices. That is, the vehicle H
In the UD device, a windshield is used as a screen for projecting an image, but the reference point is also displayed on the windshield, so that the first viewpoint position measuring method of the present invention can be applied without separately providing a screen. can do. In this case, a vehicle width confirmation marker (fender marker), a fender mirror, or the like may be used as a target, or a mark for confirming a viewpoint position may be attached to the hood of the vehicle in advance and used.

According to the present invention, as a viewpoint position measuring device using the first method, input means for receiving an input of position information of a reference point, and on a screen based on the position information received via the input means. An apparatus (hereinafter, referred to as a first viewpoint position measuring device) is provided that includes a reference point display unit that displays a reference point, and a viewpoint position calculation unit that determines a viewpoint position based on the display position of the reference point. . Here, the viewpoint position calculating means includes, as the viewpoint position, a reference point and a predetermined target observed through the screen,
This is a means for calculating the positions of viewpoints that appear to overlap by calculation.

The viewpoint position calculating means may be operated at all times during the operation of the reference point display means to constantly measure the viewpoint position. However, the viewpoint position calculating means is started at any time during the display of the reference point. When the viewpoint position is obtained, the processing may be terminated. In this case, the measurement may be performed when a certain period of time has elapsed after the start of the display, and a unit for receiving the input of the measurement instruction through the input device is provided. The position may be measured.

If it is not necessary to constantly measure the viewpoint position, it is not necessary to display the reference point after measuring the viewpoint position.
It is desirable to erase this display. That is, when the viewpoint position measuring device is activated, it is desirable to display the reference point for a predetermined time or until the viewpoint position is measured, and then delete the reference point. The viewpoint position is obtained by using the reference point display position at the time of erasing or at any time during display.

The initial display position of the reference point may be a predetermined position, but may be a position of a seat (usually a driver's seat) for a viewer, an angle of a room mirror (a mirror for rearward confirmation), or a side view. Using the angles of the mirrors (mirrors for side confirmation (including fender mirrors)), etc., the viewpoint position is estimated in advance, and from the estimated viewpoint position, the position where the target and the target appear to overlap on the screen It is preferable to further comprise means for displaying the reference point. In this way, the amount of movement of the reference point for overlapping with the target can be reduced.

When the first viewpoint position measuring device is mounted on a vehicle or the like, a means for detecting the start of the vehicle or the like is provided. When the start of the vehicle is detected, the reference point display means is started. Alternatively, the display of the reference point may be started. When applied to a vehicle or the like, it is usually desirable to measure the viewpoint position at the start of driving (that is, at the time of starting the vehicle or the like). If it is measured, it is easy to use.

If the position of the viewer's seat (usually the driver's seat), the angle of the room mirror, or the angle of the side mirror is changed, it is estimated that the viewpoint position has changed. Therefore, it is desirable to measure the viewpoint position again each time these changes are detected. That is, the first viewpoint position measuring device further includes a change estimating unit that estimates a change in the viewpoint position, and a starting unit that starts the reference point display unit when the change estimating unit estimates that the viewpoint position has been changed. It is desirable to have. Here, the change estimating means is at least one of a means for detecting a movement of the position of the viewer's seat, a means for detecting a change in the angle of the room mirror, and a means for detecting a change in the angle of the side mirror. It is desirable to provide.

If the display position of the reference point is changed in accordance with these changes, it is preferable because the usability is good. That is, the first viewpoint position measuring device further includes a change amount estimating unit for estimating a change amount of the viewpoint position, and the reference point display unit determines a display position of the reference point based on the estimated change amount. It is desirable to further comprise means for changing Here, the change amount estimating means includes means for detecting the amount of movement of the position of the viewer's seat, means for detecting the amount of change in the angle of the room mirror, and means for detecting the amount of change in the angle of the side mirror. , At least one of them.

Further, even if these are changed, the already measured viewpoint position may be corrected according to the amount of change, without displaying the reference point again and measuring the viewpoint position again. That is, the first viewpoint position measuring device further includes the above-described change amount estimating unit, and the viewpoint position calculating unit further includes a unit that corrects the viewpoint position obtained in advance based on the estimated change amount. It is desirable.

In the first viewpoint position measuring method and apparatus,
There can be only one reference point, but since the accuracy is improved,
It is desirable to provide a plurality. That is, the reference point display means includes means for displaying a plurality of reference points on a screen, and inputting position information of any of the plurality of reference points to the position information of the target reference point. Desirably, the apparatus includes means for changing the display position of the reference point of the input target based on the input position information received and input, and means for changing the reference point of the input target.

In the case where a plurality of reference points are provided, means for notifying the user of which reference point movement instruction is being accepted when receiving an input of a reference point movement instruction (ie, position information) is provided. Is desirable. That is,
The reference point display means is means for highlighting (for example, changing or blinking the display color) the reference point to be input, or
It is desirable to further include a unit that outputs a voice as to which of the plurality of reference points is an input target.

According to the present invention, there is provided, as a second method of measuring the viewpoint position, a method for determining the viewpoint position of the driver of the vehicle based on the angle of the rearview mirror and / or the side mirror of the vehicle. The viewpoint position measuring device using the second method includes a mirror angle measuring unit that measures an angle of a rearview mirror of the vehicle, and a unit that determines a viewpoint position of a driver of the vehicle based on the measured angle. Equipment (hereafter,
A second viewpoint position measuring device), a mirror angle measuring means for measuring an angle of a side mirror of the vehicle, and a means for obtaining a viewpoint position of a driver of the vehicle based on the measured angle. (Hereinafter, referred to as a third viewpoint position measuring device).

The viewpoint position measured by the above-described first or second method can be used in an apparatus that requires various accurate viewpoint positions. As such a device,
For example, there is a device for adjusting a mirror of a vehicle. The angle of the mirror is usually adjusted by the driver while visually recognizing the image reflected on the mirror. However, if the viewpoint position measuring method of the present invention is used, the mirror can be adjusted to an optimum angle according to the viewpoint position of the driver. .

Therefore, in the present invention, the target angle of the side mirror is determined based on the first or second viewpoint position measuring device and the viewpoint position determined by the viewpoint position measuring device, and the target angle is set to the target angle. Means for adjusting the angle of the side mirror. Further, based on the first or second viewpoint position measuring device and the viewpoint position obtained by the viewpoint position measuring device, a target angle of the room mirror is obtained, and the angle of the room mirror is adjusted to be the target angle. Means for adjusting the rearview mirror is provided.

The viewpoint position measuring method of the present invention is particularly suitable for a head-up display device. This is because a head-up display device requires an accurate viewpoint position. In view of this, the present invention includes a first, second, or third viewpoint position measuring device, and an image display device that projects an image on a screen, wherein the image display device includes a viewpoint position determined by the viewpoint position measuring device. A head-up display device that determines a display position of an image based on the above is provided. Further, according to the present invention, there is provided a head-up display device, wherein a reference point for viewpoint position measurement is displayed on the screen at the time of activation.

As described above, the HUD device of the present invention can be used as a display device of an in-vehicle navigation system, but can also be used as a display device of various systems such as games and simulators. .

Further, the present invention provides a storage medium for storing a program using the viewpoint position measuring method of the present invention.

[0031]

【Example】

<Embodiment 1> An embodiment in which the first viewpoint position measuring method of the present invention is applied to a HUD device of a navigation system will be described below.

In a navigation system, it is important to guide a car to a destination. For this reason, the user is notified of the direction of a road to be traveled by arrows or the like. Here, if the HUD device is used as the display device, the virtual image of the arrow for route indication is displayed so as to be superimposed on the real image of the actual road, so that the driver can intuitively obtain guidance information.
For example, in the example shown in FIG. 1A, the left-turn guidance arrow 3 is displayed so as to overlap the actual image of the actual road.

This is because the HUD device calculates the position of the real image on the actual windshield of the road (the position seen through the windshield of the road) from the relationship between the current position of the vehicle and the viewpoint position. This is realized by displaying (projecting) the virtual image of the arrow 3 at the position of the real image. The position of the vehicle can be obtained by processing signals output from a distance sensor, a gyro, and a GPS (Global Positioning System) receiver mounted on the navigation system by a mathematical method. Further, the HUD device of the present embodiment includes the first viewpoint position measuring device of the present invention, and the viewpoint position can be easily and inexpensively obtained by the viewpoint position measuring device.

As shown in FIG. 1B, this viewpoint position measuring device displays virtual images 5 and 6 of the cursor on the windshield, and superimposes the real images 1 and 2 of the target and the virtual images 5 and 6 of the cursor. The viewpoint position that can be seen is calculated by calculation. The viewpoint position measuring device includes a cursor control switch 4, and moves the cursor in accordance with a cursor movement instruction input via the switch 4. Therefore, the user (a viewer of the navigation information output by the HUD device) uses the cursor control switch 4 to move the cursors 5 and 6 so that the predetermined targets 1 and 2 attached to the vehicle ahead of the windshield are attached. And the cursors 5 and 6 can be superimposed. The cursor has a cross shape in this embodiment, but may have another shape such as a circle.

The viewpoint position measuring apparatus according to the present embodiment obtains the viewpoint position by performing calculations using the display position information of the cursors 5 and 6 and the physical position information of the targets 1 and 2.
According to the method of the present embodiment, a function of displaying a cursor,
By incorporating means for moving the display position of the cursor into the HUD device, the viewpoint position can be measured at low cost.
Therefore, the HUD device can project an image such as an arrow for guidance or a code for calling attention, which is output from a navigation system or the like, so as to accurately overlap the real image in front of the own vehicle at low cost. realizable. Therefore, the viewpoint position measuring device and the HUD device of the present embodiment can provide information without distracting the eyes of a viewer, in many cases, a driver, thereby contributing to improvement of safety.

Next, the operation principle of the HUD device is shown in FIG.
A description will be given with reference to a schematic view of the vehicle shown in FIG. The projector 21 is a device for projecting an image to be displayed on the windshield 20, and includes a CRT (cathode ray tube) and an LCD.
(Liquid crystal display). The image generated here is reflected by the mirror 22 and projected on the windshield 20. The windshield 20 is a half mirror that reflects light from inside the vehicle and transmits light from outside. The image reflected by the mirror 22 is further reflected by the windshield 20, so that a viewer can see the image generated by the projector 21 displayed on the windshield 20. Further, since light from the outside is transmitted, scenes outside the vehicle can be seen through the windshield. Therefore, the user sees the real image of the scenery such as the actual road and the virtual image such as the guidance arrow projected by the HUD device in an overlapping manner.

Next, the principle of measuring the viewpoint position is shown in FIG.
This will be described with reference to FIGS. In addition, FIG.
It is the schematic diagram which looked at the car from the top.

The viewpoint position measuring device of this embodiment is started when the seat position of the driver's seat is moved and when the engine is started. As a result, the cursors 5 and 6 serving as reference points are displayed on the HUD screen, and the user is prompted to perform an operation for overlapping the cursors 5 and 6 on the targets 1 and 2. Note that the viewpoint position measuring device may be activated even when the angle of the room mirror and / or the side mirror changes. Further, in the present embodiment, the left and right vehicle width confirmation markers 25 and 26 are used as targets. However, for example, a mark may be provided in advance on the bonnet surface and used. It is desirable in calculation that there are two or more targets, and the wider the target is, the higher the measurement accuracy is. Therefore, when the present invention is applied to an automobile having a vehicle width confirmation marker on the front left and right of the hood or fender, it is desirable to use the vehicle width confirmation marker.

The principle of measuring the viewpoint position from the cursor display position information and the target physical position information is as follows. Here, the front-back direction of the vehicle is an x-axis, the lateral direction is a y-axis, and the up-down direction is a z-axis.

First, from the relationship between the cursor display position coordinates and the position coordinates of the windshield 20, the windshield 20
The physical coordinates of the display position of the cursor displayed on the screen are obtained. For example, the coordinates of the left vehicle width checking marker 25 are (X 1 , Y 1 , Z 1 ), and the physical coordinates of the left cursor 5 on the windshield 20 (if the cursor and the target are correctly overlapped, the vehicle (X 2 , Y 2 , Z)
2 ), a straight line 31 on the xy plane connecting the two coordinate points is expressed by the following equation (Equation 1).

[0041]

(Equation 1)

Similarly, the coordinates of the right vehicle width confirmation marker 26 are set to (X 3 , Y 3 , Z 3 ), and the physical coordinates of the right cursor 6 on the windshield 20 (the cursor and the target are correctly overlapped). , The same as the coordinates of the real image 24 on the windshield 20 of the vehicle width confirmation marker 26) (X 4 ,
Y 4 , Z 4 ), a straight line 32 on the xy plane connecting these two coordinate points is expressed by the following equation (Equation 2).

[0043]

(Equation 2)

The xy coordinates of the viewpoint position 27 are given by the above (Equation 1)
Are the xy coordinates of the point where the straight line 31 represented by the formula (3) intersects with the straight line 32 represented by the (formula 2).

Here, the case where measurement is performed using two pieces of cursor position information has been described, but the viewpoint position coordinates (X, Y) can be obtained from one piece of cursor position information. In this case, for example, the coordinates of the position where the straight line expressed by (Equation 1) or (Equation 2) intersects the center plane 29 of the seat where the user sits are set as the xy coordinates of the viewpoint position.

On the other hand, the z coordinate value of the viewpoint position is calculated by the following method. Here, the left and right vehicle width confirmation markers 2 are used.
The z-coordinate is obtained by using the left one 25 of the five and 26. However, the z-coordinate can be similarly obtained by using the right marker 26.

The position of the left vehicle width confirmation marker 25 is indicated by (X
1 , Y 1 , Z 1 ), left side cursor 5 windshield 2
0 (X 2 , Y 2 , Z 2 ) if the cursor and the target are correctly superimposed on each other (same as the coordinates of the real image 23 on the windshield 20 of the vehicle width confirmation marker 25).
Then, a straight line on the xz plane connecting the two coordinate points is expressed by the following equation (Equation 3). In this (Equation 3),
By substituting the x coordinate values calculated from the above (Equation 1) and (Equation 2), the z coordinate value of the viewpoint position 27 can be obtained.

[0048]

(Equation 3)

The left and right vehicle width confirmation markers 25,
The accuracy of the z-coordinate value can be further improved by using the average value of the z-coordinate obtained by calculating each of the 26 position information as the z-coordinate value of the viewpoint position.

By using the viewpoint position information (X, Y, Z) obtained in this way, the HUD device of this embodiment can display a virtual image of an image such as a guidance arrow on a real image. it can. Although the case where the number of targets is two has been described here, the viewpoint method 27 can be calculated even when the number of targets is three or more by using the same method.

Next, the system configuration of the mobile navigation system of this embodiment will be described.

As shown in FIG. 3, the mobile navigation system according to the present embodiment includes an arithmetic processing unit 101, a display 102 connected to the arithmetic processing unit 101 via a signal S1, and an arithmetic processing unit 101. A HUD output device 103 connected via a signal S2, a map storage device 104 connected via a signal S3 to the arithmetic processing unit 101, and an audio input device connected via a signal S4 to the arithmetic processing unit 101; An output device 105; an input / output device 106 connected to the arithmetic processing unit 101 via a signal S5;
1 is connected to the wheel speed sensor 107 via a signal S6.
And a geomagnetic sensor 108 connected to the arithmetic processing unit 101 via a signal S7, and a signal S8 to the arithmetic processing unit 101.
Gyro 109 connected via the CPU and the arithmetic processing unit 1
GPS receiver 12 connected to the GPS receiver 01 via signal S9
0, a traffic information receiving device 121 connected to the arithmetic processing unit 101 via a signal S10, and an in-vehicle LAN (Local Area) connected to the arithmetic processing unit 101 via a signal S11.
Network) device 122. The signals S1 to S1
The signal line for transmitting S11 may be wired or wireless as long as the signal can be transmitted. However, in this embodiment, a wired line is used.

The arithmetic processing unit 101 includes various sensors 107 to
9, the current position is detected based on the information output from the control unit 9, the necessary map information is read from the map storage device 104 based on the obtained current position information, and the map data is developed into graphics. It is a central unit that performs various processes such as superimposing and displaying an optimal road connecting the destination designated by the user and the current location, and informing the user using voice or graphic display.

The display 102 includes an arithmetic processing unit 101
A unit for displaying the graphics information generated by the CRT or a liquid crystal display. In addition, in the present embodiment, between the arithmetic processing unit 101 and the display 102, similarly to a normal system, RGB is used.
(Red Green Blue) signal or NTSC (National Tel)
evision System Committee) Connected by signal S1.

The HUD output device 103 includes the arithmetic processing unit 10
1 to the graphics information generated by the projector 21
Is displayed on the front glass 20.
Between the arithmetic processing unit 101 and the HUD output device 103, similarly to the above-described S1, the RGB signal or the NTSC signal S2
Connected by

The map storage device 104 has a CD-ROM (Co
mpact Disk-Read Only Memory) or IC (Integrated
Circuit) a large-capacity storage medium such as a card, reads out data held in the large-capacity storage medium in response to a request from the arithmetic processing unit 101, and notifies the arithmetic processing unit 101 of the data. A write process for storing the data notified from the storage medium 101 in the mass storage medium is performed.

The voice input / output device 105 converts the message to the user generated by the arithmetic processing unit 101, outputs a voice, accepts a voice input, recognizes the content, and sends the voice input to the arithmetic processing unit 101. Perform the transfer process.

As a sensor used to detect a position in mobile navigation, the navigation system of the present embodiment measures a distance from the product of the circumference of a wheel and the measured number of rotations of the wheel, and further measures the distance. The wheel speed sensor 107 that measures the angle at which the moving object is bent from the difference in the number of rotations of the wheels, the geomagnetic sensor 108 that detects the magnetic field held by the earth and measures the direction in which the moving object is facing, A gyro 109 composed of an optical fiber gyro, a vibrating gyro, and the like, which measures a rotated angle, and a satellite that receives signals from GPS satellites and determines the distance between the mobile unit and the GPS satellite and the rate of change in distance by three or more satellites. GP that measures the current position, traveling direction, and traveling direction of a moving object by measuring
An S receiving device 120 is provided. In the present invention, sensors are not limited to these. For example, in a navigation system mounted on a ship, the speed may be detected by using a Doppler sonar, for example.

The traffic information receiving device 121 is used for
It is a beacon transmitter that emits traffic information such as construction, road closure information, and parking lot information, and a means for receiving a signal from FM (Frequency Modulation) broadcasting. Further, the in-vehicle LAN device 122 includes various information of the vehicle, for example, information on opening / closing of doors, types and situations of lit lights, positions of seats,
This is a means for receiving the set angles of the room mirror and side mirror, the state of the engine, the results of the failure diagnosis, and the like.

The input / output device 106 is a unit for receiving an instruction input from the outside. In addition to keys provided in a normal navigation system such as a scroll key and a scale change key, a HUD display key, a viewpoint position setting key, And a cursor control switch 4. Note that the switch is provided as a hard switch in this embodiment, but the configuration of the input / output device 106 is not limited to this, and other input means such as a touch panel, a joystick, a keyboard, a mouse, and a pen input device may be used. You may.

As shown in FIG. 5, the navigation system of this embodiment includes a navigation input device 106a and a HUD / side mirror input device 106b as shown in FIG. Navigation input device 1
Reference numeral 06a includes a scroll key 51, a scale change key 52, and the like, and forms a navigation console 164 together with the display device 102. The HUD / side mirror input device 106b is an input unit for obtaining information necessary for HUD output, and includes a side mirror selection switch 162, a cursor selection switch 163, and a direction input switch 161.

The side mirror selection switch 162 is a switch for receiving selection of a side mirror whose angle is to be adjusted. The cursor selection switch 163 is a switch for receiving selection of a movement target among the cursors 5 and 6 displayed by the HUD output device 103. The direction input switch 161 is a switch for receiving an input of an instruction of a direction and an amount of change for changing the angle of the selected side mirror or moving the selected cursor. In this embodiment, since one switch 161 is used for both the side mirror adjustment and the cursor position setting, advantages such as reduction in the installation area of the input device 106a and simplification of the user interface are obtained. Can be

For example, the side mirror selection switch 162
When the right side mirror is pressed, the right side mirror is selected, and when the direction input switch 161 is further pressed, the angle of the right side mirror rotates up, down, left and right. For example, pressing the left side of the cursor selection switch 163 selects the left side cursor 5, and further pressing the direction input switch 161 moves the display position of the left side cursor 5 up, down, left and right.

Next, FIG. 4 shows a hardware configuration of the arithmetic processing unit 101 described above. The arithmetic processing unit 101 is a CPU (Central Processing Unit) 1 that executes processing such as numerical operation and control of each of the devices 132 to 141.
31 and a RAM (Ra) for holding maps and operation data.
a ROM (Read Only Memory) 133 for holding a program, a DMA (Direct Memory Access) 134 for executing data transfer between memory and memory and between memory and each device at high speed,
A drawing controller 135 that executes graphics drawing such as developing vector data into an image at high speed and controls display, and a VRAM (Video Random Access Memory) 136 that stores graphics image data.
, A color palette 137 for converting image data into RGB signals, an A / D (Analog / Digital) converter 138 for converting analog signals to digital signals, and an SCI (SCI) for converting serial signals into parallel signals synchronized with a bus. Se
serial communication interface) 139 and a PIO that synchronizes serial signals with parallel signals and puts them on the bus
(Parallel Input Output) device 140 and a counter 141 for integrating a pulse signal. These devices 131 to 141 are mutually connected by a bus. In addition, power is constantly supplied to the RAM 132 so that the retained information is not erased.

Next, the functional configuration of the arithmetic processing unit 1 will be described with reference to FIG. The arithmetic processing unit 101 includes a user operation analysis unit 181, a route calculation unit 182, a route guidance unit 184, a map / menu display unit 185, a current position calculation unit 186, and a map match processing unit 187.
, Data reading processing means 188, and viewpoint position measuring means 1
89, HUD display means 190, and graphics processing means 191. Each of these means 181 to 191
Is realized in the present embodiment by the CPU 131 executing the instructions stored in the ROM 133. However, the present invention is not limited to this, and each means may be realized by hardware such as a dedicated circuit.

The current position calculating means 186 uses distance data and angle data obtained by integrating distance pulse data S6 measured by the wheel speed sensor 107 and angular acceleration data S8 measured by the gyro 109, respectively. By integrating the data on the time axis, a calculation process for obtaining the position (X ′, Y ′) after traveling of the moving object from the initial position (X, Y) is performed. Here, the current position calculating means 18
6 is a one-to-one relationship between the azimuth data S7 obtained from the geomagnetic sensor 108 and the angle data S8 obtained by integrating the angle data S8 obtained from the gyro 109 in order to match the relationship between the angle of rotation of the moving body and the direction of travel. Mapping,
The absolute azimuth of the traveling direction of the moving object is corrected.

When the data obtained from the sensor is integrated as described above, the error of the sensor is accumulated. Therefore, the current position calculating unit 186 performs a correction for canceling the accumulated error based on the position data S9 obtained from the GPS receiver 120 at a predetermined time period (every second in this embodiment). Processing is performed, and the corrected data is output as current position information.

The current position information obtained in this way still contains a small error due to the sensor. Therefore, in order to further improve the position accuracy, in the navigation system of the present embodiment, the map match processing means 187 performs a map match process. This is because the road data included in the map around the current position read by the data reading processing means 188 and the current position calculating means 1
In this process, the running trajectories obtained from the vehicle 86 are compared with each other, and the current position is adjusted to the road having the highest shape correlation. By performing the map matching process, the current position often coincides with the traveling road, so that the current position information can be output with high accuracy.

On the other hand, the user operation analyzing means 181 analyzes the contents of the request for the operation instruction received via the input device 106, and executes each unit 1 so that the corresponding processing is executed.
82 to 190 are controlled. For example, when a route guidance request to a destination is input, a process for displaying a map for setting a destination is requested from the map / menu display unit 185, and a process for calculating a route from the current location to the destination is performed. From the route calculation means 182 and the route guidance means 18
4 to request the user to provide route guidance information to the user.

The route calculation means 182 searches the map data for a node connecting two designated points (for example, the current location and the destination) using the Dijkstra method or the like, and determines the route having the highest priority. Then, it is stored in the path storage means 183. In addition, the route calculation means 182
A plurality of priorities are provided, and a route specified by an operation instruction is used to determine a route. In the present embodiment, a route with the shortest distance between the two points, a route that can be reached in the shortest time, a route with the lowest cost, and the like are determined as the highest priority according to the operation instruction. .

The route guidance means 184 is
The link information of the guidance route obtained in step 2 is compared with the current position information obtained by the current position calculating means 186 and the map matching processing means 187, and whether the vehicle should go straight or turn left or right before passing through an intersection or the like. And output direction instruction information. That is, the route guidance unit 184 outputs a voice through the voice input / output device 4,
Draws a sign indicating the direction to proceed on the map displayed on the display screen of the user, and projects a guidance arrow at a position overlapping the real image of the road to be traveled when viewed from the user using the HUD output means 190. Or

The data reading processing means 188 reads the map data of the requested area from the map storage device 104.

The map / menu display means 185
The map data around the point designated for display is received from the data read processing unit 188, and a command for drawing the specified object at the specified scale and drawing method is transferred to the graphics processing unit 191. Further, the map / menu display means 185 is provided for the user operation analysis means 18.
In response to the command output from the control unit 1, a command for drawing various types of required menus and marks to be displayed on the map is transferred to the graphic processing unit 191.

The graphics processing means 191 receives the drawing command generated by the map / menu display means 185 and develops an image in the VRAM 26.

Next, the viewpoint position measuring means 189 will be specifically described. View point position measuring means 189 of this embodiment
As shown in FIG. 13, a start point 71, a reference point display section 72 for displaying a cursor, a viewpoint position calculating section 73 for obtaining a viewpoint position from a cursor display position, and a viewpoint for holding viewpoint position information. The system includes a position storage unit 74 and a coordinate conversion table 201 that stores coordinates of the cursor on the windshield 20 for each coordinate of the projector 21. The activation unit 71 activates the reference point display unit 72 when the power supply system of the automobile is activated, when the cursor selection switch 163 is operated, and when the side mirror, the room mirror, or the seat position is adjusted. It is. Each of the units 71 to 73 is realized by the CPU 131 executing the instructions held in the ROM 133, but may be realized by hardware such as a dedicated circuit. The viewpoint position storage means 74 and the coordinate conversion table 201 are storage areas provided in the RAM 132. Note that the viewpoint position measuring means 1 of the present embodiment
89, together with the input device 106b for the HUD / side mirror and the in-vehicle LAN device 122, constitute one aspect of the viewpoint position measuring device of the present invention.

As shown in FIG. 7, when the control is transferred to the viewpoint position calculating means 189, the starting means 71 operates this routine because the power supply system of the automobile is turned on (that is, when the power is turned on). Is determined (step 1001). If it is determined that the power is turned on, the user's viewpoint position information in the storage medium calculated before that is read (step 1006), and the process proceeds to step 1010. As a result, a cursor for inputting the viewpoint position is displayed each time the engine of the vehicle is started, so that the user can always be prompted to set the correct viewpoint position. Further, the user confirms that the cursor has been displayed, and thereby the HU
It can be recognized that the device D, particularly the display system, is operating normally.

If it is determined in step 1001 that the power is not turned on, the activation unit 71
It is determined whether or not the cursor selection switch 163 in the number 6 has been operated (step 1002).
If it is determined that No. 3 has been operated, the user's viewpoint position information in the storage medium calculated before that is read (step 1006), and the process proceeds to step 1010. As a result, when the cursor selection switch 163 is operated, the user requests resetting of the viewpoint position, and the cursor is displayed on the HUD screen at that time.

If it is determined in step 1002 that the cursor selection switch 163 has not been operated, the activation unit 71
Determines whether the rearview mirror has been adjusted (ie, whether the rearview mirror angle has been changed) (step 100).
3) If it is determined that the rearview mirror has been adjusted, the viewpoint position is calculated from the adjustment angle of the rearview mirror (step 1).
007) The process proceeds to step 1010. In this embodiment, whether or not the rearview mirror has been adjusted and the angle information thereof are determined by the in-vehicle LA connected to the rearview mirror.
Obtained from N122.

If it is determined in step 1003 that the rear-view mirror has not been adjusted, the activation unit 71 next determines whether the side mirror has been adjusted (ie, whether the angle of the side mirror has been changed) (step 1004). If it is determined that the side mirror has been adjusted, the viewpoint position is calculated from the adjustment angle of the side mirror (step 100).
8) The process proceeds to step 1010. In this embodiment, whether or not the side mirror has been adjusted and the angle information thereof are obtained from the in-vehicle LAN 122 connected to the side mirror, as in the case of the room mirror.

If it is determined in step 1004 that the side mirror has not been adjusted, the activation unit 71 next determines whether the seat position has been adjusted (ie, whether the seat position has been changed) (step 1005). If it is determined that the position has been adjusted, the viewpoint position is calculated from the sheet position (step 1009), and the process proceeds to step 1010.
Proceed to In this embodiment, whether or not the seat position is adjusted and the position information are obtained from the in-vehicle LAN 122 connected to the seat. Note that the arithmetic processing in step 1009 is performed by adding the previously measured viewpoint position coordinate values to
A new viewpoint position coordinate value is obtained by adding the amount of movement of the sheet.

In step 1010, the starting means 71
Activates the reference point display means 72. Reference point display means 7
2 calculates the position where the cursor is to be displayed using the viewpoint position information obtained as a result of the above steps 1006 to 1009, and displays the cursor at that position. This allows
The cursor to operate near the target is displayed,
The user can end the cursor setting with a minimum operation. The processing of steps 1007 to 1009 is
The purpose is to bring the virtual image position of the cursor closer to the target real image position. Therefore, if this is not necessary, step 1006 may be executed instead of steps 1007 to 1009.

Next, the method of calculating the cursor position in step 1007 will be described with reference to FIG.

In this embodiment, the side mirror selection switch 162 of the input device 106 and the direction indication switch 16
1, the input of the angles of the side mirrors 61 and 63 is accepted. These information are transmitted to the side mirrors 61 and 63 via the in-vehicle LAN device 122 by a driving mechanism (not shown).
And the angles of the side mirrors 61 and 63 are adjusted in the up / down / left / right directions according to the information. The adjusted angle is measured by an up / down / left / right angle sensor (not shown) provided in the side mirrors 61 and 63, and transmitted to the arithmetic processing unit 101 via the in-vehicle LAN device 122.

Here, the front-rear direction of the vehicle is the x-axis, and the lateral direction is the y-axis.
Axis, the vertical direction is the z axis, and the z axis and the left and right side mirrors 6
The angle between 1,63, respectively phi 1, phi 2, the angle of the y-axis and the left side mirror 63 forms theta 1, an angle y axis and the right side mirror 61 forms a theta 2. At this time, the straight line connecting the left side mirror 63 and the driver's viewpoint is
Assuming that the position of the left side mirror 63 is (X 1 , Y 1 ), it is represented by the following equation (Equation 4).

[0085]

(Equation 4)

On the other hand, a straight line connecting the right side mirror 61 and the driver's viewpoint is expressed by the following equation (Equation 5), where the position of the right side mirror 61 is (X 2 , Y 2 ). .

[0087]

(Equation 5)

Thus, the (X, Y) coordinates of the viewpoint position are obtained as the coordinates of the position where the straight lines of (Equation 4) and (Equation 5) intersect.

Here, the case where the measurement is performed using both the left and right side mirrors 61 and 63 has been described, but the viewpoint position (X, Y) can also be obtained from the angle information of one of the side mirrors. In this case, (X, Y) coordinates of the viewpoint position are obtained as coordinates of a position where the straight line expressed by (Equation 4) or (Equation 5) intersects with the center plane 65 of the seat where the driver sits.

On the other hand, the z coordinate value of the viewpoint position is calculated by the following method. Here, the right side mirror 61 is used here.
Request z coordinates with the angle phi 2, but with the angle phi 1 of the left side mirror 63, similarly can be obtained z coordinates.

The position of the right side mirror 61 is changed to (X 2 ,
Y 2 , Z 2 ), a straight line connecting the right side mirror 61 and the driver's viewpoint position is expressed by the following equation (Equation 6). By substituting the x coordinate value calculated from the above (Equation 4) and (Equation 5) into this (Equation 6), the z coordinate value of the viewpoint position can be obtained. Furthermore, both left and right side mirrors 6
If the average value of the z coordinate values obtained by using the 1, 63 angle information is used as the z coordinate value of the viewpoint position, the accuracy of the z coordinate value can be further improved.

[0092]

(Equation 6)

Based on the coordinate values (X, Y, Z) of the viewpoint position obtained as described above, the position on the windshield where the real image of the target can be seen is obtained, and this position is used as the display position of the cursor. This allows the cursor to be displayed very close to the target real image from the beginning of the display.

Next, the method of calculating the cursor position in step 1008 will be described with reference to FIG.

In this embodiment, the angle of the rearview mirror 41 is manually adjusted by the driver. Room mirror 41
Is measured by an up / down / left / right angle sensor (not shown) provided in the interior mirror 41, and the in-vehicle L
The data is transmitted to the arithmetic processing unit 101 via the AN device 122.

Here, the front-rear direction of the vehicle is the x-axis, and the lateral direction is the y-axis.
The axis and the vertical direction are the z axis, the angle between the z axis and the room mirror 41 is φ 3 , and the angle between the y axis and the room mirror 41 is θ 3 . Straight xy plane connecting the viewpoint position of the driver at this time rearview mirror is expressed by the position of the rearview mirror (X 3, Y 3) to the following equation (7).
Therefore, the coordinates of the position where the straight line represented by (Expression 7) intersects with the center plane 43 of the seat where the driver sits are obtained as the (X, Y) coordinates of the viewpoint position.

[0097]

(Equation 7)

On the other hand, the position of the room mirror 41 is changed to (X 3 ,
Y 3 , Z 3 ), a straight line in the xz plane connecting the rear-view mirror 41 and the driver's viewpoint is expressed by the following equation (Equation 8). By substituting the x coordinate value obtained by using (Expression 7) into this (Expression 8), the z coordinate value of the viewpoint position is obtained.

[0099]

(Equation 8)

Based on the coordinate values (X, Y, Z) of the viewpoint position obtained as described above, the position on the windshield where the real image of the target can be seen is obtained, and this position is used as the display position of the cursor. This allows the cursor to be displayed very close to the target real image from the beginning of the display.

Next, the processing procedure of the reference point display means 72 in step 1010 will be specifically described with reference to the flowchart of FIG.

The reference point display means 72 first displays the cursors 5 and 6 on the HUD display means 190 using the viewpoint positions set in the above-mentioned steps 1006 to 1009 (step 1000). It is determined whether or not any of the cursor selection switches 163 in (1) has been pressed (that is, whether or not a cursor operation has been selected) (step 1021). HUD display means 1
Blink through 90 (step 1027), and
The display color of the cursor is changed (step 102).
8) Guidance is provided by voice via the voice input / output device 105 (step 1029).

This is for urging the viewer to perform an operation of moving the blinking cursor 5 or 6 so as to overlap the target 1 or 2. Through these processes, the viewer can easily recognize where the cursor to be operated is displayed. The cursor blinks (step 1027).
, Display color change (step 1028), and audio output (step 1030)
One may be omitted. In step 1030, a sound such as “Please move the right cursor to the right fender marker” may be output.

After prompting the cursor movement instruction in this manner, the reference point display means 72 waits until an instruction to end the setting is input (step 1030), and then returns the cursor cursor display to the original state. And for the other cursor, as in steps 1027 to 1029,
The display is blinked (step 1031), the display color is changed (step 1032), guidance is provided by voice output (step 1033), and the process proceeds to step 1024.

In this embodiment, the direction indication switch 1
When it is detected that the central portion 165 (shown in FIG. 5) of the key 61 has been pressed, it is assumed that the setting end has been instructed. However, a switch dedicated to the setting end instruction may be separately provided, and when an input from a switch other than the direction indicating switch 161 is detected, the setting end may be instructed. In addition, regardless of the presence or absence of the setting operation, a standby for a predetermined time may be performed in step 1030.

When it is determined in step 1021 that the viewer has operated the cursor selection switch 163, the reference point display means 72 causes the cursor selected via the cursor selection switch 163 to blink via the HUD display means 190 ( (Step 1022) Further, the display color of the cursor is changed (Step 1023), and the process proceeds to Step 1024. Either the blinking of the cursor (step 1022) or the change of the display color (step 1023) may be omitted. Through these processes, the viewer can easily recognize where the operated cursor is displayed.

At step 1024, the reference point display means 7
2 waits until the setting is completed similarly to step 1030, and then deletes the cursor display. This is because the cursor is displayed to measure the viewpoint position of the viewer, and if the cursor position is determined, it is not necessary to display the cursor. In this way, if the cursor is automatically erased, no special operation for erasure by the viewer is required, and the usability is improved. However, the cursor may always be displayed. In this way, the cursor can be used as a reference for the driver to check his / her own posture. Further, a key for deleting the cursor may be provided, and the cursor may be displayed until the cursor is designated by the key.

Next, the reference point display means 72 determines whether or not the cursor selection switch 163 has been operated (step 1025).
022, and the processing for setting the cursor position (steps 1022 to 1024) is repeated again.

If it is determined in step 1025 that no operation has been performed, the reference point display means 72 calculates the viewpoint position by the viewpoint position calculation means 73, and stores the result in the viewpoint position storage means 74 (step 1026). The process ends.

The processing of the viewpoint position calculating means 73 in step 1026 is for obtaining the viewpoint position based on the cursors 5 and 6 using the above-mentioned equations (Equation 1) to (Equation 3). This calculation method has already been described with reference to FIG.

Since the shape of the windshield 20 is not a flat surface but a curved surface, obtaining the coordinate values of the display positions where the cursors 5 and 6 are actually displayed from the set cursor positions is complicated. . Therefore, in this embodiment,
The coordinate value (Xn, Yn) at which the center of the cursor is displayed in advance in the display coordinate system of the projector 21
The relationship between the coordinates and the coordinate values (X, Y, Z) of the display position where the cursor is actually projected on the windshield is stored in the coordinate conversion table 201, thereby simplifying the calculation.

An example of the coordinate conversion table 201 is shown in FIG.
0 is shown. According to this example, for example, the projector 21
Is (X 2 , Y 2 ), the coordinates of the display position on the windshield 20 are (X 22 , Y 22 , Z 22 ). Therefore, the viewpoint position calculating means 73 calculates the viewpoint position using the coordinates (X 22 , Y 22 , Z 22 ). According to this method, since the shape of the curved surface of the windshield is considered in advance, the position where the cursor is projected on the windshield can be accurately obtained.

In step 1026, the obtained viewpoint position information is stored in the RAM 132 to which power is constantly supplied.
Therefore, it can be used for the next viewpoint position calculation or the like.

Next, the processing of the HUD display means 190 will be specifically described with reference to the flowchart of FIG.

The HUD display means 190 receives a cursor display request from the viewpoint position measuring means 189 and the route guidance means 18.
4 is activated when a request to display a guidance arrow from the user 4 or a request to display an image on the HUD screen is generated. When activated, the HUD display means first determines whether the display request is a cursor display request (step 104).
1). Note that the cursor display request is sent to the viewpoint position measuring unit 1.
This occurs at step 1000 (shown in FIG. 7) in the process of 89.

Here, if it is determined that the cursor display request is made, H
The UD display unit 190 calculates the positions of the cursors 5 and 6 to be displayed on the projector 21 (step 1042), generates images of the cursors 5 and 6, displays the images on the display of the projector 21 (step 1043), and ends the processing. I do.

The cursors 5 and 6 are positioned on the windshield 2
The coordinate value projected on 0 connects the coordinates of the viewpoint position set in the viewpoint position measuring means 189 (that is, the coordinate values held in the viewpoint position storage area 74) to the coordinates of the targets 1 and 2. The straight line is a coordinate value of a point where the straight line intersects the surface of the windshield 20. The coordinate value of this intersection is stored in the coordinate conversion table 20.
The display position of the cursor on the windshield 20 is obtained by converting the display coordinate system into the display coordinate system of the projector 21 by using 1.

The displayed cursor may be a cross cursor (the center of the cross is the target position), a point cursor, or a more complicated shape. If the cursor image is the same as the target image viewed from the user, more accurate viewpoint position measurement can be performed. Further, by determining the shape of the cursor using the coordinate conversion table 201 or the like so as to be close to the target image shape, a higher quality display can be performed.

When it is determined in step 1041 that the request is not a cursor display request, the HUD display means 190
It is determined whether the request is a cursor emphasis request (step 104).
6). This cursor emphasis request is sent to the viewpoint position measuring means 18.
This occurs in steps 1027 to 1028 or steps 1031 to 1032 in the process of No. 9. If it is determined that the request is a cursor emphasis request, the HUD display unit 190
Blinks or changes the color of the designated cursor image according to the designation (step 1045).

When it is determined in step 1044 that the request is not a cursor emphasis request, the HUD display means 190
It is determined whether the request is a cursor movement request (step 104)
6). This cursor movement request is input via the direction switch 161 of the input device 106. In the present embodiment, an instruction of the moving direction is received depending on which of the direction instruction keys 166 (shown in FIG. 5) of the switch 161 is pressed, and an instruction of the moving amount is received based on the pressing time of the key 166.

If it is determined that the request is a cursor movement request, the HUD display means 190 obtains the coordinates of the cursor after the movement according to the designated movement amount (step 104).
7) The image of the highlighted cursor is moved to the coordinates (step 1048).

When it is determined in step 1046 that the request is not a cursor movement request, the HUD display means 190
It is determined whether the request is a cursor emphasis release request (step 10).
49). This cursor emphasis release request is generated in step 1030 in the processing of the viewpoint position measuring means 189.
If it is determined that the request is a cursor emphasis release request, the HU
The D display unit 190 moves the highlighted cursor to the original image (step 10) without changing the display position.
(The image generated in step 43) (step 105)
0).

If it is determined in step 1049 that the request is not a cursor emphasis release request, HUD display means 190
Determines whether it is a cursor non-display request (step 1)
051). This cursor non-display request is generated in step 1024 in the processing of the viewpoint position measuring means 189.
If it is determined that the request is a cursor non-display request, HUD
The display means 190 deletes all the displayed cursors (step 1052). Clearing this cursor
This is done by making the display color of the cursor the same as the background color.

If it is not determined in step 1051 that the cursor is not to be displayed, the HUD display means 190 displays the HU
It is determined whether it is a D information display request (step 1053).
Note that the HUD information display request is generated by the route guidance unit 184 or the like. If it is determined that the HUD information display is necessary, the HUD display unit 190 determines from the internal setting whether to display the requested display information (such as a guidance arrow) simultaneously with the cursor. In the present embodiment, whether or not to display the cursor and the guidance arrow or the like at the same time is determined in advance as an internal setting. However, the cursor may be displayed according to a user's instruction.

If the image such as the guide arrow and the cursor are displayed at the same time, the cursor can be provided as a reference for the correct viewpoint position. Therefore, in this way, even if the viewpoint position of the viewer at the time of display is slightly different from that at the time of the measurement, the user can shift the viewpoint positions so that the cursors 5 and 6 and the targets 1 and 2 overlap. , HUD display information can be corrected.

If it is determined in step 1047 that the requested display information and the cursor are simultaneously displayed,
The HUD display means 190 calculates the display position of the cursor by the same operation as in step 1042 (step 10).
55), The process proceeds to a step 1056. If it is determined in step 1047 that the cursor is not displayed at the same time, the HUD display unit 190 immediately proceeds to step 1047.
The process proceeds to 56.

At step 1056, the HUD display means 190 calculates the position at which the requested display information (such as a guidance arrow) is displayed on the projector 21. The coordinate value on the windshield 20 of the information to be displayed is a point at which a straight line connecting the position coordinates of the object (road or the like) to be superimposed on the display and the viewpoint position coordinates of the viewer intersects with the plane formed by the windshield 20. . The HUD display unit 190 calculates the display position of the display information on the projector 21 by converting the coordinates of the intersection into the display coordinate system of the projector 21 using the coordinate conversion table 201.

Next, the HUD display means 190 performs graphics development of an image of display information (guidance arrows and the like) at the coordinate position obtained in step 1056 (step 1).
057). If it is determined in step 1054 that the cursor is to be displayed at the same time, in step 1057, the cursor is image-developed together with the display information in the same manner as in step 1043.

If the request is not a HUD information display request in step 1053, the HUD display means 190
It is determined whether it is a D information non-display request (step 105)
8). The HUD information non-display request is sent to the route guidance unit 1
84 and so on. If it is determined that the request is a HUD information non-display request, the HUD display unit 190 deletes the image displayed in step 1057 (step 1059).

According to the present embodiment, the hardware can be realized simply by adding input means for moving a cursor to an existing system instead of an expensive camera or the like. , Accurate viewpoint position can be measured. Further, even if the viewpoint position of the viewer is shifted, by displaying the display information such as the guidance arrow and the cursor at the same time, it is possible to provide a reference point for correcting the viewpoint position. Therefore, the user of the present invention can accurately recognize the display by the HUD by correcting the posture so that the displayed cursor overlaps the target.

Second Embodiment In the first embodiment, the cursor is displayed and the viewpoint position is obtained based on the position of the cursor.
In this embodiment, instead of using the position coordinates of the cursor,
Obtain the viewpoint position using the angle of the room mirror. The configuration of the navigation system according to the present embodiment is almost the same as that of the first embodiment, except for the following points.

First, the viewpoint position measuring means 18 of the present embodiment.
9 has a different configuration from that of the first embodiment. Second
Next, the HUD display unit 190 according to the present embodiment performs processing related to the cursor (steps 1041 to 1052 and
054 to 1055) are not executed (other processes are the same as those in the embodiment). Third, in this embodiment, the coordinate conversion table is provided not in the viewpoint position measuring means 189 but in the HUD display means 190.

Next, the viewpoint position measuring means 189 will be described. The viewpoint position measuring means 189 of this embodiment is different from that of FIG.
As shown in FIG. 4, the reference point display means 72, the viewpoint position calculation means 73, and the coordinate conversion table 201 are not provided, and only the activation means 75 and the viewpoint position storage means 74 are provided.

FIG. 15 shows the flow of the processing of the activation means 75 of this embodiment. When the control is transferred to the viewpoint position calculating means 189, the activating means 75 of this embodiment determines whether or not the room mirror has been adjusted (that is, whether or not the angle of the room mirror has been changed) (Step 151). If the adjustment has not been made, the process ends. If it is determined in step 151 that the rearview mirror has been adjusted,
In step 5, the viewpoint position is calculated from the adjustment angle of the room mirror in the same manner as step 1007 in the first embodiment (step 1).
52) storing the obtained viewpoint position information in the viewpoint position storage means 74;
(Step 153), and the process ends. In this embodiment, whether or not the rearview mirror has been adjusted and its angle information are obtained from the in-vehicle LAN 122 connected to the rearview mirror as in the first embodiment.

In this embodiment, as described above, the viewpoint position is obtained based on the adjustment angle of the room mirror, and the HUD display means 190 determines the display position of the display information such as the guide arrow using the viewpoint position. Will do. Therefore, according to the present embodiment, there is no need for an input unit for receiving a cursor movement instruction, which is required in the first embodiment, and the position of the viewpoint can be measured at lower cost.

<Embodiment 3> In the navigation system of this embodiment, the viewpoint position is measured in the same manner as in Embodiment 2. However, the angle of the side mirror is used instead of the angle of the room mirror. That is, the navigation system of the present embodiment has the same configuration as that of the second embodiment except that the processing content of the activation unit 75 is different.

FIG. 1 shows the processing contents of the starting means 75 of this embodiment.
6 is shown. When the control is transferred to the viewpoint position calculating means 189, the activating means 75 of the present embodiment determines whether or not the side mirror has been adjusted (that is, whether or not the angle of the side mirror has been changed) (Step 1601). If the adjustment has not been made, the process ends. If it is determined in step 1601 that the side mirror has been adjusted, the activation unit 75 calculates the viewpoint position from the adjustment angle of the side mirror in the same manner as step 1008 in the first embodiment (step 1602), and obtains the obtained viewpoint position information. Is stored in the viewpoint position storage means 74 (step 1603), and the process ends. Note that whether or not the side mirror has been adjusted and the angle information thereof are the same in the present embodiment as in the first embodiment.
Obtained from the in-vehicle LAN 122 connected to the side mirror.

In this embodiment, similarly to the second embodiment, no input means for receiving a cursor movement instruction is required. Therefore, according to the present embodiment, the position of the viewpoint can be measured at a lower cost than in the first embodiment. Further, by using the angles of the two side mirrors, it is possible to obtain a more accurate viewpoint position than in the second embodiment.

<Embodiment 4> The viewpoint position measuring method of the present invention is not limited to a navigation system, but can be applied to other systems. Therefore, an embodiment in which the invention is applied to a room mirror and side mirror adjustment device will be described here.

As shown in FIG. 17, the mirror adjusting device of the present embodiment has the same configuration as the navigation system of the first embodiment. However, a mechanism related to only navigation (specifically, the display device 102, the map storage device 104, the various sensors 107 to 109, the GPS receiving device 1
20, and instead of the traffic information receiving device 121),
Rearview mirror drive mechanism 1 that changes the angle of the rearview mirror
71 and a side mirror driving mechanism 172 for changing the angle of the left and right side mirrors.

The arithmetic processing unit 170 according to the present embodiment is different from the arithmetic processing unit 170 according to the first embodiment.
Has the same hardware configuration as the arithmetic processing unit 101, but has a different functional configuration and processing content. Therefore, here, the functional configuration and processing contents of the arithmetic processing unit 170 will be described.

As shown in FIG. 18, the arithmetic processing unit 170 of this embodiment includes a viewpoint position measuring unit 175, a HUD display unit 176, a mirror angle determining unit 177, a mirror angle table 178, It has the same graphics processing means 191. HUD display means 1 of this embodiment
76 is the same as the HUD display means 190 of the first embodiment, except that the processing of steps 1053 to 1059 is not performed. The mirror angle table 178 is a storage area secured in the RAM 132, and previously holds desired angles of the left and right side mirrors and the room mirror for each coordinate of the viewpoint position. The units 175 to 177 and 191 other than the storage unit 178 are realized by the CPU 131 executing the instructions held in the ROM 133, but may be realized by hardware such as a dedicated circuit.

The viewpoint position measuring means 175 of the present embodiment performs processing in the same procedure as the viewpoint position measuring means 189 of the first embodiment. However, the viewpoint position measuring means 175 of this embodiment is
After activating the mirror angle determining means 177 at the end of the processing, the processing is terminated.

As shown in FIG. 19, the mirror angle determining means 177 first reads the viewpoint position from the viewpoint position storage means 74 (step 1901), and searches the mirror angle table using the viewpoint position to obtain the room. The optimum value of the mirror angle is determined (step 1902), and the optimum value is notified to the room mirror driving mechanism 171 as a signal S173, and the angle of the room mirror is adjusted to the optimum value (step 1903). Next, mirror angle determining means 1
77 searches the mirror angle table using the viewpoint position read in step 1901 to obtain an optimum value of the angle of the side mirror (step 1904), and uses this optimum value as a signal S174 as the signal S174.
2, the angle of the side mirror is adjusted to the optimum value (step 1905), and the process is terminated.

As described above, the viewpoint position measuring method using the cursor similar to the first embodiment can be applied to the mirror adjusting device. According to the present embodiment, the mirror can be automatically adjusted to an appropriate angle at a low cost and easily, so that not only the usability of the driver is improved but also the driving safety can be contributed. In this embodiment, both the side mirror and the room mirror are adjusted by the mirror adjustment device, but only one of them may be adjusted. In addition, it is preferable that the angle can be manually adjusted after the angle adjustment by the mirror adjustment device.

<Embodiment 5> This embodiment is an embodiment of a mirror adjustment device similar to that of Embodiment 4 to which the viewpoint position measuring method using the angle of the room mirror similar to Embodiment 2 is applied. The mirror adjustment device of the present embodiment has substantially the same configuration as that of the fourth embodiment, but differs in the following points.

First, the mirror adjusting device of this embodiment is
It does not include the UD output device 103, the audio input / output device 105, the input / output device 106, the room mirror drive mechanism 171, and the coordinate conversion table 201. Second, the arithmetic processing unit 170 of the present embodiment does not include the HUD display unit 176 and the graphics processing unit 191. Third, the viewpoint position measuring means 189 of this embodiment has the same configuration as that of the second embodiment. However, the viewpoint position measuring means 189 of the present embodiment.
Starts the mirror angle determination means 177 at the end of the processing, and then ends the processing. Fourth, the mirror angle table 178 of this embodiment does not hold information on the angle of the room mirror. Fifth, the mirror angle determining unit 177 of the present embodiment does not execute the processing (steps 1902 to 1903) related to the change of the rearview mirror angle. Other configurations are the same as in the fourth embodiment.

According to this embodiment, similarly to the second embodiment, the viewpoint position is obtained based on the adjustment angle of the room mirror, and each mirror is adjusted using the viewpoint position. Therefore, according to the present embodiment, there is no need for an input unit for receiving a cursor movement instruction, which is required in the fourth embodiment, and the position of the viewpoint can be measured at lower cost.

<Embodiment 6> This embodiment is the same as Embodiment 5 except that a side mirror is used instead of a room mirror. The mirror adjustment device of the present embodiment has substantially the same configuration as that of the fourth embodiment, but differs in the following points.

First, the mirror adjusting device of this embodiment is
It does not include the UD output device 103, the audio input / output device 105, the input / output device 106, the side mirror driving mechanism 171, and the coordinate conversion table 201. Second, the arithmetic processing unit 170 of the present embodiment does not include the HUD display unit 176 and the graphics processing unit 191. Third, the viewpoint position measuring means 189 of the present embodiment has the same configuration as that of the third embodiment. However, the viewpoint position measuring means 189 of the present embodiment.
Starts the mirror angle determination means 177 at the end of the processing, and then ends the processing. Fourth, the mirror angle table 178 of this embodiment does not hold information on the angle of the room mirror. Fifth, the mirror angle determination unit 177 of the present embodiment does not execute the processing (steps 1904 to 1905) relating to the change of the angle of the side mirror. Other configurations are the same as in the fourth embodiment.

In this embodiment, as in the fifth embodiment, no input means is required for receiving a cursor movement instruction. Therefore, according to the present embodiment, the position of the viewpoint can be measured at a lower cost than in the fourth embodiment. Further, by using the angles of the two side mirrors, it is possible to obtain a more accurate viewpoint position than in the second embodiment.

[0152]

According to the present invention, the viewpoint position can be accurately measured at low cost. Therefore, according to the present invention, a viewpoint position measuring device, a head-up display device, and a vehicle mirror adjusting device can be provided at low cost.

[Brief description of the drawings]

FIG. 1 is an explanatory diagram illustrating a display example according to a first embodiment.

FIG. 2 is an explanatory diagram showing the principle of a viewpoint position measuring method using a cursor.

FIG. 3 is a configuration diagram of a navigation system according to the first embodiment.

FIG. 4 is a hardware configuration diagram of an arithmetic processing unit according to the first embodiment.

FIG. 5 is a schematic diagram illustrating an appearance of an input panel according to the first embodiment.

FIG. 6 is a functional configuration diagram of an arithmetic processing unit according to the first embodiment.

FIG. 7 is a flowchart of a process performed by a viewpoint position measuring unit according to the first embodiment.

FIG. 8 is a flowchart of a process performed by a reference point display unit according to the first embodiment.

FIG. 9 is a flowchart of a process performed by a HUD display unit according to the first embodiment.

FIG. 10 is an explanatory diagram illustrating a data configuration example of a coordinate conversion table according to the first embodiment.

FIG. 11 is an explanatory diagram showing a principle of a viewpoint position measuring method using an angle of a room mirror.

FIG. 12 is an explanatory diagram showing a principle of a viewpoint position measuring method using an angle of a side mirror.

FIG. 13 is a functional configuration diagram of a viewpoint position measuring unit according to the first embodiment.

FIG. 14 is a functional configuration diagram of a viewpoint position measuring unit according to the second embodiment.

FIG. 15 is a flowchart of a process performed by a starting unit according to the second embodiment.

FIG. 16 is a flowchart of a process performed by a starting unit according to the third embodiment.

FIG. 17 is a configuration diagram of a mirror adjustment device according to a fourth embodiment.

FIG. 18 is a functional configuration diagram of an arithmetic processing unit according to a fourth embodiment.

FIG. 19 is a flowchart of a process performed by a mirror angle determining unit according to the fourth embodiment.

[Explanation of symbols]

1, 2 ... goal, 3 ... left turn guidance arrow generated by HUD, 4
… Cursor position control switch, 5… Left cross cursor,
6 right cursor, 20 windshield, 21
Projector, 22 ... Mirror, 23 ... Position where the line connecting the viewpoint and the left pseudo target crosses the windshield, 24 ...
Position where the line connecting the viewpoint and the right pseudo target crosses the windshield, 25: Left side vehicle width confirmation marker, 26: Right side vehicle width confirmation marker, 27: Viewpoint, 29: Driver's seat center plane, 3
Reference numeral 1 denotes a line connecting the viewpoint and the left target, 32 denotes a line connecting the viewpoint and the right target, 33 denotes a line connecting the viewpoint and the target, 41.
Rearview mirror, 42 ... Viewpoint, 43 ... Center plane of driver's seat, 4
4, 45: line connecting the viewpoint and the room mirror, 61: right side mirror, 62: line connecting the viewpoint and the side mirror, 6
3 Left side mirror, 64 View point, 65 Center plane of driver's seat, 66 Line connecting viewpoint and left side mirror, 67
... a line connecting the viewpoint and the right side mirror, 71 ... activation means, 72 ... reference point display means, 73 ... viewpoint position calculation means,
74: viewpoint position storage means, 75: activation means, 101: arithmetic processing unit, 102: display, 103: HUD output device, 104: map storage device, 105: voice input / output device, 106: input device, 106a: navigation system Input panel, 106b: input panel for viewpoint position measurement, 107: wheel speed sensor, 108: geomagnetic sensor, 1
09: gyro, 120: GPS receiver, 121: traffic information receiving device, 122: in-vehicle LAN device, 131: CP
U, 132: RAM, 133: ROM, 134: DM
A, 135: drawing controller, 136: VRAM, 1
37: color palette, 138: A / D converter, 139
... SCI, 140 ... PIO, 141 ... Counter, 16
DESCRIPTION OF SYMBOLS 1 ... Direction switch, 162 ... Side mirror selection switch, 163 ... Cursor selection switch, 164 ... Navigation system panel, 165 ... Cursor position setting key, 170 ... Calculation processing unit, 171 ... Room mirror drive mechanism, 172 ... Side mirror Driving mechanism, 175: viewpoint position measuring means, 176: HUD display means, 177: mirror angle determining means, 178: mirror angle table, 166 ...
Direction instruction key, 181, user operation analysis means, 182,
Route calculation means, 183: Route storage means, 184: Route guidance means, 185: Map / menu display means, 186: Current position calculation means, 187: Map matching means, 188 ...
Data reading means, 189 ... viewpoint position measuring means, 190 ...
HUD display means, 191 ... graphics processing means, 2
01: coordinate conversion table.

Continued on the front page (72) Inventor Mariko Okude 7-1-1, Omika-cho, Hitachi City, Ibaraki Prefecture Within Hitachi Research Laboratory, Hitachi, Ltd.

Claims (25)

[Claims]
1. A display device according to claim 1, further comprising: a display position of a reference point displayed on a screen; and a predetermined target position observed through said screen. A viewpoint position measuring method characterized in that a viewpoint position is obtained.
2. The method according to claim 1, wherein the screen is a windshield of a vehicle.
3. The method according to claim 2, wherein the target is a vehicle width confirmation marker.
4. The method according to claim 2, wherein the target is a mark provided on a hood of the vehicle in advance.
5. A method for measuring a viewpoint position of a driver of a vehicle, wherein the viewpoint position of the driver of the vehicle is obtained based on an angle of a rearview mirror and / or a side mirror of the vehicle. Position measurement method.
6. An input means for receiving an input of position information of a reference point; a reference point display means for displaying the reference point on a screen based on the position information received via the input means; A viewpoint position calculating means for calculating a viewpoint position based on a display position of a point, wherein the viewpoint position calculating means determines the viewpoint position as the viewpoint, through the reference point and the screen, and A viewpoint position measuring device characterized in that a position of a viewpoint that appears to overlap with a target is obtained by calculation.
7. A viewpoint position measuring apparatus according to claim 6, wherein said reference point display means erases the display when a predetermined time elapses after the display of said reference point is started. .
8. The system according to claim 6, further comprising: end instruction means for receiving a notification of the end of input of the position information of the reference point, wherein the reference point display means transmits a notification of the end of the input via the end instruction means. A viewpoint position measuring device, wherein upon receipt, the display of the reference point is deleted.
9. The method according to claim 6, wherein the reference point display means includes: means for displaying a plurality of the reference points on the screen; and any one of the plurality of the reference points as a position information input target. The input of the position information of the target reference point is received via the input means, and based on the input position information,
A viewpoint position measuring device, comprising: means for changing the display position of the reference point of the input target; and means for changing the reference point of the input target.
10. The viewpoint position measuring device according to claim 9, wherein the reference point display means further comprises means for highlighting the reference point to be input.
11. The viewpoint position according to claim 9, wherein said reference point display means further comprises: means for outputting a voice as to which of said plurality of reference points is said input target. measuring device.
12. The viewpoint position measuring device according to claim 6, wherein the screen is a windshield of a vehicle.
13. The viewpoint position measuring device according to claim 12, further comprising an activation unit that activates the reference point display unit when the start of the vehicle is detected.
14. The vehicle according to claim 13, wherein when activated, at least one of a predetermined seat position of the vehicle, an angle of a rearview mirror of the vehicle, and an angle of a side mirror of the vehicle is determined. The apparatus further comprises means for detecting and estimating a viewpoint position based on the detected value, wherein the reference point display means is located at a position where the target and the target overlap with each other on the screen when viewed from the estimated viewpoint position. And a means for displaying the reference point.
15. The change estimating means for estimating a change in the viewpoint position, and a starting means for starting the reference point displaying means when the change estimating means estimates that the viewpoint position has been changed. And a viewpoint position measuring device, further comprising:
16. A vehicle according to claim 15, wherein said change estimating means includes means for detecting a movement of a predetermined seat position of said vehicle, means for detecting a change in an angle of a rearview mirror of said vehicle, And a means for detecting a change in the angle of a side mirror of the vehicle.
17. The method according to claim 12, further comprising a change amount estimating means for estimating the change amount of the viewpoint position, wherein the reference point display means displays a reference point based on the estimated change amount. A viewpoint position measuring device, further comprising means for changing a position.
18. The apparatus according to claim 12, further comprising a change amount estimating means for estimating the change amount of the viewpoint position, wherein said viewpoint position calculating means calculates the viewpoint position based on the estimated change amount. A viewpoint position measuring apparatus, further comprising: a unit for correcting a viewpoint position obtained in advance by the unit.
19. A vehicle according to claim 17, wherein said change amount estimating means detects a movement amount of a predetermined seat position of said vehicle, and detects a change amount of an angle of a room mirror of said vehicle. And a means for detecting a change in the angle of the side mirror of the vehicle.
20. A viewpoint position measurement, comprising: mirror angle measurement means for measuring an angle of a rearview mirror of a vehicle; and means for obtaining a viewpoint position of a driver of the vehicle based on the measured angle. apparatus.
21. A viewpoint position measuring device comprising: mirror angle measuring means for measuring an angle of a side mirror of a vehicle; and means for obtaining a viewpoint position of a driver of the vehicle based on the measured angle. apparatus.
22. A side mirror according to claim 6 or 20, wherein a target angle of the side mirror is determined based on the viewpoint determined by the viewpoint position measuring apparatus, and the side mirror is set to the target angle. Means for adjusting the angle of the side mirror.
23. A viewpoint position measuring device according to claim 6 or 21, and a target angle of the room mirror is determined based on the viewpoint position determined by the viewpoint position measuring device, and the room mirror is set to the target angle. Means for adjusting the angle of the rearview mirror.
24. A viewpoint position measuring device according to claim 6, 20 or 21, and an image display device for projecting an image on the screen, wherein the image display device obtains the image obtained by the viewpoint position measuring device. Based on the viewpoint position,
A head-up display device, wherein a display position of the image is determined.
25. A head-up display device for projecting an image on a screen, wherein a reference point for viewpoint position measurement is displayed on the screen at the time of startup.
JP8338067A 1996-12-18 1996-12-18 Viewpoint position measuring method and device, head-up display, and mirror adjustment device Pending JPH10176928A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP8338067A JPH10176928A (en) 1996-12-18 1996-12-18 Viewpoint position measuring method and device, head-up display, and mirror adjustment device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP8338067A JPH10176928A (en) 1996-12-18 1996-12-18 Viewpoint position measuring method and device, head-up display, and mirror adjustment device

Publications (1)

Publication Number Publication Date
JPH10176928A true JPH10176928A (en) 1998-06-30

Family

ID=18314606

Family Applications (1)

Application Number Title Priority Date Filing Date
JP8338067A Pending JPH10176928A (en) 1996-12-18 1996-12-18 Viewpoint position measuring method and device, head-up display, and mirror adjustment device

Country Status (1)

Country Link
JP (1) JPH10176928A (en)

Cited By (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007512636A (en) * 2003-12-01 2007-05-17 ボルボ テクノロジー コーポレイション Method and system for supporting route control
JP2007272350A (en) * 2006-03-30 2007-10-18 Honda Motor Co Ltd Driving support device for vehicle
US7561966B2 (en) 2003-12-17 2009-07-14 Denso Corporation Vehicle information display system
KR100939761B1 (en) 2008-10-02 2010-01-29 삼성전기주식회사 Camera unit for a vehicle, method for display outside a vehicle, and system for generating driving corridor markers
JP2010096874A (en) * 2008-10-15 2010-04-30 Nippon Seiki Co Ltd Display device for vehicle
WO2011122654A1 (en) * 2010-03-30 2011-10-06 新日鉄ソリューションズ株式会社 Information processing device, information processing method and program, information processing device, vacant space guidance system, vacant space guidance method and program, image displaying system, and image displaying method and program
JP2011227874A (en) * 2010-03-30 2011-11-10 Ns Solutions Corp Information processing device, system, empty space guide method and program
WO2012023480A1 (en) * 2010-08-16 2012-02-23 トヨタ自動車株式会社 Viewpoint location computation device
JP2012068481A (en) * 2010-09-24 2012-04-05 Asia Air Survey Co Ltd Augmented reality expression system and method
JP2015063256A (en) * 2013-09-26 2015-04-09 日産自動車株式会社 Driver eyeball position estimation device
CN105313780A (en) * 2014-07-21 2016-02-10 罗伯特·博世有限公司 Driver information system in a vehicle
JP2016055756A (en) * 2014-09-09 2016-04-21 カルソニックカンセイ株式会社 Head-up display device for vehicle
WO2016085149A1 (en) * 2014-11-24 2016-06-02 현대엠엔소프트 주식회사 Hud system based on vehicle traveling direction
KR20160068611A (en) * 2014-12-05 2016-06-15 현대 아메리카 테크니컬 센타, 아이엔씨 Method and system for aligning a vehicle with a wireless charging assembly
JP2016210212A (en) * 2015-04-30 2016-12-15 株式会社リコー Information providing device, information providing method and control program for information provision
WO2018042473A1 (en) * 2016-08-29 2018-03-08 三菱電機株式会社 Display control apparatus and display control method
FR3060774A1 (en) * 2016-12-16 2018-06-22 Peugeot Citroen Automobiles Sa Method for adjusting high-level reality head display device

Cited By (26)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7656313B2 (en) 2003-11-30 2010-02-02 Volvo Technology Corp. Method and system for supporting path control
US8497880B2 (en) 2003-12-01 2013-07-30 Volvo Technology Corporation Method and system for presenting information
JP2007512636A (en) * 2003-12-01 2007-05-17 ボルボ テクノロジー コーポレイション Method and system for supporting route control
US7561966B2 (en) 2003-12-17 2009-07-14 Denso Corporation Vehicle information display system
JP2007272350A (en) * 2006-03-30 2007-10-18 Honda Motor Co Ltd Driving support device for vehicle
KR100939761B1 (en) 2008-10-02 2010-01-29 삼성전기주식회사 Camera unit for a vehicle, method for display outside a vehicle, and system for generating driving corridor markers
JP2010096874A (en) * 2008-10-15 2010-04-30 Nippon Seiki Co Ltd Display device for vehicle
US8896685B2 (en) 2010-03-14 2014-11-25 Ns Solutions Corporation Method and system for determining information relating to vacant spaces of a parking lot
US9395195B2 (en) 2010-03-30 2016-07-19 Ns Solutions Corporation System, method and program for managing and displaying product information
JP2011227874A (en) * 2010-03-30 2011-11-10 Ns Solutions Corp Information processing device, system, empty space guide method and program
US9689688B2 (en) 2010-03-30 2017-06-27 Ns Solutions Corporation Image display system, image display method and program
WO2011122654A1 (en) * 2010-03-30 2011-10-06 新日鉄ソリューションズ株式会社 Information processing device, information processing method and program, information processing device, vacant space guidance system, vacant space guidance method and program, image displaying system, and image displaying method and program
JP2012040902A (en) * 2010-08-16 2012-03-01 Toyota Motor Corp Viewpoint location computation device
WO2012023480A1 (en) * 2010-08-16 2012-02-23 トヨタ自動車株式会社 Viewpoint location computation device
US8594974B2 (en) 2010-08-16 2013-11-26 Toyota Jidosha Kabushiki Kaisha Viewpoint location computation device
JP2012068481A (en) * 2010-09-24 2012-04-05 Asia Air Survey Co Ltd Augmented reality expression system and method
JP2015063256A (en) * 2013-09-26 2015-04-09 日産自動車株式会社 Driver eyeball position estimation device
CN105313780A (en) * 2014-07-21 2016-02-10 罗伯特·博世有限公司 Driver information system in a vehicle
JP2016055756A (en) * 2014-09-09 2016-04-21 カルソニックカンセイ株式会社 Head-up display device for vehicle
WO2016085149A1 (en) * 2014-11-24 2016-06-02 현대엠엔소프트 주식회사 Hud system based on vehicle traveling direction
US9463707B2 (en) 2014-12-05 2016-10-11 Hyundai America Technical Center, Inc. Method and system for aligning a vehicle with a wireless charging assembly
KR20160068611A (en) * 2014-12-05 2016-06-15 현대 아메리카 테크니컬 센타, 아이엔씨 Method and system for aligning a vehicle with a wireless charging assembly
JP2016210212A (en) * 2015-04-30 2016-12-15 株式会社リコー Information providing device, information providing method and control program for information provision
WO2018042473A1 (en) * 2016-08-29 2018-03-08 三菱電機株式会社 Display control apparatus and display control method
JPWO2018042473A1 (en) * 2016-08-29 2018-11-22 三菱電機株式会社 Display control apparatus and display control method
FR3060774A1 (en) * 2016-12-16 2018-06-22 Peugeot Citroen Automobiles Sa Method for adjusting high-level reality head display device

Similar Documents

Publication Publication Date Title
US10029700B2 (en) Infotainment system with head-up display for symbol projection
JP5106540B2 (en) In-vehicle information provider
DE69736766T2 (en) Method and device for displaying a navigation map
US6919866B2 (en) Vehicular navigation system
EP0738874B1 (en) Navigation display with bird&#39;s-eye perspective
US8036823B2 (en) Navigation system
US8315796B2 (en) Navigation device
JP4935145B2 (en) Car navigation system
JP4497133B2 (en) Driving support method and driving support device
EP1974998B1 (en) Driving support method and driving support apparatus
EP2724896B1 (en) Parking assistance device
US8352180B2 (en) Device with camera-info
KR100930159B1 (en) Display method of vehicle mounted display device and vehicle mounted display device
US7423553B2 (en) Image display apparatus, image display method, measurement apparatus, measurement method, information processing method, information processing apparatus, and identification method
US7688221B2 (en) Driving support apparatus
JP2008309529A (en) Navigation system, navigation method and program for navigation
JP4646923B2 (en) Navigation system and portable terminal device
JP2007099261A (en) Parking assistance method and parking assistance device
US8423292B2 (en) Navigation device with camera-info
US20120235805A1 (en) Information display apparatus and information display method
JP2005207999A (en) Navigation system, and intersection guide method
US8010283B2 (en) Driving evaluation system and server
KR19980034003A (en) Navigation device that informs the surroundings of the car and its control method
US7215254B2 (en) Driving assistance system
US20030045973A1 (en) Motor vehicle parking support unit and method thereof