JP4574157B2 - Information display device and information display method - Google Patents

Information display device and information display method Download PDF

Info

Publication number
JP4574157B2
JP4574157B2 JP2003357201A JP2003357201A JP4574157B2 JP 4574157 B2 JP4574157 B2 JP 4574157B2 JP 2003357201 A JP2003357201 A JP 2003357201A JP 2003357201 A JP2003357201 A JP 2003357201A JP 4574157 B2 JP4574157 B2 JP 4574157B2
Authority
JP
Japan
Prior art keywords
information
object
display
display device
recognized
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
JP2003357201A
Other languages
Japanese (ja)
Other versions
JP2005121495A (en
Inventor
勉 丹沢
英明 土屋
Original Assignee
富士重工業株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 富士重工業株式会社 filed Critical 富士重工業株式会社
Priority to JP2003357201A priority Critical patent/JP4574157B2/en
Priority claimed from US10/965,126 external-priority patent/US7356408B2/en
Publication of JP2005121495A publication Critical patent/JP2005121495A/en
Application granted granted Critical
Publication of JP4574157B2 publication Critical patent/JP4574157B2/en
Application status is Active legal-status Critical
Anticipated expiration legal-status Critical

Links

Images

Description

  The present invention relates to an information display device and an information display method, and more particularly, to a technique for displaying a traveling situation ahead of a host vehicle and navigation information in an overlapping manner.

2. Description of the Related Art In recent years, attention has been focused on an information display device that displays a traveling state ahead along with navigation information on a display provided in a vehicle. For example, Patent Document 1 discloses a vehicle display device that displays, on a display screen, a partial image corresponding to a region where a host vehicle travels, among infrared images captured by an infrared camera, superimposed on a map image. It is disclosed. According to this Patent Document 1, by superimposing an infrared image in which a less necessary part is cut on a map image, it is easy to recognize the type and size of an obstacle, and the recognition of an object is improved. Can be achieved. Patent Document 2 discloses an information display device that displays the position information of surrounding vehicles and following vehicles on a road shape generated from map information in a cruise control device. According to this Patent Document 2, a mark indicating the position of the own vehicle, a mark indicating the position of the following vehicle, and a mark indicating the surrounding vehicle position other than the following vehicle are displayed on the road in different colors and patterns, respectively. The
JP-A-11-250396 JP 2002-46504 A

  However, according to Patent Document 1 described above, the infrared image is merely displayed as it is, and the user recognizes the obstacle from the dynamically changing image. Further, according to Patent Document 2, although the own vehicle, the following vehicle, and the surrounding vehicle are displayed in different display forms, no further information can be obtained from the display. Such an information display device is a device for enjoying a safe and comfortable drive, and ease of use for the user adds value and motivates purchase. For this reason, this type of device is easy to use and requires a unique function.

  The present invention has been made in view of such circumstances, and an object of the present invention is to improve user convenience in an information display device that displays navigation information and a traveling state in an overlapping manner.

In order to solve such problems, a first invention, in accordance with the traveling of the vehicle, the navigation information displayed on the display method according to the driver's eye, obtained from the preview sensor for detecting the running condition of the vehicle ahead in the information display apparatus for displaying overlapping information,
Based on the detection result of the preview sensor recognizes the vehicle ahead of the object, with the object the recognized object is classified into the type belonging, a recognition unit that calculates a road model which defines the road shape recognition Provided is an information display device comprising: a control unit that determines information to be displayed based on an object recognized by the unit and navigation information; and a display device that is controlled by the control unit and displays the determined information . Here, the control unit associates the road position of the navigation information with the position of the object based on the road model calculated by the recognition unit , and superimposes the symbol indicating the recognized object and the navigation information. The display device is controlled so that the symbols are displayed using a plurality of different display colors corresponding to the type to which the object belongs.

  Here, in the first invention, it is preferable that the recognition unit classifies an object recognized as at least one of an automobile, a two-wheeled vehicle, a pedestrian, and an obstacle.

The second invention is, according to the traveling of the vehicle, the navigation information displayed on the display method according to the driver's eye, displays superimposed information obtained from the preview sensor for detecting the running condition of the vehicle ahead in the information display method, the computer, on the basis of the detection result of the preview sensor recognizes the vehicle ahead of the object, with the object the recognized object is classified into the type belonging, road model defining the road shape A first step of calculating the first step, and the computer determines the information to be displayed based on the recognized object and the navigation information, and controls the display device to display the determined information. An information display method having two steps is provided. Here, the second step associates the road position of the navigation information with the position of the object based on the road model, and displays the symbol indicating the recognized object and the navigation information in an overlapping manner. In addition to controlling the display device, the display device is controlled to display symbols using a plurality of different display colors corresponding to the types to which the object belongs.

  Here, in the second invention, it is preferable that the first step is a step of classifying an object recognized as at least one of an automobile, a two-wheeled vehicle, a pedestrian, and an obstacle.

The third invention is according to the traveling of the vehicle, the navigation information displayed on the display method according to the driver's eye, displays superimposed information obtained from the preview sensor for detecting the running condition of the vehicle ahead In the information display device, based on the detection result of the preview sensor, the object ahead of the host vehicle is recognized , the risk level indicating the degree of danger of the recognized object to the host vehicle is calculated , and the road shape is defined. A recognition unit that calculates a road model, a control unit that determines information to be displayed based on the object recognized by the recognition unit and navigation information, and a display that displays the determined information controlled by the control unit An information display apparatus having the apparatus is provided. Here, the control unit associates the road position of the navigation information with the position of the object based on the road model calculated by the recognition unit , and superimposes the symbol indicating the recognized object and the navigation information. The display device is controlled so that the symbols are displayed using a plurality of different display colors corresponding to the degree of risk.

Further, the fourth invention, in accordance with the traveling of the vehicle, the navigation information displayed on the display method according to the driver's eye, displays superimposed information obtained from the preview sensor for detecting the running condition of the vehicle ahead In the information display method, the computer recognizes an object ahead of the host vehicle based on information obtained from the preview sensor, and calculates a risk level indicating a degree of danger of the recognized object to the host vehicle. And a second step in which the computer determines information to be displayed based on the recognized object and the navigation information, and displays the determined information by controlling the display device. Provided is an information display method. Here, the second step associates the road position of the navigation information with the position of the object based on the road model, and displays the symbol indicating the recognized object and the navigation information in an overlapping manner. In addition to controlling the display device, the display device is controlled to display symbols using a plurality of different display colors each corresponding to the degree of danger.

  Here, in the third invention or the fourth invention, the display color is preferably set to three or more colors according to the degree of danger.

According to the present invention, an object existing ahead of the host vehicle is recognized based on the detection result of the preview sensor. A symbol indicating the object and navigation information are displayed in an overlapping manner. In this case, the display device is controlled so that the displayed symbols are displayed in different display colors depending on the recognized object. Therefore, the difference in the object can be determined based on the color, so that the visibility of the user can be improved. As a result, it is possible to improve user convenience. Further, in the display method using the driver's eye, the symbol can be displayed at a more accurate position by associating the road position of the navigation information with the position of the three-dimensional object based on the road model.

  FIG. 1 is a block diagram showing the overall configuration of the information display apparatus 1 according to the present embodiment. As the preview sensor 2 for detecting the traveling situation ahead of the host vehicle, a known stereo image processing device configured by a stereo camera and an image processing system can be used.

  A stereo camera that captures the scenery in front of the vehicle is attached, for example, in the vicinity of a room mirror. This stereo camera includes a pair of cameras 20 and 21, and each camera 20 and 21 includes an image sensor (for example, a CCD or CMOS sensor). The main camera 20 captures a reference image necessary for performing stereo image processing, and the sub camera 21 captures a comparative image. In a state where they are synchronized with each other, each analog image output from the cameras 20 and 21 is converted into a digital image of a predetermined luminance gradation (for example, 256 gradation gray scale) by the A / D converters 22 and 23. Is converted to

  The pair of digitized image data is subjected to brightness correction, image geometric conversion, and the like in the image correction unit 24. Usually, there is an error in the mounting positions of the pair of cameras 20 and 21 to some extent, but a shift due to the error occurs in the left and right images. In order to correct this deviation, geometrical transformation such as image rotation or translation is performed using affine transformation or the like.

  Through such image processing, reference image data is obtained from the main camera 20, and comparison image data is obtained from the sub camera 21. These image data are a set of luminance values (0 to 255) of each pixel. Here, the image plane defined by the image data is expressed in the ij coordinate system, with the lower left corner of the image as the origin, the horizontal direction as the i coordinate axis, and the vertical direction as the j coordinate axis. Stereo image data corresponding to one frame is output to the subsequent stereo image processing unit 25 and stored in the image data memory 26.

  The stereo image processing unit 25 calculates distance data related to the captured image corresponding to one frame based on the reference image data and the comparison image data. Here, the “distance data” is a set of parallaxes calculated for each small area on the image plane defined by the image data, and each parallax is associated with a position (i, j) on the image plane. ing. Each parallax is calculated for each pixel block of a predetermined area (for example, 4 × 4 pixels) constituting a part of the reference image.

  When calculating the parallax regarding a certain pixel block (correlation source), a region (correlation destination) having a correlation with the luminance characteristic of this pixel block is specified in the comparison image. The distance from the cameras 20 and 21 to the object appears as a horizontal shift amount between the reference image and the comparison image. Therefore, when searching for the correlation destination in the comparison image, it is only necessary to search on the same horizontal line (epipolar line) as the j coordinate of the pixel block as the correlation source. The stereo image processing unit 25 shifts the correlation between the correlation source and the correlation destination candidate while shifting one pixel at a time on the epipolar line within a predetermined search range set with reference to the i coordinate of the correlation source. Sequential evaluation (stereo matching). In principle, the amount of horizontal deviation of the correlation destination (one of the correlation destination candidates) determined to have the highest correlation is defined as the parallax of the pixel block. The hardware configuration of the stereo image processing unit 25 is disclosed in Japanese Patent Laid-Open No. 5-114099, so refer to it if necessary. The distance data calculated through such processing, that is, a set of parallaxes associated with the position (i, j) on the image is stored in the distance data memory 27.

  The microcomputer 3 includes a CPU, a ROM, a RAM, an input / output interface, and the like. When the microcomputer 3 is viewed as a function, the microcomputer 3 includes a recognition unit 4 and a control unit 5. The recognition unit 4 recognizes an object ahead of the host vehicle based on the detection result in the preview sensor 2 and classifies the recognized object into the type to which the object belongs. The object to be recognized by the recognition unit 4 is typically a three-dimensional object, and in this embodiment, an automobile, a two-wheeled vehicle, a pedestrian, and an obstacle (for example, a fallen object on a road, a pylon or road for road works) This includes four types of trees (side trees, etc.). The control unit 5 determines information to be displayed on the display device 6 based on the object recognized by the recognition unit 4 and the navigation information. Then, the control unit 5 controls the display device 6 to display the symbol indicating the recognized object and the navigation information so as to overlap each other. Therefore, the ROM of the microcomputer 3 stores a symbol indicating an object (in this embodiment, an automobile, a two-wheeled vehicle, a pedestrian, and an obstacle) as data in a predetermined format (for example, an image or a wire frame model). ing. The symbols indicating these objects are displayed using a plurality of different display colors corresponding to the type to which each object belongs. Further, when the recognition unit 4 determines that an alarm to the driver is necessary based on the recognition result of the object, the recognition unit 4 can operate the display device 6 and the speaker 7 to prompt the driver to call attention. Moreover, the recognition part 4 may control the control apparatus 8 as needed, and may perform vehicle control, such as downshift and brake control.

  Here, the navigation information is information necessary for displaying the current position of the host vehicle and the planned route of the host vehicle together with the map information, and can be acquired from the known navigation system 9. Although not shown in FIG. 1, the navigation system 9 is mainly composed of a vehicle speed sensor, a gyroscope, a GPS receiver, a map data input unit, and a navigation control unit. The vehicle speed sensor is a sensor that detects the speed of the vehicle, and the gyroscope detects the amount of change in the azimuth angle of the vehicle based on the angular velocity of the rotational motion applied to the vehicle. The GPS receiver receives radio waves transmitted from a GPS artificial satellite via an antenna and detects positioning information such as the position and direction (traveling direction) of the vehicle. The map data input unit is a device for inputting map information data (hereinafter referred to as “map data”) stored in a recording medium such as a CD-ROM or DVD to the system 9. The navigation control unit calculates the current position of the vehicle on the basis of the positioning information obtained from the GPS receiver or the moving distance of the vehicle and the direction change amount of the vehicle according to the vehicle speed. The calculated current position and map data corresponding to the current position are output to the control unit 5 as navigation information.

  FIG. 2 is a flowchart showing a procedure of information display processing according to the present embodiment. The routine shown in this flowchart is called at predetermined intervals and executed by the microcomputer 3. In step 1, the detection result in the preview sensor 2, that is, information necessary for recognizing the traveling situation ahead of the host vehicle is acquired. In the stereo image processing apparatus functioning as the preview sensor 2, the distance data stored in the distance data memory 27 is read in step 1. Note that image data stored in the image data memory 26 is also read as necessary.

  In step 2, a three-dimensional object existing in front of the host vehicle is recognized. In the three-dimensional object recognition, first, noise included in the distance data, that is, parallax that seems to have low reliability is removed by the group filter processing. The parallax caused by mismatch due to the influence of noise or the like has a characteristic that the area of a group having a value different from the value of the surrounding parallax and having the same value as the parallax is relatively small. Therefore, the parallaxes calculated for the respective pixel blocks are grouped so that the amount of change from the parallax in pixel blocks adjacent in the vertical and horizontal directions is within a predetermined threshold. Then, the size of the area of the group is detected, and a group having an area larger than a predetermined size (for example, a two-pixel block) is determined as an effective group. On the other hand, distance data (isolated distance data) belonging to a group having an area of a predetermined size or less is determined to have low calculated parallax reliability and is removed from the distance data.

  Next, based on the parallax extracted by the group filter processing and the coordinate position on the image plane corresponding to the parallax, a position in the real space is calculated using a known coordinate conversion formula. Then, by comparing the calculated position in the real space with the position of the road surface, the parallax positioned above the road surface, that is, the parallax corresponding to the three-dimensional object (hereinafter referred to as “three-dimensional object parallax”). Is extracted. The position of the road surface can be specified by calculating a road model that defines the road shape. The road model is expressed by a linear equation in the horizontal direction and the vertical direction in the coordinate system of the real space, and is calculated by setting the parameters of the linear equation to values that match the actual road shape. Based on the knowledge that the white line drawn on the road surface is higher in luminance than the road surface, the image data is referred to. Based on this image data, the position of the left and right white lines on the image plane is specified by evaluating the luminance change in the width direction of the road. Based on the position of the white line on the image plane, the position of the white line in the real space is detected using the distance data. The road model is calculated by dividing the white line on the road into a plurality of sections in the distance direction, approximating the left and right white lines in each section with a three-dimensional straight line, and connecting them in a polygonal line shape.

  Next, the distance data is sectioned in a grid pattern, and a histogram relating to the three-dimensional object parallax belonging to each section is created for each grid pattern section. This histogram shows the distribution of the frequency of the three-dimensional object parallax included per unit section. In this histogram, the frequency of the parallax indicating a certain three-dimensional object is high. Therefore, by detecting the three-dimensional object parallax whose frequency is equal to or higher than the determination value in the created histogram, the detected three-dimensional object parallax is detected as a candidate for a three-dimensional object existing in front of the host vehicle. At this time, the distance to the candidate for the three-dimensional object is also calculated. Next, three-dimensional object candidates whose calculated distances are close to each other are grouped in adjacent sections, and each group is recognized as a three-dimensional object. The recognized three-dimensional object is associated with the left and right end positions, the center position, the distance, and the like as parameters. A specific processing procedure in the group filter and a specific processing procedure for three-dimensional object recognition are disclosed in Japanese Patent Laid-Open No. 10-285582, so refer to them if necessary.

In step 3, the recognized three-dimensional object is classified into the type to which the three-dimensional object belongs. The recognized three-dimensional object is classified based on the conditions shown in the following (1) to (3), for example.
(1) Of the three-dimensional objects that are recognized as to whether the horizontal width of the three-dimensional object is equal to or less than the determination value, the automobile has a larger width in the horizontal direction than other three-dimensional objects (two-wheeled vehicle, pedestrian, and obstacle) Therefore, it is possible to separate the automobile and the other three-dimensional object based on the width of the three-dimensional object. Therefore, by using an appropriately set determination value (for example, 1 m), the three-dimensional object whose width of the three-dimensional object is larger than the determination value is classified as an automobile.
(2) Since the speed V of the three-dimensional object is equal to or lower than the judgment value or the three-dimensional object excluding the automobile, the two-wheeled vehicle has a higher speed V than other three-dimensional objects (pedestrians and obstacles). On the basis of the speed V, the two-wheeled vehicle and other three-dimensional objects can be separated. Therefore, by using an appropriately set determination value (for example, 10 kn / h), a three-dimensional object having a three-dimensional object speed V greater than the determination value is classified as a two-wheeled vehicle. The speed V of the three-dimensional object can be calculated based on the relative speed Vr calculated based on the current position of the three-dimensional object and the position of the three-dimensional object a predetermined time ago and the current speed V0 of the host vehicle. is there.
(3) Among the three-dimensional objects excluding automobiles and two-wheeled vehicles, whether the speed V is 0 or not, since the speed V of the obstacle is 0, the pedestrian and the obstacle are separated using the speed V of the three-dimensional object as a criterion. be able to. Therefore, the three-dimensional object whose speed is 0 is classified as an obstacle.

  In addition to these conditions, the pedestrian and the car may be separated by comparing the heights of the three-dimensional objects. In addition, a three-dimensional object whose position in the real space is outside the white line position (road model) may be classified as a pedestrian, or a three-dimensional object that moves in the horizontal direction is a pedestrian that crosses the road. You may classify.

  In step 4, display processing is performed based on the navigation information and the recognized three-dimensional object. First, the control unit 5 determines a symbol for displaying the three-dimensional object on the display device 6 based on the type to which the three-dimensional object belongs. FIG. 3 is a schematic diagram illustrating an example of a symbol. In the figure, symbols for displaying a three-dimensional object belonging to each type are shown, and each symbol has a design imitating the type. (A) is a symbol for displaying a three-dimensional object whose type is classified as “automobile”, and (b) is a symbol for displaying a three-dimensional object whose type is classified as “two-wheeled vehicle”. is there. FIG. 4C is a symbol for displaying a three-dimensional object whose type is classified as “pedestrian”, and FIG. 10D is a symbol indicating a three-dimensional object whose type is classified as “obstacle”. It is a symbol for display.

  For example, in the case where the type of the three-dimensional object is classified as “two-wheeled vehicle”, the control unit 5 controls the display device 6 to display the symbol illustrated in FIG. 3B as the symbol indicating the three-dimensional object. . In the case where two or more three-dimensional objects classified into the same type are recognized, or in the case where two or more three-dimensional objects classified into different types are recognized, each recognized three-dimensional object The display device 6 is controlled so that the corresponding symbol is displayed.

And the control part 5 controls the display apparatus 6 so that it may become a display form shown to following (1), (2).
(1) In the three-dimensional object recognition using the preview sensor 2 that displays the symbol and the navigation information in an overlapping manner, the position of the three-dimensional object is a coordinate system (three-dimensional coordinate system in the present embodiment) with the vehicle as the origin position. Expressed. Therefore, the control unit 5 superimposes the symbol corresponding to each three-dimensional object and the map data on the basis of the position of the recognized three-dimensional object with reference to the current position of the host vehicle obtained from the navigation system 9. In this case, by referring to the road model and associating the road position on the map data with the position of the three-dimensional object based on the road model, the symbol can be displayed at a more accurate position.
(2) Displaying Symbols in a Predetermined Display Color Symbols displayed on the map data have a display color set in advance corresponding to the type to which the three-dimensional object belongs. In the present embodiment, from the viewpoint of protecting the weak in the traffic environment, the symbol indicating the pedestrian to be most careful is set to a red color that stands out in color, and the symbol indicating the next motorcycle to be noted is set to a yellow display color. Yes. Further, a blue display color is associated with a symbol indicating an automobile, and a green display color is associated with a symbol indicating an obstacle. Therefore, when displaying the symbol, the control unit 5 controls the display device 6 so that the symbol is displayed in a display color corresponding to the type to which the three-dimensional object belongs.

  FIG. 4 is an explanatory diagram showing a display state of the display device 6. In this figure, map data is displayed using a so-called driver's eye method in the case where two cars, one motorcycle, and only one pedestrian are recognized, and each three-dimensional object is displayed on it. The symbol indicating is superimposed and displayed. As described above, the display color of the symbols displayed on the display device 6 is set according to the type, and only the symbols indicating the three-dimensional objects classified into the same type are displayed with the same display color.

  As shown in the figure, the control unit 5 may control the display device 6 so that a sense of perspective is expressed in the symbols other than the above conditions (1) and (2). In this case, depending on the distance from the recognized three-dimensional object to the host vehicle, the display object of the symbol becomes smaller as the three-dimensional object exists farther away. In addition, in the case where the symbol displayed far away in position overlaps with the symbol displayed at a position closer thereto, the control unit 5 places the former symbol on the upper side of the latter symbol. The display device 6 may be controlled so as to be displayed. As a result, distant symbols are obscured by nearby symbols, so that the visibility of the symbols is improved and the positional context between the symbols can be expressed.

  Thus, according to the present embodiment, an object (a three-dimensional object in the present embodiment) existing in front of the host vehicle is recognized based on the detection result of the preview sensor. The recognized object is classified into the type to which the three-dimensional object belongs based on the detection result in the preview sensor. Then, the symbol indicating the recognized object and the navigation information are displayed in an overlapping manner. In this case, the display device is controlled so that the displayed symbol has a display color corresponding to the classified type. Therefore, since the difference in the type of the object can be recognized depending on the color, it is possible to improve the visibility by the user (typically a driver). In addition, by properly using display colors according to the degree of alerting, the order of three-dimensional objects that the driver should be aware of can be empirically grasped from the colors. As a result, it is possible to improve convenience by using a function that is not present in the past, and therefore it is possible to improve the appeal of the product from the viewpoint of user friendliness.

  Note that displaying symbols corresponding to all recognized three-dimensional objects is advantageous in that the driving situation is displayed in detail, but the amount of information displayed on the screen increases. That is, information that is not directly related to driving may be displayed, such as a preceding vehicle that is far away from the host vehicle. Therefore, from the viewpoint of eliminating unnecessary information, a plurality of three-dimensional objects close to the host vehicle may be selected, and only symbols corresponding to the selected three-dimensional objects may be displayed. However, the selection method can be arbitrarily determined such that the pedestrian to be most protected is preferentially selected. In the present embodiment, three-dimensional objects are classified into four types. However, the three-dimensional objects may be classified into finer types within a range recognizable by the preview sensor 2.

(Second Embodiment)
The difference between the information display processing according to the second embodiment and that of the first embodiment is that depending on the degree of danger (specifically, the possibility of collision) of the recognized three-dimensional object to the host vehicle, The symbol display color is set. For this reason, in the second embodiment, the recognizing unit 4 further calculates a risk T indicating the degree of danger to the host vehicle for the recognized three-dimensional object. Then, the symbols indicating the recognized three-dimensional object are displayed using a plurality of different display colors each corresponding to the risk T of the three-dimensional object.

Specifically, first, similarly to the processing shown in Steps 1 to 3 in FIG. 3, based on the detection result in the preview sensor 2, a three-dimensional object existing in front of the host vehicle is recognized and the target object belongs. Classified into types. In the present embodiment, after step 3, the risk T is calculated for each recognized object as a calculation target. This risk level T can be uniquely calculated using, for example, Equation 1 shown below.
(Formula 1)
T = K1, D + K2, Vr + K3, Ar

  Here, D is the distance (m) to the object, Vr is the relative speed between the host vehicle and the object, and Ar is the relative acceleration between the host vehicle and the object. The parameters K1 to K3 are coefficients related to the variables D, Vr, and Ar, and are set to appropriate values through experiments and simulations in advance. For example, Formula 1 (risk degree T) in which these coefficients K1 to K3 are set indicates a time margin until the host vehicle reaches the three-dimensional object. In this case, an object with a higher risk T means that the degree of danger is lower (the possibility of collision is smaller), and an object with a lower risk T means that the degree of danger is higher ( High possibility of collision).

  And a display process is performed based on navigation information and the solid object recognized by the recognition part 4 similarly to the process shown to step 4 in FIG. Specifically, first, a symbol to be displayed is determined based on the type to which the recognized three-dimensional object belongs. Then, the display device 6 is controlled so that this symbol and the navigation information are displayed in an overlapping manner. In this case, the display color of the displayed symbol is set in advance corresponding to the degree of risk T calculated for the corresponding three-dimensional object. Specifically, an object whose danger level T is equal to or lower than the first determination value (risk level T ≦ first determination value), that is, a three-dimensional object with a high degree of danger, is a red color whose symbol is conspicuous in color. Is set to In addition, an object whose risk T is greater than the first determination value and equal to or less than a second determination value greater than the first determination value (first determination value <risk level ≦ second determination). Value), that is, a three-dimensional object with a relatively high degree of danger, has its symbol set to yellow. Then, the symbol of a target object having a risk level T greater than the second determination value (second determination value <risk level T), that is, a three-dimensional object with a low risk level, is set to blue.

  FIG. 5 is an explanatory diagram showing a display state of the display device 6. In the figure, the display state in the case where the vehicle ahead is suddenly braked is illustrated. As shown in FIG. 5, by using different display colors corresponding to the risk level T, a symbol indicating a car in front of the vehicle having a high level of risk level (high possibility of a collision) is displayed in red. The Then, a symbol indicating a three-dimensional object with a low level of danger to the host vehicle (low possibility of a collision) is displayed in a yellow or blue display color depending on the level of the danger.

  As described above, according to the present embodiment, the symbol indicating the recognized object and the navigation information are displayed in an overlapping manner, and the display device is configured so that the symbol has a display color corresponding to the degree of danger to the host vehicle. Is controlled. Therefore, since the difference in the degree of danger of the target object with respect to the host vehicle can be recognized by the color, the visibility by the driver can be improved. In addition, by properly using display colors according to the degree of alerting, the order of three-dimensional objects that the driver should be aware of can be ascertained from the colors based on experience. As a result, it is possible to improve convenience by using a function that is not present in the past, and therefore it is possible to improve the appeal of the product from the viewpoint of user friendliness.

  In the present embodiment, display is performed in three display colors according to the degree of risk T, but display may be performed so that there are more display colors. In this case, the driver can be made aware of the degree of danger in a finer range.

  In the first and second embodiments, a stereo image processing apparatus is used as the preview sensor 25. In addition to this, a well-known distance detection sensor such as a monocular camera, a laser radar, or a millimeter wave radar is used alone. Alternatively, these may be used in combination. Even with this method, the same effects as those of the above-described embodiment can be obtained.

  In the first and second embodiments, symbols whose design is determined in advance according to the type of the three-dimensional object are used. However, only one type of symbol may be displayed regardless of the type of the three-dimensional object. . Further, an image corresponding to a recognized three-dimensional object may be displayed based on image data captured by a stereo camera. Even in these cases, the same kind of three-dimensional object (or the degree of danger of the three-dimensional object) can be recognized from the colors by changing the display colors.

The block diagram which shows the whole structure of the information display apparatus concerning this embodiment The flowchart which showed the procedure of the information display process concerning 1st Embodiment. Schematic diagram showing an example of display symbols Explanatory drawing which shows the display state of a display apparatus Explanatory drawing which shows the display state of a display apparatus

Explanation of symbols

DESCRIPTION OF SYMBOLS 1 Information display apparatus 2 Preview sensor 3 Microcomputer 4 Recognition part 5 Control part 6 Display apparatus 7 Speaker 8 Control apparatus 9 Navigation system

Claims (8)

  1. According to the running of the vehicle, the navigation information displayed on the display method according to the driver's eye, the information display apparatus for displaying overlapping information obtained from the preview sensor for detecting the running condition of the vehicle ahead,
    A recognition unit that recognizes an object ahead of the host vehicle based on a detection result in the preview sensor, classifies the recognized object into a type to which the object belongs , and calculates a road model that defines a road shape When,
    A control unit for determining information to be displayed based on the object recognized by the recognition unit and the navigation information;
    A display device that is controlled by the control unit and displays the determined information;
    The control unit associates the road position of the navigation information with the position of the target object based on the road model calculated by the recognition unit, the symbol indicating the recognized target object, and the navigation Controlling the display device to display information in an overlapping manner, and controlling the display device to display the symbol using a plurality of different display colors corresponding to the type to which the object belongs. An information display device.
  2.   The information display device according to claim 1, wherein the recognition unit classifies the recognized object into at least one of an automobile, a motorcycle, a pedestrian, and an obstacle.
  3. According to the running of the vehicle, the navigation information displayed on the display method according to the driver's eye, the information display apparatus for displaying overlapping information obtained from the preview sensor for detecting the running condition of the vehicle ahead,
    A road model that recognizes an object ahead of the host vehicle based on the detection result in the preview sensor, calculates a risk level indicating the degree of danger of the recognized object to the host vehicle , and defines a road shape. A recognition unit to calculate ,
    A control unit for determining information to be displayed based on the object recognized by the recognition unit and the navigation information;
    A display device that is controlled by the control unit and displays the determined information;
    The control unit associates the road position of the navigation information with the position of the target object based on the road model calculated by the recognition unit, the symbol indicating the recognized target object, and the navigation The display device is controlled to display information in an overlapping manner, and the display device is controlled to display the symbol by using a plurality of different display colors corresponding to the degree of risk. Information display device.
  4.   The information display device according to claim 3, wherein the display color is set to three or more colors according to the degree of risk.
  5. According to the running of the vehicle, the navigation information displayed on the display method according to the driver's eye, the information display method of displaying overlapping information obtained from the preview sensor for detecting the running condition of the vehicle ahead,
    Computer, based on a detection result in the preview sensor recognizes the vehicle ahead of the subject, the recognized object with classifying the type of the object belongs, calculating a road model which defines a road shape A first step to:
    A second step in which the computer determines information to be displayed based on the recognized object and the navigation information, and displays the determined information by controlling the display device; Have
    In the second step, the road position of the navigation information is associated with the position of the object on the basis of the road model, and the symbol indicating the recognized object and the navigation information are overlapped. Controlling the display device to display, and controlling the display device to display the symbol using a plurality of different display colors corresponding to the type to which the object belongs, respectively. Display method.
  6.   The information display method according to claim 5, wherein the first step is a step of classifying the recognized object into at least one of an automobile, a two-wheeled vehicle, a pedestrian, and an obstacle.
  7. According to the running of the vehicle, the navigation information displayed on the display method according to the driver's eye, the information display method of displaying overlapping information obtained from the preview sensor for detecting the running condition of the vehicle ahead,
    Computer, based on the information obtained from the preview sensor recognizes the vehicle ahead of the object, a first step of calculating the risk level indicating a degree of danger with respect to the vehicle of the recognized object ,
    A second step in which the computer determines information to be displayed based on the recognized object and the navigation information, and displays the determined information by controlling the display device; Have
    In the second step, the road position of the navigation information is associated with the position of the object on the basis of the road model, and the symbol indicating the recognized object and the navigation information are overlapped. An information display method comprising: controlling the display device to display, and controlling the display device to display the symbol using a plurality of different display colors corresponding to the degree of risk.
  8.   The information display method according to claim 7, wherein the display color is set to three or more colors according to the degree of risk.
JP2003357201A 2003-10-17 2003-10-17 Information display device and information display method Active JP4574157B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2003357201A JP4574157B2 (en) 2003-10-17 2003-10-17 Information display device and information display method

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
JP2003357201A JP4574157B2 (en) 2003-10-17 2003-10-17 Information display device and information display method
US10/965,126 US7356408B2 (en) 2003-10-17 2004-10-14 Information display apparatus and information display method
DE602004011164T DE602004011164T2 (en) 2003-10-17 2004-10-15 Device and method for displaying information
EP04024625A EP1524638B9 (en) 2003-10-17 2004-10-15 Information display apparatus and method

Publications (2)

Publication Number Publication Date
JP2005121495A JP2005121495A (en) 2005-05-12
JP4574157B2 true JP4574157B2 (en) 2010-11-04

Family

ID=34614158

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2003357201A Active JP4574157B2 (en) 2003-10-17 2003-10-17 Information display device and information display method

Country Status (1)

Country Link
JP (1) JP4574157B2 (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4952429B2 (en) * 2007-08-02 2012-06-13 株式会社デンソー Driving support system
JP5054555B2 (en) * 2008-02-07 2012-10-24 クラリオン株式会社 Vehicle peripheral image display device
CN102713989A (en) 2010-03-17 2012-10-03 本田技研工业株式会社 Vehicle surroundings monitoring device
KR101823029B1 (en) * 2016-01-27 2018-01-31 대우조선해양 주식회사 System for managing obstacle of ship and method for managing obstacle
JP6372556B2 (en) * 2016-12-27 2018-08-15 エイディシーテクノロジー株式会社 In-vehicle image display device

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH05265547A (en) * 1992-03-23 1993-10-15 Fuji Heavy Ind Ltd On-vehicle outside monitoring device
JPH06266828A (en) * 1993-03-12 1994-09-22 Fuji Heavy Ind Ltd Outside monitoring device for vehicle
JPH07223488A (en) * 1994-02-14 1995-08-22 Mitsubishi Motors Corp Situation information display device for vehicle
JPH09166452A (en) * 1995-12-14 1997-06-24 Pioneer Electron Corp Drive support apparatus
JPH1069598A (en) * 1996-08-29 1998-03-10 Fuji Heavy Ind Ltd Collision preventing device for vehicle
JP2000211452A (en) * 1999-01-20 2000-08-02 Toyota Motor Corp Travel path shape display device and map data base recording medium
JP2002046506A (en) * 2000-04-24 2002-02-12 Matsushita Electric Ind Co Ltd Navigation device
JP2002049998A (en) * 2000-04-24 2002-02-15 Matsushita Electric Ind Co Ltd Drive support device
JP2002156233A (en) * 2000-11-16 2002-05-31 Fuji Heavy Ind Ltd On-board information display
JP2002342899A (en) * 2001-05-14 2002-11-29 Denso Corp Device and program for supporting driving

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH05265547A (en) * 1992-03-23 1993-10-15 Fuji Heavy Ind Ltd On-vehicle outside monitoring device
JPH06266828A (en) * 1993-03-12 1994-09-22 Fuji Heavy Ind Ltd Outside monitoring device for vehicle
JPH07223488A (en) * 1994-02-14 1995-08-22 Mitsubishi Motors Corp Situation information display device for vehicle
JPH09166452A (en) * 1995-12-14 1997-06-24 Pioneer Electron Corp Drive support apparatus
JPH1069598A (en) * 1996-08-29 1998-03-10 Fuji Heavy Ind Ltd Collision preventing device for vehicle
JP2000211452A (en) * 1999-01-20 2000-08-02 Toyota Motor Corp Travel path shape display device and map data base recording medium
JP2002046506A (en) * 2000-04-24 2002-02-12 Matsushita Electric Ind Co Ltd Navigation device
JP2002049998A (en) * 2000-04-24 2002-02-15 Matsushita Electric Ind Co Ltd Drive support device
JP2002156233A (en) * 2000-11-16 2002-05-31 Fuji Heavy Ind Ltd On-board information display
JP2002342899A (en) * 2001-05-14 2002-11-29 Denso Corp Device and program for supporting driving

Also Published As

Publication number Publication date
JP2005121495A (en) 2005-05-12

Similar Documents

Publication Publication Date Title
US8244027B2 (en) Vehicle environment recognition system
EP2431917B1 (en) Barrier and guardrail detection using a single camera
DE102009005505B4 (en) Method and device for generating an image of the surroundings of a motor vehicle
US10328932B2 (en) Parking assist system with annotated map generation
KR100481248B1 (en) Picture synthesizing apparatus for presenting circumferencial images to driver, and display apparatus, warning apparatus and position recognition apparatus using it
JP3357749B2 (en) Roadway image processing apparatus of the vehicle
KR20020033817A (en) Device for assisting automobile driver
US20050143887A1 (en) Vehicle driving assist system
EP2103500B1 (en) Vehicle and steering control device for vehicle
US7031496B2 (en) Method and apparatus for object recognition using a plurality of cameras and databases
WO2011040119A1 (en) Vehicle controller
JP4093208B2 (en) Vehicle runway determination device
JP2007235642A (en) Obstruction detecting system
US20100153000A1 (en) Navigation system
WO2010032523A1 (en) Device for detecting/judging road boundary
EP2546602A1 (en) Stereo camera device
JP4861574B2 (en) Driving assistance device
KR20090014124A (en) Method and apparatus for evaluating an image
JP5068779B2 (en) Vehicle surroundings overhead image display apparatus and method
US6360170B1 (en) Rear monitoring system
JP4883977B2 (en) Image display device for vehicle
JP5441549B2 (en) Road shape recognition device
US6744380B2 (en) Apparatus for monitoring area adjacent to vehicle
JP4433887B2 (en) Vehicle external recognition device
US20130300872A1 (en) Apparatus and method for displaying a blind spot

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20061005

A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20090630

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20090714

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20090914

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20100202

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20100817

A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20100818

R150 Certificate of patent or registration of utility model

Free format text: JAPANESE INTERMEDIATE CODE: R150

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20130827

Year of fee payment: 3

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

S531 Written request for registration of change of domicile

Free format text: JAPANESE INTERMEDIATE CODE: R313531

R350 Written notification of registration of transfer

Free format text: JAPANESE INTERMEDIATE CODE: R350

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

S533 Written request for registration of change of name

Free format text: JAPANESE INTERMEDIATE CODE: R313533

R350 Written notification of registration of transfer

Free format text: JAPANESE INTERMEDIATE CODE: R350

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250