WO2011070783A1 - 情報表示装置及び情報表示方法 - Google Patents
情報表示装置及び情報表示方法 Download PDFInfo
- Publication number
- WO2011070783A1 WO2011070783A1 PCT/JP2010/007156 JP2010007156W WO2011070783A1 WO 2011070783 A1 WO2011070783 A1 WO 2011070783A1 JP 2010007156 W JP2010007156 W JP 2010007156W WO 2011070783 A1 WO2011070783 A1 WO 2011070783A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- information
- optical flow
- image
- vehicle
- display
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/215—Motion-based segmentation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30248—Vehicle exterior or interior
- G06T2207/30252—Vehicle exterior; Vicinity of vehicle
- G06T2207/30261—Obstacle
Definitions
- the present invention relates to an information display device and an information display method for displaying information for supporting driving by a driver.
- car navigation system car navigation system
- vehicle speed information for example, vehicle speed information, engine speed, radio station number displayed on the center console, music track number information, etc.
- information for supporting safe driving such as displayed map information or route guidance information, or warning of danger around the vehicle.
- a driver often acquires information displayed mainly on an instrument panel or a center console panel through vision.
- the driver's line-of-sight direction and the direction in which the information is displayed are significantly different, or if the distance between the driver and the position at which the information is displayed is long, the driver In order to confirm the contents, the line of sight was moved or the focus was adjusted.
- HUD Head-Up Display
- the HUD is a display that displays predetermined information in a specific range from the upper part of the dashboard of the vehicle to the windshield.
- the HUD is a type for directly viewing a display device such as a liquid crystal display or an OELD (Organic ElectroLuminescent Display), or a light output from a display device including a liquid crystal display or an OELD or a laser light source is projected onto the windshield.
- a display device such as a liquid crystal display or an OELD (Organic ElectroLuminescent Display)
- OELD Organic ElectroLuminescent Display
- a light output from a display device including a liquid crystal display or an OELD or a laser light source is projected onto the windshield.
- the line-of-sight movement time for viewing the display from the state of driving while looking forward can be shortened compared to viewing the conventional instrument panel and center console.
- the type projected on the windshield is characterized in that the display information is superimposed on the foreground of the vehicle, and the optical distance between the driver and the display device can be made longer. As a result, the driver can view the display information with little time for focus adjustment.
- Such a HUD makes it possible to display information at a position close to the center of the driver's visual field, and is expected to greatly improve the driver's awareness of the display information. In addition, it is considered to be particularly effective for elderly people with poor visual function.
- Non-Patent Document 1 discloses a result that, as the above-described attracting effect, by adding a movement effect of longitudinal vibration or lateral vibration to the display information, the operator is easily noticed and the operator is less likely to feel annoyance. Has been. Further, as an example of applying the above-described attracting effect using vibration during driving of a vehicle, Patent Document 1 discloses a method of improving the driver's awareness by lateral vibration.
- Patent Document 1 a vehicle display device and method for generating a flow of light having a moving speed linked to the traveling of a vehicle in a driver's field of view and providing a non-display area indicating the presence of another vehicle in the flow. Is disclosed. Moreover, in this patent document 1, in order to prevent a driver from getting used to the existence of the non-display area, when the non-display area continues for a certain period, the non-display area is laterally vibrated. .
- Patent Document 1 When displaying on a windshield in a state where a predetermined image is vibrated using HUD, the foreground of the vehicle that changes over time and the image are displayed in a superimposed manner. Therefore, the display method disclosed in Patent Document 1 is used. It can be applied, easily noticed, and less troublesome.
- the presence of the attention object is displayed by adding a horizontal vibration attracting effect to the image informing the presence of a pedestrian or the like (attention object) on the course of the vehicle.
- the driver can be surely communicated in a short time.
- the optical flow of the foreground of the vehicle is mainly in the horizontal direction, and the optical flow and the vibration direction of the display image are substantially parallel. Therefore, there has been a problem that a sufficient attractive effect cannot be obtained for the driver.
- the present invention has been made in view of the above-described conventional problems, and an object of the present invention is to provide an information display device and information that are not affected by the traveling state of the vehicle, are easily noticed by the driver, and are less likely to feel bothersome.
- An object is to provide a display method.
- An information display device is an information display device that displays predetermined information, which is mounted on a vehicle including a camera that captures a foreground image, and displays a predetermined information by moving it.
- An optical flow calculation unit that acquires a plurality of foreground images captured by the camera and calculates foreground optical flow vector information based on the acquired foreground images, and optical flow vector information from the optical flow calculation unit
- an exercise effect determination unit that determines the exercise effect of the predetermined information based on the acquired vector information of the optical flow.
- the information display device and the information display method of the present invention it is possible to realize an information display that is not affected by the traveling state of the vehicle, is easily noticed by the driver, and is less annoying.
- 1 is a system configuration diagram showing a configuration of an information display system including an information display device according to a first embodiment.
- the figure which shows an example of the exercise effect determination table The flowchart explaining operation
- Diagram showing how optical flow is calculated The figure which shows a mode that the vector component of the optical flow in the camera image image
- FIG. 1 The figure which shows the mode of relative movement of (b), (b) is a figure which shows a mode that the relative movement of the foreground on the basis of a virtual camera is decomposed
- region The figure which shows an example of the image of route guidance
- the figure which shows an example of display magnification condition information The figure which shows an example of the pattern of the expansion or reduction movement to the vertical or horizontal of a symbol image, (a) The figure which shows an example of the pattern of the expansion or reduction movement of a sine wave, (b) The positive of the value of a sine wave An example of a pattern of repetitive enlargement or reduction of a part, (c) A diagram showing an example of a motion pattern of enlargement or reduction of a rectangular wave, (d) A diagram showing an example of a pattern of a movement of enlargement or reduction of a triangular wave , (E) An example of a pattern of expansion or contraction of a sawtooth wave
- a figure showing an example of target in-vehicle information The figure which shows an example of exercise effect condition information
- the figure which shows the specific example of a movement effect (a) is a figure which shows a sine wave, (b) is a figure which shows a rectangular wave, (c) is a figure which shows a triangular wave, (d) is a figure which shows a saw-tooth waveform.
- the system block diagram which shows the structure of the information display system containing the vehicle-mounted information display apparatus which is an information display apparatus which concerns on 4th Embodiment.
- the block diagram which shows the internal structure of the vehicle-mounted information display apparatus which is the information display apparatus which concerns on 4th Embodiment.
- region (a) The figure which shows the positional relationship seen from the top, (b) The figure which shows the positional relationship seen from the side
- FIG. 1 is a diagram illustrating a state in which a vehicle on which an information display system including an information display device according to the first embodiment is mounted is traveling.
- FIG. 2 is a diagram illustrating a state in the vicinity of the driver's seat inside the vehicle on which the information display system including the information display device according to the first embodiment is mounted.
- the information display system including the information display device will be described as being mounted on a vehicle such as an automobile, and the vehicle will be referred to as “own vehicle 5” and the driver of the own vehicle 5 will be referred to as “driver”.
- the own vehicle 5 is equipped with a sensor unit including one or a plurality of sensors 6a to 6d.
- the number of sensors mounted is not limited to four as shown in FIG.
- the sensors 6a to 6d detect a person or an object existing in a predetermined detection range 7 around the host vehicle 5. Based on the detection information output from the sensors 6a to 6d, the display information is displayed on the display area 21 of the HUD installed in the host vehicle 5.
- a symbol image indicating the pedestrian 1 is selected based on detection information indicating that “pedestrian” has been detected.
- a display image in which a movement effect is added to the symbol image is generated and displayed in the display area 21.
- a camera 8 (first camera) is mounted on the host vehicle 5.
- the camera 8 photographs the external environment of the host vehicle 5.
- the camera 8 describes a case where the foreground of the host vehicle 5 is photographed, but this is not a limitation.
- the camera 8 is installed, for example, on the back side of the room mirror 23 shown in FIG.
- the camera 8 captures a predetermined shooting range 9 determined based on the angle of view of the camera 8 in the foreground of the host vehicle 5 at a predetermined frame rate, and acquires a foreground image.
- the foreground image is used to calculate an optical flow.
- FIG. 3 is a system configuration diagram showing the configuration of the information display system 10a including the information display device 30a according to the first embodiment.
- the information display system 10a of the first embodiment includes sensors 6a to 6d, a camera 8, a display unit 90, and an information display device 30a.
- the information display device 30a is connected to the sensors 6a to 6d and the camera 8.
- the information display device 30 a includes a storage unit 31, an attention object detection unit 32, a symbol image selection unit 33, an optical flow calculation unit 34 a, an exercise effect determination unit 35, a display image generation unit 36, and a display control unit 37.
- FIG. 7 shows a state immediately before the host vehicle 5 turns right
- FIG. 8 shows a state in which the host vehicle 5 is turning right.
- the pedestrian 1 enters from outside the predetermined detection range 7 by the sensors 6a to 6d.
- the sensors 6a to 6d detect the presence of a person or an object in a predetermined detection range 7 around the host vehicle 5.
- the sensors 6a to 6d are specifically imaging devices, radar devices, wireless tag readers, road-to-vehicle communication devices, etc., and may be sensors that detect specific types such as people and vehicles, or A sensor having a function of simply detecting the presence of an object may be used.
- the sensors 6a to 6d When the presence of a detection target such as a person or an object is detected in the predetermined detection range 7, the sensors 6a to 6d output detection information including information on the detected person or object to the attention target detection unit 32. To do.
- the camera 8 shoots a predetermined shooting range 9 in the foreground of the host vehicle 5.
- the shooting range 9 is determined by the performance of the camera 8, such as the angle of view.
- the camera 8 outputs the captured foreground image to the optical flow calculation unit 34a.
- the display unit 90 is, for example, a partial display area 21 of the windshield 22 of the host vehicle 5.
- An HUD is used for the display unit 90 of the information display system 10a of the first embodiment.
- the display unit 90 displays an image generated by the information display device 30a in which an exercise effect is added to information that needs to be notified to the driver.
- an image used for display on the display unit an image before the exercise effect is added is a “symbol image”, and an image in which the exercise effect is added to the symbol image is a “display image”. It is defined as The symbol image may be predetermined information.
- the storage unit 31 includes a symbol image selection table 40 including a detection target type and a symbol image that need to be notified to the driver, and an exercise effect determination table 50 indicating conditions for adding a predetermined exercise effect to the symbol image.
- FIG. 4 is a diagram illustrating an example of the symbol image selection table 40.
- FIG. 5 is a diagram illustrating an example of the exercise effect determination table.
- the symbol image selection table 40 will be described with reference to FIG. 4, and the exercise effect determination table 50 will be described with reference to FIG.
- the symbol image selection table 40 is a table in which a detection information type and a symbol image are associated with each other. For example, “pedestrian image” is assigned as a symbol image to the detection information type “pedestrian”. In addition, a “bicycle image” is assigned as a symbol image to the detection information type “bicycle”. Also, “vehicle image” is assigned as a symbol image to the “vehicle” of the detection information type. The contents of the symbol image selection table 40 are not limited to these.
- the exercise effect determination table 50 associates an ID for identifying an optical flow condition, an optical flow condition calculated by an optical flow calculation unit 34a described later, and an added exercise effect. It is.
- ID “001” of the exercise effect determination table 50 indicates that no exercise effect is added when the optical flow condition is “the vector size of the optical flow is smaller than ⁇ ”.
- the parameter ⁇ is a preset value. In this case, the host vehicle 5 hardly moves and is not dangerous, so there is no need to add an exercise effect.
- the optical flow condition is “the magnitude of the optical flow vector is ⁇ or more and the horizontal component of the optical flow vector is the magnitude of the vertical component of the optical flow vector. When it is “greater than”, it indicates that the motion effect of longitudinal vibration is added.
- the optical flow condition is “the vector size of the optical flow is greater than or equal to ⁇ , and the magnitude of the vertical component of the optical flow is greater than or equal to the magnitude of the horizontal component of the optical flow. If the value is equal to or less than the value obtained by adding the parameter ⁇ to the magnitude of the horizontal component, it indicates that a lateral vibration motion effect is added.
- the optical flow condition is “the value obtained by adding the parameter ⁇ to the size of the horizontal component of the optical flow is less than the size of the vertical component of the optical flow” in the ID “004” of the exercise effect determination table 50, the horizontal It shows that vibration and expansion motion effects are added.
- the contents of the exercise effect determination table 50 are not limited to these.
- a default direction and amplitude may be set for the motion effect, and the motion may be performed with the default direction and amplitude when the magnitude of the optical flow vector is a predetermined value or less.
- the motion effect may be a vibration method, and the motion effect determination unit 35 determines a vibration method of predetermined information based on the vector information of the optical flow.
- the exercise effect may be an expansion / contraction method, and the exercise effect determination unit 35 determines a predetermined information expansion / contraction method based on the optical flow vector information.
- FIG. 6 is a flowchart for explaining the operation of the information display device 30a according to the first embodiment.
- FIG. 7 is a diagram showing a positional relationship between the host vehicle 5 and the pedestrian 1 before the pedestrian 1 is detected by the sensors 6a to 6d of the host vehicle 5.
- FIG. 8 is a diagram showing a positional relationship between the host vehicle 5 and the pedestrian 1 after the pedestrian 1 is detected by the sensors 6a to 6d of the host vehicle 5.
- the pedestrian 1 is detected by the sensors 6a to 6d of the host vehicle 5.
- the attention object detection unit 32 detects the detections output by the sensors 6a to 6d, respectively. Get information.
- the attention object detection unit 32 determines whether or not the acquired detection information is included in the detection information type of the symbol image selection table 40 illustrated in FIG. 4 (S401).
- the attention object detection unit 32 compares the symbol image selection table 40 stored in the storage unit 31 with the detection information acquired from the sensors 6a to 6d, and selects the symbol image selection shown in FIG. It is determined whether or not the detection information described above is included in the detection information type of the table 40.
- the attention object detection unit 32 includes information indicating that the detection information type of the detection information is included in the detection information type of the symbol image selection table 40, and the symbol image selection unit 33. Notify
- the symbol image selection unit 33 acquires information indicating that detection information is included in the detection information type of the symbol image selection table 40 from the attention object detection unit 32, and selects a corresponding symbol image (S402). The symbol image selection unit 33 outputs the selected symbol image to the display image generation unit 36.
- FIG. 9 is a diagram showing how the optical flow is calculated.
- FIG. 4B is a diagram showing the decomposition of the vector into x and y components.
- FIG. 11 is a diagram regarding the optical flow when the vehicle is traveling straight, (a) is a diagram showing a camera image taken by the camera 8 installed in the vehicle and an example of the optical flow, and (b) It is a figure which shows decomposition
- FIG. 12 is a diagram regarding the optical flow when the vehicle turns right, (a) is a diagram showing a camera image taken by the camera 8 installed in the vehicle and an example of the optical flow, and (b). It is a figure which shows decomposition
- the camera 8 always shoots a predetermined shooting range 9 in the foreground of the host vehicle 5, and outputs the shot foreground image to the optical flow calculation unit 34a.
- the optical flow calculation unit 34a calculates an optical flow in the image based on temporal changes of the plurality of foreground images output by the camera 8.
- the optical flow calculation unit 34a compares the first foreground image output from the camera 8 with the second foreground image of the next frame after the first foreground image, and estimates the corresponding coordinate points.
- the coordinate points corresponding to each other are the first coordinate point included in the first foreground image and the second coordinate indicating the background information indicated by the first coordinate point in the second foreground image. It is a point.
- the optical flow calculation unit 34a calculates a difference (movement vector) between the first coordinate point in the first foreground image and the second coordinate point in the second foreground image as an optical flow vector (S403).
- a difference movement vector
- S403 optical flow vector
- methods such as a gradient method and a block matching method are used.
- the gradient method takes an x-axis in the horizontal direction of the foreground image and a y-axis in the vertical direction of the foreground image, and assumes that the position and brightness at the same point in the image changes smoothly with respect to time. A flow is required.
- the luminance at the time t of (x, y) in the foreground image is E (x, y, t)
- the x component that is the horizontal component of the optical flow vector in the image is u
- y component which is a vertical component is set to v
- Numerical formula (1) is formed.
- Equation (2) is derived from Equation (1), and the least squares solution from the constraint equation near the pixel (x, y) in the image is calculated as an optical flow vector. Equation (2) is a constraint equation.
- the center 131 of the lens of the camera 8 is the origin O
- the horizontal right direction of the host vehicle 5 is the X axis
- the vertical An orthogonal coordinate system is used in which the upward Y-axis and the Z-axis opposite to the traveling direction are taken.
- the road surface is a plane and the height of the camera 8 from the road surface is h
- a certain point P (X 0 , Y 0 , Z 0 ) on the road surface in the field of view of the camera 8 moves to a point P ′ (X 1 , Y 1 , Z 1 ) in the next frame, and these points are respectively Z
- the optical flow vector 132 shown in FIG. 10 is represented by (u, v), Equation (5) and Equation (6) are established, and the actual movement vector 133 is projected onto the optical flow vector 132. become.
- the optical flow calculation method is not limited to the method described above.
- the optical flow calculation unit 34a outputs the optical flow calculated by the above-described method to the exercise effect determination unit 35.
- the exercise effect determination unit 35 acquires the optical flow output by the optical flow calculation unit 34a.
- the exercise effect determination unit 35 refers to the exercise effect determination table 50 stored in the storage unit 31 and determines an exercise effect according to the acquired optical flow (S404 to S408, S410 to S412).
- the representative value of the optical flow of the foreground image captured by the camera 8 may be adopted as the optical flow output by the optical flow calculation unit 34a.
- the optical flow is vector distribution information because the optical flow vector is different at each foreground point.
- the optical flow is particularly important in the foreground in a region where a display image that can be seen by the driver is superimposed (hereinafter referred to as “superimposed region”).
- the optical flow calculation unit 34a may calculate the optical flow based only on the superimposed region in the foreground image.
- the representative value of the optical flow the average of several optical flows in the superimposition region may be used, or the average value of the optical flow limited to the vicinity where the display image is actually displayed is used in the superimposition region. Also good.
- FIG. 24 is a diagram illustrating a configuration of an information display system 10e including an information display device 30e that calculates an optical flow vector in the vicinity of the overlapping region in the windshield 22.
- An information display system 10e illustrated in FIG. 24 detects a driver's eye position from a camera 2401 (second camera) that captures a driver's face image, and an image acquired from the camera 2401, and an installation position of the camera 2401. Based on the installation information indicating the driver's eyeball position calculation unit 2402 that calculates the three-dimensional position information of the eyeball, the calculated eyeball position, and the display position of the symbol image 20 on the windshield 22, A superimposition area calculation unit 2403 for calculating an area.
- installation information indicating the installation position of the camera 2401 is stored in a memory (not shown) included in the information display system 10e.
- FIG. 25 is a diagram for explaining the eyeball position of the driver 80 and the positional relationship between the overlapping regions.
- FIG. 25A is a view of the symbol image displayed on the HUD as viewed from the viewpoint of the driver 80, and the symbol image 20 is displayed in the lower region of the windshield 22. In the drawing, a region near the symbol image 20 is a superimposed region 86.
- FIG. 25 (b) is a cross-sectional view in the lateral direction of the vehicle corresponding to FIG. 25 (a).
- the emitted light 82 of the HUD unit 81 reaches the windshield 22, the reflected light is directed toward the driver 80, and enters the field of view of the driver 80, so that the display of the symbol image 20 appears in the eyes of the driver 80.
- the line of sight 83 of the driver 80 passes through the symbol image 20.
- the line of sight 83 of the driver 80 passes through the symbol image 20 and further reaches the road surface 85.
- a partial area of the road surface 85 cut out by a predetermined neighboring area centered on the symbol image 20 is the overlapping area 86.
- the superimposed region 86 is a foreground cut by a straight line 84 connecting the eyeball position of the driver 80 and a predetermined region including the symbol image 20 on the windshield 22, and the camera 8 (first camera) takes a picture. This is a region where the foreground image to be overlapped.
- FIG. 24 An optical flow calculation method based on the foreground image of the superimposed region 86 will be described with reference to FIGS. 24 and 25.
- FIG. 24 An optical flow calculation method based on the foreground image of the superimposed region 86 will be described with reference to FIGS. 24 and 25.
- the camera 2401 takes a face image of the driver 80 at predetermined time intervals. Image data captured by the camera 2401 is input to the eyeball position calculation unit 2402.
- the camera 2401 is, for example, a stereo camera, and inputs simultaneously captured images to the eyeball position calculation unit 2402.
- the eyeball position calculation unit 2402 detects the positions of the two eyes of the driver 80 from the input image.
- the distance to the object here, the eyeball
- the distance to the object is calculated using the parallax of the image taken from a slightly shifted position. Since the basic distance measuring technique using a stereo camera is known, it will not be described here. In the distance measuring technique using the stereo camera, since there are actually two human eyes, for example, the midpoint of the detected positions of both eyes of the driver 80 may be regarded as the eyeball position.
- FIG. 34 is a diagram for explaining the eyeball position and the positional relationship between the overlapping regions, and is a diagram schematically showing FIG. 25.
- FIG. 4A shows the positional relationship seen from above
- FIG. 4B shows the positional relationship seen from the side.
- the direction of each coordinate axis is the same as in FIG. 9, but the display position of the symbol image 20 is the origin.
- the position of the overlapping region 86 can be calculated.
- the display position of the symbol image 20 can be easily converted into the position coordinate of the coordinate system based on the camera 8 by storing the geometric relationship in advance.
- the exercise effect determination unit 35 may determine the exercise effect based on the optical flow vector information included in the superimposing region 86 described above.
- the predetermined value set in advance is the parameter ⁇ in the exercise effect determination table 50 shown in FIG.
- the motion effect determination unit 35 performs a comparison based on the magnitudes of the horizontal and vertical components of the optical flow, The exercise effect is determined according to the result.
- the driving effect determination unit 35 is shown in FIG. In accordance with the motion effect determination table 50 shown in FIG. 4, it is determined to add the motion effect of “longitudinal vibration” to the symbol image (S408).
- the display can be easily noticed by the driver and less troublesome.
- the driving effect determining unit 35 determines according to the exercise effect determination table 50 shown in FIG. 5 to add the effect of enlarged display in addition to adding the “lateral vibration” exercise effect to the symbol image 20 (S411). Note that the magnification at that time may be fixed in advance or may be calculated separately by some method.
- the display can be easily noticed by the driver and less troublesome.
- the motion effect determination unit 35 applies only the motion effect of lateral vibration to the symbol image (S412).
- the display image generation unit 36 generates a display image in which the exercise effect determined by the exercise effect determination unit 35 is added to the symbol image selected by the symbol image selection unit 33 (S409). Then, the display control unit 37 acquires the display image to which the exercise effect is added by the display image generation unit 36, and displays it on the display unit 90 (S406).
- the types of motion effects can be, for example, vertical one-dimensional vibration, horizontal one-dimensional vibration, enlargement, or reduction.
- the attributes such as the amplitude and period of vibration, a predetermined value set in advance is used, but can be arbitrarily edited. Any kind of exercise effect may be used as long as it is a periodic exercise.
- the position change pattern is a sine wave, rectangular wave, triangular wave, or even the movement of the display position. May be a discontinuous sawtooth waveform.
- FIG. 13 is a diagram illustrating a specific example of the exercise effect.
- FIG. 4A shows a sine wave.
- FIG. 5B is a diagram showing a rectangular wave.
- FIG. 3C shows a triangular wave.
- FIG. 4D shows a sawtooth waveform.
- FIG. 4E shows a sawtooth waveform.
- the change in the width in the enlargement or reduction direction is a positive value of the sine wave or sine wave value. It may be a repeated part, rectangular wave, triangular wave or sawtooth waveform.
- FIG. 21 is a diagram illustrating an example of a movement pattern for enlarging or reducing the symbol image 20 vertically or horizontally.
- FIG. 6A is a diagram showing an example of a movement pattern for expanding or reducing a sine wave.
- FIG. 6B is a diagram showing an example of a pattern of repetitive enlargement or reduction movement of the positive part of the sine wave value.
- FIG. 4C is a diagram showing an example of a motion pattern for enlarging or reducing the rectangular wave.
- FIG. 4D is a diagram showing an example of a movement pattern for expanding or reducing a triangular wave.
- FIG. 4E is a diagram showing an example of a movement pattern for expanding or reducing a sawtooth waveform.
- FIG. 4F is a diagram showing an example of a movement pattern for enlarging or reducing a sawtooth waveform.
- a display method may be used as if the image is rotating vertically or horizontally for the driver. Specifically, for example, if the image changes as shown in FIGS. 23A to 23M, the bicycle symbol image appears to rotate in the horizontal direction with the vertical center line as the rotation axis. .
- FIG. 23 is a diagram illustrating an example of an image change in which a symbol image appears to rotate.
- FIGS. 4A to 4M respectively show a series of changing frame images.
- FIG. 22 is a diagram illustrating an example of a vertical or horizontal motion pattern of an image that appears to rotate the symbol image.
- Figures (a) to (e) are graphs with the horizontal axis representing time and the vertical axis representing the width of the display image in the direction of rotation. Changes can be made by methods such as sine waves, rectangular waves, triangular waves, and sawtooth waveforms. It is a figure which shows the example made to do.
- FIG. 4A is a diagram showing an example of a pattern that is moved so as to rotate in the direction of a sine wave.
- FIG. 4B is a diagram showing an example of a pattern moved so as to rotate in the direction of a rectangular wave.
- FIG. 4C is a diagram showing an example of a pattern moved so as to rotate in the direction of the triangular wave.
- FIG. 4D is a diagram showing an example of a pattern moved so as to rotate in the direction of a sawtooth waveform.
- FIG. 4E is a diagram showing an example of a pattern moved so as to rotate in the direction of a saw-tooth waveform. Note that a minus sign on the vertical axis of the graph indicates that the display image is inverted with respect to the original symbol image.
- the presence of the pedestrian 1 or the like that is the driver's attention target is detected in the predetermined detection range 7 around the host vehicle 5.
- the motion effect on the symbol image is determined by calculating the optical flow of the host vehicle 5.
- the optical flow of the host vehicle 5 is calculated based on the foreground image of the host vehicle 5 taken by the camera 8.
- the determined exercise effect is added to the symbol image and displayed on the display unit 90 of the host vehicle 5 as a display image.
- the information display device 30a of the first embodiment is an information display device that displays predetermined information mounted on a vehicle including the camera 8 that captures a foreground image, and exercises the predetermined information.
- a display unit that displays the image, a plurality of foreground images captured by the camera 8, an optical flow calculation unit that calculates vector information of the foreground optical flow based on the acquired plurality of foreground images, and an optical flow
- An information display device comprising: an exercise effect determination unit that acquires optical flow vector information from a calculation unit and determines an exercise effect of predetermined information based on the acquired optical flow vector information; There is an effect that it is possible to realize an information display that is not affected by the state, is easily noticed by the driver, and is less troublesome.
- the sensors 6a to 6d simply have a function of detecting something (whether it is a person or a specific object). It doesn't matter. In that case, a predetermined image may be displayed without performing the selection procedure by the symbol image selection unit.
- the information display device 30a of the first embodiment can calculate the optical flow of the foreground of the vehicle based on the foreground image of the vehicle that has been taken continuously in time.
- the driver can easily notice the display image.
- the added motion effect is a periodic motion, it is easy to predict the trajectory of the display image, and the time from when the display image is noticed until the content of the symbol image is understood is shortened. be able to.
- the motion effect to be added to the symbol image is determined based on the difference between the vertical and horizontal component sizes of the optical flow. However, the vertical component and the horizontal component of the optical flow are determined. The motion effect added to the symbol image may be determined based on the size ratio.
- FIG. 14 is a system configuration diagram showing a configuration of an information display system 10b including an information display device 30b according to the second embodiment. 14, the same reference numerals are used for the same components as those in FIG.
- the second embodiment constituent elements different from those in the first embodiment and operations of the constituent elements will be described, and description of the same contents will be omitted.
- the difference from the information display system 10a in the first embodiment is that the speed sensor 14 and the yaw rate sensor 15 are connected to the optical flow calculation unit 34b instead of the camera 8, and the optical flow of the optical flow calculation unit 34b. This is the calculation process.
- the speed sensor 14 always detects the speed of the host vehicle 5 and outputs speed information related to the detected speed to the optical flow calculation unit 34b.
- the yaw rate sensor 15 always detects the rotational speed of the host vehicle 5 and outputs rotational speed information related to the detected rotational speed to the optical flow calculation unit 34b.
- FIG. 15 is a diagram for explaining an optical flow calculation process calculated based on the speed information and the rotation speed information output by the speed sensor 14 and the yaw rate sensor 15, respectively. Although it is the own vehicle 5 that actually moves, the foreground appears to move when viewed from the driver. Here, the coordinate system is based on the driver.
- FIG. 4B is a diagram illustrating a state in which the relative movement of the foreground with respect to the virtual camera is decomposed into each component of parallel movement and rotational movement.
- FIG. 4C is a diagram showing an optical flow vector on the projection plane.
- the driver seems to have moved an arbitrary area 156 in the foreground from a point P (X 0 , Y 0 , Z 0 ) to a point Q (X 1 , Y 1 , Z 1 ). Looks like.
- the movement from point P to point Q at that time is represented by a movement vector 152.
- a vector 153 is a linear component due to a linear motion when viewed from the viewpoint of the virtual camera 151, and a vector 154 is a rotational component due to a rotational motion.
- the vector 155 is a projection of the movement vector 152 on the image, that is, an optical flow vector.
- an arbitrary area 156 in the foreground located at the coordinates of the point P (X 0 , Y 0 , Z 0 ) has a predetermined time ⁇ t.
- the case where the point Q (X 1 , Y 1 , Z 1 ) is moved will be described later.
- the optical flow is calculated based on the image projected on the imaging surface of the virtual camera 151.
- the speed of the host vehicle 5 at the image capturing time of a certain virtual camera 151 is a
- the rotational speed that is an angular speed is ⁇ .
- Equation 155 By substituting Equations (9) and (11) into Equation (5) and Equations (9) and (11) into Equation (6), an optical flow vector 155 can be calculated. Is possible.
- the presence of the pedestrian 1 or the like that is a driver's attention target is detected in the predetermined detection range 7 around the host vehicle 5.
- the motion effect on the symbol image 20 is determined by calculating the optical flow of the host vehicle 5.
- the optical flow of the host vehicle 5 is calculated based on the values output by the speed sensor 14 and the yaw rate sensor 15.
- the determined exercise effect is added to the symbol image 20 and displayed on the display unit 90 of the host vehicle 5 as a display image.
- the information display device 30a of the second embodiment can realize information display that is not affected by the traveling state of the vehicle, is easily noticed by the driver, and does not feel annoying.
- the optical flow of the foreground of the vehicle is calculated based on the speed information and the rotational speed information output by the speed sensor 14 and the yaw rate sensor 15 provided in the vehicle, the same effect can be obtained without using the camera 8. Can be obtained.
- the driver can easily predict the trajectory of the display image. As a result, the time from noticing the display of the image to understanding the contents of the symbol image is shortened.
- the own vehicle 5 is equipped with one or a plurality of sensors 6a to 6d.
- the number of mounted sensors 6a to 6d is not limited to four as shown in FIG.
- the sensors 6a to 6d detect the presence of a person or an object in a predetermined detection range 7 around the host vehicle 5, and obtain detection information generated by this detection.
- the detection information includes information indicating the presence of the pedestrian 1 as shown in FIG.
- Each of the sensors 6a to 6d outputs the generated detection information to the in-vehicle information display device.
- a camera 8 is mounted on the host vehicle 5. As shown in FIG. 1, the camera 8 captures a foreground in a predetermined shooting range 9 in front of the traveling direction of the host vehicle 5. The camera 8 outputs the taken foreground image to the in-vehicle information display device.
- the display area 21 as the display unit 90 is disposed in the host vehicle 5 in the lower right direction of the room mirror 23 when viewed from the driver and below the windshield 22. .
- the in-vehicle information display device displays a display area as in-vehicle information that needs to notify the driver of the detected symbol image 20 of the pedestrian 1. 21.
- FIG. 26 is a system configuration diagram showing a configuration of an in-vehicle information display system 10c including an in-vehicle information display device 30c that is an information display device according to the third embodiment.
- the in-vehicle information display system 10c according to the third embodiment includes sensors 6a to 6d, a camera 8, a display unit 90, and an in-vehicle information display device 30c.
- the in-vehicle information display device 30c is connected to the sensors 6a to 6d and the camera 8, respectively.
- FIG. 27 is a block diagram showing an internal configuration of an in-vehicle information display device 30c that is an information display device according to the third embodiment. As shown in FIG.
- the in-vehicle information display device 30c includes a storage unit 31c, a display target detection unit 2601, a display image selection unit 2602, an optical flow calculation unit 34c, an exercise effect determination unit 35, an image generation unit 2603, and a display control unit. 37.
- the display object detection unit 2601 corresponds to the attention object detection unit 32 in the first embodiment
- the display image selection unit 2602 corresponds to the symbol image selection unit 33 in the first embodiment
- the image generation unit 2603 This corresponds to the display image generation unit 36 in the first embodiment.
- Sensors 6a to 6d detect the presence of a person or an object in a predetermined detection range 7 around the host vehicle 5.
- the sensors 6a to 6d output detection information including information on the detected person or object to the display target detection unit 2601. .
- the camera 8 captures a foreground in a predetermined capturing range 9 in front of the traveling direction of the host vehicle 5.
- the camera 8 outputs the captured foreground image to the optical flow calculation unit 34c.
- the display unit 90 is, for example, a partial display area 21 of the windshield 22 of the host vehicle 5.
- An HUD is used for the display unit 90 of the in-vehicle information display system 10a of the third embodiment.
- an image of information that is generated by the in-vehicle information display device 30c and needs to be notified to the driver of the host vehicle 5 is displayed.
- information that needs to be notified to the driver of the host vehicle 5 is defined as “in-vehicle information”
- an image indicating the in-vehicle information is defined as “in-vehicle information image”.
- the storage unit 31c includes at least target vehicle information 2800 including the type and image of the vehicle information, and exercise effect condition information 2900 indicating condition information for giving a predetermined exercise effect to the vehicle information image. I remember it.
- the target in-vehicle information 2800 will be described with reference to FIG. 28, and the exercise effect condition information 2900 will be described with reference to FIG.
- FIG. 28 is a diagram illustrating an example of the target in-vehicle information 2800.
- FIG. 29 is a diagram showing an example of exercise effect condition information 2900.
- the information stored in the storage unit 31c is an example, and may be information stored in the storage unit 31 in the first embodiment.
- the target in-vehicle information 2800 in the present embodiment corresponds to the symbol image selection table 40 in the first embodiment, and the exercise effect condition information 2900 in the present embodiment is stored in the exercise effect determination table 50 in the first embodiment. Equivalent to.
- the target in-vehicle information 2800 is obtained by associating an ID for identifying the target in-vehicle information 2800, a target type of the target in-vehicle information 2800, and an in-vehicle information image.
- the in-vehicle information type is “pedestrian” and “pedestrian image” is assigned as the in-vehicle information image.
- the in-vehicle information type is “bicycle” and “bicycle image” is assigned as the in-vehicle information image.
- the in-vehicle information type is “vehicle” and “vehicle image” is assigned as the in-vehicle information image. Note that the content of the target in-vehicle information 2800 is not limited to these.
- the exercise effect condition information 2900 is associated with an ID for identifying the exercise effect condition information 2900, an optical flow condition calculated by an optical flow calculation unit 34c described later, and an exercise effect to be applied. It is a thing.
- the exercise effect condition information 2900 of ID “001” indicates that no exercise effect is given when the optical flow condition is “the magnitude of the optical flow vector ⁇ ”.
- the parameter ⁇ is a threshold value that determines whether or not an exercise effect may be applied.
- the optical flow condition is “the magnitude of the horizontal component of the optical flow vector is greater than the magnitude of the vertical component of the optical flow vector”, the motion of the longitudinal vibration An effect is given.
- the exercise effect condition information 2900 of ID “003” when the optical flow condition is “the magnitude of the horizontal component of the optical flow is equal to or less than the magnitude of the vertical component of the optical flow”, the exercise effect of lateral vibration is given.
- the contents of the exercise effect condition information 2900 are not limited to these.
- FIG. 30 is a flowchart for explaining the operation of the in-vehicle information display device 30c which is the information display device according to the third embodiment.
- FIG. 7 is a diagram showing the positional relationship between the host vehicle 5 and the pedestrian 1 before the pedestrian 1 is detected by the sensors 6a to 6d of the host vehicle 5 as described in the first embodiment.
- FIG. 8 is a diagram showing the positional relationship between the host vehicle 5 and the pedestrian 1 after the pedestrian 1 is detected by the sensors 6a to 6d of the host vehicle 5, as described in the first embodiment.
- FIG. 8 shows a state in which the driver is alerted to the presence of the pedestrian 1 approaching when the traveling vehicle 5 turns right at the intersection. The pedestrian 1 is detected by the sensors 6a to 6d of the host vehicle 5.
- the display target detection unit 2601 detects the detections output by the sensors 6a to 6d, respectively. Get information.
- the display target detection unit 2601 determines whether or not the target in-vehicle information 2800 illustrated in FIG. 28 is included in the acquired detection information (S3001). Specifically, the display target detection unit 2601 compares the target in-vehicle information 2800 stored in the storage unit 31c with the detection information acquired from the sensors 6a to 6d, and the target in-vehicle information 2800 shown in FIG. It is determined whether the target type is included in the detection information described above.
- the display target detection unit 2601 When the target type of the target in-vehicle information 2800 is included in the detection information (YES in S3001), the display target detection unit 2601 includes information indicating that the target type of the target in-vehicle information 2800 is included in the detection information. The image is output to the display image selection unit 2602.
- the display image selection unit 2602 acquires information indicating that the target type of the target in-vehicle information 2800 is included in the detection information from the display target detection unit 2601 and sets the target type of the target in-vehicle information 2800 included in the detection information. A corresponding in-vehicle information image is selected (S3002). The display image selection unit 2602 outputs to the image generation unit 2603 that the in-vehicle information image has been selected.
- FIG. 9 is a diagram showing how the optical flow is calculated as described in the first embodiment.
- (B) is a drawing showing the decomposition of a vector into x and y components.
- FIG. 11 is a drawing relating to an optical flow when the vehicle is traveling straight as described in the first embodiment, and FIG. 11A is a drawing showing an image taken by the camera 8 placed in the vehicle.
- FIGS. 12A and 12B are diagrams relating to an optical flow when the vehicle turns to the right.
- FIG. 12A is a diagram illustrating an image taken by the camera 8 placed in the vehicle
- FIG. 12B is an x of a vector based on the optical flow. It is drawing which shows decomposition
- the camera 8 always shoots the foreground of the predetermined shooting range 9 in front of the traveling direction of the host vehicle 5, and outputs the shot foreground image to the optical flow calculation unit 34c.
- the optical flow calculation unit 34 c calculates the optical flow in the image based on the temporal change of the foreground image output by the camera 8.
- the optical flow of the foreground image captured by the camera 8 is used as a representative value of the optical flow.
- the optical flow vector is different at each point in the image taken by the camera 8, the optical flow in the display area 21 of the in-vehicle information image that can be seen by the driver is particularly important. Therefore, as a representative value of the optical flow, an average of several optical flows in the display area 21 may be used, or a local optical flow in the vicinity where an image is actually displayed in the display area 21 may be used. An average value may be used.
- the optical flow calculation unit 34c estimates the same point in the image frame of the two foreground images output by the camera 8, and derives the movement vector of the point as an optical flow vector (S3003).
- a method for estimating the same point between image frames methods such as a gradient method and a block matching method are used.
- the x-axis is taken in the horizontal direction of the foreground image and the y-axis is taken in the vertical direction of the foreground image, and the position and brightness at the same point in the image change smoothly with respect to time.
- Optical flow is required.
- the luminance at the time t of (x, y) in the foreground image is E (x, y, t)
- u is the horizontal component of the optical flow vector in the image
- the vector Assuming that the y component, which is the vertical component, is v, Equation (1) shown in the first embodiment is established.
- the constraint equation of Formula (2) shown in Embodiment 1 is derived from Formula (1), and the least squares solution from the constraint equation near the pixel (x, y) in the image is calculated as an optical flow vector.
- the road surface is a plane and the height of the camera 8 from the road surface is h
- Q ′ (x 1 , y 1 ) respectively, the relationship of Equation (3) and Equation (4) shown in Embodiment 1 is established. The same applies to the projection of the points P ′ and Q ′.
- optical flow calculation method is not limited to the method described above.
- the optical flow calculation unit 34c outputs the optical flow derived by the above-described method to the exercise effect determination unit 35.
- the exercise effect determination unit 35 acquires the optical flow output by the optical flow calculation unit 34c.
- the exercise effect determination unit 35 refers to the exercise effect condition information 2900 stored in the storage unit 31c, and sets an exercise effect according to the acquired optical flow (S3004 to S3008). For example, when the size of the optical flow vector is less than a predetermined value ⁇ , such as when the host vehicle 5 is stopped or the host vehicle 5 is traveling at a low speed (S3004). NO), the exercise effect determination unit 35 considers that there is no optical flow, and determines not to give the exercise effect to the in-vehicle information image selected by the display image selection unit 2602 (S3005).
- the predetermined value set in advance is the parameter ⁇ in the exercise effect condition information 2900 shown in FIG.
- the driving effect determining unit 35 In accordance with the exercise effect condition information 2900 shown in FIG. 5, it is determined to give the “lateral vibration” exercise effect to the in-vehicle information image (S3007).
- the motion effect of this “lateral vibration” is a motion effect mainly composed of horizontal components.
- the vehicle information image is displayed in a direction substantially perpendicular to the optical flow having a large vertical component, so that the motion effect of “lateral vibration” is the vehicle information. By being added to the image, it is possible to display in-vehicle information that is easy to notice without causing the driver to feel bothersome.
- the host vehicle 5 travels close to a turning motion.
- the driving effect determination unit 35 performs the exercise effect condition shown in FIG. In accordance with the information 2900, it is determined to give the “longitudinal vibration” motion effect to the in-vehicle information image (S3008).
- This “longitudinal vibration” motion effect is a motion effect mainly composed of vertical components.
- the vehicle information image is displayed in a substantially vertical direction with respect to the optical flow having a large horizontal component.
- the in-vehicle information can be displayed easily and easily without being bothered by the driver.
- the image generation unit 2603 generates an in-vehicle information image in which the exercise effect determined by the exercise effect determination unit 35 is added to the in-vehicle information image selected by the display image selection unit 2602 (S3009).
- the image generation unit 2603 outputs the generated in-vehicle information image to the display control unit 37.
- the display control unit 37 acquires the in-vehicle information image output by the image generation unit 2603, and controls the display unit 90 to display the acquired in-vehicle information image (S3010).
- the types of motion effects can be, for example, vertical one-dimensional vibration, horizontal one-dimensional vibration, enlargement, or reduction.
- the attributes such as the amplitude and period of vibration, a predetermined value set in advance is used, but can be arbitrarily edited. Any kind of exercise effect may be used as long as it is a periodic exercise.
- the pattern of change in position or shape may be a sine wave, a rectangular wave, a triangular wave, or a sawtooth waveform in which the movement of the display position is discontinuous.
- FIG. 31 is a diagram illustrating a specific example of the exercise effect.
- (A) It is a figure which shows a sine wave.
- FIG. 5B is a diagram showing a rectangular wave.
- FIG. 3C shows a triangular wave.
- FIG. 4D shows a sawtooth waveform.
- the waveforms shown in FIGS. 13A to 13E shown in the first embodiment may be used.
- the presence of the pedestrian 1 or the like that is the target of the in-vehicle information image in the predetermined detection range 7 around the own vehicle 5 Is detected.
- the detection information generated by this detection includes an object of the in-vehicle information image
- the optical effect of the host vehicle 5 is calculated to determine the exercise effect on the in-vehicle information image.
- the optical flow of the host vehicle 5 is calculated based on the foreground image in the traveling direction of the host vehicle 5 taken by the camera 8.
- the on-vehicle information image in which the determined exercise effect is added to the on-vehicle information image is displayed on the display unit 90 of the host vehicle 5.
- the in-vehicle information display device 30c which is an information display device according to the third embodiment, can display the in-vehicle information without being affected by the traveling state of the vehicle and without being bothered by the driver. . Further, the in-vehicle information display device 30c according to the third embodiment determines whether or not a target of in-vehicle information that needs to be notified to the driver around the vehicle is found based on preset target in-vehicle information 2800. Can be determined with high accuracy. Further, the in-vehicle information display device 30c of the third embodiment can be used either when the vehicle is traveling straight or when rotating, such as turning, depending on the magnitude of the derived horizontal and vertical components of the optical flow.
- the in-vehicle information display device 30c of the third embodiment can calculate the optical flow in the traveling direction of the vehicle based on continuously captured images in the traveling direction of the vehicle.
- the driving effect is a periodic motion
- the in-vehicle information image periodically moves in the direction intersecting the derived optical flow vector main component, the driver can more easily predict the trajectory displayed in the in-vehicle information image. Even after noticing the display, the display contents can be easily understood.
- the driver since the in-vehicle information image periodically moves in a direction intersecting with the derived vector main component of the optical flow, the driver reduces the movement amount of the line of sight after noticing the display of the in-vehicle information image.
- the display contents can be easily understood.
- FIG. 32 is a system configuration diagram showing a configuration of an in-vehicle information display system 10d including an in-vehicle information display device 30d which is an information display device according to the fourth embodiment. 32, the same symbols are used for the same components as in FIG.
- the fourth embodiment components different from those in the third embodiment and operations of the components will be described, and description of the same contents will be omitted.
- FIG. 33 is a block diagram showing an internal configuration of an in-vehicle information display device 30d which is an information display device according to the fourth embodiment.
- the speed sensor 14 and the yaw rate sensor 15 are the same as the speed sensor 14 and the yaw rate sensor 15 described in the second embodiment.
- the speed sensor 14 always detects the speed of the host vehicle 5 and outputs speed information related to the detected speed to the optical flow calculation unit 34d.
- the yaw rate sensor 15 always detects the rotational speed of the host vehicle 5 and outputs rotational speed information related to the detected rotational speed to the optical flow calculation unit 34d.
- FIG. 15 is a diagram for explaining an optical flow calculation process derived based on speed information and rotational speed information output by the speed sensor 14 and the yaw rate sensor 15, respectively.
- reference numeral 151 denotes a virtual camera
- reference numeral 152 denotes an arbitrary area 156 of the foreground moved from the point P (X 0 , Y 0 , Z 0 ) to the point Q (X 1 , Y 1 , Z 1 ). It is a movement vector at the time.
- reference numeral 151 denotes a virtual camera
- reference numeral 152 denotes an arbitrary area 156 of the foreground moved from the point P (X 0 , Y 0 , Z 0 ) to the point Q (X 1 , Y 1 , Z 1 ). It is a movement vector at the time.
- reference numeral 155 denotes an optical flow vector of an image in which the movement of an arbitrary area 156 in the foreground is projected by the virtual camera 151.
- an arbitrary area 156 in the foreground located at the coordinates of the point P (X 0 , Y 0 , Z 0 ) has a predetermined time ⁇ t.
- the case where the point Q (X 1 , Y 1 , Z 1 ) is moved will be described later.
- the speed of the host vehicle 5 at the image capturing time in a certain virtual camera 151 is a, and the rotational speed which is an angular speed is ⁇ .
- an arbitrary region 156 located at an arbitrary point P (X 0 , Y 0 , Z 0 ) in the front field of view moves to an arbitrary point Q (X 1 , Y 1 , Z 1 ) after time ⁇ t.
- the movement vector 152 from the point P to the point Q can be regarded as the sum of the linear component 153 in the velocity direction and the rotation component 154 centered on the virtual camera 151. For this reason, between the point P and the point Q, Formula (9), Formula (10), and Formula (11) shown in Embodiment 2 hold.
- the optical flow vector 155 is calculated. It is possible. The motion effect on the in-vehicle information image can be determined by the optical flow vector 155 calculated in this manner, as in the third embodiment.
- the presence of the pedestrian 1 or the like that is the target of the in-vehicle information image in the predetermined detection range 7 around the host vehicle 5 exists. Is detected.
- the detection information generated by this detection includes an object of the in-vehicle information image
- the optical effect of the host vehicle 5 is calculated to determine the exercise effect on the in-vehicle information image.
- the optical flow of the host vehicle 5 is derived based on the values output by the speed sensor 14 and the yaw rate sensor 15.
- the on-vehicle information image in which the determined exercise effect is added to the on-vehicle information image is displayed on the display unit 90 of the host vehicle 5.
- the in-vehicle information display device 30d which is the information display device according to the fourth embodiment, can display the in-vehicle information without being affected by the traveling state of the vehicle and without being bothered by the driver. . Further, the in-vehicle information display device 30d according to the fourth embodiment determines whether or not a target of in-vehicle information that needs to be notified to the driver around the vehicle is found based on preset target in-vehicle information 2800. Can be determined with high accuracy. In addition, it is possible to determine with high accuracy whether the vehicle is traveling straight or rotating based on the absolute values of the derived horizontal and vertical components of the optical flow.
- the optical flow in the traveling direction of the vehicle can be calculated based on the speed information and the rotational speed information output by the speed sensor 14 and the yaw rate sensor 15 provided in the vehicle.
- the driver can easily predict the trajectory on which the image of the in-vehicle information is displayed, and can easily understand the display content even after noticing the display of the image.
- the in-vehicle information image periodically moves in the direction intersecting the derived optical flow vector main component, the driver can more easily predict the trajectory displayed in the in-vehicle information image. Even after noticing the display, the display contents can be easily understood.
- the driver since the in-vehicle information image periodically moves in a direction intersecting with the derived vector main component of the optical flow, the driver reduces the movement amount of the line of sight after noticing the display of the in-vehicle information image.
- the display contents can be easily understood.
- an optical flow vector can be calculated without an image captured by the camera 8. Therefore, the horizontal component of the optical flow vector for the combination of the output value of the speed sensor 14 and the output value of the yaw rate sensor 15, the y component that is the vertical component, and the x component and the y component are formed.
- the optimal movement direction of the movement effect on the symbol image can be determined based on the table.
- the optimum movement direction of the movement effect on the symbol image is determined based on the table of FIG. 16B including the position coordinates as parameters. May be.
- FIG. 16 is a diagram illustrating an example of a table for obtaining an angle formed by a horizontal component and a vertical component of an optical flow vector when a virtual camera is used.
- FIG. 5A is a diagram showing a table in which the angles formed by the horizontal component and the vertical component of the vector of the optical flow with respect to the vehicle speed and the angular velocity are associated with each other.
- FIG. 5B is a table in which the position coordinates of the road surface where the HUD overlap region overlaps, the horizontal component of the vector of the optical flow with respect to the vehicle speed and the angular velocity, and the angle formed by the vertical component are associated with each other.
- FIG. 24 is a diagram illustrating a system configuration of an information display system that calculates the eyeball position of the driver and calculates an optical flow according to the viewpoint of the driver.
- the driver's eyeball position can be calculated by using a known positioning technique using an in-camera. A technique for converting the viewpoint of an image is also known. Therefore, it is possible to convert the image captured by the camera 8 into an image of the driver's viewpoint, and to calculate the optical flow of the portion that overlaps the display area of the HUD in the converted image.
- the position and angle of the driver's seat may be taken into consideration.
- the background (superimposed region) of the display position of the HUD as viewed from the driver also changes. Therefore, if the eyeball position of the driver determined by the position and angle of the seat is recorded in advance as a table, the eyeball position of the driver is determined in conjunction with the position and angle of the seat, and the HUD as viewed from the eyeball position is further determined. It is also possible to calculate an optical flow vector at a position on the road surface beyond the display area.
- FIG. 17 is a diagram illustrating an example of a table for setting the position coordinates of the road surface where the HUD overlap region overlaps.
- the optical flow vector is small when the position on the road surface that is the background of the display position is farther away.
- the size increases.
- the direction of the optical flow vector is different from that when the vehicle is traveling straight, but the magnitude of the vector is the same.
- the optical axis of the camera 8 including the virtual camera 151 has been described as being parallel to the Z axis.
- the camera 8 may have a depression angle.
- the optical flow can be calculated in the same manner from the geometric relationship.
- the sensors 6a to 6d and the camera 8 are always operating while the host vehicle 5 is traveling.
- the operations of the sensors 6a to 6d and the camera 8 may be performed only when the speed is higher than a predetermined speed.
- the predetermined speed is, for example, 5 km / h or more.
- the sensors 6a to 6d and the camera 8 may start to operate simultaneously with the start of the engine of the host vehicle 5, or set to operate only when the speed satisfies the operation condition preset by the driver. You may do it.
- the detection information type may be selected only when the distance between the host vehicle 5 and a specific type of object satisfies a predetermined condition.
- the predetermined condition is, for example, “the distance between the host vehicle 5 and a specific type of object is 10 [m] or less” or the like.
- the sensors 6a to 6d can detect the right / left turn of the host vehicle 5, and can detect only the approaching object detected on the right side of the host vehicle 5 when the host vehicle 5 is turning right, for example.
- FIG. 18 is a diagram illustrating an example of a route guidance image.
- the symbol image is vibrated, so that the driver can quickly notice the approach of the right / left turn point without feeling troublesome.
- driving information instrument information, vehicle abnormality warning information, etc.
- driving support information running environment information, sign information, information to reinforce vision, blind spot information, obstacle warning information, lanes Display of deviations, traffic violations, snooze warnings, recommended speed information, etc.
- in-car environment information clock, air conditioner, multimedia information, etc.
- navigation information route guidance information, economic driving support information, etc.
- the method of determining the motion effect added to the symbol image by the optical flow in either the horizontal direction or the vertical direction has been described.
- a symbol image such as adding a motion effect in the direction closest to the optical flow direction among the four directions from top to bottom, left to right, top right to bottom left, and top left to bottom right as viewed from the driver.
- the motion direction options may be subdivided.
- the options of the direction of the motion effect to be added are further subdivided.
- the angle formed by the optical flow vector 132 and the direction of the x component described above is ⁇ , and based on the value of ⁇ , A movement effect may be added in a direction closest to the optical flow vector.
- the symbol image is described as a single image 70 as shown in FIG.
- FIG. 19B for example, the first image 71 that needs to be notified to the driver and the driver's attractiveness and visibility as shown in FIG. 19B.
- FIG. 19 is a diagram illustrating an example of a display image displayed on the display unit.
- FIG. 5A shows an image with only a symbol image.
- FIG. 5B shows an image composed of a symbol image and a background image.
- the first image 71 and the second image 72 perform the same exercise, even if the driver notices the first image 71 is moving, the driver's visibility is reduced. Therefore, the visibility of the symbol image 70 can be enhanced by adding exercise only to the second image 72 and not adding exercise effect to the first image 71 in the display control unit 37.
- the optical flow may be estimated based on the Global Positioning System) information, the steering, the winker, and the traveling state of the host vehicle 5.
- the visual characteristics of the driver are not taken into consideration, but there are age differences and individual differences in the visual characteristics.
- age for example, visual acuity, visual field range, spectral visual sensitivity, color contrast sensitivity, luminance contrast sensitivity, etc.
- the driver's individual visual characteristics can be grasped in advance, it is more suitable for each driver by displaying an image according to the driver's condition based on the driver's individual visual characteristic data. The display is easy to notice and less troublesome.
- the exercise effect added to the symbol image may be changed over time.
- the amplitude and period of motion are uniform regardless of the speed of the host vehicle 5, but depending on the output value of the speed sensor 14, for example, when the speed of the host vehicle 5 increases, the symbol image
- the size of the symbol image or the motion cycle may be changed according to the speed of the host vehicle 5 such as increasing the size of the vehicle or shortening the motion cycle of the motion effect added to the symbol image.
- the vibration of the vehicle body of the host vehicle 5 is not taken into account, but the vertical vibration of the host vehicle 5 is large by detecting the vibration of the vehicle body of the host vehicle 5 in the vertical direction. It is also possible to increase the vibration amplitude of the symbol image.
- the sensors 6a to 6d and the camera 8 are not included in the information display device 30a. However, the sensors 6a to 6d and the camera 8 are not included in the information display device 30a. It may be included.
- the speed sensor 14 and the yaw rate sensor 15 are not included in the information display device 30b. However, the speed sensor 14 and the yaw rate sensor 15 are not included in the information display device 30b. May be included.
- the display part 90 was demonstrated as a structure which is not included in the information display apparatus 30. FIG. However, the display unit 90 may be included in each information display device 30.
- the format of the detection information type is text data
- the format of the symbol image is image data, but is not limited to this.
- the symbol image management table 40 is used when accessing the other storage unit storing the detected information type and the symbol image. May be stored.
- the display size of the display image is increased as the relative distance is smaller and the display is larger as the relative speed is larger according to the distance and speed. It is also possible to change the size of the display image such as increasing the display size of the image.
- the information display device 30 determines the display magnification of the symbol image according to the relative distance between the host vehicle 5 and the pedestrian 1 or the like. And can be displayed.
- FIG. 20 when the relative distance between the host vehicle 5 and the pedestrian 1 or the like is less than A [m], it is displayed in a size three times the preset size of the display target image. . In the case of A [m] to B [m], it is displayed in a size of 2.0 times, and in the case of B [m] to C [m], it is displayed in a size of 1.0 (normal size). Is displayed.
- a symbol image having a different hue such as green ⁇ yellow ⁇ red is assigned as the relative distance between the host vehicle 5 and the detection target is shortened, or a symbol image having higher brightness is assigned as the relative distance is shortened. It is possible to assign a symbol image with high saturation or a symbol image with high luminance as the distance becomes shorter.
- different symbol images may be assigned depending on the value of TTC (Time To Collation) calculated from the values of the relative distance and relative speed between the vehicle 5 and the detection target.
- TTC Time To Collation
- the symbol image is selected based on the detected attention object.
- the attention object portion is trimmed from the acquired camera image, and the attention object image trimmed instead of the symbol image is HUD. It may be displayed above.
- the display image is superimposed and displayed on the background through the windshield by HUD, but the display image may be displayed on a small non-transparent display installed in front of the driver.
- the information display device and the information display method according to the present invention are useful as a device or a method that becomes a part of a safe driving support system mounted on a vehicle. It can also be used in a route guidance application, such as being incorporated as a part of a car navigation device and using a route guidance image as a symbol image.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Traffic Control Systems (AREA)
- Image Processing (AREA)
- Controls And Circuits For Display Device (AREA)
Abstract
Description
図1は、第1の実施形態に係る情報表示装置を含む情報表示システムが搭載された車両が走行している様子を示す図である。図2は、第1の実施形態に係る情報表示装置を含む情報表示システムが搭載された車両内部の運転席付近の様子を示す図である。以下、情報表示装置を含む情報表示システムは自動車等の車両に搭載されるものとして説明し、当該車両を「自車両5」、当該自車両5の運転者を「運転者」と記載する。
次に、第2の実施形態の情報表示装置を含む情報表示システムについて図14を参照して説明する。図14は、第2の実施形態に係る情報表示装置30bを含む情報表示システム10bの構成を示すシステム構成図である。図14において、図3と同じ構成要素については同じ符号を用いる。
第3の実施形態に係る情報表示装置を含む情報表示システムについて説明する。第3の実施形態に係る情報表示装置としての車載情報表示装置を含む車載情報表示システムが搭載された車両の走行状態の様子は、実施の形態1における図1と同様である。また、車載情報表示装置を含む車載情報表示システムが搭載された自車両5内部の様子は、実施の形態1における図2と同様である。したがって、図1および及び図2を用いて、実施の形態3における情報表示システムについて説明する。
次に、第4の実施形態の車載情報表示装置を含む車載情報表示システムについて図32を参照して説明する。図32は、第4の実施形態に係る情報表示装置である車載情報表示装置30dを含む車載情報表示システム10dの構成を示すシステム構成図である。図32において、図26と同じ構成要素については同じ符号を用いる。以下、第4の実施形態では、第3の実施形態と異なる構成要素及び当該構成要素の動作について説明し、同一の内容については説明を省略する。第3の実施形態における車載情報表示システム10cと異なるのは、カメラ8の代わりに、速度センサ14及びヨーレートセンサ15がオプティカルフロー算出部34dに接続されている点、及びオプティカルフロー算出部34dのオプティカルフローの算出過程である。図33は、第4の実施形態に係る情報表示装置である車載情報表示装置30dの内部構成を示すブロック図である。なお、速度センサ14および及びヨーレートセンサ15は、実施の形態2において説明した速度センサ14および及びヨーレートセンサ15と同じものである。
5 自車両
6a~6d センサ
7 検出範囲
8、2401 カメラ
9 撮影範囲
10a~10e 情報表示システム
14 速度センサ
15 ヨーレートセンサ
20、70 シンボル画像
21 表示領域
22 フロントガラス
23 ルームミラー
30a~30e 情報表示装置
31 記憶部
32 注意対象検出部
33 シンボル画像選択部
34a~34e オプティカルフロー算出部
35 運動効果決定部
36 表示画像生成部
37 表示制御部
40 シンボル画像選択テーブル
50 運動効果決定テーブル
60 表示倍率条件情報
71 第1の画像
72 第2の画像
80 運転者
81 HUDユニット
82 HUDの射出光
83 視線
84 直線
85 路面
86 重畳領域
87 眼球位置
90 表示部
111、121、132、155 オプティカルフローのベクトル
131 レンズの中心
133 移動ベクトル
151 仮想カメラ
152 移動ベクトル
153 直進成分
154 回転成分
156 前景の任意の領域
2402 眼球位置算出部
2403 重畳領域算出部
2601 表示対象検出部
2602 表示画像選択部
2603 画像生成部
2800 対象車載情報
2900 運動効果条件情報
Claims (9)
- 前景画像を撮影する第1のカメラを備える車両に搭載された、所定の情報を表示する情報表示装置であって、
前記所定の情報を運動させて表示する表示部と、
前記第1のカメラが撮影した複数の前景画像を取得し、取得された前記複数の前景画像に基づいて、前景のオプティカルフローのベクトル情報を算出するオプティカルフロー算出部と、
前記オプティカルフロー算出部から前記オプティカルフローのベクトル情報を取得し、取得された前記オプティカルフローのベクトル情報に基づいて、前記所定の情報の運動効果を決定する運動効果決定部と、を備える情報表示装置。 - 前記運動効果は、振動方法であって、
前記運動効果決定部は、前記オプティカルフローのベクトル情報に基づいて、前記所定の情報の振動方法を決定する請求項1に記載の情報表示装置。 - 前記運動効果は、伸縮方法であって、
前記運動効果決定部は、前記オプティカルフローのベクトル情報に基づいて、前記所定の情報の伸縮方法を決定する請求項1に記載の情報表示装置。 - 前記運動効果決定部において決定された運動効果の情報を取得し、前記所定の情報に前記運動効果の情報を付加して前記表示部に表示させる表示画像を生成する表示画像生成部を、更に備える請求項1に記載の情報表示装置。
- 前記車両が周囲に存在する被検出対象を検出するセンサを備える場合において、
注意対象である被検出対象を記憶する記憶部と、
前記センサから取得された被検出対象の情報と、前記記憶部から取得された被検出対象の情報とを比較し、前記センサから取得された被検出対象が注意対象であるか否かを判定する注意対象検出部と、
前記注意対象検出部において、前記センサから取得された被検出対象が注意対象であると判定された場合に、前記取得された被検出対象に応じたシンボル画像を選択するシンボル画像選択部と、
を更に備え、
前記表示部は、前記シンボル画像を前記所定の情報として運動させて表示する請求項1に記載の情報表示装置。 - 前記車両の運転者の画像を撮影する第2のカメラを備え、前記撮影された運転者の画像に基づいて前記運転者の視線方向を検出する視線検出部を、更に備え、
前記運動効果決定部は、前記オプティカルフローのベクトル情報のうち、前記視線検出部において検出された前記視線方向を含む所定の領域と、前記前景との重畳領域に含まれるベクトル情報に基づいて、前記所定の情報の運動効果を決定する請求項1に記載の情報表示装置。 - 前記運動効果決定部は、取得された前記オプティカルフローのベクトル情報の鉛直成分の大きさと、水平成分の大きさとの比較結果に基づいて、前記所定の情報の運動効果を決定する請求項1に記載の情報表示装置。
- 速度情報及び回転速度情報を検出するセンサを備える車両に搭載された、所定の情報を表示する情報表示装置であって、
前記所定の情報を運動させて表示する表示部と、
前記センサから前記速度情報及び前記回転速度情報を取得し、取得された前記速度情報及び前記回転速度情報に基づいて、前景のオプティカルフローのベクトル情報を算出するオプティカルフロー算出部と、
前記オプティカルフロー算出部から前記オプティカルフローのベクトル情報を取得し、取得された前記オプティカルフローのベクトル情報に基づいて、前記所定の情報の運動効果を決定する運動効果決定部と、を備える情報表示装置。 - 前景画像を撮影するカメラを備える車両に搭載された、所定の情報を表示する情報表示方法であって、
表示部は、前記所定の情報を運動させて表示し、
オプティカルフロー算出部は、前記カメラが撮像した複数の前景画像を取得し、取得された前記複数の前景画像に基づいて、前景のオプティカルフローのベクトル情報を算出し、
運動効果決定部は、前記オプティカルフロー算出部から前記オプティカルフローのベクトル情報を取得し、取得された前記オプティカルフローのベクトル情報に基づいて、前記所定の情報の運動効果を決定する情報表示方法。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13/255,296 US8536995B2 (en) | 2009-12-10 | 2010-12-08 | Information display apparatus and information display method |
EP10835706.2A EP2511893A4 (en) | 2009-12-10 | 2010-12-08 | Information display apparatus and information display method |
JP2011545089A JP5590684B2 (ja) | 2009-12-10 | 2010-12-08 | 情報表示装置及び情報表示方法 |
CN201080014454.4A CN102378998B (zh) | 2009-12-10 | 2010-12-08 | 信息显示设备和信息显示方法 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2009280559 | 2009-12-10 | ||
JP2009-280559 | 2009-12-10 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2011070783A1 true WO2011070783A1 (ja) | 2011-06-16 |
Family
ID=44145342
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2010/007156 WO2011070783A1 (ja) | 2009-12-10 | 2010-12-08 | 情報表示装置及び情報表示方法 |
Country Status (5)
Country | Link |
---|---|
US (1) | US8536995B2 (ja) |
EP (1) | EP2511893A4 (ja) |
JP (1) | JP5590684B2 (ja) |
CN (1) | CN102378998B (ja) |
WO (1) | WO2011070783A1 (ja) |
Cited By (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2015092346A (ja) * | 2014-11-18 | 2015-05-14 | 株式会社デンソー | 表示装置 |
US9063328B2 (en) | 2013-01-28 | 2015-06-23 | Electronics And Telecommunications Research Institute | Head-up display apparatus and method for vehicle |
US9475420B2 (en) | 2012-10-05 | 2016-10-25 | Denso Corporation | Display apparatus |
WO2016199442A1 (ja) * | 2015-06-10 | 2016-12-15 | 株式会社Jvcケンウッド | レーザレーダ装置および検知方法 |
US9576489B2 (en) | 2014-01-23 | 2017-02-21 | Electronics And Telecommunications Research Institute | Apparatus and method for providing safe driving information |
WO2017134876A1 (ja) * | 2016-02-05 | 2017-08-10 | 日立マクセル株式会社 | ヘッドアップディスプレイ装置及びその表示制御方法 |
JP2019109656A (ja) * | 2017-12-18 | 2019-07-04 | 東芝情報システム株式会社 | 軽車両・軽二輪車の情報管理システム |
JP2020013274A (ja) * | 2018-07-17 | 2020-01-23 | 富士通株式会社 | 表示プログラム、表示装置及び表示方法 |
JP2022175877A (ja) * | 2021-05-14 | 2022-11-25 | トヨタ自動車株式会社 | 車両用表示装置、表示方法及びプログラム |
Families Citing this family (31)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10018703B2 (en) * | 2012-09-13 | 2018-07-10 | Conduent Business Services, Llc | Method for stop sign law enforcement using motion vectors in video streams |
US9230501B1 (en) * | 2012-01-06 | 2016-01-05 | Google Inc. | Device control utilizing optical flow |
JP5774770B2 (ja) * | 2012-03-12 | 2015-09-09 | 本田技研工業株式会社 | 車両周辺監視装置 |
US9975483B1 (en) * | 2013-02-08 | 2018-05-22 | Amazon Technologies, Inc. | Driver assist using smart mobile devices |
US9514650B2 (en) * | 2013-03-13 | 2016-12-06 | Honda Motor Co., Ltd. | System and method for warning a driver of pedestrians and other obstacles when turning |
US9064420B2 (en) * | 2013-03-14 | 2015-06-23 | Honda Motor Co., Ltd. | Augmented reality heads up display (HUD) for yield to pedestrian safety cues |
CN104321682B (zh) * | 2013-03-28 | 2017-09-22 | 松下知识产权经营株式会社 | 图像显示装置 |
TW201500735A (zh) * | 2013-06-18 | 2015-01-01 | Nat Applied Res Laboratories | 行動式影像流速辨識之方法及其裝置 |
JP6032195B2 (ja) * | 2013-12-26 | 2016-11-24 | トヨタ自動車株式会社 | センサ異常検出装置 |
JP6233599B2 (ja) | 2014-02-25 | 2017-11-22 | マツダ株式会社 | 車両用表示制御装置 |
WO2015128985A1 (ja) * | 2014-02-27 | 2015-09-03 | パイオニア株式会社 | 表示装置、制御方法、プログラム、及び記憶媒体 |
JP6273976B2 (ja) * | 2014-03-31 | 2018-02-07 | 株式会社デンソー | 車両用表示制御装置 |
US9495814B2 (en) * | 2014-06-19 | 2016-11-15 | Atieva, Inc. | Vehicle fault early warning system |
US9626811B2 (en) | 2014-06-19 | 2017-04-18 | Atieva, Inc. | Vehicle fault early warning system |
US10309797B2 (en) * | 2014-12-19 | 2019-06-04 | Here Global B.V. | User interface for displaying navigation information in a small display |
US9718405B1 (en) | 2015-03-23 | 2017-08-01 | Rosco, Inc. | Collision avoidance and/or pedestrian detection system |
DE112016006199B4 (de) * | 2016-02-12 | 2024-02-01 | Mitsubishi Electric Corporation | Informationsanzeigeeinrichtung und informationsanzeigeverfahren |
JP6784058B2 (ja) * | 2016-05-23 | 2020-11-11 | 株式会社リコー | 情報表示装置 |
FR3056490B1 (fr) * | 2016-09-29 | 2018-10-12 | Valeo Vision | Procede de projection d'une image par un systeme de projection d'un vehicule automobile, et systeme de projection associe |
FR3056775B1 (fr) * | 2016-09-29 | 2021-08-20 | Valeo Vision | Procede de projection d'images par un systeme de projection d'un vehicule automobile, et systeme de projection associe |
KR102581779B1 (ko) * | 2016-10-11 | 2023-09-25 | 주식회사 에이치엘클레무브 | 교차로충돌방지시스템 및 교차로충돌방지방법 |
KR101899396B1 (ko) | 2016-11-24 | 2018-09-18 | 현대자동차주식회사 | 차량 및 그 제어방법 |
JP2018151940A (ja) * | 2017-03-14 | 2018-09-27 | 株式会社デンソーテン | 障害物検出装置および障害物検出方法 |
JP7077616B2 (ja) * | 2017-12-28 | 2022-05-31 | トヨタ自動車株式会社 | 表示制御装置および表示制御方法 |
CN112424788B (zh) * | 2019-06-17 | 2024-08-16 | 谷歌有限责任公司 | 使用三维眼睛注视矢量的车辆乘员参与 |
US11580833B2 (en) * | 2020-03-24 | 2023-02-14 | Object Video Labs, LLC | Camera detection of human activity with co-occurrence |
KR102347423B1 (ko) * | 2020-03-25 | 2022-01-12 | 주식회사 만도모빌리티솔루션즈 | 차량 후측방 경보 장치 및 그 방법 |
US11292398B2 (en) * | 2020-04-20 | 2022-04-05 | Hyundai Mobis Co., Ltd. | Apparatus for displaying forward blind spot situation |
CN111767844B (zh) | 2020-06-29 | 2023-12-29 | 阿波罗智能技术(北京)有限公司 | 用于三维建模的方法和装置 |
CN112925241A (zh) * | 2021-01-25 | 2021-06-08 | 常州机电职业技术学院 | 一种汽车开关门自动响应系统及控制方法 |
US12115916B2 (en) | 2021-02-01 | 2024-10-15 | Rosco, Inc. | Downlighting signal and illumination mirror head for vehicle |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2008257378A (ja) * | 2007-04-03 | 2008-10-23 | Honda Motor Co Ltd | 物体検出装置 |
JP2009017572A (ja) * | 2008-08-04 | 2009-01-22 | Mitsubishi Motors Corp | ノーズビューモニタ装置 |
JP2009277021A (ja) * | 2008-05-15 | 2009-11-26 | Hitachi Ltd | 接近物検出装置および接近物検出方法 |
Family Cites Families (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6727807B2 (en) * | 2001-12-14 | 2004-04-27 | Koninklijke Philips Electronics N.V. | Driver's aid using image processing |
US7266220B2 (en) * | 2002-05-09 | 2007-09-04 | Matsushita Electric Industrial Co., Ltd. | Monitoring device, monitoring method and program for monitoring |
US7190282B2 (en) * | 2004-03-26 | 2007-03-13 | Mitsubishi Jidosha Kogyo Kabushiki Kaisha | Nose-view monitoring apparatus |
JP4639921B2 (ja) * | 2004-06-11 | 2011-02-23 | 日産自動車株式会社 | 運転支援装置 |
JP4899340B2 (ja) * | 2004-08-02 | 2012-03-21 | 日産自動車株式会社 | 運転感覚調整装置及び運転感覚調整方法 |
JP4701773B2 (ja) | 2005-03-24 | 2011-06-15 | 日産自動車株式会社 | 車両用表示装置及び方法 |
JP2009009446A (ja) * | 2007-06-29 | 2009-01-15 | Denso Corp | 車両用情報表示装置 |
-
2010
- 2010-12-08 JP JP2011545089A patent/JP5590684B2/ja not_active Expired - Fee Related
- 2010-12-08 WO PCT/JP2010/007156 patent/WO2011070783A1/ja active Application Filing
- 2010-12-08 EP EP10835706.2A patent/EP2511893A4/en not_active Withdrawn
- 2010-12-08 CN CN201080014454.4A patent/CN102378998B/zh not_active Expired - Fee Related
- 2010-12-08 US US13/255,296 patent/US8536995B2/en not_active Expired - Fee Related
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2008257378A (ja) * | 2007-04-03 | 2008-10-23 | Honda Motor Co Ltd | 物体検出装置 |
JP2009277021A (ja) * | 2008-05-15 | 2009-11-26 | Hitachi Ltd | 接近物検出装置および接近物検出方法 |
JP2009017572A (ja) * | 2008-08-04 | 2009-01-22 | Mitsubishi Motors Corp | ノーズビューモニタ装置 |
Non-Patent Citations (1)
Title |
---|
See also references of EP2511893A4 * |
Cited By (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9475420B2 (en) | 2012-10-05 | 2016-10-25 | Denso Corporation | Display apparatus |
US9771022B2 (en) | 2012-10-05 | 2017-09-26 | Denso Corporation | Display apparatus |
US9063328B2 (en) | 2013-01-28 | 2015-06-23 | Electronics And Telecommunications Research Institute | Head-up display apparatus and method for vehicle |
US9576489B2 (en) | 2014-01-23 | 2017-02-21 | Electronics And Telecommunications Research Institute | Apparatus and method for providing safe driving information |
JP2015092346A (ja) * | 2014-11-18 | 2015-05-14 | 株式会社デンソー | 表示装置 |
WO2016199442A1 (ja) * | 2015-06-10 | 2016-12-15 | 株式会社Jvcケンウッド | レーザレーダ装置および検知方法 |
WO2017134876A1 (ja) * | 2016-02-05 | 2017-08-10 | 日立マクセル株式会社 | ヘッドアップディスプレイ装置及びその表示制御方法 |
JP2019109656A (ja) * | 2017-12-18 | 2019-07-04 | 東芝情報システム株式会社 | 軽車両・軽二輪車の情報管理システム |
JP2020013274A (ja) * | 2018-07-17 | 2020-01-23 | 富士通株式会社 | 表示プログラム、表示装置及び表示方法 |
JP2022175877A (ja) * | 2021-05-14 | 2022-11-25 | トヨタ自動車株式会社 | 車両用表示装置、表示方法及びプログラム |
Also Published As
Publication number | Publication date |
---|---|
EP2511893A1 (en) | 2012-10-17 |
EP2511893A4 (en) | 2018-01-31 |
JPWO2011070783A1 (ja) | 2013-04-22 |
US8536995B2 (en) | 2013-09-17 |
JP5590684B2 (ja) | 2014-09-17 |
CN102378998B (zh) | 2014-12-10 |
CN102378998A (zh) | 2012-03-14 |
US20120235805A1 (en) | 2012-09-20 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5590684B2 (ja) | 情報表示装置及び情報表示方法 | |
US8692739B2 (en) | Dynamic information presentation on full windshield head-up display | |
JP6485732B2 (ja) | 情報提供装置、情報提供方法及び情報提供用制御プログラム | |
US10293745B2 (en) | Projection of a pre-definable light pattern | |
JP6379779B2 (ja) | 車両用表示装置 | |
JP2019011017A (ja) | 表示システム、情報提示システム、表示システムの制御方法、プログラム、及び移動体 | |
WO2011108091A1 (ja) | 車載用表示装置及び表示方法 | |
JP6883759B2 (ja) | 表示システム、表示システムの制御方法、プログラム、及び移動体 | |
JP2009196630A (ja) | 表示装置 | |
US20200249044A1 (en) | Superimposed-image display device and computer program | |
JP5488303B2 (ja) | 車両用表示装置 | |
JP2014036268A (ja) | 移動体の周辺画像表示装置 | |
JP6796806B2 (ja) | 表示システム、情報提示システム、表示システムの制御方法、プログラム、及び移動体 | |
JP6876277B2 (ja) | 制御装置、表示装置、表示方法及びプログラム | |
JP7310560B2 (ja) | 表示制御装置及び表示制御プログラム | |
JP2013174667A (ja) | 車両用表示装置 | |
JP2016109645A (ja) | 情報提供装置、情報提供方法及び情報提供用制御プログラム | |
JP2019202589A (ja) | 表示装置 | |
JP2019116229A (ja) | 表示システム | |
JP2016107947A (ja) | 情報提供装置、情報提供方法及び情報提供用制御プログラム | |
JP2016021116A (ja) | 車両用表示装置 | |
JP6186905B2 (ja) | 車載表示装置およびプログラム | |
JP2019206262A (ja) | 表示装置 | |
JP7318431B2 (ja) | 表示制御装置及び表示制御プログラム | |
JP2018019155A (ja) | 車両用表示制御装置、車両用表示システム、車両用表示制御方法およびプログラム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 201080014454.4 Country of ref document: CN |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 10835706 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2011545089 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 13255296 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2010835706 Country of ref document: EP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |