JP4259368B2 - Nose view monitor device - Google Patents

Nose view monitor device Download PDF

Info

Publication number
JP4259368B2
JP4259368B2 JP2004091666A JP2004091666A JP4259368B2 JP 4259368 B2 JP4259368 B2 JP 4259368B2 JP 2004091666 A JP2004091666 A JP 2004091666A JP 2004091666 A JP2004091666 A JP 2004091666A JP 4259368 B2 JP4259368 B2 JP 4259368B2
Authority
JP
Japan
Prior art keywords
vehicle
image
optical flow
approaching object
vector
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
JP2004091666A
Other languages
Japanese (ja)
Other versions
JP2005276056A (en
Inventor
ドゥルカン エムルラー
高広 前村
将弘 池山
Original Assignee
三菱自動車工業株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 三菱自動車工業株式会社 filed Critical 三菱自動車工業株式会社
Priority to JP2004091666A priority Critical patent/JP4259368B2/en
Priority claimed from US11/087,873 external-priority patent/US7190282B2/en
Publication of JP2005276056A publication Critical patent/JP2005276056A/en
Application granted granted Critical
Publication of JP4259368B2 publication Critical patent/JP4259368B2/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Description

  The present invention relates to a nose view monitor device that displays a captured image of a nose view camera and detects an optical flow vector of the captured image.

  2. Description of the Related Art Conventionally, there has been a technique of providing an imaging camera for imaging the left and right sides at the nose (front end) portion of a vehicle, displaying the captured image (so-called nose view image) on a monitor device, etc., and assisting the occupant with the naked eye. Has been developed. In such a technique, generally, a monitor device that displays a captured image is also used as an in-vehicle monitor that displays a television image, a car navigation image (navigation image), and the like. A navigation image is displayed, and when the vehicle approaches an intersection or a T-junction and stops temporarily, the display is automatically switched to a nose view image display.

  Patent Document 1 discloses a vehicle camera device that is attached to a front portion of a vehicle, captures left and right side images, and displays them on a monitor device (display) in a vehicle compartment, before the vehicle stops or at a predetermined low level. A configuration is described in which the apparatus is automatically turned on according to the vehicle speed and the magnitude of deceleration before traveling at a high speed. According to such a configuration, for example, when the vehicle approaches an intersection and the driver tries to check the left and right, the device is turned on before the temporary stop and the left and right imaging screens from the nose portion are displayed. The driver can quickly and smoothly check the left and right with plenty of room, while the device is not turned on when driving in a nororo due to traffic jams, so as not to disturb the TV image, navigation image, etc. Can be done.

  On the other hand, a technique for detecting a moving object in a captured image using an optical flow in a captured image of such a monitor device has been developed. The optical flow is a two-dimensional velocity vector field on an image, that is, an apparent velocity field of a moving object in a moving image. In such a technique, for example, a point that can be recognized as the same object is set as a feature point (detected by arithmetic processing) between two consecutive images captured at a predetermined period, and the feature point is moved. (Moving direction and moving distance) is calculated as a vector (this vector is an optical flow vector, also simply referred to as a flow vector, and this vector may also be referred to as an optical flow). Then, by calculating the feature points and flow vectors in the entire area in the captured image, information such as the position and moving direction of the moving object in the image can be recognized.

Further, for example, in Patent Document 2, calculation is performed by omitting an area corresponding to a landscape outside the road on the image in the calculation process for obtaining the optical flow of the captured image in the traveling direction of the vehicle (the forward direction that is the traveling direction of the vehicle). The structure which performs is described. Specifically, the optical flow is obtained only for an area including a lower part surrounded by straight lines drawn from the infinity point on the image to the lower corners of the screen and a part around the infinity point. Yes. As a result, the amount of calculation can be reduced compared to obtaining the optical flow in the entire region in the image, and the processing time can be shortened and the processing speed can be increased.
Japanese Patent No. 3287817 Japanese Patent No. 3398934

  By the way, when thinking about recognizing a moving object such as a vehicle approaching the host vehicle using an optical flow from a nose view image obtained by imaging the left and right sides of the vehicle, if the host vehicle is in a stopped state, A flow vector is not generated for an object whose relative position with respect to the host vehicle is not changed (for example, a background portion in an image such as a building or a roadside tree), and a flow vector is generated only for an actually moved object. . That is, a moving object exists at the place where the flow vector is generated, and the moving object can be accurately detected based on the presence or absence of the flow vector.

  However, the situation where nose view images are generally required is when entering an intersection or T-junction with poor visibility, and in such a situation, the vehicle occupant looks at the nose view images and confirms the left and right sides. In many cases, the vehicle is slowly moved forward. In this way, when the host vehicle is moving, the angle (composition) of the nose view image itself moves, so objects that do not actually move, such as buildings and street trees that form the background in the image In addition, a flow vector is generated and the background is detected as a moving object. In addition, it is difficult to extract and detect only a moving object approaching the host vehicle from the image in which the angle itself is moving.

  The present invention has been devised in view of such a problem, and it is possible to accurately detect an approaching object on the side of a nose portion of a vehicle with simple logic and to notify the occupant of approaching object information. An object of the present invention is to provide a nose view monitor device which can be used.

To achieve the above object, the nose view monitoring apparatus of the present invention (Claim 1) includes an imaging means for capturing an image of an area of the left and right side of the vehicle provided the nose front of the vehicle, on the basis of the image An optical flow vector calculating means for calculating an optical flow vector, an optical flow vector having a vector component in the right direction among the optical flow vectors in the left side area calculated by the optical flow vector calculating means, and the right side An approaching object detection means for detecting an approaching object based on an optical flow vector having a vector component in the left direction among the optical flow vectors in the region; an informing means for displaying the image and informing the detection of the approaching object ; Vehicle speed detecting means for detecting the vehicle speed of the vehicle Comprising a該接near object detecting means, when said vehicle speed is greater than a predetermined value, is characterized by stopping the detection of該接near object.

Further, the notification means switches a plurality of notification modes according to the magnitude of an optical flow vector having a vector component in the traveling direction of the vehicle, displays the image and notifies the detection of the approaching object. preferably (claim 2), or,該報known means, the closer together by switching a plurality of notification manner corresponding to the number of optical flow vectors having vector component in the traveling direction of the vehicle, and displays the image It is preferable to notify the detection of the object (claim 3 ).

The vehicle may further include a steering angle detection unit that detects a steering angle of the vehicle , and the approaching object detection unit may stop detecting the approaching object when the steering angle is larger than a predetermined value set in advance. Preferably (Claim 4 ), the notification means preferably notifies that the approaching object detection means has stopped detecting the approaching object (Claim 5 ).

According to the nose view monitoring device of the present invention (claim 1), recognition of an approaching object to the host vehicle in the left side region and the right side region of the vehicle can be easily and accurately performed with a simple control logic . Further, it is possible to reduce the amount of calculation of the optical flow related to the recognition of the approaching object to the host vehicle. Furthermore, even when the vehicle moves at a low speed, it is possible to prevent the movement of the background image in the captured image from being mistakenly recognized as an object approaching the host vehicle, and the detection accuracy is improved by accurately detecting the moving object. Can be made. Therefore, erroneous notification of an approaching object to the host vehicle can be reduced.
In addition, erroneous recognition and erroneous notification of an approaching object due to movement at a high vehicle speed can be reduced. As a result, the detection accuracy of the approaching object by the optical flow can be improved.

Further, according to the nose view monitor device of the present invention (claims 2 and 3 ), the risk of an approaching object to the host vehicle is reduced by switching a plurality of notification modes according to the size and number of optical flow vectors. It is possible to make a determination, and effective notification according to the degree of risk can be performed, and safety can be improved.

Also, according to the nose-view monitoring apparatus of the present invention (Claim 4), it can be reduced misrecognition of approaching object due to the movement of a large steering angle, the erroneous notification. As a result, the detection accuracy of the approaching object by the optical flow can be improved.

Further, according to the nose view monitor device of the present invention (claim 5 ), it is possible to notify the occupant of the stoppage of detection of an approaching object to prevent erroneous recognition, and to prompt the occupant to pay attention to the approaching object. , Safety can be further improved.

Hereinafter, embodiments of the present invention will be described with reference to the drawings.
1 to 5 show a nose view monitor device as an embodiment of the present invention. FIG. 1 is a schematic configuration diagram showing a vehicle equipped with the device. FIG. 2 shows an example of a monitor display by the device. FIG. 3 is a flowchart for explaining the control in the apparatus, FIG. 4 is a schematic diagram for explaining the arithmetic processing in the approaching object detection means of the apparatus, and FIG. 5 is the nose in the apparatus. It is a typical top view which shows the imaging region of a view camera.

  Referring to FIG. 1, a vehicle 2 equipped with a nose view monitor device 1 of the present invention is shown. The vehicle 2 includes a nose view camera (imaging means) 3 that images the left and right sides of the vehicle 3, a vehicle speed sensor (vehicle speed detection means) 4 that detects a traveling speed signal of the vehicle 2, and a nose as an operation switch for the nose view camera 3. Image taken by camera switch 6, steering angle sensor (steering angle detection means) 7, electronic control unit (ECU) 10 and nose view camera for detecting a steering angle (or steering angle of steering wheel) signal of a steering wheel by an occupant Is provided with a monitor (notification means) 5 for displaying the message.

A pair of nose view cameras 3 are provided at the left and right end portions of the nose (front end) portion of the vehicle 2 so that the left side and the right side of the vehicle can be imaged simultaneously.
The monitor 5 displays left and right side images taken by the nose view camera 3. In the present embodiment, as shown in FIG. 2, the image on the right side of the vehicle 2 is displayed in the right half area on the monitor screen, and at the same time, the image on the left side of the vehicle 2 is displayed in the left half area on the monitor screen. It is like that. As a result, the occupant can check the left and right sides of the vehicle at the same time.

  In this embodiment, the regions imaged by the left and right nose view cameras 3 are on the left and right sides in the vehicle width direction of the vehicle 2 and are perpendicular to the traveling direction of the vehicle 2, as shown in FIG. It is designed to point slightly forward rather than the correct direction. Thereby, in the left side image of the vehicle 2, the traveling direction of the vehicle is the right side direction on the image, while in the right side image, the traveling direction of the vehicle is the left side direction on the image. Further, as shown in FIG. 2, the vehicle 21 approaching the vehicle 2 on the road ahead of the vehicle 2 in the image on the left side is displayed so as to expand while moving in the right direction on the image. On the other hand, the vehicle 22 approaching the vehicle 2 on the road ahead of the vehicle 2 in the image on the right side is displayed so as to expand while moving in the left direction on the image.

  The vehicle speed sensor 4 detects wheel rotation number information and inputs the rotation number information to the ECU 10. Similarly, the steering angle sensor 7 detects the steering angle information of the steering wheel steered by the occupant and inputs it to the ECU 10. The ECU 10 calculates the traveling speed V of the vehicle 2 based on the input wheel speed information, and calculates the steering angle θ of the steered wheels based on the steering angle information.

  The nose camera switch 6 is a switch for switching on / off the operation of the nose view camera 3. The nose view camera 3 is not operated when the nose camera switch 6 is turned off. Further, when the nose camera switch 6 is turned on, the nose view camera 3 is activated when a predetermined condition (nose view camera operating condition) is established.

The nose view camera operating condition is, for example, the traveling speed V of the vehicle 2 calculated by the ECU 10 is smaller than a predetermined speed V 0 (a speed corresponding to a very low speed state, for example, 5 km / h). In addition, the steering angle θ of the steered wheels is smaller than a predetermined angle θ 0 set in advance (that is, the steering state of the steering wheel is close to the neutral state).
The monitor 5 displays the captured image when the nose view camera 3 is operating, but displays another image, such as a television image or a car navigation image, when the nose view camera 3 is not operating. It functions as an in-vehicle monitor.

That is, when the vehicle 2 is about to enter an intersection or a T-junction when the nose camera switch 6 is turned on with a TV image or car navigation image continuously displayed during normal driving. When the vehicle speed is decelerated from the predetermined speed V 0 while the steering angle θ is smaller than the predetermined angle θ 0 , the nose view camera 3 is automatically operated, and left and right side images are displayed on the monitor 5. It is like that. In other words, the image is automatically switched from the TV image or the car navigation image to the left and right side images without being conscious of the passenger. Further, when the steering angle θ is greater than or equal to the predetermined angle θ 0 or when the traveling speed V is greater than or equal to the predetermined speed V 0 , the nose view camera 3 does not operate and a TV image or car navigation image is displayed as a general vehicle monitor. It is supposed to continue.

Note that the magnitudes of the predetermined speed V 0 and the predetermined angle θ 0 are set in consideration of the magnitude of the flow vector of the background portion in the captured image of the nose view camera 3, and will be described later.
Further, if the nose camera switch 6 is turned off, the nose view camera 3 can be prevented from operating even if the above-described nose view camera operating conditions are satisfied.

  The ECU 10 is able to detect a moving object approaching the host vehicle by calculating an optical flow of each of the left and right captured images captured by the nose view camera 3. First, the ECU 10 detects an approaching object to the host vehicle based on the optical flow calculation unit (optical flow vector calculation means) 11 that calculates the optical flow of the captured image and the optical flow vector calculated by the optical flow calculation unit 11. An approaching object detection unit (approaching object detection means) 12 and an output unit 13 for outputting these calculation and detection results are provided. In the following description, each optical flow vector is simply referred to as a flow vector, and a set of these flow vectors is referred to as an optical flow.

  The optical flow calculation unit 11 individually calculates the optical flow of each of the left and right side images captured by the nose view camera 3, and the left side image (that is, the left half region in FIG. 2). The optical flow of the right image (that is, the image of the right half area in FIG. 2) is calculated by the left-side region optical flow calculation unit 1B. It has become so. Regarding the calculation of the optical flow, a point corresponding to the same object is calculated (detected by arithmetic processing) between two consecutive images among the images captured by the nose view camera 3, and this is detected. A method of calculating a moving direction and a moving distance of a feature point as a flow vector is used. In addition, a flow vector is calculated in the entire area in the captured image, and information such as the position and moving direction of the moving object in the image can be recognized.

  The approaching object detection unit 12 detects an object approaching the vehicle 2 based on the flow vector calculated by the optical flow calculation unit 11. Specifically, an approaching object to the host vehicle 2 is detected based on a flow vector having a gradient toward the traveling direction side of the vehicle 2 in the left and right side images. For example, in the left side image, a flow vector having a vector component in the right direction on the image is extracted, while in the right side image, a flow vector having a vector component in the left direction on the image is extracted. Then, it is determined that the extracted flow vector is a flow vector due to an approaching object approaching the vehicle 2 (that is, an object approaching the host vehicle 2 among moving objects having the flow vector), and the approaching object is recognized. It is like that.

  That is, it is difficult to determine whether or not the moving object is approaching the host vehicle 2 simply by recognizing the moving object using the optical flow, but in the present embodiment, the approaching object detection unit 12 extracts and selects a flow vector of an object approaching the host vehicle 2 among moving objects recognized by the optical flow calculation unit 11 based on a region where the vehicle exists and its direction, and recognizes the flow vector. A moving object that is approaching the host vehicle 2 and that may be dangerous for the host vehicle 2 is recognized.

  In addition, when the approaching object detection unit 12 detects an approaching object to the host vehicle 2, the output unit 13 displays on the monitor 5 that the approaching object has been detected, and notifies the occupant by voice or the like. It has become. Here, the output unit 13 displays an image by switching a plurality of notification modes based on the magnitude and number of flow vectors having a gradient toward the traveling direction of the vehicle 2 detected by the approaching object detection unit 12. In addition, the detection of the approaching object is notified.

  In other words, focusing on the magnitude of the flow vector having a gradient toward the traveling direction of the vehicle 2, if the flow vector is large, the approaching object generating the flow vector has moved away from the host vehicle 2. Even if it is in the distance, it is likely to be dangerous for the host vehicle 2 because it is approaching the host vehicle 2 at high speed, while the moving object is not approaching the host vehicle 2 at high speed. However, if the vehicle 2 is at a short distance, the moving object is still likely to be dangerous, and in this case as well, the flow vector having a gradient toward the traveling direction side of the vehicle 2 becomes large. .

Therefore, as the magnitude of the flow vector having a gradient toward the traveling direction side of the vehicle 2 increases, the danger level of the approaching object to the vehicle 2 increases, and a plurality of notification modes are switched according to the danger level. Thus, screen display and notification are carried out.
Similarly, when there are many objects approaching the host vehicle 2 or when the distance between the host vehicle 2 and the approaching object is short, the number of flow vectors having a gradient toward the traveling direction side of the vehicle 2 increases. For this reason, as the number of flow vectors increases, the danger level of the approaching object to the vehicle 2 increases, and a plurality of notification modes corresponding to the risk level are switched to perform screen display and notification.

In the present embodiment, the output unit 13 includes a plurality of notification modes such as “low risk level notification mode” and “high risk level notification mode”, and the flow vector of the approaching object detected by the approaching object detection unit 12 is displayed. These notification modes are configured to be switched based on the size and number.
That is, when a flow vector having a gradient toward the traveling direction side of the vehicle 2 is detected that has a magnitude greater than a preset predetermined magnitude, or more than a preset predetermined number is detected. In this case, the output unit 13 is set to the “high risk level notification mode”, and in other cases, the output unit 13 is set to the “low risk level notification mode”.

In the “small danger level notification mode”, the output unit 13 displays one of the flow vectors of the approaching object with an arrow and notifies the detection of the approaching object by voice (for example, “Please pay attention to the surroundings” And so on).
In the “high risk level notification mode”, the output unit 13 highlights an area on the screen corresponding to the approaching object detected by the approaching object detection unit 12 (for example, changes the luminance and color tone), and the like. All flow vectors due to approaching objects are displayed on the screen with arrows, and the passenger is informed that the danger level is high by voice (for example, “Please note that there are vehicles approaching”) It has become.
Note that the output unit 13 notifies the occupant when the nose view camera operating condition is no longer satisfied (V ≧ V 0 or θ ≧ θ 0 ).

  The nose view monitor device 1 according to the present embodiment is configured as described above, and is controlled as follows according to the flowchart shown in FIG. This flow is always executed in the ECU 10 every predetermined cycle (for example, a cycle synchronized with the imaging cycle of the nose view camera 3).

Steps A10 to A30 are flows for determining whether or not the nose view camera 3 is operating as a premise for calculating the optical flow.
First, in step A10, it is determined whether or not the nose camera switch 6 is turned on. If it is on, the process proceeds to step A20, and if it is off, this flow is terminated. Then, in step A20, the travel speed V of the vehicle is determined whether less than the predetermined speed V 0, the process proceeds to step A30 in the case of V <V 0, the flow in the case of V ≧ V 0 Exit. Subsequently, in Step A30, it is determined whether or not the steering angle θ of the steered wheel is smaller than a predetermined angle θ 0. If θ <θ 0 , the process proceeds to Step A40, and if θ ≧ θ 0 , this End the flow.

That is, when the nose view camera 3 is not operating, the flow is terminated without proceeding to step A40 and after, and the process proceeds to step A40 and after only after the nose view camera 3 is operated.
In step A40, the optical flow calculation unit 11 individually calculates the feature points in the left and right side images captured by the nose view camera 3, and in step A50 calculates the flow vectors for all the feature points. That here would moving objects in the left and right sides in each image is recognized.

  Subsequently, in step A60, the approaching object detection unit 12 determines whether all the flow vectors calculated in step A50 have vector components toward the traveling direction side of the vehicle 2. For the flow vector on the left side image of the vehicle 2, it is determined whether it has a vector component in the right direction on the image. On the other hand, for the flow vector on the right side image of the vehicle 2, Then, it is determined whether or not it has a vector component in the left direction on the image. That is, here, among the moving objects recognized in step A50, a moving object approaching the vehicle 2 (approaching object) is recognized separately from the others.

Here, when a flow vector having a vector component toward the traveling direction side of the vehicle 2 is not detected, it means that there is no approaching object, so this flow ends, but the above flow vector is detected. If yes, go to Step A70.
In step A70, the output unit 13 displays on the monitor 5 that the approaching object has been recognized, and notifies the passenger by voice. Here, the notification to the passenger in the output unit 13 is switched in accordance with the size and number of flow vectors having vector components in the traveling direction of the vehicle 2 detected in step A60.

  Of the approaching object flow vectors detected in step A60, when the magnitude of the approaching object is greater than a preset predetermined magnitude, or a preset number of approaching object flow vectors greater than or equal to the preset number. Is detected, the output unit 13 is set to the “high risk level notification mode”, the area on the screen corresponding to the approaching object is highlighted, and all flow vectors due to the approaching object are displayed with arrows. The occupant is informed that the degree of danger is further displayed by voice.

  In addition, among the approaching object flow vectors detected in step A60, those whose magnitude is larger than a preset predetermined size are not detected, and a flow of approaching objects of a preset number or more is detected. When the vector is not detected, the output unit 13 is set to the “low risk level notification mode”, one of the flow vectors by the approaching object is displayed on the screen with an arrow, and the detection of the approaching object is notified to the occupant by voice. Is done.

Specifically, the following operations and effects are achieved by the control as described above.
In a T-shaped road as shown in FIG. 5, when the vehicle 2 tries to enter the main road while checking left and right, first, if the nose camera switch 6 of the vehicle 2 is turned on, the vehicle 2 is connected to the main road. When the traveling speed is made smaller than the predetermined speed V 0 in front, the nose view camera 3 is automatically operated, and left and right side images are displayed on the monitor 5. Thereby, the left and right side images can be automatically displayed on the monitor 5 without making the occupant aware of the switching of the image on the monitor 5. When the nose view camera 3 is activated, the ECU 10 starts calculating the optical flow in the captured image.

  Here, when the vehicle 2 is stopped at the position shown in FIG. 5, the position of the nose view camera 3 is fixed as shown in FIG. An approach that approaches the vehicle 2 without generating a flow vector in the image (the portion that is the background of the vehicles 21 and 22 on the image, and here indicates a non-moving object such as a road, a building, a guardrail, and the sky) Only flow vectors (black arrows in FIG. 3A) by the vehicles 21 and 22 as objects are generated. At this time, the direction of the flow vector by the vehicles 21 and 22 is such that the vehicle 21 has a vector component in the right direction in the left side image, that is, the gradient toward the traveling direction side of the vehicle 2 in the image. Will have. On the other hand, the vehicle 22 has a vector component in the left direction in the screen on the right side, and has a gradient toward the traveling direction side of the vehicle 2 in the image.

Therefore, the approaching object detection unit 12 can detect the vehicles 21 and 22 as the approaching object based on the flow vector having a gradient toward the traveling direction side of the vehicle in the left and right side images.
Further, when the vehicle 2 is traveling at the low speed (traveling speed V <V 0 ) in the position shown in FIG. 5, that is, when the occupant slowly advances the vehicle 2 while checking the left and right, As shown in FIG. 3B, the movement of the imaging position of the nose view camera 3 causes a flow vector (a white arrow in FIG. 3B) to occur in the background portion in the captured image. Become. The flow vectors of the vehicles 21 and 22 are the flow vector that will be generated when the vehicle 2 is stopped (that is, the flow vector of the black arrow shown in FIG. 3A), and the nose view camera. Therefore, a flow vector (black arrow in FIG. 3B) is generated as a sum of the flow vector of the background portion (flow vector of the white arrow) generated by the position movement of 3. Further, the flow vector of the background portion is generated as a flow vector having a gradient in the direction opposite to the forward direction of the vehicle 2 on the image when the vehicle 2 moves forward.

  Therefore, by slowly moving the vehicle 2 forward, the magnitude and direction of the flow vector by the vehicles 21 and 22 are subject to deformation. However, when the vehicle 2 is slowing down at low speed, the vehicle 2 is not subject to significant deformation. It is possible to have a gradient toward the traveling direction side of the vehicle 2 above. Moreover, even if the gradient in the traveling direction side of the vehicle and the flow vector in the background portion of the flow vectors by the vehicles 21 and 22 become equal, the flow vector by the vehicles 21 and 22 Since it also has a vector that expands on the image by approaching, that is, a vector in the vertical direction on the image, the flow vector by the vehicles 21 and 22 is not canceled.

  At this time, the direction of the flow vector by the vehicles 21 and 22 is a vector direction having a gradient on the traveling direction side of the vehicle 2 in the image, as shown in FIG. The direction having the vector component is the direction having the vector component in the left direction in the right side region. In other words, in the left region, the flow vector has a direction in the range of 180 degrees clockwise from the vertical direction, and in the right region, the flow vector has a direction in the range of 180 degrees counterclockwise from the vertical direction.

Then, the approaching object detection unit 12 can detect the vehicles 21 and 22 as approaching objects based on a flow vector having a gradient toward the traveling direction side of the vehicle in the left and right side images. Therefore, the output unit 13 of the ECU 2 outputs to the monitor 5 that an approaching object has been detected, and can alert the passenger.
Moreover, since the output part 13 switches notification mode according to the danger level with respect to the own vehicle 2 of an approaching object, it can notify a passenger | crew effectively and can improve safety | security. be able to.

Note that when the vehicle 2 moves forward quickly, the flow vector of the background portion generated by the position movement of the nose view camera 3 increases, so the direction of the flow vector by the vehicles 21 and 22 is the progression of the vehicle 2 on the image. Although it is conceivable that the vehicle does not have a gradient toward the direction, in the present embodiment, the operation of the nose view camera 3 is stopped when the traveling speed V of the vehicle 2 exceeds a predetermined speed V 0. Therefore , erroneous recognition of an approaching object can be prevented. Similarly, since the operation of the nose view camera 3 is stopped when the steering angle θ of the vehicle 2 is equal to or greater than the predetermined angle θ 0 , an approaching object is not erroneously recognized by turning movement. As a result, the detection accuracy of the approaching object by the optical flow can be improved.

Even when the operation of the nose view camera 3 is stopped, the output unit 13 notifies the occupant to that effect, so that the occupant can be alerted and safety can be improved.
The setting of the magnitudes of the predetermined speed V 0 and the predetermined angle θ 0 is arbitrary depending on the embodiment, but when the vehicle 2 moves, as the traveling speed V and the steering angle θ increase, The flow vector of the background part becomes large. Therefore, in order to detect an object with a fast approach speed, there is no problem even if the set values of the predetermined speed V 0 and the predetermined angle θ 0 are increased. On the other hand, to detect an object with a slow approach speed, It is desirable that the vehicle 2 is stopped. However, when the vehicle 2 is moving, by using the difference in direction between the flow vectors of the vehicles 21 and 22 as the approaching object traveling on the general public road and the flow vector generated in the background portion ( In other words, the vehicles 21 and 22 as the approaching objects can be distinguished from the background portion by the logic as described above.

As described above, according to the nose view monitor device of the present invention, even when the vehicle 2 is stopped or traveling at a low speed (V <V 0 ), or the steering angle is a predetermined angle. In a smaller (θ <θ 0 ) state, an approaching object on the side of the vehicle can be easily and reliably recognized, and the movement of the background image can be prevented from being erroneously recognized as an approaching object, and the detection accuracy of the approaching object is improved Can be improved. Further, among the moving objects recognized based on the optical flow, the configuration for recognizing an approaching object that is dangerous to the host vehicle 2 is simple, and the amount of calculation of the ECU 10 relating to the recognition of the approaching object is reduced. be able to. Moreover, safety can be improved by notification according to the degree of danger.

Although the embodiments of the present invention have been described above, the present invention is not limited to such embodiments, and various modifications can be made without departing from the spirit of the present invention.
For example, in the above-described embodiment, the nose view camera 3 as the imaging unit is provided at each of the left and right end portions of the nose portion of the vehicle 2, but may be configured to include only one of them. Alternatively, a wide-angle camera having a wide imaging area may be used to simultaneously image the left and right sides of the vehicle.

In addition, the imaging area of the nose view camera 3 may be configured to be adjusted in the horizontal direction according to the angle formed by the traveling direction of the vehicle 2 and the direction of the main road that the vehicle 2 wants to enter. You may comprise so that it may adjust to a perpendicular direction according to inclination.
In addition, the predetermined speed V 0 and the predetermined angle θ 0 in the above-described embodiment are arbitrarily set according to the magnitude of the flow vector of the moving object to be recognized. For example, even if the traveling speed V of the vehicle 2 is higher than a predetermined speed V 0 , the movement moves with a flow vector larger than the flow vector of the background portion generated by the position movement of the vehicle 2 (nose view camera 3). An object can be recognized by the above-described configuration.

In addition, the notification method by the output unit 13 in the above-described embodiment is arbitrary. For example, when an approaching object is highlighted in the image on the monitor 5, the approaching object on the image may be enlarged and displayed. Or you may comprise so that a passenger | crew may be alert | reported by operating the braking device of the vehicle 2.
Moreover, you may comprise so that the some alerting | reporting mode according to a danger level may be set in multistep. For example, the notification mode for switching based on the magnitude and number of the flow vector of the approaching object detected by the approaching object detection unit 12 is further increased, and the possibility that the approaching object is dangerous to the host vehicle 2 is determined in multiple stages. As a result, more detailed notification can be performed, safety can be improved, and reliability of the device itself relating to risk determination can be improved.

It is a typical lineblock diagram showing vehicles provided with a nose view monitor device as one embodiment of the present invention. It is a schematic diagram of the display screen content as a monitor display example by the nose view monitor apparatus as one embodiment of the present invention. It is a flowchart for demonstrating the control in the nose view monitor apparatus as one Embodiment of this invention. It is a schematic diagram for demonstrating the arithmetic processing in the approaching object detection means of the nose view monitor apparatus as one Embodiment of this invention. It is a typical top view which shows the imaging area of the nose view camera in the nose view monitor apparatus as one Embodiment of this invention.

Explanation of symbols

1 Nose view monitor device 2 Vehicle (own vehicle)
3 Nose view camera (imaging means)
4 Vehicle speed sensor (vehicle speed detection means)
5 Monitor (notification means)
6 Nose camera switch 7 Steering angle sensor (steering angle detection means)
10 Electronic control unit (ECU)
11 Optical flow calculation unit (optical flow vector calculation means)
11A Optical flow calculation unit for left side region 11B Optical flow calculation unit for right side region 12 Approaching object detection unit (approaching object detection means)
13 Output section

Claims (5)

  1. Imaging means for capturing an image of a region of the left and right side of the vehicle provided the nose front of the vehicle,
    Optical flow vector calculation means for calculating an optical flow vector based on the image;
    An optical flow vector having a right direction vector component among the optical flow vectors in the left side region calculated by the optical flow vector calculation means, and a left direction vector component among the optical flow vectors in the right side region. An approaching object detection means for detecting an approaching object based on an optical flow vector having ;
    An informing means for displaying the image and informing the detection of the approaching object ;
    Vehicle speed detecting means for detecting the vehicle speed of the vehicle,
    The nose view monitor device, wherein the approaching object detection means stops detecting the approaching object when the vehicle speed is larger than a predetermined value set in advance .
  2. The notification means switches between a plurality of notification modes according to the magnitude of an optical flow vector having a vector component in the traveling direction of the vehicle, displays the image and notifies the detection of the approaching object. The nose view monitor device according to claim 1 .
  3. The notification means switches between a plurality of notification modes according to the number of optical flow vectors having a vector component in the traveling direction of the vehicle, displays the image and notifies the detection of the approaching object. The nose view monitor apparatus according to claim 1 or 2 .
  4. Steering angle detection means for detecting the steering angle of the vehicle,
    該接near object detecting means, when the steering angle is greater than a preset predetermined value, characterized by stopping the detection of該接near the object, according to any one of claims 1 to 3 Nose view monitor device.
  5. 該報knowledge means is characterized by notifying the該接near object detecting means stops detecting of該接near objects, a nose view monitoring apparatus according to any one of claims 1 to 4.
JP2004091666A 2004-03-26 2004-03-26 Nose view monitor device Active JP4259368B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2004091666A JP4259368B2 (en) 2004-03-26 2004-03-26 Nose view monitor device

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
JP2004091666A JP4259368B2 (en) 2004-03-26 2004-03-26 Nose view monitor device
US11/087,873 US7190282B2 (en) 2004-03-26 2005-03-24 Nose-view monitoring apparatus
DE200510013920 DE102005013920B4 (en) 2004-03-26 2005-03-24 Front view monitoring apparatus
CNB200510063703XA CN100471263C (en) 2004-03-26 2005-03-25 Nose-view monitoring apparatus

Publications (2)

Publication Number Publication Date
JP2005276056A JP2005276056A (en) 2005-10-06
JP4259368B2 true JP4259368B2 (en) 2009-04-30

Family

ID=35175635

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2004091666A Active JP4259368B2 (en) 2004-03-26 2004-03-26 Nose view monitor device

Country Status (1)

Country Link
JP (1) JP4259368B2 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013205924A (en) * 2012-03-27 2013-10-07 Fuji Heavy Ind Ltd Vehicle exterior environment recognition device, and vehicle exterior environment recognition method

Families Citing this family (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4657765B2 (en) 2005-03-09 2011-03-23 三菱自動車工業株式会社 Nose view system
JP2007147458A (en) * 2005-11-28 2007-06-14 Fujitsu Ltd Location detector, location detection method, location detection program, and recording medium
JP4760562B2 (en) * 2006-06-19 2011-08-31 日産自動車株式会社 Vehicle periphery information presentation device and vehicle periphery information presentation method
JP5271511B2 (en) * 2007-06-14 2013-08-21 富士通テン株式会社 Driving support device and image display device
JP4986070B2 (en) * 2008-03-19 2012-07-25 マツダ株式会社 Ambient monitoring device for vehicles
JP4986069B2 (en) * 2008-03-19 2012-07-25 マツダ株式会社 Ambient monitoring device for vehicles
JP5054612B2 (en) * 2008-05-15 2012-10-24 クラリオン株式会社 Approaching object detection device and approaching object detection method
JP2010070127A (en) * 2008-09-19 2010-04-02 Mitsubishi Motors Corp Vehicle periphery monitoring device
JP5413516B2 (en) 2010-08-19 2014-02-12 日産自動車株式会社 Three-dimensional object detection apparatus and three-dimensional object detection method
KR101223484B1 (en) 2010-10-05 2013-01-17 한국과학기술연구원 HUMAN SERUM ALBUMIN-siRNA NANO-SIZED CARRIER SYSTEM
JP5664787B2 (en) * 2011-08-02 2015-02-04 日産自動車株式会社 Moving body detection apparatus and moving body detection method
US9349057B2 (en) * 2011-09-12 2016-05-24 Nissan Motor Co., Ltd. Three-dimensional object detection device
JP5935435B2 (en) 2012-03-26 2016-06-15 富士通株式会社 Image processing apparatus and image processing method
JP6031819B2 (en) 2012-05-17 2016-11-24 富士通株式会社 Image processing apparatus and image processing method
JP6084048B2 (en) * 2013-01-28 2017-02-22 富士通テン株式会社 Object detection apparatus, object detection system, and object detection method
JP6178580B2 (en) 2013-01-28 2017-08-09 富士通テン株式会社 Object detection apparatus, object detection system, and object detection method
JP5990285B2 (en) * 2014-09-01 2016-09-07 株式会社小松製作所 Transport vehicle

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013205924A (en) * 2012-03-27 2013-10-07 Fuji Heavy Ind Ltd Vehicle exterior environment recognition device, and vehicle exterior environment recognition method

Also Published As

Publication number Publication date
JP2005276056A (en) 2005-10-06

Similar Documents

Publication Publication Date Title
US10515279B2 (en) Vehicle vision system with front and rear camera integration
US8933797B2 (en) Video-based warning system for a vehicle
JP6346614B2 (en) Information display system
US8559675B2 (en) Driving support device, driving support method, and program
JP4134939B2 (en) Vehicle periphery display control device
JP5070809B2 (en) Driving support device, driving support method, and program
JP5160564B2 (en) Vehicle information display device
JP5172314B2 (en) Stereo camera device
JP5022609B2 (en) Imaging environment recognition device
US10445596B2 (en) Camera device for vehicle
KR101328363B1 (en) Lane departure prevention support apparatus, method of displaying a lane boundary line and program
US8890674B2 (en) Driver assistance detection system
US7602945B2 (en) Driving support apparatus
US8461976B2 (en) On-vehicle device and recognition support system
JP4628683B2 (en) Pedestrian detection device and vehicle driving support device including the pedestrian detection device
US7680592B2 (en) System and apparatus for drive assistance
JP3102250B2 (en) Ambient information display device for vehicles
JP4654208B2 (en) Vehicle environment recognition device
US8421863B2 (en) In-vehicle image display device
JP4855158B2 (en) Driving assistance device
JP5099451B2 (en) Vehicle periphery confirmation device
JP5345350B2 (en) Vehicle driving support device
DE102016222502A1 (en) Lane departure warning apparatus and method
WO2014027489A1 (en) Driving assistance device
DE112013006385B4 (en) Vehicle peripheral display device

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20060324

A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20080529

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20080708

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20080904

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20090120

A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20090202

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20120220

Year of fee payment: 3

R151 Written notification of patent or utility model registration

Ref document number: 4259368

Country of ref document: JP

Free format text: JAPANESE INTERMEDIATE CODE: R151

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20120220

Year of fee payment: 3

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20120220

Year of fee payment: 3

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20130220

Year of fee payment: 4

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20140220

Year of fee payment: 5