CN112534473A - Image processing apparatus and image processing method - Google Patents

Image processing apparatus and image processing method Download PDF

Info

Publication number
CN112534473A
CN112534473A CN201980051719.9A CN201980051719A CN112534473A CN 112534473 A CN112534473 A CN 112534473A CN 201980051719 A CN201980051719 A CN 201980051719A CN 112534473 A CN112534473 A CN 112534473A
Authority
CN
China
Prior art keywords
image
template
vehicle
light spot
unit
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
CN201980051719.9A
Other languages
Chinese (zh)
Inventor
茂泉拓纪
土井宏治
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hitachi Astemo Ltd
Original Assignee
Hitachi Automotive Systems Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hitachi Automotive Systems Ltd filed Critical Hitachi Automotive Systems Ltd
Publication of CN112534473A publication Critical patent/CN112534473A/en
Withdrawn legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/223Analysis of motion using block-matching

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Analysis (AREA)
  • Traffic Control Systems (AREA)
  • Image Processing (AREA)

Abstract

The invention provides an image processing device which can properly switch templates according to the light source state change of a preceding object at night and improve the tracking precision of the preceding object. In the present invention, if the spot information of the tracked vehicle is not changed (step S204), the template replacement process is not performed, and the conventional template process is performed (step S210). When the spot information has changed, it is checked whether or not a template image corresponding to the current spot information exists in the template storage unit (step S205), and when the template image exists, the template image corresponding to the spot is selected (step S206) and the template is replaced (step S207). If no template corresponding to the light spot exists in the template storage unit in step S205, a template image corresponding to the light spot is newly generated (step S208), and the new template image is associated with the light spot information and registered in the template storage unit (step S209).

Description

Image processing apparatus and image processing method
Technical Field
The invention relates to an image processing apparatus and an image processing method.
Background
In recent years, advanced driving assistance systems and automatic driving systems using external recognition sensors such as vehicle-mounted cameras and radars have been receiving attention.
For example, as a driving assistance function for improving convenience and comfort, acc (adaptive Cruise control) or the like for automatically performing follow-up running control by measuring a distance and a relative speed to a vehicle running ahead of the host vehicle is cited, and there is an increasing demand for a vehicle detection technique for improving such a functional performance.
In the vehicle detection technology by the in-vehicle camera described in patent document 1, template matching is performed in order to search for and track a preceding vehicle from a captured image.
Documents of the prior art
Patent document
Japanese patent laid-open publication No. 2013-161241
Disclosure of Invention
Problems to be solved by the invention
When template matching is performed, the positional relationship between the host vehicle and the preceding vehicle changes due to a change in the inter-vehicle distance to the preceding vehicle, a lane change, or a curve radius of the road as time passes, and the appearance gradually changes. The change in appearance with the passage of time may reduce the ease of matching with the template, and in order to suppress this, the template needs to be updated.
In the template updating process in the vehicle tracking process at night, when a portion of a vehicle body that is driving earlier is made to be bright by lighting a brake lamp of the preceding vehicle or a headlight of an oncoming vehicle, the brightness of the peripheral portion thereof is saturated and the appearance thereof is greatly changed. This results in a decrease in similarity to the template, making updating of the template difficult, and possibly failing in template matching.
In the technique described in patent document 1, recognition of a phenomenon that template matching fails due to, for example, the lighting of a brake light of a preceding vehicle is not performed, and a countermeasure is not described.
Thus, it is difficult in the related art to further improve the vehicle tracking accuracy.
The invention aims to realize an image processing device and an image processing method which can improve the tracking precision of a preceding object by appropriately performing template switching processing according to the light source state change of the preceding object at night.
Means for solving the problems
In order to achieve the above object, the present invention is configured as follows.
An image processing apparatus that detects a tracked vehicle by template matching, comprising: an image pickup unit that picks up an image of an object; a light spot detection unit that detects a light spot from an image captured by the imaging unit; a template storage unit that stores templates of a plurality of vehicle images and light spot information; and a recognition processing unit that, when the light spot of the image captured by the imaging unit has changed, associates information of the changed light spot with the template of the vehicle image and stores the information in the template storage unit.
In addition, in an image processing method for detecting a vehicle to be tracked by template matching, templates of a plurality of vehicle images and light spot information are stored in a template storage unit, a light spot is detected from an image captured by an imaging unit, and when the light spot of the image captured by the imaging unit has changed, information of the changed light spot is stored in the template storage unit in association with the template of the vehicle image.
ADVANTAGEOUS EFFECTS OF INVENTION
According to the present invention, it is possible to realize an image processing apparatus and an image processing method that can improve the accuracy of tracking a preceding object by appropriately performing template switching processing in accordance with a change in the light source state of the preceding object at night.
Drawings
Fig. 1 is a configuration diagram of an image processing apparatus in embodiment 1 of the present invention.
Fig. 2 is a diagram showing a flow of the template replacement process in embodiment 1.
Fig. 3 is a diagram for specifically explaining the template replacement process in embodiment 1.
Fig. 4 is a diagram for specifically explaining the template replacement process in embodiment 2.
Detailed Description
Hereinafter, a specific embodiment of the present invention will be described with reference to the drawings.
Examples
(example 1)
Fig. 1 is a configuration diagram of an image processing apparatus 100 according to embodiment 1 of the present invention.
Although the three-dimensional object detection using the stereo camera as the short-distance vehicle detection means is described in example 1, the detection may be performed by, for example, a monocular camera alone or a three-dimensional object detection in which a monocular camera and a radar are combined.
In fig. 1, a camera 101 (camera 1) and a camera 102 (camera 2) as imaging devices (imaging units for imaging an object) are provided so as to obtain a field of view in front of the host vehicle.
In example 1, a stereoscopic image necessary for calculating parallax information used for stereoscopic object detection processing and a light spot detection image necessary for calculating light spot information used for light spot detection processing are captured.
The stereoscopic image is captured by the camera 101 and the camera 102, and the exposure time is set by automatic exposure adjustment in order to detect a non-self-luminous object by the camera 101 and the camera 102.
The image for spot detection is captured by the camera 101 or the camera 102 in which the exposure time recorded in the exposure time storage unit 117 of the storage area 115 is set. The image for detecting the light spot may be an image of either the camera 101 or the camera 102, or may be an image of both the camera 101 and the camera 102.
The stereoscopic image and the light spot detection image are captured at different times. For example, an image for light spot detection is taken after a stereoscopic image is taken.
The images captured by the cameras 101 and 102 are input to the image signal processing unit 103.
The image signal processing unit 103 calculates parallax information using the stereo images input from the cameras 101 and 102. The image signal processing unit 103 includes: an exposure time control unit 104 that controls the exposure time of the cameras 101 and 102; an image capturing unit 105 that captures an image with the camera 101 and the camera 102; a parallax calculation unit 106 that calculates parallax using the image captured by the image capture unit 105; and a light spot detection unit 107 that detects a light spot using the image captured by the image capturing unit 105.
The image signal processing 103 is connected to the storage area 115.
The images captured by the cameras 101 and 102 and the parallax information calculated by the parallax calculation unit 106 are sent to the storage area 115. The captured image is stored in the image information storage unit 116 of the storage area 115, and the parallax information is stored in the parallax information storage unit 118 of the storage area 115.
The parallax information calculated by the parallax calculation unit 106 is sent to the vehicle search unit 109 of the recognition processing unit 108.
The light spot detection unit 107 performs light spot detection using the light spot detection image, and the detected light spot is sent to the light spot information calculation unit 110 of the recognition processing unit 108. The spot information calculating unit 110 calculates spot information such as the type and position of the spot based on the spot information sent from the spot detecting unit 107.
As an example of the light spot detection method in the light spot detection unit 107, a high-luminance pixel is extracted by performing binary processing on a light spot detection image with a luminance value. In the case of a spot of a headlight, a white high-luminance pixel is extracted, and in the case of a spot of a taillight, a red high-luminance pixel is extracted. The extracted high-luminance pixels are grouped, and the set of adjacent high-luminance pixels is used as a light spot. Further, the shape, aspect ratio, and area of the light spot are reduced by performing filtering processing.
The vehicle search unit 109 of the recognition processing unit 108 performs vehicle search based on the parallax information to acquire vehicle candidate information, and sends the vehicle candidate information to the template selection unit 111. The light spot information calculation unit 110 sends the calculated light spot information to the template selection unit 111.
The template selection unit 111 selects a template based on the vehicle candidate information sent from the vehicle search unit 109 and the spot information sent from the spot information calculation unit 110. The newly detected vehicle is newly registered as a tracking target (preceding object), and a template is generated in the template generation unit 112. At this time, the light spot information present on the image area serving as the template is associated with the template image of the vehicle. The template image associated with the light spot information is stored in the template image storage unit 122 of the template storage unit 121 in the storage area 115.
When there is a vehicle that has been registered as a tracking target in the past, template matching search is performed using a template image of the vehicle.
The template image used in the template matching search is selected by the template selecting unit 111, and the template selecting unit 111 acquires the spot information around the search area from the spot information storage unit 123 of the template storage unit 121, compares the acquired spot information with the spot information associated with the template image, and selects a template image that seems to be authentic (yu 1241 らしい.
When the light spot information changes and a template image that seems to be authentic is not found, the template image generating unit 112 newly generates a template image and stores the template image in the template image storage unit 122. The template image storage unit 122 stores templates of a plurality of vehicle images.
The vehicle tracking processing unit 113 performs tracking processing on vehicle information such as the template image generated by the template generating unit 112. That is, the tracked vehicle is detected by template matching, and tracking processing is performed.
In order to pass the vehicle information obtained by the tracking processing performed by the vehicle tracking processing unit 113 to the control processing, the vehicle information is sent to the control signal output unit 114. The control signal output unit 114 outputs a control signal for controlling the vehicle based on the vehicle information from the vehicle tracking processing unit 113.
The storage area 115 includes a three-dimensional object information storage unit 119 and a vehicle characteristic information storage unit 120.
Fig. 2 is a diagram showing the flow of the template replacement process in embodiment 1. The processing shown in fig. 2 is performed by the recognition processing unit 108.
In fig. 2, the vehicle search unit 109 searches for vehicle candidates based on the parallax information (image information) from the image capturing unit 105, and acquires vehicle candidate information (step S201).
The spot information calculating unit 110 acquires the spot information from the spot information (step S202).
Next, vehicle tracking processing is performed based on the vehicle candidate information from the vehicle search unit 109 and the spot information from the spot information calculation unit 110 (steps S203 to S210).
First, in step S203, it is checked whether or not there is a vehicle under tracking, and if it is not the tracking operation of the vehicle (before the tracking of the vehicle is started), the tracking is started with the vehicle indicated by the vehicle candidate information from the vehicle search unit 109 as a new tracking target. At this time, the light spot information present in the image area serving as the template is correlated to generate a template image of the vehicle image (step S208), and the template image is registered (stored) in the template storage unit 121 (step S209).
If there is a vehicle under tracking in step S203, the process proceeds to step S204, and it is determined whether or not the light spot of the vehicle under tracking has changed. If the spot information of the vehicle under tracking is not changed in step S204, the conventional template process is performed without performing the template replacement process (step S210).
When the spot information has changed in step S204, the process proceeds to step S205, and it is checked whether or not a template image corresponding to the current spot information exists in the template storage unit 121.
When the corresponding template exists in step S205, the template selection unit 111 selects a template image corresponding to the changed light spot (step S206), and performs replacement of the template image (template replacement) (step S207). Then, the tracking action is continued.
The vehicle tracking processing unit 113 predicts the position of the tracked vehicle (tracking range of the tracked vehicle) based on the vehicle information obtained by the search by the vehicle search unit 109. As described above, when the template image corresponding to the light spot is selected, the template image is replaced, and the vehicle tracking processing unit 113 calculates the similarity between the image of the replaced (selected) template and the predicted position (predicted range) of the tracked vehicle.
Then, the vehicle tracking processing unit 113 corrects the predicted position of the tracked vehicle (tracking range of the tracked vehicle) based on the calculated similarity. By correcting the predicted position of the tracked vehicle (the tracking prediction range of the tracked vehicle), it is possible to suppress a deviation in the tracking position of the tracked vehicle, and it is possible to improve the matching accuracy.
In step S205, when there is no template corresponding to the light spot in the template storage unit 121, a template image corresponding to the light spot is newly generated (step S208), and the new template of the vehicle image is associated with the changed light spot information and registered (stored) in the template storage unit 121 (step S209). Then, the tracking action is continued.
Fig. 3 is a diagram for specifically explaining the template replacement process in embodiment 1.
In fig. 3, a preceding vehicle 302 is searched for on a captured image 301 at a time (t-n) frame. If the preceding vehicle 302 is not registered as a vehicle under tracking at this point in time (t-n)) (No in step S205), it is registered as a new tracking target, and the image area surrounding the entire vehicle is stored as the template image 314 in the template image storage unit 122 of the template storage unit 121 (steps S208 and S209).
At this time, the tail lamps 303 and 304 of the preceding vehicle are acquired as spot information. When the template image 314 is registered in the template image storage unit 122, the spot information of the tail lamps 303 and 304 is stored in the spot information storage unit 123 in association with the template image.
Next, the preceding vehicle 306 identical to the preceding vehicle 302 is tracked on the captured image 305 at the time (t). The predicted tracking position of the vehicle 306 in the present frame (the frame at time t) is calculated from the moving speed of the preceding vehicle 302 in the previous frame (the frame at time (t-n)), and the vicinity of the predicted tracking position is searched for by template matching.
At this time, spot information of the tail lamps 307 and 308 of the preceding vehicle 306 existing in the vicinity of the predicted tracking position is acquired, and it is determined whether or not the state of the spot information has changed from that at time (t-n) (step S204). When the states of the tail lights 307 and 308 are not changed, the template 314 generated in the (t-n) frame is used without being replaced to perform template matching, because it is considered that the change in appearance due to the change in light source is small.
The preceding vehicle 310 identical to the preceding vehicles 302 and 306 is tracked on the captured image 309 at time (t + n) (frame image (t + n)).
At this time, the preceding vehicle 310 applies the brake to turn on the brake lamps 311, 312, and 313. When it is determined whether or not the spot information of the preceding vehicle 310 has changed from the state of the spot information at time (t) (step S204), the spot information of the tail lamps 307 and 308 is registered at time (t), but the brake lamps 311, 312 and 313 are turned on at time (t + n) to change the number and size of the spots.
Therefore, in this case, since it is considered that the change in appearance due to the change in light source is large, the template is replaced by selecting the template corresponding to the light spot, and the image area surrounding the vehicle at the time of (t + n) is stored as the template image 318 in the template image storage unit 122. At this time, the spot information of the stop lamps 311, 312, and 313 is stored in association with the template image.
As described above, embodiment 1 of the present invention is configured to acquire spot information of a preceding vehicle, and when there is a change in spot information, perform a tracking operation by replacing a template corresponding to the changed spot information with a template corresponding to the changed spot information, and perform a tracking operation by replacing the template corresponding to the changed spot information with a template corresponding to the changed spot information when there is no template corresponding to the changed spot information.
Therefore, it is possible to realize an image processing apparatus and an image processing method that appropriately perform template switching processing according to a change in the light source state of a preceding object at night, suppress a decrease in template matching accuracy corresponding to a rapid change in appearance caused by a change in the light source, and improve the accuracy of tracking the preceding object.
(example 2)
Next, example 2 of the present invention will be explained.
While embodiment 1 described above is an example in which the template is replaced every time the spot information of the preceding vehicle changes, embodiment 2 is an example in which a plurality of templates corresponding to the change in the spot are stored and an appropriate template is selected based on the spot information.
The image processing apparatus in embodiment 2 is the same as the example shown in fig. 1, and therefore, illustration and detailed description thereof are omitted. Note that the processing shown in fig. 2 can also be applied to embodiment 2, and thus, illustration and detailed description are omitted.
Fig. 4 is a diagram for specifically explaining the template replacement process in embodiment 2.
In fig. 4, a preceding vehicle 402 is being tracked on a captured image 401 of a frame at time (t-n). The image of the vehicle 402 on the captured image 401 is stored as a template image 415 in the template image storage unit 122.
At this time, the spot information of the tail lamps 403 and 404 of the preceding vehicle 402 is stored in association with the template image. In addition to the template image 415, for example, a template image 416 when the stop lamp is turned on and a template image 417 when the left turn lamp is turned on are stored in the template image storage unit 122 of the template storage unit 121.
On the captured image 405 of the time (t) frame, the preceding vehicle 406, which is the same as the preceding vehicle 402, turns on the turn signal lamp 408. At the time (t-n) frame, spot information of the tail lamps 403 and 404 is registered, and at the time (t) frame, the tail lamps 407 and 409 and the turn lamp 408 of the preceding vehicle 406 are turned on to change the number and the type of the spots.
Therefore, it is searched whether or not a template associated with the same spot information as the tail lights 407 and 409 and the turn signal 408 exists among the templates registered in the template image storage unit 122 (step S205), and since a matching template image 417 exists, the template image 417 is selected and template matching is performed.
At this time, for example, when the preceding vehicle 406 has a trim or the like, the tracking predicted position is shifted, and it is considered that a wrong spot state change is observed by acquiring spot information around the shifted tracking predicted position, and the template is erroneously selected.
In order to avoid erroneous selection when selecting a template based on spot information, an allowable range is set for the amount of change in the predicted tracking position due to pitching or the like, and template selection is not performed when the predicted tracking position changes by more than the allowable range.
When there is no template associated with the same spot information in the template image storage unit 122, the template image is registered as a new template image in the template image storage unit 122 (steps S208 and S209).
Similarly, the presence or absence of a change in the light point information is determined for the preceding vehicle 411 on the captured image 410 of the time (t + n) frame. On the captured image 410, the preceding vehicle 411 applies a brake to turn on the brake lamps 413, 412, and 413. When it is determined whether or not the spot information of the preceding vehicle 411 has changed from the state of the spot information at time (t), the spot information of the tail lamps 407 and 408 is registered at time (t), but the brake lamps 413, 412 and 413 are turned on at time (t + n) to change the number and size of the spots.
Therefore, in this case, since it is considered that the change in appearance due to the light source change is large, the template 416 corresponding to the light spot is selected and template matching is performed.
According to embodiment 2 of the present invention, in addition to the same effects as those of embodiment 1, since a plurality of templates corresponding to the change of the light spot are stored and an appropriate template is selected based on the light spot information, it is possible to quickly use an appropriate template based on the change of the state of the light spot.
Here, an application of the embodiment of the present invention to the automatic driving technique will be described.
In the present invention, since the template is stored in association with the spot information, it is possible to grasp the behavior such as the braking of the preceding vehicle and the lane change idea based on the turn signal lighting.
This information can be used to determine control such as deceleration of the host vehicle by detecting a jam due to a lane change of a preceding vehicle, or acceleration of the host vehicle by detecting absence of a preceding vehicle.
Description of the symbols
100 … image processing device, 101, 102 … camera, 103 … image signal processing section, 104 … exposure time control section, 105 … image shooting section, 106 … parallax operation section, 107 … light spot detection section, 108 … recognition processing section, 109 … vehicle search section, 110 … light spot information calculation section, 111 … template selection section, 112 … template generation section, 113 … vehicle tracking processing section, 114 … control signal output section, 115 … storage area, 116 … image information storage section, 117 … exposure time storage section, 118 … parallax information storage section, 119 … three-dimensional object information storage section, 120 … vehicle characteristic information storage section, 121 … template storage section, 122 … template image storage section, 123 … light spot information storage section.

Claims (12)

1. An image processing device that detects a tracked vehicle by template matching, the image processing device comprising:
an image pickup unit that picks up an image of an object;
a light spot detection unit that detects a light spot from the image captured by the imaging unit;
a template storage unit that stores templates of a plurality of vehicle images and light spot information; and
and a recognition processing unit that stores information of the changed light spot in the template storage unit in association with a template of the vehicle image when the light spot of the image captured by the imaging unit has changed.
2. The image processing apparatus according to claim 1,
the recognition processing unit includes a template selection unit that selects a template corresponding to a change in the light spot from among the templates of the plurality of vehicle images in the template storage unit when the light spot of the image captured by the imaging unit has changed.
3. The image processing apparatus according to claim 1,
when the light spot of the image captured by the imaging unit has changed and the template storage unit does not have a template corresponding to the change in the light spot, the recognition processing unit stores the information on the changed light spot of the image and the image as a template in the template storage unit.
4. The image processing apparatus according to claim 2,
the recognition processing unit includes a vehicle tracking processing unit that predicts a tracking range of a tracked vehicle, and the vehicle tracking processing unit calculates a similarity between the image of the template selected by the template selection unit and the predicted tracking range of the tracked vehicle.
5. The image processing apparatus according to claim 4,
the vehicle tracking processing unit corrects the predicted tracking range of the tracked vehicle based on the calculated similarity.
6. The image processing apparatus according to claim 1,
the recognition processing unit includes a vehicle search unit that searches for a vehicle candidate based on the image information from the imaging unit, and generates a template image of a vehicle image by associating spot information present in the image information with the vehicle image information from the vehicle search unit before starting vehicle tracking, and stores the template image in the template storage unit.
7. An image processing method of detecting a tracked vehicle by template matching, the image processing method being characterized in that,
the template storage unit stores templates of a plurality of vehicle images and spot information,
a light spot is detected from an image captured by an imaging unit,
when the light spot of the image captured by the imaging unit changes, the information of the changed light spot is associated with the template of the vehicle image and stored in the template storage unit.
8. The image processing method according to claim 7,
when the light spot of the image captured by the imaging unit has changed, a template corresponding to the change in the light spot is selected from the templates of the plurality of vehicle images in the template storage unit.
9. The image processing method according to claim 7,
when a light spot of an image captured by the imaging unit has changed and a template corresponding to the change of the light spot is not present in the template storage unit, information on the changed light spot of the image and the image are stored in the template storage unit as templates.
10. The image processing method according to claim 8,
predicting a tracking range of a tracked vehicle, and calculating a similarity between the image of the selected template and the predicted tracking range of the tracked vehicle.
11. The image processing method according to claim 10,
and correcting the predicted tracking range of the tracked vehicle according to the calculated similarity.
12. The image processing method according to claim 7,
before starting the vehicle tracking, a vehicle candidate search is performed based on the image information from the image pickup unit, and a template image of a vehicle image is generated by associating spot information present in the image information based on the image information of the searched vehicle, and is stored in the template storage unit.
CN201980051719.9A 2018-08-22 2019-07-25 Image processing apparatus and image processing method Withdrawn CN112534473A (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
JP2018155352 2018-08-22
JP2018-155352 2018-08-22
PCT/JP2019/029151 WO2020039838A1 (en) 2018-08-22 2019-07-25 Image processing device and image processing method

Publications (1)

Publication Number Publication Date
CN112534473A true CN112534473A (en) 2021-03-19

Family

ID=69593004

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201980051719.9A Withdrawn CN112534473A (en) 2018-08-22 2019-07-25 Image processing apparatus and image processing method

Country Status (3)

Country Link
JP (1) JP7139431B2 (en)
CN (1) CN112534473A (en)
WO (1) WO2020039838A1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115571127B (en) * 2022-11-24 2023-04-14 山东欣立得光电科技有限公司 Vehicle cruise system applying lamplight characteristics

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102881186A (en) * 2011-07-11 2013-01-16 歌乐株式会社 Environment recognizing device for a vehicle and vehicle control system using the same
CN103136935A (en) * 2013-01-11 2013-06-05 东南大学 Method for tracking sheltered vehicles
CN103914853A (en) * 2014-03-19 2014-07-09 华南理工大学 Method for processing target adhesion and splitting conditions in multi-vehicle tracking process
US20160005180A1 (en) * 2013-02-27 2016-01-07 Hitachi Automotive Systems, Ltd. Imaging Apparatus and Vehicle Controller
CN107730533A (en) * 2016-08-10 2018-02-23 富士通株式会社 The medium of image processing method, image processing equipment and storage image processing routine
US9934440B1 (en) * 2017-10-04 2018-04-03 StradVision, Inc. Method for monitoring blind spot of monitoring vehicle and blind spot monitor using the same
US9947228B1 (en) * 2017-10-05 2018-04-17 StradVision, Inc. Method for monitoring blind spot of vehicle and blind spot monitor using the same

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102881186A (en) * 2011-07-11 2013-01-16 歌乐株式会社 Environment recognizing device for a vehicle and vehicle control system using the same
JP2013020417A (en) * 2011-07-11 2013-01-31 Clarion Co Ltd External environment recognizing device for vehicle and vehicle control system using the same
CN103136935A (en) * 2013-01-11 2013-06-05 东南大学 Method for tracking sheltered vehicles
US20160005180A1 (en) * 2013-02-27 2016-01-07 Hitachi Automotive Systems, Ltd. Imaging Apparatus and Vehicle Controller
CN103914853A (en) * 2014-03-19 2014-07-09 华南理工大学 Method for processing target adhesion and splitting conditions in multi-vehicle tracking process
CN107730533A (en) * 2016-08-10 2018-02-23 富士通株式会社 The medium of image processing method, image processing equipment and storage image processing routine
US9934440B1 (en) * 2017-10-04 2018-04-03 StradVision, Inc. Method for monitoring blind spot of monitoring vehicle and blind spot monitor using the same
US9947228B1 (en) * 2017-10-05 2018-04-17 StradVision, Inc. Method for monitoring blind spot of vehicle and blind spot monitor using the same

Also Published As

Publication number Publication date
WO2020039838A1 (en) 2020-02-27
JP7139431B2 (en) 2022-09-20
JPWO2020039838A1 (en) 2021-08-10

Similar Documents

Publication Publication Date Title
US10286834B2 (en) Vehicle exterior environment recognition apparatus
US10442343B2 (en) Vehicle exterior environment recognition apparatus
US8848980B2 (en) Front vehicle detecting method and front vehicle detecting apparatus
US20150073705A1 (en) Vehicle environment recognition apparatus
US20190193739A1 (en) Vehicle control apparatus and vehicle control method
US20170024622A1 (en) Surrounding environment recognition device
JP6085522B2 (en) Image processing device
WO2017134982A1 (en) Imaging device
JP2008207677A (en) Image processing apparatus, image processing method and image processing system
JP2016045903A (en) Object recognition device and vehicle control system
WO2017212992A1 (en) Object distance detection device
CN115151955A (en) System for monitoring the environment of a motor vehicle
CN112534473A (en) Image processing apparatus and image processing method
JP7356319B2 (en) Vehicle exterior environment recognition device
JP2018151999A (en) Object distance detection apparatus
US10351048B2 (en) Headlight control device
WO2019058755A1 (en) Object distance detection device
JP2004130969A (en) Preceding vehicle brake operation judging device and vehicle-to-vehicle distance control device
CN114730520B (en) Semaphore recognition method and semaphore recognition device
JP6335065B2 (en) Outside environment recognition device
CN114746915B (en) Signal machine identification method and signal machine identification device
JP6879881B2 (en) White line recognition device for vehicles
US20180278862A1 (en) Image generating apparatus, image generating method, and recording medium having the program stored thereon
WO2019013253A1 (en) Detection device
JP2012240523A (en) Light distribution control device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information
CB02 Change of applicant information

Address after: Ibaraki

Applicant after: Hitachi astemo Co.,Ltd.

Address before: Ibaraki

Applicant before: HITACHI AUTOMOTIVE SYSTEMS, Ltd.

WW01 Invention patent application withdrawn after publication
WW01 Invention patent application withdrawn after publication

Application publication date: 20210319