CN115668959A - Image processing apparatus and method - Google Patents

Image processing apparatus and method Download PDF

Info

Publication number
CN115668959A
CN115668959A CN202180035429.2A CN202180035429A CN115668959A CN 115668959 A CN115668959 A CN 115668959A CN 202180035429 A CN202180035429 A CN 202180035429A CN 115668959 A CN115668959 A CN 115668959A
Authority
CN
China
Prior art keywords
parallax
image
feature point
calculation unit
unit
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202180035429.2A
Other languages
Chinese (zh)
Inventor
堤大
三苫宽人
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hitachi Astemo Ltd
Original Assignee
Hitachi Astemo Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hitachi Astemo Ltd filed Critical Hitachi Astemo Ltd
Publication of CN115668959A publication Critical patent/CN115668959A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • H04N23/81Camera processing pipelines; Components thereof for suppressing or minimising disturbance in the image signal generation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/97Determining parameters from multiple pictures
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/183Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B35/00Stereoscopic photography
    • G03B35/08Stereoscopic photography by simultaneous recording
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • G06T2207/10012Stereo images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20021Dividing image into blocks, subimages or windows
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30168Image quality inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior
    • G06T2207/30252Vehicle exterior; Vicinity of vehicle
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/90Arrangement of cameras or camera modules, e.g. multiple cameras in TV studios or sports stadiums

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Quality & Reliability (AREA)
  • Image Processing (AREA)
  • Image Analysis (AREA)

Abstract

The invention provides an image processing device capable of performing high-precision image degradation determination independent of a driving road or weather. The image processing apparatus includes: a parallax calculation unit (103) that obtains a parallax image; a feature point detection unit (105) that detects feature points (edges, etc.) from the image; a parallax error number calculation unit (107) that counts the number of predetermined unit regions in which parallax error occurs, with reference to a region in the parallax image that corresponds to the feature point region obtained by the feature point detection unit (105); and an image degradation determination unit (109) that determines image degradation on the basis of the count result of the parallax defect number calculation unit (107).

Description

Image processing apparatus
Technical Field
The present invention relates to an image processing apparatus that contributes to improvement in accuracy of image degradation determination of an in-vehicle camera.
Background
In recent years, as one of techniques related to a temporary stop function at the time of degradation of an input image of a camera sensor and calculation of a correction value at the time of degradation of the input image, a technique of detecting the presence or absence of dirt on a camera windshield of a vehicle in which the camera sensor is mounted (image degradation) has been known. For example, since the driver may be notified when the camera sensor is temporarily stopped, accuracy is required for image degradation detection.
However, the outside world captured by the camera sensor may be affected by weather, road surface conditions, and the like, and although image degradation does not occur, the camera sensor may exhibit characteristics similar to those when image degradation occurs. Therefore, in order to detect image degradation with high accuracy, it is important to select a feature amount used for image degradation determination.
Conventionally, various techniques and image processing apparatuses for detecting image degradation have been proposed. For example, patent document 1 describes an image processing apparatus that sets a plurality of processing regions with attention paid to a feature of contrast reduction when image degradation occurs, and performs image degradation determination when the contrast of each processing region is reduced.
Documents of the prior art
Patent document
Patent document 1: japanese patent laid-open No. Hei 11-139262.
Disclosure of Invention
Problems to be solved by the invention
However, in the conventional technique described in patent document 1, image degradation is erroneously detected in severe weather such as snow-covered roads, snowstorms, and fog, in which the contrast is reduced although image degradation does not occur.
The invention aims to provide an image processing device capable of performing high-precision image degradation determination independent of a driving road or weather.
Means for solving the problems
In order to solve the above problem, an image processing apparatus according to the present invention includes: a parallax calculation unit that obtains a parallax image from the 1 st image captured by the 1 st camera sensor and the 2 nd image captured by the 2 nd camera sensor; a feature point detection unit that detects a feature point from either the 1 st image or the 2 nd image; a parallax error number calculation unit that counts the number of predetermined unit regions in which parallax error occurs with reference to a region in the parallax image corresponding to the feature point region obtained by the feature point detection unit; and an image degradation determination unit that determines image degradation based on the count result of the parallax defect count calculation unit.
ADVANTAGEOUS EFFECTS OF INVENTION
According to the present invention, it is possible to distinguish a travel road or weather showing a similar characteristic to that when image degradation occurs from that when actual image degradation occurs although image degradation does not occur, and therefore it is possible to realize highly accurate image degradation detection that does not depend on the travel road or weather.
Problems, configurations, and effects other than those described above will become apparent from the following description of the embodiments.
Drawings
Fig. 1 is a functional block configuration diagram showing an embodiment of an image processing apparatus according to the present invention.
Fig. 2 is a schematic diagram of image degradation of an input image.
Fig. 3 is a schematic diagram of a parallax image.
Fig. 4 is a schematic diagram of processing performed by the feature point detection unit and the parallax defect number calculation unit when there is a parallax corresponding to the processing window.
Fig. 5 is a schematic diagram of processing performed by the feature point detection unit and the parallax defect number calculation unit when there is no parallax corresponding to the processing window.
Fig. 6 is a schematic diagram of processing performed by the feature point detection unit, the parallax defect number calculation unit, and the image degradation determination unit when there is no parallax corresponding to the processing window.
Fig. 7 is a flowchart showing an example of a processing flow of the feature point detection unit when the feature point is determined on a pixel-by-pixel basis with the feature point being an edge.
Fig. 8 is a flowchart showing an example of the processing flow of the parallax defect number calculation unit when the feature point is determined on a pixel-by-pixel basis.
Fig. 9 is a flowchart showing another example of the processing flow of the feature point detection unit when determining in units of the parallax matching window region with the feature point as an edge.
Fig. 10 is a flowchart showing another example of the processing flow of the parallax defect count calculation unit when the determination is made in units of parallax matching window regions with feature points as edges.
Fig. 11 is a flowchart showing an example of the processing flow of the image degradation determination unit.
Detailed Description
Embodiments of the present invention will be described below with reference to the drawings.
Fig. 1 is a functional block configuration diagram showing an embodiment of an image processing apparatus according to the present invention. The image processing apparatus according to the present embodiment is mounted on a vehicle, for example, and determines the presence or absence of image degradation such as dirt on a front windshield. As shown in fig. 1, the image processing apparatus 100 of the present embodiment includes a pair of CMOS elements 101 and 102 as a camera sensor, a parallax calculation unit 103, a feature point detection unit 105, a parallax defect number calculation unit 107, and an image degradation determination unit 109.
The CMOS elements 101, 102 in fig. 1 take input images, respectively. For example, the CMOS devices 101 and 102 are provided in a pair (laterally aligned) in the vehicle interior inside the front windshield of the vehicle toward the front of the vehicle, and capture images of the outside world in front of the vehicle from two different imaging points. Further, by using the images captured by the CMOS elements 101 and 102, it is possible to detect an external object (another vehicle, a pedestrian, an obstacle, or the like), to perform vehicle control based on the detection information, and the like.
The parallax calculation unit 103 generates a parallax image using the input images captured by the CMOS elements 101 and 102. The parallax calculation unit 103 performs stereo matching using the 1 st image captured by one of the CMOS elements 101 and 102 (the 1 st camera sensor) as a reference image and the 2 nd image captured by the other of the CMOS elements 101 and 102 (the 2 nd camera sensor) as a probe image, thereby generating a parallax image. Since the generation of the parallax image can be performed by a general method, a detailed description thereof will be omitted.
The feature point detection unit 105 detects feature points for a reference image used for stereo matching. As the feature point, for example, a feature point related to brightness on the image such as an edge can be used. When an edge is taken as a feature point, the feature point detection unit 105 acquires a difference between the target pixel and the adjacent pixel, and determines that the edge is present when the difference exceeds a preset threshold value. That is, the edge is a region where the luminance difference or the luminance change amount of the adjacent pixel is large. This processing is performed in units of pixels, and is set as a region having an edge (feature point). The parallax determination may be performed in units of a matching window region (parallax matching window region) for determining parallax, and when the number of detected edges is equal to or greater than a predetermined threshold value, the region may be set as a region having an edge (feature point). In the following description, when an edge is a feature point, the feature point detection unit 105 may be referred to as an edge detection unit.
The parallax error count calculation unit 107 refers to a region in the parallax image calculated by the parallax calculation unit 103 corresponding to the feature point region of the reference image detected by the feature point detection unit 105, checks the presence or absence of parallax, and counts when there is no parallax (error).
The image degradation determination unit 109 determines whether or not image degradation has occurred using the parallax defect count, which is the count result obtained by the parallax defect count calculation unit 107. The image degradation determination unit 109 compares the number of parallax defects with a preset threshold value, for example, and determines that image degradation has occurred when the number of parallax defects exceeds the threshold value.
The details of the processing performed by the parallax calculation unit 103, the feature point detection unit 105, the parallax defect number calculation unit 107, and the image degradation determination unit 109 will be described with reference to fig. 2 to 6. In the description herein, the feature point detection unit 105 may be referred to as an edge detection unit 105, with an edge being a feature point.
Fig. 2 is a diagram for defining a state of an input image in addition to the explanation of the embodiment of the present invention. In the embodiment described here, it is assumed that pedestrians, preceding vehicles, left and right white lines, and road surface textures are present on the road surface. Further, the 1 st image captured by one of the CMOS elements 101 and 102 (the 1 st camera sensor) is set as a reference image for stereo matching, and the 2 nd image captured by the other of the CMOS elements 101 and 102 (the 2 nd camera sensor) is set as a probe image for stereo matching. As shown in fig. 2, it is assumed that image degradation such as blur or stain occurs in the 2 nd image.
Fig. 3 is a schematic diagram of a parallax image generated by using fig. 2 as an input image. The matching window block size is indicated by a dotted line and the disparity is indicated by hatching. The distance information is represented by shading (dense shading represents near, and light shading represents far). As shown in fig. 3, a parallax containing near distance information appears in the upper half of a pedestrian, and a parallax containing far distance information appears on the preceding vehicle, but no parallax (defect) appears in the lower half of the pedestrian due to image degradation. In reality, parallax is also caused by ground objects such as road surfaces and white lines, but this time is omitted for the sake of explanation.
Fig. 4 shows an outline of the processing performed by the edge detection unit 105 and the parallax defect number calculation unit 107 according to the present embodiment. The edge detection unit 105 detects a region having a large luminance change and sets the region as a processing window. The parallax defect number calculation unit 107 refers to the parallax image block corresponding to the block detected by the edge detection unit 105, checks the presence or absence of parallax, and ends the processing when there is parallax. Fig. 4 shows a case where there is a parallax corresponding to the processing window.
Fig. 5 shows an outline of the processing performed by the edge detection unit 105 and the parallax defect number calculation unit 107 according to the present embodiment. The edge detection unit 105 detects a region having a large luminance change and sets the region as a processing window. The parallax defect number calculation unit 107 refers to the parallax image blocks corresponding to the blocks detected by the edge detection unit 105, checks the presence or absence of parallax, counts up the number of parallax defects when there is no parallax, and ends the processing. Fig. 5 shows a case where there is no disparity corresponding to the processing window.
Fig. 6 is a schematic diagram of the entire processing of the edge detection unit 105, the parallax defect number calculation unit 107, and the image degradation determination unit 109 according to the present embodiment. The processing shown in fig. 4 and 5 is performed on all blocks of the target image, the number of parallax defects (block number) in the entire image is counted, and the image degradation determination unit 109 compares the counted number (parallax defect number) with a preset threshold value, and if the counted number exceeds the preset threshold value, it is determined that the image degradation is caused.
The flow of processing by the feature point detection unit 105, the parallax defect number calculation unit 107, and the image degradation determination unit 109 will be described with reference to fig. 7 to 11. In the description herein, an example (fig. 7, 8, and 11) in which determination is performed on a pixel-by-pixel basis (every 1 pixel) with a feature point as an edge and an example (fig. 9, 10, and 11) in which determination is performed on a disparity matching window region-by-pixel basis with a feature point as an edge will be separately described. As described above, the parallax matching window region is a predetermined setting unit region including a plurality of pixels used for parallax calculation.
(example of judging in units of pixels with feature points as edges)
Fig. 7 is a flowchart showing an example of the processing flow of the feature point detection unit 105. As shown in fig. 7, a difference between a pixel of interest and an adjacent pixel of a reference image is obtained (S711), and a luminance difference that is the difference is compared with a preset threshold value (S713). If the difference exceeds the threshold, it is determined that there is an edge (feature point) (S715). If the difference is smaller than the threshold, it is determined not to be an edge (feature point) (S717). It is determined whether the pixel of interest reaches the end of the image (S719), and if the pixel of interest reaches the end of the image, the process is ended (S721), and if the pixel of interest does not reach the end of the image, the process is continued with shifting the pixel of interest (S723).
Fig. 8 is a flowchart illustrating an example of the processing flow of the parallax defect number calculation unit 107. As shown in fig. 8, the parallax image region corresponding to the pixel determined as the feature point by the feature point detection unit 105 is referred to (S831). It is determined whether or not parallax is present in the area (S833), and if parallax is not present in the area (if there is a defect), the number of parallax defects is counted up and the process is terminated (S835).
By the processing shown in fig. 7 and 8, it is possible to detect an edge (feature point) in units of pixels and to count the number of pixels with defective parallax (in units of pixels).
(example of determining the edge of the feature point and the parallax matching window region as a unit.)
Fig. 9 is a flowchart showing another example of the processing flow of the feature point detection unit 105. A difference between a target pixel and an adjacent pixel in a reference image is acquired in a matching window region where a parallax is obtained (S911), and a luminance difference which is the difference is compared with a preset threshold value (S913). If the difference exceeds the threshold, it is determined that there is an edge (feature point) (S915). If the difference is smaller than the threshold, it is determined not to be an edge (feature point) (S917). It is determined whether the pixel of interest reaches the end of the matching window region (S919), and if the pixel of interest does not reach the end of the matching window region, the pixel of interest is moved (S923). If the pixel of interest reaches the end of the matching window region, it is determined whether or not the number of pixels determined as an edge in the matching window region of interest exceeds a predetermined threshold (S925), and if the number of edges (feature points) exceeds a predetermined threshold, the region is determined as a feature point (S927). If the number of edges (feature points) does not exceed the preset threshold, the region is not determined as a feature point (S929). It is determined whether or not the pixel of interest is an image end (S931), and if the pixel of interest is an image end, the process is ended (S933). If the pixel of interest is not the end of the image, the process continues with shifting the matching window region of interest (S935).
Fig. 10 is a flowchart showing another example of the processing flow of the parallax defect number calculation unit 107. As shown in fig. 10, the parallax image region corresponding to the region (pixels in the case of fig. 8) determined to be the feature point by the feature point detection unit 105 is referred to (S1031). It is determined whether or not parallax is present in the area (S1033), and if no parallax is present in the area (if defective), the number of defective parallax is counted up and the process is terminated (S1035).
By the processing shown in fig. 9 and 10, it is possible to detect an edge (feature point) in units of a parallax matching window region that is a predetermined unit region including a plurality of pixels, and to count the number of parallax-deficient regions (corresponding to the parallax matching window region) (in units of the parallax matching window region).
Fig. 11 is a flowchart illustrating an example of the processing flow of the image degradation determination unit 109. The processing flow of the image degradation determination section 109 shown in fig. 11 is executed after the count of the number of parallax defects in units of pixels shown in fig. 7 and 8 or after the count of the number of parallax defects in units of parallax matching window regions shown in fig. 9 and 10. The parallax defect number is acquired from the parallax defect number calculation unit 107 (S1151). It is determined whether or not the number of parallax defects exceeds a preset threshold value (S1153). If the number of parallax defects exceeds the threshold value, the image is determined to be image deterioration (S1155). If the number of parallax defects is less than the threshold value, the image is determined to have no image deterioration (S1157).
As described above, the image processing apparatus 100 of the present embodiment includes: a parallax calculation unit 103 that obtains a parallax image from the 1 st image captured by the 1 st camera sensor and the 2 nd image captured by the 2 nd camera sensor; a feature point detection unit 105 that detects a feature point (edge, etc.) from either the 1 st image or the 2 nd image; a parallax error number calculation unit 107 that counts the number of predetermined unit regions in which parallax error occurs with reference to a region in the parallax image corresponding to the feature point region obtained by the feature point detection unit 105; and an image degradation determination unit 109 for determining image degradation based on the count result of the parallax defect count calculation unit 107.
In other words, the image processing apparatus 100 of the present embodiment refers to the parallax image area corresponding to the feature point area on the reference image used for stereo matching, counts whether or not the parallax is defective (the number of predetermined unit areas in which the parallax is defective), and determines whether or not image degradation has occurred.
According to the present embodiment, it is possible to distinguish a travel road or weather showing a similar characteristic to that when image degradation occurs from that when actual image degradation occurs although image degradation does not occur, and therefore it is possible to realize highly accurate image degradation detection that does not depend on the travel road or weather.
The present invention is not limited to the above embodiments, and includes various modifications. For example, the above embodiments are detailed for easily understanding the present invention, and are not necessarily limited to the embodiments having all the configurations described above.
The above-described respective configurations, functions, processing units, processing means, and the like may be realized by hardware by designing a part or all of them by, for example, an integrated circuit. The above-described respective configurations, functions, and the like may be realized by software by a processor interpreting and executing a program for realizing the respective functions. Information such as programs, tables, and files for realizing the respective functions may be stored in a memory, a hard disk, a storage device such as SSD (Solid State Drive), or a storage medium such as an IC card, an SD card, or a DVD.
The control lines and the information lines are considered necessary for the description, and not all the control lines and the information lines are necessarily shown in the product. In practice, it can be said that almost all the components are connected to each other.
Description of the symbols
100 image processing apparatus
101. 102CMOS component (Camera sensor)
103 parallax calculating unit
105 feature point detection unit (edge detection unit)
107 parallax defect number calculating unit
109 an image deterioration determination unit.

Claims (5)

1. An image processing apparatus is characterized by comprising:
a parallax calculation unit that obtains a parallax image from the 1 st image captured by the 1 st camera sensor and the 2 nd image captured by the 2 nd camera sensor;
a feature point detection unit that detects a feature point from either the 1 st image or the 2 nd image;
a parallax error number calculation unit that counts the number of predetermined unit regions in which parallax error occurs with reference to a region in the parallax image corresponding to the feature point region obtained by the feature point detection unit; and
and an image degradation determination unit that determines image degradation based on the count result of the parallax defect count calculation unit.
2. The image processing apparatus according to claim 1,
the feature point is an edge of one of the 1 st image and the 2 nd image.
3. The image processing apparatus according to claim 1,
the feature point detecting section detects a feature point in units of pixels,
the parallax error number calculation unit counts the number of predetermined unit regions in which parallax error occurs in units of the pixels.
4. The image processing apparatus according to claim 1,
the feature point detection section detects feature points in units of a parallax matching window region,
the parallax error count calculation unit counts the number of predetermined unit regions in which parallax error occurs in units of the parallax matching window region.
5. The image processing apparatus according to claim 1,
the image degradation determination unit compares the count result of the parallax defect number calculation unit with a preset threshold value to determine image degradation.
CN202180035429.2A 2020-05-25 2021-02-05 Image processing apparatus and method Pending CN115668959A (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
JP2020-090475 2020-05-25
JP2020090475 2020-05-25
PCT/JP2021/004266 WO2021240896A1 (en) 2020-05-25 2021-02-05 Image processing device

Publications (1)

Publication Number Publication Date
CN115668959A true CN115668959A (en) 2023-01-31

Family

ID=78744237

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202180035429.2A Pending CN115668959A (en) 2020-05-25 2021-02-05 Image processing apparatus and method

Country Status (4)

Country Link
JP (1) JP7354443B2 (en)
CN (1) CN115668959A (en)
DE (1) DE112021001875T5 (en)
WO (1) WO2021240896A1 (en)

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11139262A (en) 1997-11-12 1999-05-25 Fuji Heavy Ind Ltd Device for detecting poor visibility of vehicle window
JPH11157418A (en) * 1997-11-26 1999-06-15 Asmo Co Ltd Wiper and its controlling method
JP3444192B2 (en) * 1998-05-21 2003-09-08 日産自動車株式会社 Imaging environment estimation device
JP7247772B2 (en) * 2019-06-13 2023-03-29 株式会社デンソー Information processing device and driving support system

Also Published As

Publication number Publication date
WO2021240896A1 (en) 2021-12-02
JPWO2021240896A1 (en) 2021-12-02
JP7354443B2 (en) 2023-10-02
DE112021001875T5 (en) 2023-01-12

Similar Documents

Publication Publication Date Title
US10045002B2 (en) Object recognizing apparatus and stain detecting method
US10520309B2 (en) Object recognition device, object recognition method, equipment control system, and distance image generation device
US9846823B2 (en) Traffic lane boundary line extraction apparatus and method of extracting traffic lane boundary line
US9794543B2 (en) Information processing apparatus, image capturing apparatus, control system applicable to moveable apparatus, information processing method, and storage medium of program of method
JP5943084B2 (en) Image processing apparatus, image processing method, and image processing program
US20180129883A1 (en) Detection method and apparatus of a status of a parking lot and electronic equipment
US20150063647A1 (en) Apparatus and method for detecting obstacle
CN111868784B (en) Vehicle-mounted stereo camera
WO2016076449A1 (en) Method and system for detecting an approaching obstacle based on image recognition
US20180012068A1 (en) Moving object detection device, image processing device, moving object detection method, and integrated circuit
KR101268282B1 (en) Lane departure warning system in navigation for vehicle and method thereof
WO2015162910A1 (en) Vehicle-mounted display device, method for controlling vehicle-mounted display device, and program
CN115668959A (en) Image processing apparatus and method
JP2010286995A (en) Image processing system for vehicle
JP6174960B2 (en) Outside environment recognition device
CN111762100B (en) Vehicle camera system and object detection method
CN113614809B (en) Vehicle control arithmetic device, vehicle control device, and vehicle control arithmetic method
JP7134780B2 (en) stereo camera device
US11842552B2 (en) Vehicle exterior environment recognition apparatus
JP7142131B1 (en) Lane detection device, lane detection method, and lane detection program
JP7277666B2 (en) processing equipment
KR102063454B1 (en) Method for determining distance between vehiceles and electrinoc device performing the method
US20220406070A1 (en) Method and device for recognizing an object for a vehicle including a mono camera, and camera system
JP2011191884A (en) White line detection device
JP6334773B2 (en) Stereo camera

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination