CN112544066A - Image processing apparatus - Google Patents

Image processing apparatus Download PDF

Info

Publication number
CN112544066A
CN112544066A CN201980052664.3A CN201980052664A CN112544066A CN 112544066 A CN112544066 A CN 112544066A CN 201980052664 A CN201980052664 A CN 201980052664A CN 112544066 A CN112544066 A CN 112544066A
Authority
CN
China
Prior art keywords
image
unit
processing
image processing
monocular
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201980052664.3A
Other languages
Chinese (zh)
Inventor
武藤善之
的野春树
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hitachi Astemo Ltd
Original Assignee
Hitachi Automotive Systems Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hitachi Automotive Systems Ltd filed Critical Hitachi Automotive Systems Ltd
Publication of CN112544066A publication Critical patent/CN112544066A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R11/00Arrangements for holding or mounting articles, not otherwise provided for
    • B60R11/02Arrangements for holding or mounting articles, not otherwise provided for for radio sets, television sets, telephones, or the like; Arrangement of controls thereof
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B15/00Special procedures for taking photographs; Apparatus therefor
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B35/00Stereoscopic photography
    • G03B35/08Stereoscopic photography by simultaneous recording
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Mechanical Engineering (AREA)
  • Image Analysis (AREA)
  • Studio Devices (AREA)
  • Fittings On The Vehicle Exterior For Carrying Loads, And Devices For Holding Or Mounting Articles (AREA)
  • Traffic Control Systems (AREA)

Abstract

The invention provides an image processing device which executes parallax image calculation processing calculated based on difference information of images obtained from a plurality of imaging units, sets an image calculation region suitable for each of the plurality of imaging units, suppresses calculation processing load of the whole image control system, and can realize high speed image processing. An image processing apparatus of the present invention includes: an image acquisition unit that obtains a first image captured by the first imaging unit and a second image captured by the second imaging unit; a first calculation region acquisition unit that obtains a first calculation region from the first image; and a second calculation region acquisition unit that obtains a second calculation region different from the first calculation region from the second image.

Description

Image processing apparatus
Technical Field
The present invention relates to an image processing apparatus.
Background
For example, there is known an image processing device mounted on a vehicle, which displays an infrared image on a display unit for a landscape in a low-illuminance environment where illumination is not possible, and which assists driving of a driver.
Since the image processing apparatus as described above displays all the images of the infrared image, buildings and the like which do not require attention are displayed, and there is a possibility that the detection of the object to be detected is delayed.
Patent document 1 discloses a technique of performing image processing on acquired infrared image data and visible image data only for a control target area determined based on vehicle speed information and the like.
Documents of the prior art
Patent document
Patent document 1: japanese patent laid-open publication No. 2013-042404
Disclosure of Invention
Problems to be solved by the invention
An image processing apparatus using a stereo camera using 2 cameras as an imaging section is known.
In a stereo camera, there are a plurality of image pickup units, and different image calculation regions are set for each image pickup unit.
If the technique described in patent document 1 is applied to an image processing apparatus using a stereo camera, it is necessary to perform image processing only on a control target region for each different image calculation region. An image processing device mounted on a vehicle is required to increase the speed of image processing, and the obtained image is sometimes used for controlling the operation of the vehicle, so that it is necessary to suppress the computational processing load of the entire vehicle control system.
In the technique described in patent document 1, in an image processing apparatus having a plurality of images captured, such as a stereo camera, a configuration in which different image calculation regions are set for each image capturing unit is not assumed, and therefore it is difficult to suppress the calculation processing load and the image processing speed of the entire vehicle control system.
An object of the present invention is to provide an image processing apparatus that performs parallax image calculation processing that calculates based on difference information of images obtained from a plurality of imaging units, and sets an image calculation region suitable for each of the plurality of imaging units, thereby suppressing the calculation processing load on the entire image control system and achieving high speed image processing.
Means for solving the problems
In order to achieve the above object, the present invention is configured as follows.
An image processing apparatus includes: an image acquisition unit that obtains a first image captured by the first imaging unit and a second image captured by the second imaging unit; a first calculation region acquisition unit that obtains a first calculation region from the first image; and a second calculation region acquisition unit that obtains a second calculation region different from the first calculation region from the second image.
Further, the stereoscopic camera device includes: a first imaging unit; a second photographing section; a first calculation region acquisition unit that acquires a first calculation region from the image acquired by the first imaging unit; and a second calculation region acquisition unit that acquires a second calculation region different from the first calculation region from the image acquired by the second imaging unit.
ADVANTAGEOUS EFFECTS OF INVENTION
According to the present invention, it is possible to realize an image processing apparatus (stereo camera apparatus) that executes parallax image calculation processing that calculates based on difference information of images obtained from a plurality of imaging units, and sets an image calculation region suitable for each of the plurality of imaging units, thereby suppressing the calculation processing load on the entire image control system and increasing the speed of image processing.
Drawings
Fig. 1 is a block configuration diagram of a stereo camera device according to an embodiment of the present invention.
Fig. 2 is a diagram showing an example of a synchronization signal corresponding to the image acquisition timing of each of the first and second imaging units.
Fig. 3 is a flowchart illustrating a process of switching the setting of the first and second photographing units according to the frame.
Fig. 4 is an explanatory diagram of execution of a three-dimensional object recognition process using a three-dimensional image.
Fig. 5 is an explanatory diagram about execution of the monocular recognition processing using the monocular image.
Fig. 6 is an explanatory diagram of an example in which the stereoscopic processing and monocular processing are integrated.
Fig. 7 is a diagram showing an example of an image calculation region in the monocular recognition processing of the first imaging unit and the second imaging unit.
Fig. 8 is a diagram showing an example of a flow before the recognition processing is executed when a dedicated image calculation region is set for the recognition target object.
Detailed Description
Hereinafter, embodiments of the present invention will be described in detail with reference to the accompanying drawings. In the following description, an example in which the present invention is applied to an in-vehicle stereoscopic camera device as an image processing device will be described.
Examples
As the safe driving support system of the automobile, an inter-vehicle distance warning system, an adaptive cruise control system, a pre-crash brake system, and the like can be cited.
When such a system is constructed, a sensing technique for reliably recognizing the environment in front of the vehicle, such as a preceding vehicle, a traveling lane, and an obstacle around the host vehicle, is required. As an environment recognition sensor in front of a vehicle, there is a vehicle-mounted camera device mounted on the vehicle.
In the vehicle-mounted camera device, the stereoscopic camera device is superior in detecting a stereoscopic object having an arbitrary shape such as a pedestrian or a curb because the stereoscopic camera device can recognize the stereoscopic object based on parallax information of the left and right cameras. In the stereoscopic camera device, parallax is calculated based on a difference in luminance information of images acquired at the same timing by the image pickup element sections of the cameras attached to the left and right with respect to the vehicle traveling direction, whereby a three-dimensional object can be recognized with high accuracy.
Here, by variably controlling the frame rates of the left and right cameras, it is possible to capture an image for monocular recognition processing using a monocular image in addition to an image for parallax calculation, and therefore, it is possible to execute a plurality of image recognition processing including three-dimensional object recognition processing using parallax information on the same platform.
However, in the case of the above configuration example, since a plurality of image recognition processes are performed at the same time, the arithmetic processing load is increased.
Therefore, it is important to reduce the calculation processing load on the entire system of the in-vehicle camera device by optimizing the image calculation region for each recognition processing.
An image processing apparatus according to an embodiment of the present invention includes a control unit that switches between stereoscopic image processing for performing parallax calculation and monocular recognition processing for not performing parallax calculation at an appropriate timing, and independently controls appropriate exposure control and an image calculation region for each imaging unit mounted in an in-vehicle camera apparatus according to an image processing timing.
A representative one of the image processing apparatuses of an embodiment of the present invention is a stereo camera apparatus having two image pickup elements.
Fig. 1 is a block configuration diagram of a stereo camera apparatus 101 according to an embodiment of the present invention.
In fig. 1, a stereo camera apparatus 101 includes: a first imaging unit 102 and a second imaging unit 103 that are installed on the left and right sides with respect to the vehicle traveling direction and capture a first image and a second image; a first calculation region acquisition unit 104 for setting an image calculation region of a first image captured by the first imaging unit 102; a second calculation region acquisition unit 105 for setting an image calculation region of the second image captured by the second imaging unit 103; a first shutter control unit (first exposure control unit) 106 for setting an exposure control value for a shutter or the like of the first imaging unit 102; and a second shutter control unit (second exposure control unit) 107 for setting an exposure control value of a shutter or the like for the second imaging unit 103.
Further, the stereo camera apparatus 101 includes: a first image acquisition control unit 108 for calculating an image calculation area and an exposure control value set for the first imaging unit 102; a second image acquisition control unit 109 for calculating an image calculation area and an exposure control value set for the second imaging unit 103; a first monocular image processing unit 110 that performs monocular image processing using the image acquired by the first imaging unit 102; a second monocular image processing unit 111 that performs monocular image processing using the image acquired by the second imaging unit 103; a stereo image processing unit 112 that performs stereo image processing using images of both the first image capturing unit 102 and the second image capturing unit 103; and a recognition processing frame switching unit 113 for performing switching control of recognition processing performed on a frame-by-frame basis.
The recognition frame switching unit 113 performs a switching process of the recognition frame (image frame) according to which of the stereoscopic image processing unit 112, the first monocular image processing unit 110, and the second monocular image processing unit 111 is operated and the frame timing.
Fig. 2 is a diagram showing an example of the synchronization signal corresponding to the image acquisition timing of each of the first image capturing unit 102 and the second image capturing unit 103.
Fig. 2 shows an example in which a stereoscopic processing image and a monocular processing image are alternately acquired.
At the time (t) of performing the stereoscopic image processing, the synchronization signal (synchronization signal of the first image pickup unit) 201 in the first image pickup unit 102 and the synchronization signal (synchronization signal of the second image pickup unit) 202 in the second image pickup unit 103 are set0~t1(imaging time for stereo processing)), a stereo image 2000a is generated in which parallax calculation is performed based on images of both the first imaging unit 102 and the second imaging unit 103.
In addition, at the time (t) of monocular image processing2~t3(shooting time for monocular processing)), the image acquired by the first shooting section 102 is used as the first image 1001 for monocular processing, and the image acquired by the second shooting section 103 is used as the second image 1002 for monocular processing.
Then, at the time (t) of performing the stereoscopic image processing4~t5(imaging time for stereo processing)), a stereo image 2000b is generated in which parallax calculation is performed based on images of both the first imaging unit 102 and the second imaging unit 103.
Thereafter, similarly, a stereoscopic image, a first monocular image, and a second monocular image are sequentially generated.
The recognition processing frame switching unit 113 instructs the first image acquisition control unit 108 and the second image acquisition control unit 109 whether to execute the stereoscopic object recognition processing by the stereoscopic image processing unit 112 or the monocular image processing by the first monocular image processing unit 110 and the second monocular image processing unit 111, based on the image frame, and switches the recognition processing frame (image frame for which the recognition processing is performed).
In the stereoscopic image processing and the monocular image processing, since the image calculation region and the exposure control value requested from the imaging units (102, 103) are different depending on the recognition target object, the first image acquisition control unit 108 and the second image acquisition control unit 109 use information from the recognition processing frame switching unit 113 in order to determine which one of the request value (image calculation region and exposure control value) of the stereoscopic image processing unit 112 and the request value (image calculation region and exposure control value) of the first monocular image processing unit 110 and the second monocular image processing unit 111 should be selected.
The first image acquisition control unit 108 and the second image acquisition control unit 109 control the first calculation region acquisition unit 104, the second calculation acquisition unit 105, the first shutter control unit 106, or the second shutter control unit 107 based on the request value of the stereoscopic image processing unit 112 and the request value of the first monocular image processing unit 110 or the second monocular image processing unit 111.
Then, the first image acquisition control unit 108 transmits the image acquired by the first arithmetic region acquisition unit 104 to the first monocular image processing unit 110 or the stereoscopic image processing unit 112.
The second image acquisition control unit 109 also transmits the image acquired by the second arithmetic region acquisition unit 105 to the second monocular image processing unit 111 or the stereoscopic image processing unit 112.
Fig. 3 is a flowchart for explaining switching of the setting processing of the first and second imaging units 102 and 103 according to frames based on the above.
In fig. 3, first, the identification processing frame switching unit 113 checks the image processing to be performed in the image frame (step S301). When the image frame is at the time of performing the stereoscopic image processing 112 (yes in step S302), control values to be set in both the first image capturing unit 102 and the second image capturing unit 103 when performing the stereoscopic image processing are acquired (step S303).
When the image frame is at the time of performing the monocular image processing (no in step S302), the control value to be set in the first imaging unit 102 when the first monocular image processing 110 is performed is acquired (step S304), and the control value to be set in the second imaging unit 103 when the second monocular image processing 111 is performed is acquired (step S305).
After acquiring the control values to be set in the respective imaging units (first imaging unit 102 and second imaging unit 103), the shutter control value is set in first imaging unit 102 (step S306) and the image calculation area is set (step S307). Further, the second imaging unit 103 also sets a shutter control value (step S308) and an image calculation region (step S309).
Here, an example of a flow before the three-dimensional object recognition processing using the three-dimensional image is executed will be described with reference to the synchronization signal shown in fig. 2.
Fig. 4 is an explanatory diagram of execution of a three-dimensional object recognition process using a three-dimensional image.
In fig. 4, in an image acquisition synchronization signal 401 of the first image capturing unit 102 and the second image capturing unit 103, image acquisition processing 403 of an image output from an image sensor such as a CMOS image sensor is performed in an image frame 402 at the image acquisition time for stereo processing, and image correction processing 404 such as shading correction and gamma correction is appropriately performed in accordance with an acquired image region.
From the region where the image correction processing 404 is completed in each of the first image capturing unit 102 and the second image capturing unit 103, the parallax operation processing 405 is performed based on the difference information between the two images, and the three-dimensional object recognition processing 406 is executed at the time when the parallax operation processing 405 is completed for the necessary region.
Thereafter, the stereoscopic image processing (synonymous with stereoscopic processing) is similarly executed.
An example of a flow before the monocular recognition processing using the monocular image is executed will be described with reference to fig. 4. Fig. 5 is an explanatory diagram of the execution of the monocular recognition processing using the monocular image.
In fig. 5, in the image acquisition synchronization signal 501 of the first imaging unit 102, in the image frame 502 at the first monocular-processing image acquisition timing, the image correction processing 504 is appropriately executed according to the acquired image area while the image acquisition processing 503 output from the imaging element is performed, as in the stereoscopic-processing image acquisition timing of fig. 4.
In the monocular recognition processing, since the parallax operation processing with the image acquired by the second imaging unit 103 is not required, the first monocular image processing 505 is executed at a stage where the image correction processing 504 is appropriately completed.
In the image acquisition synchronization signal 506 of the second imaging unit 103, as in the case of the first imaging unit 102, the image correction processing 509 is appropriately executed according to the acquired image region while the image acquisition processing 508 output from the imaging element is performed on the image frame 507 at the second monocular-processing-use image acquisition time.
At a stage where the image correction processing 509 is appropriately completed, the second monocular image processing 510 is performed.
Here, the stereoscopic processing and monocular processing shown in fig. 4 and 5, which are shown as an example of the flow before the recognition processing is executed, are integrated. Fig. 6 is an explanatory diagram of an example in which the stereoscopic processing and monocular processing are integrated.
In the case where the stereoscopic image processing, the first monocular image processing, and the second monocular image processing are executed on the same platform, in the example shown in fig. 6, when the image processing is performed at the timing of the image synchronization signal 601 in the first imaging unit 102 and the image synchronization signal 602 in the second imaging unit 103, in the image frames 603 and 604 at the time of acquiring the stereoscopic processing image, there is a portion where the time of the stereoscopic image processing (stereoscopic object recognition processing) in the previous stereoscopic processing image frame overlaps with the time of executing the first monocular image processing and the second monocular image processing on the basis of the image acquired in the previous monocular processing image frame.
In this case, for example, if all the processes are executed in the same RAM, the amount of bus traffic between the CPU and the RAM increases, and a delay or the like is generated in the arithmetic processing time of the entire system.
In order to solve the above problem, the image calculation area is independently controlled by the first image capturing unit 102 and the second image capturing unit 103.
Fig. 7 is a diagram showing an example of an image calculation region in the monocular recognition processing of the first imaging unit 102 and the second imaging unit 103.
In the example shown in fig. 7, a left camera image 701 acquired by an imaging unit (the first imaging unit 102 or the second imaging unit 103) attached to the left side with respect to the vehicle traveling direction and a right camera image 702 acquired by an imaging unit (the second imaging unit 103 or the first imaging unit 102) attached to the right side are shown.
In the present embodiment, the identification recognition processing is performed by the left and right cameras (the first imaging section 102 and the second imaging section 103), and for example, in the left camera image 701, the signboard 703 of the shoulder of the road is assumed as the first specific recognition target, and in the right camera image 702, the electronic bulletin board 704 above the road is assumed as the second specific recognition target.
Here, when the recognition processing of the sign 703 of the road shoulder as the recognition target object of the left camera is executed, it is not necessary to execute the image operation processing on the entire shooting region 701 of the left camera, and the image operation processing may be executed by narrowing down to a certain region (first specific region) 705 including the recognition target object 703.
Similarly, the electronic bulletin board 704 above the road as the recognition target object of the right camera does not need to perform image arithmetic processing on the entire imaging area 702 of the right camera, and may be reduced to a certain area (second specific area) 706 including the recognition target object 704.
In the same manner as in the above-described stereoscopic image processing, an image calculation region dedicated to a specific recognition target may be set, and parallax calculation may be performed.
Fig. 8 is a diagram showing an example of a flow before the recognition processing is executed when the image calculation region dedicated to the recognition target object is set.
Since the image calculation region is reduced for each imaging unit (the first imaging unit 102 and the second imaging unit 103), the calculation time for image correction, parallax calculation, recognition processing (three-dimensional object recognition processing, first monocular image processing, and second monocular image processing), and the like can be reduced, the processing load on the entire system of the stereoscopic camera device in the present embodiment can be reduced, and the scalability related to the parallel execution of a plurality of recognition applications can be expanded on the same platform.
As described above, according to an embodiment of the present invention, in a stereoscopic camera device (image processing device) using a stereoscopic camera, stereoscopic object processing is performed based on image information from two image capturing units (first image capturing unit 102 and second image capturing unit 103) at an image capturing time for stereoscopic processing, and monocular recognition processing for acquiring specific information of two other parts is performed based on image information of other image areas (specific image areas) from the two image capturing units at an image capturing time for monocular processing different from the image capturing time for stereoscopic processing.
Therefore, it is possible to realize a stereoscopic camera device as an image processing device that executes parallax image calculation processing that calculates based on difference information of images obtained from a plurality of imaging units, and sets an image calculation region suitable for each of the plurality of imaging units, thereby suppressing the calculation processing load on the entire image control system and achieving high speed image processing.
The frame 502 at the first monocular-processing image acquisition time, the frame 507 at the second monocular-processing image acquisition time, and the frame for the second monocular-processing image acquisition time may be the same time or different times from each other.
The above example is an example in which the present invention is applied to an image processing apparatus for a vehicle, but is not limited to the case of a vehicle, and the present invention can be applied to other moving objects, a monitoring apparatus that processes images from a plurality of imaging units, and the like.
The present invention is achieved by providing at least 2 image capturing units, and configuring the image calculation regions of the 2 image capturing units to be different from each other. That is, the present invention is applicable to any image processing apparatus provided with an image acquisition unit (including the shutter control units 106 and 107) for acquiring a first image captured by the first image capturing unit 102 and a second image captured by the second image capturing unit 103, a first arithmetic region acquisition unit 104 for acquiring a first arithmetic region from the first image acquired by the image acquisition unit, and a second arithmetic region acquisition unit 105 for acquiring a second arithmetic region different from the first arithmetic region from the second image acquired by the image acquisition unit.
The image processing apparatus can be applied to a stereoscopic camera apparatus, and can also be applied to an image processing apparatus having a plurality of imaging units other than the stereoscopic camera apparatus.
The present invention is not limited to the embodiments described above, and other embodiments that can be considered within the scope of the technical idea of the present invention are also included in the scope of the present invention.
Description of the symbols
101: a stereo camera device (image processing device); 102: a first imaging unit; 103: a second photographing section; 104: a first calculation region acquisition unit; 105: a second calculation region acquisition unit; 106: a first shutter control section; 107: a second shutter control section; 108: a first image acquisition control unit; 109: a second image acquisition control unit; 110: a first monocular image processing unit; 111: a second monocular image processing unit; 112: a stereo image processing unit; 113: a recognition processing frame switching unit; 201: a synchronization signal of the first imaging section; 202: a synchronization signal of the second photographing section; 401: an image acquisition synchronization signal; 402: a frame of a stereoscopic image acquisition time; 403: image acquisition processing; 404: image correction processing; 405: performing parallax operation processing; 406: three-dimensional object recognition processing; 501: an image acquisition synchronization signal; 502: a frame of a first monocular-processing image acquisition time; 503: image acquisition processing; 504: image correction processing; 505: processing a first monocular image; 506: an image acquisition synchronization signal; 507: a frame of a second monocular-processing image acquisition time; 508: image acquisition processing; 509: image correction processing; 510: processing the second monocular image; 601: an image synchronization signal; 602: an image synchronization signal; 603: a frame at the time of image acquisition for stereoscopic processing; 604: a frame at the time of image acquisition for stereoscopic processing; 701: a left camera image; 702: a right camera image; 703: signboards for road shoulders; 704: an electronic billboard above the road; 705. 706: a certain area.

Claims (7)

1. An image processing apparatus is characterized by comprising:
an image acquisition unit that obtains a first image captured by the first imaging unit and a second image captured by the second imaging unit;
a first calculation region acquisition unit that obtains a first calculation region from the first image; and
and a second calculation region acquisition unit that obtains a second calculation region different from the first calculation region from the second image.
2. A stereoscopic camera device is characterized by comprising:
a first imaging unit;
a second photographing section;
a first calculation region acquisition unit that acquires a first calculation region from the image acquired by the first imaging unit; and
and a second calculation region acquisition unit that acquires a second calculation region different from the first calculation region from the image acquired by the second imaging unit.
3. The stereoscopic camera apparatus according to claim 2, comprising:
a stereoscopic image processing unit that performs stereoscopic object recognition processing using images of both the first imaging unit and the second imaging unit;
a first monocular image processing unit that performs monocular image processing using an image of the first imaging unit; and
and a second monocular image processing unit that performs monocular image processing using the image of the second imaging unit.
4. The stereoscopic camera apparatus according to claim 3, comprising:
a recognition processing frame switching unit that instructs whether the three-dimensional object recognition processing or the monocular image processing is to be performed, and switches between a recognition processing frame in which the three-dimensional object recognition processing is performed and a recognition processing frame in which the monocular image processing is performed;
a first image acquisition control unit that acquires an image acquired by the first arithmetic region acquisition unit from the first imaging unit in accordance with a request value from the first monocular image processing unit or the stereoscopic image processing unit in accordance with an instruction from the recognition processing frame switching unit; and
and a second image acquisition control unit that acquires the image acquired by the second arithmetic region acquisition unit from the second imaging unit in accordance with the request value from the second monocular image processing unit or the stereoscopic image processing unit in accordance with the instruction from the recognition processing frame switching unit.
5. The stereoscopic camera apparatus according to claim 4, comprising:
a first exposure control unit that calculates an exposure control value, and sets the exposure control value of the first imaging unit based on the calculated exposure control value; and
and a second exposure control unit that calculates an exposure control value, and sets the exposure control value of the second imaging unit based on the calculated exposure control value.
6. The stereoscopic camera apparatus according to claim 5,
the first calculation region is a first specific region including a first specific recognition object, and the second calculation region is a second specific region including a second specific recognition object.
7. The stereo camera apparatus according to any one of claims 2, 3, 4, 5, 6,
the stereo camera device is a vehicle-mounted camera device mounted on a vehicle.
CN201980052664.3A 2018-08-22 2019-07-25 Image processing apparatus Pending CN112544066A (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
JP2018155301 2018-08-22
JP2018-155301 2018-08-22
PCT/JP2019/029150 WO2020039837A1 (en) 2018-08-22 2019-07-25 Image processing device

Publications (1)

Publication Number Publication Date
CN112544066A true CN112544066A (en) 2021-03-23

Family

ID=69592964

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201980052664.3A Pending CN112544066A (en) 2018-08-22 2019-07-25 Image processing apparatus

Country Status (3)

Country Link
JP (1) JP7427594B2 (en)
CN (1) CN112544066A (en)
WO (1) WO2020039837A1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPWO2023074067A1 (en) * 2021-10-29 2023-05-04

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090153664A1 (en) * 2007-12-14 2009-06-18 Hitachi, Ltd. Stereo Camera Device
CN103403779A (en) * 2011-03-04 2013-11-20 日立汽车系统株式会社 Vehicle-mounted camera and vehicle-mounted camera system
CN106170828A (en) * 2014-04-24 2016-11-30 日立汽车系统株式会社 Extraneous identification device
JP2017121856A (en) * 2016-01-06 2017-07-13 サクサ株式会社 Image processing apparatus

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3797949B2 (en) * 2002-03-28 2006-07-19 株式会社東芝 Image processing apparatus and method
JP2012221103A (en) * 2011-04-06 2012-11-12 Denso Corp Image processing device for vehicle
JP6085522B2 (en) * 2013-05-29 2017-02-22 富士重工業株式会社 Image processing device
JP6762090B2 (en) * 2015-11-11 2020-09-30 日立オートモティブシステムズ株式会社 Object detector

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090153664A1 (en) * 2007-12-14 2009-06-18 Hitachi, Ltd. Stereo Camera Device
CN103403779A (en) * 2011-03-04 2013-11-20 日立汽车系统株式会社 Vehicle-mounted camera and vehicle-mounted camera system
CN106170828A (en) * 2014-04-24 2016-11-30 日立汽车系统株式会社 Extraneous identification device
JP2017121856A (en) * 2016-01-06 2017-07-13 サクサ株式会社 Image processing apparatus

Also Published As

Publication number Publication date
JP7427594B2 (en) 2024-02-05
JPWO2020039837A1 (en) 2021-08-10
WO2020039837A1 (en) 2020-02-27

Similar Documents

Publication Publication Date Title
EP2642429B1 (en) Multi-lens camera system and range-finding method executed by the multi-lens camera system
EP2919197B1 (en) Object detection device and object detection method
JP6660751B2 (en) Imaging device
JP2007172035A (en) Onboard image recognition device, onboard imaging device, onboard imaging controller, warning processor, image recognition method, imaging method and imaging control method
EP3203725B1 (en) Vehicle-mounted image recognition device
JP6723079B2 (en) Object distance detection device
US20200051435A1 (en) Information processing apparatus, information processing method, program, and movable object
JP2009085651A (en) Image processing system
JP2016045903A (en) Object recognition device and vehicle control system
US20170318279A1 (en) Stereo camera apparatus and vehicle comprising the same
JP6139493B2 (en) License plate detection device and license plate detection method
US9967438B2 (en) Image processing apparatus
JP2012073927A (en) Driving support apparatus
JP7427594B2 (en) Image processing device
JP6899673B2 (en) Object distance detector
JP6253175B2 (en) Vehicle external environment recognition device
KR20210023859A (en) Image processing device, mobile device and method, and program
CN108259819B (en) Dynamic image feature enhancement method and system
JP2015011665A (en) Apparatus and method for detecting marking on road surface
JP6891082B2 (en) Object distance detector
JP2013161187A (en) Object recognition device
JP4598011B2 (en) Vehicle display device
WO2021029167A1 (en) Information processing device, information processing method, and information processing program
JP5017921B2 (en) Image processing apparatus for vehicle
JP2022163882A (en) Signal processing device and method, and program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information

Address after: Ibaraki

Applicant after: Hitachi astemo Co.,Ltd.

Address before: Ibaraki

Applicant before: HITACHI AUTOMOTIVE SYSTEMS, Ltd.

CB02 Change of applicant information