WO2016117213A1 - 撮像装置及び方法、並びにプログラム及び記録媒体 - Google Patents
撮像装置及び方法、並びにプログラム及び記録媒体 Download PDFInfo
- Publication number
- WO2016117213A1 WO2016117213A1 PCT/JP2015/081838 JP2015081838W WO2016117213A1 WO 2016117213 A1 WO2016117213 A1 WO 2016117213A1 JP 2015081838 W JP2015081838 W JP 2015081838W WO 2016117213 A1 WO2016117213 A1 WO 2016117213A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- imaging
- information indicating
- attention
- captured image
- attention area
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
- G06T7/73—Determining position or orientation of objects or cameras using feature-based methods
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B7/00—Control of exposure by setting shutters, diaphragms or filters, separately or conjointly
- G03B7/08—Control effected solely on the basis of the response, to the intensity of the light received by the camera, of a built-in light-sensitive device
- G03B7/091—Digital circuits
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/80—Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/57—Mechanical or electrical details of cameras or camera modules specially adapted for being embedded in other devices
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/70—Circuitry for compensating brightness variation in the scene
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/70—Circuitry for compensating brightness variation in the scene
- H04N23/71—Circuitry for evaluating the brightness variation
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/70—Circuitry for compensating brightness variation in the scene
- H04N23/73—Circuitry for compensating brightness variation in the scene by influencing the exposure time
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B15/00—Special procedures for taking photographs; Apparatus therefor
- G03B15/02—Illuminating scene
- G03B15/03—Combinations of cameras with lighting apparatus; Flash units
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10016—Video; Image sequence
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20021—Dividing image into blocks, subimages or windows
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20172—Image enhancement details
- G06T2207/20208—High dynamic range [HDR] image processing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30236—Traffic on road, railway or crossing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30248—Vehicle exterior or interior
- G06T2207/30252—Vehicle exterior; Vicinity of vehicle
- G06T2207/30256—Lane; Road marking
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30248—Vehicle exterior or interior
- G06T2207/30252—Vehicle exterior; Vicinity of vehicle
- G06T2207/30261—Obstacle
Definitions
- the present invention relates to an imaging apparatus and method.
- the present invention also relates to a program for causing a computer to execute processing in an imaging apparatus or method, and a recording medium on which the program is recorded.
- the conventional imaging apparatus described above determines whether or not a light source or the like is included in the peripheral portion of the image and corrects the backlight, so that the backlight state for the subject of interest cannot be detected when the subject of interest is not in the center of the screen. Perform exposure control. For this reason, there is a problem in that the subject of interest is blacked out and is not visible due to whiteout. Further, when the recognition using the captured image is performed, there is a problem that the recognition rate of the subject of interest is low.
- the present invention has been made to solve the above-described problems, and it is an object of the present invention to suppress the occurrence of blackout or whiteout of a subject of interest so that the subject can be clearly seen.
- the imaging apparatus of the present invention Imaging means for imaging a subject located within an imaging angle of view range and generating a captured image; Detection means for detecting an object existing in a detection range at least partially overlapping the imaging field angle range and outputting information indicating the direction of the object and information indicating a distance to the object; One or more objects are selected based on information indicating the direction of the object and information indicating the distance output from the detection means, and information indicating the position of the selected object in the captured image; and Object-of-interest selection means for outputting information indicating the distance to the selected object; From the information indicating the position of the selected object and the information indicating the distance to the selected object output from the object-of-interest selection means, an area including the selected object is noted in the captured image. Attention area specifying means for specifying as an area; Brightness calculation means for calculating the brightness of the captured image in the attention area identified by the attention area identification means; Exposure control means for controlling exposure of the imaging means based on the brightness calculated by the brightness calculation means.
- the image portion corresponding to the selected object is less likely to be blacked out or blown out, and thus the selected object can be clearly seen.
- FIG. 5 is a diagram showing the position of an object in the detection range detected by a radar in the detection range of FIG. 4. It is a figure which shows the example of the attention area specified with respect to one object selected based on distance among the objects shown by FIG. It is a figure which shows an example of the image obtained as a result of performing exposure control by making the attention area
- FIG. 9 is a diagram illustrating the position of an object in the detection range detected by a radar in the detection range of the scene in FIG. 8. It is a figure which shows the example of the attention area specified with respect to one object selected based on distance among the objects shown by FIG. It is a figure which shows an example of the image obtained as a result of performing exposure control by making the attention area
- FIG. 16 is a diagram illustrating an example of an image obtained as a result of performing exposure control using the attention area of FIG. 15 as a detection frame. It is a block diagram which shows the structure of the imaging device of Embodiment 3 of this invention. It is a figure which shows the example of the attention area
- FIG. 26 is a diagram illustrating an example of an image obtained as a result of performing exposure control using the attention area of FIG. 25 as a detection frame.
- FIG. 11 is a block diagram illustrating a computer system that constitutes the imaging apparatus according to the first, third, fifth, or sixth embodiment.
- FIG. 10 is a block diagram illustrating a computer system that constitutes the imaging apparatus according to the second or fourth embodiment.
- FIG. 1 is a block diagram showing the configuration of the imaging apparatus according to Embodiment 1 of the present invention.
- the illustrated imaging apparatus is mounted on a vehicle, and includes a lens 1, an imaging device 2, a camera signal processing circuit 3, a radar 4, an attention object selection circuit 5, an attention area specifying circuit 6, and a brightness.
- the lens 1 guides light from a subject located in the imaging field angle range onto the imaging surface of the imaging element 2 and forms a subject image on the imaging surface.
- the image sensor 2 photoelectrically converts a subject image formed on the imaging surface to generate an imaging signal representing the captured image.
- the imaging signal D2 generated by the imaging element 2 is supplied to the camera signal processing circuit 3.
- the imaging element 2 is provided so as to image the front of the vehicle body of the vehicle (host vehicle) on which the imaging device is mounted, and the direction of the optical axis of the lens 1 coincides with the front direction of the vehicle body of the host vehicle. Shall.
- the camera signal processing circuit 3 performs color synchronization processing, signal amplification processing, gradation correction processing, noise reduction processing, contour correction processing, white balance adjustment processing on the captured image of each frame output from the image sensor 2, and Color correction processing is performed, and a time series of images (captured images after signal processing) obtained as a result of these processings is output as a video signal D3.
- the video signal D3 is output from the output terminal 9.
- the output video signal D3 is used for object recognition in, for example, driving support processing.
- the camera signal processing circuit 3 also outputs a luminance signal Y3 representing the luminance of each pixel of the captured image of each frame represented by the video signal D3.
- the luminance signal Y3 is supplied to the brightness calculation circuit 7.
- the missing color component is interpolated at each pixel position.
- the missing color component is interpolated at each pixel position.
- the missing color component is interpolated at each pixel position.
- R pixel, G pixel, and B pixel exists at each pixel position.
- Interpolation is performed using pixel data of the same color of surrounding pixels. Accordingly, it is possible to generate a video signal in which all the R, G, and B pixel data are arranged at each pixel position.
- the signal is amplified based on the detection result of the brightness of the image by the exposure control circuit 8 described later.
- gradation correction processing for example, gradation correction is performed with reference to a table so as to obtain a gradation characteristic of gamma characteristics.
- noise is reduced by performing at least one of a spatial smoothing process and a temporal smoothing process.
- contour correction processing for example, a contour dulled by smoothing processing or the like is corrected by emphasizing with a high-pass filter, for example.
- the R signal and the B signal are detected, for example, an average value is obtained, the magnitudes of the average values are compared, and the R signal And B signal (or RY signal and BY signal) are white-balanced.
- the hue and saturation are corrected by performing a matrix operation on the R signal, the G signal, and the B signal.
- the radar 4 as detection means detects the position of one or more objects existing within the detection range.
- the detection range of the radar 4 is the same as the imaging field angle range of the imaging device 2.
- the radar 4 includes a transmission unit 41 that generates a transmission signal, and a highly directional antenna 42 that transmits a transmission radio wave corresponding to the transmission signal in a specific direction and receives a reflected radio wave from each object.
- a receiving unit 43 that extracts a reflected wave from the radio wave received by the antenna 42, and measures the direction and distance of an object within the detection range from the transmitted wave and the reflected wave.
- the radar 4 outputs information (direction information) D4a indicating the measured direction and information (distance information) D4b indicating the distance as position information D4 of each object.
- the direction indicated by the direction information D4a is associated with the position in the captured image of the image sensor 2.
- the position in the captured image is represented by, for example, a horizontal position and a vertical position in the captured image.
- the position information D4 is supplied to the target object selection circuit 5.
- the target object selection circuit 5 selects the target object based on the position information D4 of each object from the radar 4. In selecting a target object, the degree of attention of each object is evaluated, and an object is selected based on the evaluation result. For example, an object with the highest degree of attention is identified and the identified object is selected.
- the target object selection circuit 5 supplies information D5 indicating the position of the selected object to the target area specifying circuit 6.
- the information D5 indicating the position of the selected object includes information D5a indicating the position of the selected object in the captured image and information D5b indicating the distance from the host vehicle to the object.
- the direction indicated by the direction information D4a included in the position information D4 output from the radar 4 is associated with the position in the captured image. Therefore, the corresponding position in the captured image can be specified from the direction information D4a about each object output from the radar 4.
- the target object selection circuit 5 further extracts information D5b indicating the distance to the selected object from the distance information D4b output from the radar 4 for each object.
- the attention area specifying circuit 6 specifies the attention area in the captured image based on the position of the object indicated by the information D5 output from the attention object selection circuit 5, that is, the position of the object selected by the attention object selection circuit 5. Then, information (attention area information) D6 indicating the identified area is output.
- the attention area information D6 is supplied to the brightness calculation circuit 7.
- the brightness calculation circuit 7 detects the luminance signal Y3 output from the camera signal processing circuit 3 using the attention area as a detection frame, based on the information D6 indicating the attention area output from the attention area specifying circuit 6. For example, the brightness calculation circuit 7 calculates the average value (brightness average value) Yav of the pixels included in the detection frame based on the luminance signal Y3 of the pixels included in the detection frame, This is supplied to the exposure control circuit 8 as information indicating the length. The brightness average value thus obtained is used as a brightness index value.
- the exposure control circuit 8 compares the brightness average value Yav calculated by the brightness calculation circuit 7 with the brightness target value Yrf, and performs exposure control based on the comparison result.
- the exposure control is performed by controlling exposure parameter, that is, controlling the exposure time Te for the image sensor 2 and controlling the signal amplification gain Gs for the camera signal processing circuit 3.
- the exposure control circuit 8 supplies a control signal Ct for controlling the exposure time Te to the image sensor 2 and supplies a control signal Cg for controlling the signal amplification gain Gs to the camera signal processing circuit 3.
- the exposure control circuit 8 includes a memory 81.
- the memory 81 stores the value (parameter) of the exposure time Te used for imaging with the imaging device 2 and the value (parameter) of the signal amplification gain Gs used for signal amplification in the camera signal processing circuit 3.
- the exposure control circuit 8 adjusts the exposure time Te and the signal amplification gain Gs.
- the luminance average value Yav is larger than the luminance target value Yrf, adjustment for shortening the exposure time or adjustment for reducing the signal amplification gain is performed. Conversely, if the luminance average value Yav is smaller than the luminance target value Yrf, adjustment to increase the exposure time or adjustment to increase the signal amplification gain is performed.
- Whether to change the exposure time or the signal amplification gain is determined so as to reduce noise. For example, when the subject is gradually darkened, the exposure time is lengthened first, and when the exposure time is maximized, the signal amplitude gain is increased thereafter.
- the maximum value of the exposure time is one frame period or a time that is an allowable limit of motion blur. The length of time that is the allowable limit of motion blur is shorter as the subject moves more vigorously.
- the exposure control may be performed by controlling the aperture of the lens 1.
- the exposure time Te or the signal amplification gain Gs When the exposure time Te or the signal amplification gain Gs is adjusted, the value of the exposure time or the signal amplification gain stored in the memory 81 is rewritten (updated) with the adjusted value.
- the adjusted exposure time Te or the value of the signal amplification gain Gs is supplied to the image pickup device 2 and the camera signal processing circuit 3 and used for exposure control at the time of imaging after the next frame period.
- the image represented by the video signal output from the camera signal processing circuit 3 by controlling the exposure time of the image sensor 2 and the signal amplification gain of the camera signal processing circuit 3 by the exposure control circuit 8 is an object in the region of interest. Is controlled so that it can be seen with appropriate brightness.
- the imaging apparatus of the present embodiment is applied to capture and recognize an object in front of a vehicle.
- the imaging device is mounted on the vehicle and images the front of the vehicle.
- FIG. 3 shows an example of a scene in front of the host vehicle according to the present embodiment.
- This scene is an object to be imaged by the image sensor 2 and is also an object to be detected by the radar 4.
- the host vehicle (not shown) travels in the overtaking lane TFL of the two-lane road and approaches the tunnel TN.
- the small vehicle A is traveling in the overtaking lane TFL in the tunnel TN
- the large vehicle B is traveling in the traveling lane TSL in front of the tunnel TN
- the two-wheeled vehicle D is traveling following the large vehicle B.
- a small vehicle C that has come out of the tunnel TN is traveling in a traveling lane RSL in the opposite direction. No vehicle is driving in the overtaking lane RFL of the oncoming lane.
- Examples of large vehicles include large trucks and large buses.
- An example of a small vehicle is a passenger car.
- the vehicles A to D may be referred to as objects. People are also treated as a kind of object.
- a lane line TLM is drawn between the traveling lane TSL in the traveling direction and the overtaking lane TFL.
- a lane division line RLM is drawn between the traveling lane RSL and the overtaking lane RFL in the opposite direction.
- the time zone is daytime, the weather is clear, the outside of the tunnel TN is bright, and the inside of the tunnel TN is dark.
- the radar 4 supplies the detected object position information D4 to the target object selection circuit 5 with respect to the detection range of FIG.
- the position of the object detected by the detection of the radar 4 is shown by a black circle in FIG.
- background scenes tunnel TN, median strip MD, and lane markings TLM and RLM are shown in an overlapping manner in order to facilitate understanding of the positional relationship of the detected objects.
- the radar 4 detects an object A, an object B, an object C, and an object D.
- the radar 4 is, for example, information (direction information) D4a indicating the direction of the object as viewed from the host vehicle and information indicating the distance from the host vehicle to the object (information (position information) D4 indicating the position of the detected object).
- Distance information) D4b is output.
- Information indicating the position of the center of the object is output as the position information D4.
- the direction of the center in the direction range in which the reflected wave from the same object is received or the direction in which the intensity of the reflected wave is the strongest from the same object is regarded as the direction of the center of the object.
- the distance calculated based on the reflected wave from the direction of the center of the object is treated as the distance to the center of the object.
- the radar 4 is assumed to have relatively low performance, and the size and shape of the object cannot be accurately determined from the output of the radar 4, and therefore the type of the object cannot be specified. And
- the target object selection circuit 5 evaluates the attention level of each object based on the position information D4 of each object from the radar 4, selects the object with the highest level of attention, and determines the position of the selected object.
- the indicated information D5 is output.
- the information D5 indicating the position of the object includes information D5a indicating the position in the captured image of the object and information D5b indicating the distance to the object.
- the information D5a indicating the position of the object in the captured image is information indicating the position of the center of the object in the captured image, for example.
- the information D5b indicating the distance to the object is information indicating the distance to the center of the object, for example.
- the output information D5 is supplied to the attention area specifying circuit 6.
- the direction indicated by the position information D4 output from the radar 4 is associated with the position in the captured image, and in the captured image corresponding to the position of the center of the object detected by the radar 4.
- the position can be specified. Therefore, information D5a indicating the position in the captured image of the same object can be generated from the information D4a indicating the direction of each object output from the radar 4.
- the information D5b indicating the distance to each object the information D4b indicating the distance to the same object output from the radar 4 can be used as it is.
- the target object selection circuit 5 selects the object D.
- the attention area identification circuit 6 identifies the attention area in the captured image for the object selected by the attention object selection circuit 5. For example, an area having a size corresponding to the distance to the object with the center of the object selected by the object-of-interest selection circuit 5 as the center is specified as the area of interest.
- the attention area is a rectangular area having a pair of sides extending in the horizontal direction and a pair of sides extending in the vertical direction.
- the size of the region of interest is estimated to be occupied by the object in the captured image when it is assumed that the selected object is the largest of a plurality of types of assumed objects, that is, a large vehicle. In other words, the image portion corresponding to the object is included in the image portion. In estimating this size, the distance to the selected object is taken into account. This is because even in the same large vehicle, the size that appears in the captured image varies depending on the distance.
- the selected object is the largest of a plurality of possible types of objects, that is, a large vehicle. This is because a corresponding portion (an object appearing in the captured image) in the captured image of the object is included in the attention area regardless of the type of the object.
- FIG. 6 shows a case where a rectangular area including the selected object D is specified as the attention area Rd.
- the objects A, B, and C are shown together with the background scenes (tunnel TN, median strip MD, lane line LM). This is for easy understanding of the positional relationship of the attention area Rd.
- the attention area specifying circuit 6 generates information for specifying the attention area Rd shown in FIG. 6 and supplies the information to the brightness calculation circuit 7.
- the brightness calculation circuit 7 detects the luminance signal Y3 output from the camera signal processing circuit 3 using the attention area Rd as a detection frame, calculates the average luminance value Yav, and supplies the calculation result to the exposure control circuit 8.
- the exposure control circuit 8 compares the brightness average value Yav calculated by the brightness calculation circuit 7 with the brightness target value Yrf, and performs exposure control based on the comparison result.
- the entire attention area Rd is outside the tunnel TN.
- the average luminance value Yav of the portion in the attention area Rd in the captured image is high.
- the luminance average value Yav is high, adjustment for shortening the exposure time or adjustment for reducing the signal amplification gain is performed.
- the brightness of the image represented by the video signal D3 output from the camera signal processing circuit 3 changes, and an image whose brightness is optimally controlled in the attention area Rd is obtained. That is, an image with high visibility of the object D in the attention area Rd can be obtained without causing the image portion in the attention area Rd to be overexposed.
- the inside of the tunnel TN is crushed black
- the brightness is appropriately controlled outside the tunnel TN, particularly the attention area Rd, and therefore the object D located inside the attention area Rd. Is obtained.
- FIG. 8 shows an example of a scene in front of the host vehicle near the exit of the tunnel.
- FIG. 8 shows that the own vehicle is traveling in the overtaking lane TFL as in FIG. 3, and the same objects A to D as in FIG. 3 are at the same relative position with respect to the own vehicle, and the objects B, C, It is assumed that D exists inside the tunnel, the luminance is low, and the object A on the back side exists outside the tunnel, and the luminance is high.
- imaging field angle range is the imaging field angle range by the image sensor 2 and the detection range by the radar 4.
- FIG. 9 shows the position of the object detected by the radar 4 in the scene in front of FIG. As described above, since the relative positions of the objects A to D with respect to the host vehicle are the same as those in FIG. 3, the detection result by the radar 4 is the same as that in FIG.
- the attention object selection circuit 5 selects the object with the highest degree of attention, and the attention area identification circuit 6 captures the captured image. Identify the region of interest inside. This operation is the same as the operation in the case of FIG. 5, that is, when the target scene is near the entrance of the tunnel.
- the attention area Rd is located inside the tunnel TN, and the average luminance value of the attention area Rd is low. For example, adjustment for increasing the exposure time or adjustment for increasing the signal amplification gain is performed. Done. As a result, as shown in FIG. 11, although the object A located outside the tunnel is whiteout, an image with high visibility of the object D is obtained.
- an image in which the brightness of an object to be noted is appropriately controlled by a combination of a radar and an image sensor.
- the type of the obstacle cannot be specified, and for example, it may be impossible to determine whether the object is a vehicle.
- by combining the radar and the image sensor exposure control is performed so that an object with a high degree of attention among the objects detected by the radar is optimally exposed. It is possible to obtain an effect that an object having a high degree can be visually recognized without being crushed or blown out white. Therefore, it is effective in assisting driving or preventing accidents.
- any radar can be used as long as it can obtain information indicating the direction and distance of the object, so that it can be used with a relatively low performance and low cost, and the imaging apparatus can be realized at low cost.
- the object closest to the host vehicle is selected from the objects detected by the radar, and the area including the object is set as the attention area, so that the visibility to the closest object is improved. It is effective in avoiding the danger of an impending collision.
- FIG. FIG. 12 is a block diagram illustrating a configuration of the imaging apparatus according to the second embodiment of the present invention.
- the imaging apparatus in FIG. 12 is generally the same as the imaging apparatus in FIG. 1, but differs in that an attention object selection circuit 5 b is provided instead of the attention object selection circuit 5 and a traveling direction detection circuit 10 is added.
- the operations of the constituent elements other than those described above are the same as those described with reference to FIG.
- the traveling direction detection circuit 10 detects the traveling direction of the host vehicle from the steering direction of the steering wheel of the host vehicle, and supplies information D10 indicating the traveling direction to the target object selection circuit 5b.
- the target object selection circuit 5b in FIG. 12 is generally the same as the target object selection circuit 5 in FIG. 1, but differs in the following points.
- the target object selection circuit 5b is based on the information D10 indicating the traveling direction of the host vehicle supplied from the traveling direction detection circuit 10 and the position information D4 (direction information D4a and distance information D4b) of the object supplied from the radar 4.
- the attention level of the object is evaluated, an object having a relatively high level of attention, for example, the highest object is selected based on the evaluation result, and information D5 indicating the position of the selected object is supplied to the attention area specifying circuit 6. .
- the direction evaluation value is increased as the direction of each object is closer to the traveling direction of the host vehicle, and the distance evaluation value is increased as the distance between the objects is shorter, and the direction evaluation value and the distance evaluation value are combined.
- Determine the degree of attention In this case, the higher the direction evaluation value, the higher the degree of attention, and the higher the distance evaluation value, the higher the degree of attention. Then, the object with the highest degree of attention is selected.
- the radar 4 detects the objects A to D shown in FIG. 5, and the target object selection circuit 5b advances the own vehicle with respect to the detected objects A to D. Assuming that the attention degree is evaluated with emphasis on the direction, it is assumed that the attention degree for the object A in the same lane as the host vehicle is determined to be the highest.
- the attention object selecting circuit 5b notifies the attention area specifying circuit 6 of the position of the object A determined to have the highest degree of attention.
- the attention area specifying circuit 6 specifies the attention area for the position of the object A having a high degree of attention. Also in this case, as described in the first embodiment, it is assumed that the center of the object A selected by the target object selection circuit 5b is the center, and the object A is a large vehicle, and the distance depends on the distance. A region Ra having a predetermined size is specified. For example, the area Ra shown in FIG. 13 is specified and notified to the brightness calculation circuit 7 as the attention area.
- the brightness calculation circuit 7 detects the luminance signal Y3 output from the camera signal processing circuit 3 using the attention area Ra as a detection frame, calculates the average luminance value Yav, and supplies the calculation result to the exposure control circuit 8.
- the exposure control circuit 8 performs exposure control based on the average brightness value Yav.
- the object A is in the tunnel TN, and the region Ra is in the portion of the tunnel TN in the captured image. Since the region Ra in the tunnel TN has a relatively low average luminance value, for example, adjustment for increasing the exposure time or adjustment for increasing the signal amplification gain is performed. As a result, a captured image whose brightness is optimally controlled in the attention area Ra is obtained. That is, an image with high visibility of the object A in the attention area Ra can be obtained without the image portion in the attention area Ra being blacked out.
- FIG. 14 although the outside of the tunnel TN is white, an image in which the brightness is appropriately controlled is obtained in the tunnel TN.
- vehicles B, C, and D are not drawn to show that the whiteout has occurred.
- the median strip MD and the lane marking LM may not be visible in the image due to overexposure
- FIG. 14 is shown to show the positional relationship of the vehicle B.
- the brightness of the attention area within the tunnel TN, and therefore the object A located inside the attention area is appropriately controlled, and an image with high visibility of the object A is obtained.
- the attention object selection circuit 5b evaluates the attention level by placing importance on the traveling direction of the own vehicle, and as a result, determines that the object A located in the same lane as the own vehicle has the highest attention degree. Then, the attention area specifying circuit 6 specifies the attention area Ra corresponding to the object A as shown in FIG.
- the exposure control circuit 8 performs adjustment for shortening the exposure time or adjustment for reducing the signal amplification gain, for example. Is done.
- the objects B, C, and D located in the tunnel are crushed black, an image with high visibility of the object A is obtained.
- the imaging direction by the imaging device 2 matches the front direction of the body of the host vehicle, but the imaging direction by the imaging device 2 does not match the front direction of the body of the host vehicle.
- the traveling direction of the host vehicle may be calculated based on the angle formed by the imaging direction of the image sensor 2 with respect to the front direction of the vehicle body of the host vehicle and the steering direction of the steering wheel.
- exposure control is performed so that an object closer to the traveling direction of the own vehicle and closer to the own vehicle among the objects detected by the radar is controlled to achieve optimal exposure. It is effective for prevention.
- FIG. 17 is a block diagram illustrating a configuration of the imaging apparatus according to the third embodiment of the present invention.
- the imaging apparatus of FIG. 17 is generally the same as the imaging apparatus of FIG. 1, but an attention object discrimination circuit 11 is added, and an attention area identification circuit 6 c is provided instead of the attention area identification circuit 6.
- the target object discrimination circuit 11 receives the video signal D3 from the camera signal processing circuit 3, and receives information D5a indicating the position of the target object in the captured image from the target object selection circuit 5.
- the object-of-interest determination circuit 11 analyzes an image of a region (analysis region) having a predetermined size at the position indicated by the information D5a in the video signal D3, and determines the type of the object.
- determining the type of object it is determined whether the object corresponding to the image portion included in the analysis area is a large vehicle such as a truck or a bus, a small vehicle such as a passenger car, a two-wheeled vehicle, or a pedestrian. .
- Information D11 indicating the determination result is supplied to the attention area specifying circuit 6c.
- the size of the analysis region is determined so as to match the size in the image of the image portion corresponding to the selected object, assuming that the object selected by the target object selection circuit 5 is a large vehicle. It is done. Again, the distance to the object is taken into account.
- the attention area specifying circuit 6c receives the information D5 indicating the position of the selected object from the attention object selection circuit 5, and receives the information D11 indicating the type of the selected object from the attention object determination circuit 11, and receives these information. Based on the above, the attention area corresponding to the selected object is specified.
- the target region specified by the target region specifying circuit 6c is the same as the target region in the first embodiment.
- the attention area specified by the attention area specifying circuit 6c is smaller than the attention area in the first embodiment.
- the attention area specifying circuit 6 c supplies information D 6 indicating the attention area to the brightness calculation circuit 7.
- the attention object selection circuit 5 selects the object having the highest attention degree based on the distance among the objects detected by the radar 4 as in the case of FIG. For example, it is assumed that the position of the object is as shown in FIG. 5 and the object-of-interest selection circuit 5 determines that the object D is the object with the highest degree of attention.
- the object-of-interest determination circuit 11 receives information D5a indicating the position of the object D in the captured image from the object-of-interest selection circuit 5, and selects an image portion in the video D3 corresponding to the position indicated by the information D5a. Analysis is performed to determine that the object D is a motorcycle, and information D11 indicating the determination result is supplied to the attention area specifying circuit 6c.
- the attention area specifying circuit 6c recognizes that the object D is a two-wheeled vehicle based on the information D11.
- the object D When the object D is a two-wheeled vehicle, it corresponds to the portion estimated to be occupied by the object D in the captured image, that is, the object D.
- a region Rmd (FIG. 18) including the image portion to be output is output as a region of interest. Again, distance is taken into account.
- FIG. 18 also shows a region of interest Rd according to the first embodiment for comparison.
- the attention area specifying circuit 6 c supplies information indicating the attention area Rmd to the brightness calculation circuit 7.
- the brightness calculation circuit 7 Based on the information supplied from the attention area specifying circuit 6c, the brightness calculation circuit 7 detects the luminance signal Y3 output from the camera signal processing circuit 3 using the attention area Rmd as a detection frame, and obtains the average luminance value Yav. The calculation result is supplied to the exposure control circuit 8.
- the exposure control circuit 8 performs exposure control based on the luminance average value Yav in the attention area Rmd calculated by the brightness calculation circuit 7.
- the entire attention area Rmd is outside the tunnel TN. Therefore, the average luminance value Yav of the attention area is high.
- the luminance average value Yav is high, adjustment for shortening the exposure time or adjustment for reducing the signal amplification gain is performed.
- the brightness of the image represented by the video signal D3 output from the camera signal processing circuit 3 changes, and a captured image in which the brightness is optimally controlled is obtained in the attention area Rmd. That is, an image with high visibility of the object D in the attention area Rmd can be obtained without causing the image portion in the attention area Rmd to be overexposed.
- the tunnel TN is crushed in black
- the brightness is appropriately controlled outside the tunnel TN, particularly the attention area Rmd, and therefore the object D located inside the area, An image with high visibility of the object D is obtained.
- the portion other than the object D (the portion other than the image portion corresponding to the object D) is reduced, and the brightness of the object D is increased. On the other hand, more appropriate exposure control is performed. As a result, the visibility of the object D is further improved.
- FIG. 17 is a modification of the imaging apparatus in FIG. 1, but the same modification can be applied to the imaging apparatus in FIG.
- FIG. 20 is a block diagram illustrating a configuration of the imaging apparatus according to the fourth embodiment of the present invention.
- the image pickup apparatus in FIG. 20 is generally the same as the image pickup apparatus in FIG. 12, but an attention object discrimination circuit 11 is added and an attention area specifying circuit 6 c is provided instead of the attention area specifying circuit 6.
- the target object discriminating circuit 11 and the target area specifying circuit 6c are the same as those described in the third embodiment.
- the object-of-interest selection circuit 5b identifies the object having the highest degree of attention based on the distance and direction among the objects detected by the radar 4 as in the case of FIG. For example, it is assumed that the position of the object is as shown in FIG. 3 and the object-of-interest selection circuit 5b determines that the object A is the object with the highest degree of attention.
- the object-of-interest discrimination circuit 11 receives information D5a indicating the position of the object A in the captured image from the object-of-interest selection circuit 5b, and selects an image portion in the video D3 corresponding to the position indicated by the information D5a. Analysis is performed to determine that the object A is a small car, and information D11 indicating the determination result is supplied to the attention area specifying circuit 6c.
- the attention area specifying circuit 6c recognizes that the object A is a small car based on the information D11, and includes the object A, and when the object A is a small car, a portion estimated to be occupied by the object A in the captured image, That is, the region Rma (FIG. 21) including the image portion corresponding to the object A is output as the attention region. Again, distance is taken into account.
- FIG. 21 also shows a region of interest Ra according to the second embodiment for comparison.
- the attention area specifying circuit 6 c supplies information indicating the attention area Rma to the brightness calculation circuit 7.
- the brightness calculation circuit 7 Based on the information supplied from the attention area specifying circuit 6c, the brightness calculation circuit 7 detects the luminance signal Y3 output from the camera signal processing circuit 3 using the attention area Rma as a detection frame, and obtains the luminance average value Yav. The calculation result is supplied to the exposure control circuit 8.
- the exposure control circuit 8 performs exposure control based on the average brightness value Yav in the attention area Rma calculated by the brightness calculation circuit 7.
- the entire attention area Rma is in the tunnel TN. Therefore, the average luminance value Yav of the attention area is low. Thus, when the luminance average value Yav is low, adjustment for increasing the exposure time or adjustment for increasing the signal amplification gain is performed.
- the brightness of the image represented by the video signal D3 output from the camera signal processing circuit 3 changes, and a captured image whose brightness is optimally controlled is obtained in the attention area Rma. That is, an image with high visibility of the object A in the attention area Rma can be obtained without causing the image portion in the attention area Rma to be blackened.
- the brightness is appropriately controlled in the tunnel TN, in particular, the attention area Rma, and therefore the object A located inside the area, An image with high visibility of the object A is obtained.
- the region Rma in the captured image is smaller than the region Ra of the second embodiment, the portion other than the object A (the portion other than the image portion corresponding to the object A) is reduced, and the brightness of the object A is increased. On the other hand, more appropriate exposure control is performed. As a result, the visibility of the object A is further improved.
- the size of the detection frame can be made closer to the size of the object, and the object is bright. A more appropriately controlled image can be obtained, and therefore the visibility of objects in the field image is improved.
- FIG. FIG. 23 is a block diagram showing a configuration of the imaging apparatus according to the fifth embodiment of the present invention.
- the imaging apparatus in FIG. 23 is generally the same as the imaging apparatus in FIG. 1, but an attention object selection circuit 5 c is provided instead of the attention object selection circuit 5, and a lane detection circuit 12 is newly added.
- the lane detection circuit 12 receives a captured image or a video signal obtained by processing the captured image, for example, a video signal D3 output from the camera signal processing circuit 3, and a lane in the captured image represented by the video signal D3.
- the position of the lane marking and the central separator is detected, and information indicating the position of the detected lane marking and the central separator is output.
- the target object selection circuit 5c is generally the same as the target object selection circuit 5 of FIG. 1, but differs in the following points.
- the target object selection circuit 5c includes information D12 indicating the position of the lane division line and the median strip supplied from the lane detection circuit 12, and position information D4 of the object supplied from the radar 4 (direction information D4a and distance information D4b). Based on the evaluation result, the attention level of the object is evaluated. Based on the evaluation result, an object having a relatively high level of attention, for example, the highest object is selected. To supply.
- an object located on the opposite side of the median strip is excluded from the evaluation target.
- an object on the same side of the median strip an object on the same lane as the lane in which the host vehicle is traveling increases the lane evaluation value.
- the distance evaluation value is increased as the distance between the objects is shorter.
- the attention level is determined by combining the lane evaluation value and the distance evaluation value. In this case, the higher the lane evaluation value, the higher the degree of attention, and the higher the distance evaluation value, the higher the degree of attention. Then, the object with the highest degree of attention is selected.
- the attention area specifying circuit 6, the brightness calculation circuit 7, and the exposure control circuit 8 operate in the same manner as described in the first embodiment.
- Embodiment 6 In the example described with reference to FIGS. 6 and 7 with respect to the first embodiment, only the object with the highest degree of attention is selected, and the region including the selected object is set as the region of interest, based on the luminance of the region of interest. Exposure control.
- the number of objects to be selected is not limited to one. That is, a plurality of objects may be selected, a plurality of areas each including the selected object may be set as the attention area, and exposure control may be performed based on the luminance of the plurality of attention areas.
- FIG. 24 shows a configuration of an imaging apparatus that performs such exposure control.
- the image pickup apparatus in FIG. 24 is generally the same as the image pickup apparatus in FIG. However, an attention object selection circuit 5d, an attention area identification circuit 6d, and a brightness calculation circuit 7b are provided instead of the attention object selection circuit 5, the attention area identification circuit 6, and the brightness calculation circuit 7.
- the target object selection circuit 5d selects a plurality of objects and outputs information indicating the positions of the selected objects. For example, an object with the highest degree of attention and an object with the second highest degree of attention are selected. When the attention is higher as the distance is shorter, the object with the closest distance and the object with the second shortest distance are selected. In the example of FIG. 5, the object D and the object B are selected.
- the attention area identification circuit 6d identifies, as the attention area, an area having a size corresponding to the distance to the object, with the center of each of the plurality of objects selected by the attention object selection circuit 5d. For example, as shown in FIG. 25, a region including the object D and a region including the object B are specified.
- the size of the region Rb including the object B is also the size of the image portion corresponding to the object B when it is assumed that there is a large vehicle at the distance of the object B.
- the region Rb including the object B is smaller than the region Rd including the object D. This is because the distance to the object B is larger than the distance to the object D.
- the brightness calculation circuit 7b detects the luminance signal Y3 output from the camera signal processing circuit 3 by using the region obtained by adding the attention regions Rb and Rd as one detection frame, calculates the luminance average value Yav, and calculates the calculation result.
- the exposure control circuit 8 is supplied. That is, one luminance average value Yav is obtained as one brightness index value for the attention areas Rb and Rd.
- the exposure control circuit 8 performs exposure control based on the average brightness value Yav. As a result, an image whose brightness is optimally controlled in the attention regions Rb and Rd is obtained. That is, an image with high visibility of the objects B and D in the attention areas Rb and Rd can be obtained without whiteout of the image portions in the attention areas Rb and Rd.
- the inside of the tunnel TN is crushed black
- the brightness of the attention regions Rb and Rd, particularly the objects B and D located inside the tunnel TN, and the objects B and D located inside the tunnel TN Controlled images with high visibility of the objects B and D are obtained.
- the sixth embodiment has been described as a modification of the first embodiment. Similar modifications can be made to the second to fifth embodiments.
- Embodiment 7 when a plurality of objects are selected and a plurality of regions corresponding to the selected objects are set as a region of interest, the plurality of regions of interest are combined into one detection frame, and the luminance average value for the detection frame And the exposure control is performed based on the obtained average brightness value.
- different attention areas are specified in different frame periods of imaging by the image sensor 2
- a luminance average value is obtained using the designated attention area as a detection frame, and based on the obtained luminance average value
- the exposure conditions determined in this way may be applied to imaging in a frame period in which the same region of interest is subsequently designated.
- the number of objects to be selected is M (M is an integer of 2 or more), and therefore the number of regions of interest is M.
- the imaging apparatus operates with the M frame period as one operation cycle or operation cycle.
- FIG. 27 shows an imaging apparatus used in this case.
- the imaging apparatus of FIG. 27 is generally the same as the imaging apparatus of FIG. 24, but instead of the brightness calculation circuit 7b and the exposure control circuit 8, a brightness calculation circuit 7d and an exposure control circuit 8d are provided, and the counter 14 is It has been added.
- the counter 14 sets the number M of frame periods constituting one operation cycle as a maximum value, and the count value m is incremented by 1 for each frame period. When the count value m reaches the above maximum value M, the counter 14 is set to an initial value 1. Return and repeat counting. The count value m of the counter 14 is supplied to the brightness calculation circuit 7d and the exposure control circuit 8d.
- the target object selection circuit 5d selects M objects and outputs information indicating the position of the selected objects. For example, an object having the highest degree of attention to an object having the Mth highest degree of attention is selected. When M is 2 and the degree of attention is higher as the distance is shorter, the object with the shortest distance and the object with the second shortest distance are selected. In the example of FIG. 5, the object D and the object B are selected.
- the attention object selection circuit 5d supplies information D5 indicating the position of each of the selected M objects to the attention area specifying circuit 6d.
- the attention area specifying circuit 6d specifies the attention area for each of the M objects selected by the attention object selecting circuit 5d. For example, when M is 2 and the objects D and B shown in FIG. 5 are selected, the area including the object D and the area including the object B are specified as shown in FIG.
- the attention area identification circuit 6d supplies attention area information D6 to the brightness calculation circuit 7d for each of the identified M attention areas.
- the brightness calculation circuit 7d designates one of M attention areas in each frame period, detects the luminance signal Y3 output from the camera signal processing circuit 3 using the designated attention area as a detection frame, and obtains luminance.
- the average value Yav is calculated. Specifically, for a captured image obtained in the mth frame period of one operation cycle, the average luminance value Yavm is calculated using the mth region of interest as a detection frame.
- the captured image to be processed is a captured image obtained in the m-th frame period of one operation cycle since the count value of the counter 14 is m.
- the luminance average values Yav1 and Yav2 of the attention areas Rb and Rd are alternately set to each of the attention areas Rb and Rd, that is, every other frame period. calculate.
- the brightness calculation circuit 7d supplies the calculation result Yavm to the exposure control circuit 8d.
- the memory 81 of the exposure control circuit 8d stores exposure condition parameters corresponding to M attention regions R1 to RM, that is, exposure time Te values (parameters) Te1 to TeM and signal amplification gains Gs values (parameters) Gs1 to Gs1. Store GsM.
- the exposure control circuit 8d performs exposure control based on the average luminance value calculated using one of the M attention areas as a detection frame in each frame period. Specifically, during imaging in the m-th frame period of one operation cycle, exposure control for the imaging device 2 and the camera signal processing circuit 3 is performed using the exposure condition parameters Tem and Gsm stored in the memory 81. Do.
- the brightness calculation circuit 7d calculates the luminance average value Yavm using the mth region of interest of the captured image as the detection frame.
- the exposure control circuit 8d calculates new parameters Tem and Gsm (updates the parameters) based on the calculated luminance average value Yavm and the exposure condition parameters Tem and Gsm stored in the memory 81, and calculates the calculated new Parameters (updated parameters) Tem and Gsm are stored in the memory 81.
- the updated parameters Tem and Gsm stored in the memory 81 are used for exposure control during imaging in the m-th frame period of the next operation cycle. It can be confirmed that the captured image to be processed is a captured image obtained in the m-th frame period of one operation cycle since the count value of the counter 14 is m.
- a plurality of attention areas are sequentially selected for each frame period, the selected attention area is used as a detection frame, a luminance average value is obtained, and based on the obtained luminance average value, corresponding to the selected attention area, A new exposure condition parameter is calculated, and exposure control is performed using the calculated parameter in a frame period in which the same region of interest is selected in the next operation cycle. Therefore, in the captured image of each frame, an image with high visibility of the selected attention area can be obtained.
- the number of selected objects and the number of frame periods constituting one operation cycle are both M, which are the same as each other, and exposure control based on the average brightness value of the attention area corresponding to each object is Done at the same frequency.
- the present invention is not limited to this.
- the exposure control based on the luminance average value of the attention area corresponding to each object may be performed at different frequencies.
- the number of frame periods (Mf) constituting one operation cycle is made larger than the number of selected objects (Mo), and two of the selected objects are two in one operation cycle.
- Exposure control based on the luminance average value of the corresponding attention area may be performed during the above frame period.
- the parameters of the exposure condition are updated based on the average luminance value of a certain region of interest, and then the above-mentioned update is performed at the time of imaging in the frame period to obtain the average luminance value of the same attention region. Use parameters.
- the seventh embodiment has been described as a modification to the first embodiment. Similar modifications can be applied to the second to fifth embodiments.
- the operation near the entrance of the tunnel has been described as an example, but the same effect can be obtained near the exit of the tunnel as described in the first and second embodiments.
- the detection range of the radar 4 has been described as being the same as the imaging field angle range of the imaging device 2, but this point is not essential, and if both overlap at least partially good.
- the detection by the radar 4 is performed in synchronization with the image pickup by the image pickup device 2, but this point is not essential.
- the detection by the radar 4 and the image pickup by the image sensor 2 are not synchronized, one of the position information obtained as a result of the detection by the radar 4 and the image picked up by the image sensor 2 is interpolated in the time direction, and the same timing is obtained. What is necessary is just to produce
- the positional information at the same timing as the imaging by the imaging device 2 may be generated by interpolating the positional information by the radar 4.
- an image captured by the image sensor 2 may be interpolated to generate an image at the same timing as the detection by the radar 4.
- the interpolation mentioned here includes interpolation using the position information or captured image obtained at the closest timing as it is as the position information or image after interpolation.
- the processing circuit may be dedicated hardware or a CPU that executes a program stored in a memory.
- the functions of the parts other than the lens 1, the image sensor 2, the radar 4, and the traveling direction detection circuit 10 in FIG. 12 or FIG. 20 may be realized by separate processing circuits. The functions may be integrated and realized by a single processing circuit.
- each part of the imaging apparatus is realized by software, firmware, or a combination of software and firmware.
- Software or firmware is described as a program and stored in a memory.
- the processing circuit reads out and executes the program stored in the memory, thereby realizing the function of each unit. That is, when the imaging apparatus is executed by the processing circuit, the lens 1, the imaging element 2, the radar 4 and the progression shown in FIG. 1, 12, 17, 17, 20, 23, 24, or 27 are shown.
- the function of each part other than the direction detection circuit 10 includes a memory for storing a program to be executed as a result. These programs can also be said to cause a computer to execute processing methods or procedures in an imaging method implemented by the imaging apparatus.
- each part of the imaging apparatus may be realized by dedicated hardware, and a part may be realized by software or firmware.
- the processing circuit can realize the functions described above by hardware, software, firmware, or a combination thereof.
- FIG. 28 shows all functions other than the lens 1, image sensor 2, and radar 4 of the image pickup apparatus of FIG. 1, FIG. 17, FIG. 23, FIG. An example of the configuration in the case of realization is shown together with the lens 1, the image sensor 2 and the radar 4.
- the computer 50, the lens 1, the imaging device 2, and the radar 4 constitute an imaging device.
- a computer 50 shown in FIG. 28 includes a CPU 51, a memory 52, a first input interface 53A, a second input interface 53B, a first output interface 54A, and a second output interface 54B. Are connected by a bus 55.
- the imaging signal D2 from the imaging device 2 is input to the first input interface 53A.
- the CPU 51 operates according to a program stored in the memory 52. Specifically, the CPU 51 performs the same processing as the camera signal processing circuit 3 on the imaging signal D2 input through the first input interface 53A, and outputs the video signal D3 obtained as a result of the processing to the second signal. From the output interface 54B. The CPU 51 further selects the target object selection circuit 5, the target signal selection circuit 5, the target signal selection circuit 5, the target signal selection circuit 5, the target signal selection circuit 5, and target position information D 4 input via the second input interface 53 B. A control signal Ct for controlling the exposure time calculated as a result of the processing is performed from the first output interface 54A by performing the same processing as the area specifying circuit 6, the brightness calculation circuit 7 or 7b, and the exposure control circuit 8 or 8b. Supplied to the image sensor 2.
- FIG. 29 shows a case in which all the functions other than the lens 1, the image sensor 2, the radar 4 and the traveling direction detection circuit 10 of the image pickup apparatus shown in FIG. An example of the configuration is shown together with the lens 1, the image sensor 2, the radar 4, and the traveling direction detection circuit 10.
- the computer 50, the lens 1, the image sensor 2, the radar 4, and the traveling direction detection circuit 10 constitute an imaging device.
- the computer 50 includes a third input interface 53C as shown in FIG. 29 in addition to the configuration of FIG. 28, via the third input interface 53C.
- Information D10 indicating the traveling direction is input from the traveling direction detection circuit 10.
- the traveling direction detection circuit 10 includes not only the imaging signal D2 input via the first input interface 53A and the position information D4 from the radar 4 input via the second input interface 53B, but also the traveling direction detection circuit 10.
- the information D10 indicating the traveling direction input from the third input interface to the target object selection circuit 5, the attention area specifying circuit 6, the brightness calculation circuit 7 or 7b, and the exposure control are also used. Processing similar to that of the circuit 8 or 8b is performed, and a control signal Ct for controlling the exposure time calculated as a result of the processing is supplied from the first output interface 54A to the image sensor 2.
- the traveling direction detection circuit 10 detects the traveling direction of the host vehicle from the steering direction of the steering wheel of the host vehicle and outputs information D10 indicating the traveling direction, part of which is constituted by a processing circuit. May be. Further, when the traveling direction detection circuit 10 detects the traveling direction based on information generated inside the CPU 51, the entire traveling direction detection circuit 10 may be configured by a processing circuit.
- the same effects as described for the imaging apparatus can be obtained for the imaging method implemented by the imaging apparatus, the processing of each part of the imaging apparatus, or the program that causes the computer to execute each process in the imaging method.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Theoretical Computer Science (AREA)
- Traffic Control Systems (AREA)
- Studio Devices (AREA)
- Image Processing (AREA)
- Image Analysis (AREA)
Abstract
Description
撮像画角範囲内に位置する被写体を撮像して撮像画像を生成する撮像手段と、
前記撮像画角範囲に少なくとも部分的に重なる探知範囲内に存在する物体を検知して、該物体の方向を示す情報及び該物体までの距離を示す情報を出力する探知手段と、
前記探知手段から出力される、前記物体の方向を示す情報及び前記距離を示す情報に基づいて1又は2以上の物体を選択し、選択された物体の前記撮像画像中における位置を示す情報及び前記選択された物体までの距離を示す情報を出力する注目物体選択手段と、
前記注目物体選択手段から出力された、前記選択された物体の位置を示す情報及び前記選択された物体までの距離を示す情報から、前記撮像画像における、前記選択された物体を含む領域を、注目領域として特定する注目領域特定手段と、
前記注目領域特定手段で特定された注目領域における前記撮像画像の明るさを算出する明るさ算出手段と、
前記明るさ算出手段で算出された前記明るさに基づいて前記撮像手段を露光制御する露光制御手段と
を備えることを特徴とする。
図1は、本発明の実施の形態1の撮像装置の構成を示すブロック図である。
図示の撮像装置は、車両に搭載されるものであり、レンズ1と、撮像素子2と、カメラ信号処理回路3と、レーダー4と、注目物体選択回路5と、注目領域特定回路6と、明るさ算出回路7と、露光制御回路8とを有する。
撮像素子2は、撮像面上に形成された被写体像を光電変換して、撮像画像を表す撮像信号を生成する。撮像素子2で生成された撮像信号D2は、カメラ信号処理回路3に供給される。
以下では、撮像素子2により動画撮影が行われ、撮像素子2からフレーム期間毎に撮像信号が出力される場合を想定して説明する。
撮像素子2は、撮像装置を搭載した車両(自車両)の車体の前方を撮像するように設けられており、レンズ1の光軸の方向は、自車両の車体の前方向に一致しているものとする。
出力される映像信号D3は、例えば運転支援処理における物体認識のために利用される。
カメラ信号処理回路3は、また、映像信号D3で表される各フレームの撮像画像の各画素の輝度を表す輝度信号Y3を出力する。輝度信号Y3は、明るさ算出回路7に供給される。
上記の階調補正処理においては、例えばガンマ特性の階調特性となるようにテーブルを参照して階調補正を行う。
上記の輪郭補正処理においては、例えば平滑処理などで鈍った輪郭を、例えば高域通過フィルタで強調することで補正する。
上記の色補正処理においては、R信号、G信号、及びB信号に対してマトリクス演算を行うことで色相、及び彩度を補正する。
レーダー4は、図2に示すように、送出信号を生成する送信部41と、送出信号に対応する送出電波を特定方向に送出し、各物体からの反射電波を受信する指向性の高いアンテナ42と、アンテナ42で受信した電波から反射波を抽出する受信部43とを備え、送出波及び反射波から、探知範囲内の物体の方向と距離を測定する。
レーダー4は、測定された方向を示す情報(方向情報)D4a及び距離を示す情報(距離情報)D4bを、各物体の位置情報D4として出力する。
方向情報D4aで示される方向は、撮像素子2の撮像画像内の位置に対応付けられている。撮像画像内の位置は、例えば、撮像画像内の水平方向位置及び垂直方向位置で表される。
以下では、レーダー4による物体の検知は、撮像素子2による撮像に同期して、即ち、同じ周期で、且つ同じタイミングで行われるものとする。
位置情報D4は注目物体選択回路5に供給される。
選択した物体の位置を示す情報D5には、選択した物体の、撮像画像中における位置を示す情報D5aと、自車両から当該物体までの距離を示す情報D5bが含まれる。
上記のように、レーダー4から出力される位置情報D4に含まれる方向情報D4aで示される方向は、撮像画像内の位置に対応付けられている。従って、レーダー4から出力される、各物体についての方向情報D4aから、撮像画像内の対応する位置を特定することができる。
注目物体選択回路5はさらに、レーダー4から出力される、各物体について距離情報D4bから、上記の選択された物体までの距離を示す情報D5bを抽出する。
露光制御は、露光条件のパラメータ、即ち、撮像素子2に対する露光時間Teの制御及びカメラ信号処理回路3に対する信号増幅ゲインGsの制御により行われる。露光制御回路8は、露光時間Teの制御のための制御信号Ctを撮像素子2に供給し、信号増幅ゲインGsの制御のための制御信号Cgをカメラ信号処理回路3に供給する。
明るさ算出回路7から輝度平均値Yavが入力されると、露光制御回路8は、露光時間Te及び信号増幅ゲインGsの調整を行う。
逆に、輝度目標値Yrfよりも輝度平均値Yavが小さければ、露光時間を長くする調整、或いは信号増幅ゲインを大きくする調整を行う。
なお、レンズ1の絞りを制御して露光制御を行っても良い。
調整後の露光時間Te又は信号増幅ゲインGsの値は、次のフレーム期間以降における撮像に際して、撮像素子2及びカメラ信号処理回路3に供給されて、露光制御に用いられる。
図示の例では、自車両(図示しない)が、片側2車線の道路の追越し車線TFLを走行して、トンネルTNにさしかかった場合を想定している。
レーダー4は、検知した物体の位置を示す情報(位置情報)D4として例えば、自車両から見た、物体の方向を示す情報(方向情報)D4a、及び自車両から物体までの距離を示す情報(距離情報)D4bを出力する。
その場合、同じ物体からの反射波が受信された方向範囲のうちの中心の方向、或いは同じ物体から反射波の強度が最も強い方向を、物体の中心の方向とみなす。
また、物体の中心の方向からの反射波に基づいて計算される距離を物体の中心までの距離として扱う。
レーダー4としては、比較的低性能のものを想定しており、レーダー4の出力からは、物体の大きさ及び形状を正確に知ることはできず、従って物体の種類を特定することはできないものとする。
各物体までの距離を示す情報D5bとしては、レーダー4から出力される同じ物体までの距離を示す情報D4bをそのまま用いることができる。
図5の例では、物体Dが自車両からの距離が最も短く、従って注目度が最も高いと判定される。この場合、注目物体選択回路5は、物体Dを選択する。
図6には、特定された注目領域Rdに含まれる物体Dのほか、物体A、B、及びCが、背景シーン(トンネルTN、中央分離帯MD、車線区分線LM)とともに示されている。注目領域Rdの位置関係をわかりやすくするためである。
図6に示す例では、注目領域Rdは、その全体がトンネルTNの外部にある。日中、快晴の場合、トンネルTNの外は明るい。従って、撮像画像のうち、注目領域Rd内の部分の輝度平均値Yavは高い。このように輝度平均値Yavが高い場合には、露光時間を短くする調整、或いは信号増幅ゲインを小さくする調整が行われる。
その結果、カメラ信号処理回路3から出力される映像信号D3で表される画像の明るさが変わり、注目領域Rdにおいて、明るさが最適に制御された画像が得られる。即ち、注目領域Rd内の画像部分が白飛びすることなく、注目領域Rd内の物体Dの視認性の高い画像が得られる。
上記のように、自車両に対する物体A~Dの相対位置が、図3と同じであるので、レーダー4による探知結果は、図5と同じである。
図10に示すように、注目領域Rdは、トンネルTN内部に位置しており、注目領域Rdの輝度平均値が低いため、例えば、露光時間を長くする調整、或いは信号増幅ゲインを大きくする調整が行われる。
その結果、図11に示すように、トンネル外に位置する物体Aは白飛びするものの、物体Dの視認性の高い画像が得られる。
撮像素子を用いず、レーダーのみで障害物を検知するシステムでは、障害物の種類を特定できず、例えば車両であるか否かも判別できない場合がある。
これに対して、上記の実施の形態では、レーダーと撮像素子とを組合せることで、レーダーで検知した物体のうちの注目度の高い物体が最適露光となるように露光制御を行うので、注目度の高い物体が黒潰れ或いは白飛びすることなく視認できるという効果が得られる。従って、運転の支援、或いは事故の防止を図る上で有効である。
図12は、本発明の実施の形態2の撮像装置の構成を示すブロック図である。
図12の撮像装置は、図1の撮像装置と概して同じであるが、注目物体選択回路5の代わりに注目物体選択回路5bが設けられ、進行方向検知回路10が付加されている点で異なる。
上記以外の構成要素の動作は、実施の形態1の図1の説明と同様であり、同様の処理を行うので説明を省略する。
図12の注目物体選択回路5bは、図1の注目物体選択回路5と概して同じであるが以下の点で異なる。
例えば、各物体の方向が自車両の進行方向に近いほど、方向評価値を高くするとともに、各物体の距離が短いほど、距離評価値を高くし、方向評価値と距離評価値とを総合することで注目度を決める。この場合、方向評価値が高いほど注目度を高くするとともに、距離評価値が高くなるほど注目度を高くする。そして、注目度が最も高い物体を選択する。
この場合にも、実施の形態1で説明したのと同様に、注目物体選択回路5bで選択された物体Aの中心を中心とし、物体Aが大型車両であると仮定して、その距離に応じた大きさの領域Raを特定する。例えば、図13に示す領域Raを特定し、注目領域として明るさ算出回路7に通知する。
物体AはトンネルTN内にあり、領域Raは撮像画像中のトンネルTNの部分内にある。
トンネルTN内にある領域Raは比較的輝度平均値が低いので、例えば、露光時間を長くする調整、或いは信号増幅ゲインを大きくする調整が行われる。その結果、注目領域Raにおいて、明るさが最適に制御された撮像画像が得られる。即ち、注目領域Ra内の画像部分が黒潰れすることなく、注目領域Ra内の物体Aの視認性の高い画像が得られる。
この場合、レーダー4により探知された物体の位置は、図9のごとくである。注目物体選択回路5bは、自車両の進行方向を重視して注目度の評価を行い、その結果、自車両と同じ車線に位置する物体Aが注目度が最も高いと判定する。そして、注目領域特定回路6は、図15に示すように、物体Aに対応する注目領域Raを特定する。
図17は、本発明の実施の形態3の撮像装置の構成を示すブロック図である。
図17の撮像装置は、図1の撮像装置と概して同じであるが、注目物体判別回路11が付加され、注目領域特定回路6の代わりに注目領域特定回路6cが設けられている。
注目物体判別回路11は、映像信号D3のうち、情報D5aで示される位置の、予め定められた大きさの領域(解析領域)の画像を解析し、物体の種類を判別する。
判定結果を示す情報D11は、注目領域特定回路6cに供給される。
判定結果が大型車両以外であることを示す場合、注目領域特定回路6cにより特定される注目領域は、実施の形態1の場合の注目領域よりも小さくなる。
注目領域特定回路6cは、注目領域を示す情報D6を、明るさ算出回路7に供給する。
注目領域特定回路6cは、注目領域Rmdを示す情報を明るさ算出回路7に供給する。
その結果、カメラ信号処理回路3から出力される映像信号D3で表される画像の明るさが変わり、注目領域Rmdにおいて、明るさが最適に制御された撮像画像が得られる。即ち、注目領域Rmd内の画像部分が白飛びすることなく、注目領域Rmd内の物体Dの視認性の高い画像が得られる。
その結果物体Dの視認性が一層向上する。
図17の撮像装置は、図1の撮像装置に対して変形を加えたものであるが、図12の撮像装置に対しても同様の変形を加えることができる。
図20は、本発明の実施の形態4の撮像装置の構成を示すブロック図である。
図20の撮像装置は、図12の撮像装置と概して同じであるが、注目物体判別回路11が付加され、注目領域特定回路6の代わりに注目領域特定回路6cが設けられている。
注目物体判別回路11及び注目領域特定回路6cは実施の形態3に関して説明したのと同じものである。
この場合、注目物体判別回路11は、注目物体選択回路5bから、撮像画像中の物体Aの位置を示す情報D5aを受け、該情報D5aで示される位置に対応する、映像D3中の画像部分を解析して、物体Aが小型車であると判定し、判定結果を示す情報D11を、注目領域特定回路6cに供給する。
注目領域特定回路6cは、注目領域Rmaを示す情報を明るさ算出回路7に供給する。
その結果物体Aの視認性が一層向上する。
図23は本発明の実施の形態5の撮像装置の構成を示すブロック図である。図23の撮像装置は図1の撮像装置と概して同じであるが、注目物体選択回路5の代わりに注目物体選択回路5cが設けられ、車線検知回路12が新たに付加されている。
実施の形態1に関し、図6及び図7を参照して説明した例では、注目度が最も高い物体のみを選択して、選択した物体を含む領域を注目領域として、該注目領域の輝度に基づいて露光制御を行うこととしている。しかしながら、選択する物体の数は一つに限定されない。即ち、複数の物体を選択し、それぞれ選択した物体を含む複数の領域を注目領域とし、複数の注目領域の輝度に基づいて露光制御を行うこととしても良い。
図24の撮像装置は、図1の撮像装置と概して同じである。但し、注目物体選択回路5、注目領域特定回路6、及び明るさ算出回路7の代わりに注目物体選択回路5d、注目領域特定回路6d、及び明るさ算出回路7bが設けられている。
この結果、注目領域Rb及びRdにおいて明るさが最適に制御された画像が得られる。即ち、注目領域Rb及びRd内の画像部分が白飛びすることなく、注目領域Rb及びRd内の物体B及びDの視認性の高い画像が得られる。
実施の形態2~5に対しても同様の変形を加えることができる。
実施の形態6では、複数の物体を選択し、選択した物体に対応する複数の領域を注目領域とする場合、複数の注目領域をまとめて一つの検波枠とし、該検波枠についての輝度平均値を求めて、求めた輝度平均値に基づいて露光制御を行う。このようにする代わりに、撮像素子2による撮像の、互いに異なるフレーム期間に互いに異なる注目領域を指定し、指定された注目領域を検波枠として、輝度平均値を求め、求めた輝度平均値に基づいて定められた露光条件を、その後に同じ注目領域が指定されるフレーム期間の撮像に適用することとしても良い。以下選択される物体の数がM(Mは2以上の整数)、従って、注目領域の数がMであるものとして説明する。この場合、撮像装置はMフレーム期間を1動作周期乃至動作サイクルとして動作する。
カウンタ14のカウント値mは明るさ算出回路7d及び露光制御回路8dに供給される。
具体的には、1動作周期のm番目のフレーム期間に得られた撮像画像については、m番目の注目領域を検波枠として輝度平均値Yavmを計算する。
例えば、Mが2であり、注目領域Rb及びRdが特定された場合、注目領域Rb及びRdの各々を交互に、即ち1フレーム期間おきに、注目領域Rb、Rdの輝度平均値Yav1、Yav2を計算する。
明るさ算出回路7dは、上記の計算結果Yavmを露光制御回路8dに供給する。
具体的には、1動作周期のm番目のフレーム期間における撮像に際しては、メモリ81に記憶されている露光条件のパラメータTem、Gsmを用いて、撮像素子2及びカメラ信号処理回路3に対する露光制御を行う。
処理対象となる撮像画像が、1動作周期のm番目のフレーム期間に得られた撮像画像であることは、カウンタ14のカウント値がmであることから確認可能である。
例えば、図1、図17、図23、図24又は図27の、レンズ1、撮像素子2、及びレーダー4以外の各部分の機能をそれぞれ別個の処理回路で実現してもよいし、複数の部分の機能をまとめて一つの処理回路で実現しても良い。同様に、図12又は図20の、レンズ1、撮像素子2、レーダー4、及び進行方向検知回路10以外の各部分の機能をそれぞれ別個の処理回路で実現してもよいし、複数の部分の機能をまとめて一つの処理回路で実現しても良い。
このように、処理回路は、ハードウェア、ソフトウェア、ファームウェア、またはこれらの組み合わせによって、上述の各機能を実現することができる。
第1の入力インターフェース53Aには、撮像素子2からの撮像信号D2が入力される。
Claims (15)
- 撮像画角範囲内に位置する被写体を撮像して撮像画像を生成する撮像手段と、
前記撮像画角範囲に少なくとも部分的に重なる探知範囲内に存在する物体を検知して、該物体の方向を示す情報及び該物体までの距離を示す情報を出力する探知手段と、
前記探知手段から出力される、前記物体の方向を示す情報及び前記距離を示す情報に基づいて1又は2以上の物体を選択し、選択された物体の前記撮像画像中における位置を示す情報及び前記選択された物体までの距離を示す情報を出力する注目物体選択手段と、
前記注目物体選択手段から出力された、前記選択された物体の位置を示す情報及び前記選択された物体までの距離を示す情報から、前記撮像画像における、前記選択された物体を含む領域を、注目領域として特定する注目領域特定手段と、
前記注目領域特定手段で特定された注目領域における前記撮像画像の明るさを算出する明るさ算出手段と、
前記明るさ算出手段で算出された前記明るさに基づいて前記撮像手段を露光制御する露光制御手段と
を備えることを特徴とする撮像装置。 - 前記注目物体選択手段は、前記探知手段から出力される、各物体についての前記距離を示す情報で表される各物体までの距離が短いほど、当該物体の注目度を高く評価し、該注目度に基づいて物体の選択を行う
ことを特徴とする請求項1に記載の撮像装置。 - 前記撮像装置は、車両に搭載されるものであり、
ハンドルの操舵方向から、当該車両の進行方向を検知して、進行方向を示す情報を出力する進行方向検知手段をさらに備え、
前記注目物体選択手段は、
前記進行方向検知手段から出力された前記進行方向を示す情報と、前記探知手段から出力された前記物体の方向を示す情報及び前記物体の距離を示す情報とに基づいて、物体の選択を行う
ことを特徴とする請求項1に記載の撮像装置。 - 前記注目物体選択手段は、
前記探知手段から出力される、各物体についての前記方向を示す情報で表される各物体の方向が、前記進行方向に近いほど、当該物体の注目度を高く評価し、該注目度に基づいて物体の選択を行う
ことを特徴とする請求項3に記載の撮像装置。 - 前記撮像装置は、車両に搭載されるものであり、
前記撮像画像から、前記撮像画像中の車線区分線および中央分離帯の位置を検出し、検出した前記車線区分線および中央分離帯の位置を示す情報を出力する車線検知手段をさらに備え、
前記注目物体選択手段は、
前記車線検知手段から出力される前記車線区分線および中央分離帯の位置を示す情報と、前記探知手段から出力される前記物体の方向を示す情報及び前記距離を示す情報とに基づいて、物体の選択を行う
ことを特徴とする請求項1に記載の撮像装置。 - 前記注目領域特定手段は、
前記注目物体選択手段で選択された物体について、当該物体が、想定される複数種類の物体のうちで最も大きい種類のものであると仮定した場合に、前記撮像画像において、当該物体が占めると推定される部分を含む領域を、前記注目領域として特定する
ことを特徴とする請求項1から5のいずれか1項に記載の撮像装置。 - 前記注目物体選択手段は、前記物体の方向を示す情報及び前記距離を示す情報に基づいて、複数の物体を選択し、
前記注目領域特定手段は、前記注目物体選択手段で選択された複数の物体について、それぞれ注目領域を特定する
ことを特徴とする請求項1から6のいずれか1項に記載の撮像装置。 - 前記明るさ算出手段は、前記複数の物体についてそれぞれ特定された複数の注目領域を一つの検波枠として明るさを算出することを特徴とする請求項7に記載の撮像装置。
- 前記撮像手段は、前記被写体の動画撮像を行ってフレーム期間毎に前記撮像画像を出力し、
前記明るさ算出手段は、各フレーム期間において、前記複数の物体についてそれぞれ特定された複数の注目領域の一つを検波枠として、明るさを算出する
ことを特徴とする請求項7に記載の撮像装置。 - 前記露光制御手段は、あるフレーム期間において、前記複数の注目領域の一つを検波枠として算出した明るさに基づいて、露光条件を調整し、調整後の露光条件を、その後同じ注目領域を検波枠として明るさを算出するフレーム期間の撮像の際に用いることを特徴とする請求項9に記載の撮像装置。
- 前記撮像手段から出力される撮像画像から、前記注目物体選択手段で選択した物体の種類を判別する注目物体判別手段をさらに備え、
前記注目領域特定手段は、前記注目物体判別手段で判別された物体の種類に基づいて前記注目領域を決定する
ことを特徴とする請求項1から10のいずれか1項に記載の撮像装置。 - 前記注目物体判別手段は、
前記注目物体選択手段で選択された物体について、当該物体が、想定される複数種類の物体のうちで最も大きい種類のものであると仮定した場合に、前記撮像画像において、当該物体が占めると推定される部分を解析することで、当該物体の種類を判別する
ことを特徴とする請求項11に記載の撮像装置。 - 撮像画角範囲内に位置する被写体を撮像して撮像画像を生成する撮像手段と、
前記撮像画角範囲に少なくとも部分的に重なる探知範囲内に存在する物体を検知して、該物体の方向を示す情報及び該物体までの距離を示す情報を出力する探知手段とを備える撮像装置における撮像方法であって、
前記探知手段から出力される、前記物体の方向を示す情報及び前記距離を示す情報に基づいて1又は2以上の物体を選択し、選択された物体の前記撮像画像中における位置を示す情報及び前記選択された物体までの距離を示す情報を生成する注目物体選択ステップと、
前記注目物体選択ステップで生成された、前記選択された物体の位置を示す情報及び前記選択された物体までの距離を示す情報から、前記撮像画像における、前記選択された物体を含む領域を、注目領域として特定する注目領域特定ステップと、
前記注目領域特定ステップで特定された注目領域における前記撮像画像の明るさを算出する明るさ算出ステップと、
前記明るさ算出ステップで算出された前記明るさに基づいて前記撮像手段を露光制御する露光制御ステップと
を備えることを特徴とする撮像方法。 - 請求項13に記載の撮像方法における処理をコンピュータに実行させるためのプログラム。
- 請求項14に記載のプログラムを記録した、コンピュータで読み取り可能な記録媒体。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201580073865.3A CN107211096B (zh) | 2015-01-22 | 2015-11-12 | 摄像装置和方法以及记录介质 |
US15/539,233 US10453214B2 (en) | 2015-01-22 | 2015-11-12 | Image capturing device and method, program, and record medium to perform exposure control based on the brightness in an attention area corresponding to a detected object |
DE112015006032.4T DE112015006032B4 (de) | 2015-01-22 | 2015-11-12 | Vorrichtung und Verfahren zur Bilderfassung eines Objekts, das sich in einem Abbildungsfeld-Winkelbereich befindet, Programm und Aufzeichnungsmedium |
JP2016570499A JP6289681B2 (ja) | 2015-01-22 | 2015-11-12 | 撮像装置及び方法、並びにプログラム及び記録媒体 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2015010132 | 2015-01-22 | ||
JP2015-010132 | 2015-01-22 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2016117213A1 true WO2016117213A1 (ja) | 2016-07-28 |
Family
ID=56416774
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2015/081838 WO2016117213A1 (ja) | 2015-01-22 | 2015-11-12 | 撮像装置及び方法、並びにプログラム及び記録媒体 |
Country Status (5)
Country | Link |
---|---|
US (1) | US10453214B2 (ja) |
JP (1) | JP6289681B2 (ja) |
CN (1) | CN107211096B (ja) |
DE (1) | DE112015006032B4 (ja) |
WO (1) | WO2016117213A1 (ja) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2021187338A1 (ja) * | 2020-03-18 | 2021-09-23 | ソニーセミコンダクタソリューションズ株式会社 | 撮像システム、撮像制御方法および撮影制御プログラム |
Families Citing this family (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP6861375B2 (ja) * | 2017-06-30 | 2021-04-21 | パナソニックIpマネジメント株式会社 | 表示システム、情報提示システム、表示システムの制御方法、プログラム、及び移動体 |
KR20200130810A (ko) * | 2018-03-14 | 2020-11-20 | 소니 주식회사 | 정보 처리 장치, 정보 처리 방법, 및 기록 매체 |
US10346693B1 (en) * | 2019-01-22 | 2019-07-09 | StradVision, Inc. | Method and device for attention-based lane detection without post-processing by using lane mask and testing method and testing device using the same |
US10984534B2 (en) * | 2019-03-28 | 2021-04-20 | GM Global Technology Operations LLC | Identification of attention region for enhancement of sensor-based detection in a vehicle |
CN111654643B (zh) * | 2020-07-22 | 2021-08-31 | 苏州臻迪智能科技有限公司 | 曝光参数确定方法、装置、无人机和计算机可读存储介质 |
RU2746614C1 (ru) * | 2020-09-14 | 2021-04-19 | Общество с ограниченной ответственностью "2И" (ООО "2И") | Способ подавления встречной засветки при формировании изображений дорожного окружения перед транспортным средством и устройство для осуществления способа |
JP2022086311A (ja) * | 2020-11-30 | 2022-06-09 | キヤノン株式会社 | 撮像装置、撮像装置の制御方法、およびプログラム |
US11386650B2 (en) * | 2020-12-08 | 2022-07-12 | Here Global B.V. | Method, apparatus, and system for detecting and map coding a tunnel based on probes and image data |
KR20230123226A (ko) * | 2022-02-16 | 2023-08-23 | 한화비전 주식회사 | Ai 기반 객체인식을 통한 감시 카메라 영상의 노이즈 제거 |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2014135611A (ja) * | 2013-01-09 | 2014-07-24 | Denso Corp | カメラ露出設定装置及びカメラ露出設定プログラム |
JP2014135612A (ja) * | 2013-01-09 | 2014-07-24 | Denso Corp | カメラ露出設定装置及びカメラ露出設定プログラム |
Family Cites Families (20)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH07118786B2 (ja) | 1985-11-08 | 1995-12-18 | 松下電器産業株式会社 | 撮像装置 |
JP3264060B2 (ja) | 1993-11-04 | 2002-03-11 | 三菱自動車工業株式会社 | 自動車の走行制御装置の先行車検出機構 |
JP2001134769A (ja) | 1999-11-04 | 2001-05-18 | Honda Motor Co Ltd | 対象物認識装置 |
JP3685970B2 (ja) | 1999-12-27 | 2005-08-24 | 本田技研工業株式会社 | 物体検知装置 |
JP2005339176A (ja) | 2004-05-26 | 2005-12-08 | Alpine Electronics Inc | 車両認識装置、ナビゲーション装置および車両認識方法 |
US20070242944A1 (en) * | 2004-09-10 | 2007-10-18 | Kazufumi Mizusawa | Camera and Camera System |
JP4779355B2 (ja) | 2004-12-21 | 2011-09-28 | 日産自動車株式会社 | 車両用運転操作補助装置用の表示装置および表示方法 |
US20090051794A1 (en) * | 2005-03-15 | 2009-02-26 | Omron Corporation | Image processing apparatus, image processing method, image processing system, program and recording medium |
US7624925B2 (en) * | 2005-07-11 | 2009-12-01 | Get Solo, Llc | Membership cards |
JP4218670B2 (ja) * | 2005-09-27 | 2009-02-04 | オムロン株式会社 | 前方撮影装置 |
JP4214160B2 (ja) * | 2006-08-31 | 2009-01-28 | フジノン株式会社 | 監視カメラシステム |
JP2008070999A (ja) | 2006-09-13 | 2008-03-27 | Hitachi Ltd | 車両の障害物検出装置及びそれを搭載した自動車 |
JP4987573B2 (ja) * | 2007-06-01 | 2012-07-25 | 富士重工業株式会社 | 車外監視装置 |
JP4900211B2 (ja) | 2007-11-29 | 2012-03-21 | トヨタ自動車株式会社 | 走行支援装置、車間距離設定方法 |
JP2010127717A (ja) | 2008-11-26 | 2010-06-10 | Sumitomo Electric Ind Ltd | 対象物検出装置及び対象物検出システム |
KR102267575B1 (ko) * | 2009-01-29 | 2021-06-22 | 트랙맨 에이/에스 | 레이더 및 촬상 요소를 포함하는 조립체 |
JP2014006188A (ja) * | 2012-06-26 | 2014-01-16 | Toshiba Denpa Products Kk | レーダ監視システム、映像取得方法及び映像取得プログラム |
JP6471528B2 (ja) * | 2014-02-24 | 2019-02-20 | 株式会社リコー | 物体認識装置、物体認識方法 |
US10721384B2 (en) * | 2014-03-27 | 2020-07-21 | Sony Corporation | Camera with radar system |
JP6648411B2 (ja) * | 2014-05-19 | 2020-02-14 | 株式会社リコー | 処理装置、処理システム、処理プログラム及び処理方法 |
-
2015
- 2015-11-12 WO PCT/JP2015/081838 patent/WO2016117213A1/ja active Application Filing
- 2015-11-12 US US15/539,233 patent/US10453214B2/en active Active
- 2015-11-12 CN CN201580073865.3A patent/CN107211096B/zh active Active
- 2015-11-12 DE DE112015006032.4T patent/DE112015006032B4/de active Active
- 2015-11-12 JP JP2016570499A patent/JP6289681B2/ja active Active
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2014135611A (ja) * | 2013-01-09 | 2014-07-24 | Denso Corp | カメラ露出設定装置及びカメラ露出設定プログラム |
JP2014135612A (ja) * | 2013-01-09 | 2014-07-24 | Denso Corp | カメラ露出設定装置及びカメラ露出設定プログラム |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2021187338A1 (ja) * | 2020-03-18 | 2021-09-23 | ソニーセミコンダクタソリューションズ株式会社 | 撮像システム、撮像制御方法および撮影制御プログラム |
Also Published As
Publication number | Publication date |
---|---|
JPWO2016117213A1 (ja) | 2017-06-29 |
DE112015006032T5 (de) | 2017-10-05 |
US10453214B2 (en) | 2019-10-22 |
CN107211096A (zh) | 2017-09-26 |
US20180012374A1 (en) | 2018-01-11 |
JP6289681B2 (ja) | 2018-03-07 |
CN107211096B (zh) | 2020-03-03 |
DE112015006032B4 (de) | 2021-04-01 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6289681B2 (ja) | 撮像装置及び方法、並びにプログラム及び記録媒体 | |
US9424462B2 (en) | Object detection device and object detection method | |
JP4248558B2 (ja) | 道路区画線検出装置 | |
EP2471691B1 (en) | Obstacle detection device, obstacle detection system provided therewith, and obstacle detection method | |
JP4389999B2 (ja) | 露出制御装置及び露出制御プログラム | |
KR101367637B1 (ko) | 감시장치 | |
CN108860045B (zh) | 驾驶辅助方法、驾驶辅助装置及存储介质 | |
US10148938B2 (en) | Vehicle-mounted image recognition device to set a stereoscopic-vision and monocular-vision image areas | |
JP5071198B2 (ja) | 信号機認識装置,信号機認識方法および信号機認識プログラム | |
JP6750519B2 (ja) | 撮像装置、撮像表示方法および撮像表示プログラム | |
JP4191759B2 (ja) | 車両用オートライトシステム | |
WO2017134982A1 (ja) | 撮像装置 | |
JP2009157085A (ja) | 露出制御装置及び露出制御プログラム | |
US11172173B2 (en) | Image processing device, image processing method, program, and imaging device | |
JP2016196233A (ja) | 車両用道路標識認識装置 | |
JP2021114775A (ja) | 移動体用映像表示装置およびその方法 | |
US20220191449A1 (en) | Image processing device, image capturing device, mobile body, and image processing method | |
CN111898532A (zh) | 一种图像处理方法、装置、电子设备及监控系统 | |
US20190124292A1 (en) | Image processing device | |
EP3671629B1 (en) | Semiconductor device, image processing system, image processing method and computer readable storage medium | |
JP2019204988A (ja) | 画像処理装置及び画像処理方法 | |
WO2017203794A1 (ja) | 撮像装置、撮像表示方法および撮像表示プログラム | |
US10614556B2 (en) | Image processor and method for image processing | |
JP5129094B2 (ja) | 車両周辺監視装置 | |
WO2022249562A1 (ja) | 信号処理装置および方法、並びにプログラム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 15878902 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2016570499 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 15539233 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 112015006032 Country of ref document: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 15878902 Country of ref document: EP Kind code of ref document: A1 |