WO2021235001A1 - Image processing device - Google Patents

Image processing device Download PDF

Info

Publication number
WO2021235001A1
WO2021235001A1 PCT/JP2021/002142 JP2021002142W WO2021235001A1 WO 2021235001 A1 WO2021235001 A1 WO 2021235001A1 JP 2021002142 W JP2021002142 W JP 2021002142W WO 2021235001 A1 WO2021235001 A1 WO 2021235001A1
Authority
WO
WIPO (PCT)
Prior art keywords
pitch angle
search range
image processing
reliability
unit
Prior art date
Application number
PCT/JP2021/002142
Other languages
French (fr)
Japanese (ja)
Inventor
圭汰 仲澤
哲也 山田
宏治 土井
Original Assignee
日立Astemo株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 日立Astemo株式会社 filed Critical 日立Astemo株式会社
Priority to JP2022524877A priority Critical patent/JP7350168B2/en
Publication of WO2021235001A1 publication Critical patent/WO2021235001A1/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R11/00Arrangements for holding or mounting articles, not otherwise provided for
    • B60R11/02Arrangements for holding or mounting articles, not otherwise provided for for radio sets, television sets, telephones, or the like; Arrangement of controls thereof
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules

Definitions

  • the present invention relates to an in-vehicle control device installed in a vehicle such as a motorcycle, which acquires an outside world situation from a camera, recognizes the outside world situation, and generates a control signal, and particularly relates to an image processing device.
  • ACC Adaptive Cruise Control
  • the difference between a two-wheeled vehicle and a four-wheeled vehicle is that the amount of subduction in the front and rear of the vehicle body caused by braking, driving, road surface conditions, etc. is large.
  • ACC is realized by using a camera for image recognition
  • the tracking process of the camera cannot catch up and there is a high possibility that the tracking fails.
  • a camera for image recognition is used, not limited to the ACC, there is a problem corresponding to the vehicle behavior, particularly the inclination of the pitch angle, which occurs in the vehicle.
  • Patent Document 1 discloses a technique of changing the cutting range in consideration of the vehicle behavior when the image processing unit generates an image to be used for recognition from the captured image.
  • Patent Document 1 in a camera developed for other vehicles such as four wheels, when the image processing unit cuts out an image to be used in the recognition unit from the captured image, the pitch angle generated in the vehicle is obtained from the sensor information, and the pitch is obtained. It is described that a cropping method is used in which the influence of the pitch angle is removed at the stage of generating an image to be passed to the recognition unit by setting the cropping range of the captured image in the image processing unit according to the angle.
  • Patent Document 1 When the technique of Patent Document 1 is applied to tracking processing as it is, when the cutting position correction direction based on the pitch angle and the moving direction of the detection target predicted by the tracking processing are different, the image in the direction in which the detection target moves is in the image processing stage. If it is cut off with, it may not be possible to capture the detection target within the search range. Further, in the prior art, since the pitch angle of the own vehicle is estimated from the sensor information and the cutting range is changed, there is a possibility that the cutting range may be erroneous when the sensor information outputs an invalid value. Therefore, the cutting method, which is a conventional technique, cannot solve the problem of accuracy of tracking processing when a pitch angle occurs.
  • An object of the present invention is to provide an image processing apparatus capable of improving the accuracy of tracking processing.
  • the image processing device of the present invention uses a tracking processing unit that sets a search range for a target object for each of a plurality of frame images captured by an in-vehicle camera and tracks the target object in chronological order, and vehicle pitch angle information. It is characterized by having a search range calculation unit for changing the search range.
  • an image processing device capable of improving the accuracy of tracking processing. Further features relating to the present invention will be apparent from the description herein and the accompanying drawings. In addition, problems, configurations, and effects other than those described above will be clarified by the following description of the embodiments.
  • the schematic block diagram of the image processing apparatus in Example 1 of this invention An example of the tracking process supported by the present invention. An example showing a difference from the prior art in Example 1 of the present invention. An example in which the search range is set according to the reliability in the first embodiment of the present invention.
  • the flowchart of image processing in Example 1 of this invention An example of application to the roll angle in Example 1 of the present invention.
  • the schematic block diagram of the image processing apparatus in Example 2 of this invention. The flowchart of image processing in Example 2 of this invention.
  • the schematic block diagram of the image processing apparatus in Example 3 of this invention The flowchart of image processing in Example 3 of this invention.
  • the image processing device of the present invention is applied to a motorcycle will be described as an example, but the present technology is not limited to the motorcycle.
  • it can be applied to all vehicles in which pitching can occur, such as a tricycle such as a three-wheeled motorcycle and a four-wheeled vehicle such as a truck.
  • the detection target is detected from the past frame (frame n-1), and an identifier is added to the detection target.
  • the identifier A is added to the motorcycle and the identifier B is added to the preceding vehicle.
  • the positions and moving directions of the detection targets A and B are predicted using the Kalman filter from the tracking processing results in the past frames.
  • the search range of the detection targets A and B in the current frame (frame n) is set from the prediction result.
  • the search range is set by predicting that the motorcycle A moves in the direction approaching the own vehicle, that is, moving to the lower left in the image, and the preceding vehicle B is from the own vehicle.
  • the search range is set by predicting that the vehicle will move upward in the direction away from the image, that is, in the image.
  • search the search range currently set in the frame After associating the detection result of the object in the current frame with the prediction result, it is determined whether the detection target detected in the current frame and the detection targets A and B in the past frame are similar objects. If it is determined that the detection target of the current frame is an object similar to the detection targets A and B of the past frame, the same identifier is added, and if it is determined that the object is different from the past frame, a new identifier is added.
  • the above tracking process recognizes the past frame and the current frame in association with each other, but at that time, it is processed on the premise that the camera position is constant.
  • the camera itself is tilted and the position of the detection target in the frame moves. If the position of the detection target in the frame changes depending on the pitch angle, there is a problem that the detection target cannot be associated between the past frame and the current frame, and tracking fails. Therefore, in the tracking process, it is necessary to deal with the vehicle behavior that occurs in the vehicle, particularly the inclination of the pitch angle.
  • motorcycles and other two-wheeled vehicles have the characteristic that the amount of subduction in the front and rear of the vehicle body caused by braking, driving, road surface conditions, etc. is large. Therefore, for example, when an image processing device for a passenger car is mounted on a motorcycle as it is and recognition processing is performed using an image captured by an image recognition camera, it becomes impossible to capture the detection target within the search range. There is a possibility that tracking may fail due to the influence of the pitch angle generated on the body of the motorcycle in the tracking process.
  • the image processing unit does not cut out an image in consideration of the vehicle behavior as in the conventional case, but the recognition unit of the recognition camera receives information on the inclination of the pitch angle of an arbitrary vehicle. Execute the process to be used.
  • the problem (1) how to set the search range of the tracking process when the pitch angle changes, and the problem (2) how to track when the pitch angle information of the vehicle cannot be accurately acquired.
  • the purpose is to set the search range of the process or to solve these problems (1) and (2).
  • the search range set from the movement direction of the detection target predicted from the previous frame by the tracking process is searched by using the pitch angle information generated in the vehicle.
  • the range is updated, and for the above problem (2), the reliability for determining whether the acquired pitch angle information is accurate is set, and the search range of the tracking process can be changed according to the pitch angle information and the reliability.
  • the means for using the pitch angle sensor will be described using the first embodiment, and the second embodiment will be used for the means for using the output of the vehicle control unit instead of the pitch angle sensor.
  • the means for using the control driving force calculated inside the image processing device instead of the pitch angle sensor will be described with reference to Example 3.
  • the pitch angle information is acquired from the pitch angle sensor 2 for the pitch angle of the vehicle that changes from moment to moment, and the change in the position of the detection target on the recognition image when the pitch angle occurs is used as the deviation information 51. It is output, and whether the output of the deviation information 51 is accurate is output as the reliability 52. Then, a method of setting the search range of the detection target on the recognition image is provided from the deviation information 51, the reliability 52, and the prediction result of the detection target calculated by the tracking processing unit 10.
  • the conventional technology estimates the pitch angle generated in the own vehicle from the sensor information, and adopts a cutting method that changes the cutting position in the image processing unit in response to the change in the position of the detection target in the captured image due to the pitch angle.
  • We are trying to deal with it see FIG. 3 (a)).
  • the focus is on improving the recognition accuracy mainly in consideration of the matching process for detecting an object, and the tracking process is not specified.
  • the tracking process tracks the detected target by predicting the position and moving direction of the detected target, setting the search range, and associating the detected target with each other in a plurality of frames. In the tracking process, it is necessary to set the search range in consideration of the moving direction of the detection target in addition to the pitch angle information generated in the own vehicle.
  • the cutout position in the image is moved upward in order to cope with the change in the pitch angle. Therefore, in normal tracking processing, the search range is set in consideration of the moving direction of the detection target of the identifiers A and B. Therefore, if the movement direction of the detection target of the identifier A is downward, the image processing unit moves. The image in the direction is cut off, and the search range for identifying the identifier A cannot be sufficiently set. Therefore, there is a possibility that the mapping of the identifier A fails and the detection target of the identifier A cannot be identified.
  • the problem of pitch angle change is to be addressed by correcting the cutting position when the image processing unit generates a recognition image.
  • the image processing unit does not perform cutting in consideration of the pitch angle information when generating the recognition image.
  • the recognition unit 6 uses the pitch angle information to variably set the search range of the detection target in the tracking process according to the pitch angle information (see FIG. 3C).
  • the image processing unit since the image processing unit does not perform cutting in consideration of the pitch angle information, the problem of the prior art that the moving direction of the detection target is cut in the tracking processing does not occur. Since the search range is set based on the pitch angle information, the detection target can be captured within the search range even when the pitch angle changes in the vehicle as long as the detection target exists within the range of the captured image. Therefore, even if the pitch angle changes, it is possible to associate the detection target among a plurality of frames.
  • the configuration diagram of this embodiment is shown in FIG.
  • the vehicle is provided with a camera 1 and a pitch angle sensor 2, which are connected using a CAN bus 3.
  • the pitch angle sensor 2 is not particularly limited, but for example, an inertial measurement unit IMU (Inertial Measurement Unit) can be used.
  • IMU Inertial Measurement Unit
  • a three-dimensional angle including a pitch angle, an angular velocity, and an acceleration are obtained. These data are transmitted as data such as CAN and Ethernet.
  • FIG. 1 shows the internal configuration of the camera 1 which is an image processing device.
  • the camera 1 includes an image pickup unit 4, an image processing unit 5, a recognition unit 6, a control generation unit 7, and a communication interface unit 8.
  • the image pickup unit 4 captures an image with an image pickup element such as a CMOS sensor.
  • the image processing unit 5 uses the image captured by the image pickup unit 4 to generate an image to be used in the recognition process.
  • the recognition unit 6 detects vehicles and lanes, and tracks the time series of detection targets.
  • the control generation unit 7 generates a control signal to an actuator such as a vehicle brake or an accelerator based on the recognition result of the recognition unit 6.
  • the communication interface unit 8 communicates the camera with an external ECU such as CAN.
  • the recognition unit 6 includes a storage unit 9, a tracking processing unit 10, and a change amount calculation unit 20.
  • the storage unit 9 stores past recognition results such as detection of other vehicles such as the preceding vehicle and lanes of the lane, road surface recognition, and tracking processing results.
  • the tracking processing unit 10 sets a search range from the prediction information that predicts the movement of the detection target by using the Kalman filter for the detection target to which the identifier is added, and performs time-series tracking of the detection target.
  • the change amount calculation unit 20 outputs the vertical change amount of the target object in the frame image as the deviation information 51 from the pitch angle information.
  • the change amount calculation unit 20 outputs the change in the position of the detection target on the recognition image when the pitch angle is generated from the pitch angle information 50 acquired from the pitch angle sensor 2 as the deviation information 51, and also outputs the deviation information. Is output as the reliability 52.
  • the pitch angle information 50 includes a pitch angle generated in the own vehicle, a pitch angle change amount per unit time, and a pitch angle change direction.
  • the tracking method of the detection target using the Kalman filter is described as an example, but the tracking processing method is not limited to this example.
  • the tracking processing unit 10 has a search range calculation unit 11 that changes the search range from the output of the change amount calculation unit 20 and the prediction information of the detection target with respect to the search range set by the prediction information of the detection target.
  • the tracking processing unit 10 updates the search range to the search range changed by the search range calculation unit 11.
  • the tracking processing unit 10 determines whether or not the detection target exists in the updated search range by performing a full search in the search range. As a result of the search, if there is a detection target to which tracking is performed and an identifier is added in the past, the same identifier is added to track the detection target. When a new detection target is discovered, a new identifier is added.
  • the deviation information 51 may include a change amount in the vertical direction and a change amount in the left-right direction, and may output the change amount of the detection target on the recognition image as a pixel amount.
  • the reliability 52 is calculated from the amount of change in the pitch angle per unit time, and when the amount of change in the pitch angle per unit time is larger than the threshold value, it is output as low reliability and the amount of change in the pitch angle per unit time is When it is smaller than the threshold value, it is output as high reliability.
  • the reliability 52 is treated as a numerical value, but a level, an electric signal, or the like may be used as the reliability.
  • the reliability 52 is not only calculated from the amount of change in the pitch angle per unit time, but it is also possible to set the reliability according to the road surface condition during traveling based on the road surface recognition information stored in the storage unit 9. be. For example, when it is recognized that the road surface condition is uneven or a steep slope, it is output as low reliability, and conversely, when it is determined that the road surface is flat with few irregularities, it is output as high reliability.
  • the reliability 52 is not limited to high reliability and low reliability, and may be used in multiple stages.
  • the search range calculation unit 11 changes the search range according to the reliability, as shown in FIG.
  • the search range in the tracking process is reduced as compared with the case where the reliability 52 is the threshold value. This makes it possible to reduce unnecessary CPU load.
  • the search range in the tracking process is expanded as compared with the case where the reliability 52 is the threshold value. This makes it possible to take a large margin and respond to momentary changes in the pitch angle.
  • FIG. 5 shows a flowchart for setting the search range of the recognition image in the tracking process using the pitch angle information acquired from the pitch angle sensor 2.
  • the detection target is detected from the recognition image, and an identifier is added to the detection target (S1). Then, the past tracking processing result is acquired from the storage unit 9 (S2). The movement direction of the detection target is predicted from the past tracking information, and the search range within the frame is set (S3).
  • the pitch angle information 50 of the pitch angle sensor 2 is acquired through the communication interface 8 (S4).
  • the change amount calculation unit 20 calculates the deviation information 51 of the detection target in the frame and its reliability 52 from the pitch angle information 50 (S5). Subsequently, it is determined whether the reliability 52 is high or low (S6). When the reliability 52 is larger than the threshold value, it is determined to be high, and when it is smaller than the threshold value, it is determined to be low.
  • the vertical direction of the search range set in the frame is expanded beyond the preset reference search range (S7).
  • the vertical direction of the search range is reduced from the preset reference search range (S8).
  • the position of the search range is updated according to the deviation information 51 (S9).
  • the tracking processing unit 10 searches within the updated search range, and more specifically, searches the entire search range to see if the detection target to which the identifier is added according to the updated search range exists in the search range. Then, when the detection target exists within the updated search range, the same identifier is added and confirmed (S10). Then, the result of the tracking process performed by the above flow is stored in the storage unit (S11).
  • the camera 1 of the present embodiment uses a tracking processing unit 10 that sets a search range of a target object for each of a plurality of frame images captured by the camera 1 and tracks the target object in chronological order, and pitch angle information of the vehicle. It also has a search range calculation unit 11 that changes the search range of the tracking processing unit 10.
  • the camera 1 is generated by an image pickup unit 4 that captures an image, an image processing unit 5 that generates an image, a plurality of pitch angles output from the pitch angle sensor 2, and an image processing unit 5. It is provided with a recognition unit 6 that performs image recognition from an image. Then, the recognition unit 6 includes a tracking processing unit 10 that performs tracking processing for tracking an object in the frame in time series, a storage unit 9 that stores past recognition results, and deviation information 51 of the object in the frame from the pitch angle. It has a change amount calculation unit 20 for calculating.
  • the tracking processing unit 10 includes a search range calculation unit 11 that sets a search range in the tracking processing from the input from the change amount calculation unit 20 and the past recognition result. Then, the change amount calculation unit 20 calculates whether the deviation information 51 is accurate in addition to the deviation information 51, and outputs it as the reliability 52.
  • the search range calculation unit 11 sets the search range in the recognized image by using the deviation information 51, the reliability 52, and the past tracking processing result.
  • the search range is set from the movement direction of the detection target predicted from the previous frame by the tracking process, and the pitch angle generated in the motorcycle is used for the set search range. , Change the search range of the tracking process. Then, a reliability 52 for determining whether the acquired pitch angle information is accurate is set, and the search range of the tracking process is made variable by the reliability 52.
  • the search range calculation unit 11 makes the search range variable according to the reliability 52. For example, if it is determined that the reliability 52 is low, the search range is expanded, and if it is determined that the reliability 52 is high, the search range is reduced.
  • the change amount calculation unit 20 calculates the deviation information (vertical change amount) 51 of the object in the recognition image from the pitch angle information 50 acquired from the pitch angle sensor 2 as the pixel amount, and calculates the calculation accuracy of the search range calculation unit 11. Improve.
  • the pitch angle which is the vehicle behavior that occurs in the front-rear direction of the vehicle, has been focused on, but even when the roll angle occurs in the left-right direction of the vehicle as shown in FIG.
  • the search range can be expanded or contracted as shown in FIG. 6B by the same method based on the information 51 and the reliability 52, and the present invention is applied even when the pitch angle and the roll angle occur at the same time. It is possible.
  • Example 2 In the second embodiment, instead of acquiring the pitch angle information 50 from the pitch angle sensor 2, a method of acquiring the pitch angle information 50 from the output of the control device that controls the braking force and the driving force of the vehicle will be described. It is possible to calculate the pitch angle generated in the vehicle in real time from the output of the control device that controls the control driving force of the vehicle, and the present proposal can be applied to the vehicle that does not have the pitch angle sensor. The method is added to the first embodiment.
  • Example 1 The configuration diagram of this embodiment is shown in FIG. The differences from Example 1 are as follows. Instead of the pitch angle sensor 2, a driving force control device 40 and a braking force control device 41 are used. Then, the recognition unit 6 acquires the control driving force control amount 53 of the driving force control device 40 and the braking force control device 41 via the communication interface 8. In this embodiment, the pitch angle information calculation unit 30 for calculating the pitch angle information 50 generated in the vehicle is added to the first embodiment.
  • the detection target is detected from the recognition image, and an identifier is added to the detection target (S1).
  • the past tracking processing result is acquired from the storage unit 9 (S2). Then, the moving direction of the detection target is predicted from the acquired past tracking information, and the search range within the frame is set (S3).
  • the control driving force control amount 53 is acquired from the driving force control device 40 and the braking force control device 41, and the pitch angle information calculation unit 30 calculates the pitch angle information 50 (S20).
  • the calculation method of the pitch angle information 50 is performed by using a known method.
  • the change amount calculation unit 20 calculates the deviation information 51 of the detection target in the frame and its reliability 52 from the pitch angle information 50 (S5). Subsequently, it is determined whether the reliability 52 is high or low (S6), and if it is determined that the reliability 52 is low, the search range is expanded in the vertical direction (S7). If it is determined that the reliability 52 is high, the vertical direction of the search range is reduced (S8). After that, the position of the search range is updated according to the deviation information 51 (S9).
  • the tracking processing unit 10 searches within the updated search range, and searches the entire search range to see if the detection target to which the identifier is added exists within the search range. Then, when the detection target exists, the same identifier is added and confirmed (S10). Then, the result of the tracking process performed by the above flow is stored in the storage unit 9 (S11). As described above, the process of S20 is added instead of the process of S4 of the first embodiment.
  • the image processing device of the second embodiment has a communication interface 8 for communicating with the control device of the vehicle, and the recognition unit 6 has a driving force control amount 53 from the driving force control device 40 and the braking force control device 41.
  • the pitch angle information calculation unit 30 that calculates the pitch angle information 50 from the control driving force control amount 53 is added to the configuration of the first embodiment, so that the pitch angle sensor 2 is not provided. It is possible to set the search range for the vehicle in consideration of the vehicle behavior, and it is possible to improve the tracking accuracy.
  • Example 3 In the third embodiment, instead of acquiring the pitch angle information 50 from the pitch angle sensor 2 of the first embodiment, it is output from the control generation unit 7 that generates a control signal to an actuator such as a brake or an accelerator based on the recognition result. It is characterized in that the pitch angle information 50 is acquired from the target control driving force control amount 54.
  • the third embodiment is configured to be completed within the camera 1, and does not require the pitch angle sensor of the first embodiment or the external control driving force control amount 53 as in the second embodiment. Therefore, it can be expected to minimize the work of conforming to each vehicle type. The method is added to the first embodiment.
  • Example 3 The block diagram of Example 3 is shown in FIG. The differences from Example 1 are as follows. Instead of the pitch angle sensor 2 of the first embodiment, a pitch angle information calculation unit 60 that calculates the pitch angle information 50 by inputting the target control driving force control amount 54 output from the control generation unit 7 is added.
  • the target driving force control amount 54 is a control amount calculated inside the camera, the change in the control amount due to the override when the driver operates the brake or the accelerator is not included.
  • a difference occurs between the pitch angle information 50 calculated by the target driving force control amount 54 and the pitch angle actually generated in the vehicle.
  • a driver operation detection unit 70 that acquires driver operation information 55 indicating whether or not the driver has operated the brake or accelerator via the communication interface 8 and outputs override information 56. ..
  • the detection target is detected from the recognition image, and an identifier is added to the detection target (S1).
  • the past tracking processing result is acquired from the storage unit 9 (S2).
  • the movement direction of the detection target is predicted from the past tracking information, and the search range within the frame is set (S3).
  • the pitch angle information calculation unit 60 calculates the pitch angle information 50 from the target driving force control amount 54 calculated by the control generation unit 7 in the camera from the recognition result of the camera (S30). Next, the change amount calculation unit 20 calculates the deviation information 51 of the detection target in the frame and its reliability 52 from the pitch angle information 50 (S5).
  • the driver operation information 55 is acquired via the communication interface unit 8 (S31). With the driver operation information 55 as an input, it is determined whether or not the override is detected by the driver operation detection unit 70 (S32). When the driver's operation detection unit 70 detects the driver's override, the search range in the tracking process is expanded to the maximum (S33).
  • the search range is expanded in the vertical direction (S7). If it is determined that the reliability is high, the vertical direction of the search range is reduced (S8). After that, the position of the search range is updated according to the deviation information 51 (S9).
  • the tracking processing unit 10 searches within the updated search range, and searches the entire search range to see if the detection target to which the identifier is added exists within the search range. Then, when the detection target exists, the same identifier is added and confirmed (S10). Then, the result of the tracking process performed by the above flow is stored in the storage unit 9 (S11).
  • the processing of S30 is added instead of S4 in Example 1, and the processing of S31, S32, and S33 is further added.
  • the method of calculating the pitch angle information 50 from the target driving force control amount 54 is different, but a method of setting the search range of the tracking process using the deviation information 51 of the detection target and its reliability 52. Is the same as that of the first embodiment.
  • the pitch angle information 50 calculated from the target driving force control amount 54 has a difference from the pitch angle generated in the vehicle, so that the search range is accurate. Will disappear. Therefore, when an override is detected, it is determined that the reliability is the lowest, and the search range is expanded to the maximum.
  • the camera 1 has a control generation unit 7 that calculates a control amount calculated internally for controlling the vehicle from the recognition result of the recognition unit 6, and the pitch angle information calculation unit 60 has a pitch angle information calculation unit 60.
  • the pitch angle information 50 is calculated from the output of the control generation unit 7, and the deviation information 51 of the object in the frame is calculated from the pitch angle information 50 output from the pitch angle information calculation unit 60 by the change amount calculation unit 20.
  • the recognition unit 6 has a driver operation detection unit 70 that detects the driver's operation, and when an override is detected, the change amount calculation unit 20 outputs that the reliability is low.
  • the technology introduced in all the examples can be applied to both stereo cameras and monocular cameras. This proposal is also applicable to LiDAR, which outputs point cloud information. Further, although the pitch angle information is obtained by three methods in all the embodiments, it is not particularly limited as long as it is a means for calculating the pitch angle information in real time.
  • the pitch angle can be set in the vehicle equipped with the recognition camera by adding a device corresponding to the pitch angle for the two-wheeled vehicle to the recognition camera developed for the four-wheeled vehicle. Even if it occurs, by setting the search range using the pitch angle information during the tracking process, it is possible to track the detection target in time series as long as the detection target exists in the captured image.
  • the search range is expanded when sensor information such as the pitch angle sensor outputs an invalid value, and by increasing the margin, the detection target is out of the search range for tracking. It is possible to solve the problem that fails. If the reliability is high, the search range can be reduced and unnecessary calculation load can be eliminated. In the past, when the calculation load was high, processing such as thinning out the search range was performed, but this invention eliminates the need for processing, which contributes to improvement in recognition accuracy. In addition, effects such as suppressing the temperature rise of the hardware can be expected.
  • the search range can be set in consideration of the vehicle behavior even for a vehicle not provided with the pitch angle sensor, and according to the third embodiment, the present invention can be realized inside the camera. It can be expected to minimize the work of adapting to each vehicle type. Further, in the first to third embodiments, the technique has been described focusing on the pitch angle generated in the vehicle, but the technique can be applied not only to the pitch angle but also to the case where the pitch angle and the roll angle are generated at the same time.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Signal Processing (AREA)
  • Mechanical Engineering (AREA)
  • Traffic Control Systems (AREA)

Abstract

Provided is an image processing device capable of realizing accuracy improvement in tracking processing. The image processing device (1) according to the present invention is characterized by having: a tracking processing unit (10) for setting a search area for an object for each of a plurality of frame images captured by a vehicle-mounted camera, and time-sequentially tracking the object; and a search area calculation unit (11) for changing the search area by using information on a pitch angle of a vehicle.

Description

画像処理装置Image processing device
 本発明は、自動二輪車等の車両に設置され、カメラから外界状況を取得し、外界状況を認識し、制御信号を生成する車載制御装置のうち、特に画像処理装置に関する。 The present invention relates to an in-vehicle control device installed in a vehicle such as a motorcycle, which acquires an outside world situation from a camera, recognizes the outside world situation, and generates a control signal, and particularly relates to an image processing device.
 安全かつ快適な車社会の実現のため、運転支援システムの導入が進んでいる。例えば、ACC:Adaptive Cruise Control)は設定した車速内で、前方の車との車間距離を保ちながら、追従走行を行う。 In order to realize a safe and comfortable car society, the introduction of driving support systems is progressing. For example, ACC: Adaptive Cruise Control) follows the vehicle within the set vehicle speed while maintaining the distance between the vehicle and the vehicle in front.
 自動二輪車は、制動時や駆動時、路面状況などにより生じる車体前後の沈み込み量が大きいことが、四輪車(いわゆる自動車)との相違点として挙げられる。特に画像認識用のカメラを用いてACCを実現する場合、自動二輪車では、車体前後の沈み込みが発生した際にカメラのトラッキング処理が追い付かず、トラッキングに失敗する可能性が高くなる。ACCに限らず、画像認識用のカメラを用いる場合には、車両に生じる車両挙動、特にピッチ角の傾きに対応する課題がある。 The difference between a two-wheeled vehicle and a four-wheeled vehicle (so-called automobile) is that the amount of subduction in the front and rear of the vehicle body caused by braking, driving, road surface conditions, etc. is large. In particular, when ACC is realized by using a camera for image recognition, in a motorcycle, when the front and rear of the vehicle body sinks, the tracking process of the camera cannot catch up and there is a high possibility that the tracking fails. When a camera for image recognition is used, not limited to the ACC, there is a problem corresponding to the vehicle behavior, particularly the inclination of the pitch angle, which occurs in the vehicle.
 特許文献1には、画像処理部にて撮像画像から認識に使用する画像を生成する際に、車両挙動を考慮して切り取り範囲を変更するという技術が示されている。特許文献1には、四輪など他の車両向けに開発したカメラにおいて、画像処理部で撮像画像から認識部で使用する画像を切り出す際に、センサ情報から車両に生じたピッチ角を求め、ピッチ角に応じて画像処理部における撮像画像の切り取り範囲を設定することで、認識部に渡す画像を生成する段階でピッチ角の影響を除去させる、切り取り方式を利用することが記載されている。 Patent Document 1 discloses a technique of changing the cutting range in consideration of the vehicle behavior when the image processing unit generates an image to be used for recognition from the captured image. In Patent Document 1, in a camera developed for other vehicles such as four wheels, when the image processing unit cuts out an image to be used in the recognition unit from the captured image, the pitch angle generated in the vehicle is obtained from the sensor information, and the pitch is obtained. It is described that a cropping method is used in which the influence of the pitch angle is removed at the stage of generating an image to be passed to the recognition unit by setting the cropping range of the captured image in the image processing unit according to the angle.
特開2014-143547号公報Japanese Unexamined Patent Publication No. 2014-143547
 特許文献1の技術を、トラッキング処理にそのまま適用すると、ピッチ角による切り取り位置補正方向とトラッキング処理で予測される検知対象の移動方向が異なる場合に、検知対象が移動する方向の画像が画像処理段階で切り取られてしまうと、探索範囲内に検知対象を捉えることが出来なくなる可能性がある。また、従来技術は、センサ情報から自車両のピッチ角を推定して切り取り範囲を変更するものであるため、センサ情報が不正な値を出力した際に切り取り範囲を誤る可能性がある。したがって、従来技術である切り取り方式では、ピッチ角が発生した際のトラッキング処理の精度の課題は解決できない。 When the technique of Patent Document 1 is applied to tracking processing as it is, when the cutting position correction direction based on the pitch angle and the moving direction of the detection target predicted by the tracking processing are different, the image in the direction in which the detection target moves is in the image processing stage. If it is cut off with, it may not be possible to capture the detection target within the search range. Further, in the prior art, since the pitch angle of the own vehicle is estimated from the sensor information and the cutting range is changed, there is a possibility that the cutting range may be erroneous when the sensor information outputs an invalid value. Therefore, the cutting method, which is a conventional technique, cannot solve the problem of accuracy of tracking processing when a pitch angle occurs.
 本発明は、トラッキング処理の精度向上を実現することができる画像処理装置を提供することを目的とする。 An object of the present invention is to provide an image processing apparatus capable of improving the accuracy of tracking processing.
 本発明の画像処理装置は、車載カメラで撮像された複数のフレーム画像毎に対象物体の探索範囲を設定し、前記対象物体を時系列で追跡するトラッキング処理部と、車両のピッチ角情報を用いて、前記探索範囲を変更する探索範囲計算部とを有することを特徴とする。 The image processing device of the present invention uses a tracking processing unit that sets a search range for a target object for each of a plurality of frame images captured by an in-vehicle camera and tracks the target object in chronological order, and vehicle pitch angle information. It is characterized by having a search range calculation unit for changing the search range.
 本発明によれば、トラッキング処理の精度向上を実現することができる画像処理装置を提供する。本発明に関連する更なる特徴は、本明細書の記述、添付図面から明らかになるものである。また、上記した以外の、課題、構成及び効果は、以下の実施形態の説明により明らかにされる。 According to the present invention, an image processing device capable of improving the accuracy of tracking processing is provided. Further features relating to the present invention will be apparent from the description herein and the accompanying drawings. In addition, problems, configurations, and effects other than those described above will be clarified by the following description of the embodiments.
本発明の実施例1における画像処理装置の概略ブロック図。The schematic block diagram of the image processing apparatus in Example 1 of this invention. 本発明が対応するトラッキング処理の例。An example of the tracking process supported by the present invention. 本発明の実施例1における従来技術との差異を示す例。An example showing a difference from the prior art in Example 1 of the present invention. 本発明の実施例1における信頼度によって探索範囲を設定する例。An example in which the search range is set according to the reliability in the first embodiment of the present invention. 本発明の実施例1における画像処理のフローチャート。The flowchart of image processing in Example 1 of this invention. 本発明の実施例1におけるロール角に対する適用例。An example of application to the roll angle in Example 1 of the present invention. 本発明の実施例2における画像処理装置の概略ブロック図。The schematic block diagram of the image processing apparatus in Example 2 of this invention. 本発明の実施例2における画像処理のフローチャート。The flowchart of image processing in Example 2 of this invention. 本発明の実施例3における画像処理装置の概略ブロック図。The schematic block diagram of the image processing apparatus in Example 3 of this invention. 本発明の実施例3における画像処理のフローチャート。The flowchart of image processing in Example 3 of this invention.
 以下、本発明の実施形態を図面に基づいて説明する。
 なお、以下の各実施例では、本発明の画像処理装置を自動二輪車に適用した場合を例に説明するが、本技術は自動二輪車に限定されない。例えば3輪バイクの様な自動三輪車やトラックのような自動四輪車等のピッチングが発生しうる車両全てに適用可能である。
Hereinafter, embodiments of the present invention will be described with reference to the drawings.
In each of the following examples, the case where the image processing device of the present invention is applied to a motorcycle will be described as an example, but the present technology is not limited to the motorcycle. For example, it can be applied to all vehicles in which pitching can occur, such as a tricycle such as a three-wheeled motorcycle and a four-wheeled vehicle such as a truck.
 従来から、カメラで撮像した画像の認識により、物体を検出し、検知対象を時系列で追跡するトラッキング処理がある。トラッキング処理の例について、図2を用いて説明する。まず、図2(a)に示す画像のように、過去フレーム(フレームn-1)から検知対象を検出し、検出対象に識別子を付加する。図2(a)に示す例では、自動二輪車に識別子Aを付加し、先行車に識別子Bを付加している。 Conventionally, there is a tracking process that detects an object by recognizing an image captured by a camera and tracks the detection target in chronological order. An example of the tracking process will be described with reference to FIG. First, as shown in the image shown in FIG. 2A, the detection target is detected from the past frame (frame n-1), and an identifier is added to the detection target. In the example shown in FIG. 2A, the identifier A is added to the motorcycle and the identifier B is added to the preceding vehicle.
 続いて、過去フレームにおけるトラッキング処理結果からカルマンフィルタを用いて、検知対象A、Bの位置や移動方向を予測する。そして、予測結果から現在フレーム(フレームn)における検知対象A、Bの探索範囲を設定する。図2(b)に示す例では、自動二輪車Aが自車両に接近する方向、つまり、画像内を左下方に移動すると予測して、探索範囲が設定され、また、先行車Bが自車から離れる方向、つまり、画像内を上方に移動すると予測して、探索範囲が設定されている。 Subsequently, the positions and moving directions of the detection targets A and B are predicted using the Kalman filter from the tracking processing results in the past frames. Then, the search range of the detection targets A and B in the current frame (frame n) is set from the prediction result. In the example shown in FIG. 2B, the search range is set by predicting that the motorcycle A moves in the direction approaching the own vehicle, that is, moving to the lower left in the image, and the preceding vehicle B is from the own vehicle. The search range is set by predicting that the vehicle will move upward in the direction away from the image, that is, in the image.
 続いて、現在フレームに設定された探索範囲を探索する。そして、現在フレームにおける物体の検出結果と予測結果とを対応付けることによって、現在フレームにおいて検出された検知対象と過去フレームの検知対象A、Bとが同様の物体であるかを判断する。現在フレームの検知対象を過去フレームの検知対象A、Bと同様の物体と判断すれば同様の識別子を付加し、過去フレームと異なる物体と判断した場合は新たに識別子を付加する。 Next, search the search range currently set in the frame. Then, by associating the detection result of the object in the current frame with the prediction result, it is determined whether the detection target detected in the current frame and the detection targets A and B in the past frame are similar objects. If it is determined that the detection target of the current frame is an object similar to the detection targets A and B of the past frame, the same identifier is added, and if it is determined that the object is different from the past frame, a new identifier is added.
 上記トラッキング処理は、過去フレームと現在のフレームとを対応付けて認識を行うが、その際、カメラ位置が一定である前提で処理される。しかし、自車両にピッチ角が発生するとカメラ自体も傾き、フレーム内の検知対象の位置が移動してしまう。ピッチ角によりフレーム内の検知対象の位置が変わってしまうと、過去フレームと現在フレームとの間における検知対象の対応付けが出来ず、トラッキングが失敗するという課題がある。したがって、トラッキング処理において、車両に生じる車両挙動、特にピッチ角の傾きに対応する必要がある。 The above tracking process recognizes the past frame and the current frame in association with each other, but at that time, it is processed on the premise that the camera position is constant. However, when the pitch angle is generated in the own vehicle, the camera itself is tilted and the position of the detection target in the frame moves. If the position of the detection target in the frame changes depending on the pitch angle, there is a problem that the detection target cannot be associated between the past frame and the current frame, and tracking fails. Therefore, in the tracking process, it is necessary to deal with the vehicle behavior that occurs in the vehicle, particularly the inclination of the pitch angle.
 オートバイなどの自動二輪車は、四輪の乗用車と比較して、制動時や駆動時、路面状況などにより生じる車体前後の沈み込み量が大きいという特性がある。したがって、例えば自動二輪車に、乗用車用の画像処理装置をそのまま搭載し、画像認識用のカメラで撮像した画像を用いて認識処理を行った場合に、探索範囲内に検知対象を捉えることが出来なくなる可能性があり、トラッキング処理において自動二輪車の車体に発生するピッチ角の影響によりトラッキングを失敗してしまうおそれがある。 Compared to four-wheeled passenger cars, motorcycles and other two-wheeled vehicles have the characteristic that the amount of subduction in the front and rear of the vehicle body caused by braking, driving, road surface conditions, etc. is large. Therefore, for example, when an image processing device for a passenger car is mounted on a motorcycle as it is and recognition processing is performed using an image captured by an image recognition camera, it becomes impossible to capture the detection target within the search range. There is a possibility that tracking may fail due to the influence of the pitch angle generated on the body of the motorcycle in the tracking process.
 各実施例では、従来のように画像処理部で車両挙動を考慮した画像の切り取りを行うのではなく、任意の車両のピッチ角の傾きに対して、その傾きの情報を認識カメラの認識部にて利用する処理を実行する。本発明では、課題(1)ピッチ角に変化が発生した場合にどのようにトラッキング処理の探索範囲を設定するか、課題(2)車両のピッチ角情報が正確に取得できない場合にどのようにトラッキング処理の探索範囲を設定するか、これらの課題(1)、(2)を解決することを目的とする。 In each embodiment, the image processing unit does not cut out an image in consideration of the vehicle behavior as in the conventional case, but the recognition unit of the recognition camera receives information on the inclination of the pitch angle of an arbitrary vehicle. Execute the process to be used. In the present invention, the problem (1) how to set the search range of the tracking process when the pitch angle changes, and the problem (2) how to track when the pitch angle information of the vehicle cannot be accurately acquired. The purpose is to set the search range of the process or to solve these problems (1) and (2).
 各実施例では、上記課題(1)に対しては、トラッキング処理により前回フレームから予測される検知対象の移動方向から設定される探索範囲に対し、車両に生じるピッチ角情報を利用して、探索範囲の更新を行い、上記課題(2)に対しては、取得するピッチ角情報が正確であるかを判断する信頼度を設定し、ピッチ角情報と信頼度によりトラッキング処理の探索範囲を可変にすることにより、これらの課題を解決する。 In each embodiment, for the above problem (1), the search range set from the movement direction of the detection target predicted from the previous frame by the tracking process is searched by using the pitch angle information generated in the vehicle. The range is updated, and for the above problem (2), the reliability for determining whether the acquired pitch angle information is accurate is set, and the search range of the tracking process can be changed according to the pitch angle information and the reliability. By doing so, these problems are solved.
 上記課題(1)、(2)に対して、ピッチ角センサを用いる手段に関して実施例1を用いて説明し、ピッチ角センサの代わりに車両の制御部の出力を用いる手段に関して実施例2を用いて説明し、ピッチ角センサの代わりに画像処理装置内部で算出される制駆動力を用いる手段に関して実施例3を用いて説明する。 For the above problems (1) and (2), the means for using the pitch angle sensor will be described using the first embodiment, and the second embodiment will be used for the means for using the output of the vehicle control unit instead of the pitch angle sensor. The means for using the control driving force calculated inside the image processing device instead of the pitch angle sensor will be described with reference to Example 3.
[実施例1]
 実施例1では、時々刻々と変わる車両のピッチ角に対し、ピッチ角センサ2からピッチ角情報を取得し、ピッチ角が生じた際の認識画像上の検知対象の位置の変化をズレ情報51として出力し、またそのズレ情報51の出力が正確であるかを信頼度52として出力する。そして、ズレ情報51と、信頼度52と、トラッキング処理部10で算出された検知対象の予測結果とから、認識画像上における検知対象の探索範囲を設定する方法を提供する。
[Example 1]
In the first embodiment, the pitch angle information is acquired from the pitch angle sensor 2 for the pitch angle of the vehicle that changes from moment to moment, and the change in the position of the detection target on the recognition image when the pitch angle occurs is used as the deviation information 51. It is output, and whether the output of the deviation information 51 is accurate is output as the reliability 52. Then, a method of setting the search range of the detection target on the recognition image is provided from the deviation information 51, the reliability 52, and the prediction result of the detection target calculated by the tracking processing unit 10.
 まず、図3を用いて、トラッキング処理に従来技術を適用した際の課題と、本提案による解決方法について説明する。 First, using FIG. 3, the problems when the conventional technology is applied to the tracking process and the solution by the present proposal will be described.
 従来技術は、センサ情報から自車両に生じるピッチ角を推定し、そのピッチ角による撮像画像内の検知対象の位置の変化に対し、画像処理部における切り取り位置を変更する切り取り方式を採用することによって対応しようとしている(図3(a)参照)。
 しかし、従来技術では、主に物体検出を行うマッチング処理を考慮して認識精度向上に焦点を当てており、トラッキング処理に関しては明記されていない。トラッキング処理は、検出された検知対象に対して、検知対象の位置や移動方向を予測して探索範囲を設定し、複数フレームにおいて対応付けを行うことによって検知対象を追跡するものである。トラッキング処理では、自車両に生じるピッチ角情報以外に検知対象の移動方向も考慮して探索範囲を設定する必要がある。
The conventional technology estimates the pitch angle generated in the own vehicle from the sensor information, and adopts a cutting method that changes the cutting position in the image processing unit in response to the change in the position of the detection target in the captured image due to the pitch angle. We are trying to deal with it (see FIG. 3 (a)).
However, in the prior art, the focus is on improving the recognition accuracy mainly in consideration of the matching process for detecting an object, and the tracking process is not specified. The tracking process tracks the detected target by predicting the position and moving direction of the detected target, setting the search range, and associating the detected target with each other in a plurality of frames. In the tracking process, it is necessary to set the search range in consideration of the moving direction of the detection target in addition to the pitch angle information generated in the own vehicle.
 このようなトラッキング処理において、従来技術を適用した場合、例えば図3(b)に示すように、ピッチ角変化に対応するために、画像内の切り出し位置を上方に移動させることになる。したがって、通常のトラッキング処理では、識別子A、Bの検知対象の移動方向を考慮して探索範囲を設定するので、識別子Aの検知対象の移動方向が下方向であった場合、画像処理部において移動方向の画像が切り取られてしまい、識別子Aを識別するための探索範囲が十分に設定できなくなる。したがって、識別子Aの対応付けが失敗し、識別子Aの検知対象を識別できない可能性がある。 When the conventional technique is applied to such a tracking process, for example, as shown in FIG. 3 (b), the cutout position in the image is moved upward in order to cope with the change in the pitch angle. Therefore, in normal tracking processing, the search range is set in consideration of the moving direction of the detection target of the identifiers A and B. Therefore, if the movement direction of the detection target of the identifier A is downward, the image processing unit moves. The image in the direction is cut off, and the search range for identifying the identifier A cannot be sufficiently set. Therefore, there is a possibility that the mapping of the identifier A fails and the detection target of the identifier A cannot be identified.
 従来技術では、画像処理部にて認識画像を生成する際に切り取り位置を補正することによってピッチ角変化の課題に対応しようとするものである。これに対し、本実施例では、画像処理部で認識画像を生成する際にピッチ角情報を考慮した切り取りは実施しない。本実施例では、認識部6にてピッチ角情報を使用し、トラッキング処理における検知対象の探索範囲をピッチ角情報に応じて可変に設定しようとするものである(図3(c)参照)。 In the prior art, the problem of pitch angle change is to be addressed by correcting the cutting position when the image processing unit generates a recognition image. On the other hand, in this embodiment, the image processing unit does not perform cutting in consideration of the pitch angle information when generating the recognition image. In this embodiment, the recognition unit 6 uses the pitch angle information to variably set the search range of the detection target in the tracking process according to the pitch angle information (see FIG. 3C).
 本実施例によれば、画像処理部でピッチ角情報を考慮した切り取りを実施しないので、トラッキング処理において検知対象の移動方向が切り取られてしまうという従来技術の課題は生じない。そして、ピッチ角情報によって探索範囲を設定するので、撮像画像の範囲内に検知対象が存在する限り、車両にピッチ角の変化が生じた際でも探索範囲内に検知対象を捉えることができる。したがって、ピッチ角の変化が発生しても、複数のフレーム間における検知対象の対応付けが可能となる。 According to this embodiment, since the image processing unit does not perform cutting in consideration of the pitch angle information, the problem of the prior art that the moving direction of the detection target is cut in the tracking processing does not occur. Since the search range is set based on the pitch angle information, the detection target can be captured within the search range even when the pitch angle changes in the vehicle as long as the detection target exists within the range of the captured image. Therefore, even if the pitch angle changes, it is possible to associate the detection target among a plurality of frames.
 図3を用いて従来技術と本実施例との差異を説明したが、次に、自車両にピッチ角の変化が生じた際に、トラッキング処理にてピッチ角情報を利用して探索範囲を設定する本実施例の原理について図1を用いて詳細説明する。 The difference between the prior art and the present embodiment has been described with reference to FIG. 3. Next, when the pitch angle of the own vehicle changes, the search range is set by using the pitch angle information in the tracking process. The principle of this embodiment will be described in detail with reference to FIG.
 本実施例の構成図を図1に示す。車両に、カメラ1とピッチ角センサ2とを備え、それらはCANバス3を用いて接続されている。ピッチ角センサ2は、特に限定されないが、例えば、慣性計測装置IMU(Inertial Measurement Unit)を用いることができる。慣性計測装置IMUでは、ピッチ角を含む3次元の角度、角速度、加速度が求められる。これらデータは、CANやEthernetなどのデータとして送信される。 The configuration diagram of this embodiment is shown in FIG. The vehicle is provided with a camera 1 and a pitch angle sensor 2, which are connected using a CAN bus 3. The pitch angle sensor 2 is not particularly limited, but for example, an inertial measurement unit IMU (Inertial Measurement Unit) can be used. In the inertial measurement unit IMU, a three-dimensional angle including a pitch angle, an angular velocity, and an acceleration are obtained. These data are transmitted as data such as CAN and Ethernet.
 図1に、画像処理装置であるカメラ1の内部構成を示す。カメラ1は、撮像部4、画像処理部5、認識部6、制御生成部7、および通信インターフェース部8を備えている。撮像部4は、CMOSセンサなどの撮像素子で画像を撮像する。画像処理部5は、撮像部4で撮像された画像を用いて、認識処理で使用する画像を生成する。認識部6は、車両やレーンの検知や、検知対象の時系列の追跡(トラッキング)等を行う。制御生成部7は、認識部6の認識結果を元に、車両のブレーキやアクセルなどのアクチュエータへの制御信号を生成する。通信インターフェース部8は、CANなど外部ECUとカメラの通信を行う。 FIG. 1 shows the internal configuration of the camera 1 which is an image processing device. The camera 1 includes an image pickup unit 4, an image processing unit 5, a recognition unit 6, a control generation unit 7, and a communication interface unit 8. The image pickup unit 4 captures an image with an image pickup element such as a CMOS sensor. The image processing unit 5 uses the image captured by the image pickup unit 4 to generate an image to be used in the recognition process. The recognition unit 6 detects vehicles and lanes, and tracks the time series of detection targets. The control generation unit 7 generates a control signal to an actuator such as a vehicle brake or an accelerator based on the recognition result of the recognition unit 6. The communication interface unit 8 communicates the camera with an external ECU such as CAN.
 認識部6は、記憶部9と、トラッキング処理部10と、変化量計算部20とを備えている。記憶部9は、先行車などの他の車両や車線のレーンの検知、路面認識、トラッキング処理結果などの過去の認識結果を記憶する。トラッキング処理部10は、識別子を付加した検知対象に対し、カルマンフィルタを用いて検知対象の動きを予測する予測情報から探索範囲を設定し、検知対象の時系列追跡を行う。変化量計算部20は、ピッチ角情報からフレーム画像における対象物体の上下方向変化量をズレ情報51として出力する。変化量計算部20は、ピッチ角センサ2から取得したピッチ角情報50からピッチ角が生じた際の認識画像上の検知対象の位置の変化をズレ情報51として出力し、またそのズレ情報の出力が正確であるかを信頼度52として出力する。ピッチ角情報50は、自車両に発生するピッチ角や単位時間当たりのピッチ角変化量、ピッチ角変化方向を含む。 The recognition unit 6 includes a storage unit 9, a tracking processing unit 10, and a change amount calculation unit 20. The storage unit 9 stores past recognition results such as detection of other vehicles such as the preceding vehicle and lanes of the lane, road surface recognition, and tracking processing results. The tracking processing unit 10 sets a search range from the prediction information that predicts the movement of the detection target by using the Kalman filter for the detection target to which the identifier is added, and performs time-series tracking of the detection target. The change amount calculation unit 20 outputs the vertical change amount of the target object in the frame image as the deviation information 51 from the pitch angle information. The change amount calculation unit 20 outputs the change in the position of the detection target on the recognition image when the pitch angle is generated from the pitch angle information 50 acquired from the pitch angle sensor 2 as the deviation information 51, and also outputs the deviation information. Is output as the reliability 52. The pitch angle information 50 includes a pitch angle generated in the own vehicle, a pitch angle change amount per unit time, and a pitch angle change direction.
 トラッキング処理部10の処理方法について、一例としてカルマンフィルタを用いた検知対象のトラッキング方法を記載したが、トラッキング処理の手法についてはこの例に限定されない。 Regarding the processing method of the tracking processing unit 10, the tracking method of the detection target using the Kalman filter is described as an example, but the tracking processing method is not limited to this example.
 トラッキング処理部10は、検知対象の予測情報によって設定された探索範囲に対し、変化量計算部20の出力と検知対象の予測情報とから、探索範囲を変更する探索範囲計算部11を有し、トラッキング処理部10は、探索範囲を、探索範囲計算部11にて変更された探索範囲に更新する。 The tracking processing unit 10 has a search range calculation unit 11 that changes the search range from the output of the change amount calculation unit 20 and the prediction information of the detection target with respect to the search range set by the prediction information of the detection target. The tracking processing unit 10 updates the search range to the search range changed by the search range calculation unit 11.
 トラッキング処理部10は、更新された探索範囲内に検知対象が存在するかどうか探索範囲内を全探索することにより判断する。探索した結果、過去にトラッキングを行って識別子を付加した検知対象が存在した場合、同じ識別子を付加して検知対象の追跡を行う。新たな検知対象が発見された場合は新たな識別子を付加する。ズレ情報51には、上下方向の変化量や左右方向の変化量を含み、認識画像上の検知対象の変化量を画素量として出力してもよい。 The tracking processing unit 10 determines whether or not the detection target exists in the updated search range by performing a full search in the search range. As a result of the search, if there is a detection target to which tracking is performed and an identifier is added in the past, the same identifier is added to track the detection target. When a new detection target is discovered, a new identifier is added. The deviation information 51 may include a change amount in the vertical direction and a change amount in the left-right direction, and may output the change amount of the detection target on the recognition image as a pixel amount.
 信頼度52は、単位時間あたりのピッチ角の変化量から算出され、単位時間当たりのピッチ角変化量が閾値よりも大きい場合に、低い信頼度として出力し、単位時間当たりのピッチ角変化量が閾値よりも小さい場合に、高い信頼度として出力する。本実施例では、信頼度52を数値として扱うことを想定しているが、信頼度としてレベルや電気信号などを使用してもよい。 The reliability 52 is calculated from the amount of change in the pitch angle per unit time, and when the amount of change in the pitch angle per unit time is larger than the threshold value, it is output as low reliability and the amount of change in the pitch angle per unit time is When it is smaller than the threshold value, it is output as high reliability. In this embodiment, it is assumed that the reliability 52 is treated as a numerical value, but a level, an electric signal, or the like may be used as the reliability.
 信頼度52は、単位時間当たりのピッチ角変化量から算出するだけでなく、記憶部9に記憶している路面認識情報を基に、走行中の路面状況により信頼度を設定することも可能である。例えば、路面状況が凸凹している悪路や急勾配であると認識した場合に低い信頼度として出力し、逆に凹凸の少ない平坦な路面であると判断した場合は高い信頼度として出力する。なお、信頼度52は高い信頼度と低い信頼度の二つに限定されず、多段階に用いられても良い。 The reliability 52 is not only calculated from the amount of change in the pitch angle per unit time, but it is also possible to set the reliability according to the road surface condition during traveling based on the road surface recognition information stored in the storage unit 9. be. For example, when it is recognized that the road surface condition is uneven or a steep slope, it is output as low reliability, and conversely, when it is determined that the road surface is flat with few irregularities, it is output as high reliability. The reliability 52 is not limited to high reliability and low reliability, and may be used in multiple stages.
 探索範囲計算部11は、図4のように、信頼度に応じて探索範囲を変更する。例えば、図4(a)に示すように、信頼度52が閾値よりも高い場合、トラッキング処理における探索範囲を信頼度52が閾値のときよりも縮小させる。これにより、無駄なCPU負荷を低減させることができる。一方、図4(b)に示すように、信頼度52が閾値よりも低い場合は、トラッキング処理における探索範囲を信頼度52が閾値のときよりも拡大させる。これにより、マージンを大きく取り、瞬間的なピッチ角の変化に対応可能とする。 The search range calculation unit 11 changes the search range according to the reliability, as shown in FIG. For example, as shown in FIG. 4A, when the reliability 52 is higher than the threshold value, the search range in the tracking process is reduced as compared with the case where the reliability 52 is the threshold value. This makes it possible to reduce unnecessary CPU load. On the other hand, as shown in FIG. 4B, when the reliability 52 is lower than the threshold value, the search range in the tracking process is expanded as compared with the case where the reliability 52 is the threshold value. This makes it possible to take a large margin and respond to momentary changes in the pitch angle.
 上記の構成を踏まえ、ピッチ角センサ2から取得するピッチ角情報を用いてトラッキング処理における認識画像の探索範囲を設定するフローチャートを図5に示す。 Based on the above configuration, FIG. 5 shows a flowchart for setting the search range of the recognition image in the tracking process using the pitch angle information acquired from the pitch angle sensor 2.
 まず、認識画像から検知対象を検出し、検知対象に識別子を付加する(S1)。そして、記憶部9から過去のトラッキング処理結果を取得する(S2)。過去のトラッキング情報から検知対象の移動方向を予測し、フレーム内における探索範囲を設定する(S3)。 First, the detection target is detected from the recognition image, and an identifier is added to the detection target (S1). Then, the past tracking processing result is acquired from the storage unit 9 (S2). The movement direction of the detection target is predicted from the past tracking information, and the search range within the frame is set (S3).
 ピッチ角センサ2のピッチ角情報50を通信インターフェース8を通して取得する(S4)。次に、変化量計算部20にてピッチ角情報50からフレーム内における検知対象のズレ情報51とその信頼度52を計算する(S5)。続いて信頼度52が高いのか、低いのかを判断する(S6)。信頼度52が閾値よりも大きい場合には高いと判断し、閾値よりも小さい場合には低いと判断する。
 ここで、信頼度が低いと判断された場合、フレーム内に設定されている探索範囲の上下方向を、予め設定された基準の探索範囲よりも拡大する(S7)。一方、信頼度が高いと判断された場合、探索範囲の上下方向を、予め設定された基準の探索範囲よりも縮小する(S8)。
The pitch angle information 50 of the pitch angle sensor 2 is acquired through the communication interface 8 (S4). Next, the change amount calculation unit 20 calculates the deviation information 51 of the detection target in the frame and its reliability 52 from the pitch angle information 50 (S5). Subsequently, it is determined whether the reliability 52 is high or low (S6). When the reliability 52 is larger than the threshold value, it is determined to be high, and when it is smaller than the threshold value, it is determined to be low.
Here, when it is determined that the reliability is low, the vertical direction of the search range set in the frame is expanded beyond the preset reference search range (S7). On the other hand, when it is determined that the reliability is high, the vertical direction of the search range is reduced from the preset reference search range (S8).
 その後、ズレ情報51に従って探索範囲の位置を更新する(S9)。トラッキング処理部10は、更新された探索範囲内を探索し、より詳しくは、更新された探索範囲に従って識別子を付加した検知対象が探索範囲内に存在するかどうか探索範囲内を全探索する。そして、更新された探索範囲内に検知対象が存在した場合、同じ識別子を付加し確定する(S10)。そして上記フローによってトラッキング処理を行った結果を記憶部に保存する(S11)。 After that, the position of the search range is updated according to the deviation information 51 (S9). The tracking processing unit 10 searches within the updated search range, and more specifically, searches the entire search range to see if the detection target to which the identifier is added according to the updated search range exists in the search range. Then, when the detection target exists within the updated search range, the same identifier is added and confirmed (S10). Then, the result of the tracking process performed by the above flow is stored in the storage unit (S11).
 本実施例のカメラ1は、カメラ1で撮像された複数のフレーム画像毎に対象物体の探索範囲を設定し、対象物体を時系列で追跡するトラッキング処理部10と、車両のピッチ角情報を用いて、トラッキング処理部10の探索範囲を変更する探索範囲計算部11とを有する。 The camera 1 of the present embodiment uses a tracking processing unit 10 that sets a search range of a target object for each of a plurality of frame images captured by the camera 1 and tracks the target object in chronological order, and pitch angle information of the vehicle. It also has a search range calculation unit 11 that changes the search range of the tracking processing unit 10.
 より詳しくは、カメラ1は、画像を撮像する撮像部4と、画像を生成する画像処理部5と、ピッチ角センサ2から出力される複数のピッチ角と、画像処理部5にて生成された画像から画像認識を行う認識部6とを備える。そして、認識部6は、フレーム内の物体を時系列で追跡するトラッキング処理を行うトラッキング処理部10と、過去の認識結果を記憶する記憶部9と、ピッチ角からフレーム内の物体のズレ情報51を算出する変化量計算部20を有する。 More specifically, the camera 1 is generated by an image pickup unit 4 that captures an image, an image processing unit 5 that generates an image, a plurality of pitch angles output from the pitch angle sensor 2, and an image processing unit 5. It is provided with a recognition unit 6 that performs image recognition from an image. Then, the recognition unit 6 includes a tracking processing unit 10 that performs tracking processing for tracking an object in the frame in time series, a storage unit 9 that stores past recognition results, and deviation information 51 of the object in the frame from the pitch angle. It has a change amount calculation unit 20 for calculating.
 トラッキング処理部10は、変化量計算部20からの入力と過去の認識結果とからトラッキング処理における探索範囲を設定する探索範囲計算部11を備える。そして、変化量計算部20は、ズレ情報51に加え、ズレ情報51が正確であるかを計算し、信頼度52として出力する。探索範囲計算部11は、ズレ情報51と、信頼度52と、過去のトラッキング処理結果と、を用いて、認識画像内の探索範囲を設定する。 The tracking processing unit 10 includes a search range calculation unit 11 that sets a search range in the tracking processing from the input from the change amount calculation unit 20 and the past recognition result. Then, the change amount calculation unit 20 calculates whether the deviation information 51 is accurate in addition to the deviation information 51, and outputs it as the reliability 52. The search range calculation unit 11 sets the search range in the recognized image by using the deviation information 51, the reliability 52, and the past tracking processing result.
 本実施例のカメラ1によれば、トラッキング処理によって前回フレームから予測される検知対象の移動方向から探索範囲を設定し、その設定された探索範囲に対して、自動二輪車に生じるピッチ角を利用し、トラッキング処理の探索範囲の変更を行う。そして、取得するピッチ角の情報が正確であるかを判断する信頼度52を設定し、信頼度52によりトラッキング処理の探索範囲を可変にする。 According to the camera 1 of the present embodiment, the search range is set from the movement direction of the detection target predicted from the previous frame by the tracking process, and the pitch angle generated in the motorcycle is used for the set search range. , Change the search range of the tracking process. Then, a reliability 52 for determining whether the acquired pitch angle information is accurate is set, and the search range of the tracking process is made variable by the reliability 52.
 探索範囲計算部11は、信頼度52により探索範囲を可変にする。例えば信頼度52が低いと判断された場合は探索範囲を拡大し、信頼度52が高いと判断された場合は探索範囲を縮小する。 The search range calculation unit 11 makes the search range variable according to the reliability 52. For example, if it is determined that the reliability 52 is low, the search range is expanded, and if it is determined that the reliability 52 is high, the search range is reduced.
 変化量計算部20は、ピッチ角センサ2から取得したピッチ角情報50から認識画像内の物体のズレ情報(上下方向変化量)51を画素量として算出し、探索範囲計算部11の計算精度を向上させる。 The change amount calculation unit 20 calculates the deviation information (vertical change amount) 51 of the object in the recognition image from the pitch angle information 50 acquired from the pitch angle sensor 2 as the pixel amount, and calculates the calculation accuracy of the search range calculation unit 11. Improve.
 したがって、自動二輪車のように自動車と比較してピッチ角が大きく変化する車両であっても、探索範囲内に検知対象を捉えることができる。したがって、ピッチ角が発生した際のトラッキング処理の精度向上を図ることができる。 Therefore, even a vehicle such as a motorcycle whose pitch angle changes significantly compared to an automobile can catch the detection target within the search range. Therefore, it is possible to improve the accuracy of the tracking process when the pitch angle occurs.
 実施例1では、車両の前後方向に発生する車両挙動であるピッチ角に焦点を当てて説明したが、図6(a)のように車両に左右方向にロール角が発生した場合においても、ズレ情報51と信頼度52による同様の方法で、図6(b)のように探索範囲を拡大、縮小させることが可能であり、ピッチ角とロール角が同時に発生した場合においても、本発明を適用可能である。 In the first embodiment, the pitch angle, which is the vehicle behavior that occurs in the front-rear direction of the vehicle, has been focused on, but even when the roll angle occurs in the left-right direction of the vehicle as shown in FIG. The search range can be expanded or contracted as shown in FIG. 6B by the same method based on the information 51 and the reliability 52, and the present invention is applied even when the pitch angle and the roll angle occur at the same time. It is possible.
[実施例2]
 実施例2では、ピッチ角センサ2からピッチ角情報50を取得する代わりに、車両の制動力や駆動力を制御する制御装置の出力からピッチ角情報50を取得する方式について説明する。車両の制駆動力を制御する制御装置の出力から車両に発生するピッチ角をリアルタイムで算出することが可能であり、ピッチ角センサを備えていない車両に対しても本提案を適用可能としている。その方式を実施例1に追加する。
[Example 2]
In the second embodiment, instead of acquiring the pitch angle information 50 from the pitch angle sensor 2, a method of acquiring the pitch angle information 50 from the output of the control device that controls the braking force and the driving force of the vehicle will be described. It is possible to calculate the pitch angle generated in the vehicle in real time from the output of the control device that controls the control driving force of the vehicle, and the present proposal can be applied to the vehicle that does not have the pitch angle sensor. The method is added to the first embodiment.
 本実施例の構成図を図7に示す。実施例1との差分は次の通りである。ピッチ角センサ2の代わりに、駆動力制御装置40と制動力制御装置41を用いている。そして、認識部6は、駆動力制御装置40と制動力制御装置41の制駆動力制御量53を通信インターフェース8を経由して取得する。本実施例では、実施例1に対して、車両に発生するピッチ角情報50を計算するピッチ角情報計算部30が追加となっている。 The configuration diagram of this embodiment is shown in FIG. The differences from Example 1 are as follows. Instead of the pitch angle sensor 2, a driving force control device 40 and a braking force control device 41 are used. Then, the recognition unit 6 acquires the control driving force control amount 53 of the driving force control device 40 and the braking force control device 41 via the communication interface 8. In this embodiment, the pitch angle information calculation unit 30 for calculating the pitch angle information 50 generated in the vehicle is added to the first embodiment.
 制駆動力制御量53からピッチ角情報50を計算するフローチャートについて図8を用いて説明する。 A flowchart for calculating the pitch angle information 50 from the control driving force control amount 53 will be described with reference to FIG.
 認識画像から検知対象を検出し、検知対象に識別子を付加する(S1)。記憶部9から過去のトラッキング処理結果を取得する(S2)。そして、取得した過去のトラッキング情報から検知対象の移動方向を予測し、フレーム内における探索範囲を設定する(S3)。 The detection target is detected from the recognition image, and an identifier is added to the detection target (S1). The past tracking processing result is acquired from the storage unit 9 (S2). Then, the moving direction of the detection target is predicted from the acquired past tracking information, and the search range within the frame is set (S3).
 駆動力制御装置40と制動力制御装置41から制駆動力制御量53を取得し、ピッチ角情報計算部30にてピッチ角情報50を算出する(S20)。ピッチ角情報50の計算方法は、公知の方法を用いて行われる。 The control driving force control amount 53 is acquired from the driving force control device 40 and the braking force control device 41, and the pitch angle information calculation unit 30 calculates the pitch angle information 50 (S20). The calculation method of the pitch angle information 50 is performed by using a known method.
 次に、変化量計算部20にてピッチ角情報50からフレーム内における検知対象のズレ情報51とその信頼度52を計算する(S5)。続いて信頼度52が高いのか、または低いのかを判断し(S6)、信頼度52が低いと判断された場合、探索範囲の上下方向を拡大する(S7)。信頼度52が高いと判断された場合、探索範囲の上下方向を縮小する(S8)。その後、ズレ情報51に従って探索範囲の位置を更新する(S9)。 Next, the change amount calculation unit 20 calculates the deviation information 51 of the detection target in the frame and its reliability 52 from the pitch angle information 50 (S5). Subsequently, it is determined whether the reliability 52 is high or low (S6), and if it is determined that the reliability 52 is low, the search range is expanded in the vertical direction (S7). If it is determined that the reliability 52 is high, the vertical direction of the search range is reduced (S8). After that, the position of the search range is updated according to the deviation information 51 (S9).
 トラッキング処理部10は、更新された探索範囲内を探索し、識別子を付加した検知対象が探索範囲内に存在するかどうか探索範囲内を全探索する。そして、検知対象が存在した場合には、同じ識別子を付加し確定する(S10)。そして、上記フローによってトラッキング処理を行った結果を記憶部9に保存する(S11)。以上のように、実施例1のS4の処理に代わり、S20の処理が追加されている。 The tracking processing unit 10 searches within the updated search range, and searches the entire search range to see if the detection target to which the identifier is added exists within the search range. Then, when the detection target exists, the same identifier is added and confirmed (S10). Then, the result of the tracking process performed by the above flow is stored in the storage unit 9 (S11). As described above, the process of S20 is added instead of the process of S4 of the first embodiment.
 実施例2の画像処理装置は、車両の制御装置と通信を行う為の通信インターフェース8を有し、認識部6は、駆動力制御装置40と制動力制御装置41からの制駆動力制御量53を用いて、自車両に発生するピッチ角情報50を算出するピッチ角情報計算部30を有する。そして、変化量計算部20にてピッチ角情報計算部30から出力されるピッチ角情報50を用いてフレーム内の検知対象のズレ情報51を算出する。 The image processing device of the second embodiment has a communication interface 8 for communicating with the control device of the vehicle, and the recognition unit 6 has a driving force control amount 53 from the driving force control device 40 and the braking force control device 41. Has a pitch angle information calculation unit 30 for calculating the pitch angle information 50 generated in the own vehicle by using the above. Then, the change amount calculation unit 20 calculates the deviation information 51 of the detection target in the frame by using the pitch angle information 50 output from the pitch angle information calculation unit 30.
 このように、実施例2では、制駆動力制御量53からピッチ角情報50を計算するピッチ角情報計算部30を、実施例1の構成に追加することで、ピッチ角センサ2を備えていない車両に対しても車両挙動を考慮した探索範囲の設定が可能となり、トラッキングの精度向上を可能とする。 As described above, in the second embodiment, the pitch angle information calculation unit 30 that calculates the pitch angle information 50 from the control driving force control amount 53 is added to the configuration of the first embodiment, so that the pitch angle sensor 2 is not provided. It is possible to set the search range for the vehicle in consideration of the vehicle behavior, and it is possible to improve the tracking accuracy.
[実施例3]
 実施例3では、実施例1のピッチ角センサ2からピッチ角情報50を取得する代わりに、認識結果を元にブレーキやアクセルなどのアクチュエータへの制御信号を生成する制御生成部7から出力される目標制駆動力制御量54からピッチ角情報50を取得することを特徴とする。実施例3は、カメラ1内で完結する構成となっており、実施例1であるピッチ角センサや実施例2のような外部の制駆動力制御量53を必要としない。したがって、車種ごとの適合の作業を最小限に抑えることが期待できる。その方式を実施例1に追加する。
[Example 3]
In the third embodiment, instead of acquiring the pitch angle information 50 from the pitch angle sensor 2 of the first embodiment, it is output from the control generation unit 7 that generates a control signal to an actuator such as a brake or an accelerator based on the recognition result. It is characterized in that the pitch angle information 50 is acquired from the target control driving force control amount 54. The third embodiment is configured to be completed within the camera 1, and does not require the pitch angle sensor of the first embodiment or the external control driving force control amount 53 as in the second embodiment. Therefore, it can be expected to minimize the work of conforming to each vehicle type. The method is added to the first embodiment.
 実施例3の構成図を図9に示す。実施例1との差分は次の通りである。
 実施例1のピッチ角センサ2の代わりに、制御生成部7から出力される目標制駆動力制御量54を入力としてピッチ角情報50を算出するピッチ角情報計算部60が追加されている。
The block diagram of Example 3 is shown in FIG. The differences from Example 1 are as follows.
Instead of the pitch angle sensor 2 of the first embodiment, a pitch angle information calculation unit 60 that calculates the pitch angle information 50 by inputting the target control driving force control amount 54 output from the control generation unit 7 is added.
 また、目標制駆動力制御量54は、カメラ内部で演算された制御量である為、ドライバーがブレーキやアクセルを操作した際のオーバーライドによる制御量変化は含まれていない。ドライバーによるオーバーライドが発生した場合に、目標制駆動力制御量54によって計算されたピッチ角情報50と車両に実際に発生するピッチ角とに差異が発生してしまう。 Further, since the target driving force control amount 54 is a control amount calculated inside the camera, the change in the control amount due to the override when the driver operates the brake or the accelerator is not included. When an override occurs by the driver, a difference occurs between the pitch angle information 50 calculated by the target driving force control amount 54 and the pitch angle actually generated in the vehicle.
 本実施例では、運転者操作を検知した際に探索範囲の設定方法を工夫する必要がある。具体的には、運転者がブレーキやアクセルを操作したかどうかの運転者操作情報55を通信インターフェース8を経由して取得し、オーバーライド情報56を出力する運転者操作検知部70が追加されている。 In this embodiment, it is necessary to devise a method for setting the search range when the driver's operation is detected. Specifically, a driver operation detection unit 70 has been added that acquires driver operation information 55 indicating whether or not the driver has operated the brake or accelerator via the communication interface 8 and outputs override information 56. ..
 目標制駆動力制御量54からピッチ角情報50を計算するフローチャートについて図10を用いて説明する。 A flowchart for calculating the pitch angle information 50 from the target driving force control amount 54 will be described with reference to FIG.
 まず、認識画像から検知対象を検出し、検知対象に識別子を付加する(S1)。記憶部9から過去のトラッキング処理結果を取得する(S2)。過去のトラッキング情報から検知対象の移動方向を予測しフレーム内における探索範囲を設定する(S3)。 First, the detection target is detected from the recognition image, and an identifier is added to the detection target (S1). The past tracking processing result is acquired from the storage unit 9 (S2). The movement direction of the detection target is predicted from the past tracking information, and the search range within the frame is set (S3).
 カメラの認識結果からカメラ内の制御生成部7で演算される目標制駆動力制御量54からピッチ角情報計算部60にてピッチ角情報50を算出する(S30)。次に、変化量計算部20にてピッチ角情報50からフレーム内における検知対象のズレ情報51とその信頼度52を計算する(S5)。 The pitch angle information calculation unit 60 calculates the pitch angle information 50 from the target driving force control amount 54 calculated by the control generation unit 7 in the camera from the recognition result of the camera (S30). Next, the change amount calculation unit 20 calculates the deviation information 51 of the detection target in the frame and its reliability 52 from the pitch angle information 50 (S5).
 通信インターフェース部8を経由して、運転者操作情報55を取得する(S31)。運転者操作情報55を入力として、運転者操作検知部70にてオーバーライドが検知されたかどうかを判断する(S32)。運転者操作検知部70にて運転者のオーバーライドが検知された場合、トラッキング処理における探索範囲を最大限に拡大する(S33)。 The driver operation information 55 is acquired via the communication interface unit 8 (S31). With the driver operation information 55 as an input, it is determined whether or not the override is detected by the driver operation detection unit 70 (S32). When the driver's operation detection unit 70 detects the driver's override, the search range in the tracking process is expanded to the maximum (S33).
 オーバーライドが検知されない場合は、信頼度52が高いのか、または低いのかを判断する(S6)。そして、信頼度が低いと判断された場合、探索範囲の上下方向を拡大する(S7)。信頼度が高いと判断された場合、探索範囲の上下方向を縮小する(S8)。その後、ズレ情報51に従って探索範囲の位置を更新する(S9)。 If the override is not detected, it is determined whether the reliability 52 is high or low (S6). Then, when it is determined that the reliability is low, the search range is expanded in the vertical direction (S7). If it is determined that the reliability is high, the vertical direction of the search range is reduced (S8). After that, the position of the search range is updated according to the deviation information 51 (S9).
 トラッキング処理部10は、更新された探索範囲内を探索し、識別子を付加した検知対象が探索範囲内に存在するかどうか探索範囲内を全探索する。そして、検知対象が存在した場合、同じ識別子を付加し確定する(S10)。そして上記フローによってトラッキング処理を行った結果を記憶部9に保存する(S11)。 The tracking processing unit 10 searches within the updated search range, and searches the entire search range to see if the detection target to which the identifier is added exists within the search range. Then, when the detection target exists, the same identifier is added and confirmed (S10). Then, the result of the tracking process performed by the above flow is stored in the storage unit 9 (S11).
 上記のように、本実施例では、実施例1におけるS4の代わりに、S30の処理が追加され、さらにS31とS32、S33の処理が追加されている。 As described above, in this embodiment, the processing of S30 is added instead of S4 in Example 1, and the processing of S31, S32, and S33 is further added.
 実施例3において、目標制駆動力制御量54からピッチ角情報50を算出する手法が異なっているが、検知対象のズレ情報51とその信頼度52を用いてトラッキング処理の探索範囲を設定する手法については実施例1と共通である。 In the third embodiment, the method of calculating the pitch angle information 50 from the target driving force control amount 54 is different, but a method of setting the search range of the tracking process using the deviation information 51 of the detection target and its reliability 52. Is the same as that of the first embodiment.
 しかし、運転者操作検知部70でオーバーライドを検知した場合、目標制駆動力制御量54から算出したピッチ角情報50には車両に発生するピッチ角との差異が生まれてしまう為、探索範囲は正確ではなくなってしまう。そこで、オーバーライドを検知した場合は、信頼度が最も低いと判断し、探索範囲を最大限に広げることを特徴とする。 However, when the driver operation detection unit 70 detects the override, the pitch angle information 50 calculated from the target driving force control amount 54 has a difference from the pitch angle generated in the vehicle, so that the search range is accurate. Will disappear. Therefore, when an override is detected, it is determined that the reliability is the lowest, and the search range is expanded to the maximum.
 実施例3によれば、カメラ1は、認識部6の認識結果から車両を制御するために内部で計算される制御量を算出する制御生成部7を有し、ピッチ角情報計算部60は、制御生成部7の出力からピッチ角情報50を算出し、変化量計算部20にてピッチ角情報計算部60から出力されるピッチ角情報50からフレーム内の物体のズレ情報51を算出する。 According to the third embodiment, the camera 1 has a control generation unit 7 that calculates a control amount calculated internally for controlling the vehicle from the recognition result of the recognition unit 6, and the pitch angle information calculation unit 60 has a pitch angle information calculation unit 60. The pitch angle information 50 is calculated from the output of the control generation unit 7, and the deviation information 51 of the object in the frame is calculated from the pitch angle information 50 output from the pitch angle information calculation unit 60 by the change amount calculation unit 20.
 認識部6は、運転者の操作を検知する運転者操作検知部70を有し、オーバーライドを検知した場合、変化量計算部20から信頼度が低いと出力する。 The recognition unit 6 has a driver operation detection unit 70 that detects the driver's operation, and when an override is detected, the change amount calculation unit 20 outputs that the reliability is low.
 全ての実施例で紹介した技術はステレオカメラ、単眼カメラ問わず適用することができる。また、本提案は点群情報として出力するLiDARにも適用可能である。また、全ての実施例で三つの方法でピッチ角情報を求めたが、リアルタイムにピッチ角情報を算出する手段であれば、特に限定されない。 The technology introduced in all the examples can be applied to both stereo cameras and monocular cameras. This proposal is also applicable to LiDAR, which outputs point cloud information. Further, although the pitch angle information is obtained by three methods in all the embodiments, it is not particularly limited as long as it is a means for calculating the pitch angle information in real time.
 実施例1~3のカメラ1によれば、四輪車両向けに開発した認識用カメラに対し、二輪車用のピッチ角に対応する装置を付加することで、認識カメラを搭載した車両にピッチ角が発生した場合においても、トラッキング処理時にピッチ角情報を用いて探索範囲を設定することで、撮像画像内に検知対象が存在する限り検知対象を時系列で追跡することが可能となる。 According to the cameras 1 of the first to third embodiments, the pitch angle can be set in the vehicle equipped with the recognition camera by adding a device corresponding to the pitch angle for the two-wheeled vehicle to the recognition camera developed for the four-wheeled vehicle. Even if it occurs, by setting the search range using the pitch angle information during the tracking process, it is possible to track the detection target in time series as long as the detection target exists in the captured image.
 また、信頼度を新しく追加することにより、ピッチ角センサなどのセンサ情報が不正な値を出力した際に探索範囲を拡大し、マージンを大きくすることにより、検知対象が探索範囲から外れることによってトラッキングが失敗してしまう課題を解決することが可能である。また信頼度が高い場合は探索範囲を縮小し無駄な演算負荷を無くすことが出来る。従来は演算負荷が高い場合、探索範囲を間引くなどの処理を行っていたが、本発明では必要が無くなる為、認識精度向上に貢献する。また、ハードウェアの温度上昇を抑えるなどの効果が期待できる。 In addition, by adding a new reliability, the search range is expanded when sensor information such as the pitch angle sensor outputs an invalid value, and by increasing the margin, the detection target is out of the search range for tracking. It is possible to solve the problem that fails. If the reliability is high, the search range can be reduced and unnecessary calculation load can be eliminated. In the past, when the calculation load was high, processing such as thinning out the search range was performed, but this invention eliminates the need for processing, which contributes to improvement in recognition accuracy. In addition, effects such as suppressing the temperature rise of the hardware can be expected.
 これにより、二輪車において、車両にピッチ角が発生した場合においても安定したトラッキング処理を行うことができる。 As a result, in a two-wheeled vehicle, stable tracking processing can be performed even when a pitch angle occurs in the vehicle.
 実施例2によれば、ピッチ角センサを備えていない車両に対しても車両挙動を考慮した探索範囲の設定が可能となり、実施例3によれば、本発明がカメラ内部で実現可能な為、車種ごとの適合の作業を最小限に抑えることが期待できる。
 また、実施例1~3では車両に発生するピッチ角に着目して技術を述べたが、ピッチ角だけに限らずピッチ角とロール角が同時に発生する場合でも適用可能である。
According to the second embodiment, the search range can be set in consideration of the vehicle behavior even for a vehicle not provided with the pitch angle sensor, and according to the third embodiment, the present invention can be realized inside the camera. It can be expected to minimize the work of adapting to each vehicle type.
Further, in the first to third embodiments, the technique has been described focusing on the pitch angle generated in the vehicle, but the technique can be applied not only to the pitch angle but also to the case where the pitch angle and the roll angle are generated at the same time.
 以上、本発明の実施形態について詳述したが、本発明は、前記の実施形態に限定されるものではなく、特許請求の範囲に記載された本発明の精神を逸脱しない範囲で、種々の設計変更を行うことができるものである。例えば、前記した実施の形態は本発明を分かりやすく説明するために詳細に説明したものであり、必ずしも説明した全ての構成を備えるものに限定されるものではない。また、ある実施形態の構成の一部を他の実施形態の構成に置き換えることが可能であり、また、ある実施形態の構成に他の実施形態の構成を加えることも可能である。さらに、各実施形態の構成の一部について、他の構成の追加・削除・置換をすることが可能である。 Although the embodiments of the present invention have been described in detail above, the present invention is not limited to the above-described embodiments, and various designs are designed without departing from the spirit of the present invention described in the claims. You can make changes. For example, the above-described embodiments have been described in detail in order to explain the present invention in an easy-to-understand manner, and are not necessarily limited to those having all the described configurations. Further, it is possible to replace a part of the configuration of one embodiment with the configuration of another embodiment, and it is also possible to add the configuration of another embodiment to the configuration of one embodiment. Further, it is possible to add / delete / replace a part of the configuration of each embodiment with another configuration.
1・・・カメラ(画像処理装置)
2・・・ピッチ角センサ(IMU)
3・・・CANバス
4・・・撮像部
5・・・画像処理部
6・・・認識部
7・・・制御生成部
8・・・通信インターフェース部
9・・・記憶部
10・・・トラッキング処理部
11・・・探索範囲計算部
20・・・変化量計算部
30・・・ピッチ角情報計算部
40・・・駆動力制御装置
41・・・制動力制御装置
50・・・ピッチ角情報
51・・・ズレ情報
52・・・信頼度
53・・・制駆動力制御量
54・・・目標制駆動力制御量
55・・・運転者操作情報
56・・・オーバーライド情報
60・・・ピッチ角情報計算部
70・・・運転者操作検知部
1 ... Camera (image processing device)
2 ... Pitch angle sensor (IMU)
3 ... CAN bus 4 ... Imaging unit 5 ... Image processing unit 6 ... Recognition unit 7 ... Control generation unit 8 ... Communication interface unit 9 ... Storage unit 10 ... Tracking Processing unit 11 ... Search range calculation unit 20 ... Change amount calculation unit 30 ... Pitch angle information calculation unit 40 ... Driving force control device 41 ... Braking force control device 50 ... Pitch angle information 51 ... Misalignment information 52 ... Reliability 53 ... Control driving force control amount 54 ... Target control driving force control amount 55 ... Driver operation information 56 ... Override information 60 ... Pitch Angle information calculation unit 70 ... Driver operation detection unit

Claims (7)

  1.  車載カメラで撮像された複数のフレーム画像毎に対象物体の探索範囲を設定し、前記対象物体を時系列で追跡するトラッキング処理部と、
     車両のピッチ角情報を用いて、前記探索範囲を変更する探索範囲計算部を有することを特徴とする画像処理装置。
    A tracking processing unit that sets a search range for a target object for each of a plurality of frame images captured by an in-vehicle camera and tracks the target object in chronological order.
    An image processing device comprising a search range calculation unit that changes the search range using vehicle pitch angle information.
  2.  前記ピッチ角情報から前記フレーム画像における対象物体の上下方向変化量を出力する変化量計算部を有し、
     前記探索範囲計算部は、前記上下方向変化量に従って探索範囲を更新することを特徴とする請求項1記載の画像処理装置。
    It has a change amount calculation unit that outputs the vertical change amount of the target object in the frame image from the pitch angle information.
    The image processing apparatus according to claim 1, wherein the search range calculation unit updates the search range according to the amount of change in the vertical direction.
  3.  前記変化量計算部は、前記上下方向変化量の出力が正確であるかを示す信頼度を算出し、
     前記探索範囲計算部は、前記信頼度が閾値よりも高いと判断された場合に前記探索範囲を縮小し、前記信頼度が閾値よりも低いと判断された場合に前記探索範囲を拡大することを特徴とする請求項1に記載の画像処理装置。
    The change amount calculation unit calculates the reliability indicating whether the output of the vertical change amount is accurate.
    The search range calculation unit reduces the search range when it is determined that the reliability is higher than the threshold value, and expands the search range when it is determined that the reliability is lower than the threshold value. The image processing apparatus according to claim 1.
  4.  前記信頼度は、単位時間あたりのピッチ角の変化量によって算出されることを特徴とする請求項3に記載の画像処理装置。 The image processing apparatus according to claim 3, wherein the reliability is calculated based on the amount of change in the pitch angle per unit time.
  5.  前記画像処理装置は、過去の認識結果を記憶する記憶部を有し、
     前記信頼度は、前記記憶部に保存されている路面認識情報に基づいて設定されることを特徴とする請求項3に記載の画像処理装置。
    The image processing device has a storage unit for storing past recognition results.
    The image processing apparatus according to claim 3, wherein the reliability is set based on the road surface recognition information stored in the storage unit.
  6.  前記画像処理装置で演算される目標制駆動力からピッチ角情報を算出するピッチ角情報計算部を有することを特徴とする請求項1に記載の画像処理装置。 The image processing apparatus according to claim 1, further comprising a pitch angle information calculation unit that calculates pitch angle information from a target driving force calculated by the image processing apparatus.
  7.  前記画像処理装置は、運転者によるアクセル操作やブレーキ操作を検知する運転者操作検知部を有し、
     前記変化量計算部は、前記運転者操作検知部で運転者の操作が検知された場合、前記信頼度が低いと判断することを特徴とする請求項3に記載の画像処理装置。
    The image processing device has a driver operation detection unit that detects an accelerator operation or a brake operation by the driver.
    The image processing device according to claim 3, wherein the change amount calculation unit determines that the reliability is low when the driver's operation is detected by the driver operation detection unit.
PCT/JP2021/002142 2020-05-22 2021-01-22 Image processing device WO2021235001A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2022524877A JP7350168B2 (en) 2020-05-22 2021-01-22 Image processing device

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2020-090109 2020-05-22
JP2020090109 2020-05-22

Publications (1)

Publication Number Publication Date
WO2021235001A1 true WO2021235001A1 (en) 2021-11-25

Family

ID=78707916

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/002142 WO2021235001A1 (en) 2020-05-22 2021-01-22 Image processing device

Country Status (2)

Country Link
JP (1) JP7350168B2 (en)
WO (1) WO2021235001A1 (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002117392A (en) * 2000-10-06 2002-04-19 Nissan Motor Co Ltd Inter-vehicle distance estimation device
WO2011040119A1 (en) * 2009-09-30 2011-04-07 日立オートモティブシステムズ株式会社 Vehicle controller
JP2015210538A (en) * 2014-04-23 2015-11-24 本田技研工業株式会社 Light source detector, headlight control system, and light source detection method
WO2015177985A1 (en) * 2014-05-22 2015-11-26 ヤマハ発動機株式会社 Pitch angle control system, pitch angle control method, and vehicle
WO2017138286A1 (en) * 2016-02-12 2017-08-17 日立オートモティブシステムズ株式会社 Surrounding environment recognition device for moving body
WO2020090320A1 (en) * 2018-10-31 2020-05-07 ソニーセミコンダクタソリューションズ株式会社 Information processing device, information processing method, and information processing program

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002117392A (en) * 2000-10-06 2002-04-19 Nissan Motor Co Ltd Inter-vehicle distance estimation device
WO2011040119A1 (en) * 2009-09-30 2011-04-07 日立オートモティブシステムズ株式会社 Vehicle controller
JP2015210538A (en) * 2014-04-23 2015-11-24 本田技研工業株式会社 Light source detector, headlight control system, and light source detection method
WO2015177985A1 (en) * 2014-05-22 2015-11-26 ヤマハ発動機株式会社 Pitch angle control system, pitch angle control method, and vehicle
WO2017138286A1 (en) * 2016-02-12 2017-08-17 日立オートモティブシステムズ株式会社 Surrounding environment recognition device for moving body
WO2020090320A1 (en) * 2018-10-31 2020-05-07 ソニーセミコンダクタソリューションズ株式会社 Information processing device, information processing method, and information processing program

Also Published As

Publication number Publication date
JPWO2021235001A1 (en) 2021-11-25
JP7350168B2 (en) 2023-09-25

Similar Documents

Publication Publication Date Title
US11186279B2 (en) Control device for vehicle travelling
JP6996353B2 (en) Object recognition device and vehicle travel control system
JP6323473B2 (en) Travel control device
CN106537180B (en) Method for mitigating radar sensor limitations with camera input for active braking of pedestrians
JP4021344B2 (en) Vehicle driving support device
US10755421B2 (en) Tracking device
CN109426261B (en) Automatic driving device
US20100332050A1 (en) Vehicle travel support device, vehicle, and vehicle travel support program
CN106428209A (en) Steering assistant
JP6936098B2 (en) Object estimation device
US20220135030A1 (en) Simulator for evaluating vehicular lane centering system
KR20140133332A (en) System and method for estimating the curvature radius of autonomous vehicles using sensor fusion
JP2006298254A (en) Traveling support device
CN112714718B (en) Vehicle control method and vehicle control device
JP6481627B2 (en) Vehicle travel control device
WO2021235001A1 (en) Image processing device
JP2019067115A (en) Road surface detecting device
JP7236849B2 (en) External recognition device
US20230294674A1 (en) Driving assistance device, driving assistance method, and storage medium
JP2007076472A (en) Operation support device for vehicle
JP7181956B2 (en) Mobile body control device, control method, and vehicle
JP6082293B2 (en) Vehicle white line recognition device
CN113204234B (en) Vehicle control method and vehicle control system
US20240096143A1 (en) Information processing device, vehicle, and information processing method
US20230294676A1 (en) Driving assistance device, driving assistance method, and storage medium

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21809460

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2022524877

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21809460

Country of ref document: EP

Kind code of ref document: A1