WO2020022021A1 - Distance calculation device - Google Patents

Distance calculation device Download PDF

Info

Publication number
WO2020022021A1
WO2020022021A1 PCT/JP2019/026647 JP2019026647W WO2020022021A1 WO 2020022021 A1 WO2020022021 A1 WO 2020022021A1 JP 2019026647 W JP2019026647 W JP 2019026647W WO 2020022021 A1 WO2020022021 A1 WO 2020022021A1
Authority
WO
WIPO (PCT)
Prior art keywords
distance
information
distance information
monocular
calculation device
Prior art date
Application number
PCT/JP2019/026647
Other languages
French (fr)
Japanese (ja)
Inventor
稲田 圭介
裕介 内田
野中 進一
Original Assignee
日立オートモティブシステムズ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 日立オートモティブシステムズ株式会社 filed Critical 日立オートモティブシステムズ株式会社
Priority to CN201980049441.1A priority Critical patent/CN112513571B/en
Publication of WO2020022021A1 publication Critical patent/WO2020022021A1/en

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C3/00Measuring distances in line of sight; Optical rangefinders
    • G01C3/02Details
    • G01C3/06Use of electric means to obtain final indication
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/87Combinations of systems using electromagnetic waves other than radio waves
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/55Depth or shape recovery from multiple images
    • G06T7/593Depth or shape recovery from multiple images from stereo images
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes

Definitions

  • the present invention relates to a distance calculation device.
  • There is known a technique of calculating the distance between a camera and an observation target using a stereo camera and performing recognition processing of the observation target. There is also known a method of realizing high-precision and wide-field monitoring around a vehicle by shifting two cameras constituting a stereo camera and generating a compound eye region in front and a monocular region in left and right sides.
  • Patent Literature 1 is an object detection device that detects an object present on a reference plane, and includes a parallax calculation unit that calculates parallax between images captured by a plurality of imaging devices, based on the parallax, A region calculation unit that calculates a common region of the plurality of images and a non-common region other than the common region; and a rectangle in the common region, and the uniformity of the parallax of each point included in the rectangle.
  • a first generation unit that generates a rectangle larger than a predetermined threshold as a candidate rectangle that is a candidate for a rectangle for detecting the object, and a second generation unit that generates a rectangle in the non-common area as the candidate rectangle.
  • An object detection device is disclosed, comprising: a generation unit; and a determination unit that determines whether the generated candidate rectangle includes an object to be detected.
  • a distance calculation device uses a sensor output, a high-precision distance information generation unit that generates high-precision distance information that is high-precision distance information, and a sensor output.
  • a low-accuracy distance information generation unit that generates low-accuracy distance information with lower accuracy than the high-accuracy distance information, and a distance correction unit that corrects the low-accuracy distance information using the high-accuracy distance information.
  • FIG. 3A shows the first captured image 100
  • FIG. 3B shows the second captured image 101
  • FIG. 3C shows the group.
  • Configuration diagram of the correction unit 20 Configuration diagram of group information generation unit 30
  • Flow chart showing the operation of the distance calculation device 6 Configuration diagram of the correction unit 20 in Modification Example 3
  • eta weighting coefficient
  • FIG. 10 is a diagram illustrating an example of a process of correcting the monocular distance information 102 using the correction coefficient ⁇ illustrated in FIG. 9.
  • FIG. 10A shows a state before correction
  • FIG. 10B shows a state after correction.
  • FIG. 12A shows a state before correction
  • FIG. 12B shows a state after correction.
  • FIG. 15A shows a state before correction
  • FIG. 15B shows a state after correction.
  • Configuration diagram of group information generation section 30 in modification 21 Diagram showing correspondence between subject types and correction methods
  • Schematic diagram of distance calculation system S according to the second embodiment Configuration diagram of distance calculation device 6 according to the second embodiment Schematic diagram of distance calculation system S according to the third embodiment Configuration diagram of distance calculation device 6 according to the third embodiment
  • FIG. 1 is a schematic diagram of a distance calculation system S including a distance calculation device 6.
  • the distance calculation system S is mounted on the vehicle 1 and includes a first imaging unit 2 and a second imaging unit 3.
  • the first imaging unit 2 and the second imaging unit 3 are arranged side by side in the forward direction of the vehicle 1.
  • the viewing angles of the first imaging unit 2 and the second imaging unit 3 are known, and some fields of view overlap as shown in FIG. In FIG. 1, the fields indicated by hatching have overlapping visual fields.
  • FIG. 2 is a configuration diagram of the vehicle control system S according to the first embodiment.
  • the vehicle control system S includes a first imaging unit 2, a second imaging unit 3, a distance calculation device 6, a recognition processing unit 7, and a vehicle control unit 8.
  • the distance calculation device 6 includes a monocular distance information generation unit 10, a compound eye distance information generation unit 11, and a monocular distance information correction unit 12.
  • the monocular distance information correction unit 12 includes a correction unit 20 and a group information generation unit 30.
  • the first imaging unit 2 outputs a first captured image 100, which is an image obtained by capturing, to the distance calculation device 6.
  • the second imaging unit 3 outputs a second captured image 101, which is an image obtained by capturing, to the distance calculation device 6.
  • the distance calculation device 6 may be a single ECU (Electronic Control Unit), or may be a calculation device further provided with other functions.
  • the distance calculation device 6 may be configured integrally with the first imaging unit 2 and the second imaging unit 3.
  • the distance calculation device 6, the recognition processing unit 7, and the vehicle control unit 8 may be realized by an integrated device.
  • the recognition processing unit 7 and the vehicle control unit 8 may be realized by an integrated electronic control device, or may be realized by different electronic control devices.
  • the monocular distance information generation unit 10 generates the monocular distance information 102 using each of the first captured image 100 and the second captured image 101.
  • the monocular distance information generation unit 10 calculates the distance from the monocular image by a known method. That is, the monocular distance information generation unit 10 does not use the first captured image 100 and the second captured image 101 in combination in generating the monocular distance information 102.
  • the monocular distance information generation unit 10 combines the distance information calculated using the first captured image 100 and the distance information calculated using the second captured image 101 into monocular distance information 102.
  • the monocular distance information generation unit 10 assumes, for example, that the imaging is performed in a horizontal direction on a plane, and calculates the distance as the upper part of the captured image is farther. However, the monocular distance information generation unit 10 calculates a distance for each distance information block composed of one or a plurality of pixels.
  • the distance information block may be, for example, 16 pixels composed of 4 vertical pixels ⁇ 4 horizontal pixels, 2 pixels composed of 2 vertical pixels ⁇ 1 horizontal pixel, or 1 vertical pixel ⁇ 1 horizontal pixel, ie, 1 pixel It may be a pixel.
  • the monocular distance information generation unit 10 outputs the generated monocular distance information 102 to the correction unit 20 of the monocular distance information correction unit 12.
  • the compound-eye distance information generation unit 11 combines the first captured image 100 and the second captured image 101, and specifically, uses the region where the fields of view of the first captured image 100 and the second captured image 101 overlap to determine the compound eye distance.
  • the information 103 is generated. Since the positional relationship including the baseline length of the first imaging unit 2 and the second imaging unit 3 is known, the compound eye distance information generation unit 11 calculates the distance from the parallax of the pixels common to the first captured image 100 and the second captured image 101. I do. Note that the compound-eye distance information 103 calculated by the compound-eye distance information generating unit 11 is also calculated for each distance information block.
  • the compound eye distance information 103 has higher accuracy. Therefore, the monocular distance information 102 can be referred to as “low-accuracy distance information”, and the compound-eye distance information 103 can be referred to as “high-accuracy distance information”. Similarly, the monocular distance information generating unit 10 can be called a “low-accuracy distance information generating unit”, and the compound-eye distance information generating unit 11 can be called a “high-accuracy distance information generating unit”.
  • the correction unit 20 outputs the generated compound eye distance information 103 to the correction unit 20 of the single eye distance information correction unit 12 and the group information generation unit 30.
  • the group information generation unit 30 detects a subject and performs a grouping process on a plurality of subjects using the first captured image 100, the second captured image 101, and the compound-eye distance information 103, and corrects the result as group information 106. 20. Details of the operation of the group information generation unit 30 will be described later.
  • the group information 106 generated by the group information generation unit 30 includes the type of the group and the position of the subject included in the group on the image.
  • the group information 106 may include information on the type of the subject.
  • the type of the subject is, for example, a person, an animal, a vehicle, a road surface, a curb, a side road, a white line, a guardrail, a sign, a structure, a signal, a falling object on the road, a hole on the road, a puddle, and the like.
  • the correction unit 20 corrects the monocular distance information 102 using the compound eye distance information 103 and the group information 106, and outputs corrected distance information 104. Details of the operation of the correction unit 20 will be described later.
  • the recognition processing unit 7 performs a recognition process using the corrected distance information 104 generated by the distance calculation device 6.
  • the ⁇ correction unit 20 corrects the monocular distance information 102 using the compound eye distance information 103 and the group information 106, and outputs the corrected monocular distance information 102 and compound eye distance information 103 as corrected distance information 104.
  • the detailed configuration and operation of the correction unit 20 will be described later.
  • the vehicle control unit 8 controls the vehicle 1 using the recognition information 140 generated by the recognition processing unit 7.
  • the vehicle control unit 8 outputs the input information 105 to the distance calculation device 6.
  • the input information 105 includes at least one of travel information of the vehicle 1, recognition information 140, traffic infrastructure information, map information, and vehicle control information.
  • the vehicle control unit 8 can acquire travel information of the vehicle 1 from a sensor (not shown) mounted on the vehicle 1.
  • the vehicle control unit 8 can acquire traffic infrastructure information from outside the vehicle 1 using a communication device (not shown).
  • the vehicle control unit 8 can obtain map information by referring to a map database using a communication device (not shown) or by referring to a map database built in the vehicle 1.
  • Examples of the traveling information include traveling speed information, acceleration information, position information, traveling direction, traveling mode, and the like of the vehicle 1.
  • Examples of the traveling mode include an automatic driving mode, a traveling mode based on a traveling road, a traveling mode based on a traveling situation, a traveling mode based on a surrounding natural environment, and an energy saving mode in which the vehicle travels with low power consumption or low fuel consumption.
  • Examples of the automatic driving mode include a manual driving mode in which vehicle control is performed by human judgment, a driving support mode in which a part of vehicle control is assisted by system judgment, and an automatic driving mode in which vehicle control is executed by system judgment.
  • Examples of the travel mode based on the travel path include a city area travel mode, a highway travel mode, and legal speed information.
  • Examples of the traveling mode based on the traveling situation include a traffic congestion traveling mode, a parking lot mode, and a traveling mode according to the position and movement of surrounding vehicles.
  • Examples of the driving mode based on the surrounding natural environment include a night driving mode and a backlight driving mode.
  • Examples of the map information include position information of the vehicle 1 on the map, road shape information, road surface feature information, road width information, lane information, and road gradient information.
  • Examples of the road shape information include a T-junction and an intersection.
  • Examples of the road surface feature information include a signal, a roadway section, a sidewalk section, a level crossing, a bicycle parking lot, a car parking lot, and a crosswalk.
  • Examples of the map information include position information of the vehicle 1 on the map, road shape information, road surface feature information, road width information, lane information, and road gradient information.
  • Examples of the road shape information include a T-junction and an intersection.
  • Examples of the road surface feature information include a signal, a roadway section, a sidewalk section, a level crossing, a bicycle parking lot, a car parking lot, and a crosswalk.
  • Examples of the recognition information 140 include position information, type information, operation information, and danger information of the subject.
  • An example of the position information includes a direction and a distance from the vehicle 1.
  • Examples of the type information include pedestrians, adults, children, the elderly, animals, falling rocks, bicycles, surrounding vehicles, surrounding structures, and curbs.
  • Examples of the motion information include a pedestrian or bicycle swaying, jumping out, traversing, moving direction, moving speed, and moving trajectory.
  • Examples of danger information include abnormal movements of surrounding vehicles such as pedestrians jumping out, falling rocks, suddenly stopping, suddenly decelerating, and suddenly turning.
  • Examples of the traffic infrastructure information include traffic congestion information, traffic regulation information such as speed limit and traffic prohibition, travel route guidance information for guiding to another travel route, accident information, alert information, peripheral vehicle information, map information, and the like.
  • Examples of vehicle control information include brake control, steering wheel control, accelerator control, in-vehicle lamp control, alarm sound generation, in-vehicle camera control, and observation objects around imaging devices to peripheral vehicles and remote center devices connected via a network. There is information output on things. Further, the vehicle control unit 8 may perform subject detection processing based on information processed by the distance calculation device 6 or the recognition processing unit 7, or may perform a first operation on a display device connected to the vehicle control unit 8.
  • An image obtained through the imaging unit 2 or the second imaging unit 3 or a display for allowing the viewer to recognize the image may be displayed, or an information device that processes traffic information such as map information or congestion information may be displayed.
  • traffic information such as map information or congestion information
  • FIG. 3 is a diagram showing an area of a captured image.
  • FIG. 3A is a diagram illustrating a first captured image 100 acquired by the first imaging unit 2
  • FIG. 3B is a diagram illustrating a second captured image 101 acquired by the second imaging unit 3
  • FIG. 3C shows a group.
  • the first imaging unit 2 and the second imaging unit 3 are arranged in the horizontal direction, and the horizontal direction in FIGS. 3A and 3B is arranged in accordance with the shooting position. That is, the same region is photographed as the region Q of the first photographed image 100 and the region R of the second photographed image 101.
  • the compound eye distance information generation unit 11 can calculate the distance with high accuracy.
  • the information included in the region Q and the region R is collectively referred to as “information of the region T”, including the meaning of using the two information of the region Q and the region R together.
  • the region T is called a “binocular region”.
  • information is obtained only from one of the first imaging unit 2 and the second imaging unit 3, and the monocular distance information
  • the generation unit 10 can calculate the distance only with low accuracy.
  • each of the regions P to S is referred to as a “monocular region”.
  • the subjects shown in FIG. 3 are as follows.
  • a subject 410, a subject 420, and a subject 430 are photographed in the first captured image 100.
  • the subject 430 extends over the region P and the region Q.
  • the subject 430 in the region P is called a partial subject 401
  • the subject 430 in the region Q is called a partial subject 400.
  • the subject 410 exists at the right end of the region Q
  • the subject 420 exists near the center of the region P.
  • the partial subject 401 is captured at the left end of the region R
  • the subject 410 is captured near the right end of the region R.
  • the subject 411, the subject 412, and the subject 413 are photographed from the left end.
  • the shapes of the subject 410, the subject 411, the subject 412, and the subject 413 are substantially the same.
  • a set of the partial subjects 400 and 401 in other words, the subject 430 itself is a group G1
  • a set of the subjects 420 and 430 is a group G2
  • the subjects 410, 411, 412, and 413 Is a group G3.
  • Each of the groups G1, G2, and G3 is also referred to as a first group, a second group, and a third group.
  • the type of the group of the portion in each region of the subject existing over the binocular region and the monocular region that is, the group G1 is referred to as a “cross-border group”.
  • a portion in each region of the subject 430 existing over the binocular region Q and the monocular region P that is, the group G1 of the partial subject 400 and the partial subject 401 is classified as a cross-border group.
  • the group G1 is referred to as a “cross-border group G1”.
  • a subject existing in the monocular region and present in the vicinity of a subject belonging to the cross-border group, and the type of the group of the subject forming the cross-border group, that is, the group G2 is referred to as an “enlarged cross-border group”.
  • the subject 420 existing in the monocular region P near the subject 430 belonging to the cross-border group G1 and the group G2 of the subjects 430 constituting the cross-border group G1 are the enlarged cross-border.
  • the group G2 is referred to as an “extended transboundary group G2”.
  • a subject such as the subject 420 that is included in the enlarged cross-border group G2 but not included in the cross-border group G1 is referred to as an “additional subject”.
  • the type of group composed of a plurality of subjects existing adjacent to the compound eye region and the monocular region that is, the group G3 is referred to as a “simple group”.
  • each subject belonging to the simplex group is provided on condition that at least one of the type of the subject, the shape of the subject, the color of the subject, and the position of the subject in the vertical direction is the same or similar.
  • the group G3 of the subject 411, the subject 412, and the subject 413 existing adjacent to the monocular region S is classified into a single group.
  • the group G3 is referred to as a “single group G3”.
  • FIG. 4 is a configuration diagram of the correction unit 20.
  • the correction unit 20 includes a distance difference information generation unit 22, a correction determination unit 23, and a correction execution unit 21.
  • the distance difference information generation unit 22 generates distance difference information 120 based on the monocular distance information 102, the compound eye distance information 103, and the group information 106. The distance difference information 120 will be described later.
  • the correction determination unit 23 determines a correction method for the monocular distance information 102 based on the distance difference information 120, the compound eye distance information 103, and the input information 105, and outputs correction determination information 121.
  • the correction executing unit 21 performs a correction process on the single-lens distance information 102 based on the distance difference information 120, the group information 106, and the correction determination information 121, and outputs the compound-eye distance information 103 and the corrected distance information 104 including the corrected single-eye distance information 104. Generate.
  • the correction determination information 121 will be described later.
  • the distance difference information generation unit 22 calculates an average value LAm for the partial subject 400 included in the compound eye region T based on the compound eye distance information 103. Next, the distance difference information generation unit 22 calculates the average value LAs based on the monocular distance information 102 of the partial subject 400 calculated using only the monocular image. Then, the distance difference information generation unit 22 calculates the distance difference information 120 using the average value LAm and the average value LAs.
  • the calculation of the average value LAm and the average value LAs may be, for example, a simple average of the entire region of the partial subject 400 or a weighted average in which the weight is increased as the distance from the boundary increases.
  • a calculation example in the case of performing weighted averaging is as follows.
  • the compound eye distance information 103 of the n distance information blocks k constituting the partial subject 400 to be calculated is Lm (k)
  • the monocular distance information 102 is Ls (k)
  • the weighting coefficient for the compound eye distance information 103 is ⁇ (k).
  • the weighting coefficient for the monocular distance information 102 is ⁇ (k).
  • k is an integer of 0 to n-1.
  • the average values LAm and LAs are expressed as Expressions 1 and 2.
  • the distance difference information 120 may be calculated as follows. A difference between the compound eye distance information 103 of the partial subject 400 in the compound eye region T and the monocular distance information 102 of the partial subject 400 in the single eye region Q is calculated for each distance information block unit, and the generated partial subject 400 in the compound eye region T is generated. May be generated based on the difference for each distance information block unit.
  • the correction determination unit 23 determines whether or not the correction of the distance information of the monocular region is necessary using the input information 105 acquired from the vehicle control unit 8 and generates the correction determination information 121.
  • the correction determination information 121 may include parameters used for correction.
  • the correction determination unit 23 determines not to correct the distance information of the monocular region in the following cases, for example. That is, the vehicle speed exceeds a predetermined speed, the driving mode is a predetermined mode, the current position of the vehicle 1 is on a highway, the steering angle is equal to or more than a predetermined angle, and the like. However, these conditions may be used in combination. This determination is based on the importance of monitoring a short distance when traveling at low speed. At a short distance, the same object is photographed in a large size and the parallax is increased, so that the accuracy of the distance is improved.
  • the risk of contact between a person and a car increases in an environment where people tend to be distracted by the car, such as when driving in a city or turning right or left at an intersection in a city. Conversely, on highways and on national roads where vehicles are traveling at high speed, the risk of human contact with vehicles is greatly reduced.
  • the ⁇ correction determination unit 23 determines parameters used for weighting, for example, as follows. Note that the larger the value of this parameter, the greater the distance information of the compound eye region is reflected in the correction.
  • the correction determination unit 23 sets the parameter to be larger as the vehicle speed is faster, and sets the parameter to the maximum when the vehicle speed is higher than the first predetermined value, and sets the parameter to the minimum when the vehicle speed is lower than the second predetermined value.
  • the correction determination unit 23 determines the location where the vehicle 1 is traveling. For example, if the vehicle 1 is traveling in a location where traffic is intense, the parameter is set to a large value. On roads, the parameters are set to a minimum.
  • the correction determination unit 23 sets a larger parameter as the steering angle of the vehicle 1 is larger, and sets a smaller parameter as the steering angle is smaller. Note that these conditions may be combined.
  • the correction determination unit 23 By providing the correction determination unit 23, it is possible to accurately measure the distance to an object around the vehicle, that is, a person, a car, a parked vehicle, or the like during low-speed driving or when turning right or left at an intersection where the risk of contact between the person and the vehicle increases. Thus, the risk of an accident can be greatly reduced.
  • more accurate automatic driving judgment can be realized by improving the accuracy of distance information referred to by the automatic driving system. It is possible to prevent an automatic driving error due to an erroneous detection generated from the erroneous distance information. By limiting at low speed traveling, the processing load at high speed traveling can be reduced.
  • the correction executing unit 21 corrects the monocular distance information 102 based on the group information 106 by a different method for each type of group.
  • the distance difference information 120 output by the distance difference information generation unit 22 is mainly used when processing the cross-border group G1.
  • the correction determination information 121 is used for determining whether or not correction is required for the cross-border group G1, and for determining correction parameters. Hereinafter, the correction process will be described for each type of group.
  • the correction executing unit 21 corrects the monocular distance information 102 of the partial subject 401 existing in the monocular region P. This correction is, for example, to replace the monocular distance information 102 of all or a part of the distance information blocks of the partial subject 401 with the average value LAm of the partial subjects in the compound eye region T. In the case of replacement, a predetermined pixel may be weighted. As an example of the weighting, there is a method of reducing the weighting as the distance from the boundary increases.
  • the correction executing unit 21 corrects the distance information as follows when the processing is performed on the extended cross-border group. That is, the correction executing unit 21 corrects the distance information of the partial subject in the monocular area of the enlarged cross-border group in the same manner as the distance information of the monocular area in the cross-border group.
  • the correction executing unit 21 corrects the distance information of the subject in the monocular region of the enlarged cross-border group using the distance information of the partial subject in the compound eye region. Specifically, in the example shown in FIG. 3, the distance information of the partial subject 401 in the monocular region P of the enlarged cross-border group G2 is the same as the correction of the partial subject 401 in the cross-border group G1.
  • the distance information of the subject 420 in the monocular region P is corrected using the distance information of the partial subject 400 in the compound eye region T.
  • the correction of the distance information of the subject in the monocular region using the distance information of the partial subject in the compound eye region is, for example, as follows. That is, the distance information of each of the distance information blocks of the subject in the single-eye region is set as the average value of the distances of the entire regions of the partial subjects of the compound-eye region and the average value of the distance information of the respective distance information blocks of the subject in the single-eye region.
  • the correction executing unit 21 corrects the single-eye distance information 102 of the subject photographed only in the single-eye region using the compound-eye distance information 103 of the subject belonging to the single group.
  • the subject 411 is calculated using the compound-eye distance information 103 of the subject 410 calculated using the compound-eye image T and the monocular distance information 102 of the subject 410 calculated using the single-eye image R.
  • the monocular distance information 102 of the subject 412 and the subject 413 is corrected.
  • the subject 410 is also photographed in the monocular image Q, since the image is photographed by a camera different from the monocular image S to be corrected, the monocular distance information 102 obtained from the monocular image Q is used for correction in this case. Is not used.
  • the correction executing unit 21 calculates a difference between the compound-eye distance information 103 and the single-lens distance information 102 of the subject 410, and adds the difference to the single-eye distance information 102 of the subjects 411, 412, and 413.
  • FIG. 5 is a configuration diagram of the group information generation unit 30.
  • the group information generation unit 30 includes a subject detection unit 32 and a group determination unit 31.
  • the subject detection unit 32 detects the subject using the first captured image 100 output from the first imaging unit 2 and the second captured image 101 output from the second imaging unit 3 when the group information generation unit 30 detects the subject.
  • the detection information 130 is output.
  • the first captured image 100 or the second captured image 101 may be used, or both the first captured image 100 and the second captured image 101 may be used. Further, the first photographed image 100 or the second photographed image 101 having different photographing times may be used.
  • the first The approximate area where the captured image 100 and the second captured image 101 overlap can be calculated in advance.
  • the subject detection unit 32 calculates a region where the first captured image 100 and the second captured image 101 overlap with each other by using the preliminary calculation, and specifies a monocular region and a compound eye region. Thereby, the boundary between the compound eye region and the monocular region becomes clear.
  • the condition of the subject detected by the subject detection unit 32 may be input as the input information 105.
  • the condition of the subject is the type, size, position information and the like of the subject.
  • the subject detection information 130 includes the type of the subject, the size of the subject, the subject straddling the boundary between the compound eye region and the monocular region, the subject in the monocular region, the position information of the subject, and the information of the distance information block forming the subject. .
  • the group determination unit 31 calculates the group information 106 based on the subject detection information 130, the compound eye distance information 103, and the input information 105.
  • the group determination unit 31 classifies subjects straddling the boundary between the compound eye region and the monocular region into one group and classifies them into a cross-border group.
  • the group determination unit 31 classifies the subject into another extended group including the other subjects, and classifies the group into an enlarged cross-border group.
  • the group determination unit 31 classifies a plurality of subjects existing adjacent to the monocular region into one group, and classifies them into a single group.
  • FIG. 6 is a flowchart showing the operation of the distance calculation device 6.
  • the distance calculation device 6 acquires the first captured image 100 and the second captured image 101 from the first imaging unit 2 and the second imaging unit 3 (S01).
  • the distance calculation device 6 executes S02, S03, and S04 in parallel.
  • S02, S03, and S04 may be executed in any order.
  • S02 is executed by the monocular distance information generation unit 10
  • S03 is executed by the compound eye distance information generation unit 11
  • S04 is executed by the group information generation unit 30.
  • the distance calculation device 6 generates monocular distance information corresponding to the first captured image 100 using the first captured image 100, and monocular distance information corresponding to the second captured image 101 using the second captured image 101. Generate That is, in S02, the distance information of the compound eye region is calculated twice using each image. In S03, the distance calculation device 6 generates compound eye distance information of the overlapping area using the first captured image 100 and the second captured image 101. In S04, the distance calculation device 6 detects a subject, generates a group (S05), and determines whether or not the first group exists (S06).
  • the distance calculation device 6 When the execution of S02 and S03 is completed and the determination of S06 is affirmative, the distance calculation device 6 generates the distance difference information 120 of the first group (S07). If the first group does not exist (S06: NO), the distance calculation device 6 ends the processing.
  • the distance calculation device 6 determines whether the generated distance difference information 120 is larger than a predetermined value (S08). If the distance difference information 120 is larger, the correction execution unit 21 corrects the monocular distance information 102 as described above. The processing is performed, and the corrected distance information 104 is output (S09). If the distance is smaller than the predetermined value, the distance calculation device 6 ends the processing. The distance calculation device 6 repeats these processes for each frame, for example.
  • the distance calculation device 6 uses the output of the sensor to generate the compound eye distance information 103 that is highly accurate distance information, and the compound eye distance information 103 by using the output of the sensor.
  • a monocular distance information generation unit 10 that generates monocular distance information 102 with a lower precision than that, and a monocular distance information correction unit 12 that corrects the monocular distance information 102 using the compound eye distance information 103 are provided. Therefore, it is possible to correct the distance information of an area with low accuracy.
  • the compound-eye distance information generation unit 11 generates compound-eye distance information 103 by using outputs of a plurality of cameras and outputs of compound-eye regions in which fields of view overlap.
  • the monocular distance information generation unit 10 generates monocular distance information 102 of a monocular region using an output of a single camera.
  • the distance calculation device 6 includes a group information generation unit 30 that associates one or more subjects included in each of the compound eye region and the monocular region as a first group based on a predetermined condition.
  • the monocular distance information correction unit 12 corrects the monocular distance information 102 using the compound eye distance information 103 of the subject belonging to the first group generated by the compound eye distance information generating unit 11. Therefore, the monocular distance information 102 can be corrected using the compound eye distance information 103.
  • the subjects belonging to the first group G1 are the same subjects existing over the single-eye region and the compound-eye region.
  • the group information generating unit 30 associates a subject included in the monocular region, which satisfies a predetermined condition, as an additional subject with the first group as a second group.
  • the subject 420 which is an additional subject
  • the first group G1 are associated with each other as a second group G2.
  • the monocular distance information correction unit 12 corrects the monocular distance information of the additional subject by using the distance information of the first group.
  • the monocular distance information correction unit 12 calculates the difference between the compound eye distance information 103 and the monocular distance information 102 for the same area as distance difference information, and determines whether the correction of the monocular distance information 102 is necessary based on the distance difference information. to decide. Therefore, when the effect of the correction is expected to be small, the correction can be omitted to reduce the processing load.
  • the monocular distance information correcting unit 12 uses the compound eye distance information of the first group generated by the compound eye distance information generating unit 11 or the corrected monocular distance information generated by the monocular distance information correcting unit 12 to generate a monocular of the additional subject Correct the distance information.
  • the monocular distance information correction unit 12 may replace the monocular distance information 102 of the partial subject 401 with the compound eye distance information 103 of the partial subject 400 in the correction of the monocular distance of the cross-border group G1.
  • the monocular distance information correction unit 12 uses the monocular distance information 102 of the subjects 411, 412, and 413 existing in the monocular area as the monocular distance information 103 of the subject 410 existing in the compound eye area in the correction of the monocular distance of the separate group G3. It may be replaced.
  • FIG. 7 is a configuration diagram of the correction unit 20 according to the third modification.
  • the correction determining unit 23 in the present modified example does not receive the input information 105 from the vehicle control unit 8.
  • the correction determination unit 23 according to the present modification generates correction determination information 121 that determines a correction method for the monocular distance information 102 based on the distance difference information 120. Specifically, the correction determination unit 23 determines that the monocular distance information 102 is corrected when the distance difference information 120 is equal to or greater than a predetermined threshold, and corrects the monocular distance information 102 when the distance difference information 120 is less than the predetermined threshold.
  • a predetermined threshold a predetermined threshold
  • FIG. 8 is a diagram illustrating a calculation method of the distance difference information 120 and a process of correcting the monocular distance information 102 for the cross-border group G1 according to the fourth modification.
  • Grids shown in the partial subjects 400 and 401 in FIG. 8 are distance information blocks.
  • a row of distance information blocks arranged in the horizontal direction in the drawing in each of the partial subject 400 and the partial subject 401 will be referred to as a “distance block line”.
  • distance difference information 120 is generated for each distance block line.
  • the monocular distance information 102 in the monocular area is replaced with distance difference information 120 generated based on the compound-eye distance information 103 in the compound-eye area T located at the boundary between the compound-eye area T and the monocular area.
  • the single-eye distance information 102 in the single-eye area may be replaced with a difference value between the single-eye distance information 102 and the compound-eye distance information 103 generated for each distance information block based on the compound-eye distance information 700.
  • the distance information block line 700 shown in FIG. 8 is a distance information block line in the partial subject 400 in the compound eye region T.
  • a distance information block line 701 in the figure is a distance information block line in the partial subject 401 in the monocular region.
  • the distance difference information generation unit 22 generates an average value LAm for each distance information block line based on the compound eye distance information 103 in the partial subject in the compound eye region T. Then, an average value LAm for each distance information block line is generated based on the monocular distance information 102 in the partial subject in the monocular region P, and the distance difference information 120 for each distance information block line based on the average value LAm and the average value LAs.
  • the correction executing unit 21 may perform correction for adding the distance difference information 120 to the monocular distance information 102 before correction.
  • a predetermined pixel may be weighted.
  • the corrected monocular distance information represented by the symbol LCs is represented by the following Expression 6 or Expression when the symbol of the monocular distance information 102 is Ls and the symbol of the distance difference information 120 is LD. 7 is calculated.
  • LCs ⁇ ⁇ Ls + ⁇ ⁇ LD (Equation 6)
  • LCs (1 ⁇ ) ⁇ Ls + ⁇ ⁇ LD (Equation 7)
  • FIG. 9 is a diagram showing a method of determining the weighting coefficient ⁇ shown in Expression 6 or Expression 7.
  • the weighting coefficient is referred to as a “correction coefficient”.
  • the correction executing unit 21 determines the correction coefficient ⁇ according to the correction function 1100 using the compound eye distance information 103 as a variable.
  • the distance indicated by the compound eye distance information 103 is referred to as a short distance 1101 when the distance is less than L0, a medium distance 1012 when the distance L0 is less than L1, and a long distance 1013 when the distance L1 or longer.
  • the correction coefficient ⁇ is set to 0.0, and when the compound eye distance information 103 is the long distance 1103, the correction coefficient ⁇ is set to 1.0.
  • the correction coefficient ⁇ is changed according to the compound eye distance information 103.
  • L0 and L1 are predetermined thresholds.
  • FIG. 10 is a diagram showing an example of a process of correcting the monocular distance information 102 using the correction coefficient ⁇ shown in FIG. FIG. 10A shows a state before correction, and FIG. 10B shows a state after correction.
  • the regions 90A and 90B show a partial subject in the compound eye region T of the object which is a region at the intermediate distance 1102 and straddles the compound eye region T and the single eye region P.
  • the partial subjects 90A and 90B are the same.
  • the regions 91A and 91B show partial subjects within the monocular region P of the same subject as the regions 90A and 90B. Note that the partial subjects 91A and 91B are the same.
  • the 92 regions 92A and 92B indicate partial subjects in the compound eye region T of a subject that extends over a long distance 1103 and the compound eye region T and the single eye region P. Note that the partial subjects 92A and 902 are the same.
  • the regions 93A and 93B show partial subjects in the monocular region P of the same subject as the regions 92A and 92B. Symbols A, B, m, M, and n described in each area schematically indicate distance information in each area, and indicate that the distance information in the area of the same symbol is the same.
  • a correction process of the single eye distance information 102 on the partial subject within the single eye region P constituting the same subject is performed. Further, irrespective of the compound eye distance information 103 (92A) of the partial subject within the long distance 1103 and within the compound eye region T, the correction processing of the single eye distance information 102 for the same subject in the single eye region P is not performed.
  • the monocular distance information correction unit 12 determines whether or not to correct the monocular distance information 102 based on the compound eye distance information 103 and determines a parameter used for the correction.
  • the distance difference information 120 may be calculated as follows. For each distance information block unit in the distance information block line, a difference value between the compound eye distance information 103 in the partial subject in the compound eye region T and the monocular distance information 102 in the partial subject in the non-compound eye region is generated. The difference value may be used to calculate an average difference value of the partial subjects in the compound eye region T, and the average difference value may be used as the distance difference information 120.
  • the compound eye distance information 103 in the upper and lower distance information block lines may be used.
  • N integer
  • N tap FIR Finite Impulse Response
  • the distance difference information 120 may be the average value LAm in the distance information block line.
  • the correction executing unit 21 converts the monocular distance information 102 in all or some of the distance information blocks in the distance information block line in the partial It may be replaced with the average value LAm in the distance information block line. In the case of replacement, a predetermined pixel may be weighted. The weighting is reduced, for example, according to the distance from the boundary between the compound eye region T and the monocular region.
  • predetermined pixels may be weighted. For example, the weighting is made smaller as the distance from the boundary between the compound eye region T and the monocular region becomes longer.
  • FIG. 11 is a diagram illustrating a correction process of the distance difference information 120 and the monocular distance information 102 for the cross-border group G1 according to the modification 11.
  • the subject 430 straddles the compound eye region T and the monocular region P.
  • the subject 430 includes a partial subject 401 in the monocular region P and a partial subject 400 in the compound eye region T.
  • the average value of the compound eye distance information 103 generated based on the compound eye distance information 700 in the compound eye region T located at the boundary between the compound eye region T and the single eye region P is calculated, and the average value of the compound eye region P
  • the monocular distance information 102 in the partial subject 401 is replaced.
  • the distance information block sequence 700 is located at a position closest to the monocular region P of the partial subject 400 existing in the compound eye region T, that is, at a boundary with the monocular region P.
  • the distance difference information generation unit 22 generates an average value LAm of the left end distance information block of the partial subject 400 in the compound eye region T based on the compound eye distance information 103 in the partial subject within the compound eye region T. Further, the distance difference information generation unit 22 generates an average value LAs of the distance information block at the left end of the partial subject 400 in the compound eye region T based on the monocular distance information 102 in the partial subject 401 of the monocular region P, and calculates the average value LAm and the average value LAm. The distance difference information 120 for each distance information block line is generated based on the average value LAs.
  • the average value LAm and the average value LAs can be, for example, the average value of all or a part of the distance information in the distance information block at the left end 700 of the partial subject 400 in the compound eye region T. In calculating the average, predetermined pixels may be weighted.
  • the distance difference information 120 can be, for example, a difference value LD between the average value LAm and the average value LAs. When calculating the difference value, the average value LAm or the average value LAs may be weighted.
  • the distance difference information 120 may be an average value LA described below.
  • the average value LA is a difference value between the compound eye distance information 103 in the partial subject 400 in the compound eye region T and the monocular distance information 102 in the partial subject 401 in the single eye region. It is generated based on a difference value for each distance information block in the subject 400.
  • the distance difference information 120 may be the average value LAm in the distance information block at the left end 700 of the partial subject 400 in the compound eye region T.
  • Modification 14 In the correction processing by the correction execution unit 21 for the cross-border group G1, the monocular distance information 102 in all or some of the distance information blocks in the partial subject 401 is replaced with the average value LAm of the partial subject 400 in the compound eye region T. You may. In the case of replacement, a predetermined pixel may be weighted. For example, the weighting can be reduced according to the distance from the boundary between the compound eye region T and the monocular region P.
  • the distance difference information 120 of the same distance information block may be added to the monocular distance information 102.
  • predetermined pixels may be weighted. For example, the weighting can be reduced according to the distance from the boundary between the compound eye region T and the monocular region P.
  • FIG. 12 is a diagram illustrating a correction process performed by the correction execution unit 21 on the enlarged cross-border group G2 according to the twelfth modification.
  • FIGS. 12A and 12B show two subjects arranged in the compound eye region T and the monocular region P and their distance information, respectively.
  • FIG. ) Indicates after correction.
  • the 83 regions 83A and 83B indicate partial subjects in the monocular region P of the subject that straddles the compound eye region T and the monocular region P. Note that the partial subjects 83A and 83B are the same.
  • the regions 84A and 84B show partial subjects in the monocular region P of the subject that straddles the compound eye region T and the monocular region P. Note that the partial subjects 84A and 84B are the same.
  • the regions 85A and 85B show a subject existing only in the monocular region P.
  • the symbols A, n, m, M, and N1 described in the respective regions schematically indicate distance information in the respective regions, and indicate that the distance information in the regions of the same symbol is the same.
  • the correction determination unit 23 first corrects the monocular distance information 102 on the partial subject in the monocular region P constituting the same subject from m to M based on the compound eye distance information 103 (A) on the partial subject in the compound eye region T. . Next, based on the corrected monocular distance information M generated for the partial subject in the monocular region P, the correction determination unit 23 determines the monocular distance of another subject in the monocular region P included in the enlarged cross-border group G2.
  • FIG. 13 is a diagram illustrating a method of determining the weighting coefficient represented by Expression 6 or 7, that is, the correction coefficient ⁇ .
  • the correction coefficient ⁇ is determined using the traveling speed of the own vehicle included in the input information 105.
  • the traveling speed is equal to or less than the predetermined value F0
  • the correction coefficient ⁇ is set to 1.0
  • the correction coefficient ⁇ is set to 0.0.
  • the threshold L1 shown in FIG. 9 may be variable.
  • FIG. 14 is a diagram illustrating the threshold value L1.
  • the threshold value L1 is determined according to the traveling speed of the vehicle 1 included in the input information 105. When the traveling speed is equal to or less than the predetermined value F0, the threshold L1 is the minimum D0. When the traveling speed exceeds the predetermined value F1, the threshold L1 is the maximum D1. When the traveling speed is between F0 and F1, the value of the threshold L1 increases as the traveling speed increases.
  • FIG. 15 is a diagram for explaining the correction for the expanded cross-border group in the modification 19.
  • FIG. 15A shows the state before correction
  • FIG. 15B shows the state after correction.
  • the correction determination unit 23 determines whether or not to correct the single-lens distance information of the additional subject in the enlarged transboundary group as follows. That is, the correction determination information 121 determines that correction is to be performed when the distance between the additional subject and the compound eye region is shorter than a predetermined threshold, and determines that correction is not to be performed when the distance is longer than the predetermined threshold.
  • both the subject 88A and the subject 89A belong to the enlarged cross-border group.
  • the distance to the compound eye region T of the subject 88A is a distance indicated by reference numeral 1500, and is shorter than a predetermined value DB0. Therefore, the monocular distance information of the subject 88A is corrected from n to N.
  • the distance to the compound eye region T of the subject 89A is a distance indicated by reference numeral 1501 and is longer than a predetermined value DB0. Therefore, the monocular distance information of the subject 89B is not corrected and does not change from k.
  • the monocular distance information correction unit 12 determines the presence or absence of the distance information correction and the correction parameters based on the distance between another subject and the boundary between the monocular image and the compound eye image.
  • FIG. 16 is a diagram illustrating the configuration of the correction unit 20 according to the sixteenth modification.
  • the correction unit 20 in the present modification further includes a monocular area stay period holding unit 24 in addition to the configuration in the first embodiment.
  • the monocular region stay period holding unit 24 Based on the group information 106, the monocular region stay period holding unit 24 generates monocular region stay period information 122 indicating the length of time during which the subject is present in the monocular region within the monocular region P, for example, the number of frames, and performs correction. Output to the determination unit 23.
  • the monocular area stay period information 122 there is frame number information and time information when the subject is continuously present in the monocular area in the monocular area P.
  • the single eye region stay period information 122 may be initialized to zero.
  • the monocular area staying period information 122 may be initialized to a maximum value or a predetermined value.
  • FIG. 17 is a diagram illustrating a method of determining the correction coefficient ⁇ by the correction determination unit 23.
  • the correction determination information 121 determines the correction coefficient ⁇ used in Expressions 6 and 7 based on the monocular area stay period information 122.
  • the correction determination information 121 sets the correction coefficient ⁇ to 1.0 when the monocular area stay period information 122 is equal to or smaller than a predetermined value Ta.
  • the correction determination information 121 sets the correction coefficient ⁇ to 0.0.
  • the correction determination information 121 decreases the correction coefficient ⁇ as the monocular area stay period information 122 increases.
  • the distance calculation device 6 includes a monocular region stay period holding unit 24 that generates monocular region stay period information 122 in which the group exists outside the compound eye region.
  • the monocular distance information correction unit 12 determines a distance information correction method based on the monocular area stay period information 122.
  • FIG. 18 is a configuration diagram of the group information generation unit 30 in the modification 21.
  • the group information generation unit 30 further includes a monocular area stay period holding unit 33 in addition to the configuration illustrated in FIG.
  • the monocular region stay period holding unit 33 generates monocular region stay period information 131 indicating the length of time during which the subject is present in the monocular region within the monocular region P, for example, the number of frames, based on the subject detection information 130. , To the group determination unit 31.
  • the single-eye region stay period information 131 there is frame number information and time information when the subject is continuously present in the single-eye region in the single-eye region P.
  • the single eye region stay period information 131 may be initialized to zero.
  • the monocular area staying period information 131 may be initialized to a maximum value or a predetermined value.
  • the correction executing unit 21 may determine a method of correcting the single-lens distance information based on the type of the subject and the speed of the vehicle 1. For example, the correction method is the necessity of correction and the parameter of correction.
  • FIG. 19 is a diagram showing the correspondence between the type of subject and the correction method. However, FIG. 19 also shows the relationship between the speed of the vehicle 1 and the correction method. FIG. 19 is divided into two stages for the sake of drawing. In FIG. 19, “correction ON” indicates that the single-lens distance information is corrected, and “correction OFF” indicates that the single-lens distance information is not corrected.
  • the percentage in FIG. 19 indicates that the correction is to be performed and the value of the parameter used for the correction. That is, the correction method in which the percentage is described is a correction method using parameters as shown in Expressions 6 and 7. However, in the case of “0%”, the correction need not be performed.
  • the correction execution unit 21 uses the correction method 1
  • the monocular distance information is corrected when the type of the subject is a person, a vehicle, and a road surface, but is corrected when the type of the subject is a structure. do not do.
  • the correction executing unit 21 uses the correction method 3
  • the cases are classified according to the speed of the vehicle 1, and when the vehicle is running at low speed, the correction is performed regardless of the type of the subject.
  • the vehicle is running at high speed, the person and the vehicle are not corrected.
  • the time of middle speed running the person and the vehicle are corrected according to the speed.
  • the road surface or the structure is a stationary object having continuity in a large area, it can be easily corrected.
  • the area of a person or a vehicle is relatively small and has no continuity, the correction is performed only at the time of low-speed running where high necessity is required.
  • the distance accuracy of the person is increased during low-speed running, and the distance accuracy of the vehicle is increased during high-speed running.
  • Cost reduction is realized by reducing the overall correction processing load by utilizing the fact that the environment around the vehicle changes depending on the speed, and in particular, changes what is desired to be seen. Specifically, by reducing the calculation capability required for the distance calculation device 6, the distance calculation device 6 can be manufactured using a calculation device having a low processing capability.
  • the correction may be performed only when the distance indicated by the monocular distance information before correction is shorter than a predetermined distance.
  • the correction target may be limited to only those who are within a predetermined distance range from the road surface. Since the road surface has continuity in a large area, the correction accuracy is high. On the other hand, it is necessary to pay more attention to people near the traveling road.
  • the monocular distance information correction unit 12 determines a method of correcting the monocular distance information based on at least one of the traveling state of the vehicle 1 and the type of the subject. Therefore, the distance calculation device 6 can perform the correction based on the running state of the vehicle and the type of the subject.
  • FIG. 20 is a schematic diagram of a distance calculation system S according to the second embodiment.
  • the vehicle 1 includes a first imaging unit 2 and a distance sensor 2000 as sensors for collecting surrounding information.
  • the distance sensor 2000 is a sensor that can acquire distance information, and is, for example, a LiDAR (Light Detection and Ranging), a millimeter wave radar, a TOF (Time of Flight) type distance sensor, a stereo camera, or the like.
  • the distance sensor 2000 acquires distance information of a range indicated by a solid line, that is, an area indicated by hatching in the center of the drawing.
  • the first imaging unit 2 sets the range indicated by the broken line, that is, the wide range including the range in which the distance sensor 2000 acquires the distance information, as the imaging range.
  • a region indicated by hatching corresponds to the compound eye region T in the first embodiment. Regions other than the region indicated by hatching in the imaging range of the first imaging unit 2 correspond to the monocular region P and the monocular region S in the first embodiment.
  • FIG. 21 is a configuration diagram of the distance calculation device 6 according to the second embodiment.
  • the compound-eye distance information generation unit 11 is omitted from the configuration of the first embodiment.
  • Distance sensor 2000 outputs distance information 2010 to monocular distance information correction unit 12.
  • the monocular distance information correction unit 12 treats the distance information 2010 as the compound eye distance information 103 in the first embodiment. Therefore, although there is no actual situation, it can be considered that the compound-eye distance information generation unit 11 virtually exists, and the output of the input sensor is directly output as the high-precision distance information 103.
  • Other configurations are the same as those of the first embodiment.
  • the compound-eye distance information generation unit 11 generates compound-eye distance information 103 using at least one of a TOF distance sensor, a millimeter-wave radar, and an LiDAR output. Therefore, the distance information of the monocular region can be corrected using the information of the distance sensor with higher accuracy.
  • FIGS. A third embodiment of the distance calculation device will be described with reference to FIGS.
  • the same components as those in the first embodiment are denoted by the same reference numerals, and the differences will be mainly described.
  • the points that are not particularly described are the same as in the first embodiment.
  • a sensor mainly mounted on the vehicle 1 is different from that of the first embodiment.
  • FIG. 22 is a schematic diagram of a distance calculation system S according to the third embodiment.
  • the vehicle 1 includes a first imaging unit 2, a second imaging unit 3, a first distance sensor 2200, and a second distance sensor 2201 as sensors for collecting surrounding information.
  • the first distance sensor 2200 and the second distance sensor 2201 are sensors capable of acquiring distance information, and are, for example, a LiDAR, a millimeter wave radar, a TOF distance sensor, a stereo camera, and the like.
  • the fields of view of the first imaging unit 2 and the second imaging unit 3 are the same, and an image of the hatched area in the center of the figure is taken.
  • the field of view of the first distance sensor 2200 is about the front left of the vehicle 1
  • the field of view of the second distance sensor 2201 is about the front right of the vehicle 1.
  • the viewing angles, installation positions, and postures of all sensors are known.
  • the area indicated by hatching corresponds to the compound eye area T in the first embodiment.
  • the areas above and below the hatching correspond to the monocular area P and the monocular area S in the first embodiment.
  • FIG. 23 is a configuration diagram of the distance calculation device 6 according to the third embodiment.
  • the monocular distance information generation unit 10 is omitted.
  • the first distance sensor 2200 and the second distance sensor 2201 output the distance information 2210 and the distance information 2211 to the monocular distance information correction unit 12.
  • the monocular distance information correction unit 12 extracts necessary area information from the input distance information 2210 and the distance information 2211 and treats the extracted information as monocular distance information 102 in the first embodiment. Therefore, although the actual state does not exist, it can be considered that the monocular distance information generation unit 10 virtually exists and the output of the input sensor is output as the monocular distance information 102 as it is.
  • Other configurations are the same as those of the first embodiment.
  • the monocular distance information generation unit 10 generates the monocular distance information 102 by using the output of at least one of a TOF distance sensor, a millimeter wave radar, and a LiDAR.
  • FIGS. A fourth embodiment of the distance calculation device will be described with reference to FIGS.
  • the same components as those in the first embodiment are denoted by the same reference numerals, and the differences will be mainly described.
  • the points that are not particularly described are the same as in the first embodiment.
  • the present embodiment mainly differs from the first embodiment in the viewing angle of the camera mounted on the vehicle 1 and the attitude of the camera.
  • FIG. 24 is a schematic diagram of a distance calculation system S according to the fourth embodiment.
  • the vehicle 1 includes a first imaging unit 2 and a second imaging unit 3 as sensors for collecting surrounding information, as in the first embodiment.
  • the viewing angles of the first imaging unit 2 and the second imaging unit 3 are widened, and the fields of view of both are the same.
  • the first imaging unit 2 and the second imaging unit 3 have, for example, a fisheye lens and are capable of imaging a wide range, but the distortion amount around the lens is large, and the resolution is reduced.
  • a hatched area which corresponds to the compound eye area T in the first embodiment.
  • Regions other than the center of the field of view of the first imaging unit 2 and the second imaging unit 3 have large distortion and low spatial resolution, and therefore correspond to the monocular region P and the monocular region S in the first embodiment. I reckon. In other words, an area with relatively poor shooting conditions is regarded as a monocular area, and an area with relatively good imaging conditions is considered as a compound eye area.
  • FIG. 25 is a configuration diagram of the distance calculation device 6 according to the fourth embodiment.
  • the monocular distance information generation unit 10 is omitted from the configuration of the first embodiment.
  • the monocular distance information correction unit 12 treats only the vicinity of the center of the compound eye distance information 103 output by the compound eye distance information generating unit 11 as the compound eye distance information 103 in the first embodiment, and the peripheral part is the first embodiment. Is handled as the monocular distance information 102 in. Other configurations are the same as those of the first embodiment.
  • the compound-eye distance information generation unit 11 generates compound-eye distance information 103 by using information of an area where the two cameras output and the fields of view overlap and the shooting conditions are relatively good, and the compound-eye distance information 103
  • the generation unit 10 generates the monocular distance information 102 using information of an area output by the two cameras and having an overlapping field of view and in which imaging conditions are inferior to the area used by the compound-eye distance information generation unit 11. . Therefore, the present invention can be applied to a system having no monocular region.
  • the photographing conditions may include not only the distortion of the lens but also the amount of light and dust attached to the lens.
  • distance information of a dark area where the amount of light is small is treated as monocular distance information 102
  • distance information of a brightly and clearly photographed area is compound eye distance information 103. It may be treated as.
  • FIG. 26 is a configuration diagram of a distance calculation system S according to the fifth embodiment.
  • the monocular distance information correction unit 12 is realized by hardware different from the monocular distance information generation unit 10 and the compound eye distance information generation unit 11.
  • the camera processing device 15 includes a monocular distance information generation unit 10, a compound eye distance information generation unit 11, and a feature information generation unit 13.
  • the feature information generation unit 13 creates a feature amount of the input image, for example, an edge image or a histogram, and outputs it as feature information 107 to the image processing unit 60.
  • the ECU 14 includes a monocular distance information correction unit 12, a recognition processing unit 7, and a vehicle control unit 8.
  • the present invention can be implemented by various hardware configurations.

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Electromagnetism (AREA)
  • General Physics & Mathematics (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Image Analysis (AREA)
  • Traffic Control Systems (AREA)
  • Measurement Of Optical Distance (AREA)
  • Optical Radar Systems And Details Thereof (AREA)

Abstract

This distance calculation device is provided with: a high precision distance information generation unit which generates, by using an output from a sensor, high precision distance information that is information on a highly precise distance; a low precision distance information generation unit which generates, by using an output from the sensor, low precision distance information, the degree of precision of which is lower than that of the high precision distance information; and a distance correction unit which corrects the low precision distance information by using the high precision distance information.

Description

距離算出装置Distance calculation device
 本発明は、距離算出装置に関する。 The present invention relates to a distance calculation device.
 ステレオカメラを使用してカメラと観測対象物との距離を算出し、観測対象物の認識処理を行う技術が知られている。また、ステレオカメラを構成する2つのカメラをずらし、前方に複眼領域、左右側方に単眼領域を生成することで、高精度かつ広視野の車両周辺監視を実現する方法も知られている。特許文献1には基準面上に存在する物体を検出する物体検出装置であって、複数の撮像装置のそれぞれで撮像された画像間の視差を算出する視差算出部と、前記視差に基づいて、前記複数の画像の共通領域と、前記共通領域以外の非共通領域と、を算出する領域算出部と、前記共通領域内の矩形であって、前記矩形に含まれる各点の前記視差の均一度があらかじめ定められた閾値より大きい矩形を、前記物体を検出するための矩形の候補である候補矩形として生成する第1生成部と、前記非共通領域内の矩形を前記候補矩形として生成する第2生成部と、生成された前記候補矩形に検出対象の物体が含まれるか否かを判定する判定部と、を備えることを特徴とする物体検出装置が開示されている。 技術 There is known a technique of calculating the distance between a camera and an observation target using a stereo camera and performing recognition processing of the observation target. There is also known a method of realizing high-precision and wide-field monitoring around a vehicle by shifting two cameras constituting a stereo camera and generating a compound eye region in front and a monocular region in left and right sides. Patent Literature 1 is an object detection device that detects an object present on a reference plane, and includes a parallax calculation unit that calculates parallax between images captured by a plurality of imaging devices, based on the parallax, A region calculation unit that calculates a common region of the plurality of images and a non-common region other than the common region; and a rectangle in the common region, and the uniformity of the parallax of each point included in the rectangle. A first generation unit that generates a rectangle larger than a predetermined threshold as a candidate rectangle that is a candidate for a rectangle for detecting the object, and a second generation unit that generates a rectangle in the non-common area as the candidate rectangle. An object detection device is disclosed, comprising: a generation unit; and a determination unit that determines whether the generated candidate rectangle includes an object to be detected.
日本国特開2010-79582号公報Japanese Patent Application Laid-Open No. 2010-79582
 高精度に距離情報を算出できない領域の距離情報を向上させるニーズが存在する。 ニ ー ズ There is a need to improve distance information in an area where distance information cannot be calculated with high accuracy.
 本発明の第1の態様による距離算出装置は、センサの出力を用いて、精度の高い距離の情報である高精度距離情報を生成する高精度距離情報生成部と、センサの出力を用いて、前記高精度距離情報よりも精度の低い低精度距離情報を生成する低精度距離情報生成部と、前記高精度距離情報を用いて前記低精度距離情報を補正する距離補正部とを備える。 A distance calculation device according to a first aspect of the present invention uses a sensor output, a high-precision distance information generation unit that generates high-precision distance information that is high-precision distance information, and a sensor output. A low-accuracy distance information generation unit that generates low-accuracy distance information with lower accuracy than the high-accuracy distance information, and a distance correction unit that corrects the low-accuracy distance information using the high-accuracy distance information.
 本発明によれば、精度が低い領域の距離情報を補正することができる。 According to the present invention, it is possible to correct distance information of an area with low accuracy.
距離算出装置6を含む距離算出システムSの概要図Schematic diagram of distance calculation system S including distance calculation device 6 第1の実施の形態に係る車両制御システムSの構成図Configuration diagram of a vehicle control system S according to the first embodiment 図3(a)は第1撮影画像100を示す図、図3(b)は第2撮影画像101を示す図、図3(c)はグループを示す図FIG. 3A shows the first captured image 100, FIG. 3B shows the second captured image 101, and FIG. 3C shows the group. 補正部20の構成図Configuration diagram of the correction unit 20 グループ情報生成部30の構成図Configuration diagram of group information generation unit 30 距離算出装置6の動作を表すフローチャートFlow chart showing the operation of the distance calculation device 6 変形例3における補正部20の構成図Configuration diagram of the correction unit 20 in Modification Example 3 変形例4における距離差分情報120の算出方法と、越境グループG1に対する単眼距離情報102の補正処理を示す図The figure which shows the calculation method of the distance difference information 120 in the modification 4, and the correction | amendment process of the monocular distance information 102 with respect to the cross border group G1. 式6または式7に示す重み付け係数ηの決定方法を示す図The figure which shows the determination method of the weighting coefficient (eta) shown by Formula 6 or Formula 7. 図9で示す補正係数ηを用いた単眼距離情報102の補正処理の一例を示す図。図10(a)は補正前を示す図、図10(b)は補正後を示す図FIG. 10 is a diagram illustrating an example of a process of correcting the monocular distance information 102 using the correction coefficient η illustrated in FIG. 9. FIG. 10A shows a state before correction, and FIG. 10B shows a state after correction. 変形例11における越境グループG1を対象とする距離差分情報120と単眼距離情報102の補正処理を示す図The figure which shows the correction processing of the distance difference information 120 and the monocular distance information 102 for the cross border group G1 in the modification 11. 変形例12における拡大越境グループG2を対象とする補正実行部21による補正処理を示す図。図12(a)は補正前を示す図、図12(b)が補正後を示す図The figure which shows the correction process by the correction execution part 21 which targets the expansion cross-border group G2 in the modification 12. FIG. 12A shows a state before correction, and FIG. 12B shows a state after correction. 式6または式7で示す補正係数ηの決定方法を示す図The figure which shows the determination method of the correction coefficient (eta) shown by Formula 6 or Formula 7. 図9に示した閾値L1を示す図The figure which shows the threshold value L1 shown in FIG. 変形例19における拡大越境グループを対象とする補正を説明する図。図15(a)は補正前を示す図、図15(b)は補正後を示す図The figure explaining the correction which targets the expansion cross-border group in the modification 19. FIG. 15A shows a state before correction, and FIG. 15B shows a state after correction. 変形例16における補正部20の構成を示す図The figure which shows the structure of the correction | amendment part 20 in the modification 16. 補正判定部23による補正係数ηの決定方法を示す図The figure which shows the determination method of the correction coefficient (eta) by the correction determination part 23. 変形例21におけるグループ情報生成部30の構成図Configuration diagram of group information generation section 30 in modification 21 被写体の種類と補正の手法の対応関係を示す図Diagram showing correspondence between subject types and correction methods 第2の実施の形態における距離算出システムSの概要図Schematic diagram of distance calculation system S according to the second embodiment 第2の実施の形態における距離算出装置6の構成図Configuration diagram of distance calculation device 6 according to the second embodiment 第3の実施の形態における距離算出システムSの概要図Schematic diagram of distance calculation system S according to the third embodiment 第3の実施の形態における距離算出装置6の構成図Configuration diagram of distance calculation device 6 according to the third embodiment 第4の実施の形態における距離算出システムSの概要図Schematic diagram of distance calculation system S according to the fourth embodiment 第4の実施の形態における距離算出装置6の構成図Configuration diagram of a distance calculation device 6 according to the fourth embodiment 第5の実施の形態における距離算出システムSの構成図Configuration diagram of distance calculation system S according to the fifth embodiment
―第1の実施の形態―
 以下、図1~図6を参照して、距離算出装置の第1の実施の形態を説明する。
-First Embodiment-
Hereinafter, a first embodiment of a distance calculation device will be described with reference to FIGS.
(構成)
 図1は、距離算出装置6を含む距離算出システムSの概要図である。距離算出システムSは車両1に搭載され、第1撮像部2と、第2撮像部3とを備える。第1撮像部2および第2撮像部3は、車両1の進行方向の前方を向けて並べて設置される。第1撮像部2および第2撮像部3の視野角は既知であり、図1に示すように一部の視野が重複する。図1においてハッチングで示す領域が視野が重複している。
(Constitution)
FIG. 1 is a schematic diagram of a distance calculation system S including a distance calculation device 6. The distance calculation system S is mounted on the vehicle 1 and includes a first imaging unit 2 and a second imaging unit 3. The first imaging unit 2 and the second imaging unit 3 are arranged side by side in the forward direction of the vehicle 1. The viewing angles of the first imaging unit 2 and the second imaging unit 3 are known, and some fields of view overlap as shown in FIG. In FIG. 1, the fields indicated by hatching have overlapping visual fields.
 図2は、第1の実施の形態に係る車両制御システムSの構成図である。車両制御システムSは、第1撮像部2と、第2撮像部3と、距離算出装置6と、認識処理部7と、車両制御部8とを備える。距離算出装置6は、単眼距離情報生成部10と、複眼距離情報生成部11と、単眼距離情報補正部12とを備える。単眼距離情報補正部12は、補正部20と、グループ情報生成部30とを備える。第1撮像部2は撮影して得られた画像である第1撮影画像100を距離算出装置6に出力する。第2撮像部3は撮影して得られた画像である第2撮影画像101を距離算出装置6に出力する。 FIG. 2 is a configuration diagram of the vehicle control system S according to the first embodiment. The vehicle control system S includes a first imaging unit 2, a second imaging unit 3, a distance calculation device 6, a recognition processing unit 7, and a vehicle control unit 8. The distance calculation device 6 includes a monocular distance information generation unit 10, a compound eye distance information generation unit 11, and a monocular distance information correction unit 12. The monocular distance information correction unit 12 includes a correction unit 20 and a group information generation unit 30. The first imaging unit 2 outputs a first captured image 100, which is an image obtained by capturing, to the distance calculation device 6. The second imaging unit 3 outputs a second captured image 101, which is an image obtained by capturing, to the distance calculation device 6.
 距離算出装置6は、単体のECU(Electronic Control Unit、電子制御装置)であってもよいし、他の機能をさらに備える演算装置でもよい。距離算出装置6は、第1撮像部2および第2撮像部3と一体に構成されてもよい。また距離算出装置6、認識処理部7、および車両制御部8が一体の装置により実現されてもよい。認識処理部7および車両制御部8は、一体の電子制御装置により実現されてもよいし、それぞれが異なる電子制御装置により実現されてもよい。 The distance calculation device 6 may be a single ECU (Electronic Control Unit), or may be a calculation device further provided with other functions. The distance calculation device 6 may be configured integrally with the first imaging unit 2 and the second imaging unit 3. Further, the distance calculation device 6, the recognition processing unit 7, and the vehicle control unit 8 may be realized by an integrated device. The recognition processing unit 7 and the vehicle control unit 8 may be realized by an integrated electronic control device, or may be realized by different electronic control devices.
 単眼距離情報生成部10は、第1撮影画像100および第2撮影画像101のそれぞれを用いて単眼距離情報102を生成する。単眼距離情報生成部10は既知の手法により単眼画像から距離を算出する。すなわち単眼距離情報生成部10は、単眼距離情報102の生成において第1撮影画像100および第2撮影画像101を組み合わせて使用するのではない。単眼距離情報生成部10は、第1撮影画像100を用いて算出した距離情報と、第2撮影画像101を用いて算出した距離情報とをあわせて単眼距離情報102とする。 The monocular distance information generation unit 10 generates the monocular distance information 102 using each of the first captured image 100 and the second captured image 101. The monocular distance information generation unit 10 calculates the distance from the monocular image by a known method. That is, the monocular distance information generation unit 10 does not use the first captured image 100 and the second captured image 101 in combination in generating the monocular distance information 102. The monocular distance information generation unit 10 combines the distance information calculated using the first captured image 100 and the distance information calculated using the second captured image 101 into monocular distance information 102.
 単眼距離情報生成部10はたとえば撮影が平面で水平方向を向いて行われたと仮定し、撮影画像の上部ほど遠い距離として算出する。ただし単眼距離情報生成部10は、1または複数の画素単位で構成される距離情報ブロックごとに距離を算出する。距離情報ブロックはたとえば、縦4画素×横4画素で構成される16画素でもよいし、縦2画素×横1画素で構成される2画素でもよいし、縦1画素×横1画素、すなわち1画素でもよい。単眼距離情報生成部10は生成した単眼距離情報102を単眼距離情報補正部12の補正部20に出力する。 (4) The monocular distance information generation unit 10 assumes, for example, that the imaging is performed in a horizontal direction on a plane, and calculates the distance as the upper part of the captured image is farther. However, the monocular distance information generation unit 10 calculates a distance for each distance information block composed of one or a plurality of pixels. The distance information block may be, for example, 16 pixels composed of 4 vertical pixels × 4 horizontal pixels, 2 pixels composed of 2 vertical pixels × 1 horizontal pixel, or 1 vertical pixel × 1 horizontal pixel, ie, 1 pixel It may be a pixel. The monocular distance information generation unit 10 outputs the generated monocular distance information 102 to the correction unit 20 of the monocular distance information correction unit 12.
 複眼距離情報生成部11は、第1撮影画像100および第2撮影画像101を組み合わせて、具体的には第1撮影画像100および第2撮影画像101の視野が重複する領域を用いて、複眼距離情報103を生成する。第1撮像部2と第2撮像部3の基線長を含む位置関係は既知なので、複眼距離情報生成部11は第1撮影画像100および第2撮影画像101に共通する画素の視差から距離を算出する。なお複眼距離情報生成部11が算出する複眼距離情報103も距離情報ブロックを単位として算出される。 The compound-eye distance information generation unit 11 combines the first captured image 100 and the second captured image 101, and specifically, uses the region where the fields of view of the first captured image 100 and the second captured image 101 overlap to determine the compound eye distance. The information 103 is generated. Since the positional relationship including the baseline length of the first imaging unit 2 and the second imaging unit 3 is known, the compound eye distance information generation unit 11 calculates the distance from the parallax of the pixels common to the first captured image 100 and the second captured image 101. I do. Note that the compound-eye distance information 103 calculated by the compound-eye distance information generating unit 11 is also calculated for each distance information block.
 単眼距離情報102と複眼距離情報103を比較すると、複眼距離情報103の方が精度が高い。そのため単眼距離情報102は「低精度距離情報」と呼ぶことができ、複眼距離情報103は「高精度距離情報」と呼ぶこともできる。これと同様に、単眼距離情報生成部10は「低精度距離情報生成部」と呼ぶことができ、複眼距離情報生成部11は「高精度距離情報生成部」と呼ぶことができる。 す る と Comparing the monocular distance information 102 with the compound eye distance information 103, the compound eye distance information 103 has higher accuracy. Therefore, the monocular distance information 102 can be referred to as “low-accuracy distance information”, and the compound-eye distance information 103 can be referred to as “high-accuracy distance information”. Similarly, the monocular distance information generating unit 10 can be called a “low-accuracy distance information generating unit”, and the compound-eye distance information generating unit 11 can be called a “high-accuracy distance information generating unit”.
 なお単眼距離情報生成部10が用いる距離情報ブロックのサイズと補正部20が用いる距離情報ブロックのサイズは異なっていてもよいが、本実施の形態では説明を簡略化するために両者の距離情報ブロックは同一とする。補正部20は生成した複眼距離情報103を単眼距離情報補正部12の補正部20およびグループ情報生成部30に出力する。 Note that the size of the distance information block used by the monocular distance information generation unit 10 and the size of the distance information block used by the correction unit 20 may be different. However, in the present embodiment, in order to simplify the description, both distance information blocks are used. Are the same. The correction unit 20 outputs the generated compound eye distance information 103 to the correction unit 20 of the single eye distance information correction unit 12 and the group information generation unit 30.
 グループ情報生成部30は、第1撮影画像100、第2撮影画像101、および複眼距離情報103を用いて、被写体の検知および複数の被写体に対するグルーピング処理を行い、その結果をグループ情報106として補正部20に出力する。グループ情報生成部30の動作の詳細は後述する。 The group information generation unit 30 detects a subject and performs a grouping process on a plurality of subjects using the first captured image 100, the second captured image 101, and the compound-eye distance information 103, and corrects the result as group information 106. 20. Details of the operation of the group information generation unit 30 will be described later.
 グループ情報生成部30が生成するグループ情報106には、グループの種類、およびグループを構成する被写体の画像上の位置が含まれる。なおグループ情報106には被写体の種類の情報が含まれてもよい。被写体の種類とはたとえば、人物、動物、車両、路面、縁石、側道、白線、ガードレール、標識、構造物、信号、路上落下物、路面上の穴、水溜りなどである。補正部20は、複眼距離情報103とグループ情報106とを用いて単眼距離情報102を補正し、補正後距離情報104を出力する。補正部20の動作の詳細は後述する。認識処理部7は、距離算出装置6が生成した補正後距離情報104を用いて認識処理を行う。 The group information 106 generated by the group information generation unit 30 includes the type of the group and the position of the subject included in the group on the image. The group information 106 may include information on the type of the subject. The type of the subject is, for example, a person, an animal, a vehicle, a road surface, a curb, a side road, a white line, a guardrail, a sign, a structure, a signal, a falling object on the road, a hole on the road, a puddle, and the like. The correction unit 20 corrects the monocular distance information 102 using the compound eye distance information 103 and the group information 106, and outputs corrected distance information 104. Details of the operation of the correction unit 20 will be described later. The recognition processing unit 7 performs a recognition process using the corrected distance information 104 generated by the distance calculation device 6.
 補正部20は、複眼距離情報103およびグループ情報106を用いて単眼距離情報102を補正し、補正後の単眼距離情報102と複眼距離情報103を補正後距離情報104として出力する。補正部20の詳細な構成および動作は後述する。 The 部 correction unit 20 corrects the monocular distance information 102 using the compound eye distance information 103 and the group information 106, and outputs the corrected monocular distance information 102 and compound eye distance information 103 as corrected distance information 104. The detailed configuration and operation of the correction unit 20 will be described later.
 車両制御部8は、認識処理部7が生成した認識情報140を用いて車両1を制御する。また車両制御部8は、距離算出装置6に入力情報105を出力する。入力情報105は、車両1の走行情報、認識情報140、交通インフラ情報、地図情報、および車両制御情報の少なくとも1つを含む。車両制御部8は車両1に搭載された不図示のセンサから車両1の走行情報を取得できる。車両制御部8は、不図示の通信装置を用いて車両1の外部から交通インフラ情報を取得できる。車両制御部8は、不図示の通信装置を用いて地図データベースを参照することで、または車両1に内蔵される地図データベースを参照することで、地図情報を取得できる。 The vehicle control unit 8 controls the vehicle 1 using the recognition information 140 generated by the recognition processing unit 7. The vehicle control unit 8 outputs the input information 105 to the distance calculation device 6. The input information 105 includes at least one of travel information of the vehicle 1, recognition information 140, traffic infrastructure information, map information, and vehicle control information. The vehicle control unit 8 can acquire travel information of the vehicle 1 from a sensor (not shown) mounted on the vehicle 1. The vehicle control unit 8 can acquire traffic infrastructure information from outside the vehicle 1 using a communication device (not shown). The vehicle control unit 8 can obtain map information by referring to a map database using a communication device (not shown) or by referring to a map database built in the vehicle 1.
 走行情報の一例として、車両1の走行速度情報、加速度情報、位置情報、進行方向、走行モード等がある。走行モードの一例として、自動運転モード、走行路に基づく走行モード、走行状況に基づく走行モード、周囲自然環境に基づく走行モード、省電力または少燃費で走行する省エネルギーモードがある。自動運転モードの一例として、車両制御を人間の判断で実行する手動運転モード、車両制御の一部をシステムの判断で支援する運転支援モード、車両制御をシステムの判断で実行する自動運転モードがある。走行路に基づく走行モードの一例として、市街地走行モード、高速道路走行モード、法定速度情報がある。走行状況に基づく走行モードの一例として、渋滞時走行モード、駐車場モード、周囲車両の位置や動きに応じた走行モードがある。周囲自然環境に基づく走行モードの一例として、夜間時走行モード、逆光時走行モードがある。地図情報の一例として、地図上における車両1の位置情報、道路形状情報、道路面地物情報、路幅情報、車線情報、道路勾配情報がある。道路形状情報の一例として、T字路、交差点がある。道路面地物情報の一例として、信号、車道部、歩道部、踏切道、自転車駐輪場、自動車駐車場、横断歩道がある。地図情報の一例として、地図上における車両1の位置情報、道路形状情報、道路面地物情報、路幅情報、車線情報、道路勾配情報がある。道路形状情報の一例として、T字路、交差点がある。道路面地物情報の一例として、信号、車道部、歩道部、踏切道、自転車駐輪場、自動車駐車場、横断歩道がある。 Examples of the traveling information include traveling speed information, acceleration information, position information, traveling direction, traveling mode, and the like of the vehicle 1. Examples of the traveling mode include an automatic driving mode, a traveling mode based on a traveling road, a traveling mode based on a traveling situation, a traveling mode based on a surrounding natural environment, and an energy saving mode in which the vehicle travels with low power consumption or low fuel consumption. Examples of the automatic driving mode include a manual driving mode in which vehicle control is performed by human judgment, a driving support mode in which a part of vehicle control is assisted by system judgment, and an automatic driving mode in which vehicle control is executed by system judgment. . Examples of the travel mode based on the travel path include a city area travel mode, a highway travel mode, and legal speed information. Examples of the traveling mode based on the traveling situation include a traffic congestion traveling mode, a parking lot mode, and a traveling mode according to the position and movement of surrounding vehicles. Examples of the driving mode based on the surrounding natural environment include a night driving mode and a backlight driving mode. Examples of the map information include position information of the vehicle 1 on the map, road shape information, road surface feature information, road width information, lane information, and road gradient information. Examples of the road shape information include a T-junction and an intersection. Examples of the road surface feature information include a signal, a roadway section, a sidewalk section, a level crossing, a bicycle parking lot, a car parking lot, and a crosswalk. Examples of the map information include position information of the vehicle 1 on the map, road shape information, road surface feature information, road width information, lane information, and road gradient information. Examples of the road shape information include a T-junction and an intersection. Examples of the road surface feature information include a signal, a roadway section, a sidewalk section, a level crossing, a bicycle parking lot, a car parking lot, and a crosswalk.
 認識情報140の一例として、被写体の位置情報、種類情報、動作情報、危険情報がある。位置情報の一例として、車両1からの方向と距離がある。種類情報の一例として、歩行者、大人、子供、高齢者、動物、落石、自転車、周辺車両、周辺構造物、縁石がある。動作情報の一例として、歩行者や自転車のふらつき、飛び出し、横切り、移動方向、移動速度、移動軌跡がある。危険情報の一例として、歩行者飛び出し、落石、急停止や急減速や急ハンドルなど周辺車両の異常動作などがある。交通インフラ情報の一例として、渋滞情報、速度制限や通行禁止などの交通規制情報、別の走行ルートに誘導する走行ルート誘導情報、事故情報、注意喚起情報、周辺車両情報、地図情報等がある。車両制御情報の一例として、ブレーキ制御、ハンドル制御、アクセル制御、車載ランプ制御、警告音発生、車載カメラ制御、ネットワークを介して接続された周辺車両や遠隔地センタ機器への撮像装置周辺の観測対象物に関する情報出力などがある。また、車両制御部8は、距離算出装置6または認識処理部7で処理した情報を元に被写体検知処理を行ってもよいし、車両制御部8に接続された表示機器に対して、第1撮像部2または第2撮像部3を介して得られた画像や、視聴者に認識させるための表示を行ってもよいし、地図情報や、渋滞情報などの交通情報を処理する情報機器に対して、距離算出装置6または認識処理部7で検知した観測対象物の情報を供給してもよい。 例 Examples of the recognition information 140 include position information, type information, operation information, and danger information of the subject. An example of the position information includes a direction and a distance from the vehicle 1. Examples of the type information include pedestrians, adults, children, the elderly, animals, falling rocks, bicycles, surrounding vehicles, surrounding structures, and curbs. Examples of the motion information include a pedestrian or bicycle swaying, jumping out, traversing, moving direction, moving speed, and moving trajectory. Examples of danger information include abnormal movements of surrounding vehicles such as pedestrians jumping out, falling rocks, suddenly stopping, suddenly decelerating, and suddenly turning. Examples of the traffic infrastructure information include traffic congestion information, traffic regulation information such as speed limit and traffic prohibition, travel route guidance information for guiding to another travel route, accident information, alert information, peripheral vehicle information, map information, and the like. Examples of vehicle control information include brake control, steering wheel control, accelerator control, in-vehicle lamp control, alarm sound generation, in-vehicle camera control, and observation objects around imaging devices to peripheral vehicles and remote center devices connected via a network. There is information output on things. Further, the vehicle control unit 8 may perform subject detection processing based on information processed by the distance calculation device 6 or the recognition processing unit 7, or may perform a first operation on a display device connected to the vehicle control unit 8. An image obtained through the imaging unit 2 or the second imaging unit 3 or a display for allowing the viewer to recognize the image may be displayed, or an information device that processes traffic information such as map information or congestion information may be displayed. Thus, information on the observation target object detected by the distance calculation device 6 or the recognition processing unit 7 may be supplied.
(撮影画像とグループ)
 図3は撮影画像の領域を示す図である。図3(a)は第1撮像部2が取得した第1撮影画像100を示す図であり、図3(b)は第2撮像部3が取得した第2撮影画像101を示す図であり、図3(c)はグループを示す図である。第1撮像部2および第2撮像部3は水平方向に並んでおり、図3(a)と図3(b)の横方向は撮影位置にあわせて配置している。すなわち第1撮影画像100の領域Qと第2撮影画像101の領域Rは同一の領域が撮影されている。
(Shooting image and group)
FIG. 3 is a diagram showing an area of a captured image. FIG. 3A is a diagram illustrating a first captured image 100 acquired by the first imaging unit 2, FIG. 3B is a diagram illustrating a second captured image 101 acquired by the second imaging unit 3, FIG. 3C shows a group. The first imaging unit 2 and the second imaging unit 3 are arranged in the horizontal direction, and the horizontal direction in FIGS. 3A and 3B is arranged in accordance with the shooting position. That is, the same region is photographed as the region Q of the first photographed image 100 and the region R of the second photographed image 101.
 そのため第1撮影画像100の領域Qと第2撮影画像101の領域Rを用いることで、複眼距離情報生成部11は高精度に距離を算出できる。以下では、領域Qと領域Rの2つの情報をあわせて用いる意味を含めて、領域Qと領域Rに含まれる情報をまとめて「領域Tの情報」と呼ぶ。領域Tを「両眼領域」と呼ぶ。その一方で第1撮影画像100の領域Pおよび第2撮影画像101の領域Sは、第1撮像部2および第2撮像部3のいずれか一方からしか情報が得られておらず、単眼距離情報生成部10は低い精度でしか距離を算出できない。以下では、領域P~領域Sのそれぞれを「単眼領域」と呼ぶ。 た め Therefore, by using the region Q of the first captured image 100 and the region R of the second captured image 101, the compound eye distance information generation unit 11 can calculate the distance with high accuracy. Hereinafter, the information included in the region Q and the region R is collectively referred to as “information of the region T”, including the meaning of using the two information of the region Q and the region R together. The region T is called a “binocular region”. On the other hand, in the area P of the first captured image 100 and the area S of the second captured image 101, information is obtained only from one of the first imaging unit 2 and the second imaging unit 3, and the monocular distance information The generation unit 10 can calculate the distance only with low accuracy. Hereinafter, each of the regions P to S is referred to as a “monocular region”.
 図3に示されている被写体は次のとおりである。第1撮影画像100には被写体410、被写体420、および被写体430が撮影されている。被写体430は領域Pと領域Qに跨っており、便宜的に領域Pの被写体430を部分被写体401と呼び、領域Qの被写体430を部分被写体400と呼ぶ。被写体410は領域Qの右端に存在し、被写体420は領域Pの中央付近に存在する。第2撮影画像101には、領域Rの左端に部分被写体401が撮影され、領域Rの右端付近に被写体410が撮影される。また第2撮影画像101の領域Sには、左端から被写体411、被写体412、および被写体413が撮影される。被写体410、被写体411、被写体412、および被写体413の形状は略同一である。 被 写 体 The subjects shown in FIG. 3 are as follows. A subject 410, a subject 420, and a subject 430 are photographed in the first captured image 100. The subject 430 extends over the region P and the region Q. For convenience, the subject 430 in the region P is called a partial subject 401, and the subject 430 in the region Q is called a partial subject 400. The subject 410 exists at the right end of the region Q, and the subject 420 exists near the center of the region P. In the second captured image 101, the partial subject 401 is captured at the left end of the region R, and the subject 410 is captured near the right end of the region R. In the region S of the second captured image 101, the subject 411, the subject 412, and the subject 413 are photographed from the left end. The shapes of the subject 410, the subject 411, the subject 412, and the subject 413 are substantially the same.
 本実施の形態では、部分被写体400と部分被写体401の集合、換言すると被写体430そのものをグループG1とし、被写体420と被写体430の集合をグループG2とし、被写体410と被写体411と被写体412と被写体413との集合をグループG3とする。グループG1、グループG2、およびグループG3のそれぞれは、第1グループ、第2グループ、および第3グループとも呼ぶ。 In the present embodiment, a set of the partial subjects 400 and 401, in other words, the subject 430 itself is a group G1, a set of the subjects 420 and 430 is a group G2, and the subjects 410, 411, 412, and 413 Is a group G3. Each of the groups G1, G2, and G3 is also referred to as a first group, a second group, and a third group.
 本実施の形態では、両眼領域と単眼領域に跨って存在する被写体の各領域での部分のグループの種類、すなわちグループG1を「越境グループ」と呼ぶ。図3に示す例に当てはめると、両眼領域Qと単眼領域Pに跨って存在する被写体430の各領域での部分、すなわち部分被写体400と部分被写体401のグループG1は越境グループに分類される。以下ではグループG1を「越境グループG1」と呼ぶ。 In the present embodiment, the type of the group of the portion in each region of the subject existing over the binocular region and the monocular region, that is, the group G1 is referred to as a “cross-border group”. When applied to the example shown in FIG. 3, a portion in each region of the subject 430 existing over the binocular region Q and the monocular region P, that is, the group G1 of the partial subject 400 and the partial subject 401 is classified as a cross-border group. Hereinafter, the group G1 is referred to as a “cross-border group G1”.
 単眼領域に存在する被写体であって越境グループに属する被写体の近傍に存在する被写体と、その越境グループを構成する被写体のグループの種類、すなわちグループG2を「拡大越境グループ」と呼ぶ。図3に示す例に当てはめると、単眼領域Pに存在する被写体であって越境グループG1に属する被写体430の近傍に存在する被写体420と、その越境グループG1を構成する被写体430のグループG2は拡大越境グループに分類される。以下ではグループG2を「拡大越境グループG2」と呼ぶ。なお以下では、被写体420のように、拡大越境グループG2には含まれるが越境グループG1には含まれない被写体を「追加被写体」と呼ぶ。 {Circle around (1)} A subject existing in the monocular region and present in the vicinity of a subject belonging to the cross-border group, and the type of the group of the subject forming the cross-border group, that is, the group G2 is referred to as an “enlarged cross-border group”. Applying to the example shown in FIG. 3, the subject 420 existing in the monocular region P near the subject 430 belonging to the cross-border group G1 and the group G2 of the subjects 430 constituting the cross-border group G1 are the enlarged cross-border. Classified into groups. Hereinafter, the group G2 is referred to as an “extended transboundary group G2”. In the following, a subject such as the subject 420 that is included in the enlarged cross-border group G2 but not included in the cross-border group G1 is referred to as an “additional subject”.
 複眼領域および単眼領域に隣接して存在する複数の被写体から構成されるグループの種類、すなわちグループG3を「単体グループ」と呼ぶ。ただし単体グループに属するそれぞれの被写体は、被写体の種類、被写体の形状、被写体の色彩、および被写体の天地方向の位置の少なくとも1つが同一または類似していることを条件とする。図3に示す例に当てはめると、単眼領域Sに隣接して存在する被写体411と被写体412と被写体413のグループG3は単体グループに分類される。以下ではグループG3を「単体グループG3」と呼ぶ。 {Circle around (2)} The type of group composed of a plurality of subjects existing adjacent to the compound eye region and the monocular region, that is, the group G3 is referred to as a “simple group”. However, each subject belonging to the simplex group is provided on condition that at least one of the type of the subject, the shape of the subject, the color of the subject, and the position of the subject in the vertical direction is the same or similar. When applied to the example shown in FIG. 3, the group G3 of the subject 411, the subject 412, and the subject 413 existing adjacent to the monocular region S is classified into a single group. Hereinafter, the group G3 is referred to as a “single group G3”.
(補正部20)
 図4は、補正部20の構成図である。補正部20は、距離差分情報生成部22と、補正判定部23と、補正実行部21とを備える。距離差分情報生成部22は、単眼距離情報102と複眼距離情報103とグループ情報106とに基づき距離差分情報120を生成する。距離差分情報120については後述する。
(Correction unit 20)
FIG. 4 is a configuration diagram of the correction unit 20. The correction unit 20 includes a distance difference information generation unit 22, a correction determination unit 23, and a correction execution unit 21. The distance difference information generation unit 22 generates distance difference information 120 based on the monocular distance information 102, the compound eye distance information 103, and the group information 106. The distance difference information 120 will be described later.
 補正判定部23は、距離差分情報120と複眼距離情報103と入力情報105に基づき単眼距離情報102に対する補正方法を判定し補正判定情報121を出力する。補正実行部21は、距離差分情報120とグループ情報106と補正判定情報121に基づき単眼距離情報102に対する補正処理を行い、複眼距離情報103と補正後の単眼距離情報を含む補正後距離情報104を生成する。補正判定情報121については後述する。 The correction determination unit 23 determines a correction method for the monocular distance information 102 based on the distance difference information 120, the compound eye distance information 103, and the input information 105, and outputs correction determination information 121. The correction executing unit 21 performs a correction process on the single-lens distance information 102 based on the distance difference information 120, the group information 106, and the correction determination information 121, and outputs the compound-eye distance information 103 and the corrected distance information 104 including the corrected single-eye distance information 104. Generate. The correction determination information 121 will be described later.
(距離差分情報生成部22)
 距離差分情報生成部22の動作を、図3を再び参照して具体的に説明する。距離差分情報生成部22は、複眼領域Tに含まれる部分被写体400に対して、複眼距離情報103に基づき平均値LAmを算出する。次に距離差分情報生成部22は、単眼画像のみを用いて算出する部分被写体400の単眼距離情報102に基づき平均値LAsを算出する。そして距離差分情報生成部22は、平均値LAmおよび平均値LAsを用いて距離差分情報120を算出する。平均値LAmおよび平均値LAsの算出はたとえば、部分被写体400の全領域の単純平均でもよいし、境界に近いほど重みを重くする加重平均でもよい。
(Distance difference information generation unit 22)
The operation of the distance difference information generation unit 22 will be specifically described with reference to FIG. The distance difference information generation unit 22 calculates an average value LAm for the partial subject 400 included in the compound eye region T based on the compound eye distance information 103. Next, the distance difference information generation unit 22 calculates the average value LAs based on the monocular distance information 102 of the partial subject 400 calculated using only the monocular image. Then, the distance difference information generation unit 22 calculates the distance difference information 120 using the average value LAm and the average value LAs. The calculation of the average value LAm and the average value LAs may be, for example, a simple average of the entire region of the partial subject 400 or a weighted average in which the weight is increased as the distance from the boundary increases.
 加重平均を行う場合の計算例は次のとおりである。計算対象となる部分被写体400を構成するn個の距離情報ブロックkの複眼距離情報103をLm(k)、単眼距離情報102をLs(k)、複眼距離情報103に対する重み付け係数をα(k)、単眼距離情報102に対する重み付け係数をβ(k)とおく。ただしkは0~n-1の整数である。このとき平均値LAmおよびLAsは式1および式2のように表される。
   LAm=(1/n)Σk=0 n-1(α(k)×Lm(k)) ・・・(式1)
   LAs=(1/n)Σk=0 n-1(β(k)×Ls(k)) ・・・(式2)
A calculation example in the case of performing weighted averaging is as follows. The compound eye distance information 103 of the n distance information blocks k constituting the partial subject 400 to be calculated is Lm (k), the monocular distance information 102 is Ls (k), and the weighting coefficient for the compound eye distance information 103 is α (k). , The weighting coefficient for the monocular distance information 102 is β (k). Here, k is an integer of 0 to n-1. At this time, the average values LAm and LAs are expressed as Expressions 1 and 2.
LAm = (1 / n) Σk = 0 n−1 (α (k) × Lm (k)) (1)
LAs = (1 / n) Σk = 0 n−1 (β (k) × Ls (k)) (Expression 2)
 距離差分情報120はたとえば、平均値LAmと平均値LAsとの差として算出される。ただし差分値を算出する際に、平均値LAmまたは平均値LAsに重み付けを行ってもよい。記号LAmで表す複眼距離情報103に対する重み付け係数をγ、記号LAsで表す単眼距離情報102に対する重み付け係数をδとした場合に、記号LDで表す距離差分情報120は次の式3のように算出できる。
   LD=γ×LAm-δ×LAs ・・・(式3)
The distance difference information 120 is calculated, for example, as a difference between the average value LAm and the average value LAs. However, when calculating the difference value, the average value LAm or the average value LAs may be weighted. When the weighting factor for the compound eye distance information 103 represented by the symbol LAm is γ and the weighting factor for the monocular distance information 102 represented by the symbol LAs is δ, the distance difference information 120 represented by the symbol LD can be calculated as in the following Expression 3. .
LD = γ × LAm−δ × LAs (Formula 3)
 距離差分情報120を次のように算出してもよい。距離情報ブロック単位ごとに、複眼領域Tにおける部分被写体400の複眼距離情報103と、単眼領域Qの部分被写体400の単眼距離情報102との差を算出し、生成した複眼領域T内の部分被写体400の距離情報ブロック単位ごとの差に基づき生成してもよい。 The distance difference information 120 may be calculated as follows. A difference between the compound eye distance information 103 of the partial subject 400 in the compound eye region T and the monocular distance information 102 of the partial subject 400 in the single eye region Q is calculated for each distance information block unit, and the generated partial subject 400 in the compound eye region T is generated. May be generated based on the difference for each distance information block unit.
 さらに距離差分情報120を次のように算出してもよい。すなわち平均値LAmに重み付けを行ったものを距離差分情報120としてもよい。すなわち重み付け係数をεとすると、記号LDで示す距離差分情報120は次の式4のように算出できる。
   LD=ε×LAm ・・・(式4)
Further, the distance difference information 120 may be calculated as follows. That is, the weighted average value LAm may be used as the distance difference information 120. That is, assuming that the weighting coefficient is ε, the distance difference information 120 indicated by the symbol LD can be calculated as in the following Expression 4.
LD = ε × LAm (Equation 4)
(補正判定部23)
 補正判定部23は、車両制御部8から取得する入力情報105を用いて単眼領域の距離情報の補正の要否を判断し、補正判定情報121を生成する。ただし補正判定情報121には補正に用いるパラメータが含まれてもよい。
(Correction determination unit 23)
The correction determination unit 23 determines whether or not the correction of the distance information of the monocular region is necessary using the input information 105 acquired from the vehicle control unit 8 and generates the correction determination information 121. However, the correction determination information 121 may include parameters used for correction.
 補正判定部23は、たとえば次の場合に単眼領域の距離情報を補正しないと判断する。すなわち、車速が所定速度を超える場合、運転モードが所定のモードの場合、車両1の現在位置が高速道路の場合、ステアリング角度が所定の角度以上の場合などである。ただしこの条件は組み合わせて用いてもよい。この判断は、低速走行時には、近距離の監視が重要であることに起因している。近距離では、同一の物体が大きく撮影され視差が大きくなるので、距離の精度が高くなる。街中の走行時や、街中での交差点右左折時など、人が車に対して注意散漫になりやすい環境で、人と車の接触リスクが高まる。逆に、高速道路や、車両が高速走行している国道などでは人が車に接触するリスクは大きく下がる。 The correction determination unit 23 determines not to correct the distance information of the monocular region in the following cases, for example. That is, the vehicle speed exceeds a predetermined speed, the driving mode is a predetermined mode, the current position of the vehicle 1 is on a highway, the steering angle is equal to or more than a predetermined angle, and the like. However, these conditions may be used in combination. This determination is based on the importance of monitoring a short distance when traveling at low speed. At a short distance, the same object is photographed in a large size and the parallax is increased, so that the accuracy of the distance is improved. The risk of contact between a person and a car increases in an environment where people tend to be distracted by the car, such as when driving in a city or turning right or left at an intersection in a city. Conversely, on highways and on national roads where vehicles are traveling at high speed, the risk of human contact with vehicles is greatly reduced.
 補正判定部23は、重みづけに用いるパラメータをたとえば次のように決定する。なおこのパラメータは、値が大きいほど補正に複眼領域の距離情報が大きく反映される。補正判定部23は車速が早いほどパラメータを大きく設定し、車速が第1の所定値より大きい場合にはパラメータを最大とし、車速が第2の所定値よりも小さい場合は最小とする。補正判定部23は車両1が走行している場所を判断し、たとえば人の往来が激しい箇所を走行している場合にはパラメータを大きく設定し、それ以外の一般道ではパラメータを中程度、高速道路ではパラメータを最小に設定する。補正判定部23は車両1のステアリング角度が大きいほどパラメータを大きく設定し、ステアリング角度が小さいほどパラメータを小さく設定する。なおこれらの条件を組み合わせてもよい。 The 判定 correction determination unit 23 determines parameters used for weighting, for example, as follows. Note that the larger the value of this parameter, the greater the distance information of the compound eye region is reflected in the correction. The correction determination unit 23 sets the parameter to be larger as the vehicle speed is faster, and sets the parameter to the maximum when the vehicle speed is higher than the first predetermined value, and sets the parameter to the minimum when the vehicle speed is lower than the second predetermined value. The correction determination unit 23 determines the location where the vehicle 1 is traveling. For example, if the vehicle 1 is traveling in a location where traffic is intense, the parameter is set to a large value. On roads, the parameters are set to a minimum. The correction determination unit 23 sets a larger parameter as the steering angle of the vehicle 1 is larger, and sets a smaller parameter as the steering angle is smaller. Note that these conditions may be combined.
 補正判定部23を設けることにより、人と車の接触リスクが高くなる低速走行時や交差点右左折の場面で車両周囲の物体、すなわち人、車、駐車車両などとの距離を正確に測定することで、事故のリスクを大幅に軽減することができる。自動運転では、自動運転システムが参照する距離情報の精度を向上することで、より正しい自動運転判断を実現できる。誤距離情報から発生する誤検知による、自動運転ミスを防ぐことができる。低速走行時に制限することで、高速走行時の処理負荷を低減できる。 By providing the correction determination unit 23, it is possible to accurately measure the distance to an object around the vehicle, that is, a person, a car, a parked vehicle, or the like during low-speed driving or when turning right or left at an intersection where the risk of contact between the person and the vehicle increases. Thus, the risk of an accident can be greatly reduced. In automatic driving, more accurate automatic driving judgment can be realized by improving the accuracy of distance information referred to by the automatic driving system. It is possible to prevent an automatic driving error due to an erroneous detection generated from the erroneous distance information. By limiting at low speed traveling, the processing load at high speed traveling can be reduced.
(補正実行部21)
 補正実行部21は、グループ情報106に基づきグループの種類ごとに異なる手法で単眼距離情報102を補正する。距離差分情報生成部22が出力する距離差分情報120は、主に越境グループG1を処理対象とする際に用いられる。補正判定情報121は越境グループG1の補正の要否の判断や補正のパラメータの決定に用いられる。以下ではグループの種類ごとに補正処理を説明する。
(Correction execution unit 21)
The correction executing unit 21 corrects the monocular distance information 102 based on the group information 106 by a different method for each type of group. The distance difference information 120 output by the distance difference information generation unit 22 is mainly used when processing the cross-border group G1. The correction determination information 121 is used for determining whether or not correction is required for the cross-border group G1, and for determining correction parameters. Hereinafter, the correction process will be described for each type of group.
(補正実行部21 越境グループに対する補正)
 補正実行部21は、単眼領域Pに存在する部分被写体401の単眼距離情報102を補正する。この補正はたとえば、部分被写体401の全てまたは一部の距離情報ブロックの単眼距離情報102を、複眼領域T内の部分被写体における平均値LAmで置き換えるものである。置き換える場合には、所定の画素に重み付けをしてもよい。重み付けの一例として、境界からの距離が遠いほど重み付けを小さくする方法がある。記号LCsで表す補正後の単眼距離情報102は、重み付け係数をζとすると次の式5のように算出される。
   LCs=ζ×LD ・・・(式5)
 そして補正実行部21は、補正した単眼距離情報102を補正後距離情報104の一部として出力する。
(Correction execution unit 21 Correction for cross-border groups)
The correction executing unit 21 corrects the monocular distance information 102 of the partial subject 401 existing in the monocular region P. This correction is, for example, to replace the monocular distance information 102 of all or a part of the distance information blocks of the partial subject 401 with the average value LAm of the partial subjects in the compound eye region T. In the case of replacement, a predetermined pixel may be weighted. As an example of the weighting, there is a method of reducing the weighting as the distance from the boundary increases. The corrected monocular distance information 102 represented by the symbol LCs is calculated as in the following Expression 5, where the weighting coefficient is ζ.
LCs = ζ × LD (Equation 5)
Then, the correction executing unit 21 outputs the corrected monocular distance information 102 as a part of the corrected distance information 104.
(補正実行部21 拡大越境グループに対する補正)
 補正実行部21は拡大越境グループを処理対象とする場合は、次のように距離情報を補正する。すなわち補正実行部21は、拡大越境グループの単眼領域の部分被写体の距離情報は、越境グループの単眼領域の距離情報と同様に補正する。補正実行部21は、拡大越境グループの単眼領域の被写体の距離情報を複眼領域の部分被写体の距離情報を用いて補正する。図3に示す例で具体的に示すと、拡大越境グループG2の単眼領域Pにおける部分被写体401の距離情報は、越境グループG1における部分被写体401の補正と同様である。単眼領域Pにおける被写体420の距離情報は、複眼領域Tの部分被写体400の距離情報を用いて補正する。
(Correction execution unit 21 Correction for enlarged cross-border group)
The correction executing unit 21 corrects the distance information as follows when the processing is performed on the extended cross-border group. That is, the correction executing unit 21 corrects the distance information of the partial subject in the monocular area of the enlarged cross-border group in the same manner as the distance information of the monocular area in the cross-border group. The correction executing unit 21 corrects the distance information of the subject in the monocular region of the enlarged cross-border group using the distance information of the partial subject in the compound eye region. Specifically, in the example shown in FIG. 3, the distance information of the partial subject 401 in the monocular region P of the enlarged cross-border group G2 is the same as the correction of the partial subject 401 in the cross-border group G1. The distance information of the subject 420 in the monocular region P is corrected using the distance information of the partial subject 400 in the compound eye region T.
 複眼領域の部分被写体の距離情報を用いた、単眼領域における被写体の距離情報の補正は、たとえば次のとおりである。すなわち単眼領域における被写体のそれぞれの距離情報ブロックの距離情報を、複眼領域の部分被写体の全域の距離の平均値と単眼領域における被写体のそれぞれの距離情報ブロックの距離情報の平均値とする。 The correction of the distance information of the subject in the monocular region using the distance information of the partial subject in the compound eye region is, for example, as follows. That is, the distance information of each of the distance information blocks of the subject in the single-eye region is set as the average value of the distances of the entire regions of the partial subjects of the compound-eye region and the average value of the distance information of the respective distance information blocks of the subject in the single-eye region.
(補正実行部21 単体グループに対する補正)
 補正実行部21は単体グループを処理対象とする場合は、単眼領域のみに撮影されている被写体の単眼距離情報102を補正対象とし、単体グループに属する被写体の複眼距離情報103を用いて補正する。図3に示す例で処理を例示すると、複眼画像Tを用いて算出した被写体410の複眼距離情報103と、単眼画像Rを用いて算出した被写体410の単眼距離情報102を用いて、被写体411、被写体412、および被写体413の単眼距離情報102を補正する。なお単眼画像Qにも被写体410は撮影されているが、補正を行いたい単眼画像Sとは異なるカメラで撮影された画像なので、単眼画像Qから得られた単眼距離情報102はこの場合の補正には用いない。
(Correction execution unit 21 Correction for single group)
When a single group is to be processed, the correction executing unit 21 corrects the single-eye distance information 102 of the subject photographed only in the single-eye region using the compound-eye distance information 103 of the subject belonging to the single group. When the process is illustrated in the example illustrated in FIG. 3, the subject 411 is calculated using the compound-eye distance information 103 of the subject 410 calculated using the compound-eye image T and the monocular distance information 102 of the subject 410 calculated using the single-eye image R. The monocular distance information 102 of the subject 412 and the subject 413 is corrected. Although the subject 410 is also photographed in the monocular image Q, since the image is photographed by a camera different from the monocular image S to be corrected, the monocular distance information 102 obtained from the monocular image Q is used for correction in this case. Is not used.
 補正実行部21はたとえば、被写体410の複眼距離情報103と単眼距離情報102の差を算出し、その差を被写体411、被写体412、および被写体413の単眼距離情報102に加える。 For example, the correction executing unit 21 calculates a difference between the compound-eye distance information 103 and the single-lens distance information 102 of the subject 410, and adds the difference to the single-eye distance information 102 of the subjects 411, 412, and 413.
(グループ情報生成部30)
 図5は、グループ情報生成部30の構成図である。グループ情報生成部30は、被写体検知部32と、グループ判定部31とを備える。
(Group information generation unit 30)
FIG. 5 is a configuration diagram of the group information generation unit 30. The group information generation unit 30 includes a subject detection unit 32 and a group determination unit 31.
(被写体検知部32)
 被写体検知部32は、グループ情報生成部30が、第1撮像部2が出力する第1撮影画像100と、第2撮像部3が出力する第2撮影画像101を用いて被写体を検知し、被写体検知情報130を出力する。被写体の検知には、第1撮影画像100または第2撮影画像101を用いてもよいし、第1撮影画像100および第2撮影画像101の両方を用いてもよい。また、撮影された時刻が異なる第1撮影画像100または第2撮影画像101を用いてもよい。
(Subject detection unit 32)
The subject detection unit 32 detects the subject using the first captured image 100 output from the first imaging unit 2 and the second captured image 101 output from the second imaging unit 3 when the group information generation unit 30 detects the subject. The detection information 130 is output. For the detection of the subject, the first captured image 100 or the second captured image 101 may be used, or both the first captured image 100 and the second captured image 101 may be used. Further, the first photographed image 100 or the second photographed image 101 having different photographing times may be used.
 第1撮像部2と第2撮像部3の位置関係、第1撮像部2と第2撮像部3の姿勢、および第1撮像部2と第2撮像部3の視野角は既知なので、第1撮影画像100と第2撮影画像101において重複するおおよその領域は事前に算出できる。被写体検知部32は、この事前の算出を利用して、第1撮影画像100と第2撮影画像101で重複する領域を算出し、単眼領域および複眼領域を特定する。これにより複眼領域と単眼領域の境界が明らかとなる。 Since the positional relationship between the first imaging unit 2 and the second imaging unit 3, the attitude of the first imaging unit 2 and the second imaging unit 3, and the viewing angles of the first imaging unit 2 and the second imaging unit 3 are known, the first The approximate area where the captured image 100 and the second captured image 101 overlap can be calculated in advance. The subject detection unit 32 calculates a region where the first captured image 100 and the second captured image 101 overlap with each other by using the preliminary calculation, and specifies a monocular region and a compound eye region. Thereby, the boundary between the compound eye region and the monocular region becomes clear.
 被写体の検知には、たとえばパターンマッチングやニューラルネットワークを用いることができる。なお被写体検知部32が検知する被写体の条件が、入力情報105として入力されてもよい。被写体の条件とは、被写体の種類、大きさ、位置情報等である。被写体検知情報130には、被写体の種類、被写体の大きさ、複眼領域と単眼領域の境界を跨ぐ被写体、単眼領域内の被写体、被写体の位置情報、被写体を構成する距離情報ブロックの情報が含まれる。 (4) For example, pattern matching or a neural network can be used for subject detection. The condition of the subject detected by the subject detection unit 32 may be input as the input information 105. The condition of the subject is the type, size, position information and the like of the subject. The subject detection information 130 includes the type of the subject, the size of the subject, the subject straddling the boundary between the compound eye region and the monocular region, the subject in the monocular region, the position information of the subject, and the information of the distance information block forming the subject. .
(グループ判定部31)
 グループ判定部31は、被写体検知情報130と、複眼距離情報103と、入力情報105とに基づき、グループ情報106を算出する。グループ判定部31は、複眼領域と単眼領域の境界を跨ぐ被写体を1つのグループとし、越境グループに分類する。グループ判定部31は、単眼領域において越境グループの近傍に他の被写体が存在する場合は、その他の被写体を含めて別の1つのグループとし、拡大越境グループに分類する。グループ判定部31は、単眼領域に隣接して存在する複数の被写体を1つのグループとし、単体グループに分類する。
(Group determination unit 31)
The group determination unit 31 calculates the group information 106 based on the subject detection information 130, the compound eye distance information 103, and the input information 105. The group determination unit 31 classifies subjects straddling the boundary between the compound eye region and the monocular region into one group and classifies them into a cross-border group. When there is another subject near the cross-border group in the monocular region, the group determination unit 31 classifies the subject into another extended group including the other subjects, and classifies the group into an enlarged cross-border group. The group determination unit 31 classifies a plurality of subjects existing adjacent to the monocular region into one group, and classifies them into a single group.
(フローチャート)
 図6に、距離算出装置6の動作を表すフローチャートである。まず距離算出装置6は、第1撮像部2および第2撮像部3から第1撮影画像100および第2撮影画像101を取得する(S01)。次に距離算出装置6は、S02、S03、S04を並列に実行する。ただしここでは概念的に示しているにすぎず、S02、S03、S04を任意の順で実行してもよい。S02は単眼距離情報生成部10により実行され、S03は複眼距離情報生成部11により実行され、S04はグループ情報生成部30により実行される。
(flowchart)
FIG. 6 is a flowchart showing the operation of the distance calculation device 6. First, the distance calculation device 6 acquires the first captured image 100 and the second captured image 101 from the first imaging unit 2 and the second imaging unit 3 (S01). Next, the distance calculation device 6 executes S02, S03, and S04 in parallel. However, this is shown only conceptually, and S02, S03, and S04 may be executed in any order. S02 is executed by the monocular distance information generation unit 10, S03 is executed by the compound eye distance information generation unit 11, and S04 is executed by the group information generation unit 30.
 S02では距離算出装置6は、第1撮影画像100を用いて第1撮影画像100に対応する単眼距離情報を生成し、第2撮影画像101を用いて第2撮影画像101に対応する単眼距離情報を生成する。すなわちS02では複眼領域はそれぞれの画像を用いて距離情報が2回算出される。S03では距離算出装置6は、第1撮影画像100および第2撮影画像101を用いて重複領域の複眼距離情報を生成する。S04では距離算出装置6は、被写体を検出し、グループを生成し(S05)、第1グループが存在するか否かを判断する(S06)。 In S02, the distance calculation device 6 generates monocular distance information corresponding to the first captured image 100 using the first captured image 100, and monocular distance information corresponding to the second captured image 101 using the second captured image 101. Generate That is, in S02, the distance information of the compound eye region is calculated twice using each image. In S03, the distance calculation device 6 generates compound eye distance information of the overlapping area using the first captured image 100 and the second captured image 101. In S04, the distance calculation device 6 detects a subject, generates a group (S05), and determines whether or not the first group exists (S06).
 距離算出装置6は、S02およびS03の実行が完了し、かつS06を肯定判断する場合に、第1グループの距離差分情報120を生成する(S07)。距離算出装置6は、第1グループが存在しない場合は(S06:NO)、処理を終了する。 When the execution of S02 and S03 is completed and the determination of S06 is affirmative, the distance calculation device 6 generates the distance difference information 120 of the first group (S07). If the first group does not exist (S06: NO), the distance calculation device 6 ends the processing.
 最後に距離算出装置6は、生成した距離差分情報120が所定値よりも大きいか否かを判定(S08)し、大きい場合には、補正実行部21にて前述の通り単眼距離情報102の補正処理を行い、補正後距離情報104を出力する(S09)。距離算出装置6は、所定値よりも小さい場合には処理を終了する。距離算出装置6は、これらの処理をたとえば1フレームごとに繰り返し行う。 Finally, the distance calculation device 6 determines whether the generated distance difference information 120 is larger than a predetermined value (S08). If the distance difference information 120 is larger, the correction execution unit 21 corrects the monocular distance information 102 as described above. The processing is performed, and the corrected distance information 104 is output (S09). If the distance is smaller than the predetermined value, the distance calculation device 6 ends the processing. The distance calculation device 6 repeats these processes for each frame, for example.
 上述した第1の実施の形態によれば、次の作用効果が得られる。
(1)距離算出装置6は、センサの出力を用いて、精度の高い距離の情報である複眼距離情報103を生成する複眼距離情報生成部11と、センサの出力を用いて、複眼距離情報103よりも制度の低い単眼距離情報102を生成する単眼距離情報生成部10と、複眼距離情報103を用いて単眼距離情報102を補正する単眼距離情報補正部12とを備える。そのため、精度が低い領域の距離情報を補正することができる。
According to the first embodiment, the following operation and effect can be obtained.
(1) The distance calculation device 6 uses the output of the sensor to generate the compound eye distance information 103 that is highly accurate distance information, and the compound eye distance information 103 by using the output of the sensor. A monocular distance information generation unit 10 that generates monocular distance information 102 with a lower precision than that, and a monocular distance information correction unit 12 that corrects the monocular distance information 102 using the compound eye distance information 103 are provided. Therefore, it is possible to correct the distance information of an area with low accuracy.
(2)複眼距離情報生成部11は、複数のカメラの出力であって視野が重複する複眼領域の出力を用いて複眼距離情報103を生成する。単眼距離情報生成部10は、単一のカメラの出力を用いて単眼領域の単眼距離情報102を生成する。距離算出装置6は、複眼領域および単眼領域のそれぞれに含まれる1以上の被写体を所定の条件に基づき第1グループとして対応付けるグループ情報生成部30を備える。単眼距離情報補正部12は、複眼距離情報生成部11により生成された、第1グループに属する被写体の複眼距離情報103を用いて、単眼距離情報102を補正する。そのため、複眼距離情報103を用いて、単眼距離情報102を補正することができる。 (2) The compound-eye distance information generation unit 11 generates compound-eye distance information 103 by using outputs of a plurality of cameras and outputs of compound-eye regions in which fields of view overlap. The monocular distance information generation unit 10 generates monocular distance information 102 of a monocular region using an output of a single camera. The distance calculation device 6 includes a group information generation unit 30 that associates one or more subjects included in each of the compound eye region and the monocular region as a first group based on a predetermined condition. The monocular distance information correction unit 12 corrects the monocular distance information 102 using the compound eye distance information 103 of the subject belonging to the first group generated by the compound eye distance information generating unit 11. Therefore, the monocular distance information 102 can be corrected using the compound eye distance information 103.
(3)第1グループG1に属する被写体は、単眼領域および複眼領域に跨って存在する同一の被写体である。 (3) The subjects belonging to the first group G1 are the same subjects existing over the single-eye region and the compound-eye region.
(4)グループ情報生成部30は、単眼領域に含まれる被写体であって所定の条件を満たす被写体を追加被写体として第1グループとともに第2グループとして対応付ける。図3に示す例では、追加被写体である被写体420、および第1グループG1を第2グループG2として対応付けた。単眼距離情報補正部12は、第1グループの距離情報を用いて、追加被写体の単眼距離情報を補正する。 (4) The group information generating unit 30 associates a subject included in the monocular region, which satisfies a predetermined condition, as an additional subject with the first group as a second group. In the example shown in FIG. 3, the subject 420, which is an additional subject, and the first group G1 are associated with each other as a second group G2. The monocular distance information correction unit 12 corrects the monocular distance information of the additional subject by using the distance information of the first group.
(5)単眼距離情報補正部12は、同一の領域に対する複眼距離情報103と単眼距離情報102との差を距離差分情報として算出し、距離差分情報に基づき単眼距離情報102の補正の要否を判断する。そのため補正の効果が小さいことが見込まれる場合には、補正を省略して処理負荷を軽減することができる。 (5) The monocular distance information correction unit 12 calculates the difference between the compound eye distance information 103 and the monocular distance information 102 for the same area as distance difference information, and determines whether the correction of the monocular distance information 102 is necessary based on the distance difference information. to decide. Therefore, when the effect of the correction is expected to be small, the correction can be omitted to reduce the processing load.
(6)単眼距離情報補正部12は、複眼距離情報生成部11が生成した第1グループの複眼距離情報または単眼距離情報補正部12が生成した補正後の単眼距離情報を用いて追加被写体の単眼距離情報を補正する。 (6) The monocular distance information correcting unit 12 uses the compound eye distance information of the first group generated by the compound eye distance information generating unit 11 or the corrected monocular distance information generated by the monocular distance information correcting unit 12 to generate a monocular of the additional subject Correct the distance information.
(変形例1)
 単眼距離情報補正部12は、越境グループG1の単眼距離の補正において、部分被写体401の単眼距離情報102を、部分被写体400の複眼距離情報103で置き換えてもよい。
(Modification 1)
The monocular distance information correction unit 12 may replace the monocular distance information 102 of the partial subject 401 with the compound eye distance information 103 of the partial subject 400 in the correction of the monocular distance of the cross-border group G1.
(変形例2)
 単眼距離情報補正部12は、別体グループG3の単眼距離の補正において、単眼領域に存在する被写体411、412、413の単眼距離情報102を、複眼領域に存在する被写体410の複眼距離情報103で置き換えてもよい。
(Modification 2)
The monocular distance information correction unit 12 uses the monocular distance information 102 of the subjects 411, 412, and 413 existing in the monocular area as the monocular distance information 103 of the subject 410 existing in the compound eye area in the correction of the monocular distance of the separate group G3. It may be replaced.
(変形例3)
 図7は、変形例3における補正部20の構成図である。本変形例における補正判定部23は、車両制御部8からの入力情報105の入力がない。本変形例による補正判定部23は、距離差分情報120に基づき単眼距離情報102に対する補正方法を判定する補正判定情報121を生成する。具体的には補正判定部23は、距離差分情報120が所定の閾値以上の場合に単眼距離情報102を補正すると判断し、距離差分情報120が所定の閾値未満の場合に単眼距離情報102を補正しないと判断する。本変形例によれば、入力情報105による制御がないため、処理負荷が小さい利点を有する。
(Modification 3)
FIG. 7 is a configuration diagram of the correction unit 20 according to the third modification. The correction determining unit 23 in the present modified example does not receive the input information 105 from the vehicle control unit 8. The correction determination unit 23 according to the present modification generates correction determination information 121 that determines a correction method for the monocular distance information 102 based on the distance difference information 120. Specifically, the correction determination unit 23 determines that the monocular distance information 102 is corrected when the distance difference information 120 is equal to or greater than a predetermined threshold, and corrects the monocular distance information 102 when the distance difference information 120 is less than the predetermined threshold. Judge not to. According to the present modification, there is no control based on the input information 105, and thus there is an advantage that the processing load is small.
(変形例4)
 図8は、変形例4における距離差分情報120の算出方法と、越境グループG1に対する単眼距離情報102の補正処理を示す図である。図8の部分被写体400および部分被写体401に示す格子が距離情報ブロックである。以下では、部分被写体400および部分被写体401のそれぞれの領域において図示横方向に並んだ一列の距離情報ブロックを「距離ブロックライン」と呼ぶ。本変形例では、距離ブロックラインごとに距離差分情報120を生成する。
(Modification 4)
FIG. 8 is a diagram illustrating a calculation method of the distance difference information 120 and a process of correcting the monocular distance information 102 for the cross-border group G1 according to the fourth modification. Grids shown in the partial subjects 400 and 401 in FIG. 8 are distance information blocks. Hereinafter, a row of distance information blocks arranged in the horizontal direction in the drawing in each of the partial subject 400 and the partial subject 401 will be referred to as a “distance block line”. In the present modification, distance difference information 120 is generated for each distance block line.
 距離情報ブロックラインごとに、単眼領域内の単眼距離情報102を、複眼領域Tと単眼領域の境界に位置する複眼領域T内の複眼距離情報103に基づき生成した距離差分情報120で置き換える。ただし、距離情報ブロックラインごとに、単眼領域内の単眼距離情報102を、複眼距離情報700に基づき距離情報ブロック単位で生成した複眼距離情報103と単眼距離情報102の差分値で置き換えてもよい。 、 For each distance information block line, the monocular distance information 102 in the monocular area is replaced with distance difference information 120 generated based on the compound-eye distance information 103 in the compound-eye area T located at the boundary between the compound-eye area T and the monocular area. However, for each distance information block line, the single-eye distance information 102 in the single-eye area may be replaced with a difference value between the single-eye distance information 102 and the compound-eye distance information 103 generated for each distance information block based on the compound-eye distance information 700.
 図8に示す距離情報ブロックライン700は、複眼領域T内の部分被写体400内における距離情報ブロックラインである。また、図中の距離情報ブロックライン701は、単眼領域内の部分被写体401内における距離情報ブロックラインである。 距離 The distance information block line 700 shown in FIG. 8 is a distance information block line in the partial subject 400 in the compound eye region T. A distance information block line 701 in the figure is a distance information block line in the partial subject 401 in the monocular region.
 距離差分情報生成部22では、複眼領域T内の部分被写体内の複眼距離情報103に基づき距離情報ブロックラインごとの平均値LAmを生成する。そして、単眼領域P内の部分被写体内の単眼距離情報102に基づき距離情報ブロックライン単位ごとの平均値LAmを生成し、平均値LAmおよび平均値LAsに基づき距離情報ブロックラインごとの距離差分情報120を生成する。 The distance difference information generation unit 22 generates an average value LAm for each distance information block line based on the compound eye distance information 103 in the partial subject in the compound eye region T. Then, an average value LAm for each distance information block line is generated based on the monocular distance information 102 in the partial subject in the monocular region P, and the distance difference information 120 for each distance information block line based on the average value LAm and the average value LAs. Generate
(変形例5)
 越境グループG1を対象とする補正において、補正実行部21は、補正前の単眼距離情報102に距離差分情報120を加算する補正を行ってもよい。この場合に、所定の画素に重み付けをしてもよい。重み付け係数をη、θとした場合に記号LCsで表される補正後の単眼距離情報は、単眼距離情報102の記号をLs、距離差分情報120の記号をLDとおくと次の式6または式7のように算出される。
   LCs=θ×Ls+η×LD ・・・(式6)
   LCs=(1-η)×Ls+η×LD ・・・(式7)
(Modification 5)
In the correction for the cross-border group G1, the correction executing unit 21 may perform correction for adding the distance difference information 120 to the monocular distance information 102 before correction. In this case, a predetermined pixel may be weighted. When the weighted coefficients are η and θ, the corrected monocular distance information represented by the symbol LCs is represented by the following Expression 6 or Expression when the symbol of the monocular distance information 102 is Ls and the symbol of the distance difference information 120 is LD. 7 is calculated.
LCs = θ × Ls + η × LD (Equation 6)
LCs = (1−η) × Ls + η × LD (Equation 7)
 図9は、式6または式7に示す重み付け係数ηの決定方法を示す図である。ただしここでは重みづけ係数を「補正係数」と呼ぶ。補正実行部21は、複眼距離情報103を変数とする補正関数1100に従い、補正係数ηを決定する。ここでは、複眼距離情報103により示される距離を、L0未満を近距離1101、L1未満L0以上を中距離1012、L1以上を遠距離1013と呼ぶ。複眼距離情報103が近距離1101の場合は補正係数ηを0.0とし、遠距離1103の場合は補正係数ηを1.0とする。複眼距離情報103が中距離1102の場合は、複眼距離情報103に応じて補正係数ηを変更する。L0およびL1はあらかじめ定められた閾値である。 FIG. 9 is a diagram showing a method of determining the weighting coefficient η shown in Expression 6 or Expression 7. Here, the weighting coefficient is referred to as a “correction coefficient”. The correction executing unit 21 determines the correction coefficient η according to the correction function 1100 using the compound eye distance information 103 as a variable. Here, the distance indicated by the compound eye distance information 103 is referred to as a short distance 1101 when the distance is less than L0, a medium distance 1012 when the distance L0 is less than L1, and a long distance 1013 when the distance L1 or longer. When the compound eye distance information 103 is the short distance 1101, the correction coefficient η is set to 0.0, and when the compound eye distance information 103 is the long distance 1103, the correction coefficient η is set to 1.0. When the compound eye distance information 103 is the middle distance 1102, the correction coefficient η is changed according to the compound eye distance information 103. L0 and L1 are predetermined thresholds.
 図10は、図9で示す補正係数ηを用いた単眼距離情報102の補正処理の一例を示す図である。図10(a)は補正前を示し、図10(b)は補正後を示す。領域90Aおよび90Bは、中距離1102の領域かつ複眼領域Tと単眼領域Pをまたがる被写体の複眼領域T内の部分被写体を示す。なお、部分被写体90Aと90Bは同一である。領域91Aおよび91Bは、領域90Aおよび90Bと同一の被写体の単眼領域P内の部分被写体を示す。なお、部分被写体91Aと91Bは同一である。 FIG. 10 is a diagram showing an example of a process of correcting the monocular distance information 102 using the correction coefficient η shown in FIG. FIG. 10A shows a state before correction, and FIG. 10B shows a state after correction. The regions 90A and 90B show a partial subject in the compound eye region T of the object which is a region at the intermediate distance 1102 and straddles the compound eye region T and the single eye region P. The partial subjects 90A and 90B are the same. The regions 91A and 91B show partial subjects within the monocular region P of the same subject as the regions 90A and 90B. Note that the partial subjects 91A and 91B are the same.
 領域92Aおよび92Bは、遠距離1103かつ複眼領域Tと単眼領域Pをまたがる被写体の複眼領域T内の部分被写体を示す。なお、部分被写体92Aと902は同一である。領域93Aおよび93Bは、領域92Aおよび92Bと同一の被写体の単眼領域P内の部分被写体を示す。なお、各領域内に記載の記号A、B、m、M、nは、各領域における距離情報を模式的に示したものであり、同一記号の領域における距離情報は同一である事を示す。 The 92 regions 92A and 92B indicate partial subjects in the compound eye region T of a subject that extends over a long distance 1103 and the compound eye region T and the single eye region P. Note that the partial subjects 92A and 902 are the same. The regions 93A and 93B show partial subjects in the monocular region P of the same subject as the regions 92A and 92B. Symbols A, B, m, M, and n described in each area schematically indicate distance information in each area, and indicate that the distance information in the area of the same symbol is the same.
 中距離1102内かつ複眼領域T内の部分被写体における複眼距離情報103(90A)に基づき、同一の被写体を構成する単眼領域P内の部分被写体における単眼距離情報102の補正処理を行う。また、遠距離1103内かつ複眼領域T内の部分被写体における複眼距離情報103(92A)によらず、単眼領域P内の同一の被写体における単眼距離情報102の補正処理は実施しない。 補正 Based on the compound eye distance information 103 (90A) of the partial subject within the middle distance 1102 and within the compound eye region T, a correction process of the single eye distance information 102 on the partial subject within the single eye region P constituting the same subject is performed. Further, irrespective of the compound eye distance information 103 (92A) of the partial subject within the long distance 1103 and within the compound eye region T, the correction processing of the single eye distance information 102 for the same subject in the single eye region P is not performed.
 本変形例によれば次の作用効果が得られる。
(7)単眼距離情報補正部12は、複眼距離情報103に基づき、単眼距離情報102を補正するか否か、および補正に用いるパラメータを決定する。
According to this modification, the following operation and effect can be obtained.
(7) The monocular distance information correction unit 12 determines whether or not to correct the monocular distance information 102 based on the compound eye distance information 103 and determines a parameter used for the correction.
(変形例6)
 距離差分情報120を次のように算出してもよい。距離情報ブロックライン内の距離情報ブロック単位ごとに、複眼領域T内の部分被写体内の複眼距離情報103と、非複眼領域内の部分被写体内の単眼距離情報102との差分値を生成する。この差分値を用いて、複眼領域T内の部分被写体における平均差分値を算出し、平均差分値を距離差分情報120としてもよい。
(Modification 6)
The distance difference information 120 may be calculated as follows. For each distance information block unit in the distance information block line, a difference value between the compound eye distance information 103 in the partial subject in the compound eye region T and the monocular distance information 102 in the partial subject in the non-compound eye region is generated. The difference value may be used to calculate an average difference value of the partial subjects in the compound eye region T, and the average difference value may be used as the distance difference information 120.
(変形例7)
 距離差分情報120を生成する際に、上下の距離情報ブロックラインにおける複眼距離情報103を用いてもよい。一例として、縦Nタップフィルタ(N=整数)がある。縦Nタップフィルタの一例として、縦方向のN画素を使ったNタップFIR(Finite Impulse Response)フィルタがある。
(Modification 7)
When generating the distance difference information 120, the compound eye distance information 103 in the upper and lower distance information block lines may be used. As an example, there is a vertical N tap filter (N = integer). As an example of the vertical N tap filter, there is an N tap FIR (Finite Impulse Response) filter using N pixels in the vertical direction.
(変形例8)
 距離差分情報120を、距離情報ブロックライン内の平均値LAmとしてもよい。
(Modification 8)
The distance difference information 120 may be the average value LAm in the distance information block line.
(変形例9)
 越境グループG1を対象とする補正において、補正実行部21は、部分被写体401内の距離情報ブロックライン内の全てまたは一部の距離情報ブロックにおける単眼距離情報102を、複眼領域T内の部分被写体の距離情報ブロックライン内おける平均値LAmで置き換えてもよい。置き換える場合には、所定の画素に重み付けをしてもよい。重み付けはたとえば、複眼領域Tと単眼領域の境界からの距離に応じて重み付けを小さくする。
(Modification 9)
In the correction for the cross-border group G1, the correction executing unit 21 converts the monocular distance information 102 in all or some of the distance information blocks in the distance information block line in the partial It may be replaced with the average value LAm in the distance information block line. In the case of replacement, a predetermined pixel may be weighted. The weighting is reduced, for example, according to the distance from the boundary between the compound eye region T and the monocular region.
(変形例10)
 越境グループG1を対象とする補正において、補正実行部21は、距離情報ブロックライン内の単眼距離情報102に、同一の距離情報ブロックラインに対する距離差分情報120を加算してもよい。すなわち、補正後の単眼距離情報=単眼距離情報102+距離差分情報120 としてもよい。加算する場合には、所定の画素に重み付けをしてもよい。重み付けはたとえば、複眼領域Tと単眼領域の境界からの距離が遠くなるほど重み付けを小さくする。
(Modification 10)
In the correction for the cross-border group G1, the correction executing unit 21 may add the distance difference information 120 for the same distance information block line to the monocular distance information 102 in the distance information block line. That is, the corrected monocular distance information = monocular distance information 102 + distance difference information 120. When adding, predetermined pixels may be weighted. For example, the weighting is made smaller as the distance from the boundary between the compound eye region T and the monocular region becomes longer.
(変形例11)
 図11は、変形例11における越境グループG1を対象とする距離差分情報120と単眼距離情報102の補正処理を示す図である。本変形例では、複眼領域Tと単眼領域Pを、被写体430が跨いでいる。被写体430は、単眼領域P内の部分被写体401と、複眼領域T内の部分被写体400とから構成される。本変形例では、複眼領域Tと単眼領域Pの境界に位置する複眼領域T内の複眼距離情報700に基づき生成した複眼距離情報103の平均値を算出し、この平均値で単眼領域P内の部分被写体401内の単眼距離情報102を置き換える。距離情報ブロック列700は、複眼領域Tに存在する部分被写体400の最も単眼領域Pに近い位置、すなわち単眼領域Pとの境界に位置する。
(Modification 11)
FIG. 11 is a diagram illustrating a correction process of the distance difference information 120 and the monocular distance information 102 for the cross-border group G1 according to the modification 11. In this modification, the subject 430 straddles the compound eye region T and the monocular region P. The subject 430 includes a partial subject 401 in the monocular region P and a partial subject 400 in the compound eye region T. In this modification, the average value of the compound eye distance information 103 generated based on the compound eye distance information 700 in the compound eye region T located at the boundary between the compound eye region T and the single eye region P is calculated, and the average value of the compound eye region P The monocular distance information 102 in the partial subject 401 is replaced. The distance information block sequence 700 is located at a position closest to the monocular region P of the partial subject 400 existing in the compound eye region T, that is, at a boundary with the monocular region P.
 距離差分情報生成部22は、複眼領域T内の部分被写体内の複眼距離情報103に基づき、複眼領域T内の部分被写体400の左端の距離情報ブロックの平均値LAmを生成する。また距離差分情報生成部22は、単眼領域Pの部分被写体401内の単眼距離情報102に基づき複眼領域T内の部分被写体400の左端の距離情報ブロックの平均値LAsを生成し、平均値LAmおよび平均値LAsに基づき距離情報ブロックラインごとの距離差分情報120を生成する。 The distance difference information generation unit 22 generates an average value LAm of the left end distance information block of the partial subject 400 in the compound eye region T based on the compound eye distance information 103 in the partial subject within the compound eye region T. Further, the distance difference information generation unit 22 generates an average value LAs of the distance information block at the left end of the partial subject 400 in the compound eye region T based on the monocular distance information 102 in the partial subject 401 of the monocular region P, and calculates the average value LAm and the average value LAm. The distance difference information 120 for each distance information block line is generated based on the average value LAs.
 平均値LAmおよび平均値LAsはたとえば、複眼領域T内の部分被写体400の左端700の距離情報ブロック内の全てまたは一部の距離情報の平均値とすることができる。なお平均の算出において、所定の画素に重み付けをしてもよい。距離差分情報120はたとえば、平均値LAmと平均値LAsとの差分値LDとすることができる。なお差分値を算出する際に、平均値LAmまたは平均値LAsに重み付けを行ってもよい。 The average value LAm and the average value LAs can be, for example, the average value of all or a part of the distance information in the distance information block at the left end 700 of the partial subject 400 in the compound eye region T. In calculating the average, predetermined pixels may be weighted. The distance difference information 120 can be, for example, a difference value LD between the average value LAm and the average value LAs. When calculating the difference value, the average value LAm or the average value LAs may be weighted.
(変形例12)
 距離差分情報120を次に説明する平均値LAとしてもよい。平均値LAは、複眼領域T内の部分被写体400内の複眼距離情報103と、単眼領域内の部分被写体401内の単眼距離情報102との差分値を生成し、生成した複眼領域T内の部分被写体400内の距離情報ブロック単位ごとの差分値に基づき生成したものである。
(Modification 12)
The distance difference information 120 may be an average value LA described below. The average value LA is a difference value between the compound eye distance information 103 in the partial subject 400 in the compound eye region T and the monocular distance information 102 in the partial subject 401 in the single eye region. It is generated based on a difference value for each distance information block in the subject 400.
(変形例13)
 距離差分情報120を、複眼領域T内の部分被写体400の左端700の距離情報ブロック内の平均値LAmとしてもよい。
(Modification 13)
The distance difference information 120 may be the average value LAm in the distance information block at the left end 700 of the partial subject 400 in the compound eye region T.
(変形例14)
 越境グループG1を対象とする補正実行部21による補正処理は、部分被写体401内の全てまたは一部の距離情報ブロックにおける単眼距離情報102を、複眼領域T内の部分被写体400の平均値LAmで置き換えてもよい。置き換える場合には、所定の画素に重み付けをしてもよい。重み付けはたとえば、複眼領域Tと単眼領域Pの境界からの距離に応じて重み付けを小さくすることができる。
(Modification 14)
In the correction processing by the correction execution unit 21 for the cross-border group G1, the monocular distance information 102 in all or some of the distance information blocks in the partial subject 401 is replaced with the average value LAm of the partial subject 400 in the compound eye region T. You may. In the case of replacement, a predetermined pixel may be weighted. For example, the weighting can be reduced according to the distance from the boundary between the compound eye region T and the monocular region P.
(変形例15)
 越境グループG1を対象とする補正実行部21による補正処理は、単眼距離情報102に、同一の距離情報ブロックの距離差分情報120を加算してもよい。加算する場合には、所定の画素に重み付けをしてもよい。重み付けはたとえば、複眼領域Tと単眼領域Pの境界からの距離に応じて重み付けを小さくすることができる。
(Modification 15)
In the correction processing by the correction executing unit 21 for the cross-border group G1, the distance difference information 120 of the same distance information block may be added to the monocular distance information 102. When adding, predetermined pixels may be weighted. For example, the weighting can be reduced according to the distance from the boundary between the compound eye region T and the monocular region P.
(変形例16)
 図12は、変形例12における拡大越境グループG2を対象とする補正実行部21による補正処理を示す図である。図12(a)と図12(b)はそれぞれ、複眼領域Tおよび単眼領域Pに配置された2つの被写体とその距離情報を示しており、図12(a)が補正前、図12(b)が補正後を示す。
(Modification 16)
FIG. 12 is a diagram illustrating a correction process performed by the correction execution unit 21 on the enlarged cross-border group G2 according to the twelfth modification. FIGS. 12A and 12B show two subjects arranged in the compound eye region T and the monocular region P and their distance information, respectively. FIG. ) Indicates after correction.
 領域83Aおよび83Bは、複眼領域Tと単眼領域Pをまたがる被写体の単眼領域P内の部分被写体を示す。なお、部分被写体83Aと83Bは同一である。領域84Aおよび84Bは、複眼領域Tと単眼領域Pをまたがる被写体の単眼領域Pの部分被写体を示す。なお、部分被写体84Aと84Bは同一である。領域85Aおよび85Bは、単眼領域Pのみに存在する被写体を示す。なお、各領域内に記載の記号A、n、m、M、N1は、各領域における距離情報を模式的に示したものであり、同一記号の領域における距離情報は同一である事を示す。 The 83 regions 83A and 83B indicate partial subjects in the monocular region P of the subject that straddles the compound eye region T and the monocular region P. Note that the partial subjects 83A and 83B are the same. The regions 84A and 84B show partial subjects in the monocular region P of the subject that straddles the compound eye region T and the monocular region P. Note that the partial subjects 84A and 84B are the same. The regions 85A and 85B show a subject existing only in the monocular region P. The symbols A, n, m, M, and N1 described in the respective regions schematically indicate distance information in the respective regions, and indicate that the distance information in the regions of the same symbol is the same.
 補正判定部23は、まず複眼領域T内の部分被写体における複眼距離情報103(A)に基づき、同一の被写体を構成する単眼領域P内の部分被写体における単眼距離情報102をmからMに補正する。つぎに補正判定部23は、単眼領域P内の部分被写体に対して生成された補正後の単眼距離情報Mに基づき、拡大越境グループG2に含まれる、単眼領域P内の別の被写体における単眼距離情報102をnからN1に補正する。たとえば、n-N1=m-Mの関係を満たすように、N1の値を決定する。 The correction determination unit 23 first corrects the monocular distance information 102 on the partial subject in the monocular region P constituting the same subject from m to M based on the compound eye distance information 103 (A) on the partial subject in the compound eye region T. . Next, based on the corrected monocular distance information M generated for the partial subject in the monocular region P, the correction determination unit 23 determines the monocular distance of another subject in the monocular region P included in the enlarged cross-border group G2. The information 102 is corrected from n to N1. For example, the value of N1 is determined so as to satisfy the relationship of n−N1 = m−M.
(変形例17)
 図13は、式6または式7で示す重み付け係数、すなわち補正係数ηの決定方法を示す図である。本変形例では、入力情報105に含まれる自車の走行速度を用いて補正係数ηを決定する。走行速度が所定の値F0以下の場合は補正係数ηを1.0とし、走行速度が所定の値F0を超える場合は補正係数αを0.0とする。
(Modification 17)
FIG. 13 is a diagram illustrating a method of determining the weighting coefficient represented by Expression 6 or 7, that is, the correction coefficient η. In this modification, the correction coefficient η is determined using the traveling speed of the own vehicle included in the input information 105. When the traveling speed is equal to or less than the predetermined value F0, the correction coefficient η is set to 1.0, and when the traveling speed exceeds the predetermined value F0, the correction coefficient α is set to 0.0.
(変形例18)
 図9に示した閾値L1は可変であってもよい。図14は、閾値L1を示す図である。閾値L1は、入力情報105に含まれる車両1の走行速度に応じて決定される。走行速度が所定の値F0以下の場合は閾値L1は最小のD0である。走行速度が所定の値F1を超える場合は閾値L1は最大のD1である。走行速度がF0からF1の間の場合は、走行速度の増加とともに閾値L1の値も増加する。
(Modification 18)
The threshold L1 shown in FIG. 9 may be variable. FIG. 14 is a diagram illustrating the threshold value L1. The threshold value L1 is determined according to the traveling speed of the vehicle 1 included in the input information 105. When the traveling speed is equal to or less than the predetermined value F0, the threshold L1 is the minimum D0. When the traveling speed exceeds the predetermined value F1, the threshold L1 is the maximum D1. When the traveling speed is between F0 and F1, the value of the threshold L1 increases as the traveling speed increases.
(変形例19)
 図15は、変形例19における拡大越境グループを対象とする補正を説明する図である。図15(a)は補正前、図15(b)は補正後を示す。補正判定部23は、次のようにして拡大越境グループの追加被写体の単眼距離情報を補正するか否かを判断する。すなわち補正判定情報121は、追加被写体と複眼領域の距離が所定の閾値よりも近ければ補正すると判断し、所定の閾値よりも遠ければ補正しないと判断する。
(Modification 19)
FIG. 15 is a diagram for explaining the correction for the expanded cross-border group in the modification 19. FIG. 15A shows the state before correction, and FIG. 15B shows the state after correction. The correction determination unit 23 determines whether or not to correct the single-lens distance information of the additional subject in the enlarged transboundary group as follows. That is, the correction determination information 121 determines that correction is to be performed when the distance between the additional subject and the compound eye region is shorter than a predetermined threshold, and determines that correction is not to be performed when the distance is longer than the predetermined threshold.
 図15に示す例では、被写体88Aと被写体89Aはともに拡大越境グループに属している。被写体88Aの複眼領域Tまでの距離は符号1500で示す距離であり、所定の値DB0よりも短い。そのため被写体88Aの単眼距離情報はnからNへと補正される。被写体89Aの複眼領域Tまでの距離は符号1501で示す距離であり、所定の値DB0よりも遠い。そのため被写体89Bの単眼距離情報は補正されず、kから変化がない。 In the example shown in FIG. 15, both the subject 88A and the subject 89A belong to the enlarged cross-border group. The distance to the compound eye region T of the subject 88A is a distance indicated by reference numeral 1500, and is shorter than a predetermined value DB0. Therefore, the monocular distance information of the subject 88A is corrected from n to N. The distance to the compound eye region T of the subject 89A is a distance indicated by reference numeral 1501 and is longer than a predetermined value DB0. Therefore, the monocular distance information of the subject 89B is not corrected and does not change from k.
 本変形例によれば次の作用効果が得られる。
(8)単眼距離情報補正部12は、他の被写体と、単眼画像および複眼画像の境界と、の距離に基づき、距離情報補正の有無および補正のパラメータを決定する。
According to this modification, the following operation and effect can be obtained.
(8) The monocular distance information correction unit 12 determines the presence or absence of the distance information correction and the correction parameters based on the distance between another subject and the boundary between the monocular image and the compound eye image.
(変形例20)
 図16は、変形例16における補正部20の構成を示す図である。本変形例における補正部20は、第1の実施の形態における構成に加えて、単眼領域滞在期間保持部24をさらに備える。単眼領域滞在期間保持部24は、グループ情報106に基づき、被写体が単眼領域P内の単眼領域に存在している時間の長さ、たとえばフレーム数を示す単眼領域滞在期間情報122を生成し、補正判定部23に出力する。
(Modification 20)
FIG. 16 is a diagram illustrating the configuration of the correction unit 20 according to the sixteenth modification. The correction unit 20 in the present modification further includes a monocular area stay period holding unit 24 in addition to the configuration in the first embodiment. Based on the group information 106, the monocular region stay period holding unit 24 generates monocular region stay period information 122 indicating the length of time during which the subject is present in the monocular region within the monocular region P, for example, the number of frames, and performs correction. Output to the determination unit 23.
 単眼領域滞在期間情報122の一例として、被写体が単眼領域P内の単眼領域に連続して存在していた場合のフレーム数情報や時間情報がある。また、被写体の全てまたは一部が複眼領域Tに存在する場合、単眼領域滞在期間情報122を0に初期化してもよい。また、被写体の最初の出現が、単眼領域であった場合、単眼領域滞在期間情報122が取りうる最大値もしくは所定の値に初期化してもよい。 例 As an example of the monocular area stay period information 122, there is frame number information and time information when the subject is continuously present in the monocular area in the monocular area P. When all or a part of the subject exists in the compound eye region T, the single eye region stay period information 122 may be initialized to zero. In addition, when the first appearance of the subject is a monocular area, the monocular area staying period information 122 may be initialized to a maximum value or a predetermined value.
 図17は、補正判定部23による補正係数ηの決定方法を示す図である。本変形例では補正判定情報121は単眼領域滞在期間情報122に基づき、式6や式7で用いられる補正係数ηを決定する。補正判定情報121は、単眼領域滞在期間情報122が所定の値Ta以下の場合は補正係数αを1.0とする。補正判定情報121は、単眼領域滞在期間情報122が所定の値Tbを超える場合は、補正係数αを0.0とする。補正判定情報121は、単眼領域滞在期間情報122が所定の値Taより大きい場合は単眼領域滞在期間情報122の増加に従い補正係数ηを減少させる。 FIG. 17 is a diagram illustrating a method of determining the correction coefficient η by the correction determination unit 23. In the present modification, the correction determination information 121 determines the correction coefficient η used in Expressions 6 and 7 based on the monocular area stay period information 122. The correction determination information 121 sets the correction coefficient α to 1.0 when the monocular area stay period information 122 is equal to or smaller than a predetermined value Ta. When the monocular area stay period information 122 exceeds the predetermined value Tb, the correction determination information 121 sets the correction coefficient α to 0.0. When the monocular area stay period information 122 is larger than the predetermined value Ta, the correction determination information 121 decreases the correction coefficient η as the monocular area stay period information 122 increases.
 本変形例によれば次の作用効果が得られる。
(9)距離算出装置6は、グループが複眼領域外に存在する単眼領域滞在期間情報122を生成する単眼領域滞在期間保持部24を備える。単眼距離情報補正部12は、単眼領域滞在期間情報122に基づき、距離情報補正の手法を決定する。
According to this modification, the following operation and effect can be obtained.
(9) The distance calculation device 6 includes a monocular region stay period holding unit 24 that generates monocular region stay period information 122 in which the group exists outside the compound eye region. The monocular distance information correction unit 12 determines a distance information correction method based on the monocular area stay period information 122.
(変形例21)
 図18は、変形例21におけるグループ情報生成部30の構成図である。グループ情報生成部30は、図5に示す構成に加えて、単眼領域滞在期間保持部33をさらに備える。単眼領域滞在期間保持部33は、被写体検知情報130に基づき、被写体が単眼領域P内の単眼領域に存在している時間の長さ、たとえばフレーム数などを示す単眼領域滞在期間情報131を生成し、グループ判定部31に出力する。
(Modification 21)
FIG. 18 is a configuration diagram of the group information generation unit 30 in the modification 21. The group information generation unit 30 further includes a monocular area stay period holding unit 33 in addition to the configuration illustrated in FIG. The monocular region stay period holding unit 33 generates monocular region stay period information 131 indicating the length of time during which the subject is present in the monocular region within the monocular region P, for example, the number of frames, based on the subject detection information 130. , To the group determination unit 31.
 単眼領域滞在期間情報131の一例として、被写体が単眼領域P内の単眼領域に連続して存在していた場合のフレーム数情報や時間情報がある。また、被写体の全てまたは一部が複眼領域Tに存在する場合、単眼領域滞在期間情報131を0に初期化してもよい。また、被写体の最初の出現が、単眼領域であった場合、単眼領域滞在期間情報131が取りうる最大値もしくは所定の値に初期化してもよい。 例 As an example of the single-eye region stay period information 131, there is frame number information and time information when the subject is continuously present in the single-eye region in the single-eye region P. When all or a part of the subject exists in the compound eye region T, the single eye region stay period information 131 may be initialized to zero. In addition, when the first appearance of the subject is a monocular area, the monocular area staying period information 131 may be initialized to a maximum value or a predetermined value.
(変形例22)
 補正実行部21は、被写体の種類および車両1の速度により単眼距離情報の補正の手法を決定してもよい。たとえば補正の手法とは、補正の要否や補正のパラメータである。
(Modification 22)
The correction executing unit 21 may determine a method of correcting the single-lens distance information based on the type of the subject and the speed of the vehicle 1. For example, the correction method is the necessity of correction and the parameter of correction.
 図19は、被写体の種類と補正の手法の対応関係を示す図である。ただし図19には、車両1の速度と補正の手法の関係も示されている。なお図19は作図の都合により2段に分かれている。図19における「補正ON」は単眼距離情報を補正することを示しており、「補正OFF」は単眼距離情報を補正しないことを示している。また図19における百分率は補正を行うこと、および補正に用いるパラメータの値を示している。すなわち百分率が記載されている補正手法は、式6や式7で示すようなパラメータを使用する補正手法である。ただし「0%」の場合は補正を行わなくてもよい。 FIG. 19 is a diagram showing the correspondence between the type of subject and the correction method. However, FIG. 19 also shows the relationship between the speed of the vehicle 1 and the correction method. FIG. 19 is divided into two stages for the sake of drawing. In FIG. 19, “correction ON” indicates that the single-lens distance information is corrected, and “correction OFF” indicates that the single-lens distance information is not corrected. The percentage in FIG. 19 indicates that the correction is to be performed and the value of the parameter used for the correction. That is, the correction method in which the percentage is described is a correction method using parameters as shown in Expressions 6 and 7. However, in the case of “0%”, the correction need not be performed.
 補正実行部21が補正手法1を用いる場合は、被写体の種別が人、車両、および路面の場合には単眼距離情報を補正するが、被写体の種別が構造物の場合には単眼距離情報を補正しない。補正実行部21が補正手法3を用いる場合は、車両1の速度により場合分けがなされ、低速走行時であれば被写体の種別に関係なく補正し、高速走行時は人と車両を補正せず、中速走行時は速度に応じて人と車両を補正する。 When the correction execution unit 21 uses the correction method 1, the monocular distance information is corrected when the type of the subject is a person, a vehicle, and a road surface, but is corrected when the type of the subject is a structure. do not do. When the correction executing unit 21 uses the correction method 3, the cases are classified according to the speed of the vehicle 1, and when the vehicle is running at low speed, the correction is performed regardless of the type of the subject. When the vehicle is running at high speed, the person and the vehicle are not corrected. At the time of middle speed running, the person and the vehicle are corrected according to the speed.
 補正手法3では、路面や構造物は大きな領域で連続性のある静止物なので補正しやすいことから、速度によらず補正している。しかし人や車両は、領域が比較的小さく連続性がないため、必要性の高い低速走行時に限定して補正を行う。 In the correction method 3, since the road surface or the structure is a stationary object having continuity in a large area, it can be easily corrected. However, since the area of a person or a vehicle is relatively small and has no continuity, the correction is performed only at the time of low-speed running where high necessity is required.
 補正手法4では、低速走行時には、人の距離精度を高め、高速走行時には車の距離精度を高めている。速度により車両周囲環境が変わり、特に見たいものが変わることを利用して、全体の補正処理負荷を低減することで、低コスト化が実現される。具体的には距離算出装置6に要求される演算能力を減らすことで、処理能力の低い演算装置を用いて距離算出装置6を製造可能とする。 In the correction method 4, the distance accuracy of the person is increased during low-speed running, and the distance accuracy of the vehicle is increased during high-speed running. Cost reduction is realized by reducing the overall correction processing load by utilizing the fact that the environment around the vehicle changes depending on the speed, and in particular, changes what is desired to be seen. Specifically, by reducing the calculation capability required for the distance calculation device 6, the distance calculation device 6 can be manufactured using a calculation device having a low processing capability.
 本変形例ではさらに、補正前の単眼距離情報が示す距離が所定の距離よりも短い場合のみ補正を行ってもよい。また被写体の種別が人の場合は、路面から所定の距離範囲内に存在する人のみに補正対象を制限してもよい。路面は大きな領域で連続性があるため、補正精度が高い。一方で、走行路近辺の人に対して、より注意を向ける必要があるからである。 In the present modification, the correction may be performed only when the distance indicated by the monocular distance information before correction is shorter than a predetermined distance. When the type of the subject is a person, the correction target may be limited to only those who are within a predetermined distance range from the road surface. Since the road surface has continuity in a large area, the correction accuracy is high. On the other hand, it is necessary to pay more attention to people near the traveling road.
(10)単眼距離情報補正部12は、車両1の走行状態、および被写体の種類の少なくとも一方に基づき、単眼距離情報を補正する手法を決定する。そのため距離算出装置6は、車両の走行状態や被写体の種類に基づく補正が可能である。 (10) The monocular distance information correction unit 12 determines a method of correcting the monocular distance information based on at least one of the traveling state of the vehicle 1 and the type of the subject. Therefore, the distance calculation device 6 can perform the correction based on the running state of the vehicle and the type of the subject.
―第2の実施の形態―
 図20~図21を参照して、距離算出装置の第2の実施の形態を説明する。以下の説明では、第1の実施の形態と同じ構成要素には同じ符号を付して相違点を主に説明する。特に説明しない点については、第1の実施の形態と同じである。本実施の形態では、主に、車両1に搭載されるセンサが第1の実施の形態と異なる。
-Second embodiment-
A second embodiment of the distance calculation device will be described with reference to FIGS. In the following description, the same components as those in the first embodiment are denoted by the same reference numerals, and the differences will be mainly described. The points that are not particularly described are the same as in the first embodiment. In the present embodiment, a sensor mainly mounted on the vehicle 1 is different from that of the first embodiment.
 図20は、第2の実施の形態における距離算出システムSの概要図である。車両1は、周囲の情報を収集するセンサとして、第1撮像部2と、距離センサ2000を備える。距離センサ2000は、距離情報を取得可能なセンサであり、たとえばLiDAR(Light Detection and Ranging)、ミリ波レーダ、TOF(Time of Flight)方式の距離センサ、ステレオカメラ等である。距離センサ2000は、実線で示す範囲、すなわち図示中央のハッチングで示す領域の距離情報を取得する。第1撮像部2は、破線で示す範囲、すなわち距離センサ2000が距離情報を取得する範囲を含む広範囲を撮像範囲とする。 FIG. 20 is a schematic diagram of a distance calculation system S according to the second embodiment. The vehicle 1 includes a first imaging unit 2 and a distance sensor 2000 as sensors for collecting surrounding information. The distance sensor 2000 is a sensor that can acquire distance information, and is, for example, a LiDAR (Light Detection and Ranging), a millimeter wave radar, a TOF (Time of Flight) type distance sensor, a stereo camera, or the like. The distance sensor 2000 acquires distance information of a range indicated by a solid line, that is, an area indicated by hatching in the center of the drawing. The first imaging unit 2 sets the range indicated by the broken line, that is, the wide range including the range in which the distance sensor 2000 acquires the distance information, as the imaging range.
 ハッチングで示される領域が、第1の実施の形態における複眼領域Tに相当する。第1撮像部2の撮像範囲のうちハッチングで示される領域を除く領域が、第1の実施の形態における単眼領域Pおよび単眼領域Sに相当する。 領域 A region indicated by hatching corresponds to the compound eye region T in the first embodiment. Regions other than the region indicated by hatching in the imaging range of the first imaging unit 2 correspond to the monocular region P and the monocular region S in the first embodiment.
 図21は、第2の実施の形態における距離算出装置6の構成図であり、第1の実施の形態の構成に比べると、複眼距離情報生成部11が削除されている。距離センサ2000は距離情報2010を単眼距離情報補正部12に出力する。単眼距離情報補正部12は距離情報2010を第1の実施の形態における複眼距離情報103として扱う。そのため、実態は存在しないが、仮想的に複眼距離情報生成部11が存在しており、入力されたセンサの出力をそのまま高精度距離情報103として出力しているとみなすことができる。その他の構成は第1の実施の形態と同様である。 FIG. 21 is a configuration diagram of the distance calculation device 6 according to the second embodiment. The compound-eye distance information generation unit 11 is omitted from the configuration of the first embodiment. Distance sensor 2000 outputs distance information 2010 to monocular distance information correction unit 12. The monocular distance information correction unit 12 treats the distance information 2010 as the compound eye distance information 103 in the first embodiment. Therefore, although there is no actual situation, it can be considered that the compound-eye distance information generation unit 11 virtually exists, and the output of the input sensor is directly output as the high-precision distance information 103. Other configurations are the same as those of the first embodiment.
 上述した第2の実施の形態によれば、次の作用効果が得られる。
(11)複眼距離情報生成部11は、TOF方式の距離センサ、ミリ波レーダ、およびLiDARの少なくともいずれかの出力を用いて複眼距離情報103を生成する。そのため、より精度の高い距離センサの情報を用いて、単眼領域の距離情報を補正できる。
According to the above-described second embodiment, the following operation and effect can be obtained.
(11) The compound-eye distance information generation unit 11 generates compound-eye distance information 103 using at least one of a TOF distance sensor, a millimeter-wave radar, and an LiDAR output. Therefore, the distance information of the monocular region can be corrected using the information of the distance sensor with higher accuracy.
―第3の実施の形態―
 図22~図23を参照して、距離算出装置の第3の実施の形態を説明する。以下の説明では、第1の実施の形態と同じ構成要素には同じ符号を付して相違点を主に説明する。特に説明しない点については、第1の実施の形態と同じである。本実施の形態では、主に、車両1に搭載されるセンサが第1の実施の形態と異なる。
-Third embodiment-
A third embodiment of the distance calculation device will be described with reference to FIGS. In the following description, the same components as those in the first embodiment are denoted by the same reference numerals, and the differences will be mainly described. The points that are not particularly described are the same as in the first embodiment. In the present embodiment, a sensor mainly mounted on the vehicle 1 is different from that of the first embodiment.
 図22は、第3の実施の形態における距離算出システムSの概要図である。車両1は、周囲の情報を収集するセンサとして、第1撮像部2と、第2撮像部3と、第1距離センサ2200と、第2距離センサ2201を備える。第1距離センサ2200および第2距離センサ2201は、距離情報を取得可能なセンサであり、たとえばLiDAR、ミリ波レーダ、TOF方式の距離センサ、ステレオカメラ等である。 FIG. 22 is a schematic diagram of a distance calculation system S according to the third embodiment. The vehicle 1 includes a first imaging unit 2, a second imaging unit 3, a first distance sensor 2200, and a second distance sensor 2201 as sensors for collecting surrounding information. The first distance sensor 2200 and the second distance sensor 2201 are sensors capable of acquiring distance information, and are, for example, a LiDAR, a millimeter wave radar, a TOF distance sensor, a stereo camera, and the like.
 第1撮像部2と、第2撮像部3の視野は同一であり、図示中央のハッチングの範囲を撮影する。第1距離センサ2200の視野は車両1のおよそ左前方、第2距離センサ2201の視野は車両1のおよそ右前方である。すべてのセンサの視野角、設置位置、および姿勢は既知である。ハッチングで示される領域が、第1の実施の形態における複眼領域Tに相当する。ハッチングの上下の領域が、第1の実施の形態における単眼領域Pおよび単眼領域Sに相当する。 視野 The fields of view of the first imaging unit 2 and the second imaging unit 3 are the same, and an image of the hatched area in the center of the figure is taken. The field of view of the first distance sensor 2200 is about the front left of the vehicle 1, and the field of view of the second distance sensor 2201 is about the front right of the vehicle 1. The viewing angles, installation positions, and postures of all sensors are known. The area indicated by hatching corresponds to the compound eye area T in the first embodiment. The areas above and below the hatching correspond to the monocular area P and the monocular area S in the first embodiment.
 図23は、第3の実施の形態における距離算出装置6の構成図である。第1の実施の形態の構成に比べると、単眼距離情報生成部10が削除されている。第1距離センサ2200および第2距離センサ2201は、距離情報2210および距離情報2211を単眼距離情報補正部12に出力する。単眼距離情報補正部12は、入力される距離情報2210および距離情報2211から必要な領域の情報を抜き出して、第1の実施の形態における単眼距離情報102として扱う。そのため、実態は存在しないが、仮想的に単眼距離情報生成部10が存在しており、入力されたセンサの出力をそのまま単眼距離情報102として出力しているとみなすことができる。その他の構成は第1の実施の形態と同様である。 FIG. 23 is a configuration diagram of the distance calculation device 6 according to the third embodiment. Compared to the configuration of the first embodiment, the monocular distance information generation unit 10 is omitted. The first distance sensor 2200 and the second distance sensor 2201 output the distance information 2210 and the distance information 2211 to the monocular distance information correction unit 12. The monocular distance information correction unit 12 extracts necessary area information from the input distance information 2210 and the distance information 2211 and treats the extracted information as monocular distance information 102 in the first embodiment. Therefore, although the actual state does not exist, it can be considered that the monocular distance information generation unit 10 virtually exists and the output of the input sensor is output as the monocular distance information 102 as it is. Other configurations are the same as those of the first embodiment.
 上述した第3の実施の形態によれば、次の作用効果が得られる。
(12)単眼距離情報生成部10は、TOF方式の距離センサ、ミリ波レーダ、およびLiDARの少なくともいずれかの出力を用いて単眼距離情報102を生成する。
According to the above-described third embodiment, the following operation and effect can be obtained.
(12) The monocular distance information generation unit 10 generates the monocular distance information 102 by using the output of at least one of a TOF distance sensor, a millimeter wave radar, and a LiDAR.
―第4の実施の形態―
 図24~図25を参照して、距離算出装置の第4の実施の形態を説明する。以下の説明では、第1の実施の形態と同じ構成要素には同じ符号を付して相違点を主に説明する。特に説明しない点については、第1の実施の形態と同じである。本実施の形態では、本実施の形態では、主に、車両1に搭載されるカメラの視野角およびカメラの姿勢が第1の実施の形態と異なる。
-Fourth embodiment-
A fourth embodiment of the distance calculation device will be described with reference to FIGS. In the following description, the same components as those in the first embodiment are denoted by the same reference numerals, and the differences will be mainly described. The points that are not particularly described are the same as in the first embodiment. In the present embodiment, the present embodiment mainly differs from the first embodiment in the viewing angle of the camera mounted on the vehicle 1 and the attitude of the camera.
 図24は、第4の実施の形態における距離算出システムSの概要図である。車両1は、周囲の情報を収集するセンサとして、第1の実施の形態と同様に第1撮像部2および第2撮像部3を備える。ただし本実施の形態では第1撮像部2および第2撮像部3の視野角が広がり、両者の視野は一致している。第1撮像部2および第2撮像部3はたとえば魚眼レンズを有し、広範囲の撮像が可能であるが、レンズ周辺部での歪み量が大きく、解像度が低下する。 FIG. 24 is a schematic diagram of a distance calculation system S according to the fourth embodiment. The vehicle 1 includes a first imaging unit 2 and a second imaging unit 3 as sensors for collecting surrounding information, as in the first embodiment. However, in the present embodiment, the viewing angles of the first imaging unit 2 and the second imaging unit 3 are widened, and the fields of view of both are the same. The first imaging unit 2 and the second imaging unit 3 have, for example, a fisheye lens and are capable of imaging a wide range, but the distortion amount around the lens is large, and the resolution is reduced.
 第1撮像部2および第2撮像部3の視野の中央付近がハッチングで示す領域であり、第1の実施の形態における複眼領域Tに相当する。第1撮像部2および第2撮像部3の視野であって中央以外の領域は、歪みが大きく空間的な解像度が低いので、第1の実施の形態における単眼領域Pおよび単眼領域Sに相当するとみなす。換言すると、撮影条件が比較的劣悪な領域を単眼領域とみなし、撮影条件が比較的良好な領域を複眼領域とみなす。 付 近 Around the center of the field of view of the first imaging unit 2 and the second imaging unit 3 is a hatched area, which corresponds to the compound eye area T in the first embodiment. Regions other than the center of the field of view of the first imaging unit 2 and the second imaging unit 3 have large distortion and low spatial resolution, and therefore correspond to the monocular region P and the monocular region S in the first embodiment. I reckon. In other words, an area with relatively poor shooting conditions is regarded as a monocular area, and an area with relatively good imaging conditions is considered as a compound eye area.
 図25は、第4の実施の形態における距離算出装置6の構成図であり、第1の実施の形態の構成に比べると、単眼距離情報生成部10が削除されている。単眼距離情報補正部12は、複眼距離情報生成部11が出力する複眼距離情報103のうち中央付近のみを第1の実施の形態における複眼距離情報103として扱い、周辺部分は第1の実施の形態における単眼距離情報102として扱う。その他の構成は第1の実施の形態と同様である。 FIG. 25 is a configuration diagram of the distance calculation device 6 according to the fourth embodiment. The monocular distance information generation unit 10 is omitted from the configuration of the first embodiment. The monocular distance information correction unit 12 treats only the vicinity of the center of the compound eye distance information 103 output by the compound eye distance information generating unit 11 as the compound eye distance information 103 in the first embodiment, and the peripheral part is the first embodiment. Is handled as the monocular distance information 102 in. Other configurations are the same as those of the first embodiment.
 上述した第4の実施の形態によれば、次の作用効果が得られる。
(13)複眼距離情報生成部11は、2つのカメラが出力し、視野が重複する領域であり、撮影条件が比較的良好な領域の情報を用いて複眼距離情報103を生成し、単眼距離情報生成部10は、2つのカメラが出力し、視野が重複する領域であり、複眼距離情報生成部11が用いた領域よりも撮影条件が劣悪な領域の情報を用いて単眼距離情報102を生成する。そのため単眼領域が存在しないシステムにおいても本発明を適用することができる。
According to the above-described fourth embodiment, the following operation and effect can be obtained.
(13) The compound-eye distance information generation unit 11 generates compound-eye distance information 103 by using information of an area where the two cameras output and the fields of view overlap and the shooting conditions are relatively good, and the compound-eye distance information 103 The generation unit 10 generates the monocular distance information 102 using information of an area output by the two cameras and having an overlapping field of view and in which imaging conditions are inferior to the area used by the compound-eye distance information generation unit 11. . Therefore, the present invention can be applied to a system having no monocular region.
(第4の実施の形態の変形例)
 撮影条件には、レンズの歪みだけでなく、光の量やレンズに付着したゴミも含めてもよい。たとえば全周にわたって歪みの小さいレンズを使用している場合に、光の量が少ない暗い領域の距離情報を単眼距離情報102として扱い、明るく鮮明に撮影されている領域の距離情報を複眼距離情報103として扱ってもよい。
(Modification of Fourth Embodiment)
The photographing conditions may include not only the distortion of the lens but also the amount of light and dust attached to the lens. For example, when a lens with a small distortion is used over the entire circumference, distance information of a dark area where the amount of light is small is treated as monocular distance information 102, and distance information of a brightly and clearly photographed area is compound eye distance information 103. It may be treated as.
―第5の実施の形態―
 図26を参照して、距離算出装置の第5の実施の形態を説明する。以下の説明では、第1の実施の形態と同じ構成要素には同じ符号を付して相違点を主に説明する。特に説明しない点については、第1の実施の形態と同じである。本実施の形態では、主に、機能とハードウエアの対応が第1の実施の形態と異なる。
-Fifth embodiment-
A fifth embodiment of the distance calculation device will be described with reference to FIG. In the following description, the same components as those in the first embodiment are denoted by the same reference numerals, and the differences will be mainly described. The points that are not particularly described are the same as in the first embodiment. This embodiment differs from the first embodiment mainly in the correspondence between functions and hardware.
 図26は、第5の実施の形態における距離算出システムSの構成図である。本実施の形態では、単眼距離情報補正部12が単眼距離情報生成部10および複眼距離情報生成部11とは異なるハードウエアにより実現される。カメラ処理装置15は、単眼距離情報生成部10と、複眼距離情報生成部11と、特徴情報生成部13とを備える。特徴情報生成部13は、入力された画像の特徴量、たとえばエッジ画像やヒストグラムを作成し、特徴情報107として画像処理部60に出力する。ECU14は、単眼距離情報補正部12と、認識処理部7と、車両制御部8とを備える。 FIG. 26 is a configuration diagram of a distance calculation system S according to the fifth embodiment. In the present embodiment, the monocular distance information correction unit 12 is realized by hardware different from the monocular distance information generation unit 10 and the compound eye distance information generation unit 11. The camera processing device 15 includes a monocular distance information generation unit 10, a compound eye distance information generation unit 11, and a feature information generation unit 13. The feature information generation unit 13 creates a feature amount of the input image, for example, an edge image or a histogram, and outputs it as feature information 107 to the image processing unit 60. The ECU 14 includes a monocular distance information correction unit 12, a recognition processing unit 7, and a vehicle control unit 8.
 上述した第5の実施の形態によれば、様々なハードウエア構成により本発明を実施することができる。 According to the fifth embodiment described above, the present invention can be implemented by various hardware configurations.
 上述した各実施の形態および変形例は、それぞれ組み合わせてもよい。上記では、種々の実施の形態および変形例を説明したが、本発明はこれらの内容に限定されるものではない。本発明の技術的思想の範囲内で考えられるその他の態様も本発明の範囲内に含まれる。 各 The above-described embodiments and modifications may be combined with each other. Although various embodiments and modified examples have been described above, the present invention is not limited to these contents. Other embodiments that can be considered within the scope of the technical concept of the present invention are also included in the scope of the present invention.
 次の優先権基礎出願の開示内容は引用文としてここに組み込まれる。
 日本国特許出願2018-141891(2018年7月27日出願)
The disclosure of the following priority application is incorporated herein by reference.
Japanese Patent Application 2018-141891 (filed July 27, 2018)
1…車両
2…第1撮像部
3…第2撮像部
6…距離算出装置
7…認識処理部
8…車両制御部
10…単眼距離情報生成部
11…複眼距離情報生成部
12…単眼距離情報補正部
15…カメラ処理装置
20…補正部
21…補正実行部
22…距離差分情報生成部
23…補正判定部
24…単眼領域滞在期間保持部
30…グループ情報生成部
31…グループ判定部
32…被写体検知部
33…単眼領域滞在期間保持部
100…第1撮影画像
101…第2撮影画像
102…単眼距離情報
103…複眼距離情報
104…補正後距離情報
105…入力情報
106…グループ情報
120…距離差分情報
121…補正判定情報
122…単眼領域滞在期間情報
130…被写体検知情報
131…単眼領域滞在期間情報
140…認識情報
DESCRIPTION OF SYMBOLS 1 ... Vehicle 2 ... 1st imaging part 3 ... 2nd imaging part 6 ... Distance calculation device 7 ... Recognition processing part 8 ... Vehicle control part 10 ... Monocular distance information generation part 11 ... Compound eye distance information generation part 12 ... Monocular distance information correction Unit 15 camera processing device 20 correction unit 21 correction execution unit 22 distance difference information generation unit 23 correction determination unit 24 monocular area stay period holding unit 30 group information generation unit 31 group determination unit 32 subject detection Unit 33 monocular area stay period holding unit 100 first photographed image 101 second photographed image 102 monocular distance information 103 compound eye distance information 104 corrected distance information 105 input information 106 group information 120 distance difference information 121 ... Correction determination information 122 ... Monocular area stay period information 130 ... Subject detection information 131 ... Monocular area stay period information 140 ... Recognition information

Claims (15)

  1.  センサの出力を用いて、精度の高い距離の情報である高精度距離情報を生成する高精度距離情報生成部と、
     センサの出力を用いて、前記高精度距離情報よりも精度の低い低精度距離情報を生成する低精度距離情報生成部と、
     前記高精度距離情報を用いて前記低精度距離情報を補正する距離補正部とを備える距離算出装置。
    Using a sensor output, a high-precision distance information generation unit that generates high-precision distance information that is information of high-precision distance,
    Using a sensor output, a low-accuracy distance information generating unit that generates low-accuracy distance information with lower accuracy than the high-accuracy distance information,
    A distance correction unit configured to correct the low-accuracy distance information using the high-accuracy distance information.
  2.  請求項1に記載の距離算出装置において、
     前記高精度距離情報生成部は、複数のカメラの出力であって視野が重複する複眼領域の出力を用いて前記高精度距離情報を生成し、
     前記低精度距離情報生成部は、単一のカメラの出力を用いて単眼領域の前記低精度距離情報を生成し、
     前記複眼領域および前記単眼領域のそれぞれに含まれる1以上の被写体を所定の条件に基づき第1グループとして対応付けるグループ情報生成部をさらに備え、
     前記距離補正部は、前記高精度距離情報生成部により生成された、前記第1グループに属する被写体の前記高精度距離情報を用いて、前記低精度距離情報を補正する距離算出装置。
    The distance calculation device according to claim 1,
    The high-precision distance information generating unit generates the high-precision distance information using the output of a plurality of cameras and the output of a compound eye region in which the fields of view overlap,
    The low-accuracy distance information generation unit generates the low-accuracy distance information of a monocular region using an output of a single camera,
    A group information generating unit that associates one or more subjects included in each of the compound eye region and the monocular region as a first group based on a predetermined condition;
    The distance calculation device that corrects the low-accuracy distance information using the high-accuracy distance information of the subject belonging to the first group, generated by the high-accuracy distance information generating unit.
  3.  請求項2に記載の距離算出装置において、
     前記第1グループに属する被写体は、前記単眼領域および前記複眼領域に跨って存在する同一の被写体である距離算出装置。
    The distance calculation device according to claim 2,
    The distance calculation device, wherein the subjects belonging to the first group are the same subjects existing over the single-eye region and the compound-eye region.
  4.  請求項3に記載の距離算出装置において、
     前記グループ情報生成部は、前記単眼領域に含まれる被写体であって所定の条件を満たす被写体を追加被写体として前記第1グループとともに第2グループとして対応付け、
     前記距離補正部は、前記第1グループの距離情報を用いて、前記追加被写体の前記低精度距離情報を補正する距離算出装置。
    The distance calculation device according to claim 3,
    The group information generating unit associates a subject included in the monocular region and satisfying a predetermined condition as an additional subject with the first group as a second group,
    The distance calculation device, wherein the distance correction unit corrects the low-accuracy distance information of the additional subject using the distance information of the first group.
  5.  請求項2に記載の距離算出装置において、
     前記距離補正部は、同一の領域に対する前記高精度距離情報と前記低精度距離情報との差を距離差分情報として算出し、前記距離差分情報に基づき前記低精度距離情報の補正の要否を判断する距離算出装置。
    The distance calculation device according to claim 2,
    The distance correction unit calculates a difference between the high-accuracy distance information and the low-accuracy distance information for the same area as distance difference information, and determines whether correction of the low-accuracy distance information is necessary based on the distance difference information. Distance calculating device.
  6.  請求項4に記載の距離算出装置において、
     前記距離補正部は、前記高精度距離情報生成部が生成した第1グループの複眼距離情報または前記距離補正部が生成した補正後の単眼距離情報を用いて前記追加被写体の前記低精度距離情報を補正する距離算出装置。
    The distance calculation device according to claim 4,
    The distance correction unit uses the compound-eye distance information of the first group generated by the high-precision distance information generation unit or the corrected single-eye distance information generated by the distance correction unit to calculate the low-precision distance information of the additional subject. Distance calculating device to correct.
  7.  請求項2に記載の距離算出装置において、
     前記距離補正部は、前記高精度距離情報に基づき、前記低精度距離情報を補正する手法を決定する距離算出装置。
    The distance calculation device according to claim 2,
    The distance calculation device, wherein the distance correction unit determines a method of correcting the low-accuracy distance information based on the high-accuracy distance information.
  8.  請求項2に記載の距離算出装置において、
     前記距離補正部は、車両の走行状態に基づき、前記低精度距離情報を補正する手法を決定する距離算出装置。
    The distance calculation device according to claim 2,
    The distance calculation device, wherein the distance correction unit determines a method of correcting the low-accuracy distance information based on a running state of the vehicle.
  9.  請求項2に記載の距離算出装置において、
     前記距離補正部は、被写体の種類に基づき、前記低精度距離情報を補正する手法を決定する距離算出装置。
    The distance calculation device according to claim 2,
    The distance calculation device, wherein the distance correction unit determines a method of correcting the low-accuracy distance information based on a type of a subject.
  10.  請求項2に記載の距離算出装置において、
     前記距離補正部は、前記低精度距離情報が補正される被写体と、前記単眼領域および前記複眼領域の境界との距離に基づき、距離情報補正の手法を決定する距離算出装置。
    The distance calculation device according to claim 2,
    The distance calculation device, wherein the distance correction unit determines a distance information correction method based on a distance between a subject whose low-precision distance information is corrected and a boundary between the single-eye region and the compound-eye region.
  11.  請求項2に記載の距離算出装置において、
     グループが複眼領域外に存在する期間情報を生成する複眼領域外滞在期間生成部をさらに備え、
     前記距離補正部は、前記期間情報に基づき、距離情報補正の手法を決定する距離算出装置。
    The distance calculation device according to claim 2,
    The apparatus further includes a compound eye outside stay period generation unit that generates period information in which the group exists outside the compound eye region,
    The distance calculation device, wherein the distance correction unit determines a method of distance information correction based on the period information.
  12.  請求項2に記載の距離算出装置において、
     前記距離補正部は、前記第1グループの複眼距離、走行情報、グループの種類、境界からの距離または期間の少なくとも一つの条件に基づき、グループ化を決定する距離算出装置。
    The distance calculation device according to claim 2,
    The distance calculation device, wherein the distance correction unit determines grouping based on at least one condition of a compound eye distance of the first group, travel information, a type of group, a distance from a boundary, or a period.
  13.  請求項1に記載の距離算出装置において、
     前記高精度距離情報生成部は、TOF方式の距離センサ、ミリ波レーダ、およびLiDARの少なくともいずれかの出力を用いて前記高精度距離情報を生成する距離算出装置。
    The distance calculation device according to claim 1,
    The distance calculation device, wherein the high-accuracy distance information generating unit generates the high-accuracy distance information by using an output of at least one of a TOF distance sensor, a millimeter wave radar, and a LiDAR.
  14.  請求項1に記載の距離算出装置において、
     前記低精度距離情報生成部は、TOF方式の距離センサ、ミリ波レーダ、およびLiDARの少なくともいずれかの出力を用いて前記低精度距離情報を生成する距離算出装置。
    The distance calculation device according to claim 1,
    The distance calculating device, wherein the low-accuracy distance information generation unit generates the low-accuracy distance information by using an output of at least one of a TOF distance sensor, a millimeter wave radar, and a LiDAR.
  15.  請求項1に記載の距離算出装置において、
     前記高精度距離情報生成部は、2つのカメラが出力し、視野が重複する領域であり、撮影条件が比較的良好な領域の情報を用いて前記高精度距離情報を生成し、
     前記低精度距離情報生成部は、前記2つのカメラが出力し、視野が重複する領域であり、前記高精度距離情報生成部が用いた領域よりも撮影条件が劣悪な領域の情報を用いて前記低精度距離情報を生成する距離算出装置。
    The distance calculation device according to claim 1,
    The high-precision distance information generation unit is an area where the two cameras output and the fields of view are overlapped, and the high-precision distance information is generated using information of an area where shooting conditions are relatively good,
    The low-accuracy distance information generation unit uses the information of an area that is output by the two cameras and has an overlapping field of view, and the imaging conditions are inferior to the area used by the high-accuracy distance information generation unit. A distance calculation device that generates low-accuracy distance information.
PCT/JP2019/026647 2018-07-27 2019-07-04 Distance calculation device WO2020022021A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201980049441.1A CN112513571B (en) 2018-07-27 2019-07-04 Distance calculating device

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2018-141891 2018-07-27
JP2018141891A JP7042185B2 (en) 2018-07-27 2018-07-27 Distance calculation device

Publications (1)

Publication Number Publication Date
WO2020022021A1 true WO2020022021A1 (en) 2020-01-30

Family

ID=69180414

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/026647 WO2020022021A1 (en) 2018-07-27 2019-07-04 Distance calculation device

Country Status (3)

Country Link
JP (1) JP7042185B2 (en)
CN (1) CN112513571B (en)
WO (1) WO2020022021A1 (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7341079B2 (en) * 2020-02-10 2023-09-08 住友重機械工業株式会社 Distance image estimation device and control device
JP2022041219A (en) * 2020-08-31 2022-03-11 エスゼット ディージェイアイ テクノロジー カンパニー リミテッド Control device, distance measurement sensor, imaging device, control method, and program
JP7499140B2 (en) * 2020-10-14 2024-06-13 日立Astemo株式会社 Object Recognition Device
US20220268899A1 (en) * 2021-02-22 2022-08-25 Shenzhen Camsense Technologies Co., Ltd Ranging apparatus, lidar, and mobile robot

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000121319A (en) * 1998-10-15 2000-04-28 Sony Corp Image processor, image processing method and supply medium
WO2006121087A1 (en) * 2005-05-10 2006-11-16 Olympus Corporation Image processing device, image processing method, and image processing program
WO2006123615A1 (en) * 2005-05-19 2006-11-23 Olympus Corporation Distance measuring apparatus, distance measuring method and distance measuring program
JP2007263669A (en) * 2006-03-28 2007-10-11 Denso It Laboratory Inc Three-dimensional coordinates acquisition system
JP2008304202A (en) * 2007-06-05 2008-12-18 Konica Minolta Holdings Inc Method and apparatus for distance image generation and program
JP2017139631A (en) * 2016-02-04 2017-08-10 日立オートモティブシステムズ株式会社 Imaging apparatus
JP2019128153A (en) * 2018-01-19 2019-08-01 本田技研工業株式会社 Distance calculation device and vehicle control device

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2005043891A1 (en) * 2003-10-31 2005-05-12 Mitsubishi Denki Kabushiki Kaisha Image correcting method and imaging apparatus
EP3193134B1 (en) * 2014-09-11 2022-10-12 Hitachi Astemo, Ltd. Image processing device

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000121319A (en) * 1998-10-15 2000-04-28 Sony Corp Image processor, image processing method and supply medium
WO2006121087A1 (en) * 2005-05-10 2006-11-16 Olympus Corporation Image processing device, image processing method, and image processing program
WO2006123615A1 (en) * 2005-05-19 2006-11-23 Olympus Corporation Distance measuring apparatus, distance measuring method and distance measuring program
JP2007263669A (en) * 2006-03-28 2007-10-11 Denso It Laboratory Inc Three-dimensional coordinates acquisition system
JP2008304202A (en) * 2007-06-05 2008-12-18 Konica Minolta Holdings Inc Method and apparatus for distance image generation and program
JP2017139631A (en) * 2016-02-04 2017-08-10 日立オートモティブシステムズ株式会社 Imaging apparatus
JP2019128153A (en) * 2018-01-19 2019-08-01 本田技研工業株式会社 Distance calculation device and vehicle control device

Also Published As

Publication number Publication date
CN112513571A (en) 2021-03-16
JP7042185B2 (en) 2022-03-25
JP2020016628A (en) 2020-01-30
CN112513571B (en) 2023-08-25

Similar Documents

Publication Publication Date Title
WO2020022021A1 (en) Distance calculation device
JP6648411B2 (en) Processing device, processing system, processing program and processing method
EP3358302B1 (en) Travel control method and travel control device
US10956757B2 (en) Image processing device, outside recognition device
US10964217B2 (en) Travel control method and travel control apparatus
US10322674B2 (en) Display control method and display control device
US10798319B2 (en) Camera device and method for capturing a surrounding region of a vehicle in a situation-adapted manner
JP6254084B2 (en) Image processing device
US8560220B2 (en) Method and apparatus for determining a plausible lane for guiding a vehicle and an automobile
CN113998034A (en) Rider assistance system and method
US20200059613A1 (en) Camera Device and Method for Detecting a Surrounding Area of a Driver's Own Vehicle
JP2016001170A (en) Processing unit, processing program and processing method
JP2019525568A5 (en)
KR102397156B1 (en) A method of providing a camera system and driver assistance functions for photographing the surrounding area of one's vehicle
JP6699344B2 (en) Reverse vehicle detection device, reverse vehicle detection method
CN110053625B (en) Distance calculation device and vehicle control device
US20200118280A1 (en) Image Processing Device
CN103381825B (en) Use the full speed lane sensing of multiple photographic camera
JP2022152922A (en) Electronic apparatus, movable body, imaging apparatus, and control method for electronic apparatus, program, and storage medium
JP6253175B2 (en) Vehicle external environment recognition device
US20220327819A1 (en) Image processing apparatus, image processing method, and program
CN109643447B (en) Image processing apparatus and image pickup apparatus
EP3865815A1 (en) Vehicle-mounted system
JP7185571B2 (en) Viewing direction estimation device, viewing direction estimation method, and program
JPH1137752A (en) Distance detecting device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19839831

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19839831

Country of ref document: EP

Kind code of ref document: A1