WO2024009756A1 - Object identification device and object identification method - Google Patents

Object identification device and object identification method Download PDF

Info

Publication number
WO2024009756A1
WO2024009756A1 PCT/JP2023/022771 JP2023022771W WO2024009756A1 WO 2024009756 A1 WO2024009756 A1 WO 2024009756A1 JP 2023022771 W JP2023022771 W JP 2023022771W WO 2024009756 A1 WO2024009756 A1 WO 2024009756A1
Authority
WO
WIPO (PCT)
Prior art keywords
distance
identification
threshold value
threshold
score
Prior art date
Application number
PCT/JP2023/022771
Other languages
French (fr)
Japanese (ja)
Inventor
都 堀田
英彰 城戸
郭介 牛場
Original Assignee
日立Astemo株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 日立Astemo株式会社 filed Critical 日立Astemo株式会社
Publication of WO2024009756A1 publication Critical patent/WO2024009756A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems

Definitions

  • the present invention relates to an object identification device and an object identification method that identify the type of object on an image.
  • a method for identifying the type of a target object a method using a classifier created by learning a large number of images of a specific object is generally used.
  • the classifier learns the shape and texture features of objects obtained from the edges of the training image, and during inference, outputs a score for the input image that indicates the probability that it is the object to be identified.
  • a threshold in advance for the output score and determining it as "identified" when the score exceeds the threshold, it is possible to avoid misidentification and make more likely identification. This is commonly done.
  • Patent Document 1 states, ⁇ In areas that are frequently used by users (specific areas described later), such as home garages, by learning in advance the environment around the garage (scenery, installed objects, etc.), "Reducing the frequency of false positives without lowering the detection rate.”
  • the person detection unit may detect a person appearing in the surrounding image using a known image analysis method. In the first embodiment, The person detection unit detects the object as a person when the index value regarding the object appearing in the surrounding image is equal to or greater than the first threshold value.'' In other words, the pedestrian detection device disclosed in Patent Document 1 detects a target object in a limited area such as a home parking lot when an index value (score) related to the object exceeds a preset threshold. do.
  • index value index value
  • the identification target is not limited to a limited area, but is identified from the nearest to far distance within the braking distance of the own vehicle. It is necessary. At this time, the farther the object is, the smaller the object becomes, and therefore the resolution of the object itself in the image becomes smaller. As the resolution decreases, the identification score value tends to decrease because the accuracy of extracting an object region from an image deteriorates, the outline of the object itself becomes unclear, and so on.
  • the present invention has been made in view of the above situation, and aims to suppress misidentification and improve object identification performance, regardless of whether the distance from the moving body to the object is near or far.
  • an object identification device receives external world information including objects outside the vehicle acquired by an external world information acquisition unit mounted on the vehicle, and calculates the distance from the vehicle to the object.
  • a distance calculation unit that calculates a distance
  • a discrimination score calculation unit that uses external world information to obtain a discrimination score that indicates confidence that the object included in the external world information is a predetermined type
  • an object identification unit that identifies the object as being of a type associated with the identification score when the object is identified by the identification score.
  • the threshold value differs depending on the distance.
  • FIG. 1 is a block diagram showing a configuration example of an object identification device according to a first embodiment of the present invention.
  • FIG. 6 is a diagram illustrating an example of distance-based thresholds for object identification in the first embodiment of the present invention. It is a flowchart which shows the example of a procedure of object identification processing by an object identification device concerning a 1st embodiment of the present invention.
  • FIG. 2 is a diagram showing an example of an image input to an object identification device.
  • FIG. 3 is a diagram showing an example of detecting an object region from an image input to the object identification device according to the first embodiment of the present invention.
  • FIG. 7 is a diagram showing an example of distance-based thresholds, scores, and identification results when the object identification process according to the first embodiment of the present invention is applied to a plurality of objects. It is a figure which shows the example of a score and an identification result when a fixed threshold value is used regardless of the conventional distance. It is a block diagram showing an example of composition of an object identification device concerning a 2nd embodiment of the present invention.
  • FIG. 7 is a diagram illustrating an example of the configuration of a database of identification targets acquired in advance in the second embodiment of the present invention.
  • FIG. 6 is a diagram showing an example of plotting the relationship between distance and score for each element of the identification target database in the object identification device according to the second embodiment of the present invention.
  • FIG. 7 is a diagram showing a threshold value obtained from distance, a result of identification based on the threshold value, and a result of marking whether the identification result is correct or incorrect for each element of the identification target database in the second embodiment of the present invention. It is a block diagram showing an example of composition of an object identification device concerning a 3rd embodiment of the present invention.
  • FIG. 7 is a diagram showing an example of distance-based thresholds, scores, and identification results for each braking distance when the object identification process according to the third embodiment of the present invention is applied to a plurality of objects.
  • FIG. 7 is a diagram showing an example of the relationship between the distance to the object and the threshold value in a modification of the third embodiment of the present invention. It is a block diagram showing an example of composition of an object identification device concerning a 4th embodiment of the present invention. 1 is a block diagram showing an example of the hardware configuration of an object identification device according to first to fourth embodiments of the present invention. FIG.
  • FIG. 1 is a block diagram showing a configuration example of an object identification device according to a first embodiment.
  • the object identification device 2 includes an object area detection section 21, an object distance calculation section 22, an identification score calculation section 23, a distance-based identification method storage section 24, and a distance-based identification method selection section 25. and an object identification section 26.
  • the object area detection unit 21 detects an area in which an object appears ( object area). This is a process of detecting an object area made up of a group of pixels from an image, and various means can be used. For example, when a stereo camera is used as a camera for acquiring images, the distance between pixels with similar features on the left and right images can be determined based on parallax. An object area can be determined by grouping adjacent pixels (areas) that are close to each other on an image.
  • the object distance calculation unit 22 calculates the distance from the stereo camera 1 to the object on the image detected by the object area detection unit 21 (hereinafter referred to as "object distance").
  • object distance the distance from the stereo camera 1 to the object on the image detected by the object area detection unit 21.
  • the calculated value of the distance from the stereo camera 1 to the object is the distance from a moving object such as a vehicle to the object, but the mounting position of the stereo camera 1 on the moving object is reflected in the calculated value.
  • the distance from the moving body to the object may also be determined by
  • the distance to an object within the camera's field of view is measured using the principle of triangulation.
  • the principle of triangulation is to calculate the distance from the camera to the object using the positional shift (parallax) between the images of the same object photographed by the left and right cameras.
  • the parallax is derived by specifying where the image of the object on one image exists on the other image.
  • the identification score calculation unit 23 includes a classifier 231, and the classifier 231 performs a classification process on the object area detected by the object area detection unit 21, and identifies the object area.
  • a score (sometimes referred to as a "discrimination score") that indicates object-likeness is calculated. This score can be said to be an index of the certainty that the object region is the identification target.
  • the classifier 231 is a machine learning model that has learned the shape and texture features of an object obtained from the edges of the training image, and during inference, it uses the object region of the input image to determine whether it is the object to be identified. Outputs a score that represents certainty.
  • the distance-based identification method storage unit 24 is a storage unit that stores score thresholds for determining "identification" for each distance to the object.
  • the distance-specific thresholds stored in the distance-specific identification method storage unit 24 will be explained using FIG. 2.
  • FIG. 2 is a diagram illustrating an example of threshold values for each distance for object identification in this embodiment.
  • the horizontal axis indicates the distance (m) to the object
  • the vertical axis indicates the threshold value.
  • the shorter the distance from the stereo camera 1 to the object the higher the threshold value, and the farther the distance to the object, the lower the threshold value.
  • the identification target is not limited to this example.
  • the distance-based identification method selection unit 25 acquires the threshold value corresponding to the distance to the object calculated by the object distance calculation unit 22 from the distance-based identification method storage unit 24 .
  • the object identification unit 26 compares the score calculated by the identification score calculation unit 23 with the distance-specific threshold value acquired by the distance-based identification method selection unit 25, and determines that the score is the threshold value. If it exceeds , the object of interest is identified as an identification target.
  • the identification result Ob of the object identification unit 26 is output to the vehicle control device 3 that controls the operation of a moving object such as a vehicle on which the object identification device 2 is mounted.
  • the vehicle control device 3 is an electronic control unit (ECU) that controls automatic braking of a moving object, an electronic control unit (ECU) that controls steering of a moving object, or the like.
  • FIG. 3 is a flowchart showing an example of a procedure for object identification processing by the object identification device 2.
  • a pedestrian is identified as an identification target.
  • a stereo camera is used as a means for acquiring images of the surroundings of the own vehicle, and a distance corresponding to the position of an object on the acquired stereo image Im can be acquired.
  • the object identification device 2 starts the object recognition process when the recognition process execution timing (current time) comes.
  • the object area detection section 21 inputs an image acquired by an image input section (not shown) of the object identification device 2.
  • FIG. 4 shows an example of the image 31 input to the object identification device 2.
  • This image 31 can be considered to be one of the left and right images output by the stereo camera 1 and input to the object identification device 2.
  • Image 31 shows pedestrians and trees on the right side of the road, a bus stop on the left side of the road, and a crosswalk and two pedestrians in front of the vehicle and in the distance of the road.
  • step S22 the object area detection unit 21 detects an object area from the image 31 acquired in step S21.
  • FIG. 5 is a diagram showing an example of detecting an object area from the image 31 input to the object identification device 2.
  • the object area detection unit 21 detects an object such as a pedestrian indicated by reference numeral 41 as object number.
  • an object such as a bus stop indicated by reference numeral 42 is designated as object number.
  • the tree-like object indicated by reference numeral 43 is designated as object number.
  • a pedestrian-like object indicated by reference numeral 44 is designated as object no. 3 and a pedestrian-like object indicated by the reference numeral 45 as object number. Detected as 4.
  • the object (object area) of reference counter i is expressed as "object i.”
  • step S24 the object distance calculation unit 22 acquires distance information of the object i.
  • the object distance calculation unit 22 obtains the distance of the object i to be processed from the parallax of the stereo image Im obtained by the stereo camera 1.
  • Object No. 41 in FIG. It is assumed that object 0 is located at a distance of "30 m" from stereo camera 1 (own vehicle).
  • step S25 the identification score calculation unit 23 calculates the identification score of the object i to be processed.
  • the degree of certainty of being a pedestrian is used as an index from 0 to 1, and the closer it is to 1, the more likely it is to be a pedestrian.
  • object No. 41 is shown. It is assumed that the score of 0 is "0.98".
  • step S26 the distance-based identification method selection unit 25 selects the distance that corresponds to the distance of the object i acquired in step S24, from the distance-based thresholds (FIG. 2) stored in the distance-based identification method storage unit 24. Get the threshold.
  • object no. The threshold value corresponding to the distance “20 m” of 0 is “0.89”.
  • step S27 the object identification unit 26 determines whether the score of object i exceeds a threshold value. Object no. In the case of 0, the score calculated in step S25 is "0.98", which exceeds the threshold value "0.89". If the score of object i exceeds the threshold (YES in step S27), proceed to step S28; if the score of object i does not exceed the threshold (NO in step S27), proceed to step S29. .
  • step S28 the object identification unit 26 determines the type of object i as a pedestrian.
  • step S29 the object identification unit 26 determines the type of object i to be other than a pedestrian. For example, object No. 43. If the score of 2 is "0.80" and the distance is "40m", the threshold value corresponding to the distance "40m” is "0.83" ( Figure 2), so the type of object i is other than a pedestrian. It is determined that
  • step S31 the object area detection unit 21 determines whether the reference counter i has become equal to the number of object areas detected in step S22. If the reference counter i is not equal to the number of object regions detected in step S22 (NO in step S31), the object region detection unit 21 determines that the identification process has not been completed for all objects and returns to the process in step S24. .
  • the object region detection unit 21 completes the identification process for all objects at the current time. Then, the process moves on to the next time.
  • the score and distance of the object detected in the image acquired from the stereo camera 1 are calculated, and the object identification process is performed.
  • FIG. 6 is a diagram showing an example of the distance, score, threshold value, and identification result at the time when the object identification process according to the present embodiment is completed.
  • the table shown in FIG. 6 includes a field 51 for "object number”, a field 52 for "object code”, a field 53 for "object type”, a field 54 for "distance”, a field 55 for "score”, and a field 55 for "threshold code”. ” field 56, “identification result” field 57, and “identification result correct/incorrect” field 58.
  • “Object No.” is, for example, information that uniquely identifies a record in this table.
  • the "object code” is the object number. This is information such as a code attached to each object on the image in order to identify the object (object area) on the image. Although this object code is described to explain the object on the image of FIG. 5, it is not necessarily necessary.
  • “Object type” is information representing the actual type of the object on the image.
  • “Distance” is information representing the distance to the object acquired in step S24.
  • “Score” is information representing the score at the time of object identification calculated in step S25.
  • “Threshold value” is information representing a threshold value according to the distance to the object acquired in step S26.
  • “Identification result” is information representing the identification result determined in step S27. “Correctness of identification result” is information indicating whether the identification result is correct or incorrect by comparing the actual “object type” in field 53 and “identification result” in field 57, for explanation.
  • object No. shown in FIG. 0 code 41
  • object No. 4 reference numeral 45
  • Object No. shown in the first row The actual object type of the object numbered 0 and 41 is "pedestrian", and since it is located at a distance of "20 m", the outline of the object area is clear, and the identification result score is as high as "0.98". can get.
  • the threshold value corresponding to “20 m” is obtained from the threshold values for each distance in FIG. ” becomes. Since the object for which the correct answer is a pedestrian is identified as a "pedestrian,” the identification result is "correct” ( ⁇ in FIG. 6). This is an example in which the object is close, the object area can be accurately acquired, and the outline is clear, so a high score is obtained as a result of identification, and the identification judgment based on the threshold value is also made correctly.
  • the actual object type of the object numbered 42 is "other than pedestrian" (bus stop), and the outline of the object area is clear because it is at a distance of "30 m", but the shape of the bus stop is different from that of pedestrians. Since they are quite similar, the score is a high value of "0.88".
  • the threshold value for a distance of “30 m” is obtained from the threshold values for each distance in FIG. person”. Since an object whose correct answer is other than a pedestrian is identified as a "pedestrian,” the identification result is "incorrect” ( ⁇ in FIG. 6).
  • the actual object type of the object 43 is "non-pedestrian” (tree) and is located at a distance of "40 m", so the outline of the object area is somewhat clear. Since the shape is somewhat similar to that of a pedestrian, the score is "0.80", which is a rather high value.
  • the threshold value for a distance of "40 m” is obtained from the threshold values for each distance in FIG. 2, it is “0.83”, and since the score does not exceed the threshold value in the process of step S27, the identification result is " ⁇ Non-pedestrians''.
  • the identification result is "correct.” This is because the object is close, the object area can be accurately captured, and the outline is clear, so the shape of the object is similar to that of a pedestrian, but the short distance threshold is set high. As a result, this is an example in which erroneous identification as a pedestrian was avoided because the reliability as a pedestrian was less than the threshold.
  • the object 44 is an example in which the actual object type is a "pedestrian" and is located at a distance of "70 m", so the outline of the object area is less clear than in the case of a nearby object. Further, in this example, in the detected object region indicated by reference numeral 44 in FIG. 5, not all regions corresponding to the object have been extracted, and the head portion of the object is outside the object region. When detecting a distant object, it may be difficult to accurately detect the entire object area, as shown in this example, due to the low resolution of the object area.
  • the discriminator 231 included in the discrimination score calculation unit 23 learns by considering the outline of the pedestrian's head as a feature, so if the object area does not include a part of the pedestrian's body, it is determined that the pedestrian is As a result, the identification score decreases. For these reasons, the score is a low value of "0.78".
  • the threshold value for a distance of “70 m” is obtained from the threshold values for each distance in FIG. person”. Since the object for which the correct answer is a pedestrian is identified as a "pedestrian,” the identification result is "correct.” This is an example where the object is far away and the object area cannot be accurately acquired, resulting in a lower score, but the threshold for each distance is set low for distant objects, so it can be identified by pedestrians. be.
  • the object 45 is an example in which the actual object type is a "pedestrian” and is located at a distance of "80 m", so the outline of the object area is unclear compared to the case in the vicinity. Further, in this example, since the resolution of the object region is low and the outline of the object region is unclear, the score is calculated as a low value of "0.75". When the threshold value for a distance of “80 m” is obtained from the threshold values for each distance in FIG. person”.
  • the identification result is "correct.” This is an example where the object is far away and the outline of the object area is unclear, resulting in a lower score, but because the distance-based threshold is set low for far away objects, it can be correctly identified as a pedestrian. It is.
  • FIG. 7 is a diagram showing an example of distance-based thresholds, scores, and identification results when the object identification process by the object identification device 2 is applied to a plurality of objects.
  • the horizontal axis indicates the distance (m) to the object, and the vertical axis indicates the threshold value.
  • numerals 41, 42, 43, 44, and 45 indicate the distance and score from the object area indicated by the same numeral on the image 31 of FIG. 5.
  • a distance-specific threshold 71 is a function (an example of a distance-specific threshold function) that indicates a threshold for the distance to an object, and is the same as the distance-specific threshold shown in FIG. It's the content.
  • the marker indicated by “ ⁇ ” indicates the score of the object region whose correct value is “pedestrian”.
  • the marker indicated by " ⁇ " indicates the score of the object region whose correct value is "other than pedestrian.”
  • the score exceeds the distance-specific threshold 71, and for codes 42 and 43 whose correct value is “other than pedestrian,” the score is A state in which the distance does not exceed the threshold value 71 is a 100% correct state.
  • one marker whose correct value is "other than pedestrian”, indicated by reference numeral 42, is incorrectly identified as a pedestrian.
  • variable threshold values for each distance such as increasing the threshold value for short distances and lowering the threshold value for long distances, it is possible to correctly identify 80% of object regions.
  • FIG. 8 is a diagram showing an example of scores and identification results when a conventional constant (fixed) threshold value is used regardless of distance.
  • the horizontal axis indicates the distance (m) to the object, and the vertical axis indicates the threshold value.
  • numerals 41, 42, 43, 44, and 45 indicate the distance and score from the object area indicated by the same numeral on the image 31 of FIG. 5.
  • the marker indicated by " ⁇ " indicates the score of the object region whose correct value is "pedestrian".
  • the marker indicated by " ⁇ " indicates the score of the object region whose correct value is "other than pedestrian.”
  • the threshold value 81 is set to a constant value of "0.9”
  • the threshold value 82 is set to a constant value of "0.79”
  • the threshold value 83 is set to a constant value of "0.74".
  • threshold value 81 If the threshold value is set to a high value of "0.9" such as threshold value 81, the score of code 41, where the correct answer is pedestrian, is correct because it exceeds the threshold value and is identified as “pedestrian”. . Codes 42 and 43, which are non-pedestrians, are identified as “non-pedestrians” because their scores do not exceed the threshold, and these two cases are also correct. On the other hand, the numbers 44 and 45, which are pedestrians located far away, are identified as “other than pedestrians” because their scores do not exceed the threshold, and are therefore incorrect. In this way, setting a certain threshold value to a high value works well for distinguishing between "pedestrians" and "non-pedestrians" near the vehicle.
  • the accuracy rate for the five objects is 60%, which is lower than in the present embodiment.
  • the threshold value is set to a medium value of "0.78" like the threshold value 82
  • the score of code 41 whose correct answer is "pedestrian” will exceed the threshold value and be identified as “pedestrian”. That's correct.
  • the scores 42 and 43 which are not pedestrians, are also determined to be “pedestrians” because their scores are equal to or higher than the threshold value, and these two cases are misidentified.
  • the codes 44 and 45 which are long-distance pedestrians, are identified as "other than pedestrians” because their scores do not exceed the threshold value, resulting in erroneous identification. In this way, if the certain threshold value is set to a halfway value, the result will be that the identification results do not work well regardless of whether the object is located at a short distance or a long distance. As a result, the accuracy rate for the five objects is 20%.
  • the threshold is set to a low value of "0.74" like threshold 83, the scores for all codes 41, 44, and 45 for which the correct answer is "pedestrian” exceed the threshold and "pedestrian” ”.
  • codes 42 and 43 whose correct answer is other than pedestrian are also incorrectly identified as “pedestrian” because their scores exceed the threshold.
  • setting a certain threshold to a low value has the effect of correctly identifying object regions that correspond to pedestrians with low scores in the distance, but objects other than pedestrians with high scores in the vicinity There is a problem in which areas are identified as pedestrians. As a result, the accuracy rate for the five objects is 60%.
  • the values at short distances should be set high and the values at long distances should be set high, as shown in Figure 7. It is effective to make identification decisions using a threshold value set to a low value.
  • the above is the content including the configuration and operation of the object identification device 2 according to the first embodiment of the present invention.
  • the object identification device (object identification device 2) according to the first embodiment is capable of identifying objects outside the vehicle (objects on images) acquired by the external world information acquisition unit (for example, the stereo camera 1) mounted on the vehicle.
  • a distance calculation unit (object distance calculation unit 22) receives external world information (for example, a stereo image Im) including a region) and calculates the distance from the vehicle to the object, and uses the external world information to calculate the distance from the vehicle to the object.
  • a discrimination score calculation unit (discrimination score calculation unit 23) that calculates a discrimination score indicating the degree of confidence that the object is of a predetermined type; and an object identification unit (object identification unit 26) that identifies the object to be of the attached type.
  • the threshold value differs depending on the distance from the vehicle to the object (FIG. 2).
  • an appropriate threshold value for object identification is set that matches the actual situation for each distance.
  • FIG. 9 is a block diagram showing a configuration example of an object identification device according to the second embodiment.
  • the object identification device 2A shown in FIG. 9 has a configuration in which a distance-based identification method creation unit 91 and an identification target database (DB) 92 are added to the object identification device 2 according to the first embodiment shown in FIG. be.
  • DB identification target database
  • the distance-based identification method creation unit 91 which is an added component, will be further explained.
  • the distance-based identification method creation unit 91 is a means for calculating a distance-based threshold function stored in the distance-based identification method storage unit 24 .
  • the calculated distance-based threshold function is stored in the distance-based identification method storage unit 24 as a distance-based identification method.
  • the distance-based identification method selection unit 25 selects a threshold value obtained by inputting the distance to the object on the image calculated by the object distance calculation unit 22 into the distance-based threshold function of the distance-based identification method storage unit 24. get. Then, the object identification unit 26 compares the score calculated by the identification score calculation unit 23 with the threshold value obtained by the distance-based identification method selection unit 25 to identify the object on the image.
  • the distance-based threshold function is such that when the object is close to the vehicle, the outline of the object area on the input image is clearly detected, so the correct answer is that it is a pedestrian.
  • the score for a certain object also becomes high, and objects other than pedestrians that are similar in shape to pedestrians also have high scores. Therefore, the threshold value is set high to suppress misidentification.
  • the score tends to be low due to reasons such as the outline of the object region becoming unclear or the detection region not properly extracting the object region. Therefore, the threshold value is set low.
  • the distance-specific threshold function is a downward-sloping function in which the threshold value decreases as the distance increases, and any function may be used as long as it can appropriately achieve the purpose.
  • a description will be given assuming that the relationship between the distance and the threshold value is set using a linear function. An example will be explained in which pedestrians and others are identified as objects to be identified.
  • data in the identification target DB 92 acquired in advance is used.
  • FIG. 10 is a diagram showing a configuration example of the identification target DB 92 acquired in advance.
  • FIG. 10 shows an example of fields (items) of the identification target database 92.
  • the identification target DB 92 has an "ID" field 101, an "object type” field 102, a “distance” field 103, and a “score” field 104.
  • ID is information indicating an identifier for identifying an image (object) of an object region to be identified. If the correct answer is a pedestrian, it is a serial number with the symbol p added to the beginning of the identifier. If the correct answer is something other than a pedestrian, it is a serial number with a code n added to the beginning of the identifier. In total, it is assumed that there are Np identification objects whose correct answer is pedestrians and Nn identification objects other than pedestrians.
  • Object type is information representing the type of object to be identified.
  • Distance is information representing the distance to the object to be identified.
  • Score is information representing the score of the object to be identified.
  • the identification target DB 92 may be located outside the object identification device 2A and within the own vehicle, or the data corresponding to the identification target DB 92 may be acquired from outside the vehicle via a wide area network (not shown).
  • FIG. 11 is a diagram showing an example of plotting the relationship between distance and score for each element of the identification target DB 92 as shown in FIG. 10 in the object identification device 2A.
  • the relationship between distance and score is shown as a score distribution every 10 m.
  • score distributions 111, 113, 115, 117, 11a, 11c, 11e, and 11g shown by solid lines are score distributions of objects whose object type is "pedestrian.”
  • Score distributions 112, 114, 116, 118, 11b, 11d, 11f, and 11h indicated by broken lines are score distributions of objects whose object type is "other than pedestrian.”
  • variable threshold value corresponding to the distance is set as the threshold value 110 for the distance and score identification target having the score distributions 111 to 11h
  • the field 103 is set for each element of the identification target DB 92 shown in FIG.
  • the threshold value corresponding to the "distance" is determined.
  • the variable threshold value assuming linearity can be defined by the following equation (1). Note that the coefficient a is a negative value in this embodiment.
  • Threshold distance x a + b (1)
  • FIG. 12 is a diagram showing the threshold value obtained from the distance, the result of identification using the threshold value, and the correctness of the identification result for each element of the identification target DB 92 shown in FIG. 10.
  • the table shown in FIG. 12 includes, in addition to fields 101 to 104, a field 121 for "threshold", a field 122 for "identification result", and a field 123 for "correctness of identification result”.
  • “Threshold” is information representing a threshold corresponding to the distance to the object.
  • “Identification result” is information representing the result of identification from “score” and “threshold value”.
  • “Correctness of identification result” is information indicating whether the “identification result” is correct or incorrect by comparing the actual “object type” in field 102 and the “identification result” in field 122.
  • the correct classification rate and false classification rate when applying the threshold 110 shown in FIG. 11 to this data set can be determined using the following equations (2) and (3).
  • the correct identification rate is the rate at which an object type whose correct answer is pedestrian is identified as a pedestrian.
  • the misidentification rate is the rate at which an object type other than a pedestrian is identified as a pedestrian.
  • the identification target DB 92 of FIG. 10 can be used to ensure the correct identification rate. It is possible to determine a threshold value that minimizes false identification. By storing as much information about possible identification targets as possible in the identification target DB 92 shown in FIG. 10, it is possible to calculate distance-based thresholds that match the actual usage of the vehicle.
  • the above is the content including the configuration and operation of the object identification device 2A according to the second embodiment of the present invention.
  • the object identification device (object identification device 2A) according to the present embodiment, as a threshold value for each distance, information on objects for which correct values are known is stored in the image database (identification target DB 92) for each distance. , a threshold setting function (for example, threshold 110) with which the correct classification rate is a certain value or more and the misidentification is minimum is used, and the object identification unit (object identification unit 26) The above threshold value is calculated by inputting the distance to the object into the function.
  • a threshold setting function for example, threshold 110
  • the object identification unit object identification unit 26
  • the relationship between the distance to the target and the threshold value is expressed using a linear function, but the relationship is not necessarily limited to this.
  • a relationship between distance and threshold can also be used.
  • the relationship between the distance to the object and the threshold may be expressed by a nonlinear function that satisfies the threshold conditions in the above embodiments.
  • the relationship between the distance to the target and the threshold value may be defined in the form of a table or map data.
  • a braking distance is added to the object identification device 2 (FIG. 1) according to the first embodiment.
  • a function is provided to determine a threshold value.
  • FIG. 13 is a block diagram showing a configuration example of an object identification device according to the third embodiment.
  • the object identification device 2B shown in FIG. 13 adds a speed detection section 131 and a road surface condition detection section 132 to the object identification device 2 according to the first embodiment shown in FIG. 24 is replaced with a braking distance/distance-based identification method storage unit 133, and the distance-based identification method selection unit 25 is replaced with a braking distance/distance-based identification method selection unit 134. That is, the difference between this embodiment and the first embodiment is that each braking distance has a function of the distance to the object and the threshold value.
  • the main form of use of the present invention is to install it on a moving object such as a vehicle, identify an object in front that is at risk of collision, and automatically apply the brakes. That is, the present invention is suitable for application to vehicle control for advanced driver assistance systems (ADAS) and autonomous driving (AD).
  • ADAS advanced driver assistance systems
  • AD autonomous driving
  • Braking distance varies depending on the speed of the vehicle and road surface conditions. What is important in operating automatic braking is to reliably identify objects within a braking distance range that can be calculated based on the vehicle's current speed and road surface conditions. Therefore, the braking distance is calculated based on the speed of the own vehicle and the road surface condition acquired by the speed detection section 131 and the road surface condition detection section 132. Then, a distance-specific threshold value that minimizes the false identification rate while ensuring a correct identification rate within the braking distance range is calculated and stored in the braking distance/distance-based identification method storage unit 133.
  • the speed detection unit 131 can be configured using, for example, a vehicle speed sensor mounted on the vehicle.
  • the speed detection unit 131 outputs information on the detected speed of the own vehicle to the braking distance/distance-based identification method selection unit 134.
  • the road surface condition detection unit 132 determines the road surface condition from an image in front of the vehicle inputted from, for example, the stereo camera 1. For example, information on road surface conditions includes road surface irregularities, bumps, wet conditions, frozen conditions, and the like.
  • the road surface condition detection section 132 outputs information on the detected road surface condition to the braking distance/distance-specific identification method selection section 134.
  • the braking distance/distance-based identification method storage unit 133 stores distance-based threshold values for each braking distance as shown in FIG. The contents of FIG. 14 will be described later.
  • the braking distance/distance-specific identification method selection unit 134 calculates the braking distance of the own vehicle based on the speed of the own vehicle detected by the speed detection unit 131 and the road surface condition detected by the road surface condition detection unit 132.
  • the braking distance is influenced by the weight of the own vehicle and the performance and specifications of the automatic brake, but such information can be prepared in advance and stored in the object identification device.
  • the braking distance can be calculated using a machine learning model (not shown) that is trained to output the braking distance by inputting the speed of the own vehicle and the road surface condition.
  • Other information such as the weight of the vehicle and the performance and specifications of automatic braking are reflected in the parameters of the machine learning model in advance.
  • the braking distance/distance-based identification method selection unit 134 uses this machine learning model to infer the braking distance from the speed of the own vehicle and the road surface condition. Then, the braking distance/distance-based identification method selection unit 134 acquires a threshold value corresponding to the inferred value of the braking distance from the braking distance/distance-based identification method storage unit 133 and outputs it to the object identification unit 26 .
  • FIG. 14 is a diagram showing an example of distance-based thresholds, scores, and identification results for each braking distance when the object identification process according to the present embodiment is applied to a plurality of objects.
  • the horizontal axis indicates the distance (m) to the object
  • the vertical axis indicates the threshold value.
  • a threshold value 141 indicated by a solid straight line (linear function) is a threshold value for each distance to the object when the braking distance is "50 m”.
  • the threshold value 142 indicated by a broken straight line (linear function) is a threshold value for each distance to the object when the braking distance is "90 m”.
  • Reference numerals 41, 42, 43, 44, and 45 indicate distances and scores from the object regions indicated by the same reference numerals on the image 31 in FIG.
  • the long distance threshold is lower in order to identify pedestrians who are far away.
  • the threshold value 141 when the speed of the own vehicle is low and the braking distance is "50 m" as the threshold value 141, it is easier to correctly identify a pedestrian closer than 50 m than to correctly identify a pedestrian further away than 50 m. Since priority can be given to suppressing erroneous identification of objects other than pedestrians, the negative slope of the threshold value can be reduced.
  • the threshold value 141 for the case of braking distance "50 m" is adopted, the object that is a pedestrian indicated by the reference numeral 41 located in front of 50 m is correctly identified.
  • the score of the bus stop for non-pedestrians indicated by the code 42 located at a distance of 30 m is below the threshold, it can be correctly identified as non-pedestrians, and the bus stop indicated by the code 43 located at a distance of 40 m Trees other than pedestrians are also correctly identified as non-pedestrians because their scores do not exceed the threshold.
  • the distance-based thresholds are set according to the distance to the object and the braking distance.
  • the following effects can be obtained. That is, according to the present embodiment, by changing the distance-specific threshold value for each braking distance and narrowing down the application range of automatic braking, it is possible to perform object identification with higher accuracy within a necessary range.
  • a table or map data may be prepared in which the relationship between the distance to the object and the threshold value is registered for each braking distance.
  • the threshold value for object identification is set based on the distance to the object and the braking distance, but the invention is not limited to this example. For example, since the braking distance is greatly affected by the speed of the moving object, the threshold value may be set depending on the distance and speed.
  • the distance-based threshold function (threshold setting function) is a linear function, but the distance-based threshold function may be a fixed threshold value or a linear variable threshold value. It may also be defined by a combination.
  • a distance-specific threshold function is defined by a combination of a fixed threshold value and a linear variable threshold value will be described with reference to FIG. 15.
  • FIG. 15 is a diagram showing an example of the relationship between the distance to the object and the threshold value in a modification of the third embodiment.
  • the threshold value 151 shown in the upper part of FIG. 15 is an example of a method in which a fixed threshold value is used when the distance to the object is short, and a linear variable threshold value is used when the distance to the object is long. This may be used when it is desired to strengthen the suppression of misidentification when the distance to the object is short. It is also effective for object identification when the braking distance is short.
  • the threshold value 152 shown in the middle part of FIG. 15 is an example of a method in which a linear variable threshold value is used when the distance to the object is short, and a fixed threshold value is used when the distance to the object is long. This may be used when it is desired to strengthen the suppression of misidentification even if the distance to the object is long. It is also effective for object identification when the braking distance is long.
  • the threshold functions for different distances may have different distances, for example, the fixed value of the threshold value 152 at a long distance in the middle row of FIG. 15 and the fixed value of the threshold value 153 at a long distance in the lower row of FIG. 15 are the same. You can also use it as
  • the threshold value 153 shown in the lower part of FIG. 15 is an example of a method using a high fixed threshold value for short distances, a linear variable threshold value for intermediate distances, and a low fixed threshold value for long distances. This is a combination of the threshold value 141 and the threshold value 142, and is effective when the braking distance is medium. Note that the fixed value of the threshold value 151 at a short distance in the upper row of FIG. 15 is the same as the fixed value of the threshold value 153 at a short distance in the lower row of FIG. You can also use it as
  • the distance to the object is shorter than the first distance, the distance is short distance, and when the distance to the object is farther than the second distance (first distance ⁇ second distance), the distance is long distance. It is assumed that when the distance to the object is between the first distance and the second distance, the distance is medium.
  • the distance-based threshold value is set when the distance to the object is longer than the second distance (long distance ) is set to be a larger value when the distance to the object is shorter than the first distance, which is shorter than the second distance (near distance).
  • the distance-based threshold used for object identification by the distance-based threshold function may be a linear variable value (in the first to third embodiments), or a fixed value and a linear variable value. (a modification of the third embodiment).
  • the distance-based thresholds may be set to values that vary stepwise from long distances to short distances.
  • the threshold value for each distance can be set to a small value for long distance, a medium value for medium distance, and a medium value for short distance. Set to a large value.
  • the distance may be divided into four or more stages instead of three. In this case, compared to the above-described embodiment, it is possible to reduce the load required to determine the distance-based thresholds (stepwise different thresholds) to be stored in the distance-based identification method storage unit 24.
  • the object identification device 2A (FIG. 9) according to the second embodiment is provided with a function of updating threshold values for each distance based on the object identification results. An example will be explained below.
  • FIG. 16 is a block diagram showing a configuration example of an object identification device according to the fourth embodiment.
  • the object identification device 2C shown in FIG. 16 has a configuration in which an identification condition/result output unit 161 and a distance-based identification result storage unit 162 are added to the object identification device 2A according to the second embodiment shown in FIG. be. Note that in the object identification device 2C, the identification target DB 92 of the object identification device 2A in FIG. 9 has been deleted.
  • the identification condition/result output unit 161 acquires the identification results and identification conditions from the object identification unit 26 and outputs them to the distance-based identification result storage unit 162.
  • the identification condition/result output unit 161 outputs the image of the identification target (object region) identified by the object identification unit 26, the distance to the identification target, the score, and the threshold value used for identification. do.
  • the distance-based identification result storage unit 162 stores information such as the image of the identification target outputted by the identification condition/result output unit 161, the distance to the identification target, the score, and the threshold value used for identification.
  • the various information regarding the identification target stored in the distance-based identification result storage unit 162 can be used to check whether the object identification process is working well. Furthermore, by assigning an object type to each piece of information about the stored identification target, the distance-based identification method creating unit 91 can generate various types of information about the identification target stored in the distance-based identification result storage unit 162 (images of the identification target). , the distance to the identification target, and the score), an appropriate distance-specific threshold value can be calculated. The distance-based identification method creation unit 91 outputs the calculated distance-based threshold (for example, a distance-based threshold function) to the distance-based identification method storage unit 24 , and the distance-based identification method creation unit 91 outputs the calculated distance-based identification method storage unit 24 . Update another threshold.
  • the calculated distance-based threshold for example, a distance-based threshold function
  • the purpose of assigning the object type is to assign the correct type to the identification target identified by the object identifying unit 26. Therefore, examples of methods for assigning object types include the following methods.
  • a classifier object type assigning unit
  • a discriminator whose discrimination performance is equal to or higher than that of humans.
  • ResNet described in Non-Patent Document 1 below can handle high-dimensional features by using a network as deep as 152 layers among convolutional neural networks.
  • Non-patent document 1 Kaiming He, Xiangyu Zhang, Shaoqing Ren, and Jian Sun. “Deep residual learning for image recognition” IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 2016.
  • the assignment of the object type can be carried out in an offline environment different from the vehicle by storing the information regarding the identification target stored in the distance-based identification result storage unit 162 in an external storage medium or the like. Furthermore, if a classifier configured with a high-performance network as described in Non-Patent Document 1 can be installed in a vehicle (for example, in an object recognition device or other electronic control device), the distance-based classification result storage unit 162 This can be achieved by reading the image of the object to be identified stored in and performing the identification.
  • the correct object type is determined based on various information regarding the identification target outputted by the identification condition/result output unit 161 and stored in the distance-based identification result storage unit 162. It is possible to create a data set to which a distance-based discrimination method creation unit 91 described in the second embodiment can set a distance-based threshold value for this data.
  • the object identification device has an identification result storage that stores information such as an image of the object (object region) identified by the object identification unit (object identification unit 26), the distance to the object, and the identification score. (distance-based identification result storage unit 162) and a function creation unit (distance-based identification method creation unit) that creates a threshold setting function (distance-based threshold function) using each piece of information stored in the identification result storage unit 91).
  • the types of objects to be identified and objects other than pedestrians that are likely to be misidentified vary depending on the region in which they are used, environmental conditions, frequently used speed zones, etc.
  • the object identification device it becomes possible to update the variable threshold value that is suitable for the driving environment of each vehicle at any time. For example, in the initial stage of operation of a system including the object identification device according to the present embodiment described above, a versatile distance-based threshold is created and used.
  • the actual usage of mobile objects differs from user to user, that is, from vehicle to vehicle.
  • the distance-based threshold is updated according to the actual usage situation. Note that the update frequency may be once, or may be updated multiple times periodically or by setting some conditions.
  • the above are examples of embodiments of the object identification device according to the present invention.
  • an example in which a pedestrian is identified as a non-pedestrian is used as an identification target, but the identification target may be a plurality of objects including a pedestrian.
  • outside world information acquisition unit is not limited to a stereo camera.
  • a monocular camera, a millimeter wave radar, or the like may be used as the external world information acquisition unit.
  • FIG. 17 is a block diagram showing an example of the hardware configuration of the object identification devices 2 to 2C according to the first to fourth embodiments.
  • a computer 170 shown in FIG. 17 is hardware used as a so-called computer.
  • the computer 170 includes a CPU (Central Processing Unit) 171, a ROM (Read Only Memory) 172, a RAM (Random Access Memory) 173, a nonvolatile storage 176, and a network interface 177, each connected to a bus B.
  • CPU Central Processing Unit
  • ROM Read Only Memory
  • RAM Random Access Memory
  • the CPU 171 reads software program codes that implement each function of the passenger conveyor control system 100 according to the present embodiment from the ROM 172, expands them into the RAM 173, and executes them. Alternatively, the CPU 171 directly reads the program code from the ROM 172 and executes it as is. Note that the computer 170 may include a processing device such as an MPU (Micro-Processing Unit) instead of the CPU 171. Variables, parameters, etc. generated during arithmetic processing by the CPU 171 are temporarily written into the RAM 173. The functions of each block of the object identification devices 2 to 2C are realized by the CPU 171.
  • MPU Micro-Processing Unit
  • non-volatile storage 176 for example, an HDD (Hard Disk Drive), SSD (Solid State Drive), flexible disk, optical disk, magneto-optical disk, CD-ROM, CD-R, non-volatile memory card, etc. can be used. can.
  • this nonvolatile storage 176 in addition to the OS (Operating System) and various parameters, programs for operating the computer 170 and the like are recorded.
  • the functions of the distance-based identification method storage unit 24, the identification target DB 92, the braking distance/distance-based identification method storage unit 133, and the braking distance/distance-based identification method selection unit 134 of the object identification devices 2 to 2C are performed by the non-volatile storage 176. Realized.
  • the program may be stored in the ROM 172.
  • the program is stored in the form of a computer-readable program code, and the CPU 171 sequentially executes operations according to the program code. That is, the ROM 172 or the nonvolatile storage 176 is used as an example of a computer-readable non-transitory recording medium that stores a program executed by a computer.
  • the network interface 177 is composed of a communication device or the like that controls communication with other devices.
  • the communication function of the object identification devices 2 to 2C is realized by the network interface 177.
  • each of the above-mentioned configurations, functions, processing units, etc. may be partially or entirely realized by hardware, for example, by designing an integrated circuit.
  • a broadly defined processor device such as an FPGA (Field Programmable Gate Array) or an ASIC (Application Specific Integrated Circuit) may be used.
  • each component of the object identification device may be implemented in any hardware as long as the respective hardware can send and receive information to and from each other via a network.
  • the processing performed by a certain processing unit may be realized by one piece of hardware, or may be realized by distributed processing by a plurality of pieces of hardware.
  • processing steps describing chronological processing are not only processes that are performed chronologically in the described order, but also processes that are not necessarily performed chronologically, but may be performed in parallel or It also includes processes that are executed individually (for example, processes by objects). Furthermore, the processing order of processing steps that describe time-series processing may be changed within a range that does not affect the processing results.
  • control lines and information lines indicated by arrows and solid lines indicate those considered necessary for explanation, and not all control lines and information lines are necessarily indicated in the product. . In reality, almost all the components may be considered to be interconnected.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Radar Systems Or Details Thereof (AREA)
  • Traffic Control Systems (AREA)
  • Image Analysis (AREA)

Abstract

The present invention comprises: a distance calculation unit which inputs external information acquired by an external information acquisition unit mounted to a vehicle and including an object outside the vehicle and which obtains the distance from the vehicle to the object; an identification score calculation unit that uses the external information to obtain an identification score indicating a degree of reliability that the object included in the external information is of a predetermined type; and an object identification unit that identifies that the object is of the type associated with the identification score when the identification score exceeds a threshold value defined in advance. The threshold value differs depending on the distance.

Description

物体識別装置及び物体識別方法Object identification device and object identification method
 本発明は、画像上の物体の種別を識別する物体識別装置及び物体識別方法に関する。 The present invention relates to an object identification device and an object identification method that identify the type of object on an image.
 道路交通安全支援のため、自車前方の物体を識別し、自動ブレーキをかける技術が義務化されている。対象となる物体の種別を識別する手法として、大量の特定物体の画像を学習させて作成した識別器を用いる方法が一般に用いられる。識別器は、学習用画像のエッジから得られる物体の形状やテクスチャの特徴を学習し、推論時には、入力された画像に対し、識別対象であることの確からしさであるスコアを出力する。そして、出力したスコアに対して予めしきい値を設けておいて、スコアがしきい値を超えた場合に「識別」と判定することで、誤識別を避け、より確からしい識別をすることが一般的に行われている。 To support road traffic safety, technology that identifies objects in front of the vehicle and automatically applies the brakes is mandatory. As a method for identifying the type of a target object, a method using a classifier created by learning a large number of images of a specific object is generally used. The classifier learns the shape and texture features of objects obtained from the edges of the training image, and during inference, outputs a score for the input image that indicates the probability that it is the object to be identified. By setting a threshold in advance for the output score and determining it as "identified" when the score exceeds the threshold, it is possible to avoid misidentification and make more likely identification. This is commonly done.
 例えば、特許文献1には、「自宅の車庫等、ユーザの使用頻度が高いエリア(後述の特定エリア)において、車庫周辺の環境(風景や設置物等)を事前に学習することにより、人の検知率を下げることなく、誤検知の発生頻度を低減する。」、「人検知部は、公知の画像解析手法を用いて周辺画像に映る人を検知してもよい。第1実施例では、人検知部は、周辺画像に映る物体に関する指標値が第1閾値以上である場合に、その物体を人として検知する。」との記載がある。すなわち、特許文献1に開示された歩行者検知装置では、自宅駐車場などの限られたエリアで、予め設定したしきい値を物体に関する指標値(スコア)が超えた場合に対象の物体を検知する。 For example, Patent Document 1 states, ``In areas that are frequently used by users (specific areas described later), such as home garages, by learning in advance the environment around the garage (scenery, installed objects, etc.), "Reducing the frequency of false positives without lowering the detection rate.", "The person detection unit may detect a person appearing in the surrounding image using a known image analysis method. In the first embodiment, The person detection unit detects the object as a person when the index value regarding the object appearing in the surrounding image is equal to or greater than the first threshold value.'' In other words, the pedestrian detection device disclosed in Patent Document 1 detects a target object in a limited area such as a home parking lot when an index value (score) related to the object exceeds a preset threshold. do.
特開2021-144478号公報JP 2021-144478 Publication
 これに対し、一般道や高速道路を走行中に自動ブレーキを想定した識別をする場合、限られたエリアでなく、自車の制動距離の範囲内の、直近から遠方までの識別対象を識別することが必要である。この際、物体が遠方になるほど、物体が相対的に小さくなるため、画像における物体自体の解像度が小さくなる。解像度が小さくなると、画像から物体領域を抽出する精度が悪くなる、物体自体の輪郭が不明瞭になる等の理由で、識別のスコアの値が低下する傾向にある。 On the other hand, when performing identification assuming automatic braking while driving on a general road or expressway, the identification target is not limited to a limited area, but is identified from the nearest to far distance within the braking distance of the own vehicle. It is necessary. At this time, the farther the object is, the smaller the object becomes, and therefore the resolution of the object itself in the image becomes smaller. As the resolution decreases, the identification score value tends to decrease because the accuracy of extracting an object region from an image deteriorates, the outline of the object itself becomes unclear, and so on.
 物体識別のしきい値を設定する場合に、特に自車近傍の物体の誤識別は誤ブレーキにつながる。このため、物体識別のしきい値を厳しく設定したいが、一方で遠方の識別対象の識別率が下がる問題があった。 When setting thresholds for object identification, incorrect identification of objects near the vehicle can lead to incorrect braking. For this reason, it is desirable to set a strict threshold for object identification, but on the other hand, there is a problem in that the identification rate of distant identification targets decreases.
 本発明は、上記の状況に鑑みてなされたものであり、移動体から物体までの距離が近傍及び遠方にかかわらず、誤識別を抑制し、物体識別の性能を向上させることを目的とする。 The present invention has been made in view of the above situation, and aims to suppress misidentification and improve object identification performance, regardless of whether the distance from the moving body to the object is near or far.
 上記課題を解決するために、本発明の一態様の物体識別装置は、車両に搭載された外界情報取得部によって取得された車外の物体を含む外界情報が入力され、車両から物体までの距離を求める距離算出部と、外界情報を用いて、外界情報に含まれる物体が所定の種類である信頼度を示す識別スコアを求める識別スコア算出部と、識別スコアが予め定められたしきい値を超えた場合に、物体が当該識別スコアと対応付けられた種類であると識別する物体識別部と、を備える。上記しきい値は、距離に応じて異なる。 In order to solve the above problems, an object identification device according to one aspect of the present invention receives external world information including objects outside the vehicle acquired by an external world information acquisition unit mounted on the vehicle, and calculates the distance from the vehicle to the object. a distance calculation unit that calculates a distance, a discrimination score calculation unit that uses external world information to obtain a discrimination score that indicates confidence that the object included in the external world information is a predetermined type; and an object identification unit that identifies the object as being of a type associated with the identification score when the object is identified by the identification score. The threshold value differs depending on the distance.
 本発明の少なくとも一態様によれば、距離別の実態に合った適切な物体識別のしきい値を設定することで、車両等の移動体から物体までの距離が近傍及び遠方にかかわらず、誤識別を抑制し、物体識別の性能を向上させることができる。
 上記した以外の課題、構成及び効果は、以下の実施形態の説明により明らかにされる。
According to at least one aspect of the present invention, by setting an appropriate object identification threshold that matches the actual situation by distance, it is possible to make errors regardless of whether the distance from a moving body such as a vehicle to the object is near or far. It is possible to suppress identification and improve object identification performance.
Problems, configurations, and effects other than those described above will be made clear by the following description of the embodiments.
本発明の第1の実施形態に係る物体識別装置の構成例を示すブロック図である。FIG. 1 is a block diagram showing a configuration example of an object identification device according to a first embodiment of the present invention. 本発明の第1の実施形態における物体識別の距離別のしきい値の例を示す図である。FIG. 6 is a diagram illustrating an example of distance-based thresholds for object identification in the first embodiment of the present invention. 本発明の第1の実施形態に係る物体識別装置による物体識別処理の手順例を示すフローチャートである。It is a flowchart which shows the example of a procedure of object identification processing by an object identification device concerning a 1st embodiment of the present invention. 物体識別装置に入力した画像の例を示す図である。FIG. 2 is a diagram showing an example of an image input to an object identification device. 本発明の第1の実施形態に係る物体識別装置に入力した画像から物体領域を検出した例を示す図である。FIG. 3 is a diagram showing an example of detecting an object region from an image input to the object identification device according to the first embodiment of the present invention. 本発明の第1の実施形態における物体識別処理が完了した時点での、距離、スコア、しきい値、及び識別結果の例を示す図である。It is a figure which shows the example of a distance, a score, a threshold value, and an identification result at the time of completion of the object identification process in the 1st Embodiment of this invention. 本発明の第1の実施形態に係る物体識別処理を複数の物体に適用した場合の、距離別しきい値と、スコアと、識別結果の例を示す図である。FIG. 7 is a diagram showing an example of distance-based thresholds, scores, and identification results when the object identification process according to the first embodiment of the present invention is applied to a plurality of objects. 従来の距離によらず一定のしきい値を用いた場合の、スコアと識別結果の例を示す図である。It is a figure which shows the example of a score and an identification result when a fixed threshold value is used regardless of the conventional distance. 本発明の第2の実施形態に係る物体識別装置の構成例を示すブロック図である。It is a block diagram showing an example of composition of an object identification device concerning a 2nd embodiment of the present invention. 本発明の第2の実施形態において予め取得した識別対象のデータベースの構成例を示す図である。FIG. 7 is a diagram illustrating an example of the configuration of a database of identification targets acquired in advance in the second embodiment of the present invention. 本発明の第2の実施形態に係る物体識別装置において、識別対象データベースの各要素について、距離とスコアの関係でプロットした例を示す図である。FIG. 6 is a diagram showing an example of plotting the relationship between distance and score for each element of the identification target database in the object identification device according to the second embodiment of the present invention. 本発明の第2の実施形態において、識別対象データベースの各要素について、距離から取得したしきい値と、しきい値により識別した結果と、識別結果の正誤を記入した結果を示す図である。FIG. 7 is a diagram showing a threshold value obtained from distance, a result of identification based on the threshold value, and a result of marking whether the identification result is correct or incorrect for each element of the identification target database in the second embodiment of the present invention. 本発明の第3の実施形態に係る物体識別装置の構成例を示すブロック図である。It is a block diagram showing an example of composition of an object identification device concerning a 3rd embodiment of the present invention. 本発明の第3の実施形態に係る物体識別処理を複数の物体に適用した場合の、制動距離ごとの距離別しきい値と、スコアと、識別結果の例を示す図である。FIG. 7 is a diagram showing an example of distance-based thresholds, scores, and identification results for each braking distance when the object identification process according to the third embodiment of the present invention is applied to a plurality of objects. 本発明の第3の実施形態の変形例における、対象との距離としきい値との関係の例を示す図である。FIG. 7 is a diagram showing an example of the relationship between the distance to the object and the threshold value in a modification of the third embodiment of the present invention. 本発明の第4の実施形態に係る物体識別装置の構成例を示すブロック図である。It is a block diagram showing an example of composition of an object identification device concerning a 4th embodiment of the present invention. 本発明の第1~第4の実施形態に係る物体識別装置のハードウェア構成例を示すブロック図である。1 is a block diagram showing an example of the hardware configuration of an object identification device according to first to fourth embodiments of the present invention. FIG.
 以下、本発明を実施するための形態(以下、「実施形態」と称する)の例について、添付図面を参照して説明する。本明細書及び添付図面において、同一の構成要素又は実質的に同一の機能を有する構成要素には同一の符号を付して重複する説明を省略する。 Hereinafter, examples of modes for carrying out the present invention (hereinafter referred to as "embodiments") will be described with reference to the accompanying drawings. In this specification and the accompanying drawings, the same components or components having substantially the same functions are denoted by the same reference numerals, and redundant explanation will be omitted.
<第1の実施形態>
 まず、本発明の第1の実施形態に係る物体識別装置の構成について、図1を参照して説明する。
<First embodiment>
First, the configuration of an object identification device according to a first embodiment of the present invention will be described with reference to FIG.
[物体識別装置の構成]
 図1は、第1の実施形態に係る物体識別装置の構成例を示すブロック図である。図1に示すように、物体識別装置2は、物体領域検出部21と、物体距離算出部22と、識別スコア算出部23と、距離別識別方法記憶部24と、距離別識別方法選択部25と、物体識別部26を備える。
[Configuration of object identification device]
FIG. 1 is a block diagram showing a configuration example of an object identification device according to a first embodiment. As shown in FIG. 1, the object identification device 2 includes an object area detection section 21, an object distance calculation section 22, an identification score calculation section 23, a distance-based identification method storage section 24, and a distance-based identification method selection section 25. and an object identification section 26.
 物体領域検出部21は、車両等の移動体に搭載された一対のカメラを有するステレオカメラ1(外界情報取得部の一例)から取得したステレオ画像Im(左右の画像)から、物体が映る領域(物体領域)を検出する。これは画像中から一かたまりの画素から構成される物体領域を検出する処理であり、様々な手段を用いることができる。例えば、画像を取得するカメラとしてステレオカメラを用いる場合、視差により左右の画像上で特徴の類似する画素間の距離を求めることができる。画像上で隣り合って距離の近い画素(領域)をグルーピングすることで物体領域を求めることができる。 The object area detection unit 21 detects an area in which an object appears ( object area). This is a process of detecting an object area made up of a group of pixels from an image, and various means can be used. For example, when a stereo camera is used as a camera for acquiring images, the distance between pixels with similar features on the left and right images can be determined based on parallax. An object area can be determined by grouping adjacent pixels (areas) that are close to each other on an image.
 物体距離算出部22(距離算出部の一例)は、物体領域検出部21で検出した画像上の物体の、ステレオカメラ1からの距離(以下「物体距離」)を算出する。ステレオカメラを用いた場合、物体距離は、物体領域検出部21で検出した2つの画像で同一の物体領域の視差値から求めることができる。なお、本実施形態では、ステレオカメラ1から物体までの距離の計算値を車両等の移動体から物体までの距離とするが、ステレオカメラ1の移動体上での取り付け位置を該計算値に反映して移動体から物体までの距離を求めてもよい。 The object distance calculation unit 22 (an example of a distance calculation unit) calculates the distance from the stereo camera 1 to the object on the image detected by the object area detection unit 21 (hereinafter referred to as "object distance"). When a stereo camera is used, the object distance can be determined from the parallax value of the same object area in two images detected by the object area detection unit 21. Note that in this embodiment, the calculated value of the distance from the stereo camera 1 to the object is the distance from a moving object such as a vehicle to the object, but the mounting position of the stereo camera 1 on the moving object is reflected in the calculated value. The distance from the moving body to the object may also be determined by
 一般的に、ステレオカメラで撮影されたステレオ画像に基づき、カメラ視野内に存在する物体までの距離を三角測量の原理で計測する。三角測量の原理とはすなわち、左右のカメラによって撮影された同一の物体の像の位置のずれ(視差)を用いて、カメラからその物体までの距離を算出するものである。視差の導出は、物体の一方の画像上における像が、他方の画像上のどこに存在するかを特定することによって実現する。 Generally, based on stereo images taken by a stereo camera, the distance to an object within the camera's field of view is measured using the principle of triangulation. The principle of triangulation is to calculate the distance from the camera to the object using the positional shift (parallax) between the images of the same object photographed by the left and right cameras. The parallax is derived by specifying where the image of the object on one image exists on the other image.
 視差の導出方法としては様々な手法が提案されている。例えば、古典的手法では、一方の画像中の複数画素からなる画像領域に対して、他方の画像中で最も非類似度の低い画像領域を探索するブロックマッチングが知られている。 Various methods have been proposed for deriving parallax. For example, as a classical method, block matching is known in which an image area consisting of a plurality of pixels in one image is searched for an image area having the lowest degree of dissimilarity in another image.
 識別スコア算出部23(識別スコア算出部の一例)は、識別器231を有し、識別器231により物体領域検出部21で検出した物体領域に対して識別処理を実施し、当該物体領域の識別対象らしさであるスコア(「識別スコア」と表記することもある)を算出する。このスコアは、物体領域の識別対象であることの確からしさの指標と言える。例えば、識別器231は、学習用画像のエッジから得られる物体の形状やテクスチャの特徴を学習した機械学習モデルであり、推論時には、入力された画像の物体領域に対し、識別対象であることの確からしさを表すスコアを出力する。 The identification score calculation unit 23 (an example of the identification score calculation unit) includes a classifier 231, and the classifier 231 performs a classification process on the object area detected by the object area detection unit 21, and identifies the object area. A score (sometimes referred to as a "discrimination score") that indicates object-likeness is calculated. This score can be said to be an index of the certainty that the object region is the identification target. For example, the classifier 231 is a machine learning model that has learned the shape and texture features of an object obtained from the edges of the training image, and during inference, it uses the object region of the input image to determine whether it is the object to be identified. Outputs a score that represents certainty.
 距離別識別方法記憶部24は、物体までの距離別に、「識別」と判定するスコアのしきい値を格納する記憶手段である。距離別識別方法記憶部24に記憶する距離別のしきい値については、図2を用いて説明する。 The distance-based identification method storage unit 24 is a storage unit that stores score thresholds for determining "identification" for each distance to the object. The distance-specific thresholds stored in the distance-specific identification method storage unit 24 will be explained using FIG. 2.
[距離別のしきい値]
 図2は、本実施形態における物体識別の距離別のしきい値の例を示す図である。図2において、横軸は物体との距離(m)を示し、縦軸はしきい値を示す。図2から明らかなように、ステレオカメラ1から物体までの距離が近いほど高い値のしきい値、物体までの距離が遠いほど低い値のしきい値である。このような物体識別のしきい値を設定することで、自車近傍の歩行者以外の物体を歩行者と識別してしまう誤識別を抑制し、遠方の、スコアが低い歩行者であっても歩行者として識別する効果がある。この距離別のしきい値を採用した効果については、以降の実施形態の中で明らかにする。なお、図2に、歩行者識別のための距離別のしきい値の例を示したが、識別対象はこの例に限られない。例えば、構造物などの障害物や小動物等、種々の物体を識別対象とすることが可能であり、識別対象ごとに距離別のしきい値を設定すればよい。
[Threshold by distance]
FIG. 2 is a diagram illustrating an example of threshold values for each distance for object identification in this embodiment. In FIG. 2, the horizontal axis indicates the distance (m) to the object, and the vertical axis indicates the threshold value. As is clear from FIG. 2, the shorter the distance from the stereo camera 1 to the object, the higher the threshold value, and the farther the distance to the object, the lower the threshold value. By setting such a threshold for object identification, it is possible to suppress incorrect identification of objects other than pedestrians near the vehicle as pedestrians, and even if the object is far away and has a low score. This has the effect of identifying you as a pedestrian. The effect of adopting this threshold value for each distance will be explained in the following embodiments. Although FIG. 2 shows an example of distance-based thresholds for pedestrian identification, the identification target is not limited to this example. For example, it is possible to identify various objects such as obstacles such as structures and small animals, and it is sufficient to set a threshold value for each distance for each target to be identified.
 距離別識別方法選択部25は、物体距離算出部22で算出した物体までの距離に対応するしきい値を、距離別識別方法記憶部24から取得する。 The distance-based identification method selection unit 25 acquires the threshold value corresponding to the distance to the object calculated by the object distance calculation unit 22 from the distance-based identification method storage unit 24 .
 物体識別部26(物体識別部の一例)は、識別スコア算出部23で算出したスコアを、距離別識別方法選択部25で取得した距離別のしきい値と比較し、当該スコアがしきい値を超えていれば、注目した物体を識別対象として識別する。物体識別部26の識別結果Obは、物体識別装置2が搭載されている車両等の移動体の動作を制御する車両制御装置3へ出力される。例えば、車両制御装置3は、移動体の自動ブレーキを制御する電子制御装置(ECU:Electronic Control Unit)や、移動体の操舵を制御する電子制御装置(ECU)などである。 The object identification unit 26 (an example of an object identification unit) compares the score calculated by the identification score calculation unit 23 with the distance-specific threshold value acquired by the distance-based identification method selection unit 25, and determines that the score is the threshold value. If it exceeds , the object of interest is identified as an identification target. The identification result Ob of the object identification unit 26 is output to the vehicle control device 3 that controls the operation of a moving object such as a vehicle on which the object identification device 2 is mounted. For example, the vehicle control device 3 is an electronic control unit (ECU) that controls automatic braking of a moving object, an electronic control unit (ECU) that controls steering of a moving object, or the like.
[物体識別処理の手順]
 次に、物体識別装置2による物体識別処理の流れについて、図3を用いて説明する。
 図3は、物体識別装置2による物体識別処理の手順例を示すフローチャートである。本実施形態では、歩行者を識別対象として識別する例を示す。自車周辺の画像を取得する手段としてステレオカメラを用い、取得したステレオ画像Im上の物体の位置に対応する距離が取得できる場合を想定する。
[Object identification processing procedure]
Next, the flow of object identification processing by the object identification device 2 will be explained using FIG. 3.
FIG. 3 is a flowchart showing an example of a procedure for object identification processing by the object identification device 2. As shown in FIG. In this embodiment, an example will be shown in which a pedestrian is identified as an identification target. Assume that a stereo camera is used as a means for acquiring images of the surroundings of the own vehicle, and a distance corresponding to the position of an object on the acquired stereo image Im can be acquired.
 物体識別装置2は、認識処理実行タイミング(現在時刻)になったとき物体の認識処理を開始する。まず、ステップS21において、物体領域検出部21は、物体識別装置2の画像入力部(図示せず)で取得した画像を入力する。 The object identification device 2 starts the object recognition process when the recognition process execution timing (current time) comes. First, in step S21, the object area detection section 21 inputs an image acquired by an image input section (not shown) of the object identification device 2.
 図4に、物体識別装置2に入力した画像31の例を示す。この画像31は、ステレオカメラ1が出力し物体識別装置2に入力された左右の画像の一方と考えて差し支えない。画像31には、道路の右側に歩行者と樹木、道路の左側にバス停留所、さらに自車の前方かつ道路の遠方に横断歩道と二人の歩行者が映っている。 FIG. 4 shows an example of the image 31 input to the object identification device 2. This image 31 can be considered to be one of the left and right images output by the stereo camera 1 and input to the object identification device 2. Image 31 shows pedestrians and trees on the right side of the road, a bus stop on the left side of the road, and a crosswalk and two pedestrians in front of the vehicle and in the distance of the road.
 次いで、ステップS22において、物体領域検出部21は、ステップS21で取得した画像31から物体領域を検出する。 Next, in step S22, the object area detection unit 21 detects an object area from the image 31 acquired in step S21.
 図5は、物体識別装置2に入力した画像31から物体領域を検出した例を示す図である。ここでは、物体領域検出部21は、符号41で示す歩行者のような物体を物体No.0、符号42で示すバス停留所のような物体を物体No.1、符号43で示す樹木のような物体を物体No.2、符号44で示す歩行者のような物体を物体No.3、及び符号45で示す歩行者のような物体を物体No.4として検出する。 FIG. 5 is a diagram showing an example of detecting an object area from the image 31 input to the object identification device 2. Here, the object area detection unit 21 detects an object such as a pedestrian indicated by reference numeral 41 as object number. 0, an object such as a bus stop indicated by reference numeral 42 is designated as object number. 1. The tree-like object indicated by reference numeral 43 is designated as object number. 2. A pedestrian-like object indicated by reference numeral 44 is designated as object no. 3 and a pedestrian-like object indicated by the reference numeral 45 as object number. Detected as 4.
 次いで、ステップS23において、物体領域検出部21は、検出した複数の物体(物体領域)の識別を順に実行するための参照カウンタiを0(i=0)に初期化する。参照カウンタiの物体(物体領域)を「物体i」と表記する。 Next, in step S23, the object area detection unit 21 initializes a reference counter i to 0 (i=0) for sequentially identifying a plurality of detected objects (object areas). The object (object area) of reference counter i is expressed as "object i."
 次いで、ステップS24において、物体距離算出部22は、物体iの距離情報を取得する。物体距離算出部22は、ステレオカメラ1で得られたステレオ画像Imの視差から、処理対象の物体iの距離を取得する。図5の符号41で示した物体No.0の物体は、ステレオカメラ1(自車)から“30m”の距離にあるとする。 Next, in step S24, the object distance calculation unit 22 acquires distance information of the object i. The object distance calculation unit 22 obtains the distance of the object i to be processed from the parallax of the stereo image Im obtained by the stereo camera 1. Object No. 41 in FIG. It is assumed that object 0 is located at a distance of "30 m" from stereo camera 1 (own vehicle).
 次いで、ステップS25において、識別スコア算出部23は、処理対象の物体iの識別スコアを算出する。この例では、歩行者としての確からしさを示す度合いを0~1までで示す指標とし、1に近いほど歩行者らしいとする。ここでは、例えば、符号41に示す物体No.0のスコアは“0.98”であるとする。 Next, in step S25, the identification score calculation unit 23 calculates the identification score of the object i to be processed. In this example, the degree of certainty of being a pedestrian is used as an index from 0 to 1, and the closer it is to 1, the more likely it is to be a pedestrian. Here, for example, object No. 41 is shown. It is assumed that the score of 0 is "0.98".
 次いで、ステップS26において、距離別識別方法選択部25は、距離別識別方法記憶部24に格納した距離別のしきい値(図2)から、ステップS24で取得した物体iの距離に該当するしきい値を取得する。ここでは、物体No.0の距離“20m”に対応するしきい値は“0.89”とする。 Next, in step S26, the distance-based identification method selection unit 25 selects the distance that corresponds to the distance of the object i acquired in step S24, from the distance-based thresholds (FIG. 2) stored in the distance-based identification method storage unit 24. Get the threshold. Here, object no. The threshold value corresponding to the distance “20 m” of 0 is “0.89”.
 次いで、ステップS27において、物体識別部26は、物体iのスコアがしきい値を超えたか否かを判定する。物体No.0の場合、ステップS25で算出したスコアは“0.98”であるので、しきい値“0.89”を超えている。物体iのスコアがしきい値を超えている場合(ステップS27でYES判定)はステップS28に進み、物体iのスコアがしきい値を超えない場合(ステップS27でNO判定)はステップS29に進む。 Next, in step S27, the object identification unit 26 determines whether the score of object i exceeds a threshold value. Object no. In the case of 0, the score calculated in step S25 is "0.98", which exceeds the threshold value "0.89". If the score of object i exceeds the threshold (YES in step S27), proceed to step S28; if the score of object i does not exceed the threshold (NO in step S27), proceed to step S29. .
 ステップS28において、物体識別部26は、物体iの種別を歩行者として判定する。
一方、ステップS29において、物体識別部26は、物体iの種別を歩行者以外として判定する。例えば、符号43に示す物体No.2のスコアが“0.80”、距離“40m”だった場合、距離“40m”に対応するしきい値は“0.83”(図2)であるため、物体iの種別は歩行者以外と判定される。
In step S28, the object identification unit 26 determines the type of object i as a pedestrian.
On the other hand, in step S29, the object identification unit 26 determines the type of object i to be other than a pedestrian. For example, object No. 43. If the score of 2 is "0.80" and the distance is "40m", the threshold value corresponding to the distance "40m" is "0.83" (Figure 2), so the type of object i is other than a pedestrian. It is determined that
 ステップS28又はS29の処理後、ステップS30において、物体領域検出部21は、物体の参照カウンタiを1進めて(i=i+1)、次の物体iに関する識別処理に移る。 After the process in step S28 or S29, in step S30, the object area detection unit 21 increments the object reference counter i by 1 (i=i+1) and moves on to the identification process regarding the next object i.
 次いで、ステップS31において、物体領域検出部21は、参照カウンタiが、ステップS22で検出した物体領域の数と等しくなったか否かを判定する。参照カウンタiが、ステップS22で検出した物体領域の数と等しくない場合(ステップS31でNO判定)、物体領域検出部21は、すべての物体について識別処理が終わっていないとしてステップS24の処理に戻る。 Next, in step S31, the object area detection unit 21 determines whether the reference counter i has become equal to the number of object areas detected in step S22. If the reference counter i is not equal to the number of object regions detected in step S22 (NO in step S31), the object region detection unit 21 determines that the identification process has not been completed for all objects and returns to the process in step S24. .
 一方、参照カウンタiが、ステップS22で検出した物体領域の数と等しくなった場合(ステップS31でYES判定)、物体領域検出部21は、現在時刻でのすべての物体に対して識別処理が終了したとして、次時刻の処理に移る。 On the other hand, when the reference counter i becomes equal to the number of object regions detected in step S22 (YES in step S31), the object region detection unit 21 completes the identification process for all objects at the current time. Then, the process moves on to the next time.
 以上説明した流れによって、ステレオカメラ1から取得した画像中で検出した物体のスコア及び距離を算出し、物体識別処理を実施する。 According to the flow described above, the score and distance of the object detected in the image acquired from the stereo camera 1 are calculated, and the object identification process is performed.
 ここで、図5において符号41,42,43,44,45で示した物体について、図3の物体識別処理が完了した時点での距離、スコア、しきい値、及び識別結果の例を図6に示す。 Here, an example of the distance, score, threshold value, and identification result at the time when the object identification process of FIG. Shown below.
 図6は、本実施形態に係る物体識別処理が完了した時点での、距離、スコア、しきい値、及び識別結果の例を示す図である。図6に示すテーブルは、「物体No.」のフィールド51、「物体符号」のフィールド52,「物体種別」のフィールド53、「距離」のフィールド54、「スコア」のフィールド55、「しきい値」のフィールド56、「識別結果」のフィールド57、及び「識別結果の正誤」のフィールド58を有する。 FIG. 6 is a diagram showing an example of the distance, score, threshold value, and identification result at the time when the object identification process according to the present embodiment is completed. The table shown in FIG. 6 includes a field 51 for "object number", a field 52 for "object code", a field 53 for "object type", a field 54 for "distance", a field 55 for "score", and a field 55 for "threshold code". ” field 56, “identification result” field 57, and “identification result correct/incorrect” field 58.
 「物体No.」は、例えば、本テーブル内でのレコードを一意に識別する情報である。
 「物体符号」は、物体No.と紐づけられ、画像上の物体(物体領域)を識別するために画像上で物体ごとに付与している符号等の情報である。この物体符号は、図5の画像上の物体を説明するために記載しているが、必ずしも必要というものではない。
 「物体種別」は、画像上の物体の実際の種別を表す情報である。
 「距離」は、ステップS24で取得した物体までの距離を表す情報である。
 「スコア」は、ステップS25で算出した物体の識別時のスコアを表す情報である。
 「しきい値」は、ステップS26で取得した物体までの距離に応じたしきい値を表す情報である。
 「識別結果」は、ステップS27で判定した識別結果を表す情報である。
 「識別結果の正誤」は、説明のために、フィールド53の実際の「物体種別」と、フィールド57の「識別結果」を比較した、識別結果が正解であったか誤りであったかを示す情報である。
“Object No.” is, for example, information that uniquely identifies a record in this table.
The "object code" is the object number. This is information such as a code attached to each object on the image in order to identify the object (object area) on the image. Although this object code is described to explain the object on the image of FIG. 5, it is not necessarily necessary.
“Object type” is information representing the actual type of the object on the image.
"Distance" is information representing the distance to the object acquired in step S24.
“Score” is information representing the score at the time of object identification calculated in step S25.
"Threshold value" is information representing a threshold value according to the distance to the object acquired in step S26.
"Identification result" is information representing the identification result determined in step S27.
“Correctness of identification result” is information indicating whether the identification result is correct or incorrect by comparing the actual “object type” in field 53 and “identification result” in field 57, for explanation.
 ここで、図6に示した物体No.0(符号41)から物体No.4(符号45)の5件の物体についてそれぞれ説明する。 Here, object No. shown in FIG. 0 (code 41) to object No. Each of the five objects No. 4 (reference numeral 45) will be explained.
 1行目に示す物体No.0、符号41の物体は、実際の物体種別が“歩行者”であり、“20m”の距離にいるため物体領域の輪郭が明瞭で、識別結果のスコアが“0.98”と高い値が得られる。“20m”に対応するしきい値を図2の距離別のしきい値から取得すると“0.89”であり、ステップS27の処理でスコアがしきい値を超えるため、識別結果は“歩行者”となる。正解が歩行者である物体を“歩行者”と識別しているため、識別結果は“正解”(図6では○)となる。これは、物体が近距離で、物体領域も正確に取得できて、輪郭も明瞭なため、識別結果として高いスコアが得られ、しきい値による識別判定も正しくなされる例である。 Object No. shown in the first row. The actual object type of the object numbered 0 and 41 is "pedestrian", and since it is located at a distance of "20 m", the outline of the object area is clear, and the identification result score is as high as "0.98". can get. When the threshold value corresponding to “20 m” is obtained from the threshold values for each distance in FIG. ” becomes. Since the object for which the correct answer is a pedestrian is identified as a "pedestrian," the identification result is "correct" (○ in FIG. 6). This is an example in which the object is close, the object area can be accurately acquired, and the outline is clear, so a high score is obtained as a result of identification, and the identification judgment based on the threshold value is also made correctly.
 2行目に示す物体No.1、符号42の物体は、実際の物体種別が“歩行者以外”(バス停留所)であり、“30m”の距離にいるため物体領域の輪郭は明瞭であるが、バス停留所は歩行者と形状がかなり似ているため、スコアが“0.88”と高い値となる。“30m”の距離に対するしきい値を図2の距離別のしきい値から取得すると、“0.86”であり、ステップS27の処理でスコアがしきい値を超えるため、識別結果は“歩行者”となる。正解が歩行者以外である物体を“歩行者”として識別しているため、識別結果は“不正解”(図6では×)となる。これは、物体が近距離で、物体領域も正確に取得できて、輪郭も明瞭であるが、形状が歩行者と酷似しているため、識別結果として高いスコアが得られ、しきい値による判定の結果歩行者に誤識別する例である。 Object No. shown on the second line. 1. The actual object type of the object numbered 42 is "other than pedestrian" (bus stop), and the outline of the object area is clear because it is at a distance of "30 m", but the shape of the bus stop is different from that of pedestrians. Since they are quite similar, the score is a high value of "0.88". When the threshold value for a distance of “30 m” is obtained from the threshold values for each distance in FIG. person”. Since an object whose correct answer is other than a pedestrian is identified as a "pedestrian," the identification result is "incorrect" (× in FIG. 6). This is because the object is close, the object area can be accurately acquired, and the outline is clear, but the shape is very similar to that of a pedestrian, so a high score is obtained as a result of identification, and the threshold is used. This is an example of a pedestrian being mistakenly identified as a result.
 3行目に示す物体No.2、符号43の物体は、実際の物体種別が“歩行者以外”(樹木)であり、“40m”の距離にいるため物体領域の輪郭はやや明瞭である。形状が歩行者とやや似ているため、スコアが“0.80”とやや高めの値になる。“40m”の距離に対するしきい値を図2の距離別のしきい値から取得すると、“0.83”であり、ステップS27の処理でスコアがしきい値を超えないため、識別結果は“歩行者以外”となる。正解が歩行者以外である物体を“歩行者以外”として識別しているため、識別結果は“正解”となる。これは、物体が近距離で、物体領域も正確に取得できて、輪郭も明瞭であるため、物体の形状は歩行者と類似しているが、近距離のしきい値を高く設定している結果、歩行者としての信頼度がしきい値に満たないとして歩行者への誤識別を避けられた例である。 Object No. shown in the third row. 2. The actual object type of the object 43 is "non-pedestrian" (tree) and is located at a distance of "40 m", so the outline of the object area is somewhat clear. Since the shape is somewhat similar to that of a pedestrian, the score is "0.80", which is a rather high value. When the threshold value for a distance of "40 m" is obtained from the threshold values for each distance in FIG. 2, it is "0.83", and since the score does not exceed the threshold value in the process of step S27, the identification result is " ``Non-pedestrians''. Since objects for which the correct answer is other than pedestrians are identified as "other than pedestrians," the identification result is "correct." This is because the object is close, the object area can be accurately captured, and the outline is clear, so the shape of the object is similar to that of a pedestrian, but the short distance threshold is set high. As a result, this is an example in which erroneous identification as a pedestrian was avoided because the reliability as a pedestrian was less than the threshold.
 4行目に示す物体No.3、符号44の物体は、実際の物体種別が“歩行者”であり、“70m”の遠距離にいるため物体領域の輪郭が近傍の場合に比べ不明瞭な例である。また、この例では、図5中の符号44で示す、検出した物体領域が、物体に相当する領域をすべて抽出できておらず、物体の頭部分が物体領域から外れている。遠方の物体を検出する場合、物体領域の解像度が低いことが原因で、この例に示すように、物体の領域全体を正確に検出することが困難な場合が発生する。識別スコア算出部23が備える識別器231は、歩行者の頭の輪郭等も特徴としてとらえて学習しているため、物体領域に歩行者の体の一部が含まれていない場合は、歩行者としての識別スコアが下がる。これらの理由から、スコアは“0.78”と低めの値となる。“70m”の距離に対するしきい値を図2の距離別のしきい値から取得すると、“0.74”であり、ステップS27の処理でスコアがしきい値を超えるため、識別結果は“歩行者”となる。正解が歩行者である物体を“歩行者”として識別しているため、識別結果は“正解”となる。これは、物体が遠距離で、物体領域が正確に取得できず、その結果スコアが下がるが、距離別のしきい値を遠方の場合は低く設定しているため、歩行者に識別できる例である。 Object No. shown on the 4th line. 3. The object 44 is an example in which the actual object type is a "pedestrian" and is located at a distance of "70 m", so the outline of the object area is less clear than in the case of a nearby object. Further, in this example, in the detected object region indicated by reference numeral 44 in FIG. 5, not all regions corresponding to the object have been extracted, and the head portion of the object is outside the object region. When detecting a distant object, it may be difficult to accurately detect the entire object area, as shown in this example, due to the low resolution of the object area. The discriminator 231 included in the discrimination score calculation unit 23 learns by considering the outline of the pedestrian's head as a feature, so if the object area does not include a part of the pedestrian's body, it is determined that the pedestrian is As a result, the identification score decreases. For these reasons, the score is a low value of "0.78". When the threshold value for a distance of “70 m” is obtained from the threshold values for each distance in FIG. person”. Since the object for which the correct answer is a pedestrian is identified as a "pedestrian," the identification result is "correct." This is an example where the object is far away and the object area cannot be accurately acquired, resulting in a lower score, but the threshold for each distance is set low for distant objects, so it can be identified by pedestrians. be.
 5行目に示す物体No.4、符号45の物体は、実際の物体種別が“歩行者”であり、“80m”の遠距離にいるため物体領域の輪郭が近傍の場合に比べ不明瞭な例である。また、この例では、物体領域の解像度が低く、物体領域の輪郭が不明瞭なために、スコアが“0.75”と低めの値が算出される。“80m”の距離に対するしきい値を図2の距離別のしきい値から取得すると、“0.71”であり、ステップS27の処理でスコアがしきい値を超えるため、識別結果は“歩行者”となる。正解が歩行者である物体を“歩行者”として識別しているため、識別結果は“正解”となる。これは、物体が遠距離で、物体領域の輪郭が不明瞭で、その結果スコアが下がるが、距離別のしきい値を遠方の場合は低く設定しているため、歩行者に正しく識別できる例である。 Object No. shown in the 5th line. 4. The object 45 is an example in which the actual object type is a "pedestrian" and is located at a distance of "80 m", so the outline of the object area is unclear compared to the case in the vicinity. Further, in this example, since the resolution of the object region is low and the outline of the object region is unclear, the score is calculated as a low value of "0.75". When the threshold value for a distance of “80 m” is obtained from the threshold values for each distance in FIG. person”. Since the object for which the correct answer is a pedestrian is identified as a "pedestrian," the identification result is "correct." This is an example where the object is far away and the outline of the object area is unclear, resulting in a lower score, but because the distance-based threshold is set low for far away objects, it can be correctly identified as a pedestrian. It is.
[本実施形態によるスコアと識別結果の例]
 上記5件の物体に対して本実施形態に係る物体識別装置2による物体認識処理を適用した結果の、スコアと識別結果の例について、図7を用いて説明する。
[Example of scores and identification results according to this embodiment]
Examples of scores and identification results obtained by applying object recognition processing by the object identification device 2 according to the present embodiment to the five objects described above will be described using FIG. 7.
 図7は、物体識別装置2による物体識別処理を複数の物体に適用した場合の、距離別しきい値と、スコアと、識別結果の例を示す図である。図7において、横軸は物体との距離(m)を示し、縦軸はしきい値を示す。図中、符号41,42,43,44,45は、図5の画像31上に同じ符号で示した物体領域との距離とスコアを示す。図7において、距離別のしきい値71は、物体までの距離に対するしきい値を示す関数(距離別しきい値関数の一例)であり、図2に示した距離別のしきい値と同じ内容である。「▲」で示すマーカは、正解値が“歩行者”である物体領域のスコアを示す。「●」で示すマーカは、正解値が“歩行者以外”の物体領域のスコアを示す。 FIG. 7 is a diagram showing an example of distance-based thresholds, scores, and identification results when the object identification process by the object identification device 2 is applied to a plurality of objects. In FIG. 7, the horizontal axis indicates the distance (m) to the object, and the vertical axis indicates the threshold value. In the figure, numerals 41, 42, 43, 44, and 45 indicate the distance and score from the object area indicated by the same numeral on the image 31 of FIG. 5. In FIG. 7, a distance-specific threshold 71 is a function (an example of a distance-specific threshold function) that indicates a threshold for the distance to an object, and is the same as the distance-specific threshold shown in FIG. It's the content. The marker indicated by “▲” indicates the score of the object region whose correct value is “pedestrian”. The marker indicated by "●" indicates the score of the object region whose correct value is "other than pedestrian."
 正解値が“歩行者”である符号41,44,45のすべてにおいて、スコアが距離別のしきい値71を超えて、正解値が“歩行者以外”である符号42,43では、スコアが距離別のしきい値71を超えない状態が、100%正解の状態である。しかし、図7の例では、符号42で示す、正解値が“歩行者以外”である1件のマーカが、歩行者として誤識別されている。ただし、近距離のしきい値を高く、遠距離のしきい値を低くする距離別に可変なしきい値を採用することにより、80%の物体領域について正識別とすることができる。 For all codes 41, 44, and 45 whose correct value is "pedestrian," the score exceeds the distance-specific threshold 71, and for codes 42 and 43 whose correct value is "other than pedestrian," the score is A state in which the distance does not exceed the threshold value 71 is a 100% correct state. However, in the example of FIG. 7, one marker whose correct value is "other than pedestrian", indicated by reference numeral 42, is incorrectly identified as a pedestrian. However, by employing variable threshold values for each distance, such as increasing the threshold value for short distances and lowering the threshold value for long distances, it is possible to correctly identify 80% of object regions.
[従来技術を適用した場合のスコアと識別結果の例]
 本実施形態との比較として、距離によらず一定のしきい値を用いた場合の、スコアと識別結果の例について、図8を用いて説明する。
[Example of scores and identification results when applying conventional technology]
As a comparison with this embodiment, an example of scores and identification results when a constant threshold value is used regardless of distance will be described using FIG. 8.
 図8は、従来の距離によらず一定(固定)のしきい値を用いた場合の、スコアと識別結果の例を示す図である。図8において、横軸は物体との距離(m)を示し、縦軸はしきい値を示す。図中、符号41,42,43,44,45は、図5の画像31上に同じ符号で示した物体領域との距離とスコアを示す。図8において、図7と同様に、「▲」で示すマーカは、正解値が“歩行者”である物体領域のスコアを示す。「●」で示すマーカは、正解値が“歩行者以外”の物体領域のスコアを示す。ここで、しきい値81を“0.9”の一定値に設定した場合と、しきい値82を“0.79”の一定値に設定した場合と、しきい値83を“0.74”の一定値に設定した場合の3通りを例に挙げて説明する。 FIG. 8 is a diagram showing an example of scores and identification results when a conventional constant (fixed) threshold value is used regardless of distance. In FIG. 8, the horizontal axis indicates the distance (m) to the object, and the vertical axis indicates the threshold value. In the figure, numerals 41, 42, 43, 44, and 45 indicate the distance and score from the object area indicated by the same numeral on the image 31 of FIG. 5. In FIG. 8, similarly to FIG. 7, the marker indicated by "▲" indicates the score of the object region whose correct value is "pedestrian". The marker indicated by "●" indicates the score of the object region whose correct value is "other than pedestrian." Here, the threshold value 81 is set to a constant value of "0.9", the threshold value 82 is set to a constant value of "0.79", and the threshold value 83 is set to a constant value of "0.74". Three cases in which " is set to a constant value will be explained as examples.
 しきい値81のようにしきい値を高めの“0.9”に設定すると、正解が歩行者である符号41のスコアはしきい値を超えて“歩行者”と識別するため、正解である。歩行者以外である符号42,43は、スコアがしきい値を超えないため“歩行者以外”と識別され、この2件も正解である。一方、遠距離にいる歩行者である符号44,45は、スコアがしきい値を超えないため“歩行者以外”と識別され、不正解となる。このように、一定のしきい値を高い値に設定すると、自車近傍の「歩行者」と「歩行者以外」の識別には良好に作用する。他方、歩行者であってもスコアが低くなりやすい遠距離にいる歩行者に対しては、しきい値が高いと歩行者以外と判定するため識別できなくなり、良好に作用しない。この結果、従来技術では上記5件の物体に対する正解率は60%となり、本実施形態よりも正解率が低下する。 If the threshold value is set to a high value of "0.9" such as threshold value 81, the score of code 41, where the correct answer is pedestrian, is correct because it exceeds the threshold value and is identified as "pedestrian". . Codes 42 and 43, which are non-pedestrians, are identified as "non-pedestrians" because their scores do not exceed the threshold, and these two cases are also correct. On the other hand, the numbers 44 and 45, which are pedestrians located far away, are identified as "other than pedestrians" because their scores do not exceed the threshold, and are therefore incorrect. In this way, setting a certain threshold value to a high value works well for distinguishing between "pedestrians" and "non-pedestrians" near the vehicle. On the other hand, for pedestrians who are far away and whose scores tend to be low even if they are pedestrians, if the threshold value is high, they will be determined to be other than pedestrians and cannot be identified, so it will not work well. As a result, in the conventional technology, the accuracy rate for the five objects is 60%, which is lower than in the present embodiment.
 また、しきい値82のようにしきい値を中くらいの“0.78”に設定すると、正解が歩行者である符号41のスコアはしきい値を超えて“歩行者”と識別するため、正解である。歩行者以外である符号42,43も、スコアがしきい値以上となるため、“歩行者”と判定され、この2件は誤識別となる。また、遠距離の歩行者である符号44,45は、スコアがしきい値を超えないため“歩行者以外”と識別され、誤識別となる。このように、一定のしきい値を中途半端な値に設定すると、物体が近距離及び遠距離のどちらにあっても良好に作用しない識別結果となる。結果として、5件の物体に対する正解率は20%となる。 Furthermore, if the threshold value is set to a medium value of "0.78" like the threshold value 82, the score of code 41 whose correct answer is "pedestrian" will exceed the threshold value and be identified as "pedestrian". That's correct. The scores 42 and 43, which are not pedestrians, are also determined to be "pedestrians" because their scores are equal to or higher than the threshold value, and these two cases are misidentified. Further, the codes 44 and 45, which are long-distance pedestrians, are identified as "other than pedestrians" because their scores do not exceed the threshold value, resulting in erroneous identification. In this way, if the certain threshold value is set to a halfway value, the result will be that the identification results do not work well regardless of whether the object is located at a short distance or a long distance. As a result, the accuracy rate for the five objects is 20%.
 さらに、しきい値83のようにしきい値を低めの“0.74”に設定した場合、正解が歩行者である符号41,44,45のすべてにおいてスコアがしきい値を超えて“歩行者”と識別する。一方で、正解が歩行者以外である符号42,43も、スコアがしきい値を超えるため、“歩行者”として誤識別する。このように、一定のしきい値を低い値に設定すると、遠方のスコアの低い歩行者に相当する物体領域を正識別とする効果はあるものの、近距離にあるスコアの高い歩行者以外の物体領域を歩行者と識別してしまう問題がある。この結果、5件の物体に対する正解率は60%となる。 Furthermore, if the threshold is set to a low value of "0.74" like threshold 83, the scores for all codes 41, 44, and 45 for which the correct answer is "pedestrian" exceed the threshold and "pedestrian" ”. On the other hand, codes 42 and 43 whose correct answer is other than pedestrian are also incorrectly identified as "pedestrian" because their scores exceed the threshold. In this way, setting a certain threshold to a low value has the effect of correctly identifying object regions that correspond to pedestrians with low scores in the distance, but objects other than pedestrians with high scores in the vicinity There is a problem in which areas are identified as pedestrians. As a result, the accuracy rate for the five objects is 60%.
 誤識別を最小にした上で、遠距離のスコアが低くなる歩行者の物体領域も正識別とするためには、図7に示すような、近距離での値を高く、遠距離での値を低く設定したしきい値により、識別判定をすることが有効である。以上が本発明の第1の実施形態に係る物体識別装置2の構成及び動作を含めた内容である。 In order to minimize misidentification and correctly identify pedestrian object regions with low scores at long distances, the values at short distances should be set high and the values at long distances should be set high, as shown in Figure 7. It is effective to make identification decisions using a threshold value set to a low value. The above is the content including the configuration and operation of the object identification device 2 according to the first embodiment of the present invention.
 以上のとおり、第1の実施形態に係る物体識別装置(物体識別装置2)は、車両に搭載された外界情報取得部(例えば、ステレオカメラ1)によって取得された車外の物体(画像上の物体領域)を含む外界情報(例えば、ステレオ画像Im)が入力され、車両から物体までの距離を求める距離算出部(物体距離算出部22)と、外界情報を用いて、その外界情報に含まれる物体が所定の種類である信頼度を示す識別スコアを求める識別スコア算出部(識別スコア算出部23)と、識別スコアが予め定められたしきい値を超えた場合に、物体が当該識別スコアと対応付けられた種類であると識別する物体識別部(物体識別部26)と、を有する。ここで、上記しきい値は、車両から物体までの距離に応じて異なる(図2)。 As described above, the object identification device (object identification device 2) according to the first embodiment is capable of identifying objects outside the vehicle (objects on images) acquired by the external world information acquisition unit (for example, the stereo camera 1) mounted on the vehicle. A distance calculation unit (object distance calculation unit 22) receives external world information (for example, a stereo image Im) including a region) and calculates the distance from the vehicle to the object, and uses the external world information to calculate the distance from the vehicle to the object. A discrimination score calculation unit (discrimination score calculation unit 23) that calculates a discrimination score indicating the degree of confidence that the object is of a predetermined type; and an object identification unit (object identification unit 26) that identifies the object to be of the attached type. Here, the threshold value differs depending on the distance from the vehicle to the object (FIG. 2).
 上記構成の本実施形態によれば、距離別の実態に合った適切な物体識別のしきい値が設定される。それにより、車両等の移動体から物体までの距離が近傍及び遠方にかかわらず、誤識別を抑制し、物体識別の性能(精度)を向上させることができる。 According to this embodiment with the above configuration, an appropriate threshold value for object identification is set that matches the actual situation for each distance. Thereby, regardless of whether the distance from a moving body such as a vehicle to the object is near or far, misidentification can be suppressed and the performance (accuracy) of object identification can be improved.
<第2の実施形態>
 次に、本発明の第2の実施形態として、第1の実施形態に係る物体識別装置2(図1)に対して、距離別のしきい値の関数を作成する機能を設けた例について説明する。
<Second embodiment>
Next, as a second embodiment of the present invention, an example will be described in which the object identification device 2 (FIG. 1) according to the first embodiment is provided with a function of creating a threshold function for each distance. do.
[物体識別装置の構成]
 図9は、第2の実施形態に係る物体識別装置の構成例を示すブロック図である。図9に示す物体識別装置2Aは、図1に示した第1の実施形態に係る物体識別装置2に対し、距離別識別方法作成部91と、識別対象データベース(DB)92を追加した構成である。
[Configuration of object identification device]
FIG. 9 is a block diagram showing a configuration example of an object identification device according to the second embodiment. The object identification device 2A shown in FIG. 9 has a configuration in which a distance-based identification method creation unit 91 and an identification target database (DB) 92 are added to the object identification device 2 according to the first embodiment shown in FIG. be.
 追加した構成要素である距離別識別方法作成部91についてさらに説明する。距離別識別方法作成部91は、距離別識別方法記憶部24に格納する距離別のしきい値の関数を算出する手段である。算出された距離別しきい値関数は、距離別識別方法として距離別識別方法記憶部24に記憶される。 The distance-based identification method creation unit 91, which is an added component, will be further explained. The distance-based identification method creation unit 91 is a means for calculating a distance-based threshold function stored in the distance-based identification method storage unit 24 . The calculated distance-based threshold function is stored in the distance-based identification method storage unit 24 as a distance-based identification method.
 距離別識別方法選択部25は、物体距離算出部22で算出された画像上の物体までの距離を、距離別識別方法記憶部24の距離別しきい値関数に入力して得られるしきい値を取得する。そして、物体識別部26は、識別スコア算出部23で算出されたスコアを、距離別識別方法選択部25で取得したしきい値と比較して、画像上の物体を識別する。 The distance-based identification method selection unit 25 selects a threshold value obtained by inputting the distance to the object on the image calculated by the object distance calculation unit 22 into the distance-based threshold function of the distance-based identification method storage unit 24. get. Then, the object identification unit 26 compares the score calculated by the identification score calculation unit 23 with the threshold value obtained by the distance-based identification method selection unit 25 to identify the object on the image.
 距離別しきい値関数は、第1の実施形態で述べたように、自車から物体までが近距離では、入力画像上の物体領域の輪郭が明瞭に検出されるため、正解が歩行者である対象のスコアも高くなり、歩行者に形状の似た歩行者以外の物体もスコアが高く出る。このため、誤識別抑制のためしきい値を高く設定する。一方で、遠距離では、物体領域の輪郭が不明瞭になる、検出領域が物体領域を適切に抽出しない等の理由によって、スコアが低くなる傾向にある。このため、しきい値を低く設定する。 As described in the first embodiment, the distance-based threshold function is such that when the object is close to the vehicle, the outline of the object area on the input image is clearly detected, so the correct answer is that it is a pedestrian. The score for a certain object also becomes high, and objects other than pedestrians that are similar in shape to pedestrians also have high scores. Therefore, the threshold value is set high to suppress misidentification. On the other hand, at long distances, the score tends to be low due to reasons such as the outline of the object region becoming unclear or the detection region not properly extracting the object region. Therefore, the threshold value is set low.
 近距離も遠距離も歩行者の正識別率を確保した上で、誤識別を抑制することが、距離別のしきい値を設定する目的である。そのため、距離別しきい値関数は、遠距離になるほどしきい値が小さくなる右下がりの関数であって、適切に目的を達成できるものであればどのような関数を用いてもよい。ここでは、線形関数によって距離としきい値の関係を設定するとして説明する。識別対象として、歩行者とそれ以外を識別する例で説明する。距離別に適切なしきい値を設定するため、予め取得した識別対象DB92のデータを用いる。 The purpose of setting thresholds for each distance is to ensure a correct pedestrian identification rate for both short and long distances and to suppress false identification. Therefore, the distance-specific threshold function is a downward-sloping function in which the threshold value decreases as the distance increases, and any function may be used as long as it can appropriately achieve the purpose. Here, a description will be given assuming that the relationship between the distance and the threshold value is set using a linear function. An example will be explained in which pedestrians and others are identified as objects to be identified. In order to set appropriate threshold values for each distance, data in the identification target DB 92 acquired in advance is used.
 図10は、予め取得した識別対象DB92の構成例を示す図である。図10には、識別対象データベース92のフィールド(項目)の例が示されている。識別対象DB92は、「ID」のフィールド101、「物体種別」のフィールド102、「距離」のフィールド103、及び「スコア」のフィールド104を有する。 FIG. 10 is a diagram showing a configuration example of the identification target DB 92 acquired in advance. FIG. 10 shows an example of fields (items) of the identification target database 92. The identification target DB 92 has an "ID" field 101, an "object type" field 102, a "distance" field 103, and a "score" field 104.
 「ID」は、識別対象とする物体領域の画像(物体)を識別するための識別子を示す情報である。正解が歩行者である場合は、識別子の頭に符号pをつけた連番である。正解が歩行者以外である場合は、識別子の頭に符号nをつけた連番である。全体では、正解が歩行者である識別対象がNp件、歩行者以外の識別対象がNn件であるとする。
 「物体種別」は、識別対象の物体の種別を表す情報である。
 「距離」は、識別対象の物体までの距離を表す情報である。
 「スコア」は、識別対象の物体のスコアを表す情報である。
“ID” is information indicating an identifier for identifying an image (object) of an object region to be identified. If the correct answer is a pedestrian, it is a serial number with the symbol p added to the beginning of the identifier. If the correct answer is something other than a pedestrian, it is a serial number with a code n added to the beginning of the identifier. In total, it is assumed that there are Np identification objects whose correct answer is pedestrians and Nn identification objects other than pedestrians.
“Object type” is information representing the type of object to be identified.
“Distance” is information representing the distance to the object to be identified.
“Score” is information representing the score of the object to be identified.
 これらのデータは、実際に図3に示したフローチャートに沿って物体識別処理を実施し、一例として図4に挙げたような、様々な距離の歩行者や歩行者以外の対象を識別することにより、取得したデータである。なお、これらのデータは固定値でもよいし、物体識別処理を随時実施することにより更新した値でもよい。また、識別対象DB92は、物体識別装置2Aの外部かつ自車内にあってもよいし、識別対象DB92に相当するデータを不図示の広域ネットワークを介して車外から取得するようにしてもよい。 These data are obtained by actually performing object identification processing according to the flowchart shown in Figure 3, and by identifying pedestrians and non-pedestrian objects at various distances, as shown in Figure 4 as an example. , is the obtained data. Note that these data may be fixed values, or may be values updated by performing object identification processing at any time. Further, the identification target DB 92 may be located outside the object identification device 2A and within the own vehicle, or the data corresponding to the identification target DB 92 may be acquired from outside the vehicle via a wide area network (not shown).
 図11は、物体識別装置2Aにおいて、図10に示したような識別対象DB92の各要素について、距離とスコアの関係でプロットした例を示す図である。簡単のため、距離とスコアの関係を10mごとのスコア分布として示している。図中、実線で示したスコア分布111,113,115,117,11a,11c,11e,11gは、物体種別が“歩行者”である物体のスコア分布である。破線で示したスコア分布112,114,116,118,11b,11d,11f,11hは、物体種別が“歩行者以外”である物体のスコア分布である。 FIG. 11 is a diagram showing an example of plotting the relationship between distance and score for each element of the identification target DB 92 as shown in FIG. 10 in the object identification device 2A. For simplicity, the relationship between distance and score is shown as a score distribution every 10 m. In the figure, score distributions 111, 113, 115, 117, 11a, 11c, 11e, and 11g shown by solid lines are score distributions of objects whose object type is "pedestrian." Score distributions 112, 114, 116, 118, 11b, 11d, 11f, and 11h indicated by broken lines are score distributions of objects whose object type is "other than pedestrian."
 2つの物体種別のスコア分布を同じ距離帯で比較すると、物体種別が歩行者の方がスコアの値は高く、歩行者以外の方がスコアの値は低いことが期待される。また、第1の実施形態で述べたように、対象が遠距離になるにつれ、画像上の物体領域の輪郭が不明瞭である、検出領域が対象領域(物体領域)を適切に抽出しないなどの理由で、スコアは相対的に低下していく傾向がある。 When comparing the score distributions of two object types in the same distance range, it is expected that the score value will be higher if the object type is pedestrian, and the score value will be lower if the object type is non-pedestrian. In addition, as described in the first embodiment, as the object becomes far away, the outline of the object area on the image becomes unclear, the detection area does not extract the target area (object area) appropriately, etc. For this reason, scores tend to decrease relatively.
 スコア分布111~11hを持つ距離とスコアの識別対象に対し、しきい値110のように距離に応じた可変しきい値を設定すると、図10に示した識別対象DB92の要素それぞれについて、フィールド103の「距離」に対応したしきい値が求まる。線形と仮定した場合の可変しきい値は下記の式(1)で定義できる。なお、係数aは、本実施形態ではマイナス値である。 When a variable threshold value corresponding to the distance is set as the threshold value 110 for the distance and score identification target having the score distributions 111 to 11h, the field 103 is set for each element of the identification target DB 92 shown in FIG. The threshold value corresponding to the "distance" is determined. The variable threshold value assuming linearity can be defined by the following equation (1). Note that the coefficient a is a negative value in this embodiment.
 しきい値=距離×a+b   ・・・・(1) Threshold = distance x a + b (1)
 図12は、図10に示した識別対象DB92の各要素について、距離から取得したしきい値と、しきい値により識別した結果と、識別結果の正誤を記入した結果を示す図である。図12に示すテーブルでは、フィールド101~104の他に、「しきい値」のフィールド121、「識別結果」のフィールド122、及び「識別結果の正誤」のフィールド123を有する。 FIG. 12 is a diagram showing the threshold value obtained from the distance, the result of identification using the threshold value, and the correctness of the identification result for each element of the identification target DB 92 shown in FIG. 10. The table shown in FIG. 12 includes, in addition to fields 101 to 104, a field 121 for "threshold", a field 122 for "identification result", and a field 123 for "correctness of identification result".
 「しきい値」は、物体までの距離に対応したしきい値を表す情報である。
 「識別結果」は、「スコア」と「しきい値」から識別した結果を表す情報である。
 「識別結果の正誤」は、フィールド102の実際の「物体種別」と、フィールド122の「識別結果」を比較した、「識別結果」の正誤を示す情報である。
“Threshold” is information representing a threshold corresponding to the distance to the object.
“Identification result” is information representing the result of identification from “score” and “threshold value”.
“Correctness of identification result” is information indicating whether the “identification result” is correct or incorrect by comparing the actual “object type” in field 102 and the “identification result” in field 122.
 ここまで求まれば、このデータセットに図11に示したしきい値110を適用した場合の正識別率と誤識別率は、下記の式(2)と式(3)で求まる。ただし、正識別率は、正解が歩行者である物体種別を、歩行者として識別する率である。また、誤識別率は、歩行者以外である物体種別を、歩行者として識別する率である。 Once this has been determined, the correct classification rate and false classification rate when applying the threshold 110 shown in FIG. 11 to this data set can be determined using the following equations (2) and (3). However, the correct identification rate is the rate at which an object type whose correct answer is pedestrian is identified as a pedestrian. Further, the misidentification rate is the rate at which an object type other than a pedestrian is identified as a pedestrian.
 正識別率=(歩行者種別の物体のうち識別結果が正解の件数)÷Np  ・・・(2)
 誤識別率=(歩行者以外の種別のうち識別結果が不正解の件数)÷Nn ・・・(3)
Correct identification rate = (Number of objects with correct identification results among objects of pedestrian type) ÷ Np ... (2)
Misidentification rate = (Number of incorrect identification results among types other than pedestrians) ÷ Nn ... (3)
 式(1)において、正識別率がある値以上で、誤識別率が最少となる係数aと係数bを求めることにより、図10の識別対象DB92を用いて、正識別率を確保した上で誤識別が最小となるしきい値を決定することができる。識別対象として想定できる識別対象の情報を、図10の識別対象DB92に可能な限り入れておくことで、車両の利用実態に合った距離別しきい値を算出することができる。以上が本発明の第2の実施形態に係る物体識別装置2Aの構成及び動作を含めた内容である。 In formula (1), by determining the coefficient a and the coefficient b that minimize the false identification rate when the correct identification rate is a certain value or more, the identification target DB 92 of FIG. 10 can be used to ensure the correct identification rate. It is possible to determine a threshold value that minimizes false identification. By storing as much information about possible identification targets as possible in the identification target DB 92 shown in FIG. 10, it is possible to calculate distance-based thresholds that match the actual usage of the vehicle. The above is the content including the configuration and operation of the object identification device 2A according to the second embodiment of the present invention.
 このように、本実施形態に係る物体識別装置(物体識別装置2A)では、距離別しきい値として、正解値既知の物体の情報が距離別に保存された画像データベース(識別対象DB92)に対して、正識別率が一定値以上で、かつ、誤識別が最小となるしきい値設定関数(例えば、しきい値110)が用いられ、物体識別部(物体識別部26)は、しきい値設定関数に当該物体までの距離を入力して上記しきい値を算出する。 In this way, in the object identification device (object identification device 2A) according to the present embodiment, as a threshold value for each distance, information on objects for which correct values are known is stored in the image database (identification target DB 92) for each distance. , a threshold setting function (for example, threshold 110) with which the correct classification rate is a certain value or more and the misidentification is minimum is used, and the object identification unit (object identification unit 26) The above threshold value is calculated by inputting the distance to the object into the function.
 また、本実施形態では、対象との距離としきい値の関係を、線形関数を用いて表した例を説明したが、必ずしもこれに限定するものでなく、これとは異なる他の、対象との距離としきい値の関係を用いることもできる。例えば、対象との距離としきい値の関係を、線形関数に代えて、上記実施形態におけるしきい値の条件を満たす非線形関数で表してもよい。また、対象との距離としきい値の関係を、テーブルやマップデータの形式で定義してもよい。 In addition, in this embodiment, an example has been described in which the relationship between the distance to the target and the threshold value is expressed using a linear function, but the relationship is not necessarily limited to this. A relationship between distance and threshold can also be used. For example, instead of a linear function, the relationship between the distance to the object and the threshold may be expressed by a nonlinear function that satisfies the threshold conditions in the above embodiments. Further, the relationship between the distance to the target and the threshold value may be defined in the form of a table or map data.
<第3の実施形態>
 次に、本発明の第3の実施形態として、第1の実施形態に係る物体識別装置2(図1)に対して、車両等の移動体から物体までの距離に加えて制動距離を加味してしきい値を決定する機能を備えた例について説明する。
<Third embodiment>
Next, as a third embodiment of the present invention, in addition to the distance from a moving object such as a vehicle to the object, a braking distance is added to the object identification device 2 (FIG. 1) according to the first embodiment. An example will be described in which a function is provided to determine a threshold value.
[物体識別装置の構成]
 図13は、第3の実施形態に係る物体識別装置の構成例を示すブロック図である。図13に示す物体識別装置2Bは、図1に示した第1の実施形態に係る物体識別装置2に対し、速度検出部131と、路面状況検出部132を追加し、距離別識別方法記憶部24を制動距離・距離別識別方法記憶部133に置き換え、距離別識別方法選択部25を制動距離・距離別識別方法選択部134に置き換えた構成である。すなわち、本実施形態と、第1の実施形態との差分は、制動距離別に、対象との距離としきい値の関数を持つ点である。
[Configuration of object identification device]
FIG. 13 is a block diagram showing a configuration example of an object identification device according to the third embodiment. The object identification device 2B shown in FIG. 13 adds a speed detection section 131 and a road surface condition detection section 132 to the object identification device 2 according to the first embodiment shown in FIG. 24 is replaced with a braking distance/distance-based identification method storage unit 133, and the distance-based identification method selection unit 25 is replaced with a braking distance/distance-based identification method selection unit 134. That is, the difference between this embodiment and the first embodiment is that each braking distance has a function of the distance to the object and the threshold value.
 本発明の主要な利用形態は、車両等の移動体に搭載し、前方の衝突の危険のある識別対象を識別し、自動ブレーキをかけることである。つまり、本発明は、先進運転支援システム(Advanced Driver Assistance System:ADAS)や自動運転(Autonomous Driving:AD)向けの車両制御に適用して好適である。制動距離は、自車の速度や路面状況により異なる。自動ブレーキを作動させる上で重要なのは、その時々の速度や、路面状況に応じて算出できる制動距離の範囲にいる識別対象を確実に識別することである。そのため、速度検出部131及び路面状況検出部132で取得した、自車の速度及び路面状況に基づいた制動距離を算出する。そして、その制動距離の範囲で正識別率を確保した上で誤識別率を最小にする距離別しきい値を算出し、制動距離・距離別識別方法記憶部133に格納する。 The main form of use of the present invention is to install it on a moving object such as a vehicle, identify an object in front that is at risk of collision, and automatically apply the brakes. That is, the present invention is suitable for application to vehicle control for advanced driver assistance systems (ADAS) and autonomous driving (AD). Braking distance varies depending on the speed of the vehicle and road surface conditions. What is important in operating automatic braking is to reliably identify objects within a braking distance range that can be calculated based on the vehicle's current speed and road surface conditions. Therefore, the braking distance is calculated based on the speed of the own vehicle and the road surface condition acquired by the speed detection section 131 and the road surface condition detection section 132. Then, a distance-specific threshold value that minimizes the false identification rate while ensuring a correct identification rate within the braking distance range is calculated and stored in the braking distance/distance-based identification method storage unit 133.
 速度検出部131は、例えば、車両に搭載された車速センサを用いて構成することができる。速度検出部131は、検出した自車の速度の情報を制動距離・距離別識別方法選択部134へ出力する。 The speed detection unit 131 can be configured using, for example, a vehicle speed sensor mounted on the vehicle. The speed detection unit 131 outputs information on the detected speed of the own vehicle to the braking distance/distance-based identification method selection unit 134.
 路面状況検出部132は、例えば、ステレオカメラ1などから入力された自車前方の画像から、路面状況を判断する。例えば、路面状況の情報としては、路面の凹凸、隆起、濡れた状態、凍結した状態などが挙げられる。路面状況検出部132は、検出した路面状況の情報を制動距離・距離別識別方法選択部134へ出力する。 The road surface condition detection unit 132 determines the road surface condition from an image in front of the vehicle inputted from, for example, the stereo camera 1. For example, information on road surface conditions includes road surface irregularities, bumps, wet conditions, frozen conditions, and the like. The road surface condition detection section 132 outputs information on the detected road surface condition to the braking distance/distance-specific identification method selection section 134.
 制動距離・距離別識別方法記憶部133は、図14に示すような制動距離別の、距離別しきい値が格納されている。図14の内容については後述する。 The braking distance/distance-based identification method storage unit 133 stores distance-based threshold values for each braking distance as shown in FIG. The contents of FIG. 14 will be described later.
 制動距離・距離別識別方法選択部134は、速度検出部131が検出した自車の速度と、路面状況検出部132が検出した路面状況とに基づいて、自車の制動距離を算出する。
制動距離には、自車の重量や自動ブレーキの性能・仕様なども影響するが、これらの情報は予め準備して物体識別装置内に記憶しておくことができる。例えば、制動距離は、自車の速度、及び路面状況を入力として、制動距離を出力するように学習した機械学習モデル(図示略)により、算出することができる。機械学習モデルのパラメータには、予めその他の情報として、自車の重量や自動ブレーキの性能・仕様などを反映させておく。制動距離・距離別識別方法選択部134は、この機械学習モデルを用いて、自車の速度、及び路面状況から制動距離を推論する。そして、制動距離・距離別識別方法選択部134は、制動距離の推論値に応じたしきい値を、制動距離・距離別識別方法記憶部133から取得して物体識別部26へ出力する。
The braking distance/distance-specific identification method selection unit 134 calculates the braking distance of the own vehicle based on the speed of the own vehicle detected by the speed detection unit 131 and the road surface condition detected by the road surface condition detection unit 132.
The braking distance is influenced by the weight of the own vehicle and the performance and specifications of the automatic brake, but such information can be prepared in advance and stored in the object identification device. For example, the braking distance can be calculated using a machine learning model (not shown) that is trained to output the braking distance by inputting the speed of the own vehicle and the road surface condition. Other information such as the weight of the vehicle and the performance and specifications of automatic braking are reflected in the parameters of the machine learning model in advance. The braking distance/distance-based identification method selection unit 134 uses this machine learning model to infer the braking distance from the speed of the own vehicle and the road surface condition. Then, the braking distance/distance-based identification method selection unit 134 acquires a threshold value corresponding to the inferred value of the braking distance from the braking distance/distance-based identification method storage unit 133 and outputs it to the object identification unit 26 .
 ここで、制動距離ごとの、対象との距離別の識別方法について、図14を参照して説明する。 Here, a method for identifying each braking distance and each distance to the object will be described with reference to FIG. 14.
 図14は、本実施形態に係る物体識別処理を複数の物体に適用した場合の、制動距離ごとの距離別しきい値と、スコアと、識別結果の例を示す図である。図14において、横軸は物体との距離(m)を示し、縦軸はしきい値を示す。図中、実線の直線(線形関数)で示されたしきい値141は、制動距離が“50m”の場合の、対象との距離別のしきい値である。また、破線の直線(線形関数)で示されたしきい値142は、制動距離が“90m”の場合の、対象との距離別のしきい値である。符号41,42,43,44,45は、図5の画像31上に同じ符号で示した物体領域との距離とスコアを示す。 FIG. 14 is a diagram showing an example of distance-based thresholds, scores, and identification results for each braking distance when the object identification process according to the present embodiment is applied to a plurality of objects. In FIG. 14, the horizontal axis indicates the distance (m) to the object, and the vertical axis indicates the threshold value. In the figure, a threshold value 141 indicated by a solid straight line (linear function) is a threshold value for each distance to the object when the braking distance is "50 m". Further, the threshold value 142 indicated by a broken straight line (linear function) is a threshold value for each distance to the object when the braking distance is "90 m". Reference numerals 41, 42, 43, 44, and 45 indicate distances and scores from the object regions indicated by the same reference numerals on the image 31 in FIG.
 自車の速度が、ある速度よりも高速で、制動距離が長く“90m”のしきい値142の場合は、遠距離にいる歩行者を識別するために、遠距離のしきい値がより低くなるよう負の傾きが大きいしきい値を設定する。 If the vehicle's speed is faster than a certain speed and the braking distance is longer than the threshold 142 of "90 m", the long distance threshold is lower in order to identify pedestrians who are far away. Set a threshold with a large negative slope so that
 それに対し、自車の速度が低速で、制動距離が“50m”のしきい値141の場合は、50mより遠くの歩行者を正識別するよりも、50mより近くの歩行者を識別し、かつ歩行者以外の誤識別を抑制することを優先できるため、しきい値の負の傾きを小さくできる。 On the other hand, when the speed of the own vehicle is low and the braking distance is "50 m" as the threshold value 141, it is easier to correctly identify a pedestrian closer than 50 m than to correctly identify a pedestrian further away than 50 m. Since priority can be given to suppressing erroneous identification of objects other than pedestrians, the negative slope of the threshold value can be reduced.
 この結果、制動距離“50m”の場合のしきい値141を採用すると、50mより手前にいる符号41で示された歩行者である物体は、正識別される。また、30mにある符号42で示された歩行者以外のバス停留所のスコアはしきい値以下であるので、歩行者以外であると正しく識別でき、40mの距離にある符号43で示された歩行者以外の樹木も、スコアがしきい値を超えないので歩行者以外であると正しく識別されることになる。 As a result, if the threshold value 141 for the case of braking distance "50 m" is adopted, the object that is a pedestrian indicated by the reference numeral 41 located in front of 50 m is correctly identified. In addition, since the score of the bus stop for non-pedestrians indicated by the code 42 located at a distance of 30 m is below the threshold, it can be correctly identified as non-pedestrians, and the bus stop indicated by the code 43 located at a distance of 40 m Trees other than pedestrians are also correctly identified as non-pedestrians because their scores do not exceed the threshold.
 以上のように、本実施形態では、距離別しきい値(図14のしきい値141,142)は、物体までの距離及び制動距離に応じて設定される。本実施形態によれば、第1の実施形態と同様の効果に加えて、下記の効果が得られる。すなわち、本実施形態によれば、制動距離別に、距離別のしきい値を変えて、自動ブレーキの適用範囲を絞ることで、必要な範囲でより精度の高い物体識別をすることができる。 As described above, in this embodiment, the distance-based thresholds ( thresholds 141 and 142 in FIG. 14) are set according to the distance to the object and the braking distance. According to this embodiment, in addition to the effects similar to those of the first embodiment, the following effects can be obtained. That is, according to the present embodiment, by changing the distance-specific threshold value for each braking distance and narrowing down the application range of automatic braking, it is possible to perform object identification with higher accuracy within a necessary range.
 なお、制動距離ごとの、対象との距離としきい値の関係を表すデータとして、制動距離別に、対象との距離としきい値の関係を登録したテーブルやマップデータを用意しておいてもよい。 Note that as data representing the relationship between the distance to the object and the threshold value for each braking distance, a table or map data may be prepared in which the relationship between the distance to the object and the threshold value is registered for each braking distance.
 さらに、本実施形態では、物体までの距離と制動距離に基づいて物体識別のしきい値を設定したが、この例に限らない。例えば、制動距離は移動体の速度に大きく影響を受けるので、距離と速度に応じてしきい値を設定してもよい。 Further, in this embodiment, the threshold value for object identification is set based on the distance to the object and the braking distance, but the invention is not limited to this example. For example, since the braking distance is greatly affected by the speed of the moving object, the threshold value may be set depending on the distance and speed.
[第3の実施形態の変形例]
 また、上述の実施形態では、距離別しきい値関数(しきい値設定関数)を線形関数としたが、距離別しきい値関数を、固定しきい値と、線形の可変しきい値との組合せにより定義してもよい。以下、固定しきい値と、線形の可変しきい値との組合せによって距離別しきい値関数を定義する例について、図15を参照して説明する。
[Modification of third embodiment]
Further, in the above embodiment, the distance-based threshold function (threshold setting function) is a linear function, but the distance-based threshold function may be a fixed threshold value or a linear variable threshold value. It may also be defined by a combination. Hereinafter, an example in which a distance-specific threshold function is defined by a combination of a fixed threshold value and a linear variable threshold value will be described with reference to FIG. 15.
 図15は、第3の実施形態の変形例における、対象との距離としきい値との関係の例を示す図である。図15上段に示すしきい値151は、物体との距離が近距離のときは固定しきい値とし、遠距離のときは線形の可変しきい値とする方法の例である。これは、物体との距離が近距離の場合の誤識別の抑制を強化したいときに用いるとよい。また、制動距離が短い場合の物体識別にも有効である。 FIG. 15 is a diagram showing an example of the relationship between the distance to the object and the threshold value in a modification of the third embodiment. The threshold value 151 shown in the upper part of FIG. 15 is an example of a method in which a fixed threshold value is used when the distance to the object is short, and a linear variable threshold value is used when the distance to the object is long. This may be used when it is desired to strengthen the suppression of misidentification when the distance to the object is short. It is also effective for object identification when the braking distance is short.
 図15中段に示すしきい値152は、物体との距離が近距離のときは線形の可変しきい値とし、遠距離のときは固定しきい値とする方法の例である。これは、物体との距離が遠距離であっても誤識別の抑制を強化したい場合に用いるとよい。また、制動距離が長い場合の物体識別にも有効である。なお、異なる距離別しきい値関数の遠距離同士、例えば、図15中段におけるしきい値152の遠距離での固定値と、図15下段におけるしきい値153の遠距離での固定値を同じとしてもよい。 The threshold value 152 shown in the middle part of FIG. 15 is an example of a method in which a linear variable threshold value is used when the distance to the object is short, and a fixed threshold value is used when the distance to the object is long. This may be used when it is desired to strengthen the suppression of misidentification even if the distance to the object is long. It is also effective for object identification when the braking distance is long. Note that the threshold functions for different distances may have different distances, for example, the fixed value of the threshold value 152 at a long distance in the middle row of FIG. 15 and the fixed value of the threshold value 153 at a long distance in the lower row of FIG. 15 are the same. You can also use it as
 図15下段に示すしきい値153は、近距離は高い固定しきい値、中距離は線形の可変しきい値、遠距離は低い固定しきい値を用いる方法の例である。これはしきい値141としきい値142の組合せで、制動距離が中程度の場合に有効である。なお、異なる距離別しきい値関数の近距離同士、すなわち、図15上段におけるしきい値151の近距離での固定値と、図15下段におけるしきい値153の近距離での固定値を同じとしてもよい。 The threshold value 153 shown in the lower part of FIG. 15 is an example of a method using a high fixed threshold value for short distances, a linear variable threshold value for intermediate distances, and a low fixed threshold value for long distances. This is a combination of the threshold value 141 and the threshold value 142, and is effective when the braking distance is medium. Note that the fixed value of the threshold value 151 at a short distance in the upper row of FIG. 15 is the same as the fixed value of the threshold value 153 at a short distance in the lower row of FIG. You can also use it as
 なお、物体との距離が第1距離よりも近いとき当該距離が近距離であるとし、物体との距離が第2距離(第1の距離<第2距離)よりも遠いとき当該距離が遠距離とし、物体との距離が第1距離と第2距離の間のとき当該距離が中程度であるとする。 Note that when the distance to the object is shorter than the first distance, the distance is short distance, and when the distance to the object is farther than the second distance (first distance < second distance), the distance is long distance. It is assumed that when the distance to the object is between the first distance and the second distance, the distance is medium.
 このように、上述した第1~第3の実施形態に係る物体識別装置(物体識別装置2~2B)では、距離別しきい値は、物体との距離が第2距離より遠いとき(遠距離)よりも、物体との距離が第2距離よりも短い第1距離より近いとき(近距離)の方が、大きい値となるように設定される。そして、例えば、距離別しきい値関数によって、物体の識別に用いられる距離別しきい値は、線形の可変値(第1~第3の実施形態)、又は、固定値と線形の可変値との組合せ(第3の実施形態の変形例)となる。なお、距離別しきい値を、遠距離から近距離にかけて段階的に異なる値に設定してもよい。例えば、移動体から物体までの距離を遠距離、中距離、及び近距離の3段階に分けて、距離別しきい値を、遠距離では小さな値、中距離では中くらいの値、近距離では大きな値に設定する。距離を3段階ではなく4段階以上に分けてもよい。このようにした場合、上述した実施形態と比較して、距離別識別方法記憶部24に記憶する距離別しきい値(段階的に異なるしきい値)の決定にかかる負荷を軽減できる。 As described above, in the object identification devices (object identification devices 2 to 2B) according to the first to third embodiments described above, the distance-based threshold value is set when the distance to the object is longer than the second distance (long distance ) is set to be a larger value when the distance to the object is shorter than the first distance, which is shorter than the second distance (near distance). For example, the distance-based threshold used for object identification by the distance-based threshold function may be a linear variable value (in the first to third embodiments), or a fixed value and a linear variable value. (a modification of the third embodiment). Note that the distance-based thresholds may be set to values that vary stepwise from long distances to short distances. For example, if the distance from a moving object to an object is divided into three stages: long distance, medium distance, and short distance, the threshold value for each distance can be set to a small value for long distance, a medium value for medium distance, and a medium value for short distance. Set to a large value. The distance may be divided into four or more stages instead of three. In this case, compared to the above-described embodiment, it is possible to reduce the load required to determine the distance-based thresholds (stepwise different thresholds) to be stored in the distance-based identification method storage unit 24.
<第4の実施形態>
 次に、本発明の第4の実施形態として、第2の実施形態に係る物体識別装置2A(図9)に対して、物体の識別結果に基づいて距離別のしきい値を更新する機能を備えた例について説明する。
<Fourth embodiment>
Next, as a fourth embodiment of the present invention, the object identification device 2A (FIG. 9) according to the second embodiment is provided with a function of updating threshold values for each distance based on the object identification results. An example will be explained below.
 図16は、第4の実施形態に係る物体識別装置の構成例を示すブロック図である。図16に示す物体識別装置2Cは、図9に示した第2の実施形態に係る物体識別装置2Aに対し、識別条件・結果出力部161と、距離別識別結果記憶部162を追加した構成である。なお、物体識別装置2Cでは、図9の物体識別装置2Aの識別対象DB92を削除している。 FIG. 16 is a block diagram showing a configuration example of an object identification device according to the fourth embodiment. The object identification device 2C shown in FIG. 16 has a configuration in which an identification condition/result output unit 161 and a distance-based identification result storage unit 162 are added to the object identification device 2A according to the second embodiment shown in FIG. be. Note that in the object identification device 2C, the identification target DB 92 of the object identification device 2A in FIG. 9 has been deleted.
 識別条件・結果出力部161は、物体識別部26から識別結果と識別条件を取得して距離別識別結果記憶部162へ出力する。ここでは、識別条件・結果出力部161は、物体識別部26で識別した識別対象(物体領域)の画像と、識別対象との距離と、スコアと、識別に用いたしきい値の各情報を出力する。 The identification condition/result output unit 161 acquires the identification results and identification conditions from the object identification unit 26 and outputs them to the distance-based identification result storage unit 162. Here, the identification condition/result output unit 161 outputs the image of the identification target (object region) identified by the object identification unit 26, the distance to the identification target, the score, and the threshold value used for identification. do.
 距離別識別結果記憶部162は、識別条件・結果出力部161で出力した識別対象の画像と、識別対象との距離と、スコアと、識別に用いたしきい値の各情報を記憶する。 The distance-based identification result storage unit 162 stores information such as the image of the identification target outputted by the identification condition/result output unit 161, the distance to the identification target, the score, and the threshold value used for identification.
 距離別識別結果記憶部162に格納した識別対象に関する各種情報は、物体識別処理が良好に作用しているかを確認することに用いることができる。また、格納した識別対象の個々の情報に対して物体種別を付与することで、距離別識別方法作成部91において、距離別識別結果記憶部162に格納した識別対象に関する各種情報(識別対象の画像、識別対象との距離、スコア)に基づいて、適切な距離別しきい値を算出することができる。距離別識別方法作成部91は、算出した距離別しきい値(例えば、距離別しきい値関数)を距離別識別方法記憶部24へ出力し、距離別識別方法記憶部24に格納された距離別しきい値を更新する。 The various information regarding the identification target stored in the distance-based identification result storage unit 162 can be used to check whether the object identification process is working well. Furthermore, by assigning an object type to each piece of information about the stored identification target, the distance-based identification method creating unit 91 can generate various types of information about the identification target stored in the distance-based identification result storage unit 162 (images of the identification target). , the distance to the identification target, and the score), an appropriate distance-specific threshold value can be calculated. The distance-based identification method creation unit 91 outputs the calculated distance-based threshold (for example, a distance-based threshold function) to the distance-based identification method storage unit 24 , and the distance-based identification method creation unit 91 outputs the calculated distance-based identification method storage unit 24 . Update another threshold.
 ここで、物体種別の付与は、物体識別部26で識別した識別対象に対し、正しい種別を付与することが目的である。そのため、物体種別の付与方法としては、例えば以下のような方法が挙げられる。 Here, the purpose of assigning the object type is to assign the correct type to the identification target identified by the object identifying unit 26. Therefore, examples of methods for assigning object types include the following methods.
(1)識別対象として出力された画像(物体領域の画像に相当)を、人が目視し、種別を判断して付与する。
(2)識別対象として出力された画像(物体領域の画像に相当)を、識別器(物体種別付与部)が識別して種別を付与する。この際、使用する識別器は、識別性能が人間と同等かそれ以上に高いものを使用することが望ましい。例えば、下記非特許文献1に記載のあるResNetは、畳み込みニューラルネットワークの中でも152層もの深いネットワークを用いることで高次元の特徴を扱うことができる。
(1) A person visually observes the image output as an identification target (corresponding to an image of an object region), determines the type, and assigns the image.
(2) A classifier (object type assigning unit) identifies and assigns a type to an image output as a recognition target (corresponding to an image of an object region). At this time, it is desirable to use a discriminator whose discrimination performance is equal to or higher than that of humans. For example, ResNet described in Non-Patent Document 1 below can handle high-dimensional features by using a network as deep as 152 layers among convolutional neural networks.
(非特許文献1)
 Kaiming He, Xiangyu Zhang, Shaoqing Ren, and Jian Sun. “Deep residual learning for image recognition” IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 2016.
(Non-patent document 1)
Kaiming He, Xiangyu Zhang, Shaoqing Ren, and Jian Sun. “Deep residual learning for image recognition” IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 2016.
 物体種別の付与は、距離別識別結果記憶部162に格納した識別対象に関する情報を、外部記憶媒体等に格納することで、車両とは異なるオフラインの環境で実施することができる。また、非特許文献1に記載したような高性能のネットワークで構成した識別器を車両(例えば、物体識別装置又は他の電子制御装置内)に搭載可能であれば、距離別識別結果記憶部162に格納した識別対象の画像を読み込み、識別を実施することで、実現することができる。 The assignment of the object type can be carried out in an offline environment different from the vehicle by storing the information regarding the identification target stored in the distance-based identification result storage unit 162 in an external storage medium or the like. Furthermore, if a classifier configured with a high-performance network as described in Non-Patent Document 1 can be installed in a vehicle (for example, in an object recognition device or other electronic control device), the distance-based classification result storage unit 162 This can be achieved by reading the image of the object to be identified stored in and performing the identification.
 以上のように、本実施形態では、個々の車両ごとに、識別条件・結果出力部161で出力し、及び距離別識別結果記憶部162に格納した識別対象に関する各種情報に対して、正しい物体種別を付与したデータセットを作成し、このデータに対して第2の実施形態に記載の距離別識別方法作成部91により距離別しきい値を設定することができる。 As described above, in this embodiment, for each individual vehicle, the correct object type is determined based on various information regarding the identification target outputted by the identification condition/result output unit 161 and stored in the distance-based identification result storage unit 162. It is possible to create a data set to which a distance-based discrimination method creation unit 91 described in the second embodiment can set a distance-based threshold value for this data.
 すなわち、本実施形態に係る物体識別装置は、物体識別部(物体識別部26)で識別した物体(物体領域)の画像と、物体までの距離と、識別スコアの各情報を格納する識別結果記憶部(距離別識別結果記憶部162)と、その識別結果記憶部に格納された各情報により、しきい値設定関数(距離別しきい値関数)を作成する関数作成部(距離別識別方法作成部91)と、を備えて構成されている。 That is, the object identification device according to the present embodiment has an identification result storage that stores information such as an image of the object (object region) identified by the object identification unit (object identification unit 26), the distance to the object, and the identification score. (distance-based identification result storage unit 162) and a function creation unit (distance-based identification method creation unit) that creates a threshold setting function (distance-based threshold function) using each piece of information stored in the identification result storage unit 91).
 車両などの移動体は、個々の使われる地域や環境条件、使用頻度の高い速度帯等によって、識別対象の種類や、誤識別しやすい歩行者以外の物体等が異なる。本実施形態に係る物体識別装置を用いることにより、個々の車両の走行環境に適合した可変しきい値を随時更新することが可能となる。例えば、上述した本実施形態に係る物体識別装置を含むシステムの運用初期では、汎用性のある距離別しきい値が作成・使用される。一方で、移動体の利用実態は利用者ごとに、すなわち車両ごとに異なる。本実施形態に係る物体識別装置によれば、距離別しきい値は利用実態に合わせて更新される。なお、更新頻度は、一度でもよいし、定期的に又は何らかの条件を設定して、複数回更新するようにしてもよい。 For moving objects such as vehicles, the types of objects to be identified and objects other than pedestrians that are likely to be misidentified vary depending on the region in which they are used, environmental conditions, frequently used speed zones, etc. By using the object identification device according to this embodiment, it becomes possible to update the variable threshold value that is suitable for the driving environment of each vehicle at any time. For example, in the initial stage of operation of a system including the object identification device according to the present embodiment described above, a versatile distance-based threshold is created and used. On the other hand, the actual usage of mobile objects differs from user to user, that is, from vehicle to vehicle. According to the object identification device according to the present embodiment, the distance-based threshold is updated according to the actual usage situation. Note that the update frequency may be once, or may be updated multiple times periodically or by setting some conditions.
 以上が本発明による物体識別装置の実施形態の例である。以上の実施形態では、識別対象として歩行者を歩行者以外と識別する例を用いたが、識別対象は歩行者を含む複数の対象であってもよい。 The above are examples of embodiments of the object identification device according to the present invention. In the embodiments described above, an example in which a pedestrian is identified as a non-pedestrian is used as an identification target, but the identification target may be a plurality of objects including a pedestrian.
 また、画像上の物体との距離を算出する手段としてステレオカメラから得られた外界情報を用いる例を説明したが、外界情報取得部はステレオカメラに限らない。例えば、外界情報取得部として、単眼カメラとミリ波レーダなどを用いてもよい。 Furthermore, although an example has been described in which outside world information obtained from a stereo camera is used as a means for calculating the distance to an object on an image, the outside world information acquisition unit is not limited to a stereo camera. For example, a monocular camera, a millimeter wave radar, or the like may be used as the external world information acquisition unit.
[物体識別装置のハードウェア構成]
 次に、上述した第1~第4の実施形態に係る物体識別装置2~2Cの制御系の構成(ハードウェア構成)について、図17を参照して説明する。
[Hardware configuration of object identification device]
Next, the configurations (hardware configurations) of the control systems of the object identification devices 2 to 2C according to the first to fourth embodiments described above will be described with reference to FIG. 17.
 図17は、第1~第4の実施形態に係る物体識別装置2~2Cのハードウェア構成例を示すブロック図である。図17に示す計算機170は、いわゆるコンピューターとして用いられるハードウェアである。 FIG. 17 is a block diagram showing an example of the hardware configuration of the object identification devices 2 to 2C according to the first to fourth embodiments. A computer 170 shown in FIG. 17 is hardware used as a so-called computer.
 計算機170は、バスBにそれぞれ接続されたCPU(Central Processing Unit)171、ROM(Read Only Memory)172、RAM(Random Access Memory)173、不揮発性ストレージ176、ネットワークインタフェース177を備える。 The computer 170 includes a CPU (Central Processing Unit) 171, a ROM (Read Only Memory) 172, a RAM (Random Access Memory) 173, a nonvolatile storage 176, and a network interface 177, each connected to a bus B.
 CPU171は、本実施形態に係る乗客コンベア制御システム100の各機能を実現するソフトウェアのプログラムコードをROM172から読み出してRAM173に展開して実行する。もしくは、CPU171は、プログラムコードをROM172から直接読み出してそのまま実行する。なお、計算機170は、CPU171の代わりに、MPU(Micro-Processing Unit)等の処理装置を備えてもよい。RAM173には、CPU171による演算処理の途中に発生した変数やパラメータ等が一時的に書き込まれる。物体識別装置2~2Cの各ブロックの機能は、CPU171によって実現される。 The CPU 171 reads software program codes that implement each function of the passenger conveyor control system 100 according to the present embodiment from the ROM 172, expands them into the RAM 173, and executes them. Alternatively, the CPU 171 directly reads the program code from the ROM 172 and executes it as is. Note that the computer 170 may include a processing device such as an MPU (Micro-Processing Unit) instead of the CPU 171. Variables, parameters, etc. generated during arithmetic processing by the CPU 171 are temporarily written into the RAM 173. The functions of each block of the object identification devices 2 to 2C are realized by the CPU 171.
 不揮発性ストレージ176としては、例えば、HDD(Hard Disk Drive)、SSD(Solid State Drive)、フレキシブルディスク、光ディスク、光磁気ディスク、CD-ROM、CD-R、不揮発性のメモリカード等を用いることができる。この不揮発性ストレージ176には、OS(Operating System)、各種のパラメータの他に、計算機170を機能させるためのプログラム等が記録される。物体識別装置2~2Cの距離別識別方法記憶部24、識別対象DB92、制動距離・距離別識別方法記憶部133、及び制動距離・距離別識別方法選択部134の機能は、不揮発性ストレージ176によって実現される。 As the non-volatile storage 176, for example, an HDD (Hard Disk Drive), SSD (Solid State Drive), flexible disk, optical disk, magneto-optical disk, CD-ROM, CD-R, non-volatile memory card, etc. can be used. can. In this nonvolatile storage 176, in addition to the OS (Operating System) and various parameters, programs for operating the computer 170 and the like are recorded. The functions of the distance-based identification method storage unit 24, the identification target DB 92, the braking distance/distance-based identification method storage unit 133, and the braking distance/distance-based identification method selection unit 134 of the object identification devices 2 to 2C are performed by the non-volatile storage 176. Realized.
 なお、プログラムは、ROM172に格納されてもよい。プログラムは、コンピューターが読取り可能なプログラムコードの形態で格納され、CPU171は、当該プログラムコードに従った動作を逐次実行する。つまり、ROM172又は不揮発性ストレージ176は、コンピューターによって実行されるプログラムを格納した、コンピューター読取可能な非一過性の記録媒体の一例として用いられる。 Note that the program may be stored in the ROM 172. The program is stored in the form of a computer-readable program code, and the CPU 171 sequentially executes operations according to the program code. That is, the ROM 172 or the nonvolatile storage 176 is used as an example of a computer-readable non-transitory recording medium that stores a program executed by a computer.
 ネットワークインタフェース177は、他の装置との間で行われる通信の制御を行う通信デバイス等により構成される。物体識別装置2~2Cの通信機能は、ネットワークインタフェース177によって実現される。 The network interface 177 is composed of a communication device or the like that controls communication with other devices. The communication function of the object identification devices 2 to 2C is realized by the network interface 177.
 以上説明した各実施形態や各種変形例はあくまで一例であり、発明の特徴が損なわれない限り、本発明はこれらの実施形態や変形例に限定されるものではない。また、上述した種々の実施形態や変形例は、本発明を分かりやすく説明するためにその構成を詳細かつ具体的に説明したものであり、必ずしも説明したすべての構成要素を備えるものに限定されない。本発明の技術的思想の範囲内で考えられるその他の態様も本発明の範囲内に含まれる。 The embodiments and various modifications described above are merely examples, and the present invention is not limited to these embodiments and modifications as long as the characteristics of the invention are not impaired. In addition, the various embodiments and modifications described above are detailed and specific explanations of the configurations in order to explain the present invention in an easy-to-understand manner, and are not necessarily limited to those having all the described components. Other embodiments considered within the technical spirit of the present invention are also included within the scope of the present invention.
 また、上記の各構成、機能、処理部等は、それらの一部又は全部を、例えば集積回路で設計するなどによりハードウェアで実現してもよい。ハードウェアとして、FPGA(Field Programmable Gate Array)やASIC(Application Specific Integrated Circuit)などの広義のプロセッサデバイスを用いてもよい。 Further, each of the above-mentioned configurations, functions, processing units, etc. may be partially or entirely realized by hardware, for example, by designing an integrated circuit. As the hardware, a broadly defined processor device such as an FPGA (Field Programmable Gate Array) or an ASIC (Application Specific Integrated Circuit) may be used.
 また、上述した実施形態にかかる物体識別装置の各構成要素は、それぞれのハードウェアがネットワークを介して互いに情報を送受信できるならば、いずれのハードウェアに実装されてもよい。また、ある処理部により実施される処理が、1つのハードウェアにより実現されてもよいし、複数のハードウェアによる分散処理により実現されてもよい。 Further, each component of the object identification device according to the embodiment described above may be implemented in any hardware as long as the respective hardware can send and receive information to and from each other via a network. Moreover, the processing performed by a certain processing unit may be realized by one piece of hardware, or may be realized by distributed processing by a plurality of pieces of hardware.
 また、本明細書において、時系列的な処理を記述する処理ステップは、記載された順序に沿って時系列的に行われる処理はもちろん、必ずしも時系列的に処理されなくとも、並列的、あるいは個別に実行される処理(例えば、オブジェクトによる処理)をも含むものである。また、時系列的な処理を記述する処理ステップについては、処理結果に影響を及ぼさない範囲で、処理順序を変更してもよい。 In addition, in this specification, processing steps describing chronological processing are not only processes that are performed chronologically in the described order, but also processes that are not necessarily performed chronologically, but may be performed in parallel or It also includes processes that are executed individually (for example, processes by objects). Furthermore, the processing order of processing steps that describe time-series processing may be changed within a range that does not affect the processing results.
 また、上述した実施形態において、矢印や実線で示した制御線や情報線は説明上必要と考えられるものを示しており、製品上必ずしもすべての制御線や情報線を示しているとは限らない。実際には殆どすべての構成要素が相互に接続されていると考えてもよい。 Furthermore, in the embodiments described above, the control lines and information lines indicated by arrows and solid lines indicate those considered necessary for explanation, and not all control lines and information lines are necessarily indicated in the product. . In reality, almost all the components may be considered to be interconnected.
 1…ステレオカメラ、 2~2C…物体識別装置、 22…物体距離算出部、 23…識別スコア算出部、 26…物体識別部 1...Stereo camera, 2-2C...Object identification device, 22...Object distance calculation unit, 23...Identification score calculation unit, 26...Object identification unit

Claims (7)

  1.  車両に搭載された外界情報取得部によって取得された車外の物体を含む外界情報が入力され、前記車両から前記物体までの距離を求める距離算出部と、
     前記外界情報を用いて、前記外界情報に含まれる前記物体が所定の種類である信頼度を示す識別スコアを求める識別スコア算出部と、
     前記識別スコアが予め定められたしきい値を超えた場合に、前記物体が当該識別スコアと対応付けられた種類であると識別する物体識別部と、を備え、
     前記しきい値は、前記距離に応じて異なる
     物体識別装置。
    a distance calculation unit that receives external world information including an object outside the vehicle acquired by an external world information acquisition unit mounted on the vehicle, and calculates a distance from the vehicle to the object;
    an identification score calculation unit that uses the external world information to calculate an identification score indicating a degree of confidence that the object included in the external world information is of a predetermined type;
    an object identification unit that identifies the object as being of a type associated with the identification score when the identification score exceeds a predetermined threshold;
    The threshold value varies depending on the distance. Object identification device.
  2.  前記しきい値は、前記物体との距離が第2距離より遠いときよりも、前記物体との距離が前記第2距離よりも短い第1距離より近いときの方が、大きい値となるように設定される
     請求項1に記載の物体識別装置。
    The threshold value is set to be a larger value when the distance to the object is closer than a first distance shorter than the second distance than when the distance to the object is farther than a second distance. The object identification device according to claim 1, wherein the object identification device is set.
  3.  前記しきい値として、正解値既知の物体の情報が距離別に保存された画像データベースに対して、正識別率が一定値以上で、かつ、誤識別が最小となるしきい値設定関数が用いられ、
     前記物体識別部は、前記しきい値設定関数に前記物体までの距離を入力して前記しきい値を算出する
     請求項2に記載の物体識別装置。
    As the threshold value, a threshold setting function is used that provides a correct identification rate of a certain value or more and a minimum of misidentifications for an image database in which information about objects with known correct values is stored by distance. ,
    The object identification device according to claim 2, wherein the object identification unit calculates the threshold by inputting a distance to the object into the threshold setting function.
  4.  前記しきい値は、前記物体までの距離及び制動距離に応じて設定される
     請求項1に記載の物体識別装置。
    The object identification device according to claim 1, wherein the threshold value is set according to a distance to the object and a braking distance.
  5.  前記しきい値設定関数は、固定しきい値と、線形の可変しきい値との組合せにより定義される
     請求項3に記載の物体識別装置。
    The object identification device according to claim 3, wherein the threshold setting function is defined by a combination of a fixed threshold and a linear variable threshold.
  6.  前記物体識別部で識別した前記物体の画像と、前記物体までの距離と、前記識別スコアの各情報を格納する識別結果記憶部と、
     前記識別結果記憶部に格納された前記各情報により、前記しきい値設定関数を作成する関数作成部と、を備える
     請求項3に記載の物体識別装置。
    an identification result storage unit that stores information such as an image of the object identified by the object identification unit, a distance to the object, and the identification score;
    The object identification device according to claim 3, further comprising: a function creation unit that creates the threshold setting function based on the information stored in the identification result storage unit.
  7.  車両に搭載された外界情報取得部によって取得された車外の物体を含む外界情報が入力される物体識別装置による物体識別方法であって、
     前記外界情報により前記車両から前記物体までの距離を求める処理と、
     前記外界情報を用いて、前記外界情報に含まれる前記物体が所定の種類である信頼度を示す識別スコアを求める処理と、
     前記識別スコアが予め定められたしきい値を超えた場合に、前記物体が当該識別スコアと対応付けられた種類であると識別する処理と、を含み、
     前記しきい値は、前記距離に応じて異なる
     物体識別方法。
    An object identification method using an object identification device in which external world information including objects outside the vehicle acquired by an external world information acquisition unit mounted on a vehicle is input,
    a process of determining a distance from the vehicle to the object based on the external world information;
    using the external world information to obtain an identification score indicating a degree of confidence that the object included in the external world information is of a predetermined type;
    a process of identifying that the object is of a type associated with the identification score when the identification score exceeds a predetermined threshold;
    The threshold value varies depending on the distance. Object identification method.
PCT/JP2023/022771 2022-07-08 2023-06-20 Object identification device and object identification method WO2024009756A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2022-110453 2022-07-08
JP2022110453A JP2024008516A (en) 2022-07-08 2022-07-08 Object identification device and object identification method

Publications (1)

Publication Number Publication Date
WO2024009756A1 true WO2024009756A1 (en) 2024-01-11

Family

ID=89453246

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2023/022771 WO2024009756A1 (en) 2022-07-08 2023-06-20 Object identification device and object identification method

Country Status (2)

Country Link
JP (1) JP2024008516A (en)
WO (1) WO2024009756A1 (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009026223A (en) * 2007-07-23 2009-02-05 Toyota Central R&D Labs Inc Object detector and program
JP2016080639A (en) * 2014-10-22 2016-05-16 株式会社デンソー Object detection device and vehicle controller
JP2019069734A (en) * 2017-10-11 2019-05-09 トヨタ自動車株式会社 Vehicle control device
WO2022070250A1 (en) * 2020-09-29 2022-04-07 日本電気株式会社 Information processing device, information processing method, and program
JP2022055779A (en) * 2020-09-29 2022-04-08 セイコーエプソン株式会社 Method of setting threshold value used for quality determination of object recognition result, and object recognition apparatus

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009026223A (en) * 2007-07-23 2009-02-05 Toyota Central R&D Labs Inc Object detector and program
JP2016080639A (en) * 2014-10-22 2016-05-16 株式会社デンソー Object detection device and vehicle controller
JP2019069734A (en) * 2017-10-11 2019-05-09 トヨタ自動車株式会社 Vehicle control device
WO2022070250A1 (en) * 2020-09-29 2022-04-07 日本電気株式会社 Information processing device, information processing method, and program
JP2022055779A (en) * 2020-09-29 2022-04-08 セイコーエプソン株式会社 Method of setting threshold value used for quality determination of object recognition result, and object recognition apparatus

Also Published As

Publication number Publication date
JP2024008516A (en) 2024-01-19

Similar Documents

Publication Publication Date Title
KR102545105B1 (en) Apparatus and method for distinquishing false target in vehicle and vehicle including the same
CN110286389A (en) A kind of grid management method for obstacle recognition
CN110543807A (en) method for verifying obstacle candidate
US11151395B2 (en) Roadside object detection device, roadside object detection method, and roadside object detection system
CN112660128B (en) Apparatus for determining lane change path of autonomous vehicle and method thereof
KR102304851B1 (en) Ecu, autonomous vehicle including the ecu, and method of recognizing near vehicle for the same
JP4937844B2 (en) Pedestrian detection device
GB2544627A (en) Self-recognition of autonomous vehicles in mirrored or reflective surfaces
CN111284501A (en) Apparatus and method for managing driving model based on object recognition, and vehicle driving control apparatus using the same
JP2020154983A (en) Object recognition device and vehicle control system
CN111723724A (en) Method and related device for identifying road surface obstacle
US11069049B2 (en) Division line detection device and division line detection method
CN113008296A (en) Method and vehicle control unit for detecting a vehicle environment by fusing sensor data on a point cloud plane
US20220171975A1 (en) Method for Determining a Semantic Free Space
US20200174488A1 (en) False target removal device and method for vehicles and vehicle including the device
CN115147587A (en) Obstacle detection method and device and electronic equipment
CN115366885A (en) Method for assisting a driving maneuver of a motor vehicle, assistance device and motor vehicle
WO2024009756A1 (en) Object identification device and object identification method
CN112163521A (en) Vehicle driving behavior identification method, device and equipment
CN114435401B (en) Vacancy recognition method, vehicle, and readable storage medium
US11820427B2 (en) Lane keeping assist system of vehicle and lane keeping method using the same
WO2021135566A1 (en) Vehicle control method and apparatus, controller, and smart vehicle
KR102283053B1 (en) Real-Time Multi-Class Multi-Object Tracking Method Using Image Based Object Detection Information
CN114076928A (en) System for extracting contour of static object and method thereof
KR20220168061A (en) Apparatus for controlling a vehicle, system having the same and method thereof

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23835290

Country of ref document: EP

Kind code of ref document: A1