WO2015098344A1 - 鉱山用作業機械 - Google Patents

鉱山用作業機械 Download PDF

Info

Publication number
WO2015098344A1
WO2015098344A1 PCT/JP2014/080106 JP2014080106W WO2015098344A1 WO 2015098344 A1 WO2015098344 A1 WO 2015098344A1 JP 2014080106 W JP2014080106 W JP 2014080106W WO 2015098344 A1 WO2015098344 A1 WO 2015098344A1
Authority
WO
WIPO (PCT)
Prior art keywords
vehicle
image
obstacle
information
unit
Prior art date
Application number
PCT/JP2014/080106
Other languages
English (en)
French (fr)
Japanese (ja)
Inventor
佑介 日永田
松尾 茂
川股 幸博
石本 英史
幹雄 板東
Original Assignee
日立建機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 日立建機株式会社 filed Critical 日立建機株式会社
Publication of WO2015098344A1 publication Critical patent/WO2015098344A1/ja

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/58Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior
    • G06T2207/30252Vehicle exterior; Vicinity of vehicle
    • G06T2207/30261Obstacle

Definitions

  • the present invention relates to a mining work machine such as an off-road dump truck.
  • construction machines such as excavators and dump trucks are used for mining and transporting earth and sand.
  • Construction machinery used in mines is required to be unmanned from the viewpoint of safety and cost reduction.
  • dump trucks since the amount of earth and sand transported per unit time is directly linked to the progress of mining, efficient operation is required. Therefore, in order to efficiently transport a large amount of earth and sand outside the mining site, a mining system using an autonomously traveling dump truck capable of continuous operation is required.
  • an obstacle detection device such as a millimeter wave radar, a laser sensor, or a stereo camera
  • the millimeter wave radar and laser sensor have a narrow detection range and cannot distinguish between a road surface or a shoulder obstacle and a vehicle. The scene that can recognize is limited. Therefore, the millimeter wave radar and the laser sensor alone are not easily applied as an obstacle detection device for an autonomously traveling dump truck.
  • the stereo camera can measure a three-dimensional shape, the traveling road surface and the obstacle can be easily distinguished.
  • the stereo camera has already been put into practical use as a monitoring device or an obstacle detection device for an automobile, and is suitable as an obstacle detection device for an autonomously traveling dump truck.
  • Patent Document 1 a configuration using disparity information and appearance information of a stereo camera is disclosed in Patent Document 1, for example.
  • the range of search is limited using the edge information of an object, and the distance measurement accuracy of the stereo camera is improved by performing highly accurate distance detection.
  • the stereo camera can be clearly distinguished from the preceding vehicle and the traveling road surface and the obstacle as compared with the millimeter wave radar and the laser sensor, and thus is suitable as an obstacle detection device for an autonomous traveling type dump truck. Yes.
  • the traveling path of the dump truck is off-road, so when detecting the preceding vehicle with a stereo camera, a part of the preceding vehicle to be detected is, for example, the own vehicle Alternatively, a situation may be considered in which all or part of the front vehicle that is the detection target is hidden from the field of view of the stereo camera due to dust that has been rolled up during traveling of another vehicle or the like.
  • a situation may be considered in which all or part of the front vehicle that is the detection target is hidden from the field of view of the stereo camera due to dust that has been rolled up during traveling of another vehicle or the like.
  • a system that simply detects a forward vehicle using a stereo camera when a portion of the forward vehicle is hidden by dust, information necessary for detection as the forward vehicle is insufficient, and the forward vehicle May not be detected.
  • the stereo camera can obtain a parallax image in which the distance between the camera and the object is calculated using the difference (difference) between the appearance image captured by the left and right cameras and the appearance of the left and right appearance images.
  • the vehicle detection system described in Patent Document 1 that uses the parallax image and the appearance image is used.
  • the parallax image and the visible image are only used to improve the distance measurement accuracy, and cannot be applied to, for example, vehicle detection when a part of the vehicle is hidden by dust or the like.
  • the present invention has been made from the above-described prior art, and its purpose is to detect a predetermined object even when part of it is hidden, and to work for a mine capable of measuring the distance to the predetermined object. To provide a machine.
  • a mining work machine detects an image detection unit capable of detecting three-dimensional image information and a local feature of an object detected in the image information.
  • the local feature information detected by the local image feature detection unit and the local image feature detection unit is compared with the predetermined feature information of a predetermined obstacle, and is detected in the image information.
  • An obstacle determination unit that determines whether an object is the predetermined obstacle, and when the obstacle determination unit determines that the object is the predetermined obstacle, based on the image information, And a distance information acquisition unit that acquires distance information to the detected object.
  • the object is determined as the predetermined obstacle in the image information when the obstacle detection unit determines that the object detected in the image information is the predetermined obstacle.
  • An obstacle feature storage unit that stores feature information of the region that has been recorded, wherein the obstacle determination unit includes the feature information stored in the obstacle feature storage unit and the distance information acquired by the distance information acquisition unit And determining whether or not the object detected in the image information is the predetermined obstacle based on the local feature information detected by the image local feature detection unit. Yes.
  • the obstacle determination unit determines whether the object detected in the image information is a predetermined obstacle. That is, the obstacle detected while considering the feature information of the predetermined obstacle that has been previously detected in the image information by the obstacle determination unit as the predetermined obstacle and stored in the obstacle feature storage unit
  • the object determination unit can determine whether or not the object is a predetermined obstacle. Therefore, while updating the feature information of a predetermined obstacle, the obstacle determination unit can determine whether the obstacle is a predetermined obstacle based on the updated feature information. Can be determined with higher accuracy.
  • a local feature of an object is detected by the image local feature detection unit from the three-dimensional image information detected by the image detection unit, and the detected local feature information is determined in advance by a predetermined value.
  • the obstacle determination unit determines whether or not the object detected in the image information is a predetermined obstacle compared to the characteristic part information of the obstacle, and the obstacle determination unit determines the predetermined obstacle. Is determined, the distance information acquisition unit acquires the distance information to the object detected in the image information based on the image information detected by the image detection unit.
  • the present invention makes it possible to hide a part of a predetermined obstacle, for example, in the case of image information of only a part of the predetermined obstacle due to dust, dirt, cargo changes, or too close, or a wall or the like. Even when the missing image information is acquired, a predetermined obstacle can be accurately detected. In addition, since a predetermined obstacle can be accurately detected, the distance to the predetermined obstacle can be accurately measured. Therefore, the present invention can accurately acquire the presence and distance of a predetermined obstacle even in a special environment such as a mine. Problems, configurations, and effects other than those described above will be made clear from the following description of embodiments.
  • FIG. 1 is a schematic diagram showing an off-road dump truck, which is a mine vehicle 1, as a mine work machine according to a first embodiment of the present invention.
  • FIG. 2 is a schematic diagram showing a mine system in which the vehicle 1 is used.
  • FIG. 3 is a schematic diagram showing a forward vehicle detection system 10 mounted on the vehicle 1.
  • the vehicle 1 is an unmanned traveling type capable of traveling on a road surface R, which is a traveling path provided in advance in a mine, by autonomous driving.
  • an information center 3 for transmitting and receiving predetermined information to and from the vehicle 1 is installed, and a hydraulic excavator 4 for loading a load such as earth and sand on the vehicle 1 is used. .
  • the vehicle 1 is a vessel that is a vehicle body 1a, a driver's seat 1b provided on the upper front side of the vehicle body 1a, and a loading platform as a working unit provided on the vehicle body 1a so as to be raised and lowered.
  • On the front deck 1f of the vehicle main body 1a a stereo camera device 11 for detecting the environment around the vehicle main body 1a, particularly in front of the traveling direction, is attached. The vehicle 1 travels while detecting and avoiding obstacles on the road surface recognized by the stereo camera device 11, particularly the other vehicle 1 traveling ahead ahead.
  • the stereo camera device 11 includes a pair of cameras 11a and 11b, and acquires three-dimensional image information of the outside world using these two cameras 11a and 11b.
  • the three-dimensional image information includes parallax image information calculated from a difference between two-dimensional image (appearance image) information detected by two cameras, and distance information to each target in the appearance images. It is.
  • the stereo camera device 11 is attached so that the center between the left and right cameras 11a and 11b is located at the center position that is the center portion in the left-right direction on the front side of the vehicle body 1a. Internal parameters such as the focal length and lens distortion of the cameras 11a and 11b, and external parameters indicating the mutual positional relationship and the installation position on the vehicle body are synchronized with each other.
  • These cameras 11a and 11b are directed to the front of the vehicle 1 so that their optical axes are parallel to each other, and are installed so that the measurement areas 11c and 11d of these cameras 11a and 11b partially overlap each other.
  • the forward vehicle detection system 10 is a stereo camera system as an image analysis device, and includes an external environment detection unit 12, a dust detection unit 13, a parallax grouping unit 14, and an image local feature detection that include the stereo camera device 11.
  • a unit 15, an obstacle feature storage unit 16, an obstacle determination unit 17, and a distance information acquisition unit 18 are provided.
  • the external environment detection unit 12 is an image detection unit that acquires a parallax image having information and an appearance image having color or luminance information of an object in the parallax image as image information detected by the cameras 11a and 11b. is there.
  • the switch (not shown) for turning on / off the power to the forward vehicle detection system 10 is turned on
  • the external environment detection unit 12 can obtain an image that can be seen by the cameras 11a and 11b as an initialization process.
  • the calibration data of each camera 11a, 11b is read.
  • the dust detection part 13 is based on the appearance image acquired in the external environment detection part 12 from the image information detected by each camera 11a, 11b of the stereo camera device 11, and the dust D existing in front of these cameras 11a, 11b. Detects the occurrence of
  • the parallax grouping unit 14 detects a parallax with a short distance as the parallax of the same object based on the parallax image acquired by the external environment detection unit 12 from the image information detected by the stereo camera device 11. It is. Specifically, the parallax grouping unit 14 divides the parallax image into a predetermined grid size, and then the adjacent grid or the distance is equal to or less than an arbitrary threshold value set, and the parallax representative value is close In this case, assuming that the grids indicate the same object, they are determined as detection targets corresponding to at least a part of the vehicle 1 and are classified into the same parallax group (detection target group). Further, the parallax grouping unit 14 calculates a region of a visible image that corresponds to the position of the parallax group, and sets the calculated region as a target region group C that is a range in which an image local feature is detected.
  • the image local feature detection unit 15 detects an image local feature, which is a specific feature unique to the vehicle 1 traveling ahead, for example, a predetermined object, based on the appearance image acquired by the external environment detection unit 12.
  • the image local feature refers to a partial image indicating the feature of the vehicle 1, a part of the edge image, or a multidimensional feature amount that can be calculated from a part of the image.
  • SIFT Scale-Invariant Feature Transform
  • SURF Speed-Up Robust Features
  • HOG Heistograms of Oriented Gradients
  • Haar-likt feature Haar-likt feature
  • Edgelet feature etc.
  • the image local feature detection unit 15 detects image local features from the target region group C specified by the parallax grouping unit 14.
  • the obstacle feature storage unit 16 is a so-called memory, and stores image local feature information (dump data) related to image local features unique to the vehicle 1 that can be acquired when the vehicle 1 is imaged by the stereo camera device 11. Saved.
  • the obstacle determination unit 17 compares the image local feature information accumulated in the obstacle feature storage unit 16 with the image local feature detected by the image local feature detection unit 15, and the image in the image local feature information. Matches whether local features are identical. That is, the obstacle determination unit 17 matches the image local feature detected by the image local feature detection unit 15 as preprocessing with the image local feature stored in the obstacle feature storage unit 16.
  • the group is determined to be the vehicle 1. At this time, the same applies to the detection of a hydraulic excavator or other construction machine.
  • the image local feature A such that a certain image local feature A exists above the vehicle 1 and a certain image local feature B exists right next to the image local feature A is determined as the vehicle 1. B may have information on each other's geometric structure.
  • the distance information acquisition unit 18 acquires a relative distance to the group determined as the vehicle 1 by the obstacle determination unit 17. That is, the distance information acquisition unit 18 calculates the representative values of the parallax of these groups based on the parallax images of the groups determined by the obstacle determination unit 17 and calculates the inter-vehicle distance to the vehicle 1 ahead.
  • FIG. 4 is an explanatory diagram illustrating a situation where the vehicle 1 measures the other vehicle 1 ahead.
  • FIG. 5 is an explanatory diagram illustrating a situation in which the other vehicle 1 ahead is measured when dust is generated in the vehicle 1.
  • FIG. 6 is an explanatory diagram showing the operation of the obstacle determination unit 17 of the vehicle detection system 10 ahead of the vehicle 1.
  • FIG. 7 is a flowchart showing a vehicle detection process by the forward vehicle detection system 10 of the vehicle 1.
  • FIG. 8 is an explanatory diagram showing an image in which the other vehicle 1 in front is detected by the external environment detection unit 12 of the vehicle 1 when dust is generated.
  • the external environment detection unit 12 performs an initialization process (step S1, hereinafter simply referred to as “S1” or the like).
  • the visible images can be acquired by the cameras 11a and 11b of the stereo camera device 11, adjustment data such as calibration data of the cameras 11a and 11b is read, and the cameras 11a and 11b are read.
  • the parallax image shown in FIG. 6 can be calculated from the camera image detected at.
  • a parallax image is calculated from the camera images detected by the cameras 11a and 11b by the external environment detection unit 12 (S2).
  • the parallax images calculated in S2 are grouped by the parallax grouping unit 14 (S3).
  • the grouping process in S3 divides the parallax image into a predetermined grid size. After that, when adjacent grids or distances are equal to or less than a set threshold value and the parallax representative values are close, the grids are classified as the same parallax group as indicating the same object.
  • a method of selecting a representative value of parallax there are a method of taking an average value of effective parallax and a method of taking a median value of effective parallax.
  • the dust detection unit 13 detects the dust D from the camera image (S4).
  • the luminance value or color of the area where the dust D is captured in the visible image is substantially constant.
  • the parallax value of the area where the dust D is captured has a property of being substantially constant. Therefore, in the dust detection in S4, a feature is detected in which the luminance value or color in the visible image is constant over a predetermined arbitrary size or the parallax value in the parallax image is almost constant. In this case, it is assumed that the dust detection unit 13 detects dust, and otherwise, the dust D is not detected (S5). In the state shown in FIG. 4, when the dust D is not detected (No) in S5, the process proceeds to S11 described later.
  • the image local feature detection unit 15 calculates and detects the image local feature for the target region group C in the visible image calculated in S6 (S7). That is, the area where the image local feature is calculated in S7 is the target area C determined in S6. Further, as the image local feature detected in S7, as shown in FIG. 6, in addition to the edge image E partitioned in an arbitrary size in a grid pattern, an image segmented in an arbitrary size in a grid pattern, or a part of the image SIFT feature, SURF feature, HOG feature, Haar-like feature, Edgelet feature, or the like, which are multidimensional feature quantities that can be calculated from the above.
  • the image local feature detected in S ⁇ b> 7 is compared with the image local feature in the image local feature information stored in the obstacle feature storage unit 16, and the matching points of these image local features are compared. Is calculated by the obstacle determination unit 17 (S8). Then, the obstacle determination unit 17 determines whether or not the number of matches calculated in S8 exceeds a predetermined threshold value (S9). If it is determined in S9 that the number of matched features exceeds the threshold (Yes), the group having the number of matched features is determined as the vehicle 1 traveling ahead (S10). On the other hand, if it is determined in S9 that the number of matched features does not exceed the threshold (No), this process is terminated.
  • the parallax group that has been grouped in S3 has an arbitrary size. It is determined that the vehicle 1 is traveling (S11). Then, based on the parallax image of the group determined as the vehicle 1 in S9 and S10, the distance information acquisition unit 18 calculates the distance information, the representative value of the parallax of these groups is calculated, and the determined vehicle 1 An inter-vehicle distance is calculated (S12).
  • a method for calculating the representative value of the parallax of the group there are a method of taking the average value of the group, a method of taking the center value of the group, and a method of calculating the closest distance of the group.
  • the parallax image is detected by the external environment detection unit 12 from the camera images detected by the left and right cameras 11a and 11b of the stereo camera device 11. An appearance image is calculated.
  • the local image information unique to the vehicle 1 can be detected from the visible image by the local image feature detection unit 15 and based on the local image feature detected by the local image feature detection unit 15, an object in the visible image can be detected.
  • the distance information acquisition unit 18 determines the inter-vehicle distance to the vehicle 1 detected in the appearance image based on the parallax image detected by the external environment detection unit 12. Can be obtained at
  • a group that exists in the region in the parallax image and has a size equal to or larger than a predetermined threshold is calculated, and then corresponds to this group.
  • a region of a visible image is calculated, and the calculated region is set as a target region group C that is a range in which an image local feature is detected. Therefore, for this target region group C, the image local feature detection unit 15 detects the image local feature, and the obstacle determination unit 17 determines whether the object detected in the visible image is the other vehicle 1 or not. Can be judged reliably.
  • the detection range in the visible image detected by the image local feature detection unit 15 can be specified, and the detection range can be narrowed. Therefore, since the detection target information by the image local feature detection unit 15 can be reduced, the obstacle determination unit 17 can determine whether or not the vehicle 1 is more efficient and accurate.
  • the obstacle determination unit 17 determines whether the object in the visible image is the vehicle 1
  • the image local feature detected by the image local feature detection unit 15 and the obstacle feature storage unit 16 are stored.
  • the image local feature information in the image local feature information is compared, the number of matching points of these image local features is calculated, and if the calculated number of matches exceeds a predetermined threshold, the number of matching points Is a vehicle 1 ahead.
  • the obstacle determination unit 17 can determine whether or not the vehicle 1 is more accurate. Therefore, a highly reliable autonomous traveling vehicle 1 capable of detecting obstacles on the preceding vehicle 1 and the road surface R at an early stage and causing the following vehicle 1 to follow and avoid obstacles is realized. it can.
  • FIG. 9 is a schematic diagram showing a forward vehicle detection system 10A mounted on the vehicle 1 according to the second embodiment of the present invention.
  • the second embodiment is different from the first embodiment described above in that the second embodiment is different from the first embodiment in that the traveling road surface estimation unit 21, the forward vehicle feature temporary storage unit 22, and the vehicle existence possible region calculation unit. 23.
  • the same or corresponding parts as those in the first embodiment are denoted by the same reference numerals.
  • the traveling road surface estimation unit 21 is a traveling region detection unit for determining a road surface region (traveling road surface region) where the host vehicle 1 and the other vehicle 1 can travel.
  • the traveling road surface estimation unit 21 calculates the traveling road surface region in the image information in the parallax image and the appearance image based on the parallax image and the appearance image detected by the external environment detection unit 12.
  • the traveling road surface estimation unit 21 uses, for example, a RANSAC (Random sample consensus) or a Hough (Random sample consensus) based on the assumption that the road surface R is configured by a plane from each of the parallax groups grouped by the parallax grouping unit 14.
  • Hough conversion or the like is used to calculate a parallax group (road surface parallax group) existing above the road surface R as a plane area in the parallax image, and the plane area is estimated as a road area.
  • the front vehicle feature temporary storage unit 22 is for temporarily storing the image local feature detected by the image local feature detection unit 15 when the obstacle determination unit 17 detects the other vehicle 1 traveling in front. It is an obstacle feature storage unit.
  • the image local feature temporary information related to the image local feature stored in the forward vehicle feature temporary storage unit 22 includes the image local feature information stored in the obstacle feature storage unit 16 and the image detected by the image local feature detection unit 15. This is used for comparison by the obstacle determination unit 17 based on local features.
  • the vehicle existence possible area calculation unit 23 is an area (vehicle existence) where the vehicle 1 may exist from the road surface area estimated by the traveling road surface estimation unit 21 and the parallax group information grouped by the parallax grouping unit 14. Possible area).
  • the vehicle possible area calculating unit 23 narrows the area (determination area, search range) in which the obstacle determination unit 17 determines the vehicle 1 more specifically.
  • FIG. 10 is a flowchart showing a vehicle detection process by the forward vehicle detection system 10 ⁇ / b> A of the vehicle 1.
  • FIG. 11 is a schematic view showing a state in which the vehicle 1 is loaded.
  • the processing by the forward vehicle detection system 10A is the same for S1 to S5, S7, and S9 to S12 in the forward vehicle detection system 10 according to the first embodiment.
  • the dust D is detected in S5 (Yes), it exists in the region in the parallax image calculated based on any predetermined criterion from each parallax group grouped in S3, and is set in advance.
  • a group having a size equal to or greater than the threshold value is calculated, a road surface upper parallax group existing above the road surface R is obtained by the traveling road surface estimation unit 21, and a plane of an appearance image corresponding to this road surface upper parallax group is positioned.
  • An area is calculated, and this plane area is estimated as a road surface R area.
  • a vehicle possible area where the vehicle 1 is likely to exist is calculated by the vehicle possible area calculating unit 23.
  • the possible region is set as a target region group C which is a range in which the image local feature is detected (S21).
  • the image local feature temporary information stored in the forward vehicle feature temporary storage unit 22 in S23 is the image local feature in the image local feature information detected in S7 and the image local feature information stored in the obstacle feature storage unit 16.
  • the number of coincidence points of these image local features is calculated (S24). Thereafter, it is determined in S9 whether or not the number of matches calculated in S24 exceeds a predetermined threshold value.
  • the traveling road surface estimation unit 21 After the road surface area where the vehicle 1 can travel is estimated by the traveling road surface estimation unit 21, the traveling road surface estimation unit From the road surface area estimated in 21 and the parallax group information grouped by the parallax grouping unit 14, a vehicle possible area where the vehicle 1 is highly likely to exist is calculated.
  • the target area group C is a range in which features are detected.
  • the image local feature of the vehicle 1 exists in the target region group C. Therefore, if the image local feature detection unit 15 detects the image local feature of the vehicle 1 only for the target region group C, It ’s enough.
  • the detection range in the image information detected by the image local feature detection unit 15 can be specified and narrowed from the viewpoint that the vehicle 1 exists, and the detection target information by the image local feature detection unit 15 can be further reduced. Therefore, the determination as to whether or not the vehicle 1 is the vehicle by the obstacle determination unit 17 can be performed more efficiently and accurately.
  • the determination area for determining the vehicle 1 by the obstacle determination unit 17, that is, the search range can be specified and limited to be narrower. Therefore, the detection accuracy in the obstacle determination unit 17 can be further improved, and detection errors can be reduced because the number of detection targets can be reduced.
  • the vehicle 1 determination from the image information When determining the vehicle 1 as a predetermined obstacle, it can carry out efficiently and accurately.
  • the main work content of the vehicle 1 is earth and sand transport work, as shown in FIG. 11, when a large amount of earth and sand is loaded as a load S on the vessel 1 c of the vehicle 1, the obstacle feature storage unit In the image local feature stored in FIG. 16, there is a possibility that individual differences of the individual vehicles 1 cannot be considered. Therefore, in the second embodiment, when the vehicle 1 is detected by the obstacle determination unit 17 in a situation where the dust detection unit 13 does not detect the dust D, the local image feature of the vehicle 1 is determined as the vehicle ahead. The feature is stored in the temporary feature storage unit 22.
  • the front vehicle feature is temporarily accumulated.
  • the vehicle 1 can be detected by the obstacle determination unit 17.
  • the obstacle determination unit 17 determines the vehicle 1 while considering the image local feature information of the vehicle 1 that has been previously determined as the vehicle 1 by the obstacle determination unit 17 and stored in the forward vehicle feature temporary storage unit 22. Therefore, the obstacle determination unit 17 can determine the vehicle 1 based on the updated image local feature information while updating the image local feature information of the other vehicle 1. Therefore, for example, the other vehicle 1 information immediately before the generation of the dust D can be accumulated in the forward vehicle feature temporary storage unit 22, and the image local feature of the other vehicle 1 stored in the forward vehicle feature temporary storage unit 22 By using for the vehicle determination in the object determination part 17, the determination of the vehicle 1 by the obstacle determination part 17 can be performed more accurately. Furthermore, for example, since it is possible to detect the vehicle 1 in consideration of individual differences for each vehicle 1 having different image local features, it is possible to identify which vehicle 1 is traveling ahead.
  • FIG. 12 is a schematic diagram showing a forward vehicle detection system 10B mounted on the vehicle 1 according to the third embodiment of the present invention.
  • FIG. 12 is an explanatory diagram illustrating an image in which the front vehicle 1 is detected from the side when the dust D is generated in the external environment detection unit 12 of the vehicle 1.
  • FIG. 13 is an explanatory diagram illustrating an image in which the front vehicle 1 is detected from the side when the dust D is generated in the external environment detection unit 12 of the vehicle 1.
  • the third embodiment differs from the first embodiment described above in that the third embodiment is an obstacle feature storage unit 16 that takes into account the vehicle body posture of the vehicle 1.
  • the same or corresponding parts as those in the first embodiment are denoted by the same reference numerals.
  • the obstacle feature storage unit 16 includes, as shown in FIG. 12, image local features for each posture of the vehicle 1 such as rear, side (sideways), and forward of the vehicle 1, and the like.
  • Image local feature information relating to the geometric position from the center of gravity of the vehicle 1 for each image local feature (the center of gravity of the vehicle 1 itself excluding the load) is stored and accumulated.
  • the obstacle determination unit 17 applies the image local feature for each posture of the vehicle 1 in the image local feature information stored in the obstacle feature storage unit 16 in addition to the determination of the other vehicle 1 traveling ahead. The determined posture of the other vehicle 1 is determined.
  • the distance information acquisition unit 18 determines the obstacle determination unit 17 based on the image local features in the image local feature information accumulated in the obstacle feature storage unit 16 and the position from the barycentric point for each image local feature. The inter-vehicle distance for each image local feature of the vehicle 1 determined in this way is calculated. At the same time, the distance information acquisition unit 18 compares at least two image local features among the image local features in the image local feature information accumulated in the obstacle feature storage unit 16 by the obstacle determination unit 17. When they match, the center of gravity of the vehicle 1 is calculated from the geometric positional relationship of these matching image local features, and the distance from the calculated center of gravity is taken as the inter-vehicle distance.
  • the local image feature information related to the local image features for each posture of the vehicle 1 is stored in the obstacle feature storage unit 16. Based on the image local features for each posture of the vehicle 1 in the image local feature information accumulated in the obstacle feature storage unit 16, the obstacle determination unit 17 determines the posture of the other vehicle 1. As a result, as shown in FIG. 13, in the situation where the road surface R in the front is curved, the other vehicle 1 traveling ahead is turned, and only a part of the side of the other vehicle 1 is stereo camera device 11.
  • obstacle determination is performed based on the image local feature for each posture of the vehicle 1 in the image local feature information accumulated in the obstacle feature storage unit 16.
  • the vehicle 17 can be determined by the unit 17 and the posture of the vehicle 1 can be determined.
  • the traveling control of the host vehicle 1 is performed based on the detected inter-vehicle distance to the other vehicle 1 while taking into account the variation in the inter-vehicle distance from the center of gravity of the vehicle 1 due to the detected change in the attitude of the vehicle 1. be able to. Therefore, since the traveling control of the own vehicle 1 according to the detected posture of the other vehicle 1 is possible, the traveling control of the own vehicle 1 can be performed more appropriately and accurately.
  • the vehicle 1 has been described by taking a dump truck for mining as an example.
  • the forward vehicle detection systems 10, 10A, and 10B according to the present invention may be mounted on a mining work machine such as the grader 5 according to the fifth embodiment.
  • a system that detects a vehicle 1 such as a hydraulic excavator 4 that travels in the mine, a grader 5, and a patrol car.
  • a system for detecting obstacles other than the vehicle 1 such as a fallen load, collapsed earth and sand, and falling rocks may be used.
  • the configuration using the stereo camera device 11 as the external environment detection unit 12 has been described.
  • any configuration that can acquire a three-dimensional stereoscopic image may be used.
  • a laser range finder or a combination of a 2D laser range finder and a monocular camera may be used.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Image Analysis (AREA)
  • Traffic Control Systems (AREA)
  • Image Processing (AREA)
  • Length Measuring Devices By Optical Means (AREA)
PCT/JP2014/080106 2013-12-27 2014-11-13 鉱山用作業機械 WO2015098344A1 (ja)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2013-272270 2013-12-27
JP2013272270A JP2015125760A (ja) 2013-12-27 2013-12-27 鉱山用作業機械

Publications (1)

Publication Number Publication Date
WO2015098344A1 true WO2015098344A1 (ja) 2015-07-02

Family

ID=53478219

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2014/080106 WO2015098344A1 (ja) 2013-12-27 2014-11-13 鉱山用作業機械

Country Status (2)

Country Link
JP (1) JP2015125760A (enrdf_load_stackoverflow)
WO (1) WO2015098344A1 (enrdf_load_stackoverflow)

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6666180B2 (ja) 2016-03-23 2020-03-13 株式会社小松製作所 モータグレーダの制御方法およびモータグレーダ
WO2018101247A1 (ja) * 2016-11-29 2018-06-07 マクセル株式会社 画像認識撮像装置
JP2018156408A (ja) * 2017-03-17 2018-10-04 マクセル株式会社 画像認識撮像装置
JP6901982B2 (ja) * 2018-03-05 2021-07-14 日立建機株式会社 路面状態検出装置
JP2019159380A (ja) 2018-03-07 2019-09-19 株式会社デンソー 物体検知装置、物体検知方法およびプログラム
JP7074400B2 (ja) * 2018-07-09 2022-05-24 株式会社ミツトヨ 光学測定装置
JP7059956B2 (ja) * 2019-02-08 2022-04-26 コベルコ建機株式会社 建設機械の障害物検出装置
JP7319420B2 (ja) * 2019-10-08 2023-08-01 楽天グループ株式会社 処理システム、無人で飛行可能な航空機、及び粉塵状態推定方法
US11928973B2 (en) 2019-10-08 2024-03-12 Rakuten Group, Inc. Processing system, aerial vehicle capable of flying unmanned, and dust condition estimation method
JP7100413B1 (ja) * 2021-09-15 2022-07-13 日立建機株式会社 自律走行鉱山車両

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH10143659A (ja) * 1996-11-06 1998-05-29 Komatsu Ltd 物体検出装置
WO2010134539A1 (ja) * 2009-05-19 2010-11-25 国立大学法人東京大学 特徴量生成装置、特徴量生成方法および特徴量生成プログラム、ならびにクラス判別装置、クラス判別方法およびクラス判別プログラム
JP2011100174A (ja) * 2009-11-03 2011-05-19 Tokyo Institute Of Technology 車線内車両検出装置及び車線内車両検出方法

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH10143659A (ja) * 1996-11-06 1998-05-29 Komatsu Ltd 物体検出装置
WO2010134539A1 (ja) * 2009-05-19 2010-11-25 国立大学法人東京大学 特徴量生成装置、特徴量生成方法および特徴量生成プログラム、ならびにクラス判別装置、クラス判別方法およびクラス判別プログラム
JP2011100174A (ja) * 2009-11-03 2011-05-19 Tokyo Institute Of Technology 車線内車両検出装置及び車線内車両検出方法

Also Published As

Publication number Publication date
JP2015125760A (ja) 2015-07-06

Similar Documents

Publication Publication Date Title
WO2015098344A1 (ja) 鉱山用作業機械
US12352862B2 (en) Lidar-based trailer tracking
EP3787911B1 (en) Trailer detection and autonomous hitching
JP6091977B2 (ja) 建設機械
US20200057897A1 (en) Obstacle sensing device
JP6533619B2 (ja) センサキャリブレーションシステム
JP6385745B2 (ja) 鉱山用作業車両
CA2987373C (en) Position estimation device and position estimation method
US9151626B1 (en) Vehicle position estimation system
JP6523050B2 (ja) 鉱山用作業機械
CN108139755B (zh) 自己位置推定装置的异常检测装置以及车辆
JP2021519242A (ja) 車両とトレーラとの間の距離を検出する装置および方法
US20180114436A1 (en) Lidar and vision vehicle sensing
WO2018061084A1 (ja) 自己位置推定方法及び自己位置推定装置
WO2015163028A1 (ja) 鉱山用作業車両用の路面推定装置および路面推定システム
CN112771591B (zh) 用于评价运输工具的环境中的对象对运输工具的行驶机动动作的影响的方法
AU2021301647B2 (en) Obstacle detector and obstacle detection method
US12179751B2 (en) Remote target vehicle tracking in a region surrounding a motor vehicle
US20220314967A1 (en) Vehicular control system
US10970870B2 (en) Object detection apparatus
CN105480227A (zh) 主动驾驶技术中基于红外雷达与视频图像信息融合的方法
JP6204782B2 (ja) オフロードダンプトラック
JP7135579B2 (ja) 物体検知装置
WO2018179965A1 (ja) 障害物検知装置
JP7318377B2 (ja) 物体検知装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14873583

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 14873583

Country of ref document: EP

Kind code of ref document: A1