WO2016092684A1 - 体積推定装置およびそれを用いた作業機械 - Google Patents
体積推定装置およびそれを用いた作業機械 Download PDFInfo
- Publication number
- WO2016092684A1 WO2016092684A1 PCT/JP2014/082911 JP2014082911W WO2016092684A1 WO 2016092684 A1 WO2016092684 A1 WO 2016092684A1 JP 2014082911 W JP2014082911 W JP 2014082911W WO 2016092684 A1 WO2016092684 A1 WO 2016092684A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- mesh
- blind spot
- spot area
- volume
- parallax data
- Prior art date
Links
Images
Classifications
-
- E—FIXED CONSTRUCTIONS
- E02—HYDRAULIC ENGINEERING; FOUNDATIONS; SOIL SHIFTING
- E02F—DREDGING; SOIL-SHIFTING
- E02F9/00—Component parts of dredgers or soil-shifting machines, not restricted to one of the kinds covered by groups E02F3/00 - E02F7/00
- E02F9/26—Indicating devices
- E02F9/261—Surveying the work-site to be treated
- E02F9/262—Surveying the work-site to be treated with follow-up actions to control the work tool, e.g. controller
-
- E—FIXED CONSTRUCTIONS
- E02—HYDRAULIC ENGINEERING; FOUNDATIONS; SOIL SHIFTING
- E02F—DREDGING; SOIL-SHIFTING
- E02F3/00—Dredgers; Soil-shifting machines
- E02F3/04—Dredgers; Soil-shifting machines mechanically-driven
- E02F3/28—Dredgers; Soil-shifting machines mechanically-driven with digging tools mounted on a dipper- or bucket-arm, i.e. there is either one arm or a pair of arms, e.g. dippers, buckets
- E02F3/36—Component parts
- E02F3/42—Drives for dippers, buckets, dipper-arms or bucket-arms
- E02F3/43—Control of dipper or bucket position; Control of sequence of drive operations
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/50—Depth or shape recovery
- G06T7/55—Depth or shape recovery from multiple images
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/50—Depth or shape recovery
- G06T7/55—Depth or shape recovery from multiple images
- G06T7/593—Depth or shape recovery from multiple images from stereo images
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/60—Analysis of geometric attributes
- G06T7/62—Analysis of geometric attributes of area, perimeter, diameter or volume
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/204—Image signal generators using stereoscopic image cameras
- H04N13/207—Image signal generators using stereoscopic image cameras using a single 2D image sensor
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/204—Image signal generators using stereoscopic image cameras
- H04N13/239—Image signal generators using stereoscopic image cameras using two 2D image sensors having a relative position equal to or related to the interocular distance
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/271—Image signal generators wherein the generated image signals comprise depth maps or disparity maps
-
- E—FIXED CONSTRUCTIONS
- E02—HYDRAULIC ENGINEERING; FOUNDATIONS; SOIL SHIFTING
- E02F—DREDGING; SOIL-SHIFTING
- E02F9/00—Component parts of dredgers or soil-shifting machines, not restricted to one of the kinds covered by groups E02F3/00 - E02F7/00
- E02F9/26—Indicating devices
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10004—Still image; Photographic image
- G06T2207/10012—Stereo images
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30108—Industrial image inspection
- G06T2207/30164—Workpiece; Machine component
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N2013/0074—Stereoscopic image analysis
- H04N2013/0081—Depth or disparity estimation from stereoscopic image signals
Definitions
- the present invention relates to a volume estimation device and a work machine using the same.
- Excavators need to load a full dump with the specified number of excavations to improve excavation efficiency in the mine. Therefore, if the amount of excavation per time can be grasped, the operator can adjust the amount of excavation next.
- Patent Document 1 describes a method of calculating a loading capacity in a bucket by providing a plurality of cameras on the left and right sides of the boom or on the left and right sides of the arm and shooting with a camera located almost directly above the bucket. .
- Patent Document 1 it is not assumed that the blind spot area is estimated because the image is taken when the bucket moves to a position where no blind spot is generated and the loading capacity in the bucket at the time of excavation is calculated.
- Patent Document 2 generates complementary image data for complementing a blind spot part based on blind spot information estimated from a visible image part of another vehicle included in a vehicle surrounding image obtained by a camera.
- a slit image when a measuring object is irradiated with slit light is photographed by a camera, the volume for each slit image is calculated, and these are integrated to obtain the entire volume.
- it can correct
- JP 2008-241300 A JP 2013-25528 A JP-A-4-301707
- Patent Document 2 the shape of the blind spot part of the image of the other vehicle is complemented in order to display the image of the other vehicle more accurately.
- Patent Document 2 does not measure the volume of the blind spot area.
- Patent Document 3 it is difficult to accurately measure the volume of the entire measurement object including the blind spot area because the data is corrected with a straight line before and after the obtained data.
- An object of the present invention is to improve the volume estimation accuracy of an object in a container using the captured image when a blind spot area exists in the captured image of the object in the container.
- a blind spot estimation unit 310 that estimates a blind spot area of an object in the bucket 13; a blind spot area shape estimation unit 320 that estimates a shape of an object in the blind spot area; and a volume estimation unit 330 that estimates the volume of the object in the blind spot area.
- the blind spot estimation unit 310 estimates a blind spot area based on mesh parallax data obtained from the captured images of the objects in the bucket 13 taken by a plurality of cameras, and the blind spot area shape estimation unit 320 uses the mesh parallax data to detect the blind spot area.
- the shape of the object is estimated, and the volume estimation unit 330 estimates the volume of the object in the blind spot area based on the shape of the object in the blind spot area estimated by the blind spot area shape estimation unit 320 and the shape of the bottom of the bucket 13.
- Configuration diagram of volume estimation device mounted on hydraulic excavator Method of creating parallax data by stereo camera device Image taken by stereo camera device when excavated material in bucket is flat Overview of the method for estimating the volume of excavated material Flow chart for estimating the volume of excavated material Example when a blind spot occurs in the excavated material in the bucket Example of mesh parallax data when a blind spot area occurs in the excavated material in the bucket Estimated range when the volume of excavated material is estimated without considering the blind spot area Flow chart of volume estimation method considering blind spot area Process overview of blind spot area estimation process Flow chart of blind spot area determination process and blind spot area estimation process Flow chart for displaying volume reliability Flowchart displaying the volume estimation result with the highest reliability Configuration diagram of a volume estimation device including a protrusion determination unit mounted on a hydraulic excavator Flow chart of volume estimation method considering blind spot area and protrusion area Example when protrusion area is generated on excavated material in bucket Flow chart of protrusion area determination processing
- FIG. 1 shows the configuration of a hydraulic excavator 10 equipped with a volume estimation device 30.
- the excavator 10 has a boom 11, an arm 12, a bucket 13, and an upper swing body 15 for excavation.
- the upper swing body 15 has a cab 14.
- a stereo camera device 210 is installed to recognize the outside world.
- the stereo camera device 210 has two cameras, a right camera 212 and a left camera 211, and can measure the distance to the subject using these parallaxes.
- the stereo camera device 210 only needs to include a plurality of two or more cameras. For example, the number of cameras may be three or four.
- the volume estimation device 30 is a device that estimates the volume of the excavated material in the bucket 13 taken by the stereo camera device 210.
- the volume estimation device 30 includes a blind spot estimation unit 310 that estimates a blind spot area of the excavated object in the imaged container, a blind spot area shape estimation unit 320 that estimates the shape of the drill spot in the blind spot area, and a volume of the drilled object in the blind spot area. It comprises a volume estimation unit 330 that estimates and a reliability calculation unit 360 that calculates the accuracy of the volume of the estimated excavation.
- the volume estimation result and the reliability are displayed on the display unit 40.
- the blind spot estimation unit 310 estimates the blind spot area based on the mesh parallax data obtained from the captured image of the excavated object in the bucket 13 captured by the stereo camera device 210.
- parallax data obtained from a captured image taken by the stereo camera device 210 is input, and the bucket area setting unit 3100 extracts the area of the bucket 13 from the parallax data.
- the bucket region is divided into meshes, and mesh parallax data of each mesh is obtained from the parallax data included in each mesh.
- the parallax data analysis unit 3110 determines the blind spot area.
- the blind spot area shape estimation unit 320 estimates the shape of the blind spot area.
- FIG. 2 an outline of operation in which the stereo camera device 210 generates parallax data will be described.
- a right image 340 obtained by photographing the bucket 13 with the right photographing unit 212 and a left image 341 obtained by the left photographing unit 211 a part 344 of the bucket 13 is photographed at the position of the point 342 in the right image 340.
- the image 341 is taken at the position of the point 343.
- a parallax d occurs at the points 342 and 343.
- the parallax d has a large value when the excavated object in the bucket 13 is close to the stereo camera device 210, and has a small value when the object is far.
- the parallax obtained in this way is obtained for the entire image.
- the distance from the excavated material in the bucket 13 to the stereo camera device 210 can be measured by the principle of triangulation.
- the distance Z 1 from the parallax d is obtained by the following equation.
- Z 1 (f ⁇ P) / d
- f the focal length of the right and left cameras
- P the distance between the right camera 212 and the left camera 211.
- X 1 (Z ⁇ xr) / f
- Y 1 (Z ⁇ yr) / f
- xr is the x coordinate on the right image 340
- yr is the y coordinate on the right image 340.
- FIG. 3 shows a photographed image of the stereo camera device 210 when the surface of the excavated material in the bucket 13 is flat as an example of the case where the blind spot of the excavated material does not occur even when the line of sight 220 is oblique.
- FIG. 3A is a view as seen from the front of the camera
- FIG. 3B is a cross-sectional view of the bucket 13 parallel to the side surface of the arm 12. Since the stereo camera device 210 is installed in the cab 14, the bucket 13 is photographed from diagonally above. In FIG. 3, the blind spot of the excavated object does not occur even when the line of sight 220 is oblique.
- FIG. 4 shows an outline of the method for calculating the volume of excavated material.
- FIG. 4A is a diagram showing a state in which the bucket 13 obtained from the photographed image is divided into two-dimensional mesh groups 230. This figure is an image obtained by photographing the bucket 13 from obliquely above the bucket 13 with the stereo camera device 210, and is not an image obtained by photographing the bucket 13 from directly above the bucket 13.
- FIG. 4B is a cross-sectional view of the bucket 13 parallel to the side surface of the arm 12. As shown in FIG. 4A, the bucket opening surface is the xy plane, the right direction of the image taken by the camera is the positive x-axis direction, and the upward direction is the positive y-axis direction.
- FIG. 4A is a diagram showing a state in which the bucket 13 obtained from the photographed image is divided into two-dimensional mesh groups 230. This figure is an image obtained by photographing the bucket 13 from obliquely above the bucket 13 with the stereo camera device 210, and is not an image obtained
- the mesh parallax data of each mesh of the mesh group 230 is obtained from the parallax data included in each mesh.
- the method for obtaining mesh parallax data is not limited to one method, for example, obtaining an average value or median value of a plurality of parallax data, or obtaining an average value or median value after reducing the number of parallax data. Further, by finely setting the mesh, a mesh in which the parallax data included in the mesh becomes one can be created. In this case, the mesh parallax data has the same value as the parallax data.
- the shape of the bucket 13 needs to be learned in advance.
- the state in which the bucket 13 is empty is photographed by the stereo camera device 210, and the photographed image is divided into meshes, and then from the bottom of the bucket 13 to the bucket opening surface in each mesh. Calculate the length of. Or you may input the shape of a bucket with CAD data. Obtain the length from the bucket opening surface of the bucket 13 of each mesh to the surface of the excavated material with the excavated material entered, and obtain the length from the bottom of the bucket 13 to the bucket opening surface when the bucket 13 is empty.
- the blind spot area shape estimation unit 320 estimates the shape of the excavation object in the blind spot area based on the mesh parallax data, and the volume estimation unit 330 calculates the shape of the excavation object in the blind spot area estimated by the blind spot area shape estimation unit 320; Based on the shape of the bottom of the bucket 13, the volume of the excavated object in the blind spot area is estimated.
- a mesh one above a certain mesh is a mesh adjacent to the positive y-axis direction of a certain mesh. The lowermost mesh is the lowermost mesh of the captured image.
- FIG. 5 shows a flowchart for estimating the volume of the excavated material when there is no blind spot area described in FIG.
- the bucket 13 is photographed by the stereo camera device 210, and parallax data is created from the photographed image.
- the parallax data is created by calculating a coordinate shift between the left image 341 and the right image 340 of the subject. By obtaining this shift in all meshes, parallax data of an image photographed by the stereo camera device 210 can be obtained.
- a bucket area is extracted by the bucket area setting unit 3100.
- a bucket, the ground, and earth and sand can be considered as a thing image
- the fact that the bucket area is located at a place extremely close to the stereo camera device 210 than the ground or earth and sand is used. That is, since the parallax data becomes extremely large only for the bucket area, the bucket area can be extracted using the parallax data.
- the parallax data analysis unit 3110 performs three-dimensional conversion to match the extracted parallax data of the bucket region to the actual size.
- the parallax data analysis unit 3110 divides the three-dimensionally converted bucket region into a two-dimensional mesh. The smaller the mesh size, the better the accuracy of volume estimation.
- FIG. 6 shows an example when the excavated object in the bucket 13 has a blind spot area.
- 6A is a view of the bucket 13 viewed from the front of the camera
- FIG. 6B is a cross-sectional view of the bucket 13 parallel to the side surface of the arm 12.
- the excavated object has a mountain
- the back side of the mountain as viewed from the stereo camera device 210 is a blind spot area 221.
- FIG. 7 shows an example of mesh parallax data when there is a blind spot area of FIG.
- the mesh parallax data changes with a difference of about 1 or 2 from the mesh 243 to the mesh 241.
- the mesh parallax data from the mesh 241 to the mesh 240 is reduced by nine. This is due to the fact that the distance 220b in FIG. 6 suddenly becomes larger than the distance 220a. In this way, it is determined that there is a blind spot area between meshes where the mesh parallax data suddenly decreases.
- Fig. 8 shows the estimated range when the volume of the excavated material is estimated without considering the blind spot area.
- a blind spot area 221 is formed between the mesh 241 and the mesh 240 in FIG. 7, and a volume smaller than the actual volume is estimated. Therefore, a volume estimation method considering the blind spot area is necessary.
- the distance 220a of the uppermost region in the mountain and the distance 220c on the near side in the region seen from the stereo camera device 210 change little, and the distance 220a of the uppermost region of the mountain and the blind spot region
- the distance 220b of the previous area is greatly different. That is, the degree of change in the mesh parallax data is greatly different between the two.
- FIG. 9 shows a flowchart of the volume estimation method considering the blind spot area.
- a mesh division S190, a blind spot area determination process S1000, a blind spot area estimation process S1100, a volume reliability display S1200, and a reliability calculation process S1400 are added to the flowchart of FIG.
- the mesh division S190 divides the bucket region extracted in S120 into meshes. Since mesh division S140 divides the three-dimensionally converted bucket region into meshes, the size of the mesh divided by mesh division S190 and the size of the mesh divided by mesh division S140 may be different.
- the parallax data analysis unit 3110 performs the process shown in FIG. 7, and the blind spot area is determined using the mesh parallax data.
- a blind spot area exists between meshes 241 and 240 that have greatly different mesh parallax data of adjacent meshes in the y-axis direction.
- the blind spot area shape estimation unit 320 performs the process.
- the reliability calculation process S1400 is performed by the reliability calculation unit 360.
- FIG. 10 shows a process outline of the blind spot area estimation process S1100, and is a cross-sectional view of the bucket 13 parallel to the side surface of the arm 12.
- FIG. The mesh parallax data of the mesh bar 1243 is the value of the mesh 2243
- the mesh parallax data of the mesh bar 1242 is the value of the mesh 2242
- the mesh parallax data of the mesh bar 1241 is the value of the mesh 2241
- the mesh parallax data of the mesh bar 1240 is the mesh 2240
- the mesh parallax data of the mesh bar 1260 is the value of the mesh 2260
- the mesh parallax data of the mesh bar 1261 is the value of the mesh 2261.
- the mesh bar 1260 and the mesh bar 1261 are blind spots, mesh parallax data of the corresponding mesh cannot be estimated, and the volume cannot be estimated as it is.
- the shape of the mountain is the shape on the side visible from the stereo camera device 210 and the shape on the back side of the mesh bar including the peak of the mountain.
- the shape of the line object is the same. Accordingly, the mesh bar 1260 in FIG. 10 is estimated to have the same volume from the mesh bar 1242 and the bucket opening surface, and the mesh bar 1261 is estimated to have the same height from the mesh bar 1243 and the bucket opening surface, thereby estimating the volume.
- the blind spot area shape estimation unit 320 regards the shape of the excavated object as being line symmetric in accordance with the shape of the excavated object photographed by a plurality of cameras, and estimates the shape of the excavated object in the blind spot area. Since the shape of the excavated material is often mountainous, the method of estimating the shape of the excavated material as being line symmetric is more accurate than the method of estimating the shape of the excavated material with straight lines or curves. Can be estimated.
- FIG. 11 shows a flowchart of the blind spot area determination process S1000 and the blind spot area estimation process S1100.
- the mesh parallax data of the mesh in the mesh group 230 is the mesh parallax data of the lowermost mesh in one row parallel to the y axis, and the mesh parallax data of the mesh one level higher in the positive y-axis direction of the mesh whose mesh parallax data is L Let it be M.
- a mesh whose mesh parallax data is L is defined as a blind spot area determination mesh.
- the blind spot estimator 310 estimates the size of the blind spot area based on the size of N, which is the difference between the mesh parallax data.
- the width of the blind spot area is the width between the mesh bar 1241 and the mesh bar 1240.
- the mesh parallax data of the lowermost mesh in one column parallel to the y-axis in the mesh group 230 is input as L to obtain a blind spot area determination mesh.
- mesh parallax data of the mesh 243 is input as L.
- the mesh parallax data of the mesh one level above the blind spot area determination mesh whose mesh parallax data is L is input as M.
- mesh parallax data of the mesh 242 is input as M.
- a blind spot area determination value G 1 for determining the presence or absence of the blind spot area is calculated.
- the blind spot area determination value G 1 is set to 10% of the mesh parallax data L of the blind spot area determination mesh.
- N is the blind spot judgment value G 1 or more, determined mesh parallax data and the blind spot judgment mesh is L, the mesh parallax data is on one of the blind spot region determination mesh between the mesh M and blind area As shown in FIG.
- N of the column is stored.
- the blind spot estimation unit 310 uses the mesh parallax data, and the mesh parallax data of the blind spot area determination mesh that is a mesh included in the mesh group 230 and the mesh of the mesh one above the blind spot area determination mesh if the difference between the disparity data is the blind spot region determination value G 1 or more, it is determined that the blind spot region determining mesh and the blind spot region determining one above the blind area between the mesh of the mesh. Then, the blind spot region determination value G 1 is determined by the mesh parallax data blind spot region determination mesh.
- the blind spot area determination value G 1 By calculating and setting the blind spot area determination value G 1 for determining the presence or absence of the blind spot area for each mesh, it is more appropriate for the shape of the excavated object than when the blind spot area determination value G 1 is constant in all meshes.
- the blind spot area determination value G 1 can be used, and the blind spot area can be determined more correctly.
- the blind spot judgment value G 1 by determining the determined blind area by the magnitude of the difference between the mesh parallax data, it is possible to eliminate the influence of minute sediment irregularities and noise.
- FIG. 12 is a flowchart for displaying the reliability of the volume of the excavated object on the display unit 40.
- N having the largest value is selected from one vertical column calculated in FIG. Then, N having the largest value is selected from each column.
- N is converted into actual distance data, and it is set as D.
- the reliability calculation unit 360 determines the reliability of the volume of the excavated object according to the size of the blind spot area.
- the threshold 1 is 20% and the threshold 2 is 50%.
- these threshold values are not limited to this, and can be changed depending on the application situation. It is also possible to increase or decrease the threshold level.
- the reliability is not limited to the display using symbols such as A and B, and can be displayed with numbers.
- the blind spot area can be estimated even if a blind spot area is generated, so that the volume can be estimated with high accuracy.
- the object for estimating the volume of the blind spot area generated when the image is taken with the camera is not limited to the excavated object in the bucket. Any object other than the excavated material in the bucket may be in the container.
- the volume of excavated material in the shovel bucket is targeted, but the volume of load such as a dump truck may be targeted.
- FIG. 13 shows a flowchart for displaying the volume estimation result with the highest reliability.
- a plurality of photographed images can be obtained by photographing a plurality of times using the stereo camera device 210. Alternatively, it can also be obtained by taking an excavated object in a moving image format using the stereo camera device 210 and extracting a plurality of frames. Then, the photographed image having the highest reliability is selected from the plurality of photographed images.
- E may be obtained by the blind spot area shape estimation unit 320, the volume estimation unit 330, the reliability calculation unit 360, and the like.
- the image of the bucket 13 is extracted as one frame of the moving image scene.
- the estimated volume value of the one frame is stored as the estimated value F.
- ⁇ S240> it is determined whether or not the photographing has been completed. If not, the process is repeated from the beginning.
- a method for determining whether or not the photographing has been completed there are a method of inputting a signal indicating that the upper swing body 15 has started rotating from the control device of the excavator 10, and a method of inputting the operator by a switch operation.
- a captured image when the blind spot area is the smallest is selected from a plurality of captured images captured by a plurality of cameras, and within a bucket captured by the plurality of cameras based on the selected captured image. Since the volume of the excavated object can be estimated, the excavated object volume can be estimated with high accuracy, and the estimation result with the highest reliability can be displayed on the display unit 40.
- FIG. 14 shows the configuration of a hydraulic excavator equipped with a protrusion determining unit 350 that determines the protrusion area of the excavated object in the bucket 13.
- the configuration other than the protrusion determining unit 350 is the same as that in FIG.
- FIG. 15 shows a flowchart of a volume estimation method that takes into account the protrusion area and the blind spot area.
- the protrusion area determination process S1500 and the protrusion area estimation process S1300 are added to the flowchart of FIG.
- FIG. 16A is a view of the excavated material in the bucket 13 as seen from the front of the camera.
- a protrusion 50 such as a stone in the excavation.
- FIG. 16B is a cross-sectional view of the bucket 13 parallel to the side surface of the arm 12, and there is a protrusion 50. If the blind spot area is estimated using the method of FIG. 10 in such a situation, the estimation is based on the assumption that the protrusion 50 is also present in the blind spot area. If such a protrusion 50 is rare, it can be considered that the protrusion 50 does not exist in the blind spot region.
- the distance from the stereo camera device 210 to the surface of the excavated object on which the protrusion 50 exists is reduced by the length of the protrusion 50, so that the parallax data is large. Become. Therefore, the mesh parallax data of the mesh where the protrusion 50 exists is larger than the mesh parallax data of the two meshes before and after the y-axis direction of the mesh where the protrusion 50 exists.
- the two meshes before and after the y-axis direction of the mesh in which the protrusion 50 exists are a mesh adjacent to the positive y-axis direction of the mesh in which the protrusion 50 exists and the y-axis negative of the mesh in which the protrusion 50 exists.
- FIG. 16C shows the projection area estimation process S1300 when the projection 50 is present.
- the projection area estimation process S1300 is performed by the blind spot area shape estimation unit 320.
- the mesh bar 271 and the mesh bar 272 are the blind spot area.
- the mesh bar 271 has the same height as the mesh bar 273 from the bucket opening surface.
- the mesh bar 272 is set to the same height as the mesh bar 274 from the bucket opening surface.
- the method shown in FIG. 10 estimates a volume larger than the actual volume. Therefore, the mesh bar 272 is set to the same height as the mesh bar 271 from the bucket opening surface. Thereby, the estimation error of the volume by the protrusion 50 can be prevented.
- the volume can be accurately estimated by changing the method for estimating the shape of the blind spot region in accordance with the shape of the excavated object in the region visible from the stereo camera device 210.
- FIG. 17 shows a flowchart of the protrusion area determination process S1500.
- the projection area determination processing S1500 is performed by the projection determination unit 350.
- the mesh parallax data of the lowermost mesh in one row parallel to the y axis in the mesh group 230 is I
- the mesh parallax data of the mesh one mesh above the mesh I is J
- the mesh parallax data is Let K be the mesh parallax data of one mesh above J meshes.
- a mesh whose mesh parallax data is J is defined as a protrusion determination mesh.
- the result of subtracting the mesh parallax data I from the mesh parallax data J is H1
- the result of subtracting the mesh parallax data K from the mesh parallax data J is H2.
- J is input as mesh parallax data of the mesh one mesh above the mesh whose parallax data is I.
- a mesh whose mesh parallax data is J is defined as a protrusion determination mesh.
- K is input as the mesh parallax data of the mesh one above the protrusion determination mesh whose mesh parallax data is J.
- the projection area determination value G 2 for determining the presence or absence of the projection area is calculated.
- the projection area determination value G 2 is 3% of the mesh parallax data J of the projection determination mesh.
- projection area determination value G 2 is determined by the mesh parallax data projections determination mesh.
- H2 is greater than projections region determination value G 2 is, it determines that the projection region is present, H2 is the case of less than projection area determination value G 2 is, projection region is not present.
- the protrusion determination mesh whose mesh parallax data is J is determined as the protrusion area.
- the projection determination unit 350 uses the mesh parallax data to generate mesh parallax data of the projection determination mesh that is a mesh included in the mesh group 230 and mesh parallax of the mesh before and after the projection determination mesh.
- the difference from the data is equal to or greater than the protrusion area determination value, the protrusion determination mesh is determined as the protrusion area.
- the shape of the projection 50 can be obtained as compared with the case where the projection determination value G 2 is constant for all meshes.
- An appropriate protrusion determination value G 2 can be used, and the protrusion area can be determined more correctly.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Mechanical Engineering (AREA)
- Structural Engineering (AREA)
- General Engineering & Computer Science (AREA)
- Civil Engineering (AREA)
- Mining & Mineral Resources (AREA)
- Geometry (AREA)
- Length Measuring Devices By Optical Means (AREA)
- Image Processing (AREA)
Abstract
Description
但し、fは右及び左カメラの焦点距離、Pは右カメラ212と左カメラ211の距離である。また、視差データを3次元変換するために、上記Z1を求めた地点の3次元上のX1、Y1の位置を次の式で表す。
Y1 = (Z × yr) / f
但し、xrは、右画像340上でのx座標、yrは、右画像340上でのy座標である。以上のように、ステレオカメラ装置210で撮影した画像によって、被写体の3次元空間上の位置(X1,Y1,Z1)をステレオカメラ装置210からの距離で求めることができる。
まず、ステレオカメラ装置210によりバケット13を撮影し、撮影画像から視差データを作成する。視差データの作成方法は図2に示したように、被写体の左画像341と右画像340の座標のずれを求めることで作成される。このずれを全てのメッシュにおいて求めることにより、ステレオカメラ装置210で撮影された画像の視差データが得られる。
次に、バケット領域設定部3100によりバケット領域を抽出する。掘削時にステレオカメラ装置210で撮影されるものとしてはバケットや地面、土砂が考えられる。これらの被写体の中からバケット領域を抽出する方法として、バケット領域が地面や土砂よりも極端にステレオカメラ装置210に近い場所に位置することを利用する。つまり、バケット領域だけ極端に視差データが大きくなるため、視差データを用いてバケット領域を抽出することができる。
次に、視差データ分析部3110により、抽出されたバケット領域の視差データを実際のサイズに合わせるために3次元変換する。
次に、視差データ分析部3110により、3次元変換したバケット領域を2次元上のメッシュに分割する。メッシュのサイズは小さい程、体積推定の精度が良くなる。
次に、2次元上のメッシュ毎にバケット13の底から掘削物の表面までの長さを求め、体積を算出する。
次に、全メッシュの体積を合計してバケット13内の掘削物の体積を算出する。
このようにして算出された体積を、表示部40に表示する。
まず、メッシュ群230中のy軸に平行な1つの列の最下部のメッシュのメッシュ視差データをLとして入力し、死角領域判定メッシュとする。図7の例ではメッシュ243のメッシュ視差データをLとして入力する。
次に、メッシュ視差データがLである死角領域判定メッシュの1つ上のメッシュのメッシュ視差データをMとして入力する。図7の例では、メッシュ242のメッシュ視差データをMとして入力する。
次に、メッシュ視差データLとメッシュ視差データMとの差を求め、差の値をNとする。
次に、死角領域の有無を判定するための死角領域判定値G1を算出する。ここでは一例として、死角領域判定値G1を死角領域判定メッシュのメッシュ視差データLの10%とする。
次に、死角領域の有無を判定するために、Nが死角領域判定値G1以上か否かを判定する。
Nが死角領域判定値G1以上の場合は、メッシュ視差データがLである死角判定メッシュと、死角領域判定メッシュの1つ上であるメッシュ視差データがMのメッシュの間を死角領域と判定し、図10のように推定する。
次に、後で死角領域の大きさを判定するために、当該列のNを記憶しておく。
1つの列が終了していれば次の列に移る。
1つの列が終了していなければ、現在のメッシュ視差データがMのメッシュをメッシュ視差データがLである死角領域判定メッシュとして保持する。そして、S1012に戻り同様の処理を行い、S1024で当該列の新たなNを記憶しておく。このように、死角領域判定値G1以上のNは全て記憶する。
全列が終了していれば処理を終了し、残りの列があれば、隣の列の処理を行う。
図11で算出した1つの縦の列の中から、最も値が大きいNを選択する。そして、各列の中から、最も値が大きいNを選択する。
次に、Nを実際の距離データに変換し、それをDとする。
次に、バケット13のY軸方向の長さQに対するDの割合を求め、それをEとする。Eは、バケット13の大きさに対する、推定した死角領域が占める割合である。
次に、Eが閾値1以下かどうかを判定する。
もし、Eが閾値1以下であれば、死角領域が小さいため、死角推定領域が小さいことと同義となり、信頼度は高いと判定することができる。そのため、Eが閾値1%以下なら掘削物の体積の信頼度が高いことを意味するAと表示部40に表示する。このように、信頼度算出部360は死角領域の広さによって掘削物の体積の信頼度を決める。
もしS1208がNoならば、Eが閾値2以下かどうかを判定する。
もしS1212がYesならば、中間的な信頼度としてBと表示部40に表示する。
もしS1212がNoならば、正確に体積を推定できなかったと判定し、信頼度としてCと表示部40に表示する。
まず、動画シーンの1コマとしてバケット13の画像を抽出する。
次に、図12の方法で体積とEを推定する。
次に、このEは、撮影済みのシーンの中で最小値かどうかを判定する。
もし最小値であれば、当該1コマの体積推定値を推定値Fとして保存する。
次に、撮影が終了したかどうかを判定し、もし終了していなければ最初から処理を繰り返す。撮影が終了しかたどうかの判定方法としては、上部旋回体15が回転を開始したこと示す信号をショベル10の制御装置から入力する方法や、操縦者がスイッチ操作で入力する方法などがある。
撮影が終了したら、保存されている最も信頼度が高い推定値と信頼度を表示部40に表示する。
まず、メッシュ群230中のy軸に平行な1つの列の、最下部のメッシュのメッシュ視差データをIとして入力する。
次に、メッシュ視差データがIのメッシュの1つ上のメッシュのメッシュ視差データをJとして入力する。このメッシュ視差データがJであるメッシュを、突起物判定メッシュとする。
次に、メッシュ視差データがJである突起物判定メッシュの1つ上のメッシュのメッシュ視差データをKとして入力する。
次に、メッシュ視差データJからメッシュ視差データIを引いた結果をH1、メッシュ視差データJからメッシュ視差データKを引いた結果をH2とする。
次に、突起物領域の有無を判定するための突起物領域判定値G2の算出を行う。ここでは一例として、突起物領域判定値G2は突起物判定メッシュのメッシュ視差データJの3%とする。このように、突起物領域判定値G2は、突起物判定メッシュのメッシュ視差データによって決まる。
もし、H1が突起物領域判定値G2以上の場合には、突起物領域が存在する可能性があるものとし、H1が突起物領域判定値G2未満の場合には、突起物領域は存在しないと判定する。
さらに、H2が突起物領域判定値G2以上の場合には、突起物領域が存在するとし、H2が突起物領域判定値G2未満の場合には、突起物領域は存在しないと判定する。
そして、メッシュ視差データがJである突起物判定メッシュを突起物領域と判定する。
1つの列が終了していれば、次の列に移る。
1つの列が終了していなければ、現在のメッシュ視差データがJである突起物判定メッシュをメッシュ視差データがIであるメッシュとして保持する。
全列が終了していれば処理を終了し、残りの列があれば、残りの列の処理を行う。
Claims (8)
- 容器内の物体の死角領域を推定する死角推定部と、
前記死角領域の物体の形状を推定する死角領域形状推定部と、
前記死角領域の物体の体積を推定する体積推定部と、を備え、
前記死角推定部は、複数のカメラが撮影した前記容器内の物体の撮影画像から求めたメッシュ視差データにより前記死角領域を推定し、
前記死角領域形状推定部は、前記メッシュ視差データにより前記死角領域の物体の形状を推定し、
前記体積推定部は、前記死角領域形状推定部で推定された前記死角領域の物体の形状と、前記容器の底の形状とに基づき前記死角領域の物体の体積を推定する体積推定装置。
- 請求項1において、
前記容器内で2次元上に区分されたメッシュ群に死角領域判定メッシュが含まれ、
前記死角推定部は、前記メッシュ視差データを用いて、前記死角領域判定メッシュのメッシュ視差データと前記死角領域判定メッシュの1つ上のメッシュのメッシュ視差データとの差が死角領域判定値以上の場合の、前記死角領域判定メッシュと前記死角領域判定メッシュの1つ上のメッシュの間を前記死角領域と判定し、
前記死角領域判定値は、前記死角領域判定メッシュのメッシュ視差データによって決められる体積推定装置。
- 請求項2において、
前記死角推定部は、前記メッシュ視差データの差の大きさによって前記死角領域の広さを推定する体積推定装置。
- 請求項1から3のいずれかにおいて、
前記体積推定装置は、前記死角領域の広さによって前記物体の体積の信頼度を決める信頼度算出部を有し、
前記物体の体積の信頼度は、表示部に表示される体積推定装置。
- 請求項1から4のいずれかにおいて、
前記死角領域形状推定部は、前記複数のカメラで撮影した複数の撮影画像から、前記死角領域の広さが最も小さい場合の撮影画像に基づき、前記複数のカメラが撮影した前記死角領域の物体の形状を推定する体積推定装置。
- 請求項2から5のいずれかにおいて、
前記物体の突起物領域を判定する突起物判断部を有し、
前記メッシュ群に突起物判定メッシュが含まれ、
前記突起物判断部は、前記突起物判定メッシュのメッシュ視差データと前記突起物判定メッシュの前後のメッシュのメッシュ視差データとの差がそれぞれ突起物領域判定値以上の場合の、前記突起物判定メッシュを突起物領域と判定する体積推定装置。
- 請求項6において、
前記突起物領域判定値は、前記突起物判定メッシュのメッシュ視差データによって決められる体積推定装置。
- 請求項1から7のいずれかの体積推定装置を用いた作業機械。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2014/082911 WO2016092684A1 (ja) | 2014-12-12 | 2014-12-12 | 体積推定装置およびそれを用いた作業機械 |
US15/527,027 US10208459B2 (en) | 2014-12-12 | 2014-12-12 | Volume estimation device and work machine using same |
JP2016563367A JP6318267B2 (ja) | 2014-12-12 | 2014-12-12 | 体積推定装置およびそれを用いた作業機械 |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2014/082911 WO2016092684A1 (ja) | 2014-12-12 | 2014-12-12 | 体積推定装置およびそれを用いた作業機械 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2016092684A1 true WO2016092684A1 (ja) | 2016-06-16 |
Family
ID=56106931
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2014/082911 WO2016092684A1 (ja) | 2014-12-12 | 2014-12-12 | 体積推定装置およびそれを用いた作業機械 |
Country Status (3)
Country | Link |
---|---|
US (1) | US10208459B2 (ja) |
JP (1) | JP6318267B2 (ja) |
WO (1) | WO2016092684A1 (ja) |
Cited By (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2017033769A1 (ja) * | 2015-08-24 | 2017-03-02 | 株式会社小松製作所 | ホイールローダの制御システム、その制御方法およびホイールローダの制御方法 |
WO2019039606A1 (ja) * | 2017-08-24 | 2019-02-28 | 日立建機株式会社 | 建設機械 |
JP2020027501A (ja) * | 2018-08-14 | 2020-02-20 | 東芝テック株式会社 | 画像処理装置及び画像処理方法 |
JP2020034527A (ja) * | 2018-08-31 | 2020-03-05 | 株式会社小松製作所 | 作業機械の運搬物特定装置、作業機械、作業機械の運搬物特定方法、補完モデルの生産方法、および学習用データセット |
EP3495570A4 (en) * | 2017-03-31 | 2020-03-25 | Hitachi Construction Machinery Co., Ltd. | DEVICE FOR MONITORING THE AREA AROUND A WORKING MACHINE |
CN111411657A (zh) * | 2020-03-31 | 2020-07-14 | 陕西理工大学 | 一种适用于建筑工地的挖掘机铲斗斗形结构的优化方法 |
JP2020165253A (ja) * | 2019-03-29 | 2020-10-08 | 住友重機械工業株式会社 | ショベル |
JP2021085178A (ja) * | 2019-11-26 | 2021-06-03 | コベルコ建機株式会社 | 計測装置、及び建設機械 |
WO2021106411A1 (ja) | 2019-11-26 | 2021-06-03 | コベルコ建機株式会社 | 計測装置、操作支援システム、及び建設機械 |
Families Citing this family (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11150921B2 (en) * | 2015-09-01 | 2021-10-19 | International Business Machines Corporation | Data visualizations selection |
EP3426852B1 (en) * | 2016-03-09 | 2020-04-29 | Leica Geosystems Technology A/S | Measuring equipment for determining the result of earthmoving work |
JP7365122B2 (ja) | 2019-02-01 | 2023-10-19 | 株式会社小松製作所 | 画像処理システムおよび画像処理方法 |
JP7306867B2 (ja) * | 2019-04-26 | 2023-07-11 | 株式会社キーエンス | 光学式変位計 |
JP7283332B2 (ja) * | 2019-09-26 | 2023-05-30 | コベルコ建機株式会社 | 容器計測システム |
DE102022202397A1 (de) | 2022-03-10 | 2023-09-14 | Robert Bosch Gesellschaft mit beschränkter Haftung | Verfahren zum Ermitteln eines Füllgrades einer Schaufel einer Arbeitsmaschine |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH02195202A (ja) * | 1989-01-24 | 1990-08-01 | Oki Electric Ind Co Ltd | 土量計測方法 |
JPH04301708A (ja) * | 1991-03-29 | 1992-10-26 | Aisin Seiki Co Ltd | 非接触容積測定装置 |
JP2003247805A (ja) * | 2002-02-22 | 2003-09-05 | Tech Res & Dev Inst Of Japan Def Agency | 体積計測方法及び体積計測プログラム |
JP2008241300A (ja) * | 2007-03-26 | 2008-10-09 | Komatsu Ltd | 油圧ショベルの作業量計測方法および作業量計測装置 |
JP2013015394A (ja) * | 2011-07-04 | 2013-01-24 | Daifuku Co Ltd | 農産物の体積測定装置 |
JP2014089104A (ja) * | 2012-10-30 | 2014-05-15 | Mitsubishi Electric Corp | 体積推定装置、体積推定システム、体積推定方法および体積推定プログラム |
Family Cites Families (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH04301707A (ja) | 1991-03-29 | 1992-10-26 | Aisin Seiki Co Ltd | 非接触容積測定装置 |
AU2005227398B1 (en) * | 2005-10-28 | 2006-04-27 | Leica Geosystems Ag | Method and apparatus for determining the loading of a bucket |
KR100916638B1 (ko) * | 2007-08-02 | 2009-09-08 | 인하대학교 산학협력단 | 구조광을 이용한 토공량 산출 장치 및 방법 |
HUE031382T2 (en) * | 2008-05-27 | 2017-07-28 | Grieshaber Vega Kg | Echoel shape evaluation for charge level sensors |
US20100101317A1 (en) * | 2008-10-23 | 2010-04-29 | Whirlpool Corporation | Lid based amount sensor |
US8930091B2 (en) * | 2010-10-26 | 2015-01-06 | Cmte Development Limited | Measurement of bulk density of the payload in a dragline bucket |
JP5799631B2 (ja) | 2011-07-20 | 2015-10-28 | 日産自動車株式会社 | 車両用画像生成装置及び車両用画像生成方法 |
US9829364B2 (en) * | 2014-08-28 | 2017-11-28 | Raven Industries, Inc. | Method of sensing volume of loose material |
US20160061643A1 (en) * | 2014-08-28 | 2016-03-03 | Raven Industries, Inc. | Method of sensing volume of loose material |
US20180120098A1 (en) * | 2015-04-24 | 2018-05-03 | Hitachi, Ltd. | Volume Estimation Apparatus, Working Machine Including the Same, and Volume Estimation System |
-
2014
- 2014-12-12 US US15/527,027 patent/US10208459B2/en not_active Expired - Fee Related
- 2014-12-12 WO PCT/JP2014/082911 patent/WO2016092684A1/ja active Application Filing
- 2014-12-12 JP JP2016563367A patent/JP6318267B2/ja not_active Expired - Fee Related
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH02195202A (ja) * | 1989-01-24 | 1990-08-01 | Oki Electric Ind Co Ltd | 土量計測方法 |
JPH04301708A (ja) * | 1991-03-29 | 1992-10-26 | Aisin Seiki Co Ltd | 非接触容積測定装置 |
JP2003247805A (ja) * | 2002-02-22 | 2003-09-05 | Tech Res & Dev Inst Of Japan Def Agency | 体積計測方法及び体積計測プログラム |
JP2008241300A (ja) * | 2007-03-26 | 2008-10-09 | Komatsu Ltd | 油圧ショベルの作業量計測方法および作業量計測装置 |
JP2013015394A (ja) * | 2011-07-04 | 2013-01-24 | Daifuku Co Ltd | 農産物の体積測定装置 |
JP2014089104A (ja) * | 2012-10-30 | 2014-05-15 | Mitsubishi Electric Corp | 体積推定装置、体積推定システム、体積推定方法および体積推定プログラム |
Cited By (19)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2017033769A1 (ja) * | 2015-08-24 | 2017-03-02 | 株式会社小松製作所 | ホイールローダの制御システム、その制御方法およびホイールローダの制御方法 |
US10435868B2 (en) | 2015-08-24 | 2019-10-08 | Komatsu Ltd. | Control system for wheel loader, control method thereof, and method of controlling wheel loader |
EP3495570A4 (en) * | 2017-03-31 | 2020-03-25 | Hitachi Construction Machinery Co., Ltd. | DEVICE FOR MONITORING THE AREA AROUND A WORKING MACHINE |
WO2019039606A1 (ja) * | 2017-08-24 | 2019-02-28 | 日立建機株式会社 | 建設機械 |
JP2019039207A (ja) * | 2017-08-24 | 2019-03-14 | 日立建機株式会社 | 建設機械の荷重計測システム |
US11169018B2 (en) | 2017-08-24 | 2021-11-09 | Hitachi Construction Machinery Co., Ltd. | Construction machine including a display device displaying the weight of a work object |
JP2020027501A (ja) * | 2018-08-14 | 2020-02-20 | 東芝テック株式会社 | 画像処理装置及び画像処理方法 |
JP2020034527A (ja) * | 2018-08-31 | 2020-03-05 | 株式会社小松製作所 | 作業機械の運搬物特定装置、作業機械、作業機械の運搬物特定方法、補完モデルの生産方法、および学習用データセット |
WO2020044848A1 (ja) * | 2018-08-31 | 2020-03-05 | 株式会社小松製作所 | 作業機械の運搬物特定装置、作業機械、作業機械の運搬物特定方法、補完モデルの生産方法、および学習用データセット |
JP7311250B2 (ja) | 2018-08-31 | 2023-07-19 | 株式会社小松製作所 | 作業機械の運搬物特定装置、作業機械、作業機械の運搬物特定方法、補完モデルの生産方法、および学習用データセット |
JP2020165253A (ja) * | 2019-03-29 | 2020-10-08 | 住友重機械工業株式会社 | ショベル |
JP7289701B2 (ja) | 2019-03-29 | 2023-06-12 | 住友重機械工業株式会社 | ショベル |
WO2021106410A1 (ja) | 2019-11-26 | 2021-06-03 | コベルコ建機株式会社 | 計測装置、及び建設機械 |
WO2021106411A1 (ja) | 2019-11-26 | 2021-06-03 | コベルコ建機株式会社 | 計測装置、操作支援システム、及び建設機械 |
EP4047139A1 (en) | 2019-11-26 | 2022-08-24 | Kobelco Construction Machinery Co., Ltd. | Measurement device, operation support system, and construction machinery |
JP7246294B2 (ja) | 2019-11-26 | 2023-03-27 | コベルコ建機株式会社 | 計測装置、及び建設機械 |
JP2021085178A (ja) * | 2019-11-26 | 2021-06-03 | コベルコ建機株式会社 | 計測装置、及び建設機械 |
CN111411657B (zh) * | 2020-03-31 | 2022-03-22 | 陕西理工大学 | 一种适用于建筑工地的挖掘机铲斗斗形结构的优化方法 |
CN111411657A (zh) * | 2020-03-31 | 2020-07-14 | 陕西理工大学 | 一种适用于建筑工地的挖掘机铲斗斗形结构的优化方法 |
Also Published As
Publication number | Publication date |
---|---|
JP6318267B2 (ja) | 2018-04-25 |
US20170328032A1 (en) | 2017-11-16 |
US10208459B2 (en) | 2019-02-19 |
JPWO2016092684A1 (ja) | 2017-06-08 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6318267B2 (ja) | 体積推定装置およびそれを用いた作業機械 | |
JP6393412B2 (ja) | 体積推定装置及びそれを備えた油圧ショベル | |
US11417008B2 (en) | Estimating a volume of contents in a container of a work vehicle | |
JP6232494B2 (ja) | 掘削装置 | |
AU2022209235B2 (en) | Display control device and display control method | |
AU2018299288B2 (en) | Display control device, display control method, program, and display system | |
US20220101552A1 (en) | Image processing system, image processing method, learned model generation method, and data set for learning | |
EP3284870B1 (en) | Construction machine | |
US10527413B2 (en) | Outside recognition device | |
JP6289731B2 (ja) | 作業機械の制御システム、作業機械の制御方法、及びナビゲーションコントローラ | |
AU2019292458B2 (en) | Display control system, display control device, and display control method | |
JP2016065422A (ja) | 外界認識装置および外界認識装置を用いた掘削機械 | |
WO2020044848A1 (ja) | 作業機械の運搬物特定装置、作業機械、作業機械の運搬物特定方法、補完モデルの生産方法、および学習用データセット | |
JP2016106192A (ja) | ショベル | |
JP7233150B2 (ja) | 奥行推定装置およびそのプログラム | |
JP2014228941A (ja) | 地表面3次元サーフェス形状計測装置、走行可能領域検出装置およびそれを搭載した建設機械並びに走行可能領域検出方法 | |
JP2019207607A (ja) | 移動体追跡装置 | |
CN116188567A (zh) | 到动态对象的最小距离的有效计算方法 | |
JP2017110492A (ja) | ショベル | |
JP6936557B2 (ja) | 探索処理装置、及びステレオカメラ装置 | |
JP2019031908A (ja) | ショベル | |
JP6567736B2 (ja) | ショベル、ショベルの表示方法及びショベルの表示装置 | |
JP2008106431A (ja) | 掘削状況表示制御装置付き掘削機械 | |
JP2023087021A (ja) | ショベル、ショベルの表示方法及びショベルの表示装置 | |
JP2014232487A (ja) | 交通量推定装置および交通量推定方法 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 14907868 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2016563367 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 15527027 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 14907868 Country of ref document: EP Kind code of ref document: A1 |