CN112482460B - Information processing device, information processing system, information processing method, computer program, and construction machine - Google Patents

Information processing device, information processing system, information processing method, computer program, and construction machine Download PDF

Info

Publication number
CN112482460B
CN112482460B CN202010939634.9A CN202010939634A CN112482460B CN 112482460 B CN112482460 B CN 112482460B CN 202010939634 A CN202010939634 A CN 202010939634A CN 112482460 B CN112482460 B CN 112482460B
Authority
CN
China
Prior art keywords
unit
data
information
image
posture
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010939634.9A
Other languages
Chinese (zh)
Other versions
CN112482460A (en
Inventor
西田裕平
志垣富雄
大前谦
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nabtesco Corp
Original Assignee
Nabtesco Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nabtesco Corp filed Critical Nabtesco Corp
Publication of CN112482460A publication Critical patent/CN112482460A/en
Application granted granted Critical
Publication of CN112482460B publication Critical patent/CN112482460B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • EFIXED CONSTRUCTIONS
    • E02HYDRAULIC ENGINEERING; FOUNDATIONS; SOIL SHIFTING
    • E02FDREDGING; SOIL-SHIFTING
    • E02F3/00Dredgers; Soil-shifting machines
    • E02F3/04Dredgers; Soil-shifting machines mechanically-driven
    • E02F3/28Dredgers; Soil-shifting machines mechanically-driven with digging tools mounted on a dipper- or bucket-arm, i.e. there is either one arm or a pair of arms, e.g. dippers, buckets
    • E02F3/30Dredgers; Soil-shifting machines mechanically-driven with digging tools mounted on a dipper- or bucket-arm, i.e. there is either one arm or a pair of arms, e.g. dippers, buckets with a dipper-arm pivoted on a cantilever beam, i.e. boom
    • E02F3/32Dredgers; Soil-shifting machines mechanically-driven with digging tools mounted on a dipper- or bucket-arm, i.e. there is either one arm or a pair of arms, e.g. dippers, buckets with a dipper-arm pivoted on a cantilever beam, i.e. boom working downwardly and towards the machine, e.g. with backhoes
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01PMEASURING LINEAR OR ANGULAR SPEED, ACCELERATION, DECELERATION, OR SHOCK; INDICATING PRESENCE, ABSENCE, OR DIRECTION, OF MOVEMENT
    • G01P3/00Measuring linear or angular speed; Measuring differences of linear or angular speeds
    • G01P3/36Devices characterised by the use of optical means, e.g. using infrared, visible, or ultraviolet light
    • G01P3/38Devices characterised by the use of optical means, e.g. using infrared, visible, or ultraviolet light using photographic means
    • EFIXED CONSTRUCTIONS
    • E02HYDRAULIC ENGINEERING; FOUNDATIONS; SOIL SHIFTING
    • E02FDREDGING; SOIL-SHIFTING
    • E02F3/00Dredgers; Soil-shifting machines
    • E02F3/04Dredgers; Soil-shifting machines mechanically-driven
    • E02F3/28Dredgers; Soil-shifting machines mechanically-driven with digging tools mounted on a dipper- or bucket-arm, i.e. there is either one arm or a pair of arms, e.g. dippers, buckets
    • E02F3/36Component parts
    • E02F3/42Drives for dippers, buckets, dipper-arms or bucket-arms
    • E02F3/43Control of dipper or bucket position; Control of sequence of drive operations
    • E02F3/435Control of dipper or bucket position; Control of sequence of drive operations for dipper-arms, backhoes or the like
    • E02F3/437Control of dipper or bucket position; Control of sequence of drive operations for dipper-arms, backhoes or the like providing automatic sequences of movements, e.g. linear excavation, keeping dipper angle constant
    • EFIXED CONSTRUCTIONS
    • E02HYDRAULIC ENGINEERING; FOUNDATIONS; SOIL SHIFTING
    • E02FDREDGING; SOIL-SHIFTING
    • E02F9/00Component parts of dredgers or soil-shifting machines, not restricted to one of the kinds covered by groups E02F3/00 - E02F7/00
    • E02F9/26Indicating devices
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/8851Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/8851Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges
    • G01N2021/8854Grading and classifying of flaws
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/8851Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges
    • G01N2021/8887Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges based on image processing techniques

Abstract

The invention aims to provide an information processing device, an information processing system, an information processing method, a computer program and a construction machine of a working machine, wherein the movement speed of a working part can be estimated by a simple structure. The information processing device (10) is provided with a storage unit (30), an image information acquisition unit (12), and a speed estimation unit (20 e). The storage unit (30) stores correspondence information generated by associating data (Gs) of a reference image of a working unit (40) of the working machine (100) with data (Ks) of the posture of the working unit (40) of the working machine (100). An image information acquisition unit (12) acquires data (Gj) of an image of a working unit (40) of a working machine (100) and compares the data (Gj) with data (Gs) of a reference image. A speed estimation unit (20 e) generates speed information (Ve) relating to the speed of a work unit (40) of a work machine (100) on the basis of the data (Gj) of the image acquired by the image information acquisition unit (12) and the correspondence information.

Description

Information processing device, information processing system, information processing method, computer program, and construction machine
Technical Field
The present invention relates to an information processing apparatus, an information processing system, an information processing method, a computer program, and a construction machine.
Background
A construction machine provided with a camera for capturing a work implement is known. For example, patent document 1 describes a construction machine provided with: a camera provided on the rotator for photographing the operation device; an angle detection unit for detecting a relative angle; and a gesture determination unit for determining a gesture. The construction machine detects a relative angle between links from edges of the links extracted based on an image of a camera, and determines a posture of the working device with respect to the revolving unit based on the relative angle.
Prior art literature
Patent literature
Patent document 1: japanese patent laid-open publication No. 2017-053627
Disclosure of Invention
Questions to be solved by the inventionQuestions (questions)
The present inventors have found the following knowledge about a construction machine including a boom, an arm, and an attachment (attachment) driven by power such as hydraulic pressure.
A construction machine drives an arm mechanism including a boom, an arm, and the like with power, and operates an accessory member such as a bucket attached to the arm mechanism to perform a predetermined construction. For example, when constructing the ground, it is desired to efficiently perform accurate construction in a short time according to a design drawing. However, when the accessory member is moved, if the movement speed is too high, the overshoot becomes large and the positioning accuracy is lowered. Conversely, if the movement speed is too slow, the movement time becomes long, and the work efficiency decreases. Therefore, from the viewpoint of improving the accuracy of movement of the accessory member, it is necessary to detect the movement speed of the accessory member to perform precise body control.
In order to detect the moving speed, it is considered to provide position sensors in each part of the arm mechanism and the attachment member, and to estimate the speed from the change per unit time of the detection result. However, in this structure, the structure is complicated by providing the sensor and its wiring. In addition, the cost of the sensor and wiring is additionally spent, and thus is also disadvantageous in terms of cost. From the viewpoint of detecting the moving speed, it cannot be said that the construction machine described in patent document 1 sufficiently solves the problem. Such problems are not limited to the construction machine described above, but may occur in other types of working machines.
The present invention has been made in view of the above problems, and an object thereof is to provide an information processing device for a working machine that can estimate a moving speed of a working unit with a simple configuration.
Solution for solving the problem
In order to solve the above problems, an information processing apparatus according to an aspect of the present invention includes: a storage unit that stores correspondence information generated by associating data of a reference image of a working unit of the work machine with data of a posture of the working unit of the work machine; an acquisition unit that acquires data of an image of a working unit of a working machine, and compares the data with data of a reference image; and a speed estimating unit that generates speed information on the speed of the work unit of the work machine based on the correspondence information and the data of the image acquired by the acquiring unit.
Any combination of the above is also effective as the mode of the present invention, in which the constituent elements of the present invention are replaced with each other by a method, an apparatus, a program, a transitory or non-transitory storage medium storing the program, a system, or the like.
ADVANTAGEOUS EFFECTS OF INVENTION
According to the present invention, an information processing device for a work machine that can estimate the moving speed of a work unit with a simple configuration can be provided.
Drawings
Fig. 1 is a side view schematically showing a work machine including an information processing apparatus according to a first embodiment.
Fig. 2 is a block diagram schematically showing the information processing apparatus of fig. 1.
Fig. 3 is an explanatory diagram for explaining the posture of the working unit of the working machine of fig. 1.
Fig. 4 is a diagram showing learning data of the information processing apparatus of fig. 1.
Fig. 5 is an explanatory diagram for explaining the posture estimation process of the information processing apparatus of fig. 1.
Fig. 6 is an explanatory diagram for explaining the speed estimation process and the acceleration estimation process of the information processing apparatus of fig. 1.
Fig. 7 is a diagram showing an example of posture information, speed information, and acceleration information of the information processing apparatus of fig. 1.
Fig. 8 is a block diagram schematically showing an abnormality detection unit of the information processing apparatus of fig. 1.
Fig. 9 is a block diagram schematically showing an appearance detecting section of the information processing apparatus of fig. 1.
Fig. 10 is a flowchart showing an operation of the information processing apparatus of fig. 1.
Fig. 11 is a flowchart showing an operation of the information processing apparatus of fig. 1.
Fig. 12 is a flowchart showing an operation of the information processing apparatus of fig. 1.
Fig. 13 is a flowchart showing an operation of the information processing apparatus of fig. 1.
Fig. 14 is a block diagram schematically showing a work machine using the information processing system according to the fifth embodiment.
Description of the reference numerals
10: an information processing device; 12: an image information acquisition unit; 14: an environment information acquisition unit; 16: a posture information acquisition unit; 20: a control unit; 20e: a speed estimation unit; 20f: an acceleration estimation unit; 22: a model generation unit; 24c: an abnormality determination unit; 26c: an appearance determination unit; 30: a storage unit; 32: a model storage unit; 40: a working section; 42: a movable arm; 44: a bucket rod; 46: a bucket; 62: a work machine control unit; 100: a work machine; 1000: construction machine.
Detailed Description
The present invention will be described below with reference to the drawings based on preferred embodiments. In the embodiment and the modification, the same or equivalent structural elements and members are denoted by the same reference numerals, and overlapping description is appropriately omitted. In addition, the dimensions of the components in the drawings are appropriately enlarged and reduced for easy understanding. In the drawings, a part of members not important in the description of the embodiments are omitted.
The terms including the ordinal numbers of the first, second, etc. are used for explaining the plurality of components, and the terms are used only for the purpose of distinguishing one component from other components, and the components are not limited by the terms.
First embodiment
The configuration of the information processing apparatus 10 of the work machine according to the first embodiment of the present invention will be described with reference to the drawings. Fig. 1 is a side view schematically showing a work machine 100 including an information processing apparatus 10 according to a first embodiment. Fig. 2 is a block diagram schematically showing the information processing apparatus 10.
The information processing apparatus 10 includes an image information acquiring unit 12, an environment information acquiring unit 14, a posture information acquiring unit 16, a control unit 20, and a storage unit 30. The information processing apparatus 10 has a normal operation time (hereinafter referred to as "non-learning operation time") at the time of machine learning and at the time of non-machine learning. The information processing apparatus 10 can generate a posture estimation model based on the learning image information and the learning posture information of each part of the working part 40 of the working machine 100 at the time of machine learning. The pose estimation model instantiates the correspondence information. The information processing apparatus 10 can estimate the posture of the working unit 40 based on the real-time image information and the posture estimation model at the time of the non-learning operation. Work machine 100 can control the operation of work unit 40 based on the posture of work unit 40 estimated by information processing device 10.
The image information acquiring unit 12 acquires data of an image of the working unit 40. The environment information acquiring unit 14 acquires information about the environment of the work machine 100. The posture information acquisition unit 16 acquires data (hereinafter referred to as "posture data") related to the posture of the working unit 40. The control unit 20 performs various data processing related to the generation of the posture estimation model and the posture estimation of the working unit 40. The storage unit 30 stores data referred to or updated by the control unit 20. First, the structure of work machine 100 will be described, and other structures will be described later.
The work machine 100 of the present embodiment is a construction machine that performs work by moving the bucket 46, and functions as a so-called Power shovel (Power shovels). Work machine 100 includes a lower traveling portion 36, an upper body portion 34, an arm mechanism 48, and a bucket 46. In the present embodiment, the arm mechanism 48 and the bucket 46 constitute the working unit 40. The lower traveling unit 36 is configured to be capable of traveling in a predetermined direction by an endless track or the like. The upper body 34 is mounted on the lower travel unit 36. The upper body 34 and the working unit 40 are configured to be pivotable about a pivot axis La with respect to the lower travel unit 36 by a pivot driving unit 60. The swing driving unit 60 is constituted by, for example, a swing motor (not shown) and a swing gear (not shown). A control cabin 38 is provided in the upper body 34.
The cab 38 is provided with an operation unit 54 for operating the working unit 40. When an operation is input from the operation unit 54, the plurality of hydraulic valves 58 are opened and closed in accordance with the operation. Working oil supplied from a hydraulic pump (not shown) is sent to the plurality of hydraulic cylinders 56 in response to opening and closing of the hydraulic valve 58. The hydraulic cylinder 56 includes hydraulic cylinders 56a, 56b, 56c disposed in order from the base end side to the distal end side of the arm mechanism 48. The hydraulic cylinders 56a, 56b, 56c extend and retract in accordance with the amount of hydraulic oil delivered. The control room 38 is provided with a display unit 38d, and the display unit 38d displays information from an abnormality detection unit, an appearance detection unit, and the like, which will be described later.
The work machine 100 of the present embodiment includes a work machine control unit 62, and the work machine control unit 62 controls the operation of the work unit 40 based on a command from a higher-level control system. Work machine control unit 62 will be described later.
As an example, the base end portion of the arm mechanism 48 is provided on the right side of the cab 38 in the upper body 34. The arm mechanism 48 includes, for example, a boom 42 and an arm 44 extending forward from the upper body 34. A bucket 46 is attached to the front end side of the arm mechanism 48. In this way, the work machine 100 can drive the bucket 46 to perform the intended work by changing the posture of the work unit 40 in accordance with the manipulation by the operator. Further, the work machine 100 can rotate the upper body 34 and the work unit 40, thereby moving the bucket 46 three-dimensionally.
Fig. 3 is an explanatory diagram illustrating the posture of the working unit 40. The hydraulic cylinders 56a, 56b, 56c can change the expansion and contraction lengths L1, L2, L3 according to the hydraulic pressure. The boom 42 is configured to pivot the tip portion up and down about the base end portion on the upper vehicle body portion 34 side by extension and contraction of the hydraulic cylinder 56 a. The boom 44 is configured to pivot the tip portion back and forth about the base end portion on the boom 42 side by extension and contraction of the hydraulic cylinder 56 b. The bucket 46 is configured to pivot the tip portion back and forth or up and down about the base end portion on the arm 44 side by extension and contraction of the hydraulic cylinder 56 c.
In the working unit 40, the bending angles θ1, θ2, and θ3 of the joint portions connecting the boom 42, the arm 44, and the bucket 46 can be changed by changing the extension/contraction lengths L1, L2, and L3 of the hydraulic cylinders 56a, 56b, and 56 c. As an example, the angle θ1 is an angle of the boom 42 with respect to the horizontal plane, the angle θ2 is a bending angle of a joint portion connecting the boom 42 and the arm 44, and the angle θ3 is a bending angle of a joint portion connecting the arm 44 and the bucket 46.
The posture of the working unit 40 can be defined by the positions and relative angles of the boom 42, the arm 44, and the bucket 46. The shapes of the boom 42, the arm 44, and the bucket 46 are fixed, and the posture of the working unit 40 can be determined by geometric calculation based on the sizes of the respective parts of the boom 42, the arm 44, and the bucket 46 and the extension/contraction lengths L1, L2, L3 or the bending angles θ1, θ2, θ3 of the hydraulic cylinders 56a, 56b, 56 c.
Returning to fig. 2. The blocks shown in fig. 2 can be realized in hardware by elements typified by a processor, a CPU, and a memory of a computer, an electronic circuit, and a mechanical device, and can be realized in software by a computer program or the like, but functional blocks realized by cooperation of these are depicted here. Thus, those skilled in the art will appreciate that these functional blocks can be implemented in various forms by combinations of hardware and software.
The image information acquiring unit 12 will be described. The image information acquiring unit 12 of the present embodiment includes an image sensor for capturing an image of the working unit 40. At the time of machine learning, which will be described later, the image information acquisition unit 12 supplies the imaging result obtained by the imaging operation unit 40 to the control unit 20 as image data. Hereinafter, data of an image acquired at the time of machine learning is referred to as "data Gs of a reference image". In the non-learning operation, the image information acquiring unit 12 also supplies the image capturing result obtained by the image capturing operation unit 40 to the control unit 20 as image data. Hereinafter, the data of the image acquired during the non-learning operation will be referred to simply as "data Gj of the image". The data Gj of the image may be data of the image in real time.
The image information acquisition unit 12 is configured to rotate integrally with the working unit 40. Specifically, the image information acquiring unit 12 is disposed on the ceiling of the cab 38 so as to be able to capture the image of the working unit 40. When the working unit 40 rotates, the image information acquiring unit 12 moves integrally with the working unit 40, and therefore the relative positional relationship with the working unit 40 does not change even when the working unit rotates.
The gesture information acquisition unit 16 will be described. The posture information acquiring unit 16 of the present embodiment includes stroke sensors 16a, 16b, and 16c for acquiring the extension and contraction lengths L1, L2, and L3 of the hydraulic cylinders 56a, 56b, and 56 c. The posture information acquiring unit 16 is attached during machine learning and detached during non-learning operation. At the time of machine learning, the posture information acquiring unit 16 supplies the data of the expansion and contraction lengths L1, L2, L3 to the control unit 20.
The environmental information acquisition unit 14 will be described. When acquiring data of an image, if the surrounding environment such as weather is different, the brightness and color temperature of the image are also different, which becomes a factor of increasing the error of the pose estimation. Therefore, in the present embodiment, the environmental information acquisition unit 14 acquires information on the surrounding environment, and corrects the image data based on the acquired result. The environmental information acquisition unit 14 of the present embodiment includes an illuminance sensor for acquiring brightness of the surrounding and a color temperature sensor for acquiring a color temperature. The environmental information acquisition unit 14 supplies the acquisition result to the control unit 20 as environmental information Mp. In this example, the environmental information acquisition unit 14 is disposed on the ceiling of the control room 38.
The storage unit 30 will be described. The storage section 30 includes a model storage section 32. The model storage unit 32 stores a posture estimation model for estimating the posture of the working unit 40, which is a model generated by known machine learning based on the data Gs of the reference image and the data of the posture. The pose estimation model can also be said to be a function in which the data forms of the input and output are predetermined. The data Gj of the image is input to the posture estimation model of the embodiment. In addition, the posture estimation model of the embodiment outputs information on the estimated posture corresponding to the image data. The method for generating the posture estimation model will be described later.
The control unit 20 will be described. The control section 20 includes a model generating section 22, an image information receiving section 20a, a posture information receiving section 20b, a posture estimating section 20c, a position estimating section 20d, a speed estimating section 20e, an acceleration estimating section 20f, an individual information holding section 20h, an environment information receiving section 20g, an image information correcting section 20j, a background information removing section 20k, and a color information removing section 20m. In addition, the control section 20 further includes an abnormality detection section 24 and an appearance detection section 26. An application program having a plurality of modules corresponding to the plurality of functional blocks installed therein may be installed in a storage device (for example, the storage section 30) of the information processing apparatus 10. A processor (e.g., CPU) of the information processing apparatus 10 can function as each functional block by reading out the application program into the main memory and executing the application program.
First, the model generating unit 22, the image information receiving unit 20a, the posture information receiving unit 20b, and the posture estimating unit 20c will be described. The image information receiving unit 20a receives an input of an imaging result of the working unit 40 from the image information acquiring unit 12. In particular, during machine learning, the image information receiving unit 20a receives the data Gs of the reference image for learning from the image information acquiring unit 12. In addition, during the non-learning operation, the image information receiving unit 20a receives the data Gj of the image from the image information acquiring unit 12.
The posture information receiving unit 20b receives input of posture information of the working unit 40 from the posture information acquiring unit 16. Specifically, during machine learning, the posture information receiving unit 20b receives data of the expansion and contraction lengths L1, L2, and L3 of the stroke sensors 16a, 16b, and 16 c. When data of the expansion lengths L1, L2, and L3 are collectively referred to as posture data Ks.
Fig. 4 is a diagram showing learning data of the information processing apparatus 10. In this figure, for easy understanding, the data Gs of the reference image from the image information acquisition unit 12 is replaced with a planar view together with the data Ks of the posture. At the time of machine learning, the control unit 20 stores the received data Gs of the reference image and the received data Ks of the posture in the storage unit 30 in association with each other. At the time of machine learning, the control unit 20 changes the posture of the working unit 40 greatly within the movable range, and each time the posture is changed, the data Gs of the reference image and the data Ks of the posture are stored in the storage unit 30 in correspondence. The data Gs of the reference image and the data Ks of the posture stored in the storage unit 30 are referred to as learning data Sd. The learning data Sd is expected to cover the posture that the working unit 40 can take. Thus, as shown in fig. 4, the learning data Sd includes a large number of data Gs of the reference image and data Ks of the posture corresponding to each other.
The model generating unit 22 generates a posture estimation model using the data Gs of the reference image and the data Ks of the posture corresponding to each other of the learning data Sd as teacher data. The model generating unit 22 of the present embodiment generates a posture estimation model by machine learning (supervised learning) using the data Gs of the reference image and the data Ks of the posture as teacher data. The model generation unit 22 may generate the pose estimation model using a known machine learning method such as a support vector machine, a neural network (including deep learning), and a random forest. The model generation unit 22 stores the generated posture estimation model in the model storage unit 32.
In the non-learning operation, the posture estimating unit 20c estimates the posture of the working unit 40 based on the image data Gj and the stored information of the storage unit 30. As an example, the posture estimating unit 20c may compare the image data Gj with the reference image data Gs of the learning data Sd, and use the posture data Ks corresponding to the reference image data Gs having the highest similarity as the estimation result. In this case, since a large amount of data Gs of the reference image is referred to, it may take time to obtain the result. Therefore, the posture estimation unit 20c of the present embodiment derives the estimated posture from the data Gj of the image using the posture estimation model generated by the model generation unit 22.
Fig. 5 is an explanatory diagram for explaining the posture estimation process of the posture estimation model generated by the model generating unit 22. When the image data Gj is input, the pose estimation model outputs estimated pose information Ke corresponding to the image data. The estimated posture information Ke of the present embodiment includes the bending angles θ1, θ2, and θ3 of the joints of the boom 42, the arm 44, and the bucket 46 of the working unit 40. The posture estimation unit 20c transmits the estimated posture information Ke to the work machine control unit 62 of the work machine 100.
Next, the position estimating unit 20d, the velocity estimating unit 20e, and the acceleration estimating unit 20f will be described. Fig. 6 is an explanatory diagram illustrating the speed estimation process by the speed estimating unit 20 e. In fig. 6, the posture of the working unit 40 at a certain time T (n) is shown by a solid line, and the posture of the working unit 40 at a time T (n-1) which is a past time from the time T (n) is shown by a broken line. Fig. 7 shows an example of the position information Pe, the velocity information Ve, and the acceleration information Ae at each time T. In the figure, the position information Pe at the time T (n) is denoted by Pe (n), the velocity information Ve is denoted by Ve (n), and the acceleration information Ae is denoted by Ae (n). The position information Pe, the velocity information Ve, and the acceleration information Ae are described later.
The position estimating unit 20d estimates the position of a predetermined part of the working unit 40 based on the estimated posture information Ke. Although the predetermined portion is not limited, in this example, the predetermined portion is the bucket 46. The position of the bucket 46 can be determined by calculation based on the shape information of the boom 42, the arm 44, and the bucket 46 and the estimated posture information Ke (bending angles θ1, θ2, θ3). Further, shape information of the boom 42, the arm 44, and the bucket 46 may be stored in the storage unit 30 in advance.
The information about the position estimated by the position estimating unit 20d is referred to as position information Pe. The position estimating unit 20d transmits the position information Pe to the work machine control unit 62, and stores the position information Pe in the storage unit 30 in time series. As an example, the position information Pe may include data of the extension/contraction lengths L1, L2, and L3. The position information Pe may be two-dimensional information or three-dimensional information.
The speed estimating unit 20e generates speed information Ve based on the data Gj of the plurality of images acquired at the plurality of times having the time difference and the time difference. In particular, the speed estimating unit 20e estimates the speed of a predetermined portion (for example, the bucket 46) of the working unit 40 based on the position information Pe, and generates speed information Ve relating to the speed. The speed of the bucket 46 can be obtained by differentiating the position change of the bucket 46 with time. For example, the speed of the bucket 46 can be estimated by dividing the difference between the plurality of pieces of position information Pe generated and stored at different times by the time difference dT at that time, with respect to the pieces of position information Pe stored in time series in the storage unit 30. The time difference dT may be a time difference of 1 frame (japanese: 1 コ ma) of the data Gj of the image, which is a still image that is the basis of the image.
As an example, the time T (n) may be the time of the current frame, and the time T (n-1) may be the time of the previous 1 frame or the previous plurality of frames of the current frame. In this case, the time difference dT is the reciprocal of the frame rate (the number of frames per 1 second (fps)). The information on the velocity estimated by the velocity estimating unit 20e is referred to as velocity information Ve. In this example, the speed information Ve (n) at time T (n) is obtained by the expression (1).
Ve(n)=(Pe(n)-Pe(n-1))/dT…(1)
That is, the speed information Ve is obtained by dividing the difference between the 2 pieces of position information Pe of each of the 2 frames by the time difference of the frames.
The speed estimating unit 20e transmits the speed information Ve to the work machine control unit 62, and stores the speed information Ve in the storage unit 30 in time series. The speed information Ve may be two-dimensional information or three-dimensional information.
The acceleration estimating unit 20f generates acceleration information Ae about the acceleration of the working unit 40 based on the plurality of pieces of velocity information Ve and the time difference. In particular, the acceleration estimating unit 20f estimates the acceleration of a predetermined portion (for example, the bucket 46) of the working unit 40 based on the speed information Ve, and generates acceleration information Ae related to the acceleration. The acceleration of the bucket 46 can be obtained by differentiating the speed change of the bucket 46 with time. For example, the acceleration of the bucket 46 can be estimated by dividing the difference between a plurality of pieces of speed information Ve generated and stored at different times by the time difference dT at the specific time with respect to the speed information Ve stored in time series in the storage unit 30. The time difference dT may be a time difference of each frame of the data Gj of the image. The information on the acceleration estimated by the acceleration estimating unit 20f is referred to as acceleration information Ae. In this example, ae (n) at time T (n) is obtained by equation (2).
Ae(n)=(Ve(n)-Ve(n-1))/dT…(2)
That is, the acceleration information Ae is obtained by dividing the difference between the 2 pieces of velocity information Ve of each of the 2 frames by the time difference of the frames.
Acceleration estimation unit 20f transmits acceleration information Ae to work machine control unit 62, and stores the acceleration information Ae in storage unit 30 in time series. The acceleration information Ae may be two-dimensional information or three-dimensional information. Hereinafter, the position information Pe, the velocity information Ve, and the acceleration information Ae are collectively referred to as feedback information.
Referring back to fig. 2, work machine control unit 62 will be described. Work machine control unit 62 controls the operation of work machine 100 based on instruction information from a higher-level control system. The instruction information of the present embodiment includes target position information Ps related to the position of the predetermined portion (bucket 46), target speed information Vs related to the speed of the predetermined portion (bucket 46), and target acceleration information As related to the acceleration of the predetermined portion (bucket 46). Hereinafter, the target position information Ps, the target velocity information Vs, and the target acceleration information As will be collectively referred to As target information.
As shown in fig. 2, work machine control unit 62 includes an operation control unit 62a. The operation control unit 62a supplies the control signal Sj to the hydraulic valve 58 and the control signal Sk to the swing drive unit 60 by a predetermined control algorithm based on the target information and the feedback information. The operation control unit 62a controls the opening and closing of the hydraulic valve 58 by the control signal Sj to extend and retract the hydraulic cylinders 56a, 56b, and 56c, thereby controlling the operations of the boom 42, the arm 44, and the bucket 46. By feeding back the position information Pe, the operation control unit 62a can control a predetermined portion (the bucket 46) to a desired position. The operation control unit 62a can control the swing drive unit 60 by the control signal Sk.
By feeding back the speed information Ve, the operation control unit 62a can control the speed of a predetermined portion (the bucket 46) to a desired level. In addition, by feeding back the differential element of the position information Pe, overshoot or hunting (hunting) of the position control can be suppressed. Further, by feeding back the acceleration information Ae, the operation control unit 62a can control the acceleration of a predetermined portion (the bucket 46) to a desired level. In addition, by feeding back the differential element of the speed information Ve, overshoot or oscillation of the speed control can be suppressed. As described above, according to the present embodiment, control with high responsiveness and high accuracy can be realized.
Next, the abnormality detection unit 24 will be described. The abnormality detection unit 24 detects a failure state or a fault state of the working unit 40 based on the estimated posture information Ke, and outputs a result of the detection to the upper control system. Fig. 8 is a block diagram schematically showing the abnormality detection unit 24. As shown in the figure, the abnormality detection unit 24 of the present embodiment includes an expected posture calculation unit 24b, an abnormality determination unit 24c, and a state output unit 24d. The estimated posture calculation unit 24b calculates information (hereinafter referred to as "estimated posture information Ka") of the operation unit 40 estimated as a result of the control of the operation unit 40. That is, the expected posture calculation unit 24b determines the posture that the working unit 40 should have as a result of being controlled.
In the present embodiment, the estimated posture calculation unit 24b simulates the opening and closing operation of the hydraulic valve 58 and the extension and contraction lengths L1, L2, L3 of the hydraulic cylinders 56a, 56b, 56c based on the control signal Sj, and determines the estimated posture information Ka based on the extension and contraction lengths L1, L2, L3. Further, as another example, the hydraulic pressure supplied to the hydraulic cylinders 56a, 56b, 56c may be detected by a hydraulic pressure sensor, and the estimated posture calculation unit 24b may simulate the expansion and contraction lengths L1, L2, L3 based on the detection result of the hydraulic pressure sensor. The expansion and contraction lengths L1, L2, and L3 may be simulated based on the detection result of the hydraulic sensor and the control signal Sj.
The abnormality determination unit 24c determines an abnormality in the operation of the working unit 40 based on the control signal Sj and the estimated posture information Ke derived from the image data Gj. In particular, the abnormality determination unit 24c determines a failure state or a fault state of the working unit 40 based on the difference dK between the estimated posture information Ke and the expected posture information Ka. The abnormality determination unit 24c may divide the difference dK by a plurality of thresholds and output the division as the determination result J24. For example, the abnormality determination unit 24c uses a first threshold value and a second threshold value larger than the first threshold value. The abnormality determination unit 24c may determine that the vehicle is in a normal state when the difference dK is smaller than the first threshold, and the abnormality determination unit 24c may determine that the vehicle is in a defective state when the difference dK is equal to or larger than the first threshold and smaller than the second threshold, and the abnormality determination unit 24c may determine that the vehicle is in a faulty state when the difference dK is equal to or larger than the second threshold. In the present specification, a state in which the operation should be stopped immediately because of a high possibility of failure is defined as a failure state, and a state in which failure is likely to occur and should be checked as early as possible is defined as a failure state.
The state output unit 24d presents the determination result J24 of the abnormality determination unit 24c to a higher-level control system or a predetermined device. In this example, the state output unit 24d displays the determination result J24 of the abnormality determination unit 24c on a display unit 38d provided in the control room 38. The state output unit 24d may display the determination result of the abnormality determination unit 24c on a portable terminal (not shown) held by the operator or the manager via a communication means such as a network.
Next, the outer detecting unit 26 will be described. When the appearance shape to be present, for example, any of the structural members such as the hydraulic cylinders 56a, 56b, and 56c is not displayed in the data Gj of the image acquired by the image information acquisition unit 12, the appearance detection unit 26 determines that the working unit 40 is damaged outside. The appearance detecting unit 26 detects whether or not there is an abnormality in the appearance of the working unit 40 based on the estimated posture information Ke, and outputs the detection result to the upper control system. Fig. 9 is a block diagram schematically showing the appearance detecting unit 26. As shown in the figure, the appearance detecting unit 26 of the present embodiment includes a reference image generating unit 26b, an appearance determining unit 26c, and a state output unit 26d.
The reference image generating unit 26b generates data (hereinafter referred to as "data Gh of reference image") related to the image of the working unit 40 in a state considered to be normal in the posture, based on the estimated posture information Ke. The data Gh of the reference image may be data related to an image of the working unit 40 in the normal initial state. The image data of each part of the working unit 40 is stored in the storage unit 30 in advance, and the reference image generating unit 26b can synthesize the stored image data of each part based on the estimated posture information Ke to generate data Gh of the reference image.
The appearance determination unit 26c determines that the appearance of the working unit 40 is abnormal based on the image data Gj and the reference image data Gh. In particular, the appearance determination unit 26c compares the image data Gj with the reference image data Gh generated by the reference image generation unit 26b, and determines whether or not the appearance of the working unit 40 is abnormal based on the degree of difference (hereinafter referred to as "degree of difference Sh"). In this example, the degree of difference Sh is low when the degree of similarity of the two images is high, and the degree of difference Sh is high when the degree of similarity of the two images is low.
The appearance determination unit 26c may divide the difference Sh by a plurality of thresholds and output the division as the determination result J26. For example, the appearance determination unit 26c may use a first threshold value and a second threshold value larger than the first threshold value. When the difference Sh is smaller than the first threshold, the appearance determination unit 26c may determine that there is no appearance abnormality, when the difference Sh is equal to or larger than the first threshold and smaller than the second threshold, the appearance determination unit 26c may determine that there is an appearance abnormality in a part, and when the difference Sh is equal to or larger than the second threshold, the appearance determination unit 26c may determine that there is an appearance abnormality.
The state output unit 26d presents the determination result J26 of the appearance determination unit 26c to a higher-level control system or a predetermined device. In this example, the state output unit 26d displays the determination result of the appearance determination unit 26c on a display unit 38d provided in the cab 38. The state output unit 26d may display the determination result of the appearance determination unit 26c on a portable terminal (not shown) held by the operator or the manager via a communication means such as a network.
Referring back to fig. 2, the image information correction unit 20j, the environment information reception unit 20g, the individual information holding unit 20h, the background information removal unit 20k, and the color information removal unit 20m will be described.
The image information correction unit 20j of the present embodiment corrects the image data Gj based on information about the surrounding environment at the time of acquiring the image data Gj, the elapsed years of the work machine, or the individual differences of the work machine. The environment information receiving unit 20g and the individual information holding unit 20h supply correction information to the image information correcting unit 20 j.
The environmental information receiving unit 20g receives input of the acquisition result from the environmental information acquisition unit 14. In particular, the environment information receiving unit 20g receives the environment information Mp from the environment information acquiring unit 14. The image information correction unit 20j corrects the brightness and color temperature of the data Gj of the image based on the environment information Mp. The image information correction unit 20j corrects the brightness of the data Gj of the image so as to be the same as the brightness of the data Gs of the reference image. The image information correction section 20j corrects the color temperature of the data Gj of the image so as to be the same as the color temperature of the data Gs of the reference image. According to this structure, estimation errors caused by the surrounding environment can be reduced.
The appearance of work unit 40 varies from work machine 100 to work machine 100. This individual difference may be a factor of errors in posture estimation. Therefore, the individual information holding unit 20h of the present embodiment holds the individual information Me of each individual. The individual information Me includes information on the number of elapsed years of the work machine 100, damage to the work unit 40, attachments, and individual differences in appearance due to deformation or the like. The image information correction unit 20j corrects the data Gj of the image based on the individual information Me. According to this structure, estimation errors caused by individual differences can be reduced.
The image data Gj includes background images different for each site where the work machine 100 operates. Therefore, the background image included in the image data Gj may cause an error in the posture estimation. Therefore, the background information removing unit 20k of the present embodiment removes information on the background image from the data Gj of the image. According to this structure, estimation errors caused by the background image can be reduced.
When the reference image data Gs and the image data Gj are stored and processed as full-color image data, the amount of data stored and processed becomes large, which is disadvantageous in terms of processing speed and storage capacity. Therefore, the color information removing unit 20m of the present embodiment removes color information from the data Gs of the reference image and the data Gj of the image, and sets the image data as gray-scale (gradation) image data. According to this structure, the amount of data to be stored and processed becomes small, which is advantageous in terms of processing speed and storage capacity.
The operation of the information processing apparatus 10 configured as described above will be described. Fig. 10 is a flowchart showing the operation of the information processing apparatus 10. The figure shows an act S70 of generating a pose estimation model by machine learning at the time of machine learning. The posture information acquiring unit 16 is mounted in advance on the work machine 100, and operation S70 is started at the time when the administrator inputs the instruction to create the model.
When the model generation time is reached (yes in step S71), the control unit 20 of the information processing apparatus 10 receives the data Gs of the reference image and the data Ks of the posture from the image information acquisition unit 12 and the posture information acquisition unit 16 (step S72). In this step, the control unit 20 changes the posture of the working unit 40 greatly within the movable range, and each time the posture changes, receives the data Gs of the reference image and the data Ks of the posture and stores them in the storage unit 30.
The background information removing unit 20k removes information on the background image from the data Gs of the reference image (step S73). The color information removing unit 20m removes color information from the data Gs of the reference image (step S74). The removal of the background image and the removal of the color information may be performed every time the received data Gs of the reference image, or the removal of the background image and the removal of the color information may be performed on the data Gs of the reference image stored in the storage unit 30.
The model generation unit 22 generates a posture estimation model by machine learning based on the data Gs of the reference image from which the background image and the color information have been removed and the data Ks of the posture, and stores the model in the model storage unit 32 (step S75). If the pose estimation model is saved, act S70 ends. After operation S70 is completed, posture information acquiring unit 16 may be detached from work machine 100.
If the model generation time is not reached (no in step S71), S72 to S75 are skipped. The operation S70 is merely an example, and the order of the steps may be changed, or some steps may be added, deleted, or changed.
Fig. 11 is a flowchart showing the operation of the information processing apparatus 10. The figure shows an operation S80 of estimating the posture of the working unit 40 from the data Gj of the image using the posture estimation model. In the case of the non-learning operation, operation S80 is started at the time when the administrator inputs the instruction for posture estimation.
When the posture estimation is completed (yes in step S81), the control unit 20 of the information processing apparatus 10 receives the image data Gj from the image information acquisition unit 12 (step S82). In this step, the data Gj of the received image is stored in the storage unit 30.
The background information removing unit 20k removes information on the background image from the data Gj of the image (step S83). The color information removing unit 20m removes color information from the data Gj of the image (step S84).
The image information correction unit 20j corrects the data Gj of the image based on the individual information Me held in the individual information holding unit 20h (step S85). The removal of the background image, the removal of the color information, and the correction of the image are performed with respect to the data Gj of the image stored in the storage section 30.
The posture estimating unit 20c estimates the posture of the working unit 40 based on the posture estimating model from the data Gj of the image after the removal of the background image, the removal of the color information, and the correction of the image (step S86). In this step, estimated posture information Ke is output from the posture estimation model.
The speed estimating unit 20e transmits the estimated posture information Ke output from the posture estimating model to the outside of the information processing device 10 (step S87). For example, speed estimation unit 20e transmits estimated posture information Ke to work machine control unit 62. If the estimated posture information Ke is transmitted, operation S80 ends. Action S80 is repeatedly performed until no indication of pose estimation is given.
If the time of posture estimation is not reached (no in step S81), S82 to S87 are skipped. The operation S80 is merely an example, and the order of the steps may be changed, or some of the steps may be added, deleted, or changed.
Fig. 12 is a flowchart showing the operation of the information processing apparatus 10. The figure shows an operation S90, and in this operation S90, the position, velocity, and acceleration of the work unit 40 are determined based on the estimated posture information Ke, and the result of the determination is transmitted to the work machine control unit 62. In the non-learning operation, the operation S90 is started at the time when the administrator inputs an instruction to detect the speed and the acceleration.
When the time of detecting the speed and the acceleration is reached (yes in step S91), the position estimating unit 20d generates position information Pe of a predetermined portion (bucket 46) of the working unit 40 based on the estimated posture information Ke, and sends the position information Pe to the working machine control unit 62 and stores the position information Pe in the storage unit 30 (step S92).
If the position information Pe is stored, the speed estimating unit 20e generates speed information Ve of the working unit 40 based on the position information Pe, sends the speed information Ve to the working machine control unit 62, and stores the speed information Ve in the storage unit 30 (step S93).
If the speed information Ve is stored, the acceleration estimating unit 20f generates acceleration information Ae of the work unit 40 based on the speed information Ve, transmits the acceleration information Ae to the work machine control unit 62, and stores the acceleration information Ae in the storage unit 30 (step S94). If the acceleration information Ae is stored, the operation S90 ends. Action S90 is repeatedly performed until no indication of the detected speed and acceleration is given.
If it is not time to detect the speed and acceleration (no in step S91), S92 to S94 are skipped. The operation S90 is merely an example, and the order of the steps may be changed, or some steps may be added, deleted, or changed.
Fig. 13 is a flowchart showing the operation of the information processing apparatus 10. The figure shows an operation S100, and in this operation S100, a defective state or a failure state of the working unit 40 or whether there is an abnormality in the appearance of the working unit 40 is detected based on the estimated posture information Ke, and a result of the detection is output to a display unit 38d provided in the cab 38. The operation S100 starts at the time when the administrator inputs the instruction of the state detection during the non-learning operation.
When the state detection timing is reached (yes in step S101), the estimated posture calculation unit 24b acquires the control signal Sj from the operation control unit 62a of the work machine control unit 62, and determines estimated posture information Ka (step S102).
If the predicted posture information Ka is determined, the abnormality determination unit 24c determines a failure state or a fault state of the working unit 40 based on the difference dK between the estimated posture information Ke and the predicted posture information Ka (step S103).
If the state of the working unit 40 is determined, the state output unit 24d causes the display unit 38d to display the determination result of the abnormality determination unit 24c (step S104).
If the above determination result is displayed, the reference image generation unit 26b generates data Gh of the reference image in the posture based on the data Ks of the posture (step S105).
If the data Gh of the reference image is generated, the appearance determination unit 26c determines whether or not there is an abnormality in the appearance of the working unit 40 based on the degree of difference Sh between the data Gh of the reference image and the data Gj of the image (step S106).
If the appearance state of the working unit 40 is determined, the state output unit 26d causes the display unit 38d to display the determination result of the appearance determination unit 26c (step S107). If the above determination result is displayed, operation S100 ends. Action S100 is repeatedly performed until no indication of the status detection is given.
If the time of the state detection is not reached (no in step S101), S102 to S107 are skipped. The operation S100 is merely an example, and the order of the steps may be changed, or some steps may be added, deleted, or changed.
The features of the information processing apparatus 10 according to the present embodiment configured as described above will be described. The information processing device 10 includes: a storage unit 30 that stores correspondence information generated by associating data Gs of the reference image of the working unit 40 with data Ks of the posture of the working unit 40; an image information acquisition unit 12 that acquires data Gj of an image of the working unit 40 to compare with data Gs of a reference image; and a speed estimating unit 20e that generates speed information Ve relating to the speed of the working unit 40 based on the image data Gj and the correspondence information. According to this configuration, the speed information Ve relating to the speed of the working unit 40 can be generated from the data Gj of the image, the data Gs of the reference image, and the data Ks of the posture acquired by the image information acquiring unit 12. The correspondence information may be the posture estimation model described above.
The image information acquisition unit 12 may be configured to rotate integrally with the working unit 40. In this case, since the positional relationship between the image information acquiring unit 12 and the working unit 40 is fixed, the speed information Ve of the working unit 40 can be easily generated.
The information processing apparatus 10 may further include a posture estimating unit 20c, and the posture estimating unit 20c may estimate the posture of the working unit 40 based on the data Gj of the image and the correspondence information. In this case, the posture of the working unit 40 can be easily estimated from the data Gj of the image.
The storage unit 30 may store a posture estimation model generated by machine learning based on the data Gs of the reference image and the data Ks of the posture. In this case, a posture estimation model generated by machine learning can be utilized.
The speed estimating unit 20e may generate the speed information Ve based on the data Gj of the plurality of images acquired by the image information acquiring unit 12 at a plurality of times having time differences and the time differences. In this case, the speed information Ve can be generated by a simple arithmetic processing.
The information processing apparatus 10 may further include an acceleration estimating unit 20f, and the acceleration estimating unit 20f may generate the acceleration information Ae related to the acceleration of the working unit 40 based on the plurality of pieces of speed information Ve and the time difference generated by the speed estimating unit 20 e. In this case, the acceleration information Ae can be generated by a simple arithmetic processing.
The information processing apparatus 10 may further include an abnormality determination unit 24c, and the abnormality determination unit 24c may determine that the operation of the operation unit 40 is abnormal based on the data Gj of the image acquired by the image information acquisition unit 12 and a control signal for controlling the operation of the operation unit 40. In this case, even if a special sensor is not used, an abnormal operation can be detected from the data Gj of the image and the control signal.
The information processing apparatus 10 may further include an appearance determination unit 26c, and the appearance determination unit 26c may determine that the appearance of the working unit 40 is abnormal based on the data Gj of the image acquired by the image information acquisition unit 12 and the data Gh of the reference image of the working unit 40 in a state considered to be normal. In this case, even if a special sensor is not used, the appearance abnormality can be detected from the data Gj of the image and the data of the image at the time of normal.
The information processing apparatus 10 may further include an image information correction unit 20j, and the image information correction unit 20j may correct the image data Gj based on information about the surrounding environment when the image data Gj is acquired, the elapsed years of the work machine 100, and individual differences of the work machine 100. In this case, the data Gj of the image can be corrected to improve the estimation accuracy of the posture.
The information processing apparatus 10 may further include a color information removing unit 20m, and the color information removing unit 20m may compress or remove color information of the data Gs of the reference image. In this case, the storage capacity and the processing speed of the storage section 30 are advantageous.
The information processing apparatus 10 may further include a background information removing unit 20k, and the background information removing unit 20k may be configured to remove a background image of the data Gs of the reference image. In this case, a decrease in estimation accuracy due to the background image can be suppressed.
Next, second to fifth embodiments of the present invention will be described. In the drawings and the description of the second to fifth embodiments, the same or equivalent components and members as those of the first embodiment are denoted by the same reference numerals. The description repeated with the first embodiment is appropriately omitted, and a structure different from the first embodiment is mainly described.
Second embodiment
A second embodiment of the present invention is an information processing method of a work machine. The information processing method includes the steps of: storing, in the storage unit 30, correspondence information generated by associating data Gs of the reference image of the working unit 40 of the working machine 100 with data Ks of the posture of the working unit 40 (S72 to S75); acquiring data Gj of an image of the working unit 40 to compare with data Gs of a reference image (S82); and generating speed information Ve based on the data Gj of the plurality of images acquired at the plurality of times having the time difference, and the correspondence information (S86).
The step of generating the velocity information Ve may include a step of referencing a posture estimation model generated by machine learning based on the data Gs of the reference image and the data Ks of the posture. That is, the correspondence information may be a pose estimation model. According to the configuration of the second embodiment, the same operational effects as those of the first embodiment are achieved.
Third embodiment
A third embodiment of the present invention is a construction machine 1000. The construction machine 1000 includes: a working unit 40; a storage unit 30 that stores correspondence information generated by associating data Gs of the reference image of the working unit 40 with data Ks of the posture of the working unit 40; an image information acquisition unit 12 that acquires data Gj of an image of the working unit 40 to compare with data Gs of a reference image; and a speed estimating unit 20e that generates speed information Ve relating to the speed of the working unit 40 based on the image data Gj and the correspondence information. The correspondence information may be the above-described pose estimation model.
The construction machine 1000 may be, for example, a machine that performs a construction operation by moving a bucket 46 attached to an arm mechanism 48. Instead of the bucket, various accessory members such as a fork, a hammer, and a breaker may be attached to the arm mechanism 48 of the construction machine 1000. According to the structure of the third embodiment, the same operational effects as those of the first embodiment are achieved.
Fourth embodiment
A fourth embodiment of the invention is a computer program P100. The computer program P100 is for causing a computer to realize the following functions: storing, in the storage unit 30, correspondence information generated by associating data Gs of the reference image of the working unit 40 with data Ks of the posture of the working unit 40; acquiring data Gj of an image of the working unit 40 to compare with data Gs of a reference image; and generating speed information Ve of the working unit 40 based on the data Gj of the plurality of images acquired at the plurality of times having the time difference, and the correspondence information. The correspondence information may be the above-described pose estimation model.
In the computer program P100, these functions may be implemented as an application program in which a plurality of modules corresponding to the functional blocks of the control unit 20 are installed, and may be installed in a storage device (for example, the storage unit 30) of the information processing apparatus 10. The computer program P100 can be read out to a main memory of a processor (e.g., CPU) of the information processing apparatus 10 to be executed. According to the structure of the fourth embodiment, the same operational effects as those of the first embodiment are achieved.
Fifth embodiment
A fifth embodiment of the present invention is an information processing system 1. Fig. 14 is a block diagram schematically showing a work machine 100 using the information processing system 1 of the present embodiment, and corresponds to fig. 2. The information processing system 1 is the same as the first embodiment in that speed information Ve relating to the speed of the work unit 40 is estimated based on the data Gj of the image of the work unit 40 of the work machine 100. The information processing system 1 is different from the information processing apparatus 10 of the first embodiment in that a part of the structure of the information processing system 1 is provided in the web server 120. In this example, the network server 120 is provided with an estimating unit 120j.
The information processing system 1 of the present embodiment includes an information processing apparatus 110 and a web server 120. The information processing apparatus 110 and the web server 120 communicate with each other via a communication network NW that utilizes wired communication or wireless communication. As the communication network NW, a general-purpose network such as the internet or a dedicated network can be used. As the web server 120, a cloud server also called a cloud computing system can be employed.
The information processing apparatus 110 includes an acquisition unit 12 and a control unit 20. The control unit 20 includes an image information receiving unit 20a, an image information transmitting unit 20p, and an estimation result receiving unit 20q. The acquisition unit 12 acquires data Gj of an image of the working unit 40 of the working machine 100. The image information transmitting unit 20p transmits the data Gj of the image acquired by the acquiring unit 12 to the web server 120 via the communication network NW.
The network server 120 includes a storage unit 120m, an estimation unit 120j, a reception unit 120q, and a transmission unit 120p. The estimating unit 120j includes a posture estimating unit 120c, a position estimating unit 120d, a speed estimating unit 120e, and an acceleration estimating unit 120f. The storage unit 120m stores correspondence information Ci generated by associating data Gs of the reference image of the working unit 40 with data Ks of the posture of the working unit 40 of the working machine 100. The correspondence information Ci of this example is a posture estimation model generated by machine learning from the data Gs of the reference image and the data Ks of the posture. A posture estimation model is generated in advance from data of the working machine as a reference, and stored in the model storage unit 120 n.
The receiving unit 120q receives the data Gj of the image. The posture estimation unit 120c operates in the same manner as the posture estimation unit 20c to generate estimated posture information Ke. The position estimating unit 120d operates in the same manner as the position estimating unit 20d to generate the position information Pe. The speed estimating unit 120e operates in the same manner as the speed estimating unit 20e to generate speed information Ve. In this example, the speed estimating unit 120e generates speed information Ve on the speed of the working unit 40 based on the correspondence information Ci (posture estimating model) and the received image data Gj. The acceleration estimating unit 120f operates in the same manner as the acceleration estimating unit 20f to generate acceleration information Ae. The transmitting unit 120p transmits the estimated posture information Ke, the position information Pe, the velocity information Ve, and the acceleration information Ae to the information processing device 110 via the communication network NW.
The estimation result receiving unit 20q of the information processing apparatus 110 receives the estimated posture information Ke, the position information Pe, the velocity information Ve, and the acceleration information Ae from the web server 120. The estimation result receiving unit 20q transmits the estimation posture information Ke, the position information Pe, the velocity information Ve, and the acceleration information Ae to the work machine control unit 62. The work machine control unit 62 operates in the same manner as the work machine control unit of the first embodiment to control the operation of the work machine 100.
The information processing system 1 configured as described above operates similarly to the information processing apparatus 10 of the first embodiment, and functions and effects similar to those described above are achieved. In addition, since the estimating unit 120j is provided in the web server 120, the estimation accuracy can be improved using a more advanced algorithm. In addition, a plurality of work machines can be supported by one web server.
Examples of the embodiments of the present invention are described above in detail. The above embodiments are merely examples for illustrating the implementation of the present invention. The content of the embodiment is not intended to limit the technical scope of the present invention, and various design changes such as modification, addition, deletion, etc. of the constituent elements can be made without departing from the spirit of the invention defined in the claims. In the above-described embodiments, the description of the expressions such as "embodiment" and "in the embodiments" is given with respect to the content that can be subjected to such a design change, but it is not allowed to carry out the design change on the content without such an expression.
Modification example
Next, a modification will be described. In the drawings and description of the modification, the same or equivalent components and members as those of the embodiment are denoted by the same reference numerals. The description repeated with the embodiment is omitted appropriately, and a structure different from that of the first embodiment is focused on.
In the first embodiment, the speed information Ve and the acceleration information Ae are shown as examples of information on the speed and the acceleration of the bucket 46, but the present invention is not limited to this. The speed information Ve and the acceleration information Ae may be information on the speed and the acceleration of the portion of the working unit 40 other than the bucket 46, such as the arm 44 and the boom 42. The velocity information Ve and the acceleration information Ae may include information on the velocity and the acceleration of the part or the portion at 2 or more points.
In the first embodiment, an example is shown in which the speed information Ve and the acceleration information Ae are fed back to control the speed and the acceleration, but the present invention is not limited to this. For example, the information processing apparatus 10 may control the avoidance operation for avoiding contact between the working unit 40 and the surrounding person or object based on at least one of the position information Pe, the velocity information Ve, and the acceleration information Ae.
In the first embodiment, an example in which the posture estimation model is generated by each work machine 100 and stored in the model storage unit 32 of the work machine 100 is shown, but the present invention is not limited to this. The posture estimation model may be generated by the working machine as a reference and stored in advance in the model storage unit 32 of each working machine 100. In addition, the posture estimation model may be updated at an appropriate timing
In the description of the first embodiment, an example in which the image information acquisition unit 12 is constituted by one image sensor is shown, but the present invention is not limited to this. The image information acquiring unit 12 may be constituted by a plurality of image sensors. For example, the image information acquisition unit 12 may include a so-called stereo camera.
In the description of the first embodiment, an example is shown in which the image information acquiring unit 12 is provided on the ceiling of the control room 38, but the present invention is not limited to this. For example, the image information acquisition unit 12 may be disposed on a side surface of the cabin 38 or on a cover of the upper vehicle body 34. The image information acquiring unit 12 may be disposed in the working unit 40.
In the description of the first embodiment, the working machine 100 is shown as an example of a construction machine that performs a construction operation by moving the bucket 46, but the present invention is not limited to this, and can be applied to working machines other than construction machines.
In the first embodiment, the example in which the color information removing unit 20m completely removes the color information from the data Gs of the reference image and the data Gj of the image is shown, but the present invention is not limited to this. The color information removing unit 20m may compress the color information of the data Gs of the reference image and the data Gj of the image by color reduction or the like.
In the description of the first embodiment, an example in which the arm mechanism 48 is provided on the right side of the control room 38 is shown, but the present invention is not limited to this. For example, the arm mechanism may be provided on the left side of the control room or in front of the control room.
In the description of the first embodiment, an example in which the operator manipulates the work machine 100 from the cab 38 is shown, but the present invention is not limited thereto. For example, the work machine may be a machine that is automatically or remotely operated.
In the description of the fifth embodiment, the control unit 20 is shown as not including the individual information holding unit 20h, the environmental information receiving unit 20g, the image information correcting unit 20j, the background information removing unit 20k, the color information removing unit 20m, the abnormality detecting unit 24, and the appearance detecting unit 26, but the present invention is not limited thereto. For example, the control unit 20 according to the fifth embodiment may include one or more of the individual information holding unit 20h, the environment information receiving unit 20g, the image information correcting unit 20j, the background information removing unit 20k, the color information removing unit 20m, the abnormality detecting unit 24, and the appearance detecting unit 26.
The modified example described above has the same operation and effects as those of the first embodiment.
Any combination of the above embodiments and modifications is also useful as an embodiment of the present invention. The new embodiment produced by the combination has the effects of both the combined embodiment and the modification.

Claims (13)

1. An information processing device is provided with:
a storage unit that stores correspondence information indicating a correspondence between data of a reference image of a working unit of a work machine and data of a posture of the working unit of the work machine;
an acquisition unit that acquires data of an image of a working unit of the working machine, and compares the data with the data of the reference image; and
a speed estimating unit that generates speed information on a speed of a work unit of the work machine based on the correspondence information and the data of the image,
wherein the speed estimating section generates the speed information based on the data of the plurality of images acquired by the acquiring section at a plurality of times having a time difference and the time difference.
2. The information processing apparatus according to claim 1, wherein,
the acquisition unit is configured to rotate integrally with a work unit of the work machine.
3. The information processing apparatus according to claim 1 or 2, wherein,
the device further includes an estimating unit that estimates a posture of a work unit of the work machine based on the data of the image and the correspondence information.
4. The information processing apparatus according to claim 1, wherein,
the storage unit stores a posture estimation model generated by machine learning based on the data of the reference image and the data of the posture.
5. The information processing apparatus according to claim 1, wherein,
the device further includes an acceleration estimation unit that generates acceleration information on acceleration of a work unit of the work machine based on the plurality of pieces of speed information generated by the speed estimation unit and the time difference.
6. The information processing apparatus according to claim 1, wherein,
and an abnormality determination unit configured to determine an abnormality in operation of a work unit of the work machine based on the data of the image acquired by the acquisition unit and a control signal for controlling the operation of the work unit of the work machine.
7. The information processing apparatus according to claim 1, wherein,
The apparatus further includes an appearance determination unit that determines that an appearance of a work unit of the work machine is abnormal based on the data of the image acquired by the acquisition unit and the data of the image of the work unit of the work machine in a normal state.
8. An information processing device is provided with:
a storage unit that stores a posture estimation model that is generated by machine learning from data of a reference image of a working unit of a working machine and data of a posture of the working unit of the working machine;
an acquisition unit configured to rotate integrally with a working unit of the working machine, and to acquire data of an image of the working unit of the working machine for comparison with data of the reference image;
a posture estimating unit that estimates a posture of a working unit of the working machine with reference to the posture estimating model based on the data of the image acquired by the acquiring unit;
a position estimating unit that estimates a position of a predetermined portion of a work unit of the work machine based on estimated posture information on a posture of the work unit of the work machine estimated by the posture estimating unit; and
A speed estimating unit that generates speed information on a speed of a working unit of the working machine based on position information on the position of the predetermined part estimated by the position estimating unit,
wherein the speed estimating section generates the speed information based on the data of the plurality of images acquired by the acquiring section at a plurality of times having a time difference and the time difference.
9. An information processing system is provided with:
an information processing device that acquires and transmits data of an image of a work unit of a work machine; and
a web server that estimates speed information on a speed of a work unit of the work machine based on the data of the image received from the information processing device and correspondence information indicating a correspondence between data of a reference image of the work unit of the work machine and data of a posture of the work unit of the work machine, and transmits the speed information to the information processing device,
wherein the web server estimates the speed information based on data of a plurality of images acquired and transmitted by the information processing apparatus at a plurality of times having a time difference and the time difference.
10. An information processing method comprising the steps of:
storing correspondence information indicating a correspondence between data of a reference image of a working unit of a work machine and data of a posture of the working unit of the work machine;
acquiring data of an image of a work portion of the work machine for comparison with the data of the reference image; and
speed information is generated based on data of a plurality of images acquired at a plurality of times having a time difference, the time difference, and the correspondence information.
11. The information processing method according to claim 10, wherein,
the step of generating the speed information includes a step of referencing a pose estimation model generated by machine learning from the data of the reference image and the data of the pose.
12. A computer program for causing a computer to perform the following functions:
storing correspondence information indicating a correspondence between data of a reference image of a working unit of a work machine and data of a posture of the working unit of the work machine;
acquiring data of an image of a work portion of the work machine for comparison with the data of the reference image; and
Speed information of a work unit of the work machine is generated based on data of a plurality of images acquired at a plurality of times having a time difference, the time difference, and the correspondence information.
13. A construction machine is provided with:
a working unit of a working machine;
a storage unit that stores correspondence information indicating a correspondence between data of a reference image of a working unit of the work machine and data of a posture of the working unit of the work machine;
an acquisition unit that acquires data of an image of a working unit of the working machine, and compares the data with the data of the reference image; and
a speed estimating unit that generates speed information on a speed of a work unit of the work machine based on the data of the image acquired by the acquiring unit and the correspondence information,
wherein the speed estimating section generates the speed information based on the data of the plurality of images acquired by the acquiring section at a plurality of times having a time difference and the time difference.
CN202010939634.9A 2019-09-10 2020-09-09 Information processing device, information processing system, information processing method, computer program, and construction machine Active CN112482460B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2019-164810 2019-09-10
JP2019164810A JP7282002B2 (en) 2019-09-10 2019-09-10 Information processing device, information processing system, information processing method, computer program, construction machine

Publications (2)

Publication Number Publication Date
CN112482460A CN112482460A (en) 2021-03-12
CN112482460B true CN112482460B (en) 2023-06-27

Family

ID=74863806

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010939634.9A Active CN112482460B (en) 2019-09-10 2020-09-09 Information processing device, information processing system, information processing method, computer program, and construction machine

Country Status (3)

Country Link
JP (1) JP7282002B2 (en)
KR (1) KR20210030875A (en)
CN (1) CN112482460B (en)

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS6267208U (en) * 1985-10-16 1987-04-27
JP4746000B2 (en) 2007-03-27 2011-08-10 株式会社小松製作所 Fuel saving driving support method and fuel saving driving support system for construction machinery
DE112014000134B4 (en) * 2014-06-04 2016-09-22 Komatsu Ltd. Position calculation device for a work machine, work machine, and posture calculation method for a work machine
US9598845B2 (en) * 2014-06-04 2017-03-21 Komatsu Ltd. Posture computing apparatus for work machine, work machine, and posture computation method for work machine
JP6491576B2 (en) 2015-09-07 2019-03-27 Kyb株式会社 Angle detector
US10586349B2 (en) * 2017-08-24 2020-03-10 Trimble Inc. Excavator bucket positioning via mobile device
JP7073232B2 (en) 2018-09-06 2022-05-23 住友重機械工業株式会社 Excavator and how to update excavator information
CN109903337B (en) * 2019-02-28 2022-06-14 北京百度网讯科技有限公司 Method and apparatus for determining pose of bucket of excavator

Also Published As

Publication number Publication date
CN112482460A (en) 2021-03-12
JP7282002B2 (en) 2023-05-26
KR20210030875A (en) 2021-03-18
JP2021042569A (en) 2021-03-18

Similar Documents

Publication Publication Date Title
CN107263464B (en) Machine learning device, machine system, manufacturing system, and machine learning method
EP3733355A1 (en) Robot motion optimization system and method
CN109719756B (en) Life prediction device
CN112638596B (en) Autonomous learning robot device and method for generating operation of autonomous learning robot device
KR102525831B1 (en) Control system, controller and control method
CN111788361A (en) System including working machine, method executed by computer, method for manufacturing learned position estimation model, and data for learning
CN110722552A (en) Automatic route generation device
CN113167054A (en) System including working machine, method executed by computer, method for manufacturing learned position estimation model, and data for learning
CN114174608A (en) Position determining system for construction machine
Kim et al. Modular data communication methods for a robotic excavator
CN109421049A (en) Robot system
CN112482460B (en) Information processing device, information processing system, information processing method, computer program, and construction machine
EP4102328A1 (en) Work machine
US20160104391A1 (en) Method of training an operator of machine
JPH01209505A (en) Teaching device for remote control robot
JP7374867B2 (en) Control system, local controller and control method
CN114454176B (en) Robot control method, control device, robot, and storage medium
JP7383255B2 (en) Information processing systems, information processing methods, construction machinery
JP2020159147A (en) Automatic maneuvering system
JP7447568B2 (en) Simulation equipment and programs
CN112207813B (en) Computing device, machine learning method, and non-transitory storage medium
Lee et al. A probabilistic motion control approach for teleoperated construction machinery
US20240054393A1 (en) Learning Device, Learning Method, Recording Medium Storing Learning Program, Control Program, Control Device, Control Method, and Recording Medium Storing Control Program
WO2022137581A1 (en) Simulation device and simulation program
JP2010053606A (en) Working machine

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant