CN115457096A - Auxiliary control method, device and system for working machine and working machine - Google Patents

Auxiliary control method, device and system for working machine and working machine Download PDF

Info

Publication number
CN115457096A
CN115457096A CN202210989203.2A CN202210989203A CN115457096A CN 115457096 A CN115457096 A CN 115457096A CN 202210989203 A CN202210989203 A CN 202210989203A CN 115457096 A CN115457096 A CN 115457096A
Authority
CN
China
Prior art keywords
working
scene
key part
working machine
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210989203.2A
Other languages
Chinese (zh)
Inventor
李大鼎
王威
储海军
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sany Heavy Machinery Ltd
Original Assignee
Sany Heavy Machinery Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sany Heavy Machinery Ltd filed Critical Sany Heavy Machinery Ltd
Priority to CN202210989203.2A priority Critical patent/CN115457096A/en
Publication of CN115457096A publication Critical patent/CN115457096A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • EFIXED CONSTRUCTIONS
    • E02HYDRAULIC ENGINEERING; FOUNDATIONS; SOIL SHIFTING
    • E02FDREDGING; SOIL-SHIFTING
    • E02F9/00Component parts of dredgers or soil-shifting machines, not restricted to one of the kinds covered by groups E02F3/00 - E02F7/00
    • E02F9/20Drives; Control devices
    • E02F9/2025Particular purposes of control systems not otherwise provided for
    • EFIXED CONSTRUCTIONS
    • E02HYDRAULIC ENGINEERING; FOUNDATIONS; SOIL SHIFTING
    • E02FDREDGING; SOIL-SHIFTING
    • E02F9/00Component parts of dredgers or soil-shifting machines, not restricted to one of the kinds covered by groups E02F3/00 - E02F7/00
    • E02F9/20Drives; Control devices
    • E02F9/2025Particular purposes of control systems not otherwise provided for
    • E02F9/2037Coordinating the movements of the implement and of the frame
    • EFIXED CONSTRUCTIONS
    • E02HYDRAULIC ENGINEERING; FOUNDATIONS; SOIL SHIFTING
    • E02FDREDGING; SOIL-SHIFTING
    • E02F9/00Component parts of dredgers or soil-shifting machines, not restricted to one of the kinds covered by groups E02F3/00 - E02F7/00
    • E02F9/26Indicating devices
    • E02F9/264Sensors and their calibration for indicating the position of the work tool
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/40Scaling of whole images or parts thereof, e.g. expanding or contracting
    • G06T3/4038Image mosaicing, e.g. composing plane images from plane sub-images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/40Scaling of whole images or parts thereof, e.g. expanding or contracting
    • G06T3/4046Scaling of whole images or parts thereof, e.g. expanding or contracting using neural networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • G06T7/75Determining position or orientation of objects or cameras using feature-based methods involving models
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/22Image preprocessing by selection of a specific region containing or referencing a pattern; Locating or processing of specific regions to guide the detection or recognition
    • G06V10/23Image preprocessing by selection of a specific region containing or referencing a pattern; Locating or processing of specific regions to guide the detection or recognition based on positionally close patterns or neighbourhood relationships
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2200/00Indexing scheme for image data processing or generation, in general
    • G06T2200/32Indexing scheme for image data processing or generation, in general involving image mosaicing

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Mining & Mineral Resources (AREA)
  • Civil Engineering (AREA)
  • General Engineering & Computer Science (AREA)
  • Structural Engineering (AREA)
  • Evolutionary Computation (AREA)
  • Artificial Intelligence (AREA)
  • Software Systems (AREA)
  • Multimedia (AREA)
  • Computer Graphics (AREA)
  • Geometry (AREA)
  • Health & Medical Sciences (AREA)
  • Computing Systems (AREA)
  • Databases & Information Systems (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Operation Control Of Excavators (AREA)

Abstract

The invention relates to the field of working machines, and provides an auxiliary control method, device and system for a working machine and the working machine, wherein the method comprises the following steps: acquiring image data of a working environment where a working machine is located and attitude data of the working machine; establishing a working scene based on the image data, and determining pose information of a key part of the working machine based on the attitude data of the working machine; fusing the pose information of the key part with the operation scene to obtain the position information of the key part in the operation scene; auxiliary control data is generated based on the position information of the key portion and the work surface in the work scene to perform auxiliary control on the work machine. The auxiliary control data is generated by fusing the pose information of the key part with the operation scene and combining the operation surface in the operation scene, so that the efficiency and the reliability of the auxiliary control process are improved, and the problem that the existing auxiliary control process is difficult to efficiently and reliably assist an operator to control the operation machine is solved.

Description

Auxiliary control method, device and system for working machine and working machine
Technical Field
The invention relates to the technical field of working machines, in particular to an auxiliary control method, device and system of a working machine and the working machine.
Background
The working efficiency of the working machine is determined by the control accuracy and the control efficiency of the manipulator to a great extent, in order to improve the control accuracy and the control efficiency of the manipulator, the environmental information and the attitude information of the working machine can be acquired, and the operator can be assisted to control the working machine more conveniently by utilizing the field data such as the environmental information and the attitude information.
However, since the field data is scattered and not intuitive for the operator, it is difficult to efficiently and reliably assist the operator in controlling the work machine.
Disclosure of Invention
The invention provides an auxiliary control method, device and system of a working machine and the working machine, which are used for solving the defect that an operator is difficult to efficiently and reliably control the working machine in the prior art by an auxiliary control process directly realized based on field data.
In a first aspect, the present invention provides a method of assisting control of a work machine, the method including:
acquiring image data of a working environment where a working machine is located and attitude data of the working machine;
establishing a working scene of the working machine based on the image data, and determining pose information of key parts of the working machine based on the attitude data of the working machine;
fusing the pose information of the key position with the operation scene to obtain the position information of the key position in the operation scene;
and generating auxiliary control data based on the position information of the key part and the working surface in the working scene so as to perform auxiliary control on the working machine.
According to the method for assisting control of a working machine according to the present invention, the method for acquiring image data of a working environment in which the working machine is located includes:
acquiring a two-dimensional image and depth information of at least one target scene in the working environment;
determining the image data based on the two-dimensional image and the depth information of the at least one target scene.
According to the assist control method for a working machine according to the present invention, the creating of the working scene of the working machine based on the image data includes:
respectively splicing the two-dimensional image of each target scene with the depth information to obtain a three-dimensional image of each target scene;
and splicing the three-dimensional images of the target scenes to obtain the operation scene.
According to the method for assisting control of a work machine of the present invention, generating assist control data based on the position information of the key part and the work surface in the work scene includes:
determining a working area corresponding to the key part and distance information between the key part and the working area in the working scene based on the position information of the key part and the working surface in the working scene;
and generating auxiliary control data based on the position information of the key part, the working area and the distance information.
According to the present invention, there is provided an assist control method for a working machine, wherein a working area corresponding to a key part and distance information between the key part and the working area are specified in a working scene based on position information of the key part and a working surface in the working scene, the method comprising:
determining the operation direction of the key part based on the position information of the key part;
determining the position of a falling point of the key part based on the intersection point of the operation direction and the operation surface;
determining the working area based on the position of the drop point and the working surface;
and determining distance information between the key part and the working area based on the position information of the key part and the position of the drop point.
According to the method for assisting in controlling a working machine of the present invention, after the work area corresponding to the key part and the information on the distance between the key part and the work area are determined in the work scene, the method further includes:
determining a work guidance parameter corresponding to the distance information based on a preset corresponding relation;
and the preset corresponding relation is the corresponding relation between the distance information and the operation guidance parameter.
According to the auxiliary control method for a working machine according to the present invention, the fusing the pose information of the key part with the working scene to obtain the position information of the key part in the working scene includes:
and converting the pose information of the key part into a coordinate system corresponding to the operation scene to obtain the position information of the key part in the operation scene.
In a second aspect, the present invention also provides an auxiliary control device for a working machine, the device including:
the system comprises an acquisition module, a display module and a control module, wherein the acquisition module is used for acquiring image data of a working environment where a working machine is located and attitude data of the working machine;
the first processing module is used for establishing a working scene of the working machine based on the image data and determining the position and attitude information of the key part of the working machine based on the attitude data of the working machine;
the second processing module is used for fusing the pose information of the key position with the operation scene to obtain the position information of the key position in the operation scene;
and the third processing module is used for generating auxiliary control data based on the position information of the key part and the working surface in the working scene so as to perform auxiliary control on the working machine.
In a third aspect, the present disclosure also provides an assist control system for a work machine, the system including: the environment data acquisition equipment and the attitude data acquisition equipment are connected with the controller, and the controller is also connected with the operation machine;
the environment data acquisition equipment is used for acquiring image data of a working environment where the working machine is located;
the attitude data acquisition equipment is used for acquiring attitude data of the working machine;
the controller is used for acquiring the image data and the attitude data; establishing an operation scene of the operation machine based on the image data, and determining pose information of key parts of the operation machine based on the pose data; fusing the pose information of the key position with the operation scene to obtain the position information of the key position in the operation scene; and generating auxiliary control data based on the position information of the key part and the working surface in the working scene so as to perform auxiliary control on the working machine.
In a fourth aspect, the present invention also provides a working machine that executes any one of the above-described assist control methods for a working machine or an assist control device including the working machine.
In a fifth aspect, the present invention further provides an electronic device, including a memory, a processor, and a computer program stored in the memory and executable on the processor, wherein the processor implements the method for assisting control of a work machine according to any one of the above aspects when executing the program.
According to the auxiliary control method, the auxiliary control device and the auxiliary control system for the operating machine and the operating machine, the operating scene of the operating machine is established based on the image data of the operating environment where the operating machine is located, the pose information of the key part of the operating machine is determined based on the pose data of the operating machine, and the pose information of the key part is fused with the operating scene to obtain the position information of the key part in the operating scene; and generating auxiliary control data based on the position information of the key part and the working face in the working scene so as to perform auxiliary control on the working machine, wherein the auxiliary control data is generated by fusing the pose information of the key part and the working scene and combining the pose information with the working face in the working scene, so that the pose information can be displayed to an operator more intensively and intuitively, and the efficiency and the reliability of an auxiliary control process are improved.
Drawings
In order to more clearly illustrate the technical solutions of the present invention or the prior art, the drawings needed to be used in the description of the embodiments or the prior art will be briefly described below, and it is obvious that the drawings in the following description are some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 is a flowchart illustrating an assist control method of a work machine according to the present invention;
FIG. 2 is a schematic diagram of the positions of sensors on the excavator according to the embodiment of the invention;
FIG. 3 is a partial structural view of a D-H model of an excavator according to an embodiment of the present invention;
FIG. 4 is a schematic diagram of the principle of coordinate transformation of the center point of the bucket tooth tip;
FIG. 5 is a schematic diagram illustrating the principle of obtaining the working area and distance information of the excavator;
fig. 6 is a schematic configuration diagram of an auxiliary control device of a working machine according to the present invention;
FIG. 7 is a schematic illustration of a work machine auxiliary control system according to the present disclosure;
fig. 8 is a schematic structural diagram of an electronic device provided in the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the technical solutions of the present invention will be clearly and completely described below with reference to the accompanying drawings, and it is obvious that the described embodiments are some, but not all embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
An auxiliary control method, an auxiliary control device, an auxiliary control system, and a work machine according to embodiments of the present invention are described below with reference to fig. 1 to 8.
Fig. 1 illustrates an assist control method for a work machine according to an embodiment of the present invention, including:
step 101: acquiring image data of a working environment where a working machine is located and attitude data of the working machine;
step 102: establishing an operation scene of the operation machine based on the image data, and determining pose information of key parts of the operation machine based on the attitude data of the operation machine;
step 103: fusing the pose information of the key part with the operation scene to obtain the position information of the key part in the operation scene;
step 104: the assist control data is generated based on the position information of the key portion and the work surface in the work scene to assist control of the work machine.
In this embodiment, the image data Of the working environment where the working machine is located may be acquired by one or more image data acquisition devices, for example, may be acquired by one or more combinations Of a laser radar, a binocular camera, and a camera with a TOF (Time Of Flight) function, where in this embodiment, the camera may specifically adopt an intel's real sense depth sensing camera, a ZED binocular camera, or a Kinect depth sensor.
In this embodiment, the position information of the key portion may be understood as position coordinate data of the key portion in a coordinate system corresponding to the operation scene.
In this embodiment, the auxiliary control data may be status data of the work machine that can be visually presented to the operator, such as image data for marking positions of relevant key positions, or other data that can assist the operator in controlling the work machine, such as a work area corresponding to a key position and a distance from the work area, which can be presented in a work scene and can be visually presented to the operator.
In an exemplary embodiment, acquiring the image data of the working environment where the working machine is located may specifically include:
acquiring a two-dimensional image and depth information of at least one target scene in a working environment;
image data is determined based on the two-dimensional image and the depth information of the at least one target scene.
In the embodiment, the image data specifically includes a two-dimensional image and depth information of at least one target scene in the working environment, in the practical application process, if two-dimensional images and depth information of a plurality of target scenes need to be acquired, and when data acquisition is performed through a plurality of image data acquisition devices, data acquired by the plurality of image data acquisition devices can be aligned to a unified coordinate system through an alignment algorithm, the alignment algorithm can be realized according to an internal parameter matrix of the camera and coordinate transformation relations among the plurality of cameras, and the alignment algorithm in the embodiment can be realized by an alignment optimization algorithm such as BA (Bundle Adjustment ), dense residual error and the like.
In an exemplary embodiment, establishing a work scene of the work machine based on the image data may specifically include:
respectively splicing the two-dimensional image and the depth information of each target scene to obtain a three-dimensional image of each target scene;
and splicing the three-dimensional images of the target scenes to obtain a working scene.
The process of establishing the operation scene of the operation machine based on the image data in this embodiment may be understood as a process of reconstructing a three-dimensional operation scene based on the image data, and may be specifically implemented by an SLAM (Simultaneous Localization And Mapping) algorithm. The two-dimensional image refers to an RGB image and can be acquired through a camera, and the depth information refers to point cloud data and can be acquired through laser radar. In the practical application process, the two-dimensional image and the depth information of each target scene can be spliced by using the multi-channel information, the pose information and the corresponding pose estimation algorithm of the image data acquisition equipment to obtain the three-dimensional image of each target scene.
After the three-dimensional image of each target scene is obtained, the three-dimensional images of the target scenes are subjected to global scope registration in the working scope of the working machine, so that the three-dimensional images of the target scenes can be preprocessed before splicing, the global scope registration can be understood as recording and completing some data after some data are found to be missing, the global scope registration can also be used as a strategy for updating scene data at a fixed point, and then the completed three-dimensional images of the target scenes are spliced, so that the working scenes can be obtained.
In this embodiment, the data accuracy of the reconstructed operation scene may be improved by using a reconstruction optimization algorithm, and the reconstruction optimization algorithm may adopt reconstruction optimization algorithms such as an ICP (Iterative Closest Point) algorithm, a dense residual optimization algorithm, a feature Point alignment algorithm, and the like, and by using the reconstruction optimization algorithm, point cloud data of a three-dimensional reproduction of an operation surface in the operation scene may be obtained, thereby implementing three-dimensional construction of the operation scene.
In this embodiment, the attitude data of the work machine may be acquired by a plurality of sets of attitude sensors mounted on a body and a work device of the work machine, as shown in fig. 2, taking the work machine as an excavator 200 as an example, a bucket attitude sensor 201 and a bucket tilt sensor 202 are disposed on a bucket of the excavator 200, an arm attitude sensor 203 and an arm tilt sensor 204 are disposed on an arm of the excavator 200, a boom attitude sensor 205 and a boom tilt sensor 206 are disposed on a boom of the excavator 200, a body tilt sensor 208 is disposed on the body of the excavator 200, a return sensor 207 is disposed on a swing mechanism of the excavator 200, and the attitude data of the body and the work device of the excavator may be acquired by the sensors mounted on the excavator 200, so as to achieve acquisition of the attitude data of the work machine.
In consideration of the fact that the attitude data of the working machine is easily lost in a remote area and in a sheltered area due to a way of acquiring the attitude data of the working machine through a Global Navigation Satellite System (GNSS), which affects the accuracy of the auxiliary control, the embodiment collects the attitude data by using a rotation sensor, an inclination sensor and a pose sensor mounted on a body and a working device of the working machine, thereby effectively avoiding the problem of loss of base station signals in the remote area and in the sheltered area, and further improving the accuracy of the auxiliary control.
Further, the image data collecting device 209 may be installed at a top position of a front end of a vehicle body of the excavator 200, so that image data of a working environment in which the excavator is located may be collected while the excavator 200 travels.
After the attitude data of the working machine is obtained, the collected attitude data of the working machine can be subjected to preprocessing operations such as time series statistics, noise filtering and the like to determine the attitude data of the body of the working machine and each working device, so that the pose information of key parts can be conveniently extracted.
In this embodiment, the key part of the working machine mainly refers to a part of the working machine that contacts the working surface in the working environment, that is, an actuator of a working operation, taking an excavator as an example, the key part of the excavator refers to a bucket of the excavator, and the position and orientation information of the key part can be understood as position coordinate data of the key part in a coordinate system corresponding to the working machine.
In the practical application process, the attitude data of the working machine comprises a body of the working machine and attitude data of each working device, the attitude data of each part is respectively in different coordinate systems, and for convenience of analysis, a corresponding D-H (Denavit-Hartenberg, danawitt and Haatonberg) model can be established according to physical parameters of the body of the working machine and a working device, and the attitude data of each part is unified into a basic coordinate system of the working machine.
Still taking the working machine as an example of the excavator in the present embodiment, fig. 3 shows a partial structure of a D-H model of the excavator, where an X-Y coordinate system represents a basic coordinate system of the excavator, an m-n coordinate system represents a coordinate system corresponding to a working scene (i.e., a three-dimensional reconstruction coordinate system), physical parameters of the vehicle body and the working device include relevant length and inclination data of the vehicle body and the working device, specifically, L1 represents a linear distance between a boom swing point and an arm swing point, L2 represents a linear distance between the arm swing point and a bucket swing point, L3 represents a linear distance between the bucket swing point and a bucket tooth tip, θ 1 represents an angle between a connecting line between the boom swing point and the arm swing point and an X coordinate axis of the X-Y coordinate system, θ 2 represents an angle between a connecting line between the arm swing point and the bucket swing point and an extension line between the boom swing point and the arm swing point, and θ 3 represents an angle between a connecting line between the bucket swing point and the bucket tooth tip and the arm swing point and the swing point.
In an exemplary embodiment, fusing pose information of a key part with a job scene to obtain position information of the key part in the job scene, including:
and converting the pose information of the key position into a coordinate system corresponding to the operation scene to obtain the position information of the key position in the operation scene.
In this embodiment, the process of fusing the pose information of the key part and the operation scene can be understood as a process of performing coordinate transformation on the key part.
Based on physical parameters of a body and a working device of the working machine, such as the distance and angle data shown in fig. 3, pose information of a key part can be converted into a coordinate system corresponding to a working scene through forward and inverse kinematics calculation or by using a pre-trained neural network model in a coordinate transformation manner, taking the key part as a bucket tooth tip of the excavator as an example, referring to fig. 4, after obtaining a coordinate P of a bucket tooth tip central point in a basic coordinate system of the excavator, the pose information can be converted into the coordinate system corresponding to the working scene through a rotation and translation manner, wherein a three-dimensional coordinate system X is a three-dimensional coordinate system X W -Y W -Z W A basic coordinate system of the excavator is represented,three-dimensional coordinate system X C -Y C -Z C And a coordinate system corresponding to the operation scene, namely a camera coordinate system, is shown, and T represents a translation vector.
The coordinate P 'can be obtained after the coordinate conversion is carried out according to the mode, and then the converted coordinate P' is matched with the point cloud data of the bucket in the operation scene so as to determine the final position of the bucket tooth tip, namely the position information of the bucket tooth tip.
Specifically, in the process of fusing the position information of the key part with the operation scene, the position information may be implemented by an existing matching algorithm, and the matching algorithm adopted in this embodiment may be an ICP (Iterative Closest point) algorithm. In the fusion process, the position information of the key part can be adjusted and optimized by using the residual error between the target point cloud and the current point cloud of the key part, so that the positioning accuracy of the key part in a coordinate system corresponding to an operation scene is improved.
In an exemplary embodiment, the generating of the auxiliary control data based on the position information of the key part and the working surface in the working scene may specifically include:
determining a working area corresponding to the key part and distance information between the key part and the working area in the working scene based on the position information of the key part and the working surface in the working scene;
the assist control data is generated based on the position information, the work area, and the distance information of the key portion.
The auxiliary control data can be an operation scene image for marking information of relevant key positions, operation areas and distances, the operation scene image can be displayed for an operator through AR (Augmented Reality) display equipment, the operator is further assisted to control the operation machine accurately and efficiently, and the AR display equipment in the embodiment can adopt display screens, HUD screen projection equipment, AR glasses and other display equipment.
Further, determining a working area corresponding to the key part and distance information between the key part and the working area in the working scene based on the position information of the key part and the working surface in the working scene may specifically include:
determining the operation direction of the key part based on the position information of the key part;
determining the position of a falling point of a key part based on the intersection point of the operation direction and the operation surface;
determining a working area based on the position of the drop point and the working surface;
and determining distance information between the key part and the working area based on the position information of the key part and the position of the drop point.
In this embodiment, the working direction of the key portion may be determined according to the direction pointed by the key portion, referring to fig. 5, still taking the key portion as an example of a bucket tooth tip of an excavator, where the direction pointed by the bucket tooth tip is an excavation direction, the direction indicated by an arrow 501 in fig. 5 is an excavation direction, and an intersection point of the excavation direction and an excavation surface 504 is a drop point position, of course, the drop point position may also be determined according to a current excavation region, in this embodiment, a region right below a center point of the bucket tooth tip is used as a current excavation region 502, an intersection point of the current excavation region 502 and the excavation surface 504 in the fused three-dimensional working scene is a drop point position, and then, a drop point distance 503 in the current excavation direction is determined according to the bucket tooth tip and the drop point position, and the position where the current excavation region 502 is located and the drop point distance 503 are marked on the AR display image as mark information, so as to obtain auxiliary control data.
In a scene needing fixed-point excavation, when the bucket tooth tip is in the process of working action, the current excavation area dynamically changes in real time, the target excavation area of the working action is used as a target area 505, and the distance 506 between the position of the falling point and the target area 505 is determined in real time, so that the target area 505 and the distance 506 can be displayed on an AR display image in real time, and auxiliary control data can be more complete.
In an exemplary embodiment, after determining the working area corresponding to the key part and the distance information between the key part and the working area in the working scene, the method may further include:
determining a work guidance parameter corresponding to the distance information based on a preset corresponding relation;
the preset corresponding relation is the corresponding relation between the distance information and the operation guidance parameter.
The work guidance parameter in this embodiment refers to a parameter that can provide guidance for an operator to accurately control the work machine, such as a key parameter of excavation depth, excavation force, and the like.
In the practical application process, the corresponding relation between the distance information and the operation guidance parameter, namely the operation guidance parameter corresponding to each distance value in the distance information, can be preset by combining the data related to the operation torque of the operation machine, such as the excavating force distribution data of an excavator, of a host factory, so that under the condition that the distance information is known, the corresponding operation guidance parameter can be directly determined according to the corresponding relation, the operation known parameter is also used as one part of auxiliary control data, the auxiliary control data can be more referential and more visual, meanwhile, the performance of the operation machine can be guided to be maximally utilized by an operator, and the efficiency and the precision of auxiliary control are improved.
The present invention provides an assist control device for a working machine, which can be referred to in correspondence with the above-described assist control method for a working machine.
Fig. 6 shows an assist control device for a work machine according to an embodiment of the present invention, including:
an obtaining module 601, configured to obtain image data of a working environment where a working machine is located and attitude data of the working machine;
a first processing module 602, configured to establish a working scene of the working machine based on the image data, and determine pose information of a key part of the working machine based on pose data of the working machine;
the second processing module 603 is configured to fuse the pose information of the key part with the operation scene to obtain position information of the key part in the operation scene;
a third processing module 604 is configured to generate auxiliary control data based on the position information of the key part and the working plane in the working scene, so as to perform auxiliary control on the working machine.
In an exemplary embodiment, the obtaining module 601 obtains the image data of the work environment where the work machine is located specifically by the following processes:
acquiring a two-dimensional image and depth information of at least one target scene in a working environment;
image data is determined based on the two-dimensional image and the depth information of the at least one target scene.
In an exemplary embodiment, the first processing module 602 implements establishing a work scene of the work machine based on the image data by specifically:
respectively splicing the two-dimensional image and the depth information of each target scene to obtain a three-dimensional image of each target scene;
and splicing the three-dimensional images of the target scenes to obtain a working scene.
In an exemplary embodiment, the third processing module 604 may be specifically configured to:
determining a working area corresponding to the key part and distance information between the key part and the working area in the working scene based on the position information of the key part and the working surface in the working scene;
the assist control data is generated based on the position information, the work area, and the distance information of the key portion.
Further, the third processing module 604 may specifically determine, based on the position information of the key part and the working plane in the working scene, a working area corresponding to the key part and distance information between the key part and the working area in the working scene by the following processes:
determining the operation direction of the key part based on the position information of the key part;
determining the position of a falling point of a key part based on the intersection point of the operation direction and the operation surface;
determining a working area based on the position of the drop point and the working surface;
and determining distance information between the key part and the working area based on the position information of the key part and the position of the drop point.
Further, the third processing module 604 may be further configured to:
determining operation guidance parameters corresponding to the distance information based on a preset corresponding relation;
wherein, the preset corresponding relation is the corresponding relation between the distance information and the operation guidance parameter.
In an exemplary embodiment, the second processing module 603 may specifically be configured to:
and converting the pose information of the key position into a coordinate system corresponding to the operation scene to obtain the position information of the key position in the operation scene.
In summary, according to the auxiliary control apparatus for a working machine provided in the embodiments of the present invention, the first processing module establishes a working scene of the working machine based on the image data of the working environment where the working machine is located, determines the pose information of the key portion of the working machine based on the pose data of the working machine, fuses the pose information of the key portion with the working scene through the second processing module to obtain the position information of the key portion in the working scene, and generates the auxiliary control data based on the position information of the key portion and the working plane in the working scene through the third processing module to perform auxiliary control on the working machine.
Fig. 7 shows an assist control system for a work machine according to an embodiment of the present invention, including: the system comprises an environment data acquisition device 701, an attitude data acquisition device 702 and a controller 703, wherein the environment data acquisition device 701 and the attitude data acquisition device 702 are both connected with the controller 703, and the controller 703 is also connected with a working machine 704;
the environment data acquisition device 701 is used for acquiring image data of a working environment where the working machine 704 is located;
attitude data collection device 702 is used to collect attitude data for work machine 704;
the controller 703 is configured to obtain image data and pose data; establishing a work scene for work machine 704 based on the image data, and determining pose information for key portions of work machine 704 based on the pose data; fusing the pose information of the key part with the operation scene to obtain the position information of the key part in the operation scene; assist control data is generated based on the position information of the key portion and the work surface in the work scene to assist control of work machine 704.
The controller 703 may be a vehicle controller of the working machine 704, or may be a controller added to implement the auxiliary control function, and the environmental data acquisition device 701 and the attitude data acquisition device 702 may be mounted on the working machine 704.
In this embodiment, the environment data collecting device 701 may be a camera or a video camera capable of collecting two-dimensional images and depth information, and the posture data collecting device 702 may be various sensors capable of collecting tilt angles, poses, and rotation angles.
In an exemplary embodiment, the controller 703 may specifically obtain image data of a work environment where the work machine is located by the following processes:
acquiring a two-dimensional image and depth information of at least one target scene in a working environment;
image data is determined based on the two-dimensional image and the depth information of the at least one target scene.
In an exemplary embodiment, the controller 703 may specifically realize the establishment of the work scene of the work machine based on the image data by the following processes:
respectively splicing the two-dimensional image and the depth information of each target scene to obtain a three-dimensional image of each target scene;
and splicing the three-dimensional images of the target scenes to obtain a working scene.
In an exemplary embodiment, the controller 703 may specifically generate the auxiliary control data based on the position information of the key part and the working plane in the working scene through the following processes:
determining a working area corresponding to the key part and distance information between the key part and the working area in the working scene based on the position information of the key part and the working surface in the working scene;
the auxiliary control data is generated based on the position information, the work area, and the distance information of the key portion.
Further, the controller 703 may specifically determine a working area corresponding to the key part and distance information between the key part and the working area in the working scene based on the position information of the key part and the working plane in the working scene by the following processes:
determining the operation direction of the key part based on the position information of the key part;
determining the position of a falling point of a key part based on the intersection point of the operation direction and the operation surface;
determining a working area based on the position of the drop point and the working surface;
and determining distance information between the key part and the working area based on the position information of the key part and the position of the drop point.
In an exemplary embodiment, the controller 703 may be further configured to:
determining a work guidance parameter corresponding to the distance information based on a preset corresponding relation;
wherein, the preset corresponding relation is the corresponding relation between the distance information and the operation guidance parameter.
In an exemplary embodiment, the controller 703 may specifically fuse the pose information of the key part with the job scene through the following processes to obtain the position information of the key part in the job scene:
and converting the pose information of the key position into a coordinate system corresponding to the operation scene to obtain the position information of the key position in the operation scene.
In summary, the auxiliary control system for the working machine according to the embodiment of the present invention can generate auxiliary control data for performing auxiliary control on the working machine through cooperation of the environment data acquisition device, the posture data acquisition device, and the controller, and because the auxiliary control data is generated by fusing pose information of the key part with the working scene and combining the pose information with the working plane in the working scene, the pose information can be displayed to the operator more intensively and intuitively, thereby improving the efficiency and reliability of the auxiliary control process.
Furthermore, an embodiment of the present invention provides a working machine that executes the method for assisting control of the working machine, or includes an assist control device of the working machine, or includes an assist control system of the working machine.
In this embodiment, the working machine may be an excavator, a crane, or another engineering vehicle, and by executing the auxiliary control method of the working machine, or the auxiliary control device including the working machine, or the auxiliary control system including the working machine, the auxiliary control data may be displayed to the operator more centrally and intuitively, so that the efficiency and reliability of the auxiliary control process are improved, and the working process of the working machine is safer and more reliable.
Fig. 8 illustrates a physical structure diagram of an electronic device, and as shown in fig. 8, the electronic device may include: a processor (processor) 801, a communication Interface (Communications Interface) 802, a memory (memory) 803 and a communication bus 804, wherein the processor 801, the communication Interface 802 and the memory 803 complete communication with each other through the communication bus 804. Processor 801 may invoke logic instructions in memory 803 to perform a method of assisting control of a work machine, the method comprising: acquiring image data of a working environment where a working machine is located and attitude data of the working machine; establishing an operation scene of the operation machine based on the image data, and determining pose information of key parts of the operation machine based on the attitude data of the operation machine; fusing the pose information of the key part with the operation scene to obtain the position information of the key part in the operation scene; the assist control data is generated based on the position information of the key portion and the work surface in the work scene to assist control of the work machine.
In addition, the logic instructions in the memory 803 may be implemented in the form of software functional units and stored in a computer readable storage medium when the logic instructions are sold or used as independent products. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
In another aspect, the present disclosure also provides a computer program product including a computer program stored on a non-transitory computer-readable storage medium, the computer program including program instructions, when the program instructions are executed by a computer, the computer being capable of executing the auxiliary control method for a working machine provided by the above embodiments, the method including: acquiring image data of a working environment where a working machine is located and attitude data of the working machine; establishing an operation scene of the operation machine based on the image data, and determining pose information of a key part of the operation machine based on the pose data of the operation machine; fusing the pose information of the key part with the operation scene to obtain the position information of the key part in the operation scene; auxiliary control data is generated based on the position information of the key portion and the work surface in the work scene to perform auxiliary control on the work machine.
In yet another aspect, the present invention also provides a non-transitory computer-readable storage medium, on which a computer program is stored, the computer program being executed by a processor to implement the auxiliary control method for a working machine provided in the above embodiments, the method including: acquiring image data of a working environment where a working machine is located and attitude data of the working machine; establishing an operation scene of the operation machine based on the image data, and determining pose information of key parts of the operation machine based on the attitude data of the operation machine; fusing the pose information of the key part with the operation scene to obtain the position information of the key part in the operation scene; auxiliary control data is generated based on the position information of the key portion and the work surface in the work scene to perform auxiliary control on the work machine.
The above-described embodiments of the apparatus are merely illustrative, and the units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of the present embodiment. One of ordinary skill in the art can understand and implement it without inventive effort.
Through the above description of the embodiments, those skilled in the art will clearly understand that each embodiment can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware. With this understanding in mind, the above-described technical solutions may be embodied in the form of a software product, which can be stored in a computer-readable storage medium such as ROM/RAM, magnetic disk, optical disk, etc., and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to execute the methods described in the embodiments or some parts of the embodiments.
Finally, it should be noted that: the above examples are only intended to illustrate the technical solution of the present invention, but not to limit it; although the present invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; and such modifications or substitutions do not depart from the spirit and scope of the corresponding technical solutions of the embodiments of the present invention.

Claims (11)

1. An assist control method for a working machine, comprising:
acquiring image data of a working environment where a working machine is located and attitude data of the working machine;
establishing a working scene of the working machine based on the image data, and determining pose information of key parts of the working machine based on the attitude data of the working machine;
fusing the pose information of the key position with the operation scene to obtain the position information of the key position in the operation scene;
and generating auxiliary control data based on the position information of the key part and the working surface in the working scene so as to perform auxiliary control on the working machine.
2. The method of assisting in controlling a work machine according to claim 1, wherein the acquiring image data of a work environment in which the work machine is located includes:
acquiring a two-dimensional image and depth information of at least one target scene in the working environment;
determining the image data based on the two-dimensional image and depth information of the at least one target scene.
3. The method of assisting in controlling a work machine according to claim 2, wherein the creating of the work scene of the work machine based on the image data includes:
respectively splicing the two-dimensional image of each target scene with the depth information to obtain a three-dimensional image of each target scene;
and splicing the three-dimensional images of the target scenes to obtain the operation scene.
4. The method according to claim 1, wherein the generating assistance control data based on the position information of the key part and the work surface in the work scene includes:
determining a working area corresponding to the key part and distance information between the key part and the working area in the working scene based on the position information of the key part and the working surface in the working scene;
and generating auxiliary control data based on the position information of the key part, the working area and the distance information.
5. The method according to claim 4, wherein the determining a working area corresponding to the key part and information on a distance between the key part and the working area in the working scene based on the position information of the key part and the working surface in the working scene comprises:
determining the operation direction of the key part based on the position information of the key part;
determining the position of a falling point of the key part based on the intersection point of the operation direction and the operation surface;
determining the working area based on the position of the drop point and the working surface;
and determining distance information between the key part and the working area based on the position information of the key part and the position of the drop point.
6. The assist control method for a working machine according to claim 4, further comprising, after the determination of the working area corresponding to the key part and the information on the distance between the key part and the working area in the working scene:
determining a work guidance parameter corresponding to the distance information based on a preset corresponding relation;
the preset corresponding relation is the corresponding relation between the distance information and the operation guidance parameters.
7. The method according to claim 1, wherein the fusing the pose information of the key part with the work scene to obtain the position information of the key part in the work scene includes:
and converting the pose information of the key part into a coordinate system corresponding to the operation scene to obtain the position information of the key part in the operation scene.
8. An assist control device for a working machine, comprising:
the system comprises an acquisition module, a display module and a control module, wherein the acquisition module is used for acquiring image data of a working environment where a working machine is located and attitude data of the working machine;
the first processing module is used for establishing a working scene of the working machine based on the image data and determining pose information of a key part of the working machine based on the attitude data of the working machine;
the second processing module is used for fusing the pose information of the key position with the operation scene to obtain the position information of the key position in the operation scene;
and the third processing module is used for generating auxiliary control data based on the position information of the key part and the working surface in the working scene so as to perform auxiliary control on the working machine.
9. An auxiliary control system for a work machine, comprising: the environment data acquisition equipment and the attitude data acquisition equipment are connected with the controller, and the controller is also connected with the operating machinery;
the environment data acquisition equipment is used for acquiring image data of a working environment where the working machine is located;
the attitude data acquisition equipment is used for acquiring attitude data of the working machine;
the controller is used for acquiring the image data and the attitude data; establishing a working scene of the working machine based on the image data, and determining pose information of key parts of the working machine based on the attitude data; fusing the pose information of the key part with the operation scene to obtain the position information of the key part in the operation scene; and generating auxiliary control data based on the position information of the key part and the working surface in the working scene so as to perform auxiliary control on the working machine.
10. A working machine characterized by performing the method of auxiliary control of a working machine according to any one of claims 1 to 7 or comprising the auxiliary control apparatus of a working machine according to claim 8.
11. An electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, characterized in that the processor implements the method of auxiliary control of a work machine according to any of claims 1 to 7 when executing the program.
CN202210989203.2A 2022-08-17 2022-08-17 Auxiliary control method, device and system for working machine and working machine Pending CN115457096A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210989203.2A CN115457096A (en) 2022-08-17 2022-08-17 Auxiliary control method, device and system for working machine and working machine

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210989203.2A CN115457096A (en) 2022-08-17 2022-08-17 Auxiliary control method, device and system for working machine and working machine

Publications (1)

Publication Number Publication Date
CN115457096A true CN115457096A (en) 2022-12-09

Family

ID=84298415

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210989203.2A Pending CN115457096A (en) 2022-08-17 2022-08-17 Auxiliary control method, device and system for working machine and working machine

Country Status (1)

Country Link
CN (1) CN115457096A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116290203A (en) * 2023-01-12 2023-06-23 中港疏浚有限公司 Dredging construction parameter optimization model method based on neural network

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116290203A (en) * 2023-01-12 2023-06-23 中港疏浚有限公司 Dredging construction parameter optimization model method based on neural network
CN116290203B (en) * 2023-01-12 2023-10-03 中港疏浚有限公司 Dredging construction parameter optimization model method based on neural network

Similar Documents

Publication Publication Date Title
CN109270534B (en) Intelligent vehicle laser sensor and camera online calibration method
JP7137548B2 (en) 3D reconstruction method, apparatus, electronic device and computer readable medium for material stack
US10496762B2 (en) Model generating device, position and orientation calculating device, and handling robot device
KR101049266B1 (en) Mobile robot
US20160312432A1 (en) Computer Vision Assisted Work Tool Recognition and Installation
KR102315535B1 (en) Method and apparatus for determining rotation angle of engineering mechanical device
JP2020140696A (en) Method and apparatus for determining position attitude of bucket of drilling machine
WO2017042873A1 (en) Remote operation system and operation assistance system
JP7071203B2 (en) Work machine
KR20210151964A (en) construction machinery
US20220412048A1 (en) Work assist server, work assist method, and work assist system
CN115457096A (en) Auxiliary control method, device and system for working machine and working machine
EP4348379A1 (en) Topology processing for waypoint-based navigation maps
WO2023025262A1 (en) Excavator operation mode switching control method and apparatus and excavator
CN117576094B (en) 3D point cloud intelligent sensing weld joint pose extraction method, system and equipment
CN112937444B (en) Auxiliary image generation method and device for working machine and working machine
CN114327076A (en) Virtual interaction method, device and system for working machine and working environment
Pachidis et al. Vision-based path generation method for a robot-based arc welding system
CN112884710A (en) Auxiliary image generation method, remote control method and device for operation machine
US20200347579A1 (en) Tip attachment discrimination device
CN112059983A (en) Method, device and computer readable medium for assembling workpiece
US20160150189A1 (en) Image processing system and method
CN116016613A (en) Method, system and device for remotely controlling excavator and electronic equipment
KR102011386B1 (en) an excavator working radius representation method
CN113140031A (en) Three-dimensional image modeling system and method and oral cavity scanning equipment applying same

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination