CN116630411B - Mining electric shovel material surface identification method, device and system based on fusion perception - Google Patents

Mining electric shovel material surface identification method, device and system based on fusion perception Download PDF

Info

Publication number
CN116630411B
CN116630411B CN202310924852.9A CN202310924852A CN116630411B CN 116630411 B CN116630411 B CN 116630411B CN 202310924852 A CN202310924852 A CN 202310924852A CN 116630411 B CN116630411 B CN 116630411B
Authority
CN
China
Prior art keywords
coordinate
material surface
point cloud
shovel
coordinate system
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202310924852.9A
Other languages
Chinese (zh)
Other versions
CN116630411A (en
Inventor
李康军
张嘉莉
龚权华
何世超
周良
张寒乐
李艳斌
庞敏丽
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hunan Retoo Intelligent Technology Co ltd
Original Assignee
Hunan Retoo Intelligent Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hunan Retoo Intelligent Technology Co ltd filed Critical Hunan Retoo Intelligent Technology Co ltd
Priority to CN202310924852.9A priority Critical patent/CN116630411B/en
Publication of CN116630411A publication Critical patent/CN116630411A/en
Application granted granted Critical
Publication of CN116630411B publication Critical patent/CN116630411B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/22Image preprocessing by selection of a specific region containing or referencing a pattern; Locating or processing of specific regions to guide the detection or recognition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/762Arrangements for image or video recognition or understanding using pattern recognition or machine learning using clustering, e.g. of similar faces in social networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/764Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/77Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
    • G06V10/80Fusion, i.e. combining data from various sources at the sensor level, preprocessing level, feature extraction level or classification level
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior
    • G06T2207/30252Vehicle exterior; Vicinity of vehicle

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Computing Systems (AREA)
  • Artificial Intelligence (AREA)
  • Health & Medical Sciences (AREA)
  • Databases & Information Systems (AREA)
  • Evolutionary Computation (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • Electromagnetism (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Operation Control Of Excavators (AREA)

Abstract

The application discloses a mining electric shovel material surface identification method, device and system based on fusion perception, and relates to the technical field of mining electric shovel material surface identification. According to the application, through acquiring point cloud data and image information of a material surface, euler angle information of a mine shovel state sent by a mine shovel main control system is received; according to the installation position of the laser radar on the mining shovel and Euler angle information, converting coordinate points in point cloud data of the material surface into coordinate positions under a forklift coordinate system; identifying and dividing the effective material point cloud of the material surface according to the coordinate position of the coordinate point in the point cloud data of the material surface under the forklift coordinate system and the image information; the effective material point cloud is sent to the mine shovel main control system so as to provide a data basis for path planning and material digging of the follow-up mine shovel main control system, and the three-dimensional material surface can be identified and segmented while specific accurate position coordinates can be obtained, so that the positioning requirement of an actual forklift on materials is met.

Description

Mining electric shovel material surface identification method, device and system based on fusion perception
Technical Field
The application relates to the technical field of mining electric shovel material surface identification, in particular to a mining electric shovel material surface identification method, device and system based on fusion perception.
Background
Mine actual working scene environment is abominable, high temperature insolate in summer, low temperature strong wind in winter and often accompanied with sand storm, and when the shovel car master is controlling the electric shovel and carrying out the material excavation, also face the restriction that brings by aspects such as external environment and operation experience, the efficiency of operation is not high and there is certain potential safety hazard. In order to ensure the safety of mine site work and improve the efficiency of mineral excavation on the basis, the technical promotion of unmanned mining cards is necessary, and the key point of unmanned control is that the specific coordinate position of minerals can be correctly obtained, and the forklift is remotely or unmanned track planning is carried out on the basis of positioning the accurate position of the minerals so as to correctly excavate materials.
The existing visual scheme is difficult to achieve the purpose of acquiring specific and accurate position coordinates and simultaneously identifying and dividing the three-dimensional material surface, and is difficult to meet the positioning requirement of an actual forklift on materials. Therefore, it is necessary to provide a mining electric shovel material surface identification method and device based on fusion perception, so as to solve the above problems.
Disclosure of Invention
The application aims to provide a mining electric shovel material surface identification method, device and system based on fusion perception, aiming at the defects of the prior art, so as to solve the problems that the prior art can hardly acquire specific accurate position coordinates and simultaneously identify and divide the three-dimensional material surface, and the positioning requirement of an actual forklift on the material is difficult to meet.
In a first aspect, the application provides a mining electric shovel material surface identification method based on fusion perception, which comprises the following steps:
acquiring point cloud data and image information of a material surface; the point cloud data are collected by a laser radar installed on the mine shovel, and the image information is collected by a camera installed on the mine shovel;
receiving Euler angle information of the state of the ore shovel sent by an ore shovel main control system;
according to the installation position of the laser radar on the mining shovel and the Euler angle information, converting coordinate points in the point cloud data of the material surface into coordinate positions under a forklift coordinate system;
identifying and dividing an effective material point cloud of the material surface according to the coordinate position of a coordinate point in the point cloud data of the material surface under a forklift coordinate system and the image information;
and sending the effective material point cloud to an ore shovel main control system.
Further, according to the installation position of the laser radar on the mining shovel and the euler angle information, converting coordinate points in the point cloud data of the material surface into coordinate positions under a forklift coordinate system, including:
according to the Euler angle information, calculating a rotation matrix of the laser radar coordinate system relative to the forklift coordinate system;
according to the installation position of the laser radar on the mining shovel, calculating a translation matrix of a laser radar coordinate system relative to a forklift coordinate system;
according to the rotation matrix and the translation matrix, obtaining a conversion relation of point coordinates under a laser radar coordinate system relative to point coordinates under a forklift coordinate system;
and according to the conversion relation, converting each coordinate point in the point cloud data of the material surface into a coordinate position under a forklift coordinate system.
Further, according to the euler angle information, calculating a rotation matrix of a laser radar coordinate system relative to a forklift coordinate system, including:
assuming that the rotation matrix of the laser radar coordinate system relative to the forklift coordinate system is R, the rotation angle isPoint P is in the forklift coordinate system +.>The lower coordinates are (">) The method comprises the steps of carrying out a first treatment on the surface of the Point P in the lidar coordinate system +.>The lower coordinates are (">) Obtaining a conversion relation of a coordinate system of the formula (1):
(1)
according to formula (1), can be obtainedAnd->The method comprises the following steps:
-/> (2)
(3)
from the equations (2) and (3), the following coordinate transformation matrix equation can be derived:
(4)
according to equation (4), the rotation matrix R can be obtained as:
(5)
the coordinate rotation in the two-dimensional space is generalized to the rotation around the z axis in the three-dimensional space, and the rotation angle is the same as that of the coordinate rotation in the three-dimensional spaceThe heading angle in euler angle, since the z-axis is not transformed before and after rotation, the above formula (4) can be written as follows:
(6)
from this, a course angle rotation matrix can be obtainedThe method comprises the following steps:
(7)
from the above equation (6), it can be deduced that the rotation is around the y-axis of the coordinate axisThe formula of pitch angle pictch of (2) is as follows:
(8)
from this, pitch pictch rotation matrix can be obtainedThe method comprises the following steps:
(9)
rotation about the x-axis of the coordinate axis can also be deducedThe roll angle roll is given by:
(10)
thereby obtaining the roll angle roll rotation matrixThe method comprises the following steps:
(11)
roll angle roll rotation matrix here, since no roll angle is assumedThe method comprises the following steps of setting an identity matrix:
(12)
rotating the coordinate axes according to different sequences to obtain different rotation matrixes R, R having six forms respectively
(13)
According to the Euler angle measurement mode, each rotation carries out external rotation according to the coordinate axis of the forklift chassis coordinate system, the external rotation matrix is a left-hand matrix, the rotation is carried out according to the sequence of X-Y-Z, and the obtained rotation matrix isAnd calculating a corresponding rotation matrix through the angle value of the Euler angle sent by the main control system.
Further, according to the installation position of the laser radar on the mining shovel, calculating a translation matrix of the laser radar coordinate system relative to the forklift coordinate system, including:
the translation matrix of the laser radar and the forklift coordinate system is obtained by measuring three-dimensional coordinates x, y and z from the origin of the laser radar at the installation position to the origin of the forklift coordinate system, and is as follows:
(14)。
further, according to the rotation matrix and the translation matrix, a conversion relation of the point coordinates in the laser radar coordinate system relative to the point coordinates in the forklift coordinate system is obtained, including:
and obtaining the coordinates ((X) of the points under the forklift coordinate system according to the rotation matrix and the translation matrix c ,Y c ,Z c ) Coordinates of points under a laser radar coordinate system,/>,/>) The conversion relation of (2) is:
(15)。
further, identifying the effective material point cloud from which the material level is segmented according to the coordinate position of the coordinate point in the point cloud data of the material level under the forklift coordinate system and the image information, including:
uniformly sampling and filtering coordinate points in point cloud data of a material surface;
performing outlier removal on the uniformly sampled and filtered point cloud data;
fitting out the maximum plane in the point cloud data, and removing the ground point cloud;
inputting image information of a material surface into a segmentation model to obtain a result mask image, wherein the pixel values of the material surface and the background are different; obtaining image position information of the material surface by using threshold segmentation and contour detection;
and (3) mapping coordinate positions of coordinate points in the point cloud data of the material surface on an image of the material surface through registration iteration by utilizing a point cloud image fusion principle, and obtaining an effective material point cloud by further dividing the point cloud in a material surface area of image segmentation.
Further, by utilizing a point cloud image fusion principle, coordinate positions of coordinate points in point cloud data of a material surface are mapped onto an image of the material surface through registration iteration under a forklift coordinate system, and the effective material point cloud is obtained by further dividing the point cloud in a material surface area of image division, which comprises the following steps:
randomly selecting seed points from the point cloud, searching the radius r range of the seed points by using a k-d tree, and classifying the points and the seed points into the same cluster if the points exist in the range;
selecting new seed points from the cluster, and continuing to execute the searching process in the radius in the last step until the number of points in the cluster is not increased any more, and ending the cluster clustering;
setting a point range of the clusters through input parameters of an algorithm, wherein the point range is the maximum and minimum points limited by each cluster;
if the clustering cluster number is within the threshold value range, reserving the clustering result, otherwise, removing;
selecting new seed points again in the rest point clouds, and continuing to execute the steps until all points in the point clouds are traversed; and storing a plurality of clustering clusters which accord with the point cloud point number threshold values through repeated iteration, and selecting the cluster with the most points from the clustering clusters to serve as the required final effective material point cloud.
Further, before receiving euler angle information of the mine shovel state sent by the mine shovel main control system, the method further comprises:
setting a communication protocol and a specific message format of the processor and the mine shovel main control system, wherein the communication protocol comprises a heartbeat packet format, a notification message requirement sent by the mine shovel main control system to the processor, and a response and a scanning result message requirement sent by the processor to the mine shovel main control system;
connecting the mine shovel main control system according to the ip address where the mine shovel main control system is located and the port number which is monitored to be opened, and after the connection is successful, starting to send a heartbeat packet to the mine shovel main control system by the processor to determine the connection state of the mine shovel main control system and the mine shovel main control system;
when the direction of the mine shovel laser radar faces the material surface, the mine shovel main control system sends scanning notification and Euler angle information of the mine shovel state to the processor, the Euler angle information comprises a rotation angle, a roll angle and a pitch angle, and the processor sends a scanning starting signal to the mine shovel main control system after receiving the notification.
In a second aspect, the application provides a mining electric shovel material surface recognition device based on fusion perception, which comprises:
the acquisition unit is used for acquiring point cloud data and image information of the material surface; the point cloud data are collected by a laser radar installed on the mine shovel, and the image information is collected by a camera installed on the mine shovel;
the receiving unit is used for receiving Euler angle information of the state of the mine shovel sent by the mine shovel main control system;
the conversion unit is used for converting coordinate points in the point cloud data of the material level into coordinate positions under a forklift coordinate system according to the installation position of the laser radar on the mining shovel and the Euler angle information;
the identification unit is used for identifying and dividing the effective material point cloud of the material surface according to the coordinate position of the coordinate point in the point cloud data of the material surface under the forklift coordinate system and the image information;
and the sending unit is used for sending the effective material point cloud to the mine shovel main control system.
In a third aspect, the application provides a mining electric shovel material surface identification system based on fusion perception, which comprises: the mining shovel comprises a laser radar and image fusion sensing device, a mining shovel main control system and a processor; the laser radar and image fusion sensing device is arranged on the mining shovel; the laser radar and the camera are in communication connection with the processor, and the processor is in communication connection with the mine shovel main control system;
the laser radar and image fusion sensing device is used for adopting point cloud data and image information of a material surface;
the processor is used for receiving Euler angle information of the state of the mine shovel sent by the mine shovel main control system; according to the installation position of the laser radar on the mining shovel and the Euler angle information, converting coordinate points in the point cloud data of the material surface into coordinate positions under a forklift coordinate system; identifying and dividing an effective material point cloud of the material surface according to the coordinate position of a coordinate point in the point cloud data of the material surface under a forklift coordinate system and the image information; the effective material point cloud is sent to an ore shovel main control system;
the mine shovel main control system is used for sending scanning instructions and Euler angle information of mine shovel states to the processor and receiving effective material point clouds sent by the processor
The application has the following beneficial effects: according to the mining electric shovel material surface identification method, device and system based on fusion perception, point cloud data and image information of the material surface are acquired, the point cloud data are acquired by a laser radar installed on a mining shovel, and the image information is acquired by a camera installed on the mining shovel; receiving Euler angle information of the state of the ore shovel sent by an ore shovel main control system; according to the installation position of the laser radar on the mining shovel and Euler angle information, converting coordinate points in point cloud data of the material surface into coordinate positions under a forklift coordinate system; identifying and dividing the effective material point cloud of the material surface according to the coordinate position of the coordinate point in the point cloud data of the material surface under the forklift coordinate system and the image information; the effective material point cloud is sent to the mine shovel main control system so as to provide a data basis for path planning and material digging of the follow-up mine shovel main control system, and the three-dimensional material surface can be identified and segmented while specific accurate position coordinates can be obtained, so that the positioning requirement of an actual forklift on materials is met.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings that are needed in the embodiments will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and other drawings can be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a flow chart of a mining electric shovel material surface identification method based on fusion perception;
FIG. 2 is a two-dimensional coordinate system conversion schematic;
FIG. 3 is a heading angle conversion schematic;
FIG. 4 is a pitch angle conversion schematic;
FIG. 5 is a roll angle transition schematic;
FIG. 6 is a three-dimensional coordinate system conversion schematic;
FIG. 7 is a schematic diagram of a mining shovel material surface identification device based on fusion perception;
fig. 8 is a schematic diagram of a mining shovel material surface identification system based on fusion perception according to the present application.
Detailed Description
It should be noted that, without conflict, the embodiments of the present application and features of the embodiments may be combined with each other. The application will be described in detail below with reference to the drawings in connection with embodiments. It should be noted that the following detailed description is illustrative and is intended to provide further explanation of the application. Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this application belongs.
Referring to fig. 1, the application provides a mining electric shovel material surface identification method based on fusion perception, wherein an execution main body of the method is a processor, and the method comprises the following steps:
s101, acquiring point cloud data and image information of a material surface; the point cloud data are collected by a laser radar installed on the mine shovel, and the image information is collected by a camera installed on the mine shovel.
Install laser radar and image fusion perception device on the ore shovel, the device has sensor shock-absorbing structure, and the sensor structure of laser radar and 2D camera combination, response mouth seal structure, dismouting maintenance mouth seal structure and automatically controlled windscreen wiper structure, adaptation mine that can be better is comparatively abominable environmental requirement. The camera can monitor and acquire the image data of the material surface in real time, so that the material surface can be conveniently identified and segmented by combining the point cloud data. And selecting a proper area array type laser radar according to the position height of the material surface on site and the field of view requirement, and simultaneously, the laser radar also needs to have waterproof and shockproof indexes to stably acquire the point cloud data of the material surface. And establishing a tcp/ip link between the laser radar and the processor to acquire real-time point cloud data, wherein three-dimensional coordinate point information of a material surface acquired by the laser radar and two-dimensional image information at the same moment can be sent to the processor through tcp/ip communication of the local area network.
S102, receiving Euler angle information of the state of the mine shovel sent by the mine shovel main control system.
The processor establishes a tcp/ip instruction with the mine shovel main control system to receive a scanning instruction and Euler angle information of the mine shovel at the time. Setting a communication protocol and a specific message format of the processor and the mine shovel main control system, wherein the communication protocol comprises a heartbeat packet format, a notification message requirement sent by the mine shovel main control system to the processor, and a response and a scanning result message requirement sent by the processor to the mine shovel main control system. Connecting the mine shovel main control system according to the ip address where the mine shovel main control system is located and the port number which is monitored to be opened, and after the connection is successful, starting to send a heartbeat packet to the mine shovel main control system by the processor to determine the connection state of the mine shovel main control system and the mine shovel main control system; when the direction of the mine shovel laser radar faces the material surface, the mine shovel main control system sends scanning notification and Euler angle information of the mine shovel state to the processor, the Euler angle information comprises a rotation angle, a roll angle and a pitch angle, and the processor sends a scanning starting signal to the mine shovel main control system after receiving the notification.
Specifically, the euler angle information includes a rotation angle, a roll angle, and a pitch angle.
And S103, converting coordinate points in the point cloud data of the material surface into coordinate positions under a forklift coordinate system according to the installation position of the laser radar on the mining shovel and the Euler angle information.
Since the point cloud data acquired by the lidar is relative to the lidar coordinate system, in order for the point cloud data finally obtained by the forklift to be in the forklift coordinate system, coordinate conversion needs to be performed on the coordinates of each point in the point cloud.
Specifically, according to the Euler angle information, a rotation matrix of the laser radar coordinate system relative to the forklift coordinate system is calculated. Firstly, a coordinate system needs to be rotated and transformed, firstly, a unit vector coordinate transformation of a two-dimensional plane is taken as an example, a rotation matrix of a laser radar coordinate system relative to a forklift coordinate system is assumed to be R, and a rotation angle is assumed to be RPoint P is in the forklift coordinate system +.>The lower coordinates are (">) The method comprises the steps of carrying out a first treatment on the surface of the Point P in the lidar coordinate system +.>The lower coordinates are (">) Obtaining a conversion relation of a coordinate system of the formula (1):
(1)
according to formula (1), can be obtainedAnd->The method comprises the following steps:
-/> (2)
(3)
from the equations (2) and (3), the following coordinate transformation matrix equation can be derived:
(4)
according to equation (4), the rotation matrix R can be obtained as:
(5)
generalizing from two dimensions to three dimensions, the right hand rule can imagine that there is a Z axis pointing perpendicular to the XOY plane at the screen at point O in fig. 2: referring to fig. 3, the coordinate rotation in the two-dimensional space is generalized to the rotation around the z-axis in the three-dimensional space, the rotation angleThe heading angle in euler angle (also called yaw angle yaw), since the z-axis is not changed before and after rotation, the above formula (4) can be written as follows:
(6)
from this, a course angle rotation matrix can be obtainedThe method comprises the following steps:
(7)
referring to FIG. 4, the rotation about the y-axis of the coordinate axis can be deduced from the above equation (6)The formula of pitch angle pictch of (2) is as follows:
(8)
from this, pitch pictch rotation matrix can be obtainedThe method comprises the following steps:
(9)
referring to FIG. 5, rotation about the coordinate axis x can also be deducedThe roll angle roll is given by:
(10)
thereby obtaining the roll angle roll rotation matrixThe method comprises the following steps:
(11)
roll angle roll rotation matrix here, since no roll angle is assumedThe method comprises the following steps of setting an identity matrix:
(12)
rotating the coordinate axes according to different sequences to obtain different rotation matrixes R, R having six forms respectively
(13)
According to the Euler angle measurement mode, each rotation carries out external rotation according to the coordinate axis of the forklift chassis coordinate system, the external rotation matrix is a left-hand matrix, the rotation is carried out according to the sequence of X-Y-Z, and the obtained rotation matrix isAnd calculating a corresponding rotation matrix through the angle value of the Euler angle sent by the main control system.
And calculating a translation matrix of the laser radar coordinate system relative to the forklift coordinate system according to the installation position of the laser radar on the mining shovel. The translation matrix of the laser radar and the forklift coordinate system is obtained by measuring three-dimensional coordinates x, y and z from the origin of the laser radar at the installation position to the origin of the forklift coordinate system, and is as follows:
(14)。
obtaining the coordinates of the points under the forklift coordinate system according to the rotation matrix and the translation matrix,/>,/>) Coordinates (+_with points under laser radar coordinate system>,/>,/>) The conversion relation of (2) is:
(15)。
referring to fig. 6, each coordinate point in the point cloud data of the material level is converted into a coordinate position under a forklift coordinate system according to the conversion relation.
S104, identifying and dividing the effective material point cloud of the material surface according to the coordinate position of the coordinate point in the point cloud data of the material surface under the forklift coordinate system and the image information.
The obtained point cloud data includes information of a material surface, a ground surface and an interference point, and effective material information needs to be segmented from an image and the point cloud in order to ensure the accuracy of the point cloud position information sent to the main control system.
And downsampling the material face point cloud. Considering the time of point cloud processing and the limit of the number of material points to be sent comprehensively, the point cloud needs to be filtered firstly to reduce the number of points of the point cloud, and the processing speed is improved. While uniform sampling filtering is used in order to preserve as much as possible the original position coordinates of the point cloud. The uniform sampling filtering is used for constructing a sphere with a specified radius to perform downsampling filtering on point clouds, and the point closest to the center of the sphere in each sphere is used as a downsampling point, so that the number of the point clouds is reduced, the original position information of the reserved points is not changed, and the accuracy is better compared with other filtering methods.
And removing the interference points in the point cloud. The filtered material surface point cloud has a plurality of miscellaneous points, and if the material surface point cloud is sent to a main control system, the route planning is influenced. Removal of point cloud interference points is required to remove sparse outliers. The main steps of the outlier removal algorithm are as follows: the average distance between each point in the point cloud and the adjacent points is calculated, the obtained result is assumed to accord with Gaussian distribution, the shape is determined by the mean value and the standard deviation, and the points with the average distance outside the standard range are defined as outliers and removed.
And removing the ground point cloud in the point clouds. Because the ground point cloud is gentle and is located on the same plane, the ground can be removed and filtered by fitting the largest plane in the point cloud. The method of fitting the ground plane point cloud is the random sample consensus algorithm (RANSAC). Firstly, randomly selecting three non-collinear points in a material surface cloud, and calculating plane equations of planes where the three points are locatedThe method comprises the steps of carrying out a first treatment on the surface of the Calculating the distance +.A. of all points in the point cloud to the plane>Distance threshold value +.>For distance-></>Regarding the points of the model as interior points of the model, otherwise, regarding the points as exterior points of the model, and recording the number of the interior points of the model; and calculating an iteration end judging factor according to the expected error rate, the optimal number of inner points, the total sample number and the current iteration number at the end of each iteration, determining whether to stop iteration according to the number of the inner points, repeating the steps until the set iteration number is reached, regarding a plane model with the maximum number of the inner points as the ground, recording the index of the part of point cloud, and deleting the part of point cloud, namely removing the ground.
Before actual production, acquiring field material surface image data by using an industrial camera installed on an electric shovel, marking a frame by using a polygon, training the marked material surface image data by using a segment frame, and generating a segmentation model; the method comprises the steps that a result mask image can be obtained by inputting on-site material surface image data into a segmentation model, wherein pixel values of a material surface and a background are different; and obtaining the image position information of the material surface by using threshold segmentation and contour detection.
And (5) dividing to obtain a final material point cloud. In the prior art, approximate material surface position information is obtained through an image, however, the position accuracy is still insufficient, in order to obtain more accurate coordinate information, the point cloud information is mapped onto the image through registration iteration by utilizing a point cloud image fusion principle, and the accurate material surface position information is obtained through further segmentation of the point cloud in a material surface area segmented by the image.
Because of the practical installation and the point cloud acquisition, some point clouds except the ground point clouds and the discrete points still exist, and the point clouds are denser and are difficult to remove in a filtering mode, so that the material point clouds with the largest clusters are segmented in an European cluster segmentation mode.
Firstly randomly selecting seed points in a point cloud, searching the radius r range of the seed points by using a k-d tree, and classifying the points and the seed points into the same cluster if the points exist in the range; and selecting new seed points from the cluster, and continuously executing the searching process in the radius in the last step until the number of points in the cluster is not increased any more, and ending the clustering of the cluster. The point range of the clusters, namely the maximum and minimum points limited by each cluster, is set through the input parameters of the algorithm. If the clustering cluster number is within the threshold value range, reserving the clustering result, otherwise, removing; and selecting new seed points again in the rest point clouds, and continuing to execute the steps until all points in the point clouds are traversed. At this time, a plurality of clustering clusters which meet the point cloud point number threshold values are stored through a plurality of iterations, and the clustering with the most points is selected from the clustering clusters, namely the required final material point cloud.
S105, the effective material point cloud is sent to a mining shovel main control system.
And sending the position information of the points of the material point cloud to the mine shovel main control system for subsequent path planning of the mine shovel main control system.
Referring to fig. 7, the application provides a mining electric shovel material surface recognition device based on fusion perception, which comprises:
an acquisition unit 71 for acquiring point cloud data and image information of a material level; the point cloud data are collected by a laser radar installed on the mine shovel, and the image information is collected by a camera installed on the mine shovel.
And the receiving unit 72 is used for receiving Euler angle information of the mine shovel state sent by the mine shovel main control system.
And the conversion unit 73 is used for converting coordinate points in the point cloud data of the material surface into coordinate positions under a forklift coordinate system according to the installation position of the laser radar on the mining shovel and the Euler angle information.
And the identifying unit 74 is used for identifying the effective material point cloud for dividing the material surface according to the coordinate position of the coordinate point in the point cloud data of the material surface under the forklift coordinate system and the image information.
And the sending unit 75 is used for sending the effective material point cloud to the mining shovel main control system.
Referring to fig. 8, the application provides a mining electric shovel material surface recognition system based on fusion perception, which comprises: the laser radar and image fusion sensing device 81, the mine shovel main control system 82 and the processor 83; a laser radar 84 and a camera 85 are arranged in the laser radar and image fusion sensing device 81, and the laser radar and image fusion sensing device 81 is arranged on the mining shovel 86; the laser radar 84 and the camera 85 are in communication connection with the processor 83, and the processor 83 is in communication connection with the mining shovel main control system 82.
The laser radar and image fusion sensing device 81 is used for adopting point cloud data and image information of a material surface.
The processor 83 is configured to receive euler angle information of a mine shovel state sent by the mine shovel main control system; according to the installation position of the laser radar on the mining shovel and the Euler angle information, converting coordinate points in the point cloud data of the material surface into coordinate positions under a forklift coordinate system; identifying and dividing an effective material point cloud of the material surface according to the coordinate position of a coordinate point in the point cloud data of the material surface under a forklift coordinate system and the image information; and sending the effective material point cloud to an ore shovel main control system.
The mine shovel main control system 82 is used for sending scanning instructions and Euler angle information of the mine shovel state to the processor and receiving effective material point clouds sent by the processor.
It is noted that the terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of exemplary embodiments according to the present application. As used herein, the singular is also intended to include the plural unless the context clearly indicates otherwise, and furthermore, it is to be understood that the terms "comprises" and/or "comprising" when used in this specification are taken to specify the presence of stated features, steps, operations, devices, components, and/or combinations thereof.
It should be noted that the terms "first," "second," and the like in the description and the claims of the present application and the above figures are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate such that embodiments of the application described herein may be capable of being practiced otherwise than as specifically illustrated and described.
The above description is only of the preferred embodiments of the present application and is not intended to limit the present application, but various modifications and variations can be made to the present application by those skilled in the art. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present application should be included in the protection scope of the present application.

Claims (9)

1. The mining electric shovel material surface identification method based on fusion perception is characterized by comprising the following steps of:
acquiring point cloud data and image information of a material surface; the point cloud data are collected by a laser radar installed on the mine shovel, and the image information is collected by a camera installed on the mine shovel;
receiving Euler angle information of the state of the ore shovel sent by an ore shovel main control system;
according to the Euler angle information, calculating a rotation matrix of the laser radar coordinate system relative to the forklift coordinate system; according to the installation position of the laser radar on the mining shovel, calculating a translation matrix of a laser radar coordinate system relative to a forklift coordinate system; according to the rotation matrix and the translation matrix, obtaining a conversion relation of point coordinates under a laser radar coordinate system relative to point coordinates under a forklift coordinate system; converting each coordinate point in the point cloud data of the material surface into a coordinate position under a forklift coordinate system according to the conversion relation;
identifying and dividing an effective material point cloud of the material surface according to the coordinate position of a coordinate point in the point cloud data of the material surface under a forklift coordinate system and the image information;
and sending the effective material point cloud to an ore shovel main control system.
2. The fusion perception-based mining electric shovel material surface identification method according to claim 1, wherein calculating a rotation matrix of a laser radar coordinate system relative to a forklift coordinate system according to the euler angle information comprises:
assuming that the rotation matrix of the laser radar coordinate system relative to the forklift coordinate system is R, the rotation angle isPoint P is in the forklift coordinate system +.>The lower coordinates are (">) The method comprises the steps of carrying out a first treatment on the surface of the Point P in the lidar coordinate system +.>The lower coordinates are (">) Obtaining a conversion relation of a coordinate system of the formula (1):
(1)
according to formula (1), can be obtainedAnd->The method comprises the following steps:
-/> (2)
(3)
from the equations (2) and (3), the following coordinate transformation matrix equation can be derived:
(4)
according to equation (4), the rotation matrix R can be obtained as:
(5)
the coordinate rotation in the two-dimensional space is generalized to the rotation around the z axis in the three-dimensional space, and the rotation angle is the same as that of the coordinate rotation in the three-dimensional spaceThe heading angle in euler angle, since the z-axis is not transformed before and after rotation, the above formula (4) can be written as follows:
(6)
from this, a course angle rotation matrix can be obtainedThe method comprises the following steps:
(7)
from the above equation (6), it can be deduced that the rotation is around the y-axis of the coordinate axisThe formula of pitch angle pictch of (2) is as follows:
(8)
from this, pitch pictch rotation matrix can be obtainedThe method comprises the following steps:
(9)
rotation about the x-axis of the coordinate axis can also be deducedThe roll angle roll is given by:
(10)
thereby obtaining the roll angle roll rotation matrixThe method comprises the following steps:
(11)
due to the assumption that no roll angle existsSo the roll angle roll rotation matrix hereThe method comprises the following steps of setting an identity matrix:
(12)
rotating the coordinate axes according to different sequences to obtain different rotation matrixes R, R having six forms respectively
(13);
According to the Euler angle measurement mode, each rotation carries out external rotation according to the coordinate axis of the forklift chassis coordinate system, the external rotation matrix is a left-hand matrix, the rotation is carried out according to the sequence of X-Y-Z, and the obtained rotation matrix isAnd calculating a corresponding rotation matrix through the angle value of the Euler angle sent by the main control system.
3. The fusion perception-based mining electric shovel material surface identification method according to claim 2, wherein calculating a translation matrix of a laser radar coordinate system relative to a forklift coordinate system according to the installation position of the laser radar on the mining shovel comprises:
the translation matrix of the laser radar and the forklift coordinate system is obtained by measuring three-dimensional coordinates x, y and z from the origin of the laser radar at the installation position to the origin of the forklift coordinate system, and is as follows:
(14)。
4. the mining electric shovel material surface identification method based on fusion perception according to claim 3, wherein the obtaining of the conversion relation of the point coordinates in the laser radar coordinate system relative to the point coordinates in the forklift coordinate system according to the rotation matrix and the translation matrix comprises the following steps:
obtaining the coordinates of the points under the forklift coordinate system according to the rotation matrix and the translation matrix,/>,/>) Coordinates (+_with points under laser radar coordinate system>,/>,/>) The conversion relation of (2) is:
(15)。
5. the fusion perception-based mining electric shovel material surface identification method according to claim 4, wherein the identification of the effective material point cloud for dividing the material surface according to the coordinate position of the coordinate point in the point cloud data of the material surface under the forklift coordinate system and the image information comprises the following steps:
uniformly sampling and filtering coordinate points in point cloud data of a material surface;
performing outlier removal on the uniformly sampled and filtered point cloud data;
fitting out the maximum plane in the point cloud data, and removing the ground point cloud;
inputting image information of a material surface into a segmentation model to obtain a result mask image, wherein the pixel values of the material surface and the background are different; obtaining image position information of the material surface by using threshold segmentation and contour detection;
and (3) mapping coordinate positions of coordinate points in the point cloud data of the material surface on an image of the material surface through registration iteration by utilizing a point cloud image fusion principle, and obtaining an effective material point cloud by further dividing the point cloud in a material surface area of image segmentation.
6. The mining electric shovel material surface identification method based on fusion perception according to claim 5, wherein the coordinate positions of coordinate points in point cloud data of the material surface under a forklift coordinate system are mapped onto an image of the material surface through registration iteration by utilizing a point cloud image fusion principle, and effective material point clouds are obtained by further dividing point clouds in a material surface area of image division, and the method comprises the following steps:
randomly selecting seed points from the point cloud, searching the radius r range of the seed points by using a k-d tree, and classifying the points and the seed points into the same cluster if the points exist in the range;
selecting new seed points from the cluster, and continuing to execute the searching process in the radius in the last step until the number of points in the cluster is not increased any more, and ending the cluster clustering;
setting a point range of the clusters through input parameters of an algorithm, wherein the point range is the maximum and minimum points limited by each cluster;
if the clustering cluster number is within the threshold value range, reserving the clustering result, otherwise, removing;
selecting new seed points again in the rest point clouds, and continuing to execute the steps until all points in the point clouds are traversed; and storing a plurality of clustering clusters which accord with the point cloud point number threshold values through repeated iteration, and selecting the cluster with the most points from the clustering clusters to serve as the required final effective material point cloud.
7. The fusion awareness-based mining electric shovel material surface identification method according to claim 1, wherein before receiving euler angle information of a mining shovel state sent by a mining shovel main control system, the method further comprises:
setting a communication protocol and a specific message format of the processor and the mine shovel main control system, wherein the communication protocol comprises a heartbeat packet format, a notification message requirement sent by the mine shovel main control system to the processor, and a response and a scanning result message requirement sent by the processor to the mine shovel main control system;
connecting the mine shovel main control system according to the ip address where the mine shovel main control system is located and the port number which is monitored to be opened, and after the connection is successful, starting to send a heartbeat packet to the mine shovel main control system by the processor to determine the connection state of the mine shovel main control system and the mine shovel main control system;
when the direction of the mine shovel laser radar faces the material surface, the mine shovel main control system sends scanning notification and Euler angle information of the mine shovel state to the processor, the Euler angle information comprises a rotation angle, a roll angle and a pitch angle, and the processor sends a scanning starting signal to the mine shovel main control system after receiving the notification.
8. Mining electric shovel material face recognition device based on fusion perception, characterized by comprising:
the acquisition unit is used for acquiring point cloud data and image information of the material surface; the point cloud data are collected by a laser radar installed on the mine shovel, and the image information is collected by a camera installed on the mine shovel;
the receiving unit is used for receiving Euler angle information of the state of the mine shovel sent by the mine shovel main control system;
the conversion unit is used for calculating a rotation matrix of the laser radar coordinate system relative to the forklift coordinate system according to the Euler angle information; according to the installation position of the laser radar on the mining shovel, calculating a translation matrix of a laser radar coordinate system relative to a forklift coordinate system; according to the rotation matrix and the translation matrix, obtaining a conversion relation of point coordinates under a laser radar coordinate system relative to point coordinates under a forklift coordinate system; converting each coordinate point in the point cloud data of the material surface into a coordinate position under a forklift coordinate system according to the conversion relation;
the identification unit is used for identifying and dividing the effective material point cloud of the material surface according to the coordinate position of the coordinate point in the point cloud data of the material surface under the forklift coordinate system and the image information;
and the sending unit is used for sending the effective material point cloud to the mine shovel main control system.
9. Mining electric shovel material face identification system based on fusion perception, characterized by comprising: the mining shovel comprises a laser radar and image fusion sensing device, a mining shovel main control system and a processor; the laser radar and image fusion sensing device is arranged on the mining shovel; the laser radar and the camera are in communication connection with the processor, and the processor is in communication connection with the mine shovel main control system;
the laser radar and image fusion sensing device is used for adopting point cloud data and image information of a material surface;
the processor is used for receiving Euler angle information of the state of the mine shovel sent by the mine shovel main control system; according to the Euler angle information, calculating a rotation matrix of the laser radar coordinate system relative to the forklift coordinate system; according to the installation position of the laser radar on the mining shovel, calculating a translation matrix of a laser radar coordinate system relative to a forklift coordinate system; according to the rotation matrix and the translation matrix, obtaining a conversion relation of point coordinates under a laser radar coordinate system relative to point coordinates under a forklift coordinate system; converting each coordinate point in the point cloud data of the material surface into a coordinate position under a forklift coordinate system according to the conversion relation; identifying and dividing an effective material point cloud of the material surface according to the coordinate position of a coordinate point in the point cloud data of the material surface under a forklift coordinate system and the image information; the effective material point cloud is sent to an ore shovel main control system;
the mine shovel main control system is used for sending scanning instructions and Euler angle information of the mine shovel state to the processor and receiving effective material point clouds sent by the processor.
CN202310924852.9A 2023-07-26 2023-07-26 Mining electric shovel material surface identification method, device and system based on fusion perception Active CN116630411B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310924852.9A CN116630411B (en) 2023-07-26 2023-07-26 Mining electric shovel material surface identification method, device and system based on fusion perception

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310924852.9A CN116630411B (en) 2023-07-26 2023-07-26 Mining electric shovel material surface identification method, device and system based on fusion perception

Publications (2)

Publication Number Publication Date
CN116630411A CN116630411A (en) 2023-08-22
CN116630411B true CN116630411B (en) 2023-09-29

Family

ID=87610353

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310924852.9A Active CN116630411B (en) 2023-07-26 2023-07-26 Mining electric shovel material surface identification method, device and system based on fusion perception

Country Status (1)

Country Link
CN (1) CN116630411B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117553686B (en) * 2024-01-12 2024-05-07 成都航空职业技术学院 Laser radar point cloud-based carriage bulk cargo overrun detection method

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108152831A (en) * 2017-12-06 2018-06-12 中国农业大学 A kind of laser radar obstacle recognition method and system
CN109285188A (en) * 2017-07-21 2019-01-29 百度在线网络技术(北京)有限公司 Method and apparatus for generating the location information of target object
CN113443555A (en) * 2021-06-24 2021-09-28 上海振华重工(集团)股份有限公司 Method for determining position of grab bucket, method for detecting position of grab bucket and storage medium
CN114372914A (en) * 2022-01-12 2022-04-19 吉林大学 Mechanical laser radar point cloud preprocessing method applied to mining electric shovel
CN114543666A (en) * 2022-01-20 2022-05-27 大连理工大学 Stockpile surface prediction method based on mine field environment perception
CN115761550A (en) * 2022-12-20 2023-03-07 江苏优思微智能科技有限公司 Water surface target detection method based on laser radar point cloud and camera image fusion

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20200098480A (en) * 2017-09-13 2020-08-20 벨로다인 라이더, 인크. Multi-resolution, simultaneous position measurement and mapping based on 3-D LIDAR measurements

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109285188A (en) * 2017-07-21 2019-01-29 百度在线网络技术(北京)有限公司 Method and apparatus for generating the location information of target object
CN108152831A (en) * 2017-12-06 2018-06-12 中国农业大学 A kind of laser radar obstacle recognition method and system
CN113443555A (en) * 2021-06-24 2021-09-28 上海振华重工(集团)股份有限公司 Method for determining position of grab bucket, method for detecting position of grab bucket and storage medium
CN114372914A (en) * 2022-01-12 2022-04-19 吉林大学 Mechanical laser radar point cloud preprocessing method applied to mining electric shovel
CN114543666A (en) * 2022-01-20 2022-05-27 大连理工大学 Stockpile surface prediction method based on mine field environment perception
CN115761550A (en) * 2022-12-20 2023-03-07 江苏优思微智能科技有限公司 Water surface target detection method based on laser radar point cloud and camera image fusion

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
RotPredictor: Unsupervised Canonical Viewpoint Learning for Point Cloud Classification;Jin Fang.et.;《2020 International Conference on 3D Vision (3DV)》;第987-996页 *
无人机激光扫描测绘系统检校方法的研究与实现;熊光洋;《中国优秀硕士学位论文全文数据库工程科技Ⅱ辑》(第2期);第C031-285页 *

Also Published As

Publication number Publication date
CN116630411A (en) 2023-08-22

Similar Documents

Publication Publication Date Title
CN107735794B (en) Condition detection using image processing
CN110717983A (en) Building facade three-dimensional reconstruction method based on knapsack type three-dimensional laser point cloud data
CN116630411B (en) Mining electric shovel material surface identification method, device and system based on fusion perception
CN105260988A (en) High-precision map data processing method and high-precision map data processing device
CN112380312B (en) Laser map updating method based on grid detection, terminal and computer equipment
CN105468033A (en) Control method for medical suspension alarm automatic obstacle avoidance based on multi-camera machine vision
CN110749895B (en) Laser radar point cloud data-based positioning method
CN112414403B (en) Robot positioning and attitude determining method, equipment and storage medium
CN112327326A (en) Two-dimensional map generation method, system and terminal with three-dimensional information of obstacles
CN112835064B (en) Mapping positioning method, system, terminal and medium
CN112965063A (en) Robot mapping and positioning method
CN111880195A (en) Tower crane anti-collision method and system based on laser radar
CN114841944B (en) Tailing dam surface deformation inspection method based on rail-mounted robot
CN113516108B (en) Construction site dust suppression data matching processing method based on data identification
Yang et al. 3D building scene reconstruction based on 3D LiDAR point cloud
CN115830234A (en) Point cloud processing method and system for power transmission line modeling
CN114758063A (en) Local obstacle grid map construction method and system based on octree structure
CN116540726A (en) Intelligent obstacle avoidance method, system and medium for patrol robot
CN115017578A (en) Intelligent actual measurement method and device for building, UGV and storage medium
CN116579989B (en) Tunnel punching inclination angle correction method based on depth camera
CN109559374B (en) Efficient mapping system based on point cloud data
CN116753945A (en) Navigation method of industrial inspection robot based on multi-sensor fusion
CN111860084A (en) Image feature matching and positioning method and device and positioning system
CN112348950A (en) Topological map node generation method based on laser point cloud distribution characteristics
CN117876624B (en) Complex environment track planning method based on unmanned aerial vehicle remote sensing image

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant