CN116817904B - Door machine detecting system - Google Patents

Door machine detecting system Download PDF

Info

Publication number
CN116817904B
CN116817904B CN202311092301.7A CN202311092301A CN116817904B CN 116817904 B CN116817904 B CN 116817904B CN 202311092301 A CN202311092301 A CN 202311092301A CN 116817904 B CN116817904 B CN 116817904B
Authority
CN
China
Prior art keywords
point cloud
cloud data
dimensional point
data
inertial navigation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202311092301.7A
Other languages
Chinese (zh)
Other versions
CN116817904A (en
Inventor
牛金玉
严征
李佳阳
赵博飞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
LeiShen Intelligent System Co Ltd
Original Assignee
LeiShen Intelligent System Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by LeiShen Intelligent System Co Ltd filed Critical LeiShen Intelligent System Co Ltd
Priority to CN202311092301.7A priority Critical patent/CN116817904B/en
Publication of CN116817904A publication Critical patent/CN116817904A/en
Application granted granted Critical
Publication of CN116817904B publication Critical patent/CN116817904B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • G01C21/1652Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments with ranging devices, e.g. LIDAR or RADAR
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/005Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 with correlation of navigation data from several sources, e.g. map or contour matching
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/38Electronic maps specially adapted for navigation; Updating thereof
    • G01C21/3804Creation or updating of map data
    • G01C21/3807Creation or updating of map data characterised by the type of data
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/38Electronic maps specially adapted for navigation; Updating thereof
    • G01C21/3804Creation or updating of map data
    • G01C21/3833Creation or updating of map data characterised by the source of data
    • G01C21/3841Data obtained from two or more sources, e.g. probe vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/86Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T90/00Enabling technologies or technologies with a potential or indirect contribution to GHG emissions mitigation

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Electromagnetism (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention provides a door machine detection system, and relates to the technical field of automatic operation. The system comprises: the detection component is used for acquiring three-dimensional point cloud data and combined inertial navigation data of a preset detection area below the vertical direction of the trunk bridge; the computing processing unit is used for performing external parameter calibration on the three-dimensional point cloud data and the combined inertial navigation data, performing pose transformation on the three-dimensional point cloud data based on the processing result of the external parameter calibration and the combined inertial navigation data to obtain target three-dimensional point cloud data, and correspondingly dividing the target three-dimensional point cloud data into N sub course angles; the computing processing unit is also used for translating the target three-dimensional point cloud data to the rotation center of the gantry crane to obtain a three-dimensional point cloud map under the gantry crane coordinate system, updating the target three-dimensional point cloud data of each sub-heading angle, and detecting the target to be detected according to the three-dimensional point cloud map. Thus, the three-dimensional point cloud map can be updated in real time, and a complete three-dimensional point cloud map can be constructed.

Description

Door machine detecting system
Technical Field
The invention relates to the technical field of automatic operation, in particular to a door machine detection system.
Background
The portal crane is an important shore device for port operation, and the portal crane can be used for grabbing materials to finish the material loading and unloading work of the port bulk cargo loading and unloading ship through the portal crane rotation operation. In the prior art, a plurality of laser radar sensors fixedly arranged around a material area are adopted to scan other operation area languages of the material area and a door machine respectively, corresponding three-dimensional point cloud data are obtained, then data processing is carried out on the three-dimensional point cloud data corresponding to the plurality of laser radar sensors, and cabin information and material information are obtained through detection. However, the above manner only adopts the laser radar sensor, which may cause the incompleteness of the point cloud map constructed by the three-dimensional point cloud data, thereby affecting the diversity of the detection targets, and the point cloud map cannot be updated in real time in the operation process of the portal crane.
Disclosure of Invention
In view of the above, the present invention aims to overcome the defects in the prior art, and provide a door machine detection system for solving the problems that a point cloud map is incomplete and cannot be updated in real time.
The invention provides a door machine detection system, wherein the door machine comprises a trunk bridge, and the system comprises:
the detection assembly comprises a tripod head, at least one laser radar and at least one combined inertial navigation, wherein the tripod head is arranged at the tail end of the trunk bridge, at least one laser radar and at least one combined inertial navigation are arranged on the tripod head, at least one laser radar is rigidly connected with at least one combined inertial navigation, and the detection assembly is used for acquiring three-dimensional point cloud data and combined inertial navigation data of a preset detection area below the vertical direction of the trunk bridge;
the computing processing unit is respectively in communication connection with the detection assembly and the door machine, and is used for performing external parameter calibration on the three-dimensional point cloud data and the combined inertial navigation data, equally dividing the course angle of the combined inertial navigation data according to a preset resolution to obtain N sub course angles, performing pose transformation on the three-dimensional point cloud data based on the processing result of external parameter calibration and the combined inertial navigation data to obtain target three-dimensional point cloud data, and correspondingly dividing the target three-dimensional point cloud data into N sub course angles, wherein N is an integer greater than 1;
The computing processing unit is further used for controlling the trunk bridge amplitude to be unchanged and horizontally rotate by a first preset angle, computing the rotation center of the gantry crane according to the combined inertial navigation data in the horizontal rotation process, translating the target three-dimensional point cloud data corresponding to each sub-heading angle to the rotation center so as to obtain a three-dimensional point cloud map under a gantry crane coordinate system, updating the target three-dimensional point cloud data of each sub-heading angle, and detecting a target to be detected according to the three-dimensional point cloud map.
In an embodiment, the target to be detected includes a cabin of a port bulk cargo loading and unloading ship and ground materials, and the detecting the target to be detected according to the three-dimensional point cloud map includes: and detecting the cabin and the ground materials according to the three-dimensional point cloud map.
In an embodiment, after the cabin and the ground material are detected according to the three-dimensional point cloud map, the computing processing unit is further configured to detect the cabin and the ground material again according to the three-dimensional point cloud map and update a detection result of the cabin and the ground material when the door machine rotates to a second preset angle.
In an embodiment, the detecting the cabin according to the three-dimensional point cloud map includes:
acquiring cabin point cloud data according to the three-dimensional point cloud map and a preset cabin detection area, wherein the preset cabin detection area at least comprises a hatch, a cover plate and a deck of the cabin;
extracting deck point cloud data from the cabin point cloud data based on a preset plane extraction algorithm;
fitting according to the deck point cloud data and the cabin point cloud data above the deck to obtain a hatch boundary of the hatch;
and extracting three-dimensional point cloud data which is positioned below the deck plane of the deck and is positioned in the hatch boundary, and calculating the highest point and the lowest point of materials in the hatch.
In an embodiment, the fitting to obtain the hatch boundary of the hatch according to the deck point cloud data and the cabin point cloud data above the deck includes:
projecting the deck point cloud data and the cabin point cloud data positioned above the deck onto a deck plane to obtain projection point cloud data;
performing data processing on the projection point cloud data to obtain hatch boundary point cloud data;
And performing linear fitting on the hatch boundary point cloud data to obtain a hatch boundary of the hatch, and calculating angular point coordinates of the hatch boundary.
In an embodiment, the performing data processing on the projection point cloud data to obtain hatch boundary point cloud data includes:
calculating the geometric center of the projection point cloud data, and calculating the connection line lengths of all projection points of the projection point cloud data and the geometric center connection line respectively;
performing angle equally dividing processing on the projection point cloud data by taking the geometric center as a circle center, and determining the coordinates of the projection point with the shortest connecting line length in each equally dividing angle;
and obtaining hatch boundary point cloud data according to the set of coordinates.
In an embodiment, the detecting the cabin according to the three-dimensional point cloud map further includes:
detecting the height of the cover plate according to the hatch boundary;
extracting the cabin point cloud data above the cover plate height to obtain obstacle point cloud data;
and carrying out cluster analysis on the obstacle point cloud data to obtain coordinate information and size information of each obstacle on the cabin.
In an embodiment, the detecting the ground material according to the three-dimensional point cloud map includes:
Acquiring ground material point cloud data according to the three-dimensional point cloud map and a preset ground detection area;
extracting ground point cloud data from the ground material point cloud data based on a preset plane extraction algorithm, and determining a ground plane according to the ground point cloud data;
and carrying out equal division processing on the ground material point cloud data above the ground level according to the preset material detection areas, and calculating the highest point and the lowest point of each preset material detection area.
In an embodiment, the gantry crane further comprises a grapple, the grapple is connected with the trunk bridge through a rope, and the computing processing unit is further used for acquiring real-time frame point cloud data of at least one laser radar;
the target to be detected further comprises the grapple, and the detecting the target to be detected according to the three-dimensional point cloud map further comprises: and tracking the grapple according to the real-time frame point cloud data and the three-dimensional point cloud map.
In an embodiment, the tracking the grapple according to the real-time frame point cloud data and the three-dimensional point cloud map includes:
performing differential calculation on the real-time frame point cloud data and the three-dimensional point cloud map to obtain differential point cloud data, wherein the differential point cloud data comprises grapple point cloud data and rope point cloud data;
Based on a preset point cloud registration algorithm, matching the differential point cloud data by using a preset grapple point cloud template;
if so, outputting the grapple point cloud data;
and tracking the center coordinates and the moving speed of the grapple according to the grapple point cloud data.
In an embodiment, the performing differential computation on the real-time frame point cloud data and the three-dimensional point cloud map to obtain differential point cloud data includes:
performing grid division on the real-time frame point cloud data and the three-dimensional point cloud map according to the same standard;
traversing all grids, and taking a point as a differential point if the point exists in the grid of the real-time frame point cloud data and the point does not exist in the grid of the three-dimensional point cloud map;
and obtaining differential point cloud data according to the set of differential points.
In an embodiment, the tracking the center coordinates and the moving speed of the grapple according to the grapple point cloud data includes:
calculating the center coordinates of the grapples according to the grapple point cloud data;
and tracking the center coordinates and the moving speed of the grapple by using a preset filtering algorithm.
In an embodiment, the computing unit performs extrinsic calibration on the three-dimensional point cloud data and the combined inertial navigation data, including:
Calculating calibration external parameters of the three-dimensional point cloud data and the combined inertial navigation data through a preset calibration algorithm;
and converting the calibrated external parameters into an external parameter rotation translation matrix.
In an embodiment, before the course angle of the combined inertial navigation data is equally divided according to a preset resolution to obtain N sub-course angles, the calculation processing unit is further configured to initialize the three-dimensional point cloud data, the combined inertial navigation data, the calibration external parameters, and the preset resolution.
In an embodiment, the performing pose transformation on the three-dimensional point cloud data based on the processing result of the extrinsic calibration and the combined inertial navigation data to obtain target three-dimensional point cloud data includes:
converting the combined inertial navigation data into a combined inertial navigation matrix;
converting the calibrated external parameters into an external parameter rotation translation matrix;
multiplying the combined inertial navigation matrix and the external reference rotation translation matrix to obtain a pose transformation matrix;
multiplying the pose transformation matrix and the three-dimensional point cloud data to obtain target three-dimensional point cloud data.
In an embodiment, the combined inertial navigation data includes inertial navigation attitude information and inertial navigation coordinate information, and the calculating the rotation center of the gantry crane according to the combined inertial navigation data includes: and calculating the rotation center of the gantry crane according to the inertial navigation coordinate information.
The invention discloses a door machine detection system, wherein a detection component is used for acquiring three-dimensional point cloud data and combined inertial navigation data of a preset detection area below the vertical direction of a trunk bridge; the computing processing unit is used for performing external parameter calibration on the three-dimensional point cloud data and the combined inertial navigation data, performing pose transformation on the three-dimensional point cloud data based on the processing result of the external parameter calibration and the combined inertial navigation data to obtain target three-dimensional point cloud data, and correspondingly dividing the target three-dimensional point cloud data into N sub course angles; the computing processing unit is also used for translating the target three-dimensional point cloud data to the rotation center of the gantry crane to obtain a three-dimensional point cloud map under the gantry crane coordinate system, updating the target three-dimensional point cloud data of each sub-course angle and detecting the target to be detected according to the three-dimensional point cloud map. In this way, a three-dimensional point cloud map is obtained according to the three-dimensional point cloud data and the combined inertial navigation data and updated in real time, a plurality of targets to be detected are detected according to the three-dimensional point cloud map, detection results are stored, the point cloud maps of the plurality of targets to be detected can be obtained, and a complete point cloud map is constructed.
In order to make the above objects, features and advantages of the present invention more comprehensible, preferred embodiments accompanied with figures are described in detail below.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings that are needed in the embodiments will be briefly described below, it being understood that the following drawings only illustrate some embodiments of the present invention and therefore should not be considered as limiting the scope, and other related drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a schematic structural diagram of a gantry crane system according to an embodiment of the present disclosure;
FIG. 2 is a schematic diagram of another door system according to an embodiment of the present disclosure;
FIG. 3 is a schematic view of a bulk carrier according to an embodiment of the present disclosure;
FIG. 4 is a schematic flow diagram of a gantry crane system according to an embodiment of the present disclosure;
fig. 5 shows another flow diagram of the gantry crane system provided by an embodiment of the present disclosure;
fig. 6 shows a further flow diagram of a gantry crane system provided by an embodiment of the present disclosure.
Description of main elements:
101-the trunk bridge; 102-grappling hook; 103-translating the base; 104-a center of rotation; 20-detection assembly.
Detailed Description
Embodiments of the present invention are described in detail below, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to like or similar elements or elements having like or similar functions throughout. The embodiments described below by referring to the drawings are illustrative only and are not to be construed as limiting the invention.
It will be understood that when an element is referred to as being "fixed to" another element, it can be directly on the other element or intervening elements may also be present. When an element is referred to as being "connected" to another element, it can be directly connected to the other element or intervening elements may also be present. In contrast, when an element is referred to as being "directly on" another element, there are no intervening elements present. The terms "vertical," "horizontal," "left," "right," and the like are used herein for illustrative purposes only.
In the present invention, unless explicitly specified and limited otherwise, the terms "mounted," "connected," "secured," and the like are to be construed broadly, and may be, for example, fixedly connected, detachably connected, or integrally formed; can be mechanically or electrically connected; can be directly connected or indirectly connected through an intermediate medium, and can be communicated with the inside of two elements or the interaction relationship of the two elements. The specific meaning of the above terms in the present invention can be understood by those of ordinary skill in the art according to the specific circumstances.
Furthermore, the terms "first," "second," and the like, are used for descriptive purposes only and are not to be construed as indicating or implying a relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defining "a first" or "a second" may explicitly or implicitly include one or more such feature. In the description of the present application, the meaning of "a plurality" is two or more, unless explicitly defined otherwise.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this application belongs. The terminology used in the description of the templates herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. The term "and/or" as used herein includes any and all combinations of one or more of the associated listed items.
Example 1
The embodiment provides a door machine detecting system for door machine rotation operation detects the bulk carrier and obtains bulk carrier's cabin information and stacks the material information that is the irregularly material in bulk carrier cabin hatch, builds and updates the whole point cloud map in real time, according to the whole point cloud map, snatchs subaerial material and stacks the irregularly bulk carrier material in bulk carrier hatch through the grapple, accomplishes port bulk carrier's material loading and unloading ship's material loading and unloading work.
Specifically, referring to fig. 1, the gantry crane 10 includes a bridge 101, and the system includes:
the detection assembly 20 comprises a cradle head 201, at least one laser radar 202 and at least one combined inertial navigation 203, wherein the cradle head 201 is installed at the tail end of the image bridge 101, at least one laser radar 202 and at least one combined inertial navigation 203 are installed on the cradle head 201, at least one laser radar 202 is rigidly connected with at least one combined inertial navigation 203, and the detection assembly 20 is used for acquiring three-dimensional point cloud data and combined inertial navigation data of a preset detection area below the vertical direction of the image bridge 101.
It should be noted that, in the present embodiment, the detection assembly 20 includes a pan-tilt 201, a laser radar 202, and a combined inertial navigation 203, and the laser radar 202 and the combined inertial navigation 203 are mounted on the pan-tilt 201, and the laser radar 202 and the combined inertial navigation 203 are rigidly connected. In other embodiments, the number of lidars may be multiple, such as 2, 3, etc., and the number of combined inertial navigation may be multiple, such as 2, 3, etc. The number of lidars, combined inertial navigation is not limited.
Referring to fig. 2, the gantry crane 10 further includes a translation base 103 and a center of rotation 104, and the gantry crane 10 can move on a track through the translation base 103 and rotate 360 degrees through the center of rotation 104; the cradle head 201 is installed at the tail end of the image bridge 101, the laser radar 202 and the combined inertial navigation 203 which are installed on the cradle head 201 vertically irradiate downwards, three-dimensional point cloud data and combined inertial navigation data of a preset detection area are acquired in real time, and the laser radar 202 and the combined inertial navigation 203 transmit the acquired three-dimensional point cloud data and combined inertial navigation data to the calculation processing unit 30 in real time through a wired or wireless network. The three-dimensional point cloud data generally comprises a coordinate X, a coordinate Y, a coordinate Z, reflection intensity and the like; the combined inertial navigation data typically includes pitch, roll, heading or yaw, longitude, latitude, and altitude.
The computing unit 30 is respectively in communication connection with the detecting component 20 and the door machine 10, the computing unit 30 is configured to perform external parameter calibration on the three-dimensional point cloud data and the combined inertial navigation data, equally divide the course angle of the combined inertial navigation data according to a preset resolution ratio to obtain N sub-course angles, perform pose transformation on the three-dimensional point cloud data based on a processing result of the external parameter calibration and the combined inertial navigation data to obtain target three-dimensional point cloud data, and correspondingly divide the target three-dimensional point cloud data into N sub-course angles, where N is an integer greater than 1.
In this embodiment, computing processing unit 30 is communicatively coupled to detection assembly 20 and door motor 10, respectively. The computing processing unit 30 receives and processes the three-dimensional point cloud data and the combined inertial navigation data transmitted by the detection assembly 20, and controls the operation of the door motor 10.
In a specific embodiment, the performing the extrinsic calibration on the three-dimensional point cloud data and the combined inertial navigation data includes: and calculating the calibration external parameters of the three-dimensional point cloud data and the combined inertial navigation data through a preset calibration algorithm.
In this embodiment, the calculation processing unit 30 calculates the calibration parameters of the three-dimensional point cloud data and the combined inertial navigation data by a preset calibration algorithm, where the calibration parameters generally include pitch angle, roll angle, heading angle or yaw angle, translation X, translation Y, and translation Z. The preset calibration algorithm is not limited.
In a specific embodiment, before the course angle of the combined inertial navigation data is equally divided according to a preset resolution to obtain N sub-course angles, the calculation processing unit is further configured to perform an initialization process on the calibration external parameter.
In this embodiment, the calculation processing unit 30 initializes parameters such as three-dimensional point cloud data, combined inertial navigation data, calibration parameters, and preset resolution. Because the rotation center 104 in the gantry crane 10 can rotate by 360 degrees, the course angle range in the combined inertial navigation data acquired by the combined inertial navigation is 0-360 degrees. After the parameter initialization, the computing unit 30 divides 360 ° into a plurality of sub-heading angles according to a preset resolution, where the preset resolution may be 0.5 °, 1 °, 2 °, and the like. For example, if the preset resolution is 1 °, 360 ° may be equally divided into 360 sub-heading angles of 1 °.
In a specific embodiment, the performing pose transformation on the three-dimensional point cloud data based on the processing result of the extrinsic calibration and the combined inertial navigation data to obtain target three-dimensional point cloud data includes: converting the combined inertial navigation data into a combined inertial navigation matrix; converting the calibrated external parameters into an external parameter rotation translation matrix; multiplying the combined inertial navigation matrix and the external reference rotation translation matrix to obtain a pose transformation matrix; multiplying the pose transformation matrix and the three-dimensional point cloud data to obtain target three-dimensional point cloud data.
In this embodiment, after initializing various parameters, the calculation processing unit 30 converts the combined inertial navigation data into a combined inertial navigation matrix, converts the calibrated external parameters into an external parameter rotation translation matrix, and multiplies the combined inertial navigation matrix by the external parameter rotation translation matrix to obtain a pose transformation matrix. And multiplying the three-dimensional point cloud data with the pose change matrix to perform pose transformation to obtain target three-dimensional point cloud data, and dividing and storing the target three-dimensional point cloud data into corresponding sub-heading angles.
The three-dimensional point cloud data before pose transformation is three-dimensional point cloud data under a laser radar coordinate system, target three-dimensional point cloud data is obtained after pose transformation, namely, three-dimensional point cloud data under a laser radar coordinate system is subjected to pose transformation to obtain three-dimensional point cloud data under a combined inertial navigation coordinate system, and the three-dimensional point cloud data under the combined inertial navigation coordinate system is target three-dimensional point cloud data. And dividing and storing the three-dimensional point cloud data of the target into corresponding sub-heading angles according to the heading angles in the combined inertial navigation data. For example, if the course angle in the combined inertial navigation data is 15 °, three-dimensional point cloud data in the combined inertial navigation coordinate system when the course angle in the combined inertial navigation data is 15 ° is divided and stored into sub-course angles of 15 °.
It should be noted that, dividing 360 ° into a plurality of sub-heading angles according to a preset resolution, and storing the target three-dimensional point cloud data into corresponding sub-heading angles. That is, when the door machine rotates to a sub-heading angle, part of target point cloud data is stored instead of storing all target point cloud data once again after 360 degrees of rotation. The dividing molecular course angle stores the point cloud data, so that the cache pressure of the computing processing unit 30 is greatly reduced, and the timeliness of updating the point cloud data is improved.
The computing unit 30 is further configured to control the magnitude of the image bridge 101 to be unchanged and horizontally rotate by a first preset angle, calculate a rotation center of the gantry crane according to the combined inertial navigation data in a horizontal rotation process, translate target three-dimensional point cloud data corresponding to each sub-heading angle to the rotation center, obtain a three-dimensional point cloud map under a gantry crane coordinate system, update target three-dimensional point cloud data of each sub-heading angle, and detect a target to be detected according to the three-dimensional point cloud map.
In this embodiment, the calculation processing unit 30 may control the amplitude of the image bridge 101. When the calculation processing unit 30 controls the amplitude of the image bridge 101 to be unchanged and only horizontally rotates, that is, only the door motor 101 rotates, the calculation processing unit 30 starts to build a map. The calculation processing unit 30 stores the target three-dimensional point cloud data and the combined inertial navigation data in each sub-heading angle in real time, when the change of the heading angle in the stored combined inertial navigation data exceeds a first preset angle, which is equivalent to the situation that the rotation range of the gantry crane exceeds the first preset angle, the calculation processing unit 30 superimposes the target three-dimensional point cloud data in each sub-heading angle stored when the gantry crane 10 rotates by the first preset angle, and a target three-dimensional point cloud map is obtained, namely the map construction is completed. The first preset angle may be 90 °, or may be set to another angle according to practical applications. For example, when the first preset angle is 90 °, if the rotation range of the gantry crane exceeds 90 °, the course angle in the combined inertial navigation data changes by 90 ° and the mapping is considered to be completed. For example, the course angle in the combined inertial navigation data is changed from 0 ° to 90 ° as a map, or the course angle in the combined inertial navigation data is changed from 20 ° to 110 ° as a map.
Further, after the drawing of the calculation processing unit 30 is completed, the calculation processing unit 30 calculates the rotation center of the gantry crane according to the combined inertial navigation data, translates the target three-dimensional point cloud data in each sub-heading angle in the target three-dimensional point cloud map to the rotation center of the gantry crane to obtain the three-dimensional point cloud map under the gantry crane coordinate system, and updates the target three-dimensional point cloud data in each sub-heading angle into the three-dimensional point cloud data under the gantry crane coordinate system.
Further, the computing unit 30 immediately detects the target to be detected after obtaining the three-dimensional point cloud map under the gantry crane coordinate system.
In a specific embodiment, the combined inertial navigation data includes inertial navigation coordinate information, and the calculating the rotation center of the gantry crane according to the combined inertial navigation data includes: and calculating the rotation center of the gantry crane according to the inertial navigation coordinate information.
In this embodiment, after the mapping is completed, the calculation processing unit 30 calculates the rotation center of the gantry crane according to the inertial navigation coordinate information in the combined inertial navigation data stored in the mapping process, where the inertial navigation coordinate information includes longitude, latitude, and altitude.
It should be noted that, after the calculation processing unit 30 completes the first mapping, if the program is closed or the gantry crane moves, the second mapping is performed, and the rotation center of the gantry crane is calculated again, so as to obtain the three-dimensional point cloud image under the gantry crane coordinate system of the second mapping.
In a specific embodiment, the target to be detected includes a cabin of a port bulk cargo loading and unloading ship and ground materials, and the detecting the target to be detected according to the three-dimensional point cloud map includes: and detecting the cabin and the ground materials according to the three-dimensional point cloud map.
In this embodiment, the computing unit 30 obtains the three-dimensional point cloud map under the gantry crane coordinate system, immediately detects the cabin and the ground materials, and stores the detection results of the cabin and the ground materials.
Referring to fig. 4, in an embodiment, the detecting the cabin according to the three-dimensional point cloud map includes the following steps S401 to S407, and each step is described below.
And S401, acquiring cabin point cloud data according to the three-dimensional point cloud map and a preset cabin detection area, wherein the preset cabin detection area at least comprises a hatch, a cover plate and a deck of the cabin.
It should be noted that, referring to fig. 3, for a bulk carrier, the cover plate is located above the hatch of the cabin, and the deck is located beside the hatch, so that in order to more completely construct a point cloud map and facilitate loading and unloading of bulk carrier materials, the deck, the hatch and the cover plate need to be detected one by one.
Step S402, extracting deck point cloud data from the cabin point cloud data based on a preset plane extraction algorithm.
It should be noted that, when the computing unit 30 obtains the three-dimensional point cloud map under the gantry crane coordinate system and obtains the deck point cloud data based on the preset plane extraction algorithm, the preset plane extraction algorithm includes, but is not limited to, a covariance method and a random sampling consistency algorithm (Random Sample Consensus, RANSAC).
And step S403, fitting to obtain a hatch boundary of the hatch according to the deck point cloud data and the cabin point cloud data positioned above the deck.
In a specific embodiment, the fitting to obtain the hatch boundary of the hatch according to the deck point cloud data and the cabin point cloud data located above the deck includes: projecting the deck point cloud data and the cabin point cloud data positioned above the deck onto a deck plane to obtain projection point cloud data; performing data processing on the projection point cloud data to obtain hatch boundary point cloud data; and performing linear fitting on the hatch boundary point cloud data to obtain a hatch boundary of the hatch, and calculating angular point coordinates of the hatch boundary.
It should be noted that, when the calculation processing unit 30 obtains the hatch boundary, the hatch boundary is rectangular, and 4 corner coordinates need to be calculated. In other embodiments, if the hatch boundary is of other shape, the corresponding corner coordinates are calculated. For example, if the hatch boundary is pentagonal, there are 5 corner coordinates.
In a specific embodiment, the performing data processing on the projection point cloud data to obtain hatch boundary point cloud data includes: calculating the geometric center of the projection point cloud data, and calculating the connection line lengths of all projection points of the projection point cloud data and the geometric center connection line respectively; performing angle equally dividing processing on the projection point cloud data by taking the geometric center as a circle center, and determining the coordinates of the projection point with the shortest connecting line length in each equally dividing angle; and obtaining hatch boundary point cloud data according to the set of coordinates.
When the calculation processing unit 30 performs the angle equally dividing processing on the projection point cloud data with the geometric center as the center, all the projection point cloud data is equally divided by 360 ° with the geometric center as the center, and generally equally divided by 2 °, and then all the projection point cloud data is equally divided by 180 ° according to 2 ° and 180 ° according to practical applications. Methods of extracting hatch boundary point cloud data also include, but are not limited to, halving angle methods.
And step S404, extracting three-dimensional point cloud data which is positioned below the deck plane of the deck and is positioned in the hatch boundary, and calculating the highest point and the lowest point of the materials in the hatch.
It should be noted that, after detecting the cabin to obtain the highest point and the lowest point of bulk cargo materials in the cabin hatch, it is known to grasp the materials and store the materials at a certain position.
In this embodiment, when the calculation processing unit 30 detects the cabin, the calculation processing unit 30 obtains cabin point cloud data according to the three-dimensional point cloud map and a preset cabin detection area, where the preset cabin detection area includes at least a hatch, a cover plate and a deck of the cabin, and the cabin point cloud data includes a hatch point cloud, a cover plate point cloud and a deck point cloud of the cabin. And extracting deck point cloud data from the cabin point cloud data by using a preset plane extraction algorithm.
Further, cabin point cloud data above the deck are obtained according to the deck point cloud data, and the deck point cloud data and the cabin point cloud data above the deck are projected onto a deck plane to obtain projection point cloud data.
Further, calculating the geometric center of the projection point cloud data, and calculating the connection length of the connection lines between all projection points in the projection point cloud data and the geometric center respectively; performing angle equally dividing processing on the projection point cloud data by taking the geometric center as the center of a circle; and determining the coordinates of the projection point with the shortest connecting line length in each equal division angle; and acquiring the hatch boundary point cloud data by collecting the coordinates of the projection points with the shortest connecting line length in each equal division angle.
Further, performing linear fitting on the hatch boundary point cloud data to obtain a hatch boundary of the hatch, and calculating corner coordinates of the hatch boundary.
Further, according to the deck point cloud data and the angular point coordinates of the hatch boundary, extracting three-dimensional point cloud data which is positioned below a deck plane and in the hatch boundary to obtain material point cloud data in the hatch, and calculating the highest point and the lowest point of bulk materials stacked in the hatch.
Step S405, detecting the cover plate height according to the hatch boundary.
It should be noted that, according to the extension of the hatch boundary in the horizontal direction by 1-2 meters, part of the point cloud data of the cover plate covered on the cabin hatch can be extracted, and then the part of the point cloud data of the cover plate is traversed to obtain the height of the cover plate.
And step S406, extracting cabin point cloud data above the cover plate height to obtain obstacle point cloud data.
The cabin point cloud data above the cover plate height is used as obstacle point cloud data because the materials stacked on the cover plate are not materials.
And S407, performing cluster analysis on the obstacle point cloud data to obtain coordinate information and size information of each obstacle on the cabin.
It should be noted that, according to the obstacle coordinate information and the size information obtained by detecting the cabin, it is known how to grasp and store the materials while avoiding the obstacle.
Referring to fig. 5, in a specific embodiment, the detection of the ground material according to the three-dimensional point cloud map includes steps S501-S503, and each step is described below.
And step S501, acquiring ground material point cloud data according to the three-dimensional point cloud map and a preset ground detection area.
It should be noted that the gantry crane needs to load ground materials onto the bulk cargo ship, so that the ground materials also need to be detected. The three-dimensional point cloud map obtained in the rotation process of the gantry crane already contains ground material information, so that ground material point cloud data can be obtained according to a preset ground detection area.
Step S502, extracting ground point cloud data from the ground material point cloud data based on a preset plane extraction algorithm, and determining a ground plane according to the ground point cloud data.
It should be noted that, when the ground point cloud data is acquired based on the preset plane extraction algorithm, the preset plane extraction algorithm includes, but is not limited to, a covariance method and a random sampling consistency algorithm (Random Sample Consensus, RANSAC).
Step S503, performing equal division processing on the ground material point cloud data above the ground level according to the preset material detection areas, and calculating the highest point and the lowest point of each preset material detection area.
In this embodiment, for convenience of calculation, the detection area of the material is set to be rectangular, and the rectangular material detection area is divided into a plurality of small rectangular areas, and the highest point and the lowest point are calculated for each small rectangular area. In other embodiments, the detection area of the material may be configured in other shapes, such as circular, triangular, etc. And the area of the material detection area can be adjusted according to the area of the material detection area when the material detection area is equally dividedAnd 3:3 or 5:5, for example, if the area of the material detection area is 20m30m, an aliquot ratio of 5:5, then 20 m/5=4m, 30 m/5=6m, thus each small rectangular area is 4m +.>6m. In other embodiments, other proportions of the aliquots may be selected depending on the application, such as 4: 4. 6:6.
it should be noted that, after the highest point and the lowest point of the ground material are obtained, the ground material is conveniently grabbed at the corresponding highest point or lowest point.
In a specific embodiment, after the cabin and the ground material are detected according to the three-dimensional point cloud map, the computing processing unit is further configured to detect the cabin and the ground material again according to the three-dimensional point cloud map when the door 10 rotates to a second preset angle, and update the detection results of the cabin and the ground material.
In this embodiment, the calculation processing unit 30 will store all the data obtained when the cabin is detected, including cabin point cloud data, deck point cloud data, projection point cloud data, hatch boundary point cloud data, coordinates of corner points, highest and lowest points of the material in the hatch, and coordinate information and size information of the obstacle. The ship cabin point cloud data, the deck point cloud data, the projection point cloud data, the hatch boundary point cloud data and other point cloud data can be used for constructing a more complete point cloud map.
Further, after the calculation processing unit 30 completes the first mapping and the first detection, the calculation processing unit 30 controls the gantry crane 10 to rotate for grabbing materials, when the gantry crane 10 rotates, the combined inertial navigation 203 synchronously acquires the combined inertial navigation data, when the course angle in the combined inertial navigation data exceeds a second preset angle, which is equivalent to the fact that the gantry crane 10 rotates to exceed the second preset angle, the gantry crane 10 completes a grabbing period, the calculation processing unit 30 starts to perform the second detection on the cabin and the ground materials, updates the detection results of the cabin and the ground materials, and synchronously updates the three-dimensional point cloud map. The second preset angle can be set to 90 degrees, and can be set to other angles according to actual application. Thereafter, every second preset angle of rotation of the door 10, i.e., every cycle of gripping material is completed, cabin inspection and ground material inspection are performed once.
In a specific embodiment, the gantry crane 10 further includes a grapple 102, the grapple 102 is connected to the nose bridge 101 through a rope, and the computing unit 30 is further configured to acquire real-time frame point cloud data of at least one laser radar 202; the target to be detected further includes the grapple 102, and the detecting the target to be detected according to the three-dimensional point cloud map further includes: and tracking the grapple 102 according to the real-time frame point cloud data and the three-dimensional point cloud map.
In this embodiment, referring again to fig. 2, the door machine 10 further includes a grapple 102, and after the first drawing is completed, the grapple 102 is detected.
It should be noted that, the real-time frame point cloud data obtained by the calculation processing unit 30 is three-dimensional point cloud data of each frame of the laser radar, and each frame is 10Hz, that is, 0.1 seconds.
In a specific embodiment, the tracking the grapple 102 according to the real-time frame point cloud data and the three-dimensional point cloud map includes steps S601 to S604, and the following description will explain the three steps.
And step S601, carrying out differential calculation on the real-time frame point cloud data and the three-dimensional point cloud map to obtain differential point cloud data, wherein the differential point cloud data comprises grapple point cloud data and rope point cloud data.
In a specific embodiment, the performing differential computation on the real-time frame point cloud data and the three-dimensional point cloud map to obtain differential point cloud data includes: performing grid division on the real-time frame point cloud data and the three-dimensional point cloud map according to the same standard; traversing all grids, and taking a point as a differential point if the point exists in the grid of the real-time frame point cloud data and the point does not exist in the grid of the three-dimensional point cloud map; and obtaining differential point cloud data according to the set of differential points.
Note that, when the computing processing unit 30 acquires the differential point cloud, the method includes, but is not limited to, the grid method used in the present embodiment.
Step S602, based on a preset point cloud registration algorithm, the differential point cloud data are matched by using a preset grapple point cloud template.
Note that when the computing processing unit 30 matches the grapple 102 based on the preset point cloud registration algorithm, the preset point cloud registration algorithm includes, but is not limited to, a normal distribution transformation algorithm (normal distribution transformation, NDT).
Step S603, if yes, outputting the grapple point cloud data.
The output grapple point cloud data is grapple point cloud data of a preset grapple point cloud template which is matched best.
Step S604, tracking the center coordinates and the moving speed of the grapple 102 according to the grapple point cloud data.
In a specific embodiment, the tracking the center coordinates and the moving speed of the grapple according to the grapple point cloud data includes: calculating the center coordinates of the grapples according to the grapple point cloud data; and tracking the center coordinates and the moving speed of the grapple by using a preset filtering algorithm.
When the calculation processing unit 30 tracks the center coordinates and the moving speed of the grapple 102 by using a preset filtering algorithm, algorithms such as kalman filtering and extended kalman filtering are used.
In this embodiment, first, the calculation processing unit 30 obtains real-time frame point cloud data of the laser radar, the real-time frame point cloud data and the three-dimensional point cloud map are subjected to grid division according to the same standard, all grids are traversed, and if points exist in the grids of the real-time frame point cloud data and no points exist in the grids of the three-dimensional point cloud map, the points are used as differential points; and obtaining differential point cloud data according to the set of differential points, wherein the differential point cloud data comprises grapple point cloud data and rope point cloud data.
Further, the grapple point cloud data of different forms are collected in advance, the grapple point cloud data of a plurality of different forms are segmented, and the grapple point cloud data are respectively stored as a plurality of grapple point cloud templates.
Further, according to a preset point cloud registration algorithm, matching grapple positions in current differential point cloud data by using a plurality of grapple point cloud templates; selecting a grapple point cloud template with the best matching position, and outputting grapple point cloud data of the grapple point cloud template; calculating the center coordinates of the grapple according to the grapple point cloud data; and calculating the center coordinates and the movement speed of the grapple by using a preset filtering algorithm.
It should be noted that the grapple detection may also be performed before the calculation processing unit 30 builds the map. Before the calculation processing unit 30 builds the graph, the preset grapple point cloud template and the real-time frame point cloud are matched based on a preset point cloud registration algorithm, so that the preset grapple point cloud template which is most matched with the grapple position in the real-time frame point cloud is obtained, and grapple tracking is performed according to grapple point cloud data of the preset grapple point cloud template. The grapple matching method can also be used after the drawing is built, but the grapple matching method directly using real-time frame point cloud data can generate the interference of other point cloud data, and the grapple point cloud data obtained by using differential point cloud data to match the grapple after the drawing is built are more accurate.
It is to be noted that, detect the grapple in real time, track the central coordinate and the motion speed of current grapple, can play the predictive action for the grapple matching of next frame, predict the coordinate position of grapple of next frame promptly to reduce the central coordinate deviation of grapple, prevent the grapple from rocking too big, thereby avoid the grapple to touch foreign objects such as ship, door machine, truck when snatching the material.
The embodiment of the disclosure provides a door machine detection system, wherein a detection component is used for acquiring three-dimensional point cloud data and combined inertial navigation data of a preset detection area below the vertical direction of a trunk bridge; the computing processing unit is used for performing external parameter calibration on the three-dimensional point cloud data and the combined inertial navigation data, performing pose transformation on the three-dimensional point cloud data based on the processing result of the external parameter calibration and the combined inertial navigation data to obtain target three-dimensional point cloud data, and correspondingly dividing the target three-dimensional point cloud data into N sub course angles; the computing processing unit is also used for translating the target three-dimensional point cloud data to the rotation center of the gantry crane to obtain a three-dimensional point cloud map under the gantry crane coordinate system, updating the target three-dimensional point cloud data of each sub-heading angle, and detecting the target to be detected according to the three-dimensional point cloud map. In this way, a three-dimensional point cloud map is obtained according to the three-dimensional point cloud data and the combined inertial navigation data and updated in real time, a plurality of targets to be detected are detected according to the three-dimensional point cloud map, detection results are stored, the point cloud maps of the plurality of targets to be detected can be obtained, and a complete point cloud map is constructed.
Any particular values in all examples shown and described herein are to be construed as merely illustrative and not a limitation, and thus other examples of exemplary embodiments may have different values.
It should be noted that: like reference numerals and letters denote like items in the following figures, and thus once an item is defined in one figure, no further definition or explanation thereof is necessary in the following figures.
The above examples merely represent a few embodiments of the present invention, which are described in more detail and are not to be construed as limiting the scope of the present invention. It should be noted that it will be apparent to those skilled in the art that several variations and modifications can be made without departing from the spirit of the invention, which are all within the scope of the invention.

Claims (15)

1. A door machine detection system, said door machine including a trunk bridge, said system comprising:
the detection assembly comprises a tripod head, at least one laser radar and at least one combined inertial navigation, wherein the tripod head is arranged at the tail end of the trunk bridge, at least one laser radar and at least one combined inertial navigation are arranged on the tripod head, at least one laser radar is rigidly connected with at least one combined inertial navigation, and the detection assembly is used for acquiring three-dimensional point cloud data and combined inertial navigation data of a preset detection area below the vertical direction of the trunk bridge;
The computing processing unit is respectively in communication connection with the detection assembly and the door machine, and is used for performing external parameter calibration on the three-dimensional point cloud data and the combined inertial navigation data, equally dividing the course angle of the combined inertial navigation data according to a preset resolution to obtain N sub course angles, performing pose transformation on the three-dimensional point cloud data based on the processing result of external parameter calibration and the combined inertial navigation data to obtain target three-dimensional point cloud data, and correspondingly dividing the target three-dimensional point cloud data into N sub course angles, wherein N is an integer greater than 1; the processing result of the external parameter calibration comprises: calibrating external parameters of the three-dimensional point cloud data and the combined inertial navigation data;
the computing processing unit is further used for controlling the trunk bridge amplitude to be unchanged and horizontally rotating by a first preset angle, computing a rotation center of the gantry crane according to the combined inertial navigation data in the horizontal rotation process, translating target three-dimensional point cloud data corresponding to each sub-heading angle to the rotation center to obtain a three-dimensional point cloud map under a gantry crane coordinate system, updating target three-dimensional point cloud data of each sub-heading angle, and detecting a target to be detected according to the three-dimensional point cloud map;
The processing result based on the external parameter calibration and the combined inertial navigation data are used for carrying out pose transformation on the three-dimensional point cloud data to obtain target three-dimensional point cloud data, and the processing method comprises the following steps:
converting the combined inertial navigation data into a combined inertial navigation matrix;
converting the calibrated external parameters into an external parameter rotation translation matrix;
multiplying the combined inertial navigation matrix and the external reference rotation translation matrix to obtain a pose transformation matrix;
multiplying the pose transformation matrix and the three-dimensional point cloud data to obtain target three-dimensional point cloud data.
2. The portal crane detection system according to claim 1, wherein the object to be detected includes a cabin of a port bulk cargo loading and unloading ship and ground materials, and the detecting the object to be detected according to the three-dimensional point cloud map includes: and detecting the cabin and the ground materials according to the three-dimensional point cloud map.
3. The door machine detection system according to claim 2, wherein after the detection of the cabin and the ground material according to the three-dimensional point cloud map, the calculation processing unit is further configured to detect the cabin and the ground material again according to the three-dimensional point cloud map and update the detection results of the cabin and the ground material when the door machine rotates to a second preset angle.
4. The door machine detection system of claim 2, wherein the detecting the cabin from the three-dimensional point cloud map comprises:
acquiring cabin point cloud data according to the three-dimensional point cloud map and a preset cabin detection area, wherein the preset cabin detection area at least comprises a hatch, a cover plate and a deck of the cabin;
extracting deck point cloud data from the cabin point cloud data based on a preset plane extraction algorithm;
fitting according to the deck point cloud data and the cabin point cloud data above the deck to obtain a hatch boundary of the hatch;
and extracting three-dimensional point cloud data which is positioned below the deck plane of the deck and is positioned in the hatch boundary, and calculating the highest point and the lowest point of materials in the hatch.
5. The door machine inspection system of claim 4, wherein said fitting the hatch boundary of the hatch based on the deck point cloud data and the hold point cloud data above the deck comprises:
projecting the deck point cloud data and the cabin point cloud data positioned above the deck onto a deck plane to obtain projection point cloud data;
Performing data processing on the projection point cloud data to obtain hatch boundary point cloud data;
and performing linear fitting on the hatch boundary point cloud data to obtain a hatch boundary of the hatch, and calculating angular point coordinates of the hatch boundary.
6. The door machine detection system according to claim 5, wherein the performing data processing on the projection point cloud data to obtain hatch boundary point cloud data includes:
calculating the geometric center of the projection point cloud data, and calculating the connection line lengths of all projection points of the projection point cloud data and the geometric center connection line respectively;
performing angle equally dividing processing on the projection point cloud data by taking the geometric center as a circle center, and determining the coordinates of the projection point with the shortest connecting line length in each equally dividing angle;
and obtaining hatch boundary point cloud data according to the set of coordinates.
7. The door inspection system of claim 4, wherein said inspecting said cabin from said three-dimensional point cloud map further comprises:
detecting the height of the cover plate according to the hatch boundary;
extracting the cabin point cloud data above the cover plate height to obtain obstacle point cloud data;
And carrying out cluster analysis on the obstacle point cloud data to obtain coordinate information and size information of each obstacle on the cabin.
8. The portal crane detection system of claim 2, wherein the detecting the ground material from the three-dimensional point cloud map comprises:
acquiring ground material point cloud data according to the three-dimensional point cloud map and a preset ground detection area;
extracting ground point cloud data from the ground material point cloud data based on a preset plane extraction algorithm, and determining a ground plane according to the ground point cloud data;
and carrying out equal division processing on the ground material point cloud data above the ground level according to the preset material detection areas, and calculating the highest point and the lowest point of each preset material detection area.
9. The portal crane detection system of claim 2, wherein the portal crane further comprises a grapple connected to the trunk bridge by a rope, and the computing processing unit is further configured to obtain real-time frame point cloud data of at least one of the lidars;
the target to be detected further comprises the grapple, and the detecting the target to be detected according to the three-dimensional point cloud map further comprises: and tracking the grapple according to the real-time frame point cloud data and the three-dimensional point cloud map.
10. The gantry crane detection system of claim 9, wherein the tracking the grapple from the real-time frame point cloud data and the three-dimensional point cloud map comprises:
performing differential calculation on the real-time frame point cloud data and the three-dimensional point cloud map to obtain differential point cloud data, wherein the differential point cloud data comprises grapple point cloud data and rope point cloud data;
based on a preset point cloud registration algorithm, matching the differential point cloud data by using a preset grapple point cloud template;
if so, outputting the grapple point cloud data;
and tracking the center coordinates and the moving speed of the grapple according to the grapple point cloud data.
11. The gantry crane detection system according to claim 10, wherein the performing differential computation on the real-time frame point cloud data and the three-dimensional point cloud map to obtain differential point cloud data includes:
performing grid division on the real-time frame point cloud data and the three-dimensional point cloud map according to the same standard;
traversing all grids, and taking a point as a differential point if the point exists in the grid of the real-time frame point cloud data and the point does not exist in the grid of the three-dimensional point cloud map;
And obtaining differential point cloud data according to the set of differential points.
12. The gantry crane detection system of claim 10, wherein tracking the center coordinates and the moving speed of the grapple based on the grapple point cloud data comprises:
calculating the center coordinates of the grapples according to the grapple point cloud data;
and tracking the center coordinates and the moving speed of the grapple by using a preset filtering algorithm.
13. A gantry crane detection system according to any one of claims 1 to 12, wherein said performing an extrinsic calibration on said three-dimensional point cloud data and said combined inertial navigation data comprises:
and calculating the calibration external parameters through a preset calibration algorithm.
14. The gantry crane inspection system according to claim 13, wherein the computing unit is further configured to initialize the three-dimensional point cloud data, the combined inertial navigation data, the calibration external parameters, and the preset resolution before equally dividing the heading angle of the combined inertial navigation data according to the preset resolution to obtain N sub-heading angles.
15. A door machine inspection system according to any one of claims 1-12 wherein the combined inertial navigation data includes inertial navigation coordinate information, the calculating a center of rotation of the door machine from the combined inertial navigation data comprising: and calculating the rotation center of the gantry crane according to the inertial navigation coordinate information.
CN202311092301.7A 2023-08-29 2023-08-29 Door machine detecting system Active CN116817904B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311092301.7A CN116817904B (en) 2023-08-29 2023-08-29 Door machine detecting system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311092301.7A CN116817904B (en) 2023-08-29 2023-08-29 Door machine detecting system

Publications (2)

Publication Number Publication Date
CN116817904A CN116817904A (en) 2023-09-29
CN116817904B true CN116817904B (en) 2023-11-10

Family

ID=88126095

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311092301.7A Active CN116817904B (en) 2023-08-29 2023-08-29 Door machine detecting system

Country Status (1)

Country Link
CN (1) CN116817904B (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1350716A1 (en) * 2002-04-05 2003-10-08 Coeclerici Logistics S.p.A. Floating facility equipped with cranes
CN113538566A (en) * 2021-07-15 2021-10-22 武汉港迪智能技术有限公司 Cargo ship hatch position obtaining method and system based on laser radar
CN114581606A (en) * 2022-02-23 2022-06-03 武汉港迪智能技术有限公司 Bulk door machine cabin based on 3D laser radar and modeling method for materials in cabin
CN114988052A (en) * 2022-04-20 2022-09-02 北京汇力智能科技有限公司 Automatic compensation method and device in dynamic ship unloading, storage medium and electronic equipment
CN115731459A (en) * 2022-10-19 2023-03-03 唐山港集团港机船舶维修有限公司 Hatch edge identification method for large port machinery
CN116359929A (en) * 2023-02-21 2023-06-30 国能黄骅港务有限责任公司 Cabin positioning and scanning identification method, system and storage medium for cabin materials

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1350716A1 (en) * 2002-04-05 2003-10-08 Coeclerici Logistics S.p.A. Floating facility equipped with cranes
CN113538566A (en) * 2021-07-15 2021-10-22 武汉港迪智能技术有限公司 Cargo ship hatch position obtaining method and system based on laser radar
CN114581606A (en) * 2022-02-23 2022-06-03 武汉港迪智能技术有限公司 Bulk door machine cabin based on 3D laser radar and modeling method for materials in cabin
CN114988052A (en) * 2022-04-20 2022-09-02 北京汇力智能科技有限公司 Automatic compensation method and device in dynamic ship unloading, storage medium and electronic equipment
CN115731459A (en) * 2022-10-19 2023-03-03 唐山港集团港机船舶维修有限公司 Hatch edge identification method for large port machinery
CN116359929A (en) * 2023-02-21 2023-06-30 国能黄骅港务有限责任公司 Cabin positioning and scanning identification method, system and storage medium for cabin materials

Also Published As

Publication number Publication date
CN116817904A (en) 2023-09-29

Similar Documents

Publication Publication Date Title
CN109552665B (en) Method for measuring and inspecting structures using a catenary platform
RU2623295C2 (en) System, device and method for current monitoring of vehicle, loading device and cargo position and orientation, while loading device operation
KR20200031165A (en) Navigation chart configuration method, obstacle avoidance method and device, terminal, drone
CN109556577A (en) Positioning system for aerial nondestructive inspection
CN110889808B (en) Positioning method, device, equipment and storage medium
CN110196454B (en) Geological survey integrated system based on unmanned aerial vehicle
CN116605772B (en) Tower crane collision early warning method based on multiple integrated systems
WO2020103049A1 (en) Terrain prediction method and device of rotary microwave radar, and system and unmanned aerial vehicle
CN112197741B (en) Unmanned aerial vehicle SLAM technology inclination angle measuring system based on extended Kalman filtering
CN106908040A (en) A kind of binocular panorama visual robot autonomous localization method based on SURF algorithm
JP2011157187A (en) Raw material heap measuring system of raw material yard, raw material heap measuring method of raw material yard and computer program
CN112967392A (en) Large-scale park mapping and positioning method based on multi-sensor contact
CN115574816B (en) Bionic vision multi-source information intelligent perception unmanned platform
Karam et al. Integrating a low-cost mems imu into a laser-based slam for indoor mobile mapping
CN115902930A (en) Unmanned aerial vehicle room built-in map and positioning method for ship detection
CN108249307B (en) Movement measurement and feedback control system and method for large crane
CN115435772A (en) Method and device for establishing local map, electronic equipment and readable storage medium
CN109352646B (en) Automatic yarn loading and unloading method and system
CN114638419A (en) Method and equipment for predicting object track of tower crane
CN109459046B (en) Positioning and navigation method of suspension type underwater autonomous vehicle
CN116817904B (en) Door machine detecting system
CN114077249A (en) Operation method, operation equipment, device and storage medium
CN115797490B (en) Graph construction method and system based on laser vision fusion
CN114485613B (en) Positioning method for multi-information fusion underwater robot
CN110618696B (en) Air-ground integrated surveying and mapping unmanned aerial vehicle

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant