CN110793512A - Pose recognition method and device, electronic equipment and storage medium - Google Patents

Pose recognition method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN110793512A
CN110793512A CN201910860105.7A CN201910860105A CN110793512A CN 110793512 A CN110793512 A CN 110793512A CN 201910860105 A CN201910860105 A CN 201910860105A CN 110793512 A CN110793512 A CN 110793512A
Authority
CN
China
Prior art keywords
point cloud
pose
cloud data
target
calculating
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201910860105.7A
Other languages
Chinese (zh)
Inventor
曹金
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Bingtong Intelligent Technology Co Ltd
Original Assignee
Shanghai Bingtong Intelligent Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Bingtong Intelligent Technology Co Ltd filed Critical Shanghai Bingtong Intelligent Technology Co Ltd
Priority to CN201910860105.7A priority Critical patent/CN110793512A/en
Publication of CN110793512A publication Critical patent/CN110793512A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J1/00Photometry, e.g. photographic exposure meter
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/08Systems determining position data of a target for measuring distance only
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Electromagnetism (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Spectroscopy & Molecular Physics (AREA)
  • Automation & Control Theory (AREA)
  • Optical Radar Systems And Details Thereof (AREA)

Abstract

The embodiment of the invention relates to the technical field of control of automatic navigation equipment, and discloses a pose identification method, a pose identification device, electronic equipment and a storage medium. The method comprises the following steps: acquiring point cloud data; the point cloud data comprises a distance measurement value and a light intensity value from each point to a sensing device of the automatic navigation equipment, and the distance measurement value is used for calculating coordinate information of each point; identifying target point cloud from the point cloud data according to the light intensity value; the target point cloud is used for calculating the pose of the target object; calculating the pose of the target object according to the coordinate information of the points in the target point cloud and outputting the pose; and the automatic navigation equipment navigates according to the output pose. The embodiment of the invention can reduce the cost and the realization difficulty of automatic navigation control of AGV and the like, and is beneficial to promoting the development of AGV application.

Description

Pose recognition method and device, electronic equipment and storage medium
Technical Field
The invention relates to the technical field of control of automatic navigation equipment, in particular to a pose identification method, a pose identification device, electronic equipment and a storage medium.
Background
With the development of science and technology, the application of the AGV (Automated Guided Vehicle) in production and life is more and more extensive. In industry AGV transportation field, generally need the AGV chassis to accurately move to the butt joint station, carry out the conveying action of goods from top to bottom.
The inventors found that the related art has at least the following problems: the premise that the AGV accurately reaches the target station is that the pose of the target station needs to be accurately identified, however, the problems of high cost and complex implementation generally exist in the existing pose identification method, and the development of AGV application is not facilitated.
Disclosure of Invention
The embodiment of the invention aims to provide a pose identification method, a pose identification device, electronic equipment and a storage medium, and aims to reduce the cost and the realization difficulty of automatic navigation control of an AGV and promote the development of AGV application.
In order to solve the above technical problem, an embodiment of the present invention provides a pose identification method applied to an automatic navigation device, where the method includes:
acquiring point cloud data; the point cloud data comprises a distance measurement value and a light intensity value from each point to a sensing device of the automatic navigation equipment, and the distance measurement value is used for calculating coordinate information of each point;
identifying and obtaining target point cloud from the point cloud data according to the light intensity value; wherein the target point cloud is used for calculating the pose of a target object;
calculating the pose of the target object according to the coordinate information of the points in the target point cloud and outputting the pose; and the automatic navigation equipment navigates according to the output pose.
The embodiment of the invention also provides a pose recognition device, which is applied to automatic navigation equipment, and the device comprises:
the acquisition module is used for acquiring point cloud data; the point cloud data comprises a distance measurement value and a light intensity value from each point to a sensing device of the automatic navigation equipment, and the distance measurement value is used for calculating coordinate information of each point;
the point cloud identification module is used for identifying target point cloud from the point cloud data according to the light intensity value; wherein the target point cloud is used for calculating the pose of a target object;
the calculation module is used for calculating the pose of the target object according to the coordinate information of the points in the target point cloud and outputting the pose; and the automatic navigation equipment navigates according to the output pose.
An embodiment of the present invention also provides an electronic device, including: a memory storing a computer program and a processor running the computer program to implement the pose identification method as described above.
Embodiments of the present invention also provide a storage medium storing a computer-readable program for causing a computer to execute the pose identification method described above.
Compared with the prior art, the method and the device have the advantages that the target point cloud is obtained through identification according to the light intensity value of the point cloud data, the pose of the target object is obtained through calculation according to the coordinate information of the points in the target point cloud, and therefore automatic navigation of automatic navigation equipment is achieved. The light intensity value and the coordinate information of the point cloud data are low in acquisition cost, and calculation in the pose identification process is easy to achieve, so that the development of AGV application is promoted.
As an embodiment, the identifying and obtaining the target point cloud from the point cloud data according to the light intensity value specifically includes:
clustering the point cloud data according to the distance of the geometric space, and clustering the point cloud data according to a clustering result;
and matching the geometric features corresponding to the clustered point cloud data with preset geometric features to identify the target point cloud. Clustering of point cloud data can be realized by clustering the point cloud data, so that interference point cloud data can be removed conveniently.
As an embodiment, the sensing device is a laser radar, the preset geometric features are geometric features of a rectangular reflector, the rectangular reflector is arranged on the target object and has the same height as the laser radar, and the automatic navigation equipment acquires the point cloud data through the laser radar;
the method for identifying the target point cloud by matching the geometric features corresponding to the clustered point cloud data with preset geometric features specifically comprises the following steps:
and identifying the point cloud data of which the physical length corresponding to the clustered point cloud data is matched with the length of the reflector as the target point cloud. The length of the reflector is matched with the physical length corresponding to the clustered point cloud data, so that the target point cloud can be conveniently and accurately identified.
As an embodiment, before the clustering the point cloud data according to the distance of the geometric space, the method further includes:
and filtering point cloud data of which the light intensity value is smaller than a preset light intensity threshold value in the acquired point cloud data. Therefore, the method is beneficial to filtering out interference point cloud data in the environment where the target object is located and reducing the calculated amount of pose identification.
As one embodiment, the pose includes target coordinates and a direction;
the calculating according to the coordinate information of the points in the target point cloud to obtain and output the pose of the target object specifically comprises:
calculating the gravity center of the target point cloud to obtain the target coordinate;
and fitting according to the target point cloud to obtain a straight line where the reflector is located, and taking the normal direction of the straight line as the direction.
As an embodiment, after the straight line where the reflector is located is obtained according to the target point cloud fitting and the normal direction of the straight line is taken as the direction, the method further includes:
calculating to obtain error parameters of the pose;
and determining whether the error parameter of the pose meets a preset condition, and if the error parameter meets the preset condition, outputting the pose.
As one embodiment, the error parameter is a sum of squares of distances from each point in the target point cloud to the straight line fitted;
the determining whether the error parameter of the pose meets a preset condition specifically comprises:
if the sum of squares is smaller than a preset sum of squares threshold, determining that the error parameter meets a preset condition;
and if the square sum is greater than or equal to the preset square sum threshold value, determining that the error parameter does not meet the preset condition. Therefore, the accuracy of the pose calculation result can be accurately evaluated, and the automatic navigation accuracy of the AGV is improved.
Drawings
Fig. 1 is a flowchart of a pose identification method according to a first embodiment of the present invention;
FIG. 2 is a schematic diagram of an automatic navigation device acquiring point cloud data according to a first embodiment of the present invention;
fig. 3 is a flowchart of a pose identification method according to a second embodiment of the present invention;
fig. 4 is a schematic structural view of a pose recognition apparatus according to a third embodiment of the present invention;
fig. 5 is a schematic structural diagram of an electronic device according to a fourth embodiment of the invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention more apparent, embodiments of the present invention will be described in detail below with reference to the accompanying drawings. However, it will be appreciated by those of ordinary skill in the art that numerous technical details are set forth in order to provide a better understanding of the present invention in its various embodiments. However, the technical solution claimed in the present invention can be implemented without these technical details and various changes and modifications based on the following embodiments.
The first embodiment of the invention relates to a pose identification method, which is applied to automatic navigation equipment, including but not limited to automatic navigation vehicles. The method comprises the following steps: acquiring point cloud data, wherein the point cloud data comprises a distance measurement value and a light intensity value of each point to a sensing device of automatic navigation equipment, and the distance measurement value is used for calculating coordinate information of each point; identifying target point cloud from the point cloud data according to the light intensity value; the target point cloud is used for calculating the pose of the target object, wherein the pose of the target object has three degrees of freedom (x, y, theta), the (x, y) is a target coordinate of the target object, and the (theta) is the direction of the target object; calculating the pose of the target object according to the coordinate information of the points in the target point cloud and outputting the pose; and the automatic navigation equipment navigates according to the output pose. Compared with the prior art, the method and the device have the advantages that the target point cloud is obtained through identification according to the light intensity value of the point cloud data, the pose of the target object is obtained through calculation according to the coordinate information of the points in the target point cloud, and therefore automatic navigation of the automatic navigation vehicle and the like is achieved. The light intensity value and the coordinate information of the point cloud data are low in acquisition cost, and calculation in the pose identification process is easy to achieve, so that the development of AGV application is promoted.
The pose recognition method according to the present embodiment will be described in detail below with reference to fig. 1. The method includes steps 101 to 107.
Step 101: and acquiring point cloud data.
As shown in fig. 2, the sensing device of the automatic navigation vehicle includes a laser radar, the laser radar 2 may be disposed on a chassis 1 of the automatic navigation vehicle, and a reflector 3 is disposed on a target object. Wherein the position of the target object may be fixed, such as a fixed-position workstation; or the target object may be movable. When the automatic navigation vehicle is in butt joint with the target object, the pose of the target object needs to be accurately identified. The reflector panel can be the rectangle, and the rectangle reflector panel can be as high as laser radar to be convenient for acquire the point cloud data through laser radar. It should be understood that the present embodiment is not particularly limited to the position of the laser radar, the shape and position of the reflector, and the like. The laser radar can scan according to a certain frequency to obtain point cloud data. The point cloud data comprises a distance value and a light intensity value of each point to a laser radar of the automatic navigation vehicle, and the distance value is used for calculating coordinate information of each point.
Step 102: and filtering point cloud data of which the light intensity value is smaller than a preset light intensity threshold value in the acquired point cloud data.
In practical application, the preset light intensity threshold value can be determined according to the light intensity value of the point cloud data generated by the reflector, and the reflector is generally made of a material with high reflection intensity, so that the light intensity value of the point cloud data formed by the reflector is also larger than the light intensity value of the point cloud data of most objects in the environment. The point cloud data obtained in step 101 may include not only point cloud data (hereinafter, also referred to as target point cloud) generated by reflection of the reflector, but also point cloud data of an environment in which the target object is located. Because the reflectivity of the reflector is high, the light intensity value of the point cloud data generated by the reflector is large, and the light intensity value of the point cloud data generated by the environment is small, so that the filtering of the point cloud data with the light intensity value smaller than the preset light intensity threshold value can eliminate interference information in the acquired point cloud data, the reduction of the pose identification calculated amount is facilitated, and the pose identification efficiency is improved.
Step 103: and clustering the point cloud data according to the distance of the geometric space, and clustering the point cloud data according to the clustering result.
Specifically, clustering is performed according to the distance of different points in the physical space, and the clustering algorithm may adopt algorithms known in the art, such as KD-Tree, Kmeans, and the like, and will not be described herein again. The point cloud data formed by reflection sources with different high reflection intensities in the filtered point cloud data can be distinguished through clustering. The clustered point cloud data may be divided into one or more clusters, and is determined by the acquired point cloud data and a clustering algorithm. For example, when the filtered point cloud data includes point cloud data formed by the reflection source a and the reflection plate, the point cloud data may be divided into a cluster of point cloud data corresponding to the reflection source a and a cluster of point cloud data corresponding to the reflection plate by clustering.
Step 104: and matching the geometric features corresponding to the clustered point cloud data with preset geometric features to identify the target point cloud.
The preset geometric features are specifically geometric features of a rectangular reflector, such as the length of the reflector, and similarly, the geometric features corresponding to the clustered point cloud data are also the length of the geometric shapes corresponding to the point cloud data, and the length of the geometric shapes can be according to the distance between points of the point cloud data. Step 104 specifically includes identifying point cloud data in which the physical length corresponding to the clustered point cloud data matches the length of the reflector as a target point cloud, where the matching between the physical length corresponding to the clustered point cloud data and the length of the reflector may mean that the ratio of the absolute value of the difference between the physical length (i.e., the actual length) of the reflector and the measured length corresponding to the point cloud data to the physical length of the reflector is within a preset range, for example, | physical length-measured length |/physical length <0.1, that is, the ratio of the absolute value of the difference between the measured value and the actual value to the actual length is less than 10%. For example, the physical length of the reflector may be 100cm, and when the measurement length corresponding to the clustered point cloud data is 95cm, the cluster point cloud may be determined to be the target point cloud. It should be understood that in practical applications, the geometric features may also include dimensions in two dimensions, such as matching both the length and width of the reflector. Therefore, the point cloud data not belonging to the target point cloud in the point cloud data can be further removed through the step 104, so that the accuracy of the pose identification of the target object can be improved, and the calculation amount of the pose identification can be further reduced.
Step 105: and calculating the gravity center of the target point cloud to obtain target coordinates.
The gravity center of the target point cloud corresponds to the geometric center coordinate of the reflector. Specifically, the coordinate values of all points in the target point cloud may be added, and then divided by the number of points in the target point cloud, so as to obtain the center of gravity of the target point cloud, and thus obtain the target coordinates (x, y).
Step 106: and fitting according to the target point cloud to obtain a straight line where the reflector is located, and taking the normal direction of the straight line as the direction (theta).
Direction (theta), namely the orientation of the reflector relative to the laser radar, and (x, y, theta), namely the pose of the target object to be output.
Step 107: and outputting the pose.
Namely (x, y, theta) is output, and the automatic navigation vehicle can automatically control the running direction according to the output pose so as to realize accurate butt joint with the target object.
It should be noted that, in practical application, the automatic navigation vehicle may also obtain the point cloud data through a depth camera and other devices, and accordingly, the reflector may also be replaced by a two-dimensional code and the like. In step 101, after point cloud data is obtained by scanning, color information of RGB (red, green, blue) and the like in the point cloud data may be removed, and then pose recognition is performed.
Compared with the prior art, the method and the device have the advantages that the point cloud data are obtained through the laser radar and the reflector, the point cloud data are filtered and clustered through the light intensity value and the distance information in the point cloud data, the target point cloud is further identified, and then the pose of the target object is obtained through calculation according to the target point cloud. Therefore, the method and the device have low requirements on hardware equipment and are easy to realize in the calculation process, so that the control cost of the AGV is reduced, and the development of the AGV application is promoted.
The second embodiment of the present invention relates to a pose recognition method, and is further improved on the basis of the first embodiment, and the main improvements are as follows: according to the embodiment, the accuracy of the calculated pose is measured, and only the pose with high accuracy is output, so that the control accuracy and efficiency of the AGV are improved.
As shown in fig. 3, the pose recognition method of the present embodiment includes steps 301 to 307. Step 301 to step 101 of the first embodiment are the same, step 302 corresponds to step 102 to step 104 of the first embodiment, and step 303 and step 304 correspond to step 105 and step 106 of the first embodiment, respectively, and are not described herein again.
Step 305: and calculating to obtain the error parameters of the pose.
Step 306: and determining whether the error parameter of the pose meets a preset condition, if so, executing step 307, and if not, re-executing steps 301 to 306 until the error parameter meets the preset condition.
In this embodiment, the error parameter may be a sum of squares of distances from each point in the target point cloud to a straight line where the reflector is located. Step 306 specifically comprises: and if the sum of squares is less than a preset sum of squares threshold, determining that the error parameter meets a preset condition (namely the pose accuracy is better), and if the sum of squares is greater than or equal to the preset sum of squares threshold, determining that the error parameter does not meet the preset condition. In practical application, the distance between the laser radar and the reflector may be large, or the interference in the environment is large, so that the deviation of the pose calculation result is large. According to the embodiment, the accuracy of the pose is measured, so that pose data with poor accuracy can be discarded, the pose data with good accuracy can be obtained through recalculation, and the butt joint precision of the AGV and the target object can be improved.
Step 307: and outputting the pose.
Compared with the prior art, the method and the device have the advantages that the point cloud data are obtained through the laser radar and the reflector, the point cloud data are filtered and clustered through the light intensity value and the distance information in the point cloud data, the target point cloud is further identified, and then the pose of the target object is obtained through calculation according to the target point cloud. Therefore, the method and the device have low requirements on hardware equipment and are easy to realize in the calculation process, so that the control cost of the AGV is reduced, and the development of the AGV application is promoted. In addition, the embodiment measures the calculation result of the pose, so that pose data with poor accuracy can be discarded, the AGV can navigate according to the pose data with good accuracy, and the butt joint precision of the AGV and the target object can be improved.
A third embodiment of the present invention relates to a pose recognition apparatus applied to an automatic navigation device. Referring to fig. 4, the pose recognition apparatus 400 according to the present embodiment includes:
an obtaining module 402, configured to obtain point cloud data, where the point cloud data includes a distance value and a light intensity value from each point to a sensing device of an automatic navigation apparatus, and the distance value is used to calculate coordinate information of each point;
a point cloud identification module 404, configured to identify a target point cloud from the point cloud data according to the light intensity value; the target point cloud is used for calculating the pose of the target object; and
the calculation module 406 is configured to calculate and obtain a pose of the target object according to the coordinate information of the point in the target point cloud and output the pose; and the automatic navigation equipment navigates according to the output pose.
A fourth embodiment of the present invention relates to an electronic apparatus. As shown in fig. 5, the electronic apparatus includes: a memory 502 and a processor 501;
wherein the memory 502 stores instructions executable by the at least one processor 501 to implement: acquiring point cloud data; the point cloud data comprises a distance measurement value and a light intensity value from each point to a sensing device of the automatic navigation equipment, and the distance measurement value is used for calculating coordinate information of each point; identifying target point cloud from the point cloud data according to the light intensity value; the target point cloud is used for calculating the pose of the target object; calculating the pose of the target object according to the coordinate information of the points in the target point cloud and outputting the pose; and the automatic navigation equipment navigates according to the output pose.
The electronic device includes one or more processors 501 and a memory 502, and one processor 501 is taken as an example in fig. 5. The processor 501 and the memory 502 may be connected by a bus or other means, and fig. 5 illustrates the connection by the bus as an example. Memory 502, which is a non-volatile computer-readable storage medium, may be used to store non-volatile software programs, non-volatile computer-executable programs, and modules. The processor 501 executes various functional applications and data processing of the apparatus by running nonvolatile software programs, instructions, and modules stored in the memory 502, that is, implements the above-described pose recognition method.
The memory 502 may include a program storage area and a data storage area, wherein the program storage area may store an operating system, an application program required for at least one function, and the like. Further, the memory 502 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other non-volatile solid state storage device. In some embodiments, memory 502 may optionally include memory located remotely from processor 501, which may be connected to an external device via a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
One or more modules are stored in the memory 502, and when executed by the one or more processors 501, perform the pose recognition method of any of the above-described method embodiments.
The above-mentioned device can execute the method provided by the embodiment of the present invention, and has the corresponding functional modules and beneficial effects of the execution method, and reference may be made to the method provided by the embodiment of the present invention for technical details that are not described in detail in the embodiment.
Compared with the prior art, the method and the device have the advantages that the point cloud data are obtained through the laser radar and the reflector, the point cloud data are filtered and clustered through the light intensity value and the distance information in the point cloud data, the target point cloud is further identified, and then the pose of the target object is obtained through calculation according to the target point cloud. Therefore, the method and the device have low requirements on hardware equipment and are easy to realize in the calculation process, so that the control cost of the AGV is reduced, and the development of the AGV application is promoted. In addition, the embodiment measures the calculation result of the pose, so that pose data with poor accuracy can be discarded, the AGV can navigate according to the pose data with good accuracy, and the butt joint precision of the AGV and the target object can be improved.
A sixth embodiment of the invention is directed to a non-volatile storage medium storing a computer-readable program for causing a computer to perform some or all of the above method embodiments.
That is, those skilled in the art can understand that all or part of the steps in the method according to the above embodiments may be implemented by a program instructing related hardware, where the program is stored in a storage medium and includes several instructions to enable a device (which may be a single chip, a chip, etc.) or a processor (processor) to execute all or part of the steps in the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
It will be understood by those of ordinary skill in the art that the foregoing embodiments are specific examples for carrying out the invention, and that various changes in form and details may be made therein without departing from the spirit and scope of the invention in practice.

Claims (10)

1. A pose recognition method is applied to automatic navigation equipment and is characterized by comprising the following steps:
acquiring point cloud data; the point cloud data comprises a distance measurement value and a light intensity value from each point to a sensing device of the automatic navigation equipment, and the distance measurement value is used for calculating coordinate information of each point;
identifying and obtaining target point cloud from the point cloud data according to the light intensity value; wherein the target point cloud is used for calculating the pose of a target object;
calculating the pose of the target object according to the coordinate information of the points in the target point cloud and outputting the pose; and the automatic navigation equipment navigates according to the output pose.
2. The pose identification method according to claim 1, wherein the identifying a target point cloud from the point cloud data according to a light intensity value specifically comprises:
clustering the point cloud data according to the distance of the geometric space, and clustering the point cloud data according to a clustering result;
and matching the geometric features corresponding to the clustered point cloud data with preset geometric features to identify the target point cloud.
3. The pose identification method according to claim 2, wherein the sensing device is a laser radar, the preset geometric features are geometric features of a rectangular reflector, the rectangular reflector is arranged on the target object and is as high as the laser radar, and the automatic navigation equipment acquires the point cloud data through the laser radar;
the method for identifying the target point cloud by matching the geometric features corresponding to the clustered point cloud data with preset geometric features specifically comprises the following steps:
and identifying the point cloud data of which the physical length corresponding to the clustered point cloud data is matched with the length of the reflector as the target point cloud.
4. The pose recognition method according to claim 2 or 3, wherein before the clustering the point cloud data by the distance in the geometric space, the method further comprises:
and filtering point cloud data of which the light intensity value is smaller than a preset light intensity threshold value in the acquired point cloud data.
5. The pose recognition method according to claim 3, wherein the pose contains target coordinates and an orientation;
the calculating according to the coordinate information of the points in the target point cloud to obtain and output the pose of the target object specifically comprises:
calculating the gravity center of the target point cloud to obtain the target coordinate;
and fitting according to the target point cloud to obtain a straight line where the reflector is located, and taking the normal direction of the straight line as the direction.
6. The pose recognition method according to claim 5, wherein after the straight line where the reflector is located is obtained according to the target point cloud fitting, and a normal direction of the straight line is taken as the direction, the method further comprises:
calculating to obtain error parameters of the pose;
and determining whether the error parameter of the pose meets a preset condition, and if the error parameter meets the preset condition, outputting the pose.
7. The pose identification method according to claim 6, wherein the error parameter is a sum of squares of distances from each point in the target point cloud to the straight line fitted;
the determining whether the error parameter of the pose meets a preset condition specifically comprises:
if the sum of squares is smaller than a preset sum of squares threshold, determining that the error parameter meets a preset condition;
and if the square sum is greater than or equal to the preset square sum threshold value, determining that the error parameter does not meet the preset condition.
8. A pose recognition apparatus applied to an automatic navigation device, the apparatus comprising:
the acquisition module is used for acquiring point cloud data; the point cloud data comprises a distance measurement value and a light intensity value from each point to a sensing device of the automatic navigation equipment, and the distance measurement value is used for calculating coordinate information of each point;
the point cloud identification module is used for identifying target point cloud from the point cloud data according to the light intensity value; wherein the target point cloud is used for calculating the pose of a target object;
the calculation module is used for calculating the pose of the target object according to the coordinate information of the points in the target point cloud and outputting the pose; and the automatic navigation equipment navigates according to the output pose.
9. An electronic device, comprising: a memory storing a computer program that is executed by the processor to implement the pose identification method according to any one of claims 1 to 7, and a processor.
10. A storage medium characterized by storing a computer-readable program for causing a computer to execute a pose identification method according to any one of claims 1 to 7.
CN201910860105.7A 2019-09-11 2019-09-11 Pose recognition method and device, electronic equipment and storage medium Pending CN110793512A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910860105.7A CN110793512A (en) 2019-09-11 2019-09-11 Pose recognition method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910860105.7A CN110793512A (en) 2019-09-11 2019-09-11 Pose recognition method and device, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
CN110793512A true CN110793512A (en) 2020-02-14

Family

ID=69427100

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910860105.7A Pending CN110793512A (en) 2019-09-11 2019-09-11 Pose recognition method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN110793512A (en)

Cited By (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111352118A (en) * 2020-03-25 2020-06-30 三一机器人科技有限公司 Method and device for matching reflecting columns, laser radar positioning method and equipment terminal
CN111366896A (en) * 2020-03-05 2020-07-03 三一机器人科技有限公司 Method and device for detecting reflective column, electronic equipment and readable storage medium
CN111533051A (en) * 2020-05-08 2020-08-14 三一机器人科技有限公司 Tray pose detection method and device, forklift and freight system
CN111596299A (en) * 2020-05-19 2020-08-28 三一机器人科技有限公司 Light reflection column tracking and positioning method and device and electronic equipment
CN112034481A (en) * 2020-09-02 2020-12-04 亿嘉和科技股份有限公司 Automatic cable identification method based on reflective sticker and laser radar
CN112731445A (en) * 2021-01-04 2021-04-30 上海木蚁机器人科技有限公司 Goods shelf identification and positioning method, system and device and readable storage medium
CN112784799A (en) * 2021-02-01 2021-05-11 三一机器人科技有限公司 AGV (automatic guided vehicle) backward pallet and obstacle identification method and device and AGV
CN112818715A (en) * 2020-12-31 2021-05-18 北京云迹科技有限公司 Pose identification method and device, electronic equipment and storage medium
CN113281775A (en) * 2021-05-26 2021-08-20 珠海市一微半导体有限公司 Charging seat positioning method based on laser scanning information, chip and robot
CN113470111A (en) * 2020-03-31 2021-10-01 纳恩博(北京)科技有限公司 Positioning method, positioning device, positioning apparatus, and positioning medium
CN113624225A (en) * 2021-09-15 2021-11-09 沈阳飞机设计研究所扬州协同创新研究院有限公司 Pose calculation method for mounting engine positioning pin
CN113689504A (en) * 2021-10-25 2021-11-23 上海仙工智能科技有限公司 Point cloud accurate positioning method and device based on describable shape and storage medium
CN114266266A (en) * 2021-11-12 2022-04-01 上海宾通智能科技有限公司 Simple and quick composite two-dimensional code identification system
WO2022078467A1 (en) * 2020-10-14 2022-04-21 深圳市杉川机器人有限公司 Automatic robot recharging method and apparatus, and robot and storage medium
WO2024051054A1 (en) * 2022-09-08 2024-03-14 劢微机器人科技(深圳)有限公司 Transportation obstacle avoidance method, apparatus and device for agv, and storage medium

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180259647A1 (en) * 2015-11-16 2018-09-13 Panasonic Intellectual Property Management Co., Ltd. Imaging device and solid-state imaging element used in same
CN108921865A (en) * 2018-06-27 2018-11-30 南京大学 A kind of jamproof sub-pix line fitting method
CN108955666A (en) * 2018-08-02 2018-12-07 苏州中德睿博智能科技有限公司 A kind of hybrid navigation method, apparatus and system based on laser radar and reflector
CN109084738A (en) * 2018-07-06 2018-12-25 上海宾通智能科技有限公司 A kind of height-adjustable calibration system and scaling method
CN109211236A (en) * 2017-06-30 2019-01-15 沈阳新松机器人自动化股份有限公司 navigation locating method, device and robot
CN109285188A (en) * 2017-07-21 2019-01-29 百度在线网络技术(北京)有限公司 Method and apparatus for generating the location information of target object
CN110188687A (en) * 2019-05-30 2019-08-30 爱驰汽车有限公司 Landform recognition methods, system, equipment and the storage medium of automobile
CN110196429A (en) * 2018-04-02 2019-09-03 北京航空航天大学 Vehicle target recognition methods, storage medium, processor and system

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180259647A1 (en) * 2015-11-16 2018-09-13 Panasonic Intellectual Property Management Co., Ltd. Imaging device and solid-state imaging element used in same
CN109211236A (en) * 2017-06-30 2019-01-15 沈阳新松机器人自动化股份有限公司 navigation locating method, device and robot
CN109285188A (en) * 2017-07-21 2019-01-29 百度在线网络技术(北京)有限公司 Method and apparatus for generating the location information of target object
CN110196429A (en) * 2018-04-02 2019-09-03 北京航空航天大学 Vehicle target recognition methods, storage medium, processor and system
CN108921865A (en) * 2018-06-27 2018-11-30 南京大学 A kind of jamproof sub-pix line fitting method
CN109084738A (en) * 2018-07-06 2018-12-25 上海宾通智能科技有限公司 A kind of height-adjustable calibration system and scaling method
CN108955666A (en) * 2018-08-02 2018-12-07 苏州中德睿博智能科技有限公司 A kind of hybrid navigation method, apparatus and system based on laser radar and reflector
CN110188687A (en) * 2019-05-30 2019-08-30 爱驰汽车有限公司 Landform recognition methods, system, equipment and the storage medium of automobile

Cited By (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111366896A (en) * 2020-03-05 2020-07-03 三一机器人科技有限公司 Method and device for detecting reflective column, electronic equipment and readable storage medium
CN111352118A (en) * 2020-03-25 2020-06-30 三一机器人科技有限公司 Method and device for matching reflecting columns, laser radar positioning method and equipment terminal
CN111352118B (en) * 2020-03-25 2022-06-21 三一机器人科技有限公司 Method and device for matching reflecting columns, laser radar positioning method and equipment terminal
CN113470111A (en) * 2020-03-31 2021-10-01 纳恩博(北京)科技有限公司 Positioning method, positioning device, positioning apparatus, and positioning medium
CN111533051B (en) * 2020-05-08 2021-12-17 三一机器人科技有限公司 Tray pose detection method and device, forklift and freight system
CN111533051A (en) * 2020-05-08 2020-08-14 三一机器人科技有限公司 Tray pose detection method and device, forklift and freight system
CN111596299A (en) * 2020-05-19 2020-08-28 三一机器人科技有限公司 Light reflection column tracking and positioning method and device and electronic equipment
CN112034481A (en) * 2020-09-02 2020-12-04 亿嘉和科技股份有限公司 Automatic cable identification method based on reflective sticker and laser radar
CN112034481B (en) * 2020-09-02 2024-08-02 亿嘉和科技股份有限公司 Automatic cable identification method based on reflective sticker and laser radar
WO2022078467A1 (en) * 2020-10-14 2022-04-21 深圳市杉川机器人有限公司 Automatic robot recharging method and apparatus, and robot and storage medium
CN112818715A (en) * 2020-12-31 2021-05-18 北京云迹科技有限公司 Pose identification method and device, electronic equipment and storage medium
CN112731445A (en) * 2021-01-04 2021-04-30 上海木蚁机器人科技有限公司 Goods shelf identification and positioning method, system and device and readable storage medium
CN112784799A (en) * 2021-02-01 2021-05-11 三一机器人科技有限公司 AGV (automatic guided vehicle) backward pallet and obstacle identification method and device and AGV
CN113281775A (en) * 2021-05-26 2021-08-20 珠海市一微半导体有限公司 Charging seat positioning method based on laser scanning information, chip and robot
CN113281775B (en) * 2021-05-26 2024-07-09 珠海一微半导体股份有限公司 Charging seat positioning method based on laser scanning information, chip and robot
CN113624225A (en) * 2021-09-15 2021-11-09 沈阳飞机设计研究所扬州协同创新研究院有限公司 Pose calculation method for mounting engine positioning pin
CN113624225B (en) * 2021-09-15 2023-07-04 沈阳飞机设计研究所扬州协同创新研究院有限公司 Pose resolving method for mounting engine positioning pins
CN113689504A (en) * 2021-10-25 2021-11-23 上海仙工智能科技有限公司 Point cloud accurate positioning method and device based on describable shape and storage medium
CN114266266A (en) * 2021-11-12 2022-04-01 上海宾通智能科技有限公司 Simple and quick composite two-dimensional code identification system
CN114266266B (en) * 2021-11-12 2023-10-20 上海宾通智能科技有限公司 Composite two-dimensional code recognition system
WO2024051054A1 (en) * 2022-09-08 2024-03-14 劢微机器人科技(深圳)有限公司 Transportation obstacle avoidance method, apparatus and device for agv, and storage medium

Similar Documents

Publication Publication Date Title
CN110793512A (en) Pose recognition method and device, electronic equipment and storage medium
CN111797734B (en) Vehicle point cloud data processing method, device, equipment and storage medium
EP3489895B1 (en) Industrial vehicles with point fix based localization
WO2022188663A1 (en) Target detection method and apparatus
EP3505868A1 (en) Method and apparatus for adjusting point cloud data acquisition trajectory, and computer readable medium
CN114820391B (en) Point cloud processing-based storage tray detection and positioning method and system
CN110927732A (en) Pose recognition method, electronic device, and storage medium
WO2021082380A1 (en) Laser radar-based pallet recognition method and system, and electronic device
WO2023005384A1 (en) Repositioning method and device for mobile equipment
CN112935703B (en) Mobile robot pose correction method and system for identifying dynamic tray terminal
CN111964680A (en) Real-time positioning method of inspection robot
CN115147333A (en) Target detection method and device
US11668794B2 (en) Sensor calibration
CN117008151A (en) Goods shelf identification method, robot and storage medium
CN116243329A (en) High-precision multi-target non-contact ranging method based on laser radar and camera fusion
CN115453549A (en) Method for extracting environment right-angle point coordinate angle based on two-dimensional laser radar
CN115600118A (en) Tray leg identification method and system based on two-dimensional laser point cloud
CN115236696A (en) Method and device for determining obstacle, electronic equipment and storage medium
CN114228411A (en) Connection control method, device, equipment and storage medium
CN114265083A (en) Robot position identification method and device by using laser radar
CN110455274B (en) AGV initial positioning method and positioning system based on chamfer distance shape matching
CN114688992B (en) Method and device for identifying reflective object, electronic equipment and storage medium
CN114972495A (en) Grabbing method and device for object with pure plane structure and computing equipment
CN113140007B (en) Concentrated point cloud-based set card positioning method and device
CN116468787A (en) Position information extraction method and device of forklift pallet and domain controller

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20200214

RJ01 Rejection of invention patent application after publication