CN112379390A - Pose measurement method, device and system based on heterogeneous data and electronic equipment - Google Patents
Pose measurement method, device and system based on heterogeneous data and electronic equipment Download PDFInfo
- Publication number
- CN112379390A CN112379390A CN202011296352.8A CN202011296352A CN112379390A CN 112379390 A CN112379390 A CN 112379390A CN 202011296352 A CN202011296352 A CN 202011296352A CN 112379390 A CN112379390 A CN 112379390A
- Authority
- CN
- China
- Prior art keywords
- pose
- target object
- image acquisition
- acquisition device
- coordinate system
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000691 measurement method Methods 0.000 title claims abstract description 21
- 238000005259 measurement Methods 0.000 claims abstract description 26
- 239000011159 matrix material Substances 0.000 claims description 33
- 238000000034 method Methods 0.000 claims description 33
- 238000006243 chemical reaction Methods 0.000 claims description 29
- 238000012360 testing method Methods 0.000 claims description 26
- 238000012545 processing Methods 0.000 claims description 23
- 230000009466 transformation Effects 0.000 claims description 19
- 238000004590 computer program Methods 0.000 claims description 7
- 238000004891 communication Methods 0.000 claims description 5
- 238000004422 calculation algorithm Methods 0.000 claims description 4
- 238000001514 detection method Methods 0.000 abstract description 31
- 238000004364 calculation method Methods 0.000 abstract description 7
- 238000010586 diagram Methods 0.000 description 9
- 238000000605 extraction Methods 0.000 description 6
- 238000001914 filtration Methods 0.000 description 6
- 238000009792 diffusion process Methods 0.000 description 5
- 230000006870 function Effects 0.000 description 5
- 230000008569 process Effects 0.000 description 4
- 230000009471 action Effects 0.000 description 3
- 238000013461 design Methods 0.000 description 3
- 239000003550 marker Substances 0.000 description 3
- 239000000126 substance Substances 0.000 description 3
- RZVHIXYEVGDQDX-UHFFFAOYSA-N 9,10-anthraquinone Chemical compound C1=CC=C2C(=O)C3=CC=CC=C3C(=O)C2=C1 RZVHIXYEVGDQDX-UHFFFAOYSA-N 0.000 description 2
- 238000004458 analytical method Methods 0.000 description 2
- 238000013507 mapping Methods 0.000 description 2
- 239000000463 material Substances 0.000 description 2
- 238000006467 substitution reaction Methods 0.000 description 2
- 239000013598 vector Substances 0.000 description 2
- 230000009286 beneficial effect Effects 0.000 description 1
- 239000003086 colorant Substances 0.000 description 1
- 238000007405 data analysis Methods 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 238000003708 edge detection Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 230000005484 gravity Effects 0.000 description 1
- 238000009499 grossing Methods 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000002035 prolonged effect Effects 0.000 description 1
- 238000012876 topography Methods 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/86—Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/02—Systems using the reflection of electromagnetic waves other than radio waves
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/02—Systems using the reflection of electromagnetic waves other than radio waves
- G01S17/06—Systems determining position data of a target
Landscapes
- Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- Electromagnetism (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Computer Networks & Wireless Communication (AREA)
- General Physics & Mathematics (AREA)
- Length Measuring Devices By Optical Means (AREA)
- Optical Radar Systems And Details Thereof (AREA)
Abstract
The application provides a pose measurement method, a pose measurement device, a pose measurement system and electronic equipment based on heterogeneous data. The electronic equipment acquires the first pose and the spatial position information of the target object by acquiring the laser radar, or acquires the second pose of the target object from the image acquired by the image acquisition device based on the spatial position information detected by the laser radar when the target object moves to the image acquisition range of the image acquisition device. Therefore, the detection range of the laser radar and the detection precision of the image acquisition device to the pose are considered, and meanwhile, the calculation amount of the image acquired by the image acquisition device is reduced by utilizing the spatial position information detected by the laser radar.
Description
Technical Field
The application relates to the field of data processing, in particular to a pose measurement method, a pose measurement device, a pose measurement system and electronic equipment based on heterogeneous data.
Background
In the flight test of the aircraft, the position and the attitude of the aircraft in the air reflect the scientificity and the reasonability of the design of the aircraft and the accuracy of flight control to a certain extent. Therefore, the pose information of the aircraft in the air is captured through the non-contact sensing system, the performance of the aircraft can be effectively verified, and the defects in the design of the aircraft can be found.
The existing non-contact sensing system mainly comprises a laser radar and a vision measurement system, wherein the laser radar can detect a large range, but the detection precision of the position posture has a certain upper limit. The vision measurement system can capture the pose with higher precision, but is limited by the detection principle, and the detection range is smaller.
Therefore, both lidar and vision measurement systems have their own limitations.
Disclosure of Invention
In a first aspect, an embodiment of the present application provides a pose measurement method based on heterogeneous data, which is applied to an electronic device, where the electronic device is in communication connection with an image acquisition device and a laser radar, and the method includes:
acquiring a first position and spatial position information of a target object through the laser radar;
judging whether the target object moves into an image acquisition range of the image acquisition device or not according to the spatial position information;
and if the target object moves to the image acquisition range of the image acquisition device, acquiring the spatial position information of the target object and the image acquired by the image acquisition device according to the laser radar, and acquiring a second pose of the target object.
In one possible embodiment, the first pose is located in a first base coordinate system and the second pose is located in a second base coordinate system; the method further comprises the following steps:
acquiring a first conversion matrix and a second conversion matrix, wherein the first conversion matrix represents a pose conversion relation between the first base coordinate system and a third base coordinate system, and the second conversion matrix represents a pose conversion relation between the second base coordinate system and the third base coordinate system;
acquiring the first pose or the second pose;
and processing the first pose according to a first conversion matrix to obtain a first conversion pose of the target object in a third base coordinate system, or processing the second pose according to a second conversion matrix to obtain a second conversion pose of the target object in the third base coordinate system.
In a possible implementation, the step of obtaining the first transformation matrix and the second transformation matrix includes:
acquiring a first test pose of the same three-dimensional calibration object in the first base coordinate system, a second test pose in the second base coordinate system and a third test pose in the third base coordinate system;
obtaining the first conversion matrix according to the first test pose and the third test pose;
and obtaining the second transformation matrix according to the second test pose and the third test pose.
In one possible embodiment, the image acquisition device is a binocular vision system.
In a possible implementation manner, the step of obtaining the second pose of the target object according to the space position information of the target object obtained by the lidar and the image collected by the image collecting device includes:
determining the target object from the image acquired by the image acquisition device according to the spatial position information;
identifying the target object to obtain the coordinates of the mark points in a preset coordinate system, wherein the preset coordinate system is established based on the mark points;
and processing the coordinates of the mark points in a preset coordinate system through a light speed adjustment algorithm to obtain the second pose.
In a possible implementation, before the obtaining, by the lidar, the first attitude and spatial location information of the target object, the method further includes:
acquiring point cloud data through the laser radar;
processing the point cloud data through a grid map method to obtain a grid map plane image of the point cloud data;
and traversing the grid map plane image according to the shape and the size of the target object, and judging whether the target object is detected.
In a second aspect, a heterogeneous source data-based pose measurement apparatus based on heterogeneous sources includes:
the data acquisition module is used for acquiring a first position and spatial position information of a target object through the laser radar;
the data processing module is used for judging whether the target object moves into the image acquisition range of the image acquisition device or not according to the spatial position information;
the data acquisition module is further configured to acquire spatial position information of the target object and an image acquired by the image acquisition device according to the laser radar if the target object moves into an image acquisition range of the image acquisition device, and acquire a second pose of the target object.
In a third aspect, an embodiment of the present application provides a pose measurement system based on heterogeneous data, where the pose measurement system based on heterogeneous data includes an electronic device, an image acquisition device, and a laser radar;
the laser radar sends the collected point cloud data to the electronic equipment;
the electronic equipment acquires a first position and spatial position information of a target object according to the point cloud data; judging whether the target object moves into an image acquisition range of the image acquisition device or not according to the spatial position information;
if the target object moves to the image acquisition range of the image acquisition device, the image acquisition device sends the acquired image to the electronic equipment;
and the electronic equipment acquires the second pose of the target object according to the space position information of the target object acquired by the laser radar and the image acquired by the image acquisition device.
In a fourth aspect, an embodiment of the present application provides an electronic device, where the electronic device includes a memory and a processor, where the memory stores machine executable instructions executable by the processor, and the machine executable instructions, when executed by the processor, implement the method for measuring pose based on heterogeneous data.
In a fifth aspect, an embodiment of the present application provides a storage medium, where the storage medium stores a computer program, and when the computer program is processed and executed, the method for measuring pose based on heterogeneous data is implemented.
Compared with the prior art, the method has the following beneficial effects:
the embodiment of the application provides a pose measurement method, a pose measurement device, a pose measurement system and electronic equipment based on heterogeneous data. The electronic equipment acquires the first pose and the spatial position information of the target object by acquiring the laser radar, or acquires the second pose of the target object from the image acquired by the image acquisition device based on the spatial position information detected by the laser radar when the target object moves to the image acquisition range of the image acquisition device. Therefore, the detection range of the laser radar and the detection precision of the image acquisition device to the pose are considered, and meanwhile, the calculation amount of the image acquired by the image acquisition device is reduced by utilizing the spatial position information detected by the laser radar.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are required to be used in the embodiments will be briefly described below, it should be understood that the following drawings only illustrate some embodiments of the present application and therefore should not be considered as limiting the scope, and for those skilled in the art, other related drawings can be obtained from the drawings without inventive effort.
Fig. 1 is a schematic structural diagram of an electronic device according to an embodiment of the present disclosure;
FIG. 2 is a flowchart illustrating steps of a pose measurement method based on heterogeneous data according to an embodiment of the present disclosure;
fig. 3 is a schematic view of a scenario for acquiring a transformation matrix according to an embodiment of the present application;
FIG. 4 is a schematic view of a scene for capturing the pose of an aircraft according to an embodiment of the present application;
FIGS. 5-7 are schematic diagrams of grid map plane images provided by embodiments of the present application;
FIG. 8 is a schematic diagram of a marker provided in an embodiment of the present application;
fig. 9 is a schematic structural diagram of a pose measurement apparatus based on heterogeneous numbers according to an embodiment of the present application.
Icon: 110-pose measuring means based on heterogeneous data; 120-a memory; 130-a processor; 500-laser radar; 600-an image acquisition device; 700-a stereo calibration object; 7001-plate; 2000-detection range of lidar; 3000-image acquisition range of the image acquisition device; 900-an aircraft; 4000-grid map plane image; 5000-mark points; 1101-a data acquisition module; 1102-data processing module.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present application clearer, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some embodiments of the present application, but not all embodiments. The components of the embodiments of the present application, generally described and illustrated in the figures herein, can be arranged and designed in a wide variety of different configurations.
Thus, the following detailed description of the embodiments of the present application, presented in the accompanying drawings, is not intended to limit the scope of the claimed application, but is merely representative of selected embodiments of the application. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, it need not be further defined and explained in subsequent figures.
In the description of the present application, it is noted that the terms "first", "second", "third", and the like are used merely for distinguishing between descriptions and are not intended to indicate or imply relative importance.
In the related art, the current non-contact sensing system mainly comprises a laser radar and a vision measurement system, wherein the laser radar can detect a large range, but the captured pose precision is low. The vision measurement system can capture the pose with higher precision, but is limited by the detection principle, and the detection range is smaller. Therefore, both lidar and vision measurement systems have their own limitations.
In view of this, the embodiment of the present application provides a pose measurement method based on heterogeneous data, which is applied to an electronic device, where the electronic device is in communication connection with an image acquisition device and a laser radar. The image acquisition device is used for capturing a second pose of the target object in a second base coordinate system, and the laser radar is used for capturing a first pose of the target object in a first base coordinate system.
It should be noted that although the detection range of the conventional radar and the detection accuracy of the position of the target object can meet the requirements, the laser radar is used in the embodiment of the present application in consideration of the high cost of the conventional radar, the limitation on the material of the target object (that is, part of the material can absorb the electromagnetic wave of the radar), and the poor detection accuracy of the pose of the target object. Although the laser radar can detect the pose of the target object within a certain precision range, the pose detection precision has a certain upper limit, so that in the application example, the image acquisition device and the laser radar are combined with each other to take the detection range of the laser radar and the detection precision of the image acquisition device to the pose into consideration.
It should be understood that, for the electronic device, please refer to fig. 1, the electronic device further includes a position and posture measurement apparatus 110 based on heterogeneous source data, a memory 120, and a processor 130.
Wherein the memory 120, the processor 130, and other elements are communicatively coupled directly or indirectly to each other to enable the transfer or interaction of data. For example, the components may be electrically connected to each other via one or more communication buses or signal lines. The pose measurement apparatus 110 based on the heterogeneous data includes at least one software function module which may be stored in the memory 120 in the form of software or firmware (firmware) or solidified in an Operating System (OS) of the electronic device. The processor 130 is configured to execute executable modules stored in the memory 120, such as software functional modules and computer programs included in the heterogeneous number-based pose measurement apparatus 110 based on heterogeneous data. When the computer executable instructions corresponding to the pose measurement device based on the heterogeneous numbers are executed by the processor, the pose measurement method based on the heterogeneous data is realized.
The Memory 120 may be, but is not limited to, a Random Access Memory (RAM), a Read Only Memory (ROM), a Programmable Read-Only Memory (PROM), an Erasable Read-Only Memory (EPROM), an electrically Erasable Read-Only Memory (EEPROM), and the like. The memory 120 is used for storing programs, and the processor 130 executes the programs after receiving the execution instructions.
The processor 130 may be an integrated circuit chip having signal processing capabilities. The Processor may be a general-purpose Processor, and includes a Central Processing Unit (CPU), a Network Processor (NP), and the like; but may also be a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other programmable logic device, discrete gate or transistor logic device, discrete hardware components. The various methods, steps, and logic blocks disclosed in the embodiments of the present application may be implemented or performed. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
With respect to the above-mentioned pose measurement method based on heterogeneous data, please refer to fig. 2, which is a flowchart illustrating steps of a target object pose capture method according to an embodiment of the present application, and the method including the steps will be described in detail below.
And step S103, acquiring the first position and spatial position information of the target object through the laser radar.
The laser radar is used as a common target detection device and can be used for detecting pose information and spatial position information of a target object.
And step S104, judging whether the target object moves into the image acquisition range of the image acquisition device or not according to the spatial position information.
In the embodiment of the application, the detection range of the image acquisition device is influenced by the image acquisition range of the image acquisition device, weather, environment and other factors, so that the detection range of the image acquisition device and the detection range of the laser radar have certain difference. Therefore, when the target object is far away, the electronic device takes the pose detected by the laser radar as the first pose of the target object.
And based on the image acquisition range of the image acquisition device, when the electronic equipment detects that the target object enters the image acquisition range of the image acquisition device according to the spatial position information of the target object, switching to the second pose of the target object acquired by the image acquisition device.
And step S105, if the target object moves into the image acquisition range of the image acquisition device, acquiring the spatial position information of the target object and the image acquired by the image acquisition device according to the laser radar, and acquiring a second pose of the target object.
In the embodiment of the application, the image acquisition device and the laser radar are combined, and when the second pose of the target object is acquired through the image acquisition device, the position of the target object in the image is determined through the spatial position information detected by the laser radar, so that the electronic equipment can accurately determine the target object without performing operations such as image recognition on the image acquired by the image acquisition device independently, and the calculation amount when the second pose of the target object is acquired through the image acquisition device can be reduced.
Therefore, with the above-mentioned pose measurement method based on heterogeneous data, the electronic device obtains the first pose and the spatial position information of the target object by obtaining the lidar, or obtains the second pose of the target object from the image captured by the image capturing device based on the lidar detecting the spatial position information when the target object moves to the image capturing range of the image capturing device. Therefore, the detection range of the laser radar and the detection precision of the image acquisition device to the pose are considered, and meanwhile, the calculation amount of the image acquired by the image acquisition device is reduced by utilizing the spatial position information detected by the laser radar.
In one possible embodiment, it is worth mentioning that a large amount of computing resources are consumed when performing operations such as image recognition on the images acquired by the imaging device. In consideration of the fact that when the electronic equipment is used for field operation, reducing the calculation amount means that the power consumption of the electronic equipment can be reduced, therefore, when the second pose of the target object is obtained through the image acquisition device, the position of the target object in the image is determined through the spatial position information detected by the laser radar, and the duration of the electronic equipment during field operation can be prolonged.
The embodiment of the application aims at taking into account the advantages of large detection range of the laser radar and the advantages of high pose detection precision of the image acquisition device, and the electronic equipment is in communication connection with the laser radar and the image acquisition device. In order to keep the stability of the system, the laser radar and the image acquisition device are fixed on the same bearing piece, and the relative position between the laser radar and the image acquisition device is kept fixed.
The laser radar and the image acquisition device are arranged at different positions of the electronic equipment, so that a first position and a second position of a target object detected by the laser radar and a second position of a flight detected by the image acquisition device are respectively located in different base coordinate systems. In order to facilitate subsequent analysis processing, the first pose and the second pose need to be unified to the same base coordinate system, i.e. a third base coordinate system.
For the third base coordinate system, a base coordinate system used by the laser radar may be selected, and a base coordinate system used by the image acquisition device may also be selected. Of course, other base coordinate systems may be selected, and the embodiments of the present application are not limited to these.
Step S106, a first transformation matrix and a second transformation matrix are obtained, wherein the first transformation matrix represents the pose transformation relation between the first base coordinate system and the third base coordinate system, and the second transformation matrix represents the pose transformation relation between the second base coordinate system and the third base coordinate system.
And S107, acquiring the first pose or the second pose.
And S108, processing the first pose according to the first conversion matrix to obtain a first conversion pose of the target object in a third base coordinate system, or processing the second pose according to the second conversion matrix to obtain a second conversion pose of the target object in the third base coordinate system.
Therefore, through the steps of the pose measurement method based on the heterogeneous data, the detection range of the laser radar and the detection precision of the pose of the image acquisition device are considered by acquiring the first pose of the target object captured by the laser radar or the second pose of the target object captured by the image acquisition device. And converting the first posture or the second posture into the same basic coordinate system through the conversion matrix so as to facilitate the subsequent data analysis and processing.
In view of the fact that the lidar and the image acquisition device are disposed at different positions of the same bearing member, in order to facilitate subsequent analysis of pose data of the target object (that is, the pose reflects the scientificity and reasonability of the design of the target object and the accuracy of flight control to a certain extent), a first transformation matrix and a second transformation matrix need to be obtained.
As a possible implementation manner, the electronic device obtains a first test pose of the same three-dimensional calibration object in a first base coordinate system, a second test pose in a second base coordinate system, and a third test pose in a third base coordinate system; obtaining a first conversion matrix according to the first test pose and the third test pose; and obtaining a second transformation matrix according to the second test pose and the third test pose.
Fig. 3 is a schematic diagram of a pose transformation relationship provided in the embodiment of the present application. The first base coordinate system where the laser radar is located is selected as a third base coordinate system, that is, the first base coordinate system and the third base coordinate system are the same base coordinate system. Therefore, the first pose acquired by the laser radar 500 is the first conversion pose in the third base coordinate system.
As shown in fig. 3, the stereo calibration object 700 is placed in the field of view common to the laser radar 500 and the image acquisition apparatus 600 (i.e., in the common detection region). Since the first base coordinate system where the laser radar is located is selected as the third base coordinate system, that is, the first base coordinate system and the third base coordinate system are the same base coordinate system, the pose of the stereoscopic calibration object 700 detected by the electronic device through the laser radar 500 in the first base coordinate system is the third test pose of the stereoscopic calibration object 700, which can be represented as T1. The second testing pose of the three-dimensional calibration object 700 in the second base coordinate system, which is detected by the electronic device through the image capturing device 600, is denoted as T2Then, a second transformation matrix T representing the pose transformation relationship between the second base coordinate system and the third base coordinate system3Can be expressed as:
it should be noted that the second test pose detected by the image capture device 600 and the first test pose detected by the laser radar 500 are pose information of the three-dimensional calibration object 700 at the same time. The image acquisition device 600 and the laser radar 500 are also required to be synchronously arranged, so that the image acquisition device 600 and the laser radar 500 can simultaneously detect the pose of the three-dimensional calibration object 700.
Referring to fig. 3 again, as one possible implementation of the three-dimensional calibration object 700, the three-dimensional calibration object 700 includes three plates 7001 which are different in size and perpendicular to each other, and squares which do not pass through colors are disposed on the surfaces of the plates 7001 at intervals. Here, when the sizes of the respective flat plates 7001 are different from each other to facilitate identification, the respective flat plates 7001 are distinguished from each other. The different colored squares spaced apart in each plate 7001 facilitate stereoscopic vision recognition.
In addition, it is considered that the image capturing apparatus 600 needs to be within a specific image capturing range to be able to achieve higher pose detection accuracy.
Therefore, as a possible example of the above-mentioned target object, referring to fig. 4, the electronic device detects the aircraft 900 in the detection range 2000 of the laser radar through the laser radar 500, and determines whether the aircraft 900 enters the image capturing range 3000 of the image capturing device. If the aircraft 900 enters the image capturing range 3000 of the image capturing device, the electronic device obtains the second pose of the aircraft 900 in the second base coordinate system through the image capturing device 600.
In the embodiment of the application, before the electronic device obtains the first pose and the spatial position information of the target object, the target object in the space needs to be determined. Therefore, before step S100, the pose measurement method based on the heterogeneous data further includes:
and S100, acquiring point cloud data through a laser radar.
And S101, processing the point cloud data through a raster map method to obtain a raster map plane image of the point cloud data.
And step S102, traversing the grid map plane image according to the shape and the size of the target object, and judging whether the target object is detected.
Also taking the aircraft 900 as an example, the above steps will be exemplarily described below with reference to the grid map plane images shown in fig. 5, 6 and 7. When the electronic equipment detects the aircraft 900 through the laser radar 500, point cloud data of the aircraft 900 are obtained, and a grid map plane image 4000 of the point cloud data is obtained through processing by a grid map method.
The different shapes are present in the grid map plane image 4000 due to the different appearances of the aircraft 900. Therefore, based on this principle, the electronic device traverses the pixel cells in the grid map plane image 4000 according to the shape and size of the aircraft 900 to be detected, and determines whether the aircraft 900 is detected.
In addition, patterns formed by other objects may exist in the grid map plane image 4000, and the electronic device may further filter the patterns formed by other objects according to the distribution and the gray value of the patterns of the grid map plane image 4000.
It should be understood that the electronic device often has huge data amount of point cloud data obtained by the laser radar 500, and common devices are hard to bear the calculation amount of such many data; therefore, further data simplification of the obtained point cloud data is required. Common methods for simplifying point cloud data may be, but are not limited to, grid mapping, feature mapping, and topographies.
In the embodiment of the application, the target object enters the image acquisition range of the image acquisition device, and the electronic device needs to obtain the second pose of the target object through the image acquired by the image acquisition device. It is worth noting that the planar images captured by the monocular camera cannot be used to detect the pose of the aircraft 900. Therefore, in the embodiment of the present application, the image capturing device 600 is a binocular vision system.
Also taking the aircraft 900 as an example, referring to fig. 6, the surface of the aircraft 900 is provided with the mark point 5000, so that the electronic device recognizes the aircraft 900 through the binocular vision system to obtain the coordinates of the mark point 5000 in the preset coordinate system. Wherein the predetermined coordinate system is established based on the mark point 5000.
As a possible example of the embodiment of the present application, the electronic device identifies the marker points on the surface of the aircraft by using a sub-pixel marker point extraction method. The method specifically comprises the steps of mark point ROI extraction, anisotropic diffusion filtering, edge extraction and Zernike moment sub-pixel positioning.
The mark point ROI extraction is specifically used for effectively segmenting an image by adopting a binarization means to obtain a mark point candidate region with strong brightness, and then comparing the mark point candidate region with a mark point template to obtain a final mark point ROI region. In order to obtain accurate sub-pixel mark point positioning, further operations such as filtering and edge extraction need to be performed on the mark point ROI region.
Anisotropic diffusion filtering is particularly useful for smoothing noisy images. In general, different filters may be selected, such as gaussian filtering, mean filtering, median filtering, and so forth. In the present example, the diffusion filter is used, and the image is regarded as a substance field, the gray value of each pixel in the image is equivalent to the density of the substance, and the smoothness of the image is equivalent to the diffusion between the substances. By selecting a proper diffusion coefficient, the effect of increasing the smoothness in the region or reducing the smoothness between the regions can be achieved, so that the blurring of the edge is reduced while the image noise is removed.
The edge extraction is specifically used for extracting edge points at a pixel level in the ROI image, and then finding edge points of sub-pixels in the vicinity of the coarse positioning points. The pixel-level edge detection algorithm mostly utilizes the gradient information of the image to search, and in the embodiment of the application, a Sobel edge operator is adopted to extract the corresponding mark point edge.
The Zernike moment sub-pixel positioning is particularly used for positioning, and has the characteristic of high positioning accuracy based on a Zernike moment method. And (3) assuming the edge of the target image as a step model, solving edge parameters according to a relational column equation of moments before and after rotation, and utilizing the rotation invariant characteristic of the moments so as to realize sub-pixel positioning.
Further, the electronic device processes the coordinates of the marking point 5000 in the preset coordinate system through a light speed adjustment algorithm to obtain a first pose of the aircraft 900 in the first base coordinate system.
To obtainWith reference to fig. 8, the electronic device obtains the coordinates of the marking points 5000 in the preset coordinate system, and obtains the three-dimensional coordinates (P) of each marking point 5000 on the surface of the aircraft 900 through a binocular vision system, a total station or an electronic distance meter1,P2,P3,…P4) And three of the mark points (P) are selectedi,Pj,Po) (or selecting the point set gravity centers of all the mark points and two additional mark points) to form a plane; selecting a mark point PoAnd PiThe straight line is used as x-axis, and the cross point P in the plane isoAnd taking the normal vector of the point as a y axis and taking the normal vectors of the planes where the x and the y are positioned as a z axis to establish the preset coordinate system. Further, the electronic device obtains the coordinates of each marking point 5000 in the preset coordinate system according to the relative position relationship between each marking point 5000.
The embodiment of the application also provides a pose measuring device 110 based on the heterogeneous data and based on the heterogeneous data. The heterogeneous data based pose measurement apparatus 110 based on heterogeneous data includes at least one functional module that can be stored in the form of software in the memory 120. Referring to fig. 9, functionally partitioned, the heterogeneous data based pose measurement apparatus 110 based on heterogeneous data may include:
the data acquisition module 1101 is configured to acquire the first position and spatial position information of the target object through a laser radar.
In this embodiment of the application, when the computer-executable instructions corresponding to the data obtaining module 1101 are executed by the processor, step S103 in fig. 2 is implemented, and for the detailed description of the data obtaining module 1101, reference may be made to the detailed description of step S103.
And the data processing module 1102 is configured to determine whether the target object moves within an image acquisition range of the image acquisition device according to the spatial position information.
The data obtaining module 1101 is further configured to, if the target object moves into the image acquisition range, obtain a second pose of the target object according to the spatial position information and the image acquired by the image acquisition device.
In this embodiment of the application, when the computer-executable instructions corresponding to the data processing module 1102 are executed by the processor, step S104 and step S105 in fig. 2 are implemented, and for the detailed description of the data processing module 1102, refer to the detailed description of step S104 and step S105.
The embodiment of the application also provides a pose measurement system based on the heterogeneous data, and the pose measurement method based on the heterogeneous data comprises electronic equipment, an image acquisition device and a laser radar;
and the laser radar sends the collected point cloud data to the electronic equipment.
The electronic equipment acquires a first position and spatial position information of the target object according to the point cloud data; and judging whether the target object moves into the image acquisition range of the image acquisition device or not according to the spatial position information.
And if the target object moves to the image acquisition range, the image acquisition device sends the acquired image to the electronic equipment.
And the electronic equipment acquires a second pose of the target object according to the spatial position information and the image acquired by the image acquisition device.
The embodiment of the present application further provides an electronic device, which includes a memory 120 and a processor 130, where the memory 120 stores machine executable instructions capable of being executed by the processor 130, and when the machine executable instructions are executed by the processor 130, the pose measurement method based on the heterogeneous data is implemented.
The embodiment of the application also provides a storage medium, wherein the storage medium stores a computer program, and the pose measurement method based on the heterogeneous data is realized when the computer program is processed and executed.
In summary, the present application provides a pose measurement method, device, system and electronic device based on heterogeneous data. The electronic equipment acquires the first pose and the spatial position information of the target object by acquiring the laser radar, or acquires the second pose of the target object from the image acquired by the image acquisition device based on the spatial position information detected by the laser radar when the target object moves to the image acquisition range of the image acquisition device. Therefore, the detection range of the laser radar and the detection precision of the image acquisition device to the pose are considered, and meanwhile, the calculation amount of the image acquired by the image acquisition device is reduced by utilizing the spatial position information detected by the laser radar.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. The apparatus embodiments described above are merely illustrative, and for example, the flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of apparatus, methods and computer program products according to various embodiments of the present application. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
In addition, functional modules in the embodiments of the present application may be integrated together to form an independent part, or each module may exist separately, or two or more modules may be integrated to form an independent part.
The functions, if implemented in the form of software functional modules and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application or portions thereof that substantially contribute to the prior art may be embodied in the form of a software product stored in a storage medium and including instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
It is noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.
The above description is only for various embodiments of the present application, but the scope of the present application is not limited thereto, and any person skilled in the art can easily conceive of changes or substitutions within the technical scope of the present application, and all such changes or substitutions are included in the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.
Claims (10)
1. A pose measurement method based on heterogeneous data is applied to electronic equipment, the electronic equipment is in communication connection with an image acquisition device and a laser radar, and the method comprises the following steps:
acquiring a first position and spatial position information of a target object through the laser radar;
judging whether the target object moves into an image acquisition range of the image acquisition device or not according to the spatial position information;
and if the target object moves to the image acquisition range of the image acquisition device, acquiring the spatial position information of the target object and the image acquired by the image acquisition device according to the laser radar, and acquiring a second pose of the target object.
2. The method of pose measurement based on heterogeneous data according to claim 1, wherein the first pose is located in a first base coordinate system and the second pose is located in a second base coordinate system; the method further comprises the following steps:
acquiring a first conversion matrix and a second conversion matrix, wherein the first conversion matrix represents a pose conversion relation between the first base coordinate system and a third base coordinate system, and the second conversion matrix represents a pose conversion relation between the second base coordinate system and the third base coordinate system;
acquiring the first pose or the second pose;
and processing the first pose according to a first conversion matrix to obtain a first conversion pose of the target object in a third base coordinate system, or processing the second pose according to a second conversion matrix to obtain a second conversion pose of the target object in the third base coordinate system.
3. The pose measurement method based on the heterogeneous data according to claim 2, wherein the step of acquiring the first transformation matrix and the second transformation matrix comprises:
acquiring a first test pose of the same three-dimensional calibration object in the first base coordinate system, a second test pose in the second base coordinate system and a third test pose in the third base coordinate system;
obtaining the first conversion matrix according to the first test pose and the third test pose;
and obtaining the second transformation matrix according to the second test pose and the third test pose.
4. The pose measurement method based on the heterogeneous data according to claim 1, wherein the image acquisition device is a binocular vision system.
5. The method for measuring pose based on heterogeneous data according to claim 1, wherein the surface of the target object is provided with a mark point, and the step of obtaining the second pose of the target object according to the space position information of the target object obtained by the laser radar and the image collected by the image collecting device comprises:
determining the target object from the image acquired by the image acquisition device according to the spatial position information;
identifying the target object to obtain the coordinates of the mark points in a preset coordinate system, wherein the preset coordinate system is established based on the mark points;
and processing the coordinates of the mark points in a preset coordinate system through a light speed adjustment algorithm to obtain the second pose.
6. The method according to claim 1, wherein before the acquiring the first pose and spatial position information of the target object by the lidar, the method further comprises:
acquiring point cloud data through the laser radar;
processing the point cloud data through a grid map method to obtain a grid map plane image of the point cloud data;
and traversing the grid map plane image according to the shape and the size of the target object, and judging whether the target object is detected.
7. A posture measuring apparatus based on heterogeneous data, characterized in that the posture measuring apparatus based on heterogeneous data comprises:
the data acquisition module is used for acquiring the first position and spatial position information of the target object through the laser radar;
the data processing module is used for judging whether the target object moves into the image acquisition range of the image acquisition device according to the spatial position information;
the data acquisition module is further configured to acquire spatial position information of the target object and an image acquired by the image acquisition device according to the laser radar if the target object moves into an image acquisition range of the image acquisition device, and acquire a second pose of the target object.
8. The pose measurement system based on the heterogeneous data is characterized by comprising electronic equipment, an image acquisition device and a laser radar;
the laser radar sends the collected point cloud data to the electronic equipment;
the electronic equipment acquires a first position and spatial position information of a target object according to the point cloud data; judging whether the target object moves into an image acquisition range of the image acquisition device or not according to the spatial position information;
if the target object moves to the image acquisition range of the image acquisition device, the image acquisition device sends the acquired image to the electronic equipment;
and the electronic equipment acquires the second pose of the target object according to the space position information of the target object acquired by the laser radar and the image acquired by the image acquisition device.
9. An electronic device comprising a memory and a processor, the memory storing machine executable instructions executable by the processor, the machine executable instructions when executed by the processor implementing the method of position and orientation measurement based on heterogeneous data as claimed in any one of claims 1-6.
10. A storage medium characterized in that the storage medium stores a computer program which, when executed by a processor, implements the heterogeneous data-based pose measurement method according to any one of claims 1 to 6.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011296352.8A CN112379390B (en) | 2020-11-18 | 2020-11-18 | Pose measurement method, device and system based on heterogeneous data and electronic equipment |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011296352.8A CN112379390B (en) | 2020-11-18 | 2020-11-18 | Pose measurement method, device and system based on heterogeneous data and electronic equipment |
Publications (2)
Publication Number | Publication Date |
---|---|
CN112379390A true CN112379390A (en) | 2021-02-19 |
CN112379390B CN112379390B (en) | 2024-09-27 |
Family
ID=74585813
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202011296352.8A Active CN112379390B (en) | 2020-11-18 | 2020-11-18 | Pose measurement method, device and system based on heterogeneous data and electronic equipment |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN112379390B (en) |
Citations (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20070036405A (en) * | 2005-09-29 | 2007-04-03 | 에프엠전자(주) | Sensing system in a traveling railway vehicle for sensing a human body or an obstacle on a railway track |
CN1940711A (en) * | 2005-09-27 | 2007-04-04 | 欧姆龙株式会社 | Front image taking device |
US20130305805A1 (en) * | 2011-01-31 | 2013-11-21 | Agency For Defense Development | Device, system and method for calibration of camera and laser sensor |
CN107443377A (en) * | 2017-08-10 | 2017-12-08 | 埃夫特智能装备股份有限公司 | Sensor robot coordinate system conversion method and Robotic Hand-Eye Calibration method |
CN108337915A (en) * | 2017-12-29 | 2018-07-27 | 深圳前海达闼云端智能科技有限公司 | Three-dimensional builds drawing method, device, system, high in the clouds platform, electronic equipment and computer program product |
CN108828606A (en) * | 2018-03-22 | 2018-11-16 | 中国科学院西安光学精密机械研究所 | Laser radar and binocular visible light camera-based combined measurement method |
CN109035309A (en) * | 2018-07-20 | 2018-12-18 | 清华大学苏州汽车研究院(吴江) | Pose method for registering between binocular camera and laser radar based on stereoscopic vision |
CN208350997U (en) * | 2018-07-04 | 2019-01-08 | 北京国泰星云科技有限公司 | A kind of object movement monitoring system |
US20190086548A1 (en) * | 2017-09-19 | 2019-03-21 | Topcon Corporation | Data processing device, data processing method, and data processing program |
CN110583014A (en) * | 2016-10-11 | 2019-12-17 | 深圳市前海腾际创新科技有限公司 | method and system for detecting and locating intruders using laser detection and ranging device |
CN110929669A (en) * | 2019-11-29 | 2020-03-27 | 北京百度网讯科技有限公司 | Data labeling method and device |
CN210225575U (en) * | 2019-10-17 | 2020-03-31 | 济南和普威视光电技术有限公司 | Radar linkage monitoring camera device |
CN111812668A (en) * | 2020-07-16 | 2020-10-23 | 南京航空航天大学 | Winding inspection device, positioning method thereof and storage medium |
-
2020
- 2020-11-18 CN CN202011296352.8A patent/CN112379390B/en active Active
Patent Citations (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1940711A (en) * | 2005-09-27 | 2007-04-04 | 欧姆龙株式会社 | Front image taking device |
KR20070036405A (en) * | 2005-09-29 | 2007-04-03 | 에프엠전자(주) | Sensing system in a traveling railway vehicle for sensing a human body or an obstacle on a railway track |
US20130305805A1 (en) * | 2011-01-31 | 2013-11-21 | Agency For Defense Development | Device, system and method for calibration of camera and laser sensor |
CN110583014A (en) * | 2016-10-11 | 2019-12-17 | 深圳市前海腾际创新科技有限公司 | method and system for detecting and locating intruders using laser detection and ranging device |
CN107443377A (en) * | 2017-08-10 | 2017-12-08 | 埃夫特智能装备股份有限公司 | Sensor robot coordinate system conversion method and Robotic Hand-Eye Calibration method |
US20190086548A1 (en) * | 2017-09-19 | 2019-03-21 | Topcon Corporation | Data processing device, data processing method, and data processing program |
CN108337915A (en) * | 2017-12-29 | 2018-07-27 | 深圳前海达闼云端智能科技有限公司 | Three-dimensional builds drawing method, device, system, high in the clouds platform, electronic equipment and computer program product |
CN108828606A (en) * | 2018-03-22 | 2018-11-16 | 中国科学院西安光学精密机械研究所 | Laser radar and binocular visible light camera-based combined measurement method |
CN208350997U (en) * | 2018-07-04 | 2019-01-08 | 北京国泰星云科技有限公司 | A kind of object movement monitoring system |
CN109035309A (en) * | 2018-07-20 | 2018-12-18 | 清华大学苏州汽车研究院(吴江) | Pose method for registering between binocular camera and laser radar based on stereoscopic vision |
CN210225575U (en) * | 2019-10-17 | 2020-03-31 | 济南和普威视光电技术有限公司 | Radar linkage monitoring camera device |
CN110929669A (en) * | 2019-11-29 | 2020-03-27 | 北京百度网讯科技有限公司 | Data labeling method and device |
CN111812668A (en) * | 2020-07-16 | 2020-10-23 | 南京航空航天大学 | Winding inspection device, positioning method thereof and storage medium |
Non-Patent Citations (4)
Title |
---|
叶泽田: "《地表空间数字模拟理论方法及应用》", 30 April 2010, 北京:测绘出版社, pages: 134 - 135 * |
张慧智: "基于激光视觉技术的运动目标位姿测量与误差分析", 《激光杂志》, vol. 41, no. 4, 30 April 2020 (2020-04-30), pages 81 - 85 * |
李涛;吴云;: "一种非合作卫星目标立体视觉测量技术", 空间控制技术与应用, no. 04, 15 August 2017 (2017-08-15) * |
贾子永;任国全;李冬伟;程子阳;: "视觉与激光雷达信息融合的目标领航车识别方法", 火力与指挥控制, no. 06, 15 June 2018 (2018-06-15) * |
Also Published As
Publication number | Publication date |
---|---|
CN112379390B (en) | 2024-09-27 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN107463918B (en) | Lane line extraction method based on fusion of laser point cloud and image data | |
CN106651752B (en) | Three-dimensional point cloud data registration method and splicing method | |
CN107025663B (en) | Clutter scoring system and method for 3D point cloud matching in vision system | |
CN109544599B (en) | Three-dimensional point cloud registration method based on camera pose estimation | |
JP6649796B2 (en) | Object state specifying method, object state specifying apparatus, and carrier | |
CN109801333B (en) | Volume measurement method, device and system and computing equipment | |
CN111627075B (en) | Camera external parameter calibration method, system, terminal and medium based on aruco code | |
CN111640158B (en) | End-to-end camera and laser radar external parameter calibration method based on corresponding mask | |
JP2016166853A (en) | Location estimation device and location estimation method | |
CN103727930A (en) | Edge-matching-based relative pose calibration method of laser range finder and camera | |
KR102073468B1 (en) | System and method for scoring color candidate poses against a color image in a vision system | |
CN109815822B (en) | Patrol diagram part target identification method based on generalized Hough transformation | |
JP6844235B2 (en) | Distance measuring device and distance measuring method | |
Yuan et al. | Combining maps and street level images for building height and facade estimation | |
CN113223135A (en) | Three-dimensional reconstruction device and method based on special composite plane mirror virtual image imaging | |
US20220148153A1 (en) | System and method for extracting and measuring shapes of objects having curved surfaces with a vision system | |
KR20180098945A (en) | Method and apparatus for measuring speed of vehicle by using fixed single camera | |
CN113945937A (en) | Precision detection method, device and storage medium | |
CN117496467A (en) | Special-shaped lane line detection method based on fusion of monocular camera and 3D LIDAR | |
JP5928010B2 (en) | Road marking detection apparatus and program | |
CN101782386B (en) | Non-visual geometric camera array video positioning method and system | |
CN114485433A (en) | Three-dimensional measurement system, method and device based on pseudo-random speckles | |
CN112379390A (en) | Pose measurement method, device and system based on heterogeneous data and electronic equipment | |
CN115115619A (en) | Feature point extraction method, device and equipment based on circle fitting and storage medium | |
Ziqiang et al. | Research of the algorithm calculating the length of bridge crack based on stereo vision |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |