WO2021190485A1 - 数据采集设备及数据校正方法、装置、电子设备 - Google Patents

数据采集设备及数据校正方法、装置、电子设备 Download PDF

Info

Publication number
WO2021190485A1
WO2021190485A1 PCT/CN2021/082315 CN2021082315W WO2021190485A1 WO 2021190485 A1 WO2021190485 A1 WO 2021190485A1 CN 2021082315 W CN2021082315 W CN 2021082315W WO 2021190485 A1 WO2021190485 A1 WO 2021190485A1
Authority
WO
WIPO (PCT)
Prior art keywords
data
ranging
module
point cloud
acquisition device
Prior art date
Application number
PCT/CN2021/082315
Other languages
English (en)
French (fr)
Inventor
盛哲
董子龙
谭平
Original Assignee
阿里巴巴集团控股有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 阿里巴巴集团控股有限公司 filed Critical 阿里巴巴集团控股有限公司
Priority to EP21774125.5A priority Critical patent/EP4130651A4/en
Publication of WO2021190485A1 publication Critical patent/WO2021190485A1/zh
Priority to US17/816,842 priority patent/US20230012240A1/en

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B21/00Measuring arrangements or details thereof, where the measuring technique is not covered by the other groups of this subclass, unspecified or not relevant
    • G01B21/02Measuring arrangements or details thereof, where the measuring technique is not covered by the other groups of this subclass, unspecified or not relevant for measuring length, width, or thickness
    • G01B21/04Measuring arrangements or details thereof, where the measuring technique is not covered by the other groups of this subclass, unspecified or not relevant for measuring length, width, or thickness by measuring coordinates of points
    • G01B21/045Correction of measurements
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/08Systems determining position data of a target for measuring distance only
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/42Simultaneous measurement of distance and other co-ordinates
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/86Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/4808Evaluating distance, position or velocity data
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/481Constructional features, e.g. arrangements of optical elements
    • G01S7/4817Constructional features, e.g. arrangements of optical elements relating to scanning
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/497Means for monitoring or calibrating
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/497Means for monitoring or calibrating
    • G01S7/4972Alignment of sensor

Definitions

  • the present disclosure relates to the field of computer technology, and in particular to a data acquisition device, a data correction method, device, and electronic equipment.
  • Three-dimensional reconstruction technology is one of the research hotspots in the field of computer vision in industry and academia. According to the different objects to be reconstructed, it can be divided into three-dimensional reconstruction of objects, three-dimensional reconstruction of scenes, three-dimensional reconstruction of human bodies, etc.
  • the 3D scene acquisition equipment in the related technology usually uses a depth camera to collect images and depth information of the surrounding environment.
  • the 3D scene acquisition device is equipped with three depth cameras for head-up, top-down, and bottom-view to collect the depth map and the color map respectively.
  • the bottom of the 3D scene acquisition device is equipped with a rotating motor so that the 3D scene acquisition device can rotate in the horizontal direction, and is controlled by The depth camera takes pictures of the scene.
  • the embodiments of the present disclosure provide a data collection device, a data correction method, device, and electronic equipment.
  • an embodiment of the present disclosure provides a data acquisition device, including: a rotation module, a first ranging module, and an image acquisition module; wherein,
  • the rotation module is used to drive the data collection device to rotate in a first direction
  • the first ranging module is adapted to rotate in the first direction with the data collection device, is also adapted to rotate in a second direction, and is also adapted to measure first ranging data, wherein the first One direction is different from the second direction;
  • the image acquisition module is adapted to rotate with the data acquisition device in the first direction, and is also adapted to acquire image data in a three-dimensional scene.
  • first direction and the second direction are perpendicular to each other.
  • the data collection device further includes: a second ranging module for acquiring second ranging data; the ranging error of the second ranging module is smaller than the ranging error of the first ranging module.
  • the rotation module drives the data collection device to rotate one circle in the first direction
  • the second ranging module rotates in the first direction to measure the second ranging data.
  • the rotation module is arranged under the data acquisition device, the first ranging module is arranged on the first side of the data acquisition device, and the second ranging module is arranged on the data acquisition device.
  • the second side, and the plane where the first side is located and the plane where the second side is located are perpendicular to each other; the lens direction of the image acquisition device is opposite to the distance measurement direction of the second distance measurement module.
  • first ranging module and the second ranging module are both laser ranging modules.
  • the first ranging module is a single-line lidar.
  • the center of the lens of the image acquisition module is located on the extension line of the rotation axis of the rotation module.
  • the image acquisition module collects the image data after rotating to a preset rotation angle in the first direction.
  • the data acquisition device further includes: a micro control unit and a main control unit; wherein,
  • the micro control unit is respectively connected with the rotation module, the first ranging module and the second ranging module, and the micro control unit is used to control the rotation module and obtain the rotation of the rotation module in real time Angle, the first ranging data, and the second ranging data; the micro control unit is also used to perform the acquired rotation angle, the first ranging data, and the second ranging data Output to the main control unit after time synchronization;
  • the main control unit is connected to the image acquisition module, and acquires the image data from the image acquisition module, and evaluates the rotation angle, the first ranging data, and the rotation angle received from the micro control unit.
  • the second ranging data is processed according to the first ranging data.
  • the main control unit obtains omnidirectional point cloud data in a three-dimensional scene by processing multiple sets of the first ranging data, and the omnidirectional point cloud data includes the measured points on the surface of the object in the three-dimensional scene
  • the three-dimensional space coordinates in; multiple sets of the first ranging data include data collected by the first ranging module rotating one circle in the first direction along with the data collection device and rotating in the second direction for multiple rotations.
  • the main control unit further uses the second ranging data to perform error correction on the omnidirectional point cloud data, and the error correction method is as follows:
  • the main control unit obtains first point cloud data according to the second ranging data;
  • the first point cloud data includes the three-dimensional space coordinates of the target point on the surface of the object corresponding to the second ranging data, and
  • the main control unit obtains the second point cloud data corresponding to the target point from the omnidirectional point cloud data;
  • the main control unit determines error data according to the first point cloud data and the second point cloud data, and corrects the omnidirectional point cloud data according to the error data.
  • the main control unit also processes the image data to obtain a corresponding panoramic image.
  • the main control unit also obtains a three-dimensional scene model by processing the omnidirectional point cloud data and the image data.
  • an embodiment of the present disclosure provides a data correction method, including:
  • the first ranging data and the second ranging data are respectively determined by the first ranging module and the second ranging module on the data collection device of the first aspect. Obtained from the module;
  • the first point cloud data includes the three-dimensional space coordinates of the target point on the surface of the object corresponding to the second ranging data;
  • the error data is determined according to the first point cloud data and the second point cloud data, and the omnidirectional point cloud data is corrected according to the error data.
  • an embodiment of the present invention provides a data correction device, including:
  • the first acquisition module is configured to acquire the first ranging data and the second ranging data; wherein the first ranging data and the second ranging data are respectively determined by the first ranging data on the data collection device of the first aspect. Collected by a ranging module and a second ranging module;
  • the second acquiring module is configured to acquire first point cloud data according to the second ranging data;
  • the first point cloud data includes the three-dimensional space coordinates of the target point on the surface of the object corresponding to the second ranging data ;
  • An extraction module configured to obtain second point cloud data corresponding to the target point from the omnidirectional point cloud data
  • the determining module is configured to determine error data according to the first point cloud data and the second point cloud data, and correct the omnidirectional point cloud data according to the error data.
  • the function can be realized by hardware, or by hardware executing corresponding software.
  • the hardware or software includes one or more modules corresponding to the above-mentioned functions.
  • the structure of the foregoing device includes a memory and a processor, the memory is used to store one or more computer instructions that support the foregoing device to execute the foregoing corresponding method, and the processor is configured to execute all The computer instructions stored in the memory.
  • the above-mentioned apparatus may also include a communication interface, which is used for the above-mentioned apparatus to communicate with other equipment or a communication network.
  • embodiments of the present disclosure provide an electronic device, including a memory and a processor; wherein the memory is used to store one or more computer instructions, wherein the one or more computer instructions are processed by the The device executes to implement the method described in any one of the above aspects.
  • embodiments of the present disclosure provide a computer-readable storage medium for storing computer instructions used by any of the foregoing devices, including computer instructions used to execute the methods involved in any of the foregoing aspects.
  • the data collection device in the embodiment of the present disclosure uses the first ranging module to measure the depth information in the three-dimensional scene, that is, the distance data from the first ranging module to the surface of the object in the three-dimensional scene. Since the first ranging module can be selected to cover Products such as lidars with various measuring ranges and high precision, and the errors of lidars are small, so the area covered by a single collection point is wider, and the number of collection points can be reduced in a relatively open environment, which can reduce costs; in addition, The data acquisition device in the embodiment of the present disclosure can also collect multiple image data that can be spliced to form a panorama through the image acquisition device; in the embodiment of the present disclosure, since the first ranging module can rotate one circle in the first direction in the process It can rotate multiple times in the second direction, so it can finally collect points to the first ranging data covering the surface of all objects in the three-dimensional scene. Therefore, the data collection device in the embodiments of the present disclosure can collect and obtain three-dimensional scene data with higher accuracy and wider coverage at
  • Fig. 1 shows a schematic structural diagram of a data collection device according to an embodiment of the present disclosure
  • Fig. 2 shows a schematic structural diagram of a data collection device according to an embodiment of the present disclosure
  • Fig. 3 shows a schematic diagram of a circuit design of a three-dimensional acquisition device according to an embodiment of the present disclosure
  • FIG. 4 shows a schematic diagram of coordinates of a measured point in a three-dimensional space coordinate system in point cloud data according to an embodiment of the present disclosure
  • FIG. 5 shows a schematic diagram of a process of using a data acquisition device to collect data and perform a three-dimensional scene reconstruction according to an embodiment of the present disclosure
  • Fig. 6 shows a flowchart of a data correction method according to an embodiment of the present disclosure
  • FIG. 7 is a schematic structural diagram of an electronic device suitable for implementing a data correction method according to an embodiment of the present disclosure.
  • Fig. 1 shows a schematic structural diagram of a data collection device 100 according to an embodiment of the present disclosure.
  • the data acquisition device 100 includes: a rotation module 101, a first ranging module 102, and an image acquisition module 103; among them,
  • the rotation module 101 is used to drive the data collection device 100 to rotate in a first direction
  • the first ranging module 102 is adapted to rotate in a first direction with the data collection device 100, is also adapted to rotate in a second direction, and is also adapted to measure first ranging data, wherein the first direction and the second direction different;
  • the image acquisition module 103 is adapted to rotate with the data acquisition device 100 in the first direction, and is also adapted to acquire image data in a three-dimensional scene.
  • the rotation module 101 may be a pan/tilt, and the pan/tilt may be a horizontal pan/tilt, which is used to support the data collection device 100 and drive the data collection device 100 to rotate in the first direction.
  • the first direction is parallel to the ground plane.
  • the first ranging module 102 may rotate in the second direction, and continuously collect distance data in the three-dimensional scene during the process of rotating in the second direction.
  • the first direction and the second direction may be different.
  • the rotation module 101 drives the data collection device to rotate in the first direction
  • the first ranging module 102 rotates in the second direction to measure the first ranging data
  • the first ranging data may include the first ranging module The distance data from the surface of the object scanned in the first direction on the second plane formed by rotating 102 in the second direction to the first ranging module 102.
  • first direction and the second direction may be perpendicular to each other.
  • first direction when the data acquisition device 100 is placed horizontally, the first direction may be parallel to the ground plane, and the second direction may be perpendicular to the ground plane.
  • the rotation axis of the first ranging module 102 in the second direction and the rotation axis of the rotation module 101 in the first direction are perpendicular to each other.
  • the first ranging module 102 may be a laser ranging module, the first ranging module 102 may be rotated by a brushless motor, and the first ranging module 102 may include a laser transmitter and a laser receiver, And when the first ranging module 102 rotates in the second direction under the drive of the brushless motor, the laser transmitter and the laser receiver rotate together, and the laser transmitter and the laser receiver rotate on the same plane.
  • the laser transmitter may be a single-point laser transmitter, so the direction of the laser beam emitted by the laser transmitter may be perpendicular to the rotation axis of the first ranging module 102, and the laser beam formed by the laser beam during the rotation in the second direction It can be parallel to the rotation axis of the rotating module 101, that is, when the first ranging module 102 rotates one circle in the second direction, the laser plane formed by the laser beam emitted by the laser transmitter is perpendicular to the rotating module to drive the data acquisition device in the second direction.
  • the laser transmitter continuously emits a laser beam from the laser receiving hole, and the laser beam is reflected by the surface of the object and then received by the laser receiver in the laser receiving hole.
  • the first ranging module 102 may calculate the physical distance from the first ranging module 102 to the reflection point on the surface of the object according to the time difference between the emitted laser beam and the received laser beam after being reflected.
  • the physical distance between the first ranging module 102 and the reflection point on the surface of the object can also be calculated by triangulation, phase method, etc., which can be determined according to actual applications, and is not limited here. .
  • first distance measurement module may also be other distance measurement modules, such as an ultrasonic distance measurement module.
  • a set of first ranging data can be collected, and the set of first ranging data includes the vertical rotation plane formed by the laser beam emitted by the first ranging module 102 The distance data from multiple laser reflection points on the curve intersecting the surface of the surrounding object to the first ranging module 102.
  • the rotation speed of the first ranging module 102 in the second direction is much greater than the rotation speed of the rotation module 101 in the first direction. Therefore, the rotation module 101 drives the data collection device 100 to rotate one circle in the first direction.
  • the ranging module 102 can rotate multiple times in the second direction, and can collect multiple sets of first ranging data at multiple horizontal rotation angles, and the multiple sets of first ranging data can cover all-round object surfaces in a three-dimensional scene .
  • the first ranging module 102 may use a single-line laser ranging radar, such as a rotating triangulation single-line laser radar, a phase-based single-line laser radar, a ToF single-line laser radar, and the like.
  • a single-line laser ranging radar such as a rotating triangulation single-line laser radar, a phase-based single-line laser radar, a ToF single-line laser radar, and the like.
  • the first ranging module 102 may also use a multi-line lidar, such as a ToF multi-line lidar.
  • the laser ranging radar determines the distance between the measured object and the test point on the laser ranging radar by emitting a laser beam to the measured object and receiving the reflected wave of the laser beam.
  • the distance calculation methods of laser ranging radar usually include three types: time-of-flight (TOF) calculation method, trigonometric calculation method and phase method calculation method.
  • TOF time-of-flight
  • the flight time calculation method calculates the distance by recording the time difference between the emitted light and the reflected light.
  • the laser ranging radar that uses the flight time calculation method is called ToF lidar;
  • the triangulation calculation method calculates the distance based on the triangular pixel deviation of the reflected light, using the triangulation method
  • the calculation method of the laser ranging radar is called the triangulation lidar;
  • the phase method calculation method calculates the distance based on the phase difference between the emitted light and the reflected light, and the laser ranging radar using the phase method calculation method is called the phase method lidar.
  • the single-line lidar collects the single-line distance on the surface of the object in the 360° direction by placing a laser ranging device on the rotating motor and continuously rotating.
  • the multi-line lidar collects the multi-line distance on the surface of the object at the same time by placing multiple laser ranging devices on the rotating motor.
  • the image acquisition module 103 may be a camera, a camera, or the like. In some embodiments, the image acquisition module 103 may be a camera or a camera with a wide-angle lens. When the rotation module 101 drives the data collection device 100 to rotate one circle in the first direction, the image collection module 103 can collect image data.
  • the data collection device in the embodiment of the present disclosure uses the first ranging module to measure the depth information in the three-dimensional scene, that is, the distance data from the first ranging module to the surface of the object in the three-dimensional scene. Since the first ranging module can be selected to cover Products such as lidars with various measuring ranges and high precision, and the errors of lidars are small, so the area covered by a single collection point is wider, and the number of collection points can be reduced in a relatively open environment, which can reduce costs; in addition, The data acquisition device in the embodiment of the present disclosure can also collect multiple image data that can be spliced to form a panorama through the image acquisition device; in the embodiment of the present disclosure, since the first ranging module can rotate one circle in the first direction in the process It can rotate multiple times in the second direction, so it can finally collect points to the first ranging data covering the surface of all objects in the three-dimensional scene. Therefore, the data collection device in the embodiments of the present disclosure can collect and obtain three-dimensional scene data with higher accuracy and wider coverage at
  • the data collection device 100 further includes a second ranging module.
  • the second ranging module may be a high-precision ranging module, which may be used to obtain second ranging data in a three-dimensional scene.
  • the high-precision ranging module may be a ranging module with a ranging error in the millimeter level.
  • the distance measurement error of the first distance measurement module can be in the centimeter level. Therefore, the distance measurement error of the second distance measurement module is smaller than the distance measurement error of the first distance measurement module.
  • both the first ranging module and the second ranging module may use a laser ranging module.
  • the first ranging module 102 may use a single-line laser ranging radar.
  • the single-line lidar has certain measurement errors.
  • the common triangulation single-line lidar has an error of about ⁇ 1% to 2% of the measured distance
  • the ToF single-line lidar has an error of about ⁇ 3cm and has nothing to do with the distance. Therefore, if the actual application scenario requires high accuracy of the depth information in the 3D scene data, a second ranging module, that is, the high-precision laser ranging module, and the second ranging module can be set on the 3D laser acquisition device.
  • the ranging error of may be smaller than the ranging error of the first ranging module 102.
  • the second ranging module can also be used to obtain second ranging data. Since multiple sets of first ranging data can cover the surface of the object in the three-dimensional scene in all directions, and the purpose of the second ranging data acquired by the second ranging module is to correct the first ranging data, the second ranging module Only need to measure a part of the surface of the object.
  • the first ranging data of the same part of the surface of the object measured by the second ranging data can be extracted from the first ranging data, and this part of the first ranging data can be compared with the second ranging data. The distance data is compared to determine the measurement error, and then all the first distance measurement data are corrected according to the measurement error.
  • the laser wavelengths used by the first ranging module 102 and the second ranging module may be different.
  • the first ranging module 102 may use infrared light
  • the second ranging module may use visible light. In this way, it is possible to prevent the laser beams emitted by the first ranging module 102 and the second ranging module from affecting each other.
  • the distance from the first ranging module 102 to the point A on the surface of the object in the first ranging data is r1
  • the distance from the second ranging module to the point A on the surface of the object in the second ranging data is r2
  • the three-dimensional coordinate of point A in the three-dimensional scene is determined by processing r1 as B1
  • the three-dimensional coordinate of point A in the three-dimensional scene is determined by processing r2 as B2.
  • B1 is relative to B1 by comparing B1 and B2.
  • Error data of B2 (the accuracy of the second ranging module is high, so the data obtained by the second ranging module is used as the correct data to determine the error data).
  • the error data can be calculated by comprehensively considering the distance data of a group of points on the surface of the object.
  • the specifics can be determined according to the actual situation, and there is no limitation here.
  • the second ranging module rotates in the first direction to measure the second ranging data.
  • the rotation module 101 is arranged under the data acquisition device 100, the first ranging module 102 is arranged on the first side of the data acquisition device 100, and the second ranging module is arranged on the data acquisition device.
  • the second side of 100, and the plane of the first side and the plane of the second side intersect, for example, they may be perpendicular to each other; the direction of the lens of the image acquisition device is opposite to the direction of the laser beam emitted by the second distance measuring module.
  • the rotation module 101 may be arranged at the bottom of the data acquisition device 100 to support the entire data acquisition device 100, and when the rotation module 101 rotates around the rotation axis, it can drive the entire data acquisition device 100 to rotate .
  • the first ranging module 102 can be arranged on the first side
  • the second ranging module can be arranged on the second side that intersects the plane where the first side is located, and the direction of the laser beam emitted by the second ranging module can be the same as that of the first side.
  • the scanning planes of the laser beams emitted from the distance module 102 are parallel to each other. As the data acquisition device 100 rotates in the first direction, the scanning plane of the laser beam emitted by the second ranging module is parallel to the ground plane.
  • the angle between the direction of the laser beam emitted by the second ranging module and the scanning plane of the laser beam emitted by the first ranging module 102 can be set to be larger, for example, close to 90 degrees, even if
  • the laser wavelengths used by the first ranging module 102 and the second ranging module are the same or similar, the laser beams emitted by the first ranging module 102 and the second ranging module will not affect each other, and because the rotating module 101 is located in the data collection The bottom of the device 100 will not block the laser beams emitted by the first ranging module 102 and the second ranging module during the rotation of the rotating module 101 and the rotating of the first ranging module 102.
  • the image acquisition module 103 may be arranged above the data acquisition device 100, and the lens direction of the image acquisition module 103 to acquire the image may be opposite to the direction in which the second ranging module emits the laser beam. For example, when the image acquisition module 103 acquires image data at a head-up angle, the direction in which it acquires the image is opposite to the direction in which the second distance measurement module acquires the distance data.
  • Fig. 2 shows a schematic structural diagram of a data collection device according to an embodiment of the present disclosure.
  • the data acquisition equipment is equipped with a lidar, a camera module, a high-precision ranging module, and a pan/tilt motor.
  • the pan/tilt motor is set at the bottom of the data acquisition device, which can support the data acquisition device and drive the data acquisition device to rotate around the rotation axis A in the first direction;
  • the lidar can be set on the cylindrical brushless motor, and the bottom plane of the brushless motor can be It is set close to one side of the data acquisition device (left in the figure), and the body of the lidar can also be cylindrical, and the cylindrical side is provided with a laser emitting hole and a laser receiving hole (hidden in the figure) ), the laser transmitter is arranged in the laser emitting hole, and can emit the laser beam in the second direction through the laser emitting hole, and the laser beam reflected by the surface of the object enters the laser receiving hole, and is set in the laser receiving hole Received by the laser receiver.
  • the lidar can rotate around the rotation axis B in the second direction.
  • the camera module is installed on the top of the data acquisition device through a bracket, and the lens direction of the camera module can be the first direction, that is, the camera module collects images at a head-up angle, and the camera module can use a wide-angle lens, so it can collect Multiple images can be stitched into a panorama.
  • the high-precision laser ranging module is arranged on the other side of the data acquisition device (it is blocked in the figure, and the figure shows the location of the high-precision laser ranging module on the back).
  • the high-precision laser ranging module emits a laser beam in the first direction, and the emitting direction is opposite to the lens direction of the camera module (that is, the laser emitting direction of the high-precision laser ranging module is the direction inward in the figure).
  • the lens center of the image acquisition module 103 is located on the extension line of the rotation axis of the rotation module 101.
  • the lens center of the image acquisition module 103 is set on the extension line of the rotation axis of the rotation module 101, so that the image acquisition module 103 has no parallax between the multiple images acquired at multiple preset rotation angles, and utilizes the multiple images.
  • the panorama effect obtained by image stitching is better.
  • the image acquisition module 103 collects image data after rotating to a preset rotation angle in the first direction.
  • the image acquisition module 103 can acquire multiple images, and the multiple images can be stitched together to form a panoramic image. Therefore, the data acquisition device 100 can be rotated in the first direction by the rotation module 101, and after the rotation is rotated to a preset rotation angle , The image data is collected by the image collection module 103.
  • the preset rotation angle may include multiple, for example, it may include six, and each interval of 60 degrees is a preset rotation angle, that is, the six preset rotation angles are 0 degrees, 60 degrees, 120 degrees, 180 degrees, 240 degrees, 300 degrees.
  • the preset rotation angle may be determined according to the actual hardware equipment and actual requirements of the image acquisition module 103. For example, if a panoramic image is required, only multiple images that can be spliced to form a panoramic image may be collected.
  • the data collection device 100 may also include a micro control unit and a main control unit.
  • the micro-control unit can be a single-chip microcomputer, etc., and a real-time system can be run on the micro-control unit.
  • the micro-control unit is connected to various sensors on the data acquisition device, and is used to obtain real-time data on each sensor set on the data acquisition device 100 , And time-synchronize each channel of data acquired with a time stamp, and send it to the main control unit for processing.
  • the sensors provided on the data collection device 100 include, but are not limited to, a first ranging module 102, a second ranging module, an inertial measurement unit (IMU), and the like.
  • the micro-control unit is also connected to the rotation module 101, and can control the rotation module 101, and obtain the rotation angle and the like from the rotation motor of the rotation module 101.
  • the main control unit is connected to the image acquisition module, and is used to acquire image data from the image acquisition module 103, and receive data of each sensor from the micro control unit, and then perform corresponding processing on the data of each sensor.
  • the main control unit may be a processing unit such as a CPU for running a non-real-time operating system such as Linux.
  • Fig. 3 shows a schematic diagram of a circuit design of a three-dimensional acquisition device according to an embodiment of the present disclosure.
  • the microcontroller unit MCU is provided with a serial port, a motor drive interface, a lidar drive interface, an IMU interface, etc.
  • the main control unit is provided with a camera interface, WiFi and/or Bluetooth Configure the interface, etc.
  • the micro-control unit MCU is connected to the pan/tilt motor through the motor drive interface, connected to the lidar and high-precision laser ranging module through the lidar drive interface, connected to the inertial measurement unit through the IMU interface, and connected to the main control through the serial port
  • the units are connected.
  • the main control unit connects with the camera module through the camera interface, configures WiFi/Bluetooth devices through the Wifi/Bluetooth configuration interface, and communicates with external devices such as ipad/mobile phones through this interface.
  • the circuit design of the three-dimensional acquisition device also includes a power management module, which is used to provide power to each module on the three-dimensional acquisition device.
  • the IMU can measure the 3-axis acceleration and 3-axis angular velocity of the data acquisition device, and the measured acceleration and angular velocity can calculate the angle of the data acquisition device relative to the ground plane, which can be used in 3D scene modeling.
  • the main control unit obtains omni-directional point cloud data in the three-dimensional scene by processing multiple sets of first ranging data, and the omni-directional point cloud data includes measured points on the surface of the object Three-dimensional space coordinates in a three-dimensional scene; multiple sets of first ranging data include data collected by the first ranging module 102 with the data collection device 100 rotating one circle in the first direction and rotating in the second direction for multiple rotations .
  • the first ranging module 102 rotates multiple times in the second direction, and the first ranging module 102 can measure tens of thousands of times per second. Therefore, a set of first ranging data can be collected every one rotation, and multiple sets of first ranging data can be obtained through continuous rotation for multiple rotations.
  • the first set of distance measurement data includes distance data of the measured points on the surface of the object obtained by scanning the three-dimensional scene on a vertical plane by a laser, and multiple sets of first distance data collected by the data collection device 100 after one rotation in the first direction.
  • the ranging data includes the distance data of the measured point on the surface of the omni-directional object in the 3D scene.
  • the distance data of the measured point on the surface of the omnidirectional object in the three-dimensional scene is the physical distance from the test point (such as the laser emitting point) on the first ranging module 102 to the measured point on the surface of the object.
  • the rotation angle of the first ranging module 102 in the second direction and the rotation angle of the rotation module 101 in the first direction during each acquisition are recorded in real time. Therefore, after the distance data is determined, the above-mentioned rotation angle and distance The data is calculated to obtain the three-dimensional space coordinates of each point in the three-dimensional space coordinate system.
  • the omnidirectional point cloud data includes depth information in the three-dimensional scene.
  • Fig. 4 shows a schematic diagram of the coordinates of the measured point in the three-dimensional space coordinate system in the point cloud data according to an embodiment of the present disclosure.
  • the three-dimensional space coordinates of the measured point can be determined as follows:
  • x, y, z are the coordinates of the target point in the three coordinate directions in the three-dimensional space
  • r is the distance from the target point to the first ranging module 102
  • is the rotation module 101 in the first direction when r is measured
  • the horizontal rotation angle that is, the horizontal rotation angle of the first ranging module 102 in the first direction
  • d x represents the distance from the center point of the first ranging module 102 to the rotation axis of the rotation module 101
  • d z represents the first measurement module 102 from the center point to the vertical height of drop of the image acquisition module 103 and the lens center point
  • d L is a first measuring module laser emission point 102 to the rotation shaft of the first measuring module 102 in the second direction of rotation distance
  • ⁇ C is the yaw angle of the first ranging module 102 relative
  • the global coordinate system may be a geodetic coordinate system, that is, a three-dimensional spatial coordinate system with the geodetic plane as the XY plane.
  • the rotation axis of the first ranging module 102 in the first direction is theoretically strictly parallel to the ground plane, but due to equipment assembly and other reasons, the rotation axis has a certain error relative to the ground plane, so it is necessary to calibrate the first distance measurement in advance
  • the pitch angle of module 102 relative to the global coordinate system That is, the assembly inclination angle of the rotation axis of the first ranging module 102 in the second direction relative to the vertical plane in the global coordinate system, and the yaw angle ⁇ C of the first ranging module 102 relative to the global coordinate system, that is The assembly inclination angle of the rotation axis of the first ranging module 102 in the second direction relative to the vertical plane in the global coordinate system.
  • the laser beam emitted by the first ranging module 102 can be strictly perpendicular to the horizontal plane in the global coordinate system, that is, the earth plane, and since the rotation axis of the rotating module 101 is also perpendicular to the earth plane, theoretically The direction of the laser beam emitted by the first ranging module 102 and the rotation axis of the first ranging module 102 in the second direction are perpendicular to each other, but due to hardware equipment assembly and other reasons, it may cause the first ranging module 102 to emit The laser beam is not strictly perpendicular to the rotation axis, so it is necessary to calibrate the pitch angle ⁇ L of the laser beam emitted by the first ranging module 102 relative to the rotation axis (that is, the angle between the laser beam and the rotation axis minus 90 Spend).
  • the main control unit further uses the second ranging data to perform error correction on the omnidirectional point cloud data, and the error correction method is as follows:
  • the main control unit obtains the first point cloud data according to the second distance measurement data;
  • the first point cloud data includes the three-dimensional space coordinates of the target point on the surface of the object corresponding to the second distance measurement data, and the main control unit obtains the first point cloud data from the omni-directional point cloud data To obtain the second point cloud data corresponding to the target point;
  • the main control unit determines error data according to the first point cloud data and the second point cloud data, and corrects the omnidirectional point cloud data according to the error data.
  • the second ranging module continuously emits the laser beam, and then receives the laser beam reflected by the surface of the object in the three-dimensional scene. , Calculate the distance data from the point on the surface of the object to the second ranging module.
  • a set of second ranging data can be measured.
  • the second distance measurement data may include distance data of points on the surface of the object obtained by sweeping a circle of the three-dimensional scene on the horizontal rotation plane by the laser beam emitted by the second distance measurement module, and these points may be referred to as target points.
  • the three-dimensional space coordinates of the target points in the global coordinate system can be determined according to the set of second ranging data, and these three-dimensional space coordinates can constitute the first point cloud data.
  • the omnidirectional laser ranging data obtained from multiple sets of first ranging data covers all the points on the surface of the object in the three-dimensional scene, so the above-mentioned target can be extracted from the omnidirectional laser ranging data
  • the second point cloud data corresponding to the point, that is, the three-dimensional space coordinates of the target point are extracted from the omnidirectional laser ranging data to form the second point cloud data.
  • the second point cloud data can be used as the reference data to calculate the error between the two. After the error between the first point cloud data and the second point cloud data is calculated, the omnidirectional point cloud data can be corrected by the error, so that the omnidirectional point cloud data has a higher accuracy. In this way, the high-precision laser ranging module can be used to complement the single-line lidar, and high-precision point cloud data can be realized with a low-cost solution.
  • the main control unit also processes the image data to obtain a corresponding panoramic image.
  • the image acquisition module 103 may acquire multiple images, such as 6 images, during the process of the data acquisition device 100 rotating one circle in the first direction, and the acquisition angle interval between each image in the first direction is 60 degrees.
  • the main control unit can stitch together multiple color images after various viewing angles are distorted to form a low-resolution, boundary-preserving panoramic image for the collection user to confirm whether the collection task is completed when collecting data.
  • stitching method of the panoramic image please refer to the prior art, which will not be repeated here.
  • the main control unit also obtains a three-dimensional scene model by processing omnidirectional point cloud data and image data.
  • the main control unit can perform a series of 3D scene modeling processing on the omnidirectional point cloud data and multiple images collected by the image acquisition module 103 to obtain a 3D scene model, which can also be used by the collecting user to confirm when collecting data Whether to complete the collection task.
  • the processing method of 3D scene modeling can refer to the existing technology, which will not be repeated here.
  • the data collection device 100 can store the collected data locally, and upload the collected data at multiple collection points to the cloud, and the cloud will integrate it.
  • the data collected at multiple collection points are used to obtain a complete three-dimensional scene model using more sophisticated algorithms.
  • Fig. 5 shows a schematic diagram of a process of using a data acquisition device to collect data and perform a three-dimensional scene reconstruction according to an embodiment of the present disclosure.
  • the data collected at N collection points are collected by the data collection device.
  • the data collected at each collection point can include a full range of three-dimensional point cloud data and 6 color images.
  • the cloud 3D reconstruction engine can be used to process the N panoramic images and a texture model.
  • the texture model and N panoramic images can be published on the web for users to roam in real scenes, such as VR viewings.
  • the embodiments of the present disclosure can be applied to a variety of scenarios, such as a home improvement scenario.
  • the data collection device of the embodiment of the present disclosure can be used to collect data in the room to be decorated, and to perform three-dimensional reconstruction of the room to be decorated based on the collected data to obtain a texture model and a panoramic view of the room to be decorated.
  • the texture model and panorama can be sent to relevant home improvement personnel in the current home improvement process, such as home improvement designers, furniture designers, etc., and the home improvement personnel can formulate the decoration plan and furniture design plan of the room to be decorated based on the texture model and the panorama. It can speed up the entire home improvement process and save home improvement costs.
  • the process of collecting data by the data collecting device may be as follows:
  • the control terminal can connect to the data collection device via WiFi/Bluetooth, and the control terminal can open the control software on the data collection device to create a new collection project. And start the collection process; 2. After the data collection device receives the instruction to start the collection, it starts to rotate and collect the data in the 3D scene, and wait for the data collection device to finish collecting at the current collection point; 3. If it prompts that the collection is successful, you can move the data collection The device goes to the next point to be collected, and repeats the above step 2; if it prompts that the collection fails, it means that the collected data is incomplete or fails to match the existing map. It is recommended to adjust the location and collect again. 4. After collecting all the data, upload the data to the cloud, and the cloud will perform 3D reconstruction of the scene.
  • the data collection process of the data collection device in step 2 above is as follows:
  • the micro control unit activates the first ranging module, the image acquisition module, the second ranging module and other sensors; the micro control unit controls the rotation module to quickly rotate to a rotation angle of 0° and stops, the image acquisition module collects images 0; the micro control unit controls The rotating module slowly rotates to a rotation angle of 60° and stops.
  • the first ranging module and the second ranging module continue to measure distances, and obtain the first measurement returned by the first ranging module and the second ranging module in real time during the rotation.
  • the micro-control unit transmits the acquired data to the main control unit at the same time, which is processed by the main control unit.
  • the first distance measurement data is used to generate point cloud data;
  • the acquisition module collects image k at the rotation angle (60*k)°(0 ⁇ k ⁇ 5), while the rotation module rotates from the rotation angle (60*k)° to (60*(k+1))° during the period from
  • the first laser acquisition module and the second distance measurement module collect distance measurement data;
  • the rotation module returns to the rotation angle of 0°, then the collection process ends;
  • the main control unit can verify the integrity of the data and check the data collected at this collection point Whether the splicing with the existing map is successful, if it fails, the user can be prompted to re-collect.
  • Fig. 6 shows a flowchart of a data correction method according to an embodiment of the present disclosure. As shown in Figure 6, the data correction method includes the following steps:
  • step S601 the first ranging data and the second ranging data are acquired; wherein the first ranging data and the second ranging data are respectively generated by the first ranging module and the second ranging module on the data collection device. Collected
  • step S602 first point cloud data is acquired according to the second ranging data; the first point cloud data includes the three-dimensional space coordinates of the target point on the surface of the object corresponding to the second ranging data;
  • step S603 obtain the second point cloud data corresponding to the target point from the omnidirectional point cloud data
  • step S604 the error data is determined according to the first point cloud data and the second point cloud data, and the omnidirectional point cloud data is corrected according to the error data.
  • the details of the first ranging data and the second ranging data can refer to the description of the data collection device in the foregoing embodiment. It is understandable that the data correction method can be executed on the main control unit on the data acquisition device, or on other electronic devices, and other electronic devices can obtain the first ranging data and the second ranging data from the data acquisition device. .
  • the second ranging module When the data acquisition device rotates one circle in the first direction, the second ranging module continuously emits the laser beam, and then receives the laser beam reflected by the surface of the object in the three-dimensional scene, and calculates that the point on the surface of the object reaches the first point.
  • the distance data of the second ranging module After the second ranging module rotates one circle in the first direction with the three-dimensional acquisition device, a set of second ranging data can be measured, and the set of second ranging data can include the second ranging data.
  • the distance data from the points on the surface of the object obtained by sweeping the laser beam emitted by the module across a three-dimensional scene on the horizontal rotating plane. These points can be called target points.
  • the three-dimensional space coordinates of the target points in the global coordinate system can be determined according to the set of second ranging data, and these three-dimensional space coordinates can constitute the first point cloud data.
  • the omnidirectional laser ranging data obtained from multiple sets of first ranging data covers all the points on the surface of the object in the three-dimensional scene, so the above-mentioned target can be extracted from the omnidirectional laser ranging data
  • the second point cloud data corresponding to the point, that is, the three-dimensional space coordinates of the target point are extracted from the omnidirectional laser ranging data to form the second point cloud data.
  • the second point cloud data can be used as the reference data to calculate the error between the two. After the error between the first point cloud data and the second point cloud data is calculated, the omnidirectional point cloud data can be corrected by the error, so that the omnidirectional point cloud data has a higher accuracy. In this way, the high-precision laser ranging module can be used to complement the single-line lidar, and high-precision point cloud data can be obtained with a low-cost solution.
  • the device can be implemented as a part or all of an electronic device through software, hardware, or a combination of the two.
  • the data correction device includes:
  • the first obtaining module is configured to obtain first ranging data and second ranging data; wherein, the first ranging data and the second ranging data are respectively generated by the first ranging module and the second ranging data on the data collection device. Obtained by the second ranging module;
  • the second acquiring module is configured to acquire first point cloud data according to the second ranging data;
  • the first point cloud data includes the three-dimensional space coordinates of the target point on the surface of the object corresponding to the second ranging data ;
  • An extraction module configured to obtain second point cloud data corresponding to the target point from the omnidirectional point cloud data
  • the determining module is configured to determine error data according to the first point cloud data and the second point cloud data, and correct the omnidirectional point cloud data according to the error data.
  • the data correction device in this embodiment corresponds to the data correction method in the embodiment shown in FIG. 6.
  • the data correction method for specific details, please refer to the above description of the data correction method, which will not be repeated here.
  • FIG. 7 is a schematic structural diagram of an electronic device suitable for implementing the data correction method according to an embodiment of the present disclosure.
  • the electronic device 700 includes a processing unit 701, which may be implemented as a processing unit such as a CPU, GPU, FPGA, and NPU.
  • the processing unit 701 can execute various processes in the embodiments of any of the above-mentioned methods of the present disclosure according to a program stored in a read-only memory (ROM) 702 or a program loaded from the storage part 708 into a random access memory (RAM) 703 .
  • ROM read-only memory
  • RAM random access memory
  • various programs and data required for the operation of the electronic device 700 are also stored.
  • the processing unit 701, the ROM 702, and the RAM 703 are connected to each other through a bus 704.
  • An input/output (I/O) interface 705 is also connected to the bus 704.
  • the following components are connected to the I/O interface 705: an input part 706 including a keyboard, a mouse, etc.; an output part 707 including a cathode ray tube (CRT), a liquid crystal display (LCD), etc., and speakers, etc.; a storage part 708 including a hard disk, etc. ; And a communication section 709 including a network interface card such as a LAN card, a modem, and the like. The communication section 709 performs communication processing via a network such as the Internet.
  • the drive 710 is also connected to the I/O interface 705 as needed.
  • a removable medium 711 such as a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory, etc., is installed on the drive 710 as needed, so that the computer program read therefrom is installed into the storage section 708 as needed.
  • any method described above with reference to the embodiments of the present disclosure may be implemented as a computer software program.
  • the embodiments of the present disclosure include a computer program product, which includes a computer program tangibly contained on a readable medium thereof, and the computer program includes program code for executing any method in the embodiments of the present disclosure.
  • the computer program may be downloaded and installed from the network through the communication part 709, and/or installed from the removable medium 711.
  • each block in the route diagram or block diagram may represent a module, program segment, or part of the code, and the module, program segment, or part of the code contains one or more functions for realizing the specified logic function.
  • Executable instructions may also occur in a different order from the order marked in the drawings. For example, two blocks shown one after another can actually be executed substantially in parallel, and they can sometimes be executed in the reverse order, depending on the functions involved.
  • each block in the block diagram and/or flowchart, and the combination of the blocks in the block diagram and/or flowchart can be implemented by a dedicated hardware-based system that performs the specified functions or operations Or it can be realized by a combination of dedicated hardware and computer instructions.
  • the units or modules involved in the embodiments described in the present disclosure can be implemented in software or hardware.
  • the described units or modules may also be provided in the processor, and the names of these units or modules do not constitute a limitation on the units or modules themselves under certain circumstances.
  • the present disclosure also provides a computer-readable storage medium.
  • the computer-readable storage medium may be the computer-readable storage medium included in the device described in the above-mentioned embodiments; A computer-readable storage medium assembled into the device.
  • the computer-readable storage medium stores one or more programs, and the programs are used by one or more processors to execute the methods described in the present disclosure.

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Electromagnetism (AREA)
  • Optical Radar Systems And Details Thereof (AREA)

Abstract

本公开实施例公开了一种数据采集设备及数据校正方法、装置、电子设备,该数据采集设备包括:旋转模块、第一测距模块以及图像采集模块;其中,所述旋转模块用于带动所述数据采集设备在第一方向上旋转;所述第一测距模块适于随着所述数据采集设备在所述第一方向上旋转,还适于在第二方向上旋转,还适于测量第一测距数据,其中,所述第一方向和第二方向不同;所述图像采集模块适于随着所述数据采集设备在所述第一方向上旋转,还适于采集三维场景中的图像数据。该技术方案中的数据采集设备成本低,采集的数据精度较高。

Description

数据采集设备及数据校正方法、装置、电子设备 技术领域
本公开涉及计算机技术领域,具体涉及一种数据采集设备及数据校正方法、装置、电子设备。
背景技术
三维重建技术是工业界和学术界计算机视觉领域的研究热点之一,根据重建的对象不同可以分为物体三维重建、场景三维重建、人体三维重建等。针对场景三维重建,相关技术中的三维场景采集设备通常利用深度摄像头采集周围环境的图像与深度信息。该三维场景采集设备搭载平视、俯视、仰视3个深度摄像头分别采集深度图与彩色图,三维场景采集设备底部装有旋转电机使得三维场景采集设备可以在水平方向上旋转,并在旋转过程中由深度摄像头拍摄场景画面。通过这种方式可以在一个采集点上采集多张彩色图与多张深度图,并从深度图中可以获得像素对应物体的实际距离。但是通过这种方式采集到的数据深度误差较大,而且深度量程也有限,并不适用于室外等较空旷的场景。
发明内容
本公开实施例提供一种数据采集设备及数据校正方法、装置、电子设备。
第一方面,本公开实施例中提供了一种数据采集设备,包括:旋转模块、第一测距模块以及图像采集模块;其中,
所述旋转模块用于带动所述数据采集设备在第一方向上旋转;
所述第一测距模块适于随着所述数据采集设备在所述第一方向上旋转,还适于在第二方向上旋转,还适于测量第一测距数据,其中,所述第一方向和第二方向不同;
所述图像采集模块适于随着所述数据采集设备在所述第一方向上旋转,还适于采集三维场景中的图像数据。
进一步地,所述第一方向和第二方向相互垂直。
进一步地,数据采集设备还包括:第二测距模块,用于获取第二测距数据;所述第二测距模块的测距误差小于所述第一测距模块的测距误差。
进一步地,在所述旋转模块带动所述数据采集设备在第一方向上旋转一周的过程中,所述第二测距模块在所述第一方向上旋转测量所述第二测距数据。
进一步地,所述旋转模块设置在所述数据采集设备下方,所述第一测距模块设置在所述数据采集设备的第一侧,所述第二测距模块设置在所述数据采集设备的第二侧,且第一侧所在平面与第二侧所在平面相互垂直;所述图像采集设备的镜头方向与所述第二测距模块的测距方向相反。
进一步地,所述第一测距模块和第二测距模块均为激光测距模块。
进一步地,所述第一测距模块为单线激光雷达。
进一步地,所述图像采集模块的镜头中心位于所述旋转模块的旋转轴的延长线上。
进一步地,所述图像采集模块在所述第一方向上旋转到预设旋转角度后采集所述图像数据。
进一步地,数据采集设备还包括:微控制单元和主控制单元;其中,
所述微控制单元分别与所述旋转模块、所述第一测距模块和所述第二测距模块连接,并且所述微控制单元用于控制旋转模块,并实时获取所述旋转模块的旋转角度、所述第一测距数据和所述第二测距数据;所述微控制单元还用于将获取的所述旋转角度、所述第一测距数据和所述第二测距数据进行时间同步后输出至主控制单元;
所述主控制单元与所述图像采集模块连接,并且从所述图像采集模块获取所述图像数据,并对从所述微控制单元接收到的所述旋转角度、所述第一测距数据和所述第二测距数据根据第一测距数据进行处理。
进一步地,所述主控制单元通过对多组所述第一测距数据进行处理,获取三维场景中的全方位点云数据,所述全方位点云数据包括物体表面上被测点在三维场景中的三维空间坐标;多组所述第一测距数据包括第一测距模块随着数据采集设备在第一方向上旋转一周、并在第二方向上自转多周采集到的数据。
进一步地,所述主控制单元还利用所述第二测距数据对所述全方位点云数据进行误差校正,且误差校正方式如下:
所述主控制单元根据所述第二测距数据获取第一点云数据;所述第一点云数据包括所述第二测距数据对应的物体表面上的目标点的三维空间坐标,以及所述主控制单元从所述全方位点云数据中获取所述目标点对应的第二点云数据;
所述主控制单元根据所述第一点云数据和第二点云数据确定误差数据,并根据所述误差数据对所述全方位点云数据进行校正。
进一步地,所述主控制单元还对所述图像数据进行处理得到对应的全景图。
进一步地,所述主控制单元还通过对所述全方位点云数据和所述图像数据处理得到三维场景模型。
第二方面,本公开实施例中提供了一种数据校正方法,包括:
获取第一测距数据以及第二测距数据;其中,所述第一测距数据和第二测距数据分别由第一方面所述的数据采集设备上的第一测距模块和第二测距模块采集得到;
根据所述第二测距数据获取第一点云数据;所述第一点云数据包括所述第二测距数据对应的物体表面上的目标点的三维空间坐标;
从所述全方位点云数据中获取所述目标点对应的第二点云数据;
根据所述第一点云数据和第二点云数据确定误差数据,并根据所述误差数据对所述全方位点云数据进行校正。
第二方面,本发明实施例中提供了一种数据校正装置,包括:
第一获取模块,被配置为获取第一测距数据以及第二测距数据;其中,所述第一测距数据和第二测距数据分别由第一方面所述的数据采集设备上的第一测距模块和第二测距模块采集得到;
第二获取模块,被配置为根据所述第二测距数据获取第一点云数据;所述第一点云数据包括所述第二测距数据对应的物体表面上的目标点的三维空间坐标;
提取模块,被配置为从所述全方位点云数据中获取所述目标点对应的第二点云数据;
确定模块,被配置为根据所述第一点云数据和第二点云数据确定误差数据,并根据所述误差数据对所述全方位点云数据进行校正。
所述功能可以通过硬件实现,也可以通过硬件执行相应的软件实现。所述硬件或软件包括一个或多个与上述功能相对应的模块。
在一个可能的设计中,上述装置的结构中包括存储器和处理器,所述存储器用于存储一条或多条支持上述装置执行上述对应方法的计算机指令,所述处理器被配置为用于执行所述存储器中存储的计算机指令。上述装置还可以包括通信接口,用于上述装置与其他设备或通信网络通信。
第三方面,本公开实施例提供了一种电子设备,包括存储器和处理器;其中,所述存储器用于存储一条或多条计算机指令,其中,所述一条或多条计算机指令被所述处理器执行以实现上述任一方面所述的方法。
第四方面,本公开实施例提供了一种计算机可读存储介质,用于存储上述任一装置所用的计算机指令,其包含用于执行上述任一方面所述方法所涉及的计算机指令。
本公开实施例提供的技术方案可以包括以下有益效果:
本公开实施例中的数据采集设备采用第一测距模块测量三维场景中的深度信息,也即第一测距模块至三维场景中物体表面的距离数据,由于第一测距模块可以选用能够覆盖各种量程范围、且精度较高的激光雷达等产品,并且激光雷达的误差较小,因此单个采集点所覆盖的面积较广,在较为空旷的环境下可以减少采集点数,能够减少成本;此外,本公开实施例中的数据采集设备还能够通过图像采集设备采集可以拼接形成全景图的多张图像数据;本公开实施例中由于第一测距模块可以在第一方向上旋转一周的过程中能够在第二方向上旋转多周,因此最终能够集点到覆盖三维场景中全方位物体表面的第一测距数据。因此,本公开实施例中的数据采集设备能够以较低成本采集得到精度更高、覆盖范围更广的三维场景数据。
应当理解的是,以上的一般描述和后文的细节描述仅是示例性和解释性的,并不能限制本公开。
附图说明
结合附图,通过以下非限制性实施方式的详细描述,本公开的其它特征、目的和优点将变得更加明显。在附图中:
图1示出根据本公开一实施方式的数据采集设备的结构示意图;
图2示出根据本公开一实施方式的数据采集设备的结构示意图;
图3示出根据本公开一实施方式中三维采集设备的电路设计示意图;
图4示出根据本公开一实施方式中点云数据中被测点在三维空间坐标系中的坐标示意图;
图5示出根据本公开一实施方式中利用数据采集设备采集数据并进行三维场景重建的过程示意图;
图6示出根据本公开一实施方式的数据校正方法的流程图;
图7是适于用来实现根据本公开一实施方式的数据校正方法的电子设备的结构示意图。
具体实施方式
下文中,将参考附图详细描述本公开的示例性实施方式,以使本领域技术人员可容易地实现它们。此外,为了清楚起见,在附图中省略了与描述示例性实施方式无关的部分。
在本公开中,应理解,诸如“包括”或“具有”等的术语旨在指示本说明书中所公开的特征、数字、步骤、行为、部件、部分或其组合的存在,并且不欲排除一个或多个其他特征、数字、步骤、行为、部件、部分或其组合存在或被添加的可能性。
另外还需要说明的是,在不冲突的情况下,本公开中的实施例及实施例中的特征可以相互组合。下面将参考附图并结合实施例来详细说明本公开。
图1示出根据本公开一实施方式的数据采集设备100的结构示意图。如图1所示,数据采集设备100包括:旋转模块101、第一测距模块102以及图像采集模块103;其中,
旋转模块101用于带动数据采集设备100在第一方向上旋转;
第一测距模块102适于随着数据采集设备100在第一方向上旋转,还适于在第二方向上旋转,还适于测量第一测距数据,其中,第一方向和第二方向不同;
图像采集模块103适于随着数据采集设备100在第一方向上旋转,还适于采集三维场景中的图像数据。
本实施例中,旋转模块101可以是云台,该云台可以是水平云台,用于支撑数据采集设备100并带动数据采集设备100在第一方向上转动。在数据采集设备100水平放置时,该第一方向与大地平面平行。
第一测距模块102可以在第二方向上旋转,并且在第二方向上旋转的过程中持续不断的采集三维场景中的距离数据。第一方向和第二方向可以不同。旋转模块101带动数据采集设备在第一方向上旋转的过程中,第一测距模块102在第二方向上旋转测量第一测距数据,因此该第一测距数据可以包括第一测距模块102在第二方向上旋转形成的第二平面在第一方向上扫描到的物体表面到第一测距模块102的距离数据。
在一些实施例中,第一方向和第二方向可以相互垂直。在另一些实施例中,在数 据采集设备100水平放置时,该第一方向可以与大地平面平行,而第二方向可以与大地平面垂直。
在一些实施例中,第一测距模块102在第二方向上的旋转轴与旋转模块101在第一方向上的旋转轴相互垂直。
在一些实施例中,第一测距模块102可以是激光测距模块,第一测距模块102可以采用无刷电机旋转,第一测距模块102可以包括一个激光发射器和一个激光接收器,并且第一测距模块102在无刷电机的驱动下在第二方向上旋转到过程中,激光发射器和激光接收器一起旋转,且激光发射器和激光接收器在同一平面上旋转。激光发射器可以是单点激光发射器,因此激光发射器发射的激光光束的方向可以与第一测距模块102的旋转轴垂直,并且在第二方向上旋转过程中该激光光束形成的激光平面与旋转模块101的旋转轴可以平行,也即第一测距模块102在第二方向上旋转一周的过程中,激光发射器发出的激光光束形成的激光平面垂直于旋转模块带动数据采集设备在第一方向上旋转形成的第一平面。
第一测距模块102在旋转过程中,激光发射器从激光接收孔中不断发出激光光束,而该激光光束经过物体表面反射之后,被激光接收孔中的激光接收器所接收。在一些实施例中,第一测距模块102可以根据发射的激光光束和被反射后接收的激光光束的时间差计算得到第一测距模块102到物体表面反射点的物理距离。当然,在另一些实施例中,也可以利用三角法、相位法等计算得到第一测距模块102到物体表面反射点之间的物理距离,具体可以根据实际应用而定,在此不做限制。
可以理解的是,上述第一测距模块也可以是其他测距模块,例如超声波测距模块等。
第一测距模块102在第二方向上旋转一周后,可以采集到一组第一测距数据,该一组第一测距数据包括第一测距模块102发出的激光光束形成的垂直旋转平面与周围物体表面相交曲线上多个激光反射点到该第一测距模块102的距离数据。第一测距模块102在第二方向上的旋转速度远大于旋转模块101在第一方向上的旋转速度,因此旋转模块101带动数据采集设备100在第一方向上旋转一周的过程中,第一测距模块102可以在第二方向上旋转多周,并且能够在多个水平旋转角度采集得到多组第一测距数据,该多组第一测距数据可以覆盖三维场景中的全方位物体表面。
在一些实施例中,为了降低成本,第一测距模块102可以选用单线激光测距雷达, 例如旋转式三角法单线激光雷达、相位法单线激光雷达、ToF单线激光雷达等。当然可以理解的是,第一测距模块102也可以选用多线激光雷达,例如ToF多线激光雷达。激光测距雷达通过对被测物体发射激光光束,并接收该激光光束的反射波,来确定被测物体与激光测距雷达上测试点的距离。激光测距雷达的距离计算方式通常包括三种:飞行时间(TOF)计算方式、三角法计算方式和相位法计算方式。飞行时间计算方式通过记录发射光、反射光时间差来计算距离,采用飞行时间计算方式的激光测距雷达称为ToF激光雷达;三角法计算方式根据反射光的三角形像素偏差来计算距离,采用三角法计算方式的激光测距雷达称为三角法激光雷达;相位法计算方式根据发射光、反射光相位差计算距离,采用相位法计算方式的激光测距雷达称为相位法激光雷达。单线激光雷达通过将一个激光测距装置放置在旋转电机上,不停旋转以采集360°方向上物体表面的单线距离。多线激光雷达通过将多个激光测距装置放在旋转电机上,同时采集物体表面的多线距离。
图像采集模块103可以是相机、摄像头等。在一些实施例中,图像采集模块103可以是广角镜头的相机或摄像头。旋转模块101带动数据采集设备100在第一方向上转动一周的过程中,可以由图像采集模块103采集图像数据。
本公开实施例中的数据采集设备采用第一测距模块测量三维场景中的深度信息,也即第一测距模块至三维场景中物体表面的距离数据,由于第一测距模块可以选用能够覆盖各种量程范围、且精度较高的激光雷达等产品,并且激光雷达的误差较小,因此单个采集点所覆盖的面积较广,在较为空旷的环境下可以减少采集点数,能够减少成本;此外,本公开实施例中的数据采集设备还能够通过图像采集设备采集可以拼接形成全景图的多张图像数据;本公开实施例中由于第一测距模块可以在第一方向上旋转一周的过程中能够在第二方向上旋转多周,因此最终能够集点到覆盖三维场景中全方位物体表面的第一测距数据。因此,本公开实施例中的数据采集设备能够以较低成本采集得到精度更高、覆盖范围更广的三维场景数据。
在本实施例的一个可选实现方式中,该数据采集设备100还包括第二测距模块。该第二测距模块可以为高精度测距模块,可以用于获取三维场景中的第二测距数据。高精度测距模块可以为测距误差在毫米级的测距模块。而第一测距模块的测距误差可以是厘米级的,因此,第二测距模块的测距误差小于第一测距模块的测距误差。
在一些实施例中,第一测距模块和第二测距模块均可以采用激光测距模块。
在一些实施例中,为了降低成本,第一测距模块102可以选用单线激光测距雷达。单线激光雷达的由于存在一定的测量误差,例如常见的三角法单线激光雷达的误差约为测量距离的±1%~2%,而ToF单线激光雷达的误差约为±3cm且与距离无关。因此,如果实际应用场景中,对于三维场景数据中深度信息的精度要求较高的话,可以在三维激光采集设备上设置第二测距模块,也即高精度激光测距模块,第二测距模块的测距误差可以小于第一测距模块102的测距误差。在数据采集过程中,利用第一测距模块102获得多组第一测距数据的同时,也可以利用第二测距模块获得第二测距数据。由于多组第一测距数据可以全方位覆盖三维场景中的物体表面,并且该第二测距模块获取的第二测距数据的目的是为了校正第一测距数据,因此第二测距模块只需要测量其中一部分物体表面即可。在测量误差校正过程中,可以从第一测距数据中提取出与该第二测距数据所测量物体表面相同部分的第一测距数据,并将该部分第一测距数据与第二测距数据进行比较确定测量误差,进而再根据该测量误差校正全部的第一测距数据。
在一些实施例中,第一测距模块102和第二测距模块所采用的激光波长可以不同。例如第一测距模块102可以采用红外光,而第二测距模块可以采用可见光。通过这种方式,可以是使得第一测距模块102和第二测距模块发出的激光光束之间不会相互影响。
举例说明一下,假如第一测距数据中第一测距模块102到物体表面上点A的距离为r1,而第二测距数据中第二测距模块到物体表面点A的距离为r2,通过处理r1确定了点A在三维场景中的三维空间坐标为B1,而通过处理r2确定了点A在三维场景中的三维空间坐标为B2,此时可以通过比较B1和B2可以确定B1相对于B2的误差数据(第二测距模块的精度较高,因此以第二测距模块得到的数据为正确数据确定误差数据)。
上面仅是举例说明,实际应用场景中可以通过综合考虑物体表面一组点的距离数据计算得到误差数据,具体可以根据实际情况而定,在此不做限制。
在本实施例的一个可选实现方式中,在旋转模块101带动数据采集设备100在第一方向上旋转一周的过程中,第二测距模块在第一方向上旋转测量第二测距数据。
在本实施例的一个可选实现方式中,旋转模块101设置在数据采集设备100下方,第一测距模块102设置在数据采集设备100的第一侧,第二测距模块设置在数据采集 设备100的第二侧,且第一侧所在平面与第二侧所在平面相交,例如可以相互垂直;图像采集设备的镜头方向与第二测距模块发射激光光束的方向相反。
该可选的实现方式中,旋转模块101可以设置在数据采集设备100底部,用于支撑整个数据采集设备100,并且该旋转模块101绕着旋转轴旋转时,可以带动整个数据采集设备100进行转动。第一测距模块102可以设置在第一侧,而第二测距模块可以设置在与第一侧所在平面相交的第二侧,并且第二测距模块发射的激光光束方向可以与第一测距模块102发射的激光光束的扫描平面相互平行。随着数据采集设备100在第一方向上旋转的过程中,第二测距模块发射的激光光束的扫描平面与大地平面平行。
在一些实施例中,第二测距模块发出激光光束的方向与第一测距模块102发射的激光光束的扫描平面之间的夹角可以设置较大,例如接近于90度的情况下,即使第一测距模块102与第二测距模块采用的激光波长相同或相近时,第一测距模块102和第二测距模块发出的激光光束不会相互影响,并且由于旋转模块101位于数据采集设备100底部,也不会在旋转模块101转动和第一测距模块102转动的过程中阻挡第一测距模块102和第二测距模块发出的激光光束。图像采集模块103可以设置在数据采集设备100的上方,并且图像采集模块103采集图像的镜头方向可以与第二测距模块发射激光光束的方向相背。例如,图像采集模块103以平视角度采集图像数据时,其采集图像的方向与第二测距模块采集距离数据的方向相反。
图2示出根据本公开一实施方式的数据采集设备的结构示意图。如图2所示,数据采集设备上设置有激光雷达、相机模组、高精度测距模组和云台电机。云台电机设置在数据采集设备的底部,可以支撑数据采集设备并带动数据采集设备在第一方向上绕旋转轴A转动;激光雷达可以设置在柱状无刷电机上,无刷电机的底部平面可以紧贴数据采集设备的一侧面(图中左侧)设置,并且激光雷达的机身也可以是圆柱状的,并且其柱状侧面上设置有一个激光发射孔和一个激光接收孔(图中被遮挡),激光发射器设置在激光发射孔中,并且可以通过激光发射孔发射第二方向上的激光光束,而经过物体表面反射后的激光光束进入激光接收孔后,被设置在激光接收孔中的激光接收器所接收。激光雷达可以在第二方向上绕旋转轴B旋转。相机模组通过支架安装在数据采集设备的上方,并且相机模组的镜头方向可以为第一方向,也即相机模组以平视角度采集图像,该相机模组可以是采用广角镜头,因此可以采集到得多张 图像可以拼接成全景图。高精度激光测距模组设置在数据采集设备的另一侧面(图中被遮挡,且图中示出的是高精度激光测距模组在背面的设置位置)。高精度激光测距模组发射第一方向上的激光光束,并且发射方向与相机模组的镜头方向相反(也即高精度激光测距模组的激光发射方向为图中朝里的方向)。
在本实施例的一个可选实现方式中,图像采集模块103的镜头中心位于旋转模块101的旋转轴的延长线上。图像采集模块103的镜头中心设置在旋转模块101的旋转轴的延长线上,可以使得图像采集模块103在多个预设旋转角度上采集得到的多张图像之间无视差,并且利用该多张图像拼接得到的全景图效果较佳。
在本实施例的一个可选实现方式中,图像采集模块103在第一方向上旋转到预设旋转角度后采集图像数据。
图像采集模块103可以采集多张图像,并且该多张图像可以拼接形成全景图,因此可以在旋转模块101带动数据采集设备100在第一方向上旋转的过程中,在旋转到预设旋转角度之后,由图像采集模块103采集图像数据。该预设旋转角度可以包括多个,例如可以包括6个,每间隔60度为一个预设旋转角度,也即该6个预设旋转角度分别为0度、60度、120度、180度、240度、300度。当然,可以理解的是,预设旋转角度可以根据图像采集模块103的实际硬件设备以及实际要求而定,比如需要全景图的情况下只要采集得到能够拼接形成全景图的多张图像即可。
在本实施例的一个可选实现方式中,数据采集设备100还可以包括微控制单元和主控制单元。微控制单元可以是单片机等,在微控制单元上可以运行实时系统,微控制单元与数据采集设备上的各路传感器连接,并且用于实时获取数据采集设备100上设置的各路传感器上的数据,并对获取的各路数据打上时间戳后进行时间同步,并发送至主控制单元进行处理。数据采集设备100上设置的传感器包括但不限于第一测距模块102、第二测距模块、惯性测量单元(IMU)等。微控制单元还与旋转模块101连接,并且可以控制旋转模块101,从旋转模块101的旋转电机获取旋转角度等。
主控制单元与图像采集模块连接,用于从图像采集模块103获取图像数据,并从微控制单元接收各传感器的数据,进而对各传感器的数据进行相应的处理。主控制单元可以是CPU等处理单元,用于运行Linux等非实时操作系统。
图3示出根据本公开一实施方式中三维采集设备的电路设计示意图。如图3所示,三维采集设备的电路设计中,微控制单元MCU上设置有串口、电机驱动接口、 激光雷达驱动接口、IMU接口等,主控制单元上设置有相机接口、WiFi和/或蓝牙配置接口等。微控制单元MCU通过电机驱动接口与云台电机相接,通过激光雷达驱动接口与激光雷达和高精度激光测距模组相接,通过IMU接口与惯性测量单元相接,并通过串口与主控制单元相接。主控制单元通过相机接口与相机模组相接,通过Wifi/蓝牙配置接口配置WiFi/蓝牙设备,并通过该接口与外部设备如ipad/手机等进行通信。三维采集设备的电路设计中还包括电源管理模块,用于向三维采集设备上的各个模块提供电源。其中,IMU可以测量得到数据采集设备的3轴加速度和3轴角速度,通过测量得到的加速和和角速度可以推算出数据采集设备相对于大地平面的角度,该角度可以在三维场景建模中使用。
在本实施例的一个可选实现方式中,主控制单元通过对多组第一测距数据进行处理,获取三维场景中的全方位点云数据,全方位点云数据包括物体表面上被测点在三维场景中的三维空间坐标;多组第一测距数据包括第一测距模块102随着数据采集设备100在第一方向上旋转一周、并在第二方向上自转多周采集到的数据。
该可选的实现方式中,数据采集设备100在第一方向旋转一周的过程中,第一测距模块102在第二方向上自转多周,且第一测距模块102每秒可以测量上万个点的距离,因此每旋转一周可以采集到一组第一测距数据,而通过连续不断的旋转多周可以得到多组第一测距数据。该一组第一测距数据包括激光在垂直平面上扫过一周三维场景得到的物体表面被测点的距离数据,而数据采集设备100在第一方向上旋转一周后采集得到的多组第一测距数据则包括三维场景中全方位物体表面上被测点的距离数据。
三维场景中全方位物体表面上被测点的距离数据为第一测距模块102上的测试点(例如激光发射点)至该物体表面上被测点的物理距离,此外由于采集过程中,还实时记录了每次采集时第一测距模块102在第二方向上的旋转角度以及旋转模块101在第一方向上的旋转角度,因此确定了该距离数据之后,则可以通过上述旋转角度以及距离数据计算得到各个点在三维空间坐标系中的三维空间坐标。
根据上述方法确定覆盖三维场景中全方位物体表面的多组第一测距数据中多个被测点的三维空间坐标之后,可以通过将该多个被测点在三维空间中标定出来的方式构建全方位的激光点云。该全方位点云数据包括三维场景中的深度信息。
图4示出根据本公开一实施方式点云数据中被测点在三维空间坐标系中的坐标 示意图。如图4所示,被测点的三维空间坐标可以如下确定:
Figure PCTCN2021082315-appb-000001
其中,
Figure PCTCN2021082315-appb-000002
Figure PCTCN2021082315-appb-000003
Figure PCTCN2021082315-appb-000004
θ 2=θ 1C
其中,x、y、z为目标点在三维空间中三个坐标方向上的坐标,r为目标点到第一测距模块102的距离;θ为测量r时旋转模块101在第一方向上的水平旋转角,也即第一测距模块102在第一方向上的水平旋转角,d x表示第一测距模块102的中心点到旋转模块101的旋转轴的距离,d z表示第一测距模块102的中心点到图像采集模块103的镜头中心点的垂直高度落差;d L为第一测距模块102的激光发射点到第一测距模块102在第二方向上旋转的旋转轴的距离;
Figure PCTCN2021082315-appb-000005
为测量r时第一测距模块102在第二方向上的垂直旋转角;
Figure PCTCN2021082315-appb-000006
为第一测距模块102相对于全局坐标系的俯仰角;θ C为第一测距模块102相对于全局坐标系的偏航角;θ L为第一测距模块102发出的激光光束相对于第一测距模块102在第二方向上旋转的旋转轴的俯仰角。
全局坐标系可以是大地坐标系,也即以大地平面为XY平面的三维空间坐标系。第一测距模块102在第一方向上的旋转轴理论上严格平行于大地平面,但是由于设备装配等原因,导致该旋转轴相对于大地平面有一定的误差,因此需要预先标定第一测距模块102相对于全局坐标系的俯仰角
Figure PCTCN2021082315-appb-000007
也即第一测距模块102在第二方向上旋转的旋转轴相对于全局坐标系中垂直平面的装配倾角,以及第一测距模块102相对于全局坐标系的偏航角θ C,也即第一测距模块102在第二方向上旋转的旋转轴相对于全局坐标系中垂直平面的装配倾角。此外,在理论上,第一测距模块102发射的激光光束可以严格垂直于全局坐标系中的水平平面,也即大地平面,进而由于旋转模块101的旋转轴也垂直于大地平面,因此理论上第一测距模块102发射的激光光束的方向与 第一测距模块102在第二方向上旋转的旋转轴相互垂直,但是由于硬件设备装配等原因,可能会导致第一测距模块102发射的激光光束不严格垂直于该旋转轴,因此需要预先标定出第一测距模块102发射的激光光束相对于该旋转轴的俯仰角θ L(也即激光光束与该旋转轴的夹角减去90度)。
在本实施例的一个可选实现方式中,主控制单元还利用第二测距数据对全方位点云数据进行误差校正,且误差校正方式如下:
主控制单元根据第二测距数据获取第一点云数据;第一点云数据包括第二测距数据对应的物体表面上的目标点的三维空间坐标,以及主控制单元从全方位点云数据中获取目标点对应的第二点云数据;
主控制单元根据第一点云数据和第二点云数据确定误差数据,并根据误差数据对全方位点云数据进行校正。
该可选的实现方式中,数据采集设备100在第一方向上旋转一周的过程中,第二测距模块通过持续不断地发射激光光束,进而接收经过三维场景内物体表面反射后的激光光束之后,计算得到物体表面上点到第二测距模块的距离数据,在第二测距模块随着三维采集设备在第一方向旋转一周后,可以测得一组第二测距数据,该组第二测距数据可以包括第二测距模块发出的激光光束在水平旋转平面上扫过一周三维场景得到的物体表面上的点的距离数据,这些点可以称之为目标点。根据该组第二测距数据可以确定这些目标点在全局坐标系中的三维空间坐标,这些三维空间坐标可以构成第一点云数据。
如前文中所描述,根据多组第一测距数据得到的全方位激光测距数据全方位覆盖了三维场景中的物体表面上的点,因此可以从全方位激光测距数据中提取出上述目标点对应的第二点云数据,也即从全方位激光测距数据中提取上述目标点的三维空间坐标,形成第二点云数据。
由于第一点云数据和第二点云数据均对应相同的目标点,且由于第二点云数据是精度更高的第二测距模块得到,因此第二点云数据的精确度更高,因此可以以第二点云数据为基准数据,计算两者的误差。在计算得到第一点云数据和第二点云数据之间的误差之后,可以通过该误差校正全方位点云数据,使得全方位点云数据的精确度更高。通过这种方式,可以利用高精度激光测距模块与单线激光雷达形成互补,以低成本的方案实现高精度的点云数据。
在本实施例的一个可选实现方式中,主控制单元还对图像数据进行处理得到对应的全景图。
图像采集模块103可以在数据采集设备100第一方向旋转一周的过程中,采集得到多张图像,例如6张图像,且每张图像之间在第一方向上的采集角度间隔60度。主控制单元可以将各个视角去畸变后的多张彩色图进行拼接,形成一幅低分辨率、保留边界的全景图,以便供采集用户在采集数据时确认是否完成采集任务。全景图的拼接方式可以参见已有技术,在此不再赘述。
在本实施例的一个可选实现方式中,主控制单元还通过对全方位点云数据和图像数据处理得到三维场景模型。
主控制单元可以对全方位点云数据和图像采集模块103采集到的多张图像进行一系列三维场景建模处理后,得到三维场景模型,该三维场景模型也可以供采集用户在采集数据时确认是否完成采集任务。三维场景建模处理方式可以参见已有技术,在此不再赘述。
在三维场景面积较大时,可以在多个采集点进行数据采集,数据采集设备100可以将采集到的数据存储在本地,并在多个采集点的数据采集完毕轴上传至云端,由云端综合多个采集点上采集到的数据,并采用更精细的算法得到完整的三维场景模型。
图5示出根据本公开一实施方式中利用数据采集设备采集数据并进行三维场景重建的过程示意图。如图5所示,利用数据采集设备采集得到N个采集点上的数据,每个采集点上采集得到的数据可以包括一个全方位三维点云数据和6张彩色图像,这些数据通过网络传输至云端后,可以利用云端三维重建引擎处理后得到N张全景图以及一个纹理模型。该纹理模型以及N张全景图可以发布在web端供用户进行实景漫游,例如VR看房等。本公开实施例可应用于多种场景,例如家装场景。在家装应用场景下,可以利用本公开实施例的数据采集设备采集待装修房间内的数据,并针对所采集到的数据对待装修房间进行三维重建,得到纹理模型以及待装修房间中的全景图。该纹理模型和全景图可以发送至当前家装环节的相关家装人员,例如家装设计师、家具设计人员等,家装人员可以根据该纹理模型和全景图制定待装修房间的装修方案、家具设计方案等,能够加快整个家装流程,节省家装成本。
在一些实施例中,数据采集设备采集数据的流程可以如下:
1、将数据采集设备放置在三脚架上,并放置在采集点,开启设备电源,控制终端可以通过WiFi/蓝牙连接数据采集设备,并由控制终端打开数据采集设备上的控制软件,新建采集工程,并启动采集流程;2、数据采集设备接收到启动采集的指令之后开始旋转采集三维场景中的数据,并等待数据采集设备在当前采集点上采集结束;3、若提示采集成功,可移动数据采集设备至下一个需要采集点,并重复上述步骤2;若提示采集失败,则表示采集的数据不完整或与已有地图匹配失败,建议调整位置重新采集。4、采集完所有数据后,上传数据至云端,云端进行场景三维重建。
上述步骤2中数据采集设备采集数据的流程如下:
微控制单元启动第一测距模块、图像采集模块、第二测距模块等各个传感器;微控制单元控制旋转模块快速旋转至旋转角度0°停下,图像采集模块采集图像0;微控制单元控制旋转模块缓慢旋转至旋转角度60°停下,旋转期间第一测距模块和第二测距模块持续测距,并实时获取旋转期间第一测距模块和第二测距模块返回的第一测距数据和第二测距数据,微控制单元同时将获取的数据传送至主控制单元,由主控制单元进行处理,例如利用第一测距数据生成点云数据等;重复步骤上述步骤,使得图像采集模块在旋转角度(60*k)°(0≤k≤5)位置采集图像k,而旋转模块从旋转角度(60*k)°旋转至(60*(k+1))°的期间由第一激光采集模块和第二测距模块采集测距数据;旋转模块回到旋转角度0°,则采集过程结束;主控制单元可以验证数据完整性,并检查本次采集点上采集到的数据是否与已有地图拼接成功,如果失败,可以提示用户重新采集。
下面通过具体实施例详细介绍本公开实施例的细节。
图6示出根据本公开一实施方式的数据校正方法的流程图。如图6所示,该数据校正方法包括以下步骤:
在步骤S601中,获取第一测距数据以及第二测距数据;其中,第一测距数据和第二测距数据分别由上述数据采集设备上的第一测距模块和第二测距模块采集得到;
在步骤S602中,根据第二测距数据获取第一点云数据;第一点云数据包括第二测距数据对应的物体表面上的目标点的三维空间坐标;
在步骤S603中,从全方位点云数据中获取目标点对应的第二点云数据;
在步骤S604中,根据第一点云数据和第二点云数据确定误差数据,并根据误差数据对全方位点云数据进行校正。
本实施例中,第一测距数据以及第二测距数据的细节可以参见上述实施例中对于数据采集设备的描述。可以理解的是,该数据校正方法可以在数据采集设备上的主控制单元执行,也可以在其它电子设备上执行,其它电子设备可以从数据采集设备获取第一测距数据和第二测距数据。
数据采集设备在第一方向上旋转一周的过程中,第二测距模块通过持续不断地发射激光光束,进而接收经过三维场景内物体表面反射后的激光光束之后,计算得到物体表面上点到第二测距模块的距离数据,在第二测距模块随着三维采集设备在第一方向旋转一周后,可以测得一组第二测距数据,该组第二测距数据可以包括第二测距模块发出的激光光束在水平旋转平面上扫过一周三维场景得到的物体表面上的点的距离数据,这些点可以称之为目标点。根据该组第二测距数据可以确定这些目标点在全局坐标系中的三维空间坐标,这些三维空间坐标可以构成第一点云数据。
如前文中所描述,根据多组第一测距数据得到的全方位激光测距数据全方位覆盖了三维场景中的物体表面上的点,因此可以从全方位激光测距数据中提取出上述目标点对应的第二点云数据,也即从全方位激光测距数据中提取上述目标点的三维空间坐标,形成第二点云数据。
由于第一点云数据和第二点云数据均对应相同的目标点,且由于第二点云数据是精度更高的第二测距模块得到,因此第二点云数据的精确度更高,因此可以以第二点云数据为基准数据,计算两者的误差。在计算得到第一点云数据和第二点云数据之间的误差之后,可以通过该误差校正全方位点云数据,使得全方位点云数据的精确度更高。通过这种方式,可以利用高精度激光测距模块与单线激光雷达形成互补,以低成本的方案获得高精度的点云数据。
下述为本公开装置实施例,可以用于执行本公开方法实施例。
根据本公开一实施方式的数据校正装置,该装置可以通过软件、硬件或者两者的结合实现成为电子设备的部分或者全部。该数据校正装置包括:
第一获取模块,被配置为获取第一测距数据以及第二测距数据;其中,所述第一测距数据和第二测距数据分别由上述数据采集设备上的第一测距模块和第二测距模块采集得到;
第二获取模块,被配置为根据所述第二测距数据获取第一点云数据;所述第一点云数据包括所述第二测距数据对应的物体表面上的目标点的三维空间坐标;
提取模块,被配置为从所述全方位点云数据中获取所述目标点对应的第二点云数据;
确定模块,被配置为根据所述第一点云数据和第二点云数据确定误差数据,并根据所述误差数据对所述全方位点云数据进行校正。
本实施例中的数据校正装置与图6所示实施例中的数据校正方法对应一致,具体细节可以参见上述对数据校正方法的描述,在此不再赘述。
图7是适于用来实现根据本公开实施方式的数据校正方法的电子设备的结构示意图。
如图7所示,电子设备700包括处理单元701,其可实现为CPU、GPU、FPGA、NPU等处理单元。处理单元701可以根据存储在只读存储器(ROM)702中的程序或者从存储部分708加载到随机访问存储器(RAM)703中的程序而执行本公开上述任一方法的实施方式中的各种处理。在RAM703中,还存储有电子设备700操作所需的各种程序和数据。处理单元701、ROM702以及RAM703通过总线704彼此相连。输入/输出(I/O)接口705也连接至总线704。
以下部件连接至I/O接口705:包括键盘、鼠标等的输入部分706;包括诸如阴极射线管(CRT)、液晶显示器(LCD)等以及扬声器等的输出部分707;包括硬盘等的存储部分708;以及包括诸如LAN卡、调制解调器等的网络接口卡的通信部分709。通信部分709经由诸如因特网的网络执行通信处理。驱动器710也根据需要连接至I/O接口705。可拆卸介质711,诸如磁盘、光盘、磁光盘、半导体存储器等等,根据需要安装在驱动器710上,以便于从其上读出的计算机程序根据需要被安装入存储部分708。
特别地,根据本公开的实施方式,上文参考本公开实施方式中的任一方法可以被实现为计算机软件程序。例如,本公开的实施方式包括一种计算机程序产品,其包括有形地包含在及其可读介质上的计算机程序,所述计算机程序包含用于执行本公开实施方式中任一方法的程序代码。在这样的实施方式中,该计算机程序可以通过通信部分709从网络上被下载和安装,和/或从可拆卸介质711被安装。
附图中的流程图和框图,图示了按照本公开各种实施方式的系统、方法和计算机程序产品的可能实现的体系架构、功能和操作。在这点上,路程图或框图中的每个方框可以代表一个模块、程序段或代码的一部分,所述模块、程序段或代码的一部分包 含一个或多个用于实现规定的逻辑功能的可执行指令。也应当注意,在有些作为替换的实现中,方框中所标注的功能也可以以不同于附图中所标注的顺序发生。例如,两个接连地表示的方框实际上可以基本并行地执行,它们有时也可以按相反的顺序执行,这依所涉及的功能而定。也要注意的是,框图和/或流程图中的每个方框、以及框图和/或流程图中的方框的组合,可以用执行规定的功能或操作的专用的基于硬件的系统来实现,或者可以用专用硬件与计算机指令的组合来实现。
描述于本公开实施方式中所涉及到的单元或模块可以通过软件的方式实现,也可以通过硬件的方式来实现。所描述的单元或模块也可以设置在处理器中,这些单元或模块的名称在某种情况下并不构成对该单元或模块本身的限定。
作为另一方面,本公开还提供了一种计算机可读存储介质,该计算机可读存储介质可以是上述实施方式中所述装置中所包含的计算机可读存储介质;也可以是单独存在,未装配入设备中的计算机可读存储介质。计算机可读存储介质存储有一个或者一个以上程序,所述程序被一个或者一个以上的处理器用来执行描述于本公开的方法。
以上描述仅为本公开的较佳实施例以及对所运用技术原理的说明。本领域技术人员应当理解,本公开中所涉及的发明范围,并不限于上述技术特征的特定组合而成的技术方案,同时也应涵盖在不脱离所述发明构思的情况下,由上述技术特征或其等同特征进行任意组合而形成的其它技术方案。例如上述特征与本公开中公开的(但不限于)具有类似功能的技术特征进行互相替换而形成的技术方案。

Claims (18)

  1. 一种数据采集设备,包括:旋转模块、第一测距模块以及图像采集模块;其中,
    所述旋转模块用于带动所述数据采集设备在第一方向上旋转;
    所述第一测距模块适于随着所述数据采集设备在所述第一方向上旋转,还适于在第二方向上旋转,还适于测量第一测距数据,其中,所述第一方向和第二方向不同;
    所述图像采集模块适于随着所述数据采集设备在所述第一方向上旋转,还适于采集三维场景中的图像数据。
  2. 根据权利要求1所述的数据采集设备,其中,所述第一方向和第二方向相互垂直。
  3. 根据权利要求1或2所述的数据采集设备,还包括:第二测距模块,用于获取第二测距数据;所述第二测距模块的测距误差小于所述第一测距模块的测距误差。
  4. 根据权利要求3所述的数据采集设备,其中,在所述旋转模块带动所述数据采集设备在第一方向上旋转一周的过程中,所述第二测距模块在所述第一方向上旋转测量所述第二测距数据。
  5. 根据权利要求3所述的数据采集设备,其中,所述旋转模块设置在所述数据采集设备下方,所述第一测距模块设置在所述数据采集设备的第一侧,所述第二测距模块设置在所述数据采集设备的第二侧,且第一侧所在平面与第二侧所在平面相交;所述图像采集设备的镜头方向与所述第二测距模块的测距方向相反。
  6. 根据权利要求1-2、4-5任一项所述的数据采集设备,其中,所述第一测距模块和第二测距模块均为激光测距模块。
  7. 根据权利要求5所述的数据采集设备,其中,所述第一测距模块为单线激光雷达。
  8. 根据权利要求1-2、4-5任一项所述的数据采集设备,其中,所述图像采集模块的镜头中心位于所述旋转模块的旋转轴的延长线上。
  9. 根据权利要求1-2、4-5任一项所述的数据采集设备,其中,所述图像采集模块在所述第一方向上旋转到预设旋转角度后采集所述图像数据。
  10. 根据权利要求3所述的数据采集设备,还包括:微控制单元和主控制单元;其中,
    所述微控制单元分别与所述旋转模块、所述第一测距模块和所述第二测距模块连接,并且所述微控制单元用于控制旋转模块,并实时获取所述旋转模块的旋转角度、所述第一测距数据和所述第二测距数据;所述微控制单元还用于将获取的所述旋转角度、所述第一测距数据和所述第二测距数据进行时间同步后输出至主控制单元;
    所述主控制单元与所述图像采集模块连接,并且从所述图像采集模块获取所述图像数据,并对从所述微控制单元接收到的所述旋转角度、所述第一测距数据和所述第二测距数据根据第一测距数据进行处理。
  11. 根据权利要求10所述的数据采集设备,其中,所述主控制单元通过对多组所述第一测距数据进行处理,获取三维场景中的全方位点云数据,所述全方位点云数据包括物体表面上被测点在三维场景中的三维空间坐标;多组所述第一测距数据包括第一测距模块随着数据采集设备在第一方向上旋转一周、并在第二方向上自转多周采集到的数据。
  12. 根据权利要求10或11所述的数据采集设备,其中,所述主控制单元还利用所述第二测距数据对全方位点云数据进行误差校正,且误差校正方式如下:
    所述主控制单元根据所述第二测距数据获取第一点云数据;所述第一点云数据包括所述第二测距数据对应的物体表面上的目标点的三维空间坐标,以及所述主控制单元从所述全方位点云数据中获取所述目标点对应的第二点云数据;
    所述主控制单元根据所述第一点云数据和第二点云数据确定误差数据,并根据所述误差数据对所述全方位点云数据进行校正。
  13. 根据权利要求10或11所述的数据采集设备,其中,所述主控制单元还对所述图像数据进行处理得到对应的全景图。
  14. 根据权利要求11所述的数据采集设备,其中,所述主控制单元还通过对所述全方位点云数据和所述图像数据处理得到三维场景模型。
  15. 一种数据校正方法,其中,包括:
    获取第一测距数据以及第二测距数据;其中,所述第一测距数据和第二测距数据分别由权利要求3-14任一项所述的数据采集设备上的第一测距模块和第二测距模块采集得到;
    根据所述第二测距数据获取第一点云数据;所述第一点云数据包括所述第二测距数据对应的物体表面上的目标点的三维空间坐标;
    从全方位点云数据中获取所述目标点对应的第二点云数据;
    根据所述第一点云数据和第二点云数据确定误差数据,并根据所述误差数据对所述全方位点云数据进行校正。
  16. 一种数据校正装置,其中,包括:
    第一获取模块,被配置为获取第一测距数据以及第二测距数据;其中,所述第一测距数据和第二测距数据分别由权利要求3-14任一项所述的数据采集设备上的第一测距模块和第二测距模块采集得到;
    第二获取模块,被配置为根据所述第二测距数据获取第一点云数据;所述第一点云数据包括所述第二测距数据对应的物体表面上的目标点的三维空间坐标;
    提取模块,被配置为从全方位点云数据中获取所述目标点对应的第二点云数据;
    确定模块,被配置为根据所述第一点云数据和第二点云数据确定误差数据,并根据所述误差数据对所述全方位点云数据进行校正。
  17. 一种电子设备,其中,包括存储器和处理器;其中,
    所述存储器用于存储一条或多条计算机指令,其中,所述一条或多条计算机指令被所述处理器执行以实现权利要求15所述的方法。
  18. 一种计算机可读存储介质,其上存储有计算机指令,其中,该计算机指令被处理器执行时实现权利要求15所述的方法。
PCT/CN2021/082315 2020-03-24 2021-03-23 数据采集设备及数据校正方法、装置、电子设备 WO2021190485A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
EP21774125.5A EP4130651A4 (en) 2020-03-24 2021-03-23 DATA COLLECTION DEVICE, DATA CORRECTION METHOD AND DEVICE, AND ELECTRONIC DEVICE
US17/816,842 US20230012240A1 (en) 2020-03-24 2022-08-02 Data acquisition device, data correction method and apparatus, and electronic device

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202010214154.6A CN113446956B (zh) 2020-03-24 2020-03-24 数据采集设备及数据校正方法、装置、电子设备
CN202010214154.6 2020-03-24

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US17/816,842 Continuation US20230012240A1 (en) 2020-03-24 2022-08-02 Data acquisition device, data correction method and apparatus, and electronic device

Publications (1)

Publication Number Publication Date
WO2021190485A1 true WO2021190485A1 (zh) 2021-09-30

Family

ID=77807458

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/082315 WO2021190485A1 (zh) 2020-03-24 2021-03-23 数据采集设备及数据校正方法、装置、电子设备

Country Status (4)

Country Link
US (1) US20230012240A1 (zh)
EP (1) EP4130651A4 (zh)
CN (1) CN113446956B (zh)
WO (1) WO2021190485A1 (zh)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113960574A (zh) * 2021-10-27 2022-01-21 合肥英睿系统技术有限公司 一种激光测距校准方法、装置、设备及存储介质
CN114125434A (zh) * 2021-11-26 2022-03-01 重庆盛泰光电有限公司 一种tof摄像头的3d校正装置
CN114894167B (zh) * 2022-04-06 2024-01-30 西北工业大学 一种基于多传感器技术的洞穴自动测绘系统及方法
CN114581611B (zh) * 2022-04-28 2022-09-20 阿里巴巴(中国)有限公司 虚拟场景构建方法以及装置

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH06147844A (ja) * 1992-09-18 1994-05-27 East Japan Railway Co 架空線の離隔値を測定する装置および方法
CN102232176A (zh) * 2009-03-25 2011-11-02 法罗技术股份有限公司 用于光学地扫描和测量环境的方法
CN105067023A (zh) * 2015-08-31 2015-11-18 中国科学院沈阳自动化研究所 一种全景三维激光传感器数据校准方法和装置
CN105928457A (zh) * 2016-06-21 2016-09-07 大连理工大学 一种全向三维激光彩色扫描系统及其方法
CN108603933A (zh) * 2016-01-12 2018-09-28 三菱电机株式会社 用于融合具有不同分辨率的传感器输出的系统和方法
CN110196431A (zh) * 2019-07-09 2019-09-03 南京信息工程大学 基于arm的低成本室内3d激光扫描测距系统及方法

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101008571A (zh) * 2007-01-29 2007-08-01 中南大学 一种移动机器人三维环境感知方法
CN204679637U (zh) * 2015-05-25 2015-09-30 北京雷动云合智能技术有限公司 双激光标定的高精度ccd多点测距装置
CN105445749A (zh) * 2015-11-13 2016-03-30 中国人民解放军空军装备研究院雷达与电子对抗研究所 一种基于波长分割的多脉冲激光测距系统和方法
JP6771994B2 (ja) * 2016-08-17 2020-10-21 株式会社トプコン 測定方法及びレーザスキャナ
CN107376283A (zh) * 2017-08-25 2017-11-24 江苏理工学院 一种张力篮球投篮技能形成器
CN108267748A (zh) * 2017-12-06 2018-07-10 香港中文大学(深圳) 一种全方位三维点云地图生成方法及系统
CN207557478U (zh) * 2017-12-25 2018-06-29 深圳市杉川机器人有限公司 激光测距装置及机器人
CN209625377U (zh) * 2018-12-28 2019-11-12 广州运维电力科技有限公司 一种电缆防外力破坏视频监控装置
CN109633669A (zh) * 2018-12-28 2019-04-16 湘潭大学 一种利用双波段激光提高焊接中测距精度的方法
CN113566762A (zh) * 2020-04-28 2021-10-29 上海汽车集团股份有限公司 高度测量装置和方法

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH06147844A (ja) * 1992-09-18 1994-05-27 East Japan Railway Co 架空線の離隔値を測定する装置および方法
CN102232176A (zh) * 2009-03-25 2011-11-02 法罗技术股份有限公司 用于光学地扫描和测量环境的方法
CN105067023A (zh) * 2015-08-31 2015-11-18 中国科学院沈阳自动化研究所 一种全景三维激光传感器数据校准方法和装置
CN108603933A (zh) * 2016-01-12 2018-09-28 三菱电机株式会社 用于融合具有不同分辨率的传感器输出的系统和方法
CN105928457A (zh) * 2016-06-21 2016-09-07 大连理工大学 一种全向三维激光彩色扫描系统及其方法
CN110196431A (zh) * 2019-07-09 2019-09-03 南京信息工程大学 基于arm的低成本室内3d激光扫描测距系统及方法

Also Published As

Publication number Publication date
EP4130651A4 (en) 2023-08-23
EP4130651A1 (en) 2023-02-08
CN113446956B (zh) 2023-08-11
US20230012240A1 (en) 2023-01-12
CN113446956A (zh) 2021-09-28

Similar Documents

Publication Publication Date Title
WO2021190485A1 (zh) 数据采集设备及数据校正方法、装置、电子设备
US10896497B2 (en) Inconsistency detecting system, mixed-reality system, program, and inconsistency detecting method
US11032527B2 (en) Unmanned aerial vehicle surface projection
US8699005B2 (en) Indoor surveying apparatus
US8982118B2 (en) Structure discovery in a point cloud
GB2591857A (en) Photographing-based 3D modeling system and method, and automatic 3D modeling apparatus and method
WO2022077296A1 (zh) 三维重建方法、云台负载、可移动平台以及计算机可读存储介质
CN110383004A (zh) 信息处理装置、空中摄像路径生成方法、程序、及记录介质
US11403499B2 (en) Systems and methods for generating composite sets of data from different sensors
US20210264666A1 (en) Method for obtaining photogrammetric data using a layered approach
JP2020510903A (ja) 環境のデジタルキャプチャのための画像収集の追跡、並びに関連のシステム及び方法
JP2019027960A (ja) 三次元点群データの縦断面図作成方法,そのための測量データ処理装置,および測量システム
EP4121944A1 (en) Machine vision determination of location based on recognized surface features and use thereof to support augmented reality
US20230324556A1 (en) Support system for mobile coordinate scanner
US11943539B2 (en) Systems and methods for capturing and generating panoramic three-dimensional models and images
WO2022078437A1 (zh) 一种移动物体之间的三维处理设备及方法
Frueh Automated 3D model generation for urban environments
WO2022078438A1 (zh) 一种室内3d信息采集设备
RU2723239C1 (ru) Система получения реалистичной модели местности для виртуального мира и способ ее работы
CN113888702A (zh) 基于多tof激光雷达和rgb摄像头的室内高精度实时建模和空间定位的装置和方法
Fernández-Hernandez et al. A new trend for reverse engineering: Robotized aerial system for spatial information management
WO2020107487A1 (zh) 图像处理方法和无人机
CN112672134A (zh) 基于移动终端三维信息采集控制设备及方法
US20240176025A1 (en) Generating a parallax free two and a half (2.5) dimensional point cloud using a high resolution image
Hadsell et al. Complex terrain mapping with multi-camera visual odometry and realtime drift correction

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21774125

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2021774125

Country of ref document: EP

Effective date: 20221024