CN113077509A - Space mapping calibration method and space mapping system based on synchronous positioning and mapping - Google Patents

Space mapping calibration method and space mapping system based on synchronous positioning and mapping Download PDF

Info

Publication number
CN113077509A
CN113077509A CN202010004109.8A CN202010004109A CN113077509A CN 113077509 A CN113077509 A CN 113077509A CN 202010004109 A CN202010004109 A CN 202010004109A CN 113077509 A CN113077509 A CN 113077509A
Authority
CN
China
Prior art keywords
mapping
robot
target point
space
camera
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010004109.8A
Other languages
Chinese (zh)
Inventor
吴奕旻
吕昊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Yitu Information Technology Co ltd
Original Assignee
Shanghai Yitu Information Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Yitu Information Technology Co ltd filed Critical Shanghai Yitu Information Technology Co ltd
Priority to CN202010004109.8A priority Critical patent/CN113077509A/en
Publication of CN113077509A publication Critical patent/CN113077509A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/4808Evaluating distance, position or velocity data

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Electromagnetism (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Manipulator (AREA)

Abstract

The application provides a space mapping calibration method and a space mapping system based on synchronous positioning and mapping, which comprises the steps of positioning and mapping a space by a robot through a synchronous positioning and mapping method to obtain a map of the space; marking a target point bit on a map to generate a target point bit sequence; sequentially accessing target point locations by using a robot, acquiring physical space coordinates of the target point location by using the robot when each target point location is reached, and simultaneously determining the position of the robot in a camera picture to acquire camera pixel coordinates until the robot traverses all the target point locations in a target point bit sequence; determining the mapping relation between the physical space coordinates of all target point positions and the pixel coordinates of the camera to obtain a mapping relation model of the physical space coordinates and the pixel coordinates of the camera; and calibrating the spatial mapping based on the mapping relation model. According to the space surveying and mapping calibration method and the space surveying and mapping robot, the manual surveying and mapping cost can be reduced, and the accuracy of the surveying and mapping result is improved.

Description

Space mapping calibration method and space mapping system based on synchronous positioning and mapping
Technical Field
The application relates to the field of robots, in particular to a space mapping calibration method and a space mapping calibration system based on synchronous positioning and drawing.
Background
At present, when a head-top camera determines the actual spatial coordinates of an object in a picture manually, an operator cannot directly and simply calculate the actual spatial coordinates through a pixel coordinate point where the object in the picture is located due to camera lens distortion and optical perspective. In the prior art, the corresponding relationship between the physical coordinates in the space and the pixel coordinates in the overhead camera picture needs to be obtained through field measurement, which is usually performed manually, for example, a worker is arranged to walk back and forth in the coverage area of an overhead camera, the pixel coordinates in the camera picture are located according to the recognizable mark of the camera on the shoulder of the worker, the worker records the current spatial physical coordinates, so as to form a group of corresponding measurement data, the worker repeats the measurement actions for many times to form a plurality of groups of measurement data with mapping relationship, and then the corresponding relationship is established through an algorithm.
Although current measurement mode can realize space survey and drawing, because the staff is at the survey and drawing in-process, the health appears easily and rocks and can cause the sampling point drift in the camera picture, the thought error of introducing when recording current space physical coordinate, the randomness sampling causes the sampling point factor such as intensive inadequately, causes the data of gathering to contain great error, and holistic mapping effect is unsatisfactory, also can directly influence the application of mapping result.
Disclosure of Invention
In view of this, the present application provides a spatial mapping calibration method and a spatial mapping calibration system based on synchronous positioning and drawing, which can solve the problems of high cost of manual mapping training and inaccurate mapping data in the existing spatial mapping.
In order to solve the technical problem, the following technical scheme is adopted in the application:
in a first aspect, an embodiment of the present application provides a spatial mapping calibration method based on synchronous positioning and mapping, including: positioning and drawing the space by using the robot through a synchronous positioning and drawing method to obtain a map of the space, wherein the robot automatically explores a 2D map of the space; marking target point bits on the map to generate a target point bit sequence, wherein the target point bits are as dense as possible, for example, the target point bits are selected at intervals of 30cm or at more dense intervals, a robot is used for sequentially visiting the target point bits, physical space coordinates of the target point bits are obtained through the robot when each target point bit arrives, and meanwhile, the position of the robot in a camera picture is determined to obtain camera pixel coordinates until the robot traverses all the target point bits in the target point bit sequence, wherein the pixel coordinates of the robot in the camera picture are marked by visual tags, for example, AprilTag or two-dimensional codes; determining the mapping relation between the physical space coordinates of all target point positions and the pixel coordinates of the camera to obtain a mapping relation model of the physical space coordinates and the pixel coordinates of the camera; and calibrating the spatial mapping based on the mapping relation model.
Preferably, the robot uses lidar for position mapping.
Preferably, the lidar is a single or multi-line lidar.
Preferably, the interval between the target points may be set to be between 5 and 50cm, and preferably, may be set to be 30 cm. Although other embodiments of the present application may be configured to be less than 5cm or greater than 50cm, and is not limited herein.
And determining the pixel coordinates and the physical three-dimensional space coordinates of the robot in a camera picture through a visual label on the robot.
In a second aspect, an embodiment of the present application provides a spatial mapping calibration system, including a robot, a camera for shooting the robot, and a control device for controlling the robot, where the robot includes:
the positioning and drawing module is used for positioning and drawing the space by a synchronous positioning and drawing method to obtain a map of the space;
the control device is used for marking a target point on the map to generate a target point bit sequence and controlling the robot to access the target point sequence in sequence;
the camera is used for shooting the robot when the robot reaches each target point position, and recording pixel coordinates of the robot in a camera picture;
the robot reports the physical space coordinates of the robot to the control device through the positioning and drawing module when the robot reaches the target point location, the control device is further used for determining the mapping relation between the physical space coordinates of all the target point locations and the pixel coordinates of the camera to obtain a mapping relation model of the physical space coordinates and the pixel coordinates of the camera, and spatial mapping is calibrated based on the mapping relation model.
Preferably, the location mapping module performs location mapping by using a laser radar.
Preferably, the lidar is a single line multiline lidar.
Preferably, the interval between the target points may be set to be between 5 and 50cm, and preferably, may be set to be 30 cm. Although other embodiments of the present application may be configured to be less than 5cm or greater than 50cm, and is not limited herein.
And the pixel coordinates and the physical three-dimensional space coordinates in the camera picture are determined through the visual label on the positioning drawing module.
The technical scheme of the application has at least one of the following beneficial effects:
according to the space surveying and mapping calibration method based on synchronous positioning and mapping and the space surveying and mapping robot, the robot can automatically carry out space surveying and mapping through the robot, the robot is accurate in walking, and positioning errors caused by artificial shaking are avoided, so that the accuracy of space surveying and mapping results is improved, and the problems of high cost and inaccuracy of surveying and mapping data in the conventional space surveying and mapping process are solved.
Drawings
FIG. 1 is a flowchart of a spatial mapping calibration method based on synchronized positioning mapping according to an embodiment of the present application;
fig. 2 is a schematic structural diagram of a spatial mapping system according to an embodiment of the present application.
Detailed Description
In order that the content of the present application may be more clearly understood, specific embodiments of the present application will be described in further detail below with reference to the accompanying drawings and examples. The following examples are intended to illustrate the present application but are not intended to limit the scope of the present application.
The spatial mapping calibration method based on the synchronized positioning mapping according to the embodiment of the present application is described below with reference to the accompanying drawings, and fig. 1 shows a flowchart of the spatial mapping calibration method based on the synchronized positioning mapping, as shown in fig. 1, the spatial mapping calibration method includes the following steps:
and step S110, positioning and drawing the space by using the robot through a synchronous positioning and drawing method to obtain a map of the space.
Specifically, the robot utilizes a synchronous positioning and drawing method, and performs positioning and drawing through a space where the laser radar is located, so as to find out all explorable areas and draw a 2D map of the space where the laser radar is located. The robot can be remotely controlled and started through the handheld equipment of the staff, so that the robot automatically utilizes a synchronous positioning drawing method to draw a preliminary 2D map after being started.
In step S120, the target point is marked on the map to generate a target point bit sequence. The 2D map of the space obtained in the previous step is used to generate the target point locations at certain intervals, where the intervals may be set to 5-50cm, and other values may also be set, and the interval is not limited herein, for example, one target point location is marked every 30cm, and the target point locations uniformly cover the 2D map of the space where the target point locations are located. In addition, the user can add or delete some point positions according to the needs of the user, for example, when the user judges that some areas of the 2D map need to be measured more, the target point position of the area can be added manually, and therefore the accuracy of spatial mapping is guaranteed.
Step S130, sequentially accessing the target point locations by using the robot, acquiring the physical space coordinates of the target point location by the robot when each target point location is reached, simultaneously determining the position of the robot in the camera picture to acquire the pixel coordinates of the camera until the robot traverses all the target point locations in the target point bit sequence, wherein the robot acquires the physical space coordinates thereof by a built-in navigation component And a synchronous positioning And drawing method in the process of sequentially walking to the target point location, for example, by adopting a Simultaneous Localization And Mapping (SLAM) technology, And simultaneously the camera shoots a visual label on the robot, for example, an AprilTag marking picture acquires the pixel coordinates of AprilTag in the camera picture, the robot walks to the next target point location after the two groups of data are acquired, And the process is repeated to acquire the coordinate pair of the current target point location, until the robot acquires the coordinate pairs of all the target point locations.
Step S140, determining a mapping relationship between physical space coordinates of all target point locations and pixel coordinates of the camera to obtain a mapping relationship model between the physical space coordinates and the pixel coordinates of the camera, where the robot walks across all the target point locations to obtain a complete target point location coordinate pair, each target point location obtains two sets of data of the physical space coordinates of the robot at the target point location and the pixel coordinates of a robot visual tag (e.g., aprilat or two-dimensional code) in the picture of the camera and stores the two sets of data as a set of coordinate pairs, the format of the coordinate pairs may be selected from a JSON format, and a mapping relationship model between the physical space coordinates and the pixel coordinates of the camera is obtained by calculation according to the coordinate pairs obtained by all the target point locations.
And S150, calibrating the spatial mapping based on the mapping relation model to obtain a spatial mapping result with higher precision.
According to one embodiment of the application, the robot utilizes a laser radar for positioning and drawing, preferably, the laser radar is a single-line or multi-line laser radar, and an accurate map can be obtained.
According to an embodiment of the application, a position of a robot in a camera picture is determined through a visual tag, such as aprilat, on the robot to obtain camera pixel coordinates, wherein an aprilat mark is located on a head of the robot and can be arranged in a manner of facing the camera, so that the camera can accurately shoot the aprilat mark and record a coordinate pair of the aprilat mark, wherein the coordinate pair corresponding to the mark comprises the pixel coordinates and the physical three-dimensional space coordinates in the camera picture, further, a mapping relation model of the physical space coordinates and the camera pixel coordinates is obtained through a mapping relation of the physical space coordinates and the camera pixel coordinates, and the spatial mapping is calibrated based on the mapping relation model.
Therefore, through the steps of the space surveying and mapping calibration method based on synchronous positioning and mapping, the robot can automatically perform space surveying and mapping, the robot is accurate in walking, and positioning errors caused by artificial shaking are avoided, so that the accuracy of space surveying and mapping results is improved, and the problems of high cost and inaccuracy of surveying and mapping data of manual surveying and mapping training in the existing space surveying and mapping are solved.
Based on the above description, the spatial mapping robot of the present application is described below with reference to a specific embodiment, fig. 2 is a schematic structural diagram of a spatial mapping system of the present application, as shown in fig. 2, the spatial mapping system of the present application includes a robot 1002, a camera 1003 for shooting the robot, and a control device 1001 for controlling the robot, and the robot 1002 includes:
the positioning and drawing module is used for positioning and drawing the space by a synchronous positioning and drawing method to obtain a map of the space;
the control device is used for marking target point positions on a map to generate a target point position sequence and controlling the robot 1002 to access the target point positions in sequence;
the camera 1003 is used for shooting the robot 1002 when the robot reaches each target point, and recording pixel coordinates of the robot 1002 in a camera picture;
when the robot 1002 reaches the target point location through the positioning and drawing module, the robot 1002 reports the target point location to the control device, and obtains the physical space coordinate of the robot 1002, the control device is further configured to determine the mapping relationship between the physical space coordinates of all the target point locations and the pixel coordinates of the camera, obtain a mapping relationship model between the physical space coordinates and the pixel coordinates of the camera, and calibrate spatial mapping based on the mapping relationship model.
Preferably, the positioning and mapping module performs positioning and mapping by using a laser radar, and preferably, the laser radar is a single-line or multi-line laser radar.
Preferably, the distance between the target points is preferably 30cm, and in other embodiments of the present application, the distance may be higher or lower than 30cm, for example, may be 40cm and 20cm, may be a distance pre-stored in the system, or may be set by a worker according to needs, and is not limited herein.
According to an embodiment of the application, aprilat marks may be provided on the legs of the robot by positioning visual tags on the mapping module, such as aprilat marks, to locate the robot 1002 in the view of the camera 1003 to obtain camera pixel coordinates.
It should be noted that, the workflow of each component of the spatial mapping system provided in the embodiment of the present application has been described in detail in the above embodiment, and specifically, reference may be made to the spatial mapping calibration method in the above embodiment, which is not described herein again.
In the present application, the robot may walk through a movable chassis, the movable chassis is driven by a servo motor, and the motor may be driven by a motor driving board, and other structures and operations of the spatial mapping system according to the embodiments of the present application are understood and easily implemented by those skilled in the art, and thus will not be described in detail.
Therefore, according to the space surveying and mapping calibration method and the space surveying and mapping robot based on synchronous positioning and mapping, the robot can automatically carry out space surveying and mapping through the robot, the robot walks accurately, positioning errors caused by artificial shaking are avoided, the accuracy of space surveying and mapping results is improved, and the problems that manual surveying and mapping training is high in cost and surveying and mapping data is inaccurate in existing space surveying and mapping are solved.
The foregoing is a preferred embodiment of the present application and it should be noted that modifications and embellishments could be made by those skilled in the art without departing from the principle described in the present application and should be considered as the scope of protection of the present application.

Claims (10)

1. A space mapping calibration method based on synchronous positioning and drawing is characterized by comprising the following steps:
positioning and drawing the space by using a robot through a synchronous positioning and drawing method to obtain a map of the space;
marking a target point bit on the map to generate a target point bit sequence;
sequentially accessing the target point positions by using a robot, acquiring the physical space coordinates of the target point position by using the robot when each target point position is reached, and simultaneously determining the position of the robot in a camera picture to acquire the pixel coordinates of the camera until the robot traverses all the target point positions in the target point sequence;
determining the mapping relation between the physical space coordinates of all target point positions and the pixel coordinates of the camera to obtain a mapping relation model of the physical space coordinates and the pixel coordinates of the camera;
and calibrating the spatial mapping based on the mapping relation model.
2. The synchronous mapping-based spatial mapping calibration method of claim 1, wherein the robot performs mapping using lidar.
3. The synchronous positioning mapping-based spatial mapping calibration method of claim 2, wherein the lidar is a single-line or multi-line lidar.
4. The synchronous positioning mapping-based spatial mapping calibration method according to claim 1, wherein the pixel coordinates and physical three-dimensional space coordinates of the robot in the camera frame are determined by a visual label on the robot.
5. The method of claim 4, wherein the visual tag is an AprilTag tag or a two-dimensional code.
6. A space mapping calibration system is characterized by comprising a robot, a camera for shooting the robot and a control device for controlling the robot,
the robot includes:
the positioning and drawing module is used for positioning and drawing the space by a synchronous positioning and drawing method to obtain a map of the space;
the control device is used for marking a target point on the map to generate a target point bit sequence and controlling the robot to access the target point sequence in sequence;
the camera is used for shooting the robot when the robot reaches each target point position, and recording pixel coordinates of the robot in a camera picture;
the robot reports the physical space coordinates of the robot to the control device through the positioning and drawing module when the robot reaches a target point position,
the control device is further configured to determine a mapping relationship between physical space coordinates of all target point locations and the camera pixel coordinates, obtain a mapping relationship model between the physical space coordinates and the camera pixel coordinates, and calibrate spatial mapping based on the mapping relationship model.
7. The system of claim 6, wherein the location mapping module performs location mapping using a lidar.
8. The system of claim 7, wherein the lidar is a single or multi-line lidar.
9. The system of claim 6, wherein the control device determines pixel coordinates and physical three-dimensional space coordinates of the robot in a camera view via a visual tag on the robot.
10. The system of claim 9, wherein the visual tag is an aprilat tag or a two-dimensional code.
CN202010004109.8A 2020-01-03 2020-01-03 Space mapping calibration method and space mapping system based on synchronous positioning and mapping Pending CN113077509A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010004109.8A CN113077509A (en) 2020-01-03 2020-01-03 Space mapping calibration method and space mapping system based on synchronous positioning and mapping

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010004109.8A CN113077509A (en) 2020-01-03 2020-01-03 Space mapping calibration method and space mapping system based on synchronous positioning and mapping

Publications (1)

Publication Number Publication Date
CN113077509A true CN113077509A (en) 2021-07-06

Family

ID=76608472

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010004109.8A Pending CN113077509A (en) 2020-01-03 2020-01-03 Space mapping calibration method and space mapping system based on synchronous positioning and mapping

Country Status (1)

Country Link
CN (1) CN113077509A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114543807A (en) * 2022-01-14 2022-05-27 安徽海博智能科技有限责任公司 High-precision evaluation method for SLAM algorithm in extreme scene

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108734654A (en) * 2018-05-28 2018-11-02 深圳市易成自动驾驶技术有限公司 It draws and localization method, system and computer readable storage medium
CN108803591A (en) * 2017-05-02 2018-11-13 北京米文动力科技有限公司 A kind of ground drawing generating method and robot
CN109308077A (en) * 2018-09-06 2019-02-05 广州极飞科技有限公司 A kind of mapping method based on aircraft, apparatus and system

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108803591A (en) * 2017-05-02 2018-11-13 北京米文动力科技有限公司 A kind of ground drawing generating method and robot
CN108734654A (en) * 2018-05-28 2018-11-02 深圳市易成自动驾驶技术有限公司 It draws and localization method, system and computer readable storage medium
CN109308077A (en) * 2018-09-06 2019-02-05 广州极飞科技有限公司 A kind of mapping method based on aircraft, apparatus and system

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114543807A (en) * 2022-01-14 2022-05-27 安徽海博智能科技有限责任公司 High-precision evaluation method for SLAM algorithm in extreme scene
CN114543807B (en) * 2022-01-14 2023-10-20 安徽海博智能科技有限责任公司 High-precision evaluation method of SLAM algorithm in extreme scene

Similar Documents

Publication Publication Date Title
US11441899B2 (en) Real time position and orientation tracker
CN107037880A (en) Space orientation attitude determination system and its method based on virtual reality technology
EP3062066A1 (en) Determination of object data by template-based UAV control
EP2914927B1 (en) Visual positioning system
CN108051837A (en) Multiple-sensor integration indoor and outdoor mobile mapping device and automatic three-dimensional modeling method
Hinsken et al. Triangulation of LH systems ADS40 imagery using Orima GPS/IMU
US9207677B2 (en) Vehicle positioning method and its system
EP3086283A1 (en) Providing a point cloud using a surveying instrument and a camera device
KR102016636B1 (en) Calibration apparatus and method of camera and rader
CN109751992B (en) Indoor three-dimensional space-oriented positioning correction method, positioning method and equipment thereof
US20070257836A1 (en) Site survey tracking
CN106541404A (en) A kind of Robot visual location air navigation aid
US20200400431A1 (en) Movable marking system, controlling method for movable marking apparatus, and computer readable recording medium
CN105659107A (en) Optical tracking
CN105116886A (en) Robot autonomous walking method
US20180356222A1 (en) Device, system and method for displaying measurement gaps
CN106851575A (en) The method and locating calibration device of a kind of unified locating base station coordinate system
US10962609B2 (en) Calibration system and method for magnetic tracking in virtual reality systems
CN113077509A (en) Space mapping calibration method and space mapping system based on synchronous positioning and mapping
KR20190063967A (en) Method and apparatus for measuring position using stereo camera and 3D barcode
CN212254095U (en) Land mapping device for territory planning
CN114322990B (en) Acquisition method and device for data for constructing mobile robot map
CN112556681A (en) Visual-based orchard machine navigation positioning method
CN110222552A (en) Positioning system and method and computer-readable storage medium
CN111089568B (en) Road sign calibration instrument based on RTK + camera

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination