CN113190007A - Map contour correction method, chip and mobile robot - Google Patents

Map contour correction method, chip and mobile robot Download PDF

Info

Publication number
CN113190007A
CN113190007A CN202110501046.1A CN202110501046A CN113190007A CN 113190007 A CN113190007 A CN 113190007A CN 202110501046 A CN202110501046 A CN 202110501046A CN 113190007 A CN113190007 A CN 113190007A
Authority
CN
China
Prior art keywords
mobile robot
distance
obstacle
robot
boundary
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110501046.1A
Other languages
Chinese (zh)
Inventor
赖钦伟
肖刚军
何再生
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhuhai Amicro Semiconductor Co Ltd
Original Assignee
Zhuhai Amicro Semiconductor Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhuhai Amicro Semiconductor Co Ltd filed Critical Zhuhai Amicro Semiconductor Co Ltd
Priority to CN202110501046.1A priority Critical patent/CN113190007A/en
Publication of CN113190007A publication Critical patent/CN113190007A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0234Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using optical markers or beacons
    • G05D1/0236Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using optical markers or beacons in combination with a laser
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0221Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory involving a learning process
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0238Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors
    • G05D1/024Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors in combination with a laser
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0276Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Electromagnetism (AREA)
  • Optics & Photonics (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The invention discloses a map contour correction method, a chip and a mobile robot, wherein the method comprises the following steps: the mobile robot measures the boundary distance of the surrounding environment through a single-point ranging sensor and generates a boundary contour line, wherein the mobile robot stores a grid map of the current environment, and the grid map comprises barrier grids; based on the boundary contour lines, the mobile robot aligns the obstacle grids on the grid map, thereby correcting the contour of the grid map. The method of the invention collects the barrier distance of the surrounding environment of the mobile robot and generates an accurate boundary contour line by controlling a high-precision single-point distance measuring sensor, so as to correct the error of the barrier grid on the grid map, thus the map boundary is more accurate, and the method does not need complex equipment and algorithm, has lower cost, small data operand and higher efficiency and practicability.

Description

Map contour correction method, chip and mobile robot
Technical Field
The invention relates to the field of SLAM mapping, in particular to a map contour correction method, a chip and a mobile robot.
Background
Visual SLAM (Simultaneous Localization And Mapping) And laser SLAM are key technologies in robot positioning And navigation. SLAM technologies based on cameras and laser radars have advantages and disadvantages, wherein the laser radars can return point clouds with high-precision distance information, have higher robustness, but have extremely high cost and are not beneficial to marketization; in contrast, the visual SLAM is low in cost, but poor in robustness, high in requirement on environmental illumination, and the processed map data are generally sparse, so that real scale information cannot be obtained. Therefore, the map formed by the visual SLAM has poor accuracy and relatively poor boundaries.
Disclosure of Invention
In order to solve the problems, the invention provides a map contour correction method, a chip and a mobile robot, and the problem of inaccurate boundary of a visual SLAM map can be solved by only adding a single-point ranging sensor. The specific technical scheme of the invention is as follows:
a mobile robot comprising a vision sensor, further comprising: and the single-point distance measuring sensor is arranged on the periphery of the mobile robot body and used for detecting the distance between the mobile robot and the obstacle. Compared with the prior art, the mobile robot provided by the invention is provided with only one single-point distance measuring sensor, the problem of inaccurate boundary of the visual SLAM map is solved, no complex equipment is needed, the cost is lower, and the efficiency and the practicability are higher.
Furthermore, the single-point distance measuring sensor is arranged on the front side surface of the machine body in front of the robot and used for detecting the distance between the front of the robot and the obstacle; or the single-point distance measuring sensor is arranged on the left side surface of the left side body of the robot and is used for detecting the distance between the left side of the robot and the obstacle; or the single-point distance measuring sensor is arranged on the right side surface of the machine body on the right side of the robot and is used for detecting the distance between the right side of the robot and the obstacle; or the single-point distance measuring sensor is arranged on the rear side surface of the machine body at the rear side of the robot and used for detecting the distance between the rear side of the robot and the obstacle.
Furthermore, the single-point distance measuring sensors are arranged on the side face of the periphery of the mobile robot body at intervals. Multiple single point ranging sensors may help the mobile robot to more quickly acquire boundary information of the surrounding environment.
Further, the mobile robot is provided with a collision sensor, a fitting unit and an aligning unit, wherein the collision sensor is used for detecting the position of the obstacle, so that the mobile robot marks an obstacle grid on a grid map according to the collision position; the fitting unit is used for generating a boundary contour line on the grid map according to the distance between the mobile robot and the obstacle; the alignment unit is used for aligning the barrier grids on the grid map according to the boundary contour lines; the mobile robot stores a grid map of the current environment, wherein the grid map comprises an obstacle grid. The alignment unit is used for aligning the barrier grids marked by the collision sensor, so that errors generated during marking can be eliminated, and more accurate map boundaries can be obtained.
A map contour correction method, the method comprising: the mobile robot measures the boundary distance of the surrounding environment through a single-point ranging sensor and generates a boundary contour line, wherein the mobile robot stores a grid map of the current environment, and the grid map comprises barrier grids; based on the boundary contour lines, the mobile robot aligns the obstacle grids on the grid map, thereby correcting the contour of the grid map. Compared with the prior art, in order to obtain a more accurate map boundary, the method provided by the invention acquires the barrier distance between the mobile robot and the surrounding environment by controlling a high-precision single-point ranging sensor and generates a precise boundary contour line so as to correct the error of the barrier grid on the grid map, so that the map boundary is more accurate, and the method is realized without complex equipment and algorithm, has lower cost and small data operand, and has higher efficiency and practicability.
Further, the method for measuring the boundary distance of the surrounding environment and generating the boundary contour line by the mobile robot through the single-point distance measuring sensor comprises the following steps: after the single-point distance measuring sensor is started, the mobile robot rotates in situ to obtain the distance of surrounding obstacles; and marking and fitting corresponding positions on the grid map according to the distance of surrounding obstacles to generate the boundary contour line. An accurate boundary reference can be obtained.
Further, the fitting method is a least squares method. The calculation complexity is reduced, and the calculation efficiency is improved.
Further, based on the boundary contour line, the method for aligning the obstacle grids on the grid map by the mobile robot is that the mobile robot adjusts the obstacle grids which are less than or equal to the preset distance from the boundary contour line to complete alignment. The obstacle grids in a certain range of the boundary contour line are aligned, so that errors in marking the obstacle grids can be eliminated, and the accuracy of the map boundary is improved.
A chip for storing computer program code which, when executed, implements the steps of the map contour correction method. Compared with the prior art, the chip provided by the invention can enable the mobile robot to establish a map with accurate map boundaries, does not need complex equipment and algorithms, and has the advantages of low cost, small data operand, and high efficiency and practicability.
Drawings
Fig. 1 is a flowchart illustrating a map contour correction method according to an embodiment of the present invention.
Fig. 2 is a schematic diagram of barrier grid alignment according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be described in detail below with reference to the accompanying drawings in the embodiments of the present invention. It should be understood that the following specific examples are illustrative only and are not intended to limit the invention.
The vision navigation sweeping robot in the prior art only has vision positioning information and collision sensor information, and cannot detect whether a front obstacle is a continuous surface or not. For example, when a robot collides with a wall, only possible wall distance points are calculated according to the collision position, and then collision grids are generated one by one to represent the wall. The location of these collision grids is often subject to errors due to the accuracy of the vision and collision sensors. The mere reliance on a visual sensor does not provide an accurate map boundary, resulting in differences between the boundaries of the visual SLAM map and the actual environment boundaries.
Accordingly, as shown in fig. 1, an embodiment of the present invention provides a method for correcting a contour of a map, the method including:
the mobile robot measures the boundary distance of the surrounding environment through a single-point ranging sensor and generates a boundary contour line, wherein the mobile robot stores a grid map of the current environment, and the grid map comprises barrier grids; based on the boundary contour lines, the mobile robot aligns the obstacle grids on the grid map, thereby correcting the contour of the grid map.
The grid map of the current environment stored by the mobile robot is stored in the storage space of the mobile robot after the mobile robot first creates a map, and the grid map is marked with an obstacle grid and a walkable grid. In general, a grid map is a gray image in which a black grid represents an obstacle, a white grid represents a walkable grid, and a gray grid, and colors of different shades are used to represent the probability of the presence of an obstacle on the grid. Optionally, the mobile robot may implement the correction of the map boundary in the first mapping process. However, for convenience of description, in the embodiment of the present invention, it is assumed that, when the map is first created, the mobile robot does not correct the map boundary, but aligns the obstacle grid generated when the map is first created by the boundary contour line scanned by the single-point distance measuring sensor in a second traversal of the environment (the map is created again, or the work is performed), so as to complete the present invention.
In this embodiment, the method for measuring the boundary distance of the surrounding environment and generating the boundary contour line by the mobile robot through the single-point ranging sensor includes the steps of: after the single-point distance measuring sensor is started, the mobile robot rotates in situ to obtain the distance of surrounding obstacles; and marking and fitting corresponding positions on the grid map according to the distance of surrounding obstacles to generate the boundary contour line. The single-point distance measuring sensor can be arranged in a plurality of positions so as to effectively measure the distance of surrounding obstacles. Here, a single-point distance measuring sensor is taken as an example, and the single-point distance measuring sensor is arranged at the head of the mobile robot and can acquire the distance of surrounding obstacles through rotation. In some implementations, a light pulse is emitted from a single point ranging sensor on the mobile robot at time T1, and a light pulse reflected back from an object returns to the single point ranging sensor at time T2. As is known, the movement speed of the light pulse is approximated to the speed of light C, and the distance D = C between the mobile robot and the obstacle can be obtained by a simple calculation (T2-T1). Therefore, the mobile robot can continuously acquire the distance of the obstacles in the surrounding environment through scanning in the moving process. Compared with a laser radar, the single-point ranging sensor is simple in structure, not prone to damage and low in cost.
After obtaining the distance information, the mobile robot marks a corresponding position on the grid map, for example, making a point, where the point is the position of the obstacle. After the mobile robot has traversed the entire environment, all obstacles will be marked in the form of points on the grid map. It should be noted that an obstacle may be composed of a plurality of points, and not one point represents an obstacle. In this embodiment, the type, number or size of the obstacle is not concerned, and it is mainly known whether the obstacle is a continuous surface or some separated small objects (such as tables and chairs) by fitting the marked points after the single-point ranging sensor ranges. Then, the scanned boundary information is displayed on a map, so that a more accurate and beautiful map can be obtained. Preferably, the mobile robot fits points on the grid map using a least squares method to obtain the boundary contour line.
In this embodiment, based on the boundary contour line, the method for the mobile robot to align the obstacle grids on the grid map is that the mobile robot adjusts the obstacle grids whose distance from the boundary contour line is less than or equal to the preset distance to the boundary contour line, so as to complete the alignment. As shown in fig. 2, a dotted line 201 represents a boundary contour line obtained by scanning with the single-point range sensor, and a black grid 202 represents an obstacle grid. It is readily apparent that part of the barrier grid deviates from the boundary profile due to the lower accuracy of the crash sensors, and therefore an accurate boundary profile is required for adjustment as a reference. Fig. 2 shows the aligned barrier grids at 203, and it should be noted that only the barrier grids closer to the boundary contour line are adjusted, otherwise the barrier grids not belonging to the boundary may be "misplaced". The distance between the barrier grid and the boundary contour line refers to the shortest distance between the barrier grid and the boundary contour line.
The embodiment of the invention provides a mobile robot which is a visual robot and comprises a single-point distance measuring sensor. The single-point distance measuring sensor is arranged on the periphery of the mobile robot body and used for detecting the distance between the mobile robot and the obstacle. The single-point ranging sensors can be arranged in a plurality of positions and arranged on the side face of the periphery of the mobile robot body at intervals, and the multiple single-point ranging sensors can help the mobile robot to acquire boundary information of the surrounding environment more quickly. For example, the single-point distance measuring sensor is arranged on the front side surface of the machine body in front of the robot and used for detecting the distance between the front of the robot and the obstacle; or the single-point distance measuring sensor is arranged on the left side surface of the left side body of the robot and is used for detecting the distance between the left side of the robot and the obstacle; or the single-point distance measuring sensor is arranged on the right side surface of the machine body on the right side of the robot and is used for detecting the distance between the right side of the robot and the obstacle; or the single-point distance measuring sensor is arranged on the rear side surface of the machine body at the rear side of the robot and used for detecting the distance between the rear side of the robot and the obstacle. Compared with the prior art, the mobile robot provided by the invention is provided with only one single-point distance measuring sensor, the problem of inaccurate boundary of the visual SLAM map is solved, no complex equipment is needed, the cost is lower, and the efficiency and the practicability are higher.
The mobile robot is further provided with a collision sensor, a fitting unit and an aligning unit, wherein the collision sensor enables the mobile robot to mark barrier grids on a grid map according to collision positions in the walking process of the mobile robot and is used for representing the positions of the barriers. The fitting unit is used for marking on the grid map according to the distance between the mobile robot and the obstacle, and then fitting to generate a boundary contour line, wherein the fitting method is preferably a least square method. The alignment unit is used for aligning the barrier grids on the grid map according to the boundary contour lines so as to correct the map boundary. The mobile robot stores a grid map of the current environment, wherein the grid map comprises an obstacle grid. Wherein the fitting unit and the aligning unit are virtual devices.
The invention also discloses a chip, which is used for storing the computer program code and can be arranged in the mobile robot, and the computer program code realizes the steps of the map contour correction method when being executed. Or the chip implements the functions of each unit in the mobile robot when executing the computer program code. Illustratively, the computer program code may be partitioned into one or more modules/units that are stored in and executed by the chip to complete the present application. The one or more modules/units may be a series of computer program instruction segments capable of performing specific functions, which are used to describe the execution process of the computer program code in the mobile robot. For example, the computer program code may be partitioned into: a fitting unit and an alignment unit in the mobile robot. The chip can enable the mobile robot to establish a map with an accurate map boundary, does not need complex equipment and algorithms, and is low in cost, small in data operand and high in efficiency and practicability.
Obviously, the above-mentioned embodiments are only a part of embodiments of the present invention, not all embodiments, and the technical solutions of the embodiments may be combined with each other. Furthermore, if terms such as "center", "upper", "lower", "left", "right", "vertical", "horizontal", "inner", "outer", etc., appear in the embodiments, their indicated orientations or positional relationships are based on those shown in the drawings only for convenience of describing the present invention and simplifying the description, but do not indicate or imply that the referred devices or elements must have a specific orientation or be constructed and operated in a specific orientation, and thus, should not be construed as limiting the present invention. If the terms "first", "second", "third", etc. appear in the embodiments, they are for convenience of distinguishing between related features, and they are not to be construed as indicating or implying any relative importance, order or number of features.
Those of ordinary skill in the art will understand that: all or a portion of the steps of implementing the above-described method embodiments may be performed by hardware associated with program instructions. These programs may be stored in a computer-readable storage medium (such as a ROM, a RAM, a magnetic or optical disk, or various other media that can store program codes). Which when executed performs steps comprising the method embodiments described above.
Finally, it should be noted that: the above embodiments are only used to illustrate the technical solution of the present invention, and not to limit the same; while the invention has been described in detail and with reference to the foregoing embodiments, it will be understood by those skilled in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some or all of the technical features may be equivalently replaced; and the modifications or the substitutions do not make the essence of the corresponding technical solutions depart from the scope of the technical solutions of the embodiments of the present invention.

Claims (9)

1. A mobile robot comprising a vision sensor, characterized by further comprising: and the single-point distance measuring sensor is arranged on the periphery of the mobile robot body and used for detecting the distance between the mobile robot and the obstacle.
2. The mobile robot as claimed in claim 1, wherein the single-point distance measuring sensor is provided on a front side surface of the body in front of the robot for detecting a distance between the front of the robot and the obstacle; or the single-point distance measuring sensor is arranged on the left side surface of the left side body of the robot and is used for detecting the distance between the left side of the robot and the obstacle; or the single-point distance measuring sensor is arranged on the right side surface of the machine body on the right side of the robot and is used for detecting the distance between the right side of the robot and the obstacle; or the single-point distance measuring sensor is arranged on the rear side surface of the machine body at the rear side of the robot and used for detecting the distance between the rear side of the robot and the obstacle.
3. The mobile robot of claim 1, wherein the plurality of single-point distance measuring sensors are spaced apart from each other on a side surface of the outer periphery of the body of the mobile robot.
4. A mobile robot according to claim 1, characterized in that the mobile robot is provided with a collision sensor, a fitting unit and an alignment unit, wherein,
the collision sensor is used for detecting the position of the barrier, so that the mobile robot marks a barrier grid on the grid map according to the collision position;
the fitting unit is used for generating a boundary contour line on the grid map according to the distance between the mobile robot and the obstacle;
the alignment unit is used for aligning the barrier grids on the grid map according to the boundary contour lines;
the mobile robot stores a grid map of the current environment, wherein the grid map comprises an obstacle grid.
5. A map contour correction method, characterized in that the method comprises:
the mobile robot measures the boundary distance of the surrounding environment through a single-point ranging sensor and generates a boundary contour line, wherein the mobile robot stores a grid map of the current environment, and the grid map comprises barrier grids;
based on the boundary contour lines, the mobile robot aligns the obstacle grids on the grid map, thereby correcting the contour of the grid map.
6. The method of claim 5, wherein the method of measuring the boundary distance of the surrounding environment and generating the boundary contour line by the mobile robot through the single-point distance measuring sensor comprises the steps of:
after the single-point distance measuring sensor is started, the mobile robot rotates in situ to obtain the distance of surrounding obstacles;
and marking and fitting corresponding positions on the grid map according to the distance of surrounding obstacles to generate the boundary contour line.
7. The method of claim 6, wherein the fitting method is a least squares method.
8. The map contour correction method according to claim 5, wherein the mobile robot aligns the obstacle grids on the grid map based on the boundary contour lines by adjusting the obstacle grids that are at a distance equal to or less than a predetermined distance from the boundary contour lines to the boundary contour lines.
9. A chip for storing computer program code, wherein the computer program code when executed implements the steps of the map contour correction method of any one of claims 5 to 8.
CN202110501046.1A 2021-05-08 2021-05-08 Map contour correction method, chip and mobile robot Pending CN113190007A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110501046.1A CN113190007A (en) 2021-05-08 2021-05-08 Map contour correction method, chip and mobile robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110501046.1A CN113190007A (en) 2021-05-08 2021-05-08 Map contour correction method, chip and mobile robot

Publications (1)

Publication Number Publication Date
CN113190007A true CN113190007A (en) 2021-07-30

Family

ID=76984417

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110501046.1A Pending CN113190007A (en) 2021-05-08 2021-05-08 Map contour correction method, chip and mobile robot

Country Status (1)

Country Link
CN (1) CN113190007A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116382315A (en) * 2023-06-01 2023-07-04 深之蓝(天津)水下智能科技有限公司 Picture construction method and system thereof, underwater robot, storage medium and electronic equipment
WO2023138373A1 (en) * 2022-01-24 2023-07-27 追觅创新科技(苏州)有限公司 Map processing method and system, and self-moving device
WO2024037262A1 (en) * 2022-08-16 2024-02-22 珠海一微半导体股份有限公司 Narrow passage navigation method for robot, chip, and robot

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN208334652U (en) * 2018-07-04 2019-01-04 深圳市朗空亿科科技有限公司 A kind of walking robot and its obstacle detection system
CN111427360A (en) * 2020-04-20 2020-07-17 珠海市一微半导体有限公司 Map construction method based on landmark positioning, robot and robot navigation system
CN111572526A (en) * 2019-02-19 2020-08-25 长城汽车股份有限公司 Positioning method and system for automatic driving system
CN112578392A (en) * 2020-11-25 2021-03-30 珠海市一微半导体有限公司 Environment boundary construction method based on remote sensor and mobile robot
CN112698654A (en) * 2020-12-25 2021-04-23 珠海市一微半导体有限公司 Single-point TOF-based mapping and positioning method, chip and mobile robot

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN208334652U (en) * 2018-07-04 2019-01-04 深圳市朗空亿科科技有限公司 A kind of walking robot and its obstacle detection system
CN111572526A (en) * 2019-02-19 2020-08-25 长城汽车股份有限公司 Positioning method and system for automatic driving system
CN111427360A (en) * 2020-04-20 2020-07-17 珠海市一微半导体有限公司 Map construction method based on landmark positioning, robot and robot navigation system
CN112578392A (en) * 2020-11-25 2021-03-30 珠海市一微半导体有限公司 Environment boundary construction method based on remote sensor and mobile robot
CN112698654A (en) * 2020-12-25 2021-04-23 珠海市一微半导体有限公司 Single-point TOF-based mapping and positioning method, chip and mobile robot

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023138373A1 (en) * 2022-01-24 2023-07-27 追觅创新科技(苏州)有限公司 Map processing method and system, and self-moving device
WO2024037262A1 (en) * 2022-08-16 2024-02-22 珠海一微半导体股份有限公司 Narrow passage navigation method for robot, chip, and robot
CN116382315A (en) * 2023-06-01 2023-07-04 深之蓝(天津)水下智能科技有限公司 Picture construction method and system thereof, underwater robot, storage medium and electronic equipment
CN116382315B (en) * 2023-06-01 2023-10-03 深之蓝(天津)水下智能科技有限公司 Picture construction method and system thereof, underwater robot, storage medium and electronic equipment

Similar Documents

Publication Publication Date Title
CN113190007A (en) Map contour correction method, chip and mobile robot
AU2008250604B2 (en) Position determination method for a geodetic measuring device
CN103487034B (en) Method for measuring distance and height by vehicle-mounted monocular camera based on vertical type target
CN108227929B (en) Augmented reality lofting system based on BIM technology and implementation method
CN110361717B (en) Laser radar-camera combined calibration target and combined calibration method
EP4283567A1 (en) Three-dimensional map construction method and apparatus
US20220390233A1 (en) Movable marking system, controlling method for movable marking apparatus, and computer readable recording medium
CN115774265B (en) Two-dimensional code and laser radar fusion positioning method and device for industrial robot
CN107742304A (en) Determination method and device, mobile robot and the storage medium of motion track
CN112904358B (en) Laser positioning method based on geometric information
CN107153422A (en) aircraft landing system and method
KR20120043446A (en) Apparatus and method for detecting a location of vehicle and obstacle
CN108544491A (en) A kind of moving robot obstacle avoiding method considering distance and two factor of direction
CN113419249A (en) Repositioning method, chip and mobile robot
Hasheminasab et al. Linear Feature-based image/LiDAR integration for a stockpile monitoring and reporting technology
CN110057370A (en) The localization method and device of robot
CN113776515B (en) Robot navigation method and device, computer equipment and storage medium
CN110703771B (en) Control system between multiple devices based on vision
JP7363545B2 (en) Calibration judgment result presentation device, calibration judgment result presentation method and program
CN115100287A (en) External reference calibration method and robot
CN111765881B (en) Positioning system for mobile device
JP2018084995A (en) Route data generation device, moving body equipped with same, and route data generation method
Lindzey et al. Extrinsic calibration between an optical camera and an imaging sonar
Inoue et al. A study on Position Measurement System Using Laser Range Finder and Its Application fo r Construction Work
EP4361558A1 (en) A point cloud aided calibration of a combined geodetic survey instrument

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information
CB02 Change of applicant information

Address after: 519000 2706, No. 3000, Huandao East Road, Hengqin new area, Zhuhai, Guangdong

Applicant after: Zhuhai Yiwei Semiconductor Co.,Ltd.

Address before: 519000 room 105-514, No. 6, Baohua Road, Hengqin new area, Zhuhai City, Guangdong Province (centralized office area)

Applicant before: AMICRO SEMICONDUCTOR Co.,Ltd.

RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20210730