CN112711257B - Edge method based on single-point TOF of robot, chip and mobile robot - Google Patents

Edge method based on single-point TOF of robot, chip and mobile robot Download PDF

Info

Publication number
CN112711257B
CN112711257B CN202011559833.3A CN202011559833A CN112711257B CN 112711257 B CN112711257 B CN 112711257B CN 202011559833 A CN202011559833 A CN 202011559833A CN 112711257 B CN112711257 B CN 112711257B
Authority
CN
China
Prior art keywords
robot
point cloud
cloud data
edge
tof
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011559833.3A
Other languages
Chinese (zh)
Other versions
CN112711257A (en
Inventor
陈卓标
周和文
黄惠保
杨武
赖钦伟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhuhai Amicro Semiconductor Co Ltd
Original Assignee
Zhuhai Amicro Semiconductor Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhuhai Amicro Semiconductor Co Ltd filed Critical Zhuhai Amicro Semiconductor Co Ltd
Priority to CN202011559833.3A priority Critical patent/CN112711257B/en
Publication of CN112711257A publication Critical patent/CN112711257A/en
Application granted granted Critical
Publication of CN112711257B publication Critical patent/CN112711257B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0214Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory in accordance with safety or protection criteria, e.g. avoiding hazardous areas

Landscapes

  • Engineering & Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • Manipulator (AREA)

Abstract

The invention discloses a single-point TOF-based edge method of a robot, a chip and a mobile robot, wherein the method comprises the following steps: s1: the robot moves a set distance to acquire point cloud data; s2: the robot adopts straight line fitting to acquire a point cloud straight line according to the point cloud data; s3: the robot acquires the linear distance between the robot and the point cloud straight line and the linear angle of the point cloud straight line according to the current position; s4: the robot performs edge-following according to the linear distance and the linear angle, and acquires point cloud data in the edge-following process to update the linear distance and the linear angle. The robot firstly carries out the edge along in a traditional mode to obtain point cloud data, then carries out the edge along of the single-point TOF through the point cloud data, and obtains the point cloud data in the edge along process to update the edge along data, so that the accuracy is high and the robustness is stronger.

Description

Edge method based on single-point TOF of robot, chip and mobile robot
Technical Field
The invention relates to the technical field of electronics, in particular to a single-point TOF-based edge method for a robot, a chip and a mobile robot.
Background
At present, SLAM robots based on inertial navigation, vision and laser are more and more popular, and a household sweeping cleaning robot is strong in representativeness, and the indoor environment is positioned and built in real time by combining data of vision, laser, a gyroscope, acceleration and a wheel odometer, and then positioning navigation is realized according to the built map to clean. In order to clean the wall garbage and prevent the robot from walking in complex obstacle environments, the robot can push the obstacle to walk or be wound by the wire type obstacle, so that the robot can walk along the edges. However, most of the current robots adopt a traditional PID mode for edge-following, and when edge-following is carried out, the shake is large and the operation is not smooth enough.
Disclosure of Invention
In order to solve the problems, the invention provides a single-point TOF-based edge method of a robot, a chip and a mobile robot, wherein a single-point TOF sensor is adopted as an edge sensor, and the robot has stronger robustness during edge. The specific technical scheme of the invention is as follows:
A robot single point TOF based edging method comprising the steps of: s1: the robot moves a set distance to acquire point cloud data; s2: the robot adopts straight line fitting to acquire a point cloud straight line according to the point cloud data; s3: the robot acquires the linear distance between the robot and the point cloud straight line and the linear angle of the point cloud straight line according to the current position; s4: the robot performs edge-following according to the linear distance and the linear angle, and acquires point cloud data in the edge-following process to update the linear distance and the linear angle. The robot firstly carries out the edge along in a traditional mode to obtain point cloud data, then carries out the edge along of the single-point TOF through the point cloud data, and obtains the point cloud data in the edge along process to update the edge along data, so that the accuracy is high and the robustness is stronger.
In one or more aspects of the present invention, in step S1, the robot acquires point cloud data in a PID edge manner. The point cloud data are acquired in an edge-wise mode, so that the acquired point cloud data are more accurate.
In one or more aspects of the present invention, in step S1, the manner in which the robot acquires the point cloud data is: the robot establishes a world coordinate system by taking the central position of the robot as an origin at the beginning of edge, obtains IMU data and measurement distance through an IMU module and a TOF module respectively in the moving process, obtains the coordinate of the central position of the current robot through the IMU data, then establishes a robot coordinate system by taking the coordinate of the central position of the current robot as the origin and taking the front of the robot as an x axis, so as to obtain the coordinate of the TOF module, and obtains point cloud data based on the coordinate of the central position of the current robot, the coordinate of the TOF module and the measurement distance. The point data on the obstacle is acquired by setting the world coordinate system and the robot coordinate system, so that the accuracy is high and the calculated amount is small.
In one or more aspects of the present invention, a robot acquires point cloud data through a calculation formula, where the calculation formula is: ox=cos (rθ) ×tx-sin (rθ) ×ty+rx+d×cos (rθ+tθ); ry=sin (rθ) ×tx+cos (rθ) ×ty+ry+d×sin (rθ+tθ); the coordinates of the current robot center position are (rx, ry, rθ), the coordinates of the TOF module are (tx, ty, tθ), the TOF data are d, and the coordinates of the obstacle are (ox, oy).
In one or more aspects of the present invention, in step S2, the robot performs line fitting by using a least square method, and obtains a point cloud line.
In one or more aspects of the present invention, in step S3, the straight line distance is a distance between a center position of the robot and a straight line, the straight line angle is an included angle between a point cloud straight line and a forward direction of the robot, the robot first obtains a straight line coordinate of the point cloud straight line on a robot coordinate system, and obtains the straight line distance and the straight line angle according to the straight line coordinate. The position of the robot relative to the wall edge is adjusted through the linear distance and the linear angle, and the accuracy of the robot edge is improved.
In one or more aspects of the present invention, in step S4, the robot adjusts a distance and a forward direction between the robot and the obstacle when the robot is along the edge according to the linear distance and the linear angle.
In one or more aspects of the present invention, in step S4, the robot sets and acquires a preset number of point cloud data in the process of the edge, and after the number of acquired point cloud data reaches the preset data, each time a new point cloud data is acquired, deletes the point cloud data acquired first according to the acquisition time. And acquiring the set number of point cloud data, and reducing the memory space required by the robot to store the point cloud data.
In one or more aspects of the present invention, in step S4, after the robot acquires new point cloud data, steps S2 to S4 are performed. The edge is carried out according to the continuous updating point cloud data of environment, and the practicality is high.
In one or more aspects of the present invention, in step S4, the robot acquires the point cloud data once every time the robot advances a set distance. And the point cloud is prevented from being calculated repeatedly at the same position, and the acquired point cloud data is too dense.
The chip is internally provided with a control program, and is characterized in that the control program is used for controlling the robot to execute the edge method of the robot based on the single-point TOF. The robot can use the method by loading the chip, and the practicability is high.
A mobile robot is provided with a main control chip, and is characterized in that the main control chip is the chip. The robot acquires the point cloud data through the method, edges, and reduces the production cost of the robot.
Drawings
FIG. 1 is a schematic flow chart of a single point TOF-based edge method of a robot of the present invention;
FIG. 2 is a schematic diagram of the coordinates of the mobile robot according to the present invention;
FIG. 3 is a formula of a fitted straight line of the present invention;
fig. 4 is a schematic structural view of the mobile robot of the present invention.
Detailed Description
Embodiments of the present invention are described in detail below, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to like or similar elements or elements having like or similar functions throughout.
In the description of the present invention, it should be noted that, for the azimuth words such as "center", "lateral", "longitudinal", "length", "width", "thickness", "upper", "lower", "front", "rear", "left", "right", "vertical", "horizontal", "top", "bottom", "inner", "outer", "clockwise", "counterclockwise", etc., the azimuth and positional relationships are based on the azimuth or positional relationships shown in the drawings, it is merely for convenience of describing the present invention and simplifying the description, and it is not to be construed as limiting the specific scope of protection of the present invention that the device or element referred to must have a specific azimuth configuration and operation.
Furthermore, the terms "first," "second," and the like, are used for descriptive purposes only and are not to be construed as indicating or implying a relative importance or implicitly indicating the number of technical features. Thus, the definition of "a first", "a second" feature may explicitly or implicitly include one or more of such features, and in the description of the invention, "at least" means one or more, unless clearly specifically defined otherwise.
In the present invention, unless explicitly stated and limited otherwise, the terms "assembled," "connected," and "connected" are to be construed broadly, e.g., as being either fixedly connected, detachably connected, or integrally connected; or may be a mechanical connection; can be directly connected or connected through an intermediate medium, and can be communicated with the inside of the two elements. The specific meaning of the above terms in the present invention can be understood by those of ordinary skill in the art according to the specific circumstances.
In the present invention, unless specified and limited otherwise, a first feature "above" or "below" a second feature may include both the first and second features being in direct contact, as well as the first and second features not being in direct contact but being in contact with each other through additional features therebetween. Moreover, a first feature being "above," "below," and "above" a second feature includes the first feature being directly above and obliquely above the second feature, or simply representing the first feature as having a higher level than the second feature. The first feature being "above," "below," and "beneath" the second feature includes the first feature being directly below or obliquely below the second feature, or simply indicating that the first feature is level below the second feature.
The technical scheme and the beneficial effects of the invention are more clear and definite by further describing the specific embodiments of the invention with reference to the drawings in the specification. The embodiments described below are exemplary by referring to the drawings for the purpose of illustrating the invention and are not to be construed as limiting the invention.
Referring to fig. 1, a robot edge method based on single-point TOF includes the following steps: s1: the robot moves a set distance to acquire point cloud data; s2: the robot adopts straight line fitting to acquire a point cloud straight line according to the point cloud data; s3: the robot acquires the linear distance between the robot and the point cloud straight line and the linear angle of the point cloud straight line according to the current position; s4: the robot performs edge-following according to the linear distance and the linear angle, and acquires point cloud data in the edge-following process to update the linear distance and the linear angle. The robot firstly carries out the edge along in a traditional mode to obtain point cloud data, then carries out the edge along of the single-point TOF through the point cloud data, and obtains the point cloud data in the edge along process to update the edge along data, so that the accuracy is high and the robustness is stronger.
As one example, the robot acquires point cloud data in a PID edgewise manner, and the distance along the edge in the PID edgewise manner may be 50cm. The point cloud data are acquired in an edge-wise mode, so that the acquired point cloud data are more accurate.
Referring to fig. 2, the manner in which the robot acquires the point cloud data is as follows: the robot establishes a world coordinate system by taking the central position of the robot as an origin at the beginning of edge, obtains IMU data and measurement distance through an IMU module and a TOF module respectively in the moving process, obtains the coordinate of the central position of the current robot through the IMU data, then establishes a robot coordinate system by taking the coordinate of the central position of the current robot as the origin and taking the front of the robot as an x axis, so as to obtain the coordinate of the TOF module, and obtains point cloud data based on the coordinate of the central position of the current robot, the coordinate of the TOF module and the measurement distance. The point data on the obstacle is acquired by setting the world coordinate system and the robot coordinate system, so that the accuracy is high and the calculated amount is small. When the mobile robot works, the central position of the mobile robot is taken as a basic coordinate, a world coordinate system is established by taking the basic coordinate as an original point, the mobile robot starts to move, IMU data are acquired through an IMU module in the moving process, and the relative movement posture of the robot from the moment t1 to the moment t2 can be estimated through the IMU data by acquiring the IMU data from the moment t1 to the moment t 2. Thus, the robot coordinates at time t2 can be calculated from the robot coordinates at time t 1. For a planar moving robot, the IMU data only includes the magnitude of the angular velocity with the angular velocity direction perpendicular to the ground, and the encoder counts of the two rounds can estimate the pose of the robot. (this is a technique disclosed for robots, commonly used for inertial navigation robots). After knowing the current robot coordinate, the mobile robot establishes a robot coordinate system by taking the current robot coordinate as an origin, the right front of the robot is an x-axis, a y-axis can be set according to actual conditions, the y-axis in the figure is the left direction of the mobile robot, TOF coordinates of the TOF module relative to the robot coordinate system are obtained according to the position parameters when the TOF module is set, for example, the TOF coordinates in the figure are (1.0,1.0,1.5707), the TOF coordinates are (0.035-0.165-1.5707), and then according to the distance between the TOF module and an obstacle, the robot coordinates and the TOF coordinates, which are obtained by the TOF module, the TOF coordinates are represented by the formula ox=cos (rθ) ×tx-sin (rθ) ×ty+rx+d×cos (rθ+tθ); ry=sin (rθ) ×tx+cos (rθ) ×ty+ry+d×sin (rθ+tθ); wherein, the robot coordinates are (rx, ry, rθ), the TOF coordinates are (tx, ty, tθ), the TOF data are d, and the coordinates of the obstacle are (ox, oy). The point cloud data (point cloud data) is that scanning data is recorded in the form of points, in the application, a single-point TOF module is adopted, and the coordinate of a point on an obstacle is obtained, and the coordinate is the point cloud data. The mobile robot can acquire a series of point cloud data during the movement process.
Referring to fig. 3, the robot uses a least square method to perform line fitting to obtain a point cloud line. Let the linear equation y=a+bx; a system of normal equations is established and solved for a and b.
As one embodiment, the linear distance is a distance between a central position of the robot and a straight line, the straight line angle is an included angle between a point cloud straight line and a forward direction of the robot, the robot firstly obtains a linear coordinate of the point cloud straight line on a robot coordinate system, and then obtains the linear distance and the linear angle according to the linear coordinate. The position of the robot relative to the wall edge is adjusted through the linear distance and the linear angle, and the accuracy of the robot edge is improved.
As one of the embodiments, the robot adjusts the distance and the advancing direction from the obstacle when the robot is along the edge according to the straight line distance and the straight line angle. The robot sets and acquires the preset number of point cloud data in the process of the edge, and after the number of the acquired point cloud data reaches the preset data, each time a new point cloud data is acquired, one point cloud data acquired first is deleted according to the acquisition time. And acquiring the set number of point cloud data, and reducing the memory space required by the robot to store the point cloud data. After the robot acquires the new point cloud data, steps S2 to S4 are performed. The edge is carried out according to the continuous updating point cloud data of environment, and the practicality is high. And acquiring point cloud data once every forward set distance of the robot, for example, deleting historical point clouds when the machine moves more than 50cm, and assuming that each point cloud is recorded once every 1cm of the machine movement, deleting the first point cloud when 51 point clouds are recorded, wherein the recorded point clouds are always the historical distance of 50 cm. And the point cloud is prevented from being calculated repeatedly at the same position, and the acquired point cloud data is too dense.
The chip is internally provided with a control program, and is characterized in that the control program is used for controlling the robot to execute the edge method of the robot based on the single-point TOF. The robot can use the method by loading the chip, and the practicability is high.
A mobile robot is provided with a main control chip, and is characterized in that the main control chip is the chip. The robot acquires the point cloud data through the method, edges, and reduces the production cost of the robot.
Referring to fig. 4, a point cloud data acquisition structure of a robot includes a main body 1 and a controller, wherein a TOF module 2 of a single point is provided on a left front side or a right front side of the main body, the TOF module 2 is electrically connected with the controller, and a detection direction of the TOF module 2 is parallel to an axle line of the robot. The detection direction of the TOF module 2 is perpendicular to the wall surface, so that the robot can conveniently acquire distance information between the robot and the wall surface, and the robot can directly use the acquired data for modifying the pose of the robot in the process of edge, and complex calculation is not needed.
As one example, the TOF module 2 includes a TOF sensor, and the model of the TOF sensor is VL6180. The cost is lower, and the practicability is high. The TOF sensor includes an emitter and a receiver disposed in a horizontal arrangement. The interval between the center line of the TOF module 2 and the center line of the main body 1 is 30mm to 40mm, the center line of the main body 1 is parallel to the wheel axis of the robot, and the arrangement interval can enable the machine to better edge around a column and a corner, so that acquired data are more accurate.
As one embodiment, the front end of the main body 1 is provided with a collision bar 3, the collision bar 3 is provided with a round hole 4, and the TOF module 2 is disposed at one side of the round hole 4 and detects an obstacle through the round hole 4. The main body 1 comprises an IMU module, wherein the IMU module comprises a six-axis gyroscope and a code disc, and the six-axis gyroscope and the code disc are respectively and electrically connected with a controller.
In the description of the present invention, a description of the terms "one embodiment," "preferred," "example," "specific example," or "some examples," etc., means that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the invention, and a schematic representation of the terms described above in the present specification does not necessarily refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples. The connection modes in the description of the specification have obvious effects and practical effectiveness.
From the above description of the structure and principles, it should be understood by those skilled in the art that the present invention is not limited to the above-described embodiments, but rather that modifications and substitutions using known techniques in the art on the basis of the present invention fall within the scope of the present invention, which is defined by the appended claims.

Claims (6)

1. A single point TOF based edging method for a robot, the method comprising the steps of:
S1: the robot moves a set distance to acquire point cloud data;
s2: the robot adopts straight line fitting to acquire a point cloud straight line according to the point cloud data;
S3: the robot acquires the linear distance between the robot and the point cloud straight line and the linear angle of the point cloud straight line according to the current position;
S4: the robot performs edge-following according to the linear distance and the linear angle, and acquires point cloud data in the edge-following process to update the linear distance and the linear angle;
In S1, a robot acquires point cloud data in a PID (proportion integration differentiation) edge mode;
S3, the linear distance is the distance between the central position of the robot and the straight line, the linear angle is the included angle between the point cloud straight line and the advancing direction of the robot, the robot firstly acquires the linear coordinate of the point cloud straight line on the robot coordinate system, and the linear distance and the linear angle are acquired according to the linear coordinate;
s4, the robot adjusts the distance and the advancing direction between the robot and the obstacle when the robot edges according to the linear distance and the linear angle;
s4, the robot sets and acquires the preset number of point cloud data in the process of edge, and after the number of the acquired point cloud data reaches the preset number, deleting the point cloud data acquired first according to the acquisition time after acquiring a new point cloud data;
s4, acquiring point cloud data once every time the robot advances a set distance;
In S4, after the robot acquires the new point cloud data, steps S2 to S4 are executed.
2. The method for robot edge based on single point TOF according to claim 1, wherein in step S1, the manner in which the robot acquires the point cloud data is: the robot establishes a world coordinate system by taking the central position of the robot as an origin at the beginning of edge, obtains IMU data and measurement distance through an IMU module and a TOF module respectively in the moving process, obtains the coordinate of the central position of the current robot through the IMU data, then establishes a robot coordinate system by taking the coordinate of the central position of the current robot as the origin and taking the front of the robot as an x axis, so as to obtain the coordinate of the TOF module, and obtains point cloud data based on the coordinate of the central position of the current robot, the coordinate of the TOF module and the measurement distance.
3. The robot-based single-point TOF edge method of claim 2 wherein the robot obtains point cloud data by a calculation formula:
ox=cos(rθ)*tx-sin(rθ)*ty+rx+d*cos(rθ+tθ);
oy=sin(rθ)*tx+cos(rθ)*ty+ry+d*sin(rθ+tθ);
the coordinates of the current robot center position are (rx, ry, rθ), the coordinates of the TOF module are (tx, ty, tθ), the TOF data are d, and the coordinates of the obstacle are (ox, oy).
4. The robot edge method based on single-point TOF according to claim 1, wherein in step S2, the robot uses a least square method to perform line fitting to obtain a point cloud line.
5. A chip, in which a control program is built, characterized in that the control program is used to control a robot to perform the robot edge method based on single point TOF according to any one of claims 1 to 4.
6. A mobile robot equipped with a master chip, characterized in that said master chip is the chip of claim 5.
CN202011559833.3A 2020-12-25 2020-12-25 Edge method based on single-point TOF of robot, chip and mobile robot Active CN112711257B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011559833.3A CN112711257B (en) 2020-12-25 2020-12-25 Edge method based on single-point TOF of robot, chip and mobile robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011559833.3A CN112711257B (en) 2020-12-25 2020-12-25 Edge method based on single-point TOF of robot, chip and mobile robot

Publications (2)

Publication Number Publication Date
CN112711257A CN112711257A (en) 2021-04-27
CN112711257B true CN112711257B (en) 2024-06-18

Family

ID=75546277

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011559833.3A Active CN112711257B (en) 2020-12-25 2020-12-25 Edge method based on single-point TOF of robot, chip and mobile robot

Country Status (1)

Country Link
CN (1) CN112711257B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113219488A (en) * 2021-05-08 2021-08-06 珠海市一微半导体有限公司 Robot mapping method
CN115137267B (en) * 2022-07-13 2024-03-26 浙江欣奕华智能科技有限公司 Obstacle avoidance walking method and device of cleaning robot, electronic equipment and medium

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110211228A (en) * 2019-04-30 2019-09-06 北京云迹科技有限公司 For building the data processing method and device of figure
CN112034837A (en) * 2020-07-16 2020-12-04 珊口(深圳)智能科技有限公司 Method for determining working environment of mobile robot, control system and storage medium

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108931983B (en) * 2018-09-07 2020-04-24 深圳市银星智能科技股份有限公司 Map construction method and robot thereof
CN110335295B (en) * 2019-06-06 2021-05-11 浙江大学 Plant point cloud acquisition registration and optimization method based on TOF camera
CN110673107B (en) * 2019-08-09 2022-03-08 北京智行者科技有限公司 Road edge detection method and device based on multi-line laser radar
CN111947649A (en) * 2020-06-21 2020-11-17 珠海市一微半导体有限公司 Robot positioning method based on data fusion, chip and robot
CN111948673A (en) * 2020-06-21 2020-11-17 珠海市一微半导体有限公司 Method and robot for updating laser data based on IMU data

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110211228A (en) * 2019-04-30 2019-09-06 北京云迹科技有限公司 For building the data processing method and device of figure
CN112034837A (en) * 2020-07-16 2020-12-04 珊口(深圳)智能科技有限公司 Method for determining working environment of mobile robot, control system and storage medium

Also Published As

Publication number Publication date
CN112711257A (en) 2021-04-27

Similar Documents

Publication Publication Date Title
CN112698654B (en) Single-point TOF-based map building and positioning method, chip and mobile robot
EP3603372B1 (en) Moving robot, method for controlling the same, and terminal
US9908240B1 (en) Ground plane compensation for legged robots
CN112711257B (en) Edge method based on single-point TOF of robot, chip and mobile robot
AU2013270671B2 (en) Carpet drift estimation using differential sensors or visual measurements
Goel et al. Robust localization using relative and absolute position estimates
US20210141389A1 (en) Autonomous Map Traversal with Waypoint Matching
CN112183133B (en) Aruco code guidance-based mobile robot autonomous charging method
JP6108818B2 (en) Moving device and position recognition method of moving device
CN111624997A (en) Robot control method and system based on TOF camera module and robot
Yoshida et al. A sensor platform for outdoor navigation using gyro-assisted odometry and roundly-swinging 3D laser scanner
CN110488818B (en) Laser radar-based robot positioning method and device and robot
CN111123911A (en) Legged intelligent star catalogue detection robot sensing system and working method thereof
CN112859860A (en) Robot system and path planning method thereof
CN112684813A (en) Docking method and device for robot and charging pile, robot and readable storage medium
CN112880683A (en) Robot positioning control method, system and chip based on reference linear distance
CN112747746A (en) Point cloud data acquisition method based on single-point TOF, chip and mobile robot
CN112308033A (en) Obstacle collision warning method based on depth data and visual chip
WO2023050545A1 (en) Outdoor automatic operation control system and method based on machine vision, and device
CN111007522A (en) Position determination system of mobile robot
CN213814413U (en) Food delivery biped robot control system
CN114330832A (en) Intelligent express package distribution system and working method thereof
CN114789439B (en) Slope positioning correction method, device, robot and readable storage medium
CN213748478U (en) Point cloud data acquisition structure of robot and robot
Ax et al. Optical position stabilization of an UAV for autonomous landing

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information
CB02 Change of applicant information

Address after: 519000 2706, No. 3000, Huandao East Road, Hengqin new area, Zhuhai, Guangdong

Applicant after: Zhuhai Yiwei Semiconductor Co.,Ltd.

Address before: Room 105-514, No.6 Baohua Road, Hengqin New District, Zhuhai City, Guangdong Province

Applicant before: AMICRO SEMICONDUCTOR Co.,Ltd.

GR01 Patent grant
GR01 Patent grant