CN105004368A - Collision detection method, device and system for autonomous robot - Google Patents

Collision detection method, device and system for autonomous robot Download PDF

Info

Publication number
CN105004368A
CN105004368A CN201510367447.7A CN201510367447A CN105004368A CN 105004368 A CN105004368 A CN 105004368A CN 201510367447 A CN201510367447 A CN 201510367447A CN 105004368 A CN105004368 A CN 105004368A
Authority
CN
China
Prior art keywords
information
autonomous robot
collision
coordinate
module
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201510367447.7A
Other languages
Chinese (zh)
Other versions
CN105004368B (en
Inventor
吴泽晓
徐成
郭盖华
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen LD Robot Co Ltd
Original Assignee
Shenzhen Inmotion Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Inmotion Technologies Co Ltd filed Critical Shenzhen Inmotion Technologies Co Ltd
Priority to CN201510367447.7A priority Critical patent/CN105004368B/en
Publication of CN105004368A publication Critical patent/CN105004368A/en
Application granted granted Critical
Publication of CN105004368B publication Critical patent/CN105004368B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Abstract

The present application discloses a collision detection method, a collision detection device and a collision detection system for an autonomous robot. According to the collision detection method, the collision detection device and the collision detection system, obtained acceleration data and angular velocity data are calculated, attitude information of the autonomous robot is obtained through data fusion, coordinate specific force information of the autonomous robot is obtained through conversion of coordinates, and table lookup operation is carried out on a preset fuzzy rule table according to the attitude information and the coordinate specific force information, thus the collision situation of the autonomous robot is obtained, and corresponding measures can be taken according to the collision situation, so as to ensure that the autonomous robot can operate normally.

Description

A kind of collision checking method of autonomous robot, Apparatus and system
Technical field
The application relates to robotics, more particularly, relates to a kind of collision checking method of autonomous robot, Apparatus and system.
Background technology
Along with improving constantly of people's living standard, the autonomous robot of back work can be provided to be accepted by increasing people by family or other places.Self-help machine people in use more or less can run into the problem of collision, needs to take appropriate measures in time to avoid affecting it and normally work after colliding, and therefore carrying out detecting in time to collision situation is the prerequisite ensureing its normal work.
Summary of the invention
In view of this, the application provides a kind of collision checking method, Apparatus and system of autonomous robot, for judging collision situation, takes the foundation of corresponding measure as it after colliding, to ensure that it normally works.
To achieve these goals, the existing scheme proposed is as follows:
A collision checking method for autonomous robot, comprises following operation:
Data fusion is carried out to the acceleration information of described autonomous robot and angular velocity data, obtains the attitude information of described autonomous robot;
Specific force conversion is carried out to described acceleration information, obtains the ratio force information of described autonomous robot;
Carry out coordinate transform by described than force information, the coordinate obtaining each axis compares force information;
Using described attitude information and described coordinate than force information as comparison foundation, described comparison is obtained the collision situation of described autonomous robot according to comparing with the fuzzy reasoning table preset.
Optionally, described data fusion is carried out to described acceleration information and described angular velocity data, comprising:
Kalman filtering process is carried out to described acceleration information and described angular velocity data, to carry out data fusion.
Optionally, described described comparison foundation being compared with the fuzzy reasoning table preset obtains the collision situation of described autonomous robot, comprising:
By multiple preset posture information threshold of described attitude information and described fuzzy reasoning table, the multiple preset coordinate specific force information threshold of described coordinate than force information and described fuzzy reasoning table are compared respectively, obtain described collision situation according to comparative result.
Optionally, describedly carry out coordinate transform by described than force information, the coordinate obtaining each axis, than force information, comprising:
Carry out Eulerian angle coordinate transform by described than force information, the coordinate obtaining each axis under earth coordinates compares force information.
Optionally, also comprise:
Crash site is obtained according to the positional information of described autonomous robot and described collision situation.
A collision detecting device for autonomous robot, comprises data fusion module, specific force conversion module, coordinate transformation module and table look-up module, wherein:
Described data fusion module is used for carrying out data fusion to described acceleration information and described angular velocity data, obtains the attitude information of described autonomous robot;
Described specific force conversion module carries out specific force conversion to described acceleration information, obtains the ratio force information of described autonomous robot;
Described coordinate transformation module is used for carrying out coordinate transform by described than force information, and the coordinate obtaining each axis compares force information;
Described table look-up module be used for using described attitude information and described coordinate than force information as comparison foundation, described comparison is obtained the collision situation of described autonomous robot according to comparing with the fuzzy reasoning table preset.
Optionally, described data fusion module is Kalman filtering module.
Optionally, described table look-up module is used for multiple preset posture information threshold of described attitude information and described fuzzy reasoning table, is compared respectively than the multiple preset coordinate specific force information threshold listed by force information and described fuzzy reasoning table by described coordinate, obtains described collision situation according to comparative result.
Optionally, also comprise crash site computing module, wherein:
Described crash site computing module is used for obtaining crash site according to the positional information of described autonomous robot and described collision situation.
A collision detecting system for autonomous robot, comprises inertial sensor and the collision detecting device as described in any one of claim 6 ~ 9, wherein:
Described inertial sensor is for obtaining described acceleration information and described angular velocity data.
As can be seen from technique scheme, this application discloses a kind of collision checking method of autonomous robot, Apparatus and system, the method, the acceleration information of acquisition and angular velocity data calculate by Apparatus and system, the attitude information of autonomous robot is obtained by data fusion, the coordinate being obtained autonomous robot by coordinate transform compares force information, then than force information, table lookup operation is carried out to the fuzzy reasoning table preset according to this attitude information and coordinate, thus obtain the collision situation of this autonomous robot, and then enough take corresponding measure according to this collision situation, to ensure that autonomous robot can normally work.
Accompanying drawing explanation
In order to be illustrated more clearly in the embodiment of the present application or technical scheme of the prior art, be briefly described to the accompanying drawing used required in embodiment or description of the prior art below, apparently, accompanying drawing in the following describes is only some embodiments of the application, for those of ordinary skill in the art, under the prerequisite not paying creative work, other accompanying drawing can also be obtained according to these accompanying drawings.
The process flow diagram of the collision checking method of a kind of autonomous robot that Fig. 1 provides for the embodiment of the present application;
The process flow diagram of the collision checking method of a kind of autonomous robot that Fig. 2 provides for another embodiment of the application;
The schematic diagram of the collision detecting device of a kind of autonomous robot that Fig. 3 provides for the another embodiment of the application;
The schematic diagram of the collision detecting device of a kind of autonomous robot that Fig. 4 provides for the another embodiment of the application;
The schematic diagram of the collision detecting system of a kind of autonomous robot that Fig. 5 provides for the another embodiment of the application.
Embodiment
Below in conjunction with the accompanying drawing in the embodiment of the present application, be clearly and completely described the technical scheme in the embodiment of the present application, obviously, described embodiment is only some embodiments of the present application, instead of whole embodiments.Based on the embodiment in the application, those of ordinary skill in the art are not making the every other embodiment obtained under creative work prerequisite, all belong to the scope of the application's protection.
Embodiment one
The process flow diagram of the collision checking method of a kind of autonomous robot that Fig. 1 provides for the embodiment of the present application.
As shown in Figure 1, the collision checking method that the present embodiment provides comprises the steps:
S101: the attitude information calculating autonomous robot.
After the acceleration information obtaining this autonomous robot and angular velocity data, both are carried out data fusion, obtains the attitude information of autonomous robot.
Acceleration information i.e. acceleration information in three axial directions, angular velocity data is three angular velocity datas axially.Then carry out data fusion by Kalman filtering algorithm, thus obtain this attitude information.
S102: specific force conversion is carried out to acceleration information.
Specific force conversion is carried out to acceleration information, preferably by above-mentioned three acceleration informations axially.Calculate, obtain three ratio force informations axially.Specific force equals acceleration and is multiplied by acceleration of gravity, for characterizing stressing conditions in three axial directions.
S103: coordinate system transformation will be carried out than force information.
This is carried out angular coordinate conversion than force information, preferably its coordinate system is transformed to the ratio force information under the earth coordinates characterizing actual geographic position by Eulerian angle coordinate transformation method, for the ease of difference, the ratio force information after conversion is compared force information as terrestrial coordinate.
S104: the fuzzy reasoning table according to presetting obtains collision situation.
This fuzzy reasoning table is that two-dimentional the counting obtained according to engineering practice is shown, comprising multiple preset posture information threshold and multiple default terrestrial coordinate specific force information threshold, also comprise the collision situation that multiple and above-mentioned two parameters contrast, this collision situation comprises forward direction collision, backward collision, left-hand collision, dextrad collision and lifts idle running.
When tabling look-up to this fuzzy reasoning table, respectively attitude information and multiple preset posture information threshold are contrasted one by one, and terrestrial coordinate is contrasted one by one than force information and multiple preset posture information threshold simultaneously, when this attitude information and a certain preset posture information threshold match, and when this terrestrial coordinate matches than force information and a certain default terrestrial coordinate specific force information threshold simultaneously, then judge simultaneously corresponding with this certain preset posture information threshold and this certain default terrestrial coordinate specific force information threshold collision situation now occurs.Thus determine current autonomous robot is which type of collision occurs.
As can be seen from technique scheme, present embodiments provide a kind of collision checking method of autonomous robot, the acceleration information of acquisition and angular velocity data calculate by the method, the attitude information of autonomous robot is obtained by data fusion, the coordinate being obtained autonomous robot by coordinate transform compares force information, then than force information, table lookup operation is carried out to the fuzzy reasoning table preset according to this attitude information and coordinate, thus obtain the collision situation of this autonomous robot, and then enough take corresponding measure according to this collision situation, to ensure that autonomous robot can normally work.
Embodiment two
After the collision situation of autonomous robot being detected, if after the place of this collision situation generation can also be obtained, just residing for it, the concrete condition in space can take corresponding measures to keep clear, such as, can take after encountering wall to turn around or retreat, can select directly to cross or turn to avoid in time encountering less barrier, for this reason, present invention also provides following embodiment to determine crash site.
The process flow diagram of the collision checking method of a kind of autonomous robot that Fig. 2 provides for another embodiment of the application.
The collision checking method that the present embodiment provides on the basis of a upper embodiment, has done part improve, and the process flow diagram completed as shown in Figure 2.
S201: the attitude information calculating autonomous robot.
After the acceleration information obtaining this autonomous robot and angular velocity data, both are carried out data fusion, obtains the attitude information of autonomous robot.
Acceleration information i.e. acceleration information in three axial directions, angular velocity data is three angular velocity datas axially, is designated as.Then carry out data fusion by Kalman filtering algorithm, thus obtain this attitude information.
S202: specific force conversion is carried out to acceleration information.
Specific force conversion is carried out to acceleration information, calculates by above-mentioned three acceleration informations axially, obtain three ratio force informations axially.Specific force equals acceleration and is multiplied by acceleration of gravity, for characterizing stressing conditions in three axial directions.
S203: coordinate system transformation will be carried out than force information.
This is carried out angular coordinate conversion than force information, preferably its coordinate system is transformed to the ratio force information under the earth coordinates characterizing actual geographic position by Eulerian angle coordinate transformation method, for the ease of difference, the ratio force information after conversion is compared force information as terrestrial coordinate.
S204: the fuzzy reasoning table according to presetting obtains collision situation.
This fuzzy reasoning table is that two-dimentional the counting obtained according to engineering practice is shown, comprising multiple preset posture information threshold and multiple default terrestrial coordinate specific force information threshold, also comprise the collision situation that multiple and above-mentioned two parameters contrast, this collision situation comprises forward direction collision, backward collision, left-hand collision, dextrad collision and lifts idle running.
S205: judge crash site.
Autonomous robot is generally provided with the odometer for exporting its positional information of reflection, comprises left side scrambler and right side scrambler.Just can obtain the true geographical location information of autonomous robot when mobile according to this positional information, the geographical location information when colliding is this crash site.
Thus this autonomous robot just can take corresponding dodging or Disposal Measures according to crash site and residing environment.
Embodiment three
The schematic diagram of the position detecting device of a kind of autonomous robot that Fig. 3 provides for the another embodiment of the application.
As shown in Figure 3, the position detecting device that the present embodiment provides comprises data fusion module 10, specific force conversion module 20, coordinate transformation module 30 and table look-up module 40.
Both, for according to the acceleration information of this autonomous robot and angular velocity data, are then carried out data fusion by data fusion module 10, obtain and output from the attitude information of main robot.
Acceleration information i.e. acceleration information in three axial directions, angular velocity data is three angular velocity datas axially.Then carry out data fusion by Kalman filtering algorithm, thus obtain and export this attitude information, corresponding notebook data Fusion Module 10 is Kalman filtering module.
Specific force conversion module 20 is for carrying out specific force conversion to acceleration information, and three acceleration informations axially exported by data fusion module 10 calculate, and obtain three ratio force informations axially.Specific force equals acceleration and is multiplied by acceleration of gravity, for characterizing stressing conditions in three axial directions.
Coordinate transformation module 30 is for carrying out coordinate transform by this than force information, preferably carry out Eulerian angle coordinate transform, it is the ratio force information under the earth coordinates of sign actual geographic position by its coordinate system transformation, for the ease of difference, the ratio force information after conversion is compared force information as terrestrial coordinate.
Table look-up module 40 is for obtaining collision situation according to the fuzzy reasoning table preset.
This fuzzy reasoning table is that two-dimentional the counting obtained according to engineering practice is shown, comprising multiple preset posture information threshold and multiple default terrestrial coordinate specific force information threshold, also comprise the collision situation that multiple and above-mentioned two parameters contrast, this collision situation comprises forward direction collision, backward collision, left-hand collision, dextrad collision and lifts idle running.
Table look-up module 40 is when tabling look-up to this fuzzy reasoning table, respectively attitude information and multiple preset posture information threshold are contrasted one by one, and terrestrial coordinate is contrasted one by one than force information and multiple preset posture information threshold simultaneously, when this attitude information and a certain preset posture information threshold match, and when this terrestrial coordinate matches than force information and a certain default terrestrial coordinate specific force information threshold simultaneously, then judge simultaneously corresponding with this certain preset posture information threshold and this certain default terrestrial coordinate specific force information threshold collision situation now occurs.Thus determine current autonomous robot is which type of collision occurs.
As can be seen from technique scheme, present embodiments provide a kind of collision detecting device of autonomous robot, the acceleration information of acquisition and angular velocity data calculate by this device, the attitude information of autonomous robot is obtained by data fusion, the coordinate being obtained autonomous robot by coordinate transform compares force information, then than force information, table lookup operation is carried out to the fuzzy reasoning table preset according to this attitude information and coordinate, thus obtain the collision situation of this autonomous robot, and then enough take corresponding measure according to this collision situation, to ensure that autonomous robot can normally work.
Embodiment four
The schematic diagram of the collision detecting device of a kind of autonomous robot that Fig. 4 provides for the another embodiment of the application.
With the reason that embodiment two is set forth, corresponding measures to keep clear is taked in order to enable autonomous robot concrete condition in space residing for it, such as, can take after encountering wall to turn around or retreat, can select directly to cross or turn to avoid in time encountering less barrier, crash site computing module 50 has also been set up on the basis of a upper embodiment.
This crash site computing module 50 obtains its real-time geographical locations information when mobile for the positional information exported according to its odometer 100, and collision situation table look-up module 40 obtained combines with this real-time geographical locations information and obtains this crash site.Thus make this autonomous robot just can take corresponding dodging or Disposal Measures according to crash site and residing environment.
Embodiment five
The schematic diagram of the collision detecting system of a kind of autonomous robot that Fig. 5 provides for the another embodiment of the application.
As shown in Figure 5, the collision detecting system that the present embodiment provides comprises the collision detecting device 200 that embodiment above provides, and has also set up the inertial sensor 60 for obtaining above-mentioned acceleration information and angular velocity data on this basis.
This inertial sensor 60 is arranged on the relevant position of autonomous robot, for obtaining its acceleration information and angular velocity data according to the motion conditions of autonomous robot, and outputs to collision detecting device 200.Collision detecting device 200 utilizes this acceleration information and angular velocity data to carry out the calculating of collision situation.
Native system carries out collision detection compared to using the method for multiple crash sensor, because its collision detection capabilities and object volume are inversely proportional to, therefore a lot of crash sensor is needed to carry out inspected object collision situation, all larger burden for process and cost, and native system only needs an inertial sensor and utilize collision detecting device just can realize collision checking function, thus process can be simplified and reduce overall cost.
In this instructions, each embodiment adopts the mode of going forward one by one to describe, and what each embodiment stressed is the difference with other embodiments, between each embodiment identical similar portion mutually see.To the above-mentioned explanation of the disclosed embodiments, professional and technical personnel in the field are realized or uses the application.To be apparent for those skilled in the art to the multiple amendment of these embodiments, General Principle as defined herein when not departing from the spirit or scope of the application, can realize in other embodiments.Therefore, the application can not be restricted to these embodiments shown in this article, but will meet the widest scope consistent with principle disclosed herein and features of novelty.

Claims (10)

1. a collision checking method for autonomous robot, is characterized in that, comprises following operation:
Data fusion is carried out to the acceleration information of described autonomous robot and angular velocity data, obtains the attitude information of described autonomous robot;
Specific force conversion is carried out to described acceleration information, obtains the ratio force information of described autonomous robot;
Carry out coordinate transform by described than force information, the coordinate obtaining each axis compares force information;
Using described attitude information and described coordinate than force information as comparison foundation, described comparison is obtained the collision situation of described autonomous robot according to comparing with the fuzzy reasoning table preset.
2. collision checking method as claimed in claim 1, is characterized in that, describedly carries out data fusion to described acceleration information and described angular velocity data, comprising:
Kalman filtering process is carried out to described acceleration information and described angular velocity data, to carry out data fusion.
3. collision checking method as claimed in claim 1, is characterized in that, described described comparison foundation being compared with the fuzzy reasoning table preset obtains the collision situation of described autonomous robot, comprising:
By multiple preset posture information threshold of described attitude information and described fuzzy reasoning table, the multiple preset coordinate specific force information threshold of described coordinate than force information and described fuzzy reasoning table are compared respectively, obtain described collision situation according to comparative result.
4. collision checking method as claimed in claim 1, is characterized in that, describedly carries out coordinate transform by described than force information, and the coordinate obtaining each axis, than force information, comprising:
Carry out Eulerian angle coordinate transform by described than force information, the coordinate obtaining each axis under earth coordinates compares force information.
5. the collision checking method as described in any one of Claims 1 to 4, is characterized in that, also comprises:
Crash site is obtained according to the positional information of described autonomous robot and described collision situation.
6. a collision detecting device for autonomous robot, is characterized in that, comprises data fusion module, specific force conversion module, coordinate transformation module and table look-up module, wherein:
Described data fusion module is used for carrying out data fusion to described acceleration information and described angular velocity data, obtains the attitude information of described autonomous robot;
Described specific force conversion module carries out specific force conversion to described acceleration information, obtains the ratio force information of described autonomous robot;
Described coordinate transformation module is used for carrying out coordinate transform by described than force information, and the coordinate obtaining each axis compares force information;
Described table look-up module be used for using described attitude information and described coordinate than force information as comparison foundation, described comparison is obtained the collision situation of described autonomous robot according to comparing with the fuzzy reasoning table preset.
7. collision detecting device as claimed in claim 6, it is characterized in that, described data fusion module is Kalman filtering module.
8. collision detecting device as claimed in claim 6, it is characterized in that, described table look-up module is used for multiple preset posture information threshold of described attitude information and described fuzzy reasoning table, is compared respectively than the multiple preset coordinate specific force information threshold listed by force information and described fuzzy reasoning table by described coordinate, obtains described collision situation according to comparative result.
9. the collision detecting device as described in any one of claim 6 ~ 9, is characterized in that, also comprises crash site computing module, wherein:
Described crash site computing module is used for obtaining crash site according to the positional information of described autonomous robot and described collision situation.
10. a collision detecting system for autonomous robot, is characterized in that, comprises inertial sensor and the collision detecting device as described in any one of claim 6 ~ 9, wherein:
Described inertial sensor is for obtaining described acceleration information and described angular velocity data.
CN201510367447.7A 2015-06-29 2015-06-29 A kind of collision checking method of autonomous robot, apparatus and system Active CN105004368B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201510367447.7A CN105004368B (en) 2015-06-29 2015-06-29 A kind of collision checking method of autonomous robot, apparatus and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201510367447.7A CN105004368B (en) 2015-06-29 2015-06-29 A kind of collision checking method of autonomous robot, apparatus and system

Publications (2)

Publication Number Publication Date
CN105004368A true CN105004368A (en) 2015-10-28
CN105004368B CN105004368B (en) 2018-03-20

Family

ID=54377131

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510367447.7A Active CN105004368B (en) 2015-06-29 2015-06-29 A kind of collision checking method of autonomous robot, apparatus and system

Country Status (1)

Country Link
CN (1) CN105004368B (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107038874A (en) * 2017-06-01 2017-08-11 广东工业大学 A kind of traffic accident monitoring method and device
CN108367437A (en) * 2015-12-08 2018-08-03 库卡德国有限公司 Identify the method that robots arm is bumped against with object and the robot with robots arm
CN111750873A (en) * 2019-03-26 2020-10-09 东元电机股份有限公司 Mobile platform picture data correction system
CN112033398A (en) * 2020-07-24 2020-12-04 江苏美的清洁电器股份有限公司 Collision detection system and method for sweeping robot
SE2250611A1 (en) * 2022-05-20 2023-11-21 Husqvarna Ab Autonomous work tool and method for operation thereof

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2010079660A1 (en) * 2009-01-06 2010-07-15 有限会社レプトリノ Force sensor
CN102426391A (en) * 2011-09-05 2012-04-25 华南理工大学 Method for determining whether there is collision during robot operation
CN102445920A (en) * 2010-09-07 2012-05-09 罗伯特·博世有限公司 Collision detection method for a drive unit
CN102554939A (en) * 2010-12-30 2012-07-11 沈阳新松机器人自动化股份有限公司 Method and device for collision protection of industrial robot
CN104269075A (en) * 2014-10-14 2015-01-07 武汉理工大学 Navigation mark collision monitoring system based on various sensors

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2010079660A1 (en) * 2009-01-06 2010-07-15 有限会社レプトリノ Force sensor
CN102445920A (en) * 2010-09-07 2012-05-09 罗伯特·博世有限公司 Collision detection method for a drive unit
CN102554939A (en) * 2010-12-30 2012-07-11 沈阳新松机器人自动化股份有限公司 Method and device for collision protection of industrial robot
CN102426391A (en) * 2011-09-05 2012-04-25 华南理工大学 Method for determining whether there is collision during robot operation
CN104269075A (en) * 2014-10-14 2015-01-07 武汉理工大学 Navigation mark collision monitoring system based on various sensors

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
魏延辉 等: "基于两轮自平衡机器人组合定位方法的研究", 《机械与电子》 *

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108367437A (en) * 2015-12-08 2018-08-03 库卡德国有限公司 Identify the method that robots arm is bumped against with object and the robot with robots arm
CN107038874A (en) * 2017-06-01 2017-08-11 广东工业大学 A kind of traffic accident monitoring method and device
CN107038874B (en) * 2017-06-01 2024-01-23 广东工业大学 Traffic accident monitoring method and device
CN111750873A (en) * 2019-03-26 2020-10-09 东元电机股份有限公司 Mobile platform picture data correction system
CN112033398A (en) * 2020-07-24 2020-12-04 江苏美的清洁电器股份有限公司 Collision detection system and method for sweeping robot
CN112033398B (en) * 2020-07-24 2023-01-06 美智纵横科技有限责任公司 Collision detection system and method for sweeping robot
SE2250611A1 (en) * 2022-05-20 2023-11-21 Husqvarna Ab Autonomous work tool and method for operation thereof

Also Published As

Publication number Publication date
CN105004368B (en) 2018-03-20

Similar Documents

Publication Publication Date Title
US10823576B2 (en) Systems and methods for robotic mapping
CN105004368A (en) Collision detection method, device and system for autonomous robot
US11059174B2 (en) System and method of controlling obstacle avoidance of robot, robot and storage medium
CN103294059B (en) Based on mobile robot positioning system and the method thereof of hybrid navigation band
CN107168186B (en) 4 automatic horizontal control systems and its working method based on six axis combination sensors
CN104635730B (en) A kind of robot autonomous charging method
CN104777835A (en) Omni-directional automatic forklift and 3D stereoscopic vision navigating and positioning method
CN205121338U (en) AGV navigation based on image recognition and wireless network
CN104916216A (en) Map construction method and system thereof
CN106682563A (en) Lane line detection self-adaptive adjusting method and device
CN113593284B (en) Method and device for planning path of vehicle in mine roadway and electronic equipment
CN107539887A (en) Construction crane machine group anti-collision early warning accessory system
CN102034005B (en) Shield machine posture simulation detection system for shield tunnel construction
CN108759829A (en) A kind of local obstacle-avoiding route planning method of intelligent forklift
CN103309351A (en) Maintenance robot obstacle avoidance planning method
CN104569958A (en) Target positioning method and system based on ultrasonic wave and inertial navigation combination
CN108544491A (en) A kind of moving robot obstacle avoiding method considering distance and two factor of direction
CN205175416U (en) Mobile robot positioning system based on laser and inertia measuring unit
CN116734757A (en) Tunnel surrounding rock deformation monitoring and early warning method based on unmanned aerial vehicle-mounted laser scanner
Kim et al. Safety control of automatic excavator for swing collision avoidance
CN102707301A (en) Positioning device and positioning method thereof
CN115540854A (en) Active positioning method, equipment and medium based on UWB assistance
CN111595328B (en) Real obstacle map construction and navigation method and system based on depth camera
CN205028160U (en) Measurement resolver and controlling means that unmanned aerial vehicle independently landed
CN104540098B (en) Location positioning method, system and server based on crowdsourcing

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right

Effective date of registration: 20180115

Address after: 518055 Guangdong city of Shenzhen province Nanshan District Taoyuan Street Xueyuan Road No. 1001 Nanshan Chi Park B1 building 16 floor

Applicant after: SHENZHEN LD ROBOT Co.,Ltd.

Address before: Nanshan District Xili Tong long Shenzhen city of Guangdong Province in 518055 with rich industrial city 8 Building 2, 6 floor

Applicant before: INMOTION TECHNOLOGIES Co.,Ltd.

TA01 Transfer of patent application right
GR01 Patent grant
GR01 Patent grant
CP03 Change of name, title or address

Address after: 518000 room 1601, building 2, Vanke Cloud City phase 6, Tongfa South Road, Xili community, Xili street, Nanshan District, Shenzhen City, Guangdong Province (16th floor, block a, building 6, Shenzhen International Innovation Valley)

Patentee after: Shenzhen Ledong robot Co.,Ltd.

Address before: 518055, 16, B1 building, Nanshan Zhiyuan 1001, Taoyuan Road, Nanshan District, Shenzhen, Guangdong.

Patentee before: SHENZHEN LD ROBOT Co.,Ltd.

CP03 Change of name, title or address