Summary of the invention
In view of this, the application provides a kind of collision checking method, Apparatus and system of autonomous robot, for judging collision situation, takes the foundation of corresponding measure as it after colliding, to ensure that it normally works.
To achieve these goals, the existing scheme proposed is as follows:
A collision checking method for autonomous robot, comprises following operation:
Data fusion is carried out to the acceleration information of described autonomous robot and angular velocity data, obtains the attitude information of described autonomous robot;
Specific force conversion is carried out to described acceleration information, obtains the ratio force information of described autonomous robot;
Carry out coordinate transform by described than force information, the coordinate obtaining each axis compares force information;
Using described attitude information and described coordinate than force information as comparison foundation, described comparison is obtained the collision situation of described autonomous robot according to comparing with the fuzzy reasoning table preset.
Optionally, described data fusion is carried out to described acceleration information and described angular velocity data, comprising:
Kalman filtering process is carried out to described acceleration information and described angular velocity data, to carry out data fusion.
Optionally, described described comparison foundation being compared with the fuzzy reasoning table preset obtains the collision situation of described autonomous robot, comprising:
By multiple preset posture information threshold of described attitude information and described fuzzy reasoning table, the multiple preset coordinate specific force information threshold of described coordinate than force information and described fuzzy reasoning table are compared respectively, obtain described collision situation according to comparative result.
Optionally, describedly carry out coordinate transform by described than force information, the coordinate obtaining each axis, than force information, comprising:
Carry out Eulerian angle coordinate transform by described than force information, the coordinate obtaining each axis under earth coordinates compares force information.
Optionally, also comprise:
Crash site is obtained according to the positional information of described autonomous robot and described collision situation.
A collision detecting device for autonomous robot, comprises data fusion module, specific force conversion module, coordinate transformation module and table look-up module, wherein:
Described data fusion module is used for carrying out data fusion to described acceleration information and described angular velocity data, obtains the attitude information of described autonomous robot;
Described specific force conversion module carries out specific force conversion to described acceleration information, obtains the ratio force information of described autonomous robot;
Described coordinate transformation module is used for carrying out coordinate transform by described than force information, and the coordinate obtaining each axis compares force information;
Described table look-up module be used for using described attitude information and described coordinate than force information as comparison foundation, described comparison is obtained the collision situation of described autonomous robot according to comparing with the fuzzy reasoning table preset.
Optionally, described data fusion module is Kalman filtering module.
Optionally, described table look-up module is used for multiple preset posture information threshold of described attitude information and described fuzzy reasoning table, is compared respectively than the multiple preset coordinate specific force information threshold listed by force information and described fuzzy reasoning table by described coordinate, obtains described collision situation according to comparative result.
Optionally, also comprise crash site computing module, wherein:
Described crash site computing module is used for obtaining crash site according to the positional information of described autonomous robot and described collision situation.
A collision detecting system for autonomous robot, comprises inertial sensor and the collision detecting device as described in any one of claim 6 ~ 9, wherein:
Described inertial sensor is for obtaining described acceleration information and described angular velocity data.
As can be seen from technique scheme, this application discloses a kind of collision checking method of autonomous robot, Apparatus and system, the method, the acceleration information of acquisition and angular velocity data calculate by Apparatus and system, the attitude information of autonomous robot is obtained by data fusion, the coordinate being obtained autonomous robot by coordinate transform compares force information, then than force information, table lookup operation is carried out to the fuzzy reasoning table preset according to this attitude information and coordinate, thus obtain the collision situation of this autonomous robot, and then enough take corresponding measure according to this collision situation, to ensure that autonomous robot can normally work.
Embodiment
Below in conjunction with the accompanying drawing in the embodiment of the present application, be clearly and completely described the technical scheme in the embodiment of the present application, obviously, described embodiment is only some embodiments of the present application, instead of whole embodiments.Based on the embodiment in the application, those of ordinary skill in the art are not making the every other embodiment obtained under creative work prerequisite, all belong to the scope of the application's protection.
Embodiment one
The process flow diagram of the collision checking method of a kind of autonomous robot that Fig. 1 provides for the embodiment of the present application.
As shown in Figure 1, the collision checking method that the present embodiment provides comprises the steps:
S101: the attitude information calculating autonomous robot.
After the acceleration information obtaining this autonomous robot and angular velocity data, both are carried out data fusion, obtains the attitude information of autonomous robot.
Acceleration information i.e. acceleration information in three axial directions, angular velocity data is three angular velocity datas axially.Then carry out data fusion by Kalman filtering algorithm, thus obtain this attitude information.
S102: specific force conversion is carried out to acceleration information.
Specific force conversion is carried out to acceleration information, preferably by above-mentioned three acceleration informations axially.Calculate, obtain three ratio force informations axially.Specific force equals acceleration and is multiplied by acceleration of gravity, for characterizing stressing conditions in three axial directions.
S103: coordinate system transformation will be carried out than force information.
This is carried out angular coordinate conversion than force information, preferably its coordinate system is transformed to the ratio force information under the earth coordinates characterizing actual geographic position by Eulerian angle coordinate transformation method, for the ease of difference, the ratio force information after conversion is compared force information as terrestrial coordinate.
S104: the fuzzy reasoning table according to presetting obtains collision situation.
This fuzzy reasoning table is that two-dimentional the counting obtained according to engineering practice is shown, comprising multiple preset posture information threshold and multiple default terrestrial coordinate specific force information threshold, also comprise the collision situation that multiple and above-mentioned two parameters contrast, this collision situation comprises forward direction collision, backward collision, left-hand collision, dextrad collision and lifts idle running.
When tabling look-up to this fuzzy reasoning table, respectively attitude information and multiple preset posture information threshold are contrasted one by one, and terrestrial coordinate is contrasted one by one than force information and multiple preset posture information threshold simultaneously, when this attitude information and a certain preset posture information threshold match, and when this terrestrial coordinate matches than force information and a certain default terrestrial coordinate specific force information threshold simultaneously, then judge simultaneously corresponding with this certain preset posture information threshold and this certain default terrestrial coordinate specific force information threshold collision situation now occurs.Thus determine current autonomous robot is which type of collision occurs.
As can be seen from technique scheme, present embodiments provide a kind of collision checking method of autonomous robot, the acceleration information of acquisition and angular velocity data calculate by the method, the attitude information of autonomous robot is obtained by data fusion, the coordinate being obtained autonomous robot by coordinate transform compares force information, then than force information, table lookup operation is carried out to the fuzzy reasoning table preset according to this attitude information and coordinate, thus obtain the collision situation of this autonomous robot, and then enough take corresponding measure according to this collision situation, to ensure that autonomous robot can normally work.
Embodiment two
After the collision situation of autonomous robot being detected, if after the place of this collision situation generation can also be obtained, just residing for it, the concrete condition in space can take corresponding measures to keep clear, such as, can take after encountering wall to turn around or retreat, can select directly to cross or turn to avoid in time encountering less barrier, for this reason, present invention also provides following embodiment to determine crash site.
The process flow diagram of the collision checking method of a kind of autonomous robot that Fig. 2 provides for another embodiment of the application.
The collision checking method that the present embodiment provides on the basis of a upper embodiment, has done part improve, and the process flow diagram completed as shown in Figure 2.
S201: the attitude information calculating autonomous robot.
After the acceleration information obtaining this autonomous robot and angular velocity data, both are carried out data fusion, obtains the attitude information of autonomous robot.
Acceleration information i.e. acceleration information in three axial directions, angular velocity data is three angular velocity datas axially, is designated as.Then carry out data fusion by Kalman filtering algorithm, thus obtain this attitude information.
S202: specific force conversion is carried out to acceleration information.
Specific force conversion is carried out to acceleration information, calculates by above-mentioned three acceleration informations axially, obtain three ratio force informations axially.Specific force equals acceleration and is multiplied by acceleration of gravity, for characterizing stressing conditions in three axial directions.
S203: coordinate system transformation will be carried out than force information.
This is carried out angular coordinate conversion than force information, preferably its coordinate system is transformed to the ratio force information under the earth coordinates characterizing actual geographic position by Eulerian angle coordinate transformation method, for the ease of difference, the ratio force information after conversion is compared force information as terrestrial coordinate.
S204: the fuzzy reasoning table according to presetting obtains collision situation.
This fuzzy reasoning table is that two-dimentional the counting obtained according to engineering practice is shown, comprising multiple preset posture information threshold and multiple default terrestrial coordinate specific force information threshold, also comprise the collision situation that multiple and above-mentioned two parameters contrast, this collision situation comprises forward direction collision, backward collision, left-hand collision, dextrad collision and lifts idle running.
S205: judge crash site.
Autonomous robot is generally provided with the odometer for exporting its positional information of reflection, comprises left side scrambler and right side scrambler.Just can obtain the true geographical location information of autonomous robot when mobile according to this positional information, the geographical location information when colliding is this crash site.
Thus this autonomous robot just can take corresponding dodging or Disposal Measures according to crash site and residing environment.
Embodiment three
The schematic diagram of the position detecting device of a kind of autonomous robot that Fig. 3 provides for the another embodiment of the application.
As shown in Figure 3, the position detecting device that the present embodiment provides comprises data fusion module 10, specific force conversion module 20, coordinate transformation module 30 and table look-up module 40.
Both, for according to the acceleration information of this autonomous robot and angular velocity data, are then carried out data fusion by data fusion module 10, obtain and output from the attitude information of main robot.
Acceleration information i.e. acceleration information in three axial directions, angular velocity data is three angular velocity datas axially.Then carry out data fusion by Kalman filtering algorithm, thus obtain and export this attitude information, corresponding notebook data Fusion Module 10 is Kalman filtering module.
Specific force conversion module 20 is for carrying out specific force conversion to acceleration information, and three acceleration informations axially exported by data fusion module 10 calculate, and obtain three ratio force informations axially.Specific force equals acceleration and is multiplied by acceleration of gravity, for characterizing stressing conditions in three axial directions.
Coordinate transformation module 30 is for carrying out coordinate transform by this than force information, preferably carry out Eulerian angle coordinate transform, it is the ratio force information under the earth coordinates of sign actual geographic position by its coordinate system transformation, for the ease of difference, the ratio force information after conversion is compared force information as terrestrial coordinate.
Table look-up module 40 is for obtaining collision situation according to the fuzzy reasoning table preset.
This fuzzy reasoning table is that two-dimentional the counting obtained according to engineering practice is shown, comprising multiple preset posture information threshold and multiple default terrestrial coordinate specific force information threshold, also comprise the collision situation that multiple and above-mentioned two parameters contrast, this collision situation comprises forward direction collision, backward collision, left-hand collision, dextrad collision and lifts idle running.
Table look-up module 40 is when tabling look-up to this fuzzy reasoning table, respectively attitude information and multiple preset posture information threshold are contrasted one by one, and terrestrial coordinate is contrasted one by one than force information and multiple preset posture information threshold simultaneously, when this attitude information and a certain preset posture information threshold match, and when this terrestrial coordinate matches than force information and a certain default terrestrial coordinate specific force information threshold simultaneously, then judge simultaneously corresponding with this certain preset posture information threshold and this certain default terrestrial coordinate specific force information threshold collision situation now occurs.Thus determine current autonomous robot is which type of collision occurs.
As can be seen from technique scheme, present embodiments provide a kind of collision detecting device of autonomous robot, the acceleration information of acquisition and angular velocity data calculate by this device, the attitude information of autonomous robot is obtained by data fusion, the coordinate being obtained autonomous robot by coordinate transform compares force information, then than force information, table lookup operation is carried out to the fuzzy reasoning table preset according to this attitude information and coordinate, thus obtain the collision situation of this autonomous robot, and then enough take corresponding measure according to this collision situation, to ensure that autonomous robot can normally work.
Embodiment four
The schematic diagram of the collision detecting device of a kind of autonomous robot that Fig. 4 provides for the another embodiment of the application.
With the reason that embodiment two is set forth, corresponding measures to keep clear is taked in order to enable autonomous robot concrete condition in space residing for it, such as, can take after encountering wall to turn around or retreat, can select directly to cross or turn to avoid in time encountering less barrier, crash site computing module 50 has also been set up on the basis of a upper embodiment.
This crash site computing module 50 obtains its real-time geographical locations information when mobile for the positional information exported according to its odometer 100, and collision situation table look-up module 40 obtained combines with this real-time geographical locations information and obtains this crash site.Thus make this autonomous robot just can take corresponding dodging or Disposal Measures according to crash site and residing environment.
Embodiment five
The schematic diagram of the collision detecting system of a kind of autonomous robot that Fig. 5 provides for the another embodiment of the application.
As shown in Figure 5, the collision detecting system that the present embodiment provides comprises the collision detecting device 200 that embodiment above provides, and has also set up the inertial sensor 60 for obtaining above-mentioned acceleration information and angular velocity data on this basis.
This inertial sensor 60 is arranged on the relevant position of autonomous robot, for obtaining its acceleration information and angular velocity data according to the motion conditions of autonomous robot, and outputs to collision detecting device 200.Collision detecting device 200 utilizes this acceleration information and angular velocity data to carry out the calculating of collision situation.
Native system carries out collision detection compared to using the method for multiple crash sensor, because its collision detection capabilities and object volume are inversely proportional to, therefore a lot of crash sensor is needed to carry out inspected object collision situation, all larger burden for process and cost, and native system only needs an inertial sensor and utilize collision detecting device just can realize collision checking function, thus process can be simplified and reduce overall cost.
In this instructions, each embodiment adopts the mode of going forward one by one to describe, and what each embodiment stressed is the difference with other embodiments, between each embodiment identical similar portion mutually see.To the above-mentioned explanation of the disclosed embodiments, professional and technical personnel in the field are realized or uses the application.To be apparent for those skilled in the art to the multiple amendment of these embodiments, General Principle as defined herein when not departing from the spirit or scope of the application, can realize in other embodiments.Therefore, the application can not be restricted to these embodiments shown in this article, but will meet the widest scope consistent with principle disclosed herein and features of novelty.