Summary of the invention
The disclosure provides the localization method and device, electronic equipment, computer readable storage medium of a kind of robot, with solution
Certainly deficiency in the related technology.
According to the first aspect of the embodiments of the present disclosure, a kind of localization method of robot is provided, the robot is configured with
Image acquisition units and distance measuring unit;The described method includes:
According to the collected current image date of described image acquisition unit, determination matches with the current image date
History image data, the history image data collect by described image acquisition unit in the historical juncture;
Obtain the robot history posture information corresponding when the history image data are collected;
According to the current collected ranging data of the distance measuring unit, the current posture information of the robot is determined;
According to the history posture information and the current posture information, the current location of the robot is determined
Position.
Optionally,
Further include: determine whether the robot occurs kidnapping accident;
The current location progress according to the history posture information and the current posture information, to the robot
Positioning, comprising: when determining that kidnapping accident occurs for the robot, believed according to the history posture information and the current pose
Breath, the position that the robot occurs after kidnapping accident position.
Optionally, whether the determination robot occurs kidnapping accident, comprising:
When described image acquisition unit acquired image data and/or the collected ranging data hair of the distance measuring unit
When raw mutation, determine that kidnapping accident occurs for the robot.
Optionally, described according to the collected current image date of described image acquisition unit, the determining and current figure
The history image data to match as data, comprising:
When the similarity between the current image date and the history image data is more than preset threshold, institute is determined
History image data are stated to match with the current image date.
Optionally, described according to the collected current image date of described image acquisition unit, the determining and current figure
The history image data to match as data, comprising:
When in the current image date and the history image data comprising one or more same collected objects
When, determine that the history image data match with the current image date.
Optionally, described according to the history posture information and the current posture information, to the current of the robot
Position is positioned, comprising:
Determine the target histories posture information to match in the history posture information with the current posture information;
According to the target histories posture information to the robot in the map constructed according to the distance measuring unit
Current location positioned.
Optionally, described according to the history posture information and the current posture information, to the current of the robot
Position is positioned, comprising:
Three-dimensional environment composition is obtained, the three-dimensional environment composition is by the history image data and the history posture information
Building obtains;
The posture information for corresponding to the current image date is determined based on the three-dimensional environment composition;
According to the posture information determined to the robot working as in the map constructed according to the distance measuring unit
Front position is positioned.
Optionally, the three-dimensional environment is patterned into previously according to the collected history image data and the history bit
Appearance information and construct to obtain;Alternatively, the three-dimensional environment is patterned into the history for determining to be matched with the current image date
After image data, construct to obtain with the history posture information based on the history image data.
Optionally, further includes:
Determine the current posture information with the presence or absence of mistake;
When determining that the current posture information has mistake, the current pose is believed using the history posture information
Breath is modified.
It is optionally, described to determine that the current posture information whether there is mistake, comprising:
When presently described distance measuring unit is blocked or the robot skids, it is wrong to determine that the current posture information exists
Accidentally.
It is optionally, described to determine that the current posture information whether there is mistake, comprising:
When any in the current posture information and the history posture information does not match that, determine described current
There are mistakes for posture information.
Optionally, further includes:
In the moving process that the robot executes specific operation, search and the collected figure of described image acquisition unit
As the history image data that data match, and count corresponding matching times;
When the ratio of the number of the matching times and execution specific operation of any history image data is less than default threshold
When value, it is right to delete the institute when any history image data are collected of any history image data and the robot
The posture information answered.
Optionally, all history image data and history posture information are stored in presetting database;The method is also wrapped
It includes:
When receiving the more new command for the presetting database, constructed in moving process according to the robot
Map out determines depletion region;
Image data and corresponding posture information are acquired by described image acquisition unit in the depletion region, with
Update the presetting database.
Optionally, the robot is for different scene types configured with corresponding cleaning strategy;The method is also wrapped
It includes:
During the robot executes clean operation, according to for described image acquisition unit acquired image
The scene Recognition of data is as a result, take corresponding cleaning strategy to be cleaned.
According to the second aspect of an embodiment of the present disclosure, a kind of positioning device of robot is provided, the robot is configured with
Image acquisition units and distance measuring unit;Described device includes:
Image data determination unit, according to the collected current image date of described image acquisition unit, it is determining with it is described
The history image data that current image date matches, the history image data are by described image acquisition unit in the historical juncture
It collects;
Pose acquiring unit obtains the robot history pose corresponding when the history image data are collected
Information;
Pose determination unit determines working as the robot according to the current collected ranging data of the distance measuring unit
Preceding posture information;
Positioning unit, according to the history posture information and the current posture information, to the present bit of the robot
It sets and is positioned.
Optionally,
Further include: kidnapping accident determination unit determines whether the robot occurs kidnapping accident;
The positioning unit includes: the first locator unit, when determining that kidnapping accident occurs for the robot, according to institute
History posture information and the current posture information are stated, the position that the robot occurs after kidnapping accident positions.
Optionally, the kidnapping accident determination unit includes:
Kidnapping accident determines subelement, when described image acquisition unit acquired image data and/or the ranging list
When first collected ranging data mutates, determine that kidnapping accident occurs for the robot.
Optionally, the positioning unit, comprising:
First determines subelement, determines that the target to match in the history posture information with the current posture information is gone through
History posture information;
Second locator unit, according to the target histories posture information to the robot according to the distance measuring unit
The current location in map constructed is positioned.
Optionally, the positioning unit, comprising:
Obtain subelement, obtain three-dimensional environment composition, the three-dimensional environment composition by the history image data with it is described
History pose information architecture obtains;
Second determines subelement, determines that the pose for corresponding to the current image date is believed based on the three-dimensional environment composition
Breath;
Third locator unit constructs the robot according to the distance measuring unit according to the posture information determined
The current location in map out is positioned.
Optionally, the three-dimensional environment is patterned into previously according to the collected history image data and the history bit
Appearance information and construct to obtain;Alternatively, the three-dimensional environment is patterned into the history for determining to be matched with the current image date
After image data, construct to obtain with the history posture information based on the history image data.
Optionally, further includes:
Judging unit determines that the current posture information whether there is mistake;
Amending unit, when determining that the current posture information has mistake, using the history posture information to described
Current posture information is modified.
Optionally, the judging unit includes:
First determines subelement, when presently described distance measuring unit is blocked or the robot skids, works as described in judgement
There are mistakes for preceding posture information.
Optionally, the judging unit includes:
Second determines subelement, when the current posture information does not match that with any in the history posture information
When, determine that there are mistakes for the current posture information.
Optionally, further includes:
Statistic unit is searched and described image acquisition unit in the moving process that the robot executes specific operation
The history image data that acquired image data match, and count corresponding matching times;
Unit is deleted, when the matching times of any history image data and the ratio for the number for executing the specific operation are small
When preset threshold, deletes any history image data and the robot is adopted in any history image data
Corresponding posture information when collection.
Optionally, all history image data and history posture information are stored in presetting database;Described device is also wrapped
It includes:
Depletion region determination unit, when receiving the more new command for the presetting database, according to the machine
The map that people constructs in moving process determines depletion region;
Updating unit acquires image data and corresponding position by described image acquisition unit in the depletion region
Appearance information, to update the presetting database.
Optionally, the robot is for different scene types configured with corresponding cleaning strategy;Described device is also wrapped
It includes:
Developing Tactics unit, during the robot executes clean operation, according to single for described image acquisition
The scene Recognition of first acquired image data is as a result, take corresponding cleaning strategy to be cleaned.
Optionally, described image data determination unit includes:
First image data determines subelement, when similar between the current image date and the history image data
When degree is more than preset threshold, determine that the history image data match with the current image date.
Optionally, described image data determination unit includes:
Second image data determines subelement, when including one in the current image date and the history image data
When a or multiple same collected objects, determine that the history image data match with the current image date.
According to the third aspect of an embodiment of the present disclosure, a kind of robot is provided, the robot is configured with Image Acquisition list
Member and distance measuring unit;The robot further include:
Processor;
Memory for storage processor executable instruction;
Wherein, the processor is by running the executable instruction to realize the side as described in any in above-described embodiment
Method.
According to a fourth aspect of embodiments of the present disclosure, a kind of computer readable storage medium is provided, calculating is stored thereon with
Machine instruction, which is characterized in that the step of any the method in such as above-described embodiment is realized when the instruction is executed by processor.
The technical scheme provided by this disclosed embodiment can include the following benefits:
As can be seen from the above embodiments, the disclosure by with distance measuring unit robot on configure image acquisition units,
So that robot can acquire image data in moving process, and mapping is established with posture information when acquisition image data and is closed
System.It, can be by the corresponding history bit of the history image data to match with current image date so in subsequent moving process
Appearance information works as robot collectively as foundation as reference, and with the current posture information determined by distance measuring unit
Front position is positioned, so that the accuracy of robot self poisoning is improved, further hoisting machine task efficiency.
It should be understood that above general description and following detailed description be only it is exemplary and explanatory, not
The disclosure can be limited.
Specific embodiment
Example embodiments are described in detail here, and the example is illustrated in the accompanying drawings.Following description is related to
When attached drawing, unless otherwise indicated, the same numbers in different drawings indicate the same or similar elements.Following exemplary embodiment
Described in embodiment do not represent all embodiments consistent with the application.On the contrary, they be only with it is such as appended
The example of the consistent device and method of some aspects be described in detail in claims, the application.
It is only to be not intended to be limiting the application merely for for the purpose of describing particular embodiments in term used in this application.
It is also intended in the application and the "an" of singular used in the attached claims, " described " and "the" including majority
Form, unless the context clearly indicates other meaning.It is also understood that term "and/or" used herein refers to and wraps
It may be combined containing one or more associated any or all of project listed.
It will be appreciated that though various information, but this may be described using term first, second, third, etc. in the application
A little information should not necessarily be limited by these terms.These terms are only used to for same type of information being distinguished from each other out.For example, not departing from
In the case where the application range, the first information can also be referred to as the second information, and similarly, the second information can also be referred to as
One information.Depending on context, word as used in this " if " can be construed to " ... when " or " when ...
When " or " in response to determination ".
In the related art, by taking sweeping robot as an example: configuring LDS (LASER usually on sweeping robot
DISTANCE SENSOR, laser range sensor) and positioned during cleaning using SLAM algorithm.However, passing through
LDS is positioned, and on the one hand the accuracy of positioning is lower;On the other hand when sweeping robot is held as a hostage, there may be utilizations
LDS can not detect the case where kidnapping accident.It is illustrated below with reference to Fig. 1.
Referring to Figure 1, Fig. 1 is that showing for kidnapping accident occurs for a kind of sweeping robot shown according to an exemplary embodiment
It is intended to.As shown in Figure 1, when sweeping robot 100 executes clean operation in rectangular environment 10, if by sweeping robot
100 move from position A to position B (i.e. kidnapping accident has occurred in sweeping robot 100), then sweeping robot 100 according to LDS simultaneously
It can not determine that kidnapping accident itself has occurred (since position A is similar in the data apart from aspect with position B).In fact,
Since sweeping robot 100 can not determine that kidnapping accident itself has occurred according to LDS, determine instead currently still in position
A (being practically in position B, that is, reset bit-errors) is set, continues to be in the cleaning strategy of position A according to itself (for example, using
Cleaning route is constant) clean operation is executed, cause to reduce cleaning efficiency.
Therefore, the disclosure is improved by the positioning method to the robot with autonomous function, to solve phase
Above-mentioned technical problem present in the technology of pass.It is described in detail below with reference to embodiment.
The robot 100 that the disclosure provides can (but being not limited to) be sweeping robot, floor-mopping robot or sweep and drag one
The automatic cleaning equipments such as formula robot, the robot 100 may include machine body, sensory perceptual system, control system, driving system
System, cleaning systems, energy resource system and man-machine interactive system.
Wherein:
Machine body includes forward portion and backward part, has approximate circular shape (front and back is all round), can also have
There are other shapes, the approximate D-shape of circle including but not limited to behind front.
Sensory perceptual system include position determining means above the machine body, positioned at machine body forward portion it is slow
Rush the sensings dress such as device, steep cliff sensor and ultrasonic sensor, infrared sensor, magnetometer, accelerometer, gyroscope, odometer
It sets, provides the various positions information and movement state information of machine to control system.Position determining means include but is not limited to take the photograph
As head, laser ranging system (LDS).Illustrate how that progress position is true by taking the laser ranging system of triangle telemetry as an example below
It is fixed.The basic principle of triangle telemetry is based on the equal than relationship of similar triangles, and this will not be repeated here.
Laser ranging system includes luminescence unit and light receiving unit.Luminescence unit may include the light source for emitting light, light source
It may include light-emitting component, such as the infrared or luminous ray light emitting diode (LED) of transmitting infrared light or luminous ray.It is excellent
Selection of land, light source can be the light-emitting component of transmitting laser beam.In the present embodiment, the example by laser diode (LD) as light source
Son.Specifically, due to the monochrome of laser beam, orientation and collimation property, use the light source of laser beam can make measurement compared to
Other light are more accurate.For example, the infrared light or luminous ray of light emitting diode (LED) transmitting are by week compared to laser beam
Such environmental effects (such as color or texture of object) is enclosed, and may be decreased in measurement accuracy.Laser diode
(LD) it can be dot laser, measure the two-dimensional position information of barrier, be also possible to line laser, measure the certain model of barrier
Enclose interior three dimensional local information.
Light receiving unit may include imaging sensor, and the light for being reflected by barrier or being scattered is formed on the imaging sensor
Point.Imaging sensor can be the set of single or plurality of rows of multiple unit pixels.These light receiving elements can be by optical signal
Be converted to electric signal.Imaging sensor can be complementary metal oxide semiconductor (CMOS) sensor or charge coupled cell
(CCD) sensor, since the advantage in cost is preferably complementary metal oxide semiconductor (CMOS) sensor.Moreover, light
Unit may include sensitive lens component.The light for being reflected by barrier or being scattered can advance via sensitive lens component to scheme
As forming image on sensor.Sensitive lens component may include single or multiple lens.
Base portion can support luminescence unit and light receiving unit, and luminescence unit and light receiving unit are arranged on base portion and to each other
Every a specific range.For the barrier situation around robot measurement on 360 degree of directions, base portion can be made to be rotatably arranged
In main body, it can not also be rotated with base portion itself and rotate transmitting light, reception light by the way that rotating element is arranged.Rotation
The angular velocity of rotation of element can be obtained by setting optic coupling element and code-disc, and optic coupling element incudes the tooth on code-disc and lacks, and be passed through
Tooth lack spacing slip over time and tooth lack between distance value be divided by instantaneous angular velocity can be obtained.The scarce density of tooth is bigger on code-disc, surveys
The accuracy rate and precision of amount are also just corresponding higher but just more accurate in structure, and calculation amount is also higher;Conversely, tooth lack it is close
Spend smaller, the accuracy rate and precision of measurement are accordingly also lower, but can be relatively easy in structure, and calculation amount is also smaller, can
To reduce some costs.
The data processing equipment connecting with light receiving unit, such as DSP, will be relative to all angles on 0 degree of angular direction of robot
Obstacle distance value at degree records and sends to the data processing unit in control system, such as comprising the application processor of CPU
(AP), location algorithm of the CPU operation based on particle filter obtains the current location of robot, and is charted according to this position, for leading
Boat uses.It is preferable to use instant positioning and map structuring (SLAM) for location algorithm.
Although the laser ranging system based on triangle telemetry can measure the infinity other than certain distance in principle
Distance value at distance, but actually telemeasurement, such as 6 meters or more, realization be it is very difficult, be primarily due to light
The size limitation of pixel unit on the sensor of unit, while also by the photoelectric conversion speed of sensor, sensor and connection
The calculating speed of data transmission bauds, DSP between DSP influences.The measured value that laser ranging system is affected by temperature
The variation that meeting generating system can not put up with, the thermal expansion that the structure being primarily due between luminescence unit and light receiving unit occurs become
Shape leads to the angle change between incident light and emergent light, and luminescence unit and light receiving unit itself can also have temperature drift.Swash
Optical range finding apparatus be used for a long time after, as many factors such as temperature change, vibration accumulate and caused by deformation also can serious shadow
Ring measurement result.The accuracy of measurement result directly determines the accuracy of map making, is robot further progress strategy
The basis of implementation, it is particularly important.
The forward portion of machine body can carry buffer, and driving wheel module promotes robot on ground during cleaning
When walking, buffer detects one or more in the driving path of robot 100 via sensing system, such as infrared sensor
A event (or object), robot can be controlled by the event (or object) that is detected by buffer, such as barrier, wall
Driving wheel module processed makes robot to respond to the event (or object), for example away from barrier.
Set-up of control system is on the circuit main board in machine body, including with non-transitory memory, such as it is hard disk, fast
Flash memory, random access memory, the computation processor of communication, such as central processing unit, application processor, using processing
The obstacle information that device is fed back according to laser ranging system draws institute, robot in the environment using location algorithm, such as SLAM
Instant map.And combining buffer, steep cliff sensor and ultrasonic sensor, infrared sensor, magnetometer, accelerometer,
Which kind of work range information, the velocity information comprehensive descision sweeper of the sensing devices such as gyroscope, odometer feedback are currently at
State such as crosses threshold, and upper carpet is located at steep cliff, and either above or below is stuck, and dirt box is full, be lifted etc., it can also be directed to
Different situations provide specific next step action policy, so that the work of robot is more in line with the requirement of owner, have preferably
User experience.Further, the instant map information planning that control system can be drawn based on SLAM cleaning the most efficient and rational
Path and cleaning method greatly improve the sweeping efficiency of robot.
Drive system can based on have distance and angle information, such as x, y and θ component drive command and the people that operates machine
100 cross over ground run.Drive system includes driving wheel module, and driving wheel module can control revolver and right wheel simultaneously, in order to
The movement of machine is more accurately controlled, preferably driving wheel module respectively includes left driving wheel module and right driving wheel module.Left,
Right driving wheel module is opposed along the lateral shaft defined by main body.Can move on the ground more stablely for robot or
The stronger locomitivity of person, robot may include one or more driven wheel, and driven wheel includes but is not limited to universal wheel.It drives
Driving wheel module includes traveling wheel and drive motor and the control circuit for controlling drive motor, and driving wheel module can also connect survey
Measure the circuit and odometer of driving current.Driving wheel module can be detachably connected in main body, easy disassembly and maintenance.It drives
Driving wheel can have biasing drop suspension system, movably fasten, such as be rotatably attached, and arrive robot master
Body, and receive spring biasing downward and far from robot body's biasing.Spring biasing allows driving wheel with certain soil fertility
The contact and traction with ground are maintained, while the cleaning element of robot 100 is also with certain pressure contact ground 10.
Cleaning systems can be dry cleaning system and/or wet cleaning system.As dry cleaning system, main cleaning
The cleaning system that connecting component of the function between roller brushes structure, dirt box structure, blower fan structure, air outlet and four is constituted
System.With ground have the roller brushes structure centainly interfered by the rubbish on ground sweep up and winding to roller brushes structure and dirt box structure it
Between suction inlet in front of, then by blower fan structure generates and pass through dirt box structure have suction gas sucking dirt box structure.It sweeps
The dust collection capacity of ground machine can be characterized, sweeping efficiency with the sweeping efficiency DPU (Dust pick up efficiency) of rubbish
DPU is by roller brushes structure and Effect of Materials, by the interconnecting piece between suction inlet, dirt box structure, blower fan structure, air outlet and four
The wind power utilization rate in the air duct that part is constituted influences, and is influenced by the type and power of blower, is a complicated system design problem.
Compared to common plug-in dust catcher, the raising of dust collection capacity meaning for the clean robot of limited energy is bigger.Cause
It directly effectively reduces for the raising of dust collection capacity for energy requirement, that is to say, that filling primary electricity originally, can to clean 80 flat
The machine in meter face can evolve even more to fill 180 square meters of primary electricity cleaning.And reduce making for the battery of charging times
It can also be greatly increased with the service life, so that the frequency that user replaces battery also will increase.It is more intuitive and importantly, dust collection capacity
Raising be the most obvious and important user experience, user can immediately arrive at sweep whether clean/wipe whether clean knot
By.Dry cleaning system also may include that there is the side of rotary shaft to brush, and rotary shaft is angled relative to ground, with for will be broken
Bits are moved in the round brush region of cleaning systems.
Energy resource system includes rechargeable battery, such as nickel-metal hydride battery and lithium battery.Rechargeable battery can connect charge control
Circuit, battery pack charging temperature detection circuit and battery undervoltage observation circuit, charging control circuit, the detection of battery pack charging temperature
Circuit, battery undervoltage observation circuit are connected with single chip machine controlling circuit again.Host, which passes through, is arranged in fuselage side or lower section
Charging electrode connect with charging pile and charges.
Man-machine interactive system includes the key on host panel, and key carries out function selection for user;It can also include aobvious
Display screen and/or indicator light and/or loudspeaker, display screen, indicator light and loudspeaker show current machine status or function to user
It can options;It can also include mobile phone client program.For path navigation type cleaning equipment, cell phone client can to
The map of environment and machine present position where the presentation device of family, can provide a user the function of more horn of plenty and hommization
It can item.
The robot that the disclosure provides is configured with image acquisition units and distance measuring unit;Image acquisition units are for acquiring figure
As data, distance measuring unit is for acquiring ranging data.Wherein, image acquisition units and distance measuring unit may be included in above-mentioned perception system
In the position determining means of system.For example, image acquisition units can be camera, distance measuring unit can be laser ranging system.
For another example, image acquisition units and distance measuring unit can be integrated in camera;For example, can be used with TOF (Time of
Flight, flight time) function feels camera deeply, or uses the camera of 3D structured light technique.Certainly, the disclosure is not
The particular hardware form of image acquisition units and distance measuring unit is limited.
Based on the structure of above-mentioned robot, the disclosure provides a kind of localization method of robot.As shown in Fig. 2, this method
It may comprise steps of:
In step 202, according to the collected current image date of described image acquisition unit, the determining and current figure
As the history image data that data match, the history image data are acquired by described image acquisition unit in the historical juncture
It arrives.
In the present embodiment, it can be regarded as robot before (in timing in " historical juncture " acquired image data
Before current) operational process (robot moves during the motion) in acquired image data.It is set with automated cleaning
For standby (certainly, it be not limited only to automatic cleaning equipment, can also be any other robot with autonomous function), it can
The image data that automatic cleaning equipment is acquired during cleaning for the first time is as the history image data;Alternatively, by automatic
The cleaning equipment image data that (i.e. history cleaning process) acquires before this cleaning process is as the history image data.
It should be noted that the locating scheme of the disclosure is for automatic cleaning equipment cleans same environment.
In the present embodiment, in one case, " matching " can be regarded as current image date and history image data
Between similarity (or matching degree) be more than certain threshold value.In another case, " matching ", it is current to can be regarded as
Comprising one or more same subjects in image data and history image data.
In step 204, the robot history pose corresponding when the history image data are collected is obtained
Information.
In the present embodiment, while robot acquires image data in moving process, record itself pose at this time
Information, and the image data and the posture information are established into mapping relations.Wherein, posture information may include robot and shot pair
As (i.e. the object of image acquisition units shooting, image acquisition units obtain the image of the subject by shooting subject
Data) the distance between, angle, the posture of robot etc. the parameter of relative position between robot and subject can be described.
For configuring LDS and camera (i.e. for LDS as distance measuring unit, camera is image acquisition units) using automatic cleaning equipment, camera shooting
Head and LDS are worked at the same time to acquire corresponding data.For example, automatic cleaning equipment is during the cleaning of " historical juncture ",
While by camera collection image data, using the collected ranging data of LDS, execution is formed by SLAM algorithm
The map of the environment of clean operation, and determine itself currently location information in the map;Meanwhile according to other sensing devices
(for example, gyroscope, accelerometer, electronic compass etc.) acquires itself current posture information, further according to the location information and is somebody's turn to do
Posture information determines posture information.
It should be noted that by the above-mentioned description for " matching " it is found that the image to match with a certain image data
Data are there may be multiple, and multiple images data then correspond to different angles, position and posture etc..For example, automated cleaning is set
It is standby in scale removal process, may at different angles, distance and posture take the same tea table.
In step 206, according to the current collected ranging data of the distance measuring unit, the current of the robot is determined
Posture information.
In the present embodiment, the present bit of robot can be determined according to the posture information of ranging data and current robot
Appearance information.By taking automatic cleaning equipment configures LDS as an example, automatic cleaning equipment utilizes the collected ranging data of LDS, passes through
SLAM algorithm forms the map for the environment for executing clean operation, and determines itself current location information in the map.Together
When, itself current posture information is acquired according to other sensing devices (for example, gyroscope, accelerometer, electronic compass etc.), then
Current posture information is determined according to the location information and the posture information.
In step 608, according to the history posture information and the current posture information, to the current of the robot
Position is positioned.
In the present embodiment, image data can be acquired by image acquisition units in moving process based on robot, and
Mapping relations are established with posture information corresponding when acquisition image data, then in the moving process of subsequent robot, it can
History posture information that the history image data to match with current image date are corresponding is worked as reference with what is determined
Preceding posture information positions the current location of robot collectively as foundation, to improve the accuracy of positioning, further
Hoisting machine task efficiency.
By the content of above-mentioned steps 202 it is found that may include more with the history image data that current image date matches
It is a;In other words, the history posture information got in step 204 also may include multiple therewith.So, for how using obtaining
The multiple history posture informations got and the current posture information (the current posture information determined in step 206) carry out
The mode of positioning, it may include following two:
In one embodiment, the target to match in the history posture information with the current posture information can be first determined
History posture information is constructing the robot according to the distance measuring unit further according to the target histories posture information
Current location in map is positioned.
In another embodiment, three-dimensional environment composition can be first obtained, the three-dimensional environment composition is by the history image number
It is obtained according to the history pose information architecture, and is determined based on the three-dimensional environment composition and correspond to the current image date
Posture information, further according to the posture information determined to the robot in the map constructed according to the distance measuring unit
Current location in (constructing to obtain according to the collected all ranging datas of distance measuring unit) is positioned.Wherein, three-dimensional map
It can be and generated in real time in positioning, is also possible to pre-generated.In other words, in one case, the three-dimensional environment
It is patterned into and constructs to obtain with the history posture information previously according to the collected history image data;In another feelings
Under condition, the three-dimensional environment is patterned into after determining to be matched with the history image data of the current image date, is based on institute
History image data are stated to construct to obtain with the history posture information.
In the locating scheme for the robot that the disclosure provides, for robot, there is a situation where kidnapping accidents, can support
The position that robot occurs after kidnapping accident is accurately relocated.As an exemplary embodiment, shown in above-mentioned Fig. 2
On the basis of embodiment, before step 202, it can comprise the further steps of: and determine whether the robot occurs kidnapping accident.
So, the current image date in the case where kidnapping accident occurs for robot, in above-mentioned steps 202, it should be understood that
The image data of image acquisition units acquisition after kidnapping accident occurs for robot;Similarly, distance measuring unit is current in above-mentioned steps 206
Collected ranging data, it should be understood that the collected ranging data of distance measuring unit after kidnapping accident occurs for robot, that is, works as
Preceding posture information should be understood as being determined after kidnapping accident occurs for robot according to the collected ranging data of distance measuring unit
Current posture information.Therefore, in the case where kidnapping accident occurs for robot, the positioning operation executed in step 208 can be into one
Step include: when determining that kidnapping accident occurs for the robot, according to the history posture information and the current posture information,
The position that the robot occurs after kidnapping accident positions.Wherein, the operation for specifically how executing the positioning, can refer to
Associated description in above-mentioned steps 208, details are not described herein.
And for the condition for determining robot generation kidnapping accident, it can refer to the image acquisition units and survey of robot configuration
Away from the collected data of unit.For example, image acquisition units acquired image data will when kidnapping accident occurs for robot
It mutates, the collected ranging data of distance measuring unit will also mutate.Therefore, as an exemplary embodiment, when described
It, can be true when image acquisition units acquired image data and/or the collected ranging data of the distance measuring unit mutate
Kidnapping accident occurs for the fixed robot.As it can be seen that by the way that the situation of change of image acquisition units acquired image data is added
It is added to as determining whether robot occurs the foundation of kidnapping accident, it can be achieved that accurate detection to kidnapping accident, to help
In subsequent reorientation.
It, can be (i.e. current in step 206 to current posture information in the locating scheme for the robot that the disclosure provides
Posture information) it is verified, and it is modified after verifying out the present bit appearance information and there is mistake.It is exemplary as one
Embodiment can determine that the current posture information whether there is mistake, and sentencing on the basis of above-mentioned embodiment illustrated in fig. 2
When making the current posture information in the presence of mistake, the current posture information is repaired using the history posture information
Just, to be conducive to improve the accuracy of the map constructed according to distance measuring unit.
In one embodiment, when presently described distance measuring unit is blocked or the robot skids, it can determine that described work as
There are mistakes for preceding posture information;In another embodiment, when appointing in the current posture information and the history posture information
One when not matching that, can determine that there are mistakes for the current posture information.
In the embodiment shown in Figure 2, it is gone through by the way that the history image data that will be matched with current image date are corresponding
History posture information is as reference, to improve the accuracy of positioning.As it can be seen that can history image data and history posture information reflect
The physical location of robot is most important out.Therefore, maintenance history image data and history pose letter can be come in the following manner
Breath.
In one embodiment, it (by taking automatic cleaning equipment as an example, is somebody's turn to do in the moving process that the robot executes specific operation
Moving process is the cleaning process of automatic cleaning equipment) in, it searches and described image acquisition unit acquired image data
The history image data to match, and count corresponding matching times.Based on the statistics to matching times, when any history image
It the matching times of data and executes the number of the specific operation (by taking automatic cleaning equipment as an example, which is set
Standby cleaning time) ratio be less than preset threshold when, any history image data can be deleted and the robot exists
Any history image data posture information corresponding when collected.
In another embodiment, all history image data and history posture information can be stored in presetting database.When
When receiving the more new command for the presetting database, the map that can be constructed in moving process according to the robot
(for example, can construct to obtain according to distance measuring unit) determines depletion region, and is acquired in the depletion region by described image
Unit acquires image data and corresponding posture information, to update the presetting database.Wherein, the more new command can be by
User is issued by mobile terminal (establishing communication connection with robot) to robot;Alternatively, the more new command can be by machine
People generates according to the default update cycle.
In the present embodiment, the acquisition that can carry out image to institute's clean environment in moving process based on robot, can
With according to the difference of the scene type of institute's clean environment, to the cleaning strategy of robot (for example, can be sweeping robot)
It is adjusted correspondingly, to promote the usage experience of cleaning efficiency and user.For example, robot is directed to different scene types
Configured with corresponding cleaning strategy;It so, can be according to for described image during the robot executes clean operation
The scene Recognition of acquisition unit acquired image data is as a result, take corresponding cleaning strategy to be cleaned.
In order to make it easy to understand, (certainly, being not limited only to automated cleaning by taking the artificial automatic cleaning equipment of machine as an example below and setting
Standby, can also be any other robot with autonomous function), in conjunction with attached drawing and concrete scene to the machine of the disclosure
The locating scheme of people is described in detail.
Fig. 3 is referred to, Fig. 3 is the flow chart of the localization method of another robot shown in an exemplary embodiment.Such as
Shown in Fig. 3, this method is applied to automatic cleaning equipment and (is configured with camera and LDS, i.e. image acquisition units are camera, are surveyed
It is LDS away from unit), it may comprise steps of:
In step 302, current image date is acquired.
In step 304, the determining history image data to match with current image date.
In the present embodiment, automatic cleaning equipment can will for the first time clean during acquired image data as history
Image data;Alternatively, will before this cleaning process (i.e. history cleans process, is the same environment of cleaning) collected figure
As data are as history image data.Also, the pose of itself when history image data and the acquisition history image data is believed
Breath is established mapping relations and is stored in presetting database.Wherein, posture information may include automatic cleaning equipment and subject (i.e.
Camera shooting object, camera by shooting subject obtain the image data of the subject) the distance between, angle
Degree, posture of robot etc. can describe the parameter of relative position between automatic cleaning equipment and subject.For example, camera with
LDS is in working condition during cleaning, and posture information includes location information and posture information.So, automated cleaning is set
For while passing through camera collection image data, using the collected ranging data of LDS, held by SLAM algorithm to be formed
The map of the environment of row clean operation, and determine itself currently location information in the map;Meanwhile it being filled according to other sensings
The posture information that (for example, gyroscope, accelerometer, electronic compass etc.) acquires itself is set, further according to the location information and the appearance
State information determines posture information.
It should be noted that in the determining history image data to match with current image date, in one case,
It is more than certain that " matching ", which can be regarded as the similarity (or matching degree) between current image date and history image data,
Threshold value.For example, image matching algorithm (MAD algorithm, absolute error and algorithm, normalization product correlation al gorithm can be used
Deng) or machine learning model determine in presetting database in all history image data, similarity with current image date
More than preset threshold image data as " the history image number to match with current image date in above-mentioned steps 304
According to ".In another case, " matching " can be regarded as in current image date and history image data comprising one or more
A same subject.For example, can recognize (for example, can by image-recognizing method neural network based, based on wavelet moment
Image-recognizing method etc. identifies current image date) go out the object for including in current image date (for example, cup, tea
The objects such as several, TV), will in presetting database equally also comprising this identify object (there may be multiple, may be configured as include
Whole objects include fractional object) image data as " matching with current image date in above-mentioned steps 304
History image data ".In still another case, can also have both condition in above-mentioned two situations as determine " matching " according to
According to.For example, when the similarity between current image date and history image data is more than certain threshold value, and (or) include
When one or more same subjects, the relationship between the two for " matching " can determine that.
Within step 306, history posture information of the automatic cleaning equipment when the history image data are collected is obtained.
In the present embodiment, it can search and be determined with step 304 in the preset database based on the mapping relations of foundation
The corresponding history posture information of history image data out.
In step 308, according to the current collected ranging data of LDS, the current pose letter of automatic cleaning equipment is determined
Breath.
In the present embodiment, automatic cleaning equipment can utilize the collected ranging data of LDS, by SLAM algorithm come shape
At the map for the environment for executing clean operation, and determine itself currently location information in the map.Meanwhile according to other biographies
Induction device (for example, gyroscope, accelerometer, electronic compass etc.) acquires itself current posture information, believes further according to the position
Breath and the posture information determine current posture information.
In the step 310, the target histories posture information to match in history posture information with current posture information is determined.
In the present embodiment, the history image data to match with current image date may include multiple.In other words, it walks
The history posture information got in rapid 306 also may include multiple therewith.For example, the multiple history got in step 304
Image data includes same tea table;Wherein, shooting distance of the automatic cleaning equipment when shooting each history image data, shooting
Angle and posture etc. are different.
And in the determining target histories posture information to match with current posture information, it can will be got in step 306
History posture information be compared one by one with current posture information, will with current posture information close or even identical history
Posture information is as the target histories posture information.For example, can by the history posture information got in step 306,
Distance include in current posture information at a distance from difference wrapped in pre-determined distance threshold value, and in angle and current posture information
The difference of the angle contained predetermined angle threshold value history posture information as the target histories posture information.
In step 312, automatic cleaning equipment is being constructed according to distance measuring unit according to target histories posture informations
Current location in map is positioned.
In the present embodiment, automatic cleaning equipment can construct the ground for executing the environment of clean operation using SLAM algorithm
Figure, and then positioned according to current location of the target histories posture information to automatic cleaning equipment.So, automatic cleaning equipment
Further according to the map and prelocalization can be worked as, determine the route for executing clean operation.
By the corresponding history posture information of the history image data that will be matched with current image date as reference, come
Improve the accuracy of positioning.As it can be seen that can history image data and history posture information reflect the reality of automatic cleaning equipment
Position is most important.Therefore, the history image data and history pose letter in presetting database can be safeguarded in the following manner
Breath.
In one embodiment, it during each cleaning of automatic cleaning equipment, can search and the collected figure of camera
As the history image data that data match, and count corresponding matching times.Based on the statistics to matching times, gone through when any
The matching times of history image data and cleaning time (can be by automatic cleaning equipments from starting to clean required clean environment to should
The process that clean environment finishes is interpreted as primary cleaning process, i.e. cleaning time record is primary) ratio when being less than preset threshold,
The pose letter of any history image data and automatic cleaning equipment when any history image data are collected can be deleted
Breath.
It in another embodiment, can be according to automatic cleaning equipment when receiving the more new command for presetting database
The map (for example, can construct to obtain according to LDS) constructed during cleaning determines depletion region, then automatic cleaning equipment
Camera collection image data and corresponding posture information can be passed through, in the depletion region to update presetting database.Example
Such as, for the same teacup, automatic cleaning equipment shoots the teacup with posture at different angles by rotating, and remembers simultaneously
Record corresponding posture information.Wherein, more new command (can be established by user by mobile terminal or server with automatic cleaning equipment
Communication connection) it is issued to automatic cleaning equipment;Alternatively, more new command can be raw according to the default update cycle by automatic cleaning equipment
At.
Fig. 4 is referred to, Fig. 4 is the flow chart of the localization method of another robot shown in an exemplary embodiment.Such as
Shown in Fig. 4, this method is applied to automatic cleaning equipment and (is configured with camera and LDS, i.e. image acquisition units are camera, are surveyed
It is LDS away from unit), it may comprise steps of:
In step 402, current image date is acquired.
In step 404, the determining history image data to match with current image date.
In a step 406, history posture information of the automatic cleaning equipment when the history image data are collected is obtained.
In the present embodiment, the detailed process of step 402-406 is similar with above-mentioned steps 302-306, and details are not described herein.
In a step 408, three-dimensional environment composition is obtained.
In the present embodiment, three-dimensional environment composition is obtained by the history image data got in step 404 with step 406
The history pose information architecture got obtains.In one case, three-dimensional environment composition, which can be, is collecting the history image
It is constructed when data and the history posture information;In other words, the history image data and history pose letter are being collected
It is just constructed to obtain three-dimensional map when breath, then directly reading the three-dimensional map when executing step 408.In another kind
In the case of, three-dimensional environment is patterned into after determining to be matched with the history image data of current image date, is based on the history figure
As data construct to obtain with the history posture information;In other words, by get in step 404 the history image data with
And after by step 406 getting the history posture information, then structure is carried out to the history image data and the history posture information
It builds to obtain three-dimensional map.
In step 410, the posture information for corresponding to current image date is determined based on three-dimensional environment composition.
In the present embodiment, the position for corresponding to current image date can be determined from three-dimensional environment composition by PNP algorithm
Appearance information.Certainly, other algorithms for calculating posture information can also be used, the disclosure is limited not to this.
In step 412, according to the posture information determined to automatic cleaning equipment in the map constructed according to LDS
Current location positioned.
In the present embodiment, the map can collected all ranging datas (include during this cleaning according to LDS
According to the current collected ranging data of LDS) building obtains.After determining current location, automatic cleaning equipment can be into one
Step determines the route for executing clean operation according to the map and current location.
The embodiment as shown in above-mentioned Fig. 3-4 during cleaning based on automatic cleaning equipment by image it is found that can be adopted
Collect unit and acquire image data, and establish mapping relations with posture information when acquisition image data, then subsequent automatic clear
During the cleaning of clean equipment, the corresponding history posture information of the history image data to match with current image date can be made
For reference, and the current location of automatic cleaning equipment is positioned collectively as foundation with the current posture information determined,
To improve the accuracy of positioning, the cleaning efficiency of automatic cleaning equipment is further promoted.
Fig. 5 is referred to, Fig. 5 is a kind of flow chart of the method for relocating of robot shown in an exemplary embodiment.Such as
Shown in Fig. 5, this method is applied to automatic cleaning equipment and (is configured with camera and LDS, i.e. image acquisition units are camera, are surveyed
It is LDS away from unit), it may comprise steps of:
In step 502, current collected data are obtained.
In the present embodiment, collected data include that the collected current image date of camera and LDS are currently adopted
The ranging data collected.
In step 504, judge whether the data got mutate, if being mutated, be transferred to step 506;
Otherwise return step 502.
In the present embodiment, when kidnapping accident occurs for automatic cleaning equipment, (kidnapping accident can be regarded as automatic cleaning equipment
It is non-to be moved according to the normal route for executing clean operation or speed, such as moved away from cleaning by force by user during cleaning
Position) when, camera acquired image data will mutate, and the collected ranging data of LDS will also mutate.Cause
Whether this, can judge automatic cleaning equipment by the way that whether current acquired image data and/or ranging data mutate
Kidnapping accident has occurred.As it can be seen that automatic as determining by being added to the situation of change of camera acquired image data
The foundation of kidnapping accident whether occurs for cleaning equipment, it can be achieved that accurate detection to kidnapping accident, to facilitate subsequent heavy
Positioning.Wherein, image data mutation can be regarded as the information for including in adjacent picture frame in timing mutation;Example
Such as, the variable quantity of identical content proportion is more than in certain threshold value or adjacent picture frame in adjacent picture frame
Not comprising identical subject etc..Certainly, whether the disclosure does not mutate used algorithm to identification image data
It is limited.Similarly, ranging data mutation can be regarded as adjacent ranging data in timing (same ranging data can wrap
Ranging data containing all angles around automatic cleaning equipment) it mutates;Certainly, the disclosure is not also to identification ranging data
Algorithm used by no mutation is limited.
For example, it is undertaken in the citing of Fig. 1 in the related technology, even if position A and position B are in the data apart from aspect
(i.e. ranging data) is similar (not mutating), as long as the surface of position A and position B has differences (for example, machine of sweeping the floor
People 100 can take TV 101 in position A by camera, and can not take TV by camera in position B
101), then camera acquired image data will be sent out during sweeping robot 100 is moved from position A to position B
Raw mutation (can be regarded as the image data that camera collects position A, with collect position B image data between difference
It is larger), so as to judge that kidnapping accident has occurred in sweeping robot 100, and can't obtain such as judgement in the related technology
As a result (determine that kidnapping accident does not occur for sweeping robot 100).
In step 506, the determining history image data to match with current image date.
In the present embodiment, it in the case where kidnapping accident occurs for automatic cleaning equipment, is collected in step 502
Current image date, it should be understood that automatic cleaning equipment occur kidnapping accident rear camera acquired image data;Together
Reason, the current collected ranging data of LDS in step 502, it should be understood that LDS is adopted after kidnapping accident occurs for automatic cleaning equipment
The ranging data collected.
In step 508, history posture information of the automatic cleaning equipment when the history image data are collected is obtained.
In step 510, the posture information after automatic cleaning equipment is held as a hostage is obtained.
In step 512, the target histories position that the posture information in history posture information and after being held as a hostage matches is determined
Appearance information.
In the present embodiment, it will be apparent to a skilled person that occurring to kidnap thing for automatic cleaning equipment
In the case where part, automatic cleaning equipment be held as a hostage after posture information, i.e., for according to the currently collected survey of LDS in step 502
Current posture information away from the automatic cleaning equipment that data are determined.
In the step 514, the position after being held as a hostage according to target histories posture information to automatic cleaning equipment positions.
In the present embodiment, it will be apparent to a skilled person that occurring to kidnap thing for automatic cleaning equipment
In the case where part, step 514 is similar with above-mentioned steps 312, i.e. the current location in the map constructed according to distance measuring unit
It is positioned.Behind the position after determining to be held as a hostage, automatic cleaning equipment can further according to the map of building and by
Position after abduction, planning executes the route of clean operation again.
It should be noted that step 506-514 is identical as the principle of above-mentioned steps 304-312, details are not described herein.
Fig. 6 is referred to, Fig. 6 is the flow chart of the method for relocating of another robot shown in an exemplary embodiment.
As shown in fig. 6, this method be applied to automatic cleaning equipment (be configured with camera and LDS, i.e., image acquisition units be camera,
Distance measuring unit is LDS), it may comprise steps of:
In step 602, current collected data are obtained.
In the present embodiment, collected data include that the collected current image date of camera and LDS are currently adopted
The ranging data collected.
In step 604, judge whether the data got mutate, if being mutated, be transferred to step 606;
Otherwise return step 602.
In the present embodiment, step 602-604 is similar with above-mentioned steps 502-504, and details are not described herein.
In step 606, the determining history image data to match with current image date.
In the present embodiment, it in the case where kidnapping accident occurs for automatic cleaning equipment, is collected in step 602
Current image date, it should be understood that automatic cleaning equipment occur kidnapping accident rear camera acquired image data;Together
Reason, the current collected ranging data of LDS in step 602, it should be understood that LDS is adopted after kidnapping accident occurs for automatic cleaning equipment
The ranging data collected.
In step 608, history posture information of the automatic cleaning equipment when the history image data are collected is obtained.
In step 610, corresponding three-dimensional environment composition is obtained.
In step 612, the posture information for corresponding to current image date is determined based on three-dimensional environment composition.
In step 614, the position after being held as a hostage according to the posture information determined to automatic cleaning equipment is positioned.
In the present embodiment, it will be apparent to a skilled person that occurring to kidnap thing for automatic cleaning equipment
In the case where part, current location is the position after being held as a hostage, i.e. step 614 is similar with above-mentioned steps 412.It is determining to be robbed
Behind position after holding, the position that automatic cleaning equipment can further according to the map of building and after being held as a hostage is planned again
Execute the route of clean operation.
It should be noted that step 606-614 is identical as the principle of above-mentioned steps 404-412, details are not described herein.It can
See, the disclosure is by configuring image acquisition units on the automatic cleaning equipment with distance measuring unit, so that automatic cleaning equipment
Image data can be acquired during cleaning, and establishes mapping relations with posture information when acquisition image data.So rear
During continuous cleaning, can history posture information that the history image data that matched with current image date are corresponding as ginseng
It examines, and with the current posture information determined by distance measuring unit collectively as foundation, to the current location of automatic cleaning equipment
It is positioned, to improve the accuracy of automatic cleaning equipment self poisoning, further promotes the cleaning effect of automatic cleaning equipment
Rate.
The embodiment as shown in above-mentioned Fig. 5-6 is as it can be seen that by adding the situation of change of camera acquired image data
It is added to as determining whether automatic cleaning equipment occurs the foundation of kidnapping accident, it can be achieved that accurate detection to kidnapping accident, from
And facilitate subsequent reorientation.Fig. 7 is referred to, Fig. 7 is the current posture information of a kind of verification shown in an exemplary embodiment
Flow chart.As shown in fig. 7, this method is applied to the automatic cleaning equipment in any of the above-described embodiment, it may include following step
It is rapid:
In a step 702, current posture information is obtained.
In the present embodiment, current posture information is current posture information in any of the above-described embodiment (according to LDS
The current posture information for the automatic cleaning equipment that current collected ranging data is determined).
In step 704, current posture information is judged with the presence or absence of mistake, and mistake is then transferred to step 706 if it exists;Otherwise
Return step 702.
In the present embodiment, in one case, when current LDS is blocked (for example, the ranging data of LDS remains
It is constant) or automatic cleaning equipment skid (for example, the data of automatic cleaning equipment accelerometer and odometer do not match that) when, can
Determine that there are mistakes for the present bit appearance information;In another case, when appointing in current posture information and history posture information
One when not matching that, can determine that there are mistakes for current posture information.
In step 706, history posture information is obtained.
In the present embodiment, history posture information be in any of the above-described embodiment history posture information (for example, step
History posture information in rapid 306).
In step 708, current posture information is modified using history posture information.
In the present embodiment, when determining current posture information and there is mistake, using the history posture information to deserving
Preceding posture information is modified, and is conducive to improve the accuracy according to the LDS map constructed.
Fig. 8 is the flow chart that a kind of automatic cleaning equipment shown according to an exemplary embodiment executes clean operation, such as
Shown in Fig. 8, this method is applied to the automatic cleaning equipment in any of the above-described embodiment, may comprise steps of:
In step 802, current image date is acquired.
In step 804, the scene type of current image date is identified.
In the present embodiment, can be used (can be by being marked with scene type applied to the machine learning model of scene Recognition
Sample data be trained to obtain) identified;Alternatively, (being stored with various scene types in default scene type database
Image data, for example, parlor, toilet, bedroom etc.) in search the image data to match with current image date, and will
The scene type that the scene type of the image data found is indicated as current image date;Alternatively, can be by user's control certainly
Dynamic cleaning equipment is moved to captured image data under each scene, and marks corresponding scene type, then subsequent automatic clear
Clean equipment can go out the scene type of institute's clean environment according to the marker recognition.Certainly, the disclosure is not to identification scene class
The mode of type is limited.
In step 806, corresponding cleaning strategy is searched.
In the present embodiment, based on automatic cleaning equipment during cleaning can by camera to institute's clean environment into
The acquisition of row image, can according to the difference of the scene type of institute's clean environment, to the cleaning strategy of automatic cleaning equipment into
The corresponding adjustment of row, to promote the usage experience of cleaning efficiency and user.
For example, it is assumed that for the scene type of institute's cleaning ambient, can configure cleaning strategy as shown in Table 1:
Scene type cleaning strategy
Parlor uses strength cleaning mode
Toilet is without cleaning
Bedroom silent mode
…………
Table 1
Certainly, table 1 is only a kind of example to configuration cleaning strategy, and specifically cleaning tactful user can be according to practical feelings
Condition is flexibly set, and the disclosure is limited not to this.As it can be seen that by for the different corresponding cleaning plans of scene type configuration
Slightly, the different requirement for cleaning of user can be met, helps to promote user experience.
In step 808, it is cleaned according to the cleaning strategy found out.
Corresponding with the embodiment of the localization method of automatic cleaning equipment above-mentioned, the disclosure additionally provides automated cleaning and sets
The embodiment of standby positioning device.
Fig. 9 is a kind of block diagram of the positioning device of robot shown according to an exemplary embodiment.Referring to Fig. 9, the machine
Device people is configured with image acquisition units and distance measuring unit;The device includes image data determination unit 901, pose acquiring unit
902, pose determination unit 903 and positioning unit 904.
The image data determination unit 901 is configured as according to the collected present image number of described image acquisition unit
According to the determining history image data to match with the current image date, the history image data are acquired by described image
Unit is collected in the historical juncture;
It is right that the pose acquiring unit 902 is configured as obtaining institute when the history image data are collected, the robot
The history posture information answered;
The pose determination unit 903 is configured as determining institute according to the current collected ranging data of the distance measuring unit
State the current posture information of robot;
The positioning unit 904 is configured as according to the history posture information and the current posture information, to the machine
The current location of device people positions.
As shown in Figure 10, Figure 10 is the frame of the positioning device of another robot shown according to an exemplary embodiment
Figure, the embodiment is on the basis of aforementioned embodiment illustrated in fig. 9, further includes: kidnapping accident determination unit 905, the positioning unit
904 include: the first locator unit 9041.
The kidnapping accident determination unit 905 is configured to determine that whether the robot occurs kidnapping accident;
First locator unit 9041 is configured as being gone through when determining that kidnapping accident occurs for the robot according to described
History posture information and the current posture information, the position that the robot occurs after kidnapping accident position.
As shown in figure 11, Figure 11 is the frame of the positioning device of another robot shown according to an exemplary embodiment
Figure, for the embodiment on the basis of aforementioned embodiment illustrated in fig. 10, kidnapping accident determination unit 905 includes: that kidnapping accident determines
Subelement 9051.
The kidnapping accident determine subelement 9051 be configured as when described image acquisition unit acquired image data and/
Or the distance measuring unit collected ranging data determines that kidnapping accident occurs for the robot when mutating.
As shown in figure 12, Figure 12 is the frame of the positioning device of another robot shown according to an exemplary embodiment
Figure, for the embodiment on the basis of aforementioned embodiment illustrated in fig. 9, which includes: the first determining subelement 9042
With the second locator unit 9043.
The first determining subelement 9042 is configured to determine that in the history posture information and the current posture information
The target histories posture information to match;
Second locator unit 9043 is configured as according to the target histories posture information to the robot in root
The current location in map constructed according to the distance measuring unit is positioned.
It should be noted that first in Installation practice shown in above-mentioned Figure 12 determines that subelement 9042 and second is determined
The structure of seat unit 9043 also may be embodied in the Installation practice of earlier figures 10, be not limited to this disclosure.
As shown in figure 13, Figure 13 is the frame of the positioning device of another robot shown according to an exemplary embodiment
Figure, for the embodiment on the basis of aforementioned embodiment illustrated in fig. 9, which includes: to obtain subelement 9044, second
Determine subelement 9045 and third locator unit 9046.
The acquisition subelement 9044 is configured as obtaining three-dimensional environment composition, and the three-dimensional environment composition is by the history figure
As data and the history pose information architecture obtain;
The second determining subelement 9045, which is configured as determining based on the three-dimensional environment composition, corresponds to the current figure
As the posture information of data;
The third locator unit 9046 is configured as according to the posture information determined to the robot according to institute
The current location stated in the map that distance measuring unit constructs is positioned.
Optionally, the three-dimensional environment is patterned into previously according to the collected history image data and the history bit
Appearance information and construct to obtain;Alternatively, the three-dimensional environment is patterned into the history for determining to be matched with the current image date
After image data, construct to obtain with the history posture information based on the history image data.
It should be noted that the acquisition subelement 9044, second in Installation practice shown in above-mentioned Figure 13 determines that son is single
The structure of member 9045 and third locator unit 9046 also may be embodied in the Installation practice of earlier figures 10, to this this public affairs
It opens and is not limited.
As shown in figure 14, Figure 14 is the frame of the positioning device of another robot shown according to an exemplary embodiment
Figure, the embodiment is on the basis of aforementioned embodiment illustrated in fig. 9, further includes: judging unit 906 and amending unit 907.
The judging unit 906 is configured as determining the current posture information with the presence or absence of mistake;
The amending unit 907 is configured as utilizing the history bit when determining that the current posture information has mistake
Appearance information is modified the current posture information.
It should be noted that the knot of judging unit 906 and amending unit 907 in Installation practice shown in above-mentioned Figure 14
Structure also may be embodied in the Installation practice of earlier figures 10, be not limited to this disclosure.
As shown in figure 15, Figure 15 is the frame of the positioning device of another robot shown according to an exemplary embodiment
Figure, for the embodiment on the basis of aforementioned embodiment illustrated in fig. 14, judging unit 906 includes: the first judgement subelement 9061.
The first judgement subelement 9061 is configured as being blocked when presently described distance measuring unit or the robot skids
When, determine that there are mistakes for the current posture information.
As shown in figure 16, Figure 16 is the frame of the positioning device of another robot shown according to an exemplary embodiment
Figure, for the embodiment on the basis of aforementioned embodiment illustrated in fig. 14, judging unit 906 includes: the second judgement subelement 9062.
The second judgement subelement 9062 is configured as when in the current posture information and the history posture information
It is any when not matching that, determine that there are mistakes for the current posture information.
As shown in figure 17, Figure 17 is the frame of the positioning device of another robot shown according to an exemplary embodiment
Figure, the embodiment is on the basis of aforementioned embodiment illustrated in fig. 9, further includes: statistic unit 908 and deletion unit 909.
The statistic unit 908 be configured as the robot execute specific operation moving process in, search with it is described
The history image data that image acquisition units acquired image data match, and count corresponding matching times;
The deletion unit 909 is configured as matching times and the execution specific operation when any history image data
When the ratio of number is less than preset threshold, any history image data and the robot are deleted in any history
Image data posture information corresponding when collected.
It should be noted that the knot of statistic unit 908 and deletion unit 909 in Installation practice shown in above-mentioned Figure 17
Structure also may be embodied in earlier figures 10 and the Installation practice of Figure 14, be not limited to this disclosure.
As shown in figure 18, Figure 18 is the frame of the positioning device of another robot shown according to an exemplary embodiment
Figure, on the basis of aforementioned embodiment illustrated in fig. 9, all history image data are stored in pre- the embodiment with history posture information
If in database;Further include: depletion region determination unit 910 and updating unit 911.
The depletion region determination unit 910 is configured as when receiving the more new command for the presetting database,
Depletion region is determined according to the map that the robot constructs in moving process;
The updating unit 911 is configured as in the depletion region acquiring image data by described image acquisition unit
And corresponding posture information, to update the presetting database.
It should be noted that the depletion region determination unit 910 and update in Installation practice shown in above-mentioned Figure 18 are single
The structure of member 911 also may be embodied in the Installation practice of earlier figures 10, Figure 14 and Figure 17, to this disclosure without limit
System.
As shown in figure 19, Figure 19 is the frame of the positioning device of another robot shown according to an exemplary embodiment
Figure, for the embodiment on the basis of aforementioned embodiment illustrated in fig. 9, the robot is for different scene types configured with correspondence
Cleaning strategy;Further include: Developing Tactics unit 912.
The Developing Tactics unit 912 is configured as during the robot executes clean operation, according to for institute
The scene Recognition of image acquisition units acquired image data is stated as a result, corresponding cleaning strategy is taken to be cleaned.
It should be noted that the structure of the Developing Tactics unit 912 in Installation practice shown in above-mentioned Figure 19 can also be with
Included in earlier figures 10, the Installation practice of Figure 14, Figure 17 and Figure 18, this disclosure is not limited.
As shown in figure 20, Figure 20 is the frame of the positioning device of another robot shown according to an exemplary embodiment
Figure, for the embodiment on the basis of aforementioned embodiment illustrated in fig. 9, which includes the first image data
Determine subelement 9011.
First image data determines that subelement 9011 is configured as when the current image date and the history image number
When similarity (or matching degree) between is more than preset threshold, the history image data and the current image date are determined
Match.
As shown in figure 21, Figure 21 is the frame of the positioning device of another robot shown according to an exemplary embodiment
Figure, for the embodiment on the basis of aforementioned embodiment illustrated in fig. 9, which includes the second image data
Determine subelement 9012.
Second image data determines that subelement 9012 is configured as when the current image date and the history image number
When including one or more same collected objects in, the history image data and the current image date phase are determined
Matching.
About the device in above-described embodiment, wherein modules execute the concrete mode of operation in related this method
Embodiment in be described in detail, no detailed explanation will be given here.
For device embodiment, since it corresponds essentially to embodiment of the method, so related place is referring to method reality
Apply the part explanation of example.The apparatus embodiments described above are merely exemplary, wherein described be used as separation unit
The unit of explanation may or may not be physically separated, and component shown as a unit can be or can also be with
It is not physical unit, it can it is in one place, or may be distributed over multiple network units.It can be according to actual
The purpose for needing to select some or all of the modules therein to realize disclosure scheme.Those of ordinary skill in the art are not paying
Out in the case where creative work, it can understand and implement.
Correspondingly, the disclosure also provides a kind of robot, the robot is configured with image acquisition units and distance measuring unit;
The robot further include: processor;Memory for storage processor executable instruction;Wherein, the processor passes through
The executable instruction is run to realize the implementation method of the screen light filling as described in any in above-described embodiment, such as this method
It may include: according to the collected current image date of described image acquisition unit, determination and the current image date phase
The history image data matched, the history image data are collected by described image acquisition unit in the historical juncture;Obtain institute
State the robot history posture information corresponding when the history image data are collected;It is currently adopted according to the distance measuring unit
The ranging data collected determines the current posture information of the robot;According to the history posture information and the present bit
Appearance information positions the current location of the robot.
Correspondingly, the disclosure also provides a kind of terminal, the terminal include memory and one or more than one
Program, one of them perhaps more than one program be stored in memory and be configured to by one or more than one
Managing device and executing the one or more programs includes for realizing such as the robot as described in any in above-described embodiment
The instruction of localization method, such as this method may include: according to the collected current image date of described image acquisition unit, really
The fixed history image data to match with the current image date, the history image data are existed by described image acquisition unit
Historical juncture collects;Obtain the robot history pose letter corresponding when the history image data are collected
Breath;According to the current collected ranging data of the distance measuring unit, the current posture information of the robot is determined;According to described
History posture information and the current posture information, position the current location of the robot.
Figure 22 is a kind of block diagram of positioning device 2200 for robot shown according to an exemplary embodiment.Example
Such as, device 2200 can be mobile phone, computer, digital broadcasting terminal, messaging device, game console, and plate is set
It is standby, Medical Devices, body-building equipment, personal digital assistant etc..
Referring to Figure 22, device 2200 may include following one or more components: processing component 2202, memory 2204,
Power supply module 2206, multimedia component 2208, audio component 2210, the interface 2212 of input/output (I/O), sensor module
2214 and communication component 2216.
The integrated operation of the usual control device 2200 of processing component 2202, such as with display, telephone call, data communication,
Camera operation and record operate associated operation.Processing component 2202 may include one or more processors 2220 to execute
Instruction, to perform all or part of the steps of the methods described above.In addition, processing component 2202 may include one or more moulds
Block, convenient for the interaction between processing component 2202 and other assemblies.For example, processing component 2202 may include multi-media module,
To facilitate the interaction between multimedia component 2208 and processing component 2202.
Memory 2204 is configured as storing various types of data to support the operation in device 2200.These data
Example includes the instruction of any application or method for operating on device 2200, contact data, telephone book data,
Message, picture, video etc..Memory 2204 can by any kind of volatibility or non-volatile memory device or they
Combination is realized, such as static random access memory (SRAM), electrically erasable programmable read-only memory (EEPROM), it is erasable can
Program read-only memory (EPROM), programmable read only memory (PROM), read-only memory (ROM), magnetic memory, flash memory
Reservoir, disk or CD.
Power supply module 2206 provides electric power for the various assemblies of device 2200.Power supply module 2206 may include power management
System, one or more power supplys and other with for device 2200 generate, manage, and distribute the associated component of electric power.
Multimedia component 2208 includes the screen of one output interface of offer between described device 2200 and user.?
In some embodiments, screen may include liquid crystal display (LCD) and touch panel (TP).If screen includes touch panel,
Screen may be implemented as touch screen, to receive input signal from the user.Touch panel includes that one or more touch passes
Sensor is to sense the gesture on touch, slide, and touch panel.The touch sensor can not only sense touch or sliding is dynamic
The boundary of work, but also detect duration and pressure associated with the touch or slide operation.In some embodiments, more
Media component 2208 includes a front camera and/or rear camera.When device 2200 is in operation mode, as shot mould
When formula or video mode, front camera and/or rear camera can receive external multi-medium data.Each preposition camera shooting
Head and rear camera can be a fixed optical lens system or have focusing and optical zoom capabilities.
Audio component 2210 is configured as output and/or input audio signal.For example, audio component 2210 includes one
Microphone (MIC), when device 2200 is in operation mode, when such as call mode, recording mode, and voice recognition mode, microphone
It is configured as receiving external audio signal.The received audio signal can be further stored in memory 2204 or via logical
Believe that component 2216 is sent.In some embodiments, audio component 2210 further includes a loudspeaker, is used for output audio signal.
I/O interface 2212 provides interface, above-mentioned peripheral interface module between processing component 2202 and peripheral interface module
It can be keyboard, click wheel, button etc..These buttons may include, but are not limited to: home button, volume button, start button and
Locking press button.
Sensor module 2214 includes one or more sensors, and the state for providing various aspects for device 2200 is commented
Estimate.For example, sensor module 2214 can detecte the state that opens/closes of device 2200, the relative positioning of component, such as
The component is the display and keypad of device 2200, and sensor module 2214 can be with detection device 2200 or device 2200
The position change of one component, the existence or non-existence that user contacts with device 2200,2200 orientation of device or acceleration/deceleration and
The temperature change of device 2200.Sensor module 2214 may include proximity sensor, be configured in not any object
It is detected the presence of nearby objects when reason contact.Sensor module 2214 can also include optical sensor, as CMOS or ccd image are passed
Sensor, for being used in imaging applications.In some embodiments, which can also include acceleration sensing
Device, gyro sensor, Magnetic Sensor, pressure sensor or temperature sensor.
Communication component 2216 is configured to facilitate the communication of wired or wireless way between device 2200 and other equipment.Dress
The wireless network based on communication standard, such as WiFi can be accessed by setting 2200,2G or 3G or their combination.It is exemplary at one
In embodiment, communication component 2216 receives broadcast singal or broadcast correlation from external broadcasting management system via broadcast channel
Information.In one exemplary embodiment, the communication component 2216 further includes near-field communication (NFC) module, to promote short distance
Communication.For example, radio frequency identification (RFID) technology, Infrared Data Association (IrDA) technology, ultra wide band can be based in NFC module
(UWB) technology, bluetooth (BT) technology and other technologies are realized.
In the exemplary embodiment, device 2200 can be by one or more application specific integrated circuit (ASIC), number
Signal processor (DSP), digital signal processing appts (DSPD), programmable logic device (PLD), field programmable gate array
(FPGA), controller, microcontroller, microprocessor or other electronic components are realized, for executing the above method.
In the exemplary embodiment, a kind of non-transitorycomputer readable storage medium including instruction, example are additionally provided
It such as include the memory 2204 of instruction, above-metioned instruction can be executed by the processor 2220 of device 2200 to complete the above method.Example
Such as, the non-transitorycomputer readable storage medium can be ROM, random access memory (RAM), CD-ROM, tape, soft
Disk and optical data storage devices etc..
Those skilled in the art will readily occur to its of the disclosure after considering specification and practicing disclosure disclosed herein
Its embodiment.This application is intended to cover any variations, uses, or adaptations of the disclosure, these modifications, purposes or
Person's adaptive change follows the general principles of this disclosure and including the undocumented common knowledge in the art of the disclosure
Or conventional techniques.The description and examples are only to be considered as illustrative, and the true scope and spirit of the disclosure are by following
Claim is pointed out.
It should be understood that the present disclosure is not limited to the precise structures that have been described above and shown in the drawings, and
And various modifications and changes may be made without departing from the scope thereof.The scope of the present disclosure is only limited by the accompanying claims.