CN109431381A - Localization method and device, electronic equipment, the storage medium of robot - Google Patents

Localization method and device, electronic equipment, the storage medium of robot Download PDF

Info

Publication number
CN109431381A
CN109431381A CN201811268293.6A CN201811268293A CN109431381A CN 109431381 A CN109431381 A CN 109431381A CN 201811268293 A CN201811268293 A CN 201811268293A CN 109431381 A CN109431381 A CN 109431381A
Authority
CN
China
Prior art keywords
history
posture information
robot
image data
current
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201811268293.6A
Other languages
Chinese (zh)
Other versions
CN109431381B (en
Inventor
曹晶瑛
罗晗
王磊
薛英男
蔡为燕
吴震
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Stone Innovation Technology Co ltd
Original Assignee
Beijing Rockrobo Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Rockrobo Technology Co Ltd filed Critical Beijing Rockrobo Technology Co Ltd
Priority to CN202210535950.9A priority Critical patent/CN114847803B/en
Priority to CN201811268293.6A priority patent/CN109431381B/en
Publication of CN109431381A publication Critical patent/CN109431381A/en
Application granted granted Critical
Publication of CN109431381B publication Critical patent/CN109431381B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/24Floor-sweeping machines, motor-driven
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/28Floor-scrubbing machines, motor-driven
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/40Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
    • A47L11/4011Regulation of the cleaning machine by electric means; Control systems and remote control systems therefor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/75Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes

Abstract

The disclosure is directed to a kind of localization method of robot and device, electronic equipment, computer readable storage mediums;This method may include: according to the collected current image date of described image acquisition unit, and the determining history image data to match with the current image date, the history image data are collected by described image acquisition unit in the historical juncture;Obtain the robot history posture information corresponding when the history image data are collected;According to the current collected ranging data of the distance measuring unit, the current posture information of the robot is determined;According to the history posture information and the current posture information, the current location of the robot is positioned.By the locating scheme of the robot of the disclosure, the accuracy of positioning can be improved, thus further hoisting machine task efficiency.

Description

Localization method and device, electronic equipment, the storage medium of robot
Technical field
This disclosure relates to the localization method and device of robotic technology field more particularly to a kind of robot, electronic equipment, Storage medium.
Background technique
With the development of technology, there is the diversified robot with autonomous function, for example sweep the floor automatically The automatic cleaning equipments such as robot, automatic floor cleaning machine people.Automatic cleaning equipment can be by actively perceive surrounding enviroment come automatic Ground executes clean operation.For example, in the related technology by SLAM (simultaneous location and mapping, i.e., Shi Dingwei and map structuring) construct the map that need to currently carry out clean environment, and cleaning is executed according to the map constructed Operation.
However, positioning method in the related technology haves the defects that position inaccurate, to be easy to influence the work of robot Make efficiency.
Summary of the invention
The disclosure provides the localization method and device, electronic equipment, computer readable storage medium of a kind of robot, with solution Certainly deficiency in the related technology.
According to the first aspect of the embodiments of the present disclosure, a kind of localization method of robot is provided, the robot is configured with Image acquisition units and distance measuring unit;The described method includes:
According to the collected current image date of described image acquisition unit, determination matches with the current image date History image data, the history image data collect by described image acquisition unit in the historical juncture;
Obtain the robot history posture information corresponding when the history image data are collected;
According to the current collected ranging data of the distance measuring unit, the current posture information of the robot is determined;
According to the history posture information and the current posture information, the current location of the robot is determined Position.
Optionally,
Further include: determine whether the robot occurs kidnapping accident;
The current location progress according to the history posture information and the current posture information, to the robot Positioning, comprising: when determining that kidnapping accident occurs for the robot, believed according to the history posture information and the current pose Breath, the position that the robot occurs after kidnapping accident position.
Optionally, whether the determination robot occurs kidnapping accident, comprising:
When described image acquisition unit acquired image data and/or the collected ranging data hair of the distance measuring unit When raw mutation, determine that kidnapping accident occurs for the robot.
Optionally, described according to the collected current image date of described image acquisition unit, the determining and current figure The history image data to match as data, comprising:
When the similarity between the current image date and the history image data is more than preset threshold, institute is determined History image data are stated to match with the current image date.
Optionally, described according to the collected current image date of described image acquisition unit, the determining and current figure The history image data to match as data, comprising:
When in the current image date and the history image data comprising one or more same collected objects When, determine that the history image data match with the current image date.
Optionally, described according to the history posture information and the current posture information, to the current of the robot Position is positioned, comprising:
Determine the target histories posture information to match in the history posture information with the current posture information;
According to the target histories posture information to the robot in the map constructed according to the distance measuring unit Current location positioned.
Optionally, described according to the history posture information and the current posture information, to the current of the robot Position is positioned, comprising:
Three-dimensional environment composition is obtained, the three-dimensional environment composition is by the history image data and the history posture information Building obtains;
The posture information for corresponding to the current image date is determined based on the three-dimensional environment composition;
According to the posture information determined to the robot working as in the map constructed according to the distance measuring unit Front position is positioned.
Optionally, the three-dimensional environment is patterned into previously according to the collected history image data and the history bit Appearance information and construct to obtain;Alternatively, the three-dimensional environment is patterned into the history for determining to be matched with the current image date After image data, construct to obtain with the history posture information based on the history image data.
Optionally, further includes:
Determine the current posture information with the presence or absence of mistake;
When determining that the current posture information has mistake, the current pose is believed using the history posture information Breath is modified.
It is optionally, described to determine that the current posture information whether there is mistake, comprising:
When presently described distance measuring unit is blocked or the robot skids, it is wrong to determine that the current posture information exists Accidentally.
It is optionally, described to determine that the current posture information whether there is mistake, comprising:
When any in the current posture information and the history posture information does not match that, determine described current There are mistakes for posture information.
Optionally, further includes:
In the moving process that the robot executes specific operation, search and the collected figure of described image acquisition unit As the history image data that data match, and count corresponding matching times;
When the ratio of the number of the matching times and execution specific operation of any history image data is less than default threshold When value, it is right to delete the institute when any history image data are collected of any history image data and the robot The posture information answered.
Optionally, all history image data and history posture information are stored in presetting database;The method is also wrapped It includes:
When receiving the more new command for the presetting database, constructed in moving process according to the robot Map out determines depletion region;
Image data and corresponding posture information are acquired by described image acquisition unit in the depletion region, with Update the presetting database.
Optionally, the robot is for different scene types configured with corresponding cleaning strategy;The method is also wrapped It includes:
During the robot executes clean operation, according to for described image acquisition unit acquired image The scene Recognition of data is as a result, take corresponding cleaning strategy to be cleaned.
According to the second aspect of an embodiment of the present disclosure, a kind of positioning device of robot is provided, the robot is configured with Image acquisition units and distance measuring unit;Described device includes:
Image data determination unit, according to the collected current image date of described image acquisition unit, it is determining with it is described The history image data that current image date matches, the history image data are by described image acquisition unit in the historical juncture It collects;
Pose acquiring unit obtains the robot history pose corresponding when the history image data are collected Information;
Pose determination unit determines working as the robot according to the current collected ranging data of the distance measuring unit Preceding posture information;
Positioning unit, according to the history posture information and the current posture information, to the present bit of the robot It sets and is positioned.
Optionally,
Further include: kidnapping accident determination unit determines whether the robot occurs kidnapping accident;
The positioning unit includes: the first locator unit, when determining that kidnapping accident occurs for the robot, according to institute History posture information and the current posture information are stated, the position that the robot occurs after kidnapping accident positions.
Optionally, the kidnapping accident determination unit includes:
Kidnapping accident determines subelement, when described image acquisition unit acquired image data and/or the ranging list When first collected ranging data mutates, determine that kidnapping accident occurs for the robot.
Optionally, the positioning unit, comprising:
First determines subelement, determines that the target to match in the history posture information with the current posture information is gone through History posture information;
Second locator unit, according to the target histories posture information to the robot according to the distance measuring unit The current location in map constructed is positioned.
Optionally, the positioning unit, comprising:
Obtain subelement, obtain three-dimensional environment composition, the three-dimensional environment composition by the history image data with it is described History pose information architecture obtains;
Second determines subelement, determines that the pose for corresponding to the current image date is believed based on the three-dimensional environment composition Breath;
Third locator unit constructs the robot according to the distance measuring unit according to the posture information determined The current location in map out is positioned.
Optionally, the three-dimensional environment is patterned into previously according to the collected history image data and the history bit Appearance information and construct to obtain;Alternatively, the three-dimensional environment is patterned into the history for determining to be matched with the current image date After image data, construct to obtain with the history posture information based on the history image data.
Optionally, further includes:
Judging unit determines that the current posture information whether there is mistake;
Amending unit, when determining that the current posture information has mistake, using the history posture information to described Current posture information is modified.
Optionally, the judging unit includes:
First determines subelement, when presently described distance measuring unit is blocked or the robot skids, works as described in judgement There are mistakes for preceding posture information.
Optionally, the judging unit includes:
Second determines subelement, when the current posture information does not match that with any in the history posture information When, determine that there are mistakes for the current posture information.
Optionally, further includes:
Statistic unit is searched and described image acquisition unit in the moving process that the robot executes specific operation The history image data that acquired image data match, and count corresponding matching times;
Unit is deleted, when the matching times of any history image data and the ratio for the number for executing the specific operation are small When preset threshold, deletes any history image data and the robot is adopted in any history image data Corresponding posture information when collection.
Optionally, all history image data and history posture information are stored in presetting database;Described device is also wrapped It includes:
Depletion region determination unit, when receiving the more new command for the presetting database, according to the machine The map that people constructs in moving process determines depletion region;
Updating unit acquires image data and corresponding position by described image acquisition unit in the depletion region Appearance information, to update the presetting database.
Optionally, the robot is for different scene types configured with corresponding cleaning strategy;Described device is also wrapped It includes:
Developing Tactics unit, during the robot executes clean operation, according to single for described image acquisition The scene Recognition of first acquired image data is as a result, take corresponding cleaning strategy to be cleaned.
Optionally, described image data determination unit includes:
First image data determines subelement, when similar between the current image date and the history image data When degree is more than preset threshold, determine that the history image data match with the current image date.
Optionally, described image data determination unit includes:
Second image data determines subelement, when including one in the current image date and the history image data When a or multiple same collected objects, determine that the history image data match with the current image date.
According to the third aspect of an embodiment of the present disclosure, a kind of robot is provided, the robot is configured with Image Acquisition list Member and distance measuring unit;The robot further include:
Processor;
Memory for storage processor executable instruction;
Wherein, the processor is by running the executable instruction to realize the side as described in any in above-described embodiment Method.
According to a fourth aspect of embodiments of the present disclosure, a kind of computer readable storage medium is provided, calculating is stored thereon with Machine instruction, which is characterized in that the step of any the method in such as above-described embodiment is realized when the instruction is executed by processor.
The technical scheme provided by this disclosed embodiment can include the following benefits:
As can be seen from the above embodiments, the disclosure by with distance measuring unit robot on configure image acquisition units, So that robot can acquire image data in moving process, and mapping is established with posture information when acquisition image data and is closed System.It, can be by the corresponding history bit of the history image data to match with current image date so in subsequent moving process Appearance information works as robot collectively as foundation as reference, and with the current posture information determined by distance measuring unit Front position is positioned, so that the accuracy of robot self poisoning is improved, further hoisting machine task efficiency.
It should be understood that above general description and following detailed description be only it is exemplary and explanatory, not The disclosure can be limited.
Detailed description of the invention
The drawings herein are incorporated into the specification and forms part of this specification, and shows the implementation for meeting the disclosure Example, and together with specification for explaining the principles of this disclosure.
Fig. 1 is the schematic diagram that kidnapping accident occurs for a kind of robot shown according to an exemplary embodiment.
Fig. 2 is a kind of flow chart of the localization method of robot shown in an exemplary embodiment.
Fig. 3-4 is the flow chart of the localization method of another robot shown in an exemplary embodiment.
Fig. 5 is a kind of flow chart of the method for relocating of robot shown in an exemplary embodiment.
Fig. 6 is the flow chart of the method for relocating of another robot shown in an exemplary embodiment.
Fig. 7 is a kind of flow chart of the current posture information of verification shown in an exemplary embodiment.
Fig. 8 is the flow chart that a kind of automatic cleaning equipment shown according to an exemplary embodiment executes clean operation.
Fig. 9 is a kind of block diagram of the positioning device of robot shown according to an exemplary embodiment.
Figure 10-21 is the block diagram of the positioning device of another robot shown according to an exemplary embodiment.
Figure 22 is a kind of structural schematic diagram of positioning device for robot shown according to an exemplary embodiment.
Specific embodiment
Example embodiments are described in detail here, and the example is illustrated in the accompanying drawings.Following description is related to When attached drawing, unless otherwise indicated, the same numbers in different drawings indicate the same or similar elements.Following exemplary embodiment Described in embodiment do not represent all embodiments consistent with the application.On the contrary, they be only with it is such as appended The example of the consistent device and method of some aspects be described in detail in claims, the application.
It is only to be not intended to be limiting the application merely for for the purpose of describing particular embodiments in term used in this application. It is also intended in the application and the "an" of singular used in the attached claims, " described " and "the" including majority Form, unless the context clearly indicates other meaning.It is also understood that term "and/or" used herein refers to and wraps It may be combined containing one or more associated any or all of project listed.
It will be appreciated that though various information, but this may be described using term first, second, third, etc. in the application A little information should not necessarily be limited by these terms.These terms are only used to for same type of information being distinguished from each other out.For example, not departing from In the case where the application range, the first information can also be referred to as the second information, and similarly, the second information can also be referred to as One information.Depending on context, word as used in this " if " can be construed to " ... when " or " when ... When " or " in response to determination ".
In the related art, by taking sweeping robot as an example: configuring LDS (LASER usually on sweeping robot DISTANCE SENSOR, laser range sensor) and positioned during cleaning using SLAM algorithm.However, passing through LDS is positioned, and on the one hand the accuracy of positioning is lower;On the other hand when sweeping robot is held as a hostage, there may be utilizations LDS can not detect the case where kidnapping accident.It is illustrated below with reference to Fig. 1.
Referring to Figure 1, Fig. 1 is that showing for kidnapping accident occurs for a kind of sweeping robot shown according to an exemplary embodiment It is intended to.As shown in Figure 1, when sweeping robot 100 executes clean operation in rectangular environment 10, if by sweeping robot 100 move from position A to position B (i.e. kidnapping accident has occurred in sweeping robot 100), then sweeping robot 100 according to LDS simultaneously It can not determine that kidnapping accident itself has occurred (since position A is similar in the data apart from aspect with position B).In fact, Since sweeping robot 100 can not determine that kidnapping accident itself has occurred according to LDS, determine instead currently still in position A (being practically in position B, that is, reset bit-errors) is set, continues to be in the cleaning strategy of position A according to itself (for example, using Cleaning route is constant) clean operation is executed, cause to reduce cleaning efficiency.
Therefore, the disclosure is improved by the positioning method to the robot with autonomous function, to solve phase Above-mentioned technical problem present in the technology of pass.It is described in detail below with reference to embodiment.
The robot 100 that the disclosure provides can (but being not limited to) be sweeping robot, floor-mopping robot or sweep and drag one The automatic cleaning equipments such as formula robot, the robot 100 may include machine body, sensory perceptual system, control system, driving system System, cleaning systems, energy resource system and man-machine interactive system.
Wherein:
Machine body includes forward portion and backward part, has approximate circular shape (front and back is all round), can also have There are other shapes, the approximate D-shape of circle including but not limited to behind front.
Sensory perceptual system include position determining means above the machine body, positioned at machine body forward portion it is slow Rush the sensings dress such as device, steep cliff sensor and ultrasonic sensor, infrared sensor, magnetometer, accelerometer, gyroscope, odometer It sets, provides the various positions information and movement state information of machine to control system.Position determining means include but is not limited to take the photograph As head, laser ranging system (LDS).Illustrate how that progress position is true by taking the laser ranging system of triangle telemetry as an example below It is fixed.The basic principle of triangle telemetry is based on the equal than relationship of similar triangles, and this will not be repeated here.
Laser ranging system includes luminescence unit and light receiving unit.Luminescence unit may include the light source for emitting light, light source It may include light-emitting component, such as the infrared or luminous ray light emitting diode (LED) of transmitting infrared light or luminous ray.It is excellent Selection of land, light source can be the light-emitting component of transmitting laser beam.In the present embodiment, the example by laser diode (LD) as light source Son.Specifically, due to the monochrome of laser beam, orientation and collimation property, use the light source of laser beam can make measurement compared to Other light are more accurate.For example, the infrared light or luminous ray of light emitting diode (LED) transmitting are by week compared to laser beam Such environmental effects (such as color or texture of object) is enclosed, and may be decreased in measurement accuracy.Laser diode (LD) it can be dot laser, measure the two-dimensional position information of barrier, be also possible to line laser, measure the certain model of barrier Enclose interior three dimensional local information.
Light receiving unit may include imaging sensor, and the light for being reflected by barrier or being scattered is formed on the imaging sensor Point.Imaging sensor can be the set of single or plurality of rows of multiple unit pixels.These light receiving elements can be by optical signal Be converted to electric signal.Imaging sensor can be complementary metal oxide semiconductor (CMOS) sensor or charge coupled cell (CCD) sensor, since the advantage in cost is preferably complementary metal oxide semiconductor (CMOS) sensor.Moreover, light Unit may include sensitive lens component.The light for being reflected by barrier or being scattered can advance via sensitive lens component to scheme As forming image on sensor.Sensitive lens component may include single or multiple lens.
Base portion can support luminescence unit and light receiving unit, and luminescence unit and light receiving unit are arranged on base portion and to each other Every a specific range.For the barrier situation around robot measurement on 360 degree of directions, base portion can be made to be rotatably arranged In main body, it can not also be rotated with base portion itself and rotate transmitting light, reception light by the way that rotating element is arranged.Rotation The angular velocity of rotation of element can be obtained by setting optic coupling element and code-disc, and optic coupling element incudes the tooth on code-disc and lacks, and be passed through Tooth lack spacing slip over time and tooth lack between distance value be divided by instantaneous angular velocity can be obtained.The scarce density of tooth is bigger on code-disc, surveys The accuracy rate and precision of amount are also just corresponding higher but just more accurate in structure, and calculation amount is also higher;Conversely, tooth lack it is close Spend smaller, the accuracy rate and precision of measurement are accordingly also lower, but can be relatively easy in structure, and calculation amount is also smaller, can To reduce some costs.
The data processing equipment connecting with light receiving unit, such as DSP, will be relative to all angles on 0 degree of angular direction of robot Obstacle distance value at degree records and sends to the data processing unit in control system, such as comprising the application processor of CPU (AP), location algorithm of the CPU operation based on particle filter obtains the current location of robot, and is charted according to this position, for leading Boat uses.It is preferable to use instant positioning and map structuring (SLAM) for location algorithm.
Although the laser ranging system based on triangle telemetry can measure the infinity other than certain distance in principle Distance value at distance, but actually telemeasurement, such as 6 meters or more, realization be it is very difficult, be primarily due to light The size limitation of pixel unit on the sensor of unit, while also by the photoelectric conversion speed of sensor, sensor and connection The calculating speed of data transmission bauds, DSP between DSP influences.The measured value that laser ranging system is affected by temperature The variation that meeting generating system can not put up with, the thermal expansion that the structure being primarily due between luminescence unit and light receiving unit occurs become Shape leads to the angle change between incident light and emergent light, and luminescence unit and light receiving unit itself can also have temperature drift.Swash Optical range finding apparatus be used for a long time after, as many factors such as temperature change, vibration accumulate and caused by deformation also can serious shadow Ring measurement result.The accuracy of measurement result directly determines the accuracy of map making, is robot further progress strategy The basis of implementation, it is particularly important.
The forward portion of machine body can carry buffer, and driving wheel module promotes robot on ground during cleaning When walking, buffer detects one or more in the driving path of robot 100 via sensing system, such as infrared sensor A event (or object), robot can be controlled by the event (or object) that is detected by buffer, such as barrier, wall Driving wheel module processed makes robot to respond to the event (or object), for example away from barrier.
Set-up of control system is on the circuit main board in machine body, including with non-transitory memory, such as it is hard disk, fast Flash memory, random access memory, the computation processor of communication, such as central processing unit, application processor, using processing The obstacle information that device is fed back according to laser ranging system draws institute, robot in the environment using location algorithm, such as SLAM Instant map.And combining buffer, steep cliff sensor and ultrasonic sensor, infrared sensor, magnetometer, accelerometer, Which kind of work range information, the velocity information comprehensive descision sweeper of the sensing devices such as gyroscope, odometer feedback are currently at State such as crosses threshold, and upper carpet is located at steep cliff, and either above or below is stuck, and dirt box is full, be lifted etc., it can also be directed to Different situations provide specific next step action policy, so that the work of robot is more in line with the requirement of owner, have preferably User experience.Further, the instant map information planning that control system can be drawn based on SLAM cleaning the most efficient and rational Path and cleaning method greatly improve the sweeping efficiency of robot.
Drive system can based on have distance and angle information, such as x, y and θ component drive command and the people that operates machine 100 cross over ground run.Drive system includes driving wheel module, and driving wheel module can control revolver and right wheel simultaneously, in order to The movement of machine is more accurately controlled, preferably driving wheel module respectively includes left driving wheel module and right driving wheel module.Left, Right driving wheel module is opposed along the lateral shaft defined by main body.Can move on the ground more stablely for robot or The stronger locomitivity of person, robot may include one or more driven wheel, and driven wheel includes but is not limited to universal wheel.It drives Driving wheel module includes traveling wheel and drive motor and the control circuit for controlling drive motor, and driving wheel module can also connect survey Measure the circuit and odometer of driving current.Driving wheel module can be detachably connected in main body, easy disassembly and maintenance.It drives Driving wheel can have biasing drop suspension system, movably fasten, such as be rotatably attached, and arrive robot master Body, and receive spring biasing downward and far from robot body's biasing.Spring biasing allows driving wheel with certain soil fertility The contact and traction with ground are maintained, while the cleaning element of robot 100 is also with certain pressure contact ground 10.
Cleaning systems can be dry cleaning system and/or wet cleaning system.As dry cleaning system, main cleaning The cleaning system that connecting component of the function between roller brushes structure, dirt box structure, blower fan structure, air outlet and four is constituted System.With ground have the roller brushes structure centainly interfered by the rubbish on ground sweep up and winding to roller brushes structure and dirt box structure it Between suction inlet in front of, then by blower fan structure generates and pass through dirt box structure have suction gas sucking dirt box structure.It sweeps The dust collection capacity of ground machine can be characterized, sweeping efficiency with the sweeping efficiency DPU (Dust pick up efficiency) of rubbish DPU is by roller brushes structure and Effect of Materials, by the interconnecting piece between suction inlet, dirt box structure, blower fan structure, air outlet and four The wind power utilization rate in the air duct that part is constituted influences, and is influenced by the type and power of blower, is a complicated system design problem. Compared to common plug-in dust catcher, the raising of dust collection capacity meaning for the clean robot of limited energy is bigger.Cause It directly effectively reduces for the raising of dust collection capacity for energy requirement, that is to say, that filling primary electricity originally, can to clean 80 flat The machine in meter face can evolve even more to fill 180 square meters of primary electricity cleaning.And reduce making for the battery of charging times It can also be greatly increased with the service life, so that the frequency that user replaces battery also will increase.It is more intuitive and importantly, dust collection capacity Raising be the most obvious and important user experience, user can immediately arrive at sweep whether clean/wipe whether clean knot By.Dry cleaning system also may include that there is the side of rotary shaft to brush, and rotary shaft is angled relative to ground, with for will be broken Bits are moved in the round brush region of cleaning systems.
Energy resource system includes rechargeable battery, such as nickel-metal hydride battery and lithium battery.Rechargeable battery can connect charge control Circuit, battery pack charging temperature detection circuit and battery undervoltage observation circuit, charging control circuit, the detection of battery pack charging temperature Circuit, battery undervoltage observation circuit are connected with single chip machine controlling circuit again.Host, which passes through, is arranged in fuselage side or lower section Charging electrode connect with charging pile and charges.
Man-machine interactive system includes the key on host panel, and key carries out function selection for user;It can also include aobvious Display screen and/or indicator light and/or loudspeaker, display screen, indicator light and loudspeaker show current machine status or function to user It can options;It can also include mobile phone client program.For path navigation type cleaning equipment, cell phone client can to The map of environment and machine present position where the presentation device of family, can provide a user the function of more horn of plenty and hommization It can item.
The robot that the disclosure provides is configured with image acquisition units and distance measuring unit;Image acquisition units are for acquiring figure As data, distance measuring unit is for acquiring ranging data.Wherein, image acquisition units and distance measuring unit may be included in above-mentioned perception system In the position determining means of system.For example, image acquisition units can be camera, distance measuring unit can be laser ranging system. For another example, image acquisition units and distance measuring unit can be integrated in camera;For example, can be used with TOF (Time of Flight, flight time) function feels camera deeply, or uses the camera of 3D structured light technique.Certainly, the disclosure is not The particular hardware form of image acquisition units and distance measuring unit is limited.
Based on the structure of above-mentioned robot, the disclosure provides a kind of localization method of robot.As shown in Fig. 2, this method It may comprise steps of:
In step 202, according to the collected current image date of described image acquisition unit, the determining and current figure As the history image data that data match, the history image data are acquired by described image acquisition unit in the historical juncture It arrives.
In the present embodiment, it can be regarded as robot before (in timing in " historical juncture " acquired image data Before current) operational process (robot moves during the motion) in acquired image data.It is set with automated cleaning For standby (certainly, it be not limited only to automatic cleaning equipment, can also be any other robot with autonomous function), it can The image data that automatic cleaning equipment is acquired during cleaning for the first time is as the history image data;Alternatively, by automatic The cleaning equipment image data that (i.e. history cleaning process) acquires before this cleaning process is as the history image data. It should be noted that the locating scheme of the disclosure is for automatic cleaning equipment cleans same environment.
In the present embodiment, in one case, " matching " can be regarded as current image date and history image data Between similarity (or matching degree) be more than certain threshold value.In another case, " matching ", it is current to can be regarded as Comprising one or more same subjects in image data and history image data.
In step 204, the robot history pose corresponding when the history image data are collected is obtained Information.
In the present embodiment, while robot acquires image data in moving process, record itself pose at this time Information, and the image data and the posture information are established into mapping relations.Wherein, posture information may include robot and shot pair As (i.e. the object of image acquisition units shooting, image acquisition units obtain the image of the subject by shooting subject Data) the distance between, angle, the posture of robot etc. the parameter of relative position between robot and subject can be described. For configuring LDS and camera (i.e. for LDS as distance measuring unit, camera is image acquisition units) using automatic cleaning equipment, camera shooting Head and LDS are worked at the same time to acquire corresponding data.For example, automatic cleaning equipment is during the cleaning of " historical juncture ", While by camera collection image data, using the collected ranging data of LDS, execution is formed by SLAM algorithm The map of the environment of clean operation, and determine itself currently location information in the map;Meanwhile according to other sensing devices (for example, gyroscope, accelerometer, electronic compass etc.) acquires itself current posture information, further according to the location information and is somebody's turn to do Posture information determines posture information.
It should be noted that by the above-mentioned description for " matching " it is found that the image to match with a certain image data Data are there may be multiple, and multiple images data then correspond to different angles, position and posture etc..For example, automated cleaning is set It is standby in scale removal process, may at different angles, distance and posture take the same tea table.
In step 206, according to the current collected ranging data of the distance measuring unit, the current of the robot is determined Posture information.
In the present embodiment, the present bit of robot can be determined according to the posture information of ranging data and current robot Appearance information.By taking automatic cleaning equipment configures LDS as an example, automatic cleaning equipment utilizes the collected ranging data of LDS, passes through SLAM algorithm forms the map for the environment for executing clean operation, and determines itself current location information in the map.Together When, itself current posture information is acquired according to other sensing devices (for example, gyroscope, accelerometer, electronic compass etc.), then Current posture information is determined according to the location information and the posture information.
In step 608, according to the history posture information and the current posture information, to the current of the robot Position is positioned.
In the present embodiment, image data can be acquired by image acquisition units in moving process based on robot, and Mapping relations are established with posture information corresponding when acquisition image data, then in the moving process of subsequent robot, it can History posture information that the history image data to match with current image date are corresponding is worked as reference with what is determined Preceding posture information positions the current location of robot collectively as foundation, to improve the accuracy of positioning, further Hoisting machine task efficiency.
By the content of above-mentioned steps 202 it is found that may include more with the history image data that current image date matches It is a;In other words, the history posture information got in step 204 also may include multiple therewith.So, for how using obtaining The multiple history posture informations got and the current posture information (the current posture information determined in step 206) carry out The mode of positioning, it may include following two:
In one embodiment, the target to match in the history posture information with the current posture information can be first determined History posture information is constructing the robot according to the distance measuring unit further according to the target histories posture information Current location in map is positioned.
In another embodiment, three-dimensional environment composition can be first obtained, the three-dimensional environment composition is by the history image number It is obtained according to the history pose information architecture, and is determined based on the three-dimensional environment composition and correspond to the current image date Posture information, further according to the posture information determined to the robot in the map constructed according to the distance measuring unit Current location in (constructing to obtain according to the collected all ranging datas of distance measuring unit) is positioned.Wherein, three-dimensional map It can be and generated in real time in positioning, is also possible to pre-generated.In other words, in one case, the three-dimensional environment It is patterned into and constructs to obtain with the history posture information previously according to the collected history image data;In another feelings Under condition, the three-dimensional environment is patterned into after determining to be matched with the history image data of the current image date, is based on institute History image data are stated to construct to obtain with the history posture information.
In the locating scheme for the robot that the disclosure provides, for robot, there is a situation where kidnapping accidents, can support The position that robot occurs after kidnapping accident is accurately relocated.As an exemplary embodiment, shown in above-mentioned Fig. 2 On the basis of embodiment, before step 202, it can comprise the further steps of: and determine whether the robot occurs kidnapping accident. So, the current image date in the case where kidnapping accident occurs for robot, in above-mentioned steps 202, it should be understood that The image data of image acquisition units acquisition after kidnapping accident occurs for robot;Similarly, distance measuring unit is current in above-mentioned steps 206 Collected ranging data, it should be understood that the collected ranging data of distance measuring unit after kidnapping accident occurs for robot, that is, works as Preceding posture information should be understood as being determined after kidnapping accident occurs for robot according to the collected ranging data of distance measuring unit Current posture information.Therefore, in the case where kidnapping accident occurs for robot, the positioning operation executed in step 208 can be into one Step include: when determining that kidnapping accident occurs for the robot, according to the history posture information and the current posture information, The position that the robot occurs after kidnapping accident positions.Wherein, the operation for specifically how executing the positioning, can refer to Associated description in above-mentioned steps 208, details are not described herein.
And for the condition for determining robot generation kidnapping accident, it can refer to the image acquisition units and survey of robot configuration Away from the collected data of unit.For example, image acquisition units acquired image data will when kidnapping accident occurs for robot It mutates, the collected ranging data of distance measuring unit will also mutate.Therefore, as an exemplary embodiment, when described It, can be true when image acquisition units acquired image data and/or the collected ranging data of the distance measuring unit mutate Kidnapping accident occurs for the fixed robot.As it can be seen that by the way that the situation of change of image acquisition units acquired image data is added It is added to as determining whether robot occurs the foundation of kidnapping accident, it can be achieved that accurate detection to kidnapping accident, to help In subsequent reorientation.
It, can be (i.e. current in step 206 to current posture information in the locating scheme for the robot that the disclosure provides Posture information) it is verified, and it is modified after verifying out the present bit appearance information and there is mistake.It is exemplary as one Embodiment can determine that the current posture information whether there is mistake, and sentencing on the basis of above-mentioned embodiment illustrated in fig. 2 When making the current posture information in the presence of mistake, the current posture information is repaired using the history posture information Just, to be conducive to improve the accuracy of the map constructed according to distance measuring unit.
In one embodiment, when presently described distance measuring unit is blocked or the robot skids, it can determine that described work as There are mistakes for preceding posture information;In another embodiment, when appointing in the current posture information and the history posture information One when not matching that, can determine that there are mistakes for the current posture information.
In the embodiment shown in Figure 2, it is gone through by the way that the history image data that will be matched with current image date are corresponding History posture information is as reference, to improve the accuracy of positioning.As it can be seen that can history image data and history posture information reflect The physical location of robot is most important out.Therefore, maintenance history image data and history pose letter can be come in the following manner Breath.
In one embodiment, it (by taking automatic cleaning equipment as an example, is somebody's turn to do in the moving process that the robot executes specific operation Moving process is the cleaning process of automatic cleaning equipment) in, it searches and described image acquisition unit acquired image data The history image data to match, and count corresponding matching times.Based on the statistics to matching times, when any history image It the matching times of data and executes the number of the specific operation (by taking automatic cleaning equipment as an example, which is set Standby cleaning time) ratio be less than preset threshold when, any history image data can be deleted and the robot exists Any history image data posture information corresponding when collected.
In another embodiment, all history image data and history posture information can be stored in presetting database.When When receiving the more new command for the presetting database, the map that can be constructed in moving process according to the robot (for example, can construct to obtain according to distance measuring unit) determines depletion region, and is acquired in the depletion region by described image Unit acquires image data and corresponding posture information, to update the presetting database.Wherein, the more new command can be by User is issued by mobile terminal (establishing communication connection with robot) to robot;Alternatively, the more new command can be by machine People generates according to the default update cycle.
In the present embodiment, the acquisition that can carry out image to institute's clean environment in moving process based on robot, can With according to the difference of the scene type of institute's clean environment, to the cleaning strategy of robot (for example, can be sweeping robot) It is adjusted correspondingly, to promote the usage experience of cleaning efficiency and user.For example, robot is directed to different scene types Configured with corresponding cleaning strategy;It so, can be according to for described image during the robot executes clean operation The scene Recognition of acquisition unit acquired image data is as a result, take corresponding cleaning strategy to be cleaned.
In order to make it easy to understand, (certainly, being not limited only to automated cleaning by taking the artificial automatic cleaning equipment of machine as an example below and setting Standby, can also be any other robot with autonomous function), in conjunction with attached drawing and concrete scene to the machine of the disclosure The locating scheme of people is described in detail.
Fig. 3 is referred to, Fig. 3 is the flow chart of the localization method of another robot shown in an exemplary embodiment.Such as Shown in Fig. 3, this method is applied to automatic cleaning equipment and (is configured with camera and LDS, i.e. image acquisition units are camera, are surveyed It is LDS away from unit), it may comprise steps of:
In step 302, current image date is acquired.
In step 304, the determining history image data to match with current image date.
In the present embodiment, automatic cleaning equipment can will for the first time clean during acquired image data as history Image data;Alternatively, will before this cleaning process (i.e. history cleans process, is the same environment of cleaning) collected figure As data are as history image data.Also, the pose of itself when history image data and the acquisition history image data is believed Breath is established mapping relations and is stored in presetting database.Wherein, posture information may include automatic cleaning equipment and subject (i.e. Camera shooting object, camera by shooting subject obtain the image data of the subject) the distance between, angle Degree, posture of robot etc. can describe the parameter of relative position between automatic cleaning equipment and subject.For example, camera with LDS is in working condition during cleaning, and posture information includes location information and posture information.So, automated cleaning is set For while passing through camera collection image data, using the collected ranging data of LDS, held by SLAM algorithm to be formed The map of the environment of row clean operation, and determine itself currently location information in the map;Meanwhile it being filled according to other sensings The posture information that (for example, gyroscope, accelerometer, electronic compass etc.) acquires itself is set, further according to the location information and the appearance State information determines posture information.
It should be noted that in the determining history image data to match with current image date, in one case, It is more than certain that " matching ", which can be regarded as the similarity (or matching degree) between current image date and history image data, Threshold value.For example, image matching algorithm (MAD algorithm, absolute error and algorithm, normalization product correlation al gorithm can be used Deng) or machine learning model determine in presetting database in all history image data, similarity with current image date More than preset threshold image data as " the history image number to match with current image date in above-mentioned steps 304 According to ".In another case, " matching " can be regarded as in current image date and history image data comprising one or more A same subject.For example, can recognize (for example, can by image-recognizing method neural network based, based on wavelet moment Image-recognizing method etc. identifies current image date) go out the object for including in current image date (for example, cup, tea The objects such as several, TV), will in presetting database equally also comprising this identify object (there may be multiple, may be configured as include Whole objects include fractional object) image data as " matching with current image date in above-mentioned steps 304 History image data ".In still another case, can also have both condition in above-mentioned two situations as determine " matching " according to According to.For example, when the similarity between current image date and history image data is more than certain threshold value, and (or) include When one or more same subjects, the relationship between the two for " matching " can determine that.
Within step 306, history posture information of the automatic cleaning equipment when the history image data are collected is obtained.
In the present embodiment, it can search and be determined with step 304 in the preset database based on the mapping relations of foundation The corresponding history posture information of history image data out.
In step 308, according to the current collected ranging data of LDS, the current pose letter of automatic cleaning equipment is determined Breath.
In the present embodiment, automatic cleaning equipment can utilize the collected ranging data of LDS, by SLAM algorithm come shape At the map for the environment for executing clean operation, and determine itself currently location information in the map.Meanwhile according to other biographies Induction device (for example, gyroscope, accelerometer, electronic compass etc.) acquires itself current posture information, believes further according to the position Breath and the posture information determine current posture information.
In the step 310, the target histories posture information to match in history posture information with current posture information is determined.
In the present embodiment, the history image data to match with current image date may include multiple.In other words, it walks The history posture information got in rapid 306 also may include multiple therewith.For example, the multiple history got in step 304 Image data includes same tea table;Wherein, shooting distance of the automatic cleaning equipment when shooting each history image data, shooting Angle and posture etc. are different.
And in the determining target histories posture information to match with current posture information, it can will be got in step 306 History posture information be compared one by one with current posture information, will with current posture information close or even identical history Posture information is as the target histories posture information.For example, can by the history posture information got in step 306, Distance include in current posture information at a distance from difference wrapped in pre-determined distance threshold value, and in angle and current posture information The difference of the angle contained predetermined angle threshold value history posture information as the target histories posture information.
In step 312, automatic cleaning equipment is being constructed according to distance measuring unit according to target histories posture informations Current location in map is positioned.
In the present embodiment, automatic cleaning equipment can construct the ground for executing the environment of clean operation using SLAM algorithm Figure, and then positioned according to current location of the target histories posture information to automatic cleaning equipment.So, automatic cleaning equipment Further according to the map and prelocalization can be worked as, determine the route for executing clean operation.
By the corresponding history posture information of the history image data that will be matched with current image date as reference, come Improve the accuracy of positioning.As it can be seen that can history image data and history posture information reflect the reality of automatic cleaning equipment Position is most important.Therefore, the history image data and history pose letter in presetting database can be safeguarded in the following manner Breath.
In one embodiment, it during each cleaning of automatic cleaning equipment, can search and the collected figure of camera As the history image data that data match, and count corresponding matching times.Based on the statistics to matching times, gone through when any The matching times of history image data and cleaning time (can be by automatic cleaning equipments from starting to clean required clean environment to should The process that clean environment finishes is interpreted as primary cleaning process, i.e. cleaning time record is primary) ratio when being less than preset threshold, The pose letter of any history image data and automatic cleaning equipment when any history image data are collected can be deleted Breath.
It in another embodiment, can be according to automatic cleaning equipment when receiving the more new command for presetting database The map (for example, can construct to obtain according to LDS) constructed during cleaning determines depletion region, then automatic cleaning equipment Camera collection image data and corresponding posture information can be passed through, in the depletion region to update presetting database.Example Such as, for the same teacup, automatic cleaning equipment shoots the teacup with posture at different angles by rotating, and remembers simultaneously Record corresponding posture information.Wherein, more new command (can be established by user by mobile terminal or server with automatic cleaning equipment Communication connection) it is issued to automatic cleaning equipment;Alternatively, more new command can be raw according to the default update cycle by automatic cleaning equipment At.
Fig. 4 is referred to, Fig. 4 is the flow chart of the localization method of another robot shown in an exemplary embodiment.Such as Shown in Fig. 4, this method is applied to automatic cleaning equipment and (is configured with camera and LDS, i.e. image acquisition units are camera, are surveyed It is LDS away from unit), it may comprise steps of:
In step 402, current image date is acquired.
In step 404, the determining history image data to match with current image date.
In a step 406, history posture information of the automatic cleaning equipment when the history image data are collected is obtained.
In the present embodiment, the detailed process of step 402-406 is similar with above-mentioned steps 302-306, and details are not described herein.
In a step 408, three-dimensional environment composition is obtained.
In the present embodiment, three-dimensional environment composition is obtained by the history image data got in step 404 with step 406 The history pose information architecture got obtains.In one case, three-dimensional environment composition, which can be, is collecting the history image It is constructed when data and the history posture information;In other words, the history image data and history pose letter are being collected It is just constructed to obtain three-dimensional map when breath, then directly reading the three-dimensional map when executing step 408.In another kind In the case of, three-dimensional environment is patterned into after determining to be matched with the history image data of current image date, is based on the history figure As data construct to obtain with the history posture information;In other words, by get in step 404 the history image data with And after by step 406 getting the history posture information, then structure is carried out to the history image data and the history posture information It builds to obtain three-dimensional map.
In step 410, the posture information for corresponding to current image date is determined based on three-dimensional environment composition.
In the present embodiment, the position for corresponding to current image date can be determined from three-dimensional environment composition by PNP algorithm Appearance information.Certainly, other algorithms for calculating posture information can also be used, the disclosure is limited not to this.
In step 412, according to the posture information determined to automatic cleaning equipment in the map constructed according to LDS Current location positioned.
In the present embodiment, the map can collected all ranging datas (include during this cleaning according to LDS According to the current collected ranging data of LDS) building obtains.After determining current location, automatic cleaning equipment can be into one Step determines the route for executing clean operation according to the map and current location.
The embodiment as shown in above-mentioned Fig. 3-4 during cleaning based on automatic cleaning equipment by image it is found that can be adopted Collect unit and acquire image data, and establish mapping relations with posture information when acquisition image data, then subsequent automatic clear During the cleaning of clean equipment, the corresponding history posture information of the history image data to match with current image date can be made For reference, and the current location of automatic cleaning equipment is positioned collectively as foundation with the current posture information determined, To improve the accuracy of positioning, the cleaning efficiency of automatic cleaning equipment is further promoted.
Fig. 5 is referred to, Fig. 5 is a kind of flow chart of the method for relocating of robot shown in an exemplary embodiment.Such as Shown in Fig. 5, this method is applied to automatic cleaning equipment and (is configured with camera and LDS, i.e. image acquisition units are camera, are surveyed It is LDS away from unit), it may comprise steps of:
In step 502, current collected data are obtained.
In the present embodiment, collected data include that the collected current image date of camera and LDS are currently adopted The ranging data collected.
In step 504, judge whether the data got mutate, if being mutated, be transferred to step 506; Otherwise return step 502.
In the present embodiment, when kidnapping accident occurs for automatic cleaning equipment, (kidnapping accident can be regarded as automatic cleaning equipment It is non-to be moved according to the normal route for executing clean operation or speed, such as moved away from cleaning by force by user during cleaning Position) when, camera acquired image data will mutate, and the collected ranging data of LDS will also mutate.Cause Whether this, can judge automatic cleaning equipment by the way that whether current acquired image data and/or ranging data mutate Kidnapping accident has occurred.As it can be seen that automatic as determining by being added to the situation of change of camera acquired image data The foundation of kidnapping accident whether occurs for cleaning equipment, it can be achieved that accurate detection to kidnapping accident, to facilitate subsequent heavy Positioning.Wherein, image data mutation can be regarded as the information for including in adjacent picture frame in timing mutation;Example Such as, the variable quantity of identical content proportion is more than in certain threshold value or adjacent picture frame in adjacent picture frame Not comprising identical subject etc..Certainly, whether the disclosure does not mutate used algorithm to identification image data It is limited.Similarly, ranging data mutation can be regarded as adjacent ranging data in timing (same ranging data can wrap Ranging data containing all angles around automatic cleaning equipment) it mutates;Certainly, the disclosure is not also to identification ranging data Algorithm used by no mutation is limited.
For example, it is undertaken in the citing of Fig. 1 in the related technology, even if position A and position B are in the data apart from aspect (i.e. ranging data) is similar (not mutating), as long as the surface of position A and position B has differences (for example, machine of sweeping the floor People 100 can take TV 101 in position A by camera, and can not take TV by camera in position B 101), then camera acquired image data will be sent out during sweeping robot 100 is moved from position A to position B Raw mutation (can be regarded as the image data that camera collects position A, with collect position B image data between difference It is larger), so as to judge that kidnapping accident has occurred in sweeping robot 100, and can't obtain such as judgement in the related technology As a result (determine that kidnapping accident does not occur for sweeping robot 100).
In step 506, the determining history image data to match with current image date.
In the present embodiment, it in the case where kidnapping accident occurs for automatic cleaning equipment, is collected in step 502 Current image date, it should be understood that automatic cleaning equipment occur kidnapping accident rear camera acquired image data;Together Reason, the current collected ranging data of LDS in step 502, it should be understood that LDS is adopted after kidnapping accident occurs for automatic cleaning equipment The ranging data collected.
In step 508, history posture information of the automatic cleaning equipment when the history image data are collected is obtained.
In step 510, the posture information after automatic cleaning equipment is held as a hostage is obtained.
In step 512, the target histories position that the posture information in history posture information and after being held as a hostage matches is determined Appearance information.
In the present embodiment, it will be apparent to a skilled person that occurring to kidnap thing for automatic cleaning equipment In the case where part, automatic cleaning equipment be held as a hostage after posture information, i.e., for according to the currently collected survey of LDS in step 502 Current posture information away from the automatic cleaning equipment that data are determined.
In the step 514, the position after being held as a hostage according to target histories posture information to automatic cleaning equipment positions.
In the present embodiment, it will be apparent to a skilled person that occurring to kidnap thing for automatic cleaning equipment In the case where part, step 514 is similar with above-mentioned steps 312, i.e. the current location in the map constructed according to distance measuring unit It is positioned.Behind the position after determining to be held as a hostage, automatic cleaning equipment can further according to the map of building and by Position after abduction, planning executes the route of clean operation again.
It should be noted that step 506-514 is identical as the principle of above-mentioned steps 304-312, details are not described herein.
Fig. 6 is referred to, Fig. 6 is the flow chart of the method for relocating of another robot shown in an exemplary embodiment. As shown in fig. 6, this method be applied to automatic cleaning equipment (be configured with camera and LDS, i.e., image acquisition units be camera, Distance measuring unit is LDS), it may comprise steps of:
In step 602, current collected data are obtained.
In the present embodiment, collected data include that the collected current image date of camera and LDS are currently adopted The ranging data collected.
In step 604, judge whether the data got mutate, if being mutated, be transferred to step 606; Otherwise return step 602.
In the present embodiment, step 602-604 is similar with above-mentioned steps 502-504, and details are not described herein.
In step 606, the determining history image data to match with current image date.
In the present embodiment, it in the case where kidnapping accident occurs for automatic cleaning equipment, is collected in step 602 Current image date, it should be understood that automatic cleaning equipment occur kidnapping accident rear camera acquired image data;Together Reason, the current collected ranging data of LDS in step 602, it should be understood that LDS is adopted after kidnapping accident occurs for automatic cleaning equipment The ranging data collected.
In step 608, history posture information of the automatic cleaning equipment when the history image data are collected is obtained.
In step 610, corresponding three-dimensional environment composition is obtained.
In step 612, the posture information for corresponding to current image date is determined based on three-dimensional environment composition.
In step 614, the position after being held as a hostage according to the posture information determined to automatic cleaning equipment is positioned.
In the present embodiment, it will be apparent to a skilled person that occurring to kidnap thing for automatic cleaning equipment In the case where part, current location is the position after being held as a hostage, i.e. step 614 is similar with above-mentioned steps 412.It is determining to be robbed Behind position after holding, the position that automatic cleaning equipment can further according to the map of building and after being held as a hostage is planned again Execute the route of clean operation.
It should be noted that step 606-614 is identical as the principle of above-mentioned steps 404-412, details are not described herein.It can See, the disclosure is by configuring image acquisition units on the automatic cleaning equipment with distance measuring unit, so that automatic cleaning equipment Image data can be acquired during cleaning, and establishes mapping relations with posture information when acquisition image data.So rear During continuous cleaning, can history posture information that the history image data that matched with current image date are corresponding as ginseng It examines, and with the current posture information determined by distance measuring unit collectively as foundation, to the current location of automatic cleaning equipment It is positioned, to improve the accuracy of automatic cleaning equipment self poisoning, further promotes the cleaning effect of automatic cleaning equipment Rate.
The embodiment as shown in above-mentioned Fig. 5-6 is as it can be seen that by adding the situation of change of camera acquired image data It is added to as determining whether automatic cleaning equipment occurs the foundation of kidnapping accident, it can be achieved that accurate detection to kidnapping accident, from And facilitate subsequent reorientation.Fig. 7 is referred to, Fig. 7 is the current posture information of a kind of verification shown in an exemplary embodiment Flow chart.As shown in fig. 7, this method is applied to the automatic cleaning equipment in any of the above-described embodiment, it may include following step It is rapid:
In a step 702, current posture information is obtained.
In the present embodiment, current posture information is current posture information in any of the above-described embodiment (according to LDS The current posture information for the automatic cleaning equipment that current collected ranging data is determined).
In step 704, current posture information is judged with the presence or absence of mistake, and mistake is then transferred to step 706 if it exists;Otherwise Return step 702.
In the present embodiment, in one case, when current LDS is blocked (for example, the ranging data of LDS remains It is constant) or automatic cleaning equipment skid (for example, the data of automatic cleaning equipment accelerometer and odometer do not match that) when, can Determine that there are mistakes for the present bit appearance information;In another case, when appointing in current posture information and history posture information One when not matching that, can determine that there are mistakes for current posture information.
In step 706, history posture information is obtained.
In the present embodiment, history posture information be in any of the above-described embodiment history posture information (for example, step History posture information in rapid 306).
In step 708, current posture information is modified using history posture information.
In the present embodiment, when determining current posture information and there is mistake, using the history posture information to deserving Preceding posture information is modified, and is conducive to improve the accuracy according to the LDS map constructed.
Fig. 8 is the flow chart that a kind of automatic cleaning equipment shown according to an exemplary embodiment executes clean operation, such as Shown in Fig. 8, this method is applied to the automatic cleaning equipment in any of the above-described embodiment, may comprise steps of:
In step 802, current image date is acquired.
In step 804, the scene type of current image date is identified.
In the present embodiment, can be used (can be by being marked with scene type applied to the machine learning model of scene Recognition Sample data be trained to obtain) identified;Alternatively, (being stored with various scene types in default scene type database Image data, for example, parlor, toilet, bedroom etc.) in search the image data to match with current image date, and will The scene type that the scene type of the image data found is indicated as current image date;Alternatively, can be by user's control certainly Dynamic cleaning equipment is moved to captured image data under each scene, and marks corresponding scene type, then subsequent automatic clear Clean equipment can go out the scene type of institute's clean environment according to the marker recognition.Certainly, the disclosure is not to identification scene class The mode of type is limited.
In step 806, corresponding cleaning strategy is searched.
In the present embodiment, based on automatic cleaning equipment during cleaning can by camera to institute's clean environment into The acquisition of row image, can according to the difference of the scene type of institute's clean environment, to the cleaning strategy of automatic cleaning equipment into The corresponding adjustment of row, to promote the usage experience of cleaning efficiency and user.
For example, it is assumed that for the scene type of institute's cleaning ambient, can configure cleaning strategy as shown in Table 1:
Scene type cleaning strategy
Parlor uses strength cleaning mode
Toilet is without cleaning
Bedroom silent mode
…………
Table 1
Certainly, table 1 is only a kind of example to configuration cleaning strategy, and specifically cleaning tactful user can be according to practical feelings Condition is flexibly set, and the disclosure is limited not to this.As it can be seen that by for the different corresponding cleaning plans of scene type configuration Slightly, the different requirement for cleaning of user can be met, helps to promote user experience.
In step 808, it is cleaned according to the cleaning strategy found out.
Corresponding with the embodiment of the localization method of automatic cleaning equipment above-mentioned, the disclosure additionally provides automated cleaning and sets The embodiment of standby positioning device.
Fig. 9 is a kind of block diagram of the positioning device of robot shown according to an exemplary embodiment.Referring to Fig. 9, the machine Device people is configured with image acquisition units and distance measuring unit;The device includes image data determination unit 901, pose acquiring unit 902, pose determination unit 903 and positioning unit 904.
The image data determination unit 901 is configured as according to the collected present image number of described image acquisition unit According to the determining history image data to match with the current image date, the history image data are acquired by described image Unit is collected in the historical juncture;
It is right that the pose acquiring unit 902 is configured as obtaining institute when the history image data are collected, the robot The history posture information answered;
The pose determination unit 903 is configured as determining institute according to the current collected ranging data of the distance measuring unit State the current posture information of robot;
The positioning unit 904 is configured as according to the history posture information and the current posture information, to the machine The current location of device people positions.
As shown in Figure 10, Figure 10 is the frame of the positioning device of another robot shown according to an exemplary embodiment Figure, the embodiment is on the basis of aforementioned embodiment illustrated in fig. 9, further includes: kidnapping accident determination unit 905, the positioning unit 904 include: the first locator unit 9041.
The kidnapping accident determination unit 905 is configured to determine that whether the robot occurs kidnapping accident;
First locator unit 9041 is configured as being gone through when determining that kidnapping accident occurs for the robot according to described History posture information and the current posture information, the position that the robot occurs after kidnapping accident position.
As shown in figure 11, Figure 11 is the frame of the positioning device of another robot shown according to an exemplary embodiment Figure, for the embodiment on the basis of aforementioned embodiment illustrated in fig. 10, kidnapping accident determination unit 905 includes: that kidnapping accident determines Subelement 9051.
The kidnapping accident determine subelement 9051 be configured as when described image acquisition unit acquired image data and/ Or the distance measuring unit collected ranging data determines that kidnapping accident occurs for the robot when mutating.
As shown in figure 12, Figure 12 is the frame of the positioning device of another robot shown according to an exemplary embodiment Figure, for the embodiment on the basis of aforementioned embodiment illustrated in fig. 9, which includes: the first determining subelement 9042 With the second locator unit 9043.
The first determining subelement 9042 is configured to determine that in the history posture information and the current posture information The target histories posture information to match;
Second locator unit 9043 is configured as according to the target histories posture information to the robot in root The current location in map constructed according to the distance measuring unit is positioned.
It should be noted that first in Installation practice shown in above-mentioned Figure 12 determines that subelement 9042 and second is determined The structure of seat unit 9043 also may be embodied in the Installation practice of earlier figures 10, be not limited to this disclosure.
As shown in figure 13, Figure 13 is the frame of the positioning device of another robot shown according to an exemplary embodiment Figure, for the embodiment on the basis of aforementioned embodiment illustrated in fig. 9, which includes: to obtain subelement 9044, second Determine subelement 9045 and third locator unit 9046.
The acquisition subelement 9044 is configured as obtaining three-dimensional environment composition, and the three-dimensional environment composition is by the history figure As data and the history pose information architecture obtain;
The second determining subelement 9045, which is configured as determining based on the three-dimensional environment composition, corresponds to the current figure As the posture information of data;
The third locator unit 9046 is configured as according to the posture information determined to the robot according to institute The current location stated in the map that distance measuring unit constructs is positioned.
Optionally, the three-dimensional environment is patterned into previously according to the collected history image data and the history bit Appearance information and construct to obtain;Alternatively, the three-dimensional environment is patterned into the history for determining to be matched with the current image date After image data, construct to obtain with the history posture information based on the history image data.
It should be noted that the acquisition subelement 9044, second in Installation practice shown in above-mentioned Figure 13 determines that son is single The structure of member 9045 and third locator unit 9046 also may be embodied in the Installation practice of earlier figures 10, to this this public affairs It opens and is not limited.
As shown in figure 14, Figure 14 is the frame of the positioning device of another robot shown according to an exemplary embodiment Figure, the embodiment is on the basis of aforementioned embodiment illustrated in fig. 9, further includes: judging unit 906 and amending unit 907.
The judging unit 906 is configured as determining the current posture information with the presence or absence of mistake;
The amending unit 907 is configured as utilizing the history bit when determining that the current posture information has mistake Appearance information is modified the current posture information.
It should be noted that the knot of judging unit 906 and amending unit 907 in Installation practice shown in above-mentioned Figure 14 Structure also may be embodied in the Installation practice of earlier figures 10, be not limited to this disclosure.
As shown in figure 15, Figure 15 is the frame of the positioning device of another robot shown according to an exemplary embodiment Figure, for the embodiment on the basis of aforementioned embodiment illustrated in fig. 14, judging unit 906 includes: the first judgement subelement 9061.
The first judgement subelement 9061 is configured as being blocked when presently described distance measuring unit or the robot skids When, determine that there are mistakes for the current posture information.
As shown in figure 16, Figure 16 is the frame of the positioning device of another robot shown according to an exemplary embodiment Figure, for the embodiment on the basis of aforementioned embodiment illustrated in fig. 14, judging unit 906 includes: the second judgement subelement 9062.
The second judgement subelement 9062 is configured as when in the current posture information and the history posture information It is any when not matching that, determine that there are mistakes for the current posture information.
As shown in figure 17, Figure 17 is the frame of the positioning device of another robot shown according to an exemplary embodiment Figure, the embodiment is on the basis of aforementioned embodiment illustrated in fig. 9, further includes: statistic unit 908 and deletion unit 909.
The statistic unit 908 be configured as the robot execute specific operation moving process in, search with it is described The history image data that image acquisition units acquired image data match, and count corresponding matching times;
The deletion unit 909 is configured as matching times and the execution specific operation when any history image data When the ratio of number is less than preset threshold, any history image data and the robot are deleted in any history Image data posture information corresponding when collected.
It should be noted that the knot of statistic unit 908 and deletion unit 909 in Installation practice shown in above-mentioned Figure 17 Structure also may be embodied in earlier figures 10 and the Installation practice of Figure 14, be not limited to this disclosure.
As shown in figure 18, Figure 18 is the frame of the positioning device of another robot shown according to an exemplary embodiment Figure, on the basis of aforementioned embodiment illustrated in fig. 9, all history image data are stored in pre- the embodiment with history posture information If in database;Further include: depletion region determination unit 910 and updating unit 911.
The depletion region determination unit 910 is configured as when receiving the more new command for the presetting database, Depletion region is determined according to the map that the robot constructs in moving process;
The updating unit 911 is configured as in the depletion region acquiring image data by described image acquisition unit And corresponding posture information, to update the presetting database.
It should be noted that the depletion region determination unit 910 and update in Installation practice shown in above-mentioned Figure 18 are single The structure of member 911 also may be embodied in the Installation practice of earlier figures 10, Figure 14 and Figure 17, to this disclosure without limit System.
As shown in figure 19, Figure 19 is the frame of the positioning device of another robot shown according to an exemplary embodiment Figure, for the embodiment on the basis of aforementioned embodiment illustrated in fig. 9, the robot is for different scene types configured with correspondence Cleaning strategy;Further include: Developing Tactics unit 912.
The Developing Tactics unit 912 is configured as during the robot executes clean operation, according to for institute The scene Recognition of image acquisition units acquired image data is stated as a result, corresponding cleaning strategy is taken to be cleaned.
It should be noted that the structure of the Developing Tactics unit 912 in Installation practice shown in above-mentioned Figure 19 can also be with Included in earlier figures 10, the Installation practice of Figure 14, Figure 17 and Figure 18, this disclosure is not limited.
As shown in figure 20, Figure 20 is the frame of the positioning device of another robot shown according to an exemplary embodiment Figure, for the embodiment on the basis of aforementioned embodiment illustrated in fig. 9, which includes the first image data Determine subelement 9011.
First image data determines that subelement 9011 is configured as when the current image date and the history image number When similarity (or matching degree) between is more than preset threshold, the history image data and the current image date are determined Match.
As shown in figure 21, Figure 21 is the frame of the positioning device of another robot shown according to an exemplary embodiment Figure, for the embodiment on the basis of aforementioned embodiment illustrated in fig. 9, which includes the second image data Determine subelement 9012.
Second image data determines that subelement 9012 is configured as when the current image date and the history image number When including one or more same collected objects in, the history image data and the current image date phase are determined Matching.
About the device in above-described embodiment, wherein modules execute the concrete mode of operation in related this method Embodiment in be described in detail, no detailed explanation will be given here.
For device embodiment, since it corresponds essentially to embodiment of the method, so related place is referring to method reality Apply the part explanation of example.The apparatus embodiments described above are merely exemplary, wherein described be used as separation unit The unit of explanation may or may not be physically separated, and component shown as a unit can be or can also be with It is not physical unit, it can it is in one place, or may be distributed over multiple network units.It can be according to actual The purpose for needing to select some or all of the modules therein to realize disclosure scheme.Those of ordinary skill in the art are not paying Out in the case where creative work, it can understand and implement.
Correspondingly, the disclosure also provides a kind of robot, the robot is configured with image acquisition units and distance measuring unit; The robot further include: processor;Memory for storage processor executable instruction;Wherein, the processor passes through The executable instruction is run to realize the implementation method of the screen light filling as described in any in above-described embodiment, such as this method It may include: according to the collected current image date of described image acquisition unit, determination and the current image date phase The history image data matched, the history image data are collected by described image acquisition unit in the historical juncture;Obtain institute State the robot history posture information corresponding when the history image data are collected;It is currently adopted according to the distance measuring unit The ranging data collected determines the current posture information of the robot;According to the history posture information and the present bit Appearance information positions the current location of the robot.
Correspondingly, the disclosure also provides a kind of terminal, the terminal include memory and one or more than one Program, one of them perhaps more than one program be stored in memory and be configured to by one or more than one Managing device and executing the one or more programs includes for realizing such as the robot as described in any in above-described embodiment The instruction of localization method, such as this method may include: according to the collected current image date of described image acquisition unit, really The fixed history image data to match with the current image date, the history image data are existed by described image acquisition unit Historical juncture collects;Obtain the robot history pose letter corresponding when the history image data are collected Breath;According to the current collected ranging data of the distance measuring unit, the current posture information of the robot is determined;According to described History posture information and the current posture information, position the current location of the robot.
Figure 22 is a kind of block diagram of positioning device 2200 for robot shown according to an exemplary embodiment.Example Such as, device 2200 can be mobile phone, computer, digital broadcasting terminal, messaging device, game console, and plate is set It is standby, Medical Devices, body-building equipment, personal digital assistant etc..
Referring to Figure 22, device 2200 may include following one or more components: processing component 2202, memory 2204, Power supply module 2206, multimedia component 2208, audio component 2210, the interface 2212 of input/output (I/O), sensor module 2214 and communication component 2216.
The integrated operation of the usual control device 2200 of processing component 2202, such as with display, telephone call, data communication, Camera operation and record operate associated operation.Processing component 2202 may include one or more processors 2220 to execute Instruction, to perform all or part of the steps of the methods described above.In addition, processing component 2202 may include one or more moulds Block, convenient for the interaction between processing component 2202 and other assemblies.For example, processing component 2202 may include multi-media module, To facilitate the interaction between multimedia component 2208 and processing component 2202.
Memory 2204 is configured as storing various types of data to support the operation in device 2200.These data Example includes the instruction of any application or method for operating on device 2200, contact data, telephone book data, Message, picture, video etc..Memory 2204 can by any kind of volatibility or non-volatile memory device or they Combination is realized, such as static random access memory (SRAM), electrically erasable programmable read-only memory (EEPROM), it is erasable can Program read-only memory (EPROM), programmable read only memory (PROM), read-only memory (ROM), magnetic memory, flash memory Reservoir, disk or CD.
Power supply module 2206 provides electric power for the various assemblies of device 2200.Power supply module 2206 may include power management System, one or more power supplys and other with for device 2200 generate, manage, and distribute the associated component of electric power.
Multimedia component 2208 includes the screen of one output interface of offer between described device 2200 and user.? In some embodiments, screen may include liquid crystal display (LCD) and touch panel (TP).If screen includes touch panel, Screen may be implemented as touch screen, to receive input signal from the user.Touch panel includes that one or more touch passes Sensor is to sense the gesture on touch, slide, and touch panel.The touch sensor can not only sense touch or sliding is dynamic The boundary of work, but also detect duration and pressure associated with the touch or slide operation.In some embodiments, more Media component 2208 includes a front camera and/or rear camera.When device 2200 is in operation mode, as shot mould When formula or video mode, front camera and/or rear camera can receive external multi-medium data.Each preposition camera shooting Head and rear camera can be a fixed optical lens system or have focusing and optical zoom capabilities.
Audio component 2210 is configured as output and/or input audio signal.For example, audio component 2210 includes one Microphone (MIC), when device 2200 is in operation mode, when such as call mode, recording mode, and voice recognition mode, microphone It is configured as receiving external audio signal.The received audio signal can be further stored in memory 2204 or via logical Believe that component 2216 is sent.In some embodiments, audio component 2210 further includes a loudspeaker, is used for output audio signal.
I/O interface 2212 provides interface, above-mentioned peripheral interface module between processing component 2202 and peripheral interface module It can be keyboard, click wheel, button etc..These buttons may include, but are not limited to: home button, volume button, start button and Locking press button.
Sensor module 2214 includes one or more sensors, and the state for providing various aspects for device 2200 is commented Estimate.For example, sensor module 2214 can detecte the state that opens/closes of device 2200, the relative positioning of component, such as The component is the display and keypad of device 2200, and sensor module 2214 can be with detection device 2200 or device 2200 The position change of one component, the existence or non-existence that user contacts with device 2200,2200 orientation of device or acceleration/deceleration and The temperature change of device 2200.Sensor module 2214 may include proximity sensor, be configured in not any object It is detected the presence of nearby objects when reason contact.Sensor module 2214 can also include optical sensor, as CMOS or ccd image are passed Sensor, for being used in imaging applications.In some embodiments, which can also include acceleration sensing Device, gyro sensor, Magnetic Sensor, pressure sensor or temperature sensor.
Communication component 2216 is configured to facilitate the communication of wired or wireless way between device 2200 and other equipment.Dress The wireless network based on communication standard, such as WiFi can be accessed by setting 2200,2G or 3G or their combination.It is exemplary at one In embodiment, communication component 2216 receives broadcast singal or broadcast correlation from external broadcasting management system via broadcast channel Information.In one exemplary embodiment, the communication component 2216 further includes near-field communication (NFC) module, to promote short distance Communication.For example, radio frequency identification (RFID) technology, Infrared Data Association (IrDA) technology, ultra wide band can be based in NFC module (UWB) technology, bluetooth (BT) technology and other technologies are realized.
In the exemplary embodiment, device 2200 can be by one or more application specific integrated circuit (ASIC), number Signal processor (DSP), digital signal processing appts (DSPD), programmable logic device (PLD), field programmable gate array (FPGA), controller, microcontroller, microprocessor or other electronic components are realized, for executing the above method.
In the exemplary embodiment, a kind of non-transitorycomputer readable storage medium including instruction, example are additionally provided It such as include the memory 2204 of instruction, above-metioned instruction can be executed by the processor 2220 of device 2200 to complete the above method.Example Such as, the non-transitorycomputer readable storage medium can be ROM, random access memory (RAM), CD-ROM, tape, soft Disk and optical data storage devices etc..
Those skilled in the art will readily occur to its of the disclosure after considering specification and practicing disclosure disclosed herein Its embodiment.This application is intended to cover any variations, uses, or adaptations of the disclosure, these modifications, purposes or Person's adaptive change follows the general principles of this disclosure and including the undocumented common knowledge in the art of the disclosure Or conventional techniques.The description and examples are only to be considered as illustrative, and the true scope and spirit of the disclosure are by following Claim is pointed out.
It should be understood that the present disclosure is not limited to the precise structures that have been described above and shown in the drawings, and And various modifications and changes may be made without departing from the scope thereof.The scope of the present disclosure is only limited by the accompanying claims.

Claims (30)

1. a kind of localization method of robot, which is characterized in that the robot is configured with image acquisition units and distance measuring unit; The described method includes:
According to the collected current image date of described image acquisition unit, determination is gone through with what the current image date matched History image data, the history image data are collected by described image acquisition unit in the historical juncture;
Obtain the robot history posture information corresponding when the history image data are collected;
According to the current collected ranging data of the distance measuring unit, the current posture information of the robot is determined;
According to the history posture information and the current posture information, the current location of the robot is positioned.
2. the method according to claim 1, wherein
Further include: determine whether the robot occurs kidnapping accident;
It is described according to the history posture information and the current posture information, the current location of the robot is determined Position, comprising: when determining that kidnapping accident occurs for the robot, believed according to the history posture information and the current pose Breath, the position that the robot occurs after kidnapping accident position.
3. according to the method described in claim 2, it is characterized in that, whether the determination robot occurs kidnapping accident, Include:
When described image acquisition unit acquired image data and/or the collected ranging data of the distance measuring unit occur to dash forward When change, determine that kidnapping accident occurs for the robot.
4. the method according to claim 1, wherein described collected current according to described image acquisition unit Image data, the determining history image data to match with the current image date, comprising:
When the similarity between the current image date and the history image data is more than preset threshold, gone through described in determination History image data matches with the current image date.
5. the method according to claim 1, wherein described collected current according to described image acquisition unit Image data, the determining history image data to match with the current image date, comprising:
When including one or more same collected objects in the current image date and the history image data, really The fixed history image data match with the current image date.
6. the method according to claim 1, wherein described according to the history posture information and the present bit Appearance information positions the current location of the robot, comprising:
Determine the target histories posture information to match in the history posture information with the current posture information;
According to the target histories posture information to the robot working as in the map constructed according to the distance measuring unit Front position is positioned.
7. the method according to claim 1, wherein described according to the history posture information and the present bit Appearance information positions the current location of the robot, comprising:
Three-dimensional environment composition is obtained, the three-dimensional environment composition is by the history image data and the history pose information architecture It obtains;
The posture information for corresponding to the current image date is determined based on the three-dimensional environment composition;
Present bit according to the posture information determined to the robot in the map constructed according to the distance measuring unit It sets and is positioned.
8. the method according to the description of claim 7 is characterized in that the three-dimensional environment is patterned into previously according to collected institute History image data are stated to construct to obtain with the history posture information;It is being determined alternatively, the three-dimensional environment is patterned into After history image data assigned in the current image date, based on the history image data and the history posture information Building obtains.
9. the method according to claim 1, wherein further include:
Determine the current posture information with the presence or absence of mistake;
When determining that the current posture information has mistake, using the history posture information to the current posture information into Row amendment.
10. according to the method described in claim 9, it is characterized in that, described determine the current posture information with the presence or absence of mistake Accidentally, comprising:
When presently described distance measuring unit is blocked or the robot skids, determine that there are mistakes for the current posture information.
11. according to the method described in claim 9, it is characterized in that, described determine the current posture information with the presence or absence of mistake Accidentally, comprising:
When any in the current posture information and the history posture information does not match that, the current pose is determined There are mistakes for information.
12. the method according to claim 1, wherein further include:
In the moving process that the robot executes specific operation, search and described image acquisition unit acquired image number According to the history image data to match, and count corresponding matching times;
When the ratio of the matching times of any history image data and the number for executing the specific operation is less than preset threshold, It is corresponding when any history image data are collected to delete any history image data and the robot Posture information.
13. the method according to claim 1, wherein all history image data and history posture information store In presetting database;The method also includes:
When receiving the more new command for the presetting database, constructed in moving process according to the robot Map determines depletion region;
Image data and corresponding posture information are acquired, in the depletion region to update the presetting database.
14. the method according to claim 1, wherein the robot is configured with for different scene types Corresponding cleaning strategy;The method also includes:
During the robot executes clean operation, according to for described image acquisition unit acquired image data Scene Recognition as a result, corresponding cleaning strategy is taken to be cleaned.
15. a kind of positioning device of robot, which is characterized in that the robot is configured with image acquisition units and ranging list Member;Described device includes:
Image data determination unit, it is determining and described current according to the collected current image date of described image acquisition unit The history image data that image data matches, the history image data are acquired by described image acquisition unit in the historical juncture It obtains;
Pose acquiring unit obtains the robot history pose letter corresponding when the history image data are collected Breath;
Pose determination unit determines the present bit of the robot according to the current collected ranging data of the distance measuring unit Appearance information;
Positioning unit, according to the history posture information and the current posture information, to the current location of the robot into Row positioning.
16. device according to claim 15, which is characterized in that
Further include: kidnapping accident determination unit determines whether the robot occurs kidnapping accident;
The positioning unit includes: the first locator unit, when determining that kidnapping accident occurs for the robot, is gone through according to described History posture information and the current posture information, the position that the robot occurs after kidnapping accident position.
17. device according to claim 16, which is characterized in that the kidnapping accident determination unit includes:
Kidnapping accident determines subelement, when described image acquisition unit acquired image data and/or the distance measuring unit are adopted When the ranging data collected mutates, determine that kidnapping accident occurs for the robot.
18. device according to claim 15, which is characterized in that the positioning unit, comprising:
First determines subelement, determines the target histories position to match in the history posture information with the current posture information Appearance information;
Second locator unit constructs the robot according to the distance measuring unit according to the target histories posture information The current location in map out is positioned.
19. device according to claim 15, which is characterized in that the positioning unit, comprising:
Subelement is obtained, obtains three-dimensional environment composition, the three-dimensional environment composition is by the history image data and the history Posture information constructs to obtain;
Second determines subelement, and the posture information for corresponding to the current image date is determined based on the three-dimensional environment composition;
Third locator unit is constructing according to the distance measuring unit robot according to the posture information determined Current location in map is positioned.
20. device according to claim 19, which is characterized in that the three-dimensional environment is patterned into previously according to collected The history image data construct to obtain with the history posture information;It is being determined alternatively, the three-dimensional environment is patterned into After being matched with the history image data of the current image date, based on the history image data and the history posture information And it constructs and obtains.
21. device according to claim 15, which is characterized in that further include:
Judging unit determines that the current posture information whether there is mistake;
Amending unit, when determining that the current posture information has mistake, using the history posture information to described current Posture information is modified.
22. device according to claim 21, which is characterized in that the judging unit includes:
First determines subelement, when presently described distance measuring unit is blocked or the robot skids, determines the present bit There are mistakes for appearance information.
23. device according to claim 21, which is characterized in that the judging unit includes:
Second determines subelement, when any in the current posture information and the history posture information does not match that, Determine that there are mistakes for the current posture information.
24. device according to claim 15, which is characterized in that further include:
Statistic unit is searched and is acquired with described image acquisition unit in the moving process that the robot executes specific operation To the history image data that match of image data, and count corresponding matching times;
Unit is deleted, when the ratio of the number of the matching times and execution specific operation of any history image data is less than in advance If when threshold value, deleting any history image data and the robot when any history image data are collected Corresponding posture information.
25. device according to claim 15, which is characterized in that all history image data and history posture information store In presetting database;Described device further include:
Depletion region determination unit exists when receiving the more new command for the presetting database according to the robot The map constructed in moving process determines depletion region;
Updating unit acquires image data and corresponding pose letter by described image acquisition unit in the depletion region Breath, to update the presetting database.
26. device according to claim 15, which is characterized in that the robot is configured with for different scene types Corresponding cleaning strategy;Described device further include:
Developing Tactics unit is adopted during the robot executes clean operation according to for described image acquisition unit The scene Recognition of the image data collected is as a result, take corresponding cleaning strategy to be cleaned.
27. device according to claim 15, which is characterized in that described image data determination unit includes:
First image data determines subelement, when the similarity between the current image date and the history image data is super When crossing preset threshold, determine that the history image data match with the current image date.
28. device according to claim 15, which is characterized in that described image data determination unit includes:
Second image data determines subelement, when in the current image date and the history image data comprising one or When multiple same collected objects, determine that the history image data match with the current image date.
29. a kind of robot, which is characterized in that the robot is configured with image acquisition units and distance measuring unit;The machine People further include:
Processor;
Memory for storage processor executable instruction;
Wherein, the processor is by running the executable instruction to realize the side as described in any one of claim 1-14 Method.
30. a kind of computer readable storage medium, is stored thereon with computer instruction, which is characterized in that the instruction is by processor It is realized when execution such as the step of any one of claim 1-14 the method.
CN201811268293.6A 2018-10-29 2018-10-29 Robot positioning method and device, electronic device and storage medium Active CN109431381B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202210535950.9A CN114847803B (en) 2018-10-29 Positioning method and device of robot, electronic equipment and storage medium
CN201811268293.6A CN109431381B (en) 2018-10-29 2018-10-29 Robot positioning method and device, electronic device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811268293.6A CN109431381B (en) 2018-10-29 2018-10-29 Robot positioning method and device, electronic device and storage medium

Related Child Applications (1)

Application Number Title Priority Date Filing Date
CN202210535950.9A Division CN114847803B (en) 2018-10-29 Positioning method and device of robot, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN109431381A true CN109431381A (en) 2019-03-08
CN109431381B CN109431381B (en) 2022-06-07

Family

ID=65550206

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811268293.6A Active CN109431381B (en) 2018-10-29 2018-10-29 Robot positioning method and device, electronic device and storage medium

Country Status (1)

Country Link
CN (1) CN109431381B (en)

Cited By (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109993794A (en) * 2019-03-29 2019-07-09 北京猎户星空科技有限公司 A kind of robot method for relocating, device, control equipment and storage medium
CN110414353A (en) * 2019-06-24 2019-11-05 炬星科技(深圳)有限公司 Robot booting positioning, operation method for relocating, electronic equipment and storage medium
CN111220148A (en) * 2020-01-21 2020-06-02 珊口(深圳)智能科技有限公司 Mobile robot positioning method, system and device and mobile robot
CN111239761A (en) * 2020-01-20 2020-06-05 西安交通大学 Method for indoor real-time establishment of two-dimensional map
CN111443033A (en) * 2020-04-26 2020-07-24 武汉理工大学 Floor sweeping robot carpet detection method
CN111696157A (en) * 2019-03-12 2020-09-22 北京京东尚科信息技术有限公司 Method, system, device and storage medium for determining image relocation
CN111784661A (en) * 2019-12-31 2020-10-16 山东信通电子股份有限公司 Adjusting method, device, equipment and medium of power transmission line detection equipment
CN112013840A (en) * 2020-08-19 2020-12-01 安克创新科技股份有限公司 Sweeping robot and map construction method and device thereof
CN112041634A (en) * 2020-08-07 2020-12-04 苏州珊口智能科技有限公司 Mobile robot positioning method, map building method and mobile robot
CN112212853A (en) * 2020-09-01 2021-01-12 北京石头世纪科技股份有限公司 Robot positioning method and device, and storage medium
WO2021008439A1 (en) * 2019-07-12 2021-01-21 北京石头世纪科技股份有限公司 Automatic cleaning device control method and apparatus, device and medium
CN112418046A (en) * 2020-11-17 2021-02-26 武汉云极智能科技有限公司 Fitness guidance method, storage medium and system based on cloud robot
CN112414391A (en) * 2019-08-20 2021-02-26 北京京东乾石科技有限公司 Robot repositioning method and device
CN112444251A (en) * 2019-08-29 2021-03-05 长沙智能驾驶研究院有限公司 Vehicle driving position determining method and device, storage medium and computer equipment
CN112631303A (en) * 2020-12-26 2021-04-09 北京云迹科技有限公司 Robot positioning method and device and electronic equipment
CN112766023A (en) * 2019-11-04 2021-05-07 北京地平线机器人技术研发有限公司 Target object posture determining method, device, medium and equipment
CN112880691A (en) * 2019-11-29 2021-06-01 北京初速度科技有限公司 Global positioning initialization method and device
CN112987764A (en) * 2021-02-01 2021-06-18 鹏城实验室 Landing method, landing device, unmanned aerial vehicle and computer-readable storage medium
CN113256710A (en) * 2021-05-21 2021-08-13 深圳市慧鲤科技有限公司 Method and device for displaying foresight in game, computer equipment and storage medium
WO2021189784A1 (en) * 2020-03-23 2021-09-30 南京科沃斯机器人技术有限公司 Scenario reconstruction method, system and apparatus, and sweeping robot
CN113554754A (en) * 2021-07-30 2021-10-26 中国电子科技集团公司第五十四研究所 Indoor positioning method based on computer vision
CN114941448A (en) * 2021-02-07 2022-08-26 广东博智林机器人有限公司 Mortar cleaning method, device, system and storage medium
CN115177178A (en) * 2021-04-06 2022-10-14 美智纵横科技有限责任公司 Cleaning method, cleaning device and computer storage medium
WO2023202256A1 (en) * 2022-04-22 2023-10-26 追觅创新科技(苏州)有限公司 Coordinate-based repositioning method and system, and cleaning robot

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106203341A (en) * 2016-07-11 2016-12-07 百度在线网络技术(北京)有限公司 A kind of Lane detection method and device of unmanned vehicle
CN107340522A (en) * 2017-07-10 2017-11-10 浙江国自机器人技术有限公司 A kind of method, apparatus and system of laser radar positioning
CN107796397A (en) * 2017-09-14 2018-03-13 杭州迦智科技有限公司 A kind of Robot Binocular Vision localization method, device and storage medium
CN107845114A (en) * 2017-11-10 2018-03-27 北京三快在线科技有限公司 Construction method, device and the electronic equipment of map
CN108125622A (en) * 2017-12-15 2018-06-08 珊口(上海)智能科技有限公司 Control method, system and the clean robot being applicable in
CN108544494A (en) * 2018-05-31 2018-09-18 珠海市微半导体有限公司 A kind of positioning device, method and robot based on inertia and visual signature
US20180297207A1 (en) * 2017-04-14 2018-10-18 TwoAntz, Inc. Visual positioning and navigation device and method thereof

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106203341A (en) * 2016-07-11 2016-12-07 百度在线网络技术(北京)有限公司 A kind of Lane detection method and device of unmanned vehicle
US20180297207A1 (en) * 2017-04-14 2018-10-18 TwoAntz, Inc. Visual positioning and navigation device and method thereof
CN107340522A (en) * 2017-07-10 2017-11-10 浙江国自机器人技术有限公司 A kind of method, apparatus and system of laser radar positioning
CN107796397A (en) * 2017-09-14 2018-03-13 杭州迦智科技有限公司 A kind of Robot Binocular Vision localization method, device and storage medium
CN107845114A (en) * 2017-11-10 2018-03-27 北京三快在线科技有限公司 Construction method, device and the electronic equipment of map
CN108125622A (en) * 2017-12-15 2018-06-08 珊口(上海)智能科技有限公司 Control method, system and the clean robot being applicable in
CN108544494A (en) * 2018-05-31 2018-09-18 珠海市微半导体有限公司 A kind of positioning device, method and robot based on inertia and visual signature

Cited By (34)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111696157A (en) * 2019-03-12 2020-09-22 北京京东尚科信息技术有限公司 Method, system, device and storage medium for determining image relocation
CN109993794A (en) * 2019-03-29 2019-07-09 北京猎户星空科技有限公司 A kind of robot method for relocating, device, control equipment and storage medium
CN110414353A (en) * 2019-06-24 2019-11-05 炬星科技(深圳)有限公司 Robot booting positioning, operation method for relocating, electronic equipment and storage medium
WO2021008439A1 (en) * 2019-07-12 2021-01-21 北京石头世纪科技股份有限公司 Automatic cleaning device control method and apparatus, device and medium
CN112414391A (en) * 2019-08-20 2021-02-26 北京京东乾石科技有限公司 Robot repositioning method and device
CN112444251B (en) * 2019-08-29 2023-06-13 长沙智能驾驶研究院有限公司 Vehicle driving position determining method and device, storage medium and computer equipment
CN112444251A (en) * 2019-08-29 2021-03-05 长沙智能驾驶研究院有限公司 Vehicle driving position determining method and device, storage medium and computer equipment
CN112766023B (en) * 2019-11-04 2024-01-19 北京地平线机器人技术研发有限公司 Method, device, medium and equipment for determining gesture of target object
CN112766023A (en) * 2019-11-04 2021-05-07 北京地平线机器人技术研发有限公司 Target object posture determining method, device, medium and equipment
CN112880691B (en) * 2019-11-29 2022-12-02 北京魔门塔科技有限公司 Global positioning initialization method and device
CN112880691A (en) * 2019-11-29 2021-06-01 北京初速度科技有限公司 Global positioning initialization method and device
CN111784661A (en) * 2019-12-31 2020-10-16 山东信通电子股份有限公司 Adjusting method, device, equipment and medium of power transmission line detection equipment
CN111784661B (en) * 2019-12-31 2023-09-05 山东信通电子股份有限公司 Adjustment method, device, equipment and medium of transmission line detection equipment
CN111239761A (en) * 2020-01-20 2020-06-05 西安交通大学 Method for indoor real-time establishment of two-dimensional map
CN111220148A (en) * 2020-01-21 2020-06-02 珊口(深圳)智能科技有限公司 Mobile robot positioning method, system and device and mobile robot
WO2021189784A1 (en) * 2020-03-23 2021-09-30 南京科沃斯机器人技术有限公司 Scenario reconstruction method, system and apparatus, and sweeping robot
CN111443033A (en) * 2020-04-26 2020-07-24 武汉理工大学 Floor sweeping robot carpet detection method
CN112041634A (en) * 2020-08-07 2020-12-04 苏州珊口智能科技有限公司 Mobile robot positioning method, map building method and mobile robot
WO2022037369A1 (en) * 2020-08-19 2022-02-24 安克创新科技股份有限公司 Robotic vacuum cleaner and map construction method and apparatus therefor
CN112013840A (en) * 2020-08-19 2020-12-01 安克创新科技股份有限公司 Sweeping robot and map construction method and device thereof
CN112013840B (en) * 2020-08-19 2022-10-28 安克创新科技股份有限公司 Sweeping robot and map construction method and device thereof
CN112212853A (en) * 2020-09-01 2021-01-12 北京石头世纪科技股份有限公司 Robot positioning method and device, and storage medium
WO2022048153A1 (en) * 2020-09-01 2022-03-10 北京石头世纪科技股份有限公司 Positioning method and apparatus for robot, and storage medium
CN112418046A (en) * 2020-11-17 2021-02-26 武汉云极智能科技有限公司 Fitness guidance method, storage medium and system based on cloud robot
CN112631303A (en) * 2020-12-26 2021-04-09 北京云迹科技有限公司 Robot positioning method and device and electronic equipment
CN112631303B (en) * 2020-12-26 2022-12-20 北京云迹科技股份有限公司 Robot positioning method and device and electronic equipment
CN112987764A (en) * 2021-02-01 2021-06-18 鹏城实验室 Landing method, landing device, unmanned aerial vehicle and computer-readable storage medium
CN112987764B (en) * 2021-02-01 2024-02-20 鹏城实验室 Landing method, landing device, unmanned aerial vehicle and computer-readable storage medium
CN114941448A (en) * 2021-02-07 2022-08-26 广东博智林机器人有限公司 Mortar cleaning method, device, system and storage medium
CN114941448B (en) * 2021-02-07 2023-09-05 广东博智林机器人有限公司 Mortar cleaning method, device, system and storage medium
CN115177178A (en) * 2021-04-06 2022-10-14 美智纵横科技有限责任公司 Cleaning method, cleaning device and computer storage medium
CN113256710A (en) * 2021-05-21 2021-08-13 深圳市慧鲤科技有限公司 Method and device for displaying foresight in game, computer equipment and storage medium
CN113554754A (en) * 2021-07-30 2021-10-26 中国电子科技集团公司第五十四研究所 Indoor positioning method based on computer vision
WO2023202256A1 (en) * 2022-04-22 2023-10-26 追觅创新科技(苏州)有限公司 Coordinate-based repositioning method and system, and cleaning robot

Also Published As

Publication number Publication date
CN114847803A (en) 2022-08-05
CN109431381B (en) 2022-06-07

Similar Documents

Publication Publication Date Title
CN109431381A (en) Localization method and device, electronic equipment, the storage medium of robot
CN109947109B (en) Robot working area map construction method and device, robot and medium
CN106175606B (en) The method, apparatus that robot and its realization independently manipulate
CN205671994U (en) Automatic cleaning equipment
CN106239517B (en) The method, apparatus that robot and its realization independently manipulate
CN105990876B (en) Charging pile, identification method and device thereof and automatic cleaning equipment
CN114521836B (en) Automatic cleaning equipment
CN109920424A (en) Robot voice control method and device, robot and medium
CN109998421A (en) Mobile clean robot combination and persistence drawing
CN110051289A (en) Robot voice control method and device, robot and medium
CN110507253A (en) Cleaning robot and control method thereof
CN106226755B (en) Robot
US20220125270A1 (en) Method for controlling automatic cleaning device, automatic cleaning device, and non-transitory storage medium
CN207488823U (en) A kind of mobile electronic device
CN110313867A (en) Autonomous scavenging machine, the cleaning method of autonomous scavenging machine and program
CN211022482U (en) Cleaning robot
CN109932726A (en) Robot ranging calibration method and device, robot and medium
CN110136704A (en) Robot voice control method and device, robot and medium
EP4209754A1 (en) Positioning method and apparatus for robot, and storage medium
CN114557633B (en) Cleaning parameter configuration method, device, equipment and medium for automatic cleaning equipment
CN109920425A (en) Robot voice control method and device, robot and medium
KR102581196B1 (en) Airport robot and computer readable recording medium of performing operating method of thereof
CN208207201U (en) robot
CN112269379B (en) Obstacle identification information feedback method
CN114847803B (en) Positioning method and device of robot, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
CB02 Change of applicant information
CB02 Change of applicant information

Address after: 100192 No. 6016, 6017, 6018, Block C, No. 8 Heiquan Road, Haidian District, Beijing

Applicant after: Beijing Roborock Technology Co.,Ltd.

Address before: 100192 No. 6016, 6017, 6018, Block C, No. 8 Heiquan Road, Haidian District, Beijing

Applicant before: BEIJING ROCKROBO TECHNOLOGY Co.,Ltd.

SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20220422

Address after: 102200 No. 8008, floor 8, building 16, yard 37, Chaoqian Road, Changping Park, Zhongguancun Science and Technology Park, Changping District, Beijing

Applicant after: Beijing Stone Innovation Technology Co.,Ltd.

Address before: 100192 No. 6016, 6017, 6018, Block C, No. 8 Heiquan Road, Haidian District, Beijing

Applicant before: Beijing Roborock Technology Co.,Ltd.

GR01 Patent grant
GR01 Patent grant