CN108245099A - Robot moving method and device - Google Patents

Robot moving method and device Download PDF

Info

Publication number
CN108245099A
CN108245099A CN201810036671.1A CN201810036671A CN108245099A CN 108245099 A CN108245099 A CN 108245099A CN 201810036671 A CN201810036671 A CN 201810036671A CN 108245099 A CN108245099 A CN 108245099A
Authority
CN
China
Prior art keywords
target
camera
robot
detecting
module
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201810036671.1A
Other languages
Chinese (zh)
Inventor
王声平
周毕兴
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Infinite Power Development Co., Ltd.
Original Assignee
Shenzhen Water World Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Water World Co Ltd filed Critical Shenzhen Water World Co Ltd
Priority to CN201810036671.1A priority Critical patent/CN108245099A/en
Priority to PCT/CN2018/077604 priority patent/WO2019136808A1/en
Publication of CN108245099A publication Critical patent/CN108245099A/en
Pending legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/40Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
    • A47L11/4011Regulation of the cleaning machine by electric means; Control systems and remote control systems therefor
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/24Floor-sweeping machines, motor-driven
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/40Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
    • A47L11/4061Steering means; Means for avoiding obstacles; Details related to the place where the driver is accommodated
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L2201/00Robotic cleaning machines, i.e. with automatic control of the travelling movement or the cleaning operation
    • A47L2201/04Automatic control of the travelling movement; Automatic obstacle detection

Abstract

Present invention is disclosed a kind of robot moving method and devices, the described method comprises the following steps:Auditory localization is carried out by voice collection device, determines the direction of target;By camera towards target, and pass through camera and detect target;When detecting target, vision positioning is carried out by camera, determines target position;It is moved to target position.A kind of robot moving method that the embodiment of the present invention is provided, by auditory localization determine target direction with by camera towards target, camera is recycled to carry out vision positioning to determine the accurate location of target, it allows the robot to fast and accurately be moved in front of user's (target), solves the technical issues of robot can not be accurately moved in front of user, realize accurate adjustment robot and user towards relationship and distance relation, ensure that robot can preferably receive the phonetic order of user, be greatly improved user experience.

Description

Robot moving method and device
Technical field
The present invention relates to robotic technology fields, especially relate to a kind of robot moving method and device.
Background technology
With the development of science and technology, more and more Intelligent life electric appliances enter family, people’s lives are substantially increased Comfort and convenience, wherein, sweeping robot is one of Intelligent life electric appliance most favored by users.It sweeps the floor machine People also known as sweeps machine, intellective dust collector, robot cleaner etc. automatically, can rely on certain artificial intelligence, automatically, independently Land clearing work is completed in the room.
Current sweeping robot also adds speech identifying function, and by speech identifying function, sweeping robot can Receive the phonetic order of user, and corresponding action is performed according to phonetic order.When user from sweeping robot farther out when, sweep the floor Robot then determines the position of user by auditory localization technology, and the voice for moving again to user position reception user refers to It enables.
However, the positioning accuracy of auditory localization is not high, position substantially can only be determined so that sweeping robot cannot be timely It is accurately moved in front of user, no user oriented or the situation larger with user distance usually occurs, so as to cause sweeping Floor-washing robot can not accurately receive the phonetic order of user, influence user experience.
Invention content
The main object of the present invention is provides a kind of robot moving method and device, it is intended to which solving robot can not be accurate The technical issues of being moved in front of user.
To achieve these objectives, the embodiment of the present invention proposes a kind of robot moving method, the described method comprises the following steps:
Auditory localization is carried out by voice collection device, determines the direction of target;
By camera towards the target, and the target is detected by the camera;
When detecting the target, vision positioning is carried out by the camera, determines the target position;
It is moved to the target position.
Optionally, the camera is monocular cam, described to carry out vision positioning by the camera, is determined described The step of target position, includes:
The image of the target is acquired by the monocular cam;
A distance is moved towards the target direction, the image of the target is acquired again by the monocular cam;
The target position is determined according to the front and rear image of the target acquired twice.
Optionally, the camera is binocular camera, described to carry out vision positioning by the camera, is determined described The step of target position, includes:
The image of the target is acquired by the binocular camera;
The target position is determined according to the image of the target.
Optionally, described the step of detecting the target by the camera, includes:
When not detecting the target, hide the barrier for blocking the target, until detecting that the target is Only.
Optionally, described to hide the barrier for blocking the target, until detecting target the step of, includes:
It is moved towards the target direction;
When encountering barrier, along the Boundary Moving of the barrier, until detecting the target.
Optionally, described to hide the barrier for blocking the target, until detecting target the step of, includes: Toward the lateral movement of the target direction, until detecting the target.
Optionally, described the step of detecting the target by the camera, includes:
Recognition of face detection is carried out by the camera;
When detecting face, determine to detect the target.
Optionally, the step moved to the target position includes:
It is that terminal plans mobile route using present position as starting point, the target position;
It is moved along the mobile route to the terminal.
Optionally, the method further includes:When the position of the target changes, the mobile road is planned again Diameter.
Optionally, the artificial sweeping robot of the machine.
The embodiment of the present invention proposes a kind of robot mobile device simultaneously, and described device includes:
Direction determining mould for carrying out auditory localization by voice collection device, determines the direction of target;
Module of target detection, for camera to be detected the target towards the target, and by the camera;
Position determination module, for when detecting the target, carrying out vision positioning by the camera, determining institute State target position;
Mobile control module, for the robot to be controlled to be moved to the target position.
Optionally, the camera is monocular cam, and the position determination module includes:
First collecting unit, for acquiring the image of the target by the monocular cam;
Second collecting unit, for the robot to be controlled to move a distance towards the target direction, again by institute State the image that monocular cam acquires the target;
First determination unit, the image of the target for being acquired twice before and after determine that the target institute is in place It puts.
Optionally, the camera is binocular camera, and the position determination module includes:
Third collecting unit, for acquiring the image of the target by the binocular camera;
Second determination unit, for determining the target position according to the image of the target.
Optionally, described device further includes obstacle avoidance module, and the obstacle avoidance module is used for:
When not detecting the target, hide the barrier for blocking the target, until the module of target detection Until detecting the target.
Optionally, the obstacle avoidance module includes:
First movement unit, for the robot to be controlled to be moved towards the target direction;
Second mobile unit, for the robot when encountering barrier, to be controlled to be moved along the boundary of the barrier It is dynamic, until the module of target detection detects the target.
Optionally, the obstacle avoidance module includes third mobile unit, and the third mobile unit is used for:Control the machine People toward the target direction lateral movement, until the module of target detection detects the target.
Optionally, module of target detection includes:
Face identification unit, for carrying out recognition of face detection by the camera;
Target determination unit, for when detecting face, determining to detect the target.
Optionally, the mobile control module includes:
Path planning unit, for being terminal planning movement by starting point, the target position of present position Path;
Mobile control unit, for the robot to be controlled to be moved along the mobile route to the terminal.
Optionally, the mobile control module further includes routing update unit, and the routing update unit is used for:When described When the position of target changes, the mobile route is planned again.
The embodiment of the present invention also proposes a kind of sweeping robot, including memory, processor and at least one is stored In the memory and the application program performed by the processor is configured as, the application program is configurable for holding Row aforementioned machines people's moving method.
A kind of robot moving method that the embodiment of the present invention is provided determines target direction will take the photograph by auditory localization Picture head recycles camera to carry out vision positioning to determine the accurate location of target so that robot can be quick towards target It is accurately moved in front of user's (target), solves the technical issues of robot can not be accurately moved in front of user, it is real Showed accurate adjustment robot and user towards relationship and distance relation, ensure that robot can preferably receive the language of user Sound instructs, and is greatly improved user experience.
Description of the drawings
Fig. 1 is the flow chart of the robot moving method first embodiment of the present invention;
Fig. 2 is the structure diagram of robot in the embodiment of the present invention;
Fig. 3 is the target-bound schematic diagram of camera of robot in the embodiment of the present invention;
Fig. 4 is the flow chart of the robot moving method second embodiment of the present invention;
Fig. 5 is the schematic diagram of the barrier that shelter target is hidden by robot in the embodiment of the present invention;
Fig. 6 is the another schematic diagram for the barrier that shelter target is hidden by robot in the embodiment of the present invention;
Fig. 7 is the module diagram of the robot mobile device first embodiment of the present invention;
Fig. 8 is the module diagram of the module of target detection in Fig. 7;
Fig. 9 is the module diagram of the position determination module in Fig. 7;
Figure 10 is the another module diagram of the position determination module in Fig. 7;
Figure 11 is the module diagram of the mobile control module in Fig. 7;
Figure 12 is the module diagram of the mobile control module in Fig. 7;
Figure 13 is the module diagram of the robot mobile device second embodiment of the present invention;
Figure 14 is the module diagram of the obstacle avoidance module in Figure 13.
The embodiments will be further described with reference to the accompanying drawings for the realization, the function and the advantages of the object of the present invention.
Specific embodiment
It should be appreciated that the specific embodiments described herein are merely illustrative of the present invention, it is not intended to limit the present invention.
The embodiment of the present invention is described below in detail, the example of the embodiment is shown in the drawings, wherein from beginning to end Same or similar label represents same or similar element or the element with same or like function.Below with reference to attached The embodiment of figure description is exemplary, and is only used for explaining the present invention, and is not construed as limiting the claims.
Those skilled in the art of the present technique are appreciated that unless expressly stated, singulative " one " used herein, " one It is a ", " described " and "the" may also comprise plural form.It is to be further understood that is used in the specification of the present invention arranges Diction " comprising " refers to there are the feature, integer, step, operation, element and/or component, but it is not excluded that presence or addition Other one or more features, integer, step, operation, element, component and/or their group.It should be understood that when we claim member Part is " connected " or during " coupled " to another element, it can be directly connected or coupled to other elements or there may also be Intermediary element.In addition, " connection " used herein or " coupling " can include wireless connection or wireless coupling.It is used herein to arrange Take leave whole or any cell and all combination of the "and/or" including one or more associated list items.
Those skilled in the art of the present technique are appreciated that unless otherwise defined all terms used herein are (including technology art Language and scientific terminology), there is the meaning identical with the general understanding of the those of ordinary skill in fields of the present invention.Should also Understand, those terms such as defined in the general dictionary, it should be understood that have in the context of the prior art The consistent meaning of meaning, and unless by specific definitions as here, the meaning of idealization or too formal otherwise will not be used To explain.
The robot moving method and device of the embodiment of the present invention, can be applied to various robots, be particularly suitable for sweeping Floor-washing robot.It is described in detail by taking sweeping robot as an example below.
With reference to Fig. 1, propose the robot moving method first embodiment of the present invention, the described method comprises the following steps:
S11, auditory localization is carried out by voice collection device, determines the direction of target.
Voice collection device in the embodiment of the present invention is preferably microphone array, as shown in Fig. 2, sweeping robot 100 On be provided with the microphone array being made of four microphones 101 and a camera 102.Sweeping robot utilizes microphone array The sound that row acquisition sound source is sent out carries out auditory localization, so that it is determined that the direction of sound source, that is, target using auditory localization technology.Sound Source location technology is the prior art of comparative maturity, and details are not described herein.
Goal refers mainly to people (user), naturally it is also possible to be the sounding object that other are capable of sounding, the present invention is to this It is not construed as limiting.
S12, by camera towards target, and pass through camera and detect target.Judge whether to detect target, when detecting During target, S13 is entered step;When not detecting target, terminate flow.
Behind the direction that target is determined, camera is then made its right by sweeping robot towards target, such as rotating camera Quasi- target or entire robot rotary alignment target, and start camera, target is detected by camera.As shown in figure 3, it sweeps 102 alignment target 200 of camera of floor-washing robot 100, so that within sweep of the eye (camera of the target 200 in camera 102 Two sidelines location of 102 extensions).
When target is behaved, sweeping robot can carry out recognition of face detection by camera, when detecting face, It then determines to detect target, otherwise determines not detect target.Face recognition technology is the prior art of comparative maturity, herein It repeats no more.
It, can be with other visual spies of recognition detection human body other than people being identified detection using face recognition technology It levies to realize target detection, the present invention will not enumerate herein to be repeated.
When target is other sounding objects, then it can realize that target is examined by the visual properties of the recognition detection sounding object It surveys.
S13, vision positioning is carried out by camera, determines target position.
When detecting target, sweeping robot then carries out vision using vision positioning technology by camera to target to be determined Position, determines target position.
Optionally, when camera is monocular cam, sweeping robot can be adopted first in situ by monocular cam Collect the image of target, then move a distance towards target direction, the image of target is acquired again by monocular cam, finally Go out the three-dimensional coordinate of target according to the image analysis calculation of the front and rear target acquired twice, determine target position.Pass through two The specific method that the image of a station acquisition calculates the three-dimensional coordinate of target is same as the prior art, and this will not be repeated here.
Optionally, when camera is binocular camera, sweeping robot can directly pass through in situ binocular camera The image of target is acquired, often acquisition is primary to obtain the two images with parallax, is analyzed using the two images with parallax The three-dimensional coordinate of target is calculated, determines target position.Target is calculated using the two images analysis with parallax The specific method of three-dimensional coordinate is same as the prior art, and this will not be repeated here.
S14, it is moved to target position.
After target position is determined, sweeping robot is then moved to target position, until being moved to and mesh Mark stops movement after keeping certain distance, the phonetic order of target is then received by voice collection device, according to phonetic order Perform corresponding action.
When being moved to target position, sweeping robot can be using present position as starting point, target position Mobile route is planned for terminal, is moved then along mobile route to terminal, is eventually arrived at target position.
Further, when the position of target changes in moving process, sweeping robot then planning movement again Path is moved along the mobile route planned again to terminal.So as to which when the object moves, sweeping robot can also follow target It is mobile, realize the real-time tracking retinue to target.
So that the accurate location of target is determined by way of auditory localization+vision positioning so that robot can be fast It is fast to be accurately moved in front of user's (target), solve the technical issues of robot can not be accurately moved in front of user.
With reference to Fig. 4, propose the robot moving method second embodiment of the present invention, the described method comprises the following steps:
S21, auditory localization is carried out by voice collection device, determines the direction of target.
S22, by camera towards target, and pass through camera and detect target.Judge whether to detect target, when not examining When measuring target, S23 is entered step;When detecting target, S24 is entered step.
S23, the barrier for hiding shelter target, until detecting target.
S24, vision positioning is carried out by camera, determines target position.
S25, it is moved to target position.
In view of that may have barrier between sweeping robot and target, when sweeping robot does not detect in situ mesh During mark, illustrate that the visual field of the camera of sweeping robot is blocked by barrier, therefore the present embodiment is in the base of first embodiment Step S23 is increased on plinth, when sweeping robot does not detect target in situ, then hides the barrier of shelter target, Until detecting target so that sweeping robot even if blocked by barrier can automatic dodging barrier reach and use Family is at one's side.
Sweeping robot can hide the barrier of shelter target in the following manner:
It optionally, can not as shown in figure 5, when the sight of the camera 102 of sweeping robot 100 is blocked by barrier 300 When detecting target 200, sweeping robot 100 is first moved towards 200 direction of target, when encountering barrier 300, further along obstacle The Boundary Moving of object 300, until detecting target 200.During the Boundary Moving along barrier 300, camera 102 always towards target 200, and when the edge for being moved to barrier 300, the sight (shown in dotted line) of camera 102 is just kept away It has opened barrier 300 and has reached target 200, so as to detect target 200.When detecting target 200, sweeping robot 100 carry out vision positioning by camera 102, determine 200 position of target, and move to 200 position of target.
It optionally, can not as shown in fig. 6, when the sight of the camera 102 of sweeping robot 100 is blocked by barrier 300 When detecting target 200, sweeping robot 100 is then toward the lateral movement in 200 direction of target, until detecting target 200. It is laterally preferably at an acute angle with 200 direction of target, as between 45-90 degree angle, to reduce displacement distance to the greatest extent, naturally it is also possible in straight Angle or obtuse angle.Toward during lateral movement, camera 102 is always towards target 200, when being moved to 300 side of barrier During edge position, the sight (shown in dotted line) of camera 102 just avoids barrier 300 and reaches target 200, so as to examine Measure target 200.When detecting target 200, sweeping robot 100 then carries out vision positioning by camera 102, determines mesh 200 positions are marked, and are moved to 200 position of target.
The robot moving method of the embodiment of the present invention, by auditory localization determine target direction with by camera towards mesh Mark recycles camera to carry out vision positioning to determine the accurate location of target so that robot can be moved fast and accurately In front of to user's (target), solve the technical issues of robot can not be accurately moved in front of user, realize accurate tune Whole robot and user towards relationship and distance relation, ensure that robot can preferably receive the phonetic order of user, pole Big improves user experience.
The operations such as also, in the embodiment of the present invention, robot also is able to initiative recognition user, and realization follows, interactive experience More abundant, user experience is more preferably.
With reference to Fig. 7, the robot mobile device first embodiment of the present invention is proposed, described device includes direction determining mould 10th, module of target detection 20, position determination module 30 and mobile control module 40, wherein:Direction determining mould 10, for passing through Voice collection device carries out auditory localization, determines the direction of target;Module of target detection 20, for by camera towards the mesh Mark, and pass through camera and detect target;Position determination module 30, for when detecting target, vision to be carried out by camera Positioning, determines target position;Mobile control module 40, for robot to be controlled to be moved to target position.
Voice collection device in the embodiment of the present invention is preferably microphone array, as shown in Fig. 2, sweeping robot 100 On be provided with the microphone array being made of four microphones 101 and a camera 102.Direction determining mould 10 utilizes Mike The sound that wind array acquisition sound source is sent out carries out auditory localization, so that it is determined that the side of sound source, that is, target using auditory localization technology To.Auditory localization technology is the prior art of comparative maturity, and details are not described herein.Goal refers mainly to people (user), when Can also be so the sounding object that other are capable of sounding, this is not limited by the present invention.
Behind the direction that target is determined, module of target detection 20 then makes camera towards target, such as rotating camera Its alignment target or the entire robot rotary alignment target of control, and start camera, target is detected by camera.Such as figure Shown in 3, module of target detection 20 controls 102 alignment target 200 of camera of sweeping robot 100, so that target 200 is imaging First 102 (two sidelines location that camera 102 extends) within sweep of the eye.
When target is behaved, module of target detection 20 can carry out target detection by face recognition technology.Such as Fig. 8 institutes To show, module of target detection 20 includes face identification unit 21 and target determination unit 22, wherein:Face identification unit 21, is used for Recognition of face detection is carried out by camera;Target determination unit 22, for when detecting face, determining to detect target, Otherwise it determines not detect target.Face recognition technology is the prior art of comparative maturity, and details are not described herein.
Module of target detection 20, can be with recognition detection people other than people being identified detection using face recognition technology Other visual properties of body realize target detection, and the present invention will not enumerate herein repeats.
When target is other sounding objects, module of target detection 20 can then pass through the visual spy of the recognition detection sounding object It levies to realize target detection.
When detecting target, position determination module 30 then regards target using vision positioning technology by camera Feel positioning, determine target position.
Optionally, when camera is monocular cam, position determination module 30 is as shown in figure 9, single including the first acquisition First 31, second collecting unit 32 and the first determination unit 33, wherein:First collecting unit 31, is adopted for passing through monocular cam Collect the image of target;Second collecting unit 32, for robot to be controlled to move a distance towards target direction, again by monocular Camera acquires the image of target;First determination unit 33, for the image analysis calculation of target acquired twice before and after Go out the three-dimensional coordinate of target, determine target position.The three-dimensional coordinate of target is calculated by the image of two station acquisitions Specific method it is same as the prior art, this will not be repeated here.
Optionally, when camera is binocular camera, position determination module 30 is as shown in Figure 10, is acquired including third single 34 and second determination unit 35 of member, wherein:Third collecting unit 34, for acquiring the image of target by binocular camera, often Acquisition is primary to obtain the two images with parallax;Second determination unit 35, for determining target institute according to the image of target In position, such as:The three-dimensional coordinate of target is calculated using the two images analysis with parallax, determines target position.Profit The specific method of three-dimensional coordinate for calculating target with the two images analysis with parallax is same as the prior art, does not go to live in the household of one's in-laws on getting married herein It states.
After target position is determined, mobile control module 40 then controls robot to be moved to target position, Until being moved to stopping moving after target holding certain distance.Then robot then can receive mesh by voice collection device Target phonetic order performs corresponding action according to phonetic order.
Mobile control module 40 is as shown in figure 11, including path planning unit 41 and mobile control unit 42, wherein:Path Planning unit 41, for being that terminal plans mobile route using present position as starting point, target position;Mobile control is single Member 42 for robot to be controlled to move along road radial end movement, eventually arrives at target position.
Further, as shown in figure 12, mobile control module 40 can also include routing update unit 43, the routing update Unit 43 is used for:When the position of target changes, mobile route is planned again.Mobile control unit 42 then along advising again The mobile route drawn is moved to terminal.So as to which when the object moves, sweeping robot can also follow target to move, and realize to target Real-time tracking retinue.
So that the accurate location of target is determined by way of auditory localization+vision positioning so that robot can be fast It is fast to be accurately moved in front of user's (target), solve the technical issues of robot can not be accurately moved in front of user.
With reference to Figure 13, the robot mobile device second embodiment of the present invention is proposed, the present embodiment is in first embodiment On the basis of increase obstacle avoidance module 50, which is used for:When module of target detection 20 does not detect target, hide The barrier of shelter target, until module of target detection 20 detects target, even if so that sweeping robot is hindered Hinder object block also can automatic dodging barrier reach user at one's side.
In the embodiment of the present invention, obstacle avoidance module 50 is as shown in figure 14, including 51 and second mobile unit of first movement unit 52, wherein:First movement unit 51, for robot to be controlled to be moved towards target direction;Second mobile unit 52 is encountered for working as During barrier, Boundary Moving of the robot along barrier is controlled, until module of target detection 20 detects target.
As shown in figure 5, when the sight of the camera 102 of robot 100 is blocked by barrier 300, module of target detection 20 When can not detect target 200, first movement unit 51 then controls robot 100 to be moved towards 200 direction of target, when encountering obstacle During object 300, the second mobile unit 52 controls Boundary Moving of the robot 100 along barrier 300 again, until module of target detection Until 20 detect target 200.During the Boundary Moving along barrier 300, camera 102 is always for robot 100 Towards target 200, when the edge for being moved to barrier 300, the sight (shown in dotted line) of camera 102 just avoids barrier Hinder object 300 and reach target 200, so as to which module of target detection 20 is just able to detect that target 200.
In further embodiments, obstacle avoidance module 50 includes third mobile unit, which is used for:Control machine Device people toward target direction lateral movement, until module of target detection 20 detects target.
As shown in fig. 6, when the sight of the camera 102 of robot 100 is blocked by barrier 300, module of target detection 20 When can not detect target 200, third mobile unit then controls lateral movement of the robot 100 toward 200 direction of target, Zhi Daojian Until measuring target 200.It is laterally preferably at an acute angle with 200 direction of target, as between 45-90 degree angle, with reduce to the greatest extent it is mobile away from From, naturally it is also possible to rectangular or obtuse angle.Toward during being displaced sideways, camera 102 is always towards target for robot 100 200, when being moved to 300 marginal position of barrier, the sight (shown in dotted line) of camera 102 just avoids barrier 300 and target 200 is reached, so as to which module of target detection 20 is just able to detect that target 200.
The robot mobile device of the embodiment of the present invention, by auditory localization determine target direction with by camera towards mesh Mark recycles camera to carry out vision positioning to determine the accurate location of target so that robot can be moved fast and accurately In front of to user's (target), solve the technical issues of robot can not be accurately moved in front of user, realize accurate tune Whole robot and user towards relationship and distance relation, ensure that robot can preferably receive the phonetic order of user, pole Big improves user experience.
The present invention proposes a kind of sweeping robot simultaneously, is deposited including memory, processor and at least one be stored in In reservoir and the application program performed by processor is configured as, the application program is configurable for performing robot movement Method.The robot moving method includes the following steps:Auditory localization is carried out by voice collection device, determines the side of target To;By camera towards target, and pass through camera and detect target;When detecting target, vision is carried out by camera and is determined Position, determines target position;It is moved to target position.Robot moving method described in the present embodiment is this hair Robot moving method involved by bright middle above-described embodiment, details are not described herein.
It will be understood by those skilled in the art that the present invention includes being related to performing one in operation described herein Or multinomial equipment.These equipment can specially be designed and be manufactured or can also include general-purpose computations for required purpose Known device in machine.These equipment have the computer program being stored in it, these computer programs selectively activate Or reconstruct.Such computer program, which can be stored in equipment (for example, computer) readable medium or be stored in, to be suitable for Storage e-command is simultaneously coupled in any kind of medium of bus respectively, and the computer-readable medium includes but not limited to Any kind of disk (including floppy disk, hard disk, CD, CD-ROM and magneto-optic disk), ROM (Read-Only Memory, it is read-only to deposit Reservoir), RAM (Random Access Memory, random access memory), EPROM (Erasable Programmable Read- Only Memory, Erarable Programmable Read only Memory), EEPROM (Electrically Erasable Programmable Read-Only Memory, Electrically Erasable Programmable Read-Only Memory), flash memory, magnetic card or light card.It is it is, readable Medium includes any medium by equipment (for example, computer) storage or transmission information in the form of it can read.
Those skilled in the art of the present technique be appreciated that can with computer program instructions come realize these structure charts and/or The combination of each frame and these structure charts and/or the frame in block diagram and/or flow graph in block diagram and/or flow graph.This technology is led Field technique personnel be appreciated that these computer program instructions can be supplied to all-purpose computer, special purpose computer or other The processor of programmable data processing method is realized, so as to pass through the processing of computer or other programmable data processing methods Device performs the scheme specified in the frame of structure chart and/or block diagram and/or flow graph disclosed by the invention or multiple frames.
Those skilled in the art of the present technique are appreciated that in the various operations crossed by discussion in the present invention, method, flow Steps, measures, and schemes can be replaced, changed, combined or be deleted.Further, it is each with having been crossed by discussion in the present invention Other steps, measures, and schemes in kind operation, method, flow may also be alternated, changed, rearranged, decomposed, combined or deleted. Further, it is of the prior art have with disclosed in the present invention various operations, method, the step in flow, measure, scheme It may also be alternated, changed, rearranged, decomposed, combined or deleted.
The foregoing is merely the preferred embodiment of the present invention, are not intended to limit the scope of the invention, every utilization The equivalent structure or equivalent flow shift that description of the invention and accompanying drawing content are made directly or indirectly is used in other correlations Technical field, be included within the scope of the present invention.

Claims (10)

1. a kind of robot moving method, which is characterized in that include the following steps:
Auditory localization is carried out by voice collection device, determines the direction of target;
By camera towards the target, and the target is detected by the camera;
When detecting the target, vision positioning is carried out by the camera, determines the target position;
It is moved to the target position.
2. robot moving method according to claim 1, which is characterized in that the camera be monocular cam, institute The step of stating and carry out vision positioning by the camera, determining the target position includes:
The image of the target is acquired by the monocular cam;
A distance is moved towards the target direction, the image of the target is acquired again by the monocular cam;
The target position is determined according to the front and rear image of the target acquired twice.
3. robot moving method according to claim 1, which is characterized in that the camera be binocular camera, institute The step of stating and carry out vision positioning by the camera, determining the target position includes:
The image of the target is acquired by the binocular camera;
The target position is determined according to the image of the target.
4. according to claim 1-3 any one of them robot moving method, which is characterized in that described to pass through the camera The step of detecting the target includes:
When not detecting the target, hide the barrier for blocking the target, until detecting the target.
5. according to claim 1-3 any one of them robot moving method, which is characterized in that described to pass through the camera The step of detecting the target includes:
Recognition of face detection is carried out by the camera;
When detecting face, determine to detect the target.
6. a kind of robot mobile device, which is characterized in that including:
Direction determining mould for carrying out auditory localization by voice collection device, determines the direction of target;
Module of target detection, for camera to be detected the target towards the target, and by the camera;
Position determination module, for when detecting the target, carrying out vision positioning by the camera, determining the mesh Mark position;
Mobile control module, for the robot to be controlled to be moved to the target position.
7. robot mobile device according to claim 6, which is characterized in that the camera be monocular cam, institute Position determination module is stated to include:
First collecting unit, for acquiring the image of the target by the monocular cam;
Second collecting unit, for the robot to be controlled to move a distance towards the target direction, again by the list Mesh camera acquires the image of the target;
First determination unit, the image of the target for being acquired twice before and after determine the target position.
8. robot mobile device according to claim 6, which is characterized in that the camera be binocular camera, institute Position determination module is stated to include:
Third collecting unit, for acquiring the image of the target by the binocular camera;
Second determination unit, for determining the target position according to the image of the target.
9. according to claim 6-8 any one of them robot mobile device, which is characterized in that described device further includes avoidance Module, the obstacle avoidance module are used for:
When not detecting the target, hide the barrier for blocking the target, until the module of target detection detects Until the target.
10. according to claim 6-8 any one of them robot mobile device, which is characterized in that module of target detection includes:
Face identification unit, for carrying out recognition of face detection by the camera;
Target determination unit, for when detecting face, determining to detect the target.
CN201810036671.1A 2018-01-15 2018-01-15 Robot moving method and device Pending CN108245099A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201810036671.1A CN108245099A (en) 2018-01-15 2018-01-15 Robot moving method and device
PCT/CN2018/077604 WO2019136808A1 (en) 2018-01-15 2018-02-28 Robot moving method, robot moving device, floor sweeping robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810036671.1A CN108245099A (en) 2018-01-15 2018-01-15 Robot moving method and device

Publications (1)

Publication Number Publication Date
CN108245099A true CN108245099A (en) 2018-07-06

Family

ID=62727331

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810036671.1A Pending CN108245099A (en) 2018-01-15 2018-01-15 Robot moving method and device

Country Status (2)

Country Link
CN (1) CN108245099A (en)
WO (1) WO2019136808A1 (en)

Cited By (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110025260A (en) * 2017-12-20 2019-07-19 东芝生活电器株式会社 Autonomous driving body and autonomous driving body system
CN110881909A (en) * 2019-12-20 2020-03-17 小狗电器互联网科技(北京)股份有限公司 Control method and device of sweeper
CN110916576A (en) * 2018-12-13 2020-03-27 成都家有为力机器人技术有限公司 Cleaning method based on voice and image recognition instruction and cleaning robot
CN110946519A (en) * 2019-12-20 2020-04-03 小狗电器互联网科技(北京)股份有限公司 Control method and device of sweeper
CN110946518A (en) * 2019-12-20 2020-04-03 小狗电器互联网科技(北京)股份有限公司 Control method and device of sweeper
CN111008571A (en) * 2019-11-15 2020-04-14 万翼科技有限公司 Indoor garbage treatment method and related product
CN111012252A (en) * 2019-12-20 2020-04-17 小狗电器互联网科技(北京)股份有限公司 Control method and device of sweeper
CN111067354A (en) * 2018-10-19 2020-04-28 佛山市顺德区美的饮水机制造有限公司 Water dispenser and moving method and device thereof
CN112043206A (en) * 2020-09-01 2020-12-08 珠海格力电器股份有限公司 Sweeping and mopping integrated machine and cleaning method thereof
CN112597910A (en) * 2020-12-25 2021-04-02 北京小狗吸尘器集团股份有限公司 Method and device for monitoring human activities by using sweeping robot
WO2021062681A1 (en) * 2019-09-30 2021-04-08 中新智擎科技有限公司 Automatic meal delivery method and apparatus, and robot
CN112656309A (en) * 2020-12-25 2021-04-16 北京小狗吸尘器集团股份有限公司 Function execution method and device of sweeper, readable storage medium and electronic equipment
CN112703504A (en) * 2018-10-19 2021-04-23 深圳新物种科技有限公司 Object identification method and device, electronic equipment and computer readable storage medium
TWI731331B (en) * 2019-05-10 2021-06-21 中興保全科技股份有限公司 Mobile security device
CN113679298A (en) * 2021-08-27 2021-11-23 美智纵横科技有限责任公司 Robot control method, robot control device, robot, and readable storage medium
WO2022143285A1 (en) * 2020-12-31 2022-07-07 深圳市杉川机器人有限公司 Cleaning robot and distance measurement method therefor, apparatus, and computer-readable storage medium

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1696854A (en) * 2004-05-14 2005-11-16 三星光州电子株式会社 Mobile robot and system and method of compensating for path diversions
US20060005160A1 (en) * 1997-08-18 2006-01-05 National Instruments Corporation Image acquisition device
CN101295016A (en) * 2008-06-13 2008-10-29 河北工业大学 Sound source independent searching and locating method
CN102138769A (en) * 2010-01-28 2011-08-03 深圳先进技术研究院 Cleaning robot and cleaning method thereby
CN103054522A (en) * 2012-12-31 2013-04-24 河海大学 Cleaning robot system based on vision measurement and measurement and control method of cleaning robot system
CN104188598A (en) * 2014-09-15 2014-12-10 湖南格兰博智能科技有限责任公司 Automatic ground cleaning robot
CN104887155A (en) * 2015-05-21 2015-09-09 南京创维信息技术研究院有限公司 Intelligent sweeper
CN106489104A (en) * 2014-11-26 2017-03-08 艾罗伯特公司 System and method for the use of the optics range sensorses in mobile robot
CN106527444A (en) * 2016-11-29 2017-03-22 深圳市元征科技股份有限公司 Control method of cleaning robot and the cleaning robot
CN107491069A (en) * 2017-08-31 2017-12-19 珠海市微半导体有限公司 Robot runs into the processing method and chip of barrier

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI481980B (en) * 2012-12-05 2015-04-21 Univ Nat Chiao Tung Electronic apparatus and navigation method thereof
CN105929827B (en) * 2016-05-20 2020-03-10 北京地平线机器人技术研发有限公司 Mobile robot and positioning method thereof
CN106203259A (en) * 2016-06-27 2016-12-07 旗瀚科技股份有限公司 The mutual direction regulating method of robot and device
CN106210511A (en) * 2016-06-30 2016-12-07 纳恩博(北京)科技有限公司 A kind of method and apparatus positioning user
CN107515606A (en) * 2017-07-20 2017-12-26 北京格灵深瞳信息技术有限公司 Robot implementation method, control method and robot, electronic equipment

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060005160A1 (en) * 1997-08-18 2006-01-05 National Instruments Corporation Image acquisition device
CN1696854A (en) * 2004-05-14 2005-11-16 三星光州电子株式会社 Mobile robot and system and method of compensating for path diversions
CN101295016A (en) * 2008-06-13 2008-10-29 河北工业大学 Sound source independent searching and locating method
CN102138769A (en) * 2010-01-28 2011-08-03 深圳先进技术研究院 Cleaning robot and cleaning method thereby
CN103054522A (en) * 2012-12-31 2013-04-24 河海大学 Cleaning robot system based on vision measurement and measurement and control method of cleaning robot system
CN104188598A (en) * 2014-09-15 2014-12-10 湖南格兰博智能科技有限责任公司 Automatic ground cleaning robot
CN106489104A (en) * 2014-11-26 2017-03-08 艾罗伯特公司 System and method for the use of the optics range sensorses in mobile robot
CN104887155A (en) * 2015-05-21 2015-09-09 南京创维信息技术研究院有限公司 Intelligent sweeper
CN106527444A (en) * 2016-11-29 2017-03-22 深圳市元征科技股份有限公司 Control method of cleaning robot and the cleaning robot
CN107491069A (en) * 2017-08-31 2017-12-19 珠海市微半导体有限公司 Robot runs into the processing method and chip of barrier

Cited By (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110025260A (en) * 2017-12-20 2019-07-19 东芝生活电器株式会社 Autonomous driving body and autonomous driving body system
CN111067354B (en) * 2018-10-19 2022-06-07 佛山市顺德区美的饮水机制造有限公司 Water dispenser and moving method and device thereof
CN111067354A (en) * 2018-10-19 2020-04-28 佛山市顺德区美的饮水机制造有限公司 Water dispenser and moving method and device thereof
CN112703504A (en) * 2018-10-19 2021-04-23 深圳新物种科技有限公司 Object identification method and device, electronic equipment and computer readable storage medium
CN110916576A (en) * 2018-12-13 2020-03-27 成都家有为力机器人技术有限公司 Cleaning method based on voice and image recognition instruction and cleaning robot
TWI731331B (en) * 2019-05-10 2021-06-21 中興保全科技股份有限公司 Mobile security device
WO2021062681A1 (en) * 2019-09-30 2021-04-08 中新智擎科技有限公司 Automatic meal delivery method and apparatus, and robot
CN111008571B (en) * 2019-11-15 2023-04-18 万翼科技有限公司 Indoor garbage treatment method and related product
CN111008571A (en) * 2019-11-15 2020-04-14 万翼科技有限公司 Indoor garbage treatment method and related product
CN110946518A (en) * 2019-12-20 2020-04-03 小狗电器互联网科技(北京)股份有限公司 Control method and device of sweeper
CN111012252A (en) * 2019-12-20 2020-04-17 小狗电器互联网科技(北京)股份有限公司 Control method and device of sweeper
CN110946519A (en) * 2019-12-20 2020-04-03 小狗电器互联网科技(北京)股份有限公司 Control method and device of sweeper
CN110881909A (en) * 2019-12-20 2020-03-17 小狗电器互联网科技(北京)股份有限公司 Control method and device of sweeper
CN112043206A (en) * 2020-09-01 2020-12-08 珠海格力电器股份有限公司 Sweeping and mopping integrated machine and cleaning method thereof
CN112597910A (en) * 2020-12-25 2021-04-02 北京小狗吸尘器集团股份有限公司 Method and device for monitoring human activities by using sweeping robot
CN112656309A (en) * 2020-12-25 2021-04-16 北京小狗吸尘器集团股份有限公司 Function execution method and device of sweeper, readable storage medium and electronic equipment
WO2022143285A1 (en) * 2020-12-31 2022-07-07 深圳市杉川机器人有限公司 Cleaning robot and distance measurement method therefor, apparatus, and computer-readable storage medium
CN113679298A (en) * 2021-08-27 2021-11-23 美智纵横科技有限责任公司 Robot control method, robot control device, robot, and readable storage medium

Also Published As

Publication number Publication date
WO2019136808A1 (en) 2019-07-18

Similar Documents

Publication Publication Date Title
CN108245099A (en) Robot moving method and device
CN105701447B (en) Guest-meeting robot
CN106527444B (en) Control method of cleaning robot and cleaning robot
KR101394809B1 (en) A method and systems for obtaining an improved stereo image of an object
CN110710852A (en) Meal delivery method, system, medium and intelligent device based on meal delivery robot
CN103529855A (en) Rotary adjustable binocular vision target recognition and positioning device and application thereof in agricultural fruit harvesting machinery
US11639000B2 (en) Geometrically appropriate tool selection assistance for determined work site dimensions
CN110632915A (en) Robot recharging path planning method, robot and charging system
CN109934127A (en) Pedestrian's recognition and tracking method based on video image and wireless signal
JP4677060B1 (en) Position calibration information collection device, position calibration information collection method, and position calibration information collection program
WO2017197919A1 (en) Wireless charging positioning method, device, and system, and electric vehicle
CN109840982B (en) Queuing recommendation method and device and computer readable storage medium
CN109857112A (en) Obstacle Avoidance and device
CN114093052A (en) Intelligent inspection method and system suitable for machine room management
CN105629196A (en) Positioning system based on machine vision and dynamic fingerprint and corresponding method
CN103389486A (en) Control method and electronic device
CN111248815B (en) Method, device and equipment for generating working map and storage medium
CN112985263B (en) Method, device and equipment for detecting geometrical parameters of bow net
CN106155093A (en) A kind of robot based on computer vision follows the system and method for human body
CN109146866A (en) The method and device that robot handles weld seam
CN113675923A (en) Charging method, charging device and robot
CN106303409A (en) A kind of destination object combined tracking method and destination object combine tracking device
Kim et al. Recognition and localization of generic objects for indoor navigation using functionality
CN103006332A (en) Scalpel tracking method and device and digital stereoscopic microscope system
CN110881909A (en) Control method and device of sweeper

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20190905

Address after: Room 402, 4th floor, Kanghe Sheng Building, New Energy Innovation Industrial Park, No. 1 Chuangsheng Road, Nanshan District, Shenzhen City, Guangdong Province, 518000

Applicant after: Shenzhen Infinite Power Development Co., Ltd.

Address before: 518000 Block 503,602, Garden City Digital Building B, 1079 Nanhai Avenue, Shekou, Nanshan District, Shenzhen City, Guangdong Province

Applicant before: SHENZHEN WOTE WODE CO., LTD.

RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20180706