CN109965783A - Mobile robot - Google Patents

Mobile robot Download PDF

Info

Publication number
CN109965783A
CN109965783A CN201910067596.XA CN201910067596A CN109965783A CN 109965783 A CN109965783 A CN 109965783A CN 201910067596 A CN201910067596 A CN 201910067596A CN 109965783 A CN109965783 A CN 109965783A
Authority
CN
China
Prior art keywords
mobile robot
barrier
signal
intersection point
obstacle detection
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
CN201910067596.XA
Other languages
Chinese (zh)
Inventor
南帅帅
李焕强
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Feike Robot Co Ltd
Original Assignee
Shenzhen Feike Robot Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Feike Robot Co Ltd filed Critical Shenzhen Feike Robot Co Ltd
Priority to CN201910067596.XA priority Critical patent/CN109965783A/en
Publication of CN109965783A publication Critical patent/CN109965783A/en
Withdrawn legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/24Floor-sweeping machines, motor-driven
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/40Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
    • A47L11/4061Steering means; Means for avoiding obstacles; Details related to the place where the driver is accommodated

Abstract

The present invention discloses a kind of mobile robot, including body, is set to the intracorporal processing unit of the machine and proximity sensor;The proximity sensor includes signal projector and signal receiver;The signal projector has primary optical axis, and signal receiver has field of view center axis;The primary optical axis and the field of view center axis intersect at the first intersection point;The processing unit is used for when determining that the barrier is located at first intersection point, is controlled the mobile robot and is executed predefined action.The present invention can be improved the accuracy of Mobile Robot Obstacle Avoidance.

Description

Mobile robot
Technical field
The present invention relates to robotic technology field more particularly to a kind of mobile robots.
Background technique
Currently, the cleaning equipments such as mobile robot such as sweeping robot using more and more extensive.Detection of obstacles with Avoidance is still the robot a great problem to be faced.When traditional barrier-avoiding method is due to barrier color, material difference, reflection Signal strength indication has larger difference, i.e. unlike material barrier reflection signal is different, and then causes mobile machine to barrier Detection is easy to produce erroneous judgement, to influence the movement of machine.
Summary of the invention
The embodiment of the present invention discloses a kind of mobile robot to solve the above problems.
In a first aspect, a kind of mobile robot disclosed by the embodiments of the present invention, comprising:
Body;
Processing unit is set in the body;And
Proximity sensor is set on the body, comprising:
Signal projector has primary optical axis;And
Signal receiver has field of view center axis;The primary optical axis and the field of view center axis intersect at the first intersection point;
The processing unit is used for when determining that the barrier is located at first intersection point, controls the mobile robot Execute predefined action.
Second aspect, a kind of barrier-avoiding method disclosed by the embodiments of the present invention are applied in mobile robot, the moving machine Device people includes the signal projector with primary optical axis and the signal receiver with field of view center axis;The primary optical axis with it is described Field of view center axis intersects at the first intersection point;The barrier-avoiding method includes the following steps:
Judge whether the barrier intervenes first intersection point;
When determining that the barrier intervenes first intersection point, controls the mobile robot and execute predefined action.
The third aspect, the embodiment of the present invention disclose a kind of computer readable storage medium, deposit in the readable storage medium storing program for executing Program instruction is contained, described program instruction is for for executing barrier-avoiding method described in second aspect after calling.It is presently disclosed Mobile robot, barrier-avoiding method and computer readable storage medium, as long as judging that the barrier is located at first intersection point When, that is, control the mobile robot and execute predefined action, and then can to avoid the reflectivity because of barrier it is different caused by The case where erroneous judgement, occurs, and improves the accuracy of judgement.
Detailed description of the invention
It to describe the technical solutions in the embodiments of the present invention more clearly, below will be to needed in the embodiment Attached drawing is briefly described, it should be apparent that, drawings in the following description are only some embodiments of the invention, for ability For the those of ordinary skill of domain, without creative efforts, it can also be obtained according to these attached drawings other attached Figure.
Fig. 1 is the block diagram of the mobile robot in one embodiment of the invention.
Fig. 2 is the schematic perspective view of the mobile robot in one embodiment of the invention.
Fig. 3 is the bottom substance schematic diagram of the mobile robot in one embodiment of the invention.
Fig. 4 is the structural schematic diagram of the proximity sensor in one embodiment of the invention.
Relation schematic diagram of the Fig. 5 between proximity sensor and the first intersection point.
Fig. 6 is the relation schematic diagram of the angle between signal projector and signal receiver in Fig. 4.
Fig. 7 is the corresponding curve synoptic diagram of difference barrier in " output voltage --- distance " coordinate system.
Fig. 8 is the relationship between each intersection point of signal projector and signal receiver and the output voltage of signal receiver Schematic diagram.
Fig. 9 is the flow chart of the barrier-avoiding method in one embodiment of the invention.
Figure 10 is the flow chart of the barrier-avoiding method in another embodiment of the present invention.
Specific embodiment
Following will be combined with the drawings in the embodiments of the present invention, and technical solution in the embodiment of the present invention carries out clear, complete Site preparation description, it is clear that the described embodiment is only a part of the embodiment of the present invention, instead of all the embodiments.Based on this Embodiment in invention, every other reality obtained by those of ordinary skill in the art without making creative efforts Example is applied, shall fall within the protection scope of the present invention.
It should be noted that the term used in the embodiment of the present application is only merely for the mesh of description specific embodiment , it is not intended to be limiting the application." the one of the embodiment of the present application and singular used in the attached claims Kind ", " described " and "the" are also intended to including most forms, unless the context clearly indicates other meaning.It is also understood that this Term "and/or" used herein refers to and includes one or more associated any or all possible group for listing project It closes.
In the description of the present invention, it should be noted that unless otherwise clearly defined and limited, term " installation ", " phase Even ", " connection " shall be understood in a broad sense, for example, it may be being fixedly connected, may be a detachable connection, or be integrally connected;It can To be mechanical connection, it is also possible to be electrically connected;It can be directly connected, can also can be indirectly connected through an intermediary Connection inside two elements.For the ordinary skill in the art, above-mentioned term can be understood by concrete condition Concrete meaning in the present invention.In the application, " A and/or B " includes A and B, A or B.
Referring to Fig. 1, being illustrated so that mobile robot 100 is clean robot as an example in the embodiment of the present invention, at it In his alternative embodiment, mobile robot 100 is also possible to accompany and attend to robot, meal delivery robot, guest-meeting robot, remote control camera shooting Robot etc..
One embodiment of mobile robot 100 provided by the present application includes: battery unit 10, driving unit 20, revolver 21, right wheel 22, directive wheel 23, cleaning unit 30, processing unit 40, storage unit 50, detection device 70, detection device 70 include Proximity sensor 60.
Battery unit 10 includes the electrode of rechargeable battery, the charging circuit and rechargeable battery that connect respectively with rechargeable battery. The quantity of rechargeable battery is one or more, electric energy needed for rechargeable battery can provide operation for mobile robot.Electrode can Fuselage side or the fuselage bottom of mobile robot is arranged in.Battery unit 10 can also include battery parameter detection group Part, battery parameter detection components are for detecting battery parameter, for example, voltage, electric current, battery temperature etc..In mobile robot 100 Operating mode be switched to when recharging mode, mobile robot 100 begins look for charging pile, and is mobile machine using charging pile People 100 charges.
Driving unit 20 includes the motor for applying driving force.Driving unit 20 connects cleaning unit 30, revolver 21, the right side Wheel 22 and directive wheel 23.Under the control of processing unit 40, driving unit 20 can drive cleaning unit 30, revolver 21, right wheel 22 and directive wheel 23.Alternatively, driving unit 20 includes cleaning driving subelement, revolver driving subelement, right wheel to drive subelement It with guiding wheel drive unit, cleans driving subelement and is connect with cleaning unit 30, revolver driving subelement is connect with revolver 21, right Wheel drive subelement is connect with right wheel 22, and guiding wheel drive unit is connect with directive wheel 23.
Revolver 21 and right wheel 22 (wherein revolver, right wheel are referred to as travel wheel, driving wheel) difference are in a symmetrical manner It is centrally disposed in the opposite side of the bottom of the machine body of mobile robot 100.Execute during executing cleaning includes forward Movement, the motor performance for moving and rotating backward.Directive wheel 23 may be provided at machine body front or rear portion.
It refering to Fig. 2 and Fig. 3, Fig. 2 and Fig. 3 is respectively together again please signal of the mobile robot 100 in two different perspectivess Figure.As shown in figure 3, cleaning unit 30 includes: main brush 31 and one or more side brush 32.Main brush 31 is mounted on mobile robot 100 101 bottom of body.Optionally, main brush 31 is that the cydariform rotated with roller type relative to contact surface turns brush.32 installation of side brush In the left and right edges part of the front end of the bottom surface of mobile robot.Side brush 32 is installed substantially at the front of multiple travel wheels.Side The purging zone that brush 32 cannot be cleaned for cleaning main brush 31.Moreover, side brush 32 can not only rotate in place, but also can be pacified Dress is prominent to the outside of mobile robot 100, so that can expand the region of the cleaning of mobile robot 100.
Specifically, the bottom of mobile robot 100 is provided with revolver 21, right wheel 22, directive wheel 23, cleaning unit 30, electricity Pool unit 10.Cleaning unit 30 includes main brush 31 and side brush 32.The rechargeable battery in battery unit 10 is encapsulated in capping The inside of mobile robot 10, prevents it from falling.One in the electrode 11 and electrode 12 of rechargeable battery is anode, another is Cathode.
Detection device 70 for being detected to the side environment of mobile robot 100, thus find barrier, metope, The environmental objects such as step and charging pile for charging to mobile robot 100.Detection device 70 is also used to single to processing Member 40 provides the various positions information and movement state information of mobile robot.Detection device 70 may also include steep cliff sensor, Ultrasonic sensor, infrared sensor, magnetometer, three axis accelerometer, gyroscope, odometer, LDS, ultrasonic sensor, camera shooting Head, Hall sensor etc..The present embodiment is not construed as limiting the number of detection device 70 and position.
Processing unit 40 is arranged on the circuit board in the body 101 of mobile robot 100, can be according to detection device 70 The information of the ambient enviroment object of feedback and preset location algorithm draw the instant map of 100 local environment of mobile robot. Processing unit 40 can also according to steep cliff sensor, ultrasonic sensor, infrared sensor, magnetometer, accelerometer, gyroscope, The working condition that the range information and velocity information comprehensive descision mobile robot of the devices such as odometer feedback are presently in.Processing Unit 40 can be by one or more application specific integrated circuit (ASIC), digital signal processor (DSP), Digital Signal Processing It is equipment (DSPD), programmable logic device (PLD), field programmable gate array (FPGA), processing unit, microprocessing unit, micro- Processor or other electronic components realize, disclose barrier-avoiding method in embodiment for executing the application.
For storing instruction and data, the data include but is not limited to storage unit 50: map datum, control moving machine The ephemeral data that device people 100 generates when operating, such as position data, the speed data of mobile robot 100.Processing unit 40 The corresponding function of the instruction execution stored in storage unit 50 can be read.Storage unit 50 may include random access memory (Random Access Memory, RAM) and nonvolatile memory (Non-Volatile Memory, NVM).It is non-volatile Memory may include hard disk drive (Hard Disk Drive, HDD), solid state hard disk (Solid State Drives, SSD), silicon disk driver (Silicon disk drive, SDD), read-only memory (Read-Only Memory, ROM), only It reads CD (Compact Disc Read-Only Memory, CD-ROM), tape, floppy disk, optical data storage devices etc..
It is understood that mobile robot 100 can also include input and output list in one or more embodiment Member, location measurement unit, wireless communication unit, display unit etc..
It should be noted that the connection relationship between each unit or component in mobile robot 100 is not limited to shown in Fig. 1 Connection relationship.For example, can be connected by bus between processing unit 40 and other unit or assemblies.
It should be noted that mobile robot 100 can also include other unit or assemblies, alternatively, only including above-mentioned portion Sub-unit or component, the present embodiment are not construed as limiting this, are only illustrated by taking above-mentioned mobile robot as an example.
In one embodiment, the proximity sensor 60 is set to the outer peripheral edge of the body 101, for mobile robot 100 side environment is detected, to find the environmental objects such as barrier, metope.Wherein, the quantity of proximity sensor 60 can Think it is multiple, for example, multiple proximity sensors 60 are intervally arranged along the outer peripheral edge of the body 101.
Please in conjunction with refering to Fig. 4, in one embodiment, the proximity sensor 60 includes pairs of signal projector 61 and signal receiver 62.Wherein, signal projector 61 can be Infrared Projector, laser emitter or ultraviolet light hair Emitter;Correspondingly, the signal receiver 62 can be infrared light receiver, laser pickoff or ultraviolet line receiver.
In embodiments of the present invention, it is described with Infrared Projector and infrared signal receiver, wherein signal hair Emitter 61 is configured as emitting infrared ray towards detection faces (metope or other barriers), and signal receiver 62 is in response to coming from Environment and/or the transmitting of signal projector 61, the infrared ray through detection faces reflection, i.e. signal projector 61 include that infrared ray emits Pipe, signal receiver 62 include infrared receiver tube.In fact, the infrared ray of environment is mainly generated by sunlight, light etc., it is more In number situation, the environment that mobile robot 100 works inevitably is influenced by sunlight or light, therefore, in order to subtract The influence of infrared ray in subenvironment acquires the signal (note of infrared receiver tube by opening the infrared emission tube of one section of duration For X1), it is then shut off the infrared emission tube of one section of duration, acquires the signal (being denoted as X2) of infrared receiver tube;Due to signal The generation of X1 is the influence of the infrared ray emitted simultaneously by infrared ray in environment and signal projector 61, and the generation of signal X2 is It is influenced by infrared ray in environment, therefore, signal X1 and signal X2 do difference and can eliminate the influence of infrared ray in environment.? X1 and X2 is sampled at multiple sampled points, according to the obstacle detection signal intensity value at the available multiple sampled points of X1 and X2. In one embodiment, signal X1 and signal X2 are done to the absolute value (being denoted as X3) of difference.X3 is a kind of embodiment, can also be had Other embodiments, such as X3=X2-X1 etc..
It in the prior art, is to be compared X3 with preset value to judge mobile robot 100 whether close to metope, barrier Hinder the detection faces such as object.When mobile robot 100 also not close to detection faces such as metope, barriers when, signal in proximity sensor 60 Around the infrared ray directive that transmitter 61 emits, only minimal amount of reflection infrared ray is approached signal receiver in sensor 60 62 receive, difference X3 very little;When mobile robot 100 is close to detection faces such as metope, barriers, signal projector 61 emits Infrared ray directive metope, the detection faces such as barrier, there is a large amount of reflection infrared ray to be received by signal receiver 62, difference X3 It is bigger, therefore, mobile robot 10 can be judged whether close to detection faces such as metope, barriers according to difference X3;? Judge that mobile robot 100 in the case where the detection faces such as metope, barrier, usually controls the execution of mobile robot 10 and subtracts The avoiding actions such as speed, steering, retrogressing.
For example, encountering the detections such as metope, the barrier for having the higher white material covering of reflectivity in mobile robot 100 When face, since white material has very strong reflective, the infrared ray that signal projector 61 emits from proximity sensor 60 is big Part is reflected by white material, and then causes difference X3 larger;Judge whether mobile robot 10 is close in conjunction with mentioned above The method of the detection faces such as metope, barrier is easy to it is found that encountering the detection faces such as metope, the barrier for having white material to cover It is mistaken for by mobile robot 100 close to metope, barrier etc., to begin to starting when distant from white wall along wall Mode cleans, and then generation the case where lead to drain sweep.
For another example, when encountering the detection faces such as metope, the barrier for thering is black light-absorbing material to cover in mobile robot 100, by There is very strong light absorptive in black light-absorbing material, the infrared ray that signal projector 61 emits from proximity sensor 60 is most of It is absorbed by black light absorbent, only minimal amount of reflection infrared ray is received by signal receiver, is limited to infrared receiver The characteristic of itself is managed, infrared receiver tube may be not turned on, and difference X3 is zero or difference X3 very little;In conjunction with mentioned above Judge mobile robot 10 whether close to the detection faces such as metope, barrier method it is found that encountering has black light-absorbing material to cover The detection faces such as metope, the barrier of lid are easy to be mistaken for by mobile robot 100 also not close to metope, barrier etc., thus Continue to keep the involuntary maneuvers such as original speed is mobile, acceleration is mobile, causes mobile robot 100 is violent collide metope, hinder Hinder object etc..
Therefore, in the embodiment of the application, obstacle detection signal intensity value can be indicated with X3, of course, it is possible to It is deformed to obtain the other forms of obstacle detection signal intensity value based on X3.In order to reduce the erroneous judgement of barrier, it is subsequent can To be judged mobile robot 100 whether close to metope, barrier etc. according to the change rate of obstacle detection signal intensity value Detection faces.
In one embodiment of the invention, the signal projector 61 has primary optical axis F0, and the signal receiver 62 has view Field central axis R0, and the primary optical axis F0 and the field of view center axis R0 intersect at the first intersection points B.The processing unit 40 is also used In judging whether the barrier Z is located at first intersection points B;When determining that the barrier Z is located at first intersection points B, The processing unit 40 controls the mobile robot 100 and executes predefined action.
It is appreciated that since the mobile robot 100 is during the work time in the state that constantly moves, and barrier The position of Z is relatively fixed, and therefore, with the continuous movement of mobile robot 100, barrier is relative to mobile robot 100 Position can change, and when barrier Z is located at the first intersection points B point, the mobile robot 100 executes predefined action.
Judge and execute pre- accordingly due to detecting the position of barrier Z from the mobile robot 100 and making Determine to have the reaction time during behavior, and in this process mobile robot 100 still in motion state.For example, mobile Robot 100 detects that barrier Z is located at the first intersection points B, and mobile robot is since the reaction time has moved forward 1mm, relatively In the traffic direction of mobile robot 100, the first intersection points B is than moving 1mm before barrier Z-direction.Therefore, it is located in barrier Z When in the corresponding range of allowable error of the first intersection points B, it can be equal and think that barrier Z is located at the first intersection points B.The allowable error Range can the processing speed of the processing unit 40 according to used in the mobile robot 100 and the reaction speed of driving unit 20 It spends and determines.In one embodiment, the corresponding range of allowable error of first intersection points B can be is with first intersection points B Center and using default error as the peripheral region of radius.
Wherein, on the primary optical axis F0, the signal of the signal projector 61 emitted is most strong.In the visual field On mandrel R0, the signal receiver 62 receives the sensitivity highest of signal.
Mobile robot 100 disclosed in the present application, as long as judge that the barrier Z is located at first intersection points B, institute It states processing unit 40 and controls the execution of mobile robot 100 predefined action, and then can be to avoid the reflectivity because of barrier Z not The case where judging by accident with caused by, improves the accuracy of judgement.
In one embodiment, the predefined action comprises at least one of the following: moving on towards barrier direction, edge After wall, steering, steering and after wall, retrogressing, steering and advance etc..Mentioned in the application along wall, be those skilled in the art Along wall mode (following model, i.e., the mode walked along wall or obstacles borders) known to member
For example, detecting that barrier Z is located at the first intersection points B in the front of mobile robot 100, mobile robot is controlled 100 turn to, and walk in another direction;Or the control of processing unit 40 retrogressing of mobile robot 100 is turned again to execute to clean and be moved Make;Again or the processing unit 40 controls mobile robot 100 and executes cleaning movement along barrier edge, that is, executes along wall mould Formula.It detects that barrier Z is located at the first intersection points B in the side (non-direction of advance) of mobile robot 100, controls mobile robot 100 execute along wall.
In one embodiment, the detection device 70 further includes imaging sensor 71.Described image sensor is for acquiring The image data of barrier Z, the processing unit 40 determine the type of barrier Z according to the image data of acquisition, execute and hinder Hinder the corresponding predefined action of the type of object Z.For example, retrogressing is executed when barrier Z is vase, when barrier Z is wall, It executes along wall mode.For example, described image sensor can be camera, the image of barrier Z can be acquired by camera Data determine the type of barrier Z using image technique.Wherein, image technique includes but is not limited to: image recognition, image Match, neural network algorithm, deep learning etc..
In one embodiment, the processing unit 40 controls the specific predefined action root that the mobile robot 100 executes It is determined according to the current operation mode of the mobile robot 100 and the type of barrier Z.For example, working as the mobile robot 100 are in cleaning modes, and when judge the barrier Z for wall, the row that the execution of mobile robot 100 is cleaned along wall For;Mode is recharged when the mobile robot 100 is in, and disturbance in judgement object Z is desk or other hinder the obstacle of walking When object, the mobile robot 100 changes walking path cut-through object and walks on or turn around, and specific process performing can To be set according to specific demand.
In one embodiment, imaging sensor may include that depth transducer, RGB image sensor or structure light image pass One or more of sensor.
Depth transducer includes: two-dimentional camera, captures the image of object to be captured;And infrared sensor.And The range information that depth transducer exports two-dimentional camera captured image and infrared sensor obtains.
RGB sensor can shoot RGB image, and RGB image is also referred to as color image.Such as using RGB sensor to filling Electric stake is shot to obtain the RGB image including charging pile.
In one embodiment, the signal generator 61 and the signal receiver 62 are arranged side by side, and are in angle Setting so that the primary optical axis F0 and field of view center axis R0 is arranged in angle, and is intersected in outside the body 101 First intersection points B.
In other embodiments, the signal projector 61 and the signal receiver 62 can also be arranged in parallel, such as This, only need to be respectively set optics in the path of the reception light for going out light path neutralisation signals receiver 62 of signal projector 61 Element changes the angle of the primary optical axis F0 and the angle of the field of view center axis R0 by the optical element, from And make the primary optical axis F0 after the corresponding optical element and field of view center axis R0 after corresponding optical element in folder Angle setting.
Specifically, the signal projector 61 also goes out plain edge boundary line be located at the two sides the primary optical axis F0 first F1 and second goes out plain edge boundary line F2.The signal receiver 62, which also has, is located at the first of the two sides field of view center axis R0 Visual field border line R1 and the second visual field border line R2.Wherein, described first goes out the plain edge boundary line out than described second plain edge boundary line F1 F2 is closer to the signal receiver 62.The first visual field border line R1 is than the second visual field border line R2 closer to described Signal projector 61;Described first, which goes out plain edge boundary line F1 and the first visual field border line R1, intersects at the second intersection point A, and described Second, which goes out plain edge boundary line F2 and the second visual field border line R2, intersects at third intersection point C.The second intersection point A is than described first Intersection points B closer to the mobile robot 100, and the third intersection point C than first intersection points B further from the mobile machine People 100.
Wherein, described first go out plain edge boundary line F1 and second go out plain edge boundary line F2 be the signal projector 61 emitted by The most weak boundary line of signal goes out plain edge boundary line F1 and second first and goes out except plain edge boundary line F2, will test less than the letter The signal that number transmitter 61 is emitted.First visual field border line R1 and the second visual field border line R2 is that the signal receiver 62 connects The most weak boundary line of receiving signal sensitivity, i.e., when the signal reflected through barrier Z is located at the first visual field border line R1 and When except two visual field border line R2, the signal receiver 62 will not receive the signal through being reflected through barrier Z.
As shown in figure 5, first intersection points B indicates that mobile robot 100 is detecting at a distance from proximity sensor 60 Barrier Z and when needing to be implemented predefined action, barrier Z is at a distance from the proximity sensor 60.Wherein, first intersection point B refer at a distance from proximity sensor 60 first intersection points B to the signal projector 61 and the signal receiver 62 it Between line K distance, i.e., the length T1 of described first intersection points B to the vertical segment between line K.The signal projector 61 Line K between the signal receiver 62 refers to the central point and the signal receiver 62 of the signal projector 61 Line L between central point.
In one embodiment, first intersection points B and the range of the 60 distance T1 of proximity sensor are arrived in 0.1cm Between 15cm.It is appreciated that 101 outer rim of body of the mobile robot 100 is with the distance between second intersection points B D1 It is set according to avoidance demand, for example, avoidance movement is needed to be implemented when the outer rim of body 101 is apart from barrier 2cm, this When, the distance of D1 can be set to 2cm.And the proximity sensor 60 is also to set in advance at a distance from 101 outer rim of body It is setting and can be adjusted according to design requirement.For example, in practical applications, by the proximity sensor 60 and the body When the distance D2 of 101 outer rims is set as 2cm, needs to be implemented and keep away when the distance between the outer rim of body 101 and barrier reach 2cm When barrier acts, need first intersection points B and the 60 distance T1 of proximity sensor being set as 4cm.
Certainly, in some embodiments, the range of first intersection points B and the 60 distance T1 of proximity sensor is also It can be between 0.1cm to 5cm.In the present embodiment, since the distance between barrier and mobile robot 100 are relatively close, The predefined action may include following at least one: turn to, after wall, steering and after wall, retrogressing, steering and advance etc..
In one embodiment, it can be pre-configured with predefined action, the predefined action comprises at least one of the following: towards barrier Hinder object space to move on, after wall, steering, steering and after wall, retrogressing, steering and advance etc..For example, if described first hands over Point B and the 60 distance T1 of proximity sensor are larger, there is drain sweep in order to prevent, before can first continuing towards barrier direction Into turning again to and along wall.If first intersection points B and the 60 distance T1 of proximity sensor are smaller, can be directly to simultaneously Along wall.
In one embodiment, during executing along wall, the proximity sensor 60 detects the letter of the barrier Z Number, the processing unit 40 controls the movement of mobile robot 100 and the barrier Z is existed in response to the signal Region near first intersection points B.It should be understood that the processing unit 40 as used in the mobile robot 100 The position of many other factors of processing speed and the reaction speed of driving unit 20 etc., the barrier Z can be handed over first There is deviation in allowed band near point B.The mobile robot 100 is set to be stably carried out along wall mode in this way.
Wherein, during executing along wall, the proximity sensor 60 detect the mode of the signal of the barrier Z with The acquisition modes for the obstacle detection signal mentioned in the application are similar, and details are not described herein again.
In one embodiment, the angled relationships referring again to Fig. 6, between the signal projector 61 and signal receiver 62 Schematic diagram.As shown in fig. 6, being sent out by configuring the distance between the signal projector 61 and signal receiver 62 L and signal The parameter of angle theta between emitter 61 and signal receiver 62, to configure first intersection points B and the proximity sensor 60 Distance T1, to adjust the position of first intersection points B.In one embodiment, the signal projector 61 and signal receiver The distance between 62 L are in the range of 1cm to 5cm.For example, as shown in fig. 6, if L=4cm, T1=2cm and signal hair The setting angle parameter of emitter 61 and signal receiver 62 relative to body 101, can derive angle theta, thus described in configuration First intersection points B and the 60 distance T1 of proximity sensor.
In practical applications, technical staff is obtained as shown in Figure 7 by a large amount of test experiments and data statistic analysis The corresponding curve synoptic diagram of difference barrier in " output voltage --- distance " coordinate system.It should be noted that " output electricity " output voltage " in pressure --- distance " coordinate system is a kind of quantization performance for the inductive signal intensity that signal receiver 62 exports Form, output voltage is bigger, and representation signal intensity is bigger;" distance " in " output voltage --- distance " coordinate system is that signal connects Receive the distance between device 62 and barrier.Wherein, curve I corresponds to the barrier of antiradar reflectivity (such as black light-absorbing material), curve II corresponds to the barrier of high reflectance (such as white reflector material).As can be seen from Figure, the whether obstacle of that reflectivity Object, when the distance of obstacle distance proximity sensor 60 is greater than T1, as distance becomes the signal close, signal receiver 62 exports Intensity gradually increases;When the distance of obstacle distance proximity sensor 60 is less than T1, as distance becomes close, signal receiver 62 The signal strength of output gradually weakens, i.e., whether the barrier of which kind of reflectivity is in the signal at the first intersection points B Maximum intensity.Therefore, it may determine that whether barrier is located at described first and hands over according to the signal strength that signal receiver 62 exports Point B can determine that the barrier is located at described first at this time as long as finding the strongest point of 62 output signal of signal receiver Intersection points B.
Please again together refering to Fig. 8, Fig. 8 is each intersection point and signal receiver of signal projector 61 and signal receiver 62 Relation schematic diagram between 62 output voltage.From figure 8, it is seen that when between barrier and the proximity sensor 60 away from When from being greater than the distance between the third intersection point A and the proximity sensor 60 T2, signal receiver 62 does not generate substantially Output voltage;With the movement of mobile robot 100, when barrier is between the third intersection point C and first intersection points B When, the output voltage of signal receiver 62 gradually increases;When barrier be located at first intersection points B and the second intersection point A it Between when, the output voltage of signal receiver 62 gradually weakens;When the distance between barrier and the proximity sensor 60 are less than When the distance between the second intersection point A and the proximity sensor 60 T3, signal receiver 62 is substantially without generating output electricity Pressure.
Therefore, in some embodiments, the proximity sensor 60 described in the traveling process of the mobile robot 100 Obstacle detection signal intensity value is acquired to obtain obstacle detection signal intensity value corresponding to multiple sampled points, the processing Unit 40 also according to the intensity value of obstacle detection signal corresponding to the multiple sampled point judge the barrier Z whether position In first intersection points B.
In a way of example, the processing unit 40 also acquires the kinematic parameter of mobile machine 100, according to described more Obstacle detection signal intensity value corresponding to a sampled point and the corresponding kinematic parameter of the multiple sampled point, obtain barrier Signal curve corresponding with kinematic parameter is detected, to the parameter of curve carry out that the first intersection points B is calculated.The movement ginseng Number includes but is not limited to: distance, speed, time etc., wherein as shown in Figures 7 and 8, the distance indicates that barrier connects with described The distance between nearly sensor 60.Wherein obstacle detection signal includes voltage signal, current signal, photosignal.On for example, Stating the X3 mentioned can be voltage signal or current signal or photosignal.That is, by adjust the distance this parameter and Collected original signal carries out deformation process, and available other are equal to Fig. 7, the curve in Fig. 8, only the cross in curve The physical significance that axis or the longitudinal axis indicate is different, but is based on Fig. 7, what curve principle idea reasoning described in Fig. 8 obtained.Example Such as, " electric current-distance " curve, " current-vs-time " curve, " current-velocity " curve, " voltage-vs-time " curve, " voltage-speed Degree " curve, " photosignal-distance " curve, " photosignal-time " curve, " photosignal-speed " curve, etc..
In one embodiment, the obstacle detection signal intensity value according to corresponding to the multiple sampled point judges institute State whether barrier is located at first intersection points B, comprising: in obstacle detection signal curve corresponding with kinematic parameter, if The extreme point of curve is obtained at current sampling point, it is determined that the barrier is located at first intersection points B.
Specifically, locating processing unit 40 calculates the difference of the obstacle detection signal intensity value of two neighboring sampled point, And it is the difference is strong to obtain signal divided by travel distance of the mobile robot 100 between corresponding two sampled points Difference is spent to the differential value of travel distance;If corresponding differential value is 0 at current sampling point, determine that the barrier is located at institute State the first intersection points B.
It is appreciated that the signal strength that signal receiver 62 exports is most due to when barrier is located at first intersection points B Greatly, i.e. signal strength corresponding to B point is therefore the extreme point in aforementioned curve by differentiating to the curve, works as derivation When being as a result equal to 0, that is, it can determine that signal strength corresponding to current sampling point is the point of inflexion on a curve, i.e., can determine obstacle at this time Level is in the first intersection points B.
In other embodiments, the obstacle detection signal intensity value according to corresponding to the multiple sampled point Judge whether the barrier is located at first intersection points B, comprising: compare the obstacle detection signal intensity value of current sampling point Size between the obstacle detection signal intensity value of previous sampled point;When the detection of obstacles for first appearing current sampling point When signal strength indication is not more than the obstacle detection signal intensity value of previous sampled point, determine that the barrier is located at described first Intersection points B.
In one embodiment, at each sampled point, the processing unit 40 controls the signal projector 61 and opens the One preset duration acquires the first signal of the signal receiver 62;61 second preset duration of signal projector is turned off, The second signal of the signal receiver 62 is acquired, the processing unit 40 is according to first signal at each sampled point And the second signal obtains the obstacle detection signal intensity value at each sampled point.For example, in signal projector 61 When in the open state, processing unit 40 obtains multiple unlatching sampled values by sampling end, and according to multiple unlatching sampled values It acquires and opens sample mean to obtain first signal;When signal projector 61 is in close state, processing unit 40 Multiple closing sampled values are obtained by sampling end, and are acquired according to multiple closing sampled values and closed sample mean to obtain State second signal.
In some embodiments, described close to sensing in order to further increase the accuracy to Obstacle Position judgement Device 60 further includes spacer 63.The spacer 63 is set between the signal projector 61 and the signal receiver 62, Go out plain edge boundary line F1 and the first visual field border line R1 for changing described first.In present embodiment, between being provided with Spacing body 63 makes first to go out plain edge boundary line F1 and the first visual field border line R1, is changed because of the effect of spacer 63, Jin Ergai The position of the second intersection point A is become, so that the first intersection points B is closer at a distance from the second intersection point A, to accelerate obstacle quality testing Signal strength is surveyed by by force to the weak rate of decay, it is easier to the point for determining obstacle detection signal maximum intensity, to improve The accuracy of judgement.
In one embodiment, proximity sensor 60 further includes translucent cover 64, and the translucent cover 64 is described close for covering Sensor 60 is located at the position of the second intersection point A except translucent cover 64 so that second hands over by the way that the spacer is arranged Point A is closer at a distance from the first intersection points B.
Further, in order to not influence the position of the first intersection points B, the spacer 63 should be located at the primary optical axis F0 and institute It states between field of view center axis R0, and the outer surface of the spacer 63 does not touch the primary optical axis F0 and the field of view center axis R0.In addition, receiving the influence of light intensity in order to reduce the light leakage of signal generator 62 to signal receiver 62, it is preferable that between described Spacing body 63 is black interval part, which can be EVA (Ethylene-vinyl Acetate Copolymer, second Alkene-vinyl acetate co-polymer) material, other light-proof materials can also be used.
It is the flow chart of barrier-avoiding method disclosed in one embodiment of the application referring again to Fig. 9.The barrier-avoiding method is answered For in mobile robot 100 above-mentioned, the barrier-avoiding method to include the following steps.
Step S91, judges whether the barrier is located at first intersection point;If so, thening follow the steps S92.
In some embodiments, the proximity sensor 60 described in the traveling process of the mobile robot 100 acquires Obstacle detection signal intensity value is to obtain obstacle detection signal intensity value corresponding to multiple sampled points, the processing unit 40 judge whether the barrier Z is located at institute also according to the intensity value of obstacle detection signal corresponding to the multiple sampled point State the first intersection points B.
In one embodiment, the obstacle detection signal intensity value according to corresponding to the multiple sampled point judges institute State whether barrier is located at first intersection points B, comprising: in obstacle detection signal curve corresponding with kinematic parameter, if The extreme point of curve is obtained at current sampling point, it is determined that the barrier is located at first intersection points B.
Specifically, locating processing unit 40 calculates the difference of the obstacle detection signal intensity value of two neighboring sampled point, And it is the difference is strong to obtain signal divided by travel distance of the mobile robot 100 between corresponding two sampled points Difference is spent to the differential value of travel distance;If corresponding differential value is 0 at current sampling point, determine that the barrier is located at institute State the first intersection points B.
In other embodiments, the obstacle detection signal intensity value according to corresponding to the multiple sampled point Judge whether the barrier is located at first intersection points B, comprising: compare the obstacle detection signal intensity value of current sampling point Size between the obstacle detection signal intensity value of previous sampled point;When the detection of obstacles for first appearing current sampling point When signal strength indication is not more than the obstacle detection signal intensity value of previous sampled point, determine that the barrier is located at described first Intersection points B.
In one embodiment, at each sampled point, the processing unit 40 controls the signal projector 61 and opens the One preset duration acquires the first signal of the signal receiver 62;61 second preset duration of signal projector is turned off, The second signal of the signal receiver 62 is acquired, the processing unit 40 is according to first signal at each sampled point And the second signal obtains the obstacle detection signal intensity value at each sampled point.For example, in signal projector 61 When in the open state, processing unit 40 obtains multiple unlatching sampled values by sampling end, and according to multiple unlatching sampled values It acquires and opens sample mean to obtain first signal;When signal projector 61 is in close state, processing unit 40 Multiple closing sampled values are obtained by sampling end, and are acquired according to multiple closing sampled values and closed sample mean to obtain State second signal.
Step S92 controls the mobile robot 100 and executes predefined action.
In one embodiment, the predefined action comprises at least one of the following: moving on, turns towards barrier direction To, turn to after and after wall, retrogressing, steering and advance etc..
Referring again to Figure 10, for the flow chart for the barrier-avoiding method that another embodiment of the application provides.This method compared to Method in Fig. 9, unlike, if the result judged in step S91 be it is no, the barrier-avoiding method further includes following steps.
Step S101 controls the mobile robot and executes existing behavior.
Wherein, existing behavior refers to the behavior of the presently described execution of the mobile robot 100.For example, control is described clear Clean robot moves on, continues to clean etc..
In several embodiments provided herein, it should be understood that disclosed system, device and method can be with It realizes by another way.For example, the apparatus embodiments described above are merely exemplary, for example, the unit It divides, only a kind of logical function partition, there may be another division manner in actual implementation, such as multiple units or components It can be combined or can be integrated into another system, or some features can be ignored or not executed.Another point, it is shown or The mutual coupling, direct-coupling or communication connection discussed can be through some interfaces, the indirect coupling of device or unit It closes or communicates to connect, can be electrical property, mechanical or other forms.
The unit as illustrated by the separation member may or may not be physically separated, aobvious as unit The component shown may or may not be physical unit, it can and it is in one place, or may be distributed over multiple In network unit.It can select some or all of unit therein according to the actual needs to realize the mesh of this embodiment scheme 's.
It, can also be in addition, each functional unit in each embodiment of the application can integrate in one processing unit It is that each unit physically exists alone, can also be integrated in one unit with two or more units.Above-mentioned integrated list Member both can take the form of hardware realization, can also realize in the form of software functional units.
If the integrated unit is realized in the form of SFU software functional unit and sells or use as independent product When, it can store in a computer readable storage medium.Based on this understanding, the technical solution of the application is substantially The all or part of the part that contributes to existing technology or the technical solution can be in the form of software products in other words It embodies, which is stored in a storage medium, including some instructions are used so that an equipment (can To be personal computer, server or the network equipment, robot, single-chip microcontroller, chip etc.) etc.) execute each implementation of the application The all or part of the steps of example the method.And storage medium above-mentioned include: USB flash disk, it is mobile hard disk, read-only memory, random Access the various media that can store program code such as memory, magnetic or disk.
The above is the preferred embodiment of the present invention, it is noted that those skilled in the art are come It says, various improvements and modifications may be made without departing from the principle of the present invention, these improvements and modifications are also considered as this The protection scope of invention.

Claims (15)

1. a kind of mobile robot characterized by comprising
Body;
Processing unit is set in the body;And
Proximity sensor is set on the body, comprising:
Signal projector has primary optical axis;And
Signal receiver has field of view center axis;The primary optical axis and the field of view center axis intersect at the first intersection point;
The processing unit is used for when determining that the barrier is located at first intersection point, is controlled the mobile robot and is executed Predefined action.
2. mobile robot as described in claim 1, which is characterized in that first intersection point and the proximity sensor away from From range between 0.1cm to 15cm.
3. mobile robot as claimed in claim 2, which is characterized in that the predefined action comprises at least one of the following: court The barrier direction move on, after wall, steering, steering and after wall, retrogressing, steering and advance.
4. mobile robot as described in claim 1, which is characterized in that first intersection point and the proximity sensor away from From range between 0.1cm to 5cm.
5. mobile robot as claimed in claim 4, which is characterized in that the predefined action comprises at least one of the following: turning To, after wall, steering and after wall, retrogressing, steering and advance.
6. mobile robot as claimed in claim 2 or 4, which is characterized in that the signal projector and the signal receive The distance between device is in the range of 1cm to 5cm.
7. mobile robot as claimed in claim 3 or 5, which is characterized in that described close to biography during executing along wall Sensor detects the signal of the barrier, the processing unit in response to the signal, control the mobile robot it is mobile and So that barrier region near first intersection point.
8. mobile robot as described in claim 1, which is characterized in that the signal projector also have be located at it is described The first of primary optical axis two sides goes out plain edge boundary line and second and goes out plain edge boundary line;The signal receiver, which also has, is located at the view The the first visual field border line and the second visual field border line of field central axis two sides;Wherein, described first goes out plain edge boundary line than described the Two go out plain edge boundary line closer to the signal receiver;The first visual field border line than the second visual field border line closer to The signal projector;Described first, which goes out plain edge boundary line, intersects at the second intersection point with the first visual field border line;Described second Plain edge boundary line and the second visual field border line intersect at third intersection point out;Second intersection point than first intersection point closer to The proximity sensor, and the third intersection point than first intersection point further from the proximity sensor, it is described close to sensing Device further includes spacer;The spacer does not touch the key light between the primary optical axis and the field of view center axis Axis and the field of view center axis, the spacer go out plain edge boundary line and the first visual field border line for changing described first.
9. mobile robot as claimed in claim 8, which is characterized in that the proximity sensor further includes translucent cover, described Translucent cover for covering the proximity sensor, the spacer make the position of second intersection point be located at the translucent cover it Outside.
10. mobile robot as described in claim 1, which is characterized in that the institute in the traveling process of the mobile robot It is strong to obtain obstacle detection signal corresponding to multiple sampled points to state proximity sensor acquisition obstacle detection signal intensity value Angle value, the processing unit judge the obstacle also according to obstacle detection signal intensity value corresponding to the multiple sampled point Whether object is located at first intersection point.
11. mobile robot as claimed in claim 10, which is characterized in that described according to corresponding to the multiple sampled point The intensity value of obstacle detection signal judges whether the barrier is located at first intersection point, comprising: calculates two neighboring adopt The difference of the obstacle detection signal intensity value of sampling point, and the difference is adopted divided by the mobile robot at corresponding two Travel distance between sampling point is to obtain differential value of the difference to the travel distance of obstacle detection signal intensity value;Work as institute It states as differential value when being 0, determines that the barrier is located at first intersection point.
12. mobile robot as claimed in claim 10, which is characterized in that described according to corresponding to the multiple sampled point The intensity value of obstacle detection signal judges whether the barrier is located at first intersection point, comprising: compares current sampling point Obstacle detection signal intensity value and previous sampled point obstacle detection signal intensity value between size;When first appearing When the obstacle detection signal intensity value of current sampling point is not more than the obstacle detection signal intensity value of previous sampled point, determine The barrier is located at first intersection point.
13. the mobile robot as described in any one of claims 1 to 12, which is characterized in that described at each sampled point Processing unit controls the signal projector and opens the first preset duration, acquires the first signal of the signal receiver;It closes again Close second preset duration of signal projector, acquire the second signal of the signal receiver, the processing unit according to First signal and the second signal at each sampled point obtain the obstacle detection signal at each sampled point Intensity value.
14. the mobile robot as described in any one of claims 1 to 12, which is characterized in that the mobile robot is also wrapped Imaging sensor is included, described image sensor is used to acquire the image data of barrier;The processing unit is according to described image Data determine the type of barrier, execute predefined action corresponding with the type of barrier.
15. the mobile robot as described in any one of claims 1 to 12, which is characterized in that the mobile robot is also Including imaging sensor, described image sensor is used to acquire the image data of barrier;The processing unit is according to the figure As data determine the type of barrier, and according to the current operation mode of the type of barrier and mobile robot, execute correspondence Predefined action.
CN201910067596.XA 2019-01-24 2019-01-24 Mobile robot Withdrawn CN109965783A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910067596.XA CN109965783A (en) 2019-01-24 2019-01-24 Mobile robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910067596.XA CN109965783A (en) 2019-01-24 2019-01-24 Mobile robot

Publications (1)

Publication Number Publication Date
CN109965783A true CN109965783A (en) 2019-07-05

Family

ID=67076699

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910067596.XA Withdrawn CN109965783A (en) 2019-01-24 2019-01-24 Mobile robot

Country Status (1)

Country Link
CN (1) CN109965783A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112401752A (en) * 2020-11-04 2021-02-26 北京石头世纪科技股份有限公司 Method, device, medium and electronic equipment for detecting unknown obstacles
CN113208510A (en) * 2021-05-13 2021-08-06 深圳市云视机器人有限公司 Sweeping robot device capable of being along wall

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112401752A (en) * 2020-11-04 2021-02-26 北京石头世纪科技股份有限公司 Method, device, medium and electronic equipment for detecting unknown obstacles
WO2022095488A1 (en) * 2020-11-04 2022-05-12 北京石头创新科技有限公司 Method and apparatus for detecting unknown obstacle, and medium and electronic device
CN113208510A (en) * 2021-05-13 2021-08-06 深圳市云视机器人有限公司 Sweeping robot device capable of being along wall
CN113208510B (en) * 2021-05-13 2022-04-15 深圳市云视机器人有限公司 Sweeping robot device capable of being along wall

Similar Documents

Publication Publication Date Title
US9474427B2 (en) Robot cleaner and method for controlling the same
CN111067439B (en) Obstacle processing method and cleaning robot
KR101887055B1 (en) Robot cleaner and control method for thereof
EP3505311B1 (en) Mobile robot and control method therefor
AU2016299576B2 (en) Mobile robot and control method thereof
CN110300537B (en) Dust collector and control method thereof
US9914219B2 (en) Robot cleaner and controlling method thereof
KR102388448B1 (en) Moving robot and controlling method thereof
EP3082543B1 (en) Autonomous mobile robot
CN109730590A (en) Clean robot and the method for clean robot auto-returned charging
EP1696296B1 (en) Robot cleaner and method of control thereof
CN110179404A (en) Clean robot, clean method and computer readable storage medium
CN110477825A (en) Clean robot, recharging method, system and readable storage medium storing program for executing
KR20090018562A (en) Robot cleaner and control method of the same of
CN104000541A (en) Floor sweeping robot with threshold detection function and threshold detection method thereof
CN109965783A (en) Mobile robot
US20220080600A1 (en) Particle filters and wifi robot localization and mapping
WO2023179616A1 (en) Cleaning robot, and method and apparatus for same returning to base, and method and apparatus for same moving out of base
WO2023193618A1 (en) Automatic cleaning devices, control method and storage medium
KR102206388B1 (en) Lawn mover robot and controlling method for the same
KR20190119222A (en) Robot cleaner
JP7329125B2 (en) Mobile robot and its control method
KR102115192B1 (en) A robot cleaner and an operation method for the same
KR20200100019A (en) Moving Robot and controlling method
CN218500628U (en) Cleaning device and system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WW01 Invention patent application withdrawn after publication

Application publication date: 20190705

WW01 Invention patent application withdrawn after publication