CN109984688A - A kind of clean method in Robot side and robot - Google Patents
A kind of clean method in Robot side and robot Download PDFInfo
- Publication number
- CN109984688A CN109984688A CN201910313177.XA CN201910313177A CN109984688A CN 109984688 A CN109984688 A CN 109984688A CN 201910313177 A CN201910313177 A CN 201910313177A CN 109984688 A CN109984688 A CN 109984688A
- Authority
- CN
- China
- Prior art keywords
- robot
- carpet
- clarity
- far
- judge
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- A—HUMAN NECESSITIES
- A47—FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
- A47L—DOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
- A47L11/00—Machines for cleaning floors, carpets, furniture, walls, or wall coverings
- A47L11/32—Carpet-sweepers
-
- A—HUMAN NECESSITIES
- A47—FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
- A47L—DOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
- A47L11/00—Machines for cleaning floors, carpets, furniture, walls, or wall coverings
- A47L11/40—Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
-
- A—HUMAN NECESSITIES
- A47—FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
- A47L—DOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
- A47L11/00—Machines for cleaning floors, carpets, furniture, walls, or wall coverings
- A47L11/40—Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
- A47L11/4011—Regulation of the cleaning machine by electric means; Control systems and remote control systems therefor
-
- A—HUMAN NECESSITIES
- A47—FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
- A47L—DOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
- A47L2201/00—Robotic cleaning machines, i.e. with automatic control of the travelling movement or the cleaning operation
-
- A—HUMAN NECESSITIES
- A47—FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
- A47L—DOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
- A47L2201/00—Robotic cleaning machines, i.e. with automatic control of the travelling movement or the cleaning operation
- A47L2201/04—Automatic control of the travelling movement; Automatic obstacle detection
-
- A—HUMAN NECESSITIES
- A47—FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
- A47L—DOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
- A47L2201/00—Robotic cleaning machines, i.e. with automatic control of the travelling movement or the cleaning operation
- A47L2201/06—Control of the cleaning action for autonomous devices; Automatic detection of the surface condition before, during or after cleaning
Abstract
The present invention is suitable for robotic technology field, provide a kind of clean method in Robot side and robot, wherein, the clean method in Robot side includes judging whether the robot moves on carpet, if the robot moves on carpet, it then controls the Robot and carries out cleaning movement far from the direction of the carpet, judge the robot whether far from the carpet, if the robot controls the Robot close to the direction of the carpet and carries out cleaning movement far from the carpet.The present invention carries out cleaning movement along the direction close to carpet always by control robot, and robot can be made to clean the outer edge to entire carpet as much as possible, to be conducive to improve cleaning efficiency and whole coverage rate.
Description
Technical field
The present invention relates to a kind of clean method of robotic technology field more particularly to Robot side, robot and meters
Calculation machine readable storage medium storing program for executing.
Background technique
It can be all laid with carpet on more and more floorings at present, to build a kind of quiet and clean and tidy atmosphere.
However robot in the prior art is only capable of cleaning the part edge of carpet, cleaning efficiency and whole covering
Rate is not high.
Therefore it's necessary to propose a new technical scheme, to solve the above technical problems.
Summary of the invention
In consideration of it, the embodiment of the invention provides a kind of clean method in Robot side and robot, by controlling machine
Device people carries out cleaning movement along the direction close to carpet always, and robot can be made to clean the outside to entire carpet as much as possible
Edge, to be conducive to improve cleaning efficiency and whole coverage rate.
The first aspect of the embodiment of the present invention provides a kind of clean method in Robot side, comprising:
Judge whether the robot moves on carpet;
If the robot moves on carpet, controls the Robot and cleaned far from the direction of the carpet
Movement;
Judge the robot whether far from the carpet;
If the robot controls the Robot and is cleaned close to the direction of the carpet far from the carpet
Movement.
The second aspect of the embodiment of the present invention provides a kind of robot, including memory, processor and is stored in storage
On device and the computer program that can run on a processor, above-mentioned processor realize above-mentioned first when executing above-mentioned computer program
The method that aspect refers to.
The third aspect of the embodiment of the present invention provides a kind of computer readable storage medium, the computer-readable storage medium
Computer program is stored in matter, above-mentioned computer program realizes method mentioned in the first aspect when being executed by processor.
Existing beneficial effect is the embodiment of the present invention compared with prior art: in the present embodiment, first determining whether described
Whether robot moves on carpet, if then the robot moves on carpet, controls the Robot far from institute
The direction for stating carpet carries out cleaning movement, then judges the robot whether far from the carpet, if the last robot is remote
From the carpet, then the Robot is controlled close to the direction of the carpet and carries out cleaning movement.Compared with prior art, originally
Inventive embodiments carry out cleaning movement along the direction close to carpet always by control robot, can make robot as much as possible
Clean the outer edge of entire carpet, to be conducive to improve cleaning efficiency and whole coverage rate, have stronger ease for use and
Practicability.
Detailed description of the invention
It to describe the technical solutions in the embodiments of the present invention more clearly, below will be to embodiment or description of the prior art
Needed in attached drawing be briefly described, it should be apparent that, the accompanying drawings in the following description is only of the invention some
Embodiment for those of ordinary skill in the art without creative efforts, can also be attached according to these
Figure obtains other attached drawings.
Fig. 1-a is the flow diagram for the clean method in Robot side that the embodiment of the present invention one provides;
Fig. 1-b is the effect diagram for the clean method in Robot side that the embodiment of the present invention one provides;
Fig. 1-c is that Robot bends vee path V progress along the clean effect diagram in side;
Fig. 2 is the flow diagram of the clean method in Robot side provided by Embodiment 2 of the present invention;
Fig. 3 is the flow diagram for the clean method in Robot side that the embodiment of the present invention three provides;
Fig. 4 is the structural schematic diagram for the robot that the embodiment of the present invention four provides.
Specific embodiment
In being described below, for illustration and not for limitation, the tool of such as particular system structure, technology etc is proposed
Body details, to understand thoroughly the embodiment of the present invention.However, it will be clear to one skilled in the art that there is no these specific
The present invention also may be implemented in the other embodiments of details.In other situations, it omits to well-known system, device, electricity
The detailed description of road and method, in case unnecessary details interferes description of the invention.
It should be appreciated that ought use in this specification and in the appended claims, term " includes " instruction is described special
Sign, entirety, step, operation, the presence of element and/or component, but be not precluded one or more of the other feature, entirety, step,
Operation, the presence or addition of element, component and/or its set.
It is also understood that mesh of the term used in this description of the invention merely for the sake of description specific embodiment
And be not intended to limit the present invention.As description of the invention and it is used in the attached claims, unless on
Other situations are hereafter clearly indicated, otherwise " one " of singular, "one" and "the" are intended to include plural form.
It will be further appreciated that the term "and/or" used in description of the invention and the appended claims is
Refer to any combination and all possible combinations of one or more of associated item listed, and including these combinations.
As used in this specification and in the appended claims, term " if " can be according to context quilt
Be construed to " when ... " or " once " or " in response to determination " or " in response to determining ".Similarly, phrase " if it is determined that " or
" if it is determined that [described condition or event] " can be interpreted to mean according to context " once it is determined that " or " in response to true
It is fixed " or " once determining [described condition or event] " or " in response to determining [described condition or event] ".
It should be understood that the size of the serial number of each step is not meant that the order of the execution order in the present embodiment, each process
Execution sequence should be determined by its function and internal logic, and the implementation process of the embodiments of the invention shall not be constituted with any limitation.
It should be noted that " first ", " second " etc. in the present embodiment describe, it is for distinguishing different regions, mould
Block etc. does not represent sequencing, does not also limit " first " and " second " as different types.
In order to illustrate technical solutions according to the invention, the following is a description of specific embodiments.
Embodiment one
Fig. 1-a is the flow diagram for the clean method in Robot side that the embodiment of the present invention one provides, and Fig. 1-b is this
The effect diagram for the clean method in Robot side that inventive embodiments one provide, this method may comprise steps of:
S101: judge whether the robot moves on carpet.
Wherein, described to refer to the robot using the sensor of itself constantly to detect in locating room along side cleaning
Carpet boundary, thus along the behavior for carrying out cleaning movement close to carpet boundary;One in the artificial controlling intelligent household appliances of machine
Kind, including clean floor-washing robot, floor-mopping robot, automatic cleaning machine and intellective dust collector, they can be by certain artificial intelligence
It can complete the clean work in ground in the room automatically;The carpet is the carpet being laid on ground in region to be cleaned, including
Long based carpet and down carpet.
It should be noted that the robot in the present invention is when clean along side to carpet, if encountering carpet,
Conventional barrier will not be then regarded as to be avoided, and will continue to move ahead on carpet, it is thus possible to the machine occur
The situation that people moves on carpet.
In one embodiment, it can judge whether the robot moves on carpet by image processing techniques.
S102: if the robot moves on carpet, control direction of the Robot far from the carpet into
Row cleaning movement.
In one embodiment, if the robot moves on carpet, the Robot is controlled clockwise
Or move in the counterclockwise direction, with the central area far from the carpet, to reach the fringe region of the carpet.Ying Li
Solution, the motion profile of the robot will appear as S-shaped at this time.
It should be understood that the control Robot moves the installation with light stream sensor clockwise or in the counterclockwise direction
Direction is related, specifically, when the light stream sensor is installed on the left front bottom of the robot, then when the robot
When being moved on carpet, the Robot should be controlled and moved clockwise;And described in being installed on when the light stream sensor
When the right front bottom of robot, then when the robot moves on carpet, then it is counterclockwise should to control the Robot
Direction movement, wherein the left front and right front be the direction of motion current relative to the robot for left front
And right front, it is described it is clockwise and counterclockwise be the direction of motion current relative to the robot for up time
Needle direction and counter clockwise direction.
S103: judge the robot whether far from the carpet.
In one embodiment, the robot can be judged whether far from the carpet by image processing techniques.
S104: if the robot far from the carpet, control the Robot close to the direction of the carpet into
Row cleaning movement.
It should be understood that the direction of the close carpet in step S104 and in step S102 far from the direction phase of the carpet
Instead.
In one embodiment, if it is counterclockwise or suitable to control the Robot far from the carpet for the robot
Clockwise movement, close to the fringe region of the carpet.
It should be understood that the control Robot counterclockwise or along clockwise direction moves the installation with light stream sensor
Direction is related, specifically, when the light stream sensor is installed on the left front bottom of the robot, then when the robot
When far from the carpet, the Robot should be controlled and counterclockwise moved;And described in being installed on when the light stream sensor
When the right front bottom of robot, then when the robot is far from the carpet, then it is clockwise should to control the Robot
Direction movement, wherein the left front and right front be the direction of motion current relative to the robot for left front
And right front, it is described it is clockwise and counterclockwise be the direction of motion current relative to the robot for up time
Needle direction and counter clockwise direction.
Therefore the embodiment of the present invention one is compared to Fig. 1-c, it can be timely when on the robot motion to carpet
It controls the Robot and carries out cleaning movement far from the direction of the carpet, and moved in the robot far from the carpet
Controlled in time after on to ground the Robot close to the carpet direction carry out cleaning movement, thus reach always close to
The carpet outer edge carries out clean purpose, is conducive to improve cleaning efficiency and coverage rate, has stronger ease for use and reality
The property used.
Embodiment two
Fig. 2 is the flow diagram of the clean method in Robot side provided by Embodiment 2 of the present invention, is to above-mentioned reality
The further refinement and explanation, this method for applying the step S101 and S103 in example one may comprise steps of:
S201: judge whether the robot moves on carpet by light stream sensor.
The light stream sensor is a kind of optical sensor that can convert optical signals to electric signal, mainly includes light source
And receiving unit, the light source include laser light source and/or LED light source, the receiving unit is issued for receiving the light source
The light after target object reflects.
In one embodiment, it can control the light stream sensor and acquire image information in region to be cleaned in real time.
It should be understood that the pictorial information of the light stream sensor acquisition is related with the position that it is installed in the robot,
When the light stream sensor is installed in the left front bottom or right front bottom of the robot, the machine can be only acquired
Image information when device people moves on the ground only acquires image information of the robot when moving on carpet.
It is right respectively under the premise of other conditions are identical since the reflecting power of unlike material object is different
Behind the ground for being only laid with ceramic tile and the acquisition of ground progress image information for being only laid with carpet, the clarity for the picture that the two obtains
It is different, at this time by the way that suitable interval range is arranged, can primarily determine whether the robot currently transports on carpet
It is dynamic, specifically, in one embodiment, step S201 may include:
A1: the first image in region to be cleaned is acquired by light stream sensor;
A2: handling the first image of acquisition, obtains the first clarity of the first image;
A3: judge first clarity whether within the scope of the first pre-set interval;
A4: if first clarity is within the scope of the first pre-set interval, it is determined that the robot moves on carpet.
Wherein, first picture is a certain frame picture that the light stream sensor currently acquires, and is not necessarily first
Frame image;It is clear when first pre-set interval may range from reference picture gathered in advance only comprising certain color carpet
Clear degree.
In one embodiment, first pre-set interval may range from some specific numerical value.
In one embodiment, the processing in step A2 may include carrying out gray proces and drop to the first image
It makes an uproar processing.
In one embodiment, step A2 can specifically include:
B1: gray proces are carried out to the first image of acquisition, obtain the first gray level image;
B2: the gray value of each pixel in first gray level image is obtained;
B3: according to the gray value of each pixel, the first clarity of the first image is obtained.
In one embodiment, step B3 may include:
C1: according to the gray value of each pixel, the gray scale difference between two neighboring pixel is calculated separately;
C2: by the gray scale difference of calculating respectively square after be added, obtain the first clarity of the first image.
S202: it if the robot moves on carpet, controls the robot and increases cleaning dynamics, and along far from institute
The direction for stating carpet carries out cleaning movement.
It wherein, specifically can be with for the increase cleaning dynamics is the cleaning dynamics current relative to the robot
For the revolving speed for increasing the robot current suction and/or hairbrush.
It should be understood that the robot can be more quickly along far from described when the robot is after increasing cleaning dynamics
The direction of carpet carries out cleaning movement.
Above-mentioned steps S202 and the step S102 in embodiment one are essentially identical, and specific implementation process can be found in step
The description of S102, so here is no more repetition.
S203: judge the robot whether far from the carpet by light stream sensor.
Due to the light stream sensor in the present invention can only acquire the image information that the robot moves on the ground or
Only acquire the image information that the robot moves on carpet.Therefore, when light stream sensor acquired image information is by institute
Robot is stated when the image information moved on carpet becomes the image information that the robot moves on the ground, can be determined
The robot is far from the carpet, specifically, in one embodiment, step S203 may include:
D1: the third image in region to be cleaned is acquired by light stream sensor;
D2: handling the third image of acquisition, obtain the third clarity of the third image,
D3: judge the third clarity whether within the scope of third pre-set interval;
D4: if the third clarity is within the scope of third pre-set interval, it is determined that the robot is far from the carpet.
Wherein, the third image is the subsequent time a certain frame image collected at current time, is not necessarily the
Three frame images;The third pre-set interval range is corresponding with the first pre-set interval range, and the pre-set interval range can
Clarity when thinking in reference picture gathered in advance only comprising certain color ceramic tile.
In one embodiment, the third pre-set interval range is bigger than the first pre-set interval range.
It in one embodiment, can be using the method for calculating first clarity of step B3 description in the present embodiment
To calculate third clarity described in step D2.
S204: if the robot far from the carpet, control the Robot close to the direction of the carpet into
Row cleaning movement.
Wherein, above-mentioned steps S204 is identical as the step S104 in embodiment one, and specific implementation process can be found in step
The description of S104, so here is no more repetition.
S205: judging whether the robot completes along side clean up task, if the robot is completed along side clean up task,
It then controls the robot and reduces cleaning dynamics.
Wherein, the reduction cleaning dynamics is specifically as follows for increasing cleaning dynamics in step S202
Reduce the revolving speed of the current suction of the robot and/or hairbrush.
It should be understood that being conducive to save electricity after the robot reduces cleaning dynamics.
In one embodiment, it can be completed after the clean up task of side in the robot, controlling the robot will work as
Cleaning dynamics when preceding cleaning dynamics is reduced to initial.
It should be understood that the robot completes along side clean up task to may be that the robot turns again to start position,
May be the robot carry out along side it is clean during be forced to terminate in advance the feelings of clean up task because encountering barrier
Shape, such as when the laying carpet is when some corner.
S206: will information flag relevant to the carpet on the electronic map.
In one embodiment, can will profile information relevant to the carpet, azimuth information and area information it is equal
Label is on the electronic map.
In one embodiment, by information flag relevant to carpet mode on the electronic map with can be with it
Based on the mode that its barrier distinguishes.
In one embodiment, the electronic map be the robot to wall after the cleaning of side the environment that constructs
Figure.
In one embodiment, the electronic map is grating map.
In one embodiment, can will information flag relevant to the carpet after electronic map to this electronically
Figure is correspondingly stored.
Therefore the embodiment of the present invention two gives one kind by light stream sensor and judges institute compared to embodiment one
The specific implementation whether robot moves on carpet is stated, institute can be tentatively judged when the picture clarity of acquisition is not high
It states robot and is currently on carpet and move, carry out cleaning fortune far from the direction of the carpet to control the Robot
It is dynamic, close to the outer edge of the carpet;In addition, will information flag relevant to the carpet on the electronic map, be conducive to
It is cleaned in the subsequent region to where carpet, there is stronger usability and practicality.
Embodiment three
Fig. 3 is the flow diagram for the clean method in Robot side that the embodiment of the present invention three provides, and is to above-mentioned reality
The further refinement and explanation, this method for applying the step S101 and S103 in example one may comprise steps of:
S301: acquiring the first image in region to be cleaned by light stream sensor, to the first image of acquisition into
Whether row processing, obtain the first clarity of the first image, judge first clarity in the first pre-set interval range
It is interior.
Wherein, above-mentioned steps S301 and the step S201 in embodiment two are essentially identical, and specific implementation process can be found in
The description of step S201, so here is no more repetition.
S302: if first clarity within the scope of the first pre-set interval, controls the robot for current
One light source switches to second light source, acquires the second image in region to be cleaned by light stream sensor, to described the of acquisition
Two images are handled, and the second clarity of second image is obtained, and calculate first clarity and the second clarity
Difference judges the difference calculated whether within the scope of the second pre-set interval, if the difference is in the second pre-set interval range
It is interior, it is determined that the robot moves on carpet.
It should be understood that second image be current time subsequent time a certain frame image collected, not necessarily for
Second frame image;The second pre-set interval range is for further judging whether the robot currently moves on carpet.
In one embodiment, if the first light source is laser light source, the second light source is LED light source;If institute
Stating first light source is LED light source, then the second light source is laser light source.
It in one embodiment, can be using the method for calculating first clarity of step B3 description in embodiment two
To calculate second clarity.
Table 1
Material | Stainless steel floor | Heterochromatic patterned carpet | Brown patterned carpet | Shag carpet | Red carpet | White wall |
LED | 338 | 153.9 | 330.8 | 380.3 | 246.7 | 99.7 |
Laser | 3417.4 | 91.4 | 85.5 | 89.7 | 82.7 | 371.3 |
Difference | 3079.4 | -62.5 | -245.3 | -290.6 | -164 | 271.6 |
Material | Blue carpet | Black carpet | Dark brown plank | White plank | Ceramic tile | Black ceramic tiles |
LED | 237.8 | 182.6 | 184.5 | 184.4 | 94.3 | 0 |
Laser | 89.4 | 0 | 284.5 | 483.4 | 1072.3 | 1565.6 |
Difference | -148.4 | -182.6 | 100 | 299 | 978 | 1565.6 |
It should be noted that according to above-mentioned table 1 it is found that only including ground when light source is switched to LED light source by laser light source
The clarity of blanket picture can increased, and the difference for calculating first clarity and the second clarity at this time will be negative value.Cause
This can set (- ∞, 0) for range between second predeterminable area in one embodiment.
S303: it if the robot moves on carpet, controls the robot and increases cleaning dynamics, and along far from institute
The direction for stating carpet carries out cleaning movement.
S304: judge the robot whether far from the carpet by light stream sensor.
S305: if the robot far from the carpet, control the Robot close to the direction of the carpet into
Row cleaning movement.
S306: judging whether the robot completes along side clean up task, if the robot is completed along side clean up task,
It then controls the robot and reduces cleaning dynamics.
S307: will information flag relevant to the carpet on the electronic map.
Wherein, above-mentioned steps S303-S307 is identical as the step S202-S206 in embodiment two, specific implementation process
It can be found in the description of step S202-S206, so here is no more repetition.
Therefore the embodiment of the present invention three gives one kind by light stream sensor and judges institute compared to embodiment one
The specific implementation whether robot moves on carpet is stated, after can moving on carpet tentatively judging the robot
Further judgement is remake, is conducive to accurately determine whether the robot currently moves on carpet;In addition, will with it is described
The relevant information flag of carpet on the electronic map, is conducive to clean in the subsequent region to where carpet, has relatively strong
Usability and practicality.
Example IV
Fig. 4 is the structural schematic diagram for the robot that the embodiment of the present invention four provides.As shown in figure 4, the machine of the embodiment
People 4 includes: processor 40, memory 41 and is stored in the meter that can be run in the memory 41 and on the processor 40
Calculation machine program 42.The processor 40 realizes the step in above method embodiment one, example when executing the computer program 42
The step S101 to S104 as shown in Fig. 1-a.Alternatively, realizing the step in above method embodiment two, such as step shown in Fig. 2
Rapid S201 to S206.Alternatively, realizing the step in above method embodiment three, such as step S301 to S307 shown in Fig. 3.
The robot may include, but be not limited only to, processor 40, memory 41.It will be understood by those skilled in the art that
Fig. 4 is only the example of robot 4, does not constitute the restriction to robot 4, may include than illustrating more or fewer portions
Part perhaps combines certain components or different components, such as the robot can also include input-output equipment, network
Access device, bus etc..
The processor 40 can be central processing unit (Central Processing Unit, CPU), can also be
Other general processors, digital signal processor (Digital Signal Processor, DSP), specific integrated circuit
(Application Specific Integrated Circuit, ASIC), ready-made programmable gate array (Field-
Programmable Gate Array, FPGA) either other programmable logic device, discrete gate or transistor logic,
Discrete hardware components etc..General processor can be microprocessor or the processor is also possible to any conventional processor
Deng.
The memory 41 can be the internal storage unit of the robot 4, such as the hard disk or memory of robot 4.
The memory 41 is also possible to the External memory equipment of the robot 4, such as the plug-in type being equipped in the robot 4 is hard
Disk, intelligent memory card (Smart Media Card, SMC), secure digital (Secure Digital, SD) card, flash card
(Flash Card) etc..Further, the memory 41 can also both include the internal storage unit of the robot 4 or wrap
Include External memory equipment.The memory 41 is for other programs needed for storing the computer program and the robot
And data.The memory 41 can be also used for temporarily storing the data that has exported or will export.
It is apparent to those skilled in the art that for convenience and simplicity of description, the system of foregoing description,
The specific work process of device and unit, can refer to corresponding processes in the foregoing method embodiment, and details are not described herein.
In the above-described embodiments, it all emphasizes particularly on different fields to the description of each embodiment, is not described in detail or remembers in some embodiment
The part of load may refer to the associated description of other embodiments.
Those of ordinary skill in the art may be aware that each embodiment described in conjunction with the examples disclosed in this document
Module, unit and/or method and step can be realized with the combination of electronic hardware or computer software and electronic hardware.This
A little functions are implemented in hardware or software actually, the specific application and design constraint depending on technical solution.Specially
Industry technical staff can use different methods to achieve the described function each specific application, but this realization is not
It is considered as beyond the scope of this invention.
In several embodiments provided by the present invention, it should be understood that disclosed system, device and method can be with
It realizes by another way.For example, the apparatus embodiments described above are merely exemplary, for example, the unit
It divides, only a kind of logical function partition, there may be another division manner in actual implementation, such as multiple units or components
It can be combined or can be integrated into another system, or some features can be ignored or not executed.Another point, it is shown or
The mutual coupling, direct-coupling or communication connection discussed can be through some interfaces, the indirect coupling of device or unit
It closes or communicates to connect, can be electrical property, mechanical or other forms.
The unit as illustrated by the separation member may or may not be physically separated, aobvious as unit
The component shown may or may not be physical unit, it can and it is in one place, or may be distributed over multiple
In network unit.It can select some or all of unit therein according to the actual needs to realize the mesh of this embodiment scheme
's.
It, can also be in addition, the functional units in various embodiments of the present invention may be integrated into one processing unit
It is that each unit physically exists alone, can also be integrated in one unit with two or more units.Above-mentioned integrated list
Member both can take the form of hardware realization, can also realize in the form of software functional units.
If the integrated unit is realized in the form of SFU software functional unit and sells or use as independent product
When, it can store in a computer readable storage medium.Based on this understanding, the present invention realizes above-described embodiment side
All or part of the process in method can also instruct relevant hardware to complete, the computer by computer program
Program can be stored in a computer readable storage medium, and the computer program is when being executed by processor, it can be achieved that above-mentioned each
The step of a embodiment of the method.Wherein, the computer program includes computer program code, and the computer program code can
Think source code form, object identification code form, executable file or certain intermediate forms etc..The computer-readable medium can be with
It include: any entity or device, recording medium, USB flash disk, mobile hard disk, magnetic disk, light that can carry the computer program code
Disk, computer storage, read-only memory (ROM, Read-Only Memory), random access memory (RAM, Random
Access Memory), electric carrier signal, telecommunication signal and software distribution medium etc..It should be noted that the computer
The content that readable medium includes can carry out increase and decrease appropriate according to the requirement made laws in jurisdiction with patent practice, such as
It does not include electric carrier signal and telecommunication signal according to legislation and patent practice, computer-readable medium in certain jurisdictions.
The above, the above embodiments are merely illustrative of the technical solutions of the present invention, rather than its limitations;Although referring to before
Stating embodiment, invention is explained in detail, those skilled in the art should understand that: it still can be to preceding
Technical solution documented by each embodiment is stated to modify or equivalent replacement of some of the technical features;And these
It modifies or replaces, the spirit and scope for technical solution of various embodiments of the present invention that it does not separate the essence of the corresponding technical solution.
Claims (10)
1. a kind of clean method in Robot side characterized by comprising
Judge whether the robot moves on carpet;
If the robot moves on carpet, the Robot is controlled far from the direction of the carpet and carries out cleaning fortune
It is dynamic;
Judge the robot whether far from the carpet;
If the robot controls the Robot close to the direction of the carpet and carries out cleaning fortune far from the carpet
It is dynamic.
2. the method according to claim 1, wherein it is described judge the robot whether on carpet sports bag
It includes:
Judge whether the robot moves on carpet by light stream sensor.
3. according to the method described in claim 2, it is characterized in that, judging the robot whether on ground by light stream sensor
It is moved on blanket and includes:
The first image in region to be cleaned is acquired by light stream sensor;
The first image of acquisition is handled, the first clarity of the first image is obtained;
Judge first clarity whether within the scope of the first pre-set interval;
If first clarity is within the scope of the first pre-set interval, it is determined that the robot moves on carpet.
4. according to the method described in claim 3, it is characterized in that, if first clarity is in the first pre-set interval range
It is interior, it is determined that the robot moves on carpet includes:
Current first light source is switched if first clarity within the scope of the first pre-set interval, controls the robot
To second light source;
The second image in region to be cleaned is acquired by light stream sensor;
Second image of acquisition is handled, the second clarity of second image is obtained;
Calculate the difference of first clarity and the second clarity;
Judge the difference calculated whether within the scope of the second pre-set interval;
If the difference is within the scope of the second pre-set interval, it is determined that the robot moves on carpet.
5. the method according to claim 1, wherein judge the robot whether far from the carpet include:
Judge the robot whether far from the carpet by light stream sensor.
6. the method according to claim 1, wherein controlling the direction of the Robot far from the carpet
Before movement, further includes:
It controls the robot and increases cleaning dynamics.
7. method according to any one of claims 1 to 6, which is characterized in that the method also includes:
Judge whether the robot completes along side clean up task;
If the robot completes to control the robot along side clean up task and reduce cleaning dynamics.
8. the method according to the description of claim 7 is characterized in that being gone back after controlling the robot and reducing cleaning dynamics
Include:
Will information flag relevant to the carpet on the electronic map.
9. a kind of robot, including memory, processor and storage can transport in the memory and on the processor
Capable computer program, which is characterized in that the processor realizes such as claim 1 to 8 times when executing the computer program
The step of one the method.
10. a kind of computer readable storage medium, the computer-readable recording medium storage has computer program, and feature exists
In when the computer program is executed by processor the step of any one of such as claim 1 to 8 of realization the method.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910313177.XA CN109984688A (en) | 2019-04-18 | 2019-04-18 | A kind of clean method in Robot side and robot |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910313177.XA CN109984688A (en) | 2019-04-18 | 2019-04-18 | A kind of clean method in Robot side and robot |
Publications (1)
Publication Number | Publication Date |
---|---|
CN109984688A true CN109984688A (en) | 2019-07-09 |
Family
ID=67134183
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201910313177.XA Pending CN109984688A (en) | 2019-04-18 | 2019-04-18 | A kind of clean method in Robot side and robot |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN109984688A (en) |
Cited By (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110477820A (en) * | 2019-08-16 | 2019-11-22 | 云鲸智能科技(东莞)有限公司 | Clean robot along barrier clean method, clean robot and storage medium |
CN110522360A (en) * | 2019-09-05 | 2019-12-03 | 深圳市杉川机器人有限公司 | Carpet detection method, device, sweeping robot and computer storage medium |
CN110870721A (en) * | 2019-11-26 | 2020-03-10 | 上海高仙自动化科技发展有限公司 | Control method and device for cleaning robot, cleaning robot and storage medium |
CN112515537A (en) * | 2020-11-20 | 2021-03-19 | 深圳市银星智能科技股份有限公司 | Walking ground recognition method and cleaning robot |
CN112790672A (en) * | 2021-02-10 | 2021-05-14 | 北京石头世纪科技股份有限公司 | Automatic cleaning equipment control method and device, medium and electronic equipment |
CN113693492A (en) * | 2021-02-10 | 2021-11-26 | 北京石头世纪科技股份有限公司 | Cleaning robot escaping method and device, medium and electronic equipment |
CN113693522A (en) * | 2021-02-10 | 2021-11-26 | 北京石头世纪科技股份有限公司 | Cleaning robot escaping method and device, medium and electronic equipment |
CN113974507A (en) * | 2021-09-30 | 2022-01-28 | 云鲸智能(深圳)有限公司 | Carpet detection method and device for cleaning robot, cleaning robot and medium |
WO2022095320A1 (en) * | 2020-11-04 | 2022-05-12 | 深圳市普森斯科技有限公司 | Carpet recognition-based cleaning method for robot mop, electronic device, and storage medium |
CN114587210A (en) * | 2021-11-16 | 2022-06-07 | 北京石头创新科技有限公司 | Cleaning robot control method and control device |
CN114641229A (en) * | 2019-08-26 | 2022-06-17 | 苏州宝时得电动工具有限公司 | Cleaning robot and control method thereof |
CN114652217A (en) * | 2022-03-02 | 2022-06-24 | 美智纵横科技有限责任公司 | Control method, cleaning robot, and storage medium |
CN115211763A (en) * | 2022-07-14 | 2022-10-21 | 北京石头世纪科技股份有限公司 | Identification method and equipment for automatic cleaning equipment and storage medium |
US11612295B2 (en) | 2021-01-04 | 2023-03-28 | Beijing Roborock Technology Co., Ltd. | Autonomous cleaning device |
WO2024067852A1 (en) * | 2022-09-28 | 2024-04-04 | 云鲸智能(深圳)有限公司 | Ground medium detection method and apparatus and cleaning device |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20040087183A (en) * | 2003-04-04 | 2004-10-13 | 엘지전자 주식회사 | Edge detection apparatus and method for mobile robot |
CN107045352A (en) * | 2017-05-31 | 2017-08-15 | 珠海市微半导体有限公司 | Based on how infrared robot obstacle-avoiding device, its control method and Robot side control method |
CN108115681A (en) * | 2017-11-14 | 2018-06-05 | 深圳先进技术研究院 | Learning by imitation method, apparatus, robot and the storage medium of robot |
CN108873186A (en) * | 2017-09-25 | 2018-11-23 | 北京石头世纪科技有限公司 | Optics module and Intelligent mobile equipment with it |
CN109241254A (en) * | 2018-08-06 | 2019-01-18 | 深圳市玖胜云智联科技有限公司 | A kind of corpus acquisition method and acquisition device applied to robot |
CN109330504A (en) * | 2018-10-27 | 2019-02-15 | 珊口(深圳)智能科技有限公司 | Clean robot and its mopping device |
CN109602338A (en) * | 2018-11-26 | 2019-04-12 | 深圳乐动机器人有限公司 | A kind of method, sweeping robot and floor-mopping robot cleaning ground |
-
2019
- 2019-04-18 CN CN201910313177.XA patent/CN109984688A/en active Pending
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20040087183A (en) * | 2003-04-04 | 2004-10-13 | 엘지전자 주식회사 | Edge detection apparatus and method for mobile robot |
CN107045352A (en) * | 2017-05-31 | 2017-08-15 | 珠海市微半导体有限公司 | Based on how infrared robot obstacle-avoiding device, its control method and Robot side control method |
CN108873186A (en) * | 2017-09-25 | 2018-11-23 | 北京石头世纪科技有限公司 | Optics module and Intelligent mobile equipment with it |
CN108115681A (en) * | 2017-11-14 | 2018-06-05 | 深圳先进技术研究院 | Learning by imitation method, apparatus, robot and the storage medium of robot |
CN109241254A (en) * | 2018-08-06 | 2019-01-18 | 深圳市玖胜云智联科技有限公司 | A kind of corpus acquisition method and acquisition device applied to robot |
CN109330504A (en) * | 2018-10-27 | 2019-02-15 | 珊口(深圳)智能科技有限公司 | Clean robot and its mopping device |
CN109602338A (en) * | 2018-11-26 | 2019-04-12 | 深圳乐动机器人有限公司 | A kind of method, sweeping robot and floor-mopping robot cleaning ground |
Cited By (24)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110477820A (en) * | 2019-08-16 | 2019-11-22 | 云鲸智能科技(东莞)有限公司 | Clean robot along barrier clean method, clean robot and storage medium |
CN110477820B (en) * | 2019-08-16 | 2021-11-16 | 云鲸智能科技(东莞)有限公司 | Obstacle following cleaning method for cleaning robot, and storage medium |
CN114641229A (en) * | 2019-08-26 | 2022-06-17 | 苏州宝时得电动工具有限公司 | Cleaning robot and control method thereof |
CN110522360A (en) * | 2019-09-05 | 2019-12-03 | 深圳市杉川机器人有限公司 | Carpet detection method, device, sweeping robot and computer storage medium |
CN110870721A (en) * | 2019-11-26 | 2020-03-10 | 上海高仙自动化科技发展有限公司 | Control method and device for cleaning robot, cleaning robot and storage medium |
WO2022095320A1 (en) * | 2020-11-04 | 2022-05-12 | 深圳市普森斯科技有限公司 | Carpet recognition-based cleaning method for robot mop, electronic device, and storage medium |
CN112515537B (en) * | 2020-11-20 | 2022-03-08 | 深圳市银星智能科技股份有限公司 | Walking ground recognition method and cleaning robot |
CN112515537A (en) * | 2020-11-20 | 2021-03-19 | 深圳市银星智能科技股份有限公司 | Walking ground recognition method and cleaning robot |
US11612295B2 (en) | 2021-01-04 | 2023-03-28 | Beijing Roborock Technology Co., Ltd. | Autonomous cleaning device |
CN115089056A (en) * | 2021-02-10 | 2022-09-23 | 北京石头创新科技有限公司 | Automatic cleaning equipment control method and device, medium and electronic equipment |
CN113693492A (en) * | 2021-02-10 | 2021-11-26 | 北京石头世纪科技股份有限公司 | Cleaning robot escaping method and device, medium and electronic equipment |
CN112790672A (en) * | 2021-02-10 | 2021-05-14 | 北京石头世纪科技股份有限公司 | Automatic cleaning equipment control method and device, medium and electronic equipment |
WO2022171144A1 (en) * | 2021-02-10 | 2022-08-18 | 北京石头创新科技有限公司 | Automatic cleaning device control method and apparatus, and medium and electronic device |
CN113693522A (en) * | 2021-02-10 | 2021-11-26 | 北京石头世纪科技股份有限公司 | Cleaning robot escaping method and device, medium and electronic equipment |
CN113693492B (en) * | 2021-02-10 | 2022-12-02 | 北京石头创新科技有限公司 | Cleaning robot escaping method and device, medium and electronic equipment |
CN113974507A (en) * | 2021-09-30 | 2022-01-28 | 云鲸智能(深圳)有限公司 | Carpet detection method and device for cleaning robot, cleaning robot and medium |
CN113974507B (en) * | 2021-09-30 | 2023-09-12 | 云鲸智能(深圳)有限公司 | Carpet detection method and device for cleaning robot, cleaning robot and medium |
WO2023088063A1 (en) * | 2021-11-16 | 2023-05-25 | 北京石头创新科技有限公司 | Control method and apparatus for cleaning robot |
CN114587210A (en) * | 2021-11-16 | 2022-06-07 | 北京石头创新科技有限公司 | Cleaning robot control method and control device |
CN114652217A (en) * | 2022-03-02 | 2022-06-24 | 美智纵横科技有限责任公司 | Control method, cleaning robot, and storage medium |
CN114652217B (en) * | 2022-03-02 | 2023-10-27 | 美智纵横科技有限责任公司 | Control method, cleaning robot, and storage medium |
CN115211763A (en) * | 2022-07-14 | 2022-10-21 | 北京石头世纪科技股份有限公司 | Identification method and equipment for automatic cleaning equipment and storage medium |
CN115211763B (en) * | 2022-07-14 | 2023-08-29 | 北京石头世纪科技股份有限公司 | Identification method and equipment for automatic cleaning equipment and storage medium |
WO2024067852A1 (en) * | 2022-09-28 | 2024-04-04 | 云鲸智能(深圳)有限公司 | Ground medium detection method and apparatus and cleaning device |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109984688A (en) | A kind of clean method in Robot side and robot | |
CN109602338A (en) | A kind of method, sweeping robot and floor-mopping robot cleaning ground | |
Delage et al. | Automatic single-image 3d reconstructions of indoor manhattan world scenes | |
CN109664309A (en) | A kind of clean method, apparatus of intelligent robot and robot | |
CN102169093B (en) | Multi-station machine vision imaging detection method and system based on graphics processor | |
CN109375618A (en) | The navigation barrier-avoiding method and terminal device of clean robot | |
CN110464263A (en) | A kind of method and robot controlling robot cleaner | |
JP2019514126A5 (en) | ||
CN107703937A (en) | Automatic Guided Vehicle system and its conflict evading method based on convolutional neural networks | |
US11747278B2 (en) | Deposit detection device for detecting a partial covering location, a not-adhere location, and a diffuse reflection location | |
CN104282011A (en) | Method and device for detecting interference stripes in video images | |
CN109506331A (en) | A kind of method and air purifier of air cleaning | |
CN109001757A (en) | A kind of parking space intelligent detection method based on 2D laser radar | |
CN109508002A (en) | A kind of robot cleans the method, apparatus and robot on ground | |
CN104574274A (en) | Image processing method and system applying same | |
CN112388631A (en) | Method and device for cleaning planning area by robot | |
CN108009978A (en) | A kind of non-parallel triangle rasterization cellular construction of obstruction | |
CN110123208A (en) | A kind of method and robot controlling robot cleaner | |
EP3324367B1 (en) | Identifying primitives in input index stream | |
CN110315538A (en) | A kind of method, apparatus showing barrier on the electronic map and robot | |
CN109343701A (en) | A kind of intelligent human-machine interaction method based on dynamic hand gesture recognition | |
CN112085838A (en) | Automatic cleaning equipment control method and device and storage medium | |
CN112087573B (en) | Drawing of an environment | |
CN112150538B (en) | Method and device for determining vehicle pose in three-dimensional map construction process | |
CN107392387A (en) | A kind of dispatching method of AGV optimal control times |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
TA01 | Transfer of patent application right |
Effective date of registration: 20210330 Address after: 518000 room 1601, building 2, Wanke Yuncheng phase 6, Tongfa South Road, Xili community, Xili street, Nanshan District, Shenzhen City, Guangdong Province Applicant after: SHENZHEN LEDONG ROBOT Co.,Ltd. Address before: 518000 18 / F, building B1, Nanshan wisdom Park, Tanglang, Xili Town, Nanshan District, Shenzhen City, Guangdong Province Applicant before: Shenzhen Lexing World Technology Co.,Ltd. |
|
TA01 | Transfer of patent application right | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20190709 |
|
RJ01 | Rejection of invention patent application after publication |