CN105455743B - The control method of robot cleaner and robot cleaner - Google Patents
The control method of robot cleaner and robot cleaner Download PDFInfo
- Publication number
- CN105455743B CN105455743B CN201510626586.7A CN201510626586A CN105455743B CN 105455743 B CN105455743 B CN 105455743B CN 201510626586 A CN201510626586 A CN 201510626586A CN 105455743 B CN105455743 B CN 105455743B
- Authority
- CN
- China
- Prior art keywords
- room
- image
- robot cleaner
- label
- feature
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000009826 distribution Methods 0.000 claims abstract description 88
- 238000010926 purge Methods 0.000 claims abstract description 28
- 230000000052 comparative effect Effects 0.000 claims abstract description 13
- 238000010586 diagram Methods 0.000 claims description 63
- 238000001514 detection method Methods 0.000 claims description 25
- 239000000428 dust Substances 0.000 claims description 15
- 238000003860 storage Methods 0.000 claims description 14
- 230000000875 corresponding Effects 0.000 claims description 7
- 229910002056 binary alloy Inorganic materials 0.000 claims description 6
- 238000004891 communication Methods 0.000 claims description 2
- 238000000034 method Methods 0.000 description 34
- 238000004140 cleaning Methods 0.000 description 4
- 230000003287 optical Effects 0.000 description 3
- 230000005540 biological transmission Effects 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 239000004744 fabric Substances 0.000 description 2
- 230000004807 localization Effects 0.000 description 2
- 238000003032 molecular docking Methods 0.000 description 2
- 230000002093 peripheral Effects 0.000 description 2
- 239000007787 solid Substances 0.000 description 2
- KJFMBFZCATUALV-UHFFFAOYSA-N Phenolphthalein Chemical compound C1=CC(O)=CC=C1C1(C=2C=CC(O)=CC=2)C2=CC=CC=C2C(=O)O1 KJFMBFZCATUALV-UHFFFAOYSA-N 0.000 description 1
- 229940116821 SSD Drugs 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 230000001276 controlling effect Effects 0.000 description 1
- 235000019800 disodium phosphate Nutrition 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 239000000284 extract Substances 0.000 description 1
- 238000005286 illumination Methods 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 238000011065 in-situ storage Methods 0.000 description 1
- 239000000203 mixture Substances 0.000 description 1
- 230000005622 photoelectricity Effects 0.000 description 1
- 229920001690 polydopamine Polymers 0.000 description 1
- 230000001172 regenerating Effects 0.000 description 1
- 238000000926 separation method Methods 0.000 description 1
- 229910052710 silicon Inorganic materials 0.000 description 1
- 239000010703 silicon Substances 0.000 description 1
- 238000010408 sweeping Methods 0.000 description 1
Abstract
The present invention provides the control method of robot cleaner and robot cleaner, which includes:The a steps of the image on periphery are obtained in purging zone traveling;Based on the image obtained in a steps, and according to rule determined by each room, find out the b step of the feature distribution in each room;The step c of the image on periphery is obtained in current location;To the image found out in the step c, it is applicable in the rule in each room being applicable in the b step, finds out the Step d for the comparative group being made of feature distribution;The feature distribution in each room found out in the comparative group found out in the Step d and the b step is compared, to determine that robot cleaner is currently located the step e in room.The robot cleaner of the present invention can accurately grasp the position of itself based on the identification of region-wide position.
Description
Technical field
The present invention relates to the control methods of dust catcher more particularly to robot cleaner and robot cleaner.
Background technology
Robot industrially, assumes responsibility for a part for factory automation by development and application.Recently, applied robot
Field constantly expands, and has had developed medical robot, aerospace robot etc., has also been manufactured that energy in general family
The household machine people enough used.In these robots, moving machine can be known as by the robot that the ability of itself is advanced
Device people.
The representative robot of the mobile robot used in the family is robot cleaner, is the area cleaned to needs
Advance and suck one kind of the household electrical appliance of dust or foreign matter in domain.Robot cleaner has the battery that can be charged and can
It voluntarily advances, when the surplus deficiency of battery or terminates after cleaning, voluntarily find and be moved to charging rack and charge.
Existing robot cleaner detects the infrared ray (IR emitted from charging rack by infrared sensor:InfraRed)
And find charging rack.In addition, robot cleaner based on the peripheral information detected in traveling with voluntarily generating purging zone
For the various methods of figure by public domain, the map generated in this way includes the location information of charging rack.
But it in the case where needing to charge, needs that robot cleaner is made accurately to revert on charging rack, premise
Oneself accurate location on map should be grasped by being robot cleaner itself.If external cause and forcibly change traveling
In robot cleaner position (for example, the case where robot cleaner in traveling is moved to other rooms by user), then
Robot cleaner cannot in being currently generated map or generated map on grasp which position oneself be located at, tie
Tab phenolphthaleinum cause cannot be returned to charging rack.In this case, robot cleaner re-searches for sending from charging rack from current location
Infrared signal, it is this do not know oneself position in the state of the search of charging rack that carries out be possible to accidentally detect letter
Number, but hover in most cases in order to search for signal, it is final that there is a situation where batteries to use up completely.This problem is in machine
People's dust catcher also occurs in the case of regenerating its peripheral map from the position being moved to.This is because in purging zone
In the case of cannot grasping self-position on entire map, can be depended on whether to charging rack recurrence can be from the position of variation
The signal sent from charging rack is searched, i.e., is still searched for signal repeatedly.Therefore, it is necessary to a kind of robot cleaners oneself
It will appreciate that region-wide position identification (Global Localization) method of the current location on entire map.
In addition, the problem of returning failure to charging rack described above, does not carry region-wide position identification function only
One kind in the great number of issues that robot cleaner is occurred, in addition to this, it is also possible to various problems occur.For example, recently,
It is arranged using the terminal (for example, remote controler, smart mobile phone etc.) communicated with robot cleaner by wireless network clear
The technology for sweeping range is more and more active, but in this case, it should also be as being based on by the setting of the cleaning range of the terminal
Region-wide position identifies accurately to realize.
Invention content
The technical problem to be solved by the present invention is to:
First, the robot cleaner and its control method identified based on region-wide position to advance is provided.
Second, the robot cleaner and robot cleaner that can accurately execute the action of the recurrence to charging rack are provided
Control method.
Third, itself position can promptly be identified if being transferred to arbitrary other positions by user in traveling by providing
The control method of the robot cleaner and robot cleaner set.
4th, it provides in the case of losing self-position on map, also based on the image found out from current location
The robot cleaner of current location and the control method of robot cleaner can promptly be re-recognized.
The control method of the robot cleaner of the present invention, including:A steps obtain periphery in purging zone traveling
Image;B step based on the image obtained in a steps, and according to rule determined by each room, finds out each room
Feature distribution;Step c obtains the image on periphery in current location;Step d is applicable in the image found out in the step c
It is regular determined by each room that the b step is applicable in, find out the comparative group being made of feature distribution;Step e, will
The feature distribution in each room found out in the comparative group and the b step that are found out in the Step d is compared, to determine machine
The room that device people's dust catcher is currently located.
On the other hand, the control method of robot cleaner of the invention, including:It is obtained while movement in purging zone
Take the image on periphery;According to regular determined by each room, i) in the image detection obtained from each room go out feature, ii)
Find out the label of the feature detected, iii) each label is assigned and is counted, iv) point of each label is found out according to each room
Number is distributed and stores;According to the image for obtaining periphery after the points distribution of each label of each room storage, and according in institute
State i), ii), iii) and iv) in rule determined by each room for being applicable in, label is generated to the feature of the image of acquisition, it is right
The label of generation assigns points, finds out the comparison being made of the points distribution of regular each label based on determined by each room
Group, and group and the iv based on the comparison) points of each label that are found out according to each room in step are distributed, come true
Determine the room that robot cleaner is currently located.
The kind robot cleaner of the present invention, including:Image acquiring unit obtains the image on periphery;Feature distribution learns mould
Block based on the image obtained by described image acquisition unit, and according to rule determined by each room, finds out each room
Feature distribution;Location identification module, the rule that each room is applicable in are suitable for robot cleaner in current location
The image obtained by described image acquisition unit finds out the comparative group being made of feature distribution, by the comparative group and passes through institute
The feature distribution for stating each room that feature distribution study module is found out is compared, to determine that robot cleaner is currently located
Room.
Description of the drawings
Fig. 1 is the vertical of the charging rack for showing the robot cleaner of one embodiment of the invention and charging to robot cleaner
Body figure.
Fig. 2 is the figure for showing the upper surface of robot cleaner portion shown in Fig. 1.
Fig. 3 is the front view for showing robot cleaner shown in Fig. 1.
Fig. 4 is the upward view for showing robot cleaner shown in Fig. 1.
Fig. 5 is the block diagram for showing to constitute the control planning between the critical piece of robot cleaner.
Fig. 6 is the image captured by the arbitrary room in purging zone.
Fig. 7 is the schematic diagram for the process for showing feature based Distributed learning module Learning Map.
Fig. 8 is to show to utilize scale invariant feature to convert (SIFT to detect feature:Scale Invariant
Feature Transform) algorithm extracts the figure of feature vector.
Fig. 9 is in the case of showing to have N number of room in purging zone, to classify to obtaining image according to room
Figure.
Figure 10 is that the label that will be assigned to the feature detected from the image classified according to room includes obtaining
Figure on image.
Figure 11 is to show that the block diagram of the correspondence image found out respectively based on the image obtained from N number of all rooms is found out respectively
The figure of the process of the feature distribution in room.
Figure 12 is the schematic diagram for showing to identify the process of robot cleaner position by location identification module.
It is regular determined by each room that Figure 13 is that the image acquired in the current location that shows to robot cleaner is applicable in
(in the learning process of feature distribution be applicable in rule) generates label, and assigns points to label and generate N number of relatively column
The figure of figure.
Figure 14 is shown the block diagram (block diagram of the block diagram in room 1 to room N) in N number of room and N number of reference column
Shape figure (compare column Fig. 1 to compare block diagram N) carries out matched figure.
Figure 15 is the study of the comparison block diagram K and feature distribution that show to find out the image acquired in the current location
The figure that the block diagram that image acquired out of room K is found out in journey is compared.
Specific implementation mode
Below with reference to the attached drawing embodiment that the present invention will be described in detail, wherein identical reference numeral indicates identical structure
Component.
Illustrate advantages and features of the invention with reference to attached drawing and the embodiment being described in detail later and the side in order to reach these
Method.But the present invention is not limited by embodiment as disclosed below, can be implemented with diversified forms different from each other, be needed
Bright, the present embodiment makes those skilled in the art be easy to understand guarantor of the invention for keeping disclosure of the invention complete
Range is protected, is defined the present invention is based on the protection domain of claims.
Fig. 1 is the charging rack for showing the robot cleaner 100 of one embodiment of the invention and charging to robot cleaner
200 stereogram.Fig. 2 is the figure for showing the upper surface of robot cleaner portion shown in Fig. 1.Fig. 3 is shown shown in Fig. 1
The front view of robot cleaner.Fig. 4 is the upward view for showing robot cleaner shown in Fig. 1.Fig. 5 is to show composition machine
The block diagram of control planning between the critical piece of device people's dust catcher.
Referring to figs. 1 to Fig. 4, robot cleaner 100 may include main body 110 and obtain the figure of the side images of main body 110
As acquisition unit 120.Hereinafter, in the definition to 110 each section of main body, will be defined towards the part of the ceiling in purging zone
For upper surface portion (with reference to Fig. 2), bottom surface sections (with reference to Fig. 4) are defined as towards the part on the ground in purging zone, on described
It is constituted between portion and the bottom surface sections in the part around main body 110 and is defined as front face (reference towards the part of direction of travel
Fig. 3).
The robot cleaner 100 is provided at least one driving wheel 136 for making main body 110 move, driving wheel
136 are driven by drive motor 139.Driving wheel 136 can be separately positioned on the left and right side of main body 110, divide below
It Cheng Zhiwei not revolver 136L and right wheel 136R.
Revolver 136L and right wheel 136R can be driven by a drive motor, however, if necessary, it may be provided with
Revolver drive motor for driving revolver 136L and the right wheel drive motor for driving right wheel 136R.By making revolver 136L
It is different with the rotary speed of right wheel 136R, the direction of travel of main body 110 can be made to be changed into the left or to the right.
The bottom surface sections of main body 110 can be formed with the suction inlet 110h for sucking air, may be provided in main body 110 so that
Air provides the suction apparatus (not shown) of suction in such a way that suction inlet 110h is inhaled into and passes through suction inlet for storing
The dust bucket (not shown) for the dust that 110h is inhaled into together with air.
Main body 110 may include forming the shell 111 in the space for the various parts that receiving constitutes robot cleaner 100.Shell
Body 111 can be formed with the opening portion for being inserted into and dismounting the dust bucket, open and close the dust bung of the opening portion
112 are set as to rotate relative to shell 111.
The robot cleaner 100 may be provided with the main brush via the suction inlet 110h roll with brush exposed
134 and positioned at main body 110 bottom surface sections front side and with being configured to the brush of multiple wings of radiated entend
Pair brush 135.By the rotation of these brushes 134,135, the dust on the ground in purging zone is removed, in this way from surface separation
Dust is inhaled into dust bucket via suction inlet 110h and stores.
Battery 138 is not only supplied electric power to the drive motor, but also is supplied to the entire action of robot cleaner 100
Electric power.When battery 138 needs charging because of electric discharge, robot cleaner 100 returns to charge to charging rack 200, at this
During the recurrence of sample is advanced, robot cleaner 100 oneself can detect the position of charging rack 200.
Charging rack 200 may include sending the defined signal transmission unit (not shown) for returning signal.The recurrence signal can
To be ultrasonic signal or infrared signal, but not limited to this.
Robot cleaner 100 may include receiving the signal detecting part (not shown) for returning signal.Charging rack 200 is logical
It crosses signal transmission unit and sends infrared signal, the signal detecting part may include the infrared ray sensing for detecting the infrared signal
Device.Robot cleaner 100 is moved according to from the infrared signal that charging rack 200 is sent to 200 position of charging rack, and with
Charging rack 200 docks (docking).Charging terminal 133 and the charging of robot cleaner 100 are realized in docking in this way
Charging between the charging terminal 210 of frame 200.
Image acquiring unit 120 may include digital camera for shooting purging zone.The digital camera can wrap
It includes:At least one optical lens, the multiple photodiodes for including the light based on the process optical lens and being imaged
The imaging sensor (for example, cmos image sensor) of (photodiode, such as pixel (pixel)) is based on from the photoelectricity
The signal of diode output and constitute the digital signal processor (DSP of image:Digital Signal Processor).It is described
Digital signal processor can not only generate still image, but also can generate the dynamic being made of the frame that still image is constituted
Image.
Preferably, the setting of image acquiring unit 120 obtains the figure of the ceiling in purging zone in the upper surface of main body 110 portion
Picture, still, the position of image acquiring unit 120 and image pickup scope are not limited to this.As an example, image acquiring unit 120 can obtain
Take the image in 110 front of main body.
It is examined with the presence or absence of the step of step in addition, robot cleaner 100 may also include the ground in detection purging zone
It surveys sensor 132 and obtains the lower camera sensor 139 of ground image.
With reference to Fig. 5, robot cleaner 100 may include control unit 140 and storage part 150.Control unit 140 is by controlling structure
At the various structures (for example, image acquiring unit 120, operation portion 137, drive motor 139 etc.) of robot cleaner 100, to control
The entire action of robot cleaner 100 processed especially may include traveling control module 141, feature detection module 142, feature point
Cloth study module 143 and location identification module 144.
Storage part 150 is used to store the various information needed for control robot cleaner 100, it may include volatile storage is situated between
Matter or non-volatile holographic storage medium.What the storage media can be read for storage microprocessor (micro processor)
Data, it may include hard disk drive (HDD:Hard Disk Drive), solid state disk (SSD:Solid State Disk), it is expensive
Disc driver (SDD:Silicon Disk Drive), ROM, RAM, CD-ROM, tape, floppy disk, optical data storage devices etc..
In addition, storage part 150 can be stored with the map of purging zone.The map can be by with robot cleaner
100 carry out wire communications or wireless communication and can exchange the exterior terminal of information to input, and can also be to pass through robot
Dust catcher 100 oneself learn and generate.If it is the former, as exterior terminal, it can enumerate and be mounted with to be used for setting map
Remote controler, PDA, laptop (laptop), smart mobile phone, the tablet computer of application software (application) etc..
The map can show the position in the room in purging zone.In addition, the current location of robot cleaner 100 can
On the map, the current location of the robot cleaner 100 on the map can update during advancing for display.
Traveling control module 141 is used to control the traveling of robot cleaner 100, and driving horse is controlled according to setting is advanced
Up to 139 driving.In addition, action of the traveling control module 141 based on drive motor 139 can grasp robot cleaner 100
Mobile route.As an example, traveling control module 141 can grasp robot cleaner 100 based on the rotating speed of driving wheel 136
Current or past movement speed, the distance of traveling etc., and slapped based on the direction of rotation of each driving wheel 136L, 36R
Hold current or past direction transfer process.Travel information based on the robot cleaner 100 that such as upper type is grasped, on ground
The position of robot cleaner 100 can be updated on figure.
In the traveling of robot cleaner 100, image acquiring unit 120 obtains the image around robot cleaner 100.With
Under, the image obtained by image acquiring unit 120 is known as " obtaining image ".Preferably, image acquiring unit 120 is on map
Each room obtains at least one acquisition image.Fig. 6 is the image that any one room in purging zone takes, and is passed through
Image is able to confirm that the various features (feature) such as illumination 11, corner 12 on ceiling.
Feature detection module 142 goes out feature (feature) from each acquisition image detection.In computer vision
In (Computer Vision) technical field, from image detect feature a variety of methods (Feature Detection) by
Public domain.The feature includes edge (edge), corner (corner), spot (blob), ridge (ridge) etc., and is used for
The various features detector (feature detector) of these features is detected by public domain.For example, Canny, Sobel,
The Plessey of Harris and Stephens, SUSAN, Shi-Tomasi, Level curve curvature, FAST,
Laplacian of Gaussian、Difference of Gaussians、Determinant of Hessian、MSER、
PCBR, Grey-level blobs detectors etc..
Fig. 8 is to show to extract the figure of feature vector using SIFT algorithms to detect feature.SIFT algorithms, refer to from
After selecting the characteristic points (key point) such as the corner point that can be easily identified in image, finds out and belong to each characteristic point periphery
The distribution character (direction of brightness change and the drastically degree of variation) of the brightness step (gradient) of the pixel of predetermined region
Block diagram finds out the image recognition algorithm of the vector of 128 dimensions from binary system (bin) value of the block diagram later.
SIFT can detect constant feature to the size (scale) of reference object, rotation, brightness change, therefore, i.e.,
Just it changes the posture of robot cleaner 100 and same area is shot, be also capable of detecting when constant (that is, invariable rotary
(Rotation-invariant)) feature.Certainly, algorithm is not limited to SIFT, other algorithms can also be applied (for example, direction
Histogram of gradients (HOG:Histogram of Oriented Gradient), Haar feature, Fems, local binary patterns
(LBP:Local Binary Pattern), MCT (Modified Census Transform) algorithm).
Feature distribution study module 143 is generated based on the feature detected by feature detection module 142 corresponding to feature
Label, and label is assigned and is counted, the points of the label based on each image are distributed (hereinafter, referred to as label-points of image
Distribution) and as the label-points distribution for the image that upper type acquires is distributed to find out the points of the label in each room (hereinafter, claiming
The label for room-points distribution).More detailed content will be described later, but, label-points distribution of image
With the label-points in room be distributed in robot cleaner 100 oneself position cannot be grasped from map in the case of (for example,
In being advanced based on map, the case where position of robot cleaner 100 is changed suddenly), by with obtain in the position
Image is compared to be applied when determining current position.Hereinafter, one of label-points distributing position until finding out room
The process of series is known as feature distribution learning process, label-points distribution of the distribution of the label based on room-points and image come
Identify that the process of the position of current robot cleaner 100 is known as position identification process.By feature distribution learning process and
Position identification process can determine the current location of robot cleaner 100 using the entire scope of purging zone as object.
(region-wide position identification:Global Localization)
Fig. 7 is the schematic diagram for showing the process by feature distribution study module Learning Map.With reference to Fig. 7, as described above,
Feature detection module 142 implements feature to the image of multiple position acquisitions in purging zone and detects (feature
detection).The image obtained in this way is as the database purchase according to room classes in storage part 150.In purging zone
It is interior have N number of room in the case of, preferably obtain at least one image from each room, go out from the image detection obtained in this way
Each feature.At this moment, the image obtained from each room can be by making the attitudes vibration of robot cleaner 100 on one side in the room
On one side the image (for example, image for obtaining while robot cleaner 100 rotates in situ) that shoots or while change bit
Set the image construction taken on one side.Robot cleaner 100 can be come in the room with arbitrary posture or change of location
Image is obtained, feature detection operation is referred to as.Implement in all rooms that feature detection operation can be in purging zone, with
The lower process by the entire scope of detection purging zone while implementing feature detection operation is known as feature detection operation.Feature is examined
Surveying operation can instruct according to the regulation being entered by operation portion 137 to implement, alternatively, it is also possible in robot cleaner 100
Implement when being returned to charging rack 200 to charge.Especially, in the latter case, feature detection operation has been carried out,
Image is got in purging zone entire scope, even if implementing returning to charging rack again after being stored in storage part 150
Return, need not also implement feature detection operation again.
Label 721 is generated for the feature gone out from acquisition image detection, and assigns points 722.At this moment, the generation of label and
The imparting of points is carried out according to rule determined by each room, and the preferably described rule is different according to the difference in room.
As an example, the feelings that acquisition image is classified according to room when Fig. 9 is shown in purging zone equipped with N number of room
Condition.The arrow in eight directions illustrated in each image is the descriptor for showing feature in obtaining image using SIFT algorithms
(descriptor).Feature distribution study module 143 classifies to descriptor according to rule determined by each room, and to class
As feature generate same label.At this moment, label is given a definition according to rule determined by each room.
Figure 10 is that the label assigned from the feature gone out according to the image detection of room classes is displayed on acquisition image
Figure.In embodiment, it is five labels by tagsort, but the number of label is unlimited according to rule determined by each room
In this.
From the image that room 1 obtains, the feature detected is generated meet room 1 rule P11, P12,
The label of P13, P14, P15.Also, from the image that room N is obtained, the feature detected is generated and meets room N's
PN1, PN2, PN3, PN4, PN5 of rule.In this manner it is achieved that N number of all rooms, it is raw according to rule determined by each room
At label.Implement the generation of label by feature distribution study module 143.
It counts in addition, feature distribution study module 143 assigns label according to rule determined by room.As an example,
As shown in Figure 10, correspondingly to label P11, P12, P13, P14, P15 based on the acquisition image generation obtained from room 1
1.5,2.1,1.0,2.3,1.5 points are assigned, and to being based on the mark for obtaining image and generating from other rooms (such as room N)
PN1, PN2, PN3, PN4, PN5 are signed, points are assigned according to the benchmark different from room 1.In embodiment, to PN1, PN2, PN3,
PN4, PN5 impart 0.8,1.1,1.2,2.5,1.3 points correspondingly.
Label has been generated to each acquisition image now, and points are imparted to the label of generation, therefore can have been found out
The feature distribution of each image.Hereinafter, appearance of the feature distribution of each image by each label in obtaining image
Value is indicated as the block diagram of the length of item (bar) obtained from frequency is multiplied by points.
Feature distribution study module 143 can be according to the feature distribution of each image (for example, label-points of image point
Cloth) find out the feature distribution (for example, the label in room-points distribution) in each room.Figure 11 is to show basis from N number of all rooms
Between the block diagram 1110 of each image that acquires respectively of the image that obtains find out each room feature distribution process figure.With
Under, the feature distribution in each room is indicated by block diagram 1120, is referred to as the block diagram in each room.
The block diagram 1120 in each room is in the block diagram of each image found out according to each room according to label
Obtained from being averaged to points.That is, as shown in figure 11, in each image found out by the M image obtained from room 1
The block diagram block diagram of block diagram~room 1 of image 1, image M (room 1) in, according to each label (P11 to P15),
Points are carried out average and find out the block diagram (block diagram in room 1) to room 1.Feature distribution study module 143 is to all
The above process is all implemented in room, and thus, it is possible to find out the block diagram in N number of room, the block diagram in each room found out in this way stores
In storage part 150.
The feature distribution (block diagram in each room) in each room is used for reacting the feature distribution in corresponding room, therefore,
The index for identifying the room can be become, and the feature distribution of each image can become more specifically identification and reacquire
The index of the position of image in room.Especially in the traveling according to map normal control robot cleaner 100, pass through row
The position of the robot cleaner 100 on map can be continued to monitor into control module 141, therefore, when passing through image acquiring unit
When 120 acquisition image, which position that robot cleaner 100 is located in room will appreciate that.Therefore, consider base together in this way
When the location information for the robot cleaner 100 that Map recognition arrives, the feature distribution of each image can become specific
The index for the position for getting image is identified in room.
The index (feature distribution in each room) in identification room has been found out now and identifies the position in room
Index (feature distribution of each image), therefore, explanation utilizes the position identification process of These parameters below.
Figure 12 is the schematic diagram for showing to identify the process of robot cleaner position by location identification module.With reference to figure
12, position identification process includes:Robot cleaner 100 obtains the step of the image (such as ceiling) on periphery in current location
Suddenly;The step of going out feature from the image detection obtained in this way;It is (special that rule determined by each room is applicable in the feature detected
The rule that each room is applicable in during sign Distributed learning) and the step of find out N number of feature distribution.
It at this moment, can root from the process for obtaining the process of image detection feature, generating the process of label and assigning label points
Implement according to rule determined by each room being applicable in feature distribution learning process.Rule determined by each room
Quantity then can be quantity corresponding with room number (N number of), be applicable in these N number of features that are regular and being found out from acquisition image
Distribution (below by taking block diagram as an example) can form the comparative group that robot cleaner 100 for identification is presently in room.Position
Identification module 144 will be by that will constitute the feature distribution (the hereinafter referred to as feature distribution of comparative group) of the comparative group and in feature
The feature distribution in each room found out during Distributed learning is compared, can determine the image obtained in current location
Which found out out of room.
The feature distribution in each room is obtained according to label averagely counting to the multiple images obtained from corresponding room
It arrives, accordingly, it is possible to cannot be accurately consistent with some in the feature distribution of comparative group.Therefore, location identification module 144
The feature distribution for finding out some most closely matched each room in the feature distribution with comparative group, so can determine that
Which found out from room in the image that current location obtains.
Hereinafter, 3 to Figure 15 position identification process detailed further referring to Fig.1.Figure 13 is shown each room institute really
Fixed rule (rule being applicable in feature distribution learning process) is useful in acquired in the current location of robot cleaner
Image and generate label, and to label assign points generate it is N number of relatively block diagram process.N number of relatively block diagram is from same
What image acquired, still, due to it is each relatively block diagram by be applicable in each room determined by rule by acquire, because
This, label generates imparting benchmark difference of the benchmark with points, preferably, finds out N number of comparison block diagram different from each other.It is existing
In, location identification module 144 by these N number of relatively block diagrams and the N number of room found out in each feature distribution learning process
Block diagram be compared.
Figure 14 is shown the block diagram (block diagram of the block diagram in room 1 to room N) in N number of room and N number of comparison
The matched figure of block diagram (compare column Fig. 1 to compare block diagram N).As shown, by the block diagram and reference column in each room
The result that shape figure is compared each other is to be suitable for acquiring in the image that current location obtains by the rule in k-th room
It is most close with the block diagram (block diagram of room K) in the k-th room found out in feature distribution learning process to compare block diagram K.
Therefore, in this case, location identification module 144 determines that robot cleaner 100 is located in the current rooms K.
As described above, determining itself position by the image obtained in current self-position based on robot cleaner 100
The mode set in the case where the position of the robot cleaner 100 on map is discontinuously changed, such as cannot pass through row
Traveling process is grasped into control module 141 and is moved, and causing only cannot be with the travel information obtained from traveling control module 141
In the case of the position for grasping itself on map, current location can be also grasped.
As another example, robot cleaner 100 voluntarily advances according to the map for being stored in storage part 150 and to parlor
During being cleaned, its route diversion is made it be moved to the room other than parlor by user for free routing, and
In the case of being continued in room after movement, if it is according to the traveling record obtained by traveling control module 141 only according to
Rely the existing way that self-position is grasped in map, then control unit 140 can only grasp robot cleaner 100 still in visitor
The Room.But according to the present invention, in the case of based on the image obtained in new room come enforcing location identification process, Ke Yizhang
Hold which room robot cleaner 100 is currently located at.Therefore, robot cleaner 100 is grasped again on map at which
Room, and it is feasible that can control the control of the region-wide position identification of traveling based on map.
Figure 15 is to show comparison block diagram K that the image obtained from current location is found out and learn from feature distribution
The figure that the block diagram that the image obtained in room K in the process obtains is compared.
Room K is the room of the current location residing for the robot cleaner 100 grasped by location identification module 144,
In feature distribution learning process, M block diagram that M image being obtained out of room K is found out by room K image 1 column
Figure is indicated to the block diagram of room K image M.Compare block diagram K and room K image L is most close, 144 energy of location identification module
Enough determine that robot cleaner 100 is located in current room K and finds out the position of image L in M image.Wherein, it asks
The place for going out image L can be according to the traveling record acquired by traveling control module 141 in feature distribution learning process
It is determined on map.
The control method of robot cleaner and robot cleaner according to the present invention has the following effects that:
First, it is identified based on region-wide position, can accurately grasp the position of itself.
Second, it can accurately execute the action returned to charging rack.
Third, even if position that can be after movement if being movable by the user arbitrary other positions in traveling is again
The position for identifying oneself, the phenomenon that hovering so as to prevent that in the prior art the position of oneself cannot be grasped.
4th, even if can be based on the image acquired from current location if in the case where losing self-position on map
Promptly to re-recognize current position.
5th, by robot cleaner the image that current self-position obtains be not in feature distribution learning process
The feature of all images acquired is compared, but with the feature distribution set that will be found out from the obtained image in each room
The feature distribution in each room be compared, room residing for current dust catcher is thus identified, therefore, it is possible to reduce characteristic matching
The time of required data volume and consumption.
This application claims be the South Korea patent application on the 30th of September in 2014 application No. is 10-2014-0131526, the applying date
Priority, the full content of the South Korea patent application is introduced herein as reference.
Claims (18)
1. a kind of control method of robot cleaner, which is characterized in that including:
A steps, multiple rooms in purging zone obtain the multiple images to each room respectively;
B step, as being found out for table based on the image obtained in a steps and according to rule determined by each room
The step of showing each room block diagram of each each room features distribution, to each room block diagram in defined room
Binary representation is described from obtaining the feature found out in multiple images corresponding with the defined room in a steps
The binary system of each room block diagram is endowed points;
Step c obtains the image on periphery in current location;
Step d, as each image found out for indicating each characteristics of image distribution to the image obtained in the step c
The step of block diagram, each image histogram by rule determined by each room respectively by being suitable for the c
The feature of each image obtained in step and find out, for indicating that the binary system of feature is endowed points;And
Step e, by each room block diagram found out in the b step and each image histogram found out in the Step d
It is compared, to determine room that robot cleaner is currently located.
2. the control method of robot cleaner according to claim 1, which is characterized in that
The b step includes:
B-1 steps find out the feature distribution of the image obtained in a steps;
B-2 steps, based on the feature distribution of the described image found out in the b-1 steps, according to getting the every of described image
A room finds out each room block diagram.
3. the control method of robot cleaner according to claim 2, which is characterized in that
The b-1 steps include:
The step of image detection obtained from a steps goes out feature,
The step of generating binary label to each room block diagram,
The points are assigned to the label, and the step of finding out each label points distribution of each image;
Each label points distribution of the b-2 steps based on each image, each label points distribution is found out to each room.
4. the control method of robot cleaner according to claim 3, which is characterized in that
According to rule determined by each room, implement to assign the points to the label in the b-1 steps.
5. the control method of robot cleaner according to claim 3, which is characterized in that
The each label points distribution found out according to each room in the b-2 steps is according to each room by each figure
Obtained from each label points distribution averagely of picture.
6. the control method of robot cleaner according to claim 2, which is characterized in that
Further include:
In each image histogram found out in the Step d, it will be applicable in the step e and be confirmed as current institute
State the rule in room existing for robot cleaner each image histogram and the b step in each room for finding out
Block diagram is compared, to determine the step of which region the robot cleaner be located in the room.
7. the control method of robot cleaner according to claim 1, which is characterized in that
Further include:
The step of position of the robot cleaner is grasped based on the map of stored purging zone;
The step c is implemented to the step e when that cannot grasp the position of the robot cleaner from the map.
8. the control method of robot cleaner according to claim 7, which is characterized in that
The map is inputted from the exterior terminal carried out wireless communication with the robot cleaner.
9. the control method of robot cleaner according to claim 1, which is characterized in that
The image on the periphery is the image of the ceiling in the shooting purging zone.
10. the control method of robot cleaner according to claim 1, which is characterized in that
Rule determined by each room is different from each other.
11. a kind of robot cleaner, which is characterized in that including:
Image acquiring unit obtains the image on periphery,
Feature distribution study module is advised based on the image obtained by described image acquisition unit and according to determined by each room
Then each room block diagram for indicating each each room features distribution is found out, to each room in defined room
The feature that the binary representation of block diagram is found out from multiple images corresponding with the defined room, each room column
The binary system of shape figure is endowed points, and
Location identification module is found out for indicating to being obtained by described image acquisition unit in the current location of robot cleaner
Image each characteristics of image distribution each image histogram, each image histogram respectively by will it is described each
The feature for the image that rule is suitable for obtaining by described image acquisition unit in the current location determined by room is found out, and is used
It is endowed points in the binary system of expression feature;
Each room block diagram and each image histogram are compared by the location identification module, to determine
State the room that robot cleaner is currently located.
12. robot cleaner according to claim 11, which is characterized in that
The feature distribution study module finds out the feature distribution of the image obtained by described image acquisition unit, and is based on institute
The feature distribution for stating image finds out each room block diagram according to each room for getting described image.
13. robot cleaner according to claim 11, which is characterized in that
Further include:
Feature detection module goes out feature from the image detection obtained by described image acquisition unit;
The feature distribution study module generates label corresponding with the binary system, and assigns the point to the label
Number finds out each label points distribution of each image, and each label points distribution based on each image is every to find out
Each label points distribution in a room.
14. robot cleaner according to claim 13, which is characterized in that
Implement to assign the label according to rule determined by each room and count.
15. robot cleaner according to claim 13, which is characterized in that
The each label points distribution found out according to each room is that each label of described image is counted according to each room
Obtained from distribution is average.
16. robot cleaner according to claim 12, which is characterized in that
The location identification module will be applicable in and be determined as by the location identification module in each image histogram
Each image histogram of the rule in room existing for presently described robot cleaner and from being determined as presently described robot
The each room block diagram for the image that room existing for dust catcher is found out is compared, to determine the robot cleaner in room
It is interior which region be located at.
17. robot cleaner according to claim 11, which is characterized in that
Described image acquisition unit can shoot the ceiling of purging zone.
18. a kind of control method of robot cleaner, which is characterized in that
The image on periphery is obtained while movement in purging zone;
According to regular determined by each room, i) in the image detection obtained from each room go out feature, ii) find out and detect
Feature label, iii) each label is assigned and is counted, iv) find out each label points according to each room and be distributed and deposit
Storage;
Store the image that periphery is obtained after the points distribution of each label according to each room, and according to it is described i), ii),
Iii rule determined by each room being applicable in) and iv) generates label, to the label of generation to the feature of the image of acquisition
Points are assigned, the comparative group being made of each label points distribution regular based on determined by each room, and base are found out
In the comparative group and the iv) it counts distribution according to each label that each room is found out in step, to determine that robot is inhaled
The room that dirt device is currently located.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR10-2014-0131526 | 2014-09-30 | ||
KR1020140131526A KR101629649B1 (en) | 2014-09-30 | 2014-09-30 | A robot cleaner and control method thereof |
Publications (2)
Publication Number | Publication Date |
---|---|
CN105455743A CN105455743A (en) | 2016-04-06 |
CN105455743B true CN105455743B (en) | 2018-08-31 |
Family
ID=
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103054522A (en) * | 2012-12-31 | 2013-04-24 | 河海大学 | Cleaning robot system based on vision measurement and measurement and control method of cleaning robot system |
CN103479303A (en) * | 2012-06-08 | 2014-01-01 | Lg电子株式会社 | Robot cleaner, controlling method of the same, and robot cleaning system |
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103479303A (en) * | 2012-06-08 | 2014-01-01 | Lg电子株式会社 | Robot cleaner, controlling method of the same, and robot cleaning system |
CN103054522A (en) * | 2012-12-31 | 2013-04-24 | 河海大学 | Cleaning robot system based on vision measurement and measurement and control method of cleaning robot system |
Non-Patent Citations (1)
Title |
---|
Towards 3D Point cloud based object maps for household environments;Radu Bogdan Rusu等;《Robotics and Autonomous Systems》;20081231;第56卷;第927页-第941页 * |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
KR101629649B1 (en) | A robot cleaner and control method thereof | |
KR102314539B1 (en) | Controlling method for Artificial intelligence Moving robot | |
CN109890574B (en) | Mobile robot and control method thereof | |
KR102203434B1 (en) | A robot cleaner and control method thereof | |
US11400600B2 (en) | Mobile robot and method of controlling the same | |
KR102235271B1 (en) | Moving Robot and controlling method | |
Lai et al. | A large-scale hierarchical multi-view rgb-d object dataset | |
KR102032285B1 (en) | Moving Robot and controlling method | |
US11547261B2 (en) | Moving robot and control method thereof | |
KR20190031431A (en) | Method and system for locating, identifying and counting articles | |
US11348276B2 (en) | Mobile robot control method | |
KR20180125010A (en) | Control method of mobile robot and mobile robot | |
KR102024094B1 (en) | A moving-robot using artificial intelligence and a controlling method for the same | |
Verma et al. | Object identification for inventory management using convolutional neural network | |
Lakshmi et al. | Neuromorphic vision: From sensors to event‐based algorithms | |
Ullah et al. | Rotation invariant person tracker using top view | |
KR102669126B1 (en) | Moving Robot and controlling method for thereof | |
Manderson et al. | Texture-aware SLAM using stereo imagery and inertial information | |
CN105455743B (en) | The control method of robot cleaner and robot cleaner | |
Roa-Garzón et al. | Vision-based solutions for robotic manipulation and navigation applied to object picking and distribution | |
Hadi et al. | Fusion of thermal and depth images for occlusion handling for human detection from mobile robot | |
KR20180048088A (en) | Robot cleaner and control method thereof | |
Ikeda et al. | A method to recognize 3D shapes of moving targets based on integration of inclined 2D range scans | |
WO2022089548A1 (en) | Service robot and control method therefor, and mobile robot and control method therefor | |
KR102048363B1 (en) | A moving-robot |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant |