CN109953695A - Autonomous driving body - Google Patents

Autonomous driving body Download PDF

Info

Publication number
CN109953695A
CN109953695A CN201811522888.XA CN201811522888A CN109953695A CN 109953695 A CN109953695 A CN 109953695A CN 201811522888 A CN201811522888 A CN 201811522888A CN 109953695 A CN109953695 A CN 109953695A
Authority
CN
China
Prior art keywords
mentioned
mark
unit
traveling
autonomous driving
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201811522888.XA
Other languages
Chinese (zh)
Inventor
金山将也
丸谷裕树
杉本淳一
渡边浩太
洪庚杓
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Toshiba Lifestyle Products and Services Corp
Original Assignee
Toshiba Lifestyle Products and Services Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Toshiba Lifestyle Products and Services Corp filed Critical Toshiba Lifestyle Products and Services Corp
Publication of CN109953695A publication Critical patent/CN109953695A/en
Pending legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L9/00Details or accessories of suction cleaners, e.g. mechanical means for controlling the suction or for effecting pulsating action; Storing devices specially adapted to suction cleaners or parts thereof; Carrying-vehicles specially adapted for suction cleaners
    • A47L9/28Installation of the electric equipment, e.g. adaptation or attachment to the suction cleaner; Controlling suction cleaners by electric means
    • A47L9/2805Parameters or conditions being sensed
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • G01C21/206Instruments for performing navigational calculations specially adapted for indoor navigation
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0219Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory ensuring the processing of the whole working surface

Abstract

The present invention provides autonomous driving body (11), is able to use family and knows that setting is unable to the setting of the mark of running region.Autonomous driving body (11) has being capable of main body casing voluntarily, driving control portion (61), detection sensor (42) or label detection portion (69) and communication unit (26).Driving control portion (61) carries out traveling control to main body casing.Detection sensor (42) or label detection portion (69) are to being set to traveling subject area and indicate that the mark for being unable to traveling-position detects.Communication unit (26) output is to being provided with the information that is indicated of mark this case.

Description

Autonomous driving body
Technical field
Embodiments of the present invention relate to autonomous driving body voluntarily.
Background technique
In the past, in the robot that the electric dust collector etc. of for example autonomous traveling type can be travelled independently, it is known to a kind of By be for example arranged in the position that Bu Xiangshi robot enters virtual boundary based on infrared ray or based on the boundary of adhesive tape come Setting No entry region, so as to detect the system that the robot on boundary will not cross the boundary and travel.
In recent years, exist and camera is carried to robot and using the map in the creation of image room taken by camera etc. The case where data.At this point, No entry region as described above is since robot not can enter so only as simple wall etc. And it is reflected in map datum, worry that user can forget to set field set by No entry region or No entry region Institute etc..
Summary of the invention
Project to be solved by this invention is, provide the setting of mark that setting can be unable to running region a kind of to The autonomous driving body of user's notice.
The autonomous driving body of embodiment has being capable of main body voluntarily, traveling control unit, detection unit and output Unit.Traveling control unit carries out traveling control to main body.Detection unit to be set to traveling subject area and indicate cannot go The mark for sailing position is detected.Output unit output is to being provided with the information that is indicated of mark this case.
According to above-mentioned composition, it is able to use family and knows that setting is unable to the setting of the mark of running region.
Detailed description of the invention
Fig. 1 is the in-built block diagram for indicating the autonomous driving body an of embodiment.
Fig. 2 is the perspective view for indicating above-mentioned autonomous driving body.
Fig. 3 is the top view for indicating above-mentioned autonomous driving body from below.
Fig. 4 is the explanatory diagram for schematically showing the autonomous driving body system for having above-mentioned autonomous driving body.
Fig. 5 is the perspective view for indicating an example of the mark detected by the detection unit of above-mentioned autonomous driving body.
Fig. 6 A is the perspective view for indicating another example of the mark detected by the detection unit of above-mentioned autonomous driving body.
Fig. 6 B is the perspective view for indicating another example of the mark detected by above-mentioned detection unit.
Fig. 7 is the flow chart for indicating the traveling control of above-mentioned autonomous driving body.
Fig. 8 is the top view for schematically showing the detection operation of detection unit by above-mentioned autonomous driving body to boundary.
Fig. 9 A is the top view for schematically showing an example of actual traveling subject area.
Fig. 9 B is the top view for indicating map datum corresponding with Fig. 9 A.
Figure 10 A is the top view for schematically showing an example of actual traveling subject area.
Figure 10 B is the top view for indicating map datum corresponding with Figure 10 A.
One of notice when Figure 11 becomes providing above during being the setting for indicating the mark in above-mentioned autonomous driving body The explanatory diagram of example.
Specific embodiment
Hereinafter, being illustrated referring to composition of the attached drawing to an embodiment.
In Fig. 1 to Fig. 4,11 be autonomous driving body.Autonomous driving body 11 and the charging as autonomous driving body 11 The charging unit (not shown) as base unit of base portion constitutes autonomous traveling body device together.Moreover, in this embodiment party In formula, autonomous driving body 11 is to be directed at ground while independently travelling on the swept surface as tread i.e. ground to carry out clearly The robot cleaner for the self-propelled swept.Wherein, the autonomous driving body 11 of self-propelled and the equipment for referring not only to entirely autonomous traveling, It also include being remotely operated the equipment carried out voluntarily by external equipments such as remote controlers.Moreover, autonomous driving body 11 is by making Use the wireless communication such as wire communication or Wi-Fi, Bluetooth in purging zone etc. configurations as relay unit It is communicated between home gateway 14, networks 15 can be waited via internet, with the general service as data storage element Device 16, the general external equipment 17 of function for having display unit etc. carry out wired or wireless communication.In addition, autonomous traveling If body 11 can be carried out wireless communication in the inside etc. of building via home gateway 14 and external equipment 17.Therefore, certainly Main driving body 11 constitutes autonomous driving body system 18 via home gateway 14, network 15, server 16 etc. with external equipment 17.
Autonomous driving body 11 has the main body casing 20 as main body.It is driven in addition, autonomous driving body 11 has as traveling The driving wheel 21 in dynamic portion.Also, autonomous driving body 11 can have the cleaning part 22 cleaned to the dust on ground.In addition, Autonomous driving body 11 has sensor portion 23.Also, autonomous driving body 11 can have shoot part 24.In addition, autonomous driving body 11 have the communication unit 26 as the output unit with receiving unit and transmission unit.Also, autonomous driving body 11 have the control unit 28 that controller is used as control unit.Moreover, autonomous driving body 11 can have the confession as power supply unit The battery of electricity.Wherein, before being set as below along the direction arrow FR, RR shown in Fig. 2 of the driving direction of main body casing 20 The left and right directions orthogonal with front-rear direction or two side directions are set as width direction to be illustrated by rear direction.
Main body casing 20 is formed to store the shape of various devices, component.In main body casing 20, as dust collecting opening Suction inlet 31 etc. may be disposed at the lower part etc. opposed with ground.
Driving wheel 21 be make main body casing 20 march forward on the ground direction and direction of retreat traveling traveling wheel. The driving wheel 21 of present embodiment is equipped with a pair, the configuration that but not limited to this in the left and right of main body casing 20.Driving wheel 21 It is driven by the motor 33 as driving unit.It is further possible to replace driving wheel 21 and used as traveling driving portion it is unlimited Track etc..
Cleaning part 22 is used to remove the dust on ground.Cleaning part 22 have from suction inlet 31 by ground dust collect and Trapping, the function that wiping cleaning is carried out to ground etc..Cleaning part 22 can have inhales dust together with air from suction inlet 31 The electric blowing machine 35 that enters is installed on suction inlet 31 in a manner of it can rotate and lifts the rotation as rotary cleaning body of dust Turn brush 36 and driving rotating brush 36 and makes the brush motor 37 of its rotation, is installed on main body casing 20 in a manner of it can rotate Peripheral portion and dial hold together dust as convolution cleaning part auxiliary cleaning unit i.e. side brush 38 and driving side brush 38 side brush horse Up in 39 at least any one.In addition, cleaning part 22 is configured to from suction inlet 31 to 40 trap dust of dust collecting part.
Sensor portion 23 has the detection sensor 42 for example as a detection unit.In addition, sensor portion 23 can be with Be also equipped with to the concave-convex state such as the step on ground, as autonomous driving body 11 traveling obstacle wall or barrier incude The detection of obstacles unit such as infrared sensor, the ultrasonic sensor of detection.Also, sensor portion 23 can be also equipped with to suction The dust amount detection unit that the amount of dust entered is detected.
Detection sensor 42 is for detecting the mark being arranged in the traveling subject area in room etc..As detection sensor 42, the infrared sensor of the infrared detection unit as detection infrared ray can be used for example, also can be used as inspection The Magnetic Sensor for surveying the Magnetic testi unit of magnetic, can use these sensors respectively.
The mark detected by detection sensor 42 is for indicating that traveling subject area cannot travel position on map datum It sets.As shown in figure 5, mark MA1, which can be from be set to traveling subject area A, prevents from entering device VG towards defined direction It is formed by virtual wall etc. with the infrared signal IR linearly exported, is also possible to be pasted on tape of tread etc..It changes Yan Zhi, the mark MA1 detected by detection sensor 42 can by infrared ray and magnetic at least any one detect.In addition, The mark MA1 detected by detection sensor 42 is being detected relative to main body casing 20 close to predetermined distance position below Out.That is, detecting that the position of the autonomous driving body 11 of mark refers to not using the detection mark of detection sensor 42 It can traveling-position.Moreover, as explained later, by by by mark MA1 indicate it is multiple be unable to that traveling-position is connected can Obtain boundary.Boundary refer in the traveling subject area to main body casing 20 can running region and be unable to running region progress The line of differentiation.Therefore, it is unable to traveling-position by the way that detection is multiple, and obtains boundary based on these, can obtain cannot travel Region.
Shoot part 24 has the camera 51 as shooting unit main body.In addition, shoot part 24 can have to camera 51 The lamp 53 as lighting unit that coverage is illuminated.
Camera 51 is directed towards the driving direction i.e. front of main body casing 20, and relative to the ground with mounting main body casing 20 Parallel direction shoots the number of digital image and/or dynamic image for example with 105 ° of equal defined horizontal field of view angles Word camera.In the present embodiment, camera 51 is in main body casing 20 in pairs and there are two carrying.These cameras 51 are in main body casing 20 front or so discretely configures.Each camera 51 has the capturing elements such as lens, diaphragm, shutter, CCD and shooting control Circuit etc..
Communication unit 26 via home gateway 14 and network 15 and can pass through the outside outside server 16 and building Equipment 17 is in communication with each other.In addition, communication unit 26 can be via the 17 phase intercommunication of external equipment of home gateway 14 and interior of building Letter.Communication unit 26 can send data to external equipment 17, further, it is possible to receive the signal from external equipment 17.Communication unit 26 have wireless communication unit for carrying out wireless communication with external equipment 17 and as dust catcher signal receiving unit Wireless LAN device etc..Furthermore, it is possible to carry access point function without via 14 ground of home gateway and external equipment to communication unit 26 17 directly carry out wireless communication.Alternatively, it is also possible to 26 additional web pages server capability of communication unit.
The microcomputer for having CPU, ROM and RAM as control unit main body etc. can be used in control unit 28.Control Portion 28 and motor 33, cleaning part 22, sensor portion 23, shoot part 24, communication unit 26 etc. are electrically connected.The control unit of present embodiment 28 have the driving control portion 61 as traveling control unit.In addition, control unit 28 has as the cleaning for cleaning control unit Control unit 62.Also, control unit 28 has the sensor attachment portion 63 as sensor control unit.In addition, control unit 28 has The standby communication control unit 64 as communication control unit.Also, control unit 28, which has, obtains the camera 51 from shoot part 24 The image data acquisition unit 66 of image data.In addition, control unit 28 has processing unit 67.Also, control unit 28 has as depositing The memory 68 of storage unit.In addition, control unit 28 is electrically connected with battery.Also, control unit 28, which can also have, fills battery The charging control section that electricity is controlled.
Driving control portion 61 controls the driving of driving wheel 21 by the driving of control motor 33.Driving control portion 61 has Cleaning modes, the cleaning modes are to travel in subject area to go based on expression in the room etc. configured with autonomous driving body 11 The map datum in region is sailed to set driving path, and control by the driving to motor 33, to make main body casing 20 exist The driving mode that can be independently travelled along driving path in running region.
Clean the movement that control unit 62 is used to control cleaning part 22.In the present embodiment, it cleans control unit 62 and controls electricity The output of dynamic pressure fan 35, brush motor 37 and side brush motor 39.
Sensor attachment portion 63 is electrically connected with sensor portion 23.The detection knot of the acquirement sensor portion 23 of sensor attachment portion 63 Fruit, and can export to driving control portion 61 and processing unit 67.
Communication control unit 64 is electrically connected with communication unit 26.In addition, communication control unit 64 is electrically connected with processing unit 67.Communication control Portion 64 processed is used to carry out the signal, data and the signal received by communication unit 26 that send from communication unit 26, data Reason.
Image data acquisition unit 66 is exported for obtaining the image taken by camera 51, and to processing unit 67.Picture number It can have the function of image correction unit according to acquisition unit 66, the function of the image correction unit is directed to be taken by camera 51 The data of original image carry out the amendment of distortion generated by the lens of camera 51, the removing of noise, setting contrast and Image procossing such as unification of picture centre.
Processing unit 67 is electrically connected with sensor attachment portion 63, communication control unit 64, image data acquisition unit 66.And And processing unit 67 carries out various image procossings based on the data of the image taken by camera 51.
Surrounding is incuded that is, processing unit 67 has to create the area map in the region that expression can travel and take Obtain the function of the self-position of the autonomous driving body 11 in area map.The processing unit 67 of present embodiment has SLAM (Simultaneous Localization and Mapping: simultaneous localization and mapping) function.Present embodiment from Stereoscopic camera image is utilized in the SLAM function of carrying in main driving body 11.That is, passing through in the processing unit 67 of present embodiment The characteristic points such as the angle of furniture of same position in the respective shooting image of two cameras 51 are extracted, and utilize two of camera 51 Parallax can obtain the distance from camera site to characteristic point.Since the acquirement of distance being repeated in this way, it may be to know that wall Shape, furniture position or the situation around the autonomous driving body 11 such as size, so can based on the situation of surrounding, Creation indicates the travelable map for capableing of running region of main body casing 20 in processing unit 67.Moreover, passing through in processing unit 67 The map of creation is corresponding and can obtain self-position with based on the ranging information foundation of image is shot.That is, processing unit 67 The function of having map building unit and self-position acquisition unit.In addition, the case where obtaining self-position by processing unit 67 Under, it is not necessary to it is certain oneself to create map, it also can use the map obtained from outside.In addition, due in order to realize SLAM function Can, it, also can be using using infrared sensor or simultaneously using angular speed other than using stereoscopic camera image Technology well known to sensor and acceleration transducer etc., so detailed description will be omitted.Moreover, the map number created by processing unit 67 According to memory 68 can be stored in.In addition, the shape of barrier in the map datum created etc., configuring and detecting Around shape, configuration it is inconsistent in the case where, processing unit 67 can suitably correct map datum.
Also, processing unit 67 is to the image data taken by camera 51 or the figure exported from image data acquisition unit 66 As the mark that data are included is detected, and information represented by the mark that goes out of analysis detection.That is, by processing unit 67 and camera 51 constitute the label detection portion 69 as the detection unit detected to the mark in traveling subject area setting.
Here, by label detection portion 69 detect mark for indicates travel subject area on map datum cannot Traveling-position i.e. No entry region.Mark MA2 can be the setting device BS that is arranged in traveling subject area A, can also be with It is in the label such as the AR label of the settings such as the shell of setting device BS or wall or QR code MK etc., also as shown in Fig. 6 A or Fig. 6 B It can be with preset predetermined pattern and be set to the adhesive tape etc. of tread.That is, the mark detected by label detection portion 69 Knowledge is the mark detected based on image datas such as tones in the characteristic point or image data in image data.It is specific and Speech, the shape of mark is detected in label detection portion 69 based on the characteristic point in image data.Characteristic point in image data refers to The point of the specified position in object can be determined according to the variation of the colouring information of each pixel.In addition, 69 energy of label detection portion Enough it is based on image data, the position to the mark detected, the opposite side i.e. relative to the self-position of autonomous driving body 11 To and distance detected.
Setting device BS shown in Fig. 6 A and Fig. 6 B has shell 71.The shell 71 of present embodiment is formed as cuboid Shape.That is, shell 71 has 4 side surface parts 72~75 erected from each side of the lower surface for the quadrilateral shape for being set to tread. It can be respectively arranged with label MK in side surface part 72~75, can not also be set by any one only in side surface part 72~75 Tagging MK and can be distinguished with other side surface parts in side surface part 72~75.In addition, can either be by traveling object Region A configures multiple setting device BS, come the boundary B O for setting running region A1 Yu being unable to running region A2, also can by Travel subject area A and configure single setting device BS, come set the position that will be configured perimeter boundary B O, and will be by The predetermined region that boundary B O is impaled is set as being unable to running region A2.
As label MK, being formed as can by the analysis of two-dimensional bar or the image data taken by camera 51 Arbitrary shape, tone for identifying etc..Label MK can be loaded and unloaded relative to the side surface part 72~75 of setting device BS or wall etc..
In addition, processing unit 67 shown in FIG. 1 has the function of information process unit, the function of the information process unit will be by The position for the mark that detection sensor 42 or label detection portion 69 detect, i.e. as according to the detection of mark without travelling Or the position entered is unable to traveling-position, establishes and be associated on map datum with the self-position being inferred to.Here, not Can traveling-position be from preventing into the infrared signal IR of device VG output identification or the tape that is pasted on tread etc. by examining The position that sensor 42 detects is surveyed, or is close to predetermined distance or less when the self-position relative to main body casing 20 is in Position when and in the case where detected position, can will test out mark when main body casing 20 self-position with cannot Traveling-position directly establishes association on map datum.In addition, being the mark that label etc. is detected by label detection portion 69 in mark Remember, or the self-position relative to main body casing 20 is in the coverage of camera 51, that is, is in and leaves predetermined distance In the case where the mark detected when position, can by the relative position of the mark from the self-position of main body casing 20 with not It can traveling-position foundation association.That is, in this case, can will test out mark when main body casing 20 self-position with cannot Traveling-position establishes association indirectly on map datum.In addition, processing unit 67 is based on being examined by detection sensor 42 or mark The position for the mark that survey portion 69 detects obtain on the map datum to traveling subject area being capable of running region and cannot The boundary that running region is distinguished.Also, in processing unit 67, acquirement is unable to traveling-position and continues to exist in identical position During.Wherein, the acquirement on boundary in processing unit 67 etc. will be described later.
Non-volatile memory such as flash memory can be used in memory 68.Storage is created by processing unit 67 in memory 68 Travel the map datum of subject area.Map datum can be via communication unit 26 as the data that can be shown by external equipment 17 It is sent.
Battery is for supplying cleaning part 22, sensor portion 23, shoot part 24, communication unit 26 and control unit 28 etc. Electricity.As battery, in the present embodiment, the secondary cell that can be charged is used.Therefore, in the present embodiment, in main body Expose configured with the charging terminal 77 for charging the battery the bottom of casing 20.
Charging unit becomes the base portion that autonomous driving body 11 is resetted when finishing and travelling or clean.Charging unit can To be built-in with the charging circuits such as constant-current circuit.In addition, being provided with the charging terminal of battery charging to charging unit.Charging is used Terminal is electrically connected with charging circuit.Moreover, the charging terminal of charging terminal and the autonomous driving body 11 for resetting to charging unit 77 mechanical connections and electrical connection.
Home gateway 14 is also referred to as access point etc., is set in building, such as connect by wired with network 15.
Server 16 is the computers such as the Cloud Server connecting with network 15, can save various data.
External equipment 17 be can the inside of building via home gateway 14 and network 15 it is wired or wireless communication, And can building outside and network 15 it is wired or wireless communication PC, tablet terminal, tablet PC, smart phone, The general device such as mobile phone.External equipment 17 has the display 79 of liquid crystal etc., has and at least shows figure in display 79 The function of the display unit of picture.Also it can make display 79 that there is touch panel function.That is, display 79 has the function of input unit Energy.Alternatively, it is also possible to be equipped in external equipment 17 for the map number sent from the communication unit 26 of autonomous driving body 11 According to shown application, program.Also, external equipment 17 can also be made to have the function of sending and receiving e-mail in advance.
Next, being illustrated to the movement of said one embodiment.
Firstly, also refer to flow chart shown in Fig. 7 come to the cleaning realized by autonomous driving body 11 from start to end Outline be illustrated.If autonomous driving body 11 as shown in step S1, since charging unit be detached from and clean, then such as step S2 Shown, the traveling etc. that for example carries out wriggling on one side based on the map datum stored in memory 68 is along defined driving path traveling Ground is cleaned on one side.Next, in step s3, judging whether to detect the objects such as barrier, mark, when in step It is judged as in the case where detecting in S3, updates map datum in step s 4, is not detected when being judged as in step s3 The case where and in the case where having had updated map datum, in step s 5, based on whether can running region integrally go It has sailed to determine whether terminating to clean.Then, when be judged as in step s 5 do not terminate clean or can not travel area In the case that domain is integrally run over, enter step S2 and continue to clean, when be judged as in step s 5 terminate clean or Charging unit can be reset in step s 6, terminates to clean in the case that running region integrally travelled.If terminated clear It sweeps, then starts the charging of battery in defined timing.
If be more specifically illustrated to above-mentioned control, becomes preset cleaning in autonomous driving body 11 and open Determine when receiving when moment beginning or by communication unit 26 by the control instruction the cleaning of the transmissions such as external equipment 17 etc. When, control unit 28 is switched to driving mode and starts to clean.At this point, the ground for not having storage to be capable of running region in memory 68 In the case where diagram data, by the detection sensor 42 of sensor portion 23, obstacle and acted as defined in sinuous traveling etc. Analyte detection unit, camera 51 and processing unit 67 etc. detect the barrier etc. around main body casing 20, to be created by processing unit 67 Build map datum.
That is, obtaining image data from camera 51 in image data acquisition unit 66, the images such as the distortion amendment of lens are carried out Processing.When shooting image using camera 51, can also be illuminated as needed using lamp 53.
Moreover, in processing unit 67, the detection sensor based on the sensor portion 23 obtained via sensor attachment portion 63 42, the testing result of detection of obstacles unit and SLAM is carried out via the image data that image data acquisition unit 66 obtains Self-position acquirement and map building are implemented in processing.
When creating map, if detecting mark, processing unit by detection sensor 42 or label detection portion 69 67 positions based on mark, obtaining running region and can be unable to running region on map datum to traveling subject area The boundary distinguished, also, obtained based on boundary and running region and running region can be unable to.
Using processing unit 67 obtain boundary, can running region and the method that is unable to running region be contemplated that following two Method.
1st method is when detecting the position of multiple marks by detection sensor 42, and processing unit 67 is based on these positions Come obtain to traveling subject area can running region be unable to the boundary that running region distinguished, and will relative to boundary The self-position side of main body casing 20 be set as can running region, will be opposite with the self-position of main body casing 20 relative to boundary Side be set as being unable to the method for running region.
Specifically, if processing unit 67 detects to identify using detection sensor 42 close to mark, with deduction Self-position out is painted into map datum in association, and obtains the line that discribed position is connected as boundary.If Autonomous driving body 11 detects to identify using detection sensor 42, then describes its position, also, driving control portion 61 controls motor 33 driving is so that main body casing 20 is temporarily disengaged from from the mark.Next, driving control portion 61 controls the driving of driving wheel 21 So that main body casing 20 is travelled again towards mark, if detecting to identify by detection sensor 42, position, and row are drawn It sails control unit 61 and controls the driving of motor 33 so that main body casing 20 is temporarily disengaged from from mark.By the way that the movement is repeated, come Multiple position P of mark are detected, and these positions P is connected and the acquirement of processing unit 67 as shown in Figure 8 boundary, further, it is possible to phase For boundary by the self-position side of main body casing 20 be set as can running region A1, relative to boundary by with main body casing 20 The opposite side of self-position is set as being unable to running region and being registered in map datum.That is, processing unit 67 can will be because being provided with Mark and direction that main body casing 20 does not enter is shown in map datum as running region is unable to.
Fig. 9 B indicates the example by this method to the map datum MP of actual traveling subject area A creation shown in Fig. 9 A Son.In map datum MP, can running region A1 impaled by solid line, boundary B O is represented by dashed line, also, is unable to running region A2 is only indicated by defined label M.Map datum MP is sent from communication unit 26 to external equipment 17, can be shown in external equipment 17 display 79.
In addition, the 2nd method is that basis is detected by label detection portion 69 based on the image data taken by camera 51 Mark out, processing unit 67, which is obtained, running region and can be unable to running region on map datum to traveling subject area The boundary divided, and relative to boundary by the self-position side of main body casing 20 be set as can running region, relative to side The side opposite with the self-position of main body casing 20 is set as being unable to the method for running region by boundary.
Specifically, processing unit 67 is identified based on the image data taken by camera 51,51 by label detection portion 69 The type on boundary represented by the position of the mark detected and mark, direction.For example, as shown in Figure 6A, being set with two Determine that device BS is boundary in the case where setting and being unable to running region A2, autonomous driving body 11 is identified by processing unit 67 to be set Determine the mark MA2 that the side surface part 72~75 of device BS is arranged, and obtains and side is set between setting device BS, BS based on mark MA2 Boundary BO, in addition, situations such as setting boundary B O along the position of the side surface part of the mutually not opposed side of setting device BS, BS, It is recorded in map datum.For example, in the case where the mark MA2 detected is arrow or lateral arrow upward, from main body The observation of casing 20 can be expressed as that No entry by the direction of the arrow.That is, processing unit 67 can will because be provided with mark MA2 due to The direction that main body casing 20 does not enter is shown in map datum as running region is unable to.At this point, will be for example as mark MA2 And the line that the side surface part for having multiple setting device BS of arrow upward is connected with each other perhaps face or as mark MA2 and Extended line that have the side surface part of the setting device BS of lateral arrow, arrow direction or elongated surfaces by as boundary B O and It obtains.In addition, itself position of main body casing 20 can be expressed as in the case where the mark MA2 detected is downward arrow It sets to be in and is unable to running region A2.Even if in this way, due to main body casing 20 come into be unable to running region A2 in the case where, Also it can be judged as to enter based on the type for identifying MA2 and be unable to running region A2, so driving control portion 61 can also control The driving of motor 33 is so that autonomous driving body 11 or main body casing 20 stop traveling or come out from running region A2 is unable to.
Also, mark MA2 in, can also surround be provided with mark MA2 setting device BS cannot travel area to set Domain A2.For example, as shown in Figure 6B, mark MA2 be circle etc. be the shape different from mark MA2 shown in Fig. 6 A, color or In the case where person's pattern etc., if detecting that mark MA2, processing unit 67 can obtain encirclement mark by label detection portion 69 The boundary B O of the regulation shape of MA2 or setting device BS, also, the inside for obtaining boundary B O is used as and is unable to running region A2, It is registered in map datum respectively.
As described above, in the display being arranged on view-based access control model by traveling subject area of the user to autonomous driving body 11 And the mark MA2 realized is set in any one mode for being unable to running region, can be transferred through by 67 couples of mark MA2 of processing unit It carries out image recognition and is unable to running region to obtain.That is, in processing unit 67, by carrying out image recognition, analysis to mark MA2 Content represented by MA2 is identified, can obtain in traveling subject area and be unable to running region.Acquirement is unable to running region and refers to Obtain to can running region be unable to the boundary that running region distinguished.Moreover, if achieving boundary, when main body machine When the traveling of shell 20 to boundary, self-position at this time is considered as and is unable to traveling-position and converts driving direction.
In addition, in order to by identify MA2 visual display come to autonomous driving body 11 indicate traveling subject area or Person is unable to traveling-position, as long as carrying out the display for the prescribed manner that shape, tone etc. can be identified by autonomous driving body 11.
In addition, camera 51 due to can be taken according to its field angle the regulation of the direction of travel of autonomous driving body 11 away from From position, so can be based on by processing unit 67 settings such as characteristic point obtain be located at relative to boundary B O it is opposite with self-position Side the shape for being unable to running region, and map datum can be shown in.
Moreover, be registered with boundary B O, can running region A1, be unable to running region A2 map datum MP can be via family Front yard gateway 14 is sent from communication unit 26 to external equipment 17.That is, autonomous driving body 11 can be exported from communication unit 26 to be indicated to be expert at Sail subject area be provided with setting be unable to it is this case that the mark of running region, indicate boundary, being capable of running region, Bu Nenghang Sail the information in region etc..In addition, in the present embodiment, the information exported from communication unit 26 is can be read by external equipment 17 Map datum etc. but it is also possible to be the text information etc. for being shown in external equipment 17.
Figure 10 B is shown through this method to the map datum MP of actual traveling subject area A creation shown in Figure 10 A Example.In map datum MP, running region A1 and running region A2 can be unable to impaled respectively by solid line, and boundary B O by Dotted line indicates.In addition, marking M as defined in showing in being unable to running region A2.Map datum MP is sent to from communication unit 26 External equipment 17 can be shown in the display 79 of external equipment 17.
In addition, in processing unit 67, it can obtain and be unable to traveling-position or mark continues existing for the identical position Period.
Next, driving control portion 61 is based on map datum, to create driving path.In addition, pre- in map datum In the case where being first stored in memory 68, omit the creation of map datum, driving control portion 61 in memory 68 based on storing Map datum creates driving path.About the boundary shown on map datum, running region is unable to according to not entering from boundary Mode set driving path.
Moreover, driving control portion 61 keeps main body casing 20 autonomous along set driving path by control motor 33 Traveling, while cleaning control unit 62 makes the movement of cleaning part 22 to clean to the ground for capableing of running region.In cleaning part 22 In, such as by the driving of cleaning control unit 62 electric blowing machine 35, brush motor 37 or side brush motor 39, the dust on ground is passed through Dust collecting part 40 is trapped by suction inlet 31.In addition, when autonomous driving body 11 does not have independently travelling stylish detect in map datum It is on the books in the case where set the mark for being unable to running region, as described above by the acquirement of processing unit 67 boundary, cannot Running region and running region and it can be reflected in map datum, be prestored to memory 68.Equally, it is independently travelling Body 11 detected by sensor portion 23, camera 51 and processing unit 67 do not have to record in map datum can be in running region The three-dimensional coordinate of objects such as barrier in the case where, processing unit 67 is reflected in map datum, and is prestored to storage Device 68.
Moreover, map datum is sent to external equipment 17, Huo Zhezhi via network 15 from communication control unit 64, communication unit 26 It receives and sends to external equipment 17, is shown in display 79 in such a way that user can be read by external equipment 17.
If cleaning terminates, autonomous driving body 11 is reset to charging unit and connect with charging unit.
An embodiment from the description above, autonomous driving body 11 using the output of communication unit 26 by indicating travelling Subject area, which is provided with, to be indicated to be unable to the information of this case that the mark of traveling-position, is able to use family and is easily known to indicate not The setting of the mark of energy traveling-position.As a result, assuming that user have forgotten be set mark in the case where, can be avoided production It is born from the region that main driving body 11 does not travel for a long time.Especially in the case where autonomous driving body 11 has cleaning function, energy It enough avoids generating for a long time without the region of cleaning.
By being exported the information for being unable to traveling-position indicated on map datum to external equipment 17 by communication unit 26, use Family can know to be unable to traveling-position using external equipment 17, wherein communication unit 26 will indicate the map number of traveling subject area Self-position and the position of the mark detected by detection sensor 42 or label detection portion 69 according to upper main body casing 20 It is established on map datum and is associated with and there is the communication function communicated with external equipment 17.Therefore, even user forgets In the case where the position of setting mark, also it can find to mark with being easy for without finding mark by referring to traveling-position is unable to Know.
In addition, using from the case where preventing from entering users such as infrared signal IR that device VG is exported not visible mark Under, it is difficult to judge whether to set as expecting and be unable to running region.In consideration of it, multiple being detected by detection sensor 42 When the position of mark, based on these positions obtain to traveling subject area map datum on can running region with cannot go The boundary that region is distinguished is sailed, and the information for indicating boundary is exported from communication unit 26 to external equipment 17, thus, it is possible to pass through It is easily grasped by external equipment 17 because user setting mark bring is unable to the actual effect of the setting of running region.
In addition, due to using label detection portion 69 based on by the camera 51,51 shot around main body casing 20 The image data taken identifies to detect, so autonomous driving body 11 only just can be based on mark by simply traveling control Type, shape, tone etc. easily detect boundary.
Moreover, the area for playing the opposite side on boundary by obtaining the slave self-position on the map datum for travelling subject area Domain, which is used as, is unable to running region, and will indicate that the information for being unable to running region in map datum is exported from communication unit 26 to outside Equipment 17, user can easily grasp the region that autonomous driving body 11 is not travelled or do not cleaned according to the setting of mark.
Continue with the boundary for being unable to traveling-position identical in addition, autonomous driving body 11 can also obtain in processing unit 67 Position it is existing during, and export information related with period from communication unit 26.In this case, user can easily grasp During the setting of mark.
Specifically, continuing being unable to traveling-position in the case where identical position is there are more than specified time limit, pass through The situation is notified, not all right during family knows to have length because of the setting of mark sail or without cleaning is able to use Region.As a result, it is possible to promote temporarily to remove mark, by cleaning part 22 of dust catcher it is equal to the region not cleaned during length into Row cleans.In addition, as notice, such as can also be as shown in figure 11, it is display together arbitrary article MSG with map datum MP, It can also be notified by Email etc..In addition, for a user, it is possible to which there is also to keep being provided with mark for a long time State the case where, so notice progress/without that can be set by the user.Furthermore it is possible in traveling subject area memory In the case where multiple boundaries, user can be set by each boundary the progress of notice/without.
In addition, the feelings of external equipment 17 are shown in as map datum in the driving path for running over autonomous driving body 11 Under condition, due to the setting of mark and the presence of barrier, the map number different from actual traveling subject area can be shown According to.At this time, it is contemplated that user be difficult to by the setting identified set be unable to running region with because barrier there are due to nothing The case where region of method traveling is distinguished.Therefore, being exported by communication unit 26 indicates identifying in map datum because being arranged The direction not entered, such as Fig. 9 B shown in the label information such as M, can readily distinguish and be set by the setting of mark Be unable to running region with because barrier there are due to the region that cannot travel.
In addition, in said one embodiment, for detection sensor 42 and label detection portion 69, as long as have to Lack any one.
Several embodiments of the invention are described, but these embodiments only illustrate, it is not intended to limit The range of invention.These new embodiments can be implemented by other various modes, not depart from inventive concept Range is able to carry out various omissions, displacement, change.These embodiments and modifications thereof are included in the scope and summary of invention, Also, it is included in invention and its equivalent range documented by technical solution.

Claims (7)

1. a kind of autonomous driving body, which is characterized in that have:
It being capable of main body voluntarily;
Control unit is travelled, traveling control is carried out to aforementioned body;
Detection unit, to being set to traveling subject area and indicate that being unable to the mark of traveling-position detects;And
Output unit is exported to being provided with the information this case that above-mentioned mark being indicated.
2. autonomous driving body according to claim 1, which is characterized in that have:
Storage unit, storage indicate the map datum of above-mentioned traveling subject area;
Self-position acquisition unit obtains the self-position of the main body in above-mentioned map datum;And
Information process unit takes the position of the mark detected by above-mentioned detection unit with by above-mentioned self-position acquisition unit The self-position obtained establishes association on above-mentioned map datum,
Above-mentioned output unit has the communication function communicated with external equipment, and the expression on above-mentioned map datum cannot be gone The information for sailing position is exported to above-mentioned external equipment.
3. autonomous driving body according to claim 2, which is characterized in that
When detecting the position of multiple marks by above-mentioned detection unit, above- mentioned information processing unit is obtained based on these positions To above-mentioned traveling subject area on map datum can running region and the above-mentioned boundary for being unable to running region and being distinguished,
Above-mentioned output unit exports the information for indicating above-mentioned boundary to external equipment.
4. autonomous driving body according to claim 3, which is characterized in that
The slave self-position that above- mentioned information processing unit obtains in the map datum of traveling subject area plays the opposite side on boundary Region be unable to running region as above-mentioned,
Above-mentioned output unit exports the above-mentioned information for being unable to running region indicated in above-mentioned map datum to external equipment.
5. autonomous driving body as claimed in any of claims 2 to 4, which is characterized in that
The acquirement of above- mentioned information processing unit is unable to traveling-position and continues during existing for identical position,
Above-mentioned output unit exports information related with above-mentioned period.
6. autonomous driving body as claimed in any of claims 2 to 5, which is characterized in that
Above-mentioned output unit output is to the information being indicated by setting mark without the direction of entrance in above-mentioned map datum.
7. autonomous driving body as claimed in any of claims 1 to 6, which is characterized in that
Have to the camera shot around aforementioned body,
Above-mentioned detection unit detects mark based on the image data taken by above-mentioned camera.
CN201811522888.XA 2017-12-13 2018-12-13 Autonomous driving body Pending CN109953695A (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2017-238818 2017-12-13
JP2017238818A JP7014586B2 (en) 2017-12-13 2017-12-13 Autonomous vehicle

Publications (1)

Publication Number Publication Date
CN109953695A true CN109953695A (en) 2019-07-02

Family

ID=67023299

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811522888.XA Pending CN109953695A (en) 2017-12-13 2018-12-13 Autonomous driving body

Country Status (2)

Country Link
JP (1) JP7014586B2 (en)
CN (1) CN109953695A (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111486849B (en) * 2020-05-29 2021-08-27 北京大学 Mobile visual navigation method and system based on two-dimensional code road sign

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102008028931A1 (en) * 2008-06-18 2009-12-24 BSH Bosch und Siemens Hausgeräte GmbH Robot i.e. dust collecting robot, drive movement controlling method, involves stopping drive movement of robot during determination of coordinate values based on comparison of coordinate values of virtual partial region and/or virtual wall
CN102262407A (en) * 2010-05-31 2011-11-30 恩斯迈电子(深圳)有限公司 Guide device and operating system
US20140107838A1 (en) * 2012-05-07 2014-04-17 Joseph Y. Ko Movement operation system for autonomous moving cleaning apparatus
CN204091896U (en) * 2014-09-15 2015-01-14 湖南格兰博智能科技有限责任公司 A kind of virtual wall device
CN205485623U (en) * 2016-03-15 2016-08-17 群耀光电科技(苏州)有限公司 From virtual wall of combined type of walking device and beacon system
CN106037591A (en) * 2015-04-09 2016-10-26 美国iRobot公司 Restricting movement of a mobile robot
CN205903230U (en) * 2016-06-13 2017-01-25 厦门安泰迪智能家居有限公司 Sweep floor virtual wall device of three -dimensional of robot of intelligence
CN106997202A (en) * 2017-03-24 2017-08-01 上海思岚科技有限公司 The implementation method of virtual wall is set by mobile applications

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106537186B (en) * 2014-11-26 2021-10-08 艾罗伯特公司 System and method for performing simultaneous localization and mapping using a machine vision system

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102008028931A1 (en) * 2008-06-18 2009-12-24 BSH Bosch und Siemens Hausgeräte GmbH Robot i.e. dust collecting robot, drive movement controlling method, involves stopping drive movement of robot during determination of coordinate values based on comparison of coordinate values of virtual partial region and/or virtual wall
CN102262407A (en) * 2010-05-31 2011-11-30 恩斯迈电子(深圳)有限公司 Guide device and operating system
US20140107838A1 (en) * 2012-05-07 2014-04-17 Joseph Y. Ko Movement operation system for autonomous moving cleaning apparatus
CN204091896U (en) * 2014-09-15 2015-01-14 湖南格兰博智能科技有限责任公司 A kind of virtual wall device
CN106037591A (en) * 2015-04-09 2016-10-26 美国iRobot公司 Restricting movement of a mobile robot
CN205485623U (en) * 2016-03-15 2016-08-17 群耀光电科技(苏州)有限公司 From virtual wall of combined type of walking device and beacon system
CN205903230U (en) * 2016-06-13 2017-01-25 厦门安泰迪智能家居有限公司 Sweep floor virtual wall device of three -dimensional of robot of intelligence
CN106997202A (en) * 2017-03-24 2017-08-01 上海思岚科技有限公司 The implementation method of virtual wall is set by mobile applications

Also Published As

Publication number Publication date
JP2019106060A (en) 2019-06-27
JP7014586B2 (en) 2022-02-01

Similar Documents

Publication Publication Date Title
TWI653022B (en) Autonomous mobile body
JP6632232B2 (en) Method for cleaning or treating a room with a free-standing mobile device and a free-standing mobile device
CN110091326B (en) Mobile robot and control method for mobile robot
TWI757570B (en) Robot cleaners and controlling method thereof
US11151864B2 (en) System and method for monitoring a property using drone beacons
CN109431381A (en) Localization method and device, electronic equipment, the storage medium of robot
WO2017219529A1 (en) Target tracking method, device, and system, remote monitoring system, and electronic apparatus
JP5898022B2 (en) Self-propelled equipment
JP7007078B2 (en) Vacuum cleaner
CN110636789B (en) Electric vacuum cleaner
CN110063687A (en) Self-propelled electric dust collector
CN108459597B (en) Mobile electronic device and method for processing tasks in task area
CN106659343A (en) Electric cleaner
CN207488823U (en) A kind of mobile electronic device
JP6609463B2 (en) Self-propelled electronic device
CN110325938A (en) Electric dust collector
CN106054872B (en) Mobile robot and its location recognition method
GB2569926A (en) Electric vacuum cleaner
JP2013235351A (en) Self-propelled electronic equipment
CN107229274B (en) Position indication method, terminal device, self-propelled device, and program
KR102423573B1 (en) A robot cleaner using artificial intelligence and control method thereof
CN109938642A (en) Electric dust collector
CN108780319A (en) Oftware updating method, system, mobile robot and server
CN109254580A (en) The operation method of service equipment for self-traveling
JP2017027417A (en) Image processing device and vacuum cleaner

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20190702