WO2018117616A1 - Robot mobile - Google Patents

Robot mobile Download PDF

Info

Publication number
WO2018117616A1
WO2018117616A1 PCT/KR2017/015069 KR2017015069W WO2018117616A1 WO 2018117616 A1 WO2018117616 A1 WO 2018117616A1 KR 2017015069 W KR2017015069 W KR 2017015069W WO 2018117616 A1 WO2018117616 A1 WO 2018117616A1
Authority
WO
WIPO (PCT)
Prior art keywords
display
mobile robot
information
head
cleaning
Prior art date
Application number
PCT/KR2017/015069
Other languages
English (en)
Korean (ko)
Inventor
박신영
김규희
Original Assignee
엘지전자 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 엘지전자 주식회사 filed Critical 엘지전자 주식회사
Publication of WO2018117616A1 publication Critical patent/WO2018117616A1/fr

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J11/00Manipulators not otherwise provided for
    • B25J11/008Manipulators for service tasks
    • B25J11/0085Cleaning
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L9/00Details or accessories of suction cleaners, e.g. mechanical means for controlling the suction or for effecting pulsating action; Storing devices specially adapted to suction cleaners or parts thereof; Carrying-vehicles specially adapted for suction cleaners
    • A47L9/28Installation of the electric equipment, e.g. adaptation or attachment to the suction cleaner; Controlling suction cleaners by electric means
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L9/00Details or accessories of suction cleaners, e.g. mechanical means for controlling the suction or for effecting pulsating action; Storing devices specially adapted to suction cleaners or parts thereof; Carrying-vehicles specially adapted for suction cleaners
    • A47L9/28Installation of the electric equipment, e.g. adaptation or attachment to the suction cleaner; Controlling suction cleaners by electric means
    • A47L9/2836Installation of the electric equipment, e.g. adaptation or attachment to the suction cleaner; Controlling suction cleaners by electric means characterised by the parts which are controlled
    • A47L9/2852Elements for displacement of the vacuum cleaner or the accessories therefor, e.g. wheels, casters or nozzles
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J11/00Manipulators not otherwise provided for
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J13/00Controls for manipulators
    • B25J13/06Control stands, e.g. consoles, switchboards
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J13/00Controls for manipulators
    • B25J13/08Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J13/00Controls for manipulators
    • B25J13/08Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
    • B25J13/081Touching devices, e.g. pressure-sensitive
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J5/00Manipulators mounted on wheels or on carriages
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J5/00Manipulators mounted on wheels or on carriages
    • B25J5/007Manipulators mounted on wheels or on carriages mounted on wheels
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/0003Home robots, i.e. small robots for domestic use
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1664Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L2201/00Robotic cleaning machines, i.e. with automatic control of the travelling movement or the cleaning operation
    • A47L2201/04Automatic control of the travelling movement; Automatic obstacle detection

Definitions

  • the present invention relates to a mobile robot, and more particularly, to a mobile robot that a user can conveniently check and control information.
  • Robots have been developed for industrial use and have been a part of factory automation. Recently, the application of robots has been further expanded, medical robots, aerospace robots, and the like have been developed, and home robots that can be used in general homes have also been made. Among these robots, a moving robot capable of traveling by magnetic force is called a mobile robot.
  • a representative example of a mobile robot is a robot cleaner, which is a device that cleans a corresponding area by suctioning dust or foreign matter while driving around a certain area by itself.
  • the robot cleaner disclosed in the prior art stores a main body having a traveling unit and a recognition unit provided on the main body to recognize information of the area to be cleaned and information on the area to be cleaned on the main body.
  • the display unit provided in the storage unit and the main body to display cleaning related information, and recognizes the information about the pre-stored cleaning area and the information about the current cleaning area and compares them, but controls the current cleaning progress to be displayed on the display unit. It includes a control unit.
  • the display unit included in the robot cleaner is outside, there is a risk of damage.
  • An object of the present invention is to provide a mobile robot and a control method thereof in which a user can conveniently check and control information.
  • An object of the present invention is to provide a mobile robot and its control method that can improve the user's ease of use by providing a variety of user interface screen.
  • An object of the present invention is to provide a mobile robot and a control method thereof that can reduce the risk of accidents between humans and mobile robots by providing a user interface screen in a specific situation.
  • An object of the present invention is to provide a mobile robot and a control method thereof that can be conveniently controlled by a separate portable terminal such as a portable terminal.
  • a mobile robot includes a main body including an openable head, a traveling part for moving the main body, a display housed inside the main body, and a head,
  • a control unit for controlling the display of the user interface screen on the display, it is possible to easily check a variety of information, and to control the mobile robot conveniently and safely.
  • FIG. 1 is a view showing a front portion of a mobile robot according to an embodiment of the present invention.
  • FIG. 2 is a view showing a rear portion of the mobile robot according to an embodiment of the present invention.
  • 3 to 5 are views illustrating a state in which a head of a mobile robot according to an embodiment of the present invention is opened.
  • FIG. 6 is a block diagram showing a control relationship between the major components of a mobile robot according to an embodiment of the present invention.
  • FIG. 7 is a flowchart illustrating a control method of a mobile robot according to an embodiment of the present invention.
  • FIG. 8 is a flowchart illustrating a control method of a mobile robot and a portable terminal according to an embodiment of the present invention.
  • 9 to 17 are views referred to for describing a method for controlling a mobile robot according to an embodiment of the present invention.
  • module and “unit” for the components used in the following description are merely given in consideration of ease of preparation of the present specification, and do not give particular meanings or roles by themselves. Therefore, the “module” and “unit” may be used interchangeably.
  • the mobile robot 100 refers to a robot that can move itself by using a wheel or the like, and may be a home helper robot or a robot cleaner.
  • a robot cleaner having a cleaning function among mobile robots will be described with reference to the drawings, but the present invention is not limited thereto.
  • FIG. 1 is a view showing a front portion of a mobile robot according to an embodiment of the present invention
  • Figure 2 is a view showing a rear portion of the mobile robot according to an embodiment of the present invention.
  • 3 to 5 are views illustrating a state in which a head of a mobile robot according to an embodiment of the present invention is opened.
  • FIG. 6 is a block diagram showing a control relationship between the major components of a mobile robot according to an embodiment of the present invention.
  • the mobile robot 100 includes a main body 101 and a traveling unit 160 for moving the main body 101.
  • a part facing the ceiling in the driving zone is defined as the upper surface portion
  • a part facing the bottom in the driving zone is defined as the bottom surface portion
  • the main body 101 is disposed between the upper surface portion and the bottom portion.
  • the front part is defined as a part facing the running direction among the parts forming the circumference
  • the part facing the opposite direction to the front part is defined as a rear part.
  • the mobile robot 100 includes a driving unit 160 for moving the main body 101.
  • the driving unit 160 includes at least one driving wheel 136 for moving the main body 101.
  • the driving unit 160 includes a driving motor (not shown) connected to the driving wheel 136 to rotate the driving wheel.
  • the driving wheels 136 may be provided on the left and right sides of the main body 101, respectively, hereinafter referred to as left and right wheels, respectively.
  • the left wheel and the right wheel may be driven by one driving motor, but a left wheel driving motor for driving the left wheel and a right wheel driving motor for driving the right wheel may be provided as necessary.
  • the driving direction of the main body 101 can be switched to the left or the right by varying the rotational speeds of the left and right wheels.
  • An inlet (not shown) through which air is sucked may be formed in the bottom of the main body 101, and an inlet device (not shown) that provides suction power so that air can be sucked through the inlet.
  • a dust container 160 for collecting dust sucked with air through the suction port may be provided.
  • the dust container 160 may be provided with a dust container cover 161 to discard the dust therein.
  • the main body 101 may include a body part 102 forming a space in which various components constituting the mobile robot 100 are accommodated, and a head part 110 arranged to be opened and closed on the upper side of the body part 102. Can be.
  • the head part 100 may include an openable head 111 and a fastening part 112 coupled to the head 111 so as to be openable and closeable.
  • a switch or a sensor for detecting whether the head 111 is opened or closed may be disposed at the head 111 and / or the fastening part 112.
  • the user can open and close the head 111 to insert and remove the dust container 160 inside the main body 101.
  • Roll-shaped main brush (not shown) having brushes exposed through the inlet, and auxiliary brush 135 having a brush consisting of a plurality of wings extending radially located in front of the bottom of the body 101 is provided Can be.
  • the rotation of these brushes 135 separates dust from the floor in the travel zone, and the dust separated from the floor is sucked through the suction port and collected in the dust container 160.
  • the mobile robot 100 may include a display 182 accommodated in the main body 101.
  • the mobile robot 100 may include a power supply unit (not shown) for supplying power into the mobile robot 100 by providing a rechargeable battery (not shown).
  • the power supply unit supplies driving power and operation power to the respective components of the mobile robot 100, and when the remaining power is insufficient, the power supply may be charged by receiving a charging current from a charging stand (not shown).
  • the mobile robot 100 may further include a battery detector (not shown) that detects a charging state of a battery (not shown) and transmits a detection result to the controller 140.
  • the battery is connected to the battery sensing unit so that the battery remaining amount and the charging state are transmitted to the controller 140.
  • the battery remaining amount may be displayed on the display 182 of the output unit 180.
  • the battery (not shown) supplies not only a driving motor but also power necessary for the overall operation of the mobile robot 100.
  • the body portion 102 may include an openable cover 103 for battery check and / or replacement. The user can open the cover 103 to check or replace the battery condition.
  • the mobile robot 100 may perform driving to return to a charging stand (not shown) for charging, and during this return driving, the mobile robot 100 may detect a position of the charging stand by itself. .
  • the charging station may include a signal transmitter (not shown) for transmitting a predetermined return signal.
  • the return signal may be an ultrasonic signal or an infrared signal, but is not limited thereto.
  • the mobile robot 100 may include a signal detector (not shown) that receives a return signal.
  • the charging station may transmit an infrared signal through the signal transmitter, and the signal detector may include an infrared sensor that detects the infrared signal.
  • the mobile robot 100 docks with the charging station by moving to the position of the charging station according to the infrared signal transmitted from the charging station. By this docking, charging is performed between the charging terminal (not shown) of the mobile robot 100 and the charging terminal (not shown) of the charging stand.
  • the image acquisition unit 120 captures the surroundings of the main body 101, the driving zone, the external environment, and the like, and may include a camera module.
  • the camera module may include a digital camera.
  • the digital camera includes at least one optical lens and an image sensor (eg, a CMOS image sensor) configured to include a plurality of photodiodes (eg, pixels) formed by the light passing through the optical lens.
  • It may include a digital signal processor (DSP) that forms an image based on the signals output from the photodiodes.
  • DSP digital signal processor
  • the digital signal processor may generate not only a still image but also a moving image composed of frames composed of the still image.
  • Such cameras may be installed in several parts for each photographing efficiency.
  • the image photographed by the camera may be used to recognize a kind of material such as dust, hair, floor, etc. existing in a corresponding space, whether or not to be cleaned, or to confirm a cleaning time.
  • the image acquisition unit 120 the upper camera 120a for capturing the image of the front of the main body 101 to obtain an image of the ceiling in the driving zone, the front surface provided to obtain an image of the front of the main body 101 It may include a camera 120b and a depth camera 120c.
  • the number, arrangement, type, and shooting range of the camera included in the image acquisition unit 120 are not necessarily limited thereto.
  • the front camera 120a may capture an image of an obstacle or a cleaning area and a user recognition image that exist in the front of the moving direction of the mobile robot 100.
  • the image acquisition unit 120 may acquire an image by photographing the periphery of the main body 101, and the obtained image may be stored in the storage unit 150.
  • the mobile robot 100 may include a sensor unit 170 including sensors for sensing various data related to the operation and state of the mobile robot.
  • the head 111 may include a sensor (not shown) for detecting whether the head 111 is open or closed.
  • the sensor for detecting whether the head 111 is opened or closed may use a variety of known sensors.
  • the sensor unit 170 may include an obstacle detecting sensor 131 for detecting an obstacle in front of the.
  • the sensor unit 170 may further include a cliff detection sensor (not shown) for detecting the presence of a cliff on the floor in the driving zone, and a lower camera sensor (not shown) for acquiring an image of the floor.
  • the obstacle detecting sensor 131 may include a plurality of sensors installed at regular intervals on the outer circumferential surface of the mobile robot 100.
  • the sensor unit 170 may include a first sensor and a second sensor disposed on the front surface of the main body 101 to be spaced apart from left and right.
  • the obstacle detecting sensor 131 may include an infrared sensor, an ultrasonic sensor, an RF sensor, a geomagnetic sensor, a position sensitive device (PSD) sensor, and the like.
  • the position and type of the sensor included in the obstacle detecting sensor 131 may vary according to the type of the mobile robot, and the obstacle detecting sensor 131 may include more various sensors.
  • the sensor unit 170 may include a light detection and ranging (Lidar, 132a, 132b).
  • the lidars 132a and 132b detect objects such as obstacles based on a time of flight (TOF) of a transmission signal and a reception signal or a phase difference between the transmission signal and a reception signal through laser light. Can be detected.
  • TOF time of flight
  • the lidars 132a and 132b may be provided in plurality.
  • the lidars 132a and 132b detect a first lidar 132a for detecting an object located in front of the mobile robot 100 and an object located behind the mobile robot 100. It may include a second lidar 132b.
  • the liars 132a and 132b can detect the distance to the object, the relative speed to the object, and the position of the object.
  • the lidars 132a and 132b may be provided as part of the configuration of the obstacle detecting sensor 131.
  • the liars 132a and 132b may also be provided as sensors for creating a map.
  • the map generation module 143 may generate a map of the driving zone.
  • the map generation module 143 may generate a map by processing an image acquired through the image acquisition unit 120, and supplementally or independently of the map based on the sensing data of the liars 132a and 132b. You can write
  • the obstacle detecting sensor 131 detects an object in the driving (moving) direction of the mobile robot, in particular, an obstacle, and transmits obstacle information to the controller 140. That is, the obstacle detecting sensor 131 may detect a moving path of the mobile robot 100, a protrusion existing in the front or side, a household picking device, a furniture, a wall, a wall edge, and the like and transmit the information to the control unit. have.
  • the controller 140 may detect the position of the obstacle based on at least two signals received through the ultrasonic sensor, and control the movement of the mobile robot 100 according to the detected position of the obstacle.
  • the obstacle detecting sensor 131 provided on the outer surface of the main body 101 may include a transmitter and a receiver.
  • the ultrasonic sensor may be provided such that at least one transmitter and at least two receivers are staggered from each other.
  • the signal may be emitted at various angles, and the signal reflected by the obstacle may be received at various angles.
  • the signal received by the obstacle detecting sensor 131 may be subjected to signal processing such as amplification and filtering, and then the distance and direction to the obstacle may be calculated.
  • the sensor unit 170 may further include a motion detection sensor for detecting the operation of the mobile robot 100 according to the driving of the main body 101 and outputs the motion information.
  • a motion sensor may be a gyro sensor, a wheel sensor, an acceleration sensor, or the like.
  • the gyro sensor detects the rotation direction and detects the rotation angle when the mobile robot 100 moves according to the driving mode.
  • the gyro sensor detects the angular velocity of the mobile robot 100 and outputs a voltage value proportional to the angular velocity.
  • the controller 140 calculates the rotation direction and the rotation angle by using the voltage value output from the gyro sensor.
  • the wheel sensor is connected to the left wheel and the right wheel to sense the number of revolutions of the wheel.
  • the wheel sensor may be a rotary encoder.
  • the rotary encoder detects and outputs the number of revolutions of the left and right wheels.
  • the controller 140 may calculate the rotation speed of the left and right wheels by using the rotation speed. In addition, the controller 140 may calculate the rotation angle using the difference in the rotation speed of the left wheel and the right wheel.
  • the acceleration sensor detects a change in the speed of the mobile robot 100, for example, a change in the mobile robot 100 due to start, stop, direction change, collision with an object, and the like.
  • the acceleration sensor is attached to the adjacent position of the main wheel or the auxiliary wheel, and can detect slippage or idle of the wheel.
  • the acceleration sensor may be built in the controller 140 to detect a speed change of the mobile robot 100. That is, the acceleration sensor detects the impact amount according to the speed change and outputs a voltage value corresponding thereto. Thus, the acceleration sensor can perform the function of the electronic bumper.
  • the controller 140 may calculate a position change of the mobile robot 100 based on the motion information output from the motion detection sensor. This position becomes a relative position corresponding to the absolute position using the image information.
  • the mobile robot can improve the performance of the position recognition using the image information and the obstacle information through the relative position recognition.
  • the mobile robot 100 may include an output unit 180 to display reservation information, a battery state, an operation mode, an operation state, an error state, etc. as an image or output a sound.
  • the output unit 180 may include a display 182 for displaying a user interface screen as an image such as reservation information, a battery state, an operation mode, an operation state, an error state, and the like.
  • the display 182 may be disposed on the dust cover 161.
  • the display 182 may be provided separately from the dust container 160.
  • the display 182 when the head 111 is opened, the display 182 is slid or pushed to the front so that the user can more easily recognize the user interface screen provided through the display 182. can do.
  • the display 182 may be configured as a touch screen by forming a mutual layer structure with the touch pad.
  • the display 182 may be used as an input device capable of inputting information by a user's touch in addition to the output device.
  • the output unit 180 may further include a sound output unit 181 for outputting an audio signal.
  • the sound output unit 181 may output a notification message such as a warning sound, an operation mode, an operation state, an error state, etc. under the control of the controller 140.
  • the sound output unit 181 may convert an electrical signal from the controller 140 into an audio signal and output the audio signal.
  • a speaker or the like may be provided.
  • the mobile robot 100 includes a controller 140 for processing and determining various types of information, such as recognizing a current position, and a storage 150 for storing various data.
  • the mobile robot 100 may further include a communication unit 190 for transmitting and receiving data with the mobile terminal.
  • the mobile terminal includes an application for controlling the mobile robot 100, and displays an map of a driving area to be cleaned by the mobile robot 100 through execution of the application, and designates an area to clean a specific area on the map.
  • Examples of the mobile terminal may include a remote controller, a PDA, a laptop, a smartphone, a tablet, and the like, having an application for setting a map.
  • the mobile terminal may communicate with the mobile robot 100 to display a current location of the mobile robot together with a map, and information about a plurality of areas may be displayed. In addition, the mobile terminal updates and displays its position as the mobile robot travels.
  • the controller 140 controls the overall operation of the mobile robot 100 by controlling the image acquisition unit 120, the driving unit 160, the display 182, etc. constituting the mobile robot 100.
  • the storage unit 150 records various types of information necessary for the control of the mobile robot 100 and may include a volatile or nonvolatile recording medium.
  • the recording medium stores data that can be read by a microprocessor, and includes a hard disk drive (HDD), a solid state disk (SSD), a silicon disk drive (SDD), a ROM, a RAM, a CD-ROM, a magnetic Tapes, floppy disks, optical data storage devices, and the like.
  • the storage unit 150 may store a map for the driving zone.
  • the map may be input by a mobile terminal, a server, or the like, which may exchange information with the mobile robot 100 through wired or wireless communication, or may be generated by the mobile robot 100 by itself.
  • the map may display the locations of the rooms in the driving zone.
  • the current position of the mobile robot 100 may be displayed on the map, and the current position of the mobile robot 100 on the map may be updated during the driving process.
  • the portable terminal can store a map identical to the map stored in the storage 150.
  • the storage unit 150 may store cleaning history information. Such cleaning history information may be generated every time cleaning is performed.
  • the map of the driving zone stored in the storage unit 150 stores a corresponding navigation map used for driving during cleaning, a SLAM (Simultaneous Localization and Mapping) map used for location recognition, an obstacle, and the like. It may be a learning map used for learning cleaning, a global location map used for global location recognition, an obstacle recognition map in which information about the recognized obstacle is recorded.
  • SLAM Simultaneous Localization and Mapping
  • maps may be stored and managed in the storage unit 150 for each use, but the map may not be clearly classified for each use.
  • a plurality of pieces of information may be stored in one map to be used for at least two purposes.
  • the controller 140 may include a driving control module 141, a position recognition module 142, a map generation module 143, and an obstacle recognition module 144.
  • the driving control module 141 controls the driving of the mobile robot 100, and controls the driving of the driving unit 160 according to the driving setting.
  • the driving control module 141 may determine the driving path of the mobile robot 100 based on the operation of the driving unit 160. For example, the driving control module 141 may determine the current or past moving speed of the mobile robot 100, the distance traveled, and the like based on the rotational speed of the driving wheel 136, and the rotational direction of each driving wheel. Depending on the current or past direction of the change can also be identified. Based on the driving information of the mobile robot 100 thus identified, the position of the mobile robot 100 on the map may be updated.
  • the map generation module 143 may generate a map of the driving zone.
  • the map generation module 143 may generate a map by processing the image acquired through the image acquisition unit 120. That is, a cleaning map corresponding to the cleaning area can be created.
  • the map generation module 143 may recognize the global location by processing the image acquired through the image acquisition unit 120 at each location in association with the map.
  • the location recognition module 142 estimates and recognizes the current location.
  • the position recognition module 142 detects a location in association with the map generation module 143 using the image information of the image acquisition unit 120 to estimate the current position even when the position of the mobile robot 100 suddenly changes. Can be recognized.
  • the mobile robot 100 may recognize a position during continuous driving through the position recognition module 142, and also, through the map generation module 143 and the obstacle recognition module 144, without the position recognition module 142. Learn and estimate your current location.
  • the image acquisition unit 120 acquires images around the mobile robot 100.
  • an image acquired by the image acquisition unit 120 is defined as an 'acquisition image'.
  • the acquired image includes various features such as lightings on the ceiling, edges, corners, blobs, and ridges.
  • the map generation module 143 detects a feature from each of the acquired images.
  • Various methods of detecting a feature from an image are well known in the field of computer vision technology.
  • Several feature detectors are known that are suitable for the detection of these features. Examples include Canny, Sobel, Harris & Stephens / Plessey, SUSAN, Shi & Tomasi, Level curve curvature, FAST, Laplacian of Gaussian, Difference of Gaussians, Determinant of Hessian, MSER, PCBR, and Gray-level blobs detectors.
  • the map generation module 143 calculates a descriptor based on each feature point.
  • the map generation module 143 may convert feature points into descriptors using a scale invariant feature transform (SIFT) technique for feature detection.
  • SIFT scale invariant feature transform
  • the descriptor may be expressed as an n-dimensional vector.
  • the SIFT can detect an invariant feature with respect to the scale, rotation, and brightness change of the photographing target, and thus the same area is not changed even when the mobile robot 100 is photographed with different postures.
  • -invariant)) feature can be detected.
  • HOG Histogram of Oriented Gradient
  • Haar feature Haar feature
  • Fems Fems
  • LBP Local Binary Pattern
  • MCT Modified Census Transform
  • the map generation module 143 classifies at least one descriptor into a plurality of groups according to a predetermined sub-classification rule for each acquired image based on the descriptor information obtained through the acquired image of each position, and according to the predetermined sub-representation rule. Descriptors included in each can be converted into lower representative descriptors.
  • all descriptors gathered from the acquired images in a predetermined zone are classified into a plurality of groups according to a predetermined sub-classification rule, and the descriptors included in the same group according to the predetermined sub-representation rule are each lower representative descriptors. You can also convert to.
  • the map generation module 143 may obtain a feature distribution of each location through the above process.
  • Each positional feature distribution can be represented by a histogram or an n-dimensional vector.
  • the map generation module 143 may estimate an unknown current position based on a descriptor calculated from each feature point without passing through a predetermined sub classification rule and a predetermined sub representative rule.
  • the current position of the mobile robot 100 becomes unknown due to a position leap or the like, the current position may be estimated based on data such as a previously stored descriptor or a lower representative descriptor.
  • the mobile robot 100 obtains an acquired image through the image acquisition unit 120 at an unknown current position. Through the image, various features such as lightings on the ceiling, edges, corners, blobs, and ridges are identified.
  • the position recognition module 142 detects features from the acquired image. Description of the various methods of detecting features from an image in the field of computer vision technology and the various feature detectors suitable for the detection of these features are described above.
  • the position recognition module 142 calculates a recognition descriptor through a recognition descriptor calculating step based on each recognition feature point.
  • the recognition feature point and the recognition descriptor are for explaining a process performed by the location recognition module 142 and are distinguished from terms describing the process performed by the map generation module 143.
  • the features of the external world of the mobile robot 100 are merely defined in different terms.
  • the position recognition module 142 may convert a recognition feature point into a recognition descriptor by using a scale invariant feature transform (SIFT) technique for detecting the present feature.
  • SIFT scale invariant feature transform
  • the recognition descriptor may be expressed as an n-dimensional vector.
  • SIFT selects a feature point that can be easily identified, such as a corner point, in an acquired image, and then distributes the gradient of brightness gradients of pixels belonging to a predetermined area around each feature point (the direction of the change of brightness and the degree of change of brightness). ) Is an image recognition technique that obtains an n-dimensional vector whose value is a numerical value for each dimension.
  • the position recognition module 142 is based on at least one recognition descriptor information obtained through the acquired image of the unknown current position, position information to be compared according to a predetermined lower conversion rule (for example, feature distribution of each position) And information that can be compared with (sub-recognition feature distribution).
  • each position feature distribution may be compared with each recognition feature distribution to calculate each similarity. Similarity (probability) may be calculated for each location corresponding to each location, and a location where the greatest probability is calculated may be determined as the current location.
  • the controller 140 may divide the driving zone and generate a map composed of a plurality of regions, or recognize the current position of the main body 101 based on the pre-stored map.
  • the controller 140 may transmit the generated map to the mobile terminal, the server, etc. through the communication unit 190.
  • the controller 140 may store the map in the storage unit.
  • the controller 140 transmits the updated information to the portable terminal so that the map stored in the portable terminal and the mobile robot 100 is the same.
  • the area designated by the mobile robot 100 may be cleaned for the cleaning command from the mobile terminal, and the current position of the mobile robot may be stored in the mobile terminal. To be displayed.
  • the map may be divided into a plurality of areas, the connection path connecting the plurality of areas, and may include information about obstacles in the area.
  • the controller 140 determines whether the current position of the mobile robot matches the position on the map.
  • the cleaning command may be input from a remote controller, a display, or a portable terminal.
  • the controller 140 recognizes the current position and recovers the current position of the mobile robot 100 based on the current position.
  • the driving unit 160 may be controlled to move to the designated area.
  • the position recognition module 142 analyzes the acquired image input from the image acquisition unit 120 to estimate the current position based on the map. can do.
  • the obstacle recognition module 144 or the map generation module 143 may also recognize the current position in the same manner.
  • the driving control module 141 calculates a driving route from the current position to the designated region and controls the driving unit 160 to move to the designated region.
  • the driving control module 141 may divide the entire driving zone into a plurality of areas according to the received cleaning pattern information, and set at least one area as a designated area.
  • the driving control module 141 may calculate a driving route according to the received cleaning pattern information, travel along the driving route, and perform cleaning.
  • the controller 140 may store the cleaning record in the storage 150 when the cleaning of the set designated area is completed.
  • control unit 140 may transmit the operation state or cleaning state of the mobile robot 100 to the mobile terminal and the server at predetermined intervals through the communication unit 190.
  • the portable terminal displays the position of the mobile robot along with the map on the screen of the running application based on the received data, and also outputs information on the cleaning state.
  • the mobile robot 100 moves until an obstacle or a wall surface is detected in one direction, and when an obstacle is recognized through the sensor unit 170 and the obstacle recognition module 144, the recognized obstacle property is recognized.
  • the driving pattern it is possible to determine driving patterns such as going straight and rotating.
  • the mobile robot 100 may continue to go straight if it is an obstacle of a kind that can be passed over the property of the recognized obstacle. Or, if the property of the recognized obstacle is an obstacle that can not be passed, the mobile robot 100 is rotated to move a certain distance, and again to the distance in which the obstacle is detected in the opposite direction of the initial movement direction to form a zigzag You can drive
  • the mobile robot 100 includes a main body 101 including an openable and closed head 111, and a traveling part 160 for moving the main body 101. ), A display 182 accommodated in the main body 101, and a controller 140 for controlling a user interface screen to be displayed on the display 182 when the head 111 is opened.
  • the controller 140 may control the driving unit 160 to stop the movement. In this case, the controller 140 may control the driving unit 160 to resume the movement when the head 111 is closed.
  • the user may open the head 111 for information checking, setting input, or other operation, or a child or the like may open the head 111 of the mobile robot with curiosity. Even when the mobile robot continues to travel, a safety accident such as a collision may occur.
  • the controller 140 displays a user interface screen on the display 182 only when the head 111 is opened, stops the movement of the mobile robot 100, and closes the head 111. By resuming the movement, collision between a person and the mobile robot 100 and the like can be prevented and stability can be improved.
  • the user when a user who is separated from the mobile robot 100 controls the mobile robot 100 by another device such as a mobile terminal without using the input means of the mobile robot 100 itself, the user may operate the mobile terminal. If the mobile robot 100 is moved and operated in the middle, the risk of an accident, such as a collision, may increase in public places such as an airport, a train station, a terminal, a department store, and a mart.
  • the user interface screen is displayed on the display 182 only when the head 111 is opened, and the movement of the mobile robot 100 is stopped.
  • a safety means that can be operated while checking the state of the robot 100 can be secured.
  • the mobile robot 100 may further include a communication unit 190 for transmitting predetermined information to a preset mobile terminal when the head 111 is opened.
  • the predetermined information may include data indicating that the head 111 is opened and / or data for providing the user interface screen.
  • the user can check information using the user interface screen displayed on the display means of the portable terminal, and input various commands such as reservation setting, return command, and history information confirmation.
  • the controller 140 may control a screen corresponding to the predetermined command to be displayed on the display 182.
  • the screen corresponding to the predetermined command is a screen corresponding to a command input by the user operating a user interface screen displayed on the display means of the portable terminal.
  • the screen corresponding to the predetermined command may be a detailed screen for adding a cleaning reservation setting.
  • the screen corresponding to the predetermined command may be a detailed screen for setting the cleaning area and / or the cleaning mode.
  • the screen corresponding to the predetermined command may be displayed on the display means of the portable terminal.
  • the screen corresponding to the user interface screen and the user manipulation may be displayed on the display means of the mobile terminal as well as the display 182 of the mobile robot 100.
  • the screen provided through the display 182 of the mobile robot 100 and the screen provided through the display means of the portable terminal may be configured in the same manner.
  • the screen provided through the display 182 of the mobile robot 100 and the screen provided through the display means of the portable terminal may configure at least some items differently.
  • the home user interface screen provided on the display 182 of the mobile robot 100 may display a previous cleaning history.
  • the home user interface screen provided through the display means of the portable terminal may include an item which can be moved by directly operating the mobile robot 100.
  • the controller 140 may control the display 182 to further display a pop-up for guiding the head 111 to be closed.
  • a pop-up for guiding the closing of the head 111 may be displayed on the display means of the portable terminal.
  • the controller 140 may control to perform an operation corresponding to the received predetermined command.
  • the mobile robot 100 may return to a predetermined designated place in response to an immediate return command.
  • the mobile robot 100 may operate according to the reservation setting when the reserved date and time arrive.
  • the display 182 may be configured as a touch screen that receives a user's touch input, and the control unit 140 receives a predetermined command through the display 182.
  • the screen corresponding to the predetermined command may be controlled to be displayed on the display 182.
  • the controller 140 may control the display 182 to further display a pop-up for guiding the head 111 to be closed.
  • the controller 140 may control to perform an operation corresponding to the received predetermined command.
  • the controller 140 controls the display 182 to be turned on, thereby preventing the display 182 from being unnecessarily driven and reducing power consumption. .
  • the user interface screen may include a state information area including state information of the mobile robot 100, a main menu area including a plurality of main menu items, and a management area including operation reservation information and operation history information. It may include.
  • the main menu area includes a cleaning start item for inputting a cleaning operation command, a return item for inputting a move command to a designated place, and a cleaning setting item for inputting a cleaning area and a cleaning mode. can do.
  • the controller 140 displays a detailed screen corresponding to the item for which the selection input has been received on the display 182. Can be controlled.
  • controller 140 may control the display 182 to further display a pop-up for guiding the head 111 to be closed.
  • the state information may include at least one of current time information, battery state information, component replacement information, and error information.
  • the management area may include a set reservation schedule information and a reservation add button for adding a reservation schedule, and the management area may call up information on the most recent cleaning history and a plurality of cleaning history information. May include a cleaning history button.
  • the management area may further include an item capable of displaying the most recent cleaning history on a map.
  • the user interface screen will be described later in detail with reference to FIGS. 9 to 17.
  • FIG. 7 is a flowchart illustrating a control method of a mobile robot according to an embodiment of the present invention.
  • the controller 140 may control to display a user interface (UI) screen on the display 182 (S730).
  • UI user interface
  • the controller 140 may determine whether the mobile robot 100 is moving (S720) and, if not, control to display a user interface (UI) screen on the display 182 (S730).
  • UI user interface
  • the controller 140 may stop the movement of the mobile robot 100 (S725) and control the display to display a UI screen on the display 182. There is (S730).
  • the UI screen may also be provided to a user's portable terminal.
  • the user may check information included in the user interface screen or input a command for controlling the mobile robot 100 through the display 182 or the portable terminal configured as a touch screen.
  • the mobile robot 100 may receive a user input through the display 182 configured as a touch screen or receive a user input from a preset mobile terminal through the communication unit 190 (S740).
  • the controller 140 may control to display a feedback screen corresponding to the predetermined command on the display 180 (S750).
  • the feedback screen may be a detailed screen for adding a cleaning reservation setting.
  • the feedback screen may be a detailed screen for setting a cleaning area and / or a cleaning mode.
  • the controller 140 may control the display 182 to further display a pop-up for guiding the head 111 to be closed.
  • the feedback screen may be a feedback / guidance screen including a pop-up for guiding the head 111 to be closed.
  • a pop-up for guiding the closing of the head 111 may be displayed on the display means of the portable terminal.
  • the controller 140 may terminate the provision of the user interface screen and perform an operation corresponding to the received predetermined command (input).
  • the mobile robot 100 may return to a predetermined designated place in response to an immediate return command.
  • the mobile robot 100 may operate according to the reservation setting when the reserved date and time arrive.
  • the feedback / guide screen may also be displayed on a display of the portable terminal.
  • FIG. 8 is a flowchart illustrating a control method of a mobile robot and a portable terminal according to an embodiment of the present invention.
  • the mobile robot 100 may transmit predetermined information to the preset portable terminal 200 (S830).
  • the predetermined information may include data indicating that the head 111 is opened and / or data for providing the user interface screen.
  • the mobile terminal 200 may display a user interface screen for controlling the mobile robot 100 on the display means based on the information received from the mobile robot 100 (S840).
  • the user may check information using the user interface screen displayed on the display means of the portable terminal 200 and input various commands such as reservation setting, return command, and history information check (S850).
  • the mobile robot 100 may determine whether the mobile robot 100 is moving before or after the transmission of the information (S840) and stop the movement when the mobile robot 100 is moving (S820).
  • the mobile robot 100 determines whether the head is closed (S880), and if the head is closed, the mobile robot 100 may perform an operation corresponding to the received command. There is (S890).
  • the mobile robot 100 when a predetermined command is received from the portable terminal 200 (S860), the mobile robot 100 carries the response signal indicating that the command has been received or the information requested by the portable terminal 200.
  • the terminal 200 may transmit the data to the terminal 200 (S870).
  • the portable terminal 200 may display a screen corresponding to a predetermined command input by the user on the display means.
  • a message for instructing to close the head 111 of the mobile robot 100 may be further displayed on the display means of the mobile terminal 200.
  • 9 to 17 are views referred to for describing a method for controlling a mobile robot according to an embodiment of the present invention.
  • a user interface screen is provided through the display 182 or a predetermined portable terminal.
  • predetermined screens may be displayed on the display 182 or on display means of a predetermined portable terminal.
  • a screen including a welcome message, a screen for authenticating a user, and the like may be displayed on the display 182 or a display means of a preset mobile terminal.
  • FIG. 9 illustrates a user interface screen for inputting a password when using a password input method of user authentication, the present invention is not limited thereto and other user authentication methods may be used.
  • the password input screen may include a state information area 910 including state information of the mobile robot and an input button area 920 for inputting a password.
  • the status information area 910 may be implemented as a bar type and include a status bar disposed at the top of the screen.
  • the status information area 910 may include notification items such as parts replacement and error, current time / date information, and battery information.
  • the input button area 920 may include a text guide, such as 'Please enter a four-digit password', 'Try again.'
  • the input button area 920 may include a password input button such as a numeric panel and a confirmation key.
  • the input button area 920 may include a four-digit password input window and a number delete key that display and disappear when a number is input.
  • a user interface screen for controlling the mobile robot 100 may be provided through the display 182 or the portable terminal 200 provided in the mobile robot 100.
  • FIG. 10 illustrates a user interface screen provided through the display 182 provided in the mobile robot 100
  • FIG. 11 illustrates a user interface screen provided through the display means provided in the mobile terminal 200.
  • the user interface screens illustrated in FIGS. 10 and 11 are screens including main menu items of a higher depth and may be referred to as a home user interface screen, a home screen, or a main screen.
  • the home user interface screen 1000 provided through the display 182 may include a state information area 1010 including state information of the mobile robot 100 and a main menu including a plurality of main menu items.
  • the area 1020 may include a management area 1030 including operation reservation information and operation history information.
  • the state information displayed in the state information area 1010 may include at least one of current time information, battery state information, component replacement information, and error information.
  • the main menu area 1020 is an area for displaying a main menu, and includes a cleaning start item for inputting a cleaning operation command, a return item for inputting a move command to a designated place, a cleaning area, and a cleaning mode. It may include a cleaning setting item for performing an input.
  • the controller 140 controls the display 182 to display a detailed screen corresponding to the item for which the selection input has been received. can do.
  • the controller 140 may control the display 182 to further display a pop-up for guiding the head 111 to be closed.
  • the controller 140 may control to perform an operation corresponding to a predetermined command.
  • the management area 1030 may include operation reservation information and operation history information.
  • the management area 1030 may include a set reservation schedule information and a reservation add button for adding a reservation schedule.
  • the management area 1030 may include information on a most recent cleaning history and a plurality of cleaning history information. It may include a cleaning history button that can be called.
  • the management area 1030 may further include an item capable of displaying the most recent cleaning history on a map, and the user may display the most recent cleaning history on a map. By selecting an item that can be displayed, the most recent cleaning history can be reproduced on the map.
  • an item capable of displaying the most recent cleaning history on a map may be included only in a user interface screen provided through the display 182 of the mobile robot 100.
  • the management area 1030 may be divided into an area 1031 indicating reservation management and an area 1032 indicating cleaning history management.
  • the region 1031 indicating reservation management may include a reservation schedule item for displaying a reservation time / day in a card type and a reservation addition button.
  • the area 1032 indicating the cleaning history management may include a recent cleaning history including a last cleaning date / time information, a cleaning history button, a cleaning area map display item, and a 2x / 4x / 8x playback button. Can be.
  • the home user interface screen 1100 provided through the mobile terminal 200 includes a state information area 1110 including state information of the mobile robot 100 and a main including a plurality of main menu items.
  • the menu area 1120 may include a management area 1130 including operation reservation information and a mobile robot manipulation menu.
  • the state information displayed in the state information area 1110 may include at least one of current time information, battery state information, component replacement information, and error information.
  • the main menu area 1120 is an area for displaying a main menu, and includes a cleaning start item for inputting a cleaning operation command, a return item for inputting a move command to a designated place, a cleaning area, and a cleaning mode. It may include a cleaning setting item for performing an input.
  • the user may operate the mobile terminal 200 to select one of the items included in the main menu area 1120 or input a predetermined command, and the mobile terminal 200 may include a control including the predetermined command.
  • the signal may be transmitted to the mobile robot 100.
  • the controller 140 may determine whether the head 111 is closed when a predetermined command is received through the communication unit 190.
  • the controller 140 may transmit a signal indicating that the head 111 is open to the mobile terminal 200 through the communication unit 190.
  • the mobile terminal 200 may display a message for closing the head 111.
  • the controller 140 may control to perform an operation corresponding to a predetermined command.
  • the management area 1130 may include operation reservation information and a mobile robot operation menu.
  • the management area 1130 may be divided into an area 1131 indicating reservation management and a robot operation area 1132.
  • the region 1031 indicating reservation management may include a reservation schedule item for displaying a reservation time / day in a card type, a reservation delete button (X button), and a reservation add button.
  • the robot manipulation area 1132 may be a region for manipulating the robot, and may include a robot operation four-direction key, a stop button, and a suction on / off button.
  • the robot manipulation area 1132 may be included only in a user interface screen provided through the portable terminal 200.
  • a detailed screen corresponding to the item for which the selection input is received is displayed on the display 182 or the portable terminal 200. It can be provided through).
  • FIG. 12 illustrates a cleaning start detail screen in which a cleaning start item for inputting a cleaning operation command is selected among the main menu areas 1020 and 1120.
  • the cleaning start detail screen may include a state information area 1210 including state information of the mobile robot 100, a sub menu area 1220 including one or more sub menu items, and It may include a monitoring area 1230 to display the cleaning start information.
  • the sub-menu area 1220 may include sub-menu items such as pause, cleaning cancellation.
  • the monitoring area 1230 is a status information area 1231, hour / minute, including current status information such as waiting, cleaning, status progress bar, cleaning percentage information, and the like.
  • the cleaning time information area 1232 indicating the cleaning time
  • the cleaning mode information area 1233 indicating the currently set cleaning mode such as a zigzag pattern
  • the current position and the area in charge of the mobile robot 100 on the map are displayed. At least one of the displayed map area 1234 may be included.
  • the cleaning start detail screen illustrated in FIG. 12 is a screen displayed by the user according to the selection of the cleaning start main menu, and may be omitted depending on the setting.
  • the cleaning robot when the user selects the cleaning start main menu, the cleaning robot does not display the cleaning start detail screen of FIG. 12, and the mobile robot 100 may immediately start the cleaning operation when the head 111 is closed.
  • the cleaning start detail screen illustrated in FIG. 12 may be provided only through the mobile terminal 200.
  • a popup 1300 may be displayed to guide the closing of the head 111.
  • the pop-up 1300 is a guide screen for inducing a head short, and the pop-up 1300 may include an image 1310 and / or text 1320 indicating that the head 111 is open.
  • the popup 1300 may be displayed regardless of the currently displayed screen.
  • the popup 1300 may be displayed thereon.
  • FIG. 14 illustrates a return detail screen in which a return item for inputting a move command to a designated place is selected among the main menu areas 1020 and 1120.
  • the return detail screen includes a state information area 1410 including state information of the mobile robot 100, a sub menu area 1420 including one or more sub menu items, and a cleaning. It may include a monitoring area 1430 for displaying information.
  • the submenu area 1420 may include submenu items such as a pause, a cancellation of a return.
  • the monitoring area 1430 is a status information area 1431, hour / minute, including current status information such as waiting, cleaning, status progress bar, cleaning percentage information, and the like.
  • the cleaning time information area 1432 which displays the cleaning time
  • the cleaning mode information area 1433 which shows the currently set cleaning mode, such as a zigzag pattern, and the current position of the mobile robot 100 on a map, and the area
  • the display of the return detail screen may be omitted depending on the setting.
  • the moving robot 100 may immediately start the return operation if the head 111 is closed without displaying the return detail screen of FIG. 14.
  • the return detail screen illustrated in FIG. 14 may be provided only through the mobile terminal 200.
  • a pop-up 1500 for guiding to close the head 111 may be displayed.
  • the popup 1500 is a guide screen for inducing a head short, and the popup 1500 may include an image 1510 and / or text 1520 indicating that the head 111 is open.
  • the guide information 1320 and 1520 provided as text may include information about an operation to be performed after the head 111 is closed.
  • FIG. 16 illustrates a setting detail screen in which a cleaning setting item capable of performing an input regarding a cleaning area and a cleaning mode among the main menu areas 1020 and 1120 is selected and displayed.
  • a setting detail screen may include a sub menu including one or more sub menu items such as a state information area 1610 including state information of the mobile robot 100, a home screen move, and a previous screen move.
  • the region 1620 and a detailed menu region 1630 for displaying the cleaning setting detailed menu may be included.
  • the detail menu area 1630 may include a cleaning area detail menu button 1631 for setting a cleaning area and a cleaning mode detail menu button 1632 for setting a cleaning mode.
  • the cleaning area designation screen may be switched as shown in FIG. 17.
  • the cleaning area designation screen may include a sub information including one or more sub menu items such as a state information area 1710 including state information of the mobile robot 100, a home screen move, and a previous screen move.
  • a menu area 1720 and a control area 1730 for setting a cleaning area may be included.
  • the control area 1730 is an area for manipulating a cleaning area designation.
  • the control area 1730 is an area for inputting a map screen 1731 for specifying a cleaning area by touch and / or drag, and horizontal / vertical meter (m) information. It may include an input area 1732 and a save / cancel button 1733 that can specify.
  • the user may set the cleaning area by dragging at a point where the mobile robot on the map screen 1731 is waiting.
  • the cleaning area may be automatically formed in the form of a box.
  • a cleaning area corresponding to the size of the box input based on the point where the mobile robot is waiting may be formed.
  • the size of the cleaning area may be adjusted in more detail by inputting an accurate number in the input area 1732.
  • the mobile robot according to the present invention may not be limitedly applied to the configuration and method of the embodiments described as described above, and the embodiments may be selectively combined with each or all of the embodiments so that various modifications may be made. It may be configured.
  • the control method of the mobile robot it is possible to implement as a processor-readable code on a processor-readable recording medium.
  • the processor-readable recording medium includes all kinds of recording devices that store data that can be read by the processor. Examples of the processor-readable recording medium include ROM, RAM, CD-ROM, magnetic tape, floppy disk, optical data storage device, and the like, and may also be implemented in the form of a carrier wave such as transmission over the Internet. .
  • the processor-readable recording medium can also be distributed over network coupled computer systems so that the processor-readable code is stored and executed in a distributed fashion.

Landscapes

  • Engineering & Computer Science (AREA)
  • Mechanical Engineering (AREA)
  • Robotics (AREA)
  • Human Computer Interaction (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • Electric Vacuum Cleaner (AREA)

Abstract

Un aspect de la présente invention concerne un robot mobile comprenant : un corps comprenant une tête qui peut être ouverte ou fermée ; une unité d'entraînement servant à réaliser le déplacement du corps ; un écran installé dans le corps ; et une unité de commande pour amener l'écran à afficher un écran d'interface utilisateur lorsque la tête est ouverte. Par conséquent, la présente invention permet une confirmation aisée de diverses informations et une commande pratique et sans danger d'un robot mobile.
PCT/KR2017/015069 2016-12-23 2017-12-20 Robot mobile WO2018117616A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR1020160177923A KR102070210B1 (ko) 2016-12-23 2016-12-23 이동 로봇
KR10-2016-0177923 2016-12-23

Publications (1)

Publication Number Publication Date
WO2018117616A1 true WO2018117616A1 (fr) 2018-06-28

Family

ID=62626802

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2017/015069 WO2018117616A1 (fr) 2016-12-23 2017-12-20 Robot mobile

Country Status (2)

Country Link
KR (1) KR102070210B1 (fr)
WO (1) WO2018117616A1 (fr)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109176559A (zh) * 2018-10-24 2019-01-11 上海思依暄机器人科技股份有限公司 一种机器人及控制方法、系统
CN110967703A (zh) * 2018-09-27 2020-04-07 广东美的生活电器制造有限公司 使用激光雷达和摄像头的室内导航方法及室内导航装置
CN111319045A (zh) * 2018-12-13 2020-06-23 东莞市豪铖电子科技有限公司 一种多功能机器人及控制方法

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102241603B1 (ko) * 2018-07-25 2021-04-16 엘지전자 주식회사 이동 로봇
KR102471163B1 (ko) 2022-09-15 2022-11-25 주식회사 라스테크 배터리 안전 교체 가능한 이동형 로봇

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100814330B1 (ko) * 2006-11-28 2008-03-18 주식회사 한울로보틱스 학습보조 및 교사 도우미 로봇 시스템
KR20110018211A (ko) * 2009-08-17 2011-02-23 엘지전자 주식회사 로봇 청소기 및 이를 포함한 로봇 청소기 제어 시스템
KR20130092729A (ko) * 2012-02-13 2013-08-21 엘지전자 주식회사 로봇청소기 및 그 제어방법
US20150224640A1 (en) * 2005-09-30 2015-08-13 Irobot Corporation Companion robot for personal interaction
KR20160120841A (ko) * 2015-04-08 2016-10-19 (주)이산솔루션 청소 로봇을 이용한 스마트 청소 시스템

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002355779A (ja) * 2001-06-01 2002-12-10 Sharp Corp ロボット型インタフェース装置およびその制御方法
KR100629119B1 (ko) * 2005-04-20 2006-09-27 주식회사 유진로봇 로봇용 엘시디 폴더의 개폐 장치 및 그 제어 방법
KR102082754B1 (ko) * 2013-07-11 2020-04-16 삼성전자주식회사 청소 로봇 및 그 제어 방법
KR102158690B1 (ko) * 2013-12-27 2020-10-23 엘지전자 주식회사 로봇 청소기, 로봇 청소기 시스템 및 그 제어방법

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150224640A1 (en) * 2005-09-30 2015-08-13 Irobot Corporation Companion robot for personal interaction
KR100814330B1 (ko) * 2006-11-28 2008-03-18 주식회사 한울로보틱스 학습보조 및 교사 도우미 로봇 시스템
KR20110018211A (ko) * 2009-08-17 2011-02-23 엘지전자 주식회사 로봇 청소기 및 이를 포함한 로봇 청소기 제어 시스템
KR20130092729A (ko) * 2012-02-13 2013-08-21 엘지전자 주식회사 로봇청소기 및 그 제어방법
KR20160120841A (ko) * 2015-04-08 2016-10-19 (주)이산솔루션 청소 로봇을 이용한 스마트 청소 시스템

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110967703A (zh) * 2018-09-27 2020-04-07 广东美的生活电器制造有限公司 使用激光雷达和摄像头的室内导航方法及室内导航装置
CN109176559A (zh) * 2018-10-24 2019-01-11 上海思依暄机器人科技股份有限公司 一种机器人及控制方法、系统
CN111319045A (zh) * 2018-12-13 2020-06-23 东莞市豪铖电子科技有限公司 一种多功能机器人及控制方法
CN111319045B (zh) * 2018-12-13 2023-09-19 深圳小牛黑科技有限公司 一种多功能机器人及控制方法

Also Published As

Publication number Publication date
KR20180074141A (ko) 2018-07-03
KR102070210B1 (ko) 2020-01-28

Similar Documents

Publication Publication Date Title
WO2018117616A1 (fr) Robot mobile
WO2018139796A1 (fr) Robot mobile et procédé de commande associé
AU2020247141B2 (en) Mobile robot and method of controlling the same
WO2021006677A2 (fr) Robot mobile faisant appel à l'intelligence artificielle et son procédé de commande
WO2018155999A2 (fr) Robot mobile et son procédé de commande
WO2018139865A1 (fr) Robot mobile
WO2018038488A1 (fr) Robot mobile et son procédé de commande
WO2018135870A1 (fr) Système de robot mobile et son procédé de commande
WO2021006556A1 (fr) Robot mobile et son procédé de commande
WO2017188800A1 (fr) Robot mobile et son procédé de commande
WO2015183005A1 (fr) Dispositif mobile, robot nettoyeur et son procédé de commande
WO2020060267A1 (fr) Robot de nettoyage et procédé permettant d'exécuter une tâche associée
WO2021006542A1 (fr) Robot mobile faisant appel à l'intelligence artificielle et son procédé de commande
AU2020244635B2 (en) Mobile robot control method
EP3774202A1 (fr) Robot de nettoyage et procédé permettant d'exécuter une tâche associée
WO2019017521A1 (fr) Dispositif de nettoyage et procédé de commande associé
WO2020122541A1 (fr) Robot nettoyeur et son procédé de commande
WO2021172936A1 (fr) Robot mobile et son procédé de commande
WO2016048077A1 (fr) Robot de nettoyage et son procédé de commande
WO2020004824A1 (fr) Pluralité de dispositifs de nettoyage autonomes et procédé de commande associé
WO2019221523A1 (fr) Dispositif de nettoyage et procédé de commande dudit dispositif de nettoyage
WO2020045732A1 (fr) Procédé de commande de robot mobile
WO2020256370A1 (fr) Robot mobile et son procédé de commande
WO2021006547A2 (fr) Robot mobile et son procédé de commande
WO2019177418A1 (fr) Robot mobile et son procédé de commande

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17884958

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17884958

Country of ref document: EP

Kind code of ref document: A1