WO2019004773A1 - Terminal mobile et système de robot comprenant ledit terminal mobile - Google Patents

Terminal mobile et système de robot comprenant ledit terminal mobile Download PDF

Info

Publication number
WO2019004773A1
WO2019004773A1 PCT/KR2018/007399 KR2018007399W WO2019004773A1 WO 2019004773 A1 WO2019004773 A1 WO 2019004773A1 KR 2018007399 W KR2018007399 W KR 2018007399W WO 2019004773 A1 WO2019004773 A1 WO 2019004773A1
Authority
WO
WIPO (PCT)
Prior art keywords
robot
area
mobile
cleaning
map
Prior art date
Application number
PCT/KR2018/007399
Other languages
English (en)
Korean (ko)
Inventor
유경만
김규희
손병곤
이창현
Original Assignee
엘지전자 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from KR1020170083623A external-priority patent/KR20190003119A/ko
Priority claimed from KR1020170083624A external-priority patent/KR20190003120A/ko
Application filed by 엘지전자 주식회사 filed Critical 엘지전자 주식회사
Publication of WO2019004773A1 publication Critical patent/WO2019004773A1/fr

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L9/00Details or accessories of suction cleaners, e.g. mechanical means for controlling the suction or for effecting pulsating action; Storing devices specially adapted to suction cleaners or parts thereof; Carrying-vehicles specially adapted for suction cleaners
    • A47L9/28Installation of the electric equipment, e.g. adaptation or attachment to the suction cleaner; Controlling suction cleaners by electric means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J11/00Manipulators not otherwise provided for
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J13/00Controls for manipulators
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J13/00Controls for manipulators
    • B25J13/06Control stands, e.g. consoles, switchboards
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J19/00Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
    • B25J19/06Safety devices
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0486Drag-and-drop
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/725Cordless telephones

Definitions

  • the present invention relates to a robot system including a mobile terminal and a plurality of mobile robots, and more particularly to a robot system including a plurality of mobile robots for cleaning an allocated cleaning zone, Terminal.
  • Robots have been developed for industrial use and have been part of factory automation. In recent years, medical robots, aerospace robots, and the like have been developed, and household robots that can be used in ordinary homes are being developed. Among these robots, mobile robots capable of traveling by magnetic force are called mobile robots.
  • a representative example of the mobile robot is a cleaning robot (a robot cleaner), which is a device for cleaning a corresponding area by suctioning dust or foreign matter around the robot while running a certain area by itself.
  • a cleaning robot a robot cleaner
  • Using a cleaning robot has the advantage that the working ability is higher than when it is operated by personnel, and it is possible to work without rest, and there is no restriction on the cleaning work time.
  • cleaning robots are also effective in keeping clean and swiftly in public areas throughout the year, including vulnerable areas that need frequent cleaning, even in public places with large spaces such as airports, train stations, department stores and harbors.
  • a plurality of cleaning robots need to be disposed and operated for cleaning and managing such a large space.
  • a plurality of cleaning robots perform a cleaning operation for the assigned cleaning area.
  • An object of the present invention is to provide a robot system and a control method thereof that enable a user to conveniently check and control information.
  • An object of the present invention is to provide a robot system and a control method thereof that can easily find individual robots.
  • An object of the present invention is to provide a robot system capable of performing supplementary cleaning and concentrated cleaning by simply designating multiple cleaning areas for individual cleaning robots and performing cleaning on specific areas.
  • a mobile terminal including a display unit, a wireless communication unit for receiving information on a plurality of mobile robots, and a main screen based on information on the received mobile robots,
  • the main screen includes a map area including at least one map divided into a plurality of areas and robot information items for the mobile robot assigned to the plurality of areas, The user can conveniently check and control the information of the robots included in the robot system.
  • a robot system including: a plurality of mobile robots; a server for receiving status information from a plurality of mobile robots; and a server for receiving status information from a plurality of mobile robots And a mobile terminal for displaying a main screen based on information on a plurality of mobile robots received and received on a display unit.
  • the main screen includes a map including one or more maps divided into a plurality of zones, And a robot information area including robot information items for the mobile robot to be allocated to the plurality of zones.
  • a large space can be effectively cleaned.
  • multiple cleaning areas can be easily designated for the individual cleaning robots to perform cleaning for specific areas, thereby performing supplementary cleaning and concentrated cleaning.
  • FIG. 1 is a front view of a mobile robot included in a robot system according to an embodiment of the present invention.
  • FIG. 2 is a rear view of a mobile robot included in a robot system according to an embodiment of the present invention.
  • FIG. 3 is a block diagram showing a control relationship between main components of a mobile robot included in the robot system according to an embodiment of the present invention.
  • FIG. 4 is a conceptual diagram of a robot system including a plurality of mobile robots and a mobile terminal according to an embodiment of the present invention.
  • FIG. 5 is a simplified block diagram of a mobile terminal according to an embodiment of the present invention.
  • FIGS. 6 and 7 are views showing a state in which a head of a mobile robot is opened according to an embodiment of the present invention.
  • FIG. 8 is a view illustrating a user interface screen provided through a display of a mobile robot according to an embodiment of the present invention.
  • FIGS. 9 to 14B are diagrams illustrating a user interface screen provided through a display unit of a mobile terminal according to an exemplary embodiment of the present invention and a control method of the robot system using the user interface screen.
  • 15A to 15C are diagrams referred to in explaining the robot search function according to the embodiment of the present invention.
  • 16 to 19 are diagrams for explaining a user interface screen provided through a display unit of a mobile terminal according to an embodiment of the present invention and a control method of the robot system using the same.
  • 20 to 26 are views referred to in explaining a user interface screen provided through a display unit included in the mobile terminal and a control method of the robot system using the user interface screen according to the embodiment of the present invention.
  • FIG. 27 is a conceptual diagram of a robot system including a plurality of mobile robots and a mobile terminal according to an embodiment of the present invention.
  • module and part for components used in the following description are given merely for convenience of description and do not give special significance or role in themselves. Accordingly, the terms “ module " and “ part” may be used interchangeably.
  • the mobile robot 100 refers to a robot that can move by itself using wheels or the like, and may be a robot cleaner or the like.
  • a cleaning robot robot cleaner
  • a cleaning function will be described as an example of a mobile robot with reference to the drawings, but the present invention is not limited thereto.
  • FIG. 1 is a front view of a mobile robot included in a robot system according to an embodiment of the present invention.
  • FIG. 2 is a view illustrating a rear portion of a mobile robot included in the robot system according to an embodiment of the present invention. to be.
  • FIG. 3 is a block diagram showing a control relationship between main components of a mobile robot included in the robot system according to an embodiment of the present invention.
  • FIG. 4 is a conceptual diagram of a robot system including a plurality of mobile robots and a mobile terminal according to an embodiment of the present invention.
  • the robot system according to an embodiment of the present invention may include a plurality of mobile robots exemplified with reference to Figs.
  • the robot system according to an embodiment of the present invention may further include a server, a mobile terminal, and the like.
  • a robot system includes a plurality of mobile robots 100a, 100b, 100c, 100d, 100e and a plurality of mobile robots 100a, 100b, 100c, 100d, And a mobile terminal 200 that can control the mobile terminal 200.
  • the robot system includes a plurality of mobile robots 100a, 100b, 100c, 100d and 100e, a mobile terminal 200 and a server (not shown) of a robot manufacturer or a service provider .
  • the robot system may further include a robot other than a mobile robot having a cleaning function.
  • the mobile robot 100 may include a main body 101 and a traveling unit 160 for moving the main body 101.
  • the portion facing the ceiling in the running zone is defined as the upper surface portion
  • the portion facing the floor in the running zone is defined as the bottom portion
  • the main body 101 is defined between the upper surface portion and the bottom surface portion.
  • a portion facing the running direction is defined as a front portion
  • a portion facing the opposite direction of the front portion is defined as a rear portion.
  • the mobile robot (100) includes a traveling unit (160) for moving the main body (101).
  • the driving unit 160 includes at least one driving wheel 136 for moving the main body 101.
  • the driving unit 160 includes a driving motor (not shown) connected to the driving wheels 136 to rotate the driving wheels.
  • the driving wheels 136 may be provided on the left and right sides of the main body 101, respectively, and are hereinafter referred to as left and right wheels, respectively.
  • the left wheel and the right wheel may be driven by a single drive motor, but may be provided with a left wheel drive motor for driving the left wheel and a right wheel drive motor for driving the right wheel, respectively, if necessary.
  • the running direction of the main body 101 can be switched to the left or right side by making a difference in the rotational speeds of the left and right wheels.
  • a suction port (not shown) for sucking air can be formed on the bottom surface of the main body 101.
  • a suction device (not shown) for providing a suction force to suck air through the suction port is provided in the main body 101 ,
  • a dust container (not shown) for collecting the dust sucked together with the air through the suction port is provided in the main body 101 .
  • a dust container cover (not shown) may be provided in the dust container to discard the dust therein.
  • the main body 101 includes a body part 102 forming a space for accommodating various components constituting the mobile robot 100 and a head part 110 disposed on the upper side of the body part 102 so as to be openable and closable .
  • the head unit 110 may include a head 111 that can be opened and closed and a coupling unit 112 coupled to the head 111 so as to be openable and closable.
  • a switch or a sensor for detecting whether the head 111 is open or closed may be disposed on the head 111 and / or the coupling part 112.
  • the head 111 may further include an emergency operation button 114.
  • the emergency operation button 114 may perform a function of immediately stopping the operation of the mobile robot 100 while the mobile robot 100 is stopped or running.
  • the emergency operation button 114 may be located behind the mobile robot 100 so that the emergency operation button 114 can be easily operated even if the mobile robot 100 is traveling forward have.
  • the mobile robot 100 may include a display 182 accommodated in the main body 101.
  • the display 182 may be a mobile terminal 200 to be described later.
  • the mobile terminal 200 can be stored in the main body 101 and stored, and can be taken out and used when necessary.
  • the mobile robot 100 may include the mobile terminal 200 therein.
  • the user can open the head 111 and insert and remove the dust container in the main body 101.
  • the head portion 110 may be openable and closable in a plurality of ways or may include one or more opening and closing mechanism structures.
  • a button is provided on the head 110, and a predetermined cover is opened toward the user when the user presses a button, Can be implemented.
  • the display 182 may be arranged in a predetermined area partitioned in the inner storage space, and the opening / closing mechanism structure opened by the user's button operation may be configured with a dedicated opening / closing structure of the area where the display 182 is disposed . Accordingly, when the user presses a button, the display 182 can be configured to protrude toward the user.
  • the mobile robot 100 includes a brush-type brush (not shown) having a brush which is exposed through a suction port, a brush (not shown) located on the front side of the bottom surface of the main body 101,
  • the auxiliary brush 135 may be provided.
  • the mobile robot 100 may include a power supply unit (not shown) provided with a rechargeable battery (not shown) to supply power to the mobile robot 100.
  • the power supply unit supplies driving power and operating power to the respective components of the mobile robot 100 and can be charged by receiving a charging current from a charging stand (not shown) when the remaining power is insufficient.
  • the mobile robot 100 may further include a battery sensing unit (not shown) that senses a charged state of a battery (not shown) and transmits the sensed result to the control unit 140.
  • the battery is connected to the battery sensing unit, and the battery remaining amount and the charging state are transmitted to the control unit 140.
  • the battery remaining amount may be displayed on the display 182 of the output unit 180.
  • the battery (not shown) supplies not only the drive motor but also the power required for the overall operation of the mobile robot 100.
  • the body 102 may include an openable cover 103 for battery checking and / or replacement. The user can open the cover 103 to check or replace the battery condition.
  • the mobile robot 100 can return to the charging stand (not shown) for charging. During the return travel, the mobile robot 100 can detect the position of the charging stand by itself .
  • the charging stand may include a signal transmitting unit (not shown) for transmitting a predetermined return signal.
  • the return signal may be an ultrasonic signal or an infrared signal, but is not limited thereto.
  • the mobile robot 100 may include a signal sensing unit (not shown) for receiving a return signal.
  • the charging unit may transmit an infrared signal through a signal transmission unit, and the signal sensing unit may include an infrared sensor that senses an infrared signal.
  • the mobile robot 100 moves to the position of the charging base according to the infrared signal transmitted from the charging base and docks with the charging base. (Not shown) of the mobile robot 100 and the charging terminal (not shown) of the charging base by such docking.
  • the image acquiring unit 120 photographs the periphery of the main body 101, a traveling zone, an external environment, and the like, and may include a camera module.
  • the camera module may include a digital camera.
  • a digital camera includes an image sensor (e.g., a CMOS image sensor) configured with at least one optical lens, and a plurality of photodiodes (e.g., pixels) that are formed by light passing through the optical lens, And a digital signal processor (DSP) that forms an image based on signals output from the photodiodes.
  • the digital signal processor is capable of generating moving images composed of still frames as well as still images.
  • the image captured by the camera can be used to identify the kind of material such as dust, hair, floor, etc. existing in the space, whether it is cleaned, or to confirm the cleaning time.
  • the image acquiring unit 120 may acquire an image by photographing the periphery of the main body 101, and the acquired image may be stored in the storage unit 130.
  • the image acquiring unit 120 may include an upper camera 120a for capturing an image of the upper side of the front of the main body 101 to acquire an image of the ceiling within the traveling region, A front camera 120b, and a depth camera 120c.
  • the number, arrangement, type, and photographing range of the cameras provided in the image obtaining unit 120 are not necessarily limited to these.
  • the upper camera 120a can acquire an image of the ceiling in the travel zone and the mobile robot 100 can use the image acquired by the upper camera 120a for SLAM (Simultaneous Localization and Mapping).
  • SLAM Simultaneous Localization and Mapping
  • the front camera 120b can photograph an image of a user, a situation of an obstacle, a cleaning area, or the like existing in the front of the mobile robot 100 in the traveling direction.
  • the status notification light 151 may be embodied as a circular ring around the upper camera 120a.
  • the mobile robot 100 may include a sensor unit 170 including sensors for sensing various data related to the operation and state of the mobile robot.
  • it may include a sensor (not shown) for detecting whether the head 111 is open or closed.
  • the sensors for detecting whether the head 111 is opened or closed can be various known sensors.
  • the sensor unit 170 may include an obstacle detection sensor 131 for sensing an obstacle ahead.
  • the sensor unit 170 may further include a cliff detection sensor (not shown) for detecting the presence or absence of a cliff on the floor in the driving area, and a lower camera sensor (not shown) for acquiring a bottom image.
  • the obstacle detection sensor 131 may include a plurality of sensors installed on the outer circumferential surface of the mobile robot 100 at regular intervals.
  • the sensor unit 170 may include a first sensor and a second sensor disposed on the front surface of the main body 101 so as to be spaced left and right.
  • the obstacle detection sensor 131 may include an infrared sensor, an ultrasonic sensor, an RF sensor, a geomagnetic sensor, a position sensitive device (PSD) sensor, and the like.
  • the position and type of the sensor included in the obstacle detection sensor 131 may vary depending on the type of the mobile robot, and the obstacle detection sensor 131 may include more various sensors.
  • the sensor unit 170 may include light detection and ranging (Lidar) 132a and 132b.
  • Lidar light detection and ranging
  • the lidarities 132a and 132b transmit the object such as an obstacle based on the TOF (Time of Flight) of the transmission signal and the reception signal or the phase difference between the transmission signal and the reception signal through the laser light Can be detected.
  • TOF Time of Flight
  • the plurality of Ladas 132a and 132b may be provided.
  • the lidarities 132a and 132b are a first ray 132a for detecting an object located in front of the mobile robot 100 and a first ray 132a for detecting an object located behind the mobile robot 100 And a second RL 132b.
  • the RLAs 132a and 132b can detect the distance to the object, the relative speed with the object, and the position of the object.
  • the lines 132a and 132b may be provided as part of the configuration of the obstacle detection sensor 131.
  • the RLAs 132a and 132b may be provided as a sensor for creating a map.
  • the map generation module 143 may generate a map of the running area.
  • the map generation module 143 can generate a map by processing the image acquired through the image acquisition unit 120 and supplementarily or independently generate the map on the basis of the sensing data of the lines 132a and 132b Can be created.
  • the obstacle detection sensor 131 senses an object, particularly an obstacle, existing in a traveling direction (movement direction) of the mobile robot, and transmits the obstacle information to the control unit 140. That is, the obstacle detection sensor 131 can sense the moving path of the mobile robot 100, protrusions existing on the front or side of the mobile robot 100, household appliances, furniture, walls, wall corners, etc. and transmit the information to the control unit have.
  • the controller 140 detects the position of the obstacle based on at least two signals received through the ultrasonic sensor, and controls the movement of the mobile robot 100 according to the position of the detected obstacle.
  • the obstacle detection sensor 131 provided on the outer surface of the main body 101 may include a transmitting unit and a receiving unit.
  • the ultrasonic sensor may be provided such that at least one transmitting portion and at least two receiving portions are staggered from each other. Also, it is possible to radiate signals at various angles and to receive signals reflected from obstacles at various angles.
  • the signal received by the obstacle detection sensor 131 may undergo signal processing such as amplification, filtering, and the like, and then the distance and direction to the obstacle may be calculated.
  • the sensor unit 170 may further include a motion detection sensor for detecting motion of the mobile robot 100 according to driving of the main body 101 and outputting motion information.
  • a motion detection sensor for detecting motion of the mobile robot 100 according to driving of the main body 101 and outputting motion information.
  • a gyro sensor, a wheel sensor, an acceleration sensor, or the like can be used as the motion detection sensor.
  • the gyro sensor senses the direction of rotation and detects the rotation angle when the mobile robot 100 moves according to the operation mode.
  • the gyro sensor detects the angular velocity of the mobile robot 100 and outputs a voltage value proportional to the angular velocity.
  • the control unit 140 calculates the rotation direction and the rotation angle using the voltage value output from the gyro sensor.
  • the wheel sensor is connected to the left and right wheels to detect the number of revolutions of the wheel.
  • the wheel sensor may be a rotary encoder.
  • the rotary encoder detects and outputs the number of rotations of the left and right wheels.
  • the control unit 140 can calculate the rotational speeds of the left and right wheels using the number of rotations. Also, the control unit 140 can calculate the rotation angle using the difference in the number of rotations of the left and right wheels.
  • the acceleration sensor senses a change in the speed of the mobile robot 100, for example, a change in the mobile robot 100 due to a start, a stop, a change of direction, a collision with an object or the like.
  • the acceleration sensor is attached to the main wheel or the adjoining positions of the auxiliary wheels, so that the slip or idling of the wheel can be detected.
  • the acceleration sensor is built in the control unit 140 and can detect a speed change of the mobile robot 100. [ That is, the acceleration sensor detects the amount of impact according to the speed change and outputs a corresponding voltage value. Thus, the acceleration sensor can perform the function of an electronic bumper.
  • the control unit 140 can calculate the positional change of the mobile robot 100 based on the operation information output from the motion detection sensor. Such a position is a relative position corresponding to the absolute position using the image information.
  • the mobile robot can improve the performance of the position recognition using the image information and the obstacle information through the relative position recognition.
  • the mobile robot 100 may display the reservation information, the battery status, the operation mode, the operation status, the error status, and the like, including the output unit 180, or may output it as an audio.
  • the output unit 180 may include a display 182 for displaying the user interface screen such as the reservation information, the battery status, the operation mode, the operation status, and the error status as an image.
  • the display 182 may be slid or pushed to the front to allow the user to more easily recognize the user interface screen provided through the display 182 can do.
  • the display 182 may be configured as a touch screen by forming a mutual layer structure with the touch pad.
  • the display 182 may be used as an input device capable of inputting information by a user's touch in addition to the output device.
  • the output unit 180 may further include an audio output unit 181 for outputting an audio signal.
  • the sound output unit 181 can output a sound message such as a warning sound, an operation mode, an operation state, an error state, etc., under the control of the control unit 140.
  • the sound output unit 181 can convert an electric signal from the control unit 140 into an audio signal and output it.
  • the sound output section 181 may include a speaker or the like.
  • the mobile robot 100 includes a controller 140 for processing and determining various information such as recognizing a current position, and a storage unit 130 for storing various data.
  • the mobile robot 100 may further include a communication unit 190 for transmitting and receiving data to and from a mobile terminal, a server, another mobile robot, a guide robot, and the like.
  • the mobile terminal 200 is provided with an application for controlling the mobile robot 100 and displays a map of the traveling area to be cleaned by the mobile robot 100 through the execution of the application, Can be specified.
  • the mobile terminal 200 may be, for example, a remote controller, a PDA, a laptop, a smart phone, or a tablet on which an application for setting a map is mounted.
  • the mobile terminal 200 can communicate with the mobile robot 100 to display the current position of the mobile robot together with the map, and information about a plurality of regions can be displayed. In addition, the mobile terminal 200 updates its position and displays it according to the traveling of the mobile robot.
  • the mobile robot 100 can share data or transmit / receive data with the server or other robots through the communication unit 190.
  • the mobile robot 100 can transmit current position information, current state information, and the like to the server through the communication unit 190.
  • a plurality of mobile robots can share status information such as cleaning progress status information with each other. Also, the shared state information can be transmitted to the server and managed, and the server can transmit various information to the user's mobile terminal.
  • the control unit 140 controls the overall operation of the mobile robot 100 by controlling the image acquisition unit 120, the driving unit 160, the display 182, and the like, which constitute the mobile robot 100.
  • the storage unit 130 records various kinds of information required for controlling the mobile robot 100, and may include a volatile or nonvolatile recording medium.
  • the storage medium stores data that can be read by a microprocessor, and includes a hard disk drive (HDD), a solid state disk (SSD), a silicon disk drive (SDD), a ROM, a RAM, a CD- Tape, floppy disk, optical data storage, and the like.
  • the storage unit 130 may store a map of a driving area.
  • the map may be input by a mobile terminal, a server, or the like capable of exchanging information with the mobile robot 100 through wired or wireless communication, or may be one generated by the mobile robot 100 itself.
  • the map may indicate the location of the compartmentalized zones within the driving zone.
  • the current position of the mobile robot 100 can be displayed on the map, and the current position of the mobile robot 100 on the map can be updated in the course of travel.
  • the mobile terminal may store the same map as the map stored in the storage unit 130.
  • the storage unit 130 may store cleaning history information. Such cleaning history information can be generated each time cleaning is performed.
  • the map of the traveling area stored in the storage unit 130 may include a navigation map used for traveling during cleaning, a simultaneous localization and mapping (SLAM) map used for location recognition, A learning map used for learning cleaning, a global position map used for global position recognition, an obstacle recognition map for recording information on the recognized obstacle, and the like.
  • SLAM simultaneous localization and mapping
  • the maps can be stored and managed in the storage unit 130 separately for each use.
  • the maps may not be clearly classified for each use.
  • a plurality of pieces of information may be stored in one map for use in at least two applications.
  • the control unit 140 may include a travel control module 141, a location recognition module 142, a map generation module 143, and an obstacle recognition module 144.
  • the travel control module 141 controls the travel of the mobile robot 100, and controls the travel of the travel unit 160 according to the travel setting.
  • the travel control module 141 can grasp the travel path of the mobile robot 100 based on the operation of the travel unit 160.
  • the travel control module 141 can grasp the current or past traveling speed and the traveling distance of the mobile robot 100 based on the rotational speed of the driving wheel 136, The current or past redirection process can also be identified.
  • the position of the mobile robot 100 on the map can be updated.
  • the map generating module 143 may generate a map of the running area.
  • the map generation module 143 can process the image acquired through the image acquisition unit 120 to generate a map. That is, a cleaning map corresponding to the cleaning area can be created.
  • the map generation module 143 can process the image acquired through the image acquisition unit 120 at each position and recognize the global position in association with the map.
  • the position recognition module 142 estimates and recognizes the current position.
  • the position recognition module 142 grasps the position in cooperation with the map generation module 143 by using the image information of the image acquisition unit 120 so that even if the position of the mobile robot 100 suddenly changes, .
  • the mobile robot 100 is capable of recognizing the position during the continuous running through the position recognition module 142 and is capable of recognizing the map through the map generation module 143 and the obstacle recognition module 144 without the position recognition module 142 You can learn and estimate your current location.
  • the image acquisition unit 120 acquires images around the mobile robot 100.
  • the image acquired by the image acquisition unit 120 is defined as an 'acquired image'.
  • the acquired image includes various features such as lights, edges, corners, blobs, ridges, etc., located on the ceiling.
  • the map generation module 143 detects the feature from each of the acquired images.
  • Various methods for detecting features from an image in the field of Computer Vision are well known.
  • Feature detectors suitable for detecting these features are known. For example, Canny, Sobel, Harris & Stephens / Plessey, SUSAN, Shi & Tomasi, Level curve curvature, FAST, Laplacian of Gaussian, Difference of Gaussians, Determinant of Hessian, MSER, PCBR and Gray-level blobs detector.
  • the map generation module 143 calculates a descriptor based on each minutiae point.
  • the map generation module 143 may convert a feature point into a descriptor using a Scale Invariant Feature Transform (SIFT) technique for feature detection.
  • SIFT Scale Invariant Feature Transform
  • the descriptor may be denoted by an n-dimensional vector.
  • the SIFT can detect unchanging features with respect to the scale, rotation, and brightness change of the object to be photographed. Even if the same region is photographed with a different attitude of the mobile robot 100 (i.e., -invariant) can be detected.
  • various other techniques e.g., HOG: Histogram of Oriented Gradient, Haar feature, Fems, Local Binary Pattern (LBP), Modified Census Transform (MCT) may be applied.
  • the map generation module 143 classifies at least one descriptor for each acquired image into a plurality of groups according to a predetermined lower classification rule on the basis of the descriptor information obtained through the acquired image of each position, Can be converted into lower representative descriptors, respectively.
  • the map generation module 143 can obtain the feature distribution of each position through such a process.
  • Each position feature distribution can be represented as a histogram or an n-dimensional vector.
  • the map generation module 143 can estimate an unknown current position based on descriptors calculated from each feature point, without going through a predetermined lower classification rule and a predetermined lower representative rule.
  • the current position of the mobile robot 100 becomes an unknown state due to a positional jump or the like
  • the current position can be estimated based on data such as a previously stored descriptor or a lower representative descriptor.
  • the mobile robot 100 acquires an acquired image through the image acquisition unit 120 at an unknown current position. Through the image, various features such as lights, edges, corners, blobs, ridges, etc., are found on the ceiling.
  • the position recognition module 142 detects the features from the acquired image. Various methods of detecting features from an image in the field of computer vision technology and descriptions of various feature detectors suitable for detecting these features are as described above.
  • the position recognition module 142 calculates the recognition descriptor through the recognition descriptor calculation step based on each recognition feature point.
  • the recognition minutiae and the recognition descriptor are used to describe the process performed by the position recognition module 142, and are intended to distinguish the terms used to describe the process performed by the map generation module 143.
  • the characteristics of the external world of the mobile robot 100 are merely defined by different terms.
  • the location recognition module 142 may convert the recognized minutiae point into the recognition descriptor using the Scale Invariant Feature Transform (SIFT) technique for detecting the feature.
  • SIFT Scale Invariant Feature Transform
  • the recognition descriptor may be denoted by an n-dimensional vector.
  • the SIFT selects characteristic points that are easily distinguishable, such as corner points, from the acquired image, and then determines the distribution characteristics of the brightness gradient of pixels belonging to a certain region around each characteristic point ) Is an image recognition technique that obtains an n-dimensional vector (vector) in which the degree of change in each direction is a numerical value for each dimension.
  • the position recognition module 142 Based on at least one recognition descriptor information obtained through an acquired image of an unknown current position, the position recognition module 142 generates position information (for example, feature distribution of each position) (Lower recognition feature distribution).
  • each position feature distribution can be compared with each recognition feature distribution to calculate each similarity.
  • the similarity degree (probability) is calculated for each position corresponding to each position, and the position where the greatest probability is calculated can be determined as the current position.
  • control unit 140 can recognize the current location of the main body 101 based on the pre-stored map, or can generate a map composed of a plurality of regions by distinguishing the travel regions.
  • control unit 140 can transmit the generated map to the mobile terminal, the server, and the like through the communication unit 190. Also, as described above, the control unit 140 can store the map in the storage unit when the map is received from the mobile terminal, the server, and the like.
  • the controller 140 transmits the updated information to the mobile terminal 200 so that the map stored in the mobile terminal 200 and the mobile robot 100 are the same.
  • the mobile robot 100 can clean the designated area with respect to the cleaning command from the mobile terminal, So that the current position of the mobile robot can be displayed.
  • the map is divided into a plurality of areas for the cleaning area, and includes a connection path for connecting the plurality of areas, and may include information about the obstacles in the area.
  • the control unit 140 determines whether the position on the map matches the current position of the mobile robot 100 or not.
  • the cleaning command may be input from the remote controller, the display, or the mobile terminal 200.
  • the controller 140 recognizes the current position and restores the current position of the mobile robot 100, The traveling section 160 can be controlled to move to the designated area.
  • the position recognition module 142 analyzes the acquired image input from the image obtaining unit 120 and estimates the current position based on the map can do.
  • the obstacle recognition module 144 or the map generation module 143 may also recognize the current position in the same manner.
  • the travel control module 141 calculates the travel route from the current position to the designated area and controls the travel unit 160 to move to the designated area.
  • the travel control module 141 can divide the entire traveling area into a plurality of areas and set one or more areas as designated areas according to the received cleaning pattern information.
  • the travel control module 141 may calculate the travel route in accordance with the received cleaning pattern information, travel along the travel route, and perform cleaning.
  • the control unit 140 may store the cleaning history in the storage unit 130 when the cleaning of the designated area is completed.
  • the control unit 140 may transmit the operation state or the cleaning state of the mobile robot 100 to the mobile terminal 200 and the server through the communication unit 190 at predetermined intervals.
  • the mobile terminal 200 displays the position of the mobile robot together with the map on the screen of the application being executed based on the received data, and also outputs information about the cleaning state.
  • the mobile robot 100 moves until an obstacle or a wall surface is sensed in one direction and when the obstacle is recognized through the sensor unit 170 and the obstacle recognition module 144, It is possible to determine a running pattern such as a straight run or a rotation.
  • the mobile robot 100 can continue to go straight.
  • the mobile robot 100 rotates and moves a certain distance, moves to a distance in which the obstacle is sensed in the direction opposite to the initial moving direction, and is moved in a zigzag . ≪ / RTI >
  • the mobile robot 100 can perform obstacle recognition and avoidance based on machine learning.
  • the control unit 140 includes an obstacle recognition module 144 for recognizing an obstacle learned through machine learning in the input image and a driving unit for driving the driving unit 160 based on the recognized obstacle characteristics And a travel control module 141 for controlling the vehicle.
  • the mobile robot 100 may include an obstacle recognition module 144 in which an attribute of an obstacle is learned by machine learning.
  • Machine learning means that the computer learns from the data through the computer, and the computer takes care of the problem, even though the computer does not instruct the person directly with the logic.
  • Deep Learning Deep Learning (Deep Learning). It is based on Artificial Neural Networks (ANN) for constructing artificial intelligence. It is an artificial intelligence technology that allows a computer to learn like a human being without learning it as a way of teaching people how to think.
  • ANN Artificial Neural Networks
  • the ANN may be implemented in a software form or a hardware form such as a chip.
  • the obstacle recognition module 144 may include an artificial neural network (ANN) in the form of software or hardware in which the property of the obstacle is learned.
  • ANN artificial neural network
  • the obstacle recognition module 144 may include a Deep Neural Network (DNN) such as CNN (Convolutional Neural Network), RNN (Recurrent Neural Network), DBN (Deep Belief Network) . ≪ / RTI >
  • DNN Deep Neural Network
  • CNN Convolutional Neural Network
  • RNN Recurrent Neural Network
  • DBN Deep Belief Network
  • the obstacle recognition module 144 may determine an attribute of the obstacle included in the input image data based on the weights among the nodes included in the DNN.
  • control unit 140 determines whether the image acquisition unit 120 acquires the obstacle according to the direction of the obstacle detected by the sensor unit 170 It is possible to control to extract a partial area of the image.
  • the image obtaining unit 120 may obtain an image within a predetermined angle range in the moving direction of the mobile robot 100.
  • the obstacle recognition module 144 can recognize the obstacle based on the data learned by the machine learning in the image acquired by the image acquisition unit 120.
  • the obstacle recognition module 144 may identify the type of object to be cleaned.
  • a mobile robot 100 includes a lamp unit 150 having various lamps having a signal function for informing people of the traveling state of the mobile robot 100, ).
  • the lamp unit 150 includes a status notification lamp 151 for outputting light representing the current state of the mobile robot 100 and a status notification lamp 151 disposed on the front surface of the main body 101, And a backward direction indicator 153 disposed on the rear surface of the main body 101 and turned on according to the traveling direction of the mobile robot 100.
  • the lamps 151, 152, and 153 of the lamp unit 150 may include one or more light sources.
  • the status notification light 151, the forward direction indicator 152, and the backward direction indicator 153 may include one or more light emitting diodes (LEDs).
  • LED light emitting diode
  • the light emitting diode (LED) may be a single color light emitting diode (LED) such as Red, Blue, Green, and White. According to an embodiment, the light emitting diode (LED) may be a multicolor light emitting diode (LED) capable of reproducing a plurality of colors.
  • the status indicator 151 may include a plurality of light emitting diodes (LEDs), and the plurality of light emitting diodes (LEDs) may emit white light to provide white light. Red, Blue, and Green light emitting diodes (LEDs) may be combined to provide a particular color of illumination or white light.
  • LEDs light emitting diodes
  • the status notification light 151 may include a first color (Yellowish White) indicating a standby / stop state, a second color (Blueish Green) indicating a cleaning in progress state, a third And a corresponding fourth color (sky blue) light is output when the robot search signal is received.
  • the status notification light 151 may be turned off after a predetermined time period after outputting light of a first color indicating a standby / stop state.
  • the status notification light 151 includes a plurality of light emitting diodes
  • a plurality of light emitting diodes may be turned on simultaneously, and then a plurality of light emitting diodes may be sequentially turned off at regular intervals.
  • all the light emitting diodes of the status notification light 151 blink at the same time and can output light of the second color indicating the cleaning in progress status.
  • the status notification light 151 may display a current progress state of the output light through color, and may serve as a kind of signal light for notifying stop, avoidance, and state change for avoiding the human body.
  • the status notification light 151 may provide red light.
  • the state notifying light 151 may blink light of a specific color to notify people of the avoidance driving.
  • the mobile robot 100 can be set to avoid an obstacle immediately when a human body is sensed or to avoid obstacles depending on whether a person is moving after waiting for a predetermined time.
  • the state notifying lamp 151 can output light of a predetermined color constantly for a predetermined time before blinking and during avoidance running, or blink.
  • the sound output unit 181 can output and provide voice reminders to people such as " Please take a moment. &Quot;
  • the forward direction indicator 152 and the backward direction indicator 153 may indicate direction indication and emergency stop.
  • the forward direction indicator 152 and the backward direction indicator 153 may be turned on and blink according to the traveling direction of the mobile robot 100. [ In addition, the forward direction indicator 152 and the backward direction indicator 153 may be synchronized with each other and turned on or blink together.
  • the front direction indicator light 152 and the rear direction indicator light 153 may include indicator lamps corresponding to the left and right directions, respectively.
  • the front direction indicator light 152 may include a left direction indicator light 152L and a right direction indicator light 152R.
  • the rear direction indicator light 153 includes a left direction indicator 153L , And a right turn indicator 153R.
  • the turn signal lamps 152L, 152R, 153L, and 153R corresponding to the direction in which the direction is to be turned on may turn on or blink.
  • the forward direction indicator light 152 includes a leftward direction indicator 152L and a rightward direction indicator 152R arranged adjacent to each other, but is not limited to the position of the direction indicator in the present invention.
  • the forward direction indicator light 152 may also be arranged such that the indicator light 152L corresponding to the left direction and the indicator light 152R corresponding to the right direction are spaced apart from each other by a predetermined distance.
  • the forward direction indicator light 152 and the rearward direction indicator light 153 can also notify the area departure as an emergency flashing.
  • control unit 140 may control the lamp unit 150.
  • control unit 140 may control the status notification light 151 to output lights of different colors according to the current state of the mobile robot 100. Also, the controller 140 may control the status notification lamp 151 to blink at predetermined intervals for a predetermined period of time.
  • the control unit 140 may control the forward direction indicator 152 and the backward direction indicator 153 to be turned on according to the traveling direction of the mobile robot 100. In addition, the controller 140 may control the forward direction indicator 152 and the backward direction indicator 153 to flicker at predetermined intervals for a predetermined time.
  • the forward direction indicator 152 and the backward direction indicator 153 may be synchronously driven.
  • the controller 140 can control the forward direction indicator 152 and the backward direction indicator 153 to turn on or blink in the same manner as the left / right rotation.
  • the control unit 140 controls the forward direction indicator 152 and the backward direction indicator 153 to output light having the same color as the light output from the status indicator 151 in a specific situation Can be controlled.
  • the status indicator light 151 may have a circular ring shape.
  • the status notification light 151 may be in the shape of a circular ring disposed along the outer edge of the front surface of one of the cameras included in the image acquisition unit 120.
  • the controller 140 can recognize the attribute of the obstacle based on the image acquired by the image acquiring unit 120. [0034] FIG.
  • controller 140 may control driving of the driving unit 160 based on the recognized property of the obstacle.
  • control unit 140 may control the movement of the main body 101 to stop when the recognized obstacle is at least a part of the human body.
  • the controller 140 may control to resume the existing movement when the recognized movement of the body is detected within a predetermined waiting time.
  • the control unit 140 may control to avoid the recognized body if the recognized movement of the body is not detected within a predetermined waiting time.
  • controller 140 may control the sound output unit 181 to output a predetermined sound when the recognized obstacle property is at least a part of the human body.
  • the entire area can be set as a plurality of cleaning areas. Further, one or more mobile robots can be allocated to each cleaning area.
  • the allocation of the cleaning area may be automatically divided so that a plurality of cleaning areas have similar areas based on the maximum area of the entire area.
  • a plurality of cleaning zones may be manually designated and stored by an administrator.
  • a robot system includes a plurality of mobile robots 100a, 100b, 100c, 100d, and 100e and a mobile terminal 200, and a plurality of mobile robots 100a, 100b, 100c, 100d, and 100e are each assigned a cleaning zone to perform a cleaning operation for the assigned cleaning zone.
  • a plurality of mobile robots 100a, 100b, 100c, 100d, and 100e are each assigned a cleaning zone to perform a cleaning operation for the assigned cleaning zone.
  • One or more mobile robots can be assigned to each cleaning zone. It is also possible to allocate a plurality of mobile robots to one or more cleaning zones depending on the setting of the user.
  • At least one of the plurality of mobile robots 100a, 100b, 100c, 100d, and 100e may include the mobile terminal 200 therein. That is, the mobile robots 100a, 100b, 100c, 100d, and 100e according to an embodiment of the present invention include a storage space inside the main body, and the mobile terminal 200 can be accommodated in the storage space.
  • FIG. 5 is a simplified block diagram of a mobile terminal according to an embodiment of the present invention.
  • the mobile terminal 200 includes a wireless communication unit 210, an A / V input unit 220, a user input unit 230, a sensing unit 240, an output unit 250, A memory 260, an interface unit 270, a control unit 280, and a power supply unit 290.
  • the mobile terminal 200 includes a display unit 251, a wireless communication unit 210 for exchanging data with other electronic devices such as a server, And a control unit 280 for controlling the application screen to be displayed on the display unit 251.
  • the mobile robot related application screen may include a main screen capable of controlling a plurality of mobile robots.
  • the main screen may be based on information on the mobile robots received through the wireless communication unit 210.
  • the main screen may include a map area including one or more maps divided into a plurality of areas and a robot information area including robot information items for a mobile robot allocated to the plurality of areas.
  • the user interface screen and the control of the mobile robot provided through the mobile terminal 200, such as the main screen, will be described later in detail with reference to FIG. 9 to FIG.
  • the wireless communication unit 210 can directly receive position information, status information, and the like from the mobile robot, or receive position information, status information, and the like of the mobile robot through the server.
  • the wireless communication unit 210 may include a broadcast receiving module 211, a mobile communication module 213, a wireless Internet module 215, a short distance communication module 217, and a GPS module 219.
  • the broadcast receiving module 211 may receive at least one of a broadcast signal and broadcast related information from an external broadcast management server through a broadcast channel.
  • the broadcast channel may include a satellite channel, a terrestrial channel, and the like.
  • the broadcast signal and / or broadcast related information received through the broadcast receiving module 211 may be stored in the memory 260.
  • the mobile communication module 213 transmits and receives a radio signal to at least one of a base station, an external terminal, and a server on a mobile communication network.
  • the wireless signal may include various types of data according to a voice call signal, a video call signal, or a text / multimedia message transmission / reception.
  • the wireless Internet module 215 refers to a module for wireless Internet access, and the wireless Internet module 215 can be embedded in the mobile terminal 200 or externally.
  • the wireless Internet module 215 can perform wireless communication based on WiFi or wireless based on WiFi Direct.
  • the short-range communication module 217 is for short-range communication, and includes Bluetooth TM, Radio Frequency Identification (RFID), Infrared Data Association (IrDA), Ultra Wideband (UWB) (Near Field Communication), Wi-Fi (Wireless-Fidelity), Wi-Fi Direct, and Wireless USB (Wireless Universal Serial Bus) technology.
  • RFID Radio Frequency Identification
  • IrDA Infrared Data Association
  • UWB Ultra Wideband
  • Wi-Fi Wireless-Fidelity
  • Wi-Fi Direct Wireless USB (Wireless Universal Serial Bus) technology.
  • the short-range communication module 217 may communicate with the mobile terminal 200 and the wireless communication system between the mobile terminal 200 and another mobile terminal 200 or between the mobile terminal 200 and the mobile terminal 200 through wireless local area networks ) And another mobile terminal, or a network in which an external server is located.
  • the short-range wireless communication network may be a short-range wireless personal area network.
  • the GPS (Global Position System) module 219 may receive position information from a plurality of GPS satellites.
  • the wireless communication unit 210 can exchange data with the server using one or more communication modules.
  • the wireless communication unit 210 may include an antenna 205 for wireless communication, and may include an antenna for receiving broadcast signals in addition to an antenna for communication.
  • the A / V (Audio / Video) input unit 220 is for inputting an audio signal or a video signal, and may include a camera 221 and a microphone 223.
  • the user input unit 230 generates key input data that the user inputs for controlling the operation of the terminal.
  • the user input unit 230 may include a key pad, a dome switch, a touch pad (static pressure / static electricity), and the like.
  • the touch pad has a mutual layer structure with the display unit 251, it can be called a touch screen.
  • the sensing unit 240 senses the current state of the mobile terminal 200 such as the open / close state of the mobile terminal 200, the position of the mobile terminal 200, A sensing signal can be generated.
  • the sensing unit 240 may include a sensing sensor 241, a pressure sensor 243, a motion sensor 245, and the like.
  • the motion sensor 245 can detect the movement or the position of the mobile terminal 200 using an acceleration sensor, a gyro sensor, a gravity sensor, or the like.
  • the gyro sensor is a sensor for measuring the angular velocity, and it can sense the direction (angle) of rotation about the reference direction.
  • the output unit 250 may include a display unit 251, an acoustic output module 253, an alarm unit 255, and a haptic module 257.
  • the display unit 251 displays and outputs information processed by the mobile terminal 200.
  • the display unit 251 and the touch pad have a mutual layer structure to constitute a touch screen
  • the display unit 251 may be used as an input device capable of inputting information by a user's touch in addition to the output device.
  • the audio output module 253 outputs audio data received from the wireless communication unit 210 or stored in the memory 260.
  • the sound output module 253 may include a speaker, a buzzer, and the like.
  • the alarm unit 255 outputs a signal for notifying the occurrence of an event of the mobile terminal 200. For example, it is possible to output a signal in a vibration mode. .
  • the haptic module 257 generates various tactile effects that the user can feel.
  • a typical example of the haptic effect generated by the haptic module 257 is a vibration effect.
  • the memory 260 may store a program for processing and controlling the control unit 280 and may store a function for temporarily storing input or output data (e.g., a phone book, a message, a still image, .
  • input or output data e.g., a phone book, a message, a still image, .
  • the interface unit 270 serves as an interface with all the external devices connected to the mobile terminal 200.
  • the interface unit 270 may receive data from the external device or supply power to the respective components in the mobile terminal 200 and may transmit data in the mobile terminal 200 to the external device .
  • the control unit 280 typically controls the operation of the respective units to control the overall operation of the mobile terminal 200. For example, perform related controls and processing for voice calls, data communications, video calls, and the like.
  • the control unit 280 may include a multimedia playback module 281 for multimedia playback.
  • the multimedia playback module 281 may be configured in hardware in the controller 280 or separately from software in the controller 280.
  • the power supply unit 290 receives external power and internal power under the control of the controller 280 and supplies power necessary for operation of the respective components.
  • FIG. 5 is a block diagram for an embodiment of the present invention. Each component of the block diagram may be integrated, added, or omitted depending on the specifications of the mobile terminal 200 actually implemented.
  • constituent elements may be combined into one constituent element, or one constituent element may be constituted by two or more constituent elements, if necessary.
  • the functions performed in each block are intended to illustrate the embodiments of the present invention, and the specific operations and apparatuses do not limit the scope of the present invention.
  • FIGS. 6 and 7 are views showing a state in which a head of a mobile robot is opened according to an embodiment of the present invention.
  • the mobile robot 100 may include a display 182 housed inside the main body 101.
  • the display 182 may be disposed in the dust container cover 136.
  • the display 182 may be provided separately from the dust container 135.
  • the display 182 may be slid or pushed to the front to allow the user to more easily recognize the user interface screen provided through the display 182 can do.
  • FIG 8 is a view illustrating a user interface screen provided through a display 182 of a mobile robot according to an embodiment of the present invention.
  • a user interface screen may be provided through the display 182.
  • a user interface screen may be provided through the mobile terminal 200 irrespective of whether the head 111 is opened.
  • a predetermined screen may be displayed on the display 182 or the display unit 251 prior to providing the main screen including the top menu items.
  • a screen including a message for welcoming the user may be displayed on the display 182 or the display unit 251.
  • the main screen 800 provided through the display 182 includes a status bar area 810 in which current date / time information and the like are displayed, various information of the mobile robot 100 are displayed And a main menu area 830 including a plurality of main menu items.
  • an icon corresponding to predetermined status information such as a network status may be displayed in the status bar area 810.
  • the information displayed in the monitoring area 820 may include battery information, status information, and status guidance for each error condition.
  • the battery information may include an icon corresponding to the current battery charge value and the battery state.
  • the icon corresponding to the battery status may be displayed as a graphic image that indicates the level corresponding to the current battery charge level by dividing the battery charge status into a plurality of levels.
  • the state information may include a current state of the mobile robot 100 in the standby state, during a pause, an emergency error occurrence, a battery discharge, an accessory state, and the like.
  • the status guidance message may be a guidance message corresponding to the current state of the mobile robot 100.
  • the side brush error may be displayed in the monitoring area 820 as graphic and / or text, and a guideline such as 'please remove foreign substances in the side brush' may be displayed have.
  • the monitoring area 820 displays 'Go to charging station' or 'Pause to go to charging station' as status information, and the head of the cleaning robot If you close it, the robot can go back to the charging station! '.
  • the monitoring area 820 may include an emergency call button for calling an upper manager, and an operator can quickly call an upper manager by pressing an emergency call button if necessary.
  • the main menu area 830 is an area for displaying the main menu.
  • the main menu area 830 includes a return item 831 for inputting a move command to a specified place (for example, a charging station) A start item 832, and a robot management item 833.
  • the control unit 140 controls the display unit 820 to display a guidance message to guide the user to close the head 111 when a selection input to the return entry 831 or the cleaning start entry 832 is received .
  • control unit 140 may cause the display 182 to display a pop-up message for guiding the head 111 to close when receiving a selection input for the return item 831 or the cleaning start item 832, Can be controlled to be displayed.
  • control unit 140 may control to perform an operation corresponding to the input command.
  • the control unit 140 can control the robot management detailed screen to be displayed on the display 182 when a selection input to the robot management item 833 is received.
  • the robot management detailed screen may include an accessory status item, an error history item, a password setting item, a robot version information item, and the like.
  • FIGS. 9 to 14B are diagrams illustrating a user interface screen provided through a display unit of a mobile terminal according to an exemplary embodiment of the present invention and a control method of the robot system using the user interface screen.
  • the mobile terminal 200 includes a wireless communication unit 210 for exchanging data with other electronic devices such as a display unit 251, a mobile robot, a server, and the like, And a control unit 280 for controlling the mobile robot related application screen to be displayed on the display unit 251.
  • the mobile robot related application screen may include a main screen capable of controlling a plurality of mobile robots.
  • a predetermined screen may be displayed on the display unit 251 prior to providing the main screen including the top-level menu items.
  • a screen including a message for welcoming the user may be displayed on the display unit 251.
  • the main screen 900 includes a map area 930 including one or more maps 931 and 932 divided into a plurality of areas A, B, C, D and E, And a robot information area 940 including robot information items 941, 942, 943, 944, and 945 for the mobile robot to be allocated to the plurality of zones A, B, C, D, .
  • the main screen 900 may be based on information on the mobile robots received through the wireless communication unit 210.
  • the wireless communication unit 210 can receive information on the mobile robots from the mobile robots directly or via the server.
  • the information on the received mobile robots may include at least one of position information, state information, disposed zone information, and battery state information of the mobile robots.
  • the main screen 900 provided through the display unit 251 includes a status bar for displaying current date / time information, connected communication network information, battery information of the mobile terminal 200, A title area 920 in which an application related button such as an area 910, a name information, a notification center, and a basic setting is placed, a map area 930 in which one or more maps are displayed, And a robot information area 940 in which information is displayed.
  • a service area full map can be displayed.
  • the map area 930 may display one or more maps set by the administrator among the entire maps.
  • Each of the maps 931 and 932 may be a SLAM map used by the mobile robot 100 when traveling, or may be a map generated in a different manner.
  • each of the maps 931 and 932 may be a simplified map based on SLAM maps or other maps.
  • FIG. 9 illustrates a case where the mobile robot provides a cleaning service at an airport, and therefore, the maps 931 and 932 may be maps for some areas in the airport.
  • the first map 931 is divided into three areas of an airport land area
  • the second map 932 may be a first floor map of the airport land area.
  • a scroll button 930 for vertically scrolling the screen Or a switch button for changing the displayed map or the like may be further displayed.
  • the first map 931 can be divided into a plurality of zones A, B and C.
  • the second map 932 may be divided into a plurality of zones D and E, respectively.
  • Icons of the mobile robots disposed in the respective zones may be displayed in the plurality of zones A, B, C, D, and E.
  • the current position of the allocated mobile robot can be displayed in a plurality of zones A, B, C, D, and E included in the map area 930, B, C, D, and E), the icon of the mobile robot may be displayed on a point corresponding to the current position of the mobile robot.
  • control unit 280 determines whether the touch input of the mobile robot So that the individual robot detailed screen can be displayed on the display unit 251.
  • the control unit 280 may control the display unit 251 to display the individual robot detailed screen of the mobile robot having received the touch input when the touch input to the icon of the mobile robot is received.
  • state information of the mobile robots allocated to each of the plurality of zones A, B, C, D, and E included in the map area 930 can be displayed as text.
  • status information may be displayed in the A, C, D,
  • the colors of the plurality of zones A, B, C, D, and E included in the map area 930 may vary depending on the status information of the mobile robot.
  • the area where the mobile robot having a failure is disposed is displayed in red
  • the area where the mobile robot is waiting is displayed in green
  • the area where the mobile robot is being cleaned can be displayed in blue.
  • the allocation area of the mobile robots can be easily changed by drag & drop.
  • robot information items 941, 942, 943, 944, and 945 corresponding to a plurality of mobile robots can be displayed.
  • the robot information items 941, 942, 943, 944, and 945 may include name information, status information, disposed zone information, battery status information, and one or more operation buttons of the corresponding mobile robot.
  • the state information of the mobile robot may include a current state such as during cleaning, waiting, going to a charging station, during an emergency stop, or during a pause. Further, when the state information of the mobile robot is being cleaned, information on the cleaning progress time can be further displayed.
  • the robot information items 941, 942, 943, 944, and 945 may include a cleaning start button that can remotely start cleaning and / or a recharge button to return to the charging station.
  • a pause button may be displayed on the robot information items 941, 942, 943, 944, and 945.
  • the robot information area 940 is moved up and down A scroll button for scrolling the screen, a switching button for changing the displayed item, and the like.
  • the control unit 280 determines whether the touch input corresponds to the received robot information item Can be controlled so that the individual robot detailed screen of the mobile robot (first mobile robot) to be displayed on the display unit 251 can be controlled.
  • FIG. 10 illustrates an individual robot detail screen 1000 of the 'airport cleaning robot 1' displayed when the 'A zone' or the 'airport cleaning robot 1' robot information item 941 is selected.
  • the individual robot detailed screen 1000 provided through the display unit 251 includes a status bar (for example, a date and time information, a connected network information, a battery information of the mobile terminal 200, A title area 1020 in which an individual robot basic setting button is placed, and a map item 1031 corresponding to an assigned zone, and a status bar 1010, at least one operation for controlling the corresponding mobile robot A map and remote control area 1030 including a remote control item 1032 including buttons, and a detailed menu item 1041 including a plurality of upper menus, a manual menu screen for manually manipulating the corresponding mobile robot A detailed menu including a control item 1042, and a manual control area 1040.
  • a status bar for example, a date and time information, a connected network information, a battery information of the mobile terminal 200, A title area 1020 in which an individual robot basic setting button is placed, and a map item 1031 corresponding to an assigned zone, and a status bar 1010, at least one operation for controlling the corresponding mobile robot A map and remote control area 1030
  • the map item 1031 of the map and remote control area 1030 may include a map of the 'A zone' allocated to the 'airport cleaning robot 1' and current location information of the 'airport cleaning robot 1' displayed on the map have.
  • the remote control item 1032 of the map and remote control area 1030 may include a cleaning start button that can remotely start cleaning and / or a recharge station go back button that can command a return to the charging station.
  • the map and remote control area 1030 can be switched to the screen as shown in FIG.
  • the current position of the mobile robot, the area where the cleaning is completed, and the area where the cleaning is not performed are displayed on the map, and the cleaning progress information 1132 may be displayed in the map item 1131.
  • the cleaning progress information 1132 may include information on the cleaning progress time and the cleaning progress rate.
  • the remote control item 1132 may include a pause button.
  • the pause button may be displayed at a position where the cleaning start button is displayed.
  • the map and remote control area 1030 can be switched to the screen as shown in FIG.
  • the current position of the mobile robot may be displayed on the map in the map item 1231, and the remote control item 1232 may include a pause button.
  • the pause button may be displayed at a position where the go to charging station button is displayed.
  • the detailed menu and manual control area 1040 may include a manual control item 1042 for manually manipulating the mobile robot.
  • the manual control item 1042 may include an operation button for moving the mobile robot 100 forward, backward, turning or stopping, and an operation button for operating the suction strength.
  • the manual control item 1042 may be set to a locked state.
  • the locked state of the manual control item 1042 can be released by a predetermined pre-set unlock command.
  • a notification screen may be displayed upon entry of the individual robot detailed screen 1000 or according to other notification settings.
  • Fig. 13 shows an example of the notification screen 1300.
  • the notification screen 1300 includes a status bar area 1310 displaying current date / time information, connected communication network information, battery information of the mobile terminal 200, name information, A title area 1320 in which individual robot basic setting buttons are arranged, and a list area 1330 in which one or more pieces of notification information are provided in a list form.
  • One or more notification messages 1331 and 1332 may be displayed in the list area 1330 and a predetermined number of notification messages 1331 and 1332 may be displayed in the latest order.
  • the list area 1330 may include a delete all button 1335 of the notification messages 1331 and 1332. Also, if the individual notification messages 1331 and 1332 are touched for a predetermined time or longer, the notification messages 1331 and 1332 can be individually deleted.
  • a close button of the notification screen 1300 may be disposed at the lower end of the list area 1330.
  • a pop-up that guides the emergency can be displayed regardless of the screen currently displayed on the display unit 251.
  • the detailed menu item 1041 includes a cleaning log item for confirming the cleaning history of the corresponding mobile robot, an accessory status item for checking the accessory status information, and a setting item for changing the setting can do.
  • the control unit 280 may control the display unit 251 to display a setting screen including the lower menu items included in the setting item when the setting item is selected.
  • Fig. 14A shows an example of the setting screen 1400.
  • the setting screen 1400 includes a status bar area 1410 for displaying current date / time information, communication network information, battery information of the mobile terminal 200, etc., name information, And a setting menu area 1430 including a plurality of sub menu items 1431, 1432, 1433, 1434, and 1435.
  • the setting menu area 1430 includes a robot name item 1431 for changing the name of the robot, a robot search item 1432 for inputting a command for searching for a predetermined robot, A travel distance and time item 1433, a notification setting item 1434 capable of setting a notification related setting, and a robot version information item 1435.
  • a detailed area 1440 located on the right side of the setting menu area 1430 includes a plurality of robot lists 1441 , 1442, 1443, 1444, 1445) may be displayed.
  • the control unit 280 transmits a predetermined signal to the selected robot
  • the wireless communication unit 210 can be controlled.
  • a plurality of mobile robots can be arranged and used.
  • the shapes of the mobile robots are all the same or similar, it is difficult for the user to identify the specific robot.
  • a plurality of mobile robots can be stored in a charging station where charging stations are provided, or in a predetermined place where aft and safety accidents can be prevented. In this case, since a plurality of robots are concentrated and aligned at a close position, the user is more difficult to identify the individual mobile robot.
  • 15A to 15C are diagrams referred to in explaining the robot search function according to the embodiment of the present invention.
  • the robot system may include a plurality of mobile robots and a mobile terminal.
  • the mobile terminal 200 can transmit a robot search signal for searching for a predetermined mobile robot among the plurality of mobile robots.
  • the setting screen 1500 provided through the display unit 251 of the mobile terminal 200 displays current date / time information, communication network information, battery information of the mobile terminal 200, and the like
  • the setting screen 1500 may include a detailed area 1540 in which detailed information is displayed when any one of the plurality of sub menu items is selected.
  • the detailed area 1540 displays a list of robot items 1541, 1542, 1543, 1544, and 1545 corresponding to the robots included in the robot system May be displayed.
  • the user may touch one of the robot items 1541, 1542, 1543, 1544, and 1545 in the list displayed in the detailed area 1540 to instruct to transmit a signal for searching for a specific robot.
  • control unit 240 controls the wireless communication unit 210 to transmit the robot search signal for searching for the mobile robot corresponding to the selected robot item 1542 among the plurality of mobile robots .
  • the predetermined mobile robot can receive the robot search signal from the mobile terminal 200 or the server.
  • the mobile robot that receives the signal based on the robot search signal can perform at least one of a state notification, a predetermined sound output, and a movement of one or more times in a predetermined direction.
  • the mobile robot Upon receiving the signal based on the robot search signal, the mobile robot can illuminate a status notification 151 or the like for a predetermined number of times or for a predetermined time or longer.
  • the status notification light 151 of the mobile robot 100b receiving the robot search signal among the plurality of mobile robots 100a, 100b, 100c, 100d, and 100d outputs light of a predetermined color at least once can do.
  • the status notification light 151 may output light of a fourth color (sky blue) three times.
  • the status notification light 151 of the mobile robot 100b can output light of a predetermined color for a predetermined time or longer.
  • the mobile robot 100b which has received the signal based on the robot search signal, can output a predetermined sound through the acoustic output unit 181.
  • the mobile robot 100b can output a sound or a sound effect that the user can hear, such as 'here'.
  • the mobile robot 100b receiving the signal based on the robot search signal can move at least once in a predetermined direction.
  • the mobile robot 100b receiving the signal based on the robot search signal can advance a predetermined distance to be clearly distinguished from other mobile robots 100a, 100c, 100d, and 100d aligned in the charging station have.
  • the wireless communication unit 210 of the mobile terminal 200 may transmit a robot search signal to the mobile robot corresponding to the selected robot item 1542, or may transmit a robot search signal to the server.
  • a robot system includes: a server that receives a robot search signal for searching for a predetermined mobile robot among the plurality of mobile robots, and transmits a signal corresponding to the robot search signal to the predetermined mobile robot (Not shown).
  • the server may receive status information from the plurality of mobile robots and may transmit information on the plurality of mobile robots to the mobile terminal 200 based on the received status information.
  • the mobile terminal 200 displays on the display unit 251 a main screen based on information on a plurality of mobile robots received from the server, and the main screen displays one or more maps divided into a plurality of zones And a robot information area including robot information items for the mobile robot to be allocated to the plurality of zones.
  • 16 to 19 are diagrams for explaining a user interface screen provided through a display unit of a mobile terminal according to an embodiment of the present invention and a control method of the robot system using the same.
  • FIG. 16 illustrates a setting screen 1600 provided through the display unit 251.
  • the setting screen 1600 includes a status bar area 1610 for displaying current date / time information, communication network information, battery information of the mobile terminal 200, name information, And a setting menu area 1630 including a plurality of sub menu items.
  • the setting screen 1600 may include a detailed area 1640 for displaying detailed information when any one of the plurality of sub menu items is selected.
  • the detailed information area 1640 stores travel information items 1641, 1642, and 1643 of the robots included in the robot system , 1644 may be displayed.
  • the travel information items 1641, 1642, 1643, and 1644 may include an icon of each mobile robot, the distance traveled so far, and the travel time.
  • the detailed menu item 1041 includes a cleaning log item for confirming the cleaning history of the corresponding mobile robot, an accessory status item for checking the accessory status information, and a setting item for changing the setting .
  • FIG 17 illustrates a cleaning log screen displayed on the display unit 251 when a cleaning log item is selected.
  • the cleaning log screen 1700 includes a status bar area 1710 displaying current date / time information, communication network information, battery information of the mobile terminal 200, name information, A title area 1620 to be placed, and a detailed area 1730 including cleaning history information.
  • the cleaning history information may include a cleaning execution date and time, a map image up to a cleaning completion or a stop.
  • the cleaning history information can be classified in date order and can be provided in the latest order.
  • FIG 18 illustrates an accessory status screen displayed on the display unit 251 when the accessory status item is selected.
  • the accessory status screen 1800 includes a status bar area 1810 displaying current date / time information, communication network information, battery information of the mobile terminal 200, name information, A title area 1820 to be placed, and a detail area 1830 including accessory status information.
  • the accessory status information may include accessory status information of the dirt receptacle 1831, the battery 1832, the main brush 1833, the side brush 1834, and the like.
  • a message indicating that a replacement is required is displayed, and when the user presses the completion completion button after replacement of the accessory, the completion date of replacement may be displayed.
  • the determination of the necessity of replacement of accessories may basically be based on the elapsed time after the accessory has been replaced.
  • a predetermined sensor may be disposed on an accessory to detect factors related to the life of the accessory, thereby determining the life of the accessory and whether replacement is necessary.
  • each mobile robot itself stores data on the area setting, and can perform the cleaning work in the assigned cleaning area accordingly.
  • the administrator can reallocate the zones initially set in the robots disposed in public places such as airports and terminals through the user interface screen provided through the display unit 251 of the mobile terminal 200 And reconfigure.
  • FIG. 19 shows an example of a main screen provided through the display unit 251 according to an embodiment of the present invention.
  • the position data of the area setting of the first mobile robots can be stored in the storage unit 130 of the mobile robot 100.
  • the mobile terminal 200 can receive location data of the mobile robots 100 through a server connection.
  • the mobile terminal 200 receiving the area setting location data of the plurality of mobile robots can store the area setting information of the corresponding mobile robot as each mobile robot object.
  • the main screen 1900 includes a map area 1910 including one or more maps 1911 and 1912 divided into a plurality of areas A, B, C, D, and E, A robot information area 1920 including robot information items 1921, 1922, 1923, 1924, and 1925 for a mobile robot that is assigned to the plurality of zones A, B, C, D, have.
  • the main screen 1900 may be based on information about the mobile robots received through the wireless communication unit 210.
  • the information on the received mobile robots may include at least one of position information, state information, disposed zone information, and battery state information of the mobile robots.
  • the robot information items 1921, 1922, 1923, 1924, and 1925 may include name information, status information, disposed zone information, battery status information, and one or more operation buttons of the corresponding mobile robot.
  • the state information of the mobile robot may include a current state such as during cleaning, waiting, going to a charging station, during an emergency stop, or during a pause. Further, when the state information of the mobile robot is being cleaned, information on the cleaning progress time can be further displayed.
  • the robot information items 1921, 1922, 1923, 1924, and 1925 may include a cleaning start button that can remotely start cleaning and / or a recharge button to return to the charging station.
  • a pause button may be displayed on the robot information items 1921, 1922, 1923, 1924, and 1925.
  • the first map 1911 may be divided into a plurality of zones A, B, and C, respectively.
  • the second map 1912 may be divided into a plurality of zones D and E, respectively.
  • the colors of the plurality of zones A, B, C, D, and E included in the map area 1910 may vary depending on the status information of the mobile robot.
  • the zones C, D, and E in which the mobile robot can not be grasped due to a failure or network condition are displayed in red, and the zone A in which the mobile robot is placed is green And the area B where the mobile robot is being cleaned may be displayed in blue.
  • Icons of the mobile robots disposed in the respective zones may be displayed in the plurality of zones A, B, C, D, and E.
  • the current position of the assigned mobile robot can be displayed in a plurality of zones A, B, C, D, and E included in the map area 1910, B, C, D, and E), the icon of the mobile robot may be displayed on a point corresponding to the current position of the mobile robot.
  • 2 mobile robots can be mutually changed.
  • FIG. 19 shows an example in which the mobile robots allocated to the 'C' zone and the 'D' zone are interchanged.
  • control unit 280 may exchange position data stored corresponding to the corresponding mobile robots in response to a drag and drop input.
  • the exchanged position data can be transmitted to the mapped mobile robots through the server.
  • the robot can be allocated to the changed area to perform cleaning.
  • the administrator can easily rearrange and reconfigure the zones initially set in the plurality of robots arranged at the airport, the terminal, and the like through the screen of the mobile terminal.
  • the map and remote control area 1030 may include a designated area cleaning setting item 1035 that can designate a cleaning area of the first mobile robot.
  • 20 to 26 are views referred to in explaining a user interface screen provided through a display unit included in the mobile terminal and a control method of the robot system using the user interface screen according to the embodiment of the present invention.
  • the screen can be switched to the designated area cleaning setting screen as shown in FIG.
  • the control unit 280 receives the current state information of the first mobile robot through the wireless communication unit 210 and the designated area cleaning setting item only when the first mobile robot is in the standby state 1035) can be displayed in the activated state.
  • control unit 280 may control switching to the designated area cleaning setting screen corresponding to the selection of the designated area cleaning setting item 1035 only when the first mobile robot is in the standby (stop) state.
  • control unit 280 may control the first mobile robot to stop the first mobile robot.
  • the designated area cleaning setting screen 2000 includes a map area 2010 in which a map 2005 of the area allocated to the first mobile robot is displayed and a menu area 2020 in which a plurality of menu buttons are displayed ).
  • the map 2005 may be an entire map of the 'A' area allocated to the first mobile robot.
  • an icon 2001 of the first mobile robot can be displayed.
  • the current position of the first mobile robot can be displayed on the map 2005, and the icon 2001 of the mobile robot can be displayed on a spot corresponding to the current position of the first mobile robot .
  • control unit 280 can control the map 2005 to display the cleaned area and the non-cleaned area separately.
  • the user can set to carry out supplementary cleaning by specifying an area which is not cleaned, and to set the area to be concentrated so as to maintain the set area information for the area which is determined to be necessary for centralized management.
  • the mobile robot when the mobile robot performs a cleaning operation on an assigned area and detects a moving obstacle such as a human being while driving, the robot may clean another area first and then return to try cleaning.
  • the robot when the mobile robot is returned, if a dynamic obstacle such as a person is detected, the robot can move to clean another place in the allocated area without cleaning the place. As a result, a non-cleaning area may occur in the cleaning area.
  • a non-cleaning area may be displayed on the map 2005 to assist the user in setting an area and order cleaning.
  • the area can be set as a cleaning area to help perform concentrated cleaning.
  • the cleaning area it is possible to set the cleaning area to be preferentially cleaned before the start of cleaning or during cleaning, or to clean the area repeatedly as an intensive cleaning area.
  • the plurality of menu buttons are operation buttons that can be used for area designation, and include a map enlarging button 2021, a map reducing button 2022, an initialize button 2023 of designated areas, an area designation button 2024, A cleaning start button 2025, and a cancel button 2026.
  • the menu area 2020 may further include a save button in addition to a cleaning start button 2025 and a cancel button 2026.
  • the map enlarging button 2021 is a menu button for enlarging the map 2005 displayed in the map area 2010.
  • the map reducing button 2022 is a button for displaying the map 2005 displayed in the map area 2010 It is a menu button which can be reduced.
  • the user can enlarge or reduce the map 2005 by touching the map enlarging button 2021 or the map reducing button 2022 as necessary.
  • the map 2005 may be enlarged by a pinch-out input by touching two points on the map 2005 and dragging them away from each other.
  • the map 2005 may be reduced by a pinch in input by touching two points on the map 2005 and dragging them in a direction approaching each other.
  • the initialization button 2023 may initialize all operations performed during the area designation process.
  • the setting of the designated area can be canceled.
  • the zooming operation of the map can be canceled to return to the state of the map displayed at the beginning.
  • the area designation button 2024 is a button for starting area designation. After touching the area designation button 2024, one or more areas can be designated on the map 2005.
  • the enlargement / reduction function of the map 2005 may be set to be inactivated.
  • the cleaning start button 2025 is an operation button for instructing start of cleaning for the first mobile robot.
  • the user can select the start cleaning button 2025 and instruct the first mobile robot to start cleaning for the designated area.
  • the cleaning start button 2025 is in the inactive state, and can be switched to the active state when at least one area designation is performed.
  • the cleaning start button 2025 may also function as a storage button. For example, after the user designates one or more areas, information on the designated area may be stored in the memory 260 when the user selects the start cleaning button 2025.
  • information on the designated area can be transmitted to the server and / or the first mobile robot through the wireless communication unit 210.
  • the cancel button 2026 is an operation button for terminating the designated area process and returning to the main screen.
  • the cancel button 2026 may also function as a save button. For example, if the user touches the cancel button 2026 without selecting the start cleaning button 2025 after designating one or more areas, the information about the designated area can be stored in the memory 260. [ Thereafter, when the user enters the designated area cleaning setting screen 2000, the user can display the last screen when canceling the stored information.
  • the user can designate one or more areas on the map 2005.
  • the region can be set to be directly designated by a drag and drop operation for finishing the touch after touching and dragging on the map 2005 without touching the area designation button 2024.
  • control unit 280 determines whether the first point 2201 on the map 2005 is touched to the second point 2202 when an input is received, 2201 and an area corresponding to a rectangle having the second point 2202 as an edge may be set as a first designation area 2210.
  • the control unit 280 can set the first designation area 2210 in the form of a box in response to a user's drag and drop input.
  • the user does not need to perform a lot of touch and / or drag input in order to input the entire area to be designated, and can designate one designated area by one straight drag-and-drop input.
  • the controller 280 controls the first point 2201 and the second point 2202, An area corresponding to a rectangle having the second point 2202 as an edge can be set as the first designation area 2210.
  • the area designation may be set to be performed by two sequential touch inputs instead of a drag and drop input.
  • controller 280 may display the first designation area 2210 in a different color from the rest of the map 2005.
  • controller 280 may control to display the horizontal size 2221 and the vertical size 2222 of the first designation area 2210 outside the first designation area.
  • the control unit 280 controls the control unit 280 to control the first control unit 2210 to control the first control area 2210 in response to the selection of the cleaning start button 2025 included in the designated area cleaning setting screen 2000, 1 mobile robot or the server.
  • the controller 280 may control the first designation area 2210 to display a number indicating a creation order or a cleaning order.
  • the meaning of numbers displayed in the first designation area 2210 can be set or changed by a user or a manufacturer.
  • the second designation area 2310 can be set in the same manner as the setting of the first designation area 2210.
  • control unit 280 sets the third point and the fourth point as corners And the area corresponding to the rectangle can be set as the second designation area 2310.
  • '1' is automatically displayed in the first designated area 2210, '2' may be displayed in the second designated area 2310 generated later.
  • the actual width and height of the second designation area 2310 may be displayed in meters outside the second designation area 2310.
  • 24 illustrates a case where the first, second, and third designation areas 2210, 2310, and 2410 are set.
  • the user can easily cancel the setting of the designated area by touching any one of the plurality of designated areas.
  • control unit 280 can cancel the setting of the second designation area 2310 and delete it on the screen.
  • control unit 280 sets the number displayed in the third designated area 2510 according to the deletion of the second designated area 2310 to ' 2 '.
  • the control unit 280 sets the first and second designated areas 2210 and 1610 in a predetermined order And control the wireless communication unit 210 to transmit a control signal instructing cleaning to the first mobile robot or the server.
  • the predetermined order may be set to any one of a creation order of the first and second designated areas, a distance order between the first mobile robot and the first and second designated areas, or a cleaning order manually set by the user.
  • the movement of the first mobile robot can be minimized by performing cleaning while moving in the order close to the current position of the first mobile robot.
  • control unit 280 can switch the screen displayed on the display unit 251 to the individual robot detailed screen of the first mobile robot in accordance with the start of cleaning of the first mobile robot.
  • the map and remote control area 2610 are provided with a map 2611 of the 'A' area allocated to the first mobile robot, small progress information 2612 including information on the cleaning progress time, May be displayed.
  • the current position of the first mobile robot and a designated area set in the map 2611 can be displayed.
  • the remote control item 2613 of the map and remote control area 2610 may include an operation button such as a pause button, a button to go to the charging station, and a remote control item 2613 Status information can be displayed.
  • the manual control item 2620 can be set to the locked state.
  • the locked state of the manual control item 2620 can be released by a predetermined pre-set unlock command.
  • FIG. 27 is a conceptual diagram of a robot system including a plurality of mobile robots and a mobile terminal according to an embodiment of the present invention.
  • a robot system includes a plurality of mobile robots 100a, 100b,... And a plurality of mobile robots 100a, And may include a terminal 200.
  • the robot system may include a plurality of mobile robots 100a and 100b, a mobile terminal 200, and a server 50 of a robot manufacturer or a service provider.
  • the plurality of mobile robots 100a and 100b transmit map data in which the position information of the mobile robots 100a and 100b is stored according to communication standards such as LTE and Wi-Fi via the communication units 260a and 190b, 50).
  • the map data may be an entire map of the service area stored in the storage units 200a and 130b of the mobile robots 100a and 100b or a map of the area allocated to the mobile robots 100a and 100b.
  • the control units 210a and 140b of the mobile robots 100a and 100b can determine the current position based on the data obtained from the sensor units 240a and 170b or the image acquiring unit It is possible to control map data stored in the robot 100a or 100b to be transmitted to the server 50.
  • control units 210a and 140b may control the location information of the mobile robots 100a and 100b to be transmitted to the server 50, and the server 50 may store the received location information in a pre- Can be combined into the data.
  • the server 50 is a user interface capable of displaying map data including the start (current) position of the mobile robots 100a and 100b or a map screen including the start (current) positions of the mobile robots 100a and 100b (UI) data to the mobile terminal 200.
  • the control unit 280 of the mobile terminal 200 controls the display unit 251 to display the user interface screen such as the main screen or the individual robot detailed screen described above on the basis of the data received through the wireless communication unit 210 .
  • the server 50 can receive the status information of the mobile robots 100a and 100b and transmit the status information to the mobile terminal 200.
  • the mobile terminal 200 receives status information from the plurality of mobile robots from the server 50 and displays a main screen based on information on the received plurality of mobile robots on the display unit 251 can do.
  • the main screen may include a map area including one or more maps divided into a plurality of areas and a robot information area including robot information items for a mobile robot allocated to the plurality of areas.
  • the user can easily set the cleaning area in the individual mobile robot by a drag and drop operation for finishing the touch after touching and dragging on the map displayed on the user interface screen.
  • another cleaning area can be set in one screen by the same method, and multiple cleaning areas can be set in any one cleaning area.
  • control unit 280 may control the multi-cleaning area information to be stored in the memory 260.
  • the control unit 280 may control the wireless communication unit 210 so that the set multiple cleaning area information is transmitted to the server 50 or the mobile robots 100a and 100b.
  • the control units 210a and 140b of the mobile robots 100a and 100b can control to store the multiple cleaning area information received through the communication units 240a and 170b in the storage units 200a and 130b.
  • control units 210a and 140b can control so as to adjust and store the multiple cleaning area information in a predetermined order.
  • the predetermined order may be set to any one of a creation order of a designated area among the multiple cleaning areas, a distance order with respect to the mobile robots 100a and 100b, or a cleaning order manually set by a user.
  • the movement of the mobile robots 100a and 100b is minimized by moving the mobile robots 100a and 100b in the order close to the current position.
  • the mobile robots 100a and 100b sequentially move the multiple cleaning areas according to the predetermined order, and perform the cleaning operation.
  • a large space can be effectively cleaned.
  • multiple cleaning areas can be easily designated for the individual cleaning robots to perform cleaning for specific areas, thereby performing supplementary cleaning and concentrated cleaning.
  • the mobile robot, the mobile terminal, and the robot system according to the present invention can be applied to all or some of the embodiments of the present invention so that various modifications can be made to the embodiments and methods of the embodiments described above. Some of which may be selectively combined.
  • control method of the mobile robot, the mobile terminal, and the robot system according to the embodiment of the present invention can be implemented as a code that can be read by a processor in a recording medium readable by the processor.
  • the processor-readable recording medium includes all kinds of recording apparatuses in which data that can be read by the processor is stored. Examples of the recording medium that can be read by the processor include a ROM, a RAM, a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like, and may also be implemented in the form of a carrier wave such as transmission over the Internet .
  • the processor-readable recording medium may be distributed over network-connected computer systems so that code readable by the processor in a distributed fashion can be stored and executed.

Landscapes

  • Engineering & Computer Science (AREA)
  • Mechanical Engineering (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Robotics (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

Selon un aspect, la présente invention concerne un terminal mobile, comportant : une unité d'affichage; une unité de communication sans fil, destinée à recevoir des informations sur de multiples robots mobiles; et une unité de commande, destinée à commander à un écran principal, sur la base des informations reçues sur les multiples robots mobiles, de s'afficher sur l'unité d'affichage, l'écran principal comprenant : une zone de carte, comprenant au moins une carte divisée en de multiples zones; et une zone d'informations de robot, comprenant des éléments d'informations de robot sur des robots mobiles attribués aux multiples zones, de sorte qu'un utilisateur puisse vérifier et commander de manière commode des informations de robots comprises dans un système de robot.
PCT/KR2018/007399 2017-06-30 2018-06-29 Terminal mobile et système de robot comprenant ledit terminal mobile WO2019004773A1 (fr)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
KR1020170083623A KR20190003119A (ko) 2017-06-30 2017-06-30 이동 단말기 및 이를 포함하는 이동 로봇
KR10-2017-0083623 2017-06-30
KR10-2017-0083624 2017-06-30
KR1020170083624A KR20190003120A (ko) 2017-06-30 2017-06-30 복수의 이동 로봇을 포함하는 로봇 시스템 및 이동 단말기

Publications (1)

Publication Number Publication Date
WO2019004773A1 true WO2019004773A1 (fr) 2019-01-03

Family

ID=64742409

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2018/007399 WO2019004773A1 (fr) 2017-06-30 2018-06-29 Terminal mobile et système de robot comprenant ledit terminal mobile

Country Status (1)

Country Link
WO (1) WO2019004773A1 (fr)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111685654A (zh) * 2019-03-13 2020-09-22 北京奇虎科技有限公司 扫地机状态切换方法及装置
CN114355877A (zh) * 2021-11-25 2022-04-15 烟台杰瑞石油服务集团股份有限公司 一种多机器人作业区域的分配方法和装置
EP3986679A4 (fr) * 2019-06-18 2023-08-09 LG Electronics Inc. Robot mobile et son procédé de commande

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20120126772A (ko) * 2011-05-12 2012-11-21 엘지전자 주식회사 청소 장치, 및 복수의 로봇 청소기를 이용한 협동 청소 방법
KR20130039622A (ko) * 2011-10-12 2013-04-22 엘지전자 주식회사 로봇 청소기, 이의 원격 제어 시스템, 및 단말 장치
KR20150014237A (ko) * 2013-07-29 2015-02-06 삼성전자주식회사 자동 청소 시스템, 청소 로봇 및 그 제어 방법
KR20150061398A (ko) * 2013-11-27 2015-06-04 한국전자통신연구원 군집 로봇의 협력 청소 방법 및 제어 장치
JP2016515311A (ja) * 2013-01-18 2016-05-26 アイロボット コーポレイション 移動ロボットを備える環境管理システム及び移動ロボットを用いる方法

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20120126772A (ko) * 2011-05-12 2012-11-21 엘지전자 주식회사 청소 장치, 및 복수의 로봇 청소기를 이용한 협동 청소 방법
KR20130039622A (ko) * 2011-10-12 2013-04-22 엘지전자 주식회사 로봇 청소기, 이의 원격 제어 시스템, 및 단말 장치
JP2016515311A (ja) * 2013-01-18 2016-05-26 アイロボット コーポレイション 移動ロボットを備える環境管理システム及び移動ロボットを用いる方法
KR20150014237A (ko) * 2013-07-29 2015-02-06 삼성전자주식회사 자동 청소 시스템, 청소 로봇 및 그 제어 방법
KR20150061398A (ko) * 2013-11-27 2015-06-04 한국전자통신연구원 군집 로봇의 협력 청소 방법 및 제어 장치

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111685654A (zh) * 2019-03-13 2020-09-22 北京奇虎科技有限公司 扫地机状态切换方法及装置
EP3986679A4 (fr) * 2019-06-18 2023-08-09 LG Electronics Inc. Robot mobile et son procédé de commande
CN114355877A (zh) * 2021-11-25 2022-04-15 烟台杰瑞石油服务集团股份有限公司 一种多机器人作业区域的分配方法和装置
CN114355877B (zh) * 2021-11-25 2023-11-03 烟台杰瑞石油服务集团股份有限公司 一种多机器人作业区域的分配方法和装置

Similar Documents

Publication Publication Date Title
AU2019335976B2 (en) A robot cleaner and a controlling method for the same
WO2021010757A1 (fr) Robot mobile et son procédé de commande
AU2019262468B2 (en) A plurality of robot cleaner and a controlling method for the same
AU2019262467B2 (en) A plurality of robot cleaner and a controlling method for the same
WO2018155999A2 (fr) Robot mobile et son procédé de commande
AU2019430311B2 (en) Plurality of autonomous mobile robots and controlling method for the same
WO2018038553A1 (fr) Robot mobile et procédé de commande associé
WO2020050494A1 (fr) Robot nettoyeur et son procédé de commande
WO2019212239A1 (fr) Pluralité de robots nettoyeurs et leur procédé de commande
WO2019004742A1 (fr) Système de robot comprenant un robot mobile et un terminal mobile
WO2020050489A1 (fr) Robot nettoyeur et son procédé de commande
WO2017200353A1 (fr) Robot nettoyeur
WO2020218652A1 (fr) Purificateur d'air
AU2019262477B2 (en) Plurality of autonomous mobile robots and controlling method for the same
WO2019212276A1 (fr) Pluralité de robots mobiles autonomes et procédé de commande de tels robots mobiles autonomes
WO2019212240A1 (fr) Pluralité de robots nettoyeurs et leur procédé de commande
WO2016028021A1 (fr) Robot de nettoyage et son procédé de commande
WO2021006542A1 (fr) Robot mobile faisant appel à l'intelligence artificielle et son procédé de commande
WO2020050566A1 (fr) Pluralité de robots mobiles autonomes et procédé de commande de tels robots mobiles autonomes
WO2019004773A1 (fr) Terminal mobile et système de robot comprenant ledit terminal mobile
EP3787461A1 (fr) Pluralité de robots mobiles autonomes et procédé de commande de tels robots mobiles autonomes
EP3993962A1 (fr) Robot mobile et son procédé de commande
WO2017116131A1 (fr) Robot de nettoyage et procédé de commande de robot de nettoyage
WO2020050565A1 (fr) Pluralité de robots mobiles autonomes et procédé de commande de ces derniers
WO2023286901A1 (fr) Robot mobile, station d'accueil, et système robotique les comportant

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18822682

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18822682

Country of ref document: EP

Kind code of ref document: A1