WO2013085085A1 - Automatic moving apparatus and manual operation method thereof - Google Patents

Automatic moving apparatus and manual operation method thereof Download PDF

Info

Publication number
WO2013085085A1
WO2013085085A1 PCT/KR2011/009492 KR2011009492W WO2013085085A1 WO 2013085085 A1 WO2013085085 A1 WO 2013085085A1 KR 2011009492 W KR2011009492 W KR 2011009492W WO 2013085085 A1 WO2013085085 A1 WO 2013085085A1
Authority
WO
WIPO (PCT)
Prior art keywords
robot cleaner
traveling
moving apparatus
automatic moving
mode
Prior art date
Application number
PCT/KR2011/009492
Other languages
French (fr)
Inventor
Seokbyung Oh
Jiwoon HWANG
Kwonyul CHOI
Hyungshin Park
Original Assignee
Lg Electronics Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lg Electronics Inc. filed Critical Lg Electronics Inc.
Priority to PCT/KR2011/009492 priority Critical patent/WO2013085085A1/en
Priority to US14/354,493 priority patent/US9776332B2/en
Priority to KR1020147012430A priority patent/KR101910382B1/en
Publication of WO2013085085A1 publication Critical patent/WO2013085085A1/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J13/00Controls for manipulators
    • B25J13/08Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/0011Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot associated with a remote control arrangement
    • G05D1/0016Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot associated with a remote control arrangement characterised by the operator's input device
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • B25J9/1697Vision controlled systems
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/0003Home robots, i.e. small robots for domestic use
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/0011Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot associated with a remote control arrangement
    • G05D1/0033Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot associated with a remote control arrangement by having the operator tracking the vehicle either by direct line of sight or via one or more cameras located remotely from the vehicle
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/0011Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot associated with a remote control arrangement
    • G05D1/0038Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot associated with a remote control arrangement by providing the operator with simple or augmented images from one or more cameras located onboard the vehicle, e.g. tele-operation
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L2201/00Robotic cleaning machines, i.e. with automatic control of the travelling movement or the cleaning operation
    • A47L2201/04Automatic control of the travelling movement; Automatic obstacle detection
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1612Programme controls characterised by the hand, wrist, grip control
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1679Programme controls characterised by the tasks executed
    • B25J9/1689Teleoperation
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0276Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle

Definitions

  • the present invention relates to a manually operable automatic moving apparatus and a manual operation method thereof.
  • robots have been developed for industrial purposes to play a role in factory automation. Recently, application fields of robots have extended, Robots for medical purpose, space navigation robots, and the like, and even home robots that may be used in general houses have been developed.
  • a typical example of home robots is a robot cleaner, a type of an electronic device that sucks dust or foreign materials therearound while traveling in a certain region.
  • the robot cleaner generally includes a rechargeable battery and an obstacle sensor for avoiding a hindrance or an obstacle during traveling, whereby the robot cleaner can perform cleaning while traveling.
  • an aspect of the present invention provides various interfaces allowing users to directly manipulate an automatic moving apparatus manually, thereby enhancing user convenience and efficiency.
  • an automatic moving appartus including: a storage unit configured to store a traveling method; an image detection unit configured to acquire a captured image; a driving unit having one or more wheels and driving the wheels according to a driving signal; and a control unit configured to extract a traveling direction from the traveling method stored in the storage unit in a first mode, extract a traveling direction indicated by a sensing target from the captured image acquired by the image detection unit in a second mode, and generate a driving signal for moving the automatic moving apparatus in the extracted traveling direction.
  • an automatic moving apparatus including: a storage unit configured to store a traveling method; a communication unit configured to receive control information from an external terminal device; a driving unit having one or more wheels and driving the wheels according to a driving signal; and a control unit configured to extract a traveling direction from the traveling method stored in the storage unit in a first mode, extract a traveling direction from the control information received by the communication unit in a second mode, and generate a driving signal for moving the automatic moving appartus in the extracted traveling direction.
  • the user can manually manipulate the automatic moving apparatus through simple manipulation. Accordingly, a cleaning effect can be maximized, and in particular, both convenience and utility of the automatic moving apparatus can be obtained by providing an automatic manipulation mode and a manual manipulation mode.
  • the user can simply manipulate an automatic moving appartus manually. Accordingly, cleaning effect is maximized when the automatic moving apparatus is a robot cleaner.
  • the automatic moving apparatus provides both an auto manipulation mode and a manual manipulation, thereby enhancing user convenience and efficiency.
  • FIG. 1 is a block diagram of a manual control system of a robot cleaner according to an embodiment of the present invention.
  • FIG. 2 is a perspective view showing an external appearance of the robot cleaner 100 according to an embodiment of the present invention.
  • FIG. 3 is a schematic block diagram of the robot cleaner 100 according to an embodiment of the present invention.
  • FIG. 4 is a detailed block diagram of the robot cleaner 100 according to an embodiment of the present invention.
  • FIG. 5 is a side view showing an external appearance of the robot cleaner 100 according to an embodiment of the present invention.
  • FIG. 6 is a view showing a lower portion of the external appearance of the robot cleaner 100 according to an embodiment of the present invention.
  • FIG. 7 is a block diagram of a terminal device 200 according to an embodiment of the present invention.
  • FIG. 8 is a flow chart illustrating a process of controlling an operation of the robot cleaner according to a first embodiment of the present invention.
  • FIGS. 9a to 9c are views showing configuration screens of the robot cleaner according to an embodiment of the present invention.
  • FIGS. 10a to 10c are conceptual views showing a process of driving the robot cleaner in a manual manipulation mode according to the first embodiment of the present invention.
  • FIGS. 11a and 11b are different conceptual views showing a process of driving the robot cleaner in a manual manipulation mode according to the first embodiment of the present invention.
  • FIGS. 12a to 12c are different conceptual views showing a process of driving the robot cleaner in a manual manipulation mode according to the first embodiment of the present invention.
  • FIG. 13 is a flow chart illustrating a process of controlling an operation of the robot cleaner according to a second embodiment of the present invention.
  • FIGS. 14a to 14d are conceptual views showing a process of driving the robot cleaner in a manual manipulation mode according to the second embodiment of the present invention.
  • FIGS. 15a to 15c are different conceptual views showing a process of driving the robot cleaner in a manual manipulation mode according to the second embodiment of the present invention.
  • FIGS. 16a to 16d are different conceptual views showing a process of driving the robot cleaner in a manual manipulation mode according to the second embodiment of the present invention.
  • FIG. 17 is a different conceptual view showing a process of driving the robot cleaner in a manual manipulation mode according to the second embodiment of the present invention.
  • FIGS. 18a and 18b are views showing screens of setting a scenario of the robot cleaner according to an embodiment of the present invention.
  • FIG. 1 is a block diagram of a manual control system of a robot cleaner according to an embodiment of the present invention.
  • a system for manually controlling a robot cleaner includes a robot cleaner 100 and terminal devices 200a to 200c.
  • the robot cleaner 100 may extract a control command from a control signal to perform the control command, and capture an image of surroundings according to the control command to generate image information.
  • the terminal devices 200a to 200c may be connected to the robot cleaner 100, and receive the image information from the robot cleaner 100 and store the received image information.
  • the terminal devices 200a to 200c are classified into a mobile or portable terminal and a stationary terminal according to whether the terminal devices are movable.
  • the terminal devices 200a to 200c include both a mobile or portable terminal and a stationary terminal.
  • the terminal devices 200a to 200c are classified into a handheld terminal and a vehicle mount terminal, and here the terminal devices 200a to 200c include both a handheld terminal and a vehicle mount terminal.
  • the terminal devices may include cell phones (PCS phones), smart phones, laptop computer, digital broadcast terminals, PDAs (Personal Digital Assistants), PMPs (Portable Multimedia Player), navigation devices, and the like.
  • a networking scheme of the manual control system may differ according to types of terminal devices.
  • the manual control system may use a mobile communication network such as 3G, CDMA, WCDMA, or the like, and the robot cleaner and the terminal device transmits and receives radio signals to and from a base station, an external terminal, a server, or the like.
  • a mobile communication network such as 3G, CDMA, WCDMA, or the like
  • the robot cleaner and the terminal device transmits and receives radio signals to and from a base station, an external terminal, a server, or the like.
  • the robot cleaner 100 includes a communication unit 110 and a control unit 120. Also, the robot cleaner 100 may further include an image detection unit 131, an obstacle detection unit 132, and a location recognition unit 133.
  • the communication unit 110 receives a control signal from an external terminal device and transmits a response signal with respect to the control signal or one or more data.
  • the control unit 120 controls the robot cleaner according to the control signal and generates the response signal according to the control results.
  • the control unit 120 extracts a cleaning start command or a cleaning stop command from the control signal, and performs cleaning according to the cleaning start command or stops cleaning according to the cleaning stop command.
  • the communication unit 110 is connected to the terminal device according to one of satellite communication schemes, namely, according to one communication scheme among currently available communication schemes to transmit and receive data to and from the terminal device.
  • the communication unit 110 transmits status information, obstacle information, location information, image information, a cleaning map, or the like, of the robot cleaner 100.
  • the communication unit 110 may perform communication with the terminal device according to one communication scheme among short-range wireless communication schemes such as infrared data association (IrDA), wireless location area network (WLAN), ZigbeeTM, and the like,
  • the robot cleaner may include a communication unit according to a communication scheme available for the smart phone.
  • the communication unit 110 may receive cleaning reservation information of the robot cleaner 100 from the terminal device 200.
  • the control unit 120 performs cleaning by using the cleaning reservation information.
  • the robot cleaner 100 may further include the image detection unit 131 installed to face upwardly or forwardly and having an upper camera sensor to capture an image of the surroundings of the robot cleaner to detect image information.
  • the image detection unit 131 includes a plurality of upper camera sensors, the camera sensors may be formed on an upper portion or a side portion of the robot cleaner 100 at certain distance or at certain angle.
  • the image detection unit 131 may also be used as a different type of a location recognition unit.
  • the image detection unit 131 may further include a lens connected to the camera and focusing an object, an adjusting unit adjusting the camera, and a lens adjusting unit for adjusting the lens.
  • the control unit 120 may extract a feature point from the image information captured by the image detection unit, recognize a location of the robot cleaner by using the feature point, and generate a cleaning map with respect to a cleaning area.
  • the robot cleaner further includes the obstacle detection unit 132 including one or more sensors, detecting an obstacle around the robot cleaner by using sensing signals of the sensors, and outputting obstacle information.
  • the control unit 120 generates a cleaning map by using the obstacle information.
  • the obstacle detection unit 132 may include first sensors 132a installed at certain intervals on a front side of the robot cleaner 100, namely, on an outer circumferential surface of the robot cleaner 100, as shown in FIG. 2 or 5. Also, the obstacle detection unit 132 may include a second sensor 132b installed to have a face protruded to an outer side of a main body. The positions and types of the first and second sensors may vary according to types of the robot cleaner, and the obstacle detection unit may further include a variety of sensors. The first sensor 132a senses an object, in particular, an obstacle, present in a direction in which the robot cleaner travels, and transfers the sensing information to the control unit 120.
  • the first sensor senses a protrusion, household goods, furniture, a wall face, a wall corner, and the like, existing in a path along which the robot cleaner moves, and transfers corresponding information to the control unit 120.
  • the first sensor may be an infrared ray sensor, an ultrasonic sensor, an RF sensor, a geomagnetic sensor, or the like.
  • the second sensor 132b may sense an obstacle present at a front side or a lateral side and transfers obstacle information to the control unit 120.
  • the second sensor 132b senses a protrusion, household goods, furniture, a wall face, a wall corner, and the like, existing in a path along which the robot cleaner moves, and transfers corresponding information to the control unit 120.
  • the second sensor 132b may be an infrared ray sensor, an ultrasonic sensor, an RF sensor, a position sensitive device (PSD) sensor, or the like.
  • the obstacle detection unit 132 may further include a precipice sensor 132c installed on a lower surface of the main body and sensing an obstacle on a bottom surface, e.g., a precipice.
  • the precipice sensor 132c may be configured to stably obtain a measurement value regardless of a reflectivity of the bottom surface, a color difference, or the like, and may be configured to have a type of an infrared module such as a PSD sensor.
  • the obstacle detection unit 132 may further include a charge signal sensor (not shown) receiving a guidance signal transmitted from a charging station.
  • the robot cleaner 100 may check a location and direction of the charging station upon receiving the guidance signal generated by the charging station.
  • the charging station transmits a guidance signal indicating a direction and distance to allow the robot cleaner 100 to be returned.
  • the robot cleaner 100 determines a current location thereof and sets a movement direction to return to the charging station.
  • a charging signal sensor may be, for example, an infrared ray sensor, an ultrasonic sensor, an RF sensor, or the like. In general, the infrared ray sensor is used.
  • the charging signal sensor may be provided within the robot cleaner 100 or at an outer side of the robot cleaner 100.
  • the charging signal sensor may be installed at a lower portion of the output unit 180 or in the vicinity of the image detection unit 131.
  • the output unit 180 may display a remaining battery capacity on a screen. Also, the terminal device 200 may receive a charged state of the battery, a remaining battery capacity, or the like, from the robot cleaner 100 and display the same at one side of the screen of a display unit.
  • the robot cleaner may further include the location recognition unit 133 having one or more sensors, recognizing a location of the robot cleaner by using sensing signals of the sensors and outputting location information.
  • the control unit 120 may correct a cleaning map by using the location information recognized by the location recognition unit 133.
  • the location recognition unit 133 includes a lower camera sensor 133a provided on a rear side of the robot cleaner and capturing an image of a lower side, namely, the bottom surface, a cleaning target face.
  • the lower camera sensor 133a may be an optical flow sensor, which converts an image of a lower portion input from an image sensor provided in the sensor to generate image data having a certain format.
  • the lower camera sensor may sense a location of the robot cleaner 100 irrespective of sliding of the robot cleaner 100.
  • the control unit 120 compares and analyzes image data captured by the lower camera sensor according to time to calculate a movement distance and movement direction to thus calculate a location of the robot cleaner 100. Since the lower side of the robot cleaner 100 is observed by using the lower camera sensor, the control unit can make a reliable correction resistant to sliding with respect to the location calculated by a different unit.
  • the location recognition unit 133 may further include an acceleration sensor sensing a change in the speed of the robot cleaner 100, e.g., a change in a movement speed according to start, stop, a change of direction, collision with an object, or the like.
  • the acceleration sensor may be attached to a position adjacent to a main wheel or an auxiliary wheel to sense sliding or idle rotation of the wheel.
  • the speed is calculated by using acceleration sensed by the acceleration sensor, and compared with a reference speed to thus check or correct the location of the robot cleaner 100.
  • the acceleration sensor is installed in the control unit 120 to sense a change in speed of the robot cleaner when the robot cleaner performs cleaning operation or makes a movement. Namely, the acceleration sensor senses impulse according to the change in speed to output a corresponding voltage value.
  • the acceleration sensor may perform a function of an electronic bumper.
  • the location recognition unit 133 may further include a gyro sensor sensing a rotation direction of the robot cleaner and sensing a rotation angle when the robot cleaner moves or performs cleaning.
  • the gyro sensor may sense an angular velocity of the robot cleaner and output a voltage value proportional to the angular velocity.
  • the control unit 120 calculates a rotation direction and rotation angle by using the voltage value output from the gyro sensor.
  • the location recognition unit 133 may further include a wheel sensor 162 connected to the left and right main wheels 161 to sense the number of rotations of the main wheels.
  • the wheel sensor 162 mainly uses a rotary encoder, and when the robot cleaner 100 performs cleaning or moves, the wheel sensor 162 senses the number of rotations of the left and right main wheels and outputs the same.
  • the control unit 120 may calculate the rotation speed of the left and right wheels by using the number of rotations.
  • the control unit 120 may precisely recognize a location of the robot cleaner 100 by using the sensing information from the acceleration sensor, the gyro sensor, the wheel sensor 162, the lower camera sensor and the image information from the image detection unit. Also, the control unit 120 may precisely generate a cleaning map by using the obstacle information detected by the obstacle detection unit and the location recognized by the image detection unit.
  • the communication unit 110 transmits data including the image information, the obstacle information, the location information, the cleaning map, the cleaning area, and the like, to the terminal device 200.
  • the robot cleaner 100 may further include a storage unit 140 storing one or more information among the image information, the obstacle information, the location information, the cleaning map, and the cleaning area.
  • the storage unit 140 stores a control program for controlling (or driving) the robot cleaner 100 and corresponding data.
  • the storage unit 140 may store a cleaning method or a traveling method.
  • a non-volatile memory is commonly used.
  • the non-volatile memory (NVM) (or NVRAM) is a storage device that maintains stored information although power is not supplied thereto.
  • the non-volatile memory includes a ROM, a flash memory, a magnetic computer storage device (e.g., a hard disk, a diskette drive, or a magnetic tape), an optical disk drive, a magnetic RAM, PRAM, and the like.
  • the robot cleaner 100 may further include a cleaning unit 150.
  • the cleaning unit 150 may include a dust container storing collected dust, a suction fan providing power for sucking dust from a cleaning area, and a suction motor rotating the suction fan to suck air, and suck dust or a foreign object around the robot cleaner.
  • the cleaning unit 150 may further include a rotary brush 151 rotatably mounted at a lower portion of the main body of the robot cleaner and a side brush 152 cleaning a corner, or the like, of the cleaning area such as a wall face, or the like, while rotating about a vertical shaft of the main body of the robot cleaner 100 (by being centered thereon).
  • the rotary brush 151 rotating about a horizontal shaft of the main body of the robot cleaner 100, floats dust of the bottom (or floor), carpet, or the like, in the air.
  • a plurality of blades are provided in a spiral direction on an outer circumferential surface of the rotary brush 151.
  • a brush may be provided between the spiral blades.
  • the robot cleaner includes left and right main wheels 161 formed at both sides of a lower portion thereof to allow the robot cleaner to move.
  • a handle may be installed on both sides of the main wheels to allow the user to easily grasp.
  • the robot cleaner may further include the driving unit 160 connected to the left and right main wheels 161 so as to be driven.
  • the driving unit 160 may include certain wheel motors for rotating the wheels. By driving the wheel motors, the driving unit 160 moves the robot cleaner 100.
  • the wheel motors are connected to the main wheels to rotate the main wheels, respectively. The wheel motors mutually independently operate and can be rotatable in both directions.
  • the robot cleaner 100 includes one or more auxiliary wheels on a rear surface thereof to support the main body of the robot cleaner 100, minimize frictional contact between the lower surface of the main body and the bottom surface (i.e., the surface to be cleaned), and allow the robot cleaner 100 to smoothly move.
  • the robot cleaner 100 may further include an input unit 170 directly receiving a control command.
  • the user, and so on may input a command for outputting one or more information items among the information items stored in the storage unit 140.
  • the input unit 170 may be configured as one or more buttons.
  • the input unit 170 may include an OK button, a setting button, or the like.
  • the OK button inputs a command for conforming obstacle information, location information, image information, cleaning area, or a cleaning map.
  • the setting button inputs a command for setting the foregoing information items.
  • the input unit 170 may include a resetting button for inputting a command for resetting the information items, a delete button, a cleaning start button, a stop button, or the like.
  • the input unit 170 may include a button for setting or deleting reservation information. Also, the input unit 170 may further include a button for setting or changing a cleaning mode. Also, the input unit 170 may further include a button for receiving a command for returning to the charging station. As shown in FIG. 2, the input unit 170 may be installed as a hard key, a soft key, a touch pad, or the like, at an upper portion of the robot cleaner. Also, the input unit 170 may have a form of a touch screen along with the output unit 180.
  • the output unit 180 may be provided on the upper portion of the robot cleaner 100.
  • the installation position or the installation form of the output unit 180 may vary.
  • the output unit 180 may display reservation information, a battery state, a cleaning method or a traveling method such as intensive cleaning, a space extension, a zigzag operation, or the like, on the screen thereof.
  • the output unit 180 may output a current state of the respective units constituting the robot cleaner, and a current cleaning state.
  • the output unit 180 may display obstacle information, location information, image information, an internal map, a cleaning area, a cleaning map, or the like, on the screen.
  • the output unit 180 may be configured as any one of elements among a light emitting diode (LED), a liquid crystal display (LCD), a plasma display panel (PDP), and an organic light emitting diode (OLED).
  • LED light emitting diode
  • LCD liquid crystal display
  • PDP plasma display panel
  • OLED organic light emitting diode
  • the robot cleaner may further include a power source unit 190.
  • the power source unit 190 may include a rechargeable battery to provide power to the robot cleaner 100.
  • the power source unit supplies driving power to the respective units and operation power for the robot cleaner 100 to move or perform cleaning.
  • a remaining power capacity is insufficient, the robot cleaner moves to the charging station and charged upon receiving a charge current.
  • the robot cleaner further includes a battery sensing unit (not shown) sensing charge state of the battery and transmitting sensing results to the control unit 120.
  • the battery 191 is connected to the battery sensing unit, and a remaining battery capacity and charge state of the battery are transferred to the control unit 120.
  • the remaining battery capacity may be displayed on the screen of the output unit. As shown in FIG.
  • the battery 191 may be positioned at the center of the lower portion of the robot cleaner, or may be positioned at one of left and right portions of the robot cleaner so that the dust container can be placed at the central portion of the lower portion of the main body.
  • the robot cleaner 100 may further include a counter weight to resolve weight concentration of the battery.
  • the control unit 120 may previously set a reference value (remaining battery capacity) and compares the remaining battery capacity with the reference value. Upon comparison, when the sensing result is the reference value or smaller, the control unit 120 moves the robot cleaner 100 to the charging station to charge the robot cleaner 100. For example, the control unit 120 may stop a current operation of the robot cleaner 100 and move the robot cleaner 100 to the charging station to allow the robot cleaner 100 to be charged, according to a charge command from the terminal device 200. In another example, the control unit 120 may extract a charge command, and may perform a charge command according to a comparison result obtained by comparing the remaining battery capacity with the reference value or may perform a previous operation.
  • a reference value residual battery capacity
  • the terminal device 200 may include a wireless communication unit 210, a controller 220, and a display unit 230.
  • the wireless communication unit 210 transmits a control signal generated by the controller 220 to the robot cleaner 100 and receives one or more data items including image information or a cleaning map from the robot cleaner 100.
  • the one or more data items refers to image information, obstacle information, location information, a cleaning map, state information, and the like.
  • the controller 220 generates a control signal and generates a control screen by using the data.
  • the control command includes a cleaning start command or a cleaning stop command.
  • the wireless communication unit 210 may include one or more modules allowing for wireless communication in a network between the terminal device 200 and a wireless communication system, between terminal devices, or between a terminal device and the robot cleaner 100.
  • the wireless communication unit 210 may include a broadcast receiving module, a mobile communication module, a wireless Internet module, a short-range communication module, a location information module, and the like.
  • the broadcast receiving module receives broadcast signals and/or broadcast associated information from an external broadcast management server through a broadcast channel.
  • the mobile communication module transmits and/or receives a radio signal to and/or from at least one of a base station, an external terminal and a server over a mobile communication network.
  • the radio signal may include a voice call signal, a video call signal and/or various types of data according to text and/or multimedia message transmission and/or reception.
  • the wireless Internet module which refers to a module supporting wireless Internet access, may be built-in or externally installed to the terminal device.
  • the short-range communication module is a module supporting short-range communication.
  • Bluetooth Radio Frequency IDentification (RFID), Infrared Data Association (IrDA), Ultra WideBand (UWB), ZigBeeTM, and the like, may be used.
  • RFID Radio Frequency IDentification
  • IrDA Infrared Data Association
  • UWB Ultra WideBand
  • ZigBeeTM ZigBeeTM
  • the display unit 230 includes a touch recognition region 231 for receiving a control command, and displays a control screen. Also, the display unit 230 may display an icon according to a communication scheme (e.g., Wi-Fi, 3G), displays communication sensitivity, and a remaining battery capacity of the robot cleaner. As shown in FIG. 9a, the display unit 230 may display a touch recognition region including a first region 232 including the control screen, receiving the control command, and having a certain size, and a second region 233 having a size smaller than or equal to the first region 232.
  • a communication scheme e.g., Wi-Fi, 3G
  • a touch recognition region may not be formed, and, instead, an input unit for receiving a control command and an output unit displaying a control screen may be discriminated.
  • the display unit 230 may alternately display a cleaning start icon receiving a cleaning start command and a cleaning stop icon receiving a cleaning stop command on the touch recognition region. Also, the display unit 230 may further include a mode icon for setting a cleaning mode. Here, when a touch input with respect to the mode icon is received, the controller 220 may generate a mode setting screen and the display unit 230 may display a mode setting screen. Also, the display unit 230 may further include a cleaning reservation icon for setting a cleaning reservation. Here, when a touch input with respect to the cleaning reservation icon is received, the controller 220 may generate a cleaning reservation screen and the display unit displays the cleaning reservation screen.
  • the display unit 230 may display information processed in the terminal device 200. Namely, the display unit 230 displays a control screen. For example, when the terminal device 200 is in a phone call mode, the display unit 230 may display a User Interface (UI) or a Graphic User Interface (GUI) associated with a call.
  • the display unit 230 may include at least one of a Liquid Crystal Display (LCD), a Thin Film Transistor-LCD (TFT-LCD), an Organic Light Emitting Diode (OLED) display, a flexible display, and a three-dimensional (3D) display.
  • LCD Liquid Crystal Display
  • TFT-LCD Thin Film Transistor-LCD
  • OLED Organic Light Emitting Diode
  • flexible display a flexible display
  • 3D three-dimensional
  • the display unit 230 may be used as both an input-available touch screen and an output-available touch screen.
  • the touch sensor may have the form of, for example, a touch film, a touch sheet, a touch pad, etc.
  • the touch sensor may be configured to convert a pressure applied to a particular portion of the display unit 230 or a change in capacitance at a particular portion of the display unit 230 into an electrical input signal.
  • the touch sensor may be configured to sense the pressure when a touch is applied, as well as a touched position or area.
  • the touch sensor may be a proximity sensor that a pointer is positioned to be close to the screen without actually being in contact with the screen.
  • the proximity sensor refers to a sensor for sensing the presence or absence of an object that accesses a certain sensing surface or an object that exists nearby by using the force of electromagnetism or infrared rays without a mechanical contact.
  • a touch recognition includes a proximity touch in which the pointer is positioned to be close to the touch screen without being contacted and a contact touch in which the pointer is actually in contact with the touch screen.
  • the terminal device may further include a memory 240.
  • the memory 240 may store a program for operating the controller 220. Also, the memory 240 may store input/output data (e.g., phonebook, messages, still images, videos, and the like). The memory 240 may pattern a control signal for controlling the robot cleaner and a corresponding control command in advance and store the same.
  • the terminal device may further include an A/V (audio/video) input unit, a user input unit, a sensing unit, an interface unit, a power supply unit, and the like.
  • A/V audio/video
  • the A/V input unit which is to input an audio signal or a video signal, may include a camera, a microphone, and the like.
  • the user input unit generates input data to control an operation of the terminal device by the user.
  • the user input unit may be configured as a key pad, a dome switch, a touch pad (pressure/capacitance), a jog wheel, a jog switch, and the like.
  • the touch pad when it is overlaid on the display unit 230 in a layered manner, it may form a touch screen.
  • the sensing unit senses a current status of the terminal device such as an opened or closed state of the terminal device, a location of the terminal device, the presence or absence of user contact with the terminal device, the orientation of the terminal device, an acceleration or deceleration movement and direction of the terminal device, etc., and generates commands or signals for controlling the operation of the terminal device.
  • a current status of the terminal device such as an opened or closed state of the terminal device, a location of the terminal device, the presence or absence of user contact with the terminal device, the orientation of the terminal device, an acceleration or deceleration movement and direction of the terminal device, etc.
  • the interface unit serves as a passage with every external device connected to the terminal device 200.
  • the interface unit may receive data or power from an external device and transfer the received input to elements within the terminal device 200 or may transfer data within the terminal device to an external device.
  • the power supply unit receives external power or internal power and supplies appropriate power required for operating respective elements and components under the control of the controller 220.
  • the controller 220 typically controls the general operations of the terminal device. For example, in case of a mobile phone or a smart phone, the controller 220 performs controlling and processing associated with voice calls, data communications, video calls, and the like. Also, the controller 220 may perform a pattern recognition processing to recognize a handwriting input or a picture drawing input performed on the touch screen as characters or images, respectively.
  • the controller 220 may generate a control signal corresponding to a control command with respect to the robot cleaner 100 and generate a control screen by using data and a response signal.
  • the control command includes a movement command, a patrol command, a charge command, a setting change, and the like, in addition to cleaning commands such as cleaning start, cleaning stop, or the like.
  • FIG. 8 is a flow chart illustrating a process of controlling an operation of the robot cleaner according to a first embodiment of the present invention.
  • a process for controlling an operation of a robot cleaner may include a step (S110) of checking a manipulation mode of a robot cleaner, a step (S130) of extracting a traveling direction from a traveling method previously stored in the storage unit when the manipulation mode is an automatic mode in step (S120), a step (S140) of extracting a traveling direction indicated by a sensing target from a captured image obtained by the image detection unit 131 when the manipulation mode is a manual mode in step (S120), a step (S150) of generating a driving signal for moving the robot cleaner in the traveling direction extracted in step S130 or the traveling direction extracted in step S140, and a step (S160) of driving one or more wheels provided in the robot cleaner according to the generated driving signal.
  • FIGS. 9a to 9c are views showing configuration screens of the robot cleaner according to an embodiment of the present invention.
  • the robot cleaner 100 or the terminal device 200 may provide items for changing the setting of the robot cleaner 100.
  • the robot cleaner 100 or the terminal device 200 may provide an item for setting a manipulation method of the robot cleaner 100.
  • a manipulation setting screen 300 providing items for setting a manipulation method of the robot cleaner 100 includes an automatic manipulation setting item 301 and a manual manipulation setting item 303.
  • the automatic manipulation setting item 301 and the manual manipulation setting item 303 include a detailed setting item 302 or 304, respectively.
  • the robot cleaner 100 may be automatically or manually manipulated according to an item selected form the manipulation setting screen 300.
  • the robot cleaner 100 When the automatic manipulation setting item 301 is selected, the robot cleaner 100 is manipulated according to a traveling method previously stored in the storage unit 140. However, when the manual manipulation setting item 303 is selected, the robot cleaner 100 may be manipulated by a sensing target detected from a captured image obtained by the image detection unit 131 or manipulated by the terminal device 200 transmitting control information to the communication unit 110.
  • an automatic manipulation setting screen image 310 is displayed.
  • the automatic manipulation setting screen image 310 provides an interface allowing for selecting one of a plurality of scenarios stored in the storage unit 140.
  • the robot cleaner 100 is manipulated according to the selected scenario 312.
  • the scenario stores a method for performing cleaning by the robot cleaner 100, including content regarding a cleaning time, order, position, method, and the like.
  • a manual manipulation setting screen image 320 is displayed.
  • the manual manipulation setting screen image 320 provides an interface allowing for selecting one of a plurality of scenarios stored in the storage unit 140.
  • one scenario 322 is selected, the robot cleaner 100 is manipulated according to the selected scenario 322.
  • FIGS. 10a to 10c are conceptual views showing a process of driving the robot cleaner in a manual manipulation mode according to the first embodiment of the present invention.
  • the storage unit 140 stores a traveling method and the image detection unit 131 obtains a captured image.
  • the driving unit 160 includes one or more wheels 161 and drives the wheels 161 according to a driving signal.
  • the control unit 120 extracts a traveling direction from the traveling method stored in the storage unit in a first mode and extracts a traveling direction indicated by a sensing target from the captured image obtained by the image detection unit.
  • the control unit 120 generates a driving signal for moving the robot cleaner in the extracted traveling direction.
  • the control unit 120 may also extract a traveling distance, as well as the traveling direction, to generate a driving signal for moving the robot cleaner by the corresponding distance in the corresponding direction.
  • the robot cleaner 100 may obtain an image of the sensing target through the image detection unit 131 having the upper camera sensor.
  • the sensing target refers to a moving object.
  • the sensing target may refer to a moving particular subject having feature information such as a hand.
  • the user may place his hand 500 at a certain distance above the upper part of the robot cleaner 100 when the robot cleaner is within a certain area 400, such as in the user's home.
  • the robot cleaner 100 may capture an image of an upper direction of the robot cleaner 100 in real time to detect the user’s hand 500.
  • the robot cleaner 100 may generate a driving signal for moving the robot cleaner 100 in an appropriate direction so that the user’s hand 500 can be placed at an upper side of a certain position, such as the center of the robot cleaner, all the time.
  • the user may move his hand 500 so as to be placed at an upper portion of the right side of the robot cleaner 100.
  • the robot cleaner 100 may generate a driving signal for moving the robot cleaner 100 in a rightward direction such that the detected user’s hand 500 may be placed at a central upper side of the robot cleaner.
  • the driving signal is applied to the driving unit 160, the left and right main wheels 161 are driven, and as a result, the user’s hand 500 can be continuously placed at the central upper side of the robot cleaner 100.
  • the user can simply manipulate the robot cleaner 100 to perform cleaning on a contamination area.
  • this can be effective when the robot cleaner 100 does not perform cleaning on a particular contamination area, when the robot cleaner 100 intensively perform cleaning, of when the user wants to immediately clean a particular contamination area, in the automatic manipulation mode.
  • FIGS. 11a and 11b are different conceptual views showing a process of driving the robot cleaner in the manual manipulation mode according to the first embodiment of the present invention.
  • the user may indicate a contamination area 610 with his hand 500 in the area in which the robot cleaner 100 is able to capture an image in a certain area 600 such as in the user’s home.
  • the robot cleaner 100 may capture an image of an upper direction of the robot cleaner 100 in real time to detect the user’s hand 500.
  • the robot cleaner 100 may generate a driving signal for moving the robot cleaner 100 toward the position indicated by the user’s hand 500.
  • the driving signal is applied to the driving unit 160, the left and right main wheels 161 are driven, and as a result, the robot cleaner 100 can move to the position indicated by the user’s hand 500.
  • FIGS. 12a to 12c are different conceptual views showing a process of driving the robot cleaner in the manual manipulation mode according to the first embodiment of the present invention.
  • the control unit 120 extracts a series of traveling directions and a series of traveling distances indicated by the sensing target from the captured image, stores the extracted series of traveling directions and the extracted series of traveling distances in the storage unit 140, and generates a series of driving signals for moving the robot cleaner 100 in the stored traveling directions by the traveling distances.
  • the user may indicate a first contamination area 710 with his hand 500 within the area in which the robot cleaner 100 may capture an image in the particular area 700. And then, the user may indicate a second contamination area 720 with his hand 500 within the area in which the robot cleaner 100 may capture an image.
  • the robot cleaner 100 may capture an image of an upper direction of the robot cleaner 100 in real time to detect the user’s hand 500. Also, the robot cleaner 100 may sequentially store the points indicated by the user’s hand 500. After the user finishes the behavior of indicating a particular point with his hand 500, the robot cleaner 100 sequentially retrieves the stored points.
  • the robot cleaner 100 With reference to FIG. 12b, first, the robot cleaner 100 generates a driving signal for retrieving the first contamination area 710 and moving the robot cleaner 100 to the first contamination area 710. As the driving signal is applied to the driving unit 160, the left and right main wheels 161 are driven, and as a result, the robot cleaner 100 can move to the first contamination area 710 to perform cleaning.
  • the robot cleaner 100 With reference to FIG. 12c, the robot cleaner 100 generates a driving signal for retrieving the second contamination area 720 and moving the robot cleaner 100 to the second contamination area 720. As the driving signal is applied to the driving unit 160, the left and right main wheels 161 are driven, and as a result, the robot cleaner 100 may move to the second contamination area 720 to perform cleaning.
  • FIG. 13 is a flow chart illustrating a process of controlling an operation of the robot cleaner according to a second embodiment of the present invention.
  • a process of controlling an operation of the robot cleaner according to the second embodiment of the present invention includes: a step (S210) of checking a manipulation mode of the robot cleaner, a step (S230) of extracting a traveling direction from a traveling method previously stored in the storage unit 140 when the manipulation is an automatic mode in step (S220), a step (S240) of extracting a traveling direction from control information obtained by the communication unit 110 when the manipulation mode is a manual mode in step (S220), a step (S250) of generating a driving signal for moving the robot cleaner in the traveling direction extracted in step (S230) or the traveling direction extracted in step (S240), and a step (S260) of driving one or more wheels provided in the robot cleaner according to the generated driving signal.
  • FIGS. 14a to 14d are conceptual views showing a process of driving the robot cleaner in a manual manipulation mode according to the second embodiment of the present invention.
  • the storage unit 140 stores the traveling method and the communication unit 110 receives control information from the terminal device 200.
  • the driving unit 160 includes one or more wheels 161 and drives the wheels 161 according to a driving signal.
  • the control unit 120 extracts a traveling direction from the traveling method stored in the storage unit in a first mode, and extracts a traveling direction from the control information received by the communication unit 110 in a second mode. Also, the control unit 120 generates a driving signal for moving the robot cleaner in the extracted traveling direction. Meanwhile, the control unit 120 may also extract a traveling distance, as well as the traveling direction, to generate a driving signal for moving the robot cleaner by the corresponding distance in the corresponding direction.
  • a connection may be established between the robot cleaner 100 and the terminal device 200 according to a user’s connection request.
  • the robot cleaner 100 may receive control information from the terminal device 200 and may be driven according to the received control information.
  • the terminal device 200 may manually control the operation of the robot cleaner 100.
  • the robot cleaner 100 and the terminal device 200 may be connected by using a wireless Internet technology such as WLAN (Wireless LAN) (Wi-Fi), Wibro (Wireless broadband), Wimax (World Interoperability for Microwave Access), HSDPA (High Speed Downlink Packet Access), or the like.
  • a wireless Internet technology such as WLAN (Wireless LAN) (Wi-Fi), Wibro (Wireless broadband), Wimax (World Interoperability for Microwave Access), HSDPA (High Speed Downlink Packet Access), or the like.
  • the robot cleaner 100 and the terminal device 200 may be connected by using a short-range communication technology such as BluetoothTM, RFID (Radio Frequency Identification), IrDA (infrared Data Association), UWB (Ultra Wideband), ZigBeeTM, or the like.
  • the control information transmitted from the terminal device 200 to the robot cleaner 100 may include a traveling method.
  • the traveling method may include a traveling direction and may further include a traveling distance.
  • the control information may explicitly include, for example, a parameter regarding a direction in which the robot cleaner 100 is to travel or parameters regarding a direction and a distance.
  • the control information may implicitly include, for example, a parameter regarding a direction in which the robot cleaner 100 is to travel or parameters regarding the direction and a distance, such as a parameter regarding a direction in which the terminal device 200 moves or parameters regarding the direction and a distance.
  • the terminal device 200 may display a manual control screen image 800.
  • the manual control screen image 800 may include a cleaning map region 810, a manipulation tool region 820, and a captured image region 830.
  • the cleaning map region 810 may display a cleaning map 812 with respect to a cleaning area generated by the control unit 120 by using image information captured by the image detection unit 131.
  • the cleaning map 812 may additionally indicate a current location of the robot cleaner 100.
  • the manipulation tool region 820 may display a tool 822 for manipulating traveling of the robot cleaner 100.
  • the tool 822 manipulating traveling of the robot cleaner 100 may include a direction item 822-1 enabling at least two-dimensional movement of the robot cleaner 100 and an indicator item 822-2 indicating a manipulation direction.
  • the captured image region 830 displays the image information received from the robot cleaner 100.
  • the image information may be image information generated by capturing an image of the surrounding of the robot cleaner through the image detection unit 110 of the robot cleaner 100.
  • the image information may include a captured image of a front side of the robot cleaner 100.
  • the user may input a command for moving the robot cleaner 100 in the particular direction by using the traveling manipulation tool 822 displayed on the manipulation tool region 820.
  • the user may input a command for moving the robot cleaner 100 in a corresponding direction by dragging the indicator item 822-2 in the particular direction.
  • the user when the user wants to move the robot cleaner 100 to a particular location, the user may input a command for moving the robot cleaner 100 to the corresponding location by using the cleaning map 812 displayed on the cleaning map region 810. For example, the user may input a command for moving the robot cleaner 100 to the corresponding location by touching the particular portion of the cleaning map 812.
  • the user when the user wants to move the robot cleaner 100 to a particular position viewed in the capture image region 830, the user may input a command for moving the robot cleaner 100 to the corresponding location by using the captured image displayed on the captured image region 830. For example, the user may input a command for moving the robot cleaner 100 to the corresponding location by touching a particular portion of the captured image.
  • the terminal device 200 may transmit the corresponding command as control information having a format that can be interpreted by the robot cleaner 100.
  • the robot cleaner 100 Upon receiving the control information, the robot cleaner 100 extracts a traveling direction, or a traveling direction and a distance from the control information, and generates a driving signal for moving the robot cleaner 100 in the corresponding direction or in the corresponding direction and by the corresponding distance.
  • the driving signal is applied to the driving unit 160, the left and right main wheels 161 are driven to move the robot cleaner 100 in the corresponding direction, or in the corresponding direction or by the corresponding distance.
  • FIGS. 15a to 15c are different conceptual views showing a process of driving the robot cleaner in a manual manipulation mode according to the second embodiment of the present invention.
  • the terminal device 200 may transmit information regarding a movement of the terminal device 200 to the robot cleaner 100.
  • the terminal device 200 may include an acceleration sensor (not shown) and/or a gyro sensor (not shown).
  • the user may place the terminal device 200 at a certain distance above the upper part of the robot cleaner 100 when the robot cleaner is within a certain area 400, such as in the user's home.
  • the terminal device 200 may detect a movement by using the acceleration sensor and/or gyro sensor installed therein and transmit information regarding the detected movement to the robot cleaner 100.
  • the robot cleaner 100 may analyze the received information regarding the movement and generate a driving signal for moving the robot cleaner 100 by in a direction or in a direction and by a distance corresponding to the movement of the terminal device 200.
  • the user may move the terminal device 200 to the right side of the robot cleaner 100 in a particular area 900.
  • the robot cleaner 100 may generate a driving signal for moving the robot cleaner 100 in the rightward direction according to the movement of the terminal device 200.
  • the driving signal is applied to the driving unit 160, the left and right main wheels 161 are driven, and accordingly, the robot cleaner 100 can move in the rightward direction.
  • FIGS. 16a to 16d are different conceptual views showing a process of driving the robot cleaner in a manual manipulation mode according to the second embodiment of the present invention.
  • a connection may be established between the robot cleaner 100 and the terminal device 200 according to a user’s connection request.
  • the robot cleaner 100 may receive control information from the terminal device 200 and may be driven according to the received control information.
  • the terminal device 200 may manually control the operation of the robot cleaner 100.
  • the terminal device 200 may display a manual control screen image 100.
  • the manual control screen image 1000 may include a cleaning map reduction region 1010, a manipulation tool region 1020, and a cleaning map magnification region 1030.
  • the cleaning map reduction region 1010 displays a reduced cleaning map 1012 with respect to a cleaning region generated by the control unit 120 by using image information captured by the image detection unit 131.
  • the reduced cleaning map 1012 may additionally indicate a current location of the robot cleaner 100.
  • the manipulation tool region 1020 displays a tool 1022 for manipulating traveling of the robot cleaner 100.
  • the tool 1022 for manipulating traveling of the robot cleaner 100 may include a direction item enabling at least a 2D movement of the robot cleaner 100 and an indicator item indicating a manipulation direction.
  • the cleaning map magnification region 1030 displays a magnified cleaning map 1040 with respect to the generated cleaning region.
  • the magnified cleaning map 1040 may include at least one contamination regions 1042 and 1044 and include an indicator 1050 reflecting the location of the robot cleaner 100.
  • the user may sequentially select spots desired to be cleaned on the magnified cleaning map 1040.
  • the user may sequentially touch the first contamination region 1042 and the second contamination region 1044 on the magnified cleaning map 1040.
  • the user may terminate the selection of spots by selecting an ‘OK’ button displayed on the cleaning map magnification region 1030.
  • the terminal device 200 transmits the position information regarding the selected spots to the robot cleaner 100.
  • the robot cleaner 100 analyzes the control information and sequentially stores the spots selected by the user.
  • the robot cleaner 100 generates a driving signal for calling the first contamination region 1042 and moving the robot cleaner 100 to the first contamination region 1042.
  • the driving signal is applied to the driving unit 160, the left and right main wheels 161 are driven, and accordingly, the robot cleaner 100 can move to the first contamination region 1042 to perform cleaning.
  • the robot cleaner 100 With reference to FIG. 16d, subsequently, the robot cleaner 100 generates a driving signal for calling a second contamination region 1044 and moving the robot cleaner 100 to the second contamination region 1044.
  • the driving signal is applied to the driving unit 160, the left and right main wheels 161 are driven, and accordingly, the robot cleaner 100 can move to the second contamination region 1042 to perform cleaning.
  • the user may correct the cleaning map generated by the robot cleaner 100 on the cleaning map magnification region 1030.
  • the terminal device 200 may provide a user interface allowing for a correction of the cleaning map, and when the cleaning map is corrected through the provided user interface, the terminal device 200 may transmit the corrected content to the robot cleaner 100.
  • FIG. 17 is a different conceptual view showing a process of driving the robot cleaner in a manual manipulation mode according to the second embodiment of the present invention.
  • the terminal device 200 may have a camera (not shown).
  • the terminal device 200 may provide a captured screen image 1100 of a region to be cleaned by the robot cleaner 100.
  • the captured cleaning region screen image 1100 may include a guide region 1110 providing guide information, a preview region 1120 providing a preview of a captured image, and a menu region 1130 displaying a capture menu.
  • the terminal device 200 may transmit the captured image to the robot cleaner 100.
  • the robot cleaner 100 compares the captured image with previously captured image information and extract feature information to determine a corresponding position. And, the robot cleaner 100 generates a driving signal for moving the robot cleaner 100 to the corresponding position.
  • the driving signal is applied to the driving unit 160, the left and right main wheels 161 are driven, and accordingly, the robot cleaner 100 can move to the position captured by the terminal device 200 to perform cleaning.
  • FIGS. 18a and 18b are views showing screens of setting a scenario of the robot cleaner according to an embodiment of the present invention.
  • the traveling direction or the traveling direction and the distance are manually determined by the user in the foregoing manual manipulation mode, and the robot cleaner 100 may be driven accordingly.
  • the traveling direction, or the traveling direction and distance, traveling order, a traveling time, and the like may be stored as a scenario.
  • the robot cleaner 100 or the terminal device 200 may provide a menu 1212 for storing the scenario performed in the manual manipulation mode through a cleaning completion screen image 1200. Also, with reference to FIG. 18b, the user may drive the robot cleaner 100 according to a traveling method performed in the manual manipulation mode by selecting a scenario 314 newly added in the automatic manipulation setting screen 310.
  • the foregoing method can be implemented as codes that can be read by a processor in a program-recorded medium.
  • the processor-readable medium may include a ROM, a RAM, a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like.
  • the processor-readable medium also includes implementations in the form of carrier waves or signals (e.g., transmission via the Internet).

Landscapes

  • Engineering & Computer Science (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • General Physics & Mathematics (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Mathematical Physics (AREA)
  • Theoretical Computer Science (AREA)
  • Computing Systems (AREA)
  • Human Computer Interaction (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • Electric Vacuum Cleaner (AREA)

Abstract

Various interfaces allowing users to directly manipulate an automatic moving apparatus manually, thus enhancing user convenience and efficiency, are provided. An automatic moving apparatus includes: a storage unit configured to store a traveling method; an image detection unit configured to acquire a captured image; a driving unit having one or more wheels and driving the wheels according to a driving signal; and a control unit configured to extract a traveling direction from the traveling method stored in the storage unit in a first mode, extract a traveling direction indicated by a sensing target from the captured image acquired by the image detection unit in a second mode, and generate a driving signal for moving the automatic moving apparatus in the extracted traveling direction.

Description

AUTOMATIC MOVING APPARATUS AND MANUAL OPERATION METHOD THEREOF
The present invention relates to a manually operable automatic moving apparatus and a manual operation method thereof.
In general, robots have been developed for industrial purposes to play a role in factory automation. Recently, application fields of robots have extended, Robots for medical purpose, space navigation robots, and the like, and even home robots that may be used in general houses have been developed.
A typical example of home robots is a robot cleaner, a type of an electronic device that sucks dust or foreign materials therearound while traveling in a certain region. The robot cleaner generally includes a rechargeable battery and an obstacle sensor for avoiding a hindrance or an obstacle during traveling, whereby the robot cleaner can perform cleaning while traveling.
Therefore, an aspect of the present invention provides various interfaces allowing users to directly manipulate an automatic moving apparatus manually, thereby enhancing user convenience and efficiency.
According to an aspect of the present invention, there is provided an automatic moving appartus including: a storage unit configured to store a traveling method; an image detection unit configured to acquire a captured image; a driving unit having one or more wheels and driving the wheels according to a driving signal; and a control unit configured to extract a traveling direction from the traveling method stored in the storage unit in a first mode, extract a traveling direction indicated by a sensing target from the captured image acquired by the image detection unit in a second mode, and generate a driving signal for moving the automatic moving apparatus in the extracted traveling direction.
According to another aspect of the present invention, there is provided an automatic moving apparatus including: a storage unit configured to store a traveling method; a communication unit configured to receive control information from an external terminal device; a driving unit having one or more wheels and driving the wheels according to a driving signal; and a control unit configured to extract a traveling direction from the traveling method stored in the storage unit in a first mode, extract a traveling direction from the control information received by the communication unit in a second mode, and generate a driving signal for moving the automatic moving appartus in the extracted traveling direction.
According to embodiments of the present invention, the user can manually manipulate the automatic moving apparatus through simple manipulation. Accordingly, a cleaning effect can be maximized, and in particular, both convenience and utility of the automatic moving apparatus can be obtained by providing an automatic manipulation mode and a manual manipulation mode.
According to the embodiments of the present invention, the user can simply manipulate an automatic moving appartus manually. Accordingly, cleaning effect is maximized when the automatic moving apparatus is a robot cleaner. Specially, the automatic moving apparatus provides both an auto manipulation mode and a manual manipulation, thereby enhancing user convenience and efficiency.
FIG. 1 is a block diagram of a manual control system of a robot cleaner according to an embodiment of the present invention.
FIG. 2 is a perspective view showing an external appearance of the robot cleaner 100 according to an embodiment of the present invention.
FIG. 3 is a schematic block diagram of the robot cleaner 100 according to an embodiment of the present invention.
FIG. 4 is a detailed block diagram of the robot cleaner 100 according to an embodiment of the present invention.
FIG. 5 is a side view showing an external appearance of the robot cleaner 100 according to an embodiment of the present invention.
FIG. 6 is a view showing a lower portion of the external appearance of the robot cleaner 100 according to an embodiment of the present invention.
FIG. 7 is a block diagram of a terminal device 200 according to an embodiment of the present invention.
FIG. 8 is a flow chart illustrating a process of controlling an operation of the robot cleaner according to a first embodiment of the present invention.
FIGS. 9a to 9c are views showing configuration screens of the robot cleaner according to an embodiment of the present invention.
FIGS. 10a to 10c are conceptual views showing a process of driving the robot cleaner in a manual manipulation mode according to the first embodiment of the present invention.
FIGS. 11a and 11b are different conceptual views showing a process of driving the robot cleaner in a manual manipulation mode according to the first embodiment of the present invention.
FIGS. 12a to 12c are different conceptual views showing a process of driving the robot cleaner in a manual manipulation mode according to the first embodiment of the present invention.
FIG. 13 is a flow chart illustrating a process of controlling an operation of the robot cleaner according to a second embodiment of the present invention.
FIGS. 14a to 14d are conceptual views showing a process of driving the robot cleaner in a manual manipulation mode according to the second embodiment of the present invention.
FIGS. 15a to 15c are different conceptual views showing a process of driving the robot cleaner in a manual manipulation mode according to the second embodiment of the present invention.
FIGS. 16a to 16d are different conceptual views showing a process of driving the robot cleaner in a manual manipulation mode according to the second embodiment of the present invention.
FIG. 17 is a different conceptual view showing a process of driving the robot cleaner in a manual manipulation mode according to the second embodiment of the present invention.
FIGS. 18a and 18b are views showing screens of setting a scenario of the robot cleaner according to an embodiment of the present invention.
Hereinafter, embodiments will be described in detail with reference to the accompanying drawings such that they can be easily practiced by those skilled in the art to which the present invention pertains. As those skilled in the art would realize, the described embodiments may be modified in various different ways, all without departing from the spirit or scope of the present invention. In order to clarify the present invention, parts irrespective of description will be omitted, and similar drawing sequences are used for the similar parts throughout the specification.
FIG. 1 is a block diagram of a manual control system of a robot cleaner according to an embodiment of the present invention.
With reference to FIG. 1, a system for manually controlling a robot cleaner includes a robot cleaner 100 and terminal devices 200a to 200c. The robot cleaner 100 may extract a control command from a control signal to perform the control command, and capture an image of surroundings according to the control command to generate image information. The terminal devices 200a to 200c may be connected to the robot cleaner 100, and receive the image information from the robot cleaner 100 and store the received image information.
The terminal devices 200a to 200c are classified into a mobile or portable terminal and a stationary terminal according to whether the terminal devices are movable. Here, the terminal devices 200a to 200c include both a mobile or portable terminal and a stationary terminal. Also, the terminal devices 200a to 200c are classified into a handheld terminal and a vehicle mount terminal, and here the terminal devices 200a to 200c include both a handheld terminal and a vehicle mount terminal. For example, the terminal devices may include cell phones (PCS phones), smart phones, laptop computer, digital broadcast terminals, PDAs (Personal Digital Assistants), PMPs (Portable Multimedia Player), navigation devices, and the like. A networking scheme of the manual control system may differ according to types of terminal devices. For example, in case of a mobile phone or a smart phone, the manual control system may use a mobile communication network such as 3G, CDMA, WCDMA, or the like, and the robot cleaner and the terminal device transmits and receives radio signals to and from a base station, an external terminal, a server, or the like.
With reference to FIGS. 2 and 3, the robot cleaner 100 according to an embodiment of the present invention includes a communication unit 110 and a control unit 120. Also, the robot cleaner 100 may further include an image detection unit 131, an obstacle detection unit 132, and a location recognition unit 133. The communication unit 110 receives a control signal from an external terminal device and transmits a response signal with respect to the control signal or one or more data. The control unit 120 controls the robot cleaner according to the control signal and generates the response signal according to the control results. The control unit 120 extracts a cleaning start command or a cleaning stop command from the control signal, and performs cleaning according to the cleaning start command or stops cleaning according to the cleaning stop command.
The communication unit 110 is connected to the terminal device according to one of satellite communication schemes, namely, according to one communication scheme among currently available communication schemes to transmit and receive data to and from the terminal device. The communication unit 110 transmits status information, obstacle information, location information, image information, a cleaning map, or the like, of the robot cleaner 100. Also, the communication unit 110 may perform communication with the terminal device according to one communication scheme among short-range wireless communication schemes such as infrared data association (IrDA), wireless location area network (WLAN), ZigbeeTM, and the like, For example, when the terminal device is a smart phone, the robot cleaner may include a communication unit according to a communication scheme available for the smart phone. Also, the communication unit 110 may receive cleaning reservation information of the robot cleaner 100 from the terminal device 200. Here, the control unit 120 performs cleaning by using the cleaning reservation information.
With reference to FIGS. 2 and 3, the robot cleaner 100 may further include the image detection unit 131 installed to face upwardly or forwardly and having an upper camera sensor to capture an image of the surroundings of the robot cleaner to detect image information. When the image detection unit 131 includes a plurality of upper camera sensors, the camera sensors may be formed on an upper portion or a side portion of the robot cleaner 100 at certain distance or at certain angle. The image detection unit 131 may also be used as a different type of a location recognition unit. The image detection unit 131 may further include a lens connected to the camera and focusing an object, an adjusting unit adjusting the camera, and a lens adjusting unit for adjusting the lens. As the lens, a lens having a wide angle of view to capture every surrounding region, e.g., every region of the ceiling, is used. The control unit 120 may extract a feature point from the image information captured by the image detection unit, recognize a location of the robot cleaner by using the feature point, and generate a cleaning map with respect to a cleaning area.
With reference to FIGS. 3 and 4, the robot cleaner according to an embodiment of the present invention further includes the obstacle detection unit 132 including one or more sensors, detecting an obstacle around the robot cleaner by using sensing signals of the sensors, and outputting obstacle information. Here, the control unit 120 generates a cleaning map by using the obstacle information.
The obstacle detection unit 132 may include first sensors 132a installed at certain intervals on a front side of the robot cleaner 100, namely, on an outer circumferential surface of the robot cleaner 100, as shown in FIG. 2 or 5. Also, the obstacle detection unit 132 may include a second sensor 132b installed to have a face protruded to an outer side of a main body. The positions and types of the first and second sensors may vary according to types of the robot cleaner, and the obstacle detection unit may further include a variety of sensors. The first sensor 132a senses an object, in particular, an obstacle, present in a direction in which the robot cleaner travels, and transfers the sensing information to the control unit 120. Namely, the first sensor senses a protrusion, household goods, furniture, a wall face, a wall corner, and the like, existing in a path along which the robot cleaner moves, and transfers corresponding information to the control unit 120. The first sensor may be an infrared ray sensor, an ultrasonic sensor, an RF sensor, a geomagnetic sensor, or the like. The second sensor 132b may sense an obstacle present at a front side or a lateral side and transfers obstacle information to the control unit 120. Namely, the second sensor 132b senses a protrusion, household goods, furniture, a wall face, a wall corner, and the like, existing in a path along which the robot cleaner moves, and transfers corresponding information to the control unit 120. The second sensor 132b may be an infrared ray sensor, an ultrasonic sensor, an RF sensor, a position sensitive device (PSD) sensor, or the like.
As shown in FIG. 6, the obstacle detection unit 132 may further include a precipice sensor 132c installed on a lower surface of the main body and sensing an obstacle on a bottom surface, e.g., a precipice. The precipice sensor 132c may be configured to stably obtain a measurement value regardless of a reflectivity of the bottom surface, a color difference, or the like, and may be configured to have a type of an infrared module such as a PSD sensor.
Also, the obstacle detection unit 132 may further include a charge signal sensor (not shown) receiving a guidance signal transmitted from a charging station. The robot cleaner 100 may check a location and direction of the charging station upon receiving the guidance signal generated by the charging station. The charging station transmits a guidance signal indicating a direction and distance to allow the robot cleaner 100 to be returned. Upon receiving the signal transmitted from the charging station, the robot cleaner 100 determines a current location thereof and sets a movement direction to return to the charging station. A charging signal sensor may be, for example, an infrared ray sensor, an ultrasonic sensor, an RF sensor, or the like. In general, the infrared ray sensor is used. The charging signal sensor may be provided within the robot cleaner 100 or at an outer side of the robot cleaner 100. For example, the charging signal sensor may be installed at a lower portion of the output unit 180 or in the vicinity of the image detection unit 131.
The output unit 180 may display a remaining battery capacity on a screen. Also, the terminal device 200 may receive a charged state of the battery, a remaining battery capacity, or the like, from the robot cleaner 100 and display the same at one side of the screen of a display unit.
With reference to FIGS. 3 and 4, the robot cleaner according to an embodiment of the present invention may further include the location recognition unit 133 having one or more sensors, recognizing a location of the robot cleaner by using sensing signals of the sensors and outputting location information. Here, the control unit 120 may correct a cleaning map by using the location information recognized by the location recognition unit 133.
As shown in FIG. 6, the location recognition unit 133 includes a lower camera sensor 133a provided on a rear side of the robot cleaner and capturing an image of a lower side, namely, the bottom surface, a cleaning target face. The lower camera sensor 133a may be an optical flow sensor, which converts an image of a lower portion input from an image sensor provided in the sensor to generate image data having a certain format. The lower camera sensor may sense a location of the robot cleaner 100 irrespective of sliding of the robot cleaner 100. The control unit 120 compares and analyzes image data captured by the lower camera sensor according to time to calculate a movement distance and movement direction to thus calculate a location of the robot cleaner 100. Since the lower side of the robot cleaner 100 is observed by using the lower camera sensor, the control unit can make a reliable correction resistant to sliding with respect to the location calculated by a different unit.
The location recognition unit 133 may further include an acceleration sensor sensing a change in the speed of the robot cleaner 100, e.g., a change in a movement speed according to start, stop, a change of direction, collision with an object, or the like. The acceleration sensor may be attached to a position adjacent to a main wheel or an auxiliary wheel to sense sliding or idle rotation of the wheel. Here, the speed is calculated by using acceleration sensed by the acceleration sensor, and compared with a reference speed to thus check or correct the location of the robot cleaner 100. However, in general, the acceleration sensor is installed in the control unit 120 to sense a change in speed of the robot cleaner when the robot cleaner performs cleaning operation or makes a movement. Namely, the acceleration sensor senses impulse according to the change in speed to output a corresponding voltage value. Thus, the acceleration sensor may perform a function of an electronic bumper.
The location recognition unit 133 may further include a gyro sensor sensing a rotation direction of the robot cleaner and sensing a rotation angle when the robot cleaner moves or performs cleaning. The gyro sensor may sense an angular velocity of the robot cleaner and output a voltage value proportional to the angular velocity. The control unit 120 calculates a rotation direction and rotation angle by using the voltage value output from the gyro sensor.
With reference to FIG. 4, the location recognition unit 133 may further include a wheel sensor 162 connected to the left and right main wheels 161 to sense the number of rotations of the main wheels. The wheel sensor 162 mainly uses a rotary encoder, and when the robot cleaner 100 performs cleaning or moves, the wheel sensor 162 senses the number of rotations of the left and right main wheels and outputs the same. The control unit 120 may calculate the rotation speed of the left and right wheels by using the number of rotations.
The control unit 120 may precisely recognize a location of the robot cleaner 100 by using the sensing information from the acceleration sensor, the gyro sensor, the wheel sensor 162, the lower camera sensor and the image information from the image detection unit. Also, the control unit 120 may precisely generate a cleaning map by using the obstacle information detected by the obstacle detection unit and the location recognized by the image detection unit. The communication unit 110 transmits data including the image information, the obstacle information, the location information, the cleaning map, the cleaning area, and the like, to the terminal device 200.
With reference to FIG. 3 or 4, the robot cleaner 100 may further include a storage unit 140 storing one or more information among the image information, the obstacle information, the location information, the cleaning map, and the cleaning area. The storage unit 140 stores a control program for controlling (or driving) the robot cleaner 100 and corresponding data. Also, the storage unit 140 may store a cleaning method or a traveling method. As the storage unit 140, a non-volatile memory is commonly used. Here, the non-volatile memory (NVM) (or NVRAM) is a storage device that maintains stored information although power is not supplied thereto. The non-volatile memory includes a ROM, a flash memory, a magnetic computer storage device (e.g., a hard disk, a diskette drive, or a magnetic tape), an optical disk drive, a magnetic RAM, PRAM, and the like.
With reference to FIGS. 3 and 4, the robot cleaner 100 may further include a cleaning unit 150. The cleaning unit 150 may include a dust container storing collected dust, a suction fan providing power for sucking dust from a cleaning area, and a suction motor rotating the suction fan to suck air, and suck dust or a foreign object around the robot cleaner. With reference to FIG. 6, the cleaning unit 150 may further include a rotary brush 151 rotatably mounted at a lower portion of the main body of the robot cleaner and a side brush 152 cleaning a corner, or the like, of the cleaning area such as a wall face, or the like, while rotating about a vertical shaft of the main body of the robot cleaner 100 (by being centered thereon). The rotary brush 151, rotating about a horizontal shaft of the main body of the robot cleaner 100, floats dust of the bottom (or floor), carpet, or the like, in the air. A plurality of blades are provided in a spiral direction on an outer circumferential surface of the rotary brush 151. A brush may be provided between the spiral blades.
With reference to FIG. 6, the robot cleaner includes left and right main wheels 161 formed at both sides of a lower portion thereof to allow the robot cleaner to move. A handle may be installed on both sides of the main wheels to allow the user to easily grasp. With reference to FIG. 3 or 4, the robot cleaner may further include the driving unit 160 connected to the left and right main wheels 161 so as to be driven. The driving unit 160 may include certain wheel motors for rotating the wheels. By driving the wheel motors, the driving unit 160 moves the robot cleaner 100. The wheel motors are connected to the main wheels to rotate the main wheels, respectively. The wheel motors mutually independently operate and can be rotatable in both directions. Also, the robot cleaner 100 includes one or more auxiliary wheels on a rear surface thereof to support the main body of the robot cleaner 100, minimize frictional contact between the lower surface of the main body and the bottom surface (i.e., the surface to be cleaned), and allow the robot cleaner 100 to smoothly move.
With reference to FIGS. 3 and 4, the robot cleaner 100 may further include an input unit 170 directly receiving a control command. Also, the user, and so on, may input a command for outputting one or more information items among the information items stored in the storage unit 140. The input unit 170 may be configured as one or more buttons. For example, the input unit 170 may include an OK button, a setting button, or the like. The OK button inputs a command for conforming obstacle information, location information, image information, cleaning area, or a cleaning map. The setting button inputs a command for setting the foregoing information items. The input unit 170 may include a resetting button for inputting a command for resetting the information items, a delete button, a cleaning start button, a stop button, or the like. In another example, the input unit 170 may include a button for setting or deleting reservation information. Also, the input unit 170 may further include a button for setting or changing a cleaning mode. Also, the input unit 170 may further include a button for receiving a command for returning to the charging station. As shown in FIG. 2, the input unit 170 may be installed as a hard key, a soft key, a touch pad, or the like, at an upper portion of the robot cleaner. Also, the input unit 170 may have a form of a touch screen along with the output unit 180.
As shown in FIG. 2, the output unit 180 may be provided on the upper portion of the robot cleaner 100. Of course, the installation position or the installation form of the output unit 180 may vary. For example, the output unit 180 may display reservation information, a battery state, a cleaning method or a traveling method such as intensive cleaning, a space extension, a zigzag operation, or the like, on the screen thereof. The output unit 180 may output a current state of the respective units constituting the robot cleaner, and a current cleaning state. Also, the output unit 180 may display obstacle information, location information, image information, an internal map, a cleaning area, a cleaning map, or the like, on the screen. The output unit 180 may be configured as any one of elements among a light emitting diode (LED), a liquid crystal display (LCD), a plasma display panel (PDP), and an organic light emitting diode (OLED).
With reference to FIG. 4, the robot cleaner may further include a power source unit 190. The power source unit 190 may include a rechargeable battery to provide power to the robot cleaner 100. The power source unit supplies driving power to the respective units and operation power for the robot cleaner 100 to move or perform cleaning. When a remaining power capacity is insufficient, the robot cleaner moves to the charging station and charged upon receiving a charge current. The robot cleaner further includes a battery sensing unit (not shown) sensing charge state of the battery and transmitting sensing results to the control unit 120. The battery 191 is connected to the battery sensing unit, and a remaining battery capacity and charge state of the battery are transferred to the control unit 120. The remaining battery capacity may be displayed on the screen of the output unit. As shown in FIG. 6, the battery 191 may be positioned at the center of the lower portion of the robot cleaner, or may be positioned at one of left and right portions of the robot cleaner so that the dust container can be placed at the central portion of the lower portion of the main body. In the latter case, the robot cleaner 100 may further include a counter weight to resolve weight concentration of the battery.
The control unit 120 may previously set a reference value (remaining battery capacity) and compares the remaining battery capacity with the reference value. Upon comparison, when the sensing result is the reference value or smaller, the control unit 120 moves the robot cleaner 100 to the charging station to charge the robot cleaner 100. For example, the control unit 120 may stop a current operation of the robot cleaner 100 and move the robot cleaner 100 to the charging station to allow the robot cleaner 100 to be charged, according to a charge command from the terminal device 200. In another example, the control unit 120 may extract a charge command, and may perform a charge command according to a comparison result obtained by comparing the remaining battery capacity with the reference value or may perform a previous operation.
With reference to FIG. 7, the terminal device 200 according to an embodiment of the present invention may include a wireless communication unit 210, a controller 220, and a display unit 230.
The wireless communication unit 210 transmits a control signal generated by the controller 220 to the robot cleaner 100 and receives one or more data items including image information or a cleaning map from the robot cleaner 100. Here, the one or more data items refers to image information, obstacle information, location information, a cleaning map, state information, and the like. The controller 220 generates a control signal and generates a control screen by using the data. Here, the control command includes a cleaning start command or a cleaning stop command. The wireless communication unit 210 may include one or more modules allowing for wireless communication in a network between the terminal device 200 and a wireless communication system, between terminal devices, or between a terminal device and the robot cleaner 100. For example, the wireless communication unit 210 may include a broadcast receiving module, a mobile communication module, a wireless Internet module, a short-range communication module, a location information module, and the like.
The broadcast receiving module receives broadcast signals and/or broadcast associated information from an external broadcast management server through a broadcast channel. The mobile communication module transmits and/or receives a radio signal to and/or from at least one of a base station, an external terminal and a server over a mobile communication network. Here, the radio signal may include a voice call signal, a video call signal and/or various types of data according to text and/or multimedia message transmission and/or reception. The wireless Internet module, which refers to a module supporting wireless Internet access, may be built-in or externally installed to the terminal device. Here, as a wireless Internet technique, a WLAN (Wireless LAN), Wi-Fi, Wibro (Wireless Broadband), Wimax (World Interoperability for Microwave Access), HSDPA (High Speed Downlink Packet Access), and the like, may be used. The short-range communication module is a module supporting short-range communication. Here, as a short-range communication technology, Bluetooth, Radio Frequency IDentification (RFID), Infrared Data Association (IrDA), Ultra WideBand (UWB), ZigBeeTM, and the like, may be used.
The display unit 230 includes a touch recognition region 231 for receiving a control command, and displays a control screen. Also, the display unit 230 may display an icon according to a communication scheme (e.g., Wi-Fi, 3G), displays communication sensitivity, and a remaining battery capacity of the robot cleaner. As shown in FIG. 9a, the display unit 230 may display a touch recognition region including a first region 232 including the control screen, receiving the control command, and having a certain size, and a second region 233 having a size smaller than or equal to the first region 232. Of course, in case of a mobile phone or a notebook, or the like, which does not have a touch screen (or touch pad), a touch recognition region may not be formed, and, instead, an input unit for receiving a control command and an output unit displaying a control screen may be discriminated.
The display unit 230 may alternately display a cleaning start icon receiving a cleaning start command and a cleaning stop icon receiving a cleaning stop command on the touch recognition region. Also, the display unit 230 may further include a mode icon for setting a cleaning mode. Here, when a touch input with respect to the mode icon is received, the controller 220 may generate a mode setting screen and the display unit 230 may display a mode setting screen. Also, the display unit 230 may further include a cleaning reservation icon for setting a cleaning reservation. Here, when a touch input with respect to the cleaning reservation icon is received, the controller 220 may generate a cleaning reservation screen and the display unit displays the cleaning reservation screen.
The display unit 230 may display information processed in the terminal device 200. Namely, the display unit 230 displays a control screen. For example, when the terminal device 200 is in a phone call mode, the display unit 230 may display a User Interface (UI) or a Graphic User Interface (GUI) associated with a call. The display unit 230 may include at least one of a Liquid Crystal Display (LCD), a Thin Film Transistor-LCD (TFT-LCD), an Organic Light Emitting Diode (OLED) display, a flexible display, and a three-dimensional (3D) display.
When the display unit 230 and a touch sensor for sensing a touch operation are overlaid in a layered manner, the display unit 230 may be used as both an input-available touch screen and an output-available touch screen. The touch sensor may have the form of, for example, a touch film, a touch sheet, a touch pad, etc. The touch sensor may be configured to convert a pressure applied to a particular portion of the display unit 230 or a change in capacitance at a particular portion of the display unit 230 into an electrical input signal. The touch sensor may be configured to sense the pressure when a touch is applied, as well as a touched position or area. The touch sensor may be a proximity sensor that a pointer is positioned to be close to the screen without actually being in contact with the screen. The proximity sensor refers to a sensor for sensing the presence or absence of an object that accesses a certain sensing surface or an object that exists nearby by using the force of electromagnetism or infrared rays without a mechanical contact. In an embodiment of the present invention, a touch recognition includes a proximity touch in which the pointer is positioned to be close to the touch screen without being contacted and a contact touch in which the pointer is actually in contact with the touch screen.
With reference to FIG. 7, the terminal device may further include a memory 240. The memory 240 may store a program for operating the controller 220. Also, the memory 240 may store input/output data (e.g., phonebook, messages, still images, videos, and the like). The memory 240 may pattern a control signal for controlling the robot cleaner and a corresponding control command in advance and store the same.
The terminal device may further include an A/V (audio/video) input unit, a user input unit, a sensing unit, an interface unit, a power supply unit, and the like.
The A/V input unit, which is to input an audio signal or a video signal, may include a camera, a microphone, and the like. The user input unit generates input data to control an operation of the terminal device by the user. The user input unit may be configured as a key pad, a dome switch, a touch pad (pressure/capacitance), a jog wheel, a jog switch, and the like. In particular, when the touch pad is overlaid on the display unit 230 in a layered manner, it may form a touch screen. The sensing unit senses a current status of the terminal device such as an opened or closed state of the terminal device, a location of the terminal device, the presence or absence of user contact with the terminal device, the orientation of the terminal device, an acceleration or deceleration movement and direction of the terminal device, etc., and generates commands or signals for controlling the operation of the terminal device.
The interface unit serves as a passage with every external device connected to the terminal device 200. The interface unit may receive data or power from an external device and transfer the received input to elements within the terminal device 200 or may transfer data within the terminal device to an external device. The power supply unit receives external power or internal power and supplies appropriate power required for operating respective elements and components under the control of the controller 220.
The controller 220 typically controls the general operations of the terminal device. For example, in case of a mobile phone or a smart phone, the controller 220 performs controlling and processing associated with voice calls, data communications, video calls, and the like. Also, the controller 220 may perform a pattern recognition processing to recognize a handwriting input or a picture drawing input performed on the touch screen as characters or images, respectively.
The controller 220 may generate a control signal corresponding to a control command with respect to the robot cleaner 100 and generate a control screen by using data and a response signal. Here, the control command includes a movement command, a patrol command, a charge command, a setting change, and the like, in addition to cleaning commands such as cleaning start, cleaning stop, or the like.
FIG. 8 is a flow chart illustrating a process of controlling an operation of the robot cleaner according to a first embodiment of the present invention.
A process for controlling an operation of a robot cleaner according to an embodiment of the present invention may include a step (S110) of checking a manipulation mode of a robot cleaner, a step (S130) of extracting a traveling direction from a traveling method previously stored in the storage unit when the manipulation mode is an automatic mode in step (S120), a step (S140) of extracting a traveling direction indicated by a sensing target from a captured image obtained by the image detection unit 131 when the manipulation mode is a manual mode in step (S120), a step (S150) of generating a driving signal for moving the robot cleaner in the traveling direction extracted in step S130 or the traveling direction extracted in step S140, and a step (S160) of driving one or more wheels provided in the robot cleaner according to the generated driving signal.
FIGS. 9a to 9c are views showing configuration screens of the robot cleaner according to an embodiment of the present invention.
The robot cleaner 100 or the terminal device 200 may provide items for changing the setting of the robot cleaner 100. For example, the robot cleaner 100 or the terminal device 200 may provide an item for setting a manipulation method of the robot cleaner 100.
With reference to FIG. 9a, a manipulation setting screen 300 providing items for setting a manipulation method of the robot cleaner 100 includes an automatic manipulation setting item 301 and a manual manipulation setting item 303. The automatic manipulation setting item 301 and the manual manipulation setting item 303 include a detailed setting item 302 or 304, respectively. The robot cleaner 100 may be automatically or manually manipulated according to an item selected form the manipulation setting screen 300.
When the automatic manipulation setting item 301 is selected, the robot cleaner 100 is manipulated according to a traveling method previously stored in the storage unit 140. However, when the manual manipulation setting item 303 is selected, the robot cleaner 100 may be manipulated by a sensing target detected from a captured image obtained by the image detection unit 131 or manipulated by the terminal device 200 transmitting control information to the communication unit 110.
With reference to FIG. 9b, when the automatic manipulation setting item 301 is selected in FIG. 9a, an automatic manipulation setting screen image 310 is displayed. The automatic manipulation setting screen image 310 provides an interface allowing for selecting one of a plurality of scenarios stored in the storage unit 140. When one scenario 312 is selected, the robot cleaner 100 is manipulated according to the selected scenario 312. Here, the scenario stores a method for performing cleaning by the robot cleaner 100, including content regarding a cleaning time, order, position, method, and the like.
With reference to FIG. 9c, when the manual manipulation setting item 303 is selected in FIG. 9a, a manual manipulation setting screen image 320 is displayed. The manual manipulation setting screen image 320 provides an interface allowing for selecting one of a plurality of scenarios stored in the storage unit 140. When one scenario 322 is selected, the robot cleaner 100 is manipulated according to the selected scenario 322.
FIGS. 10a to 10c are conceptual views showing a process of driving the robot cleaner in a manual manipulation mode according to the first embodiment of the present invention.
According to the first embodiment of the present invention, the storage unit 140 stores a traveling method and the image detection unit 131 obtains a captured image. Also, the driving unit 160 includes one or more wheels 161 and drives the wheels 161 according to a driving signal. Also, the control unit 120 extracts a traveling direction from the traveling method stored in the storage unit in a first mode and extracts a traveling direction indicated by a sensing target from the captured image obtained by the image detection unit. Also, the control unit 120 generates a driving signal for moving the robot cleaner in the extracted traveling direction. Meanwhile, the control unit 120 may also extract a traveling distance, as well as the traveling direction, to generate a driving signal for moving the robot cleaner by the corresponding distance in the corresponding direction.
In the manual manipulation module, the robot cleaner 100 may obtain an image of the sensing target through the image detection unit 131 having the upper camera sensor. Here, the sensing target refers to a moving object. For example, the sensing target may refer to a moving particular subject having feature information such as a hand.
With reference to FIG. 10a, the user may place his hand 500 at a certain distance above the upper part of the robot cleaner 100 when the robot cleaner is within a certain area 400, such as in the user's home. The robot cleaner 100 may capture an image of an upper direction of the robot cleaner 100 in real time to detect the user’s hand 500. In the particular area 400, the robot cleaner 100 may generate a driving signal for moving the robot cleaner 100 in an appropriate direction so that the user’s hand 500 can be placed at an upper side of a certain position, such as the center of the robot cleaner, all the time.
Namely, with reference to FIG. 10b, the user may move his hand 500 so as to be placed at an upper portion of the right side of the robot cleaner 100. With reference to FIG. 10c, the robot cleaner 100 may generate a driving signal for moving the robot cleaner 100 in a rightward direction such that the detected user’s hand 500 may be placed at a central upper side of the robot cleaner. When the driving signal is applied to the driving unit 160, the left and right main wheels 161 are driven, and as a result, the user’s hand 500 can be continuously placed at the central upper side of the robot cleaner 100.
Accordingly, the user can simply manipulate the robot cleaner 100 to perform cleaning on a contamination area. In particular, this can be effective when the robot cleaner 100 does not perform cleaning on a particular contamination area, when the robot cleaner 100 intensively perform cleaning, of when the user wants to immediately clean a particular contamination area, in the automatic manipulation mode.
FIGS. 11a and 11b are different conceptual views showing a process of driving the robot cleaner in the manual manipulation mode according to the first embodiment of the present invention.
With reference to FIG. 11a, the user may indicate a contamination area 610 with his hand 500 in the area in which the robot cleaner 100 is able to capture an image in a certain area 600 such as in the user’s home. The robot cleaner 100 may capture an image of an upper direction of the robot cleaner 100 in real time to detect the user’s hand 500.
With reference to FIG. 11b, the robot cleaner 100 may generate a driving signal for moving the robot cleaner 100 toward the position indicated by the user’s hand 500. When the driving signal is applied to the driving unit 160, the left and right main wheels 161 are driven, and as a result, the robot cleaner 100 can move to the position indicated by the user’s hand 500.
FIGS. 12a to 12c are different conceptual views showing a process of driving the robot cleaner in the manual manipulation mode according to the first embodiment of the present invention.
According to an embodiment of the present invention, the control unit 120 extracts a series of traveling directions and a series of traveling distances indicated by the sensing target from the captured image, stores the extracted series of traveling directions and the extracted series of traveling distances in the storage unit 140, and generates a series of driving signals for moving the robot cleaner 100 in the stored traveling directions by the traveling distances.
With reference to FIG. 12a, in a particular area 700 such as home, the user may indicate a first contamination area 710 with his hand 500 within the area in which the robot cleaner 100 may capture an image in the particular area 700. And then, the user may indicate a second contamination area 720 with his hand 500 within the area in which the robot cleaner 100 may capture an image. The robot cleaner 100 may capture an image of an upper direction of the robot cleaner 100 in real time to detect the user’s hand 500. Also, the robot cleaner 100 may sequentially store the points indicated by the user’s hand 500. After the user finishes the behavior of indicating a particular point with his hand 500, the robot cleaner 100 sequentially retrieves the stored points.
With reference to FIG. 12b, first, the robot cleaner 100 generates a driving signal for retrieving the first contamination area 710 and moving the robot cleaner 100 to the first contamination area 710. As the driving signal is applied to the driving unit 160, the left and right main wheels 161 are driven, and as a result, the robot cleaner 100 can move to the first contamination area 710 to perform cleaning.
With reference to FIG. 12c, the robot cleaner 100 generates a driving signal for retrieving the second contamination area 720 and moving the robot cleaner 100 to the second contamination area 720. As the driving signal is applied to the driving unit 160, the left and right main wheels 161 are driven, and as a result, the robot cleaner 100 may move to the second contamination area 720 to perform cleaning.
FIG. 13 is a flow chart illustrating a process of controlling an operation of the robot cleaner according to a second embodiment of the present invention.
A process of controlling an operation of the robot cleaner according to the second embodiment of the present invention includes: a step (S210) of checking a manipulation mode of the robot cleaner, a step (S230) of extracting a traveling direction from a traveling method previously stored in the storage unit 140 when the manipulation is an automatic mode in step (S220), a step (S240) of extracting a traveling direction from control information obtained by the communication unit 110 when the manipulation mode is a manual mode in step (S220), a step (S250) of generating a driving signal for moving the robot cleaner in the traveling direction extracted in step (S230) or the traveling direction extracted in step (S240), and a step (S260) of driving one or more wheels provided in the robot cleaner according to the generated driving signal.
FIGS. 14a to 14d are conceptual views showing a process of driving the robot cleaner in a manual manipulation mode according to the second embodiment of the present invention.
According to the second embodiment of the present invention, the storage unit 140 stores the traveling method and the communication unit 110 receives control information from the terminal device 200. Also, the driving unit 160 includes one or more wheels 161 and drives the wheels 161 according to a driving signal. The control unit 120 extracts a traveling direction from the traveling method stored in the storage unit in a first mode, and extracts a traveling direction from the control information received by the communication unit 110 in a second mode. Also, the control unit 120 generates a driving signal for moving the robot cleaner in the extracted traveling direction. Meanwhile, the control unit 120 may also extract a traveling distance, as well as the traveling direction, to generate a driving signal for moving the robot cleaner by the corresponding distance in the corresponding direction.
For example, a connection may be established between the robot cleaner 100 and the terminal device 200 according to a user’s connection request. Accordingly, the robot cleaner 100 may receive control information from the terminal device 200 and may be driven according to the received control information. In other words, the terminal device 200 may manually control the operation of the robot cleaner 100.
The robot cleaner 100 and the terminal device 200 may be connected by using a wireless Internet technology such as WLAN (Wireless LAN) (Wi-Fi), Wibro (Wireless broadband), Wimax (World Interoperability for Microwave Access), HSDPA (High Speed Downlink Packet Access), or the like. Or, the robot cleaner 100 and the terminal device 200 may be connected by using a short-range communication technology such as BluetoothTM, RFID (Radio Frequency Identification), IrDA (infrared Data Association), UWB (Ultra Wideband), ZigBeeTM, or the like.
The control information transmitted from the terminal device 200 to the robot cleaner 100 may include a traveling method. The traveling method may include a traveling direction and may further include a traveling distance. The control information may explicitly include, for example, a parameter regarding a direction in which the robot cleaner 100 is to travel or parameters regarding a direction and a distance. Or, the control information may implicitly include, for example, a parameter regarding a direction in which the robot cleaner 100 is to travel or parameters regarding the direction and a distance, such as a parameter regarding a direction in which the terminal device 200 moves or parameters regarding the direction and a distance.
With reference to FIG. 14a, when a connection is established between the robot cleaner 100 and the terminal device 200, the terminal device 200 may display a manual control screen image 800. The manual control screen image 800 may include a cleaning map region 810, a manipulation tool region 820, and a captured image region 830.
The cleaning map region 810 may display a cleaning map 812 with respect to a cleaning area generated by the control unit 120 by using image information captured by the image detection unit 131. The cleaning map 812 may additionally indicate a current location of the robot cleaner 100.
The manipulation tool region 820 may display a tool 822 for manipulating traveling of the robot cleaner 100. The tool 822 manipulating traveling of the robot cleaner 100 may include a direction item 822-1 enabling at least two-dimensional movement of the robot cleaner 100 and an indicator item 822-2 indicating a manipulation direction.
The captured image region 830 displays the image information received from the robot cleaner 100. The image information may be image information generated by capturing an image of the surrounding of the robot cleaner through the image detection unit 110 of the robot cleaner 100. For example, the image information may include a captured image of a front side of the robot cleaner 100.
With reference to FIG. 14b, when the user wants to move the robot cleaner 100 in a particular direction, the user may input a command for moving the robot cleaner 100 in the particular direction by using the traveling manipulation tool 822 displayed on the manipulation tool region 820. For example, the user may input a command for moving the robot cleaner 100 in a corresponding direction by dragging the indicator item 822-2 in the particular direction.
With reference to FIG. 14c, when the user wants to move the robot cleaner 100 to a particular location, the user may input a command for moving the robot cleaner 100 to the corresponding location by using the cleaning map 812 displayed on the cleaning map region 810. For example, the user may input a command for moving the robot cleaner 100 to the corresponding location by touching the particular portion of the cleaning map 812.
With reference to FIG. 14d, when the user wants to move the robot cleaner 100 to a particular position viewed in the capture image region 830, the user may input a command for moving the robot cleaner 100 to the corresponding location by using the captured image displayed on the captured image region 830. For example, the user may input a command for moving the robot cleaner 100 to the corresponding location by touching a particular portion of the captured image.
As shown in FIGS. 14b to 14d, when the command for moving the robot cleaner 100 to a particular direction or to a particular location is input, the terminal device 200 may transmit the corresponding command as control information having a format that can be interpreted by the robot cleaner 100. Upon receiving the control information, the robot cleaner 100 extracts a traveling direction, or a traveling direction and a distance from the control information, and generates a driving signal for moving the robot cleaner 100 in the corresponding direction or in the corresponding direction and by the corresponding distance. As the driving signal is applied to the driving unit 160, the left and right main wheels 161 are driven to move the robot cleaner 100 in the corresponding direction, or in the corresponding direction or by the corresponding distance.
FIGS. 15a to 15c are different conceptual views showing a process of driving the robot cleaner in a manual manipulation mode according to the second embodiment of the present invention.
When a connection is established between the robot cleaner 100 and the terminal device 200, the terminal device 200 may transmit information regarding a movement of the terminal device 200 to the robot cleaner 100. To this end, the terminal device 200 may include an acceleration sensor (not shown) and/or a gyro sensor (not shown).
With reference to FIG. 15a, the user may place the terminal device 200 at a certain distance above the upper part of the robot cleaner 100 when the robot cleaner is within a certain area 400, such as in the user's home. The terminal device 200 may detect a movement by using the acceleration sensor and/or gyro sensor installed therein and transmit information regarding the detected movement to the robot cleaner 100. The robot cleaner 100 may analyze the received information regarding the movement and generate a driving signal for moving the robot cleaner 100 by in a direction or in a direction and by a distance corresponding to the movement of the terminal device 200.
Namely, with reference to FIG. 15b, the user may move the terminal device 200 to the right side of the robot cleaner 100 in a particular area 900. With reference to FIG. 15c, the robot cleaner 100 may generate a driving signal for moving the robot cleaner 100 in the rightward direction according to the movement of the terminal device 200. When the driving signal is applied to the driving unit 160, the left and right main wheels 161 are driven, and accordingly, the robot cleaner 100 can move in the rightward direction.
FIGS. 16a to 16d are different conceptual views showing a process of driving the robot cleaner in a manual manipulation mode according to the second embodiment of the present invention.
For example, a connection may be established between the robot cleaner 100 and the terminal device 200 according to a user’s connection request. Accordingly, the robot cleaner 100 may receive control information from the terminal device 200 and may be driven according to the received control information. In other words, the terminal device 200 may manually control the operation of the robot cleaner 100.
With reference to FIG. 16a, when the connection is established between the robot cleaner 100 and the terminal device 200, the terminal device 200 may display a manual control screen image 100. The manual control screen image 1000 may include a cleaning map reduction region 1010, a manipulation tool region 1020, and a cleaning map magnification region 1030.
The cleaning map reduction region 1010 displays a reduced cleaning map 1012 with respect to a cleaning region generated by the control unit 120 by using image information captured by the image detection unit 131. The reduced cleaning map 1012 may additionally indicate a current location of the robot cleaner 100.
The manipulation tool region 1020 displays a tool 1022 for manipulating traveling of the robot cleaner 100. The tool 1022 for manipulating traveling of the robot cleaner 100 may include a direction item enabling at least a 2D movement of the robot cleaner 100 and an indicator item indicating a manipulation direction.
The cleaning map magnification region 1030 displays a magnified cleaning map 1040 with respect to the generated cleaning region. The magnified cleaning map 1040 may include at least one contamination regions 1042 and 1044 and include an indicator 1050 reflecting the location of the robot cleaner 100.
With reference to FIG. 16b, the user may sequentially select spots desired to be cleaned on the magnified cleaning map 1040. For example, the user may sequentially touch the first contamination region 1042 and the second contamination region 1044 on the magnified cleaning map 1040. And, the user may terminate the selection of spots by selecting an ‘OK’ button displayed on the cleaning map magnification region 1030. The terminal device 200 transmits the position information regarding the selected spots to the robot cleaner 100. The robot cleaner 100 analyzes the control information and sequentially stores the spots selected by the user.
With reference to FIG. 16c, the robot cleaner 100 generates a driving signal for calling the first contamination region 1042 and moving the robot cleaner 100 to the first contamination region 1042. When the driving signal is applied to the driving unit 160, the left and right main wheels 161 are driven, and accordingly, the robot cleaner 100 can move to the first contamination region 1042 to perform cleaning.
With reference to FIG. 16d, subsequently, the robot cleaner 100 generates a driving signal for calling a second contamination region 1044 and moving the robot cleaner 100 to the second contamination region 1044. When the driving signal is applied to the driving unit 160, the left and right main wheels 161 are driven, and accordingly, the robot cleaner 100 can move to the second contamination region 1042 to perform cleaning.
Meanwhile, the user may correct the cleaning map generated by the robot cleaner 100 on the cleaning map magnification region 1030. In this case, the terminal device 200 may provide a user interface allowing for a correction of the cleaning map, and when the cleaning map is corrected through the provided user interface, the terminal device 200 may transmit the corrected content to the robot cleaner 100.
FIG. 17 is a different conceptual view showing a process of driving the robot cleaner in a manual manipulation mode according to the second embodiment of the present invention.
Meanwhile, the terminal device 200 may have a camera (not shown). In this case, the terminal device 200 may provide a captured screen image 1100 of a region to be cleaned by the robot cleaner 100. The captured cleaning region screen image 1100 may include a guide region 1110 providing guide information, a preview region 1120 providing a preview of a captured image, and a menu region 1130 displaying a capture menu.
When an image capture button is selected by the user, the terminal device 200 may transmit the captured image to the robot cleaner 100. The robot cleaner 100 compares the captured image with previously captured image information and extract feature information to determine a corresponding position. And, the robot cleaner 100 generates a driving signal for moving the robot cleaner 100 to the corresponding position. When the driving signal is applied to the driving unit 160, the left and right main wheels 161 are driven, and accordingly, the robot cleaner 100 can move to the position captured by the terminal device 200 to perform cleaning.
FIGS. 18a and 18b are views showing screens of setting a scenario of the robot cleaner according to an embodiment of the present invention.
The traveling direction or the traveling direction and the distance are manually determined by the user in the foregoing manual manipulation mode, and the robot cleaner 100 may be driven accordingly. In this case, the traveling direction, or the traveling direction and distance, traveling order, a traveling time, and the like, may be stored as a scenario.
As shown in FIG. 18a, the robot cleaner 100 or the terminal device 200 may provide a menu 1212 for storing the scenario performed in the manual manipulation mode through a cleaning completion screen image 1200. Also, with reference to FIG. 18b, the user may drive the robot cleaner 100 according to a traveling method performed in the manual manipulation mode by selecting a scenario 314 newly added in the automatic manipulation setting screen 310.
In the embodiments of the present invention, the foregoing method can be implemented as codes that can be read by a processor in a program-recorded medium. The processor-readable medium may include a ROM, a RAM, a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like. The processor-readable medium also includes implementations in the form of carrier waves or signals (e.g., transmission via the Internet).
As the present invention may be embodied in several forms without departing from the characteristics thereof, it should also be understood that the above-described embodiments are not limited by any of the details of the foregoing description, unless otherwise specified, but rather should be construed broadly within its scope as defined in the appended claims, and therefore all changes and modifications that fall within the metes and bounds of the claims, or equivalents of such metes and bounds are therefore intended to be embraced by the appended claims.

Claims (16)

  1. An automatic moving apparatus comprising:
    a storage unit configured to store a traveling method;
    an image detection unit configured to acquire a captured image;
    a driving unit having one or more wheels and driving the wheels according to a driving signal; and
    a control unit configured to extract a traveling direction from the traveling method stored in the storage unit in a first mode, extract a traveling direction indicated by a sensing target from the captured image acquired by the image detection unit in a second mode, and generate a driving signal for moving the automatic moving appartus in the extracted traveling direction.
  2. The automatic moving apparatus of claim 1, wherein the control unit further extracts a traveling distance from the traveling method stored in the storage unit in the first mode, further extracts a traveling distance indicated by the sensing target from the captured image obtained by the image detection unit in the second mode, and generates a driving signal for moving the automatic moving apparatus in the extracted travel direction and by the extracted traveling distance.
  3. The automatic moving apparatus of claim 2, wherein the control unit extracts a series of traveling directions and a series of traveling distances indicated by the sensing target from the captured image obtained by the image detection unit in the second mode, stores the series of extracted traveling directions and traveling distances in the storage unit, and generates a series of driving signals for moving the automatic moving apparatus in the series of stored traveling directions and by the series of stored traveling distances.
  4. The automatic moving apparatus of claim 1, wherein the control unit detects movement information of the sensing target from the captured image obtained by the image detection unit in the second mode, and extracts the traveling direction based on the detected movement information.
  5. The automatic moving apparatus of claim 1, wherein the storage unit stores the driving result of the driving unit as a new traveling method in the second mode.
  6. A method for controlling an operation of an automatic moving apparatus, the method comprising:
    determining a manipulation mode of the automatic moving apparatus;
    extracting a traveling direction from a traveling method stored in a storage unit when the manipulation mode is a first mode;
    extracting a traveling direction indicated by a sensing target from a captured image obtained by an image detection unit when the manipulation mode is a second mode;
    generating a driving signal for moving the automatic moving apparatus in the extracted traveling direction; and
    driving wheels of the automatic moving apparatus according to the generated driving signal.
  7. An automatic moving apparatus comprising:
    a storage unit configured to store a traveling method;
    a communication unit configured to receive control information from an external terminal device;
    a driving unit having one or more wheels and driving the wheels according to a driving signal; and
    a control unit configured to extract a traveling direction from the traveling method stored in the storage unit in a first mode, extract a traveling direction from the control information received by the communication unit in a second mode, and generate a driving signal for moving the automatic moving apparatus in the extracted traveling direction.
  8. The automatic moving apparatus of claim 7, wherein the control unit further extracts a traveling distance from the traveling method stored in the storage unit in the first mode, further extracts a traveling distance from the control information received by the communication unit in the second mode, and generates a driving signal for moving the automatic moving apparatus in the extracted travel direction and by the extracted traveling distance.
  9. The automatic moving apparatus of claim 7, further comprising:
    an image detection unit capturing an image of surroundings to generate image information,
    wherein the communication unit transmits the image information to the terminal device.
  10. The automatic moving apparatus of claim 7, wherein the control information is information regarding a movement of the terminal device.
  11. The automatic moving apparatus of claim 7, wherein the control unit generates a cleaning map, and the communication unit transmits the cleaning map to the terminal device.
  12. The automatic moving apparatus of claim 11, wherein the control information includes a series of location information on the cleaning map.
  13. The automatic moving apparatus of claim 11, wherein the control information includes image information captured by the terminal device, and the control unit analyzes the image information to detect a location on the cleaning map.
  14. The automatic moving apparatus of claim 11, wherein the communication unit receives correction information of the cleaning map and corrects the cleaning map by using the correction information.
  15. The automatic moving apparatus of claim 11, wherein the storage unit stores the driving result of the driving unit, as a new traveling method.
  16. A method for controlling an operation of an automatic moving apparatus, the method comprising:
    determining a manipulation mode of the automatic moving apparatus;
    extracting a traveling direction from a traveling method stored in a storage unit when the manipulation mode is a first mode;
    extracting a traveling direction from control information received from an external terminal device when the manipulation mode is a second mode;
    generating a driving signal for moving the automatic moving apparatus in the extracted traveling direction; and
    driving wheels of the automatic moving apparatus according to the generated driving signal.
PCT/KR2011/009492 2011-12-08 2011-12-08 Automatic moving apparatus and manual operation method thereof WO2013085085A1 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
PCT/KR2011/009492 WO2013085085A1 (en) 2011-12-08 2011-12-08 Automatic moving apparatus and manual operation method thereof
US14/354,493 US9776332B2 (en) 2011-12-08 2011-12-08 Automatic moving apparatus and manual operation method thereof
KR1020147012430A KR101910382B1 (en) 2011-12-08 2011-12-08 Automatic moving apparatus and manual operation method thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/KR2011/009492 WO2013085085A1 (en) 2011-12-08 2011-12-08 Automatic moving apparatus and manual operation method thereof

Publications (1)

Publication Number Publication Date
WO2013085085A1 true WO2013085085A1 (en) 2013-06-13

Family

ID=48574428

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2011/009492 WO2013085085A1 (en) 2011-12-08 2011-12-08 Automatic moving apparatus and manual operation method thereof

Country Status (3)

Country Link
US (1) US9776332B2 (en)
KR (1) KR101910382B1 (en)
WO (1) WO2013085085A1 (en)

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015018355A1 (en) * 2013-08-07 2015-02-12 苏州宝时得电动工具有限公司 Automatic work system, automatic walking device, and control method thereof
CN104423797A (en) * 2013-09-05 2015-03-18 Lg电子株式会社 Robot cleaner system and control method thereof
CN104414590A (en) * 2013-08-23 2015-03-18 Lg电子株式会社 Robot cleaner and method for controlling a robot cleaner
CN104783736A (en) * 2014-01-17 2015-07-22 Lg电子株式会社 Robot cleaner and method of performing human care using same
EP2957206A1 (en) * 2014-06-16 2015-12-23 LG Electronics Inc. Robot cleaner and method for controlling the same
EP2977844A1 (en) * 2014-07-22 2016-01-27 Vorwerk & Co. Interholding GmbH Method for cleaning or processing a room using an automatically moved device
CN106413501A (en) * 2014-05-28 2017-02-15 三星电子株式会社 Mobile device, robot cleaner, and method for controlling the same
KR20170033374A (en) * 2014-08-18 2017-03-24 도시바 라이프스타일 가부시키가이샤 Autonomous moving body
CN107028558A (en) * 2016-02-03 2017-08-11 原相科技股份有限公司 Computer-readable medium storing and automatic sweeping machine
EP3187954A4 (en) * 2014-08-27 2018-04-25 Toshiba Lifestyle Products & Services Corporation Autonomous travel body device
WO2018148874A1 (en) * 2017-02-15 2018-08-23 深圳市前海中康汇融信息技术有限公司 Smart robot based on camera navigation and control method therefor
CN108568819A (en) * 2018-04-20 2018-09-25 郑州科技学院 A kind of intelligent robot autonomous control method based on artificial intelligence
EP3679850A1 (en) * 2015-03-23 2020-07-15 LG Electronics Inc. -1- Robot cleaner, robot cleaning system having the same
US10716445B2 (en) 2016-01-28 2020-07-21 Pixart Imaging Inc. Automatic clean machine control method and automatic clean machine

Families Citing this family (36)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8950792B2 (en) * 2012-03-15 2015-02-10 Irobot Corporation Compliant solid-state bumper for robot
KR20140010685A (en) * 2012-07-16 2014-01-27 삼성전자주식회사 Method and apparatus for moving object in mobile terminal
US9056396B1 (en) * 2013-03-05 2015-06-16 Autofuss Programming of a robotic arm using a motion capture system
KR102061511B1 (en) * 2013-04-26 2020-01-02 삼성전자주식회사 Cleaning robot, home monitoring apparatus and method for controlling the same
KR102094347B1 (en) * 2013-07-29 2020-03-30 삼성전자주식회사 Auto-cleaning system, cleaning robot and controlling method thereof
US10143347B2 (en) * 2013-11-13 2018-12-04 Lg Electronics Inc. Cleaning device and control method therefor
KR102118049B1 (en) * 2013-12-19 2020-06-09 엘지전자 주식회사 robot cleaner, robot cleaner system and a control method of the same
DE102014111217A1 (en) 2014-08-06 2016-02-11 Vorwerk & Co. Interholding Gmbh Floor cleaning device for dry and damp cleaning and method for operating a self-propelled floor cleaning device
US9563201B1 (en) 2014-10-31 2017-02-07 State Farm Mutual Automobile Insurance Company Feedback to facilitate control of unmanned aerial vehicles (UAVs)
KR101659037B1 (en) * 2015-02-16 2016-09-23 엘지전자 주식회사 Robot cleaner, remote controlling system and method of the same
US20160274787A1 (en) * 2015-03-19 2016-09-22 Denso Wave Incorporated Apparatus for operating robots
US10048851B2 (en) 2015-03-19 2018-08-14 Denso Wave Incorporated Apparatus for operating robots
DE102015109775B3 (en) 2015-06-18 2016-09-22 RobArt GmbH Optical triangulation sensor for distance measurement
EP3344104B1 (en) * 2015-09-03 2020-12-30 Aktiebolaget Electrolux System of robotic cleaning devices
DE102015114883A1 (en) 2015-09-04 2017-03-09 RobArt GmbH Identification and localization of a base station of an autonomous mobile robot
US10496262B1 (en) * 2015-09-30 2019-12-03 AI Incorporated Robotic floor-cleaning system manager
DE102015119501A1 (en) 2015-11-11 2017-05-11 RobArt GmbH Subdivision of maps for robot navigation
DE102015119865B4 (en) 2015-11-17 2023-12-21 RobArt GmbH Robot-assisted processing of a surface using a robot
DE102015121666B3 (en) 2015-12-11 2017-05-24 RobArt GmbH Remote control of a mobile, autonomous robot
DE102016102644A1 (en) 2016-02-15 2017-08-17 RobArt GmbH Method for controlling an autonomous mobile robot
JP6846897B2 (en) * 2016-03-24 2021-03-24 パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカPanasonic Intellectual Property Corporation of America Position indication method, position indicator, self-propelled device and program
FR3050672B1 (en) * 2016-04-29 2018-11-23 Les Companions AUTOMATE FOR TREATING A SURFACE
EP3494446B1 (en) * 2016-08-05 2023-09-13 Robart GmbH Method and apparatus for controlling an autonomous mobile robot
US10112298B2 (en) * 2016-08-09 2018-10-30 International Business Machines Corporation Assigning tasks to a robot device for execution
KR102613624B1 (en) * 2016-10-10 2023-12-15 엘지전자 주식회사 Cleaning robot for airport and method thereof
KR20180097917A (en) * 2017-02-24 2018-09-03 삼성전자주식회사 Electronic apparatus and method for controlling thereof
EP3590014B1 (en) 2017-03-02 2021-11-17 Robart GmbH Method for controlling an autonomous, mobile robot
CN108536135B (en) * 2017-03-06 2023-10-27 苏州宝时得电动工具有限公司 Path control method and device and cleaning robot
JP6885160B2 (en) * 2017-03-31 2021-06-09 カシオ計算機株式会社 Mobile devices, control methods and programs for mobile devices
DE102017007908A1 (en) * 2017-08-21 2019-02-21 Hochschule Bochum Method for controlling the movement of a mobile robot
JP7052652B2 (en) * 2018-09-06 2022-04-12 トヨタ自動車株式会社 Mobile robots, remote terminals, mobile robot control programs, and remote terminal control programs
DE102019204623A1 (en) * 2019-04-02 2020-10-08 BSH Hausgeräte GmbH Controlling a tillage robot
US20210282613A1 (en) * 2020-03-12 2021-09-16 Irobot Corporation Control of autonomous mobile robots
US11731278B1 (en) * 2020-04-20 2023-08-22 Google Llc Robot teleoperation using mobile device motion sensors and web standards
US11737627B2 (en) * 2020-10-03 2023-08-29 Viabot Inc. Methods for setting and programming zoning for use by autonomous modular robots
CN117666402A (en) * 2022-08-30 2024-03-08 北京小米移动软件有限公司 Interaction method, interaction device, electronic equipment and storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002085305A (en) * 2000-09-12 2002-03-26 Toshiba Tec Corp Robot cleaner and robot cleaner system
JP2002321180A (en) * 2001-04-24 2002-11-05 Matsushita Electric Ind Co Ltd Robot control system
JP2006321001A (en) * 2005-05-18 2006-11-30 Matsushita Electric Works Ltd Autonomous moving robot and moving status recording system thereof
KR20100045585A (en) * 2008-10-24 2010-05-04 전자부품연구원 Method for detecting position of robot

Family Cites Families (50)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE3478824D1 (en) * 1983-10-26 1989-08-03 Automax Kk Control system for mobile robot
JP2815606B2 (en) * 1989-04-25 1998-10-27 株式会社トキメック Control method of concrete floor finishing robot
JP2000029517A (en) * 1998-07-10 2000-01-28 Fuji Heavy Ind Ltd Traveling controller for autonomous traveling vehicle
EP1279081B1 (en) * 2000-05-01 2012-01-04 iRobot Corporation Method and system for remote control of mobile robot
US6457206B1 (en) * 2000-10-20 2002-10-01 Scott H. Judson Remote-controlled vacuum cleaner
AU767561B2 (en) * 2001-04-18 2003-11-13 Samsung Kwangju Electronics Co., Ltd. Robot cleaner, system employing the same and method for reconnecting to external recharging device
US6507773B2 (en) * 2001-06-14 2003-01-14 Sharper Image Corporation Multi-functional robot with remote and video system
US6667592B2 (en) * 2001-08-13 2003-12-23 Intellibot, L.L.C. Mapped robot system
US7113847B2 (en) * 2002-05-07 2006-09-26 Royal Appliance Mfg. Co. Robotic vacuum with removable portable vacuum and semi-automated environment mapping
KR100466321B1 (en) * 2002-10-31 2005-01-14 삼성광주전자 주식회사 Robot cleaner, system thereof and method for controlling the same
KR100468107B1 (en) * 2002-10-31 2005-01-26 삼성광주전자 주식회사 Robot cleaner system having external charging apparatus and method for docking with the same apparatus
JP2004299025A (en) * 2003-04-01 2004-10-28 Honda Motor Co Ltd Mobile robot control device, mobile robot control method and mobile robot control program
JP3841220B2 (en) * 2004-01-30 2006-11-01 船井電機株式会社 Autonomous traveling robot cleaner
US20060020369A1 (en) * 2004-03-11 2006-01-26 Taylor Charles E Robot vacuum cleaner
KR100590549B1 (en) * 2004-03-12 2006-06-19 삼성전자주식회사 Remote control method for robot using 3-dimensional pointing method and robot control system thereof
KR100677252B1 (en) * 2004-09-23 2007-02-02 엘지전자 주식회사 Remote observation system and method in using robot cleaner
US7474945B2 (en) * 2004-12-14 2009-01-06 Honda Motor Company, Ltd. Route generating system for an autonomous mobile robot
JP4266211B2 (en) * 2005-03-23 2009-05-20 株式会社東芝 Robot device, method of moving robot device, and program
US9373149B2 (en) * 2006-03-17 2016-06-21 Fatdoor, Inc. Autonomous neighborhood vehicle commerce network and community
US8108092B2 (en) * 2006-07-14 2012-01-31 Irobot Corporation Autonomous behaviors for a remote vehicle
US8326469B2 (en) * 2006-07-14 2012-12-04 Irobot Corporation Autonomous behaviors for a remote vehicle
KR100821162B1 (en) * 2007-03-30 2008-04-14 삼성전자주식회사 Control method and system of cleaning robot
US8577126B2 (en) * 2007-04-11 2013-11-05 Irobot Corporation System and method for cooperative remote vehicle behavior
US8255092B2 (en) * 2007-05-14 2012-08-28 Irobot Corporation Autonomous behaviors for a remote vehicle
KR100877072B1 (en) * 2007-06-28 2009-01-07 삼성전자주식회사 Method and apparatus of building map for a mobile robot and cleaning simultaneously
KR101372482B1 (en) * 2007-12-11 2014-03-26 삼성전자주식회사 Method and apparatus of path planning for a mobile robot
US20100023185A1 (en) * 2008-07-28 2010-01-28 Torc Technologies, Llc Devices and methods for waypoint target generation and mission spooling for mobile ground robots
US9026315B2 (en) * 2010-10-13 2015-05-05 Deere & Company Apparatus for machine coordination which maintains line-of-site contact
US8989972B2 (en) * 2008-09-11 2015-03-24 Deere & Company Leader-follower fully-autonomous vehicle with operator on side
US8606404B1 (en) * 2009-06-19 2013-12-10 Bissell Homecare, Inc. System and method for controlling a cleaning apparatus
CN102596517B (en) * 2009-07-28 2015-06-17 悠进机器人股份公司 Control method for localization and navigation of mobile robot and mobile robot using same
KR101055567B1 (en) * 2009-09-18 2011-08-08 성균관대학교산학협력단 Intelligent User Interface Device and Control Method for Service Robot
KR101741583B1 (en) * 2009-11-16 2017-05-30 엘지전자 주식회사 Robot cleaner and controlling method thereof
FR2957266B1 (en) * 2010-03-11 2012-04-20 Parrot METHOD AND APPARATUS FOR REMOTE CONTROL OF A DRONE, IN PARTICULAR A ROTATING SAIL DRONE.
JP5526942B2 (en) * 2010-03-31 2014-06-18 ソニー株式会社 Robot apparatus, control method and program for robot apparatus
KR20110119118A (en) * 2010-04-26 2011-11-02 엘지전자 주식회사 Robot cleaner, and remote monitoring system using the same
US9002535B2 (en) * 2010-05-11 2015-04-07 Irobot Corporation Navigation portals for a remote vehicle control user interface
GB2494081B (en) * 2010-05-20 2015-11-11 Irobot Corp Mobile human interface robot
US8918213B2 (en) * 2010-05-20 2014-12-23 Irobot Corporation Mobile human interface robot
US9014848B2 (en) * 2010-05-20 2015-04-21 Irobot Corporation Mobile robot system
WO2012094349A2 (en) * 2011-01-05 2012-07-12 Orbotix, Inc. Self-propelled device with actively engaged drive system
CN103459099B (en) * 2011-01-28 2015-08-26 英塔茨科技公司 Mutually exchange with a moveable tele-robotic
US20120215380A1 (en) * 2011-02-23 2012-08-23 Microsoft Corporation Semi-autonomous robot that supports multiple modes of navigation
KR101856502B1 (en) * 2011-04-07 2018-05-11 엘지전자 주식회사 Robot cleaner, remote controlling system and method of the same
KR101842460B1 (en) * 2011-04-12 2018-03-27 엘지전자 주식회사 Robot cleaner, and remote monitoring system and method of the same
US20120277914A1 (en) * 2011-04-29 2012-11-01 Microsoft Corporation Autonomous and Semi-Autonomous Modes for Robotic Capture of Images and Videos
US9582000B2 (en) * 2011-09-07 2017-02-28 Lg Electronics Inc. Robot cleaner, and system and method for remotely controlling the same
US9211648B2 (en) * 2012-04-05 2015-12-15 Irobot Corporation Operating a mobile robot
US9675226B2 (en) * 2012-10-26 2017-06-13 Lg Electronics Inc. Robot cleaner system and control method of the same
US8996224B1 (en) * 2013-03-15 2015-03-31 Google Inc. Detecting that an autonomous vehicle is in a stuck condition

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002085305A (en) * 2000-09-12 2002-03-26 Toshiba Tec Corp Robot cleaner and robot cleaner system
JP2002321180A (en) * 2001-04-24 2002-11-05 Matsushita Electric Ind Co Ltd Robot control system
JP2006321001A (en) * 2005-05-18 2006-11-30 Matsushita Electric Works Ltd Autonomous moving robot and moving status recording system thereof
KR20100045585A (en) * 2008-10-24 2010-05-04 전자부품연구원 Method for detecting position of robot

Cited By (33)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015018355A1 (en) * 2013-08-07 2015-02-12 苏州宝时得电动工具有限公司 Automatic work system, automatic walking device, and control method thereof
US9974422B2 (en) 2013-08-23 2018-05-22 Lg Electronics Inc. Robot cleaner and method for controlling a robot cleaner
CN104414590A (en) * 2013-08-23 2015-03-18 Lg电子株式会社 Robot cleaner and method for controlling a robot cleaner
EP2839769A3 (en) * 2013-08-23 2015-04-15 LG Electronics Inc. Robot cleaner and method for controlling the same
CN104423797B (en) * 2013-09-05 2018-03-30 Lg电子株式会社 Robot cleaner system and its control method
CN104423797A (en) * 2013-09-05 2015-03-18 Lg电子株式会社 Robot cleaner system and control method thereof
KR20150086074A (en) * 2014-01-17 2015-07-27 엘지전자 주식회사 robot cleaner and caring method of human using the same
EP2897015A3 (en) * 2014-01-17 2015-10-14 LG Electronics Inc. Robot cleaner for cleaning and for monitoring of persons who require caring for
CN104783736A (en) * 2014-01-17 2015-07-22 Lg电子株式会社 Robot cleaner and method of performing human care using same
CN104783736B (en) * 2014-01-17 2018-04-10 Lg电子株式会社 Robot cleaner and people's treatment method using the robot cleaner
US9427863B2 (en) 2014-01-17 2016-08-30 Lg Electronics Inc. Robot cleaner and method of caring for human using the same
KR102104896B1 (en) 2014-01-17 2020-05-29 엘지전자 주식회사 robot cleaner and caring method of human using the same
US10291765B2 (en) 2014-05-28 2019-05-14 Samsung Electronics Co., Ltd. Mobile device, robot cleaner, and method for controlling the same
EP3149862A4 (en) * 2014-05-28 2018-03-07 Samsung Electronics Co., Ltd. Mobile device, robot cleaner, and method for controlling the same
CN106413501A (en) * 2014-05-28 2017-02-15 三星电子株式会社 Mobile device, robot cleaner, and method for controlling the same
EP2957206A1 (en) * 2014-06-16 2015-12-23 LG Electronics Inc. Robot cleaner and method for controlling the same
US9582711B2 (en) 2014-06-16 2017-02-28 Lg Electronics Inc. Robot cleaner, apparatus and method for recognizing gesture
JP2016024820A (en) * 2014-07-22 2016-02-08 フォルヴェルク・ウント・ツェーオー、インターホールディング・ゲーエムベーハーVorwerk & Compagnie Interholding Gesellshaft Mit Beschrankter Haftung Method for cleaning or processing room by independent mobile device and independent mobile device
CN105302131A (en) * 2014-07-22 2016-02-03 德国福维克控股公司 Method for cleaning or processing a room using an automatically moved device
EP2977844A1 (en) * 2014-07-22 2016-01-27 Vorwerk & Co. Interholding GmbH Method for cleaning or processing a room using an automatically moved device
CN105302131B (en) * 2014-07-22 2020-03-17 德国福维克控股公司 Method for cleaning or treating a space by means of an autonomously operable device
CN106662876A (en) * 2014-08-18 2017-05-10 东芝生活电器株式会社 Autonomous moving body
KR101968948B1 (en) * 2014-08-18 2019-04-15 도시바 라이프스타일 가부시키가이샤 Autonomous moving body
EP3185095A4 (en) * 2014-08-18 2018-04-18 Toshiba Lifestyle Products & Services Corporation Autonomous moving body
KR20170033374A (en) * 2014-08-18 2017-03-24 도시바 라이프스타일 가부시키가이샤 Autonomous moving body
US10213080B2 (en) 2014-08-27 2019-02-26 Toshiba Lifestyle Products & Services Corporation Autonomous traveling body device
EP3187954A4 (en) * 2014-08-27 2018-04-25 Toshiba Lifestyle Products & Services Corporation Autonomous travel body device
EP3679850A1 (en) * 2015-03-23 2020-07-15 LG Electronics Inc. -1- Robot cleaner, robot cleaning system having the same
EP3679849A1 (en) * 2015-03-23 2020-07-15 LG Electronics Inc. -1- Robot cleaner and robot cleaning system having the same
US10716445B2 (en) 2016-01-28 2020-07-21 Pixart Imaging Inc. Automatic clean machine control method and automatic clean machine
CN107028558A (en) * 2016-02-03 2017-08-11 原相科技股份有限公司 Computer-readable medium storing and automatic sweeping machine
WO2018148874A1 (en) * 2017-02-15 2018-08-23 深圳市前海中康汇融信息技术有限公司 Smart robot based on camera navigation and control method therefor
CN108568819A (en) * 2018-04-20 2018-09-25 郑州科技学院 A kind of intelligent robot autonomous control method based on artificial intelligence

Also Published As

Publication number Publication date
KR101910382B1 (en) 2018-10-22
KR20140102646A (en) 2014-08-22
US9776332B2 (en) 2017-10-03
US20140303775A1 (en) 2014-10-09

Similar Documents

Publication Publication Date Title
WO2013085085A1 (en) Automatic moving apparatus and manual operation method thereof
WO2011059296A2 (en) Robot cleaner and method for controlling same
KR101297255B1 (en) Mobile robot, and system and method for remotely controlling the same
KR101412590B1 (en) Robot cleaner, and system and method for remotely controlling the same
WO2017018848A1 (en) Mobile robot and control method thereof
WO2011062396A9 (en) Robot cleaner and controlling method thereof
US9037296B2 (en) Robot cleaner, and system and method for remotely controlling the same
WO2019124913A1 (en) Robot cleaners and controlling method thereof
WO2010114235A1 (en) Mobile robot with single camera and method for recognizing 3d surroundings of the same
WO2012008703A2 (en) Robot cleaner and controlling method of the same
WO2012008702A2 (en) Robot cleaner and controlling method of the same
WO2012002603A1 (en) Method and apparatus for providing the operation state of an external device
WO2018043957A1 (en) Robot cleaner
KR101378883B1 (en) Robot cleaner, terminal, and system and method for remotely controlling the robot
US20130056032A1 (en) Robot cleaner, and system and method for remotely controlling the same
KR101356161B1 (en) Robot cleaner, and system and method for remotely controlling the same
KR101287474B1 (en) Mobile robot, and system and method for remotely controlling the same
WO2015026099A1 (en) Display device and method of displaying screen on said display device
WO2021117976A1 (en) Charging device
KR101352518B1 (en) Mobile robot, terminal, and system and method for remotely controlling the robot
WO2020153628A1 (en) Robot and control method thereof
EP3478143A1 (en) Robot cleaner
WO2022177131A1 (en) Robot and control method of same

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 11876970

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 14354493

Country of ref document: US

ENP Entry into the national phase

Ref document number: 20147012430

Country of ref document: KR

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 11876970

Country of ref document: EP

Kind code of ref document: A1