WO2012157951A2 - Method for controlling component for network device - Google Patents

Method for controlling component for network device Download PDF

Info

Publication number
WO2012157951A2
WO2012157951A2 PCT/KR2012/003830 KR2012003830W WO2012157951A2 WO 2012157951 A2 WO2012157951 A2 WO 2012157951A2 KR 2012003830 W KR2012003830 W KR 2012003830W WO 2012157951 A2 WO2012157951 A2 WO 2012157951A2
Authority
WO
WIPO (PCT)
Prior art keywords
cleaning
area
component
remote device
cleaned
Prior art date
Application number
PCT/KR2012/003830
Other languages
French (fr)
Other versions
WO2012157951A3 (en
Inventor
Donghyun SHIN
Minjin OH
Original Assignee
Lg Electronics Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lg Electronics Inc. filed Critical Lg Electronics Inc.
Publication of WO2012157951A2 publication Critical patent/WO2012157951A2/en
Publication of WO2012157951A3 publication Critical patent/WO2012157951A3/en

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08CTRANSMISSION SYSTEMS FOR MEASURED VALUES, CONTROL OR SIMILAR SIGNALS
    • G08C17/00Arrangements for transmitting signals characterised by the use of a wireless electrical link
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/0011Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement
    • G05D1/0044Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement by providing the operator with a computer generated representation of the environment of the vehicle, e.g. virtual reality, maps
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L2201/00Robotic cleaning machines, i.e. with automatic control of the travelling movement or the cleaning operation
    • A47L2201/04Automatic control of the travelling movement; Automatic obstacle detection
    • GPHYSICS
    • G08SIGNALLING
    • G08CTRANSMISSION SYSTEMS FOR MEASURED VALUES, CONTROL OR SIMILAR SIGNALS
    • G08C2201/00Transmission systems of control signals via wireless link
    • G08C2201/90Additional features
    • G08C2201/93Remote control using other portable devices, e.g. mobile phone, PDA, laptop

Definitions

  • a method for controlling a component for a network device includes: cleaning an area to be cleaned by a component; transmitting cleaning area information to a remote device capable of communicating with the component; receiving additional cleaning area information by the component from the remote device; and cleaning additional cleaning area by the component.
  • Fig. 3 is a flowchart illustrating a first exemplary use of the network device.
  • the robot cleaner 1 may further include an impact sensor 20.
  • a cleaning-completed area in the area to be cleaned may be stored in the memory unit 16. That is, the cleaning-completed area and cleaning-uncompleted area in the whole area to be cleaned may be differentiated from each other to be stored.
  • the left-wheel motor 17 and the right-wheel 18 may be independently operated. Therefore, the robot cleaner 1 may move not only forward and backward, but also leftward and rightward.
  • the remote device 2 may include a control unit 21, the input 22 unit for inputting commands and information, a communication unit 23 capable of communicating with the second communication unit 15, a voice output unit 24 for outputting a voice, a memory unit 25 for storing information, and an information display unit 26 on which information is displayed.
  • a configuration of the remote device 2 is not limited to the embodiment described above.

Landscapes

  • Engineering & Computer Science (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Automation & Control Theory (AREA)
  • Electric Vacuum Cleaner (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • Manipulator (AREA)
  • Data Exchanges In Wide-Area Networks (AREA)
  • Computer And Data Communications (AREA)

Abstract

A method for controlling a component includes: transmitting map information on a whole area to be cleaned, which is stored in the component, to a remote device; receiving a cleaning sequence for the whole area to be cleaned, from the remote device, by the component; and cleaning the whole area to be cleaned according to the received cleaning sequence by the component.

Description

METHOD FOR CONTROLLING COMPONENT FOR NETWORK DEVICE
The present disclosure relates to a method for controlling a component for a network device.
In general, a robot cleaner is a device for automatically cleaning a certain area desired to be cleaned. A typical robot cleaner has a simple cleaning function, but recently various functions have been added to a robot cleaner.
In addition, a typical robot cleaner communicates just with a charging device for charging a battery, and cannot communicate with other products.
Embodiments provide a method for controlling a component constituting a network device so that the component performs another function in addition to an original function thereof.
In one embodiment, a method for controlling a component for a network device includes: transmitting map information on a whole area to be cleaned, which is stored in the component, to a remote device; receiving a cleaning sequence for the whole area to be cleaned, from the remote device, by the component; and cleaning the whole area to be cleaned according to the received cleaning sequence by the component.
In another embodiment, a method for controlling a component for a network device includes: cleaning an area to be cleaned by a component; transmitting cleaning area information to a remote device capable of communicating with the component; receiving additional cleaning area information by the component from the remote device; and cleaning additional cleaning area by the component.
In further another embodiment, a method for controlling a component for a network device includes: periodically obtaining images through an imaging device of the component; comparing a currently obtained image with a previously obtained image by the component; and transmitting the currently obtained image and the previously obtained image to a remote device when the currently obtained image is different from the previously obtained image according to a result of the comparing by the component.
According to the proposed invention, since a user may designate a cleaning sequence for the whole area to be cleaned, a robot cleaner, which is an example of a component of a network device, can efficiently perform a cleaning operation. For instance, by designating a shortest movement path, a movement distance of the robot cleaner can be reduced.
Further, when a particular area is selected multiple times, the particular area can be intensively cleaned.
Further, a cleaning state can be easily checked by the user, and, if necessary, an area can be additionally selected so as to perform the cleaning operation again. Thus, a cleaning-completed area in the whole area to be cleaned can be increased.
Further, the user can remotely check a change of a state of the inside of a home. Therefore, when an intruder exists or fire occurs in the home, the user can rapidly cope with the situation.
Fig. 1 is a schematic diagram illustrating a network device according to an embodiment.
Fig. 2 is a block diagram illustrating a structure of the network device according to an embodiment.
Fig. 3 is a flowchart illustrating a first exemplary use of the network device.
Fig. 4 is a flowchart illustrating a second exemplary use of the network device.
Fig. 5 is a flowchart illustrating a third exemplary use of the network device.
Fig. 6 is a flowchart illustrating a fourth exemplary use of the network device.
Reference will now be made in detail to the embodiments of the present disclosure, examples of which are illustrated in the accompanying drawings.
Fig. 1 is a schematic diagram illustrating a network device according to an embodiment, and Fig. 2 is a block diagram illustrating a structure of the network device according to an embodiment.
Referring to Figs. 1 and 2, the network device includes a robot cleaner 1, which is an example of a component and automatically moves to clean, and at least one communication components 2 and 3 capable of communicating with the robot cleaner 1.
The communication components 2 and 3 may be another home appliance 3 or cell phone other than the robot cleaner 1. In the present embodiment, the cell phone may be called a remote device 2. The other home appliance may be, for example, a washing machine, a refrigerator, a cooker, or an air conditioner and may include a communication unit and an imaging device.
In detail, the robot cleaner 1 may include a control unit 11, an imaging device 12 for obtaining image information, an input unit 13 for inputting a command, a first communication unit 14 for communicating with a charging device to charge a battery (not illustrated), a second communication unit 15 for communicating with the communication components 2 and 3, a memory unit 16 for storing information, a plurality of wheels (not illustrated) for movement, a left-wheel motor 17 and a right-wheel motor 18 for rotating the wheels, and a voice output unit 19 for outputting a voice.
In addition, the robot cleaner 1 may further include an impact sensor 20.
The imaging device 12 may obtain a still image or video, and at least one imaging device may be provided to the robot cleaner 1. In one embodiment, a plurality of imaging devices 12 may be provided to the robot cleaner 1 to obtain image information on various areas. One or more of the imaging devices may be provided to an upper side portion of the robot cleaner 1. Also, one or more of the imaging devices may be provided to a side circumferential portion of the robot cleaner 1.
The robot cleaner 1 may perform a mapping operation for an area to be cleaned by using the imaging device 12. That is, when a mapping command is inputted to the robot cleaner 1, the robot cleaner 1 moves around to obtain an image through the imaging device 12, and maps a cleanable area by using the obtained information. Herein, the mapping command may be inputted through the input unit 13 of the robot cleaner 1 or an input unit 22 of the remote device 2.
The cleanable area is set as the area to be cleaned, and the mapped information is stored in the memory unit 16. And, the robot cleaner 1 uses the mapped information to move and perform a cleaning operation. In the present description, since the mapping process of the robot cleaner 1 is just exemplary and may be implemented using a known technique, a detailed description thereof is omitted.
During a cleaning process, a cleaning-completed area in the area to be cleaned may be stored in the memory unit 16. That is, the cleaning-completed area and cleaning-uncompleted area in the whole area to be cleaned may be differentiated from each other to be stored.
Through the input units 13 and 22, a cleaning start command or cleaning scheduling command for the robot cleaner 1 may be inputted. Also, through at least one of the input units 13 and 22, a cleaning pattern (moving pattern) of the robot cleaner may be set. The cleaning pattern may include a zigzag pattern, a random pattern, a cell basis pattern (i.e., after completing cleaning of a certain cell, cleaning operation is performed to a next cell), and a user-customized pattern.
The left-wheel motor 17 and the right-wheel 18 may be independently operated. Therefore, the robot cleaner 1 may move not only forward and backward, but also leftward and rightward.
The voice output unit 19 may be, e.g., a speaker. Operation information on the robot cleaner 1 may be outputted as a voice through the voice output unit 19. Also, the voice output unit 19 may output information received from the outside as a voice. For instance, when external text information is received by the robot cleaner 1, the text information is converted into a voice and is outputted through the voice output unit 12.
The impact sensor 20 may detect an amount of impact on the robot cleaner 1. Herein, when the amount of impact detected by the impact sensor 20 is greater than a reference amount, the control unit 11 may control the imaging device 12 so that an image is obtained by the imaging device 12. And, when the image is obtained by the imaging device 12 in response to the detected amount of impact greater than the reference amount, the obtained image may be transmitted to the remote device 2.
Meanwhile, the remote device 2 may include a control unit 21, the input 22 unit for inputting commands and information, a communication unit 23 capable of communicating with the second communication unit 15, a voice output unit 24 for outputting a voice, a memory unit 25 for storing information, and an information display unit 26 on which information is displayed. In the present description, provided that the robot cleaner 1 and the remote device 2 communicate with each other, a configuration of the remote device 2 is not limited to the embodiment described above.
Through the input unit 22, an operation command for the robot cleaner 1 may be inputted. The inputted command may be transmitted to the robot cleaner 1 through the communication unit 23.
In the present embodiment, the remote device 2 and the robot cleaner 1 may wirelessly communicate with each other. As an example, the remote device 2 and the robot cleaner 1 may communicate with each other using a ZigBee, Wi-Fi, or Bluetooth module. However, a communication method for the remote device 2 and the robot cleaner 1 is not limited to the embodiment described above.
The operation information on the robot cleaner 1, the image information obtained by the imaging device 12 of the robot cleaner 1, the map information on the area to be cleaned, and the like may be displayed on the information display unit 26.
Hereinafter, exemplary uses of the network device will be described.
Fig. 3 is a flowchart illustrating a first exemplary use of the network device.
Referring to Fig. 3, to perform the cleaning operation using the robot cleaner 1, a map information request command is inputted through the remote device 2, and the inputted command may be transmitted to the robot cleaner 1. Then, the map information on the whole area to be cleaned, which is stored in the memory unit 16 of the robot cleaner 1, is transmitted to the remote device 2 in operation S1. The map information transmitted to the remote device 2 may be displayed on the information display unit 26 of the remote device 2.
Next, a user may designate a cleaning sequence for the map information displayed on the information display unit 26. As an example, the map displayed on the information display unit 26 may be divided into a plurality of areas, and then, a cleaning sequence for the areas may be set. Herein, the user may select a particular area multiple times to intensively clean the particular area. Herein, when the particular area is cleaned multiple times, a time for cleaning the particular area may be lengthened in comparison with other areas, or the particular area may be cleaned again after completing cleaning of the particular area and other areas.
As another example, after designating a particular point on the map displayed on the information display unit 26, the cleaning sequence may be set by dragging the point. That is, the cleaning pattern (moving pattern) may be set by the user.
After designating the cleaning sequence for the whole area to be cleaned, the cleaning start command is inputted from the remote device 2 to the robot cleaner 1.
Then, the robot cleaner 1 moves around based on the designated cleaning sequence to thereby perform the cleaning operation in operation S3. While the cleaning operation is performed by the robot cleaner 1, in the case where the amount of impact detected by the impact sensor 20 is greater than the reference amount, the image information (still image or video) obtained by the imaging device 12 may be transmitted to the remote device 2. The reference amount may be set greater than an amount of impact that could possibly occur while the robot cleaner 1 normally performs the cleaning operation.
When the robot cleaner 1 moves, an obstacle may be avoided by using the imaging device 12 or an additionally provided obstacle sensor. Therefore, even when the robot cleaner 1 collides with an obstacle, the amount of impact detected by the impact sensor 20 is smaller than the reference amount.
When an external force is applied to the robot cleaner 1, or when an obstacle abruptly appears while the robot cleaner 1 tries to detect an obstacle, the amount of impact may be greater than the reference amount. The amount of impact greater than the reference amount may indicate an environment different from a normal environment for the robot cleaner 1. Therefore, the imaging device 12 obtains an image, and the obtained image is transmitted to the remote device 2 so that the user may monitor the image, which may be the inside of a home.
According to the present exemplary use, since the user may designate the cleaning sequence for the whole area to be cleaned, the robot cleaner may efficiently perform the cleaning operation. For instance, when a shortest movement path is designated, a movement distance of the robot cleaner may be reduced. As another example, when a particular area is selected multiple times, the particular area may be intensively cleaned.
Fig. 4 is a flowchart illustrating a second exemplary use of the network device.
Referring to Fig. 4, when a cleaning command is inputted to the robot cleaner 1, the robot cleaner starts cleaning according to a set cleaning pattern in operation S11. The cleaning command may be inputted through the robot cleaner 1 or the remote device 2. While the robot cleaner 1 performs the cleaning operation, cleaning-completed area information may be stored.
And, when the robot cleaner 1 determines that the cleaning is completed, the cleaning-completed area information is transmitted from the robot cleaner 1 to the remote device 2.
Herein, the robot cleaner 1 may transmit information, in which the cleaning-completed area and cleaning-uncompleted area in the whole area to be cleaned are differentiated from each other, to the remote device 2. Then, the cleaning-completed area and cleaning-uncompleted area may be distinguishably displayed on the information display unit 26 of the remote device 2.
For instance, the robot cleaner 1 may transmit information, in which the cleaning-completed area and cleaning-uncompleted area are differentiated from each other using a line or plane, to the remote device 2. Or, the robot cleaner 1 may transmit information, in which a color (or brightness) of the whole area to be cleaned is different from that of the cleaning completed area, to the remote device 2.
The user may check the cleaning-completed area to determine an area that needs to be additionally cleaned, and the area that needs to be additionally cleaned (movement destination area) may be selected by using the remote device 2. That is, the user may designate an area to be cleaned.
When the movement destination area is selected, the robot cleaner 1 moves to the selected movement destination area and performs the cleaning operation in operation S14.
According to the second exempary, a cleaning state may be easily checked by the user, and, if necessary, an area may be additionally selected so as to perform the cleaning operation again. Thus, the cleaning-completed area in the whole area to be cleaned may increase.
Fig. 5 is a flowchart illustrating a third exemplary use of the network device.
Referring to Fig. 5, an absence mode setting command may be inputted to the robot cleaner 1 in operation S21. The absence mode setting command may be inputted through the input unit of the robot cleaner 1 or the remote device 2. The absence mode may be called an automatic monitoring mode.
When the absence mode setting command is inputted, the robot cleaner 1 moves to a set location and then stops in operation S22. The user may designate the set location on the map of the area to be cleaned, and the set location may be changed. As an example, the set location may be such a location as to obtain a great amount of information on the inside of a home through the imaging device 12 of the robot cleaner 1. As an example, the set location may be a center portion of a living room. Or, the set location may be such a location as to easily detect an intruder entering the home. For instance, the set location may be such a location as to watch an entrance door.
After the robot cleaner 1 moves to the set location, an image is obtained periodically (on a second basis or minute basis) or intermittently in operation S23. And, the control unit 11 of the robot cleaner 1 compares a currently obtained image to a previously obtained image. And, the control unit 11 determines whether the currently obtained image is different from the previously obtained image in operation S24.
When there is a difference between the currently obtained image and the previously obtained image, the component 1 may transmit image difference notification information to the remote device 2 in operation S25. Also, the component 1 may transmit the currently obtained image and the previously obtained image to the remote device 2 in operation S25. Herein, the component 1 may separately transmit the currently obtained image and the previously obtained image to the remote device 2, or may transmit information, in which the currently obtained image and the previously obtained image are overlapped and a difference between the two images is marked, to the remote device.
According to setting of the robot cleaner, when there is a difference between the currently obtained image and the previously obtained image, the voice output unit 19 of the robot cleaner 1 may generate an alarming voice. Also, when a particular text or voice command is inputted through the remote device 2, the inputted information is transmitted to the robot cleaner 1 so that a voice is outputted from the voice output unit 19 of the robot cleaner 1.
Or, when a text or voice information is inputted through the remote device 2, the inputted information is transmitted to the other home appliance 3 so that a voice is outputted from the other home appliance 3.
The user may easily check a state of the inside of a home by checking the image difference. Therefore, when an intruder exists or fire occurs in the home, the user may rapidly contact a police station or fire station.
In the case of the above-described example, the set location may be plural. Therefore, after the image comparison operation is performed on a certain location, the image comparison operation is performed on another location. The movement among the set locations may be repeated.
As another example, when it is difficult to check the image difference through the remote device 2, an image recapturing command is inputted to the robot cleaner 1 and/or the other home appliance 3 so that the remote device 2 obtains additional image information and the user may check the additional image information.
As another example, when the amount of impact detected by the impact sensor is greater than the reference amount even though there is no difference between the currently obtained image and the previously obtained image, the imaging device 12 may obtain an image and the obtained image may be transmitted to the remote device 2.
Fig. 6 is a flowchart illustrating a fourth exemplary use of the network device.
Referring to Fig. 6, a manual mode selection command is inputted to the robot cleaner 1 in operation S31. As an example, the manual mode may be selected to monitor the inside of a home by the user, and may be called a manual monitoring mode.
The manual mode selection command may be inputted through the remote device 2. When the manual mode selection command is inputted, the imaging device 12 of the robot cleaner 1 is activated so that image information is obtained by the imaging device 12. In the present exemplary use, video information may be obtained by the imaging device 12. Further, the obtained video information may be transmitted to the remote device 2 in real time in operation S32. Therefore, the inside of a home where the robot cleaner 1 is located may be monitored in real time via the information display unit 26 of the remote device 2.
In the case where the user needs to monitor the overall inside of a home, the robot cleaner 1 may be manually moved by using the remote device 2.
For instance, forward, backward, left and right moving units may be activated according to a movement command inputted through the input unit 22 of the remote device 2 so that the robot cleaner 1 moves in operation S33. Herein, a moving speed of the robot cleaner 1 may be selected using the input unit 22, and an image capturing angle of the imaging device 12 may also be adjusted. Although not illustrated, a driving unit may be provided to the robot cleaner 1 to horizontally or vertically rotate the imaging device 12, thereby adjusting the image capturing angle. As another example, since the map information on the area to be cleaned may be displayed on the remote device 2, a particular location on the map may be designated so that the robot cleaner 1 moves thereto.
Therefore, according to the present exemplary use, the user may monitor the inside of a home by moving the robot cleaner by using the input unit of the remote device.
According to the proposed exemplary use, since a user may designate a cleaning sequence for the whole area to be cleaned, a robot cleaner, which is an example of a component of a network device, can efficiently perform a cleaning operation. For instance, by designating a shortest movement path, a movement distance of the robot cleaner can be reduced.
Further, when a particular area is selected multiple times, the particular area can be intensively cleaned.
Further, a cleaning state can be easily checked by the user, and, if necessary, an area can be additionally selected so as to perform the cleaning operation again. Thus, a cleaning-completed area in the whole area to be cleaned can be increased.
Further, the user can remotely check a change of a state of the inside of a home. Therefore, when an intruder exists or fire occurs in the home, the user can rapidly cope with the situation.
Although embodiments have been described with reference to a number of illustrative embodiments thereof, it should be understood that numerous other modifications and embodiments can be devised by those skilled in the art that will fall within the spirit and scope of the principles of this disclosure. More particularly, various variations and modifications are possible in the component parts and/or arrangements of the subject combination arrangement within the scope of the disclosure, the drawings and the appended claims. In addition to variations and modifications in the component parts and/or arrangements, alternative uses will also be apparent to those skilled in the art.

Claims (20)

  1. A method for controlling a component for a network device, the method comprising:
    transmitting map information on a whole area to be cleaned, which is stored in the component, to a remote device;
    receiving a cleaning sequence for the whole area to be cleaned, from the remote device, by the component; and
    cleaning the whole area to be cleaned according to the received cleaning sequence by the component.
  2. The method according to claim 1, further comprises receiving input by the remote device to set the cleaning sequence.
  3. The method according to claim 1, wherein the whole area to be cleaned is divided into a plurality of areas, and cleaning the whole area comprises:
    cleaning a particular area among the plurality of areas; and
    moving to a next area to clean the next area.
  4. The method according to claim 1, wherein the whole area to be cleaned is divided into a plurality of areas, and cleaning the whole area comprises:
    cleaning a particular area from the plurality of areas multiple times.
  5. The method according to claim 4, wherein cleaing the particular area multiple times comprises cleaning the particular area again after the plurality of areas are cleaned.
  6. The method according to claim 1, wherein the whole area to be cleaned is divided into a plurality of areas, and cleaning the whole area comprises cleaning particular area among the plurality of areas for a time period that is longer than at least one of the area among the plurality of areas.
  7. The method according to claim 1, further comprising:
    detecting an amount of impact during the cleaning by the component; and
    obtaining imaging information when the detected amount of impact is greater than a reference amount by an imaging device of the component.
  8. The method according to claim 7, further comprising: transmitting the image information obtained by the imaging device of the component to the remote device.
  9. A method for controlling a component for a network device, the method comprising:
    cleaning an area to be cleaned by the component;
    transmitting cleaning area information to a remote device capable of communicating with the component;
    receiving additional cleaning area information by the component from the remote device; and
    cleaning additional cleaning area by the component.
  10. The method according to claim 9, wherein transmitting cleaning area information comprises transmitting cleaning-completed area information when the component determines that the cleaning is completed.
  11. The method according to claim 9, wherein transmitting cleaning area infromation comprises transmitting a cleaing-completed area and a cleaning-uncompleted area in the whole area to be cleaned which are differentiated from each other, to the remote device.
  12. The method according to claim 9, wherein transmitting cleaning area infromation comprises transmitting the cleaning-completed area in the whole area to be cleaned that is marked with a line or plane, to the remote device.
  13. The method according to claim 9, wherein transmitting cleaning area infromation comprises transmitting a cleaning-completed area that is differentiated from that of a cleaning-uncompleted area in the whole area to be cleaned by using a color or brightness, to the remote device.
  14. A method for controlling a component for a network device, the method comprising:
    periodically obtaining images through an imaging device of the component;
    comparing a currently obtained image with a previously obtained image by the component; and
    transmitting the currently obtained image and the previously obtained image to a remote device when the currently obtained image is different from the previously obtained image according to a result of the comparing by the component.
  15. The method according to claim 14, further comprising moving to a preset location by the component to obtain an image.
  16. The method according to claim 14, further comprising moving to a location by the component after receiving moving command from the remote device.
  17. The method according to claim 14, further comprising: detecting an amount of impact by the component;
    obtaing an image when the detected amount of impact is greater than a reference amount by the imaiging device of the component; and
    transmitting the obtained image by the component to the remote device.
  18. The method according to clam 14, further comprising generating an alarm by the component when the currently obtained image is different from the previously obtained image according to a result of the comparing by the component.
  19. The method according to claim 14, further comprising:
    receiving infromation by the component from the remote device; and
    coverting the received information into a voice to output the voice by the component.
  20. The method according to claim 14, further comprising:
    transmitting image capturing command to a home appliance by a remote device; and
    receiving by the remote device an image captured by the home appliance.
PCT/KR2012/003830 2011-05-17 2012-05-16 Method for controlling component for network device WO2012157951A2 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR1020110046476A KR101760950B1 (en) 2011-05-17 2011-05-17 Controlling mehtod of network system
KR10-2011-0046476 2011-05-17

Publications (2)

Publication Number Publication Date
WO2012157951A2 true WO2012157951A2 (en) 2012-11-22
WO2012157951A3 WO2012157951A3 (en) 2013-01-24

Family

ID=47177481

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2012/003830 WO2012157951A2 (en) 2011-05-17 2012-05-16 Method for controlling component for network device

Country Status (2)

Country Link
KR (1) KR101760950B1 (en)
WO (1) WO2012157951A2 (en)

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2977844A1 (en) * 2014-07-22 2016-01-27 Vorwerk & Co. Interholding GmbH Method for cleaning or processing a room using an automatically moved device
AU2015203001B2 (en) * 2014-07-23 2016-05-26 Lg Electronics Inc. Robot cleaner and method for controlling the same
EP3091888A4 (en) * 2014-01-10 2017-10-11 Diversey, Inc. Cleaning apparatus data management system and method
EP3184014A4 (en) * 2014-08-19 2017-11-08 Samsung Electronics Co., Ltd. Cleaning robot, and control apparatus, control system and control method for cleaning robot
CN111093447A (en) * 2017-09-26 2020-05-01 伊莱克斯公司 Movement control of a robotic cleaning device
CN111685672A (en) * 2019-03-12 2020-09-22 德国福维克控股公司 Ground processing equipment and system composed of ground processing equipment and external terminal equipment
US10860029B2 (en) 2016-02-15 2020-12-08 RobArt GmbH Method for controlling an autonomous mobile robot
US11175670B2 (en) 2015-11-17 2021-11-16 RobArt GmbH Robot-assisted processing of a surface using a robot
US11188086B2 (en) 2015-09-04 2021-11-30 RobArtGmbH Identification and localization of a base station of an autonomous mobile robot
US11550054B2 (en) 2015-06-18 2023-01-10 RobArtGmbH Optical triangulation sensor for distance measurement
US11709489B2 (en) 2017-03-02 2023-07-25 RobArt GmbH Method for controlling an autonomous, mobile robot
US11768494B2 (en) 2015-11-11 2023-09-26 RobArt GmbH Subdivision of maps for robot navigation
US11789447B2 (en) 2015-12-11 2023-10-17 RobArt GmbH Remote control of an autonomous mobile robot

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102082754B1 (en) * 2013-07-11 2020-04-16 삼성전자주식회사 Cleaning robot and method for controlling the same

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040204792A1 (en) * 2003-03-14 2004-10-14 Taylor Charles E. Robotic vacuum with localized cleaning algorithm
US6888565B1 (en) * 1999-08-31 2005-05-03 Canon Kabushiki Kaisha Apparatus and method for remote-controlling image sensing apparatus in image sensing system
US20060178777A1 (en) * 2005-02-04 2006-08-10 Samsung Electronics Co., Ltd. Home network system and control method thereof

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8239992B2 (en) * 2007-05-09 2012-08-14 Irobot Corporation Compact autonomous coverage robot

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6888565B1 (en) * 1999-08-31 2005-05-03 Canon Kabushiki Kaisha Apparatus and method for remote-controlling image sensing apparatus in image sensing system
US20040204792A1 (en) * 2003-03-14 2004-10-14 Taylor Charles E. Robotic vacuum with localized cleaning algorithm
US20060178777A1 (en) * 2005-02-04 2006-08-10 Samsung Electronics Co., Ltd. Home network system and control method thereof

Cited By (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3091888A4 (en) * 2014-01-10 2017-10-11 Diversey, Inc. Cleaning apparatus data management system and method
US10362913B2 (en) 2014-01-10 2019-07-30 Diversey, Inc. Cleaning apparatus data management system and method
EP2977844A1 (en) * 2014-07-22 2016-01-27 Vorwerk & Co. Interholding GmbH Method for cleaning or processing a room using an automatically moved device
AU2015203001B2 (en) * 2014-07-23 2016-05-26 Lg Electronics Inc. Robot cleaner and method for controlling the same
US9782050B2 (en) 2014-07-23 2017-10-10 Lg Electronics Inc. Robot cleaner and method for controlling the same
EP3184014A4 (en) * 2014-08-19 2017-11-08 Samsung Electronics Co., Ltd. Cleaning robot, and control apparatus, control system and control method for cleaning robot
US10365659B2 (en) 2014-08-19 2019-07-30 Sasmung Electronics Co., Ltd. Robot cleaner, control apparatus, control system, and control method of robot cleaner
EP3970590A1 (en) * 2014-08-19 2022-03-23 Samsung Electronics Co., Ltd. Method and system for controlling a robot cleaner
US11550054B2 (en) 2015-06-18 2023-01-10 RobArtGmbH Optical triangulation sensor for distance measurement
US11188086B2 (en) 2015-09-04 2021-11-30 RobArtGmbH Identification and localization of a base station of an autonomous mobile robot
US11768494B2 (en) 2015-11-11 2023-09-26 RobArt GmbH Subdivision of maps for robot navigation
US11175670B2 (en) 2015-11-17 2021-11-16 RobArt GmbH Robot-assisted processing of a surface using a robot
US11789447B2 (en) 2015-12-11 2023-10-17 RobArt GmbH Remote control of an autonomous mobile robot
US10860029B2 (en) 2016-02-15 2020-12-08 RobArt GmbH Method for controlling an autonomous mobile robot
US11709497B2 (en) 2016-02-15 2023-07-25 RobArt GmbH Method for controlling an autonomous mobile robot
US11709489B2 (en) 2017-03-02 2023-07-25 RobArt GmbH Method for controlling an autonomous, mobile robot
CN111093447A (en) * 2017-09-26 2020-05-01 伊莱克斯公司 Movement control of a robotic cleaning device
US11921517B2 (en) 2017-09-26 2024-03-05 Aktiebolaget Electrolux Controlling movement of a robotic cleaning device
CN111685672A (en) * 2019-03-12 2020-09-22 德国福维克控股公司 Ground processing equipment and system composed of ground processing equipment and external terminal equipment
CN111685672B (en) * 2019-03-12 2023-02-21 德国福维克控股公司 Ground processing equipment and system composed of ground processing equipment and external terminal equipment

Also Published As

Publication number Publication date
KR20120128485A (en) 2012-11-27
KR101760950B1 (en) 2017-07-24
WO2012157951A3 (en) 2013-01-24

Similar Documents

Publication Publication Date Title
WO2012157951A2 (en) Method for controlling component for network device
WO2016182374A1 (en) Remote control method and device using wearable device
WO2016114463A1 (en) Driving robot, and charging station docking method for the driving robot
WO2013133486A1 (en) Device, method and timeline user interface for controlling home devices
US9742582B2 (en) House monitoring system
US10764540B2 (en) Monitoring system
US9852608B2 (en) Monitoring system
CN102833416A (en) Method of mobile phone flipping over for mute control and mobile phone
WO2011062396A9 (en) Robot cleaner and controlling method thereof
JP2015204012A (en) Monitor camera system
WO2016003227A1 (en) Method for controlling washing machine
WO2016079887A1 (en) Monitoring system
WO2020241933A1 (en) Master robot controlling slave robot and method of operating same
US9485111B2 (en) Monitoring system
WO2014092259A1 (en) Home network system having wirelessly rechargeable digital door lock
JP7145691B2 (en) Elevator, information equipment, elevator system, elevator recovery method, and image display program
WO2020045700A1 (en) Docking station apparatus for smart moveable apparatus for companion animal care, and system comprising same
KR101311951B1 (en) Fire detecing and alarm system
WO2020180108A1 (en) Mobile robot and method for controlling same
WO2013180317A1 (en) Apparatus for remotely controlling ip camera using touch input of mobile terminal
CN202837955U (en) On-line monitoring and dynamic demonstration system for carrying roller automatic production line
WO2016010369A1 (en) Action recognition toy
WO2020262712A1 (en) Image display method and mobile robot for implementing same
KR20060061962A (en) Apparatus for displaying external video signal in home automation video phone
WO2009022238A2 (en) Utility outlets capable of presenting images

Legal Events

Date Code Title Description
NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 12786669

Country of ref document: EP

Kind code of ref document: A2