WO2024043488A1 - Robot pour la gestion de dispositif électronique et son procédé de commande - Google Patents

Robot pour la gestion de dispositif électronique et son procédé de commande Download PDF

Info

Publication number
WO2024043488A1
WO2024043488A1 PCT/KR2023/009125 KR2023009125W WO2024043488A1 WO 2024043488 A1 WO2024043488 A1 WO 2024043488A1 KR 2023009125 W KR2023009125 W KR 2023009125W WO 2024043488 A1 WO2024043488 A1 WO 2024043488A1
Authority
WO
WIPO (PCT)
Prior art keywords
electronic device
robot
information
identified
confirmation signal
Prior art date
Application number
PCT/KR2023/009125
Other languages
English (en)
Korean (ko)
Inventor
황지웅
최예은
이승준
한훈
강현주
공진아
조용희
Original Assignee
삼성전자주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 삼성전자주식회사 filed Critical 삼성전자주식회사
Publication of WO2024043488A1 publication Critical patent/WO2024043488A1/fr

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1679Programme controls characterised by the tasks executed
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J13/00Controls for manipulators
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J13/00Controls for manipulators
    • B25J13/003Controls for manipulators by means of an audio-responsive input
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J13/00Controls for manipulators
    • B25J13/006Controls for manipulators by means of a wireless system for controlling one or several manipulators
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J13/00Controls for manipulators
    • B25J13/08Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J19/00Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
    • B25J19/02Sensing devices
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J19/00Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
    • B25J19/02Sensing devices
    • B25J19/021Optical sensing devices
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J19/00Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
    • B25J19/02Sensing devices
    • B25J19/026Acoustical sensing devices
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1602Programme controls characterised by the control system, structure, architecture
    • B25J9/161Hardware, e.g. neural networks, fuzzy logic, interfaces, processor
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • B25J9/1697Vision controlled systems
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16YINFORMATION AND COMMUNICATION TECHNOLOGY SPECIALLY ADAPTED FOR THE INTERNET OF THINGS [IoT]
    • G16Y20/00Information sensed or collected by the things
    • G16Y20/10Information sensed or collected by the things relating to the environment, e.g. temperature; relating to location
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L2201/00Robotic cleaning machines, i.e. with automatic control of the travelling movement or the cleaning operation

Definitions

  • This disclosure relates to a robot that manages electronic devices placed in a certain space and a method of controlling the same.
  • electronic devices may include smartphones, tablet PCs, desktop computers, laptop computers, audio devices, speakers, TVs, microwave ovens, etc.
  • existing home appliances such as refrigerators, washing machines, and dryers are also developing to include communication functions, voice recognition functions, and gesture recognition functions.
  • An electronic device that includes a communication function can be connected to another electronic device or server through a network and transmit and receive various information to the other electronic device or server.
  • an electronic device including a voice recognition function or a gesture recognition function may recognize a user's voice or gesture and perform an operation corresponding to the recognized voice or gesture.
  • robots are not only used for industrial purposes such as product assembly and inspection, but are also widely used in daily life by performing functions such as serving, delivery, security, and management.
  • a robot includes a camera, a communication interface, a memory storing 3D information corresponding to a specific space, and one or more processors.
  • the one or more processors may obtain identification information of the identified electronic device.
  • the one or more processors may transmit an operation confirmation signal corresponding to the type of the identified electronic device to the electronic device through the communication interface.
  • one or more processors may identify location information of the electronic device within the specific space.
  • One or more processors may update the 3D information by mapping the electronic device to the specific space based on the identification information of the electronic device and the location information of the electronic device.
  • the robot control method can obtain identification information of the identified electronic device when an electronic device is identified in an image acquired while the robot is traveling in a specific space.
  • the control method may transmit an operation confirmation signal corresponding to the type of the identified electronic device to the electronic device.
  • the control method may identify location information of the electronic device within the specific space when the operation of the electronic device according to the operation confirmation signal is identified.
  • the control method may update stored 3D information by mapping the electronic device to the specific space based on the identification information of the electronic device and the location information of the electronic device.
  • a program that performs a method for controlling a robot recorded on a non-transitory computer-readable storage medium is configured to: When an electronic device is identified in an image acquired while the robot is traveling in a specific space, the identified electronic device Device identification information can be obtained.
  • the program may transmit an operation confirmation signal corresponding to the type of the identified electronic device to the electronic device.
  • the program can identify location information of the electronic device within the specific space.
  • the program may update the stored 3D information by mapping the electronic device to the specific space based on the identification information of the electronic device and the location information of the electronic device.
  • FIG. 1 is a diagram illustrating a robot system according to an embodiment of the present disclosure.
  • Figure 2 is a block diagram explaining the configuration of a robot according to an embodiment of the present disclosure.
  • Figure 3 is a block diagram explaining the specific configuration of a robot according to an embodiment of the present disclosure.
  • 4 to 6 are diagrams illustrating a process for identifying an electronic device according to an embodiment of the present disclosure.
  • FIG. 7 is a diagram illustrating a process of mapping an identified electronic device to 3D information according to an embodiment of the present disclosure.
  • FIG. 8 is a diagram illustrating a process for simultaneously identifying a plurality of electronic devices according to an embodiment of the present disclosure.
  • Figure 9 is a timing diagram explaining the operation of a robot system that identifies an electronic device according to an embodiment of the present disclosure.
  • FIG. 10 is a diagram illustrating a process for identifying a moving electronic device according to an embodiment of the present disclosure.
  • 11 and 12 are diagrams illustrating a process for confirming the influence of surrounding objects according to an embodiment of the present disclosure.
  • FIG. 13 is a timing diagram explaining the operation of a robot system that identifies a moving electronic device according to an embodiment of the present disclosure.
  • Figure 14 is a diagram explaining a robot that performs a user command according to an embodiment of the present disclosure.
  • 15 and 16 are diagrams illustrating a terminal device executing a user command according to an embodiment of the present disclosure.
  • Figure 17 is a flowchart explaining a method of controlling a robot according to an embodiment of the present disclosure.
  • a “module” or “unit” for a component used in this specification performs at least one function or operation. And, the “module” or “unit” may perform a function or operation by hardware, software, or a combination of hardware and software. Additionally, a plurality of “modules” or a plurality of “units” excluding a “module” or “unit” that must be performed on specific hardware or performed on at least one processor may be integrated into at least one module. Singular expressions include plural expressions unless the context clearly dictates otherwise.
  • each step should be understood as non-limiting unless the preceding step must be performed logically and temporally prior to the subsequent step. In other words, except for the above exceptional cases, even if the process described as a subsequent step is performed before the process described as a preceding step, the nature of disclosure is not affected, and the scope of rights must also be defined regardless of the order of the steps.
  • the term “A or B” is defined not only to selectively indicate either A or B, but also to include both A and B.
  • the term "included” in this specification has the meaning of including additional components in addition to the elements listed as included.
  • each embodiment may be implemented or operated independently, but each embodiment may be implemented or operated in combination.
  • FIG. 1 is a diagram illustrating a robot system according to an embodiment of the present disclosure.
  • the robot system 1000 may include a robot 100, an electronic device 200, a server 300, and a terminal device 400.
  • the robot 100 may include map data and 3D information of a specific space.
  • the robot 100 can explore a specific space and generate map data and 3D information.
  • the robot 100 may receive map data and 3D information of a specific space from the server 300.
  • specific spaces may include home, school, work, etc.
  • the robot 100 can explore a room, kitchen, living room, etc. in a home.
  • the robot 100 can generate two-dimensional map data while exploring each space.
  • the robot 100 can identify the floor, walls, and ceiling of a room, kitchen, and living room.
  • the robot 100 can generate 3D information using information on the identified floor, wall, and ceiling.
  • the robot 100 may identify objects placed in each space. For example, objects may include a dining table, a table, a bookshelf, a desk, etc.
  • An electronic device 200 may be placed in each space.
  • the electronic device 200 may be a network device connected to a network, or may be a general device not connected to a network.
  • General devices not connected to a network may include electronic devices controlled with a remote control or the like.
  • network devices are referred to as IoT (Internet of Things) devices, and general devices controlled by a remote control and not connected to the network are referred to as IR (Infra-Red) devices.
  • the electronic device 200 may include a refrigerator, TV, robot vacuum cleaner, washing machine, dryer, air conditioner, light, speaker, sensor, power supply, router, etc.
  • the example of the electronic device 200 described above is an example, and is not limited to the example described above.
  • the server 300 may communicate with the electronic device 200 (eg, IoT device) and the robot 100.
  • the server 300 may receive information from the electronic device 200 placed in a specific space and transmit the information of the electronic device 200 to the robot 100.
  • the electronic device 200 placed in a specific space may transmit information to the server 300.
  • Information transmitted to the server 300 may include identification information including type, name, model name, and control-related information.
  • the server 300 may transmit the identification information received from the electronic device 200 to the robot 100.
  • the robot 100 can explore space and identify the placed electronic device 200.
  • the robot 100 may identify the electronic device 200 based on an image acquired through a camera in a specific space.
  • the robot 100 may transmit an operation confirmation signal corresponding to the type of the identified electronic device 200 to the electronic device 200.
  • the electronic device 200 may perform an operation according to the received operation confirmation signal.
  • the robot 100 identifies the operation of the electronic device 200, it can identify the location information of the electronic device in space.
  • the robot 100 may receive information related to lights placed in a space from the server 300 .
  • the robot 100 can identify lights placed on the walls of a room from images acquired through spatial exploration and a camera.
  • the robot 100 can transmit an operation confirmation signal to the light and identify the operation of the light.
  • the robot 100 can identify the location information of the light.
  • the location information of the light may include coordinate information of the room, wall, and light.
  • the robot 100 can match the identification information and location information of the light.
  • the robot 100 can update the 3D information by mapping the information matching the lamp's identification information and location information to space.
  • the robot 100 may transmit 3D information in which the identification information and location information of the electronic device 200 are mapped to space to the server 300 and/or the terminal device 400.
  • Server 300 may store the received information.
  • the terminal device 400 may display a control user interface (UI) based on the received information.
  • the terminal device 400 may receive a command to control the electronic device 200 on the displayed control UI.
  • the terminal device 400 may transmit the input control command to the corresponding electronic device 200.
  • the electronic device 200 may perform control operations according to received control commands.
  • FIG. 1 illustrates the robot system 300 including the server 300
  • some or all functions of the server 300 may be performed in the robot 100.
  • the robot 100 can perform all of the functions of the server 300 described above.
  • the robot 100 may request additional necessary information from the server 300 while performing the functions of the server 300 described above.
  • Figure 2 is a block diagram explaining the configuration of a robot according to an embodiment of the present disclosure.
  • the robot 100 includes a camera 110, a communication interface 120, a memory 130, and a processor 140.
  • the camera 110 can capture images of the surrounding environment of the robot 100.
  • the processor 140 may identify a space, an electronic device 200, an object, etc. from a captured image.
  • the camera 110 may capture an image of the user.
  • the processor 140 may identify user-related information, including the user's location, gaze direction, and pointing direction, from the captured user's image.
  • the processor 140 may recognize a control command based on the user's facial expression, motion, gaze, etc., and perform a control operation corresponding to the recognized control command or the recognized area.
  • the camera 1110 may include a CCD sensor or a CMOS sensor.
  • the camera 110 may include an RGB camera or a depth camera.
  • the communication interface 120 can communicate with an external device.
  • the communication interface 120 may receive data from an external device and transmit identified data to the external device.
  • the external device may include an electronic device 200, a server 300, and a terminal device 400.
  • the communication interface 120 may receive identification information of the electronic device 200 from the server 300. Additionally, the communication interface 120 may transmit 3D information mapped to a specific space to the terminal device 400 based on the identification information and location information of the electronic device 200. Additionally, the communication interface 120 may transmit an operation confirmation signal, a test signal, and a control signal to the electronic device 200.
  • the communication interface 120 may use communication methods such as Wi-Fi, Wi-Fi Direct, Bluetooth, ZigBee, DLNA, Wide, 3G (3rd Generation), 3GPP (3rd Generation Partnership Project), and LTE (Long Term Evolution). Communication with an external device can be performed using at least one of the following communication methods.
  • the communication interface 120 may be referred to as a communication device, communication unit, communication module, transceiver, etc.
  • the memory 130 may store algorithms, data, software, etc. related to the operation of the robot 100. Algorithms, data, software, etc. stored in the memory 130 may be loaded into the processor 140 under the control of the processor 140 to perform a data processing process. Additionally, the memory 130 may store identification information, location information, 3D information, etc. of the electronic device 200.
  • the memory 135 may be implemented in the form of ROM, RAM, HDD, SSD, memory card, etc.
  • the processor 140 can control each component of the robot 100.
  • the processor 140 may control the camera 110 to capture images and control the communication interface 110 to transmit and receive data with an external device.
  • the robot 100 may include one processor 140 or may include a plurality of processors 140.
  • the processor 140 may identify the electronic device 200 from the image acquired through the camera 110 and obtain information about the identified electronic device 200. For example, the processor 140 may obtain identification information of the electronic device 200 identified in the image from the identification information of the electronic device 200 received from the server 300. Alternatively, if there is no identification information of the electronic device 200 (e.g., the identification information of the electronic device is not stored, not received, or there is no identification information corresponding to the electronic device identified from the image), the processor 140 controls the electronic device 200. ) can transmit a signal requesting identification information of the electronic device 200 to the server 300, and control the communication interface 120 to receive identification information of the electronic device 200 from the server 300.
  • the processor 140 may control the communication interface 120 to transmit an operation confirmation signal corresponding to the type of the identified electronic device 200 to the electronic device 200 and identify the operation of the electronic device 200. .
  • the type of the electronic device 200 is a light output device (e.g., a light)
  • the processor 140 controls the communication interface 120 to transmit an operation confirmation signal for outputting light to the electronic device 200.
  • the camera 110 can be controlled to capture an image of the electronic device 200 that outputs light.
  • the type of the electronic device 200 is a sound output device (e.g., a speaker)
  • the processor 140 controls the communication interface 120 to transmit an operation confirmation signal for outputting sound to the electronic device 200, and the electronic device 200
  • the speaker can be controlled to receive sound output from (200).
  • an operation confirmation signal outputting a response signal is transmitted to the electronic device 200, and the communication interface 120 is controlled to receive a response signal output from the electronic device 200.
  • the operation confirmation signal may include a signal that turns on/off the electronic device 200.
  • the processor 140 identifies the operation of the electronic device 200 corresponding to the operation confirmation signal transmitted to the electronic device 200 and determines whether the electronic device 200 included in the image is the electronic device 200 corresponding to the identification information. You can check whether or not.
  • the electronic device 200 may be a device connected to a network or a device not connected to a network.
  • a device connected to a network may be an IoT device, and a device not connected to a network may be an IR device.
  • the processor 140 may control the camera 110 to photograph the IR device and transmit the captured image to the server 300.
  • the server 300 may identify the IR device based on the received image and obtain identification information of the identified IR device.
  • the server 300 may transmit identification information of the IR device to the robot 100. Accordingly, the robot 100 can obtain identification information of the IR device from the server 300 based on the captured image of the IR device.
  • the processor 140 may simultaneously identify the plurality of IoT devices.
  • simultaneous includes not only time coinciding with a specific point in time, but also sequential time at very short time intervals. For example, 1 second after the start of operation, a control signal is transmitted to multiple IoT devices, as well as 1 second, 1.1 second, 1.2 second, 1.3 second, ... This may also include cases where control signals are sequentially transmitted to each IoT device.
  • the processor 140 can simultaneously transmit distinct operation confirmation signals to a plurality of IoT devices and simultaneously identify the operations of the plurality of IoT devices.
  • distinct operation confirmation signals may include signals with distinct colors, flashing patterns, gradients, frequencies, sound output patterns, etc.
  • the processor 140 may identify location information of the electronic device 200.
  • location information includes not only two-dimensional location information such as room, living room, and kitchen, but also left wall, right wall, front wall based on the direction the robot 100 is facing, ceiling, floor, object, and each area. It may contain three-dimensional location information including coordinate information.
  • the processor 140 may update 3D information by mapping the electronic device 200 to a specific space based on the identification information and location information of the electronic device 200. When a plurality of electronic devices 200 are located, the processor 140 can simultaneously identify location information of the plurality of electronic devices 200.
  • the processor 140 may control the communication interface 120 to transmit 3D information to which the identification information and location information of the electronic device 200 are mapped to the terminal device 400 .
  • the terminal device 400 may display a control UI for controlling the electronic device 200 based on the received information.
  • the robot 100 may detect changes in the environment after generating (or updating) 3D information of the electronic device 200 placed in a specific space.
  • a change in the environment may include changes such as placement of a new electronic device 200, absence of an existing electronic device 200, placement of a new object, etc.
  • the processor 140 sends an existing operation confirmation signal corresponding to the type of the identified electronic device 200 to the new electronic device ( 200).
  • the processor 140 may identify the new electronic device 200 as the existing electronic device whose location has been changed.
  • the processor 140 may update 3D information that modifies the location information of an existing electronic device.
  • the processor 140 may identify a new object in the image acquired through the camera 110.
  • the processor 140 can control the communication interface 120 to transmit information related to the location of the new object to the robot cleaner.
  • the robot cleaner can update the cleaning map based on the received information. That is, the processor 140 may control the communication interface 120 to update the cleaning map of the robot cleaner by transmitting information related to the location of the new object to the robot cleaner.
  • the processor 140 may transmit a test signal corresponding to the type of the electronic device 200 to the electronic device 200 through the communication interface 120.
  • the processor 140 identifies the operating performance of the electronic device 200 according to the test signal, and when the operating performance is outside a preset range, the processor 140 uses the communication interface 120 to transmit information related to the operating performance to the terminal device 400. You can control it. A detailed explanation will be provided later.
  • the processor 140 can control the camera 110 to photograph the user.
  • the processor 140 may identify user-related information including the user's location, gaze direction, and pointing direction from the user's image acquired through the camera 110.
  • the processor 140 may identify the control target electronic device 200 based on the identified user-related information. A detailed explanation will be provided later.
  • Figure 3 is a block diagram explaining the specific configuration of a robot according to an embodiment of the present disclosure.
  • the robot 100 includes a camera 110, a communication interface 120, a memory 130, a processor 140, a user interface 150, a microphone 155, a sensor 160, and a speaker ( 165), a display 170, a driving unit 175, and a power supply unit 180.
  • the camera 110, communication interface 120, and memory 130 may be the same as those described in FIG. 2.
  • the user interface 150 may receive commands, etc. from the user.
  • the user interface 150 may be implemented in various forms.
  • the user interface 150 may include a keyboard, buttons, key pad, touch pad, touch screen, etc.
  • the user interface 150 may also be called an input device, input unit, input module, etc.
  • the microphone 155 can receive the user's voice.
  • the processor 140 may recognize a control command based on the input voice and perform a control operation corresponding to the recognized control command.
  • the sensor 160 may detect information related to the user or the surrounding environment.
  • the processor 140 may perform control operations based on the sensed information.
  • the sensor 160 may include a distance sensor, an image sensor, a tracking sensor, an angle sensor, an acceleration sensor, a gravity sensor, a gyro sensor, a geomagnetic sensor, a direction sensor, a motion recognition sensor, a proximity sensor, a voltmeter, an ammeter, a barometer, and a hygrometer.
  • a thermometer an illumination sensor, a heat sensor, a touch sensor, an infrared sensor, an ultrasonic sensor, etc.
  • Distance sensors can be implemented as ultrasonic sensors, laser sensors, LiDar, etc.
  • the speaker 165 outputs a sound signal on which sound signal processing has been performed.
  • the speaker 165 may output a user's input command, status-related information of the robot 100, operation-related information, and notification of abnormalities in the electronic device to be diagnosed, etc., as a voice or notification sound.
  • Display 170 may display information in a visual manner.
  • the display 150 may be implemented as a Liquid Crystal Display (LCD), Organic Light Emitting Diode (OLED), or a touch screen.
  • LCD Liquid Crystal Display
  • OLED Organic Light Emitting Diode
  • touch screen When the display 170 is implemented as a touch screen, the robot 100 can receive control commands through the touch screen.
  • the driving unit 175 may move the robot 100 or perform a preset control operation. Alternatively, the driving unit 175 may rotate the camera 110 or the part including the camera 110 up, down, left and right.
  • the driver 175 may operate under the control of the processor 140 based on stored map data and sensed information.
  • the driving unit 165 may be implemented in the form of a module including a wheel for movement, or a robot arm that performs a preset control operation.
  • the power supply unit 180 can receive power from the outside and supply power to various components of the robot 100.
  • the robot 100 may include all of the above-described configurations or may include some of the configurations. Alternatively, the robot 100 may include additional components other than those described above.
  • FIG. 4 to 6 are diagrams illustrating a process for identifying an electronic device according to an embodiment of the present disclosure.
  • FIG. 7 is a diagram illustrating a process of mapping an identified electronic device to 3D information according to an embodiment of the present disclosure.
  • the 4 to 6 show electronic devices 200a, 200b, 200c, and 200d and a robot 100 installed in a specific space.
  • the specific space may be a room 10.
  • new electronic devices 200a, 200b, 200c, and 200d may be installed in the room 10.
  • the robot 100 may receive identification information of the newly installed electronic devices 200a, 200b, 200c, and 200d from the server 300. Since there is no location information for the newly installed electronic devices 200a, 200b, 200c, and 200d, the robot 100 may classify the electronic devices 200a, 200b, 200c, and 200d as unknown. The robot 100 can move to the room 10 while exploring each space. The robot 100 can identify the electronic devices 200a, 200b, 200c, and 200d installed in the room 10 using a camera.
  • the robot 100 Since the robot 100 knows the identification information of the electronic devices 200a, 200b, 200c, and 200d, it can know the type of each electronic device 200a, 200b, 200c, and 200d.
  • the robot 100 may transmit an operation confirmation signal corresponding to the type of the electronic device to each of the electronic devices 200a, 200b, 200c, and 200d. As shown in FIG. 4, the robot 100 may transmit an operation confirmation signal corresponding to the type of the second electronic device 200b to the second electronic device 200b. Since the second electronic device 200b has received an operation confirmation signal corresponding to its type from the robot 100, it can perform an operation based on the operation confirmation signal.
  • the operation confirmation signal may be a control signal for outputting light.
  • the operation confirmation signal may be a control signal for outputting sound.
  • the operation confirmation signal may be a control signal that outputs a response signal.
  • the second electronic device 200b shown in FIG. 4 may be a lamp, and the operation confirmation signal may be a control signal that outputs light corresponding to the lamp type.
  • the second electronic device 200b may output light based on the received operation confirmation signal.
  • the robot 100 can capture an image of the second electronic device 200b that outputs light using a camera. And, the robot 100 can check the operation of the second electronic device 200b.
  • the operation confirmation signal may be a control signal for outputting sound.
  • the robot 100 may transmit an operation confirmation signal including sound type and volume information to the second electronic device 200b and check the output sound of the second electronic device 200b.
  • the operation confirmation signal may be a control signal that outputs a response signal.
  • the robot 100 may identify the second electronic device 200b based on the signal strength and response message of the second electronic device 200b. Additionally, the second electronic device 200b may include an indicator that displays an operating status. When performing an operation based on the operation confirmation signal, the second electronic device 200b may indicate the operation status using an indicator.
  • the robot 100 captures an image of the second electronic device 200b operating and uses the captured image to create a sound output device or a communication device.
  • the operation of (200b) can be identified.
  • the robot 100 can identify the location information of the second electronic device 200b.
  • room 10 may include a left wall 11, a right wall 12, a front wall 13, a ceiling 14, and a floor 15.
  • the robot 100 can capture images of the room 10 using a camera and identify each side of the room 10.
  • the robot 100 may identify the location of the second electronic device 200b as the ceiling 14.
  • the robot 100 can draw a virtual grid on the ceiling 14.
  • the robot 100 can set any point in the virtual grid as a reference point.
  • the robot 100 can extract the coordinates of each electronic device 200a, 200b, and 200c based on the reference point and the virtual grid.
  • the robot 100 has coordinates of the first electronic device 200a (3, 3), coordinates of the second electronic device 200b (3, 5), and 3
  • the coordinates of the electronic device 200c can be determined to be (3, 7).
  • the robot 100 may identify the second electronic device 200b based on the identification information of the second electronic device 200b received from the server 300. Additionally, the robot 100 may determine that the location of the second electronic device 200b is the room 10, the ceiling 14, and the coordinates (3, 5). The robot 100 can match the identification information and location information of the second electronic device 200b. Additionally, the robot 100 may generate three-dimensional information about a specific space before the electronic devices 200a, 200b, 200c, and 200d are installed. For example, the robot 100 may generate 3D information of a specific space through advance search, and may also generate 3D information by receiving information about a specific space from the server 300. Accordingly, the robot 100 can update the pre-generated 3-D information by mapping the matching information in which the identification information and location information are matched to the pre-generated 3-D information.
  • the robot 100 determines the location information for the first electronic device 200a, the third electronic device 200c, and the fourth electronic device 200d through a similar process, and determines the location information of each electronic device 200a, 200c, and 200d. Identification information and location information can be matched. Additionally, the robot 100 may update the 3D information by mapping the matching information of each electronic device 200a, 200c, and 200d to the 3D information.
  • the first to fourth electronic devices 200a, 200b, 200c, and 200d described above may be IoT devices connected to a network. However, there may be IR devices located in a room that are not connected to the network. An IR device may be a device that is not connected to a network and is controlled by a remote control.
  • a fifth electronic device 200e which is an IR device, is shown.
  • the robot 100 may not know the identification information of the fifth electronic device 200e that is not connected to the network.
  • the robot 100 searches the room 10 using a camera, it can identify the fifth electronic device 200e. Since the robot 100 does not know the identification information of the fifth electronic device 200e, it cannot generate matching information for the fifth electronic device 200e.
  • the robot 100 can capture an image of the fifth electronic device 200e and transmit the captured image to the server 300.
  • the robot 100 can capture one image and transmit one image to the server 300, and can also capture multiple images and transmit the multiple images to the server 300.
  • the robot 100 captures images of the fifth electronic device 200e from various angles, such as the front, side, and back of the fifth electronic device 200e, and sends the plurality of captured images to the server 300. It can be sent to .
  • the robot 100 may recognize text displayed on the fifth electronic device 200e and transmit an image capturing the recognized text area to the server 300.
  • the server 300 may receive an image of the fifth electronic device 200e from the robot 100.
  • the server 300 may recognize the shape, shape, and text of the fifth electronic device 200e from the received image.
  • the server 300 may obtain identification information of the fifth electronic device 200e through search.
  • the server 300 may transmit the acquired identification information of the fifth electronic device 200e to the robot 100.
  • the robot 100 may identify the location information of the fifth electronic device 200e based on the received identification information of the fifth electronic device 200e.
  • the robot 100 may transmit an operation confirmation signal corresponding to the type of the fifth electronic device 200e to the fifth electronic device 200e based on the identification information of the fifth electronic device 200e.
  • the fifth electronic device 200e may be a fan
  • the operation confirmation signal may be a control signal for turning the fan on/off or a control signal for operating the fan.
  • the robot 100 can check the operation of the fan using a camera. When the robot 100 checks the operation of the fan, it can identify the location information of the fan. The robot 100 can confirm that the fan is located on the floor 15.
  • the robot 100 can create a virtual grid on the floor 15 and set a reference point.
  • the robot 100 can acquire coordinate information of the fan based on the virtual grid and reference point. For example, as shown in FIG.
  • the robot 100 can identify location information of the fan, such as room 10, floor 15, and coordinates (1, 8).
  • the robot 100 can generate matching information by matching the identification information and location information of the fan.
  • the robot 100 can update the 3D information by mapping the matching information to the 3D information.
  • the robot 100 may set the electronic device 200, whose location information is unknown, to unknown (1a).
  • the first to fourth electronic devices 200a, 200b, 200c, and 200d shown in Figure 5 may be lights.
  • the robot 100 may know the identification information of the first to fourth lights, but may not know the location information. Identification information may include type information of the electronic device. Accordingly, the robot 100 can set the first to fourth lights to unknown lighting (1a).
  • the robot 100 can confirm its operation by transmitting an operation confirmation signal to the first to fourth lights, respectively.
  • the robot 100 may check the operation of each of the first to fourth lights and generate matching information that matches the identification information and location information of each of the first to fourth lights.
  • the robot 100 can update the 3D information 1b by mapping the generated matching information to the 3D information 1b.
  • the robot 100 may transmit the updated 3D information 1b to the terminal device 400.
  • the terminal device 400 may display a control UI including images of each space and deployed electronic devices based on the updated 3D information 1b.
  • the robot 100 can move to another space and perform a similar process.
  • the robot 100 can simultaneously check the operation of a plurality of electronic devices.
  • FIG. 8 is a diagram illustrating a process for simultaneously identifying a plurality of electronic devices according to an embodiment of the present disclosure.
  • a plurality of electronic devices 200a, 200b, 200c, 200d, 200e, and 200f are shown.
  • the robot 100 knows identification information including the types of the plurality of electronic devices 200a, 200b, 200c, 200d, 200e, and 200f.
  • the electronic device may be a light, and the robot 100 may set each light to unknown lighting (3a).
  • the robot 100 can simultaneously transmit an operation confirmation signal corresponding to each light to a plurality of lights.
  • Operation confirmation signals simultaneously transmitted to a plurality of lights may be distinct operation confirmation signals.
  • each distinct operation confirmation signal may be a signal with distinct colors, flashing patterns, gradations, frequencies, sound output patterns, etc.
  • the robot 100 transmits a signal of a first blinking pattern to the first electronic device 200a and transmits a signal of a second blinking pattern to the fifth electronic device 200e. Can be transmitted.
  • the robot 100 may transmit a signal of a first color to the third electronic device 200c and a signal of a second color to the fourth electronic device 200d.
  • the robot 100 may transmit a gradient signal to the sixth electronic device 200f.
  • Each electronic device 200a, 200b, 200c, 200d, 200e, and 200f may perform an operation corresponding to the received signal.
  • the robot 100 can simultaneously capture images of a plurality of operating electronic devices 200a, 200b, 200c, 200d, 200e, and 200f using a camera.
  • the robot 100 can simultaneously identify a plurality of electronic devices 200a, 200b, 200c, 200d, 200e, and 200f by identifying the operations of each electronic device 200a, 200b, 200c, 200d, 200e, and 200f. Additionally, the robot 100 can simultaneously identify the locations of a plurality of electronic devices 200a, 200b, 200c, 200d, 200e, and 200f based on the acquired images.
  • the robot 100 may simultaneously generate matching information 3b that matches the identification information and location information of the plurality of electronic devices 200a, 200b, 200c, 200d, 200e, and 200f.
  • the robot 100 may update 3D information based on the generated matching information 3b. And, the robot 100 can transmit the updated 3D information 3b to the terminal device 400.
  • the terminal device 400 may display a control UI including images of each space and deployed electronic devices based on the updated 3D information 3b.
  • a plurality of electronic devices (200a, 200b, 200c, 200d, 200e, 200f) simultaneously receive distinct operation confirmation signals from the robot 100, so they may be devices connected to the network, and the device connected to the network may be an IoT device. .
  • the robot 100 may transmit an operation confirmation signal distinguished by an on/off pattern, frequency, and sound output pattern to each audio device.
  • Figure 9 is a timing diagram explaining the operation of a robot system that identifies an electronic device according to an embodiment of the present disclosure.
  • the robot 100 may receive information on the electronic device 200 from the server 300 (S110).
  • the electronic device 200 may transmit identification information to the server 300.
  • the server 300 may transmit the received identification information of the electronic device 200 to the robot 100.
  • identification information may include type, name, model name, control-related information, etc.
  • the robot 100 can search for electronic devices (S120).
  • the robot 100 can search for the electronic device 200 located in a specific space using a camera.
  • identification information can be obtained.
  • the robot 100 may obtain identification information corresponding to the type of the discovered electronic device 100 from the received identification information.
  • the robot 100 may capture an image of the electronic device 200 and transmit the captured image to the server 300.
  • the robot 100 can obtain identification information of the electronic device 200 from the server 300.
  • the robot 100 may transmit an operation confirmation signal to the electronic device 200 (S130).
  • the operation confirmation signal may be a signal corresponding to the type of the electronic device 200.
  • the operation confirmation signal may be a control signal that outputs light.
  • the operation confirmation signal may be a control signal that outputs sound.
  • the operation confirmation signal may be a control signal that outputs a response signal.
  • the operation confirmation signal may include a signal that turns on/off the electronic device 200.
  • the robot 100 may simultaneously transmit a distinct operation confirmation signal to each of the plurality of electronic devices 200.
  • the robot 100 can simultaneously identify the operations of a plurality of electronic devices 200.
  • 'simultaneous' may include not only the same time but also a sequential series of times that can be considered the same time.
  • the electronic device 200 operates according to the received operation confirmation signal (S140), and the robot 100 can identify the electronic device (S150). For example, if the electronic device 200 is a light output device, the robot 100 can use a camera to check operations such as light output and blinking of the electronic device 200. If the electronic device 200 is a sound output device, the robot 100 uses the operation image of the electronic device 200 captured using a camera and the speaker to determine the sound output, sound output pattern, frequency, and The operation of the electronic device 200 can be confirmed based on information such as volume. If the electronic device 200 is a communication device, the robot can check the operation of the electronic device 200 based on the operation image of the electronic device 200 captured using a camera and information such as frequency, signal strength, and response message. there is.
  • the robot 100 may identify location information of the electronic device 200 within a specific space.
  • the robot 100 synchronizes identification information and location information with the electronic device (S160), and the robot 100 can map 3D information (S170).
  • the robot 100 may generate matching information that matches the identification information and location information of the electronic device 200. Additionally, the robot 100 can update 3D information by mapping the electronic device 200 to a specific space.
  • the robot 100 can transmit matching information or 3D information to the server 300, and the server 300 can update the stored information in the electronic device 200 (S180). Additionally, the robot 10 may transmit updated 3D information to a terminal device by mapping the matching information.
  • FIG. 10 is a diagram illustrating a process for identifying a moving electronic device according to an embodiment of the present disclosure.
  • the first space may be the room 10 and the second space may be the living room 20.
  • the electronic device 200 may be a device in which matching information is generated by the robot 100 described in the above example and updated to 3D information.
  • the robot 100 can take pictures of the space while exploring the space.
  • the robot 100 may transmit an existing operation confirmation signal corresponding to the type of the identified electronic device to the new electronic device 200.
  • the robot 100 may receive identification information of the placed electronic device from the server 300. If the robot 100 discovers a new electronic device in a specific space but fails to receive identification information from the server 300, there is a possibility that the new electronic device is an electronic device that has moved from another space. Accordingly, when the robot 100 identifies the operation of the new electronic device 200 according to the existing operation confirmation signal, it can identify the new electronic device 200 as the existing electronic device whose location has been changed.
  • the robot 100 includes matching information that matches the identification information and existing location information of the existing electronic device. Accordingly, the robot 100 can identify the new location information of the electronic device 200 and modify the existing matching information. Additionally, the robot 100 can update 3D information that modifies the location information of the electronic device 200.
  • the performance of the electronic device 200 may vary depending on the influence of the surrounding environment. Alternatively, as a new object is placed around the electronic device 200, the performance of the electronic device 200 may be affected by the placed object. Accordingly, the robot 100 can test the performance of the electronic device 200 when the location of the electronic device changes or when a new object is placed.
  • 11 and 12 are diagrams illustrating a process for confirming the influence of surrounding objects according to an embodiment of the present disclosure.
  • the second electronic device 200b may be a lamp, and a bookshelf may be placed under the lamp.
  • a bookshelf placed under a light can affect the brightness of the room caused by the light.
  • the robot 100 may identify a new object 600a from an image acquired through a camera.
  • the robot 100 may transmit a test signal corresponding to the type of the second electronic device 200b to the second electronic device 200b.
  • the test signal may be the same signal as the operation confirmation signal.
  • the test signal may be a signal that turns on the electronic device.
  • the second electronic device 200b may perform an operation according to the received test signal.
  • the robot 100 may identify the operational performance of the second electronic device 200b according to the test signal. For example, if the second electronic device 200b is a lamp, illuminance, etc. can be measured. If the second electronic device 200b is a speaker, sound volume, etc. can be measured. If the second electronic device 200b is a communication device, signal strength and data loss rate can be measured.
  • the robot 100 may identify the operational performance of the electronic device by transmitting a test signal to the electronic device when the electronic device is first deployed. Accordingly, if the second electronic device 200b is an existing electronic device, the robot 100 may include information about the operating performance of the second electronic device 200b. The robot 100 may compare the measured operation performance of the second electronic device 200b with the stored operation performance. If the operational performance of the second electronic device 200b is outside a preset range, the robot 100 may generate information 5 related to the operational performance. Additionally, the robot 100 may transmit information 5 related to the generated motion performance to the terminal device 400 . The terminal device 400 may output information related to the received operating performance.
  • FIG. 12 a diagram showing a new object 600b placed on the floor of the room 10 is shown.
  • the new object 600b shown in FIG. 12 is placed on the floor.
  • the object 600b placed on the floor may affect the robot cleaner 200c when it cleans.
  • the robot 100 may identify a new object 600b located on the floor from the acquired image.
  • the robot 100 may transmit floor information 7 among the 3D information related to the location of the new object 600b to the robot vacuum cleaner 200c among the electronic devices.
  • the robot vacuum cleaner 200c can explore the floor and create a cleaning map. If a new object 600b is placed, the robot cleaner 200c must discover the new object 600b and update the existing cleaning map. Therefore, the robot cleaner 200c must use time to identify objects and update the cleaning map. However, when the robot 100 discovers the object 600b and transmits floor information 7 among the three-dimensional information on which the object 600b is placed to the robot cleaner 200c, the robot cleaner 200c quickly cleans.
  • the map can be updated.
  • FIG. 13 is a timing diagram explaining the operation of a robot system that identifies a moving electronic device according to an embodiment of the present disclosure.
  • the robot 100 can discover a new electronic device 200 (S205).
  • the robot 100 can generate three-dimensional information by mapping the matching information that matches the identification information and location information of the installed electronic device to space. Afterwards, the robot 100 may discover a new electronic device 200 in addition to the existing electronic device during space exploration. If the robot 100 does not receive the identification information of the electronic device from the server 300, there is a possibility that the discovered electronic device 200 is an existing electronic device.
  • the robot 100 may transmit an existing operation confirmation signal to the discovered electronic device 200 (S210).
  • the existing operation confirmation signal is a signal corresponding to the type of the discovered electronic device 200. If the discovered electronic device 200 is an electronic device that has moved from another space, it may operate according to an existing operation confirmation signal (S215).
  • the robot 100 can check the operation of the electronic device 200 and identify it as an existing electronic device (S220). If the electronic device 200 is an existing electronic device, only the location information among the matching information or 3D information may be different. Accordingly, the robot 100 can obtain location information where the electronic device is placed and modify the location information included in the existing information with the acquired location information. That is, the robot 100 can update 3D information related to the electronic device 200 (S225).
  • the robot 100 may transmit a test signal to the electronic device 200 to check the operating performance of the electronic device 200 (S230).
  • the test signal may be a signal that controls light emission, sound output, data transmission and reception, etc.
  • the electronic device 200 may perform an operation according to the received test signal (S235).
  • the robot 100 can check the operation of the electronic device 200 and identify the operating performance of the electronic device 200 (S240).
  • the robot 100 may compare the identified operation performance with the operation performance of an existing stored electronic device.
  • the robot 100 may generate information about operation performance (S245). And, if the identified motion performance is outside the preset range, the robot 100 may transmit information related to the motion performance to the terminal device 400.
  • Figure 14 is a diagram explaining a robot that performs a user command according to an embodiment of the present disclosure.
  • the robot 100 can recognize the voice of the user 90 and perform control operations.
  • the first electronic device 200a and the second electronic device 200b may be lights, the first electronic device 200a may be light number 1, and the second electronic device 200b may be light number 2.
  • the robot 100 can turn on the second electronic device 200b, which is light number 2. That is, the robot 100 can recognize the user's utterance and transmit a turn-on control signal to the second electronic device 200b.
  • the user 90 may utter “Turn this on,” without referring to a specific electronic device.
  • the robot 100 located near the user 90 can acquire the user's image.
  • the robot 100 can recognize the user 90's utterance and identify user-related information from the acquired image.
  • user-related information may include the user's location, gaze direction, pointing direction, etc.
  • the robot 100 identifies the user's gaze direction and the user's pointing direction from the acquired user's image, and identifies the second electronic device (200b) where “this” uttered by the user is the number 2 light. ) can be identified.
  • the robot 100 transmits a turn-on control signal to the second electronic device 200b, and the second electronic device 200b may perform a turn-on operation according to the received control signal.
  • the user 90 may control the electronic device 200 using the terminal device 400.
  • 15 and 16 are diagrams illustrating a terminal device executing a user command according to an embodiment of the present disclosure.
  • map data of a specific space is shown.
  • the robot 100 may generate map data 50 based on navigation or data received from the server 300.
  • Map data 50 may include each area.
  • the map data 50 may display areas such as Bathroom 1, Room 1, Kitchen, Living room, Room 2, Bathroom 2, and Room 3.
  • the user 90 can select an area where an electronic device for control is located. For example, user 90 may select Room 3.
  • the terminal device 400 can display 3D information on electronic devices placed in Room 3.
  • the robot 100 can identify the spatial location and coordinates of the wall, floor, and ceiling as well as the area where the electronic device is located. That is, the robot 100 can identify the plane and coordinates where the first to fourth electronic devices 200a, 200b, 200c, and 200d are located. Additionally, the robot 100 can identify the object 600c, the surface on which the object 600c is located, and its coordinates. And, the robot 100 can identify the location of the fifth electronic device 200e.
  • the robot 100 has a position of the first electronic device 200a at the ceiling of coordinates (3, 3), a position of the second electronic device 200b at the ceiling of coordinates (3, 5), and a third electronic device 200b.
  • the location of the device is the ceiling at coordinates (3, 7)
  • the location of the fourth electronic device 200d is the ceiling at coordinates (5, 2) to (5, 8)
  • the location of the object 600c is at coordinates (1, 8)
  • the location of the fifth electronic device 200e can be identified as the top surface of the object 600c.
  • the robot 100 generates matching information by matching (synchronizing) the identification information and location information of each electronic device (200a, 200b, 200c, 200d, 200e), and three-dimensional information 51 that maps the matching information. can be created.
  • the robot 100 can transmit the generated 3D information 51 to the terminal device 400. And, as shown in FIG. 16, the terminal device 400 may display a control UI based on the received 3D information 51. The user 90 may select the first electronic device 200a. The terminal device 400 may transmit a control signal to the first electronic device 200a according to the user's input command. Alternatively, the terminal device 400 may control the first electronic device 200a through the robot 100.
  • the terminal device 400 displays a control UI including 3D information 51, the user can intuitively control the electronic device.
  • Figure 17 is a flowchart explaining a method of controlling a robot according to an embodiment of the present disclosure.
  • the robot acquires identification information of the identified electronic device (S310). For example, while a robot is traveling in a specific space, an electronic device can identify it from an image acquired through a camera and obtain identification information of the electronic device. Electronic devices may include network-connected IoT devices or non-network-connected IR devices. If the electronic device is an IR device, the robot can transmit a captured image of the electronic device to the server and obtain identification information of the electronic device corresponding to the image from the server.
  • the robot can transmit an operation confirmation signal to the electronic device (S320).
  • the robot may transmit an operation confirmation signal corresponding to the type of the identified electronic device to the electronic device.
  • the operation confirmation signal may be a signal that outputs light.
  • the operation confirmation signal may be a signal that outputs sound.
  • the operation confirmation signal may be a signal that outputs a response signal.
  • the robot can simultaneously transmit distinct operation confirmation signals to each of the plurality of IoT devices and simultaneously identify the operations of the plurality of IoT devices.
  • a distinct operation confirmation signal may be a signal that is distinguished using color, flashing pattern, gradation, frequency, sound output pattern, etc.
  • the robot can identify location information of electronic devices in space (S330).
  • Location information may include surface and coordinate information in three-dimensional space where the electronic device is located.
  • the robot may identify location information of the electronic device, such as the ceiling of the first electronic device at coordinates (3, 5).
  • Coordinate information can be calculated individually for each side. For example, the position of the ceiling (1, 1) is calculated based on the first reference point set in the virtual grid of the ceiling, and the position of the right wall (1, 1) is calculated based on the second reference point set in the virtual grid of the right wall. It can be calculated based on .
  • the robot can identify an object and identify location information of an electronic device located on the object. For example, the robot can identify a table and identify location information about the second electronic device located on the table, such as the top of the table.
  • the robot can update the stored 3D information by mapping the electronic device to space (S340).
  • the robot can generate matching information that matches the identification information and location information of the electronic device.
  • 3D information can be created (updated) by mapping the generated matching information to a specific space.
  • the generated (updated) 3D information may be transmitted to the terminal device and/or server.
  • the robot may transmit an existing operation confirmation signal corresponding to the type of the identified electronic device to the new electronic device.
  • the robot can identify the new electronic device as the existing electronic device whose location has changed.
  • the robot can update 3D information that modifies the location information of existing electronic devices.
  • the robot can update the cleaning map by transmitting information related to the location of the new object to the robot cleaner.
  • the robot can test the operational performance of the electronic device by transmitting a test signal corresponding to the type of the electronic device to the electronic device. If the operational performance of the electronic device is outside a preset range, the robot can transmit information related to the operational performance to the terminal device.
  • the robot can receive the user's voice commands near the user.
  • the robot can identify user-related information from the user's image acquired through a camera and identify the electronic device to be controlled based on the identified user-related information.
  • user-related information may include the user's location, gaze direction, pointing direction, etc.
  • a robot finds an electronic device, it can identify whether it is a new or existing electronic device. If the robot is a new electronic device, it can perform the process of matching identification information and location information and mapping it to map data. If it is an existing electronic device, a process of updating existing location information can be performed. Additionally, when the robot detects a new electronic device, a change in location, or a change in the surrounding environment, it can perform a process to test the operational performance of the electronic device.
  • a computer program product may include the S/W program itself or a non-transitory computer readable medium in which the S/W program is stored.
  • a non-transitory readable medium refers to a medium that stores data semi-permanently and can be read by a device, rather than a medium that stores data for a short period of time, such as registers, caches, and memories.
  • the various applications or programs described above may be stored and provided on non-transitory readable media such as CD, DVD, hard disk, Blu-ray disk, USB, memory card, ROM, etc.

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Human Computer Interaction (AREA)
  • Mathematical Physics (AREA)
  • Acoustics & Sound (AREA)
  • Artificial Intelligence (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Evolutionary Computation (AREA)
  • Fuzzy Systems (AREA)
  • Multimedia (AREA)
  • Software Systems (AREA)
  • Environmental & Geological Engineering (AREA)
  • Health & Medical Sciences (AREA)
  • Radar, Positioning & Navigation (AREA)
  • General Health & Medical Sciences (AREA)
  • Toxicology (AREA)
  • Computing Systems (AREA)
  • Manipulator (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Remote Sensing (AREA)
  • General Physics & Mathematics (AREA)

Abstract

L'invention concerne un robot et un procédé de commande pour le robot. Le robot comprend une caméra, une interface de communication, une mémoire pour le stockage d'information tridimensionnelle correspondant à un espace spécifique, et au moins un processeur. Lorsqu'un dispositif électronique est identifié à partir d'une image obtenue par la caméra lors du déplacement du robot dans l'espace spécifique, ledit au moins un processeur obtient une information d'identification du dispositif électronique identifié. Ledit au moins un processeur transmet, au dispositif électronique, à travers l'interface de communication, un signal de vérification de fonctionnement correspondant au type du dispositif électronique identifié. Losqu'un fonctionnement du dispositif électronique selon le signal de vérification de fonctionnement est identifié, ledit au moins un processeur identifie une information de position du dispositif électronique dans l'espace spécifique. Ledit au moins un processeur met à jour l'information tridimensionnelle en établissant une cartographie du dispositif électronique à l'espace spécifique sur la base de l'information d'identification du dispositif électronique et de l'information de position du dispositif électronique.
PCT/KR2023/009125 2022-08-24 2023-06-29 Robot pour la gestion de dispositif électronique et son procédé de commande WO2024043488A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR10-2022-0105950 2022-08-24
KR1020220105950A KR20240028574A (ko) 2022-08-24 2022-08-24 전자 장치를 관리하는 로봇 및 그 제어 방법

Publications (1)

Publication Number Publication Date
WO2024043488A1 true WO2024043488A1 (fr) 2024-02-29

Family

ID=90013480

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2023/009125 WO2024043488A1 (fr) 2022-08-24 2023-06-29 Robot pour la gestion de dispositif électronique et son procédé de commande

Country Status (2)

Country Link
KR (1) KR20240028574A (fr)
WO (1) WO2024043488A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140316636A1 (en) * 2013-04-23 2014-10-23 Samsung Electronics Co., Ltd. Moving robot, user terminal apparatus and control method thereof
KR20180084305A (ko) * 2017-01-16 2018-07-25 엘지전자 주식회사 이동 로봇
KR20180087779A (ko) * 2017-01-25 2018-08-02 엘지전자 주식회사 이동 로봇
KR20180136833A (ko) * 2017-06-15 2018-12-26 엘지전자 주식회사 3차원 공간의 이동 객체를 식별하는 방법 및 이를 구현하는 로봇
KR20190102148A (ko) * 2019-03-08 2019-09-03 엘지전자 주식회사 로봇

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140316636A1 (en) * 2013-04-23 2014-10-23 Samsung Electronics Co., Ltd. Moving robot, user terminal apparatus and control method thereof
KR20180084305A (ko) * 2017-01-16 2018-07-25 엘지전자 주식회사 이동 로봇
KR20180087779A (ko) * 2017-01-25 2018-08-02 엘지전자 주식회사 이동 로봇
KR20180136833A (ko) * 2017-06-15 2018-12-26 엘지전자 주식회사 3차원 공간의 이동 객체를 식별하는 방법 및 이를 구현하는 로봇
KR20190102148A (ko) * 2019-03-08 2019-09-03 엘지전자 주식회사 로봇

Also Published As

Publication number Publication date
KR20240028574A (ko) 2024-03-05

Similar Documents

Publication Publication Date Title
WO2019135514A1 (fr) Robot domestique mobile et procédé de commande du robot domestique mobile
WO2018128355A1 (fr) Robot et dispositif électronique servant à effectuer un étalonnage œil-main
WO2016060370A1 (fr) Terminal pour un internet des objets et son procédé de fonctionnement
WO2020013413A1 (fr) Procédé de commande d'un appareil électronique et support d'enregistrement lisible par ordinateur
WO2018128475A1 (fr) Commande de réalité augmentée de dispositifs de l'internet des objets
WO2016072635A1 (fr) Dispositif terminal utilisateur et procédé de commande de celui-ci, et système de fourniture de contenus
WO2016060371A1 (fr) Terminal de l'internet des objets, et procédé de fonctionnement correspondant
WO2016085110A1 (fr) Système d'éclairage et procédé pour enregistrer un dispositif d'éclairage
WO2015034135A1 (fr) Dispositif d'affichage et son procédé de commande
WO2017065535A1 (fr) Dispositif électronique et son procédé de commande
WO2020241933A1 (fr) Robot maître commandant un robot esclave et son procédé de fonctionnement
WO2016204357A1 (fr) Terminal mobile et procédé de commande correspondant
WO2020055112A1 (fr) Dispositif électronique, et procédé pour l'identification d'une position par un dispositif électronique
WO2020096288A1 (fr) Appareil d'affichage et son procédé de commande
WO2020017834A1 (fr) Système comprenant une pluralité de dispositifs d'affichage, et procédé de commande associé
WO2015046681A1 (fr) Dispositif numérique et son procédé de commande
WO2019009486A1 (fr) Système et procédé de suivi optique
EP3387821A1 (fr) Dispositif électronique et son procédé de commande
WO2022250300A1 (fr) Procédé et appareil électronique pour acquérir une carte de sol d'un agencement de pièce
WO2024043488A1 (fr) Robot pour la gestion de dispositif électronique et son procédé de commande
WO2020027442A1 (fr) Procédé de stockage d'informations sur la base d'une image acquise par l'intermédiaire d'un module de caméra, et dispositif électronique l'utilisant
WO2014073939A1 (fr) Procédé et appareil de capture et d'affichage d'image
EP3167405A1 (fr) Appareil et système de traitement d'image numérique et leur procédé de commande
WO2020241973A1 (fr) Appareil d'affichage et son procédé de commande
WO2022060023A1 (fr) Dispositif électronique et son procédé de commande

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23857526

Country of ref document: EP

Kind code of ref document: A1