WO2022211224A1 - Procédé et système de commande à distance de robot, et bâtiment dans lequel un robot robuste vis-à-vis d'une latence de communication est piloté - Google Patents

Procédé et système de commande à distance de robot, et bâtiment dans lequel un robot robuste vis-à-vis d'une latence de communication est piloté Download PDF

Info

Publication number
WO2022211224A1
WO2022211224A1 PCT/KR2022/000045 KR2022000045W WO2022211224A1 WO 2022211224 A1 WO2022211224 A1 WO 2022211224A1 KR 2022000045 W KR2022000045 W KR 2022000045W WO 2022211224 A1 WO2022211224 A1 WO 2022211224A1
Authority
WO
WIPO (PCT)
Prior art keywords
robot
control command
time
delay time
server
Prior art date
Application number
PCT/KR2022/000045
Other languages
English (en)
Korean (ko)
Inventor
윤영환
박경식
Original Assignee
네이버랩스 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from KR1020210043612A external-priority patent/KR102484773B1/ko
Application filed by 네이버랩스 주식회사 filed Critical 네이버랩스 주식회사
Publication of WO2022211224A1 publication Critical patent/WO2022211224A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/0011Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot associated with a remote control arrangement
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0221Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory involving a learning process

Definitions

  • the present invention relates to a robot remote control, and a robot remote control method and system capable of controlling a robot in consideration of communication delay.
  • the brainless robot system includes a plurality of brainless robots and brain servers, and in the brainless robot system, each brainless robot reports sensor information and status of various robots to a remote or edge-located brain server and moves, etc. Receive and process control commands from
  • a delay may occur due to communication between the robot and the server.
  • Communication delay is an unavoidable occurrence in a brainless robot system, and can have a great effect on the control of the robot.
  • Korean Patent Laid-Open No. 10-2011-0079872 Method and Apparatus for Time and Frequency Transmission in a Communication Network
  • a method for calculating communication delay by periodically exchanging messages between a client device and a server is disclosed.
  • the present invention is to provide a remote control method and system for a robot. More specifically, the present invention is to provide a robot remote control method and system capable of controlling the robot in consideration of the communication delay between the robot and the server.
  • the present invention is to provide a robot remote control method and system for safely avoiding obstacles included in the movement path of the robot in consideration of communication delay.
  • a method for remotely controlling a robot through communication between a robot and a server includes calculating a communication delay time between the robot and the server based on time information transmitted and received between the robot and the server, the communication Generating a control command related to driving of the robot based on a delay time and transmitting the control command to the robot so that the robot runs according to the control command, wherein the control command causes the robot to run It provides a robot remote control method, characterized in that generated based on the expected position of the robot at the time of receiving the control command.
  • the system for remotely controlling a robot through communication between the robot and the server is a communication unit configured to transmit and receive data to and from the robot, and communication delay between the robot and the server based on time information transmitted and received between the robot and the server a control unit that calculates a time, generates a control command related to the driving of the robot based on the communication delay time, and transmits the control command to the robot so that the robot runs according to the control command, the control unit comprising:
  • the command is characterized in that the robot is generated based on the expected position of the robot when the control command is received.
  • the program that is executed by one or more processes in the electronic device according to the present invention and can be stored in a computer-readable recording medium delays communication between the robot and the server based on time information transmitted/received between the robot and the server. Calculating a time, generating a control command related to the driving of the robot based on the communication delay time, and transmitting the control command to the robot so that the robot runs according to the control command Including commands, the control command may be generated based on the expected position of the robot at the time the robot receives the control command.
  • the building in which the robot driven by the cloud server according to the present invention travels is made to include an indoor area in which the robot travels, and the cloud server is based on time information transmitted and received between the robot and the cloud server.
  • the cloud server is based on time information transmitted and received between the robot and the cloud server.
  • the expected position of the robot is characterized in that it is calculated based on the communication delay time.
  • the time at which the robot receives the control command is calculated based on the communication delay time, and the expected position of the robot is a control previously transmitted to the robot before transmitting the control command to the robot. It is characterized in that it is the expected position of the robot when the robot travels up to the point in time when the control command is received according to the command.
  • the cloud server transmits a first message including time information to the robot, receives a second message corresponding to the first message from the robot, and a time included in the first message
  • the communication delay time is calculated based on information and a time point at which the second message is received, and the second message includes time information related to a time point at which the first message is received and a time point at which the second message is transmitted.
  • the robot senses an obstacle included in the indoor area, and based on the sensing of the obstacle, sets a safety area for the obstacle to avoid a collision with the obstacle, and the control command is, when the robot enters the safe area of the obstacle, including a control command to stop the running of the robot or to move the robot out of the safe area, at least one of the size and shape of the safe area is set based on the communication delay time.
  • the safety area for the obstacle is set based on the obstacle, and when the communication delay time increases, it is expanded based on the obstacle, and when the communication delay time decreases, the safety area is set based on the obstacle. characterized in that it is reduced.
  • the safe area is characterized in that it is generated to include an area where the obstacle is expected to be located at the time the robot receives a control command from the cloud server.
  • the area in which the obstacle is expected to be located is characterized in that it is determined based on at least one of a movement path and a movement speed of the obstacle.
  • the communication delay time is characterized in that it is calculated based on the communication delay time received from a plurality of control target robots different from the robot.
  • Calculating a communication delay time between the robot and the server based on the time information transmitted and received between the robot and the cloud server in the building in which the robot is remotely controlled through communication between the robot and the cloud server according to the present invention based on the communication delay time, generating a control command related to driving of the robot so that the robot avoids an obstacle existing in the indoor area, and controlling the robot to run according to the control command
  • the robot controlled by the cloud server may be made to travel.
  • the robot remote control method and system according to the present invention calculates the expected position of the robot at the time when the robot receives the control command based on the communication delay time between the robot and the server, and returns to the calculated expected position. Based on this, it is possible to perform driving control for the robot. Through this, according to the present invention, it is possible to perform preemptive predictive control in consideration of communication delay.
  • the present invention enables the robot to safely travel without colliding with the obstacle even in a situation in which the robot cannot immediately respond to a change in the state of the obstacle due to communication delay.
  • FIG. 1 and 2 are conceptual views for explaining a robot driving control method and system according to the present invention.
  • FIG. 3 is a flowchart for explaining a robot traveling control method according to the present invention.
  • 4A and 4B are conceptual diagrams illustrating a communication delay time calculation method according to an embodiment of the present invention.
  • FIG. 5 is a conceptual diagram for explaining a robot remote control method according to the present invention.
  • FIG. 6 is a conceptual diagram illustrating a state in which a robot avoids an obstacle according to an embodiment of the present invention.
  • FIG. 7 is a conceptual diagram illustrating the running of the robot when the robot enters the safe area.
  • 8A, 8B, and 9 are conceptual views illustrating an embodiment in which the size and shape of a safety area are set differently according to the type of obstacle and the moving state.
  • the present invention provides a method and system for controlling the travel of a robot, and more specifically, to provide a method and system for controlling the travel of the robot in consideration of communication delay.
  • a robot driving control system will be described. 1 and 2 are conceptual views for explaining a robot driving control method and system according to the present invention.
  • the robot capable of providing various services as described above may be configured to travel in a space 10 as shown in FIG. 1 in order to perform an assigned task.
  • a space 10 as shown in FIG. 1
  • the robot may be configured to travel in at least one of an indoor space and an outdoor space as necessary.
  • the indoor space may be various spaces such as a department store, an airport, a hotel, a school, a building, a subway station, a train station, a bookstore, and the like.
  • the robot, as described above, may be arranged in various spaces to provide useful services to humans.
  • the present invention proposes a method for more accurately controlling a robot remotely using a camera disposed in space.
  • a camera 20 may be disposed in a space 10 in which the robot is located. As shown, the number of cameras 20 disposed in the space 10 is not limited thereto. As shown, a plurality of cameras 20a and 20b may be disposed in the space 10 . The types of cameras 20 disposed in the space 10 may be varied, and in particular, a closed circuit television (CCTV) disposed in the space may be utilized in the present invention.
  • CCTV closed circuit television
  • the robot remote control system 300 it is possible to remotely manage and control the robot 100 .
  • the robot remote control system 300 includes an image received from a camera 20 (eg, CCTV) disposed in a space 10, an image received from the robot, information received from a sensor provided in the robot, and At least one of information received from various sensors provided in the space may be utilized to control the movement of the robot or to perform appropriate control on the robot.
  • a camera 20 eg, CCTV
  • At least one of information received from various sensors provided in the space may be utilized to control the movement of the robot or to perform appropriate control on the robot.
  • the robot remote control system 300 includes at least one of a communication unit 310 , a storage unit 320 , a display unit 330 , an input unit 340 , and a control unit 350 .
  • a communication unit 310 may include a Wi-Fi connection, a Wi-Fi connection, a Wi-Fi connection, and a Wi-Fi connection.
  • the communication unit 310 may be configured to communicate with various devices disposed in the space 10 by wire or wirelessly.
  • the communication unit 310 may communicate with the robot 100 as shown.
  • the communication unit 310 may be configured to receive an image photographed from a camera provided in the robot 100 through communication with the robot 100 .
  • the communication unit 310 may be configured to communicate with at least one external server (or external storage, 200 ).
  • the external server 200 may be configured to include at least one of the cloud server 210 and the database 220 as shown.
  • the external server 200 may be configured to perform at least a part of the control unit 350 . That is, it is possible to perform data processing or data operation in the external server 200, and the present invention does not place any particular limitation on this method.
  • the communication unit 310 may support various communication methods according to a communication standard of a communicating device.
  • the communication unit 310 may include a Wireless LAN (WLAN), a Wireless-Fidelity (Wi-Fi), a Wireless Fidelity (Wi-Fi) Direct, a Digital Living Network Alliance (DLNA), a Wireless Broadband (WiBro), and a WiMAX (Wireless Broadband).
  • WLAN Wireless LAN
  • Wi-Fi Wireless-Fidelity
  • Wi-Fi Wireless Fidelity
  • Wi-Fi Wireless Fidelity
  • Wi-Fi Wireless Fidelity
  • Wi-Fi Wireless Fidelity
  • Wi-Fi Wireless Fidelity
  • Wi-Fi Wireless Fidelity
  • Wi-Fi Wireless Fidelity
  • Wi-Fi Wireless Fidelity
  • WiMAX Wireless Broadband
  • HSDPA High Speed Downlink Packet Access
  • HSUPA High Speed Uplink Packet Access
  • LTE Long Term Evolution
  • LTE-A Long Term Evolution-Advanced
  • 5G 5th Generation Mobile Telecommunication
  • BluetoothTMRFID Radio Frequency Identification
  • IrDA Infrared Data Association
  • UWB Ultra-Wideband
  • ZigBee Ultra-Wideband
  • NFC Near Field Communication
  • Wi-Fi Direct Wireless USB (Wireless Universal Serial Bus)
  • the storage unit 320 may be configured to store various information related to the present invention.
  • the storage unit 320 may be provided in the robot remote control system 300 itself.
  • at least a portion of the storage unit 320 may mean at least one of the cloud server 210 and the database 220 . That is, it can be understood that the storage unit 320 is a space in which information necessary for robot control according to the present invention is stored, and there is no restriction on the physical space. Accordingly, hereinafter, the storage unit 320 , the cloud server 210 , and the database 220 are not separately distinguished, and are all referred to as the storage unit 320 .
  • the cloud server 210 may mean “cloud storage”.
  • information about the robot 100 may be stored in the storage unit 320 .
  • Information on the robot 100 may be very diverse, and the information on the robot 100 is an example, i) identification information for identifying the robot 100 disposed in the space 10 (eg, serial number, TAG information, QR code information, etc.), ii) task information assigned to the robot 100, iii) driving route information set in the robot 100, iv) location information of the robot 100, v) the robot ( 100) state information (eg, power state, failure or not, battery state, etc.), vi) image information received from a camera provided in the robot 100 , etc. may exist.
  • a map (or map information) for the space 10 may be stored in the storage unit 320 .
  • the map may be formed of at least one of a two-dimensional map or a three-dimensional map.
  • the map for the space 10 may refer to a map that can be used to determine a current location on the robot 100 or set a traveling route of the robot.
  • the position of the robot 100 may be determined based on an image received from the robot 100 or information received from the robot 100 .
  • the map for the space 10 stored in the storage unit 320 may be composed of data that enables a location to be estimated based on an image or sensing information.
  • the map for the space 10 may be a map prepared based on Simultaneous Localization and Mapping (SLAM) by at least one robot that moves the space 10 in advance.
  • SLAM Simultaneous Localization and Mapping
  • various types of information may be stored in the storage unit 320 .
  • the display unit 330 may be configured to output an image received from at least one of a camera provided in the robot 100 and a camera 20 disposed in the space 10 .
  • the display unit 330 is provided in a device of an administrator who remotely manages the robot 100 , and may be provided in the remote control room 300a as shown in FIG. 2 .
  • the display unit 330 may be a display provided in a mobile device. As such, the present invention does not limit the type of the display unit.
  • the input unit 340 is for inputting information input from the user (or administrator), and the input unit 340 may be a medium between the user (or the administrator) and the robot remote control system 300 . More specifically, the input unit 340 may refer to an input means for receiving a control command for controlling the robot 100 from a user.
  • the type of the input unit 340 is not particularly limited, and the input unit 340 is a mechanical input means (or a mechanical key, for example, a mouse, a joy stick, a physical button, It may include at least one of a dome switch, a jog wheel, a jog switch, etc.) and a touch input means.
  • the touch input means consists of a virtual key, a soft key, or a visual key displayed on the touch screen through software processing, or a part other than the touch screen. It may be made of a touch key (touch key) disposed on the.
  • the virtual key or the visual key it is possible to be displayed on the touch screen while having various forms, for example, graphic (graphic), text (text), icon (icon), video (video) or these can be made by a combination of
  • the display unit 330 may be configured as a touch screen. In this case, the display unit 330 may perform both a role of outputting information and a role of receiving information.
  • control unit 350 may be configured to control the overall operation of the robot remote control system 300 related to the present invention.
  • the controller 350 may process signals, data, information, etc. input or output through the above-described components, or may provide or process information or functions appropriate to the user.
  • the control unit 350 estimates the position of the robot 100
  • the position estimation of the robot 100 may be performed by the robot 100 itself. That is, the robot 100 may estimate the current position in the manner described above based on the image received from the robot 100 itself. Then, the robot 100 may transmit the estimated position information to the controller 350 . In this case, the controller 350 may perform a series of controls based on the position information received from the robot.
  • the movement path of the robot can be set in space by using map information previously stored in the storage unit 320 .
  • the controller 350 may control the robot 100 to move from its current location to a specific destination.
  • the present invention specifies the current position information and the destination position information of the robot, sets a path to reach the destination, and controls the robot to move according to the set path to reach the destination.
  • the control for moving the robot to the destination may be performed through the robot remote control system 300 (hereinafter referred to as a server).
  • the server monitors the position of the robot, and generates driving information related to the driving of the robot based on the monitored position of the robot.
  • the server transmits driving information to the robot at regular time intervals, and the robot drives in space based on the information included in the driving information.
  • the driving information includes information related to driving of the robot after the robot receives the driving information until the next driving information is received.
  • the driving information includes information related to at least one of a moving speed, a moving direction, and a moving distance of the robot.
  • the robot drives by controlling at least one of a moving speed, a moving direction, and a moving distance based on the driving information.
  • the driving information is transmitted to the robot every 1 second, and the first driving information transmitted to the robot defines that the robot travels at 1 m/s in the first direction.
  • the second travel information transmitted to the robot defines that the robot will travel for 0.5 seconds at 2 m/s in the second direction. From the point in time when the robot receives the first travel information to the time when the second travel information is received, the robot travels 1 m in the first direction. Thereafter, from the time when the second driving information is received, the robot travels 1 m in the second direction and then stops.
  • the period of transmitting the driving information is for convenience of explanation, and the actual period of transmitting the driving information may be 0.1 seconds or less.
  • the server periodically transmits driving information to the robot so that the robot can travel along a preset movement path to reach a destination.
  • the control command transmitted to the robot may include the above-described driving information.
  • the server receives at least one of information sensed from at least one sensor included in the robot and image information captured by a camera included in the robot, and generates driving information for the robot.
  • the server when the server receives sensing information about an obstacle disposed on the movement path of the robot from the robot, the server generates driving information for the robot to avoid the obstacle.
  • the robot transmits the sensing information or image information collected by the robot to the server, and the server generates driving information based on the information received from the robot and then transmits it to the robot.
  • communication delay may occur during data transmission/reception between the server and the robot.
  • the communication delay between the server and the robot makes it difficult to control the robot's running.
  • the time when the robot transmits information to the server and the time when the server receives the information are different due to communication delay. For this reason, when calculating the current position of the robot based on the information received from the robot, the calculated position corresponds to the position of the robot in the past by the communication delay.
  • the server generates driving information based on the monitored position of the robot, and transmits the generated driving information to the robot. Due to the communication delay, the time when the server transmits the driving information to the robot and the time when the robot receives the driving information are different. For this reason, the robot travels according to the driving information at a location different from the position at which the server transmits the driving information.
  • the present invention provides a method for controlling the driving of a robot in consideration of the communication delay time between the server and the robot.
  • a method for remotely controlling a robot in consideration of a communication delay time will be described in detail.
  • FIG. 3 is a flowchart for explaining a method for remotely controlling a robot according to the present invention
  • FIGS. 4A and 4B are conceptual diagrams illustrating a communication delay time calculation method according to an embodiment of the present invention
  • FIG. 5 is a robot remote control method according to the present invention. It is a conceptual diagram for explaining a control method.
  • a step of calculating a communication delay time between the robot and the server is performed based on time information transmitted/received between the robot and the server ( S110 ).
  • the communication delay time may be calculated by transmitting and receiving a message including time information between the robot and the server, and using the time information included in the message.
  • the time information defines information about a specific time based on a reference time system.
  • the time information may include information regarding a time point at which a message is received or information regarding a time point at which a message is transmitted.
  • the present invention is not limited thereto, and the time information may include information regarding various time points.
  • the reference time system may vary, for example, Universal Time (UT), Coordinated Universal Time (UTC), Korea Standard Time (KST), Greenwich Mean Time ( It may be at least one of GMT, Greenwich Mean Time), Unix time, and GPS Time.
  • UT Universal Time
  • UTC Coordinated Universal Time
  • KST Korea Standard Time
  • Greenwich Mean Time It may be at least one of GMT, Greenwich Mean Time
  • Unix time Unix time
  • GPS Time GPS Time
  • the communication delay time may be calculated by the server or the robot.
  • the subject for calculating the communication delay time is not specifically limited.
  • the communication delay time may be calculated by the robot. Specifically, the steps of the server receiving the first message including the time information from the robot, and the server transmitting the second message corresponding to the first message to the robot may be performed.
  • the first message includes time information about the time t1 at which the robot transmits the first message.
  • the second message may include time information related to a time t1 at which the first message is transmitted and a time tp from when the server receives the first message to generating and transmitting the second message.
  • the second message is information about the time t1 when the server transmits the first message, and generates a control command related to the robot from the time when the first message is received, and executes the control command. It may include time information (tp) required to transmit the included second message. However, the second message does not necessarily include a control command. When the second message is a message used for measuring a delay time, the second message may not include information related to a control command.
  • the robot uses the time information for the time t2 at which the second message is received, the time information included in the second message, and the time information included in the first message to delay the communication. to calculate
  • the time interval between the time when the robot receives the second message (t2) and the time when the robot transmits the first message (t1) is the time taken for the server to perform the robot-related control (tp) and the two Including communication delay time according to data transmission times.
  • the communication delay time t latency1 may be calculated as in Equation 1 below.
  • t latency1 (t2 - t1 - tp)/2
  • the communication delay time is an average value of a delay time generated when the first message is transmitted/received and a delay time generated when the second message is transmitted/received.
  • the robot transmits a third message including the communication delay time to the server.
  • the server can utilize the communication delay time between the robot and the server to generate a control command for the robot.
  • the robot may generate a message including separate time information and transmit it to the server, or may include the time information in specific information that the robot periodically transmits to the server and transmit it.
  • the robot periodically transmits sensing information sensed by a sensor included in the robot to the server.
  • a separate channel may be formed in the sensing information to include time information.
  • the communication delay time may be calculated by the server. Specifically, the steps of the server transmitting the first message including the time information to the robot, and the server receiving the second message corresponding to the first message from the robot may be performed.
  • the first message includes time information on the time t1 when the server transmits the first message.
  • the second message includes time information related to the time (t1) at which the server transmits the first message and the time it takes for the robot to transmit the second message (tp1) based on the time at which the first message is received.
  • the server uses the time information for the time t2 when the second message is received, the time information included in the second message, and the time information included in the first message to delay the communication. to calculate
  • the time interval between the time t2 when the server receives the second message and the time t1 when the server transmits the first message is the time it takes for the robot to transmit the second message corresponding to the first message.
  • the communication delay time t latency1 may be calculated as in Equation 1 above.
  • the communication delay time may be calculated based on the communication delay time calculated by the plurality of other control target robots.
  • the server utilizes the communication delay time received from other controlled robots located in the area corresponding to the current position of the specific robot as the communication delay time of the specific robot, or the average of the communication delay times received from a plurality of other controlled robots.
  • the value can be used as the communication delay time of a specific robot.
  • the server may calculate the average communication delay time based on the communication delay time for a plurality of robots located in the same area. Specifically, the server may divide pre-stored map information into a plurality of zones, and set an average value of communication delay times of a plurality of robots located in a specific zone as a representative communication delay time for a specific zone. The representative communication delay time may be updated every preset time. The server may set a representative communication delay time for each of the plurality of zones, and when a specific robot enters a specific zone, it may generate a control command based on the representative communication delay time corresponding to the specific zone.
  • the present invention can utilize the communication delay time for driving control of all the robots to be controlled, without calculating the communication delay time for all the robots to be controlled.
  • the server generates a control command related to the robot's driving based on the communication delay time.
  • the control command may be generated based on an expected position of the robot when the robot receives the control command.
  • the server receives the position information of the robot from the robot, or calculates the position of the robot in space based on the information received from the robot.
  • the server may correct the position information of the robot based on the communication delay time. Specifically, the server calculates the position moved by the robot during the communication delay time based on the position information received from the robot or the position information calculated based on the information received from the robot. At this time, the server calculates the expected position of the robot when the robot travels up to the point in time when the control command is received based on the driving information previously transmitted to the robot.
  • the server uses the driving information last transmitted to the robot to correct the position of the robot.
  • the communication delay time is 1 second and the driving information finally transmitted to the robot includes information for driving at a speed of 1 m/s in the 12 o'clock direction
  • the server responds to the position information received from the robot.
  • the robot's current position is corrected by moving 1m from the position to the 12 o'clock direction.
  • the delay time is for convenience of explanation, and the actual delay time may be within 100 ms.
  • the server calculates the expected position of the robot at the point in time when the robot receives the control command. Specifically, the server calculates the expected time at which the robot receives the control command by using the estimated time required to generate and transmit the control command related to the robot and the previously calculated communication delay time.
  • the robot's position information is periodically transmitted to the server while the robot is driving, and the server periodically transmits the driving information to the robot in response thereto.
  • the robot receives the driving information from the server after 'communication delay time + server processing time + communication delay time' from the time the robot's location information is transmitted to the server.
  • the server calculates the movement path of the robot during the time interval between the time when the position information is received from the robot and the expected time when the robot receives the control command. Specifically, the server calculates the expected position of the robot at the expected time when the robot receives the control command based on the position information received from the robot. At this time, the server calculates the expected position when the vehicle travels to the expected time according to the control command previously transmitted to the robot before transmitting the control command to the robot.
  • the server calculates the expected position of the robot after 'communication delay time + server processing time + communication delay time' elapses, and generates driving information of the robot based on the expected position.
  • the robot 100 transmits the current position information of the robot to the server, and moves to a path 530 according to the driving information preset in the robot.
  • the robot In order to reach the destination 510 from the initial position of the robot, it must travel along the first path 520a. However, when the robot receives the driving information corresponding to the first path 520a, the robot moves to another location 100'. When the robot travels along the first path 520a from another location 100 ′, it cannot reach the destination.
  • the server calculates the expected position of the robot based on the traveling information previously transmitted to the robot before transmitting a new control command to the robot 100 to the robot.
  • the previously transmitted driving information defines a driving direction corresponding to 530 and a driving speed of 1 m/s.
  • the server calculates the expected time interval from when the robot transmits the location information to the server from the initial position to the time when the robot receives the control command as 0.5 seconds. Thereafter, the server calculates a position moved by 0.5 m in the traveling direction corresponding to 530 from the initial position of the robot as the expected position 100' of the robot.
  • the second path 520b that can reach the destination from the position 100 ′ of the robot at the point in time when the robot receives the driving information is transmitted to the robot.
  • the present invention based on the communication delay time between the robot and the server, the expected position of the robot at the time when the robot receives the control command is calculated, and based on the calculated expected position, driving control for the robot is performed. be able to perform Through this, according to the present invention, it is possible to perform preemptive predictive control in consideration of communication delay.
  • the robot performs running control to avoid the obstacles disposed in the space.
  • FIG. 6 is a conceptual diagram illustrating a state in which a robot avoids an obstacle according to an embodiment of the present invention
  • FIG. 7 is a conceptual diagram illustrating the driving of the robot when the robot enters a safe area
  • FIGS. 8a, 8b and 9 are It is a conceptual diagram illustrating an embodiment in which the size and shape of the safety area are set differently according to the type of obstacle and the moving state.
  • the robot senses obstacles included in the driving space and transmits the sensed information to the server. Based on the sensing of the obstacle, the server may set a safe area for the obstacle so that the robot avoids a collision with the obstacle.
  • the safe area means a virtual area formed around the obstacle based on the position where the obstacle is disposed, and is an area in which the minimum safety distance required for the robot to avoid the obstacle is set.
  • the robot From the time when the robot transmits sensing information about the obstacle to the server to the time when the robot receives a control command for avoiding the obstacle, the robot can drive according to the preset driving information.
  • the safe area is the distance at which the robot and the obstacle can not collide even if no control commands are input to the robot from the point when the robot transmits the sensing information about the obstacle to the server until the time the robot receives the control command to avoid the obstacle. can be set based on
  • the robot When the robot moves toward an obstacle outside the safe area, the robot can modify its movement path through communication with the server, but when moving toward the obstacle within the safe area, there is a possibility of colliding with the obstacle. Accordingly, when an obstacle is detected by the robot, the server sets a safe area for the obstacle so that the robot runs outside the safe area.
  • Information related to the location, size, and shape of the safe area may be stored by matching map information or transmitted to a robot that senses an obstacle.
  • the server establishes a safe area for obstacles and creates a movement path that can be driven while avoiding the set safe area.
  • the server when the server receives sensing information about an obstacle from the robot 100 , the server sets the safety area 620 based on the detected obstacle 610 . Thereafter, the server establishes a movement path 630 that can move while avoiding the safe area. The server transmits information about the safe area 620 and driving information about the movement path 630 to the robot.
  • the server transmits a control command to the robot to stop the movement of the robot or to move the robot out of the safety margin area.
  • information on the safe area may be stored in the robot, and when the robot enters the pre-stored safe area, it may be programmed to stop running or to move out of the safe area.
  • the information about the safe area stored in the robot may be updated by the server.
  • a safety area 720 is set for the robot based on a specific obstacle 710 .
  • the robot enters the safety area 720 while driving in the direction 730a toward the obstacle, the robot stops and waits until it receives a new control command from the server, or even if it does not receive a control command from the server
  • the vehicle may travel in a direction 730b moving out of the safe area.
  • the server When the server receives sensing information about an obstacle from the robot, it sets a safe area for the obstacle based on the communication delay time. Specifically, the server makes the size and shape of the safety area different in consideration of the time interval from when the robot transmits sensing information about the obstacle to the time when the control command for obstacle avoidance is received.
  • the time at which the robot receives the control command for obstacle avoidance may be calculated based on the pre-calculated communication delay time and server expected processing time.
  • the server may set the size of the safe area in proportion to the time interval from when the robot transmits sensing information about the obstacle to the server to the time when the control command for obstacle avoidance is received.
  • the server may periodically update the communication delay time between the robots.
  • the server expands the safety area based on the specific obstacle, and when the communication delay time decreases, the server expands the safety area based on the specific obstacle Reduce the safe area.
  • the location, size, and shape of the safety area may vary depending on the type of obstacle, the moving direction of the obstacle, the moving path of the obstacle, and the moving speed of the obstacle.
  • the server may determine at least one of the size and shape of the safety area based on the type of obstacle. Specifically, the server determines the type of obstacle based on information related to the obstacle received from the robot.
  • the server may select the type of obstacle by selecting any one of a fixed obstacle and a moving obstacle as the type of obstacle.
  • the server may set the type of the obstacle by selecting any one of other controllable robots, fixed obstacles, people, and other moving obstacles as the type of obstacle.
  • the server sets the safe area by considering only the distance the robot moves during the communication delay time. Accordingly, the size of the safe area for the fixed obstacle may be smaller than the size of the safe area for the moving obstacle.
  • the server may determine at least one of the size and shape of the safety area based on at least one of the moving path and the moving speed of the obstacle.
  • the server determines the location of the obstacle based on information related to the obstacle received from the robot 100 .
  • the position of the obstacle detected by the above method is the position of the obstacle at the point in time when the robot transmits information related to the obstacle.
  • the robot receives the control command for avoiding the obstacle, the obstacle may move to another position 810 ′, and the robot may also move to another position 100 ′. Accordingly, the distance between the obstacle and the robot may be rapidly approached.
  • the server may determine the size of the safe area for the obstacle in proportion to the moving speed of the obstacle. Since the faster the moving speed of the obstacle, the faster it approaches the robot, so the server can set a relatively large safe area for the fast moving obstacle.
  • the server may determine the shape of the safety area based on the moving direction of the obstacle. Specifically, the server may make the safety area relatively wide in the direction in which the obstacle moves, and the safety area may be formed relatively narrow in the direction in which the obstacle does not move.
  • the server may set the shape of the safety area differently based on whether it is possible to calculate the movement path of the obstacle.
  • the safe area is created to include an area in which the obstacle is located when the robot receives a control command.
  • the safety margin area is generated in a form corresponding to the movement speed.
  • the server determines the movement path of the other robot 810 based on the driving information transmitted to the other robot 810 . predictable. Based on the driving information transmitted to the other robot, the predicted position 810' of the other robot is calculated at the time the robot receives the control command.
  • the safe area is set to include the expected position 810' of the other robot.
  • the safety area 820 corresponding to the other robot may be formed in an oval shape elongated in the moving direction of the other robot.
  • the server may set the safety area in a circle centered on the obstacle.
  • the safety area may be set in a circular shape centered on the obstacle. The width of the safety area may be relatively larger than that of other types of obstacles.
  • the server reduces the size of the safety area from the second size 920b to the first size 920a, or from the second size 920b to the third size 930c, based on the moving speed of the obstacle 910 . can be expanded to
  • the robot by setting a safety area in consideration of the characteristics of obstacles, even in a situation in which the robot cannot immediately respond to a change in the state of the obstacle due to communication delay, the robot can safely travel without colliding with the obstacle. .
  • the present invention described above may be implemented as a program storable in a computer-readable medium, which is executed by one or more processes in a computer.
  • the present invention as seen above can be implemented as computer-readable codes or instructions on a medium in which a program is recorded. That is, the present invention may be provided in the form of a program.
  • the computer-readable medium includes all types of recording devices in which data readable by a computer system is stored.
  • Examples of computer-readable media include Hard Disk Drive (HDD), Solid State Disk (SSD), Silicon Disk Drive (SDD), ROM, RAM, CD-ROM, magnetic tape, floppy disk, optical data storage device, etc. There is this.
  • the computer-readable medium may be a server or a cloud storage that includes a storage and that an electronic device can access through communication.
  • the computer may download the program according to the present invention from a server or cloud storage through wired or wireless communication.
  • the computer described above is an electronic device equipped with a processor, that is, a CPU (Central Processing Unit, Central Processing Unit), and there is no particular limitation on the type thereof.
  • a processor that is, a CPU (Central Processing Unit, Central Processing Unit)
  • CPU Central Processing Unit

Abstract

La présente invention concerne une commande à distance de robot, et concerne un procédé et un système de commande à distance de robot capables de commander un robot en tenant compte de la latence de communication. L'invention concerne un procédé de commande à distance d'un robot par l'intermédiaire d'une communication entre le robot et un serveur, le procédé de commande à distance de robot comprenant les étapes consistant à : calculer un temps de latence de communication entre le robot et le serveur sur la base d'informations temporelles transmises et reçues entre le robot et le serveur ; générer une instruction de commande relative au pilotage du robot, sur la base du temps de latence de communication ; et transmettre l'instruction de commande au robot de telle sorte que le robot est piloté selon l'instruction de commande, l'instruction de commande étant générée sur la base d'une position attendue du robot à un moment où le robot reçoit l'instruction de commande.
PCT/KR2022/000045 2021-04-02 2022-01-04 Procédé et système de commande à distance de robot, et bâtiment dans lequel un robot robuste vis-à-vis d'une latence de communication est piloté WO2022211224A1 (fr)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
KR10-2021-0081914 2021-04-02
KR1020210043612A KR102484773B1 (ko) 2021-04-02 2021-04-02 로봇 원격 제어 방법 및 시스템
KR10-2021-0043612 2021-04-02
KR1020210081914A KR102485644B1 (ko) 2021-04-02 2021-06-23 통신 지연에 강인한 로봇이 주행하는 건물

Publications (1)

Publication Number Publication Date
WO2022211224A1 true WO2022211224A1 (fr) 2022-10-06

Family

ID=83456425

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2022/000045 WO2022211224A1 (fr) 2021-04-02 2022-01-04 Procédé et système de commande à distance de robot, et bâtiment dans lequel un robot robuste vis-à-vis d'une latence de communication est piloté

Country Status (2)

Country Link
KR (1) KR102485644B1 (fr)
WO (1) WO2022211224A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20130027349A (ko) * 2011-09-07 2013-03-15 엘지전자 주식회사 로봇 청소기, 단말 장치, 및 로봇 청소기의 원격 제어 시스템과 방법
KR20140126539A (ko) * 2013-04-23 2014-10-31 삼성전자주식회사 이동로봇, 사용자단말장치 및 그들의 제어방법
JP2016520944A (ja) * 2013-06-03 2016-07-14 コントロールワークス プライベート リミテッド ロボットデバイスのオフボードナビゲーションの方法及び装置
JP2020500763A (ja) * 2016-12-01 2020-01-16 エスゼット ディージェイアイ テクノロジー カンパニー リミテッドSz Dji Technology Co.,Ltd 静止及び移動体に対する無人航空機の飛行方法
KR20210015211A (ko) * 2019-08-01 2021-02-10 엘지전자 주식회사 실시간으로 클라우드 슬램을 수행하는 방법 및 이를 구현하는 로봇과 클라우드 서버

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20130027349A (ko) * 2011-09-07 2013-03-15 엘지전자 주식회사 로봇 청소기, 단말 장치, 및 로봇 청소기의 원격 제어 시스템과 방법
KR20140126539A (ko) * 2013-04-23 2014-10-31 삼성전자주식회사 이동로봇, 사용자단말장치 및 그들의 제어방법
JP2016520944A (ja) * 2013-06-03 2016-07-14 コントロールワークス プライベート リミテッド ロボットデバイスのオフボードナビゲーションの方法及び装置
JP2020500763A (ja) * 2016-12-01 2020-01-16 エスゼット ディージェイアイ テクノロジー カンパニー リミテッドSz Dji Technology Co.,Ltd 静止及び移動体に対する無人航空機の飛行方法
KR20210015211A (ko) * 2019-08-01 2021-02-10 엘지전자 주식회사 실시간으로 클라우드 슬램을 수행하는 방법 및 이를 구현하는 로봇과 클라우드 서버

Also Published As

Publication number Publication date
KR102485644B1 (ko) 2023-01-05
KR102485644B9 (ko) 2023-12-07
KR20220137506A (ko) 2022-10-12

Similar Documents

Publication Publication Date Title
WO2020105898A1 (fr) Système de drone en vol autonome basé sur des mégadonnées et procédé de vol autonome associé
WO2018070663A1 (fr) Robot d'aéroport et son procédé de fonctionnement
WO2018070664A1 (fr) Robot auxiliaire pour aéroport et procédé de fonctionnement de celui-ci
WO2017039077A1 (fr) Dispositif de station de base mobile destiné à gérer des catastrophes, à l'aide de drone, et son procédé de fonctionnement
WO2015167080A1 (fr) Procédé et appareil de commande de véhicule aérien sans pilote
WO2016049906A1 (fr) Procédé et dispositif d'instructions de vol, et aéronef
WO2016104917A1 (fr) Procédé de fourniture d'informations d'arrivée, serveur et dispositif d'affichage
WO2020045896A1 (fr) Procédé et appareil permettant d'améliorer une procédure de transfert intercellulaire pour prendre en charge un transfert intercellulaire conditionnel dans un système de communication sans fil
WO2017048067A1 (fr) Terminal et procédé pour mesurer un emplacement de celui-ci
WO2017090800A1 (fr) Système de surveillance et de sécurité utilisant un robot
WO2018043821A1 (fr) Système de guidage et d'indication d'itinéraire, utilisant des informations météorologiques, pour aéronef sans pilote, procédé associé, et support d'enregistrement sur lequel est enregistré un programme informatique
EP3351023A1 (fr) Terminal et procédé pour mesurer un emplacement de celui-ci
WO2021125510A1 (fr) Procédé et dispositif de navigation dans un environnement dynamique
WO2022211224A1 (fr) Procédé et système de commande à distance de robot, et bâtiment dans lequel un robot robuste vis-à-vis d'une latence de communication est piloté
WO2024049057A1 (fr) Système et procédé de surveillance intelligente de risque d'effondrement
WO2022225134A1 (fr) Procédé et système de commande à distance de robot, et bâtiment dans lequel un robot monte dans un ascenseur en une position optique d'attente et est entraîné
KR20220139137A (ko) 로봇의 통신 장애 복구 방법 및 시스템
KR101815466B1 (ko) 무인 비행체를 이용한 실종자의 위치추적 시스템 및 이의 제어방법
WO2022211225A1 (fr) Procédé et système de commande de déplacement d'un robot, et bâtiment, dans lequel un robot ayant un trajet de déplacement, dans ce bâtiment, commandé en fonction d'un encombrement dans l'espace, se déplace
WO2018062598A1 (fr) Procédé et système de stockage de distribution de données
WO2022215838A1 (fr) Procédé et système de récupération après une défaillance de communication d'un robot, et bâtiment dans lequel un robot résistant à des conditions de défaillance de réseau se déplace
KR102484773B1 (ko) 로봇 원격 제어 방법 및 시스템
WO2018088703A1 (fr) Système de partage de parc de stationnement prenant en compte des niveaux de compétence de conduite, procédé associé et support d'enregistrement enregistré avec un programme d'ordinateur
WO2024034911A1 (fr) Dispositif électronique, système et procédé de commande de robot
WO2022196909A1 (fr) Procédé et système de commande à distance de robots, et bâtiment ayant des robots mobiles répondant de manière flexible à des obstacles

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22781345

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE