WO2023120918A1 - Bâtiment "robot-friendly", et procédé et système de commande de charge pour robot - Google Patents

Bâtiment "robot-friendly", et procédé et système de commande de charge pour robot Download PDF

Info

Publication number
WO2023120918A1
WO2023120918A1 PCT/KR2022/015878 KR2022015878W WO2023120918A1 WO 2023120918 A1 WO2023120918 A1 WO 2023120918A1 KR 2022015878 W KR2022015878 W KR 2022015878W WO 2023120918 A1 WO2023120918 A1 WO 2023120918A1
Authority
WO
WIPO (PCT)
Prior art keywords
robot
charging station
charging
point
marker
Prior art date
Application number
PCT/KR2022/015878
Other languages
English (en)
Korean (ko)
Inventor
박순용
김덕화
Original Assignee
네이버랩스 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 네이버랩스 주식회사 filed Critical 네이버랩스 주식회사
Publication of WO2023120918A1 publication Critical patent/WO2023120918A1/fr

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1679Programme controls characterised by the tasks executed
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J11/00Manipulators not otherwise provided for
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J11/00Manipulators not otherwise provided for
    • B25J11/008Manipulators for service tasks
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J13/00Controls for manipulators
    • B25J13/08Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
    • B25J13/088Controls for manipulators by means of sensing devices, e.g. viewing or touching devices with position, velocity or acceleration sensors
    • B25J13/089Determining the position of the robot with reference to its environment
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J19/00Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J19/00Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
    • B25J19/005Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators using batteries, e.g. as a back-up power source
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J19/00Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
    • B25J19/02Sensing devices
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J19/00Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
    • B25J19/02Sensing devices
    • B25J19/021Optical sensing devices
    • B25J19/023Optical sensing devices including video camera means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J5/00Manipulators mounted on wheels or on carriages
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/20Control system inputs
    • G05D1/24Arrangements for determining position or orientation
    • G05D1/244Arrangements for determining position or orientation using passive navigation aids external to the vehicle, e.g. markers, reflectors or magnetic means
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/40Control within particular dimensions
    • G05D1/43Control of position or course in two dimensions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06KGRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K19/00Record carriers for use with machines and with at least a part designed to carry digital markings
    • G06K19/06Record carriers for use with machines and with at least a part designed to carry digital markings characterised by the kind of the digital marking, e.g. shape, nature, code
    • G06K19/06009Record carriers for use with machines and with at least a part designed to carry digital markings characterised by the kind of the digital marking, e.g. shape, nature, code with optically detectable marking
    • HELECTRICITY
    • H02GENERATION; CONVERSION OR DISTRIBUTION OF ELECTRIC POWER
    • H02JCIRCUIT ARRANGEMENTS OR SYSTEMS FOR SUPPLYING OR DISTRIBUTING ELECTRIC POWER; SYSTEMS FOR STORING ELECTRIC ENERGY
    • H02J7/00Circuit arrangements for charging or depolarising batteries or for supplying loads from batteries

Definitions

  • the present invention relates to a robot-friendly building. More specifically, the present invention relates to a building in which robots and humans coexist in the same space and provide useful services to humans.
  • the present invention relates to a charging control method and system for a robot that can accurately dock a robot that needs to be charged to a charging station.
  • the present invention relates to a method and system for controlling charging of a robot that can be applied to a robot-friendly building, and relates to a system for controlling the movement and posture of a robot so that the robot accurately docks at a charging station and performs charging. will be.
  • robots have reached a level where they can safely coexist with humans in an indoor space.
  • robots have been replacing human tasks or tasks, and in particular, various methods for robots to directly provide services to people in indoor spaces are being actively researched.
  • robots provide route guidance services in public places such as airports, stations, and department stores, and robots provide serving services in restaurants. Furthermore, robots are providing a delivery service in which mails and couriers are delivered in office spaces, co-living spaces, and the like. In addition, robots provide various services such as cleaning services, crime prevention services, and logistics handling services. The type and range of services provided by robots will increase exponentially in the future, and the level of service provision is expected to continue to evolve.
  • robots provide various services not only in outdoor spaces but also in indoor spaces of buildings (or buildings) such as offices, apartments, department stores, schools, hospitals, amusement facilities, etc. In this case, robots provide It is controlled to provide various services while moving through the indoor space.
  • the present invention provides a method and system for controlling charging of a robot.
  • the present invention relates to a method and system for controlling charging of a robot, which can control charging of the robot in consideration of both the remaining battery level of the robot and the surrounding environment of the robot.
  • the present invention relates to a method and system for controlling charging of a robot through a charging station.
  • the present invention relates to a charging control method and system for a robot that controls a charging terminal of the robot to come into contact with a charging terminal of a charging station.
  • the present invention is to provide a robot-friendly building in which robots and humans coexist and provide useful services to humans.
  • the robot-friendly building according to the present invention can expand the types and ranges of services that can be provided by robots by providing various robot-friendly facility infrastructures that can be used by robots.
  • the robot-friendly building according to the present invention can manage the driving of robots providing services more systematically by organically controlling a plurality of robots and facility infrastructure using a cloud system that works in conjunction with a plurality of robots. .
  • the robot-friendly building according to the present invention can provide various services to people more safely, quickly, and accurately.
  • the robot applied to the building according to the present invention can be implemented in a brainless form controlled by a cloud server, and according to this, a plurality of robots disposed in the building can be inexpensively manufactured without expensive sensors. In addition, it can be controlled with high performance/high precision.
  • a method for controlling charging of a robot includes receiving identification information of a charging station to be charged by the robot from a server based on a charging event in the robot, and based on the identification information, the robot Controlling movement of the robot to move to a place where the charging station is located, recognizing a marker provided in the charging station through a sensor provided in the robot, and based on the recognition, the charging step.
  • the robot performs charging based on a body, a camera provided in the body, a traveling unit provided in the body and configured to travel in space, a communication unit communicating with a server, and a charging event. and a control unit for receiving identification information of a charging station to be charged from a server through the communication unit and controlling the driving unit to move to a place where the charging station is located based on the identification information, wherein the control unit, through the camera , Recognize a marker provided in the charging station, and based on the recognition, the robot faces the marker at a first point spaced apart from the charging station by a specific distance, Controlling the movement of the robot, generating a movement path of the robot from the first point to a second point closer to the charging station than the first point, based on the robot being located at the first point; , Charging may be performed through the charging station based on the robot moving toward the second point along the moving path.
  • the building according to the present invention includes a charging station where a robot performs charging, and the robot receives identification information of a charging station where the robot will perform charging from a server based on a charging event in the robot.
  • the charging system performs charging for the robot through collaboration between a server, a charging station, and a robot.
  • the robot performs charging.
  • the identification information of the charging station is received from the server, and based on the identification information, the movement of the robot is controlled so that the robot moves to the place where the charging station is located, and through a sensor provided in the robot, the charging station A marker provided in is recognized, and based on the recognition, the robot moves to the first point so that the robot faces the marker at a first point spaced apart from the charging station by a specific distance.
  • charging may be performed through the charging station.
  • the program according to the present invention includes receiving identification information of a charging station to be charged by the robot from a server based on a charging event in the robot; Controlling the movement of the robot to move to a place where the station is located, recognizing a marker provided in the charging station through a sensor provided in the robot, based on the recognition, from the charging station.
  • the charging It may include instructions executing the step of performing charging via the station.
  • the robot charging control method and system according to the present invention when a charging event occurs in the robot, identification information of the charging station is received from the server and the robot can move to the charging station based on the received identification information.
  • the robot charging control method and system according to the present invention can systematically manage the charging of the robot from the server point of view.
  • the robot charging control method and system according to the present invention can perform a series of controls for approaching the robot to the charging station while the robot faces the marker provided in the charging station. Through this, in the robot charging control method and system according to the present invention, the charging terminal of the robot and the charging terminal of the charging station are accurately contacted, so that the robot can be smoothly charged.
  • the robot-friendly building according to the present invention uses a technological convergence in which robots, autonomous driving, AI, and cloud technologies are converged and connected, and these technologies, robots, and facility infrastructure provided in the building are organically can provide a new space that is combined with
  • the robot-friendly building according to the present invention can systematically manage the driving of robots that provide services more systematically by organically controlling a plurality of robots and facility infrastructure using a cloud server that works in conjunction with a plurality of robots. can Through this, the robot-friendly building according to the present invention can provide various services to people more safely, quickly, and accurately.
  • the robot applied to the building according to the present invention can be implemented in a brainless form controlled by a cloud server, and according to this, a plurality of robots disposed in the building can be inexpensively manufactured without expensive sensors. In addition, it can be controlled with high performance/high precision.
  • robots and humans can naturally coexist in the same space by taking into account the tasks and movement situations assigned to a plurality of robots arranged in the building, as well as driving to be considerate of people.
  • 1, 2 and 3 are conceptual diagrams for explaining a robot-friendly building according to the present invention.
  • FIG 4, 5 and 6 are conceptual diagrams for explaining a robot driving a robot-friendly building and a system for controlling various facilities provided in the robot-friendly building according to the present invention.
  • FIG. 7 and 8 are conceptual diagrams for explaining facility infrastructure provided in a robot-friendly building according to the present invention.
  • 9 to 11 are conceptual diagrams for explaining a method of estimating the position of a robot traveling in a robot-friendly building according to the present invention.
  • 12a, 12b and 13 are conceptual diagrams for explaining a method and system for controlling charging of a robot according to the present invention.
  • FIG. 14 is a conceptual diagram for explaining a charging station in the present invention.
  • 15 is a conceptual diagram for explaining a method for a robot to recognize a marker provided in a charging station according to the present invention.
  • 16 and 17 are flowcharts for explaining a charging control method of a robot according to the present invention.
  • 20 21a, 21b, 22, 23a and 23b are conceptual diagrams for explaining a method of controlling the movement of a robot in the present invention.
  • the present invention relates to a robot-friendly building, and proposes a robot-friendly building in which humans and robots safely coexist and in which robots can provide beneficial services in a building.
  • the present invention provides a method of providing useful services to humans using robots, robot-friendly infrastructure, and various systems for controlling them.
  • humans and a plurality of robots can coexist, and various infrastructures (or facility infrastructures) that allow a plurality of robots to move freely within the building can be provided.
  • a building is a structure made for continuous residence, life, work, etc., and may have various forms such as commercial buildings, industrial buildings, institutional buildings, and residential buildings. Also, the building may be a multi-story building having a plurality of floors and a single-story building as opposed to a multi-story building. However, in the present invention, for convenience of description, infrastructure or facility infrastructure applied to a multi-story building will be described as an example.
  • infrastructure or facility infrastructure is a facility provided in a building for service provision, robot movement, function maintenance, cleanliness maintenance, and the like, and its types and forms can be very diverse.
  • infrastructure provided in a building may be diverse, such as mobile facilities (eg, robot moving passages, elevators, escalators, etc.), charging facilities, communication facilities, cleaning facilities, structures (eg, stairs, etc.), and the like. there is.
  • mobile facilities eg, robot moving passages, elevators, escalators, etc.
  • charging facilities communication facilities
  • cleaning facilities eg, stairs, etc.
  • structures eg, stairs, etc.
  • these facilities are referred to as facilities, infrastructure, facility infrastructure, or facility infrastructure, and in some cases, the terms are used interchangeably.
  • At least one of the building, various facility infrastructures, and robots provided in the building are controlled in conjunction with each other, so that the robot can safely and accurately provide various services within the building.
  • the building according to the present invention includes i) a building equipped with robot-used infrastructure, ii) a building equipped with robot-friendly infrastructure, iii) a robot-friendly building, iv) a building where robots and humans live together, v) It can be expressed in various ways, such as a building that provides various services using robots.
  • robot-friendly is for a building where robots coexist, and more specifically, allows robots to run, robots provide services, or facility infrastructure in which robots can be used is built. , it can mean that facility infrastructure that provides necessary functions for robots (ex: charging, repairing, washing, etc.) is established.
  • robot friendliness in the present invention can be used to mean having an integrated solution for the coexistence of robots and humans.
  • FIGS. 1, 2, and 3 are conceptual diagrams for explaining a robot-friendly building according to the present invention
  • FIGS. 4, 5, and 6 are a robot driving the robot-friendly building and a robot-friendly building according to the present invention.
  • FIGS. 7 and 8 are conceptual diagrams for explaining facility infrastructure provided in a robot-friendly building according to the present invention.
  • a building is given a reference numeral “1000”, and a space (indoor space or indoor area) of the building 1000 is given a reference numeral “10” (see FIG. 8).
  • reference numerals 10a, 10b, and 10c are assigned to indoor spaces respectively corresponding to a plurality of floors constituting the indoor space of the building 1000 (see FIG. 8).
  • an indoor space or an indoor area is a concept opposite to the exterior of a building, and refers to the interior of a building protected by an exterior wall, and is not limited to meaning a space.
  • a robot is assigned a reference numeral “R”, and even if a reference numeral is not written for a robot in the drawings or specifications, all robots can be understood as robots (R).
  • a person or human is given a reference numeral “U”, and a person or human can be named as a dynamic object.
  • the dynamic object does not necessarily mean only a person, but an animal such as a dog or cat, or at least one other robot (eg, a user's personal robot, a robot providing other services, etc.), a drone, a vacuum cleaner (eg, a user's personal robot, a robot that provides other services, etc.)
  • a vacuum cleaner eg, a user's personal robot, a robot that provides other services, etc.
  • it can be taken as a meaning including objects capable of movement, such as a robot vacuum cleaner.
  • the building ( ⁇ , building, structure, edifice, 1000) described in the present invention is not limited to a particular type, and is a structure built for people to live in, work, breed animals, or put things. can mean
  • the building 1000 may be offices, offices, officetels, apartments, residential/commercial apartments, houses, schools, hospitals, restaurants, government offices, etc., and the present invention may be applied to these various types of buildings.
  • a robot may run and provide various services.
  • a plurality of robots of one or more different types may be located in the building 1000, and these robots run in the building 1000 under the control of the server 20, provide services, and provide services to the building ( 1000) can use various facility infrastructures.
  • the location of the server 20 may exist in various ways.
  • the server 20 may be located in at least one of an interior of the building 1000 and an exterior of the building 1000 . That is, at least a part of the server 20 may be located inside the building 1000 and the other part may be located outside the building 1000 .
  • the server 20 may be located entirely inside the building 1000 or located only outside the building 1000 . Therefore, in the present invention, the specific location of the server 20 is not particularly limited.
  • the server 20 is made to use at least one of a cloud computing server (cloud server 21) and an edge computing server (edge server 22).
  • cloud server 21 a cloud computing server
  • edge server 22 an edge computing server
  • the server 20 can be applied to the present invention as long as it is a method capable of controlling a robot in addition to a cloud computing or edge computing method.
  • the server 20 according to the present invention, in some cases, by mixing the server 21 of the cloud computing method and the edge computing method, among the facility infrastructure provided in the robot and the building 1000 Control of at least one can be performed.
  • the edge server 22 is an electronic device and may operate as a brain of the robot R. That is, each edge server 22 may wirelessly control at least one robot R. At this time, the edge server 22 may control the robot R based on the determined control period. The control period may be determined as the sum of time given to process data related to the robot R and time given to provide control commands to the robot R.
  • the cloud server 21 may manage at least one of the robot R or the edge server 22 . At this time, the edge server 22 may operate as a server corresponding to the robot R, and may operate as a client corresponding to the cloud server 21.
  • the robot R and the edge server 22 may communicate wirelessly, and the edge server 22 and the cloud server 21 may communicate wired or wirelessly. At this time, the robot R and the edge server 22 may communicate through a wireless network capable of ultra-reliable and low latency communications (URLLC).
  • the wireless network may include at least one of a 5G network and WiFi-6 (WiFi ad/ay).
  • the 5G network may have features capable of ultra-reliable low-latency communication, as well as enhanced mobile broadband (eMBB) and massive machine type communications (mMTC).
  • the edge server 22 includes a mobile edge computing, multi-access edge computing (MEC) server and may be disposed in a base station.
  • MEC multi-access edge computing
  • the edge server 22 may communicate through a wireless network such as the Internet.
  • a plurality of edge servers may be connected through a wireless mesh network, and functions of the cloud server 21 may be distributed to the plurality of edge servers.
  • one of the edge servers operates as an edge server 22 for the robot R, and at least another one of the edge servers cooperates with any one of the edge servers. , can operate as a cloud server 21 for the robot R.
  • the network or communication network formed in the building 1000 according to the present invention includes at least one robot R configured to collect data, at least one edge server 22 configured to wirelessly control the robot R, and It may include communication between the cloud server 21 connected to the edge server 22 and configured to manage the robot R and the edge server 22 .
  • the edge server 22 may be configured to wirelessly receive the data from the robot R, determine a control command based on the data, and wirelessly transmit the control command to the robot R.
  • the edge server 22 determines whether to cooperate with the cloud server 21 based on the data, and if it is determined that there is no need to cooperate with the cloud server 21, the edge server 22 controls the predetermined within a period, it may be configured to determine the control command and transmit the control command.
  • the edge server 22 may be configured to determine the control command by communicating with the cloud server 21 based on the data when it is determined that cooperation with the cloud server 21 is required. there is.
  • the robot R may be driven according to a control command.
  • the robot R may move a location or change a posture by changing a movement, and may perform a software update.
  • the server 20 is uniformly named as a “cloud server” and reference numeral “20” is given.
  • a cloud server 20 can also be replaced with the term edge server 22 of edge computing.
  • cloud server may be variously changed to terms such as a cloud robot system, a cloud system, a cloud robot control system, and a cloud control system.
  • the cloud server 20 can perform integrated control of a plurality of robots traveling in the building 1000 . That is, the cloud server 20 performs monitoring on i) a plurality of robots (R) located in the building 1000, ii) assigns a mission (or task) to the plurality of robots, and iii) a plurality of robots (R) may directly control facility infrastructure provided in the building 1000 to successfully perform its mission, or iv) facility infrastructure may be controlled through communication with a control system that controls facility infrastructure.
  • the cloud server 20 performs monitoring on i) a plurality of robots (R) located in the building 1000, ii) assigns a mission (or task) to the plurality of robots, and iii) a plurality of robots (R) may directly control facility infrastructure provided in the building 1000 to successfully perform its mission, or iv) facility infrastructure may be controlled through communication with a control system that controls facility infrastructure.
  • the cloud server 20 may check state information of robots located in the building and provide (or support) various functions required for the robots.
  • various functions may include a charging function for robots, a washing function for contaminated robots, a standby function for robots whose missions have been completed, and the like.
  • the cloud server 20 may control the robots so that the robots use various facility infrastructures provided in the building 1000 to provide various functions to the robots. Furthermore, in order to provide various functions to the robots, the cloud server can directly control the facility infrastructure provided in the building 1000 or control the facility infrastructure through communication with a control system that controls the facility infrastructure. there is.
  • robots controlled by the cloud server 20 may drive the building 1000 and provide various services.
  • the cloud server 20 may perform various controls based on information stored in the database, and in the present invention, the type and location of the database are not particularly limited.
  • the terms of such a database can be freely modified and used as long as they refer to a means for storing information, such as a memory, a storage unit, a storage unit, a cloud storage unit, an external storage unit, and an external server.
  • the term “database” will be unified and explained.
  • the cloud server 20 may perform distributed control of robots based on various criteria such as the type of service provided by robots and the type of control for robots.
  • the cloud server 20 may have subordinate sub-servers of sub-concepts.
  • the cloud server 20 may control the robot traveling in the building 1000 based on various artificial intelligence algorithms.
  • the cloud server 20 performs artificial intelligence-based learning that utilizes data collected in the process of controlling the robot as learning data, and utilizes this to control the robot. It can be operated accurately and efficiently. That is, the cloud server 20 may be configured to perform deep learning or machine learning. In addition, the cloud server 20 may perform deep learning or machine learning through simulation, etc., and control the robot using an artificial intelligence model built as a result.
  • the building 1000 may be equipped with various facility infrastructures for robot driving, providing robot functions, maintaining robot functions, performing missions of robots, or coexistence of robots and humans.
  • various facility infrastructures 1 and 2 capable of supporting driving (or movement) of the robot R may be provided in the building 1000 .
  • These facility infrastructures 1 and 2 support movement of the robot R in the horizontal direction within a floor of the building 1000, or move the robot R vertically between different floors of the building 1000. can support moving to In this way, the facility infrastructures 1 and 2 may have a transport system that supports movement of the robot.
  • the cloud server 20 controls the robot R to use these various facility infrastructures 1 and 2, and as shown in FIG. 1 (b), the robot R provides a building ( 1000) can be moved.
  • the robots according to the present invention may be controlled based on at least one of the cloud server 20 and a control unit provided in the robot itself, so as to travel within the building 1000 or provide services corresponding to assigned tasks. there is.
  • the building according to the present invention is a building in which robots and people coexist, and the robots are people (U) and objects used by people (for example, strollers, carts, etc.) , It is made to drive avoiding obstacles such as animals, and in some cases, it can be made to output notification information (3) related to the robot's driving.
  • the driving of the robot may be made to avoid an obstacle based on at least one of the cloud server 20 and a control unit provided in the robot.
  • the cloud server 20 allows the robot to avoid obstacles and enter the building 1000 based on information received through various sensors (eg, a camera (image sensor), proximity sensor, infrared sensor, etc.) provided in the robot. You can control the robot to move.
  • the robot traveling in the building through the process of FIG. 1 (a) to (c) is configured to provide services to people or target objects present in the building, as shown in FIG. 1 (d).
  • the type of service provided by the robot may be different for each robot. That is, various types of robots may exist depending on the purpose, the robot may have a different structure for each purpose, and a program suitable for the purpose may be installed in the robot.
  • the building 1000 includes delivery, logistics work, guidance, interpretation, parking assistance, security, crime prevention, security, public order, cleaning, quarantine, disinfection, laundry, beverage preparation, food preparation, serving, fire suppression, and medical assistance.
  • robots providing at least one of entertainment services may be disposed. Services provided by robots may be various other than the examples listed above.
  • the cloud server 20 may assign an appropriate task to the robots in consideration of the purpose of each robot, and control the robots to perform the assigned task.
  • At least some of the robots described in the present invention may drive or perform missions under the control of the cloud server 20, and in this case, the amount of data processed by the robot itself to drive or perform missions may be minimized. there is.
  • a robot may be referred to as a brainless robot.
  • These brainless robots may depend on the control of the cloud server 20 for at least some control in performing actions such as driving, performing missions, performing charging, waiting, and washing within the building 1000 .
  • the building 1000 may be equipped with various facility infrastructures that can be used by robots, and as shown in FIGS. 2, 3 and 4, the facility infrastructure is disposed within the building 1000.
  • the facility infrastructure is disposed within the building 1000.
  • movement (or driving) of the robot may be supported or various functions may be provided to the robot.
  • the facility infrastructure may include facilities for supporting movement of the robot within a building.
  • Facilities supporting the movement of the robot may have any one of a robot-specific facility exclusively used by the robot and a common facility used jointly with humans.
  • facilities supporting movement of the robot may support movement of the robot in a horizontal direction or support movement of the robot in a vertical direction.
  • the robots may move horizontally or vertically using facilities within the building 1000 . Movement in the horizontal direction may mean movement within the same floor, and movement in the vertical direction may mean movement between different floors. Therefore, in the present invention, moving up and down within the same layer may be referred to as horizontal movement.
  • FIGS. 2 and 3 Facilities supporting the movement of the robot may be various, and for example, as shown in FIGS. 2 and 3 , the building 1000 includes a robot passage (robot road, 201) supporting the movement of the robot in the horizontal direction. , 202, 203) may be provided.
  • a robot passage may include a robot-only passage exclusively used by the robot.
  • the passage dedicated to the robot can be formed so that human access is fundamentally blocked, but may not necessarily be limited thereto. That is, the passage dedicated to the robot may have a structure that allows people to pass through or approach it.
  • the passage dedicated to the robot may include at least one of a first exclusive passage (or first type passage 201 ) and a second exclusive passage (or second type passage 202 ).
  • the first exclusive passage and the second exclusive passage 201, 202 may be provided together on the same floor or may be provided on different floors.
  • the building 1000 may be provided with moving means 204 and 205 that support movement of the robot in a vertical direction.
  • These moving means (204, 205) may include at least one of an elevator (elevator) or an escalator (escalator).
  • the robot may move between different floors using an elevator 204 or an escalator 205 provided in the building 1000 .
  • an elevator 204 or escalator 205 may be made exclusively for robots, or may be made for common use with people.
  • the building 1000 may include at least one of a robot-only elevator and a shared elevator. Similarly, furthermore, at least one of a robot-only escalator and a shared escalator may be included in the building 1000 .
  • the building 1000 may be equipped with a type of movement means that can be used for both vertical and horizontal movement.
  • a moving means in the form of a moving walkway may support a robot to move in a horizontal direction within a floor or to move in a vertical direction between floors.
  • the robot may move within the building 1000 horizontally or vertically under its own control or under the control of the cloud server 20. can move mine
  • the building 1000 may include at least one of an entrance door 206 (or automatic door) and an access control gate 207 that control access to the building 1000 or a specific area within the building 1000 .
  • At least one of the door 206 and the access control gate 207 may be made usable by a robot.
  • the robot may pass through an entrance door (or automatic door, 206) or an access control gate 207 under the control of the cloud server 20.
  • the access control gate 207 may be named in various ways, such as a speed gate.
  • the building 1000 may further include a waiting space facility 208 corresponding to a waiting space where the robot waits, a charging facility 209 for charging the robot, and a washing facility 210 for cleaning the robot. .
  • the building 1000 may include a facility 211 specialized for a specific service provided by the robot, and may include, for example, a facility for delivery service.
  • the building 1000 may include facilities for monitoring the robot (refer to reference numeral 212), and various sensors (eg, cameras (or image sensors, 121)) may be present as examples of such facilities.
  • various sensors eg, cameras (or image sensors, 121)
  • the building 1000 according to the present invention may be provided with various facilities for service provision, robot movement, driving, function maintenance, cleanliness maintenance, and the like.
  • the building 1000 according to the present invention is interconnected with the cloud server 20, the robot R, and the facility infrastructure 200, so that the robots within the building 1000 provide various services.
  • the facility infrastructure 200 so that the robots within the building 1000 provide various services.
  • interconnected means that various data and control commands related to services provided in a building, robot movement, driving, function maintenance, cleanliness maintenance, etc. are transmitted from at least one subject to another through a network (or communication network). It may mean unidirectional or bidirectional transmission and reception to at least one subject.
  • the subject may be the building 1000, the cloud server 20, the robot R, the facility infrastructure 200, and the like.
  • the facility infrastructure 200 includes at least one of the various facilities (refer to reference numerals 201 to 213) and control systems 201a, 202a, 203a, 204a, ... that control the various facilities reviewed together with FIGS. 2 and 3. can do.
  • the robot R traveling in the building 1000 is made to communicate with the cloud server 20 through the network 40, and can provide services within the building 1000 under the control of the cloud server 20. .
  • the building 1000 may include a building system 1000a for communicating with or directly controlling various facilities provided in the building 1000 .
  • the building system 1000a may include a communication unit 110 , a sensing unit 120 , an output unit 130 , a storage unit 140 and a control unit 150 .
  • the communication unit 110 forms at least one of a wired communication network and a wireless communication network within the building 1000, i) between the cloud server 20 and the robot R, ii) between the cloud server 20 and the building 1000 , iii) between the cloud server 20 and the facility infrastructure 200, iv) between the facility infrastructure 200 and the robot R, and v) between the facility infrastructure 200 and the building 1000. That is, the communication unit 110 may serve as a medium of communication between different entities.
  • the communication unit 110 may also be named a base station, a router, and the like, and the communication unit 110 allows the robot R, the cloud server 20, and the facility infrastructure 200 to communicate with each other within the building 1000. It is possible to form a communication network or network so that
  • being connected to the building 1000 through a communication network may mean being connected to at least one of the components included in the building system 1000a.
  • the plurality of robots R disposed in the building 1000 communicate with the cloud server 20 through at least one of a wired communication network and a wireless communication network formed through the communication unit 110. By performing, it can be made to be remotely controlled by the cloud server 20 .
  • a communication network such as a wired communication network or a wireless communication network may be understood as a network 40 .
  • the building 1000, the cloud server 20, the robot R, and the facility infrastructure 200 may form the network 40 based on the communication network formed within the building 1000. Based on this network, the robot R may provide services corresponding to assigned tasks using various facilities provided in the building 1000 under the control of the cloud server 20 .
  • the facility infrastructure 200 includes at least one of the various facilities (see reference numerals 201 to 213) and control systems 201a, 202a, 203a, 204a, ... that control them, respectively, as reviewed with FIGS. 2 and 3 (Such a control system may also be termed a “control server”).
  • control systems 201a and 202a for independently controlling the robot passages 201, 202 and 203 respectively , 203a exists, and in the case of an elevator (or a robot-only elevator, 204), a control system 204 for controlling the elevator 204 may exist.
  • the sensing units 201b, 202b, 203b, 204b, ... included in each of the facility control systems 201a, 202a, 203a, 204a, ... are provided in the facility itself to sense various information related to the facility. It can be done.
  • controllers 201c, 202c, 203c, 204c, ... included in each facility control system 201a, 202a, 203a, 204a, ... perform control for driving each facility, and the cloud server 20 ), it is possible to perform appropriate control so that the robot R uses the facility.
  • the control system 204b of the elevator 204 through communication with the cloud server 20, allows the robot R to board the elevator 204 at the floor where the robot R is located, the elevator 204 ) can control the elevator 204 to stop.
  • the facilities included in the building 1000 according to the present invention are controlled by the cloud server 20 or the control unit 150 of the building 1000 .
  • the facility may not have a separate facility control system.
  • each facility will be described as having its own control system, but as mentioned above, the role of the control system for controlling the facility is that of the cloud server 20 or building 1000.
  • the control unit 150 can be replaced by the control unit 150.
  • the terms of the controllers 201c, 202c, 203c, 204c, ... of the facility control system described herein are replaced with terms of the cloud server 20 or the controller 150 or the controller 150 of the building.
  • it can be expressed as
  • each facility control system 201a, 202a, 203a, 204a, ... in FIG. 4 are for an example, and various components may be added or excluded according to the characteristics of each facility.
  • the robot R, the cloud server 20, and the facility control systems 201a, 202a, 203a, 204a, ... provide various services within the building 1000 using the facility infrastructure.
  • the robot R mainly travels in a building to provide various services.
  • the robot R may include at least one of a body unit, a driving unit, a sensing unit, a communication unit, an interface unit, and a power supply unit.
  • the body part includes a case (casing, housing, cover, etc.) constituting the exterior.
  • the case may be divided into a plurality of parts, and various electronic components are embedded in the space formed by the case.
  • the body part may be formed in different shapes according to various services exemplified in the present invention.
  • a container for storing goods may be provided on the upper part of the body part.
  • a suction port for sucking in dust using a vacuum may be provided at the lower part of the body.
  • the driving unit is configured to perform a specific operation according to a control command transmitted from the cloud server 20 .
  • the driving unit provides a means for moving the body of the robot within a specific space in relation to driving. More specifically, the drive unit includes a motor and a plurality of wheels, which are combined to perform functions of driving, changing direction, and rotating the robot R. As another example, the driving unit may include at least one of an end effector, a manipulator, and an actuator to perform an operation other than driving, such as pickup.
  • the sensing unit may include one or more sensors for sensing at least one of information within the robot (particularly, a driving state of the robot), environment information surrounding the robot, location information of the robot, and user information.
  • the sensing unit may include a camera (image sensor), a proximity sensor, an infrared sensor, a laser scanner (lidar sensor), an RGBD sensor, a geomagnetic sensor, an ultrasonic sensor, an inertial sensor, a UWB sensor, and the like.
  • the communication unit of the robot transmits and receives wireless signals from the robot to perform wireless communication between the robot R and the communication unit of the building, between the robot R and other robots, or between the robot R and the control system of the facility.
  • the communication unit may include a wireless Internet module, a short-distance communication module, a location information module, and the like.
  • the interface unit may be provided as a passage through which the robot R may be connected to an external device.
  • the interface unit may be a terminal (charging terminal, connection terminal, power terminal), port, or connector.
  • the power supply unit may be a device that supplies power to each component included in the robot R by receiving external power and internal power.
  • the power supply unit may be a device that generates electric energy inside the robot R and supplies it to each component.
  • the robot R has been described based on mainly traveling inside a building, but the present invention is not necessarily limited thereto.
  • the robot of the present invention may be in the form of a robot flying in a building, such as a drone. More specifically, a robot providing a guidance service may provide guidance about a building to a person while flying around a person in a building.
  • the robot may have a separate control unit as a lower controller of the cloud server 20 .
  • the control unit of the robot receives a driving control command from the cloud server 20 and controls the driving unit of the robot.
  • the control unit may calculate the torque or current to be applied to the motor using data sensed by the sensing unit of the robot. Using the calculated result, a motor, etc. is driven by a position controller, speed controller, current controller, etc., and through this, the robot performs control commands of the cloud server 20.
  • the building 1000 may include a building system 1000a for communicating with or directly controlling various facilities provided in the building 1000 .
  • the building system 1000a may include at least one of a communication unit 110, a sensing unit 120, an output unit 130, a storage unit 140, and a control unit 150.
  • the communication unit 110 forms at least one of a wired communication network and a wireless communication network within the building 1000, i) between the cloud server 20 and the robot R, ii) between the cloud server 20 and the building 1000 , iii) between the cloud server 20 and the facility infrastructure 200, iv) between the facility infrastructure 200 and the robot R, and v) between the facility infrastructure 200 and the building 1000. That is, the communication unit 110 may serve as a medium of communication between different subjects.
  • the communication unit 110 is configured to include at least one of a mobile communication module 111, a wired Internet module 112, a wireless Internet module 113, and a short-distance communication module 114.
  • a mobile communication module 111 a wireless communication module 112
  • a wireless Internet module 113 a wireless Internet module 113
  • a short-distance communication module 114 a short-distance communication module 114.
  • the communication unit 110 may support various communication methods based on the communication modules listed above.
  • the mobile communication module 111 complies with technical standards or communication schemes for mobile communications (eg, 5G, 4G, GSM (Global System for Mobile communication), CDMA (Code Division Multi Access) ), CDMA2000 (Code Division Multi Access 2000), EV-DO (Enhanced Voice-Data Optimized or Enhanced Voice-Data Only), WCDMA (Wideband CDMA), HSDPA (High Speed Downlink Packet Access), HSUPA (High Speed Uplink Packet Access) ), LTE (Long Term Evolution), LTE-A (Long Term Evolution-Advanced), etc.) building system (1000a), cloud server 20, robot (R) and facility infrastructure (200) on a mobile communication network built according to It may be made to transmit and receive at least one of the wireless signals.
  • the robot R may transmit and receive radio signals with the mobile communication module 111 using the communication unit of the robot R described above.
  • the wired Internet module 112 is a method of providing communication in a wired manner, and transmits and receives signals with at least one of the cloud server 20, the robot R, and the facility infrastructure 200 via a physical communication line as a medium. It can be done.
  • the wireless Internet module 113 is a concept including the mobile communication module 111 and may mean a module capable of accessing the wireless Internet.
  • the wireless Internet module 113 is disposed in the building 1000 and wirelessly communicates with at least one of the building system 1000a, the cloud server 20, the robot R, and the facility infrastructure 200 in a communication network according to wireless Internet technologies. made to transmit and receive signals.
  • Wireless Internet technology can be very diverse, and in addition to the communication technology of the mobile communication module 111 described above, WLAN (Wireless LAN), Wi-Fi, Wi-Fi Direct, DLNA (Digital Living Network Alliance), WiBro (Wireless Broadband) and WiMAX (World Interoperability for Microwave Access). Furthermore, in the present invention, the wireless Internet module 113 transmits and receives data according to at least one wireless Internet technology within a range including Internet technologies not listed above.
  • the short-range communication module 114 is for short-range communication, and includes BluetoothTM, Radio Frequency Identification (RFID), Infrared Data Association (IrDA), Ultra Wideband (UWB), and ZigBee. , Near Field Communication (NFC), Wi-Fi, Wi-Fi Direct, Wireless USB (Wireless Universal Serial Bus) using at least one technology, building system (1000a), cloud server 20, robot (R) and Short-range communication may be performed with at least one of the facility infrastructure 200 .
  • RFID Radio Frequency Identification
  • IrDA Infrared Data Association
  • UWB Ultra Wideband
  • ZigBee ZigBee.
  • NFC Near Field Communication
  • Wi-Fi Wi-Fi Direct
  • Wireless USB Wireless Universal Serial Bus
  • the communication unit 110 may include at least one of the communication modules described above, and these communication modules may be disposed in various spaces inside the building 1000 to form a communication network.
  • this communication network i) cloud server 20 and robot (R), ii) cloud server 20 and building 1000, iii) cloud server 20 and facility infrastructure (200, iv) facility infrastructure (200 ) and the robot (R), v) the facility infrastructure 200 and the building 1000 may be made to communicate with each other.
  • the building 1000 may include a sensing unit 120, and this sensing unit 120 may include various sensors. At least some of the information sensed through the sensing unit 120 of the building 1000 is transferred to at least one of the cloud server 20, the robot R, and the facility infrastructure 200 through a communication network formed through the communication unit 110. can be sent as one. At least one of the cloud server 20, the robot R, and the facility infrastructure 200 controls the robot R or the facility infrastructure 200 using information sensed through the sensing unit 120. can
  • the types of sensors included in the sensing unit 120 may be very diverse.
  • the sensing unit 120 may be provided in the building 1000 to sense various pieces of information about the building 1000 .
  • the information sensed by the sensing unit 120 may be information about the robot R traveling the building 1000, people located in the building 1000, obstacles, etc., and various environmental information related to the building (eg, For example, temperature, humidity, etc.).
  • the sensing unit 120 includes an image sensor 121, a microphone 122, a bio sensor 123, a proximity sensor 124, an illuminance sensor 125, an infrared sensor 126, a temperature At least one of the sensor 127 and the humidity sensor 128 may be included.
  • the image sensor 121 may correspond to a camera. As reviewed in FIG. 3 , a camera corresponding to the image sensor 121 may be disposed in the building 1000 . In this specification, the same reference numeral “121” as that of the image sensor 121 is assigned to the camera.
  • the number of cameras 121 disposed in the building 1000 is not limited.
  • the types of cameras 121 disposed in the building 1000 may vary, and as an example, the camera 121 disposed in the building 1000 may be a closed circuit television (CCTV). Meanwhile, that the camera 121 is disposed in the building 1000 may mean that the camera 121 is disposed in the indoor space 10 of the building 1000 .
  • CCTV closed circuit television
  • the microphone 122 may be configured to sense various sound information generated in the building 1000 .
  • the biosensor 123 is for sensing biometric information and may sense biometric information (eg, fingerprint information, face information, iris information, etc.) of a person or animal located in the building 1000 .
  • biometric information eg, fingerprint information, face information, iris information, etc.
  • the proximity sensor 124 may be configured to sense an object (such as a robot or a person) approaching the proximity sensor 124 or located around the proximity sensor 124 .
  • the illuminance sensor 125 is configured to sense the illuminance around the illuminance sensor 125, and the infrared sensor 126 has a built-in LED to take pictures of the building 1000 in a dark room or at night.
  • the temperature sensor 127 may sense the temperature around the temperature sensor 127
  • the humidity sensor 128 may sense the temperature around the humidity sensor 128 .
  • the types of sensors constituting the sensing unit 120 there is no particular limitation on the types of sensors constituting the sensing unit 120, and it is sufficient as long as the functions defined by each sensor are implemented.
  • the output unit 130 is a means for outputting at least one of visual, auditory and tactile information to a person or robot R in the building 1000, and includes a display unit 131 and an audio output unit ( 132) and at least one of the lighting unit 133.
  • Such an output unit 130 may be disposed at an appropriate location on the indoor space of the building 1000 according to needs or circumstances.
  • the storage unit 140 may be configured to store various information related to at least one of the building 1000, robots, and facility infrastructure.
  • the storage unit 140 may be provided in the building 1000 itself.
  • at least a part of the storage unit 140 may mean at least one of the cloud server 20 and an external database. That is, it can be understood that the storage unit 140 suffices as long as it is a space where various information according to the present invention is stored, and there is no restriction on physical space.
  • the controller 150 is a means for performing overall control of the building 1000, and can control at least one of the communication unit 110, the sensing unit 120, the output unit 130, and the storage unit 140. there is.
  • the controller 150 may perform control of the robot by interworking with the cloud server 20 .
  • the control unit 150 may exist in the form of a cloud server 20 .
  • the building 1000 can be controlled together by the cloud server 20, which is the control unit of the robot R.
  • the cloud server controlling the building 1000 is the cloud controlling the robot R. It may exist separately from the server 20.
  • the cloud server 20 controlling the building 1000 and the cloud server 20 controlling the robot R communicate with each other to provide services by the robot R, move the robot, or maintain functions.
  • control unit of the building 1000 may also be referred to as a “processor”, and the processor may be configured to process various commands by performing basic arithmetic, logic, and input/output operations.
  • At least one of the building 1000, the robot R, the cloud server 20, and the facility infrastructure 200 forms a network 40 based on a communication network, and within the building 1000 Various services using robots may be provided.
  • the robot R, the facility infrastructure 200 provided in the building, and the cloud server 20 can be organically connected so that various services are provided by the robot. there is. At least some of the robot R, the facility infrastructure 200, and the cloud server 20 may exist in the form of a platform for constructing a robot-friendly building.
  • the process of the robot R using the facility infrastructure 200 is described in more detail.
  • the robot R travels the indoor space 10 of the building 1000 or uses the facility infrastructure 200 for purposes such as performing missions (or providing services), driving, charging, maintaining cleanliness, and waiting. and move, and furthermore, the facility infrastructure 200 can be used.
  • the robot R travels in the indoor space of the building 1000 or moves using the facility infrastructure 200 to achieve the “purpose” based on a certain “purpose”, and furthermore, the facility infrastructure (200) can be used.
  • the purpose to be achieved by the robot may be specified based on various causes.
  • the purpose to be achieved by the robot there may be a first type of purpose and a second type of purpose.
  • the purpose of the first type may be for the robot to perform its original mission
  • the purpose of the second type may be for the robot to perform a mission or function other than the robot's original mission.
  • the purpose to be achieved by the robot according to the first type may be to perform the original mission of the robot. This purpose can also be understood as the “task” of the robot.
  • the robot travels in the indoor space of the building 1000 or moves using the facility infrastructure 200 to achieve the purpose or task of providing the serving service.
  • the facility infrastructure 200 may be used.
  • the robot travels in the indoor space of the building 1000 or moves using the facility infrastructure 200 to achieve the purpose or task of providing the road guidance service.
  • the facility infrastructure 200 may be used.
  • a plurality of robots operated for different purposes may be located in the building according to the present invention. That is, different robots capable of performing different tasks may be deployed in the building, and different types of robots may be deployed in the building according to the needs of the manager of the building and various subjects who have moved into the building.
  • the building includes delivery, logistics, guidance, interpretation, parking assistance, security, crime prevention, security, policing, cleaning, quarantine, disinfection, laundry, beverage preparation, food preparation, serving, fire suppression, medical assistance, and entertainment services.
  • Robots providing at least one of the services may be deployed. Services provided by robots may be various other than the examples listed above.
  • the purpose of the second type is for the robot to perform a mission or function other than the robot's original mission, which may be a purpose unrelated to the robot's original mission.
  • the purpose of the second type is not directly related to the robot performing its original mission, but may be an indirectly necessary mission or function.
  • the robot in order to achieve the second type of purpose, can drive in the indoor space of the building 1000, move using the facility infrastructure 200, and furthermore, use the facility infrastructure 200. there is.
  • the robot may use a charging facility infrastructure to achieve a purpose according to a charging function, and may use a washing facility infrastructure to achieve a purpose according to a washing function.
  • the robot may drive in the indoor space of the building 1000, move using the facility infrastructure 200, and furthermore, use the facility infrastructure 200 in order to achieve a certain purpose.
  • the cloud server 20 may appropriately control each of the robots located in the building based on information corresponding to each of the plurality of robots located in the building stored in the database.
  • various information on each of a plurality of robots located in a building may be stored on the database, and information on the robot R may be very diverse.
  • identification information for identifying the robot R disposed in the space 10 eg, serial number, TAG information, QR code information, etc.
  • task assigned to the robot R Information eg, type of mission, operation according to the mission, target user information for the target of the mission, mission location, scheduled mission time, etc.
  • iv) robot (R) location information, v) robot (R) status information eg, power status, failure status, cleaning status, battery status, etc.
  • image information received from a camera installed in the robot (R) e.g, vii) motion information related to the motion of the robot R may exist.
  • appropriate control of the robots may be related to control for operating the robots according to the first type of purpose or the second type of purpose described above.
  • the operation of the robot may refer to control allowing the robot to drive in the indoor space of the building 1000, move using the facility infrastructure 200, and furthermore, use the facility infrastructure 200.
  • Movement of the robot may be referred to as driving of the robot, and therefore, in the present invention, the movement path and the travel path may be used interchangeably.
  • the cloud server 20 Based on the information about each robot stored in the database, the cloud server 20 assigns an appropriate task to the robots according to the purpose (or original task) of each robot, and performs the assigned task. control can be performed. At this time, the assigned task may be a task for achieving the first type of purpose described above.
  • the cloud server 20 may perform control to achieve the second type of purpose for each robot based on information about each robot stored in the database.
  • the robot that has received the control command for achieving the second type of purpose from the cloud server 20 moves to the charging facility infrastructure or washing facility infrastructure based on the control command, purpose can be achieved.
  • the terms “purpose” or “mission” will be used without distinguishing between the first type and the second type of purpose.
  • the purpose described below may be either a first type purpose or a second type purpose.
  • the mission described below may also be a mission to achieve a first type of objective or a second type of objective.
  • the cloud server 20 allows the robot to perform a task corresponding to serving to the target user, You can control the robot.
  • the cloud server 20 may control the robot to move to the charging facility infrastructure so that the robot performs a task corresponding to charging.
  • a method for the robot to perform a purpose or mission using the facility infrastructure 200 under the control of the cloud server 20 without distinction between the first type purpose and the second type purpose is described in more detail.
  • a robot controlled by the cloud server 20 to perform a mission may also be named a “target robot”.
  • the server 20 in the cloud may specify at least one robot to perform the mission upon request or under its own judgment.
  • the cloud server may receive requests from various entities such as visitors, managers, residents, workers, etc. located in the building in various ways (eg, user input through an electronic device or user input using a gesture method).
  • the request may be a service request for providing a specific service (or specific task) by the robot.
  • the cloud server 20 may specify a robot capable of performing the corresponding service among a plurality of robots located in the building 1000 .
  • the cloud server 20 is i) the type of service that the robot can perform, ii) the task previously assigned to the robot, iii) the current location of the robot, iv) the state of the robot (ex: power state, cleanliness state, battery state, etc.) Based on this, it is possible to specify a robot capable of responding to the request.
  • various information on each robot exists in the database, and the cloud server 20 may specify a robot to perform the mission based on the request based on the database.
  • the cloud server 20 may specify at least one robot to perform the mission based on its own judgment.
  • the cloud server 20 may perform its own determination based on various causes.
  • the cloud server 20 may determine whether a service needs to be provided to a specific user or a specific space existing in the building 1000 .
  • the cloud server 20 senses and receives data from at least one of a sensing unit 120 (see FIGS. 4 to 6) existing in the building 1000, a sensing unit included in the facility infrastructure 200, and a sensing unit provided in a robot. Based on the received information, it is possible to extract a specific target for which service needs to be provided.
  • the specific object may include at least one of a person, space, or object.
  • Objects may refer to facilities, objects, and the like located in the building 1000 .
  • the cloud server 20 may specify the type of service required for the extracted specific target and control the robot to provide the specific service to the specific target.
  • the cloud server 20 may specify at least one robot to provide a specific service to a specific target.
  • the cloud server 20 may determine an object for which a service needs to be provided based on various determination algorithms.
  • the cloud server 20 may include at least one of a sensing unit 120 present in the building 1000 (see FIGS. 4 to 6), a sensing unit included in the facility infrastructure 200, and a sensing unit provided in a robot. Based on the information sensed and received from , the type of service such as road guidance, serving, stair movement, etc. may be specified. And, the cloud server 20 may specify a target for which the corresponding service is required. Furthermore, the cloud server 20 may specify a robot capable of providing a specified service so that the service is provided by the robot.
  • the cloud server 20 may determine a specific space in which a service needs to be provided based on various determination algorithms.
  • the cloud server 20 may include at least one of a sensing unit 120 present in the building 1000 (see FIGS. 4 to 6), a sensing unit included in the facility infrastructure 200, and a sensing unit provided in a robot. Based on the information sensed and received from, extracting a specific space or object requiring service provision, such as a target user for delivery, a guest requiring guidance, a contaminated space, a contaminated facility, a fire zone, etc., and the specific space or object A robot capable of providing the corresponding service can be specified so that the service is provided by the robot.
  • the cloud server 20 may assign a mission to the robot and perform a series of controls necessary for the robot to perform the mission.
  • a series of controls are i) setting the movement path of the robot, ii) specifying the facility infrastructure to be used to move to the destination where the mission is to be performed, iii) communication with the specified facility infrastructure, iv) control of the specified facility infrastructure , v) monitoring the robot performing the mission, vi) evaluating the driving of the robot, and vii) monitoring whether or not the robot has completed the mission.
  • the cloud server 20 may specify a destination where the robot's mission is to be performed, and set a movement path for the robot to reach the destination.
  • the robot R may be controlled to move to a corresponding destination in order to perform a mission.
  • the cloud server 20 may set a movement path for reaching a destination from a location where the robot starts (starts) performing the mission (hereinafter referred to as “mission performance start location”).
  • the position where the robot starts to perform the mission may be the current position of the robot or the position of the robot at the time when the robot starts to perform the mission.
  • the cloud server 20 may generate a movement path of a robot to perform a mission based on a map (or map information) corresponding to the indoor space 10 of the building 1000 .
  • the map may include map information for each space of the plurality of floors 10a, 10b, 10c, ... constituting the indoor space of the building.
  • the movement route may be a movement route from a mission performance start location to a destination where the mission is performed.
  • map information and moving routes are described as being related to an indoor space, but the present invention is not necessarily limited thereto.
  • the map information may include information on an outdoor space, and the movement path may be a path leading from an indoor space to an outdoor space.
  • the indoor space 10 of the building 1000 may be composed of a plurality of different floors 10a, 10b, 10c, 10d, ..., and the mission start location and destination are the same. It can be located on a floor or on different floors.
  • the cloud server 20 may use map information on the plurality of floors 10a, 10b, 10c, 10d, ..., to create a movement path of a robot to perform a service within the building 1000.
  • the cloud server 20 may specify at least one facility that the robot must use or pass through to move to a destination among facility infrastructures (a plurality of facilities) disposed in the building 1000 .
  • the cloud server 20 specifies at least one facility 204, 205 to assist the robot in moving between floors, and It is possible to create a movement route including the point where the equipment is located.
  • the facility for assisting the robot to move between floors may be at least one of a robot-only elevator 204, a common elevator 213, and an escalator 205.
  • various types of facilities assisting the robot to move between floors may exist.
  • the cloud server 20 checks a specific floor corresponding to the destination among the plurality of floors 10a, 10b, 10c, ... of the indoor space 10, and the robot's mission start position (ex: service Based on the position of the robot at the time of starting the corresponding task), it may be determined whether the robot needs to move between floors to perform the service.
  • the cloud server 20 may include a facility (means) for assisting the robot to move between floors on the movement path based on the determination result.
  • the facility for assisting the robot to move between floors may be at least one of a robot-only elevator 204, a shared elevator 213, and an escalator 205.
  • the cloud server 20 may create a movement path so that a facility assisting the robot to move between floors is included in the movement path of the robot.
  • the cloud server 20 when the robot-only passages 201 and 202 are located on the movement path of the robot, uses the robot-only passages 201 and 202 to move the robot.
  • a movement path can be created including the point where (201, 202) is located.
  • the robot passage may be formed of at least one of a first passage (or first type passage) 201 and a second passage (or second type passage) 202 .
  • the first exclusive passage and the second exclusive passage 201, 202 may be provided together on the same floor or may be provided on different floors.
  • the first exclusive passage 201 and the second exclusive passage 202 may have different heights relative to the floor of the building.
  • the cloud server 20 may control the driving characteristics of the robot on the robot-only passage to be changed based on the type of the robot-only passage used by the robot and the degree of congestion around the robot-only passage. As shown in FIGS. 3 and 8 , when the robot travels in the second exclusive passage, the cloud server 20 may change the driving characteristics of the robot based on the degree of congestion around the exclusive passage for the robot. Since the second exclusive passage is a passage accessible to humans or animals, safety and movement efficiency are considered together.
  • the driving characteristics of the robot may be related to the driving speed of the robot.
  • the degree of congestion may be calculated based on an image received from at least one of a camera (or image sensor) 121 disposed in the building 1000 and a camera disposed in the robot. Based on these images, the cloud server 20 may control the traveling speed of the robot to be less than or equal to a predetermined speed when the path where the robot is located and the passage dedicated to the robot in the direction of travel is congested.
  • the cloud server 20 uses the map information for the plurality of floors 10a, 10b, 10c, 10d, ..., to generate a movement path of a robot to perform a service within the building 1000, and at this time , At least one facility that the robot must use or pass through to move to a destination among facility infrastructures (a plurality of facilities) disposed in the building 1000 may be specified. And, it is possible to create a movement route that includes at least one specified facility on the movement route.
  • a robot traveling in the indoor space 10 to perform a service may sequentially use or pass through at least one facility along a movement path received from the cloud server 20 and drive to a destination.
  • the order of facilities to be used by the robot may be determined under the control of the cloud server 20 . Furthermore, the order of facilities to be used by the robot may be included in the information on the movement path received from the cloud server 20 .
  • robot-specific facilities (201, 202, 204, 208, 209, 211) exclusively used by robots and shared facilities (205, 206) jointly used by humans , 207, 213) may be included.
  • Robot-exclusive facilities exclusively used by robots include facilities (208, 209) that provide functions necessary for robots (ex: charging function, washing function, standby function) and facilities used for robot movement (201, 202, 204 , 211).
  • the cloud server 20 When the robot creates a movement path, the cloud server 20 causes the robot to move (or pass) using the robot-exclusive facility when there is a robot-exclusive facility on the path from the mission start position to the destination. You can create a movement path that That is, the cloud server 20 may create a movement path by giving priority to facilities dedicated to the robot. This is to increase the efficiency of the movement of the robot. For example, the cloud server 20 may create a movement path including the robot-specific elevator 204 when both the robot-specific elevator 204 and the common elevator 213 exist on the movement path to the destination. there is.
  • the robot traveling in the building 1000 according to the present invention can travel in the indoor space of the building 1000 to perform its duties using various facilities provided in the building 1000.
  • the cloud server 20 may communicate with a control system (or control server) of at least one facility used or scheduled to be used by the robot for smooth movement of the robot.
  • a control system or control server
  • the unique control systems for controlling the facilities communicate with at least one of the cloud server 20, the robot R, and the building 1000 so that the robot R uses the facilities. Appropriate control can be performed for each facility to
  • the cloud server 200 may monitor the location of robots traveling in the building 1000 in real time or at preset time intervals.
  • the cloud server 1000 provides location information on all of the plurality of robots traveling in the building 1000. or, if necessary, selectively monitoring the location information of only a specific robot.
  • the location information of the robot being monitored can be stored on a database in which the information of the robot is stored, and the location information of the robot changes over time. can be continuously updated.
  • Methods for estimating the positional information of the robot located in the building 1000 can be very diverse. Hereinafter, an embodiment of estimating the positional information of the robot will be described.
  • 9 to 11 are conceptual diagrams for explaining a method of estimating the position of a robot traveling in a robot-friendly building according to the present invention.
  • the cloud server 20 receives an image of the space 10 using a camera (not shown) provided in the robot R, and receives It is made to perform visual localization to estimate the position of the robot from the image.
  • the camera is configured to capture (or sense) an image of the space 10, that is, an image of the robot R's surroundings.
  • robot image an image acquired using a camera provided in the robot R
  • space image the image acquired through the camera disposed in the space 10
  • the cloud server 20 is configured to acquire a robot image 910 through a camera (not shown) provided in the robot R. Also, the cloud server 20 may estimate the current location of the robot R using the obtained robot image 910 .
  • the cloud server 20 compares the robot image 910 with the map information stored in the database, and as shown in (b) of FIG. 9, the location information corresponding to the current location of the robot R (eg, “3rd floor A area (3, 1, 1)”) can be extracted.
  • the map of the space 10 may be a map prepared based on Simultaneous Localization and Mapping (SLAM) by at least one robot moving the space 10 in advance.
  • SLAM Simultaneous Localization and Mapping
  • the map of the space 10 may be a map generated based on image information.
  • the map of the space 10 may be a map generated by a vision (or visual) based SLAM technology.
  • the cloud server 20 provides coordinate information (for example, (3rd floor, area A (3, 1)) as shown in (b) of FIG. , 1,))
  • the specified coordinate information may soon be the current location information of the robot (R).
  • the cloud server 20 compares the robot image 910 obtained from the robot R with a map generated by the vision (or visual) based SLAM technology to estimate the current location of the robot R.
  • the cloud server 20 i) specifies an image most similar to the robot image 910 by using an image comparison between the robot image 910 and images constituting a pre-generated map, and ii) the specified
  • the location information of the robot R may be specified by acquiring location information matched to the image.
  • the cloud server 20 uses the acquired robot image 910 to determine the current location of the robot. can be specified.
  • the cloud server 20 provides location information (eg, coordinates) corresponding to the robot image 910 from map information previously stored in a database (eg, it can also be named “reference map”). information) can be extracted.
  • the position estimation of the robot R may be performed by the robot R itself. . That is, the robot R may estimate the current position based on the image received from the robot R itself in the manner described above. Then, the robot R may transmit the estimated location information to the cloud server 20 . In this case, the cloud server 20 may perform a series of controls based on location information received from the robot.
  • the cloud server 20 can specify at least one camera 121 disposed in the indoor space 10 corresponding to the location information.
  • the cloud server 20 may specify the camera 121 disposed in the indoor space 10 corresponding to the location information from matching information related to the camera 121 stored in the database.
  • the cloud server 20 obtains a robot image 910 obtained from the robot R itself and a camera 121 disposed in a space where the robot R is located for control of the robot R.
  • the image can be output together with the display unit of the control system. Therefore, a manager who remotely manages and controls the robot R from inside or outside the building 1000 can obtain not only the robot image 910 obtained from the robot R, but also the image of the space where the robot R is located. In consideration of, it is possible to perform remote control on the robot (R).
  • position estimation of a robot traveling in the indoor space 10 may be performed based on a tag 1010 provided in the indoor space 10, as shown in FIG. 10(a).
  • location information corresponding to a point to which the tag 1010 is attached may match and exist in the tag 1010 , as shown in (b) of FIG. 10 . That is, tags 1010 having different identification information may be provided at a plurality of different points in the indoor space 10 of the building 1000, respectively. Identification information of each tag and location information of a point where the tag is attached may match each other and exist in a database.
  • the tag 1010 may include location information matched to each tag 1010 .
  • the robot R may recognize the tag 1010 provided in the space 10 by using a sensor provided in the robot R. Through this recognition, the robot R can determine the current location of the robot R by extracting location information included in the tag 1010 . Such extracted location information may be transmitted from the robot R to the cloud server 20 through the communication unit 110 . Accordingly, the cloud server 20 may monitor the locations of the robots traveling in the building 20 based on the location information received from the robot R that has sensed the tags.
  • the robot R may transmit identification information of the recognized tag 1010 to the cloud server 20 .
  • the cloud server 20 may monitor the location of the robot within the building 1000 by extracting location information matched to the identification information of the tag 1010 from the database.
  • the term of the tag 1010 described above may be variously named.
  • this tag 1010 can be variously named as a QR code, a barcode, an identification mark, and the like.
  • the term of the tag reviewed above can be used by replacing it with “marker”.
  • Various information about the robot R can be stored in the database.
  • Various information about the robot R may include identification information (eg, serial number, TAG information, QR code information, etc.) for identifying the robot R located in the indoor space 10.
  • identification information of the robot R may be included in an identification mark (or identification mark) provided to the robot R, as shown in FIG. 11 .
  • This identification mark can be sensed or scanned by the building control system 1000a or the facility infrastructure 200 .
  • identification marks 1101, 1102, and 1103 of the robot R may include identification information of the robot.
  • the identification mark (1101, 1102, 1103) is a barcode (barcode, 1101), serial information (or serial information, 1102), QR code (1103), RFID tag (not shown) or NFC tag (not shown) ) and so on. Barcode (1101), serial information (or serial information, 1102), QR code (1103), RFID tag (not shown) or NFC tag (not shown) is provided (or attached) with an identification mark It may be made to include identification information of the robot.
  • the robot identification information is information for distinguishing each robot, and may have different identification information even if the robots are of the same type. Meanwhile, the information constituting the identification mark may be configured in various ways other than the barcode, serial information, QR code, RFID tag (not shown) or NFC tag (not shown) discussed above.
  • the cloud server 20 extracts identification information of the robot R from an image received from a camera installed in the indoor space 10, a camera installed in another robot, or a camera provided in facility infrastructure, and ), the location of the robot can be identified and monitored.
  • a means for sensing the identification mark is not necessarily limited to a camera, and a sensing unit (eg, a scanning unit) may be used according to the shape of the identification mark.
  • a sensing unit may be provided in at least one of the indoor space 10 , robots, and facility infrastructure 200 .
  • the cloud server 20 may determine the position of the robot R from an image received from the camera. At this time, the cloud server 20 provides at least one of location information where the camera is placed and location information of the robot in the image (precisely, location information of a graphic object corresponding to the robot in an image taken with the robot as a subject). Based on this, the robot R can grasp the location.
  • identification information on cameras disposed in the indoor space 10 may be matched with location information on locations where cameras are disposed. Accordingly, the cloud server 20 may extract location information of the robot R by extracting the location information matched with the identification information of the camera that has taken the image from the database.
  • the cloud server 20 may determine the location of the robot R from scan information sensed by the scan unit. On the database, identification information on the scan unit disposed in the indoor space 10 may be matched with location information on a place where the scan unit is disposed. Accordingly, the cloud server 20 may extract location information of the robot R by extracting location information matched to the scanning unit that has scanned the identification mark provided in the robot from the database.
  • the cloud server 20 can efficiently and accurately control the robot within the building by monitoring the position of the robot.
  • the robot R capable of providing various services may perform charging in order to perform assigned tasks.
  • the robot R in the present invention may perform charging by receiving power (or electric energy) from the charging station S and accumulating it in a battery.
  • At least one or a plurality of charging stations S may exist in the space 10 in the building 1000, as shown in FIG. 12A.
  • the robot R may be charged while being docked with any one (eg, S2) of the plurality of charging stations S1, S2, S3, S4....
  • “docking of the robot R and the charging station S” will be understood as contacting or combining the charging terminal 1351 of the robot R and the charging terminal 1440 of the charging station S. It can be (see (b) of FIG. 12b).
  • the contact at this time may be understood as including not only physical contact between the robot R and the charging station S, but also spaced apart within a charging distance.
  • a plurality of robots disposed in the space 10 or the building 1000 are considered in consideration of both the state of the robot R requiring charging and the surrounding environment information of the robot R. Any one of the four charging stations S1 , S2 , S3 , and S4 may be assigned to the robot R requiring charging.
  • the robot R moves to a position around the assigned charging station S, and is provided in the charging station S.
  • the assigned charging station can be identified (specified or classified).
  • the “marker (M)” refers to a marker including information related to the charging station (S) (eg, identification information (ID, serial number, etc.), location information, etc. that can distinguish the charging station). can do.
  • Such a marker M may be configured to include a specific pattern that can be sensed or identified by a sensor (or sensing unit 1320) of the robot R.
  • marker (M) may be used in combination with “visual (visual) marker” or “visual (visual) fiducial marker”.
  • the robot R moves (at least one of movement and posture) so that the robot R accurately docks with the charging station S. ) can be controlled.
  • the robot R under the control of the controller provided in the server 20 or the robot R, the robot R is located facing the charging station S. , movement can be controlled.
  • the robot R and the charging station S face each other can be understood as the front face RF of the robot and the front face SF of the charging station facing each other.
  • Front of the robot is the area of the main body of the robot R where the charging terminal 1351 of the robot R is located or the robot R moves forward. It may be an area of the body of the robot R facing the direction of movement when performing.
  • the front face RF of the robot R is an area of the body part of the robot R facing the moving direction when the robot R performs forward movement
  • the charging terminal of the robot R may refer to an area where is disposed.
  • the “Front of charging station (SF)” of the charging station (S) is the charging terminal (1480) of the charging station (S) and the charging station (S) main body where the marker (M) is disposed. It may mean a work area.
  • the charging terminal 1480 of the charging station (S) and the marker (M) will be described as an example in which the charging station (S) is disposed in an extended area of the main body.
  • the extended area of the main body of the charging station (S) is named “Front (SF) of the charging station”.
  • the robot R and the charging station S face each other
  • the robot R and the marker M provided in the charging station S face each other
  • the front of the robot (RF) and the front of the charging station (SF) face each other
  • the front of the robot (RF) and the charging station (S) face each other
  • the front of the robot (RF) and the charging station (S) are equipped with It can be used interchangeably with “the marker (M) is facing”.
  • the front (RF) of the robot and the front (SF) of the charging station face each other so that the robot (R) ) can be controlled.
  • controlling the movement of the robot R so that the front surface RF of the robot and the front surface SF of the charging station face each other may be referred to as "arrangement of the robot R".
  • control for charging the robot R may be performed under the control of at least one of the cloud server 20 and the robot R.
  • control of the movement of the robot R may be performed by the cloud server 20 or performed by the robot R itself.
  • data processing for control of the robot R is performed by any one of the cloud server 20 and the robot R, but by the cloud server 20
  • Data processing described as being performed may be performed by the robot R
  • data processing described as being performed by the robot R may be performed by the cloud server 20.
  • the charging station (S) when the robot (R) is in a state in need of charging, the charging station (S) is allocated in consideration of both the remaining battery amount information of the robot (R) and the surrounding environment information of the robot (R), and the robot ( It is possible to provide a charging control method and system 1300 of a robot R, which allows R to move to a place where an assigned charging station S is located and accurately dock with the charging station S.
  • the charging control method and system of the robot R may be used interchangeably with “robot charging control method and system” and “charging control method and system”.
  • FIGS. 12a, 12b and 13 are conceptual diagrams for explaining a method and system for controlling charging of a robot according to the present invention.
  • 14 is a conceptual diagram for explaining a charging station in the present invention
  • FIG. 15 is a conceptual diagram for explaining a method for a robot to recognize a marker provided in a charging station in the present invention
  • FIGS. 16 and 17 are a conceptual diagram according to the present invention.
  • 18 and 19 are flowcharts for explaining the charging control method of the robot, and are conceptual diagrams for explaining the movement of the robot in the present invention
  • FIGS. 20, 21a, 21b, 22, 23a and 23b are These are conceptual diagrams for explaining the method of controlling the movement of the robot in the invention.
  • the charging control system 1300 of the robot R may include at least one of the robot R, the cloud server 20, and the charging station S. .
  • the robot (R) is a robot (R) that provides services within the building 1000, and includes a communication unit 1310, a sensor unit 1320, a storage unit 1330, a driving unit 1340, and a charging unit. 1350 and at least one of the control unit 1360 may be included.
  • the communication unit 1310 of the robot R may communicate with the cloud server 20 .
  • the communication unit 1310 transmits the remaining battery capacity information of the robot R and the image acquired through the camera 1321 of the robot R to the cloud server 20, or the cloud server 20 to the building ( 1000) may receive identification information of a specific charging station (S) disposed in a specific space (10).
  • the remaining battery capacity information may be information related to the amount of battery (or amount of power) usable for driving (use or control or movement) of the robot R.
  • the identification information of the charging station (S) is information that can distinguish or specify the charging station (S), ID information of the charging station (S), serial number information of the charging station (S), charging station (S) ) may include at least one of a marker image corresponding to a location (or area) and a marker M.
  • the sensing unit 1320 of the robot R can sense (photograph or recognize) the marker M.
  • the sensing unit 1320 may include a sensor capable of sensing the marker M.
  • the sensor may be a camera 1321 that captures images.
  • the storage unit 1330 of the robot R may be configured to store various information related to the present invention.
  • information about the robot R may be stored in the storage unit 1330 .
  • the information on the robot R may be very diverse, and as an example, i) identification information of the robot R for identifying the robot R disposed in the space 10 (eg, serial number, ID, etc.), ii) mission information assigned to the robot (R), iii) driving route information set in the robot (R), iv) location information of the robot (R), v) status information (battery) of the robot (R) remaining amount information), vi) information sensed by the sensing unit 1320 of the robot (R), vii) operation information related to the operation of the robot (R), and viii) at least one of the charging history information of the robot (R).
  • identification information of the robot R for identifying the robot R disposed in the space 10 (eg, serial number, ID, etc.
  • mission information assigned to the robot (R) iii) driving route information set in the robot (R)
  • information about the charging station (S) may be stored in the storage unit 1330, a charging algorithm for performing charging with the charging station (S), and information about predetermined points spaced apart from the charging station (S). Information and the like can be stored.
  • various information received from the server 20 may be stored in the storage unit 1330 and a map of the building 1000 may be stored. Location information of the charging station S may be included in the map.
  • the driving unit 1340 of the robot R may include means for the robot R to move within a specific space 10 in relation to driving. More specifically, the driving unit 1340 of the robot R includes a motor and a plurality of wheels, which are combined to perform functions of driving, changing direction, and rotating the robot R.
  • movement of the robot R may be performed through control of the traveling unit 1340 .
  • the charging unit 1350 of the robot (R) contacts or couples to the charging terminal 1440 of the charging station (S) to accumulate power and the charging terminal 1351 configured to receive power from the charging station (S).
  • a battery (not shown) may be included.
  • the charging terminal 1351 of the robot R may be disposed on the front RF of the robot.
  • control unit 1360 of the robot (R) may be configured to control the overall operation of the robot (R).
  • the control unit 1360 of the robot R may process data or information transmitted and received to and from the cloud server 20 or process information sensed through the sensing unit 1320 .
  • the controller 1360 of the robot R may control overall operations related to charging of the robot R. As described above, in the present invention, control related to charging of the robot R performed by the cloud server 20 may be performed by the control unit 1360 of the robot R.
  • control unit 1360 of the robot R may control the traveling unit 1340 of the robot R to move to a place where the charging station S assigned to the robot R is located. .
  • the control unit 1360 of the robot R is based on the comparison between the marker image received from the server 20 and the sensing information sensed by the sensing unit 1320, and the marker of the charging station S assigned to the robot R. (M) can be recognized.
  • the control unit 1360 of the robot R based on recognizing the marker M of the charging station S assigned to the robot R, the first point (Approach Point, AP) and the second point (Stop) It is possible to control the movement of the robot (R) so that the robot (R) faces the charging station (S) at Point, SP (see FIG. 20).
  • the first point (AP) moves toward the charging station (S) while facing the charging station (S) in order for the robot (R) to attempt docking with the charging station (S).
  • AP the first point
  • S the charging station
  • R the robot
  • the second point SP may be understood as a destination where the robot R, which moves at a constant speed below a preset speed limit toward the charging station S, stops.
  • the cloud server 20 monitors whether a charging event for the robot R occurs, and when the charging event occurs, the robot R accurately docks with the charging station S to perform charging. Overall control related to the charging of (R) can be performed.
  • the cloud server 20 considers at least one of the remaining battery capacity information of the robot R received from the robot R and the surrounding environment of the robot R, and the charging station S at which the robot R will charge. ) can be assigned.
  • the cloud server 20 moves to the charging station S to which the robot R is assigned and charges the robot R at the first point AP and the second point around the charging station S.
  • the movement of the robot (R) can be controlled so that the station (S) faces each other.
  • control of the robot R which is described as being performed by the robot R in the present invention, may be performed through the cloud server 20.
  • data processing performed for charging control of the robot R may be performed in at least one of the cloud server 20 and the robot R.
  • the charging station (S) included in the charging control system 1300 according to the present invention may be disposed in the same building 1000 as the robot (R).
  • the charging station (S) is a charging terminal (1440) that delivers power (or electrical energy) to the robot (R) and the marker (M) including information related to the charging station (S). ) may include at least one of
  • the “marker M” may include a predetermined pattern for guiding or indicating information related to the charging station S.
  • the marker M may refer to a visual reference marker generated based on a library or application for augmented reality, robotics, and calibration of the camera 1321, such as ARToolKit, ARTag, AprilTag, ArUco, and the like.
  • the marker M may include a plurality of grid patterns arranged on a grid map composed of grids at regular intervals. More specifically, the marker M may include a plurality of grid patterns of a first color (eg, white) and a plurality of grid patterns of a second color (eg, black). In other words, the marker M may be composed of a combination of a plurality of first color grids and a plurality of second color grids.
  • a first color eg, white
  • a second color eg, black
  • the marker M provided in the charging station S may include a first marker and a second marker having different sizes. That is, the marker M provided in the charging station S may be formed of a combination of the first marker 1410 and the second marker 1420.
  • a relatively large first marker 1410 is disposed at the bottom, and a relatively small second marker 1420 is disposed at the top.
  • a first marker 1410 and a second marker 1420 having different sizes may be vertically disposed on the marker M provided in the charging station S.
  • the first marker 1410 and the second marker M overlap on the plane (or virtual plane, 1430) of the grid map on which the grid patterns are aligned.
  • the robot (R) based on at least one of the distance between the marker (M) and the robot (R) provided in the charging station (S) and the angle of view of the camera 1321 provided in the robot (R), At least one of the first marker 1410 and the second marker 1420 may be recognized.
  • the robot R determines that the distance between the marker M provided in the charging station S and the robot R is the entire image of the first marker 1410 from the camera 1321 of the robot R. When located at a distance from which the image 1510 including the can be acquired, the robot R may perform recognition on the first marker 1410.
  • the robot R may perform recognition of the relatively large first marker 1410 from the image 1510 acquired from the camera 1321.
  • the robot R determines that the distance between the marker M provided in the charging station S and the robot R determines the entire image of the first marker 1410 from the camera 1321 of the robot R.
  • the robot R may perform recognition on the second marker 1420 .
  • the robot R may perform recognition on the second marker 1420 having a relatively small size from the image 1520 obtained from the camera 1321 .
  • which marker among the first marker 1410 and the second marker 1420 is recognized by the robot R in the present invention may vary depending on the distance between the robot R and the charging station S.
  • a process of receiving identification information of the charging station S to which the robot R will perform charging from the cloud server 20 may proceed (S110, see Figure 16).
  • the charging event in the robot R may occur under the control of at least one of the robot R and the cloud server 20 based on the remaining battery level of the robot R being less than (or less than) a preset battery level. there is.
  • the robot R may transmit remaining battery information to the cloud server 20 .
  • the robot R transmits the remaining battery level information to the cloud server 20 at each predetermined period, or transmits the remaining battery level information to the cloud server 20 when the remaining battery level is equal to or less than (or less than) a predetermined battery level. . That is, the robot R may notify the cloud server 20 that charging is required by transmitting battery information to the cloud server 20 .
  • the cloud server 20 may determine whether or not the robot R needs charging based on the remaining battery information received from the robot R.
  • the cloud server 20 may determine that a charging event has occurred in the robot that transmitted the corresponding battery information.
  • the cloud server 20 monitors the state of the robot R by receiving the remaining battery amount information from the robot R, and based on the remaining battery amount information of the robot R according to the monitoring, the robot R requires charging. It can recognize whether it is in a state or a state in which charging is not required.
  • the cloud server 20 may transmit identification information of a charging station S through which the robot R can perform charging to the robot R.
  • the identification information of the charging station (S) may include at least one of location information of a place where the charging station (S) is located and a marker image corresponding to the marker (M) provided in the charging station (S). .
  • the “location information of the place where the charging station S is located” described in the present invention is i) information on a specific area in the building 1000 where the charging station is located (eg, area A on the 3rd floor), ii) Node information (eg, node 18) corresponding to the charging station (S) iii) Coordinate information (eg, (3, 1, 1)) corresponding to the charging station (S), vi) Robot (R) It may include at least one of movement path information from the current location of the current location to the charging station (S).
  • the cloud server 20 transmits node information (N18, 1830) of the place where the charging station (S3) is located to the robot (R), or movement path information from the current location (N2, 1820) of the robot (R) to the charging station (S) (Moving path from N2 to N18 via N3 and N13, 1840) can be transmitted to the robot R.
  • location information of the place where the charging station (S) is located is a specific point defined based on the charging station (S) (eg, a charging station among a plurality of points spaced apart from the charging station (S) by a specific distance. It is a point located on a straight line perpendicular to the front (SF) of , and may include location information about (which will be described in more detail later).
  • the cloud server 20 may extract location information of a place where the charging station S is located using map information stored in a database.
  • Location information of the charging station S disposed in the building 1000 or the space 10 may be matched with the map information.
  • the map of the space 10 may be a map prepared based on Simultaneous Localization and Mapping (SLAM) by at least one robot moving the space 10 in advance.
  • SLAM Simultaneous Localization and Mapping
  • the map of the space 10 may be a map generated based on image information.
  • the cloud server 20 includes a plurality of charging stations (first to fifth charging stations, S1, S2, S3, S4, S5, FIG. 18 in the building 1000 or space 10 where the robot R is located).
  • Reference based on at least one of the remaining battery capacity information 1810 of the robot R and the surrounding environment information of the robot R, specific charging existing within a movable distance with the remaining battery capacity of the robot R.
  • the station (third charging station, S3) can be assigned to the robot R.
  • the cloud server 20 calculates a movable distance of the robot R from the current position N2 or 1830 of the robot R based on the remaining battery capacity information 1810 of the robot R.
  • At least one charging station first to third charging stations, S1, S2, and S3 present in the can be identified.
  • the cloud server 20 corresponds to the shortest path 1840 from the current position (N2, 1820) of the robot R among the identified at least one charging station (first to third charging stations, S1, S2, and S3).
  • the third charging station S3 existing at the position N18 or 1830 may be assigned as a charging station S for the robot R to perform charging.
  • the cloud server 20 when the third charging station S3 existing at the position corresponding to the shortest path 1840 is being used for charging another robot R, the third charging station S3 A second charging station (S2) existing at a location (N17, 1850) corresponding to the shortest path among the at least one checked charging station (S1, S2) excluding , may be assigned to the robot (R).
  • the cloud server 20 corresponds to the second shortest movement path when the third charging station S3 existing in a position corresponding to the first short movement path 1840 is being used for charging another robot R.
  • the second charging station S2 existing at the location (N18, 1850) may be assigned to the robot R.
  • the cloud server 20 transmits the identification information of the charging station S assigned to the robot R to the robot R, and arrives at the place where the charging station S assigned to the robot R is located. , whether the robot R is docked to the assigned charging station (S), or whether the robot (R) is charging through the assigned charging station (S) can be continuously monitored.
  • the process of controlling the movement of the robot (R) so that the robot (R) moves to the place where the charging station is located is It may proceed (S120, see FIG. 16).
  • the robot R may receive identification information including location information of a place where the charging station S is located and a marker image corresponding to the marker M provided in the charging station S from the cloud server 20 .
  • the robot R may control the traveling unit 1340 to move to a place where the charging station S included in the identification information is located.
  • the robot (R) based on the identification information on the third charging station (S3), the third charging station (S3) from the point where the robot (R) is currently located Movement paths 1910 and 1920 to the location may be created.
  • Such a movement path may be generated from the server 20 and transmitted to the robot R.
  • the robot R may move to a place where the third charging station S3 is located by moving along the moving paths 1910 and 1920 .
  • the robot R may receive identification information of the charging station S assigned to the robot R from the cloud server 20, and move to a place where the charging station S is located based on the identification information ( S201, S202, S203, S204 and S205 see FIG. 17). At this time, the robot R may receive a marker image corresponding to the marker M provided in the charging station S from the cloud server 20 (S206, see FIG. 17).
  • the cloud server 20 transmits the robot image 910 obtained from the robot R and the vision By comparing the map generated by (or visual) based SLAM technology, the current location of the robot R can be estimated.
  • the cloud server 20 i) specifies an image most similar to the robot image 910 by using an image comparison between the robot image 910 and images constituting a pre-generated map, and ii) the specified
  • the location information of the robot R may be specified by acquiring location information matched to the image.
  • the cloud server 20 uses the obtained robot image 910 to specify the current location of the robot.
  • the cloud server 20 provides location information (eg, coordinates) corresponding to the robot image 910 from map information previously stored in a database (eg, it can also be named “reference map”). information) can be extracted.
  • the robot (R) or the server 20 specifies the current location of the robot (R) using the image obtained from the robot (R), so that the robot (R) is at the location of the charging station (S). You can control it to move to the place you want to go.
  • a process of recognizing the marker M provided in the charging station S through the sensor provided in the robot R may proceed (S130, see FIG. 16).
  • the robot R may control the camera 1321 to acquire an image including the marker M provided in the charging station S, based on arriving at the vicinity of the place where the charging station S is located. .
  • the robot R may perform recognition of the marker by detecting the marker M provided in the charging station S from the image received from the camera 1321 .
  • the marker M provided in the charging station S includes a first marker 1410 and a second marker M having different sizes
  • the robot R is a robot ( Recognition of any one of the first marker 1410 and the second marker 1420 may be performed based on the location of R) and the distance to the charging station S.
  • the robot R detects the marker M from the image acquired through the camera 1321, and the detected marker M and the marker included in the identification information of the charging station S received from the cloud server 20 images can be compared.
  • the robot R may perform recognition of the corresponding marker M. .
  • performing recognition of the marker M may be understood as confirming (interpreting or extracting) information included in the marker M and estimating the pose of the marker M. .
  • Recognition of such a marker M may be performed through a process of extracting a plurality of corner points of the marker M from an image including the marker M.
  • the marker M may be composed of a combination of a plurality of first color grids and a plurality of second color grids on the plane (or virtual plane, 1430) of the grid map on which the grid patterns are aligned. there is.
  • a corner point described in the present invention may refer to a point where vertices of grids of different colors among vertices of grids in a marker meet.
  • the robot R extracts a plurality of corner points on the marker M from the image, and based on the extracted coordinates of the plurality of corner points on the virtual plane and the extracted coordinates of the plurality of corner points in the image, the marker M ) can be recognized
  • Coordinates of the extracted plurality of corner points in the image may be calculated using homography.
  • homography may include or refer to a matrix for converting coordinates of a plurality of corner points on a virtual plane into coordinates of a plurality of corner points in an image.
  • a charging process or charging mode may be executed (or activated) in the robot R.
  • the robot (R) is equipped with a charging algorithm for charging with the charging station (S), and through the recognition of the marker (M), the robot (R) performs a series of charging with the charging station (S). process can be performed.
  • a series of processes for performing charging with the charging station S can be expressed as a charging process or a charging mode in the present invention.
  • the robot R moves (moves) so that the robot R is positioned facing the charging station S so that the robot R accurately docks with the charging station S. and at least one of posture) can be controlled.
  • the robot (R) and the charging station (S) are positioned facing each other is understood to mean that the front (RF) of the robot and the front (SF) of the charging station are facing each other. It can be (see (a) of FIG. 12a).
  • controlling the movement of the robot R so that the front surface RF of the robot and the front surface SF of the charging station S face each other is referred to as “alignment of the robot R” in the present invention.
  • it can be expressed as “Align the robot R with respect to the charging station S”.
  • the movement of the robot R may be to control the movement of the robot R so that the position of the robot R is maintained or changed, or to maintain or change the posture of the robot R.
  • control of the posture of the robot R may be performed by rotating (yaw) the robot R.
  • the movement of the robot R is such that the robot R rotates in place, the robot R moves in a left and right horizontal direction, moves along a movement path, or changes the posture of the robot R. it could be
  • the robot (R) and the charging station (S) are controlled to face each other so that the robot (R) is accurately docked to the charging station (S). can do.
  • two different points are a first point (Approaching Point, AP) and a second point (Stop Piont, SP) having different separation distances from the charging station (S). ) can mean.
  • first and second points may be points defined based on the charging station (S).
  • the first point and the second point may be defined based on the marker coordinate system of the marker M provided in the charging station S.
  • information on the first point and the second point may be stored in the robot R.
  • Information on the first point and the second point may exist in the charging algorithm installed in the robot R.
  • the first point (AP), in order for the robot (R) to attempt docking to the charging station (S), is to move toward the charging station (S) while facing the charging station (S). It can be understood as a starting point.
  • the first point AP may be located at a distance spaced apart from the charging station S by a first distance A on a straight line 2010 perpendicular to the front side SF of the charging station.
  • the first separation distance A may be a distance that allows the robot R and the charging station S to have a certain space between the robot R and the charging station S (see (a) in FIG. 12B). .
  • the second point SP may be understood as a destination where the robot R, which moves at a constant speed below a preset speed limit toward the charging station S, stops.
  • the second point SP may be located on a straight line 2010 perpendicular to the front side SF of the charging station and spaced apart from the charging station S by the second separation distance B.
  • the second separation distance B may be a distance at which the robot R and the charging station S are docked. That is, the second separation distance B may be a distance such that the charging terminal 1351 of the robot R and the charging terminal 1440 of the charging station S come into contact or are coupled (see (b) of FIG. 12B). ).
  • the first separation distance (A) may be relatively greater than the second separation distance (B), and the first point (AP) may be located farther from the charging station (S) than the second point (SP).
  • first point AP and the second point SP may be located within the marker recognition area 2020 where the robot R can recognize the marker M provided in the charging station S. there is.
  • the maximum distance (C) at which the robot (R) can recognize the marker (M) provided in the charging station (S) is the third separation distance (C).
  • the marker recognizable area 2020 is an area within the third separation distance C from the charging station S, and the first separation distance A and the second separation distance may be smaller than the third separation distance C. .
  • the robot R based on recognizing the marker M provided in the charging station S, determines the first point AP and the second point SP based on the charging station S. Location information can be calculated.
  • the robot R uses the charging algorithm stored in the storage unit 1330 of the robot R based on the marker coordinate system related to the recognized marker M, to determine the first point AP and the second point AP. It is possible to calculate location information for .
  • the charging algorithm stored in the storage unit 1330 of the robot R may include and store distance information used to calculate the location information of the first point AP and the second point based on the marker coordinate system.
  • first point (AP) and the second point described in the present invention are defined based on the marker coordinate system for the marker (M) recognized by the robot (R), and the robot (R) is the marker (M)
  • the positions of the first point AP and the second point SP may be recognized based on the relative positions of the coordinate system and the robot R.
  • data processing for recognizing the locations of the first point AP and the second point SP may also be performed by the cloud server 20 .
  • the robot (R) faces the marker (M) at a first point (AP) spaced apart from the charging station (S) by a specific distance
  • the first A process of controlling the movement of the robot R to the point AP may proceed (S1340, see FIG. 16).
  • the cloud server 20 may control the movement of the robot R so that the robot R moves from the point where the marker M is recognized to the first point by using the marker M recognized from the image. .
  • the robot R in the marker recognizable area 2020 around the charging station S, from the image 2120 captured through the camera 1321, the charging station ( The marker M provided in S) can be recognized.
  • the marker ( M) can be recognized.
  • the robot R may recognize the marker M provided in the charging station S while rotating in place (S207, FIG. 17).
  • the position corresponding to the point at which the robot R recognizes the marker M is named a first position 2111.
  • the controller of the robot R or the cloud server 20 may control the movement of the robot R to stop based on recognition of the marker M provided in the charging station S.
  • the robot R may stop at the first position 2111 where the marker M is recognized.
  • the control unit of the robot R or the cloud server 20 controls the image 2120 of the robot R acquired from the camera 1321 at the first position 2111. So that the central axis (Y1) in one direction (eg, y-axis) and the central axis (Y2) in one direction (eg, y-axis) of the marker M detected in the image coincide with each other, the robot The movement of (R) can be controlled (S208, FIG. 17).
  • the movement of the robot R may be to control the movement of the robot R so that the position of the robot R is maintained or changed, or to maintain or change the posture of the robot R.
  • the controller of the robot R or the cloud server 20 determines that the central axis Y1 of the image 2120 obtained from the camera 1321 is a marker. It is possible to control the robot R, which has stopped until it coincides with the central axis Y2, to rotate clockwise by a certain angle.
  • the central axes (Y1, Y2) of the image and the marker M coincide with each other means that the distance between the central axis (Y1) of the image and the central axis (Y2) of the marker in the image is It can be understood as being located in close proximity within a set range.
  • the distance between the central axis Y1 of the image and the central axis Y2 of the marker is displayed within a preset range within the image, it can be determined that they match even if they do not overlap.
  • This preset range is an error range, which is the characteristic of the image acquired by the camera 1321 of the robot R, the information recognized from the marker M provided in the charging station S, and the It may be set based on at least one of location information.
  • the robot R or the cloud server 20 may include a central axis (not shown) in the x-axis direction of the image acquired by the camera 1321 and the x-axis of the marker M detected in the image. It is also possible to additionally control the height (height) of the robot R so that central axes (not shown) for directions coincide with each other.
  • control unit of the robot R or the cloud server 20 determines, when the central axis Y1 of the image and the central axis Y2 of the marker coincide with each other, the central axis Y1 of the image and the central axis of the marker ( The movement of the robot R may be stopped so that the matching state of Y2) is maintained.
  • the controller of the robot R or the cloud server 20 can control the rotation of the robot R to stop.
  • the controller of the robot R or the cloud server 20 maintains the central axis Y1 of the image and the central axis Y2 of the marker to coincide with each other.
  • the robot R may be moved toward the charging station S so that the robot R is positioned at a corresponding point 2112 within a specific distance from the charging station S (S209, FIG. 17).
  • the point 2112 corresponding to within a specific distance from the charging station S may mean a position spaced apart from the charging station S by a first separation distance A, which is the first point (AP). ), but may not match as shown in (c) of FIG. 21a.
  • a corresponding point 2112 within a specific distance from the charging station S will be referred to as the second position 2112 .
  • the controller of the robot R or the cloud server 20 controls the robot R to stop and rotate at the first position 2111, and then moves the robot R from the first position 2111 to the second position ( 2112) can be controlled to move.
  • the controller of the robot R or the cloud server 20 in a state where the robot R moves to the second position 2112, the second position ( 2112), a relative positional relationship between the position of the marker M and the first point AP may be calculated (S210, FIG. 17).
  • the cloud server 20 may calculate a relative positional relationship between the second position 2112, the position of the marker M, and the first point AP based on the marker coordinate system for the recognized marker M.
  • the controller of the robot R or the cloud server 20 determines i) a relative positional relationship between the second position 2112 and the first point AP, ii) a relative position between the first point AP and the marker M Relationship, iii) A relative positional relationship between the marker M and the second position 2112 may be calculated.
  • control unit of the robot R or the cloud server 20 may calculate in which direction and at what distance D the first point AP is located from the second location 2112 .
  • the cloud server 20 calculates the relative positional relationship between the second location 2112, the marker M, and the first point AP,
  • the rotation of the robot R may be controlled so that the front face RF of the robot faces the first point AP from the second position 2112 ( S211 , FIG. 17 ).
  • the robot R may be rotated clockwise so that the front surface RF faces the first point AP.
  • the controller of the robot R or the cloud server 20 is located at the second position 2112 in a state in which the robot R looks at the first point AP. ), the movement of the robot R may be controlled to move to the first point AP. (S212, FIG. 17)
  • the controller of the robot R or the cloud server 20 determines the first point AP at the second location 2112 based on the relative positional relationship between the previously calculated second location 2112 and the first point AP. ), and control the robot R to move along the movement path.
  • the control unit of the robot R or the cloud server 20 based on the movement of the robot R to the first point AP, at the first point AP.
  • the movement (or movement) of the robot R may be controlled so that the robot R faces the marker M provided in the charging station S (S213, FIG. 17).
  • the controller of the robot R or the cloud server 20 allows the robot R to charge the charging station at the first point AP.
  • Movement (EX: rotation or movement) of the robot R may be controlled so as to face the marker M provided in S.
  • the controller of the robot R or the cloud server 20 has the marker M located at 8 o'clock from the front RF direction of the robot at the first point AP. If it is determined that, the robot R may be rotated counterclockwise so that the front RF of the robot faces the marker M.
  • the controller of the robot R or the cloud server 20 determines when the central axis Y1 of the image 2120 received from the camera 1321 of the robot R coincides with the central axis Y2 of the marker. , the robot R may be controlled to rotate at the first point AP.
  • the distance between the central axis (Y1) of the image and the central axis (Y2) of the marker is displayed within a preset range within the image, it can be determined that they match even if they do not overlap each other.
  • the robot R moves from the first position 2111 where the marker M is recognized to the first point AP through the second position 2112 that satisfies a certain criterion, and
  • the central axis Y1 and the central axis Y2 of the marker may be controlled to coincide.
  • the robot (R) can be controlled to face the charging station (S), and furthermore, the robot (R) can be controlled to accurately dock with the charging station (S).
  • the second point (from the first point AP) closer to the charging station (S) than the first point (AP) A process of generating a movement path of the robot R to the SP may proceed (S150, see FIG. 16).
  • the controller of the robot (R) or the cloud server 20 determines that the robot (R), which has moved to the first point (AP), faces the marker (M) provided in the charging station (S) at the second point (SP).
  • a movement path from the first point AP to the second point SP may be created (S214, FIG. 17).
  • the movement path from the first point AP to the second point SP will be referred to as a “final movement path”.
  • the ideal state of the robot R performing the alignment related to the first point AP is that the robot R is located at the first point AP, and the robot The front face (RF) of may be in a state facing the front face (SF) of the charging station.
  • errors may exist in i) estimating the posture of the marker M through recognition of the marker M and ii) controlling the movement of the robot R.
  • the error generated in the control related to the first point AP occurs when the central axis Y1 of the image and the central axis Y2 of the marker do not coincide or the robot R does not move along the movement path. may occur due to
  • the robot R exists at a third position 2113 other than the first point AP.
  • the front (RF) of the robot may not face the charging station (S) or the marker (M) provided in the charging station (S).
  • the final movement path of the robot R can be created by compensating for an error generated in the control related to the first point AP. .
  • the camera 1321 of the robot R The actual location of the robot R can be specified using the captured image.
  • control unit of the robot (R) or the cloud server 20 compares the image obtained from the robot (R) with the map generated by the vision (or visual)-based SLAM technology, so that the robot (R) can estimate the actual location of
  • the actual position estimated by the controller of the robot R or the cloud server 20 is named the third position 2113 in the present invention.
  • the third location 2113 may or may not correspond to the first point AP.
  • the controller of the robot R or the cloud server 20 may generate a final movement path from the third position 2113 to the second point SP.
  • the controller of the robot R or the cloud server 20 determines the final movement path from the third position 2113 to the second point SP, consisting of a curved section. It may be created to include the first movement path 2310 and the second movement path 2320 composed of straight sections.
  • the first movement path 2310 may be understood as a curved path from the third position 2113 to the fourth position 2114 included in the straight line connecting the first point AP and the second point SP.
  • the second movement path 2320 may be understood as a straight path from the fourth position 2114 to the second point SP.
  • the controller of the robot R or the cloud server 20 may control the final movement path including the first movement path 2310 and the second movement path 2320 so that the robot R follows and moves. (S215, FIG. 17).
  • the controller of the robot R or the cloud server 20 controls the robot R to follow the final movement path using the Kalman Filter-based Feedback Loop Control method, thereby precisely moving the robot R. can be controlled
  • the controller of the robot R or the cloud server 20 determines a specific distance from the second point SP along the final movement path 2310 or 2320 of the robot R. Based on the movement within (E), the constant speed movement of the robot R may be controlled so that the robot R moves at a constant speed below a predetermined speed limit.
  • a point spaced apart from the second point SP by a specific distance E is referred to as a fifth position 2115 .
  • the controller of the robot R or the cloud server 20 may control the robot R to move at a constant speed toward the second point SP at a predetermined speed or less. there is.
  • the posture of the robot R may be controlled together so that the robot R moves at a constant speed so that the central axis Y1 of the received image and the central axis Y2 of the marker M coincide with each other.
  • the controller of the robot R or the cloud server 20 controls the center axis Y1 and the marker M of the image received from the camera 1321 of the robot R while the robot R moves along the final moving path. Any one of the position of the robot R and the posture of the robot R may be controlled so that the central axes Y2 of the axes coincide with each other.
  • control unit of the robot R or the cloud server 20 if the robot R exists in the fifth position 2115, the robot R does not follow the final movement path, and the camera of the robot R
  • the center axis (Y1) of the image received from 1321 and the center axis (Y2) of the marker M are matched to either the position of the robot (R) or the rotation (or rotation angle) of the robot (R). control can be performed.
  • the robot R moves at a constant velocity from the fifth position 2115 at a preset speed limit, the central axis Y1 of the image acquired from the camera 1321 and the central axis Y2 of the marker M coincide with each other. It can move while performing rotation to do so.
  • the robot R is controlled to move toward the second point SP along the final movement path, and based on the robot R moving toward the second point SP, the charging station A process of charging the robot R may proceed through (S) (S160, see FIG. 16).
  • the controller of the robot (R) or the cloud server 20 moves (moves) the robot (R) so that the robot (R) is docked with the charging station (S) so that charging is performed through the charging station (S). and at least one of posture) can be controlled.
  • the charging station S is based on contact (or coupling) between the charging terminal 1440 of the charging station S and the charging terminal 1351 of the robot R.
  • power or electrical energy
  • the robot R may perform charging by receiving power (or electric energy) from the charging station S through the contact terminal 1351 and accumulating the power in the battery.
  • the robot (R) it is possible to monitor whether the robot (R) is accurately docked to the charging station (S) and charging is performed. In the present invention, if the robot (R) is not charged, the robot (R) can be controlled to re-docking to the charging station (S).
  • the controller of the robot R or the cloud server 20 activates a timer based on the fact that the robot R is located within a specific distance E from the second point SP to count the time. It can be done (S216 and S217, FIG. 17).
  • counting the time by activating the timer may be understood as counting the time based on the time when the robot R reaches the fifth position 2115.
  • the controller of the robot R or the cloud server 20 controls the robot R while the central axis Y1 of the image received from the camera 1321 of the robot R coincides with the central axis Y2 of the marker M. It is possible to control the movement of the robot R so as to move at a constant speed below the predetermined speed limit (S218, FIG. 17).
  • the controller of the robot R or the cloud server 20 may check whether current flows between the robot R and the charging station S. That is, the controller of the robot R or the cloud server 20 determines that the robot R is accurately docked to the charging station S when the robot R is located within a specific distance E from the second point SP. It is possible to monitor whether charging is being performed from the charging station (S) (S219, FIG. 17).
  • the controller of the robot R or the cloud server 20 controls the robot R, which is moving at a constant speed below a predetermined speed limit, to stop when it is confirmed that current flows between the robot R and the charging station S. can do.
  • the controller of the robot (R) or the cloud server 20 when current flows between the robot (R) and the charging station (S), the robot (R) is accurately docked to the charging station (S) and charging is performed It is determined that there is, and the robot R can be controlled so as not to move any more (S220, FIG. 17). Furthermore, the cloud server 20 may terminate the charging control for the robot R (S221, FIG. 17). At this time, counting of the timer may also be terminated.
  • the controller of the robot R or the cloud server 20 may check the elapsed time according to the counting when it is confirmed that current does not flow between the robot R and the charging station S.
  • the controller of the robot R or the cloud server 20 may determine whether the elapsed time according to counting satisfies a preset time condition (S222, FIG. 17).
  • the “preset time condition” is whether current does not flow between the robot R and the charging station S because the robot R is moving toward the second point SP, or the robot R ) is docked with the charging station (S), but it can be understood to determine whether current does not flow between the robot (R) and the charging station (S).
  • This preset time condition may be set based on a time when the robot R is expected to arrive at the second point SP by moving at a constant speed below the preset limit speed from the fifth position 2115 .
  • the preset time condition may be related to whether the elapsed time according to counting is within (less than or equal to) the second time.
  • the controller of the robot R or the cloud server 20 when the elapsed time according to the counting satisfies a preset condition, since the robot R is moving toward the second point SP, the robot R and charging It is determined that current does not flow between the stations S, and the constant speed movement of the robot R may be maintained so that the robot R moves toward the second point SP at a predetermined speed at a constant speed.
  • the controller of the robot R or the cloud server 20 determines that the elapsed time according to the counting is within a preset time, the robot R moves toward the second point SP at a preset speed limit. To do so, it is possible to maintain constant speed movement of the robot R.
  • the controller of the robot R or the cloud server 20 determines that, when the elapsed time according to the counting does not satisfy a preset condition, the robot R arrives at the second point SP and charges the charging station S Although docking is attempted, it may be determined that the current does not flow between the robot R and the charging station S because docking is not accurately made between the robot R and the charging station S.
  • the controller of the robot R or the cloud server 20 moves the robot R backwards away from the second point SP in order to make the robot R retry charging through the charging station S. By doing so, the robot R can be controlled to move toward the first point AP again (S215, FIG. 17).
  • the controller of the robot R or the cloud server 20 performs the final movement described above for the robot R, which moves backward to move away from the second point and arrives at or around the first point AP. Charging the robot R through the charging station S may be retried by controlling the path generation process and the movement along the final movement path to be re-performed.
  • the control unit of the robot R or the cloud server 20 determines that the robot R moves to a specific distance E from the second point SP along the regenerated final movement path. It is controlled to move at a constant speed toward the second point (SP) at a preset limit speed, and the time can be re-counted after initializing the timer.
  • the regenerated final movement path may be the same as or different from the existing final movement path.
  • a point spaced apart from the second point SP by a specific distance E may be the same as or different from the existing fifth position 2551 .
  • a point spaced apart from the second point SP different from the fifth location 2551 by a specific distance E may be referred to as a sixth location (not shown).
  • the process of determining, by the controller of the robot R or the cloud server 20, whether the re-counted elapsed time based on the point in time when the robot R reaches the sixth position (not shown) satisfies a preset time condition. can be repeated.
  • data processing performed by the cloud server 20 in the present invention may also be performed by the controller 1360 of the robot R.
  • the robot (R) When the robot (R) does not flow current with the charging station (S) despite performing the presentation of charging with the charging station (S) a predetermined number of times, the robot (R) is responsible for charging with the charging station (S). A failure event may be generated and the failure event may be transmitted to the cloud server 20 .
  • the cloud server 20 may reallocate a charging station (S) different from the specific charging station (S) to the robot (R) based on the occurrence of a failure event for the specific charging station (S). .
  • the cloud server 20 in order to reallocate the charging station S to the robot R, based on at least one of the battery level information of the robot R and the surrounding environment information of the robot R as described above, , the robot (R) can reassign the charging station (S) that can perform charging.
  • the remaining battery capacity information of the robot R and the current location of the robot R may be changed.
  • the remaining battery capacity of the robot R may be lower than when a previous charging event occurred, and the location of the robot R may correspond to a place where a pre-assigned charging station S is located.
  • the cloud server 20 may reallocate the charging station S to the robot R in consideration of at least one of the changed battery level information and environment information of the robot R.
  • a method of reallocating the charging station (S) to the robot (R) may be the same as the previously described method of allocating the charging station (S) to the robot (R).
  • the cloud server 20 may provide information related to a failure event for a specific charging station (S) to a user or system manager. For example, the cloud server 20 generates a report on a charging failure event of a specific charging station (S) including identification information of the specific charging station (S) and date and time information on which the failure event occurred, and the generated report can be provided to users or administrators.
  • the cloud server 20 may match the failure event with the identification information of the specific charging station S in which the failure event occurred and store the matching information in the database.
  • the cloud server 20 matches the remaining charging stations (S) except for the charging station (S) to which the failure event is matched.
  • the present charging station (S) can be excluded from the allocation target.
  • the cloud server 20 may include the corresponding charging station S in the assignment target again.
  • the cloud server 20 may not allocate the corresponding charging station S to the robot R until the error of the charging station S is resolved.
  • the robot charging control method and system according to the present invention when a charging event occurs in the robot, identification information of the charging station is received from the server and the robot can move to the charging station based on the received identification information.
  • the robot charging control method and system according to the present invention systematically manages the charging and discharging state of the robot, comprehensively considers the state of the robot and the surrounding environment of the robot, and moves the robot to the charging station without being discharged during movement. can make it
  • the robot charging control method and system according to the present invention can control the robot to move to the first point and the second point to face the marker provided in the charging station and perform charging.
  • the charging control method and system can ensure that the charging terminal of the robot and the charging terminal of the charging station are accurately contacted so that the robot is charged.
  • the present invention described above is executed by one or more processes in a computer and can be implemented as a program that can be stored in a computer-readable medium.
  • the present invention described above can be implemented as computer readable codes or instructions in a medium on which a program is recorded. That is, various control methods according to the present invention may be integrated or individually provided in the form of a program.
  • the computer-readable medium includes all types of recording devices in which data that can be read by a computer system is stored.
  • Examples of computer-readable media include Hard Disk Drive (HDD), Solid State Disk (SSD), Silicon Disk Drive (SDD), ROM, RAM, CD-ROM, magnetic tape, floppy disk, optical data storage device, etc. there is
  • the computer-readable medium may be a server or cloud storage that includes storage and can be accessed by electronic devices through communication.
  • the computer may download the program according to the present invention from a server or cloud storage through wired or wireless communication.
  • the above-described computer is an electronic device equipped with a processor, that is, a CPU (Central Processing Unit), and there is no particular limitation on its type.
  • a processor that is, a CPU (Central Processing Unit)
  • CPU Central Processing Unit

Landscapes

  • Engineering & Computer Science (AREA)
  • Mechanical Engineering (AREA)
  • Robotics (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Power Engineering (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • Theoretical Computer Science (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

La présente invention concerne un bâtiment "robot-friendly". Plus particulièrement, la présente invention concerne un bâtiment, dans lequel un robot et un être humain coexistent dans le même espace et qui fournit des services utiles à l'être humain. En outre, la présente invention concerne un procédé et un système de commande de charge pour un robot, le robot qui doit être chargé pouvant s'arrimer avec précision à une station de charge.
PCT/KR2022/015878 2021-12-24 2022-10-18 Bâtiment "robot-friendly", et procédé et système de commande de charge pour robot WO2023120918A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR1020210187181A KR102657615B1 (ko) 2021-12-24 2021-12-24 로봇 친화형 건물, 로봇의 충전 제어 방법 및 시스템
KR10-2021-0187181 2021-12-24

Publications (1)

Publication Number Publication Date
WO2023120918A1 true WO2023120918A1 (fr) 2023-06-29

Family

ID=86902841

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2022/015878 WO2023120918A1 (fr) 2021-12-24 2022-10-18 Bâtiment "robot-friendly", et procédé et système de commande de charge pour robot

Country Status (2)

Country Link
KR (2) KR102657615B1 (fr)
WO (1) WO2023120918A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20120049533A (ko) * 2010-11-09 2012-05-17 삼성전자주식회사 로봇 시스템 및 그의 제어 방법
US20150332213A1 (en) * 2013-07-25 2015-11-19 IAM Robotics, LLC Autonomous mobile bin storage and retrieval system
KR20190053730A (ko) * 2017-11-10 2019-05-20 삼성전자주식회사 청소용 이동장치, 충전장치 및 그 제어방법
KR20190093800A (ko) * 2018-01-17 2019-08-12 엘지전자 주식회사 이동 로봇 및 이동 로봇의 제어방법
KR20200018219A (ko) * 2018-08-03 2020-02-19 엘지전자 주식회사 이동 로봇 및 그 제어방법

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20120049533A (ko) * 2010-11-09 2012-05-17 삼성전자주식회사 로봇 시스템 및 그의 제어 방법
US20150332213A1 (en) * 2013-07-25 2015-11-19 IAM Robotics, LLC Autonomous mobile bin storage and retrieval system
KR20190053730A (ko) * 2017-11-10 2019-05-20 삼성전자주식회사 청소용 이동장치, 충전장치 및 그 제어방법
KR20190093800A (ko) * 2018-01-17 2019-08-12 엘지전자 주식회사 이동 로봇 및 이동 로봇의 제어방법
KR20200018219A (ko) * 2018-08-03 2020-02-19 엘지전자 주식회사 이동 로봇 및 그 제어방법

Also Published As

Publication number Publication date
KR20230097559A (ko) 2023-07-03
KR102657615B1 (ko) 2024-04-15
KR20240054242A (ko) 2024-04-25

Similar Documents

Publication Publication Date Title
AU2019335977B2 (en) A robot cleaner and a controlling method for the same
AU2019262468B2 (en) A plurality of robot cleaner and a controlling method for the same
AU2019262467B2 (en) A plurality of robot cleaner and a controlling method for the same
WO2020050494A1 (fr) Robot nettoyeur et son procédé de commande
AU2019430311B2 (en) Plurality of autonomous mobile robots and controlling method for the same
WO2020050489A1 (fr) Robot nettoyeur et son procédé de commande
WO2019212239A1 (fr) Pluralité de robots nettoyeurs et leur procédé de commande
WO2019212240A1 (fr) Pluralité de robots nettoyeurs et leur procédé de commande
AU2019262477B2 (en) Plurality of autonomous mobile robots and controlling method for the same
WO2019212276A1 (fr) Pluralité de robots mobiles autonomes et procédé de commande de tels robots mobiles autonomes
AU2019262482A1 (en) Plurality of autonomous mobile robots and controlling method for the same
WO2020050566A1 (fr) Pluralité de robots mobiles autonomes et procédé de commande de tels robots mobiles autonomes
WO2019004742A1 (fr) Système de robot comprenant un robot mobile et un terminal mobile
WO2022075610A1 (fr) Système de robot mobile
WO2023101228A1 (fr) Construction compatible avec des robots, et procédé et système de coopération utilisant une pluralité de robots
WO2023080490A1 (fr) Bâtiment adapté aux robots, robot et système de commande de multiples robots se déplaçant dans un bâtiment
WO2019004773A1 (fr) Terminal mobile et système de robot comprenant ledit terminal mobile
WO2023120918A1 (fr) Bâtiment "robot-friendly", et procédé et système de commande de charge pour robot
WO2022075615A1 (fr) Système de robot mobile
WO2022075616A1 (fr) Système de robot mobile
WO2023043117A1 (fr) Bâtiment adapté aux robots et procédé et système de commande de déplacement de robots dans un bâtiment
WO2023068551A1 (fr) Bâtiment adapté aux robots et procédé et système de commande de robots se déplaçant dans un bâtiment
WO2023068759A1 (fr) Procédé et système de contrôle de déplacement de robot dans un bâtiment
WO2023243834A1 (fr) Bâtiment adapté aux robots, et procédé et système de génération de carte pour fonctionnement de robots
WO2023113216A1 (fr) Bâtiment adapté aux robots, et procédé et système de livraison utilisant un robot

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22911557

Country of ref document: EP

Kind code of ref document: A1