WO2024063625A1 - Système de commande de robot collaboratif total basé sur un appareil de prise de vues à intelligence artificielle - Google Patents
Système de commande de robot collaboratif total basé sur un appareil de prise de vues à intelligence artificielle Download PDFInfo
- Publication number
- WO2024063625A1 WO2024063625A1 PCT/KR2023/014615 KR2023014615W WO2024063625A1 WO 2024063625 A1 WO2024063625 A1 WO 2024063625A1 KR 2023014615 W KR2023014615 W KR 2023014615W WO 2024063625 A1 WO2024063625 A1 WO 2024063625A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- support
- collaborative robot
- control server
- central control
- shock
- Prior art date
Links
- 238000013473 artificial intelligence Methods 0.000 title claims abstract description 46
- 238000004891 communication Methods 0.000 claims abstract description 17
- 238000000034 method Methods 0.000 claims abstract description 15
- 230000033001 locomotion Effects 0.000 claims abstract description 10
- 230000035939 shock Effects 0.000 claims description 78
- 238000013528 artificial neural network Methods 0.000 claims description 38
- 239000006096 absorbing agent Substances 0.000 claims description 35
- 235000013353 coffee beverage Nutrition 0.000 claims description 28
- 235000013361 beverage Nutrition 0.000 claims description 24
- 239000000463 material Substances 0.000 claims description 22
- 238000013527 convolutional neural network Methods 0.000 claims description 14
- 230000001681 protective effect Effects 0.000 claims description 12
- 238000010521 absorption reaction Methods 0.000 claims description 8
- 238000000547 structure data Methods 0.000 claims description 8
- 238000013135 deep learning Methods 0.000 claims description 7
- 230000008569 process Effects 0.000 claims description 7
- 238000003780 insertion Methods 0.000 claims description 3
- 230000037431 insertion Effects 0.000 claims description 3
- 238000000926 separation method Methods 0.000 claims description 3
- 238000004519 manufacturing process Methods 0.000 description 34
- 238000001514 detection method Methods 0.000 description 8
- 230000006870 function Effects 0.000 description 6
- 241000533293 Sesbania emerus Species 0.000 description 4
- 230000008602 contraction Effects 0.000 description 4
- 238000010586 diagram Methods 0.000 description 4
- 238000005516 engineering process Methods 0.000 description 4
- 239000000843 powder Substances 0.000 description 4
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 4
- 230000009471 action Effects 0.000 description 3
- 230000000694 effects Effects 0.000 description 3
- 239000006188 syrup Substances 0.000 description 3
- 235000020357 syrup Nutrition 0.000 description 3
- 241001465754 Metazoa Species 0.000 description 2
- 230000002745 absorbent Effects 0.000 description 2
- 239000002250 absorbent Substances 0.000 description 2
- 230000003139 buffering effect Effects 0.000 description 2
- 235000008504 concentrate Nutrition 0.000 description 2
- 239000012141 concentrate Substances 0.000 description 2
- 238000000605 extraction Methods 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 230000000306 recurrent effect Effects 0.000 description 2
- 230000001133 acceleration Effects 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 230000008901 benefit Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 238000012790 confirmation Methods 0.000 description 1
- 235000020295 iced americano Nutrition 0.000 description 1
- 239000002184 metal Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000000149 penetrating effect Effects 0.000 description 1
- 238000012545 processing Methods 0.000 description 1
- 230000003252 repetitive effect Effects 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
Images
Classifications
-
- A—HUMAN NECESSITIES
- A47—FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
- A47J—KITCHEN EQUIPMENT; COFFEE MILLS; SPICE MILLS; APPARATUS FOR MAKING BEVERAGES
- A47J31/00—Apparatus for making beverages
- A47J31/42—Beverage-making apparatus with incorporated grinding or roasting means for coffee
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J11/00—Manipulators not otherwise provided for
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J13/00—Controls for manipulators
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J19/00—Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J19/00—Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
- B25J19/02—Sensing devices
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/045—Combinations of networks
-
- G—PHYSICS
- G07—CHECKING-DEVICES
- G07F—COIN-FREED OR LIKE APPARATUS
- G07F9/00—Details other than those peculiar to special kinds or types of apparatus
Definitions
- the present invention relates to a collaborative robot total control system based on an artificial intelligence camera.
- Robots have been developed for industrial use and have played a part in factory automation. Recently, the field of application of robots has expanded further, and medical robots, aerospace robots, etc. have been developed, and household robots that can be used in general homes are also being created. Among these robots, those that can run on their own are called artificial intelligence robots.
- prior literature U.S. registered patent US9361021 refers to remotely controlling a robot to perform the function of caring for a patient, selectively displaying patient information through a display, and displaying real-time images of the patient. Function is provided.
- the above-mentioned background technology is technical information that the inventor possessed for deriving the present invention or acquired in the process of deriving the present invention, and cannot necessarily be said to be known technology disclosed to the general public before filing the application for the present invention. .
- the purpose of the present invention is to detect through an artificial intelligence camera the presence of an object recognized as a person within the movement radius of a plurality of operating collaborative robots and stop the operation of the corresponding collaborative robot to prevent damage to the objects (people) and the collaborative robot.
- a collaborative robot total control system based on an artificial intelligence camera may include a collaborative robot that makes coffee using a coffee machine.
- a collaborative robot total control system based on an artificial intelligence camera includes a central control server that communicates with the collaborative robot and controls the collaborative robot; An artificial intelligence camera that communicates with the central control server and takes pictures of the store where the collaborative robot is located; a database unit that receives and stores a method of controlling the collaborative robot by the central control server according to the position or movement of an object input to the artificial intelligence camera; and a communication unit that transmits data related to space, objects, and usage from the store manager or the collaborative robot to the central control server.
- the collaborative robot can perform in-store tasks including making or serving drinks.
- the central control server determines that there is a possibility of collision with the collaborative robot when an object recognized as a person through the artificial intelligence camera exists within the range where multiple collaborative robots drive, and the object is The operation of a nearby collaborative robot can be stopped.
- the central control server may call a store manager to manually control the collaborative robot that has stopped operating.
- the data related to space, objects, and use may mean a store drawing possessed by the store manager or data related to recognition of spaces and objects recognized by the artificial intelligence camera.
- the collaborative robot may include an artificial neural network in the form of software or hardware learned to recognize at least one of the properties of space and properties of objects including obstacles.
- the central control server may include an artificial neural network in the form of software or hardware trained to recognize at least one of the properties of space and properties of objects including obstacles.
- the collaborative robot may include a deep neural network including a Convolutional Neural Network (CNN) or a Deep Belief Network (DBN) learned through deep learning.
- CNN Convolutional Neural Network
- DNN Deep Belief Network
- the central control server may include a deep neural network including a Convolutional Neural Network (CNN) or a Deep Belief Network (DBN) learned through deep learning.
- CNN Convolutional Neural Network
- DNN Deep Belief Network
- the central control server may communicate with the collaborative robot and receive the current operating state of the collaborative robot in real time.
- the central control server may learn a deep neural network based on data received from the collaborative robot and data input through the communication unit, and then transmit the updated deep neural network structure data to the collaborative robot. there is.
- the collaborative robot may have an artificial intelligence deep neural network structure updated by the updated deep neural network structure data of the central control server.
- the collaborative robot includes: a first support that is formed in the shape of a cylinder and is located at the bottom, and is installed so that the position is fixed; a second support formed in the shape of ' ⁇ ' and fixed to a side of the first support, the driving of which is controlled by the first support; a third support located at the top of the second support, formed in the shape of 'L', and the driving of which is controlled by the first support; a fourth support connected to an upper end of the third support, the driving of which is controlled by the first support; a cylindrical fifth support connected to the lower end of the fourth support; and a gripper located at the bottom of the fifth support and driven to grip a coffee making tool in the store.
- the first support communicates with the central control server and matches the control signal transmitted from the central control server to the second support, the third support, the fourth support, and the fifth support.
- the grip part can be controlled.
- the collaborative robot may further include a protection member surrounding the third support.
- the protective member includes two end surfaces that contact each other to surround the third support, and a separate end surface that is formed to be separable;
- An insertion groove that has a predetermined width and is formed long in the longitudinal direction and has a plurality of rectangular grooves formed at regular intervals;
- a fitting frame that is formed to a width that can be forcefully fitted to the width of the fitting groove and is formed in an angled 'U' shape, and is fitted into the fitting groove; and a buffer portion fitted inside the center formed by the shape of the fitting frame.
- the separate end face is provided with an attachable material on both end faces, and is positioned to surround the third support, and then the two end faces are in contact with each other and attached to enable the protective member to surround the third support. can do.
- the buffer unit may be formed in a configuration capable of absorbing shock generated from the outside.
- the fitting frame may be provided with a separate attachment material so that the buffer part can be fixed to the inside so that the buffer part can be positioned on the inside of the grooved frame.
- the buffer unit includes: a first absorbing unit provided on one end, the upper and lower ends of which are rounded, and primarily absorbs shock generated from the outside; a second absorber provided on the other end of the first absorber and secondarily absorbing shocks generated from the outside, and having a plurality of air layers formed in the longitudinal direction to increase the efficiency of shock absorption; A connecting air part connecting the air generated when the first absorbing part is contracted in the direction of the other end surface in the process of primarily absorbing shock to escape through the air layer; and an elastic portion located inside the air layer and thirdly absorbing shock occurring from the outside.
- the first absorber is made of a material capable of absorbing shock so that it can primarily absorb shock generated from the outside, and when a shock occurs from the outside, it is contracted and driven in the direction of the other end surface. It can primarily absorb shock.
- the second absorber may be made of a material capable of absorbing shock so as to absorb the shock transmitted through the first absorber, and when the shock is transmitted through the first absorber, the second absorber may be made of a material capable of absorbing shock. Shock can be absorbed secondarily by shrinking in the direction of the cross section.
- the width of the air layer may become narrow as the second absorber absorbs shock and shrinks in the direction of the other end surface.
- the connecting air portion is formed to be empty inside, so that the air discharged by the first absorbing portion can flow into the air layer as the first absorbing portion is contracted to the other end surface.
- the air layer may have the other end face open so that air flowing in through the connecting air portion is discharged.
- An unmanned coffee production and sales system through advance reservation may include a production robot that produces coffee using a coffee machine.
- An unmanned coffee manufacturing and sales system through advance reservation includes a manufacturing control server that communicates with the manufacturing robot; And it may further include a plurality of user terminals that communicate with the manufacturing control server and are registered as customers using the store.
- the manufacturing control server includes a transceiver that communicates with the user terminal and receives a manufacturing menu and current location information where the user terminal is located from the user terminal; An arrival prediction unit that analyzes the minimum expected time for the user to arrive at the store based on the current location information received from the user terminal; a production command unit that transmits a beverage production command to the production robot based on the beverage production time of the production menu received from the user terminal; a manufacturing information unit that stores a beverage manufacturing recipe manufactured by the manufacturing robot within the store and an average manufacturing time according to the recipe; And after confirming that the manufactured beverage is located at the pickup stand located in the store, it may include a pickup confirmation unit that checks through the user terminal whether the beverage has been completely picked up from the pickup stand to the user.
- the production command unit compares the result of analyzing the minimum expected time for the user to arrive in the store, analyzed by the arrival prediction unit, and the beverage production time of the production menu received from the user terminal, and determines the possible arrival time.
- a beverage production command can be sent to the manufacturing robot so that the beverage production is completed in accordance with the minimum expected time and placed on the pick-up table.
- the manufacturing robot when receiving a beverage production command from the production command unit, completes beverage production according to the beverage production recipe stored in the production information section and then places the completed beverage on the pickup stand. It can be positioned.
- the manufacturing robot includes: a first support that is formed in the shape of a cylinder and is located at the bottom, and is installed so that the position is fixed; a second support formed in the shape of ' ⁇ ' and fixed to a side of the first support, the driving of which is controlled by the first support; a third support located at the top of the second support, formed in the shape of 'L', and the driving of which is controlled by the first support; a fourth support connected to an upper end of the third support, the driving of which is controlled by the first support; a cylindrical fifth support connected to the lower end of the fourth support; and a gripper located at the bottom of the fifth support and driven to grip a coffee making tool in the store.
- the first support communicates with the central control server and matches the control signal transmitted from the central control server to the second support, the third support, the fourth support, and the fifth support.
- the grip part can be controlled.
- the gripper holds the portafilter and then receives coffee bean powder from the grinder into the portafilter, inserts the portafilter containing the coffee bean powder into the coffee machine, and then grips the beverage cup to purify the water purifier.
- the beverage cup can be placed in the coffee machine so that the coffee solution extracted from the coffee machine is accommodated in the beverage cup, and after extraction of the coffee solution is completed, the beverage cup can be placed on the pickup stand.
- the manufacturing robot may further include a protection member surrounding the third support.
- the protective member includes a separated end surface that is formed to be separable, meaning that both end surfaces are in contact with each other to surround the third support.
- An insertion groove that has a predetermined width and is formed long in the longitudinal direction and has a plurality of rectangular grooves formed at regular intervals; a fitting frame that is formed to a width that can be forcefully fitted to the width of the fitting groove and is formed in an angled 'U' shape, and is fitted into the fitting groove; and a buffer portion fitted inside the center formed by the shape of the fitting frame.
- the separate end face is provided with an attachable material on both end faces, and is positioned to surround the third support, and then the two end faces are in contact with each other and attached to enable the protective member to surround the third support. can do.
- the buffer unit may be formed in a configuration capable of absorbing shock generated from the outside.
- the fitting frame may be provided with a separate attachment material so that the buffer part can be fixed to the inside so that the buffer part can be positioned on the inside of the grooved frame.
- the buffer unit includes: a first absorbing unit provided on one end, the upper and lower ends of which are rounded, and primarily absorbs shock generated from the outside; a second absorber provided on the other end of the first absorber and secondarily absorbing shocks generated from the outside, and having a plurality of air layers formed in the longitudinal direction to increase the efficiency of shock absorption; A connecting air part connecting the air generated when the first absorbing part is contracted in the direction of the other end surface in the process of primarily absorbing shock to escape through the air layer; and an elastic portion located inside the air layer and thirdly absorbing shock occurring from the outside.
- the first absorber is made of a material capable of absorbing shock so that it can primarily absorb shock generated from the outside, and when a shock occurs from the outside, it is contracted and driven in the direction of the other end surface. It can primarily absorb shock.
- the second absorber may be made of a material capable of absorbing shock so as to absorb the shock transmitted through the first absorber, and when the shock is transmitted through the first absorber, the second absorber may be made of a material capable of absorbing shock. Shock can be absorbed secondarily by shrinking in the direction of the cross section.
- the width of the air layer may become narrow as the second absorber absorbs shock and shrinks in the direction of the other end surface.
- the connecting air portion is formed to be empty inside, so that the air discharged by the first absorbing portion can flow into the air layer as the first absorbing portion is contracted to the other end surface.
- the air layer may have the other end face open so that air flowing in through the connecting air portion is discharged.
- the collaborative robot total control system based on an artificial intelligence camera proposed by the present invention detects the presence of an object recognized as a human within the movement radius of a plurality of operating collaborative robots and Damage to the object and the collaborative robot can be minimized by stopping the operation.
- the third support of the collaborative robot moves to the greatest extent during the process of making coffee, and may be subject to shock when struck by other members or people.
- a protective member is positioned to surround the outer surface to cause shock. Collaborative robots can be protected from shock.
- the protective member is provided with a first absorption part, a second absorption part, and an elastic part that can absorb shock in stages, so it can absorb everything from small shocks to large shocks.
- the air layer which is a part of the protective member, has an open other end, so the air flowing in through the connecting air part or the air generated when the second absorber contracts is discharged through the open other end, thereby increasing the buffering role of the buffer. You can.
- the shock applied to the third support can be minimized by absorbing residual shock or vibration that has not yet been absorbed by the first and second absorbers by the elastic portion.
- FIGS. 1 to 5 are diagrams showing the configuration of a collaborative robot total control system based on an artificial intelligence camera according to an embodiment of the present invention.
- FIGS. 6 to 11 are diagrams showing the configuration of a collaborative robot, which is a part of a collaborative robot total control system based on an artificial intelligence camera according to another embodiment of the present invention.
- FIGS. 1 to 5 are diagrams showing the configuration of a collaborative robot total control system based on an artificial intelligence camera according to an embodiment of the present invention.
- the collaborative robot total control system 1000 based on an artificial intelligence camera communicates with the collaborative robot 300 and the collaborative robot 300 to control the collaborative robot 300.
- a central control server 100 that controls the central control server 100, an artificial intelligence camera 500 that communicates with the central control server 100 and takes pictures of the store where the collaborative robot 300 is located, and objects input to the artificial intelligence camera 500.
- a database unit 700 that receives and stores the method by which the central control server 100 controls the collaborative robot 300 according to location or movement, and centrally collects data related to space, objects, and use from the store manager or the collaborative robot 300. It may include a communication unit 900 that transmits data to the control server.
- the collaborative robot 300 can perform in-store tasks including making or serving beverages.
- the central control server 100 determines that there is a possibility of collision with the collaborative robot 300 and collides with the object. The operation of a nearby collaborative robot 300 can be stopped.
- the central control server 100 may call a store manager to manually control a collaborative robot that has stopped operating.
- Data related to space, objects, and use may refer to a store drawing possessed by a store manager, or data related to the recognition of space and objects (people, animals, and other moving objects) recognized by the artificial intelligence camera 500.
- the collaborative robot 300 may include an artificial neural network in the form of software or hardware learned to recognize at least one of the properties of space and properties of objects, including obstacles.
- the central control server 100 may include an artificial neural network in the form of software or hardware trained to recognize at least one of the properties of space and properties of objects, including obstacles.
- the collaborative robot 300 may include a deep neural network including CNN or DBN learned through deep learning.
- the central control server 100 may include a deep neural network including CNN or DBN learned through deep learning.
- the central control server 100 can communicate with the collaborative robot 300 to receive the current operating status of the collaborative robot in real time.
- the central control server 100 learns a deep neural network based on the data received from the collaborative robot 300 and data input through the communication unit 900, and then transmits the updated deep neural network structure data to the collaborative robot 300. You can.
- the collaborative robot 300 may have its artificial intelligence deep neural network structure updated by the updated deep neural network structure data of the central control server 100.
- the collaborative robot 300 is formed in the shape of a cylinder and located at the bottom, and has a first support 310 installed to fix the position, and is formed in the shape of a ' ⁇ ' and is fixed to the side of the first support 310,
- a second support 330, the driving of which is controlled by the first support 310, is located at the top of the second support 330 and is formed in the shape of ' ⁇ ', and the driving of which is controlled by the first support 310.
- a third support 350 connected to the upper end of the third support 350, and a fourth support 370, the driving of which is controlled by the first support 310, connected to the lower end of the fourth support 370 It may include a cylindrical fifth support 380 and a gripper 390 located at the bottom of the fifth support 380 and driven to grip a coffee making tool in the store.
- the first support 310 communicates with the central control server 100 and includes a second support 330, a third support 350, a fourth support 370, and the like in accordance with the control signal transmitted from the central control server 100.
- the fifth support 380 and the gripper 390 can be controlled.
- the collaborative robot 300 grips the portafilter (G) through the gripper 390 and then grinds the coffee bean powder from the grinder 200 into the portafilter ( G), insert the portafilter (G) containing the coffee bean powder into the coffee machine (600), hold the beverage cup (A), receive water through the water purifier (400), and extract it from the coffee machine (600).
- the beverage cup (A) can be placed in the coffee machine (800) so that the coffee concentrate can be accommodated in the beverage cup (A), and the beverage cup (A) can be placed on the pickup stand (800) after extraction of the coffee concentrate is completed.
- the coffee production process using a collaborative robot total control system based on an artificial intelligence camera can be divided into 'on-site kiosk ordering' and 'APP ordering'.
- APP ordering it is the same as on-site kiosk ordering, but can be configured to allow delivery selection.
- an artificial intelligence camera which is a part of the collaborative robot total control system based on an artificial intelligence camera according to an embodiment of the present invention, is mainly used when a person or other object approaches within a safe distance of the robot, and the camera recognizes the object. After measuring the distance, control the primary warning sound and the secondary robot to stop and transmit the information to the manager.
- a plurality of collaborative robots (300, 301) and the central control server (100) communicate wirelessly using wireless communication technologies such as IEEE 802.11 WLAN, IEEE 802.15 WPAN, UWB, Wi-Fi, Zigbee, Z-wave, Blue-Tooth, etc. It can be implemented as follows.
- the collaborative robot 300 may vary depending on the communication method of other devices or the central control server 100 with which it wishes to communicate.
- a plurality of collaborative robots 300 and 301 can implement wireless communication with other collaborative robots 300 and/or the central control server 100 through a 5G network.
- the collaborative robot 300 communicates wirelessly through a 5G network, real-time response and real-time control are possible.
- the plurality of collaborative robots 300 and 301 and the central control server 100 can communicate using the Message Queuing Telemetry Transport (MQTT) method and the HyperText Transfer Protocol (HTTP) method.
- MQTT Message Queuing Telemetry Transport
- HTTP HyperText Transfer Protocol
- the plurality of collaborative robots 300 and 301 and the central control server 100 can communicate with a PC, mobile terminal, and other external servers through HTTP or MQTT.
- the plurality of collaborative robots (300, 301) and the central control server (100) support two or more communication standards and can use the optimal communication standard depending on the type of communication data and the type of device participating in communication. there is.
- the store manager can check or control information about the collaborative robots 300 in the robot system through a manager terminal such as a PC or mobile terminal.
- the central control server 100 is implemented as a cloud server, and the user can use the data stored in the central control server 100 to which the user terminal 3 is communicated and the functions and services provided by the central control server 100. You can.
- the cloud central control server 100 is linked to the collaborative robot 300 to monitor and control the collaborative robot 300 and provide various solutions and content remotely.
- the central control server 100 can store and manage information received from collaborative robots 300 and other devices.
- the central control server 100 may be a server provided by the manufacturer of the collaborative robots 300 or a company entrusted with the service by the manufacturer.
- the central control server 100 may be a control server that manages and controls the collaborative robots 300.
- the central control server 100 can control the collaborative robots 300 collectively or for each individual robot. Additionally, the central control server 100 can set at least some robots among the collaborative robots 300 into groups and then control them for each group.
- the central control server 100 may be configured with information and functions distributed across a plurality of servers, or may be configured as one integrated server.
- the store manager can transmit data related to space, objects, and usage to the central control server 100.
- the data is space
- the object-related data is data related to the recognition of space and objects recognized by the collaborative robot 300, or the space acquired by the artificial intelligence camera 500 ( It may be image data about space and objects.
- the distance between an object and a robot can be measured, allowing the robot's position to be finely adjusted.
- an object may mean a person, an animal, or other moving object, and as will be described later, the collaborative robot 300 and the central control server 100 use a CNN (Convolutional Neural Network), RNN ( Since it can include deep neural networks (DNN) such as Recurrent Neural Network (DBN) and DBN (Deep Belief Network), it has the advantage of being able to learn and apply various situations that can recognize even the sudden appearance of objects.
- CNN Convolutional Neural Network
- RNN Since it can include deep neural networks (DNN) such as Recurrent Neural Network (DBN) and DBN (Deep Belief Network), it has the advantage of being able to learn and apply various situations that can recognize even the sudden appearance of objects.
- DNN Deep Neural Network
- DBN Recurrent Neural Network
- DBN Deep Belief Network
- the collaborative robot 300 and the central control server 100 use artificial neural networks (Artificial Neural Networks in the form of software or hardware) learned to recognize at least one of the properties of objects such as users, voices, spatial properties, and obstacles. : ANN) may be included.
- Artificial Neural Networks in the form of software or hardware
- ANN Artificial Neural Networks
- the collaborative robot 300 and the central control server 100 use a Convolutional Neural Network (CNN), a Recurrent Neural Network (RNN), and a Deep Belief Network (DBN) learned through deep learning.
- CNN Convolutional Neural Network
- RNN Recurrent Neural Network
- DBN Deep Belief Network
- the collaborative robot 300 and the central control server 100 use a Convolutional Neural Network (CNN), a Recurrent Neural Network (RNN), and a Deep Belief Network (DBN) learned through deep learning.
- DNN deep neural network
- the first support 310 of the collaborative robot 300 may be equipped with a deep neural network (DNN) structure, such as a convolutional neural network (CNN).
- CNN convolutional neural network
- the central control server 100 learns a deep neural network (DNN) based on data received from the collaborative robot 300, data input by the user, etc., and then sends the updated deep neural network (DNN) structure data to the collaborative robot ( 300). Accordingly, the deep neural network (DNN) structure of artificial intelligence provided by the collaborative robot 300 can be updated.
- DNN deep neural network
- usage-related data is data acquired according to the use of the collaborative robot 300, and may include usage history data, detection signals obtained from the artificial intelligence camera 500, etc.
- the learned deep neural network structure can receive input data for recognition, recognize the attributes of people, objects, and spaces included in the input data, and output the results.
- the learned deep neural network structure receives input data for recognition, analyzes and learns data related to the usage of the collaborative robot 300, and can recognize usage patterns, usage environments, etc. there is.
- data related to space, objects, and usage may be transmitted to the central control server 100 through the communication unit 900.
- the central control server 100 can learn a deep neural network (DNN) based on the received data and then transmit the updated deep neural network (DNN) structure data to the artificial intelligence collaborative robots 300 and 301 to update them. there is.
- DNN deep neural network
- One or more robots 100 can be provided to provide services in a designated location such as a home.
- the robot system may include a robot that interacts with a user at home or the like and provides various entertainment to the user.
- Artificial intelligence collaborative robots 300 and 301 can perform assigned tasks while traveling in a specific space.
- the artificial intelligence collaborative robots 300 and 301 can perform autonomous driving by creating a path to a predetermined destination and tracking driving by following a person or another robot.
- the artificial intelligence collaborative robots 300 and 301 can drive while detecting and avoiding obstacles while moving based on image data and sensing data acquired through the artificial intelligence camera 500.
- the central control server 100 may further include a motion detection sensor that detects the motion of the collaborative robot 300 and outputs motion information.
- a motion detection sensor that detects the motion of the collaborative robot 300 and outputs motion information.
- a gyro sensor, a wheel sensor, an acceleration sensor, etc. can be used as a motion detection sensor.
- the collaborative robot 300 may include an obstacle detection sensor that detects an obstacle
- the obstacle detection sensor may include an infrared sensor, an ultrasonic sensor, an RF sensor, a geomagnetic sensor, a Position Sensitive Device (PSD) sensor, and an obstacle detection sensor within the driving area. It may include a cliff detection sensor that detects the presence of a cliff on the floor, light detection and ranging (Lidar), etc.
- the terminal carried by the store user to transmit data related to space, objects, and use transmitted through the communication unit 900 is a general term for devices equipped with computational processing capabilities by each having a memory and a processor.
- devices equipped with computational processing capabilities by each having a memory and a processor.
- PDAs personal digital assistants
- mobile phones smart devices, tablets, etc.
- the server has a memory in which a plurality of modules are stored, a processor that is connected to the memory, responds to the plurality of modules, and processes service information provided to the terminal or action information that controls the service information. It may include a new means and a UI (user interface) display means.
- Memory is a device that stores information, including USB (Universal Serial Bus), high-speed random access memory, magnetic disk storage, flash memory devices, and other non-volatile solid state memory devices. It may include various types of memory, such as non-volatile memory such as -state memory device).
- USB Universal Serial Bus
- non-volatile memory such as -state memory device
- the communication means transmits and receives service information or action information with the terminal in real time.
- the UI display means outputs service information or action information of the device in real time.
- the UI display means may be an independent device that directly or indirectly outputs or displays the UI, or may be a part of the device.
- a program for executing a method according to an embodiment of the present invention may be recorded on a computer-readable recording medium.
- Computer-readable media may include program instructions, data files, data structures, etc., singly or in combination.
- the media may be specially designed and constructed or may be known and available to those skilled in the art of computer software.
- Examples of computer-readable recording media include magnetic media such as hard disks, floppy disks, and magnetic tapes, optical recording media such as CD-ROMs and DVDs, magneto-optical media such as floptical disks, and ROM, RAM, flash memory, etc. It includes specially configured hardware devices to store and execute the same program instructions.
- the medium may be a transmission medium such as an optical or metal line or waveguide containing a carrier wave that transmits signals specifying program commands, data structures, etc.
- Examples of program instructions include machine language code, such as that produced by a compiler, as well as high-level language code that can be executed by a computer using an interpreter, etc.
- FIGS. 6 to 11 are diagrams showing the configuration of a collaborative robot, which is a part of a collaborative robot total control system based on an artificial intelligence camera according to another embodiment of the present invention.
- the collaborative robot 300 may further include a protection member 360 surrounding the third support 350.
- the protective member may include a separation section 362, a fitting groove 364, a fitting frame 366, and a buffer portion 368.
- the separation cross-section 362 refers to both cross-sections that come into contact to surround the third support 350, and can be formed so that the protection member 360 can be separated from the third support 350.
- the separating end face 362 is provided with an attachable material on both end faces, so that it is positioned to surround the third support 350, and then both end faces are in contact with each other and attached to enable the protective member to surround the third support 350. can be formed.
- the fitting groove 364 has a predetermined width and is formed long in the longitudinal direction, and may have a plurality of square grooves formed at regular intervals.
- the fitting frame 366 is formed to a width that can be forcefully fitted into the width of the fitting groove 364, and is formed in an angled 'U' shape, and can be fitted into the fitting groove 364.
- fitting frame 366 may be provided with a separate attachment material so that the buffer part 368 can be positioned on the inside of the grooved frame 366 so that the buffer part 368 can be fixed to the inside.
- the buffer portion 368 may be formed to fit inside the center formed by the shape of the fitting frame 366.
- the buffer portion 368 may be formed in a configuration capable of absorbing shocks generated from the outside.
- the buffer portion 368 may include a first absorbent portion 3687, a second absorbent portion 3681, a connecting air portion 3685, and an elastic portion 367.
- the first absorbing part 3687 is provided on one end, and the upper and lower ends of one end are formed in a round shape, and can primarily absorb shock generated from the outside.
- the first absorber 3687 may be made of a material capable of absorbing shock so that it can primarily absorb shock generated from the outside, and when a shock occurs from the outside, it is contracted and driven in the direction of the other end surface. It can primarily absorb shock.
- the first absorber 3687 may be formed of a sponge material that can primarily absorb shocks generated from the outside, but is not limited to this and can primarily absorb shocks generated from the outside. Any material that exists can be included regardless of its name.
- the second absorber 3681 is provided on the other end surface of the first absorber 3687 and secondarily absorbs shocks generated from the outside, and has a plurality of air layers 3683 in the longitudinal direction to increase the efficiency of shock absorption. may be formed.
- the second absorber 3681 may be made of a material capable of absorbing shock so as to absorb the shock transmitted through the first absorber 3687, and the second absorber 3687 may absorb the shock transmitted through the first absorber 3687.
- shock can be absorbed secondarily by contracting and driving in the direction of the other end surface.
- the second absorber 3681 may be formed of a sponge material that can primarily absorb shocks generated from the outside, but is not limited to this and can primarily absorb shocks generated from the outside. Any material that exists can be included regardless of its name.
- the width of the air layer 3683 may become narrow as the second absorber 3681 absorbs shock and shrinks in the direction of the other end surface.
- the air layer 3683 may have its other end open so that air flowing in through the connection air portion 3685 is discharged.
- the other end of the air layer 3683 is open, so the air flowing in through the connecting air part 3685 or the air generated when the second absorption part 3681 is contracted can be discharged through the open other end, forming a buffer.
- the buffering role of (368) can be increased.
- the connecting air portion 3685 may be connected so that air generated when the first absorbing portion 3687 is contracted in the direction of the other end surface in the process of primarily absorbing shock escapes through the air layer 3683.
- the connecting air portion 3685 is formed to be empty inside, so that as the first absorbing portion 3687 is contracted to the other end surface, the air discharged by the first absorbing portion 3687 will flow into the air layer 3683. You can.
- the elastic portion 367 is located inside the air layer 3683 and can thirdly absorb shock occurring from the outside.
- the elastic part according to another embodiment of the present invention includes a fixing part 3671, a spring part 3677, a contraction limiting part 3672, a fixed end part 3676, and a fitting fixing part ( 3671) and a protrusion 3675.
- the fixing part 3671 has a spiral groove 36711 formed on the outside and is formed in the shape of a circular plate with a predetermined thickness, with grooves of different diameters formed on one side and the other side, and positioned at the other end to be inserted. It may be fixed to one inner surface of the frame 366.
- the fixing part 3671 can be rotationally fastened to the inside of the fixing end 3676, which will be described later, by a spiral groove 36711 formed on the outside.
- the spring part 3677 is fixed by fitting into a groove formed on one side of the fixing part, and can be driven to contract or relax by shock or vibration generated from the outside.
- the other end of the spring part 3677 is inserted into one end of the fixing part, and one end is inserted into one side of the fitting fixing part described later and can be driven to contract or relax.
- the spring portion 3677 may be formed in the shape of a coil spring that can be contracted or released, and the other end may be formed with a diameter that can be fixed to the groove 3678 formed on one side of the fixing portion 3671.
- the diameter may be formed to be a diameter that can be fixed to the groove formed at the other end of the fitting portion 3671.
- the contraction limiting unit 3672 may limit the contracting length of the spring unit 3677.
- the distance over which the spring part 3677 can be contracted may be limited as the distance it moves in the other end direction is limited by being caught by the protruding part 3675 and the fitting part 3671 by the contraction limiting part 3672 described above.
- one end of the spring part 3677 is fixed to the other end of the fitting part 3671, which will be described later, and the fitting part 3671 moves in the direction of the other end by an impact generated from the outside to contract and drive, but the fitting part (3677) 3671 may be caught by the contraction limiting part 3672, thereby limiting the length at which the spring part 3677 is driven to contract.
- the shrinkage limiting portion 3672 may be formed in the shape of a circular band with a diameter that can be positioned at one end of the fixing portion 3671.
- the fixed end portion 3676 has a spiral protrusion 36741 formed on its inner surface to correspond to the spiral groove 36711 formed on the outer surface of the fixed portion 3671, and surrounds the outer surface of the fixed portion 3671. It may be located at one end of the fixture.
- the fixed end portion 3676 may be formed in the shape of a circular band having a diameter that allows the fixed portion 3671 to be positioned on the inner surface.
- the fixed end portion 3676 is provided with a cover shape at one end that covers certain portions of both sides, thereby preventing the protrusion 3675, which will be described later, from being released to the outside.
- the fixing part 3671, the fitting part 3671, and the protrusion 3675 are located on the empty inside of the fixed end part 3676, but an empty space is formed between the fixing part 3671 and the fitting part 3671.
- the protruding part 3675 and the fitting part 3671 may move in the other end direction due to shock or vibration generated when the first absorbing part 3687 is contracted and driven in the other end direction.
- the fitting fixing part 3671 may be fitted and fixed inside the fixing end part 3676.
- the fitting fixing part 3671 may have a groove formed at the other end having a diameter equal to or larger than the diameter of one end of the spring part 3677 so that the spring part 3677 is inserted into it.
- the fitting fixing part 3671 is provided with a cover shape at one end that covers a certain portion of both sides, so that the protrusion 3675 can be located at one end.
- the protrusion 3675 is located at one end of the fitting and fixing part 3671, but is located on the empty inside of the fixing end 3676, and is formed in a round shape in one direction, and one end of the round shape is one end of the fixing end 3676. It may protrude in one direction.
- the protrusion 3675 has first screw grooves 36755 formed at both ends, and a screw 36753 penetrating the first screw groove 36755 is formed on the inside of the fitting portion 3671 to form a second screw groove 36763. ) It can be rotated through and fixed to one end of the fitting part (3671).
- the above-mentioned elastic part 367 absorbs the remaining shock or vibration that has not yet been absorbed by the first absorbing part 3687 and the second absorbing part 3681 to minimize the shock applied to the third support 350. You can.
- the impact applied to the collaborative robot 300 can be minimized.
Landscapes
- Engineering & Computer Science (AREA)
- Robotics (AREA)
- Mechanical Engineering (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Life Sciences & Earth Sciences (AREA)
- Evolutionary Computation (AREA)
- General Engineering & Computer Science (AREA)
- Biophysics (AREA)
- Computational Linguistics (AREA)
- Data Mining & Analysis (AREA)
- Artificial Intelligence (AREA)
- General Health & Medical Sciences (AREA)
- Molecular Biology (AREA)
- Computing Systems (AREA)
- Biomedical Technology (AREA)
- Health & Medical Sciences (AREA)
- Mathematical Physics (AREA)
- Software Systems (AREA)
- Food Science & Technology (AREA)
- Manipulator (AREA)
- Computer Networks & Wireless Communication (AREA)
- Multimedia (AREA)
Abstract
La présente invention concerne un système de commande de robot collaboratif total basé sur un appareil de prise de vues à intelligence artificielle. Le système peut comprendre : un robot collaboratif; un serveur de commande central qui communique avec le robot collaboratif et commande celui-ci; un appareil de prise de vues à intelligence artificielle qui communique avec le serveur de commande central et restitue une image de l'intérieur d'un magasin dans lequel se trouve le robot collaboratif; une unité de base de données qui reçoit et stocke un procédé pour que le serveur de commande central commande le robot collaboratif en fonction de la position ou du mouvement, entré dans l'appareil de prise de vues à intelligence artificielle, d'un objet; et une unité de communication qui transmet des données relatives à l'espace, aux objets physiques ou à l'utilisation par un gestionnaire de magasin ou du robot collaboratif au serveur de commande central.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR10-2022-0121243 | 2022-09-25 | ||
KR1020220121243A KR102518302B1 (ko) | 2022-09-25 | 2022-09-25 | 인공지능 카메라에 기반한 협동 로봇 토탈 제어 시스템 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2024063625A1 true WO2024063625A1 (fr) | 2024-03-28 |
Family
ID=85918420
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/KR2023/014615 WO2024063625A1 (fr) | 2022-09-25 | 2023-09-25 | Système de commande de robot collaboratif total basé sur un appareil de prise de vues à intelligence artificielle |
Country Status (2)
Country | Link |
---|---|
KR (1) | KR102518302B1 (fr) |
WO (1) | WO2024063625A1 (fr) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR102518302B1 (ko) * | 2022-09-25 | 2023-04-06 | (주) 마가커피 | 인공지능 카메라에 기반한 협동 로봇 토탈 제어 시스템 |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR200265873Y1 (ko) * | 2001-10-26 | 2002-02-27 | 주식회사 삼우공간건축사사무소 | 도로 시설물의 지주 보호용 충격 흡수커버 |
JP2017140660A (ja) * | 2016-02-08 | 2017-08-17 | 川崎重工業株式会社 | 作業ロボット |
KR20190003118A (ko) * | 2017-06-30 | 2019-01-09 | 엘지전자 주식회사 | 이동 로봇과 이동 단말기를 포함하는 로봇 시스템 |
KR20210007771A (ko) * | 2019-07-12 | 2021-01-20 | 엘지전자 주식회사 | 커피 제조 로봇 및 그 제어방법 |
KR20210039635A (ko) * | 2019-10-02 | 2021-04-12 | 엘지전자 주식회사 | 로봇 시스템 및 그 제어 방법 |
KR102322701B1 (ko) * | 2021-04-07 | 2021-11-09 | 주식회사 지아이에스21 | Uf코드패드부와 지하시설물 측정용 ai로봇모듈을 통한 현장지능형 지하시설물 안전관리장치 및 방법 |
KR102518302B1 (ko) * | 2022-09-25 | 2023-04-06 | (주) 마가커피 | 인공지능 카메라에 기반한 협동 로봇 토탈 제어 시스템 |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11557387B2 (en) | 2019-05-02 | 2023-01-17 | Lg Electronics Inc. | Artificial intelligence robot and method of controlling the same |
-
2022
- 2022-09-25 KR KR1020220121243A patent/KR102518302B1/ko active IP Right Grant
-
2023
- 2023-09-25 WO PCT/KR2023/014615 patent/WO2024063625A1/fr unknown
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR200265873Y1 (ko) * | 2001-10-26 | 2002-02-27 | 주식회사 삼우공간건축사사무소 | 도로 시설물의 지주 보호용 충격 흡수커버 |
JP2017140660A (ja) * | 2016-02-08 | 2017-08-17 | 川崎重工業株式会社 | 作業ロボット |
KR20190003118A (ko) * | 2017-06-30 | 2019-01-09 | 엘지전자 주식회사 | 이동 로봇과 이동 단말기를 포함하는 로봇 시스템 |
KR20210007771A (ko) * | 2019-07-12 | 2021-01-20 | 엘지전자 주식회사 | 커피 제조 로봇 및 그 제어방법 |
KR20210039635A (ko) * | 2019-10-02 | 2021-04-12 | 엘지전자 주식회사 | 로봇 시스템 및 그 제어 방법 |
KR102322701B1 (ko) * | 2021-04-07 | 2021-11-09 | 주식회사 지아이에스21 | Uf코드패드부와 지하시설물 측정용 ai로봇모듈을 통한 현장지능형 지하시설물 안전관리장치 및 방법 |
KR102518302B1 (ko) * | 2022-09-25 | 2023-04-06 | (주) 마가커피 | 인공지능 카메라에 기반한 협동 로봇 토탈 제어 시스템 |
Also Published As
Publication number | Publication date |
---|---|
KR102518302B1 (ko) | 2023-04-06 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2024063625A1 (fr) | Système de commande de robot collaboratif total basé sur un appareil de prise de vues à intelligence artificielle | |
WO2020256159A1 (fr) | Robot mobile et son procédé de commande | |
WO2019235743A1 (fr) | Robot pour déplacement par le biais d'un point de cheminement sur la base d'un évitement d'obstacle et procédé pour déplacement | |
WO2018124682A2 (fr) | Robot mobile et son procédé de commande | |
WO2021010502A1 (fr) | Robot et procédé de gestion d'article l'utilisant | |
WO2021002499A1 (fr) | Procédé de suivi d'emplacement d'utilisateur à l'aide de robots en essaim, dispositif d'étiquette, et robot le mettant en œuvre | |
WO2020256195A1 (fr) | Robot de gestion d'immeuble et procédé pour fournir un service à l'aide dudit robot | |
WO2017034056A1 (fr) | Robot mobile et procédé de commande associé | |
WO2020032412A1 (fr) | Robot mobile et son procédé de réglage de suivi | |
WO2021002511A1 (fr) | Balise, procédé de déplacement en mode suivi de balise, et robot-chariot mettant en œuvre un tel procédé | |
EP3525992A1 (fr) | Système du type robot mobile et robot mobile | |
WO2020209426A1 (fr) | Module de chargement et robot de transport en étant équipé | |
WO2020256188A1 (fr) | Procédé de projection d'image et robot le mettant en oeuvre | |
WO2020218644A1 (fr) | Procédé et robot permettant de redéfinir l'emplacement d'un robot à l'aide de l'intelligence artificielle | |
WO2020241950A1 (fr) | Robot mobile et son procédé de commande | |
WO2020209394A1 (fr) | Procédé de commande de mouvement de robot de chariot en fonction d'un changement de surface de déplacement à l'aide de l'intelligence artificielle, et robot de chariot | |
WO2019235667A1 (fr) | Drone de guidage en intérieur et procédé de commande associé | |
WO2019135437A1 (fr) | Robot de guidage et son procédé de fonctionnement | |
WO2021177805A1 (fr) | Dispositif, procédé et système de localisation de source sonore et d'annulation de source sonore | |
WO2020230931A1 (fr) | Robot générant une carte sur la base d'un multi-capteur et d'une intelligence artificielle, configurant une corrélation entre des nœuds et s'exécutant au moyen de la carte, et procédé de génération de carte | |
WO2020256180A1 (fr) | Robot landau basé sur la reconnaissance d'utilisateur et son procédé de commande | |
WO2020242065A1 (fr) | Procédé de commande de mouvement de robot basé sur une détermination de niveau de risque et dispositif de robot mobile l'utilisant | |
WO2019240374A1 (fr) | Robot mobile servant à reconnaître une file d'attente et procédé d'exploitation de robot mobile | |
WO2018074909A1 (fr) | Système et procédé de guidage d'informations d'embarquement, et dispositif et procédé de gestion de procédure | |
WO2020256179A1 (fr) | Marqueur pour la reconnaissance spatiale, procédé d'alignement et de déplacement de robot de chariot par reconnaissance spatiale, et robot de chariot |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 23868676 Country of ref document: EP Kind code of ref document: A1 |