WO2022097775A1 - 5g-based production, logistics management, and cloud-oriented machine vision service providing method - Google Patents

5g-based production, logistics management, and cloud-oriented machine vision service providing method Download PDF

Info

Publication number
WO2022097775A1
WO2022097775A1 PCT/KR2020/015384 KR2020015384W WO2022097775A1 WO 2022097775 A1 WO2022097775 A1 WO 2022097775A1 KR 2020015384 W KR2020015384 W KR 2020015384W WO 2022097775 A1 WO2022097775 A1 WO 2022097775A1
Authority
WO
WIPO (PCT)
Prior art keywords
robot
present
service
cloud
network
Prior art date
Application number
PCT/KR2020/015384
Other languages
French (fr)
Korean (ko)
Inventor
박덕근
김유진
윤종필
Original Assignee
위즈코어 주식회사
에스케이텔레콤 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 위즈코어 주식회사, 에스케이텔레콤 주식회사 filed Critical 위즈코어 주식회사
Priority to PCT/KR2020/015384 priority Critical patent/WO2022097775A1/en
Publication of WO2022097775A1 publication Critical patent/WO2022097775A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/08Logistics, e.g. warehousing, loading or distribution; Inventory or stock management
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/10Office automation; Time management
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis

Definitions

  • the present invention relates to a 5G-based production, logistics management, and cloud-oriented machine vision service providing method.
  • 5G-based production that can transmit and receive ultra-low latency, ultra-capacity, and ultra-realistic data between devices using the 5G network, collects data from the manufacturing site and analyzes the collected data with artificial intelligence under the cloud infrastructure, A method for providing logistics management and cloud-oriented machine vision services is required.
  • the problem to be solved by the present invention is to provide a method for providing a 5G-based production, logistics management, and cloud-oriented machine vision service.
  • the problem to be solved by the present invention is to be able to transmit and receive ultra-low delay, ultra-capacity, and ultra-realistic data between devices using a 5G network, and collect data from the manufacturing site through this, and use the collected data under the cloud infrastructure. It is to provide a method for providing 5G-based production, logistics management, and cloud-oriented machine vision services analyzed by artificial intelligence.
  • 5G-based production, logistics management, and cloud-oriented machine vision service provision method majors in control data and large-capacity data in which real-time is important through an ultra-low latency, ultra-high-speed large-capacity network, and quality through machine learning processing services such as inspection and robot control; photographing a product with a high-resolution camera connected to the network, and determining whether the product is defective using an algorithm learned through machine learning; providing a factory automation service using a multi-function robot in which the network, the robot vision system, a 6-axis collaborative robot arm, and an AMR are integrated; Augmenting the facility status and sensor information in the factory in real time to the AR device, and providing a manual support augmentation service; and controlling the 5G-based flexible production test bed.
  • interworking/compatibility can be secured in advance by applying step-by-step technology in consideration of domestic and foreign 5G technology standardization and commercialization time.
  • the present invention can minimize the technical limitations of service implementation by applying ultra-low delay technology for each network section.
  • the present invention can secure technical versatility by designing for a commercial network structure.
  • the present invention can provide communication and overall environment verification for cloud application.
  • the present invention can stably store and process various heterogeneous data.
  • the present invention can promote efficient service development through role sharing among consortium participating companies.
  • the present invention enables technology recycling and rapid system construction through machine vision service.
  • the present invention makes it possible to derive concrete results through close collaboration with the demonstration participating organizations.
  • the present invention enables selection of a collaborative robot for AMR installation and selection of a 3D vision sensor through a joint test.
  • the present invention enables the development of a dedicated AMR design for mounting a collaborative robot and a 3D vision sensor electronic unit.
  • the present invention enables sharing of real-time status and map information of a manufacturing robot.
  • the present invention can implement network ultra-latency through connection with an edge computing device.
  • the present invention can utilize a glass-type HMD device in consideration of the operator's convenience of use in the field.
  • the present invention can provide the augmentation service of the status information and the manual of the manufacturing robot.
  • the present invention displays sensor data by grafting IoT technology, and it is also possible to inquire about logistics loading information.
  • an integrated test environment and equipment interlocking test are possible, and the VoC of the field can be reflected early by applying machine vision and a robot to an actual operation line.
  • the effect according to the present invention is not limited by the contents exemplified above, and more various effects are included in the present specification.
  • FIG. 1 is an exemplary diagram for explaining a 5G-based smart factory according to an embodiment of the present invention.
  • FIG. 2 is an exemplary diagram for explaining various 5G-based smart factory systems according to an embodiment of the present invention.
  • FIG 3 is an exemplary diagram for explaining a 5G network according to an embodiment of the present invention.
  • 4 to 6 are exemplary views for explaining a machine learning platform according to an embodiment of the present invention.
  • FIG. 7 and 8 are exemplary views for explaining a machine vision system according to an embodiment of the present invention.
  • FIG 9 is an exemplary view for explaining a multi-function robot system according to an embodiment of the present invention.
  • FIG 10 and 11 are exemplary views for explaining a manufacturing facility management AR system according to an embodiment of the present invention.
  • FIG. 12 is an exemplary view for explaining a flexible production line system according to an embodiment of the present invention.
  • FIG. 13 is a block diagram illustrating an apparatus for providing a 5G-based production, logistics management, and cloud-oriented machine vision service according to an embodiment of the present invention.
  • FIG. 14 is a flowchart illustrating a 5G-based production, logistics management, and cloud-oriented machine vision service providing method according to an embodiment of the present invention.
  • expressions such as “have,” “may have,” “includes,” or “may include” refer to the presence of a corresponding characteristic (eg, a numerical value, function, operation, or component such as a part). and does not exclude the presence of additional features.
  • expressions such as “A or B,” “at least one of A or/and B,” or “one or more of A or/and B” may include all possible combinations of the items listed together.
  • “A or B,” “at least one of A and B,” or “at least one of A or B” means (1) includes at least one A, (2) includes at least one B; Or (3) it may refer to all cases including both at least one A and at least one B.
  • first may modify various elements, regardless of order and/or importance, and refer to one element. It is used only to distinguish it from other components, and does not limit the components.
  • first user equipment and the second user equipment may represent different user equipment regardless of order or importance.
  • the first component may be named as the second component, and similarly, the second component may also be renamed as the first component.
  • a component eg, a first component is "coupled with/to (operatively or communicatively)" to another component (eg, a second component);
  • another component eg, a second component
  • the certain element may be directly connected to the other element or may be connected through another element (eg, a third element).
  • a component eg, a first component
  • another component eg, a second component
  • a device configured to may mean that the device is “capable of” with other devices or parts.
  • a processor configured (or configured to perform) A, B, and C refers to a dedicated processor (eg, an embedded processor) for performing the corresponding operations, or by executing one or more software programs stored in a memory device.
  • a generic-purpose processor eg, a CPU or an application processor
  • FIG. 1 is an exemplary diagram for explaining a 5G-based smart factory according to an embodiment of the present invention.
  • the 5G-based smart factory builds a manufacturing patent network infrastructure that combines 5G (Fifth Generation) and TSN (Time Sensitive Network) technologies, and based on this, a cloud-oriented machine learning platform and machine vision for technology verification ( Machine Vision), Multi-Function Robot, and AR service will be built and demonstrated.
  • Industry 5G is a factory-specific industrial 5G for ultra-low latency, ultra-high speed, and ultra-connection
  • the ML (Machine Learning) Cloud analyzes numerous data required for the manufacturing process based on machine learning.
  • Demo site enables education, joint research, and demonstration, and Machine Vision enables automatic quality analysis through image analysis.
  • Multi-Function Robot enables work/logistics collaboration within a factory, and AR service enables work order, information delivery, and manual confirmation.
  • FIG. 2 is an exemplary diagram for explaining various 5G-based smart factory systems according to an embodiment of the present invention.
  • the 5G-based smart factory system includes an ultra-low latency/large-capacity network, a cloud machine learning platform (Cloud ML Platform), and three types of demonstration services.
  • Cloud ML Platform cloud machine learning platform
  • Ultra-low latency/high-capacity networks include Pre-5G and Multi-access Edge Computing.
  • Pre-5G includes Pre-5G application, commercial 5G verification, and 5G facility production.
  • Multi access Edge Computing includes TSN development, 5G MEC interworking and 5G standardization facility interworking.
  • Cloud machine learning platforms include Edge Cloud and 5G Industry Cloud.
  • 5G Industry Cloud includes GPU infrastructure construction, UI/UX development, sensor data interworking and integrated cloud.
  • the three types of demonstration services include Vision, Robot, and AR.
  • Vision includes vision construction, 3D vision development, and general-purpose core development.
  • Robot includes AMR production, vision + AMR integration and 5G interworking facilities.
  • AR includes primary prototyping, machine location identification, and AR + AMR.
  • FIG 3 is an exemplary diagram for explaining a 5G network according to an embodiment of the present invention.
  • the 5G network combines wireless 5G and wired TSN to form an ultra-low-latency, ultra-high-capacity data transmission network infrastructure.
  • the present invention can be utilized for control data, which is important in real-time, and image information transmission service for vision inspection that requires large-capacity data transmission.
  • the 5G network includes a Pre 5G (60GHz band) transmitting terminal that interworks with production/logistics equipment such as AMR, Bin Picking Robot, and Machine Vision, and a SW interface for interworking between Pre 5G receivers and MEC facilities is provided.
  • production/logistics equipment such as AMR, Bin Picking Robot, and Machine Vision
  • SW interface for interworking between Pre 5G receivers and MEC facilities is provided.
  • an industrial TSN prototype is provided.
  • the 5G network provides external interfaces for AMR, Bin Picking Robot, Machine Vision facilities, and commercial 5G modems, and a computing node for MEC implementation of the service end in the 5G base station (concentrator) is built.
  • a computing node for MEC implementation of the service end in the 5G base station (concentrator) is built.
  • TSN-based industrial profiles such as improved forwarding, time synchronization, and queue control functions are provided.
  • the 5G network includes 5G wireless communication-based production/logistics (AMR, Bin Picking Robot, MachineVision) facilities, a commercial 5G modem is embedded in the production/logistics facility, and interworking verification between TSN equipment and commercial 5G standard equipment for telecommunication companies is provided. do.
  • AMR Application/logistics
  • Bin Picking Robot MachineVision
  • a low-latency packet forwarding technology a low-latency transmission technology for supporting mission critical services, and a time synchronization function between TSN devices can be provided.
  • a reliable packet forwarding function for services sensitive to Packet Loss Ratio can be implemented, and a transmission quality management solution for low-delay service support can be built.
  • service characteristics are discovered through quality measurement of Control Packet and Data Packet for each Smart Factory Application, and a commercial Network Switching Chipset-based TSN switch prototype can be manufactured.
  • the present invention enables the introduction of high-speed/ultra-low latency network infrastructure in small and medium-sized factories by applying commercial network-based 5G wireless communication and TSN-based wired communication technology, and various smart factory services based thereon.
  • the present invention is 5G wireless Infra. It is possible to develop and demonstrate an End-to-End Real Time Network in the form of commercial service by combining wired TSN technology and Ethernet-based industrial TSN Profile development to respond to remote control services such as AMR control.
  • the present invention makes it easy to secure a high-speed/ultra-low delay network base for small and medium-sized factories by utilizing commercial communication infrastructure, and replaces the wired-based domestic manufacturing facility control network dominated by global companies with a 5G wireless network to reduce dependence on foreign solutions. It is minimized, and it is easy to spread and apply by interworking with global standard communication technology with high versatility.
  • 4 to 6 are exemplary views for explaining a machine learning platform according to an embodiment of the present invention.
  • the machine learning platform is specialized in an industry where realtime is important and provides real-time processing through the 5G network-based Edge Cloud located in the factory and the machine learning learning cloud located in the central IDC in the form of SaaS. Through this, integrated process analysis of data generated within the plant is possible.
  • the machine learning platform provides a machine learning platform based on GPU development verification and enables cloud design suitable for manufacturing environments.
  • the machine learning platform can provide analysis of the manufacturing environment by major target customers.
  • the machine learning platform can design and develop a cloud portal, and it is possible to develop a portal UI/GUI that can check user data.
  • the machine learning platform can provide an integrated analysis monitoring platform based on standard OPC application and large data processing.
  • the machine learning platform can be linked with additional three types of interlocking equipment. It is possible to select 3 types of additional interlocking data (vibration sensor, temperature sensor, thermal image) and to link REST API with solutions already owned by participating companies.
  • the machine learning platform can apply a filtering algorithm by classifying meaningful data among additional input data. It is possible to analyze the structure and function of the existing S/W, derive improvements and priorities for the existing S/W, and review the UI scenario according to the addition of new functions.
  • the machine learning platform can optimize and verify cloud system performance, and perform verification comparisons for stability and performance optimization.
  • the machine learning platform can develop and verify failure analysis algorithms.
  • the machine learning platform enables failure analysis algorithm design and platform development.
  • the machine learning platform is capable of integrating and analyzing two or more types of factory data.
  • the machine learning platform enables linkage of two existing legacy systems (MES, ERP, etc.), development of correlation analysis algorithms, and linkage analysis of quality data and integrated process data.
  • the machine learning platform can provide a scale-out function of the platform infrastructure for a large data processing function, a Tensor Flow provision function, a GPU-based machine learning platform, a VM provision and monitoring function, and an SQS compatible queue service. there is.
  • the machine learning platform may build a cloud platform development prototype SMIC and provide a cloud portal configuration screen UI/UX.
  • the machine learning platform is able to move away from the existing wired-based data interlocking and storage and simple statistical prediction systems, and processing large-scale real-time data required for the manufacturing process based on 5G, service quality inspection using machine learning results, and robot control. It is possible.
  • the present invention can expand the types of data that can be collected through 5G interworking and improve the collection speed.
  • the present invention can expand the range of data that can be analyzed in the manufacturing field based on an infrastructure capable of processing large-capacity/real-time data.
  • the present invention can support the application of customized S/W package for each function for interworking with additional applications, and can lay the foundation for various algorithms and correlation analysis through integrated analysis of image data and production processes that can determine quality.
  • the present invention can establish interlocking standardization for general-purpose machine learning in the future, and reduce infrastructure costs for processing and building large-scale manufacturing data, so that the smart factory project aimed at the nation can be smoothly carried out.
  • the present invention blocks domestic technology leakage to a foreign cloud, and enables regular audits through the certification examination of related service companies that can be managed and supervised under the leadership of the government.
  • the present invention makes it possible to create domestic jobs and secure SW competitiveness for related businesses, secure domestic cloud technology suitable for the domestic environment and provide services, and secure source technology and competitiveness for smart factory solutions.
  • FIG. 7 and 8 are exemplary views for explaining a machine vision system according to an embodiment of the present invention.
  • the machine vision system can provide a quality inspection automation solution that photographs a product with a high-resolution camera connected to 5G, and determines in real time whether a mass-produced product is defective using an algorithm learned through machine learning.
  • the machine vision system can provide machine vision service based on 5G Machine Learning Cloud Platform.
  • the machine vision system can provide related services by installing the machine vision solution on the cloud industry platform.
  • This service may be a service for demonstrating automation of quality control by collecting large-capacity, high-quality quality control image data and applying it to the automobile parts production process.
  • the machine vision system can be applied to the automotive parts industry to perform performance verification.
  • Such a machine vision system can perform performance verification by identifying the required model and applying the As Is, To Be model to the sample at the model factory.
  • the machine vision system can consider the sensor and data collection status applied to the facility, environmental variables, etc., and prepare a verification basis.
  • Machine vision systems can provide high-volume 3D image inspection algorithms.
  • the machine vision system can provide high-speed pre-processing technology for large-capacity, high-quality 3D image data and high-accuracy machine vision 3D image quality inspection algorithms.
  • the machine vision system can be applied to the empirical model to perform large-capacity 3D image machine vision performance verification.
  • the machine vision system can identify the verification target that requires high-capacity image-based quality inspection and perform performance verification by applying the As Is, To Be model to the sample in the factory.
  • a machine vision system can provide a general-purpose Industry Machine Vision core platform.
  • the machine vision system can collect the relevant data step by step and advance the image detection algorithm.
  • the machine vision system may provide a general-purpose algorithm capable of determining the quality of defects other than the previously learned parts.
  • Machine vision systems provide a way to build an initial model with a small amount of data and then continuously learn from new (incremental) data. Machine vision systems have short learning times and can have similar accuracies.
  • the machine vision system can provide a general-purpose Machine Vision Core that can be broadly applied to all areas of the manufacturing industry by standardizing and lightening expensive high-tech solutions such as vision inspection in the semiconductor area and developing it in the form of SaaS.
  • the present invention can provide an optimized deep learning analysis technique by applying ultra-expensive vision inspection, which was limited to semiconductors, to the automotive parts quality inspection process, which is the basis of the domestic manufacturing industry.
  • the present invention can improve the processing speed of a solution using a 5G network, and can reduce dependence on quality experts by automating quality control through Machine Vision.
  • the present invention makes it possible to learn with a small amount of data through transfer continuous learning, so it is easy to introduce technology for small and medium-sized enterprises (SMEs), and it is possible to provide Industry Machine Vision cloud service connected by 5G network.
  • SMEs small and medium-sized enterprises
  • the present invention lowers entry barriers for initial construction, enables rapid service use, enables basic analysis improvement without having an ICT professional manpower, and can be expected to improve productivity and quality competitiveness of small and medium-sized manufacturers.
  • the high-resolution image taken is transmitted to the edge cloud through 5G.
  • a high-speed, ultra-low latency 5G wireless network can be built inside the smart factory.
  • the machine learning-based vision cloud receives a large-capacity product image from the edge cloud, and learns the machine learning algorithm as an input to advance it. In other words, real-time verification/real-time quality measurement is possible using the edge cloud.
  • the machine learning algorithm may be updated through learning using such product images as an algorithm for defect determination.
  • the machine learning-based vision cloud determines whether a product is defective through such an algorithm and delivers normal quantity/information data and defect cause data to the smart factory.
  • the machine vision system collects large-capacity, high-quality 3D image data of the production process and stores the collected data in a distributed storage environment.
  • distributed storage is based on GPU-based distributed machine learning, is storage for scale out, and can perform computing resource scheduling.
  • the machine vision system uses the above-mentioned algorithm to receive the product image on the production line and determine the amount required for the total vision judgment time, and the number of cases where the item with actual quantity among all judged products is judged as defective. It is possible to determine classification accuracy (sensitivity, true positive rate), which is a percentage, and classification accuracy (specificity) (specificity, true negative rate), which is a ratio of predicting that there is no actual defect that does not exist.
  • classification accuracy sensitivity, true positive rate
  • specificity specificity, true negative rate
  • the total time required for determination may mean a total time required based on a difference between a time stamp at which a product photo is taken and a time at which a final determination is made through the machine vision service.
  • the sensitivity may mean the number of actual defective products compared to the total number of machine vision defective products.
  • the present invention collects large-capacity, high-quality quality control image data, and applies the machine vision service using the same to the automobile parts generation process to automate the quality management.
  • the present invention can provide a general-purpose algorithm capable of quality determination for defects other than previously learned parts by collecting relevant data in stages and upgrading the image detection algorithm, and after making an initial model with a small amount of data By continuously learning from new data, it is possible to provide an algorithm with a short learning time and high accuracy.
  • the present invention can provide an optimized deep learning analysis technique by applying the ultra-expensive vision audit, which was limited to semiconductors, to the automotive parts quality inspection process, which is the basis of the domestic manufacturing industry.
  • the present invention may improve the processing speed of a solution utilizing a 5G network.
  • the present invention may reduce reliance on quality experts by automating quality control through machine vision.
  • the present invention uses an algorithm capable of learning with a small amount of data through transfer/continuous learning, thereby facilitating the introduction of technology by small and medium-sized enterprises (SMEs).
  • SMEs small and medium-sized enterprises
  • the present invention uses a machine vision cloud connected by a 5G network to lower the barrier to entry for initial deployment and to use the service quickly.
  • basic analysis/improvement of the present invention is possible without having an ICT professional manpower.
  • the present invention can improve the productivity and quality competitiveness of small and medium-sized manufacturers.
  • the present invention makes it possible to easily introduce and use advanced facilities by demonstrating cloud-type machine vision services to small and medium-sized auto parts manufacturers who lack IT professional manpower.
  • the present invention can increase productivity and price competitiveness through automation of quality inspection, which is a simple repetitive task, and can focus on eco-friendly and autonomous vehicle technologies that are currently being activated, and produce small quantities of various types through flexible production demonstration It is possible to take a lead in this necessary future automobile correction.
  • the machine vision system performs problem definitions such as whether scratches, air bubbles, stamping/pressing, foreign substances/chips, etc., occur due to physical impact during the process, and learns a model for each segmented area.
  • the detection model is established by performing problem analysis, such as reducing the amount of information to be learned through a similar background, and increasing the data set through part division.
  • the machine vision system performs defect learning using a DenseNet-based CNN model.
  • the machine vision system performs defect determination using this pre-trained defect determination model.
  • the machine vision system may provide an interface module for a 5G-based machine vision service.
  • a machine vision service can be built with a wire-based machine vision solution mounted on a cloud machine learning platform.
  • the present invention can verify the performance by designing a decision model and applying the to-be model to the demonstration plant.
  • the present invention can provide a basis for verification in consideration of the sensor and data collection status, surrounding environment variables applied to the facility, and the like.
  • the present invention can minimize variations in the existing production line, select a target centering on a line with a high automated quality inspection effect, and support improvement effect information by providing a performance report.
  • FIG 9 is an exemplary view for explaining a multi-function robot system according to an embodiment of the present invention.
  • the multi-function robot system uses a 5G wireless communication 3D robot vision system, a 6-axis collaborative robot arm, and a multi-function robot integrated with AMR to recognize various atypical objects and perform bin picking and logistics transfer. can do.
  • This multi-function robotic system includes a 3D vision sensor and 6-axis collaborative robotic arm integrated hardware.
  • the multi-function robot system enables detailed design and implementation of AMR for transport and operation for factory automation.
  • Multi-function robotic systems can provide Edge Computing module design and prototyping.
  • the multi-function robot system provides a 5G module, a 3D vision sensor, a 6-axis collaborative robot arm, and an AMR integrated system.
  • the multi-function robot system can be applied with transport and operation AMR testbeds for factory automation.
  • the multi-function robot system can upgrade the 5G module, 3D vision sensor, 6-axis collaborative robot arm and AMR integrated system, and can be applied to AMR mass production sites for transport and work for factory automation.
  • the multi-function robot system can expand the application area of 3D Vision solution by combining an advanced industrial 3D scanner with AMR and robot arm, and preoccupy the multifunctional robot market in the form of ‘AMR + robot arm + various solutions’.
  • the present invention can eliminate the physical scan operation by applying structured light irradiation to minimize the 3D image acquisition time.
  • the present invention can transmit large-capacity images through 5G and perform related image processing in the cloud.
  • the present invention is the development of a mobile solution that combines 6-axis collaborative robot & AMR in the existing fixed solution, and it can secure flexibility to be applied to various field environments.
  • FIG 10 and 11 are exemplary views for explaining a manufacturing facility management AR system according to an embodiment of the present invention.
  • the manufacturing facility management AR system uses 5G to augment the facility status and IoT sensor information in the factory in real time to the AR device, and establishes a manual support augmentation service to quickly check information and support remote augmentation through field personnel service can be provided.
  • the manufacturing facility management AR system uses real-time information augmentation technology linked with on-site manufacturing robots. Manufacturing facility management AR system can augment the manufacturing robot remote control manual with mobile devices. The manufacturing facility management AR system can augment the status information of the edge server-linked real-time robot. The manufacturing facility management AR system is capable of analyzing the location of equipment based on the manufacturing robot map. The manufacturing facility management AR system can analyze and organize the management location map data in the manufacturing robot.
  • real-time information augmentation technology linked with on-site manufacturing robots may be used.
  • the manufacturing facility management AR system can augment the manufacturing robot remote control manual with a glass device.
  • the manufacturing facility management AR system can augment the vibration state information of the Edge Server-linked robot.
  • Manufacturing facility management AR system may use manufacturing robot map-based equipment location identification and augmentation technology.
  • the manufacturing facility management AR system may provide a location map and linkage augmentation service from the manufacturing robot.
  • Manufacturing facility management AR system can use on-site logistics information augmented inquiry and manufacturing robot remote maintenance support technology.
  • the manufacturing facility management AR system can provide a real-time logistics information inquiry augmentation function.
  • Manufacturing facility management AR system can provide 5G demonstration video call solution utilization and remote interworking support technology service.
  • the manufacturing facility management AR system enables recognition/tracking based on marker recognition by mounting a unique marker (AR Tag) on the manufacturing robot device through the manufacturing robot device identification recognition technology.
  • AR Tag unique marker
  • Manufacturing facility management AR system can provide real-time inquiry and augmentation of connected equipment status values of Edge Cloud server through manufacturing robot status information inquiry and augmentation/tracking technology.
  • Manufacturing facility management AR system can provide manufacturing robot augmented manual production and interaction to provide vision recognition, tracking-based manual augmentation and interface of manufacturing robot remote control.
  • Manufacturing facility management AR system enables Edge-linked inquiry and rendering of manufacturing robot vibration sensor data through manufacturing robot vibration sensor inquiry and augmentation technology.
  • the present invention can configure a service by utilizing marker-based image recognition and the latest Android device's ARCore face recognition technology, rather than spatial recognition-based augmented authoring and face recognition technology.
  • the main device utilizes a tablet equipped with the latest Google OS, not Google Tango, to secure universality, and UI charts necessary for augmentation can be produced according to the field and use.
  • the present invention facilitates market expansion by developing a highly versatile OS, and can be expanded to fields requiring remote equipment maintenance (elevators, tractors, etc.) and logistics loading information inquiry fields.
  • the present invention makes it possible to create new jobs using non-professional personnel according to the accumulation/transmission of Know How.
  • FIG. 12 is an exemplary view for explaining a flexible production line system according to an embodiment of the present invention.
  • the flexible production line system can establish a 5G network-based flexible production test bed by developing industrial interface linkages such as 5G, OPC UA, and industrial Ethernet, and developing an integrated process control system.
  • industrial interface linkages such as 5G, OPC UA, and industrial Ethernet
  • the flexible production line system uses Pre 5G and Industrial Ethernet linkage technology, and Server client OPC UA can be improved with PubSub structure.
  • the flexible production line system can provide process-integrated connection technology that delivers Pre 5G-based process data.
  • the flexible production line system provides a 5G interface for smart process equipment and can operate a demonstration line for smart process equipment 5G commercial network verification.
  • the flexible production line system can provide an interoperability test bed for flexible production of smart factories, and can provide a control integration system and control system for flexible production.
  • the flexible production line system can strengthen the domestic competitiveness of the modular factory facility occupied by global companies by developing a mobile module combination type field automation line technology by grafting 5G technology to the SBB (Smart Base Block) platform.
  • SBB Smart Base Block
  • the present invention applies 5G to SBB core technology to provide a moving module combination field automation test bed line technology by supporting wireless-based low latency, and OPC UA to secure real-time of factory device control data by applying 5G technology of high-speed streaming connection technology can be provided.
  • the present invention makes it possible to manufacture intelligent equipment, devices, and solutions that meet the requirements of the Industrial Internet and Industry 4.0 by using OPC UA and 5G technology in equipment or device manufacturers, engineering companies, and system providing companies.
  • the present invention can secure compatibility of manufacturing equipment information by automating the tasks of mapping and symbol conversion for applying information control logic, symbols, etc. for each manufacturer of existing equipment/controllers to IT solutions.
  • FIG. 13 is a block diagram illustrating an apparatus for providing a 5G-based production, logistics management, and cloud-oriented machine vision service according to an embodiment of the present invention.
  • an apparatus 1300 for providing a 5G-based production, logistics management and cloud-oriented machine vision service includes a communication unit 1310 , a user input unit 1320 , an output unit 1330 , a memory 1340 , and an interface unit 1350 . ), a control unit 1360 and a power supply unit 1370 , and the like. Since the components shown in FIG. 13 are not essential, an apparatus having more or fewer components may be implemented.
  • the communication unit 1310 may include one or more modules that enable wired/wireless communication between a device and a network in which the device is located.
  • the communication unit 1310 transmits/receives a signal to and from at least one of an external device and a server on a communication network such as the Internet.
  • the signal may include various types of data.
  • the communication unit 210 may receive a product image file photographed from a high-resolution camera.
  • the user input unit 1320 generates input data for the user to control the operation of the device.
  • the user input unit 220 may include a keypad, a dome switch, a touch pad (static pressure/capacitance), a jog wheel, a jog switch, and the like.
  • the output unit 1330 is for generating an output related to sight, hearing, or touch, and this may include a display unit 1331 , a sound output module 1332 , and the like.
  • the display unit 1331 displays (outputs) information processed by the device.
  • the device displays a user interface (UI) or graphic user interface (GUI) related to the system.
  • UI user interface
  • GUI graphic user interface
  • the display unit 1331 is a liquid crystal display (LCD), a thin film transistor-liquid crystal display (TFT LCD), an organic light-emitting diode (OLED), a flexible display (flexible). display) and at least one of a three-dimensional display (3D display).
  • LCD liquid crystal display
  • TFT LCD thin film transistor-liquid crystal display
  • OLED organic light-emitting diode
  • flexible display flexible display
  • 3D display at least one of a three-dimensional display
  • the sound output module 1332 may output audio data received from the communication unit 110 or stored in the memory 1340 .
  • the sound output module 1332 also outputs a sound signal related to a function performed by the device.
  • the memory unit 1340 may store a program for processing and control of the controller 260 , and may perform a function for temporarily storing input/output data.
  • the memory 1340 may include a flash memory type, a hard disk type, a multimedia card micro type, a card type memory (eg, SD or XD memory), and a RAM.
  • RAM Random Access Memory
  • SRAM Static Random Access Memory
  • ROM Read-Only Memory
  • EEPROM Electrically Erasable Programmable Read-Only Memory
  • PROM Programgrammable Read-Only Memory
  • magnetic memory magnetic It may include at least one type of storage medium among a disk and an optical disk.
  • the device may operate in relation to a web storage that performs a storage function of the memory 240 on the Internet.
  • the interface unit 1350 serves as a passage with all external devices connected to the device.
  • the interface unit 1350 receives data from an external device, receives power and transmits it to each component inside the device, or allows data inside the device to be transmitted to an external device.
  • wired/wireless headset ports, external charger ports, wired/wireless data ports, memory card ports, ports for connecting devices equipped with identification modules, audio input/output (I/O) ports, A video input/output (I/O) port, an earphone port, etc. may be included in the interface unit 150 .
  • a controller 1360 typically controls the overall operation of the device.
  • the controller 1360 may major in control data and large-capacity data in which real-time is important through an ultra-low latency, ultra-high-capacity network, and may process services such as quality inspection and robot control through machine learning.
  • the controller 1360 may photograph a product with a high-resolution camera connected to a network, and determine whether the product is defective by using an algorithm learned through machine learning.
  • the controller 1360 may provide a factory automation service using a multi-function robot in which a network, a robot vision system, a 6-axis collaborative robot arm, and an AMR are integrated.
  • the control unit 1360 may augment the facility state and sensor information in the factory in real time to the AR device, and may provide a manual support augmentation service.
  • the controller 1360 may control the 5G-based flexible production test bed.
  • controller 1360 may include a graphic module 1361 for parallel data processing.
  • the graphic module 261 may be implemented within the control unit 1360 or may be implemented separately from the control unit 260 .
  • the power supply unit 1370 receives external power and internal power under the control of the control unit 1360 to supply power required for operation of each component.
  • Various embodiments described herein may be implemented in a computer-readable recording medium using, for example, software, hardware, or a combination thereof.
  • the embodiments described herein include application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable gate arrays (FPGAs), It may be implemented using at least one of processors, controllers, micro-controllers, microprocessors, and other electrical units for performing functions.
  • ASICs application specific integrated circuits
  • DSPs digital signal processors
  • DSPDs digital signal processing devices
  • PLDs programmable logic devices
  • FPGAs field programmable gate arrays
  • embodiments such as the procedures and functions described in this specification may be implemented as separate software modules.
  • Each of the software modules may perform one or more functions and operations described herein.
  • the software code may be implemented as a software application written in a suitable programming language.
  • the software code may be stored in the memory 1340 and executed by the controller 1360 .
  • FIG. 14 is a flowchart illustrating a 5G-based production, logistics management, and cloud-oriented machine vision service providing method according to an embodiment of the present invention.
  • the 5G-based production, logistics management, and cloud-oriented machine vision service providing device majors in control data and large-capacity data where real-time is important through an ultra-low latency, ultra-high-speed large-capacity network, and performs quality inspection and A service such as robot control is processed (S1400).
  • the 5G-based production, logistics management and cloud-oriented machine vision service providing device shoots a product with a high-resolution camera connected to a network, and uses an algorithm learned through machine learning to determine whether the product is defective (S1410).
  • the 5G-based production, logistics management and cloud-oriented machine vision service providing device provides factory automation services using a multi-function robot that integrates a network, a robot vision system, a 6-axis collaborative robot arm, and AMR (S1420) .
  • the 5G-based production, logistics management, and cloud-oriented machine vision service providing device augments the facility status and sensor information in the factory in real time to the AR device, and provides a manual support augmentation service (S1430).
  • the 5G-based production, logistics management and cloud-oriented machine vision service providing device controls the 5G-based flexible production test bed (S1440).
  • the present invention can secure interworking/compatibility in advance by applying the technology step-by-step considering the time of domestic and overseas 5G technology standardization and commercialization.
  • the present invention can minimize the technical limitations of service implementation by applying ultra-low delay technology for each network section.
  • the present invention can secure technical versatility by designing for a commercial network structure.
  • the present invention can provide communication and overall environment verification for cloud application.
  • the present invention can stably store and process various heterogeneous data.
  • the present invention can promote efficient service development through role sharing among consortium participating companies.
  • the present invention enables technology recycling and rapid system construction through machine vision service.
  • the present invention makes it possible to derive concrete results through close collaboration with the demonstration participating organizations.
  • the present invention enables selection of a collaborative robot for AMR installation and selection of a 3D vision sensor through a joint test.
  • the present invention enables the development of a dedicated AMR design for mounting a collaborative robot and a 3D vision sensor electronic unit.
  • the present invention enables sharing of real-time status and map information of a manufacturing robot.
  • the present invention can implement network ultra-latency through connection with an edge computing device.
  • the present invention can utilize a glass-type HMD device in consideration of the operator's convenience of use in the field.
  • the present invention can provide the augmentation service of the status information and the manual of the manufacturing robot.
  • the present invention displays sensor data by grafting IoT technology, and it is also possible to inquire about logistics loading information.
  • an integrated test environment and equipment interlocking test are possible, and the VoC of the field can be reflected early by applying machine vision and a robot to an actual operation line.
  • the apparatus and method according to an embodiment of the present invention may be implemented in the form of program instructions that can be executed through various computer means and recorded in a computer-readable medium.
  • the computer-readable medium may include program instructions, data files, data structures, etc. alone or in combination.
  • the program instructions recorded on the computer readable medium may be specially designed and configured for the present invention, or may be known and available to those skilled in the computer software field.
  • Examples of the computer-readable recording medium include magnetic media such as hard disks, floppy disks and magnetic tapes, optical media such as CD-ROMs and DVDs, and magnetic such as floppy disks.
  • Examples of program instructions include not only machine language codes such as those generated by a compiler, but also high-level language codes that can be executed by a computer using an interpreter or the like.
  • the hardware devices described above may be configured to operate as one or more software modules to perform the operations of the present invention, and vice versa.

Abstract

A 5G-based production, logistics management, and cloud-oriented machine vision service providing method according to an embodiment of the present invention comprises the steps of: specializing, through an ultralow latency ultrahigh speed large capacity network, large volume data and control data, the real-time property of which matters, and processing a service such as a robot control and a quality check through machine learning; capturing an image of a product by a high-resolution camera connected to the network, and determining whether the product is defective, by using an algorithm trained through machine learning; providing a factory automation service by using a multi-function robot in which the network, a robot vision system, a 6-axis collaborative robot arm, and an AMR are integrated; augmenting, in real time, in-plant installation state and sensor information in an AR device, and providing a manual support augmentation service; and controlling a 5G-based flexible production testbed.

Description

5G 기반 생산, 물류 관리 및 클라우드향 머신 비전 서비스 제공 방법How to provide 5G-based production, logistics management and cloud-oriented machine vision services
본 발명은 5G 기반 생산, 물류 관리 및 클라우드향 머신 비전 서비스 제공 방법에 관한 것이다.The present invention relates to a 5G-based production, logistics management, and cloud-oriented machine vision service providing method.
4차 산업혁명 시대에 고령화로 인한 노동력 부족 문제를 해소하고, 생산과 노동 효율을 제고하기 위해 ICT 기술 기반의 스마트 생산 및 물류 서비스와 지능화 솔루션 개발이 시급하다. 이러한 4차 산업혁명 시대에는 생산에 필요한 사물들을 ICT 기술과 융합하고 지능화를 통해 새로운 가치를 창출해야 기업의 생존이 보장될 수 있다. In the era of the 4th industrial revolution, it is urgent to develop smart production and logistics services and intelligent solutions based on ICT technology to solve the problem of labor shortage due to aging and to improve production and labor efficiency. In this era of the 4th industrial revolution, the survival of a company can be guaranteed only when things necessary for production are fused with ICT technology and new values are created through intelligence.
국내 총생산에서 제조업이 차지하는 비중은 높지만 노동 집얍적 제조업체 비중이 높으며, 인공지능과 같은 ICT 기술을 도입하여 생산성 향상과 고부가가치 제품을 창출해 낼 수 있는 제조업체의 수는 극히 일부에 불과하다.Although the manufacturing sector accounts for a high proportion of the gross domestic product, the proportion of labor-intensive manufacturers is high, and only a small fraction of manufacturers can improve productivity and create high value-added products by introducing ICT technologies such as artificial intelligence.
또한 생산설비 제어에 대한 응답속도와 안정성은 생산수율과 직결되는 문제이므로, 고가임에도 불구하고 외산 기업의 산업용 유선망에 의존해왔으며, 유선망 설치가 어려운 일부 설비에만 무선망을 도입하였으나, 무선망 간 간섭과 응답지연, 접속단말 제한 등의 문제로 무선망 도입이 확대되지 못하고 있다.In addition, since response speed and stability of production facility control are directly related to production yield, they have relied on industrial wired networks of foreign companies despite their high prices. The introduction of wireless networks has not been expanded due to problems such as response delay and access terminal restrictions.
뿐만 아니라, 인공지능 기술의 산업현장 적용은 대기업 위주로 시도되고 있으며, 설치비용 및 학습기간이 과도하여 중소제조업에서는 도입하기 어렵다는 문제점이 있다.In addition, the industrial field application of artificial intelligence technology is mainly attempted by large enterprises, and there is a problem in that it is difficult to introduce it in small and medium-sized manufacturing industries due to the excessive installation cost and learning period.
이에, 5G 네트워크를 이용하여 장치 간의 초저지연, 초대용량, 초실감 데이터를 송수신할 수 있고, 이를 통해 제조 현장의 데이터를 수집하여 수집된 데이터를 클라우드 인프라 기반 하에서 인공지능으로 분석하는 5G 기반 생산, 물류 관리 및 클라우드향 머신 비전 서비스를 제공하기 위한 방법이 요구된다.Accordingly, 5G-based production that can transmit and receive ultra-low latency, ultra-capacity, and ultra-realistic data between devices using the 5G network, collects data from the manufacturing site and analyzes the collected data with artificial intelligence under the cloud infrastructure, A method for providing logistics management and cloud-oriented machine vision services is required.
본 발명이 해결하고자 하는 과제는 5G 기반 생산, 물류 관리 및 클라우드향 머신 비전 서비스 제공 방법을 제공하는 것이다. The problem to be solved by the present invention is to provide a method for providing a 5G-based production, logistics management, and cloud-oriented machine vision service.
구체적으로, 본 발명이 해결하고자 하는 과제는 5G 네트워크를 이용하여 장치 간의 초저지연, 초대용량, 초실감 데이터를 송수신할 수 있고, 이를 통해 제조 현장의 데이터를 수집하여 수집된 데이터를 클라우드 인프라 기반 하에서 인공지능으로 분석하는 5G 기반 생산, 물류 관리 및 클라우드향 머신 비전 서비스를 제공하기 위한 방법을 제공하는 것이다.Specifically, the problem to be solved by the present invention is to be able to transmit and receive ultra-low delay, ultra-capacity, and ultra-realistic data between devices using a 5G network, and collect data from the manufacturing site through this, and use the collected data under the cloud infrastructure. It is to provide a method for providing 5G-based production, logistics management, and cloud-oriented machine vision services analyzed by artificial intelligence.
본 발명의 과제들은 이상에서 언급한 과제들로 제한되지 않으며, 언급되지 않은 또 다른 과제들은 아래의 기재로부터 당업자에게 명확하게 이해될 수 있을 것이다.The problems of the present invention are not limited to the problems mentioned above, and other problems not mentioned will be clearly understood by those skilled in the art from the following description.
전술한 바와 같은 과제를 해결하기 위하여 본 발명의 실시예에 따른 5G 기반 생산, 물류 관리 및 클라우드향 머신 비전 서비스를 제공하기 위한 방법이 제공된다. In order to solve the above problems, a method for providing a 5G-based production, logistics management and cloud-oriented machine vision service according to an embodiment of the present invention is provided.
본 발명의 실시예에 따른 5G 기반 생산, 물류 관리 및 클라우드향 머신 비전 서비스 제공 방법은, 초저지연, 초고속 대용량 네트워크를 통해 실시간성이 중요한 제어 데이터, 및 대용량 데이터를 전공하고, 기계학습을 통한 품질 검사 및 로봇 제어 등의 서비스를 처리하는 단계; 상기 네트워크와 연결된 고해상도 카메라로 제품을 촬영하고, 머신 러닝을 통해 학습된 알고리즘을 이용하여 제품에 대한 불량 여부를 결정하는 단계; 상기 네트워크, 로봇 비전 시스템, 6축 협업 로봇암(robot arm) 및 AMR이 통합된 멀티 기능 로봇을 이용하여 공장 자동화 서비스를 제공하는 단계; 공장 내 설비 상태 및 센서 정보를 AR 디바이스에 실시간 증강하고, 매뉴얼 지원 증강 서비스를 제공하는 단계; 및 5G 기반 유연 생산 테스트 베드를 제어하는 단계를 포함한다.5G-based production, logistics management, and cloud-oriented machine vision service provision method according to an embodiment of the present invention majors in control data and large-capacity data in which real-time is important through an ultra-low latency, ultra-high-speed large-capacity network, and quality through machine learning processing services such as inspection and robot control; photographing a product with a high-resolution camera connected to the network, and determining whether the product is defective using an algorithm learned through machine learning; providing a factory automation service using a multi-function robot in which the network, the robot vision system, a 6-axis collaborative robot arm, and an AMR are integrated; Augmenting the facility status and sensor information in the factory in real time to the AR device, and providing a manual support augmentation service; and controlling the 5G-based flexible production test bed.
기타 실시예의 구체적인 사항들은 상세한 설명 및 도면들에 포함되어 있다.Details of other embodiments are included in the detailed description and drawings.
본 발명은 본 발명은 국내외 5G 기술 표준화 및 상용화 시점을 고려한 단계별 기술 적용으로 연동/호환성을 사전에 확보할 수 있다.According to the present invention, interworking/compatibility can be secured in advance by applying step-by-step technology in consideration of domestic and foreign 5G technology standardization and commercialization time.
본 발명은 네트워크 구간별 초저지연 기술 적용으로 서비스 구현의 기술적 제한을 최소화할 수 있다.The present invention can minimize the technical limitations of service implementation by applying ultra-low delay technology for each network section.
본 발명은 상용 네트워크 구조에 맞는 설계로 기술 범용성을 확보할 수 있다.The present invention can secure technical versatility by designing for a commercial network structure.
본 발명은 클라우드 적용을 위한 통신 및 제반 환경 검증을 마련할 수 있다.The present invention can provide communication and overall environment verification for cloud application.
본 발명은 다양한 이기종 데이터를 안정적으로 저장 및 처리할 수 있다.The present invention can stably store and process various heterogeneous data.
본 발명은 컨소시엄 참여 업체들 간의 역할 분담을 통해 효율적인 서비스 개발을 추진할 수 있다.The present invention can promote efficient service development through role sharing among consortium participating companies.
본 발명은 머신 비전 서비스를 통한 기술 재활용 및 신속한 시스템 구축이 가능하다.The present invention enables technology recycling and rapid system construction through machine vision service.
본 발명은 실증 참여 기관과 긴말한 협업을 통해 구체화된 성과 도출이 가능하다.The present invention makes it possible to derive concrete results through close collaboration with the demonstration participating organizations.
본 발명은 공동 시험을 통한 AMR 장착용 협업 로봇 선정 및 3D 비전 센서의 선정이 가능하다.The present invention enables selection of a collaborative robot for AMR installation and selection of a 3D vision sensor through a joint test.
본 발명은 협업 로봇 장착을 위한 전용 AMR 설계 및 3D 비전 센서용 전장부 개발이 가능하다.The present invention enables the development of a dedicated AMR design for mounting a collaborative robot and a 3D vision sensor electronic unit.
본 발명은 제조로봇의 실시간 상태 및 맵 정보 공유가 가능하다.The present invention enables sharing of real-time status and map information of a manufacturing robot.
본 발명은 에지 컴퓨팅 디바이스와 연결을 통한 네트워크 초지연성을 구현할 수 있다.The present invention can implement network ultra-latency through connection with an edge computing device.
본 발명은 현장에서 작업자의 활용 편의성을 고려하여 글래스 형태의 HMD 기기를 활용할 수 있다.The present invention can utilize a glass-type HMD device in consideration of the operator's convenience of use in the field.
본 발명은 제조로봇의 상태 정보 및 매뉴얼의 증강 서비스를 제공할 수 있다.The present invention can provide the augmentation service of the status information and the manual of the manufacturing robot.
본 발명은 IoT 기술을 접목하여 센서 데이터를 디스플레이하고, 물류 적재 정보 조회도 가능하다.The present invention displays sensor data by grafting IoT technology, and it is also possible to inquire about logistics loading information.
본 발명은 통합 테스트 환경 및 장비 연동 테스트가 가능하고, 머신 비전 및 로봇 등을 실제 가동 라인에 적용하여 현장의 VoC를 조기에 반영할 수 있다.According to the present invention, an integrated test environment and equipment interlocking test are possible, and the VoC of the field can be reflected early by applying machine vision and a robot to an actual operation line.
본 발명에 따른 효과는 이상에서 예시된 내용에 의해 제한되지 않으며, 더욱 다양한 효과들이 본 명세서 내에 포함되어 있다.The effect according to the present invention is not limited by the contents exemplified above, and more various effects are included in the present specification.
도 1은 본 발명의 실시예에 따른 5G 기반 스마트 팩토리를 설명하기 위한 예시도이다.1 is an exemplary diagram for explaining a 5G-based smart factory according to an embodiment of the present invention.
도 2는 본 발명의 실시예에 따른 다양한 5G 기반 스마트 팩토리 시스템을 설명하기 위한 예시도이다. 2 is an exemplary diagram for explaining various 5G-based smart factory systems according to an embodiment of the present invention.
도 3은 본 발명의 실시예에 따른 5G 네트워크를 설명하기 위한 예시도이다. 3 is an exemplary diagram for explaining a 5G network according to an embodiment of the present invention.
도 4 내지 도 6은 본 발명의 실시예에 따른 머신 러닝 플랫폼을 설명하기 위한 예시도들이다.4 to 6 are exemplary views for explaining a machine learning platform according to an embodiment of the present invention.
도 7 및 도 8은 본 발명의 실시예에 따른 머신 비전 시스템을 설명하기 위한 예시도들이다.7 and 8 are exemplary views for explaining a machine vision system according to an embodiment of the present invention.
도 9는 본 발명의 실시예에 따른 멀티 기능 로봇 시스템을 설명하기 위한 예시도이다. 9 is an exemplary view for explaining a multi-function robot system according to an embodiment of the present invention.
도 10 및 도 11은 본 발명의 실시예에 따른 제조 설비 관리 AR 시스템을 설명하기 위한 예시도들이다.10 and 11 are exemplary views for explaining a manufacturing facility management AR system according to an embodiment of the present invention.
도 12는 본 발명의 실시예에 따른 유연 생산 라인 시스템을 설명하기 위한 예시도이다.12 is an exemplary view for explaining a flexible production line system according to an embodiment of the present invention.
도 13은 본 발명의 실시예에 따른 5G 기반 생산, 물류 관리 및 클라우드향 머신 비전 서비스 제공 장치를 설명하기 위한 블록도이다.13 is a block diagram illustrating an apparatus for providing a 5G-based production, logistics management, and cloud-oriented machine vision service according to an embodiment of the present invention.
도 14는 본 발명의 실시예에 따른 5G 기반 생산, 물류 관리 및 클라우드향 머신 비전 서비스 제공 방법을 설명하기 위한 흐름도이다.14 is a flowchart illustrating a 5G-based production, logistics management, and cloud-oriented machine vision service providing method according to an embodiment of the present invention.
본 발명의 이점 및 특징, 그리고 그것들을 달성하는 방법은 첨부되는 도면과 함께 상세하게 후술되어 있는 실시예들을 참조하면 명확해질 것이다. 그러나, 본 발명은 이하에서 개시되는 실시예들에 한정되는 것이 아니라 서로 다른 다양한 형태로 구현될 것이며, 단지 본 실시예들은 본 발명의 개시가 완전하도록 하며, 본 발명이 속하는 기술분야에서 통상의 지식을 가진 자에게 발명의 범주를 완전하게 알려주기 위해 제공되는 것이며, 본 발명은 청구항의 범주에 의해 정의될 뿐이다. 도면의 설명과 관련하여, 유사한 구성요소에 대해서는 유사한 참조부호가 사용될 수 있다.Advantages and features of the present invention and methods of achieving them will become apparent with reference to the embodiments described below in detail in conjunction with the accompanying drawings. However, the present invention is not limited to the embodiments disclosed below, but will be embodied in various different forms, and only these embodiments allow the disclosure of the present invention to be complete, and common knowledge in the art to which the present invention pertains It is provided to fully inform those who have the scope of the invention, and the present invention is only defined by the scope of the claims. In connection with the description of the drawings, like reference numerals may be used for like components.
본 문서에서, "가진다," "가질 수 있다," "포함한다," 또는 "포함할 수 있다" 등의 표현은 해당 특징(예: 수치, 기능, 동작, 또는 부품 등의 구성요소)의 존재를 가리키며, 추가적인 특징의 존재를 배제하지 않는다.In this document, expressions such as "have," "may have," "includes," or "may include" refer to the presence of a corresponding characteristic (eg, a numerical value, function, operation, or component such as a part). and does not exclude the presence of additional features.
본 문서에서, "A 또는 B," "A 또는/및 B 중 적어도 하나," 또는 "A 또는/및 B 중 하나 또는 그 이상" 등의 표현은 함께 나열된 항목들의 모든 가능한 조합을 포함할 수 있다. 예를 들면, "A 또는 B," "A 및 B 중 적어도 하나," 또는 "A 또는 B 중 적어도 하나"는, (1) 적어도 하나의 A를 포함, (2) 적어도 하나의 B를 포함, 또는(3) 적어도 하나의 A 및 적어도 하나의 B 모두를 포함하는 경우를 모두 지칭할 수 있다.In this document, expressions such as “A or B,” “at least one of A or/and B,” or “one or more of A or/and B” may include all possible combinations of the items listed together. . For example, "A or B," "at least one of A and B," or "at least one of A or B" means (1) includes at least one A, (2) includes at least one B; Or (3) it may refer to all cases including both at least one A and at least one B.
본 문서에서 사용된 "제1," "제2," "첫째," 또는 "둘째," 등의 표현들은 다양한 구성요소들을, 순서 및/또는 중요도에 상관없이 수식할 수 있고, 한 구성요소를 다른 구성요소와 구분하기 위해 사용될 뿐 해당 구성요소들을 한정하지 않는다. 예를 들면, 제1 사용자 기기와 제2 사용자 기기는, 순서 또는 중요도와 무관하게, 서로 다른 사용자 기기를 나타낼 수 있다. 예를 들면, 본 문서에 기재된 권리범위를 벗어나지 않으면서 제1 구성요소는 제2 구성요소로 명명될 수 있고, 유사하게 제 2 구성요소도 제1 구성요소로 바꾸어 명명될 수 있다.As used herein, expressions such as "first," "second," "first," or "second," may modify various elements, regardless of order and/or importance, and refer to one element. It is used only to distinguish it from other components, and does not limit the components. For example, the first user equipment and the second user equipment may represent different user equipment regardless of order or importance. For example, without departing from the scope of the rights described in this document, the first component may be named as the second component, and similarly, the second component may also be renamed as the first component.
어떤 구성요소(예: 제1 구성요소)가 다른 구성요소(예: 제2 구성요소)에 "(기능적으로 또는 통신적으로) 연결되어((operatively or communicatively) coupled with/to)" 있다거나 "접속되어(connected to)" 있다고 언급된 때에는, 상기 어떤 구성요소가 상기 다른 구성요소에 직접적으로 연결되거나, 다른 구성요소(예: 제3 구성요소)를 통하여 연결될 수 있다고 이해되어야 할 것이다. 반면에, 어떤 구성요소(예: 제1 구성요소)가 다른 구성요소(예: 제2 구성요소)에 "직접 연결되어" 있다거나 "직접 접속되어" 있다고 언급된 때에는, 상기 어떤 구성요소와 상기 다른 구성요소 사이에 다른 구성요소(예: 제3 구성요소)가 존재하지 않는 것으로 이해될 수 있다.A component (eg, a first component) is "coupled with/to (operatively or communicatively)" to another component (eg, a second component); When referring to "connected to", it will be understood that the certain element may be directly connected to the other element or may be connected through another element (eg, a third element). On the other hand, when it is said that a component (eg, a first component) is "directly connected" or "directly connected" to another component (eg, a second component), the component and the It may be understood that other components (eg, a third component) do not exist between other components.
본 문서에서 사용된 표현 "~하도록 구성된(또는 설정된)(configured to)"은 상황에 따라, 예를 들면, "~에 적합한(suitable for)," "~하는 능력을 가지는(having the capacity to)," "~하도록 설계된(designed to)," "~하도록 변경된(adapted to)," "~하도록 만들어진(made to)," 또는 "~ 를 할 수 있는(capable of)"과 바꾸어 사용될 수 있다. 용어 "~하도록 구성된(또는 설정된)"은 하드웨어적으로 "특별히 설계된(specifically designed to)" 것만을 반드시 의미하지 않을 수 있다. 대신, 어떤 상황에서는, "~하도록 구성된 장치"라는 표현은, 그 장치가 다른 장치 또는 부품들과 함께 "~할 수 있는" 것을 의미할 수 있다. 예를 들면, 문구 "A, B, 및 C를 수행하도록 구성된(또는 설정된)프로세서"는 해당 동작을 수행하기 위한 전용 프로세서(예: 임베디드 프로세서), 또는 메모리 장치에 저장된 하나 이상의 소프트웨어 프로그램들을 실행함으로써, 해당 동작들을 수행할 수 있는 범용 프로세서(generic-purpose processor)(예: CPU 또는 application processor)를 의미할 수 있다.The expression "configured to (or configured to)" as used in this document, depending on the context, for example, "suitable for," "having the capacity to ," "designed to," "adapted to," "made to," or "capable of." The term “configured (or configured to)” may not necessarily mean only “specifically designed to” in hardware. Instead, in some circumstances, the expression “a device configured to” may mean that the device is “capable of” with other devices or parts. For example, the phrase "a processor configured (or configured to perform) A, B, and C" refers to a dedicated processor (eg, an embedded processor) for performing the corresponding operations, or by executing one or more software programs stored in a memory device. , may mean a generic-purpose processor (eg, a CPU or an application processor) capable of performing corresponding operations.
본 문서에서 사용된 용어들은 단지 특정한 실시 예를 설명하기 위해 사용된 것으로, 다른 실시예의 범위를 한정하려는 의도가 아닐 수 있다. 단수의 표현은 문맥상 명백하게 다르게 뜻하지 않는 한, 복수의 표현을 포함할 수 있다. 기술적이거나 과학적인 용어를 포함해서 여기서 사용되는 용어들은 본 문서에 기재된 기술분야에서 통상의 지식을 가진 자에 의해 일반적으로 이해되는 것과 동일한 의미를 가질 수 있다. 본 문서에 사용된 용어들 중 일반적인 사전에 정의된 용어들은, 관련 기술의 문맥상 가지는 의미와 동일 또는 유사한 의미로 해석될 수 있으며, 본 문서에서 명백하게 정의되지 않는 한, 이상적이거나 과도하게 형식적인 의미로 해석되지 않는다. 경우에 따라서, 본 문서에서 정의된 용어일지라도 본 문서의 실시 예들을 배제하도록 해석될 수 없다.Terms used in this document are only used to describe specific embodiments, and may not be intended to limit the scope of other embodiments. The singular expression may include the plural expression unless the context clearly dictates otherwise. Terms used herein, including technical or scientific terms, may have the same meanings as commonly understood by one of ordinary skill in the art described in this document. Among terms used in this document, terms defined in a general dictionary may be interpreted with the same or similar meaning as the meaning in the context of the related art, and unless explicitly defined in this document, ideal or excessively formal meanings is not interpreted as In some cases, even terms defined in this document cannot be construed to exclude embodiments of this document.
본 발명의 여러 실시예들의 각각 특징들이 부분적으로 또는 전체적으로 서로 결합 또는 조합 가능하며, 당업자가 충분히 이해할 수 있듯이 기술적으로 다양한 연동 및 구동이 가능하며, 각 실시예들이 서로에 대하여 독립적으로 실시 가능할 수도 있고 연관 관계로 함께 실시 가능할 수도 있다.Each feature of the various embodiments of the present invention may be partially or wholly combined or combined with each other, and as those skilled in the art will fully understand, technically various interlocking and driving are possible, and each embodiment may be implemented independently of each other, It may be possible to implement together in a related relationship.
이하, 첨부된 도면을 참조하여 본 발명의 다양한 실시예들을 상세히 설명한다.Hereinafter, various embodiments of the present invention will be described in detail with reference to the accompanying drawings.
도 1은 본 발명의 실시예에 따른 5G 기반 스마트 팩토리를 설명하기 위한 예시도이다.1 is an exemplary diagram for explaining a 5G-based smart factory according to an embodiment of the present invention.
도 1을 참조하면, 5G 기반 스마트 팩토리는 5G(Fifth Generation)와 TSN(Time Sensitive Network) 기술이 융합된 제조 특허 네트워크 인프라를 구축하고, 이에 기반한 클라우드향 기계학습 플랫폼과 기술검증을 위한 머신 비전(Machine Vision), 멀티-기능 로봇(Multi-Function Robot), AR 서비스와 같은 3가지 실증 서비스가 구축 및 시연된다. Industry 5G는 초저지연, 초고속, 초연결을 위한 공장 특화된 산업용 5G이고, ML(Machine Learning) Cloud는 제조 공정에 필요한 수많은 데이터를 기계학습 기반으로 분석한다. Demo site는 교육, 공동 연구, 시연이 가능하고, Machine Vision은 이미지 분석을 통한 품질 자동 분석이 가능하다. Multi-Function Robot은 공장내 작업/물류 협업이 가능하고, AR 서비스는 작업지시, 정보 전달, 매뉴얼 확인이 가능하다.Referring to Figure 1, the 5G-based smart factory builds a manufacturing patent network infrastructure that combines 5G (Fifth Generation) and TSN (Time Sensitive Network) technologies, and based on this, a cloud-oriented machine learning platform and machine vision for technology verification ( Machine Vision), Multi-Function Robot, and AR service will be built and demonstrated. Industry 5G is a factory-specific industrial 5G for ultra-low latency, ultra-high speed, and ultra-connection, and the ML (Machine Learning) Cloud analyzes numerous data required for the manufacturing process based on machine learning. Demo site enables education, joint research, and demonstration, and Machine Vision enables automatic quality analysis through image analysis. Multi-Function Robot enables work/logistics collaboration within a factory, and AR service enables work order, information delivery, and manual confirmation.
도 2는 본 발명의 실시예에 따른 다양한 5G 기반 스마트 팩토리 시스템을 설명하기 위한 예시도이다. 2 is an exemplary diagram for explaining various 5G-based smart factory systems according to an embodiment of the present invention.
도 2를 참조하면, 5G 기반 스마트 팩토리 시스템은 초저지연/대용량의 네트워크, 클라우드 머신 러닝 플랫폼(Cloud ML Platform) 및 3종 실증 서비스를 포함한다.Referring to FIG. 2 , the 5G-based smart factory system includes an ultra-low latency/large-capacity network, a cloud machine learning platform (Cloud ML Platform), and three types of demonstration services.
초저지연/대용량의 네트워크는 Pre-5G 및 Multi access Edge Computing을 포함한다. Pre-5G는 Pre-5G 적용, 상용 5G 검증 및 5G 설비 제작을 포함한다. Multi access Edge Computing은 TSN 개발, 5G MEC 연동 및 5G 표준화 설비 연동을 포함한다. Ultra-low latency/high-capacity networks include Pre-5G and Multi-access Edge Computing. Pre-5G includes Pre-5G application, commercial 5G verification, and 5G facility production. Multi access Edge Computing includes TSN development, 5G MEC interworking and 5G standardization facility interworking.
클라우드 머신 러닝 플랫폼은 Edge Cloud 및 5G Industry Cloud를 포함한다. 5G Industry Cloud는 GPU Infra 구축, UI/UX 개발, 센서 데이터 연동 및 통합 클라우드를 포함한다.Cloud machine learning platforms include Edge Cloud and 5G Industry Cloud. 5G Industry Cloud includes GPU infrastructure construction, UI/UX development, sensor data interworking and integrated cloud.
3종 실증 서비스는 Vision, Robot 및 AR을 포함한다. Vision은 Vision 구축, 3D Vision 개발 및 범용 코어 개발을 포함한다. Robot은 AMR 제작, 비전 + AMR 통합 및 5G 연동 설비를 포함한다. AR은 1차 시제품 가공, 장비 위치 식별 및 AR + AMR을 포함한다.The three types of demonstration services include Vision, Robot, and AR. Vision includes vision construction, 3D vision development, and general-purpose core development. Robot includes AMR production, vision + AMR integration and 5G interworking facilities. AR includes primary prototyping, machine location identification, and AR + AMR.
도 3은 본 발명의 실시예에 따른 5G 네트워크를 설명하기 위한 예시도이다. 3 is an exemplary diagram for explaining a 5G network according to an embodiment of the present invention.
도 3을 참조하면, 5G 네트워크는 무선 5G와 유선 TSN을 결합하여 초저지연 초고속 대용량 데이터 전송 네트워크 인프라가 구성된다. 이를 통해 본 발명은 실시간성이 중요한 제어 데이터(Control Data), 대용량 데이터 전송이 필요한 Vision Inspection용 영상 정보 전송 서비스 등에 활용이 가능하다.Referring to FIG. 3 , the 5G network combines wireless 5G and wired TSN to form an ultra-low-latency, ultra-high-capacity data transmission network infrastructure. Through this, the present invention can be utilized for control data, which is important in real-time, and image information transmission service for vision inspection that requires large-capacity data transmission.
5G 네트워크는 AMR, Bin Picking Robot, Machine Vision 등 생산/물류 설비단과의 연동하는 Pre 5G(60GHz 대역) 송신 단말기가 포함되고, Pre 5G 수신기와 MEC 설비와의 연동을 위한 SW 인터페이스가 제공된다. 또한, 산업용 TSN Prototype이 제공된다.The 5G network includes a Pre 5G (60GHz band) transmitting terminal that interworks with production/logistics equipment such as AMR, Bin Picking Robot, and Machine Vision, and a SW interface for interworking between Pre 5G receivers and MEC facilities is provided. In addition, an industrial TSN prototype is provided.
5G 네트워크는 AMR, Bin Picking Robot, Machine Vision 설비와 상용 5G 모뎀의 외부 인터페이스가 제공되고, 5G 기지국(집중국) 내 서비스단 MEC 구현을 위한 컴퓨팅 노드가 구축된다. 그리고, 개선된 Forwarding, 시각 동기화, Queue 제어 기능 등 TSN 기반 산업용 Profile이 제공된다.The 5G network provides external interfaces for AMR, Bin Picking Robot, Machine Vision facilities, and commercial 5G modems, and a computing node for MEC implementation of the service end in the 5G base station (concentrator) is built. In addition, TSN-based industrial profiles such as improved forwarding, time synchronization, and queue control functions are provided.
5G 네트워크는 5G 무선 통신 기반 생산/물류(AMR, Bin Picking Robot, MachineVision) 설비가 포함되고, 생산/물류 설비 내 상용 5G 모뎀이 Embedded되며, TSN 장비와 통신사 상용 5G Standard 장비와의 연동 검증이 제공된다.The 5G network includes 5G wireless communication-based production/logistics (AMR, Bin Picking Robot, MachineVision) facilities, a commercial 5G modem is embedded in the production/logistics facility, and interworking verification between TSN equipment and commercial 5G standard equipment for telecommunication companies is provided. do.
이와 같은 5G 네트워크를 통해 저지연 패킷 포워딩 기술, Mission Critical 서비스 지원을 위한 저지연 전송 기술, 및 TSN 장비 간 시각 동기화 기능이 제공될 수 있다. 또한 Packet Loss Ratio에 민감한 서비스에 대한 신뢰성 있는 패킷 전달 기능이 구현되고, 저지연 서비스 지원을 위한 전송 품질 관리 솔루션이 구축될 수 있다. 또한 Smart Factory Application 별 Control Packet과 Data Packet에 대한 품질 측정을 통한 서비스 특성이 발굴되고, 상용 Network Switching Chipset 기반 TSN 스위치 Prototype이 제작될 수 있다.Through such a 5G network, a low-latency packet forwarding technology, a low-latency transmission technology for supporting mission critical services, and a time synchronization function between TSN devices can be provided. In addition, a reliable packet forwarding function for services sensitive to Packet Loss Ratio can be implemented, and a transmission quality management solution for low-delay service support can be built. In addition, service characteristics are discovered through quality measurement of Control Packet and Data Packet for each Smart Factory Application, and a commercial Network Switching Chipset-based TSN switch prototype can be manufactured.
이를 통해 본 발명은 상용망 기반의 5G 무선 통신과 TSN 기반 유선통신 기술 적용으로 중/소공장의 초고속/초저지연 네트워크 인프라 도입과 이를 기반으로 한 다양한 스마트 팩토리 서비스의 적용이 가능하다.Through this, the present invention enables the introduction of high-speed/ultra-low latency network infrastructure in small and medium-sized factories by applying commercial network-based 5G wireless communication and TSN-based wired communication technology, and various smart factory services based thereon.
본 발명은 5G 무선 Infra. 와 유선 TSN 기술 결합으로 상용 서비스가 가능한 형태의 End to End Real Time Network 개발 및 실증이 가능하며, AMR 제어 등 Remote Control 서비스 대응 위한 Ethernet 기반 산업용 TSN Profile 개발 적용이 가능하다.The present invention is 5G wireless Infra. It is possible to develop and demonstrate an End-to-End Real Time Network in the form of commercial service by combining wired TSN technology and Ethernet-based industrial TSN Profile development to respond to remote control services such as AMR control.
본 발명은 상용 통신 Infra 활용으로 중/소공장의 초고속/초저지연 Network 기반 확보가 용이하고, Global 기업이 독식하고 있는 유선기반의 국내 제조설비 제어망을 5G 무선망으로 대체하여 외산 솔루션의 의존도를 최소화하며, 범용성 높은 Global 표준 통신 기술 연동으로 확산 적용에 용이하다.The present invention makes it easy to secure a high-speed/ultra-low delay network base for small and medium-sized factories by utilizing commercial communication infrastructure, and replaces the wired-based domestic manufacturing facility control network dominated by global companies with a 5G wireless network to reduce dependence on foreign solutions. It is minimized, and it is easy to spread and apply by interworking with global standard communication technology with high versatility.
도 4 내지 도 6은 본 발명의 실시예에 따른 머신 러닝 플랫폼을 설명하기 위한 예시도들이다.4 to 6 are exemplary views for explaining a machine learning platform according to an embodiment of the present invention.
도 4를 참조하면, 머신 러닝 플랫폼은 Realtime이 중요한 Industry에 특화되어 공장 내 위치한 5G망 기반 Edge Cloud를 통한 실시간 처리와 중앙 IDC에 위치한 머신러닝 학습 Cloud를 SaaS 형태로 제공한다. 이를 통해 공장 내에 발생하는 데이터의 통합 공정 분석이 가능하다.Referring to Figure 4, the machine learning platform is specialized in an industry where realtime is important and provides real-time processing through the 5G network-based Edge Cloud located in the factory and the machine learning learning cloud located in the central IDC in the form of SaaS. Through this, integrated process analysis of data generated within the plant is possible.
머신 러닝 플랫폼은 GPU 개발 검증기반 Machine Learning Platform을 제공하고, 제조 환경에 적합한 클라우드 설계가 가능하다. 머신 러닝 플랫폼은 주요 타겟 수요처 별 제조 환경 분석을 제공할 수 있다.The machine learning platform provides a machine learning platform based on GPU development verification and enables cloud design suitable for manufacturing environments. The machine learning platform can provide analysis of the manufacturing environment by major target customers.
머신 러닝 플랫폼은 클라우드 포털 설계 및 개발이 가능하고, 사용자 데이터를 확인할 수 있는 포털 UI/GUI 개발이 가능하다. The machine learning platform can design and develop a cloud portal, and it is possible to develop a portal UI/GUI that can check user data.
머신 러닝 플랫폼은 표준 OPC 적용 및 대용량 데이터 처리 기반 통합 분석 모니터링 플랫폼을 제공할 수 있다.The machine learning platform can provide an integrated analysis monitoring platform based on standard OPC application and large data processing.
머신 러닝 플랫폼은 추가 3종 연동 장비와의 연계가 가능하다. 3종 추가 연동 데이터(진동 센서, 온도 센서, 열화상) 선정 및 참여 기업이 기 보유한 솔루션과 REST API 연동이 가능하다.The machine learning platform can be linked with additional three types of interlocking equipment. It is possible to select 3 types of additional interlocking data (vibration sensor, temperature sensor, thermal image) and to link REST API with solutions already owned by participating companies.
머신 러닝 플랫폼은 추가 입력 데이터 중 유의미한 데이터를 구분하여 필터링 알고리즘을 적용할 수 있다. 기존 보유 S/W 구조 및 기능분석, 기존 보유 S/W 개선사항 및 우선순위 도출, 및 신기능 추가에 따른 UI 시나리오 검토가 가능하다. The machine learning platform can apply a filtering algorithm by classifying meaningful data among additional input data. It is possible to analyze the structure and function of the existing S/W, derive improvements and priorities for the existing S/W, and review the UI scenario according to the addition of new functions.
머신 러닝 플랫폼은 클라우드 시스템 성능 최적화 및 검증이 가능하고, 안정성 및 성능 최적화를 위한 검증 비교 작업을 수행할 수 있다. The machine learning platform can optimize and verify cloud system performance, and perform verification comparisons for stability and performance optimization.
머신 러닝 플랫폼은 장애 분석 알고리즘 개발 및 검증이 가능하다. 머신 러닝 플랫폼은 장애 분석 알고리즘 설계 및 플랫폼 개발이 가능하다. The machine learning platform can develop and verify failure analysis algorithms. The machine learning platform enables failure analysis algorithm design and platform development.
머신 러닝 플랫폼은 2종 이상 공장 데이터 통합 및 연관 관계 분석이 가능하다. 머신 러닝 플랫폼은 기존 Legacy 시스템 2종 연동 (MES, ERP 등), 상관 연관 분석 알고리즘 개발 및 품질 데이터와 통합 공정 데이터의 연계 분석이 가능하다.The machine learning platform is capable of integrating and analyzing two or more types of factory data. The machine learning platform enables linkage of two existing legacy systems (MES, ERP, etc.), development of correlation analysis algorithms, and linkage analysis of quality data and integrated process data.
도 5를 참조하면, 머신 러닝 플랫폼은 대용량 데이터 처리 기능을 위한 플랫폼 인프라의 Scale-out 기능, GPU기반 머신러닝 플랫폼인 Tensor Flow 제공 기능, VM 제공 및 모니터링 기능, 및 SQS 호환 Queue 서비스를 제공할 수 있다.Referring to Figure 5, the machine learning platform can provide a scale-out function of the platform infrastructure for a large data processing function, a Tensor Flow provision function, a GPU-based machine learning platform, a VM provision and monitoring function, and an SQS compatible queue service. there is.
도 6을 참조하면, 머신 러닝 플랫폼은 클라우드 플랫품 개발 시제품 SMIC을 구축하고, 클라우드 포털 구성 화면 UI/UX를 제공할 수 있다.Referring to FIG. 6 , the machine learning platform may build a cloud platform development prototype SMIC and provide a cloud portal configuration screen UI/UX.
이에, 머신 러닝 플랫폼은 기존 유선 기반 데이터의 연동, 저장 및 Simple한 통계 예측 시스템에서 벗어나, 5G 기반으로 제조 공정에 필요한 대용량 실시간 데이터 전송과 기계학습 결과를 활용한 서비스 품질 검사, Robot 제어 등 처리가 가능하다.Accordingly, the machine learning platform is able to move away from the existing wired-based data interlocking and storage and simple statistical prediction systems, and processing large-scale real-time data required for the manufacturing process based on 5G, service quality inspection using machine learning results, and robot control. It is possible.
이를 통해 본 발명은 5G 연동을 통한 수집 가능한 데이터의 종류 확대 및 수집 속도 개선이 가능하다.Through this, the present invention can expand the types of data that can be collected through 5G interworking and improve the collection speed.
본 발명은 대용량/실시간 데이터를 처리할 수 있는 인프라 기반으로 제조분야에서 분석할 수 있는 데이터의 범위를 확대할 수 있다. The present invention can expand the range of data that can be analyzed in the manufacturing field based on an infrastructure capable of processing large-capacity/real-time data.
본 발명은 추가 Application 연동을 위한기능별 맞춤 S/W Package 적용을 지원할 수 있고, 품질 판명이 가능한 이미지 데이터와 생산 공정 등의 통합 분석을 통해 다양한 알고리즘 및 상관분석의 기반을 마련할 수 있다.The present invention can support the application of customized S/W package for each function for interworking with additional applications, and can lay the foundation for various algorithms and correlation analysis through integrated analysis of image data and production processes that can determine quality.
본 발명은 향후 범용 기계학습을 위한 연동 표준화 수립이 가능하고, 대용량 제조 데이터 처리 및 구축을 위한 인프라 비용을 절감하여 국가가 지향하는 스마트공장 사업을 원활히 진행할 수 있다.The present invention can establish interlocking standardization for general-purpose machine learning in the future, and reduce infrastructure costs for processing and building large-scale manufacturing data, so that the smart factory project aimed at the nation can be smoothly carried out.
본 발명은 외산 클라우드로의 국내 기술 유출을 차단하고, 정부 주도하에 관리 감독 가능 관련 서비스 업체의 인증 심사를 통한 정기 감사가 가능하다.The present invention blocks domestic technology leakage to a foreign cloud, and enables regular audits through the certification examination of related service companies that can be managed and supervised under the leadership of the government.
본 발명은 관련 사업의 국내 일자리 창출 및 SW 경쟁력 확보가 가능하고, 국내 환경에 적합한 국산 클라우드 기술 확보 및 서비스 제공이 가능하며, 스마트공장 분야 솔루션 원천기술 및 경쟁력 확보가 가능하다.The present invention makes it possible to create domestic jobs and secure SW competitiveness for related businesses, secure domestic cloud technology suitable for the domestic environment and provide services, and secure source technology and competitiveness for smart factory solutions.
도 7 및 도 8은 본 발명의 실시예에 따른 머신 비전 시스템을 설명하기 위한 예시도들이다.7 and 8 are exemplary views for explaining a machine vision system according to an embodiment of the present invention.
도 7을 참조하면, 머신 비전 시스템은 5G와 연결된 고해상도 카메라로 제품을 촬영하고, 머신 러닝을 통해 학습된 알고리즘으로 양산 제품에 대한 불량 여부를 실시간으로 판정하는 품질 검사 자동화 솔루션을 제공할 수 있다.Referring to FIG. 7 , the machine vision system can provide a quality inspection automation solution that photographs a product with a high-resolution camera connected to 5G, and determines in real time whether a mass-produced product is defective using an algorithm learned through machine learning.
머신 비전 시스템은 5G Machine Learning Cloud Platform 기반 Machine vision 서비스를 제공할 수 있다. The machine vision system can provide machine vision service based on 5G Machine Learning Cloud Platform.
머신 비전 시스템은 머신비전 솔루션을 클라우드 인더스트리 플랫폼에 탑재되어 관련 서비스를 제공할 수 있다. 이러한 서비스는 대용량, 고품질 품질 관리 이미지 데이터 수집 및 자동차 부품 생산 자동차 부품 생산 공정에 적용하여 품질 관리 자동화 실증을 위한 서비스일 수 있다. The machine vision system can provide related services by installing the machine vision solution on the cloud industry platform. This service may be a service for demonstrating automation of quality control by collecting large-capacity, high-quality quality control image data and applying it to the automobile parts production process.
머신 비전 시스템은 자동차 부품 산업에 실증 적용하여 성능 검증을 수행할 수 있다. 이러한 머신 비전 시스템은 필요 모델을 파악하고 모델 공장에 As Is, To Be 모델을 샘플에 적용하여 성능 검증을 수행할 수 있다. 머신 비전 시스템은 설비에 적용된 센서 및 데이터 수집 현황, 주변 환경 변수 등을 고려, 및 검증 기반 마련이 가능하다.The machine vision system can be applied to the automotive parts industry to perform performance verification. Such a machine vision system can perform performance verification by identifying the required model and applying the As Is, To Be model to the sample at the model factory. The machine vision system can consider the sensor and data collection status applied to the facility, environmental variables, etc., and prepare a verification basis.
머신 비전 시스템은 대용량 3D 이미지 검사 알고리즘을 제공할 수 있다. 머신 비전 시스템은 대용량, 고품질 3D 이미지 데이터 대상 고속 전처리 기술 및 고 정확도의 머신비전 3D 이미지 품질 검사 알고리즘을 제공할 수 있다. Machine vision systems can provide high-volume 3D image inspection algorithms. The machine vision system can provide high-speed pre-processing technology for large-capacity, high-quality 3D image data and high-accuracy machine vision 3D image quality inspection algorithms.
머신 비전 시스템은 실증 모델에 적용하여 대용량 3D 이미지 머신비전 성능 검증을 수행할 수 있다. 머신 비전 시스템은 고용량 이미지 기반의 품질 검사가 필요한 실증 대상을 파악하고 공장에 As Is, To Be 모델을 샘플에 적용하여 성능 검증을 수행할 수 있다.The machine vision system can be applied to the empirical model to perform large-capacity 3D image machine vision performance verification. The machine vision system can identify the verification target that requires high-capacity image-based quality inspection and perform performance verification by applying the As Is, To Be model to the sample in the factory.
머신 비전 시스템은 범용 Industry Machine Vision 코어 플랫폼을 제공할 수 있다. 머신 비전 시스템은 단계적으로 관련 데이터를 수집하고 Image Detection 알고리즘을 고도화할 수 있다. 머신 비전 시스템은 기존에 학습된 부품 외 다른 형태의 불량에 대한 품질 판정 가능한 범용 알고리즘을 제공할 수 있다. 머신 비전 시스템은 소량의 데이터로 초기 모델 만든 후 새로운 (증분) 데이터로 지속적으로 학습하는 방법을 제공한다. 머신 비전 시스템은 학습 시간이 짧고, 정확도가 유사할 수 있다. A machine vision system can provide a general-purpose Industry Machine Vision core platform. The machine vision system can collect the relevant data step by step and advance the image detection algorithm. The machine vision system may provide a general-purpose algorithm capable of determining the quality of defects other than the previously learned parts. Machine vision systems provide a way to build an initial model with a small amount of data and then continuously learn from new (incremental) data. Machine vision systems have short learning times and can have similar accuracies.
이에, 머신 비전 시스템은 반도체 영역의 Vision 검사와 같은 고가의 하이테크 영역 솔루션을 표준화, 경량화 하고 SaaS 형태로 개발함으로써 제조 산업 전영역에 확대 적용 가능한 범용 Machine Vision Core을 제공할 수 있다.Therefore, the machine vision system can provide a general-purpose Machine Vision Core that can be broadly applied to all areas of the manufacturing industry by standardizing and lightening expensive high-tech solutions such as vision inspection in the semiconductor area and developing it in the form of SaaS.
이를 통해, 본 발명은 반도체에 국한되던 초고가 비전 검사를 국내 제조 산업의 근간인 자동차 부품 품질 검사 공정에 적용하여 최적화된 딥러닝 분석 기법을 제공할 수 있다.Through this, the present invention can provide an optimized deep learning analysis technique by applying ultra-expensive vision inspection, which was limited to semiconductors, to the automotive parts quality inspection process, which is the basis of the domestic manufacturing industry.
본 발명은 5G 네트워크를 활용한 솔루션 처리 속도를 개선할 수 있고, Machine Vision을 통한 품질관리 자동화로 품질 전문가에 대한 의존성을 감소시킬 수 있다.The present invention can improve the processing speed of a solution using a 5G network, and can reduce dependence on quality experts by automating quality control through Machine Vision.
본 발명은 전이 지속 학습을 통해 적은 양의 데이터로 학습이 가능해져 중소기업의 기술 도입이 용이하고, 5G 네트워크로 연결된 Industry Machine Vision 클라우드 서비스를 제공할 수 있다. The present invention makes it possible to learn with a small amount of data through transfer continuous learning, so it is easy to introduce technology for small and medium-sized enterprises (SMEs), and it is possible to provide Industry Machine Vision cloud service connected by 5G network.
본 발명은 초기 구축에 대한 진입장벽을 낮추고 신속한 서비스 이용이 가능하고, ICT 전문 인력을 보유하지 않아도 기본적인 분석 개선이 가능하며, 중견, 중소 제조 업체의 생산성과 품질 경쟁력 향상을 기대할 수 있다.The present invention lowers entry barriers for initial construction, enables rapid service use, enables basic analysis improvement without having an ICT professional manpower, and can be expected to improve productivity and quality competitiveness of small and medium-sized manufacturers.
이에 대해서 보다 구체적으로 살펴보면, 스마트 팩토리에서 5G와 연결된 고해상도 카메라로 제품이 촬영되면 촬영된 고해상도 이미지가 5G를 통해 엣지 클라우드로 전송된다. 이러한 경우 스마트 팩토리 내부에 초고속, 초저지연의 5G 무선 네트워크가 구축될 수 있다. Looking at this in more detail, when a product is shot with a high-resolution camera connected to 5G in the smart factory, the high-resolution image taken is transmitted to the edge cloud through 5G. In this case, a high-speed, ultra-low latency 5G wireless network can be built inside the smart factory.
머신러닝 기반 비전 클라우드(ML Vision Cloud)는 엣지 클라우드로부터 초대용량의 제품 이미지를 수신하고, 이를 입력으로 머신 러닝 알고리즘을 학습하여 고도화시킨다. 다시 말해서, 엣지 클라우드를 이용하여 실시간 검증/실시간 품질 측정이 가능하다. 여기서, 머신 러닝 알고리즘은 불량 판별을 위한 알고리즘으로 이러한 제품 이미지를 이용한 학습을 통해 업데이트될 수 있다. 머신러닝 기반 비전 클라우드는 이러한 알고리즘을 통해 제품에 대한 불량 여부를 판단하여 정상 수량/정보 데이터 및 불량 원인 데이터를 스마트 팩토리로 전달한다. The machine learning-based vision cloud (ML Vision Cloud) receives a large-capacity product image from the edge cloud, and learns the machine learning algorithm as an input to advance it. In other words, real-time verification/real-time quality measurement is possible using the edge cloud. Here, the machine learning algorithm may be updated through learning using such product images as an algorithm for defect determination. The machine learning-based vision cloud determines whether a product is defective through such an algorithm and delivers normal quantity/information data and defect cause data to the smart factory.
다시 말해서, 머신 비전 시스템은 생산 공정의 대용량, 고품질 3D 이미지 데이터를 수집하여 수집된 데이터를 분산 스토리지 환경에 저장한다. 여기서, 분산 스토리지는 GPU 기반 분산 머신 러닝을 기반으로 하고, Scale out을 위한 Storage이며, Computing Resource Scheduling을 수행할 수 있다.In other words, the machine vision system collects large-capacity, high-quality 3D image data of the production process and stores the collected data in a distributed storage environment. Here, distributed storage is based on GPU-based distributed machine learning, is storage for scale out, and can perform computing resource scheduling.
머신 비전 시스템은 상술한 알고리즘을 이용하여 생산 라인 상의 제품 이미지를 입력받아 분량을 판별하는데 소요되는 시간인 비전 총 판정 소요시간, 전체 판정 제품 중 실제 분량이 발생한 품목을 불량이라 판정한 경우의 수에 대한 백분율인 분류 정확성(민감도)(Sensitivity, true positive rate) 및 존재하는 실제 불량이 없는 것을 분량이 없다고 예측하는 비율인 분류 정확성(특이도)(Specificity, true negative rate)를 판단할 수 있다. 다시 말해서, 총 판정 소요 시간은 제품의 사진을 촬영한 시간 스탬프와 머신 비전 서비스를 통해 최종 판정이 내려진 시간의 차이를 기반으로 산출된 총 소요시간을 의미할 수 있다. 또한, 민감도는 전체 머신 비전 불량 판정 제품 개수 대비 실제 불량품 개수를 의미할 수 있다.The machine vision system uses the above-mentioned algorithm to receive the product image on the production line and determine the amount required for the total vision judgment time, and the number of cases where the item with actual quantity among all judged products is judged as defective. It is possible to determine classification accuracy (sensitivity, true positive rate), which is a percentage, and classification accuracy (specificity) (specificity, true negative rate), which is a ratio of predicting that there is no actual defect that does not exist. In other words, the total time required for determination may mean a total time required based on a difference between a time stamp at which a product photo is taken and a time at which a final determination is made through the machine vision service. In addition, the sensitivity may mean the number of actual defective products compared to the total number of machine vision defective products.
이를 통해 본 발명은 대용량, 고품질의 품질 관리 이미지 데이터를 수집하고, 이를 이용한 머신 비전 서비스를 자동차 부품 생성 공정에 적용하여 품질 관리 자동화가 가능하다. 또한 본 발명은 단계적으로 관련 데이터를 수집하고 이미지 검출 알고리즘을 고도화하여 기존에 학습된 부품 외 다른 형태의 불량에 대한 품질 판정이 가능한 범용 알고리즘을 제공할 수 있고, 소량의 데이터로 초기 모델을 만든 후 새로운 데이터로 지속적으로 학습함으로써 학습 시간이 짧고, 정확도가 높은 알고리즘을 제공할 수 있다.Through this, the present invention collects large-capacity, high-quality quality control image data, and applies the machine vision service using the same to the automobile parts generation process to automate the quality management. In addition, the present invention can provide a general-purpose algorithm capable of quality determination for defects other than previously learned parts by collecting relevant data in stages and upgrading the image detection algorithm, and after making an initial model with a small amount of data By continuously learning from new data, it is possible to provide an algorithm with a short learning time and high accuracy.
다양한 실시예에서 본 발명은 반도체에 국한되던 초고가 비전 감사를 국내 제조 산업의 근간인 자동차 부품 품질 검사 공정에 적용하여 최적화된 딥러닝 분석 기법을 제공할 수 있다.In various embodiments, the present invention can provide an optimized deep learning analysis technique by applying the ultra-expensive vision audit, which was limited to semiconductors, to the automotive parts quality inspection process, which is the basis of the domestic manufacturing industry.
다양한 실시예에서 본 발명은 5G 네트워크를 활용한 솔루션 처리 속도를 개선할 수 있다.In various embodiments, the present invention may improve the processing speed of a solution utilizing a 5G network.
다양한 실시예에서 본 발명은 머신 비선을 통한 품질 관리 자동화로 품질 전문가에 대한 의존성을 감소시킬 수 있다.In various embodiments, the present invention may reduce reliance on quality experts by automating quality control through machine vision.
다양한 실시예에서 본 발명은 전이/지속 학습을 통해 적은 양의 데이터로 학습이 가능한 알고리즘을 이용함으로써, 중소기업의 기술 도입이 용이하다.In various embodiments, the present invention uses an algorithm capable of learning with a small amount of data through transfer/continuous learning, thereby facilitating the introduction of technology by small and medium-sized enterprises (SMEs).
다양한 실시예에서 본 발명은 5G 네트워크로 연결된 머신 비전 클라우드를 이용하여 초기 구축에 대한 진입 장벽을 낮추고 신속한 서비스 이용히 가능하다. In various embodiments, the present invention uses a machine vision cloud connected by a 5G network to lower the barrier to entry for initial deployment and to use the service quickly.
다양한 실시예에서 본 발명은 ICT 전문 인력을 보유하지 않아도 기본적인 분석/개선이 가능하다.In various embodiments, basic analysis/improvement of the present invention is possible without having an ICT professional manpower.
다양한 실시예에서 본 발명은 중견, 중소 제조 업체의 생산성 및 품질 경쟁력을 향상시킬 수 있다.In various embodiments, the present invention can improve the productivity and quality competitiveness of small and medium-sized manufacturers.
다양한 실시예에서 본 발명은 IT 전문 인력이 부족한 자동차 부품 중견, 중소 제조업체들에게 클라우드 방식의 머신 비전 서비스 실증을 통해 손쉽게 첨단 시설에 대한 도입과 이용이 가능하다.In various embodiments, the present invention makes it possible to easily introduce and use advanced facilities by demonstrating cloud-type machine vision services to small and medium-sized auto parts manufacturers who lack IT professional manpower.
다양한 실시예에서 본 발명은 단순 반복 작업인 품질 검사의 자동화를 통해 생산성과 가격 경쟁력을 높이고, 이를 현재 활성화가 되고 있는 친환경차와 자율주행 자동차 기술 등에 집중할 수 있고, 유연 생산 실증을 통해 다품종 소량 생산이 필요한 미래 자동차 시정에 선도적인 대응이 가능하다.In various embodiments, the present invention can increase productivity and price competitiveness through automation of quality inspection, which is a simple repetitive task, and can focus on eco-friendly and autonomous vehicle technologies that are currently being activated, and produce small quantities of various types through flexible production demonstration It is possible to take a lead in this necessary future automobile correction.
도 8을 참조하면, 머신 비전 시스템은 공정 과정에 있어서 물리적인 충격으로 스크래치, 기포, 찍힘/눌림, 이물질/칩 등의 결합이 발생하는지 등과 같은 문제 정의를 수행하고, 분할된 영역별 모델을 학습하고, 비슷한 배경을 통해 학습할 정보의 양을 감소시키며, 부품 분할을 통한 데이터 셋을 증가하는 등의 문제 분석을 수행함으로써, 검출 모델을 수립한다.Referring to FIG. 8 , the machine vision system performs problem definitions such as whether scratches, air bubbles, stamping/pressing, foreign substances/chips, etc., occur due to physical impact during the process, and learns a model for each segmented area. The detection model is established by performing problem analysis, such as reducing the amount of information to be learned through a similar background, and increasing the data set through part division.
머신 비전 시스템은 DenseNet 기반 CNN 모델을 이용하여 결함 학습을 수행한다.The machine vision system performs defect learning using a DenseNet-based CNN model.
머신 비전 시스템은 이와 같이 사전 학습된 결함 판정 모델을 이용하여 결함 판정을 수행한다. The machine vision system performs defect determination using this pre-trained defect determination model.
다양한 실시예에서 머신 비전 시스템은 5G 기반 머신 비전 서비스를 위한 인터페이스 모듈을 제공할 수도 있다. 이러한 머신 비전 서비스는 유선 기반 머신 비전 솔루션이 클라우드 머신 러닝 플랫폼에 탑재되어 구축될 수 있다.In various embodiments, the machine vision system may provide an interface module for a 5G-based machine vision service. Such a machine vision service can be built with a wire-based machine vision solution mounted on a cloud machine learning platform.
이를 통해 본 발명은 판정 모델을 설계하고, 실증 공장에 to-be 모델을 적용하여 성능을 검증할 수 있다. 또한 본 발명은 설비에 적용된 센서 및 데이터 수집 현황, 주변 환경 변수 등을 고려하여 검증을 위한 기반을 마련할 수 있다. 또한 본 발명은 기존 생산 라인 변동을 최소화하고, 자동화된 품질 검사 효과가 높은 라인을 중심으로 대상을 선정할 수 있으며, 성능 레포트를 제공하여 개선 효과 정보를 지원할 수 있다.Through this, the present invention can verify the performance by designing a decision model and applying the to-be model to the demonstration plant. In addition, the present invention can provide a basis for verification in consideration of the sensor and data collection status, surrounding environment variables applied to the facility, and the like. In addition, the present invention can minimize variations in the existing production line, select a target centering on a line with a high automated quality inspection effect, and support improvement effect information by providing a performance report.
도 9는 본 발명의 실시예에 따른 멀티 기능 로봇 시스템을 설명하기 위한 예시도이다. 9 is an exemplary view for explaining a multi-function robot system according to an embodiment of the present invention.
도 9를 참조하면, 멀티 기능 로봇 시스템은 5G 무선 통신 3D 로봇비전 시스템, 6축 협업 로봇암과 AMR이 통합된 Multi Function Robot을 이용하여 다양한 비정형 물체를 인식한 후 Bin Picking 및 물류 이송 작업을 수행할 수 있다.Referring to FIG. 9 , the multi-function robot system uses a 5G wireless communication 3D robot vision system, a 6-axis collaborative robot arm, and a multi-function robot integrated with AMR to recognize various atypical objects and perform bin picking and logistics transfer. can do.
이러한 멀티 기능 로봇 시스템은 3D 비전 센서와 6축 협업 로봇암 통합 하드웨어가 포함된다. 멀티 기능 로봇 시스템은 공장 자동화를 위한 이송 및 작업용 AMR 상세설계 및 구현이 가능하다. 멀티 기능 로봇 시스템은 Edge Computing 모듈 설계 및 프로토타이핑을 제공할 수 있다.This multi-function robotic system includes a 3D vision sensor and 6-axis collaborative robotic arm integrated hardware. The multi-function robot system enables detailed design and implementation of AMR for transport and operation for factory automation. Multi-function robotic systems can provide Edge Computing module design and prototyping.
멀티 기능 로봇 시스템은 5G 모듈, 3D 비전센서, 6축 협업 로봇암, 및 AMR 통합 시스템을 제공한다. 멀티 기능 로봇 시스템은 공장 자동화를 위한 이송 및 작용업 AMR 테스트베드가 적용될 수 있다.The multi-function robot system provides a 5G module, a 3D vision sensor, a 6-axis collaborative robot arm, and an AMR integrated system. The multi-function robot system can be applied with transport and operation AMR testbeds for factory automation.
멀티 기능 로봇 시스템은 5G 모듈, 3D 비전센서, 6축 협업 로봇암 및 AMR 통합 시스템을 고도화할 수 있고, 공장 자동화를 위한 이송 및 작업용 AMR 양산 현장에 적용될 수 있다.The multi-function robot system can upgrade the 5G module, 3D vision sensor, 6-axis collaborative robot arm and AMR integrated system, and can be applied to AMR mass production sites for transport and work for factory automation.
이에, 멀티 기능 로봇 시스템은 고도화시킨 산업용 3D 스캐너를 AMR 및 로봇암과 결합하여 3D Vision 솔루션의 적용 영역을 확대하고, ‘AMR + 로봇암 + 다양한 솔루션' 형태의 다기능 로봇 시장을 선점할 수 있다.Accordingly, the multi-function robot system can expand the application area of 3D Vision solution by combining an advanced industrial 3D scanner with AMR and robot arm, and preoccupy the multifunctional robot market in the form of ‘AMR + robot arm + various solutions’.
이를 통해서, 본 발명은 3D 이미지 획득 시간 최소화를 위해 구조광 조사를 적용하여 물리적 스캔 동작을 제거할 수 있다.Through this, the present invention can eliminate the physical scan operation by applying structured light irradiation to minimize the 3D image acquisition time.
본 발명은 5G를 통해 대용량 이미지를 전송 및 Cloud에서 관련 이미지 처리를 수행할 수 있다.The present invention can transmit large-capacity images through 5G and perform related image processing in the cloud.
본 발명은 기존의 고정형 솔루션에서 6축 협업로봇 & AMR을 결합한 이동형 솔루션 개발로, 다양한 현장 환경에 적용할 수 있는 유연성을 확보할 수 있다.The present invention is the development of a mobile solution that combines 6-axis collaborative robot & AMR in the existing fixed solution, and it can secure flexibility to be applied to various field environments.
도 10 및 도 11은 본 발명의 실시예에 따른 제조 설비 관리 AR 시스템을 설명하기 위한 예시도들이다.10 and 11 are exemplary views for explaining a manufacturing facility management AR system according to an embodiment of the present invention.
도 10을 참조하면, 제조 설비 관리 AR 시스템은 5G를 이용하여 공장 내 설비 상태 및 IoT 센서 정보를 AR Device에 실시간 증강하고, 매뉴얼 지원 증강 서비스를 구축하여 현장 인력을 통한 빠른 정보 확인 및 원격 증강 지원 서비스를 제공할 수 있다.Referring to FIG. 10 , the manufacturing facility management AR system uses 5G to augment the facility status and IoT sensor information in the factory in real time to the AR device, and establishes a manual support augmentation service to quickly check information and support remote augmentation through field personnel service can be provided.
제조 설비 관리 AR 시스템은 현장형 제조로봇 연동 실시간 정보 증강 기술이 이용된다. 제조 설비 관리 AR 시스템은 모바일 기기로 제조로봇 리모컨 매뉴얼을 증강시킬 수 있다. 제조 설비 관리 AR 시스템은 Edge Server 연계 실시간 로봇의 상태 정보를 증강시킬 수 있다. 제조 설비 관리 AR 시스템은 제조로봇 맵 기반의 장비 위치 연계 분석이 가능하다. 제조 설비 관리 AR 시스템은 제조로봇에서 관리 위치 맵 데이터 분석 및 정리가 가능하다.The manufacturing facility management AR system uses real-time information augmentation technology linked with on-site manufacturing robots. Manufacturing facility management AR system can augment the manufacturing robot remote control manual with mobile devices. The manufacturing facility management AR system can augment the status information of the edge server-linked real-time robot. The manufacturing facility management AR system is capable of analyzing the location of equipment based on the manufacturing robot map. The manufacturing facility management AR system can analyze and organize the management location map data in the manufacturing robot.
제조 설비 관리 AR 시스템은 현장형 제조로봇 연동 실시간 정보 증강 기술이 이용될 수 있다. 제조 설비 관리 AR 시스템은 글라스 기기로 제조로봇 리모컨 매뉴얼을 증강시킬 수 있다. 제조 설비 관리 AR 시스템은 Edge Server 연계 로봇의 진동 상태 정보를 증강시킬 수 있다. 제조 설비 관리 AR 시스템은 제조로봇 맵 기반의 장비 위치 식별 및 증강 기술이 이용될 수 있다. 제조 설비 관리 AR 시스템은 제조로봇에서 위치 맵과 연계 증강 서비스를 제공할 수 있다. In the manufacturing facility management AR system, real-time information augmentation technology linked with on-site manufacturing robots may be used. The manufacturing facility management AR system can augment the manufacturing robot remote control manual with a glass device. The manufacturing facility management AR system can augment the vibration state information of the Edge Server-linked robot. Manufacturing facility management AR system may use manufacturing robot map-based equipment location identification and augmentation technology. The manufacturing facility management AR system may provide a location map and linkage augmentation service from the manufacturing robot.
제조 설비 관리 AR 시스템은 현장 물류 정보 증강 조회 및 제조로봇 원격 정비 지원 기술이 이용될 수 있다. 제조 설비 관리 AR 시스템은 실시간 물류 정보 조회 증강 기능을 제공할 수 있다. 제조 설비 관리 AR 시스템은 5G 실증 영상통화 솔루션 활용 및 원격 연동 지원 기술 서비스를 제공할 수 있다.Manufacturing facility management AR system can use on-site logistics information augmented inquiry and manufacturing robot remote maintenance support technology. The manufacturing facility management AR system can provide a real-time logistics information inquiry augmentation function. Manufacturing facility management AR system can provide 5G demonstration video call solution utilization and remote interworking support technology service.
이에, 제조 설비 관리 AR 시스템은 국내외에서 증강현실 기술을 활용하여 산업 현장의 제조 정비에 활용하려는 시도는 계속되고 있으나 보편화되지는 못한 상황으로 범용성 높은 OS 기반 기술 개발을 통해 시장 선점 및 원격지 장비 유지보수 물류 적재 정보 조회 분야로 확대시킬 수 있다.Therefore, attempts to utilize AR system for manufacturing facility management at home and abroad by using augmented reality technology for manufacturing and maintenance of industrial sites are continuing, but have not been universalized. It can be expanded to the field of inquiry of logistics loading information.
보다 구체적으로 도 11을 참조하면, 제조 설비 관리 AR 시스템은 제조 로봇 장치 식별 인식 기술을 통해 제조로봇 장치에 고유의 마커(AR Tag)를 장착하여 마커 인식 기반의 인식/추적이 가능하다. More specifically, referring to FIG. 11 , the manufacturing facility management AR system enables recognition/tracking based on marker recognition by mounting a unique marker (AR Tag) on the manufacturing robot device through the manufacturing robot device identification recognition technology.
제조 설비 관리 AR 시스템은 제조로봇 상태 정보 조회 및 증강/추적 기술을 통해 Edge Cloud 서버의 연계 장비 상태값 실시간 조회 및 증강을 제공할 수 있다. Manufacturing facility management AR system can provide real-time inquiry and augmentation of connected equipment status values of Edge Cloud server through manufacturing robot status information inquiry and augmentation/tracking technology.
제조 설비 관리 AR 시스템은 제조로봇 증강 매뉴얼 제작 및 Interaction을 통해 제조로봇 리모컨의 비전 인식, 추적 기반 매뉴얼 증강 및 인터페이스를 제공할 수 있다. Manufacturing facility management AR system can provide manufacturing robot augmented manual production and interaction to provide vision recognition, tracking-based manual augmentation and interface of manufacturing robot remote control.
제조 설비 관리 AR 시스템은 제조로봇 진동 센서 조회 및 증강 기술을 통해 제조로봇 진동 센서 데이터에 대한 Edge 연계 조회 및 랜더링이 가능하다.Manufacturing facility management AR system enables Edge-linked inquiry and rendering of manufacturing robot vibration sensor data through manufacturing robot vibration sensor inquiry and augmentation technology.
이를 통해서 본 발명은 공간 인식 기반의 증강 저작 및 면 인식 기술이 아닌, 마커기반의 이미지 인식과 최신 안드로이드 기기의 ARCore 면 인식 기술을 활용하여 서비스를 구성할 수 있다.Through this, the present invention can configure a service by utilizing marker-based image recognition and the latest Android device's ARCore face recognition technology, rather than spatial recognition-based augmented authoring and face recognition technology.
본 발명은 주 디바이스가, 구글 Tango가 아닌 최신 구글 OS가 탑재된 테블릿을 활용하여 보편성을 확보하고, 증강에 필요한 UI 차트가 현장 및 용도에 맞추어 제작될 수 있다.In the present invention, the main device utilizes a tablet equipped with the latest Google OS, not Google Tango, to secure universality, and UI charts necessary for augmentation can be produced according to the field and use.
본 발명은 범용성 높은 OS 기반 개발로 시장 확대가 용이하고, 원격지 장비 유지보수 필요 분야(엘리베이터, Tractor 등) 및 물류 적재 정보 조회 분야까지 확대가 가능하다. The present invention facilitates market expansion by developing a highly versatile OS, and can be expanded to fields requiring remote equipment maintenance (elevators, tractors, etc.) and logistics loading information inquiry fields.
본 발명은 Know How의 축적/전달에 따라 비전문 인력을 활용한 신규 일자리 창출이 가능하다.The present invention makes it possible to create new jobs using non-professional personnel according to the accumulation/transmission of Know How.
도 12는 본 발명의 실시예에 따른 유연 생산 라인 시스템을 설명하기 위한 예시도이다.12 is an exemplary view for explaining a flexible production line system according to an embodiment of the present invention.
도 12를 참조하면, 유연 생산 라인 시스템은 5G와 OPC UA, 산업용 이더넷 같은 산업용 인터페이스 연계를 개발하고, Process 제어 통합 시스템을 개발하여 5G 망 기반 유연생산 테스트베드를 구축할 수 있다. Referring to FIG. 12 , the flexible production line system can establish a 5G network-based flexible production test bed by developing industrial interface linkages such as 5G, OPC UA, and industrial Ethernet, and developing an integrated process control system.
유연 생산 라인 시스템은 Pre 5G 와 산업용 이더넷 연계 기술이 이용되며, Server client OPC UA를 PubSub 구조로 개선할 수 있다. 유연 생산 라인 시스템은 Pre 5G 기반 공정 Data를 전달하는 공정통합형 연결기술을 제공할 수 있다. The flexible production line system uses Pre 5G and Industrial Ethernet linkage technology, and Server client OPC UA can be improved with PubSub structure. The flexible production line system can provide process-integrated connection technology that delivers Pre 5G-based process data.
유연 생산 라인 시스템은 스마트 공정장비의 5G 인터페이스를 제공하고, 스마트 공정장비 5G 상용망 검증을 위한 데모 라인을 운영할 수 있다.The flexible production line system provides a 5G interface for smart process equipment and can operate a demonstration line for smart process equipment 5G commercial network verification.
유연 생산 라인 시스템은 스마트 공장의 유연 생상을 위한 상호운영성 테스트 베드를 제공하고, 유연 생산을 위한 제어 통합 시스템 및 관제 시스템을 제공할 수 있다.The flexible production line system can provide an interoperability test bed for flexible production of smart factories, and can provide a control integration system and control system for flexible production.
이에, 유연 생산 라인 시스템은 SBB(Smart Base Block) 플랫폼에 5G 기술을 접목 이동성 있는 모듈 조합형 현장 자동화 라인 기술을 개발하여 Global 기업들이 점유하고 있는 Modular Factory 설비 분야의 국내 경쟁력을 강화할 수 있다.Accordingly, the flexible production line system can strengthen the domestic competitiveness of the modular factory facility occupied by global companies by developing a mobile module combination type field automation line technology by grafting 5G technology to the SBB (Smart Base Block) platform.
이를 통해서 본 발명은 SBB 핵심기술에 5G를 적용, 무선 기반 저지연성을 지원하여 움직이는 모듈 조합형 현장자동화 테스트베드 라인기술을 제공하고, 5G 기술을 적용하여 공장 디바이스 제어 데이터의 실시간성 확보를 위한 OPC UA의 고속 스트리밍 연계 기술을 제공할 수 있다.Through this, the present invention applies 5G to SBB core technology to provide a moving module combination field automation test bed line technology by supporting wireless-based low latency, and OPC UA to secure real-time of factory device control data by applying 5G technology of high-speed streaming connection technology can be provided.
본 발명은 장비나 기기 제작사, 엔지니어링 기업, 시스템 제공 기업 등에서 OPC UA와 5G 기술을 활용하여 산업 인터넷 및 industry 4.0의 요구사항을 충족하는 지능형 장비, 기기, 솔루션의 제작이 가능하다.The present invention makes it possible to manufacture intelligent equipment, devices, and solutions that meet the requirements of the Industrial Internet and Industry 4.0 by using OPC UA and 5G technology in equipment or device manufacturers, engineering companies, and system providing companies.
본 발명은 기존 장비/제어기의 제조사별 정보 제어로직, 심볼 등을 IT 솔루션에 적용하기 위한 매핑, 심볼 컨버팅의 작업을 자동화하여 제조 장비 정보의 호환성을 확보할 수 있다.The present invention can secure compatibility of manufacturing equipment information by automating the tasks of mapping and symbol conversion for applying information control logic, symbols, etc. for each manufacturer of existing equipment/controllers to IT solutions.
도 13은 본 발명의 실시예에 따른 5G 기반 생산, 물류 관리 및 클라우드향 머신 비전 서비스 제공 장치를 설명하기 위한 블록도이다.13 is a block diagram illustrating an apparatus for providing a 5G-based production, logistics management, and cloud-oriented machine vision service according to an embodiment of the present invention.
도 13을 참조하면, 5G 기반 생산, 물류 관리 및 클라우드향 머신 비전 서비스 제공 장치(1300)는 통신부(1310), 사용자 입력부(1320), 출력부(1330), 메모리(1340), 인터페이스부(1350), 제어부(1360) 및 전원 공급부(1370) 등을 포함할 수 있다. 도 13에 도시된 구성요소들이 필수적인 것은 아니어서, 그보다 많은 구성요소들을 가지거나 그보다 적은 구성요소들을 갖는 장치가 구현될 수도 있다.Referring to FIG. 13 , an apparatus 1300 for providing a 5G-based production, logistics management and cloud-oriented machine vision service includes a communication unit 1310 , a user input unit 1320 , an output unit 1330 , a memory 1340 , and an interface unit 1350 . ), a control unit 1360 and a power supply unit 1370 , and the like. Since the components shown in FIG. 13 are not essential, an apparatus having more or fewer components may be implemented.
이하, 상기 구성요소들에 대해 차례로 살펴본다.Hereinafter, the components will be described in turn.
통신부(1310)는 장치와 장치가 위치한 네트워크 사이의 유무선 통신을 가능하게 하는 하나 이상의 모듈을 포함할 수 있다. 통신부(1310)는, 인터넷 등의 통신망 상에서 외부의 장치, 서버 중 적어도 하나와 신호를 송수신한다. 상기 신호는, 다양한 형태의 데이터를 포함할 수 있다. 통신부(210)는 고해상도 카메라로부터 촬영된 제품 이미지 파일을 수신할 수 있다.The communication unit 1310 may include one or more modules that enable wired/wireless communication between a device and a network in which the device is located. The communication unit 1310 transmits/receives a signal to and from at least one of an external device and a server on a communication network such as the Internet. The signal may include various types of data. The communication unit 210 may receive a product image file photographed from a high-resolution camera.
사용자 입력부(1320)는 사용자가 장치의 동작 제어를 위한 입력 데이터를 발생시킨다. 사용자 입력부(220)는 키 패드(key pad) 돔 스위치 (domeswitch), 터치 패드(정압/정전), 조그 휠, 조그 스위치 등으로 구성될 수 있다. The user input unit 1320 generates input data for the user to control the operation of the device. The user input unit 220 may include a keypad, a dome switch, a touch pad (static pressure/capacitance), a jog wheel, a jog switch, and the like.
출력부(1330)는 시각, 청각 또는 촉각 등과 관련된 출력을 발생시키기 위한 것으로, 이에는 디스플레이부(1331), 음향 출력 모듈(1332) 등이 포함될 수 있다.The output unit 1330 is for generating an output related to sight, hearing, or touch, and this may include a display unit 1331 , a sound output module 1332 , and the like.
디스플레이부(1331)는 장치에서 처리되는 정보를 표시(출력)한다. 예를 들어, 장치가 시스템과 관련된 UI(User Interface) 또는 GUI(Graphic User Interface)를 표시한다. The display unit 1331 displays (outputs) information processed by the device. For example, the device displays a user interface (UI) or graphic user interface (GUI) related to the system.
디스플레이부(1331)는 액정 디스플레이(liquid crystal display, LCD), 박막 트랜지스터 액정 디스플레이(thin film transistor-liquid crystal display, TFT LCD), 유기 발광 다이오드(organic light-emitting diode, OLED), 플렉시블 디스플레이(flexible display), 3차원 디스플레이(3D display) 중에서 적어도 하나를 포함할 수 있다. The display unit 1331 is a liquid crystal display (LCD), a thin film transistor-liquid crystal display (TFT LCD), an organic light-emitting diode (OLED), a flexible display (flexible). display) and at least one of a three-dimensional display (3D display).
음향 출력 모듈(1332)은 통신부(110)로부터 수신되거나 메모리(1340)에 저장된 오디오 데이터를 출력할 수 있다. 음향 출력 모듈(1332)은 장치에서 수행되는 기능과 관련된 음향 신호를 출력하기도 한다.The sound output module 1332 may output audio data received from the communication unit 110 or stored in the memory 1340 . The sound output module 1332 also outputs a sound signal related to a function performed by the device.
메모리부(1340)는 제어부(260)의 처리 및 제어를 위한 프로그램이 저장될 수도 있고, 입/출력되는 데이터들의 임시 저장을 위한 기능을 수행할 수도 있다. 메모리(1340)는 플래시 메모리 타입(flash memory type), 하드디스크 타입(hard disk type), 멀티미디어 카드 마이크로 타입(multimedia card micro type), 카드 타입의 메모리(예를 들어 SD 또는 XD 메모리 등), 램(Random Access Memory, RAM), SRAM(Static Random Access Memory), 롬(Read-Only Memory, ROM), EEPROM(Electrically Erasable Programmable Read-Only Memory), PROM(Programmable Read-Only Memory), 자기 메모리, 자기 디스크, 광디스크 중 적어도 하나의 타입의 저장매체를 포함할 수 있다. 장치는 인터넷(internet)상에서 상기 메모리(240)의 저장 기능을 수행하는 웹 스토리지(web storage)와 관련되어 동작할 수도 있다.The memory unit 1340 may store a program for processing and control of the controller 260 , and may perform a function for temporarily storing input/output data. The memory 1340 may include a flash memory type, a hard disk type, a multimedia card micro type, a card type memory (eg, SD or XD memory), and a RAM. (Random Access Memory, RAM), SRAM (Static Random Access Memory), ROM (Read-Only Memory, ROM), EEPROM (Electrically Erasable Programmable Read-Only Memory), PROM (Programmable Read-Only Memory), magnetic memory, magnetic It may include at least one type of storage medium among a disk and an optical disk. The device may operate in relation to a web storage that performs a storage function of the memory 240 on the Internet.
인터페이스부(1350)는 장치에 연결되는 모든 외부기기와의 통로 역할을 한다. 인터페이스부(1350)는 외부 기기로부터 데이터를 전송받거나, 전원을 공급받아 장치 내부의 각 구성 요소에 전달하거나, 장치 내부의 데이터가 외부 기기로 전송되도록 한다. 예를 들어, 유/무선 헤드셋 포트, 외부 충전기 포트, 유/무선 데이터 포트, 메모리 카드(memory card) 포트, 식별 모듈이 구비된 장치를 연결하는 포트, 오디오 I/O(Input/Output) 포트, 비디오 I/O(Input/Output) 포트, 이어폰 포트 등이 인터페이스부(150)에 포함될 수 있다. The interface unit 1350 serves as a passage with all external devices connected to the device. The interface unit 1350 receives data from an external device, receives power and transmits it to each component inside the device, or allows data inside the device to be transmitted to an external device. For example, wired/wireless headset ports, external charger ports, wired/wireless data ports, memory card ports, ports for connecting devices equipped with identification modules, audio input/output (I/O) ports, A video input/output (I/O) port, an earphone port, etc. may be included in the interface unit 150 .
제어부(controller, 1360)는 통상적으로 장치의 전반적인 동작을 제어한다. 예를 들어, 제어부(1360)는 초저지연, 초고속 대용량 네트워크를 통해 실시간성이 중요한 제어 데이터, 및 대용량 데이터를 전공하고, 기계학습을 통한 품질 검사 및 로봇 제어 등의 서비스를 처리할 수 있다. 제어부(1360)는 네트워크와 연결된 고해상도 카메라로 제품을 촬영하고, 머신 러닝을 통해 학습된 알고리즘을 이용하여 제품에 대한 불량 여부를 결정할 수 있다. 제어부(1360)는 네트워크, 로봇 비전 시스템, 6축 협업 로봇암(robot arm) 및 AMR이 통합된 멀티 기능 로봇을 이용하여 공장 자동화 서비스를 제공할 수 있다. 제어부(1360)는 공장 내 설비 상태 및 센서 정보를 AR 디바이스에 실시간 증강하고, 매뉴얼 지원 증강 서비스를 제공할 수 있다. 제어부(1360)는 5G 기반 유연 생산 테스트 베드를 제어할 수 있다. A controller 1360 typically controls the overall operation of the device. For example, the controller 1360 may major in control data and large-capacity data in which real-time is important through an ultra-low latency, ultra-high-capacity network, and may process services such as quality inspection and robot control through machine learning. The controller 1360 may photograph a product with a high-resolution camera connected to a network, and determine whether the product is defective by using an algorithm learned through machine learning. The controller 1360 may provide a factory automation service using a multi-function robot in which a network, a robot vision system, a 6-axis collaborative robot arm, and an AMR are integrated. The control unit 1360 may augment the facility state and sensor information in the factory in real time to the AR device, and may provide a manual support augmentation service. The controller 1360 may control the 5G-based flexible production test bed.
또한, 제어부(1360)는 병렬 데이터 처리를 위한 그래픽 모듈(1361)을 구비할 수도 있다. 그래픽 모듈(261)은 제어부(1360) 내에 구현될 수도 있고, 제어부(260)와 별도로 구현될 수도 있다.Also, the controller 1360 may include a graphic module 1361 for parallel data processing. The graphic module 261 may be implemented within the control unit 1360 or may be implemented separately from the control unit 260 .
전원 공급부(1370)는 제어부(1360)의 제어에 의해 외부의 전원, 내부의 전원을 인가받아 각 구성요소들의 동작에 필요한 전원을 공급한다.The power supply unit 1370 receives external power and internal power under the control of the control unit 1360 to supply power required for operation of each component.
여기에 설명되는 다양한 실시예는 예를 들어, 소프트웨어, 하드웨어 또는 이들의 조합된 것을 이용하여 컴퓨터 또는 이와 유사한 장치로 읽을 수 있는 기록매체 내에서 구현될 수 있다.Various embodiments described herein may be implemented in a computer-readable recording medium using, for example, software, hardware, or a combination thereof.
하드웨어적인 구현에 의하면, 여기에 설명되는 실시예는 ASICs(application specific integrated circuits), DSPs (digital signal processors), DSPDs (digital signal processing devices), PLDs (programmable logic devices), FPGAs (field programmable gate arrays, 프로세서(processors), 제어기(controllers), 마이크로 컨트롤러(micro-controllers), 마이크로 프로세서(microprocessors), 기타 기능 수행을 위한 전기적인 유닛 중 적어도 하나를 이용하여 구현될 수 있다. 일부의 경우에 본 명세서에서 설명되는 실시예들이 제어부(1360) 자체로 구현될 수 있다.According to the hardware implementation, the embodiments described herein include application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable gate arrays (FPGAs), It may be implemented using at least one of processors, controllers, micro-controllers, microprocessors, and other electrical units for performing functions. The described embodiments may be implemented by the controller 1360 itself.
소프트웨어적인 구현에 의하면, 본 명세서에서 설명되는 절차 및 기능과 같은 실시예들은 별도의 소프트웨어 모듈들로 구현될 수 있다. 상기 소프트웨어 모듈들 각각은 본 명세서에서 설명되는 하나 이상의 기능 및 작동을 수행할 수 있다. 적절한 프로그램 언어로 쓰여진 소프트웨어 어플리케이션으로 소프트웨어 코드가 구현될 수 있다. 상기 소프트웨어 코드는 메모리(1340)에 저장되고, 제어부(1360)에 의해 실행될 수 있다.According to the software implementation, embodiments such as the procedures and functions described in this specification may be implemented as separate software modules. Each of the software modules may perform one or more functions and operations described herein. The software code may be implemented as a software application written in a suitable programming language. The software code may be stored in the memory 1340 and executed by the controller 1360 .
도 14는 본 발명의 실시예에 따른 5G 기반 생산, 물류 관리 및 클라우드향 머신 비전 서비스 제공 방법을 설명하기 위한 흐름도이다.14 is a flowchart illustrating a 5G-based production, logistics management, and cloud-oriented machine vision service providing method according to an embodiment of the present invention.
도 14를 참조하면, 5G 기반 생산, 물류 관리 및 클라우드향 머신 비전 서비스 제공 장치는 초저지연, 초고속 대용량 네트워크를 통해 실시간성이 중요한 제어 데이터, 및 대용량 데이터를 전공하고, 기계학습을 통한 품질 검사 및 로봇 제어 등의 서비스를 처리한다(S1400). Referring to Figure 14, the 5G-based production, logistics management, and cloud-oriented machine vision service providing device majors in control data and large-capacity data where real-time is important through an ultra-low latency, ultra-high-speed large-capacity network, and performs quality inspection and A service such as robot control is processed (S1400).
5G 기반 생산, 물류 관리 및 클라우드향 머신 비전 서비스 제공 장치는 네트워크와 연결된 고해상도 카메라로 제품을 촬영하고, 머신 러닝을 통해 학습된 알고리즘을 이용하여 제품에 대한 불량 여부를 결정한다(S1410).The 5G-based production, logistics management and cloud-oriented machine vision service providing device shoots a product with a high-resolution camera connected to a network, and uses an algorithm learned through machine learning to determine whether the product is defective (S1410).
5G 기반 생산, 물류 관리 및 클라우드향 머신 비전 서비스 제공 장치는 네트워크, 로봇 비전 시스템, 6축 협업 로봇암(robot arm) 및 AMR이 통합된 멀티 기능 로봇을 이용하여 공장 자동화 서비스를 제공한다(S1420).The 5G-based production, logistics management and cloud-oriented machine vision service providing device provides factory automation services using a multi-function robot that integrates a network, a robot vision system, a 6-axis collaborative robot arm, and AMR (S1420) .
5G 기반 생산, 물류 관리 및 클라우드향 머신 비전 서비스 제공 장치는 공장 내 설비 상태 및 센서 정보를 AR 디바이스에 실시간 증강하고, 매뉴얼 지원 증강 서비스를 제공한다(S1430).The 5G-based production, logistics management, and cloud-oriented machine vision service providing device augments the facility status and sensor information in the factory in real time to the AR device, and provides a manual support augmentation service (S1430).
5G 기반 생산, 물류 관리 및 클라우드향 머신 비전 서비스 제공 장치는 5G 기반 유연 생산 테스트 베드를 제어한다(S1440).The 5G-based production, logistics management and cloud-oriented machine vision service providing device controls the 5G-based flexible production test bed (S1440).
이를 통해 본 발명은 국내외 5G 기술 표준화 및 상용화 시점을 고려한 단계별 기술 적용으로 연동/호환성을 사전에 확보할 수 있다.Through this, the present invention can secure interworking/compatibility in advance by applying the technology step-by-step considering the time of domestic and overseas 5G technology standardization and commercialization.
본 발명은 네트워크 구간별 초저지연 기술 적용으로 서비스 구현의 기술적 제한을 최소화할 수 있다.The present invention can minimize the technical limitations of service implementation by applying ultra-low delay technology for each network section.
본 발명은 상용 네트워크 구조에 맞는 설계로 기술 범용성을 확보할 수 있다.The present invention can secure technical versatility by designing for a commercial network structure.
본 발명은 클라우드 적용을 위한 통신 및 제반 환경 검증을 마련할 수 있다.The present invention can provide communication and overall environment verification for cloud application.
본 발명은 다양한 이기종 데이터를 안정적으로 저장 및 처리할 수 있다.The present invention can stably store and process various heterogeneous data.
본 발명은 컨소시엄 참여 업체들 간의 역할 분담을 통해 효율적인 서비스 개발을 추진할 수 있다.The present invention can promote efficient service development through role sharing among consortium participating companies.
본 발명은 머신 비전 서비스를 통한 기술 재활용 및 신속한 시스템 구축이 가능하다.The present invention enables technology recycling and rapid system construction through machine vision service.
본 발명은 실증 참여 기관과 긴말한 협업을 통해 구체화된 성과 도출이 가능하다.The present invention makes it possible to derive concrete results through close collaboration with the demonstration participating organizations.
본 발명은 공동 시험을 통한 AMR 장착용 협업 로봇 선정 및 3D 비전 센서의 선정이 가능하다.The present invention enables selection of a collaborative robot for AMR installation and selection of a 3D vision sensor through a joint test.
본 발명은 협업 로봇 장착을 위한 전용 AMR 설계 및 3D 비전 센서용 전장부 개발이 가능하다.The present invention enables the development of a dedicated AMR design for mounting a collaborative robot and a 3D vision sensor electronic unit.
본 발명은 제조로봇의 실시간 상태 및 맵 정보 공유가 가능하다.The present invention enables sharing of real-time status and map information of a manufacturing robot.
본 발명은 에지 컴퓨팅 디바이스와 연결을 통한 네트워크 초지연성을 구현할 수 있다.The present invention can implement network ultra-latency through connection with an edge computing device.
본 발명은 현장에서 작업자의 활용 편의성을 고려하여 글래스 형태의 HMD 기기를 활용할 수 있다.The present invention can utilize a glass-type HMD device in consideration of the operator's convenience of use in the field.
본 발명은 제조로봇의 상태 정보 및 매뉴얼의 증강 서비스를 제공할 수 있다.The present invention can provide the augmentation service of the status information and the manual of the manufacturing robot.
본 발명은 IoT 기술을 접목하여 센서 데이터를 디스플레이하고, 물류 적재 정보 조회도 가능하다.The present invention displays sensor data by grafting IoT technology, and it is also possible to inquire about logistics loading information.
본 발명은 통합 테스트 환경 및 장비 연동 테스트가 가능하고, 머신 비전 및 로봇 등을 실제 가동 라인에 적용하여 현장의 VoC를 조기에 반영할 수 있다.According to the present invention, an integrated test environment and equipment interlocking test are possible, and the VoC of the field can be reflected early by applying machine vision and a robot to an actual operation line.
본 발명의 실시예에 따른 장치 및 방법은 다양한 컴퓨터 수단을 통하여 수행될 수 있는 프로그램 명령 형태로 구현되어 컴퓨터 판독 가능 매체에 기록될 수 있다. 컴퓨터 판독 가능 매체는 프로그램 명령, 데이터 파일, 데이터 구조 등을 단독으로 또는 조합하여 포함할 수 있다.The apparatus and method according to an embodiment of the present invention may be implemented in the form of program instructions that can be executed through various computer means and recorded in a computer-readable medium. The computer-readable medium may include program instructions, data files, data structures, etc. alone or in combination.
컴퓨터 판독 가능 매체에 기록되는 프로그램 명령은 본 발명을 위하여 특별히 설계되고 구성된 것들이거나 컴퓨터 소프트웨어 분야 당업자에게 공지되어 사용 가능한 것일 수도 있다. 컴퓨터 판독 가능 기록 매체의 예에는 하드 디스크, 플로피 디스크 및 자기 테이프와 같은 자기 매체(magnetic media), CD-ROM, DVD와 같은 광기록 매체(optical media), 플롭티컬 디스크(floptical disk)와 같은 자기-광 매체(magneto-optical media) 및 롬(ROM), 램(RAM), 플래시 메모리 등과 같은 프로그램 명령을 저장하고 수행하도록 특별히 구성된 하드웨어 장치가 포함된다. 프로그램 명령의 예에는 컴파일러에 의해 만들어지는 것과 같은 기계어 코드뿐만 아니라 인터프리터 등을 사용해서 컴퓨터에 의해서 실행될 수 있는 고급 언어 코드를 포함한다.The program instructions recorded on the computer readable medium may be specially designed and configured for the present invention, or may be known and available to those skilled in the computer software field. Examples of the computer-readable recording medium include magnetic media such as hard disks, floppy disks and magnetic tapes, optical media such as CD-ROMs and DVDs, and magnetic such as floppy disks. - Includes magneto-optical media and hardware devices specially configured to store and execute program instructions, such as ROM, RAM, flash memory, and the like. Examples of program instructions include not only machine language codes such as those generated by a compiler, but also high-level language codes that can be executed by a computer using an interpreter or the like.
상술한 하드웨어 장치는 본 발명의 동작을 수행하기 위해 하나 이상의 소프트웨어 모듈로서 작동하도록 구성될 수 있으며, 그 역도 마찬가지이다.The hardware devices described above may be configured to operate as one or more software modules to perform the operations of the present invention, and vice versa.
이상 첨부된 도면을 참조하여 본 발명의 실시예들을 더욱 상세하게 설명하였으나, 본 발명은 반드시 이러한 실시예로 국한되는 것은 아니고, 본 발명의 기술사상을 벗어나지 않는 범위 내에서 다양하게 변형 실시될 수 있다. 따라서, 본 발명에 개시된 실시예들은 본 발명의 기술 사상을 한정하기 위한 것이 아니라 설명하기 위한 것이고, 이러한 실시예에 의하여 본 발명의 기술 사상의 범위가 한정되는 것은 아니다. 그러므로, 이상에서 기술한 실시예들은 모든 면에서 예시적인 것이며 한정적이 아닌 것으로 이해해야만 한다. 본 발명의 보호 범위는 아래의 청구범위에 의하여 해석되어야 하며, 그와 동등한 범위 내에 있는 모든 기술 사상은 본 발명의 권리범위에 포함되는 것으로 해석되어야 할 것이다.Although embodiments of the present invention have been described in more detail with reference to the accompanying drawings, the present invention is not necessarily limited to these embodiments, and various modifications may be made within the scope without departing from the technical spirit of the present invention. . Accordingly, the embodiments disclosed in the present invention are not intended to limit the technical spirit of the present invention, but to explain, and the scope of the technical spirit of the present invention is not limited by these embodiments. Therefore, it should be understood that the embodiments described above are illustrative in all respects and not restrictive. The protection scope of the present invention should be construed by the following claims, and all technical ideas within the scope equivalent thereto should be construed as being included in the scope of the present invention.
[이 발명을 지원한 국가연구개발사업][National R&D project supporting this invention]
[과제고유번호] 1711119155[Project unique number] 1711119155
[과제번호] GK20P0700[task number] GK20P0700
[부처명] 과학기술정보통신부[Name of Ministry] Ministry of Science and Technology Information and Communication
[과제관리(전문)기관명] (재)기가코리아사업단[Name of project management (specialized) organization] GiGA Korea Project Group
[연구사업명] 범부처GigaKOREA사업(R&D)[Research project name] Pan-ministerial GigaKOREA project (R&D)
[연구과제명] 5G 기반 생산/물류관리 서비스 및 Cloud향 제조특화 ML 플랫폼 개발[Research project name] 5G-based production/logistics management service and cloud-oriented manufacturing specialized ML platform development
[기여율] 1/1[Contribution rate] 1/1
[과제수행기관명] 에스케이텔레콤(주)[Name of project performing organization] SK Telecom Co., Ltd.
[연구기간] 2020-01-01 ~ 2020-12-31[Research Period] 2020-01-01 ~ 2020-12-31

Claims (1)

  1. 5G 기반 생산, 물류 관리 및 클라우드향 머신 비전 서비스 제공 방법에 있어서, In the method of providing 5G-based production, logistics management and cloud-oriented machine vision service,
    초저지연, 초고속 대용량 네트워크를 통해 실시간성이 중요한 제어 데이터, 및 대용량 데이터를 전공하고, 기계학습을 통한 품질 검사 및 로봇 제어 등의 서비스를 처리하는 단계;majoring in control data and large-capacity data in which real-time is important through an ultra-low latency, high-speed, large-capacity network, and processing services such as quality inspection and robot control through machine learning;
    상기 네트워크와 연결된 고해상도 카메라로 제품을 촬영하고, 머신 러닝을 통해 학습된 알고리즘을 이용하여 제품에 대한 불량 여부를 결정하는 단계; photographing a product with a high-resolution camera connected to the network, and determining whether the product is defective using an algorithm learned through machine learning;
    상기 네트워크, 로봇 비전 시스템, 6축 협업 로봇암(robot arm) 및 AMR이 통합된 멀티 기능 로봇을 이용하여 공장 자동화 서비스를 제공하는 단계; providing a factory automation service using a multi-function robot in which the network, the robot vision system, a 6-axis collaborative robot arm, and an AMR are integrated;
    공장 내 설비 상태 및 센서 정보를 AR 디바이스에 실시간 증강하고, 매뉴얼 지원 증강 서비스를 제공하는 단계; 및 Augmenting the facility status and sensor information in the factory in real time to the AR device, and providing a manual support augmentation service; and
    5G 기반 유연 생산 테스트 베드를 제어하는 단계를 포함하는, 5G 기반 생산, 물류 관리 및 클라우드향 머신 비전 서비스 제공 방법.A method of providing machine vision services for 5G-based production, logistics management and cloud, comprising controlling a 5G-based flexible production test bed.
PCT/KR2020/015384 2020-11-05 2020-11-05 5g-based production, logistics management, and cloud-oriented machine vision service providing method WO2022097775A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/KR2020/015384 WO2022097775A1 (en) 2020-11-05 2020-11-05 5g-based production, logistics management, and cloud-oriented machine vision service providing method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/KR2020/015384 WO2022097775A1 (en) 2020-11-05 2020-11-05 5g-based production, logistics management, and cloud-oriented machine vision service providing method

Publications (1)

Publication Number Publication Date
WO2022097775A1 true WO2022097775A1 (en) 2022-05-12

Family

ID=81456743

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2020/015384 WO2022097775A1 (en) 2020-11-05 2020-11-05 5g-based production, logistics management, and cloud-oriented machine vision service providing method

Country Status (1)

Country Link
WO (1) WO2022097775A1 (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8103085B1 (en) * 2007-09-25 2012-01-24 Cognex Corporation System and method for detecting flaws in objects using machine vision
KR20190063839A (en) * 2017-11-30 2019-06-10 전자부품연구원 Method and System for Machine Vision based Quality Inspection using Deep Learning in Manufacturing Process
KR20190118451A (en) * 2018-04-10 2019-10-18 (주)오엔에스커뮤니케이션 Apparel production monitoring system using image recognition
KR102102280B1 (en) * 2018-10-26 2020-04-21 주식회사 엠에스 오토텍 Material transfer robot
KR20200063340A (en) * 2018-11-22 2020-06-05 한국클라우드컴퓨팅연구조합 Method and system that machine learning-based quality inspection using the cloud

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8103085B1 (en) * 2007-09-25 2012-01-24 Cognex Corporation System and method for detecting flaws in objects using machine vision
KR20190063839A (en) * 2017-11-30 2019-06-10 전자부품연구원 Method and System for Machine Vision based Quality Inspection using Deep Learning in Manufacturing Process
KR20190118451A (en) * 2018-04-10 2019-10-18 (주)오엔에스커뮤니케이션 Apparel production monitoring system using image recognition
KR102102280B1 (en) * 2018-10-26 2020-04-21 주식회사 엠에스 오토텍 Material transfer robot
KR20200063340A (en) * 2018-11-22 2020-06-05 한국클라우드컴퓨팅연구조합 Method and system that machine learning-based quality inspection using the cloud

Similar Documents

Publication Publication Date Title
WO2022065621A1 (en) Vision inspection system using distance learning of product defect image
CN106815681B (en) Transformer substation secondary equipment closed-loop management platform based on two-dimensional code and management system thereof
KR20220057145A (en) Method for suggesting maunfacture and logistics management service and machine vision service using cloud based on fifith generation
WO2019209059A1 (en) Machine learning on a blockchain
KR20170139239A (en) Mobile-based collaboration site provides methods of utilizing the construction site drawings conversion
WO2015137641A2 (en) Business rule management system having hierarchical rule structure and representation method thereof
WO2020122291A1 (en) Apparatus and method for automating artificial intelligence-based apartment house management work instructions
WO2020218870A1 (en) Electronic apparatus and method for controlling thereof
EP2891041A1 (en) User interface apparatus in a user terminal and method for supporting the same
WO2022097775A1 (en) 5g-based production, logistics management, and cloud-oriented machine vision service providing method
CN114428487A (en) Automatic control device, system and method
Yan et al. Digital twin-driven variant design of a 3C electronic product assembly line
WO2016072636A1 (en) User device, driving method of user device, apparatus for providing service and driving method of apparatus for providing service
WO2022181958A1 (en) Cloud migration data analysis method using system process information, and system therefor
WO2022050551A1 (en) Legal service provision system and method therefor
WO2022108427A1 (en) Smart trust enabler system for 5g-based iot environment
WO2020241923A1 (en) Artificial intelligence device for predicting performance of speech recognition model in user environment, and method therefor
WO2018105804A1 (en) Bpm-based iot diy system and method for implementing same
WO2021141292A1 (en) Method and system for hybrid cloud-based real-time data archiving
KR20170101684A (en) Manufacturing perform integrated management system
EP3980882A1 (en) Electronic apparatus and control method thereof
WO2021141293A1 (en) Data archiving method and system for minimizing cost of data transmission and retrieval
WO2020246862A1 (en) Method and apparatus for interacting with intelligent response system
KR102302867B1 (en) Vertical integrated package operation system and method of smart factory for root industry
WO2021045406A1 (en) Electronic device configured to perform action using speech recognition function and method for providing notification related to action using same

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20960877

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20960877

Country of ref document: EP

Kind code of ref document: A1