KR20220121993A - System for recogniting virtual physical weight based on virtual technology using artificial intelligence and method for visualization - Google Patents

System for recogniting virtual physical weight based on virtual technology using artificial intelligence and method for visualization Download PDF

Info

Publication number
KR20220121993A
KR20220121993A KR1020210026064A KR20210026064A KR20220121993A KR 20220121993 A KR20220121993 A KR 20220121993A KR 1020210026064 A KR1020210026064 A KR 1020210026064A KR 20210026064 A KR20210026064 A KR 20210026064A KR 20220121993 A KR20220121993 A KR 20220121993A
Authority
KR
South Korea
Prior art keywords
information
measurement target
virtual
physical weight
present
Prior art date
Application number
KR1020210026064A
Other languages
Korean (ko)
Inventor
박성준
Original Assignee
(주)데이타리얼리티
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by (주)데이타리얼리티 filed Critical (주)데이타리얼리티
Priority to KR1020210026064A priority Critical patent/KR20220121993A/en
Publication of KR20220121993A publication Critical patent/KR20220121993A/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/212Input arrangements for video game devices characterised by their sensors, purposes or types using sensors worn by the player, e.g. for measuring heart beat or leg activity
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/038Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Software Systems (AREA)
  • Medical Informatics (AREA)
  • General Health & Medical Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Cardiology (AREA)
  • Health & Medical Sciences (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • Biophysics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Computing Systems (AREA)
  • Mathematical Physics (AREA)
  • Measuring And Recording Apparatus For Diagnosis (AREA)
  • Debugging And Monitoring (AREA)
  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)

Abstract

The present invention relates to a system for recognizing a virtual physical weight based on a virtual technology using artificial intelligence and a visualization method. According to the present invention, provided is a system for recognizing a virtual physical weight based on a virtual technology using artificial intelligence. The system comprises: an information collection unit which collects acceleration information, rotation information, direction information, or time information of a measurement target by using a motion recognition sensor attached to the measurement target; a clustering data acquisition unit which obtains clustering data by applying the collected information to a machine learning algorithm and clustering the motion of the measurement target; a calculation unit which calculates a direction of the motion and the magnitude of the force of an arm of the measurement target by applying the clustering data to a linear regression algorithm; and a display unit which displays the calculated direction of the motion and magnitude of the force of the arm of the measurement target. As described above, according to the present invention, visual results according to various inputs can be accurately confirmed by utilizing finger-based motion recognition and a pointer.

Description

인공지능을 이용한 가상기술 기반의 가상 물리적 무게감 인식 시스템 및 시각화 방법{SYSTEM FOR RECOGNITING VIRTUAL PHYSICAL WEIGHT BASED ON VIRTUAL TECHNOLOGY USING ARTIFICIAL INTELLIGENCE AND METHOD FOR VISUALIZATION} SYSTEM FOR RECOGNITING VIRTUAL PHYSICAL WEIGHT BASED ON VIRTUAL TECHNOLOGY USING ARTIFICIAL INTELLIGENCE AND METHOD FOR VISUALIZATION based on virtual technology using artificial intelligence

본 발명은 인공지능을 이용한 가상기술 기반의 가상 물리적 무게감 인식 시스템 및 시각화 방법에 관한 것으로, AR 및 VR을 이용하여 물리적 무게감을 인식시키기 위한 인공지능을 이용한 가상기술 기반의 가상 물리적 무게감 인식 시스템 및 시각화 방법에 관한 것이다.The present invention relates to a virtual physical weight recognition system and visualization method based on virtual technology using artificial intelligence. it's about how

대부분의 VR장비들은 유선 형태로 컴퓨터와 연결을 하고 있다. 모든 시각적 콘텐츠를 3D 오브젝트로 구성해야만 하기 때문에 고성능의 독립적 PC와 유선으로 연결을 해야 한다. PC에서 3D 객체의 렌더링 결과를 VR 디바이스로 보내주어야 하기 때문이다. 대부분 VR 디바이스는 양손에 쥘 수 있는 VR 콘트롤러 인터페이스를 제공하고 있다. Most VR devices are connected to a computer in a wired form. Since all visual content must be composed of 3D objects, it must be connected to a high-performance independent PC by wire. This is because the PC needs to send the rendering result of the 3D object to the VR device. Most VR devices provide a VR controller interface that can be held in both hands.

사용자는 콘트롤러를 이용하여 다양한 콘텐츠를 활용할 수 있다. 이러한 VR 디바이스에서 제공해주는 콘트롤러는 게임과 같은 엔터테인먼트 분야에 최적화 되어 활용될 수 있다. The user can utilize various contents by using the controller. The controller provided by these VR devices can be optimized and utilized in entertainment fields such as games.

빠른 렌더링 결과의 시각화 지원, 양손 콘트롤러를 활용한 직관적 제어 방식에 기반하여 VR 디바이스를 보급화 하기 위한 방법으로 잘 구현되었다고 볼 수 있다. It can be seen that it has been well implemented as a method to popularize VR devices based on the visualization support of fast rendering results and the intuitive control method using the two-handed controller.

그러나, 산업 현장에서 VR 디바이스를 활용하기에는 많은 어려움이 있으며 특히, 유선이기 때문에 작업에 공간적 제약 사항이 매우 커서 현실적으로 적용하기에는 불가능하다. 또한, 현장에서 일어나는 들고, 나르고, 설치하는 등에 대한 전반적인 작업을 구현하기에는 지금까지 나온 VR 디바이스로는 힘들다고 볼 수 있다.However, there are many difficulties in using VR devices in industrial fields, and in particular, since they are wired, space restrictions are very large for work, so it is impossible to apply them realistically. In addition, it can be seen that it is difficult with VR devices that have been released so far to implement the overall work of lifting, carrying, and installing that occurs in the field.

MR 디바이스는 VR디바이스와는 달리 산업 현장에 적용하기 위한 최고의 방법이다. 현장의 작업 환경을 그대로 카메라로 인식하여 현실을 증강 시켜 혼합한 현실을 보여준다. Unlike VR devices, MR devices are the best way to apply them to industrial sites. By recognizing the working environment of the field as it is with the camera, the reality is augmented to show mixed reality.

그러나 현재 출시되어 있는 MR디바이스는 상대적으로 VR디바이스보다 많지 않으며 인터랙션 방법도 제한적이다. 유선 기반이지만 이중 몇 개의 MR 디바이스가 무선으로 활용할 수 있다. However, the currently available MR devices are relatively less than VR devices, and their interaction methods are limited. Although it is wired based, some MR devices can utilize it wirelessly.

MR디바이스의 인터랙션 방법은 대부분 Point-and-Click 기반의 커맨드를 수행한다. Most of the interaction methods of MR devices perform point-and-click based commands.

따라서, 손가락 기반의 움직임 인식 및 포인터를 활용하여 다양한 입력에 따른 시각적 결과를 볼 수 있다. 따라서, 산업 현장에서의 동적인 작업을 수행하기에는 현실적으로 어렵고 개별적인 인터랙션 방법이 연구 되어야 한다.Accordingly, visual results according to various inputs can be viewed by utilizing finger-based motion recognition and pointers. Therefore, it is realistically difficult to perform dynamic work in the industrial field, and individual interaction methods must be studied.

본 발명의 배경이 되는 기술은 한국공개특허 제10-2017-0042187호(2017.04.18. 공개)에 개시되어 있다. The technology that is the background of the present invention is disclosed in Korean Patent Application Laid-Open No. 10-2017-0042187 (published on April 18, 2017).

본 발명이 이루고자 하는 기술적 과제는 AR 및 VR을 이용하여 물리적 무게감을 인식시키기 위한 인공지능을 이용한 가상기술 기반의 가상 물리적 무게감 인식 시스템 및 시각화 방법에 관한 것이다.The technical problem to be achieved by the present invention relates to a virtual physical weight recognition system and visualization method based on virtual technology using artificial intelligence for recognizing a physical weight using AR and VR.

이러한 기술적 과제를 이루기 위한 본 발명의 실시 예에 따르면, 인공지능을 이용한 가상기술 기반의 가상 물리적 무게감 인식 시스템에 있어서, 측정 대상자에게 부착된 동작인식 센서를 이용하여 상기 측정 대상자의 가속도 정보, 회전정보, 방향 정보 또는 시간 정보를 수집하는 정보 수집부, 상기 수집된 정보를 머신러닝 알고리즘에 적용하여 상기 측정 대상자의 움직임을 군집화하여 군집화 데이터를 획득하는 군집화 데이터 획득부, 상기 군집화 데이터를 선형회귀 알고리즘에 적용하여 상기 측정 대상자의 팔의 움직임의 방향성과 힘의 크기를 연산하는 연산부, 그리고 상기 연산된 측정 대상자의 팔의 움직임의 방향성과 힘의 크기를 디스플레이하는 디스플레이부를 포함한다.According to an embodiment of the present invention for achieving this technical task, in the virtual physical weight recognition system based on virtual technology using artificial intelligence, acceleration information and rotation information of the subject to be measured using a motion recognition sensor attached to the subject to be measured , an information collection unit that collects direction information or time information, a clustering data acquisition unit that applies the collected information to a machine learning algorithm to cluster the movement of the measurement target to obtain clustering data, and the clustering data to a linear regression algorithm It includes a calculation unit for calculating the direction of the arm movement and the magnitude of the force by applying, and a display unit for displaying the calculated direction and the magnitude of the force of the arm movement of the measurement target.

이와 같이 본 발명에 따르면, 손가락 기반의 움직임 인식 및 포인터를 활용하여 다양한 입력에 따른 시각적 결과를 정확하게 확인할 수 있다.As described above, according to the present invention, visual results according to various inputs can be accurately confirmed by using finger-based motion recognition and pointers.

도 1은 본 발명의 실시예에 따른 가상 물리적 무게감 인식 시스템의 구성을 설명하기 위한 도면이다.
도 2는 본 발명의 실시예에 따른 가상 물리적 무게감 인식 시스템을 설명하기 위한 개략도이다
도 3은 본 발명의 실시예에 따른 가상 물리적 무게감 인식 시스템을 이용한 물리적 무게감 인식방법을 설명하기 위한 도면이다.
도 4a 및 도 4b는 본 발명의 실시예에 따른 디스플레이부를 나타낸 도면이다.
1 is a view for explaining the configuration of a virtual physical weight recognition system according to an embodiment of the present invention.
2 is a schematic diagram for explaining a virtual physical weight recognition system according to an embodiment of the present invention
3 is a view for explaining a method for recognizing a physical weight using a virtual physical weight recognition system according to an embodiment of the present invention.
4A and 4B are diagrams illustrating a display unit according to an embodiment of the present invention.

아래에서는 첨부한 도면을 참조하여 본 발명이 속하는 기술 분야에서 통상의 지식을 가진 자가 용이하게 실시할 수 있도록 본 발명의 실시 예를 상세히 설명한다. 그러나 본 발명은 여러 가지 상이한 형태로 구현될 수 있으며 여기에서 설명하는 실시예에 한정되지 않는다. 그리고 도면에서 본 발명을 명확하게 설명하기 위해서 설명과 관계없는 부분은 생략하였으며 명세서 전체를 통하여 유사한 부분에 대해서는 유사한 도면 부호를 붙였다.Hereinafter, embodiments of the present invention will be described in detail with reference to the accompanying drawings so that those of ordinary skill in the art can easily carry out the present invention. However, the present invention may be embodied in several different forms and is not limited to the embodiments described herein. And in order to clearly explain the present invention in the drawings, parts irrelevant to the description are omitted, and similar reference numerals are attached to similar parts throughout the specification.

명세서 전체에서, 어떤 부분이 어떤 구성요소를 "포함"한다고 할 때, 이는 특별히 반대되는 기재가 없는 한 다른 구성요소를 제외하는 것이 아니라 다른 구성요소를 더 포함할 수 있는 것을 의미한다.Throughout the specification, when a part "includes" a certain component, it means that other components may be further included, rather than excluding other components, unless otherwise stated.

그러면 첨부한 도면을 참고로 하여 본 발명의 실시 예에 대하여 본 발명이 속하는 기술 분야에서 통상의 지식을 가진 자가 용이하게 실시할 수 있도록 상세히 설명한다.Then, with reference to the accompanying drawings, embodiments of the present invention will be described in detail so that those of ordinary skill in the art to which the present invention pertains can easily implement them.

도 1은 본 발명의 실시예에 따른 가상 물리적 무게감 인식 시스템의 구성을 설명하기 위한 도면이고, 도 2는 본 발명의 실시예에 따른 가상 물리적 무게감 인식 시스템을 설명하기 위한 개략도이며, 도 3은 본 발명의 실시예에 따른 가상 물리적 무게감 인식 시스템을 이용한 물리적 무게감 인식방법을 설명하기 위한 도면이고, 도 4a 및 도 4b는 본 발명의 실시예에 따른 디스플레이부를 나타낸 도면이다.1 is a diagram for explaining the configuration of a virtual physical weight recognition system according to an embodiment of the present invention, FIG. 2 is a schematic diagram for explaining a virtual physical weight recognition system according to an embodiment of the present invention, FIG. 3 is this It is a view for explaining a method for recognizing a physical weight using a virtual physical weight recognition system according to an embodiment of the present invention, and FIGS. 4A and 4B are views showing a display unit according to an embodiment of the present invention.

도 1 내지 도 4에서 나타낸 것처럼, 본 발명의 실시예에 따른 가상 물리적 무게감 인식 시스템(100)은 정보 수집부(110), 군집화 데이터 획득부(120), 연산부(130) 및 디스플레이부(140)를 포함한다.1 to 4 , the virtual physical weight recognition system 100 according to an embodiment of the present invention includes an information collection unit 110 , a clustering data acquisition unit 120 , an operation unit 130 , and a display unit 140 . includes

먼저, 정보 수집부(110)는 측정 대상자에게 부착된 동작인식 센서를 이용하여 측정 대상자의 가속도 정보, 회전정보, 방향 정보 또는 시간 정보를 수집한다.First, the information collection unit 110 collects acceleration information, rotation information, direction information, or time information of the measurement target by using a motion recognition sensor attached to the measurement target.

다음으로, 군집화 데이터 획득부(120)는 수집된 정보를 머신러닝 알고리즘에 적용하여 측정 대상자의 움직임을 군집화하여 군집화 데이터를 획득한다.Next, the clustering data acquisition unit 120 applies the collected information to a machine learning algorithm to cluster the movement of the subject to be measured to acquire clustering data.

이때, 머신러닝 알고리즘은 사람이 할수 있거나 혹은 하기 어려운 작업을 대신 해낼 수 있는 기계를 학습시키는 일련의 작업을 의미하며, 본 발명의 실시예에서는 측정 대상자의 움직임을 군집화하기 위해 사용된다.In this case, the machine learning algorithm refers to a series of tasks for learning a machine that can perform tasks that are difficult or impossible for humans to do.

여기서, 군집화 데이터 획득부(120)는 K-means Clustering 또는 DBSCAN Clustering방법을 이용하여 측정 대상자의 움직임을 군집화한다.Here, the clustering data acquisition unit 120 uses the K-means clustering or DBSCAN clustering method to cluster the movement of the measurement target.

이때, K-means Clustering은 거리 기반의 데이터 분류를 하는 군집화 알고리즘으로서 입력한 데이터 중 임의로 선택된 K개의 기준과 각 점들간의 거리를 오차로 생각하고 각각의 점들은 거리가 가장 가까운 기준에 해당하는 경우, 설정하여 분류하고 반복적으로 각각 기준에 해당하는 점들 모두의 평균을 새로운 기준으로 갱신하는 방법이다In this case, K-means clustering is a clustering algorithm that classifies data based on distance, and considers the distance between K criteria and each point randomly selected among the input data as an error, and each point corresponds to the closest criterion. , it is a method of classifying by setting and iteratively updating the average of all points corresponding to each criterion as a new criterion.

또한, DBSCAN 클러스터링 알고리즘(Density-based spatial clustering of a applications with noise)은 어떤 학습 데이터가 코어로 정의되면 주변의 이웃 데이터와 함께 하나의 레이블을 갖는 클러스터를 구성하는 방법을 의미한다. In addition, the DBSCAN clustering algorithm (Density-based spatial clustering of a applications with noise) refers to a method of constructing a cluster having one label together with surrounding neighboring data when some learning data is defined as a core.

다음으로, 연산부(130)는 군집화 데이터를 선형회귀 알고리즘에 적용하여 상기 측정 대상자의 팔의 움직임의 방향성과 힘의 크기를 연산한다.Next, the calculation unit 130 applies the clustering data to the linear regression algorithm to calculate the direction of the arm movement and the magnitude of the force.

다음으로, 디스플레이부(140)는 연산된 측정 대상자의 팔의 움직임의 방향성과 힘의 크기를 디스플레이한다.Next, the display unit 140 displays the calculated direction of movement of the subject's arm and the magnitude of the force.

즉, 도 4에서 나타낸 것과 같이, 디스플레이부(140)는 AR 또는 VR을 이용하여 측정 대상자의 팔의 움직임의 방향성과 힘의 크기를 디스플레이한다.That is, as shown in FIG. 4 , the display unit 140 displays the direction of the arm movement and the magnitude of the force using AR or VR.

이와 같이 본 발명의 실시예에 따르면, 손가락 기반의 움직임 인식 및 포인터를 활용하여 다양한 입력에 따른 시각적 결과를 정확하게 확인할 수 있다.As described above, according to an embodiment of the present invention, visual results according to various inputs can be accurately confirmed by using finger-based motion recognition and pointers.

본 발명은 도면에 도시된 실시 예를 참고로 설명되었으나 이는 예시적인 것이 불과하며, 본 기술 분야의 통상의 지식을 가진 자라면 이로부터 다양한 변형 및 균등한 다른 실시 예가 가능하다는 점을 이해할 것이다. 따라서, 본 발명의 진정한 기술적 보호 범위는 첨부된 특허청구범위의 기술적 사상에 의하여 정해져야 할 것이다. Although the present invention has been described with reference to the embodiment shown in the drawings, this is merely exemplary, and those skilled in the art will understand that various modifications and equivalent other embodiments are possible therefrom. Accordingly, the true technical protection scope of the present invention should be defined by the technical spirit of the appended claims.

100: 가상 물리적 무게감 인식 시스템,
110: 정보 수집부,
120: 군집화 데이터 획득부,
130: 연산부, 140: 디스플레이부
100: virtual physical weight recognition system,
110: information collection unit;
120: a clustering data acquisition unit;
130: arithmetic unit, 140: display unit

Claims (1)

인공지능을 이용한 가상기술 기반의 가상 물리적 무게감 인식 시스템에 있어서,
측정 대상자에게 부착된 동작인식 센서를 이용하여 상기 측정 대상자의 가속도 정보, 회전정보, 방향 정보 또는 시간 정보를 수집하는 정보 수집부,
상기 수집된 정보를 머신러닝 알고리즘에 적용하여 상기 측정 대상자의 움직임을 군집화하여 군집화 데이터를 획득하는 군집화 데이터 획득부,
상기 군집화 데이터를 선형회귀 알고리즘에 적용하여 상기 측정 대상자의 팔의 움직임의 방향성과 힘의 크기를 연산하는 연산부, 그리고
상기 연산된 측정 대상자의 팔의 움직임의 방향성과 힘의 크기를 디스플레이하는 디스플레이부를 포함하는 가상 물리적 무게감 인식 시스템.
In a virtual physical weight recognition system based on virtual technology using artificial intelligence,
An information collection unit for collecting acceleration information, rotation information, direction information, or time information of the measurement target using a motion recognition sensor attached to the measurement target;
A clustering data acquisition unit for acquiring clustering data by applying the collected information to a machine learning algorithm to cluster the movement of the measurement target;
a calculation unit for calculating the direction of movement of the arm of the subject to be measured and the magnitude of the force by applying the clustering data to a linear regression algorithm; and
A virtual physical weight recognition system including a display unit for displaying the calculated direction of movement of the arm of the subject to be measured and the magnitude of the force.
KR1020210026064A 2021-02-26 2021-02-26 System for recogniting virtual physical weight based on virtual technology using artificial intelligence and method for visualization KR20220121993A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020210026064A KR20220121993A (en) 2021-02-26 2021-02-26 System for recogniting virtual physical weight based on virtual technology using artificial intelligence and method for visualization

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020210026064A KR20220121993A (en) 2021-02-26 2021-02-26 System for recogniting virtual physical weight based on virtual technology using artificial intelligence and method for visualization

Publications (1)

Publication Number Publication Date
KR20220121993A true KR20220121993A (en) 2022-09-02

Family

ID=83281090

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020210026064A KR20220121993A (en) 2021-02-26 2021-02-26 System for recogniting virtual physical weight based on virtual technology using artificial intelligence and method for visualization

Country Status (1)

Country Link
KR (1) KR20220121993A (en)

Similar Documents

Publication Publication Date Title
Malý et al. Augmented reality experiments with industrial robot in industry 4.0 environment
US10179407B2 (en) Dynamic multi-sensor and multi-robot interface system
Kumar et al. Hand data glove: a wearable real-time device for human-computer interaction
US10620775B2 (en) Dynamic interactive objects
US8875041B1 (en) Methods and systems for providing feedback on an interface controlling a robotic device
US20170372139A1 (en) Augmented reality robotic system visualization
US9268410B2 (en) Image processing device, image processing method, and program
US10895950B2 (en) Method and system for generating a holographic image having simulated physical properties
WO2019199504A1 (en) System, method, computer readable medium, and viewer-interface for prioritized selection of mutually occluding objects in a virtual environment
CN111338287A (en) Robot motion control method, device and system, robot and storage medium
CN104182035A (en) Method and system for controlling television application program
Hu et al. Performance evaluation of optical motion capture sensors for assembly motion capturing
Nasim et al. Physics-based interactive virtual grasping
Zhang et al. Robot programming by demonstration: A novel system for robot trajectory programming based on robot operating system
Placidi et al. Data integration by two-sensors in a LEAP-based Virtual Glove for human-system interaction
Gutierrez et al. Trajectory planning in dynamics environment: application for haptic perception in safe human-robot interaction
Fahmi et al. Development of excavator training simulator using leap motion controller
KR20220121993A (en) System for recogniting virtual physical weight based on virtual technology using artificial intelligence and method for visualization
Khan ROS-based control of a manipulator arm for balancing a ball on a plate
Yu et al. A unified framework for remote collaboration using interactive ar authoring and hands tracking
Liu et al. Research on motion control system of 6-DoF robotic arm
Menezes et al. Touching is believing-Adding real objects to Virtual Reality
KR102469149B1 (en) Real object multi-directional recognition training system based on 3D design data and the method using it
Shruti et al. Arduino Based Hand Gesture Controlled Robot
Rupprecht et al. Signal-processing transformation from smartwatch to arm movement gestures