WO2020085537A1 - Dispositif de reconnaissance de mouvement et procédé de reconnaissance de mouvement l'utilisant - Google Patents

Dispositif de reconnaissance de mouvement et procédé de reconnaissance de mouvement l'utilisant Download PDF

Info

Publication number
WO2020085537A1
WO2020085537A1 PCT/KR2018/012734 KR2018012734W WO2020085537A1 WO 2020085537 A1 WO2020085537 A1 WO 2020085537A1 KR 2018012734 W KR2018012734 W KR 2018012734W WO 2020085537 A1 WO2020085537 A1 WO 2020085537A1
Authority
WO
WIPO (PCT)
Prior art keywords
arm
forearm
data
motion data
motion
Prior art date
Application number
PCT/KR2018/012734
Other languages
English (en)
Korean (ko)
Inventor
정연우
오복성
이윤호
Original Assignee
(주)리얼감
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by (주)리얼감 filed Critical (주)리얼감
Priority to PCT/KR2018/012734 priority Critical patent/WO2020085537A1/fr
Publication of WO2020085537A1 publication Critical patent/WO2020085537A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer

Definitions

  • the present invention relates to a motion recognition device and a motion recognition method using the same, and more particularly, to a motion recognition device for recognizing the motion of an arm using a gyro sensor and a laser sensor, and a motion recognition method using the same.
  • the user interface technology of the computer has developed from the past method using a mouse, a keyboard, and a touch, and a technology for recognizing voice input through a microphone and a motion recognition technology based on a camera or sensor are commercially available. Recently, a technology has been developed that combines 3D image and spatial recognition technology to directly touch or interact with content in a 3D image.
  • the most commonly used is a technology that recognizes a moving object (a part of a body or a device attached to the body) using a camera, and uses a 3D camera and an RGB camera to create a skeleton for a moving object (dynamic object). Extract it to recognize the 3D motion of the dynamic object. While this method has the advantage that the user does not need to wear a separate device, it is difficult to accurately capture movements of the joints of the body, and there is a limit to providing immersion because there is no direct feedback to virtual reality.
  • the method of recognizing the movement of an object through a sensor is attached to a body part that requires motion recognition for the user's motion recognition. There is a problem of wearing discomfort and restriction of behavior.
  • the technical problem to be achieved by the present invention is to provide a motion recognition device and a method of using the motion recognition device to recognize the motion of the arm using a gyro sensor and a laser sensor.
  • the motion recognition device collects motion data of the forearm of a user sensed by a gyro sensor and collects distance data sensed by a distance measuring sensor Sensing unit;
  • a control unit for calculating motion data of the upper arm based on the distance data and the motion data of the forearm, and calculating motion data of the arm based on the motion data of the forearm and the motion data of the upper arm;
  • it characterized in that it comprises a communication unit for transmitting the movement data of the arm to the first external device through the network.
  • the distance measuring sensor may measure the distance from the distance measuring sensor to the upper arm.
  • control unit calculates the folding angle of the elbow based on the distance data, and then calculates the motion data of the upper arm based on the folding angle of the elbow and the motion data of the forearm. You can.
  • control unit based on the distance from the distance measuring sensor to the upper arm, the distance between the distance measuring sensor and the elbow, and the height from the elbow center to the distance measuring sensor, the folding angle of the elbow Can be calculated.
  • the controller may calculate arm motion data based on the motion data of the forearm, the motion data of the upper arm, the length of the forearm, and the length of the upper arm.
  • the motion data of the arm may be coordinates representing the position of the shoulder, the position of the elbow, and the position of the wrist, respectively.
  • the motion data of the arm may use the position of the shoulder as a reference point, and calculate the position of the elbow and the wrist based on the reference point.
  • the communication unit may receive wrist movement data from a device other than the first external device.
  • motion data of the arm is calculated. You can.
  • control unit may calculate a twist angle of the forearm based on a part of the wrist movement data and the twist angle of the upper arm.
  • the present invention includes a body attachment portion configured to be worn on the forearm, and is worn on the upper portion of the forearm using the body attachment portion, so that the value of the gyro sensor does not change due to the warping of the wrist. can do.
  • a motion recognition method includes collecting motion data of a user's forearm sensed by a gyro sensor and collecting distance data sensed by a distance measuring sensor; Calculating motion data of the upper arm based on the distance data and the motion data of the forearm; Calculating motion data of the arm based on the motion data of the forearm and the motion data of the upper arm; And transmitting the motion data of the arm to a first external device through a network.
  • the step of calculating the motion data of the upper arm is based on the distance data, after calculating the folding angle of the elbow, based on the folding angle of the elbow and the motion data of the forearm.
  • Motion data of the upper arm can be calculated.
  • the step of calculating the folding angle of the elbow is based on the distance from the distance measuring sensor to the upper arm, the distance between the distance measuring sensor and the elbow, and the height from the center of the elbow to the distance measuring sensor. , It is possible to calculate the folding angle of the elbow.
  • the step of calculating the motion data of the arm may calculate the motion data of the arm based on the motion data of the forearm, the motion data of the forearm, the length of the forearm, and the length of the forearm. have.
  • calculating the motion data of the arm is based on the motion data of the forearm, the motion data of the upper arm, the motion data of the wrist, the length of the forearm, the length of the upper arm, and the length of the wrist. By doing so, it is possible to calculate the motion data of the arm.
  • the motion recognition device and method according to an embodiment of the present invention may be worn on the forearm of the arm, and may detect the motion of the forearm and the degree of folding of the elbow to measure the motion of the upper arm.
  • FIG. 1 is a block diagram illustrating a motion recognition device related to the present invention.
  • FIG. 2 is a view for explaining the degree of freedom of the human arm of the person related to the present invention.
  • FIG. 3 is a perspective view showing an example of a wear-type motion recognition device related to the present invention.
  • FIG. 4 is a view for explaining an operation method of a motion recognition device according to an embodiment of the present invention.
  • FIG. 5 is a view for explaining a method of obtaining motion data of the forearm according to an embodiment of the present invention
  • FIG. 6 is a view for explaining a method of obtaining a folding angle of an elbow according to an embodiment of the present invention.
  • FIG. 7 is a view for explaining a method for obtaining position data of an arm according to an embodiment of the present invention.
  • FIG. 8 is a view showing an HMD device and a motion recognition device connected to a control device.
  • FIG. 9 is a view for explaining a method of obtaining position data of an arm and a hand according to an embodiment of the present invention.
  • FIG. 10 is a view for explaining a method of displaying the movement of a human arm according to an embodiment of the present invention.
  • FIG. 11 is a view for explaining a method of displaying hand and arm movements according to an embodiment of the present invention.
  • FIG. 12 is a view for explaining the movement of the arm displayed on the screen according to an embodiment of the present invention.
  • FIG. 1 is a block diagram for explaining a motion recognition device 10 related to the present invention.
  • the motion recognition device 10 includes a sensing unit 110, a control unit 120, a communication unit 130, a storage unit 140, a power supply unit 150, an output unit 160, a user input unit 170, and an interface unit 180.
  • the components shown in FIG. 1 are not essential for implementing the motion recognition device 10, so that the motion recognition device 10 described herein has more or fewer components than those listed above. Can have
  • the sensing unit 110 may include one or more sensors for sensing at least one of ambient environment information and user information of the motion recognition device 10.
  • the sensing unit 110 may detect the movement of the user's arm with the motion recognition device 10 attached thereto, and transmit the sensed information to the controller 120.
  • the sensing unit 110 includes a gyro sensor 112, an acceleration sensor, a magnetic sensor, and a distance measurement sensor 111 (for example, a laser sensor, an ultrasonic sensor, an infrared sensor, Lidar (RADAR) sensor, radar (RADAR) sensor, camera sensor, etc.), proximity sensor, illumination sensor, touch sensor, gravity sensor (G-sensor), motion sensor ( motion sensor), RGB sensor, infrared sensor (IR sensor), fingerprint scan sensor, ultrasonic sensor, optical sensor, battery gauge, environmental sensor, And at least one of chemical sensors. Meanwhile, the motion recognition device 10 disclosed in the present specification may combine and use information sensed by at least two or more of these sensors.
  • the gyro sensor 112 may measure a rotational repulsive force generated by a rotational motion of an object having a gyroscope and convert it into an electrical signal.
  • the gyro sensor 112 can express the sensed acceleration as a rotation vector value (roll, pitch, yaw) for three axes, and a gyro sensor 112 such as optical, mechanical, or piezoelectric. ).
  • the rotation recognition method of the gyro sensor 112 may include a method using an Euler angle and a method using a quaternion.
  • the Euler angle uses a rotation vector value (roll, pitch, yaw), and a specific axis is a specific angle during the rotation process of the three axes, the x-axis, y-axis, and z-axis using the Euler angle.
  • a rotation vector value roll, pitch, yaw
  • a specific axis is a specific angle during the rotation process of the three axes, the x-axis, y-axis, and z-axis using the Euler angle.
  • the quaternion is a vector of a four-dimensional complex space, and uses four values (x, y, z, w) as a method for solving the gimbal lock problem occurring in Euler rotation.
  • the distance measuring sensor 111 When the distance measuring sensor 111 reflects and returns from an object using intangible energy such as light, sound, and radio waves, it measures the travel time and multiplies the speed of the medium (for example, light, sound, radio waves, etc.). The distance unit value is displayed. Since the distance value is a thin distance and a return distance, the distance to the target can be measured by dividing 2 from this value.
  • the laser sensor can measure the difference in time (frequency, phase, displacement) that reflects and returns from the target using a laser to obtain the distance, and can detect the distance with a very high resolution.
  • the principle of distance measurement methods using a laser includes pulsed modulation, frequency modulation, phase shift, triangulation, and interferometry.
  • the controller 120 controls the overall operation of the motion recognition device 10.
  • the control unit 120 executes the programs stored in the storage unit 140, so that the sensing unit 110, the communication unit 130, the storage unit 140, the power supply unit 150, the output unit 160, and the user It is possible to control the input unit and the like.
  • the controller 120 controls driving or operation of the motion recognition device 10 based on the sensing signal generated by the sensing unit 110, or processes and functions data related to an application program installed in the motion recognition device 10. Alternatively, an operation can be performed.
  • the controller 120 may convert the position of the arm, the movement speed, the movement direction, and the like based on the information sensed by the sensing unit 110.
  • the communication unit 130 configures a connection by which the motion recognition device 10 can transmit or receive data, etc. with a server or other device. Also, the communication unit 130 may transmit output data to a server or another device or receive input data from a server or other device.
  • the communication unit 130 includes Bluetooth 131, Near Field Communication (132), Wireless LAN (WLAN), Wireless-Fidelity (Wi-Fi), Zigbee, Infrared Data Association (IrDA), It can support local area networks such as Wi-Fi Direct (WFD), ultra-wideband (UWB), Wireless Universal Serial Bus (USB), Ant +, and mobile communication networks such as 3G, LTE, LTE-A, and 5G.
  • 132 Near Field Communication
  • WLAN Wireless LAN
  • Wi-Fi Wireless-Fidelity
  • Zigbee Zigbee
  • IrDA Infrared Data Association
  • WFD Wi-Fi Direct
  • UWB ultra-wideband
  • USB Wireless Universal Serial Bus
  • Ant + Ant +
  • mobile communication networks such as 3G, LTE, LTE-A, and 5G.
  • the storage unit 140 may store a program for processing and control of the control unit 120 and may store input / output data (eg, application, content, time zone information of an external device, address book, etc.).
  • input / output data eg, application, content, time zone information of an external device, address book, etc.
  • the storage unit 140 includes a flash memory type, a hard disk type, a multimedia card micro type, and a card type memory (for example, SD or XD memory), RAM (RAM, Random Access Memory) SRAM (Static Random Access Memory), ROM (ROM, Read-Only Memory), EEPROM (Electrically Erasable Programmable Read-Only Memory), PROM (Programmable Read-Only Memory), magnetic memory, magnetic It may include a storage medium of at least one type of disk, optical disk. In addition, the device may operate a web storage or cloud server that performs a memory storage function on the Internet.
  • a card type memory for example, SD or XD memory
  • RAM Random Access Memory
  • SRAM Static Random Access Memory
  • ROM Read-Only Memory
  • EEPROM Electrically Erasable Programmable Read-Only Memory
  • PROM Programmable Read-Only Memory
  • magnetic memory magnetic It may include a storage medium of at least one type of disk, optical disk.
  • the device may operate a web storage or cloud server
  • the power supply unit 150 receives external power and internal power to supply power to each component included in the motion recognition device 10.
  • the power supply unit 150 includes a battery, and the battery may be a built-in battery or a replaceable battery.
  • the power supply unit 150 may include a connection port, and the connection port may be configured as an example of an interface in which an external charger that supplies power for charging the battery is electrically connected.
  • the power supply unit 150 may be configured to charge the battery wirelessly without using the connection port.
  • the power supply unit 150 uses one or more of an inductive coupling method based on a magnetic induction phenomenon from an external wireless power transmitter or a magnetic resonance coupling method based on an electromagnetic resonance phenomenon. Can be delivered.
  • the output unit 160 is for outputting an audio signal, a video signal, or a vibration signal, which includes a haptic module 163, a display unit 161, and an audio output unit 162, 160. You can.
  • the haptic module 163 generates various tactile effects that the user can feel.
  • a typical example of the tactile effect generated by the haptic module 163 may be vibration.
  • the intensity and pattern of vibration generated in the haptic module 163 may be controlled by a user's selection or setting of the controller 120.
  • the haptic module 163 may synthesize different vibrations and output them sequentially or sequentially.
  • the haptic module 163 is a pin array that vertically moves with respect to the skin surface of the contact, the blast or suction force of the air through the nozzle or inlet, the grazing on the skin surface, the contact of the electrode, the stimulation of the electrostatic force, etc.
  • Various tactile effects can be generated, such as an effect caused by an effect and an effect of reproducing a feeling of cold and warm using an element capable of absorbing heat or generating heat.
  • the haptic module 163 may not only deliver a tactile effect through direct contact, but also may be implemented so that a user can feel the tactile effect through muscle sensations such as fingers or arms.
  • the display unit 161 displays and outputs information processed by the motion recognition device 10.
  • the display unit 161 may display execution screen information of an application program driven by the motion recognition device 10 or user interface (UI) or graphical user interface (GUI) information according to the execution screen information.
  • UI user interface
  • GUI graphical user interface
  • the display unit 161 may be used as an input device in addition to an output device.
  • the display unit 161 includes a liquid crystal display, a thin film transistor-liquid crystal display, an organic light-emitting diode, a flexible display, and a three-dimensional display ( 3D display), and an electrophoretic display (electrophoretic display).
  • the motion recognition device 10 may include two or more display units 161.
  • the sound output unit 162 and 160 output audio data received from the communication unit 130 or stored in the storage unit 140. Also, the sound output units 162 and 160 output sound signals related to functions (for example, call signal reception sound, message reception sound, and notification sound) performed by the motion recognition device 10.
  • the sound output unit 162, 160 may include a speaker, a buzzer, and the like.
  • the user input unit 170 is for receiving information from a user. When information is input through the user input unit 170, the controller 120 may control the operation of the motion recognition device 10 to correspond to the inputted information. have.
  • the user input unit 170 is a mechanical input means (or a mechanical key, for example, a button located on the front, rear or side of the motion recognition device 10, a dome switch, a jog wheel , Jog switch, etc.) and a touch input means.
  • the touch-type input means is made of a virtual key, a soft key or a visual key displayed on the touch screen through software processing, or a part other than the touch screen It may be made of a touch key (touch key) disposed on.
  • the virtual key or the visual key may be displayed on the touch screen while having various forms, for example, a graphic, text, icon, video, or the like. It can be made of a combination of.
  • the interface unit 180 serves as a passage with all external devices connected to the motion recognition device 10.
  • the interface unit 180 receives data from an external device, receives power, and transmits data to each component inside the motion recognition device 10, or allows data inside the motion recognition device 10 to be transmitted to an external device.
  • a wired / wireless headset port For example, a wired / wireless headset port, an external charger port, a wired / wireless data port, a memory card port, a port for connecting a device equipped with an identification module
  • An interface unit 180 may include a (port), an audio input / output (I / O) port, a video input / output (I / O) port, and an earphone port.
  • FIG. 2 is a view for explaining the degree of freedom of the human arm of the person related to the present invention.
  • a robot arm a robot capable of acting as a human arm with an arm similar to a human arm
  • How freely the robot arm can move is expressed in the words 'degrees of freedom'.
  • the joint of the robot arm moves up and down only, so if you do a sledgehammer or cut in one direction, it is 'freedom 1', and the drill that rotates while moving up and down is 'freedom 2' and there are six positions. If you can move in a direction and work, you are said to be 'freedom 6.
  • a device having at least 7 degrees of freedom is required.
  • the human arm can be modeled to represent 7 rotational axes at 7 degrees of freedom.
  • Each joint region may be represented by the shoulder position 210 A 0 , the elbow position 220 A 1 , the wrist position 230 A 2 , and the center position of the hand 240 A 3 .
  • the upper arm 211 which is a portion of the arm from the shoulder to the elbow, may be displayed in a form that connects the shoulder position 210 A 0 and the elbow position 220 A 1 .
  • the forearm portion 221 which is a portion of the arm from the elbow to the wrist, may be displayed in a form that connects the elbow position 220 A 1 and the wrist position 230 A 2 .
  • the shoulder joint movement using shoulder angle 210 ⁇ 0x , shoulder left and right rotation angle ⁇ 0y , and twist angle ⁇ 0 of the upper arm 211 along the shoulder axis 210 A 0 can represent
  • the movement of the elbow joint may be represented by using the elbow folding angle ⁇ 1x along the axis at the elbow position 220 A 1 .
  • the movement of the wrist joint may be represented by using the wrist up and down rotation angle ⁇ 2x , the wrist left and right rotation angle ⁇ 2y , and the warping angle ⁇ 1 of the forearm 221.
  • the absolute angle of each body part can be expressed by Roll, Pitch, and Yaw values.
  • the absolute angle of the upper arm 211 may be represented by the roll angle ⁇ H , the pitch angle ⁇ H , and the yaw angle ⁇ H of the upper arm 211.
  • the absolute angle of the arm forearm 221 may be represented by the roll angle ⁇ L of the forearm 221, the pitch angle ⁇ L , and the yaw angle ⁇ L.
  • the absolute angle of the wrist can be expressed as the wrist roll angle ⁇ W , the pitch angle ⁇ W , and the yaw angle ⁇ W.
  • Position coordinates (x, y, z) and angle at each joint part A 0 , A 1 , A 2 of the arm using a variety of sensors, such as the gyro sensor 112, the distance measuring sensor 111, and the acceleration sensor.
  • ⁇ x , ⁇ y , ⁇ z ) are obtained, and the movement of the human arm can be sensed by changing the position coordinates and angle data of each joint.
  • a device including a sensor may be mounted or carried on the upper arm 211, the forearm 221, and the hand, and the sensor value may be measured.
  • FIG. 3 is a perspective view showing an example of a wearable motion recognition device 10 related to the present invention.
  • the wearable motion recognition device 10 shown in FIG. 3 may include one or more components shown in FIG. 1, and the components included in the wearable motion recognition device 10 may be described with reference to FIG. 1. Both the function and the functions of the components of the motion recognition device 10 described below can be performed.
  • the sensing unit 110 of the wear-recognition motion recognition device 10 shown in FIG. 3 may have a rectangular parallelepiped shape, but is not limited thereto, and may have a circular shape, an elliptical shape, or a square shape.
  • the shape of the main body 310 of the present invention may give a good visual image to the user, and any shape that can help the user's motion recognition, especially the motion recognition of the arm, may be used.
  • the wearable motion recognition device 10 includes a sensing unit 110 that detects the movement of a human arm, a control unit 120 for processing the detected signal, and transmits the processed detection signal to the outside. It may include a body 310 having a control unit 120 and a body attachment portion 320 connected to the body 310 and configured to be worn on a wrist.
  • the body 310 may include a case forming an appearance.
  • the case may include a case that provides an interior space for accommodating various electronic components.
  • the sensing unit 110 may detect the motion of the forearm 221 by using the gyro sensor 112 and obtain the sensed motion data.
  • the distance to the upper arm 211 may be measured using the distance measuring sensor 111 to obtain a measured value.
  • the forearm 221 motion data and the distance data from the distance measurement sensor 111 to the upper arm 211 may be transmitted to the controller 120 or an external device such as a head mounted display (HMD) through the communication unit 130. have.
  • HMD head mounted display
  • the sensing unit 110 includes one or more sensors, and may serve to detect various types of bio signals from a user.
  • the sensing unit 110 may acquire data by detecting movement of the forearm or various biological signals at regular intervals.
  • the controller 120 Based on the motion data of the forearm 221 transmitted to the controller 120 and the distance data from the distance measurement sensor 111 to the upper arm 211, the elbow folding angle, the absolute angle of the forearm 221, and the upper arm 211 You can perform the process of finding the absolute angle of. Additionally, based on the motion data of the wrist received from an external device such as the control device 50, a process of obtaining the absolute angle of the wrist may also be performed. The absolute angle of the wrist is not obtained from the control unit 120, but may be received through the communication unit 130 from a device that can be carried in the hand, such as the control device 50.
  • the controller 120 is based on the absolute angle of the forearm 221, the absolute angle of the upper arm 211, the length of the upper arm 211, and the length of the forearm 221, the shoulder position 210 of the human arm, the elbow position 220 ), Wrist position 230 can be obtained.
  • the shoulder position 210, the elbow position 220, and the wrist position 230 of the human arm may be transmitted to a device having a display unit 161 such as an HMD or TV through the communication unit 130.
  • the center position 240 of the hand may be obtained based on the absolute angle of the upper arm 211 and the absolute angle of the wrist.
  • the data of the central position of the hand 240 may be transmitted through the communication unit 130 to a device having a display unit 161 such as an HMD or TV.
  • the wearable motion recognition device 10 may include a communication unit 130 to enable wireless communication, and an antenna for wireless communication may be installed in the main body 310. Meanwhile, the antenna can expand its performance using a case.
  • a case including a conductive material may be configured to electrically connect to the antenna to expand the ground area or the radiation area.
  • the display unit 161 may be disposed on one surface of the main body 310 to output information, and a touch sensor may be provided on the display unit 161 to be implemented as a touch screen.
  • the main body 310 may further include a storage unit 140, a power supply unit 150, an output unit 160, a user input unit 170, an interface unit 180, and the like.
  • the display unit 161 When the display unit 161 is implemented as a touch screen, it may function as the user input unit 170, and accordingly, a separate key may not be provided in the main body 310.
  • the body attachment portion 320 is worn on the forearm to cover the forearm, and may be formed of a flexible material for easy wearing.
  • the body attachment portion 320 may be formed of leather, rubber, silicone, synthetic resin material, or the like.
  • the body attachment portion 320 is configured to be detachable to the body 310, the user can be configured to be replaceable with various types of body attachment portion 320 according to taste.
  • the body attachment part 320 may be used to expand the performance of the antenna.
  • a ground extension area that is electrically connected to the antenna to expand the ground area may be built in the body attachment part 320.
  • a fastener (not shown) may be provided on the body attachment part 320.
  • the fastener may be implemented by a buckle, a hook structure capable of snap-fit, or a velcro, and may include a stretchable section or material.
  • FIG. 4 is a view for explaining an operation method of the motion recognition device 10 according to an embodiment of the present invention.
  • the operation method of the motion recognition device 10 includes: acquiring motion data of the forearm 221 (S410), and measuring a distance between the upper arm 211 and the forearm 221 ( S420), obtaining the folding angle of the elbow using the distance data between the upper arm 211 and the forearm 221 (S430), and obtaining motion data of the upper arm 211 using the motion data of the forearm 221 Step S440 and obtaining the position data of the arm using the motion data of the upper arm 211 and the forearm 221 and the elbow folding angle may be included (S450).
  • Step S410 detects the movement of the forearm 221 through the sensing unit 110 of the motion recognition device 10 mounted on the forearm 221 which is a part from the elbow to the wrist among the human arms and detects the detected motion data. It can be an acquisition process.
  • the motion data of the forearm 221 is a three-way angle ( ⁇ g , ⁇ g , ⁇ g ) value obtained from the gyroscope sensor 112 obtained from the gyro sensor 112 constituting the sensing unit 110 It may be displayed as a value representing an angle (Roll angle ⁇ L , Pitch angle ⁇ L , Yaw angle ⁇ L ).
  • Step S420 may be a process of obtaining a measurement value by measuring a distance from the distance measurement sensor 111 of the forearm 221 to which the motion recognition device 10 is mounted to the upper arm 211.
  • the distance from the distance measuring sensor 111 to the upper arm 211 may be displayed in units of mm, cm, m, and the like.
  • Step S430 may be a process of calculating an angle representing the degree of folding of the elbow using the distance data from the distance measuring sensor 111 obtained in step 420 to the upper arm 211.
  • the folding angle of the elbow may be represented by ⁇ 1x indicating the angle.
  • Step (S440) is to calculate the motion data of the upper arm 211 indicating the movement of the upper arm 211, which is a portion from the shoulder to the elbow of the human arm, using the motion data of the forearm 221 equipped with the motion recognition device 10. It can be a process.
  • the motion data of the upper arm 211 may be displayed as a value representing a three-way angle (Roll angle ⁇ H , Pitch angle ⁇ H , Yaw angle ⁇ H ).
  • Step S450 is an arm using the motion data of the forearm 221 obtained from the sensing unit 110 and the motion data of the upper arm 211 and the elbow folding angle ⁇ 1x calculated from the motion data of the forearm 221. It may be a process of calculating the shoulder position 210, elbow position 220, wrist position 230 data. Data about the shoulder position 210, the elbow position 220, and the wrist position 230 of the arm may be displayed as coordinate values of the x, y, and z axes.
  • the shoulder position 210 is a reference point for displaying the elbow position 220, the wrist position 230, and the central position 240 of the hand, and is represented by (0, 0, 0) coordinate values of the XYZ axis corresponding to the reference point. It may or may not require separate extraction or calculation.
  • FIG. 5 is a diagram for explaining a method of obtaining motion data of the forearm 221 according to an embodiment of the present invention.
  • the motion recognition device 10 may be mounted on a human arm.
  • the motion recognition device 10 When the motion recognition device 10 is mounted on a human arm, the motion recognition device 10 may be mounted on a portion close to the elbow that is the upper side of the forearm 221 so as not to be affected by the warping of the wrist.
  • the motion recognition device 10 may be mounted in a portion close to the elbow position 220 among the elbow position 220 and the wrist position 230.
  • a control device 50 that can be carried in the hand may be additionally configured.
  • the control device 50 may be connected to a wireless network through the communication unit 130 of the motion recognition device 10 or wired through the interface unit 180 of the motion recognition device 10.
  • the motion recognition device 10 mounted on the forearm 221 may sense the movement of the forearm 221 through the sensing unit 110.
  • the movement of the forearm 221 may be detected through the gyro sensor 112 of the sensing unit 110.
  • the movement of the forearm 221 sensed through the gyro sensor 112 may be obtained as a three-way angle ( ⁇ g , ⁇ g , ⁇ g ) of the gyro sensor 112.
  • the motion recognition device 10 is worn on the upper portion of the forearm portion 221 that does not detect the rotation of the forearm portion 221, which is the Roll ⁇ L value, which is the absolute angle of the forearm portion 221, a control device 50 that can be carried in hand is connected. If not, the rotation of the forearm 221 may not be detected. On the other hand, even if the forearm 221 is rotated, the position value of the wrist does not change, so the wrist position 230 can be known.
  • the absolute angles ⁇ L and ⁇ L of the forearm 221 may be expressed by the following equation through the three-way angle of the gyro sensor 112.
  • FIG. 6 is a view for explaining a method of obtaining a folding angle of an elbow according to an embodiment of the present invention.
  • the distance through the sensing unit 110 of the motion recognition device 10 is measured. Can be measured.
  • the measurement distance D s of the distance measurement sensor 111 of the sensing unit 110 is reflected by an intangible energy such as light, sound, and radio waves emitted from the light emitting unit of the distance measurement sensor 111 to the upper arm 211 of the arm, so that the light receiving unit You can measure the distance using the speed returned to.
  • the measurement distance D s of the distance measurement sensor 111 is on the same line as the distance measurement sensor 111 in a direction parallel to the line connecting the center of the elbow position 220 A 1 and the center of the wrist position 230 A 2 . It may be a distance from a close point of the upper arm 211 located.
  • the distance d s between the distance measuring sensor 111 and the elbow is the point where the vertical line drawn perpendicular to the line connecting the center of the elbow position 220 A 1 and the center of the wrist position 230 A 2 from the sensor meets the elbow. It may be a distance to the center of the location 220 A 1 .
  • the distance d s between the distance measurement sensor 111 and the elbow has a portion that runs whenever the motion recognition device 10 is worn on the forearm 221, so that the motion recognition device 10 is connected to the content or the content is to be executed. When it can be determined through a numerical adjustment (Calibration) process.
  • the distance d s between the distance measurement sensor 111 and the elbow may be reflected by the user by setting an arbitrary value.
  • the height h s from the center of the elbow to the distance measuring sensor 111 is the length of the line perpendicular to the line connecting the center of the elbow position 220 A 1 and the center of the wrist position 230 A 2 from the distance measuring sensor 111. Can be represented as The height h s from the center of the elbow to the distance measuring sensor 111 is not measured for each user, but the radius length of the upper arm forearm 221 from the statistics on the thickness of the upper arm forearm 221 of a person having a specific gender and age. The value obtained by adding the height from the body attachment surface of the motion recognition device 10 to the distance measuring sensor 111 may be used. For example, the height h s from the center of the elbow to the distance measurement sensor 111 can be calculated and used from statistics on the upper thickness of the arm forearm 221 of a man in his 30s who is the main game layer.
  • the measurement distance D s of the distance measurement sensor 111 decreases, and thus the folding angle ⁇ of the elbow is measured. Can induce 1x .
  • the measurement distance D s of the distance measurement sensor 111 of the distance measurement sensor 111 and the distance d s between the distance measurement sensor 111 and the elbow may be the same.
  • the distance D s of the distance measurement sensor 111 is greater than the distance d s between the distance measurement sensor 111 and the elbow, it may indicate a state in which the arms are extended.
  • the measurement distance D s of the distance measurement sensor 111 becomes smaller than the distance d s between the distance measurement sensor 111 and the elbow, it may represent a state in which the arm is bent.
  • the angle of elbow folding through the tan function ⁇ 1x can be derived.
  • the measurement distance D s of the distance measuring sensor 111 of the sensing unit 110 obtained through this, the distance d s between the distance measuring sensor 111 and the elbow, and the height h s from the center of the elbow to the distance measuring sensor 111 are It can be seen that the elbow fold angle ⁇ 1x is related to the following.
  • the process of obtaining the folding angle ⁇ 1x of the elbow may be performed in the motion recognition device 10, and in the apparatus having the display unit 161 such as an HMD or TV, the distance measuring sensor 111 of the motion recognition device 10 It may be performed by receiving the measurement distance D s value.
  • the Roll ⁇ g value can be set to the warping angle ⁇ 0 of the upper arm 211, and the remaining two Pitch ⁇ g and Yaw ⁇ g values are used to rotate the shoulder up and down angle ⁇ 0x , shoulder left and right
  • the rotation angle ⁇ 0y can be obtained.
  • each angle must be calculated in consideration of the degree of folding of the elbow, so the folding angle of the elbow ⁇ 1x can be utilized.
  • the fold angle of the elbow ⁇ 1x is affected by the twist angle ⁇ 0 of the upper arm 211, so that the rotation axis changes, so the angle of twist of the upper arm 211 corresponds to the values of Pitch ⁇ g and Yaw ⁇ g measured by the gyro sensor 112.
  • the motion data of the upper arm 211 may be obtained by using the motion recognition device 10 attached to the forearm 221.
  • the absolute angles ( ⁇ L , ⁇ L ) of the forearm 221 of the arm can be known through the three-way angle of the gyro sensor 112 of the sensing unit 110, and the folding angle ⁇ 1x of the elbow is the arm upper arm 211
  • the angle of rotation of the gyro sensor 112 Roll ⁇ g , Pitch ⁇ g , Yaw ⁇ g
  • shoulder up and down rotation angle ⁇ 0x shoulder left and right rotation through the point that the axis rotates according to the twist direction of ⁇ 0
  • the angle ⁇ 0y and the twist angle ⁇ 0 of the upper arm 211 have the following relationship.
  • the angle of rotation of the upper and lower shoulders ⁇ 0x , the angle of rotation of the left and right shoulders ⁇ 0y , and the angle of twist ⁇ 0 of the upper arm 211 can be derived as follows.
  • the roll angle of the upper arm 211 which is the absolute angle of the upper arm 211, ⁇ H , Pitch angle ⁇ H , and Yaw angle ⁇ H are the twist angles of the upper arm 211, respectively, ⁇ 0 , the upper and lower rotation angles ⁇ 0x , and the shoulder left and right rotation angles It can be seen that the same value as ⁇ 0y is the same. Therefore, the absolute angles ⁇ H , ⁇ H , and ⁇ H of the upper arm 211 can be expressed by the following equation.
  • FIG. 7 is a view for explaining a method for obtaining position data of an arm according to an embodiment of the present invention.
  • Step 720 the step of setting the shoulder position (210) A 0 reference point (S710), the upper arm 211 length L 1 and the forearm 221 length L 2 to obtain Step 720, the step of obtaining the elbow position 220 A 1 (S730) and the step of obtaining the wrist position 230 A 2 (S740) may be included.
  • the shoulder position 210 is a reference point for displaying the elbow position 220, the wrist position 230, and the central position 240 of the hand, which is the coordinate value of the XYZ axis corresponding to the reference point (0, 0, 0) and may not be separately extracted or calculated.
  • the length L 1 of the upper arm 211 and the length L 2 of the forearm 221 may be obtained by measuring the lengths of the user's actual upper arm 211 and forearm 221.
  • Upper arm 211, the length L 1 and the forearm 221, the length L 2 is not determined by the user may be a statistical value for the length of the upper arm forearm 221 of a person having a particular sex and age.
  • the upper arm 211, the length L 1 and the forearm 221, the length L 2 may be calculated from the statistic of the arm length of the main game layer of 30 male
  • the elbow position 220 A 1 which is a relative position from the coordinate values of the reference point shoulder position 210 A 0 (0, 0, 0), may be obtained.
  • the elbow position 220 A 1 can be obtained by the following equation through ⁇ H , ⁇ H , ⁇ H , which is the absolute angle of the upper arm 211, and the length L 1 of the upper arm 211.
  • step S740 the elbow position 220 A 1 , which is a relative position from (0, 0, 0), which is the coordinate value of the shoulder position 210 A 0 as the reference point, may be obtained.
  • the wrist position 230 A 2 can be obtained by the following formula through the elbow position 220 A 1 and the absolute angles ⁇ L , ⁇ L of the forearm 221, and the length L 2 of the forearm 221.
  • the shoulder position of the human arm 210 A 0 , the elbow position 220 A 1 , and the wrist position 230 A 2 can be obtained, and the movement of the arm can be changed by changing the values of A 1 and A 2 . Can be detected.
  • FIG. 8 is a view showing an operation recognition device 10 connected to the HMD device and the control device 50.
  • the HMD device 80 is a display device having a form that can be worn on the user's head, showing the world of virtual reality (VR) through a display that displays a screen in front of the user, or augmented reality with real aliens (AR, Augmented Reality).
  • VR virtual reality
  • AR Augmented Reality
  • the control device 50 may detect movement of the control device 50 through various sensors capable of detecting the movement of the control device 50 and transmit motion data of the control device 50 to the display device.
  • the display device may include an HMD device 80.
  • the control device 50 is manufactured in a form that can be carried in the hand, and senses the movement of the wrist and generates motion data and transmits it to the display device or the motion recognition device 10. Based on the transmitted control device 50 or the movement of the wrist, a control command may be given to an application running on the display device. Also, based on the transmitted control device 50 or wrist movement data, it is possible to control movement of an object in an application running on the display device. It can be an object or a person in the application, or it can be a necessary tool in the application.
  • the motion recognition device 10 may output vibration or sound as feedback for a control command sent to an application of the display device or a motion control command of the object.
  • the motion recognition device 10 may output vibration or sound as feedback to the motion recognition device in a virtual reality environment even when the motion recognition device is restricted or externally impacted.
  • the control device 50 is an example of the configuration of the motion recognition device 10 of FIG. 1 and may be manufactured in an easy-to-carry form.
  • the sensing unit 510 of the control device 50 detects the movement of the user's wrist carrying the control device 50 in hand, and transmits the detected movement information of the wrist to the communication unit 530 of the control device 50. You can.
  • the detected motion data of the wrist may be transmitted to the motion recognition device 10 by the communication unit 530 of the control device 50.
  • the detected motion data of the wrist may be transmitted to a device having a display unit 810 such as an HMD device 80 or a TV.
  • the control device 50 may detect the movement of the wrist through the gyro sensor 520 of the sensing unit 510 to detect the movement of the wrist.
  • the motion data of the wrist detected through the gyro sensor 520 may be obtained as a three-way angle ( ⁇ m , ⁇ m , ⁇ m ) of the gyro sensor 520.
  • the HMD device 80 may receive the motion data of the forearm 221 and the measurement distance D s of the distance measurement sensor 111 from the communication unit 130 of the motion recognition device 10 through the communication unit 830. Also, the HMD device 80 may receive movement data of the wrist from the communication unit 530 of the control device 50 through the communication unit 830. The control unit 820 of the HMD device 80 may calculate the motion data of the upper arm 211 using the received motion data of the forearm 221 and the measurement distance D s of the distance measurement sensor 111. The control unit 820 of the HMD device 80 uses the shoulder position 210, the elbow position 220, and the wrist position 230 using the motion data of the forearm 221, the motion data of the upper arm 211, and the motion data of the wrist.
  • the control unit 820 of the HMD device 80 uses the motion data of the forearm 221 and the distance measurement sensor ( The motion of the user's arm can be accurately calculated using the measurement distance D s of 111) and the motion data of the wrist. Based on the calculated shoulder position 210, elbow position 220, wrist position 230, and hand center position 240, the model data of the arm is generated through the control unit 820 of the HMD device 80 and , It can be output through the display unit 810.
  • the motion recognition device 10 or the HMD device 80 may receive wrist movement data from the communication unit 530 of the control device 50 through the respective communication units 130 and 830.
  • the absolute angles of the wrist ⁇ W , ⁇ W , ⁇ W can be expressed by the following equation.
  • the motion recognition device 10 may obtain the warping angle ⁇ 1 of the forearm 221 as follows using the roll value ⁇ m of the wrist movement data and the warping angle ⁇ 0 of the upper arm 211.
  • the control device 50 may perform a function of showing the 3 degrees of freedom of the wrist, and the motion recognition device 10 may further know the folding angle ⁇ 1x of the elbow and the 3 degrees of freedom of the forearm 221 By doing so, you can implement 7 degrees of freedom to show the movement of the entire arm.
  • FIG. 9 is a view for explaining a method of obtaining position data of an arm and a hand according to an embodiment of the present invention.
  • Step 920 the step of setting the shoulder position (210) A 0 reference point (S910), the upper arm 211 length L 1 and the forearm 221 length L 2 to obtain Step 920, obtaining the elbow position 220 A 1 (S930), obtaining the wrist position 230 A 2 (S940) and obtaining the center position of the hand 240 A 3 (S950) ).
  • the shoulder position 210 is a reference point for displaying the elbow position 220, the wrist position 230, and the central position 240 of the hand, which is a coordinate value of the XYZ axis corresponding to the reference point (0, 0, 0) and may not be separately extracted or calculated.
  • upper arm 211, the length L 1 and the forearm 221, long length L 2 Hand L 3 can be obtained the user's physical upper arm 211 and forearm 221, the length of the hand over the measurement .
  • Upper arm 211 length L 1 and forearm 221 length L 2 , hand length L 3 are not measured by user, but are the statistics of the upper arm length and hand length of the arm forearm 221 of a person having a specific gender and age. Can be used.
  • the upper arm 211, the length L 1 and the forearm 221, the length L 2, the length L 3 of the hand may be calculated from the statistic of the arm length of the main game layer of 30 male
  • step S930 the elbow position 220 A 1 , which is a relative position from (0, 0, 0), which is the coordinate value of the shoulder position 210 A 0 as the reference point, may be obtained.
  • the elbow position 220 A 1 can be obtained by the following equation through ⁇ H , ⁇ H , ⁇ H , which is the absolute angle of the upper arm 211, and the length L 1 of the upper arm 211.
  • the elbow position 220 A 1 which is a relative position from the coordinate values of the reference point shoulder position 210 A 0 (0, 0, 0), may be obtained.
  • the wrist position 230 A 2 can be obtained by the following formula through the elbow position 220 A 1 and the absolute angles ⁇ L , ⁇ L of the forearm 221, and the length L 2 of the forearm 221.
  • step S950 the center position 240 A 3 of the hand, which is a relative position from the coordinate values (0, 0, 0) of the shoulder position 210 A 0 as the reference point, may be obtained.
  • the central position of the hand 240 A 3 can be obtained by the following formula through the wrist position 230 A 2 and the absolute angle of the wrist ⁇ W , ⁇ W , ⁇ W , and the length of the hand L 3 .
  • the shoulder position of the human arm 210 A 0 , the elbow position 220 A 1 , the wrist position 230 A 2 , and the center position of the hand 240 A 3 can be obtained, and A 1 , A Changes in the values of 2 and A 3 can detect the movement of the arm and hand
  • FIG. 10 is a view for explaining a method of displaying the movement of a human arm according to an embodiment of the present invention.
  • a method of displaying movement of a human arm includes receiving arm movement data (S1010), obtaining arm location data (S1020), and generating an arm shape using the arm location data It may include a step (S1030) and outputting the arm shape (S1040).
  • Step S1010 may be a process of receiving the motion data of the arm transmitted from the motion recognition device 10 in an apparatus having a display unit 161 such as an HMD device 80 or a TV.
  • the motion data of the arm may include motion data of the forearm 221 of the arm generated by the motion recognition device 10 and a distance value between the forearm 221 and the upper arm 211.
  • the forearm 221 motion data and the distance value between the forearm 221 and the upper arm 211 may be measured values of the sensing units 110 and 801 obtained by the motion recognition device 10 or the control device 50.
  • the movement of the forearm 221 detected through the gyro sensor 112 of the motion recognition device 10 is an HMD device with a three-way angle ( ⁇ g , ⁇ g , ⁇ g ) of the gyro sensor 112. Can be received at.
  • the distance value between the forearm 221 and the upper arm 211 may be received by the HMD device as the measurement distance D s of the distance measurement sensor 111 of the sensing unit 110 described in FIG. 6.
  • the motion data of the arm is received from the HMD device as the shoulder position 210 A 0 , elbow position 220 A 1 , and wrist position 230 A 2 values corresponding to the position data of the arm generated by the motion recognition device 10.
  • the process of acquiring the arm position data in step S1020 may be omitted.
  • the shoulder position 210 A 0 may be omitted as a reference value.
  • Step S1020 may be a process of obtaining a value of the shoulder position 210 A 0 , the elbow position 220 A 1 , and the wrist position 230 A 2 of the arm, which may indicate the position of the arm.
  • the position data of the arm is based on the motion data of the forearm 221 of the arm and the distance value between the forearm 221 and the upper arm 211, and the forearm 221 described in FIGS. 5, 6, and 7 is absolute It can be obtained through the method of obtaining the angle, the absolute angle of the upper arm 211, the folding angle of the elbow and the position data of the arm.
  • step (S1020) the shoulder position of the human arm 210 A 0 , the elbow position 220 A 1 , and the wrist position 230 A 2 can be obtained, and the movement of the arm through changes in the values of A 1 and A 2 Can detect. It is also possible to directly receive the position data of the arm from the motion recognition device 10 from the motion recognition device 10 without the process of obtaining it in step S1020 in an apparatus including a display unit 161 such as an HMD device 80 or a TV.
  • a display unit 161 such as an HMD device 80 or a TV.
  • Step S1030 may be a process of generating the shape of the arm by using the position data of the arm to display the shape of the arm on the display.
  • the shoulder position 210 A 0 indicates the position of the center point of the shoulder
  • the elbow position 220 A 1 indicates the position of the center of the elbow
  • the wrist position 230 A 2 indicates the position of the center of the wrist joint.
  • Shoulder position (210) A 0 , elbow position (220) A 1 , wrist position (230) A 2 shows the shape of the entire arm using the values, and the shape of the hand is extended to the arm forearm (221) in a certain direction Or it can be expressed as a holding shape.
  • Step S1040 may be a process of outputting the shape of the generated arm to the display unit 161.
  • the movement of the arm may be displayed on the display unit 161.
  • the shoulder position 210 A 0 , the elbow position 220 A 1 , and the wrist position 230 A 2 may be displayed in real time on the display unit 161 by the user's shoulder joint and elbow joint movement.
  • FIG. 11 is a view for explaining a method of displaying hand and arm movements according to an embodiment of the present invention.
  • the method for displaying hand and arm movement includes receiving hand and arm movement data (S1110), acquiring hand and arm location data (S1120), and hand and arm location data It may include using the step of generating an arm shape (S1030) and outputting the arm shape (S1040).
  • Step S1110 may be a process of receiving the motion data of the hand and arm transmitted from the motion recognition device 10 in an apparatus having a display unit 161 such as an HMD device 80 or a TV.
  • the motion data of the hand transmitted by the motion recognition device 10 may be data generated by the control device 50 and transmitted to the motion recognition device 10.
  • an apparatus having a display unit 810 such as an HMD device 80 or a TV may receive hand motion data from the control device 50 and arm motion data from the motion recognition device 10.
  • Hand movement data may include wrist movement data generated by the control device 50.
  • the wrist movement data may be a measurement value of the sensing unit 510 acquired by the control device 50.
  • the motion data of the arm may include motion data of the forearm 221 of the arm generated by the motion recognition device 10 and a distance value between the forearm 221 and the upper arm 211.
  • the forearm 221 motion data and the distance value between the forearm 221 and the upper arm 211 may be measured values of the sensing unit 510 obtained by the motion recognition device 10 or the control device 50.
  • the movement of the wrist detected through the gyro sensor 520 of the control device 50 is directly received by the HMD device at a three-way angle ( ⁇ m , ⁇ m , ⁇ m ) of the gyro sensor 112 or
  • the motion recognition device 10 may receive and transmit the HMD device.
  • the motion of the forearm 221 detected through the gyro sensor 112 of the motion recognition device 10 can be received by the HMD device at a three-way angle ( ⁇ g , ⁇ g , ⁇ g ) of the gyro sensor 112. have.
  • the distance value between the forearm 221 and the upper arm 211 may be received by the HMD device as the measurement distance D s of the distance measurement sensor 111 of the sensing unit 110 described in FIG. 6.
  • the movement data of the hand and arm are the shoulder position 210 A 0 , the elbow position 220 A 1 , the wrist position 230 A 2 , and the center of the hand corresponding to the position data of the arm generated by the motion recognition device 10.
  • the position 240 may be received by the HMD device with the value of A 3 , and in this case, the process of acquiring the arm position data of step S1120 may be omitted. Also, the shoulder position 210 A 0 may be omitted as a reference value.
  • Step (S1120) is a shoulder position 210 A 0 , an elbow position 220 A 1 , a wrist position 230 A 2 , a center position 240 A 3 value of the hand, which can indicate the position of the hand and arm It may be a process of seeking.
  • the position data of the hands and arms are based on the motion data of the wrist, the forearm 221 motion data, and the distance value between the forearm 221 and the upper arm 211, and the wrists described in FIGS. 5, 6, and 9 Absolute angle, forearm 221 absolute angle, upper arm 211 absolute angle, elbow folding angle and arm position data obtained through a method of obtaining.
  • the shoulder position of the human arm 210 A 0 , the elbow position 220 A 1 , the wrist position 230 A 2 , and the center position of the hand 240 A 3 may be obtained, A 1 .
  • the movements of the arms and hands can be detected by changing the values of A 2 and A 3 . It is also possible to directly receive the position data of the hand and arm from the motion recognition device 10 from the motion recognition device 10 without the process of obtaining the step S1120 in an apparatus including a display unit 161 such as an HMD device 80 or a TV.
  • Step S1130 may be a process of generating the shape of the arm by using the position data of the hand and arm to display the shape of the arm on the display.
  • Shoulder position 210 A 0 indicates the position of the center of the shoulder
  • elbow position 220 A 1 indicates the position of the elbow center
  • wrist position 230 A 2 indicates the position of the center of the wrist joint
  • the central position 240 A 3 may indicate the position of the center of the hand. Using the values of the shoulder position 210 A 0 , the elbow position 220 A 1 , the wrist position 230 A 2 , and the center position of the hand 240 A 3, the shape of the entire arm, including the hand, may be represented.
  • Step S1140 may be a process of outputting the shape of the generated arm to the display unit 161.
  • the movement of the arm may be displayed on the display unit 161.
  • Shoulder position (210) A 0 , elbow position (220) A 1 , wrist position (230) A 2 , center position of hand (240) A 3 Displays the movement of the user's shoulder joint, elbow joint and wrist joint by the value It can be displayed in real time in the unit 161.
  • FIG. 12 is a view for explaining the movement of the arm displayed on the screen according to an embodiment of the present invention.
  • the movement of the arm may be displayed by various devices such as an HMD device 80 or a TV including the display unit 161.
  • the shoulder joint (210) A 0 , elbow position (220) A 1 , wrist position (230) A 2 , center position (240) A 3 of the user's shoulder joint, elbow The movement of the joints and wrist joints may be displayed on the display unit 161.
  • the center position of the hand 240 A 3 may be represented as a shape in which the hand is stretched or held in a certain direction on the forearm 221 of the arm without movement of the wrist joint without separately calculating the value.
  • the motion of the arm can be represented only by the data detected by the motion recognition device 10 without the control device 50 for detecting the motion data of the wrist.
  • the motion recognition device 10 is mounted on only one arm to detect the movement of the arm, only the shape of one arm is displayed on the display device, or the shape of the other arm is shown to show the same movement, so that both arm shapes are Can be displayed on the display device.
  • the motion recognition device 10 is mounted only on both arms to sense the movement of the arm, the movement of the left arm and the right arm can be independently displayed on the display device.
  • the shape of the human body displayed on the display device may be displayed in various forms, such as the shape of the whole body, the shape of the upper body, and the shape of only the arm.
  • the present invention relates to a motion recognition device, and is applicable to an apparatus using a motion recognition device, and has the possibility of repeatability, and thus has industrial applicability.

Abstract

La présente invention concerne un dispositif de reconnaissance de mouvement et un procédé de reconnaissance de mouvement l'utilisant et, plus particulièrement, un dispositif et un procédé de reconnaissance de mouvement permettant la reconnaissance du mouvement d'un bras à l'aide d'un capteur gyroscopique et d'un capteur laser. Un dispositif de reconnaissance de mouvement selon un mode de réalisation de la présente invention comprend : une unité de détection, permettant de collecter des données de mouvement d'une partie d'avant-bras d'un utilisateur, qui sont détectées par un capteur gyroscopique, et de collecter des données de distance détectées par un capteur de mesure de distance; une unité de commande, permettant de calculer des données de mouvement d'une partie de bras supérieur, en fonction des données de distance et des données de mouvement de la partie d'avant-bras, et de calculer des données de mouvement d'un bras en fonction des données de mouvement de la partie d'avant-bras et des données de mouvement de la partie de bras supérieur; et une unité de communication, permettant de transmettre les données de mouvement du bras à un premier dispositif externe par l'intermédiaire d'un réseau.
PCT/KR2018/012734 2018-10-25 2018-10-25 Dispositif de reconnaissance de mouvement et procédé de reconnaissance de mouvement l'utilisant WO2020085537A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/KR2018/012734 WO2020085537A1 (fr) 2018-10-25 2018-10-25 Dispositif de reconnaissance de mouvement et procédé de reconnaissance de mouvement l'utilisant

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/KR2018/012734 WO2020085537A1 (fr) 2018-10-25 2018-10-25 Dispositif de reconnaissance de mouvement et procédé de reconnaissance de mouvement l'utilisant

Publications (1)

Publication Number Publication Date
WO2020085537A1 true WO2020085537A1 (fr) 2020-04-30

Family

ID=70331173

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2018/012734 WO2020085537A1 (fr) 2018-10-25 2018-10-25 Dispositif de reconnaissance de mouvement et procédé de reconnaissance de mouvement l'utilisant

Country Status (1)

Country Link
WO (1) WO2020085537A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113116319A (zh) * 2021-04-22 2021-07-16 科曼利(广东)电气有限公司 一种感知情绪用以转化场景变换的智能家居控制系统

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3862348B2 (ja) * 1997-03-19 2006-12-27 東京電力株式会社 モーションキャプチャシステム
US20100110169A1 (en) * 2008-07-24 2010-05-06 Noah Zerkin System and method for motion capture
KR101483713B1 (ko) * 2008-06-30 2015-01-16 삼성전자 주식회사 모션 캡쳐 장치 및 모션 캡쳐 방법
JP5896240B2 (ja) * 2013-03-21 2016-03-30 カシオ計算機株式会社 運動支援装置、運動支援方法及び運動支援プログラム
JP2017120575A (ja) * 2015-12-28 2017-07-06 株式会社テック技販 姿勢動作検出装置

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3862348B2 (ja) * 1997-03-19 2006-12-27 東京電力株式会社 モーションキャプチャシステム
KR101483713B1 (ko) * 2008-06-30 2015-01-16 삼성전자 주식회사 모션 캡쳐 장치 및 모션 캡쳐 방법
US20100110169A1 (en) * 2008-07-24 2010-05-06 Noah Zerkin System and method for motion capture
JP5896240B2 (ja) * 2013-03-21 2016-03-30 カシオ計算機株式会社 運動支援装置、運動支援方法及び運動支援プログラム
JP2017120575A (ja) * 2015-12-28 2017-07-06 株式会社テック技販 姿勢動作検出装置

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113116319A (zh) * 2021-04-22 2021-07-16 科曼利(广东)电气有限公司 一种感知情绪用以转化场景变换的智能家居控制系统

Similar Documents

Publication Publication Date Title
WO2016117758A1 (fr) Système d'exercice de rééducation de la main et méthode associée
WO2019156518A1 (fr) Procédé de suivi d'une posture de main et dispositif électronique associé
WO2018066848A1 (fr) Dispositif de soin de la peau
WO2015084111A1 (fr) Dispositif de traitement d'entrée d'utilisateur utilisant un nombre limité de capteurs de champ magnétique
WO2017086508A1 (fr) Terminal mobile et procédé de commande associé
WO2019098511A1 (fr) Dispositif de soins de la peau
WO2015083895A1 (fr) Montre intelligente et son procédé de commande
WO2016182181A1 (fr) Dispositif portable et procédé permettant de fournir une rétroaction d'un dispositif portable
WO2016060461A1 (fr) Dispositif portable
WO2016148486A1 (fr) Dispositif portable
WO2020050636A1 (fr) Procédé et appareil de reconnaissance de gestes basée sur l'intention de l'utilisateur
WO2016190477A1 (fr) Terminal mobile
WO2021075748A1 (fr) Dispositif électronique de type montre comportant une antenne
WO2022035027A1 (fr) Dispositif électronique pour commander un dispositif hôte en utilisant un signal de mouvement et un signal de souris
WO2022097862A1 (fr) Procédé de commande d'affichage et dispositif électronique prenant en charge ledit procédé
WO2020138602A1 (fr) Procédé d'identification de main réelle d'utilisateur et dispositif vestimentaire pour cela
WO2020085537A1 (fr) Dispositif de reconnaissance de mouvement et procédé de reconnaissance de mouvement l'utilisant
WO2015137629A1 (fr) Système de détection d'électromyographie et de mouvement, son procédé de commande
WO2016163591A1 (fr) Terminal mobile de type montre
WO2020085538A1 (fr) Système de reconnaissance vocale et procédé de reconnaissance vocale utilisant ledit système
US10948980B2 (en) Electronic device system with controllers
WO2018056572A1 (fr) Sonde a ultrasons, appareil d'imagerie a ultrasons, système d'imagerie par ultrasons et son procédé de commande
WO2021107200A1 (fr) Terminal mobile et procédé de commande de terminal mobile
WO2022085940A1 (fr) Procédé et appareil de commande d'affichage d'une pluralité d'objets sur un dispositif électronique
WO2020080733A1 (fr) Dispositif d'interface permettant la commande d'un contenu virtuel

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18937592

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 01/10/2021)

122 Ep: pct application non-entry in european phase

Ref document number: 18937592

Country of ref document: EP

Kind code of ref document: A1