KR101667607B1 - System for controlling lower body motion of avatar and method thereof - Google Patents

System for controlling lower body motion of avatar and method thereof Download PDF

Info

Publication number
KR101667607B1
KR101667607B1 KR1020150063406A KR20150063406A KR101667607B1 KR 101667607 B1 KR101667607 B1 KR 101667607B1 KR 1020150063406 A KR1020150063406 A KR 1020150063406A KR 20150063406 A KR20150063406 A KR 20150063406A KR 101667607 B1 KR101667607 B1 KR 101667607B1
Authority
KR
South Korea
Prior art keywords
lower body
avatar
data
person
pressure sensor
Prior art date
Application number
KR1020150063406A
Other languages
Korean (ko)
Inventor
권정흠
이지용
염기원
한혜진
유범재
Original Assignee
재단법인 실감교류인체감응솔루션연구단
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 재단법인 실감교류인체감응솔루션연구단 filed Critical 재단법인 실감교류인체감응솔루션연구단
Priority to KR1020150063406A priority Critical patent/KR101667607B1/en
Application granted granted Critical
Publication of KR101667607B1 publication Critical patent/KR101667607B1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • G06K9/00362
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T13/00Animation
    • G06T13/203D [Three Dimensional] animation
    • G06T13/403D [Three Dimensional] animation of characters, e.g. humans, animals or virtual beings

Abstract

The present invention relates to a system for controlling lower body motion of an avatar and a method for the same. The system for controlling the lower body motion of the avatar includes a pressure sensor for sensing the pressure of a sole, an inertial measurement unit (IMU) sensor for sensing the position and rotation of a foot, and a control device. The control device includes a data collecting module for collecting data sensed by the pressure sensor and the IMU sensor, a data analyzing module for analyzing data collected by the data collecting module to confirm a lower body motion of a person, and an avatar control module for generating a control signal for controlling the lower body motion of the avatar according to the lower body motion of the person and transmitting the control signal to the avatar. According to the present invention, the lower body motion of the person is detected in real time, so that the lower body motion of the avatar realized using a virtual image or a robot can be controlled in real time.

Description

[0001] The present invention relates to a system for controlling a lower body motion of an avatar,

More particularly, the present invention relates to a control system and a control method capable of collecting data on a lower body motion of a person and controlling the lower body motion of the avatar according to collected data, .

Techniques for recognizing human motions are used in various fields such as rehabilitation, robots, games, and sports training. Especially, technologies for recognizing human motion in real time and generating virtual images or controlling robots in real time Various attempts have been made.

In general, as a technique for recognizing human motion, a marker is attached to the joints of the human body, a method of recognizing the attached marker using a camera, and an inertial measurement unit (IMU) sensor are attached to the joints of the human body, And a method of measuring a change in the angle of the object.

However, in the method using the marker, if there is another object between the marker and the camera, the camera can not correctly recognize the marker, so that there is a problem that an empty space must be secured between the marker and the camera. In addition, the method using the IMU sensor has a problem that many sensors must be attached to the human body and errors are liable to occur due to the drift effect caused by the IMU sensor.

According to the present invention, data of a lower body motion of a human being is collected using a pressure sensor and an IMU sensor, the collected data is analyzed and applied to an avatar realized with a virtual image or a robot, And to provide a method for controlling a lower body motion of an avatar that can control an operation in real time and a method thereof.

According to an aspect of the present invention, there is provided a pressure sensor comprising: a pressure sensor for sensing the pressure of the sole; an IMU sensor for detecting the position and rotation of the foot; and a control device, A data analysis module for analyzing data collected by the data collection module to confirm a lower body operation of a person, and a control module for controlling the lower body operation of the avatar according to the lower body operation of the identified person And an avatar control module for generating a control signal and transmitting the control signal to the avatar.

According to an aspect of the present invention, there is provided a method of detecting a foot of a foot, comprising the steps of sensing a foot pressure using a pressure sensor, sensing a foot position and rotation using an IMU sensor, Analyzing the collected data, confirming a lower body operation of the person, and generating a control signal for controlling the lower body operation of the avatar according to the lower body operation of the identified person and transmitting the generated control signal to the avatar .

According to the present invention, it is possible to detect a lower body movement of a person in real time and to control the lower body movement of the avatar realized by a virtual image or robot in real time.

In addition, according to the present invention, the number of IMU sensors attached to a person can be reduced, and the pressure sensor and the IMU sensor can be implemented as a single wearable device, It is possible to minimize the inconvenience that is likely to occur in a technique of recognizing the motion of a person.

In addition, since the present invention is not a method of recognizing a lower body operation of a person using a camera, a person wearing a pressure sensor and an IMU sensor can freely move without being restricted by a space.

Further, the present invention is effective for controlling the lower body operation of the avatar in real time by analyzing the relative position of the human body with the lower body operation by using the pressure sensor and the IMU sensor.

BRIEF DESCRIPTION OF THE DRAWINGS FIG. 1 is a view showing a system for controlling a lower body motion of an avatar according to the present invention;
2 shows an analysis method according to the present invention.
3 is a diagram illustrating an analysis method according to an embodiment of the present invention.
4 is a view showing a method of controlling a lower body operation of an avatar according to the present invention.

The present invention collects data on a lower body motion of a person using a pressure sensor and an IMU sensor, analyzes the collected data, and controls the lower body operation of the avatar realized as a virtual image or a robot in real time according to the analyzed data It's about how you can.

Hereinafter, an embodiment of the present invention will be described in detail with reference to the drawings.

1 is a view showing a system for controlling the lower body operation of an avatar according to the present invention. As shown in Fig. 1, the avatar lower body motion control system may include a sensing device 10, a control device 20, and an avatar 30. [ The sensing device 10, the control device 20, and the avatar 30 may transmit signals through wired / wireless communication, and may transmit signals through a local communication or an Internet network.

The sensing device 10 may include a pressure sensor 11 and an IMU sensor 12 as an apparatus that can be attached to or worn by a person's foot for sensing a lower body motion of a person.

The pressure sensor 11 is a device for sensing the pressure of the soles of the feet, and can detect a change pattern of a pressure distribution and a pressure distribution.

The IMU sensor 12 is a device for sensing the position and rotation of the foot, and can sense the direction and rotation of the foot.

The pressure sensor 11 and the IMU sensor 12 may each be attached to a human foot or a single wearable device such as a shoe. Therefore, when the pressure sensor 11 and the IMU sensor 12 are worn on a person's foot, discomfort can be minimized. Since the pressure sensor 11 and the IMU sensor 12 can transmit the sensed data to the control device 20, unlike the method using the camera, the sensor device 10 and the control device 20 No space is required, and there is no physical distance between the sensing device 10 and the control device 20.

The control device 20 collects data analyzed by the sensing device 10, analyzes the analyzed data, and controls the lower body operation of the avatar based on the analyzed data. The control device 20 includes a data acquisition module 21, a data analysis module 22 and an avatar control module 23.

The data acquisition module 21 is a device for collecting data sensed by the pressure sensor 11 and the IMU sensor 12 and receives the sensed data from the pressure sensor 11 and the IMU sensor 12. [ The data collected through the data acquisition module 21 is transferred to the data analysis module 22.

The data analysis module 22 analyzes the data sensed by the pressure sensor 11 and the IMU sensor 12 to analyze a human lower body motion in real time.

The data sensed by the pressure sensor 11 includes a variation pattern of pressure, pressure distribution and pressure distribution. Therefore, the data analysis module 22 can estimate a person's height or weight according to the magnitude of the pressure or the interval between the feet where the pressure is sensed. Further, the data analysis module 22 can estimate the center of gravity of the person according to the pressure distributions of both feet, and calculates the moving distance of the person based on the center-of-gravity change and the cumulative movement amount. In addition, the data analysis module 22 can estimate a moving direction of a person according to a change pattern of pressure distributions of both feet.

The data sensed by the IMU sensor 12 includes the position and rotation of the foot. Therefore, the data analysis module 22 can estimate the foot position and rotation in the air by calculating the foot direction and the acceleration.

Accordingly, the data analysis module 22 can analyze the center of gravity, the moving distance, the moving direction, the position and the direction of the foot based on the data sensed by the pressure sensor 11 and the IMU sensor 12. According to the analyzed data, it is possible to confirm the state of the lower body motion of the person.

In addition, the data analysis module 22 may accumulate and analyze the data sensed by the pressure sensor 11 and the IMU sensor 12, thereby correcting an error that may occur in sensing a lower body motion of a person. For example, an error caused by the drift effect occurring in the IMU sensor 12 can be corrected using data sensed by the pressure sensor 11. [

The avatar control module 23 generates a signal for controlling the avatar 30 according to the data analyzed by the data analysis module 22 and transmits the signal to the avatar 30. The avatar control module 23 can change a posture or an operation of the avatar by transmitting a signal that can control the lower body motion of the avatar so as to be similar to the lower body motion of the current person.

The avatar control module 23 generates a control signal for controlling the lower body operation of the avatar according to a center of gravity, a moving distance, a moving direction, a foot position and a direction of a person analyzed by the data analysis module 22. For example, when the reference position of a person is known, the avatar control module 23 can calculate the angle of the leg portion of the avatar through the inverse kinematics. The reference position of the person may be measured in advance or may be estimated according to the pressure value sensed by the pressure sensor 11 in the standing posture.

In addition, the avatar control module 23 can accumulate the information analyzed by the data analysis module 22 and generate a control signal for the avatar by learning it as a probability-based model such as HMM (Hidden Markov Model) and artificial neural network have.

The avatar control module 23 transmits the generated control signal to the avatar 30. The avatar 30 receiving the control signal can generate an attitude such as forward walking, backward walking, running, standing, etc. in real time in response to the lower body motion of the person detected by the sensing device 10. [

The avatar 30 may be a virtual image or a robot. The avatar 30 may be a virtual image or a robot. The avatar 30 may be changed in response to a control signal received from the controller 20. For example, when the avatar is a virtual image, a control signal may be transmitted to the apparatus for displaying the avatar. Alternatively, when the avatar is a robot, a direct control signal may be transmitted to the avatar. Here, the avatar includes all types of data or mechanical devices capable of representing the lower body operation, and the physical distance between the controller 20 and the avatar 30 is not limited.

2 is a diagram showing an analysis method according to the present invention.

In the case where the lower body motion of the person is standing, the center of gravity of both feet analyzed by the data analysis module 22 is located in the middle between the feet. 2 (a), when the lower body movement of the person is walking forward, the change pattern of the pressure distribution sensed by the pressure sensor 11 indicates that the portion where the high pressure value is sensed moves from the back of the foot to the front side of the foot You will see a pattern. Therefore, the data analysis module 22 can analyze the moving direction of the person according to the change pattern of the sensed pressure distribution.

2 (b), the data analysis module 22 can estimate the center of gravity. For example, when you walk on your left foot, the center of gravity moves to the left. When you walk on your right foot, the center of gravity moves to the right. Therefore, when a person's lower body motion is walking, the center of gravity moves repeatedly to the left and right. The data analysis module 22 can calculate the moving distance of the person based on the change of the center of gravity and the cumulative moving amount.

In addition, the data analysis module 22 can accumulate and analyze the sensed data, thereby correcting an error according to the change in the pressure generated when the foot is struck.

3 is a diagram illustrating an analysis method according to an embodiment of the present invention.

3 (a) is a view showing an analysis method when a lower body motion of a person is walking forward. The center of gravity moves to the right when a person is walking on the right foot, and the center of gravity moves to the left when walking on the left foot. Also, while walking forward, the foot moves forward and rotates. Accordingly, the data analysis module 22 analyzes the center of gravity of the person, the moving direction, the moving distance, the position and the direction of the foot according to the data sensed by the pressure sensor 11 and the IMU sensor 12, I can confirm that I am walking forward. The avatar control module 23 can generate and transmit a control signal for controlling the lower body operation of the avatar in real time according to the lower body operation of the person analyzed by the data analysis module 22. [

FIG. 3 (b) is a view showing an analysis method when the center of gravity is moved in a state where the two feet are moved. Since both feet are in a relaxed state, pressure is sensed on both feet and the movement of the center of gravity to the left or right of the body is detected. Further, since the feet are fixed, movement and rotation of the feet are not detected. Therefore, the data analysis module 22 can confirm that the current human lower body motion is the posture in which only the center of gravity is moved in a state in which both feet are stopped, according to the data sensed by the pressure sensor 11 and the IMU sensor 12. [ The avatar control module 23 can transmit a control signal capable of generating an attitude of the avatar similar to the underbody action of the person analyzed by the data analysis module 22. [

As described above, the data analysis module 22 can estimate the current human lower body motion according to the data sensed by the pressure sensor 11 and the IMU sensor 12, and the avatar control module 23 can analyze It is possible to generate a control signal capable of controlling the lower body motion of the avatar in a similar manner to the lower body motion of the human in real time according to the attitude or motion.

4 is a view illustrating a method of controlling the lower body operation of the avatar according to the present invention.

4, the pressure sensor 11 and the IMU sensor 12 are first attached or worn, and the lower body operation of the person is sensed by the pressure sensor 11 and the IMU sensor 12 (S40).

Next, the control device 20 acquires the data sensed by the data acquisition module 21 by the pressure sensor 11 and the IMU sensor 12 (S41). The data collection module 21 can collect data via wired / wireless communication.

Next, the data analysis module 22 analyzes the collected data and analyzes the lower body motion of the person sensed by the pressure sensor 11 and the IMU sensor 12 (S42). The data analysis module 21 can analyze the center of gravity of the person, the moving distance, the moving direction, the position and direction of the foot, and confirm the lower body operation of the person. For example, if the center of gravity moves, only one foot pressure is sensed sequentially, and the change pattern of the pressure distribution moves from the back to the front, it can be confirmed that the lower body motion of the current person is walking forward.

Next, a control signal for controlling the avatar 30 is generated and transmitted according to the data analyzed by the data analysis module 22 (S43). The avatar receiving the control signal performs posture or operation similar to the current lower body motion of the person. Here, the avatar may be a virtual image or an actually implemented robot.

As described above, the present invention detects a lower body motion of a person by using a pressure sensor for measuring the pressure of the sole and an IMU sensor for detecting movement and rotation of the foot, analyzes the sensed data, And thereby generates a control signal capable of controlling the avatar, and a method thereof. According to the present invention, since the pressure sensor and the IMU sensor are used at the same time, it is easy to grasp the relative position of a person, and when it is implemented as a wearable device, the inconvenience of the person who occurs in attaching the pressure sensor and the IMU sensor can be minimized Since the data is collected using wired / wireless communication, there is an effect that there is no restriction on the motion of the person wearing or wearing the pressure sensor and the IMU sensor.

It will be apparent to those skilled in the art that various modifications and variations can be made in the present invention without departing from the spirit or scope of the inventive concept as defined by the appended claims and their equivalents. But is not limited thereto.

10: Sensing device
20: Control device
30: Avatar

Claims (9)

A pressure sensor for sensing the pressure of the sole;
An IMU sensor for sensing the position and rotation of the foot; And
A control device,
The control device includes:
A data collection module for collecting data sensed by the pressure sensor and the IMU sensor;
A data analysis module for analyzing data collected by the data collection module and confirming a lower body operation of a person; And
And an avatar control module for generating a control signal for controlling the lower body operation of the avatar according to a lower body operation of the identified person and transmitting the control signal to the avatar,
Wherein the data analysis module corrects an error by cumulatively analyzing data sensed by the pressure sensor and the IMU sensor.
The method according to claim 1,
Wherein the data analysis module analyzes a center of gravity, a moving distance, a moving direction, and a foot position and a direction of a person based on the data collected by the data collection module.
The method according to claim 1,
Wherein the data collection module receives data from the pressure sensor and the IMU sensor in a wired or wireless manner.
delete The method according to claim 1,
Wherein the avatar control module accumulates information analyzed by the data analysis module and learns the accumulated information by using a probability-based model to generate the control signal.
Sensing pressure of the sole using a pressure sensor;
Sensing the position and rotation of the foot using an IMU sensor;
Collecting data sensed by the pressure sensor and the IMU sensor;
Analyzing the collected data and confirming a lower body motion of a person; And
And generating a control signal for controlling the lower body operation of the avatar according to the lower body operation of the identified person and transmitting the control signal to the avatar,
Wherein the step of verifying the lower body operation of the person corrects the error by cumulatively analyzing the data sensed by the pressure sensor and the IMU sensor.
The method according to claim 6,
Wherein the step of verifying the lower body operation of the person analyzes the center of gravity of the person, the movement distance, the movement direction, and the position and direction of the foot based on the collected data.
delete The method according to claim 6,
Wherein the control signal is generated by a probability-based model based on information obtained by analyzing collected data.
KR1020150063406A 2015-05-06 2015-05-06 System for controlling lower body motion of avatar and method thereof KR101667607B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020150063406A KR101667607B1 (en) 2015-05-06 2015-05-06 System for controlling lower body motion of avatar and method thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020150063406A KR101667607B1 (en) 2015-05-06 2015-05-06 System for controlling lower body motion of avatar and method thereof

Publications (1)

Publication Number Publication Date
KR101667607B1 true KR101667607B1 (en) 2016-10-19

Family

ID=57250465

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020150063406A KR101667607B1 (en) 2015-05-06 2015-05-06 System for controlling lower body motion of avatar and method thereof

Country Status (1)

Country Link
KR (1) KR101667607B1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102075389B1 (en) * 2018-09-13 2020-02-10 인천대학교 산학협력단 Electronic device for painting characters in animation and operating method thereof
US11694380B2 (en) 2020-11-13 2023-07-04 Zoltan GELENCSER System and method for immersive telecommunications

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20070100592A (en) * 2006-04-07 2007-10-11 삼성전자주식회사 System for analyzing walking motion
KR20140011688A (en) * 2012-07-18 2014-01-29 주식회사 도담시스템스 Virtual reality simulation apparatus and method using motion capture technology and

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20070100592A (en) * 2006-04-07 2007-10-11 삼성전자주식회사 System for analyzing walking motion
KR20140011688A (en) * 2012-07-18 2014-01-29 주식회사 도담시스템스 Virtual reality simulation apparatus and method using motion capture technology and

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102075389B1 (en) * 2018-09-13 2020-02-10 인천대학교 산학협력단 Electronic device for painting characters in animation and operating method thereof
US11694380B2 (en) 2020-11-13 2023-07-04 Zoltan GELENCSER System and method for immersive telecommunications

Similar Documents

Publication Publication Date Title
US20240041349A1 (en) Gait Analysis Devices, Methods, and Systems
US10679360B2 (en) Mixed motion capture system and method
US8165844B2 (en) Motion tracking system
KR102292683B1 (en) Method and apparatus for gait task recognition
KR101751760B1 (en) Method for estimating gait parameter form low limb joint angles
KR101274114B1 (en) System and method for analylzing posture using corrected foot pressure
JP6516283B2 (en) Motion analysis device
KR101283464B1 (en) Motion recognition system using footwear for motion recognition
EP2171688A2 (en) Object motion capturing system and method
US10126108B2 (en) Apparatus and method for classifying orientation of a body of a mammal
Zheng et al. Pedalvatar: An IMU-based real-time body motion capture system using foot rooted kinematic model
WO2018018133A1 (en) Dead-reckoning drift compensation using personal gait
Pansiot et al. WISDOM: wheelchair inertial sensors for displacement and orientation monitoring
KR101667607B1 (en) System for controlling lower body motion of avatar and method thereof
Palani et al. Real-time joint angle estimation using mediapipe framework and inertial sensors
KR102336728B1 (en) A system for analyzing gait
KR101830371B1 (en) Motion posture deriving method and apparatus based path of COP
Yuan et al. Simultaneous localization and capture with velocity information
Eguchi et al. Kinetic and spatiotemporal gait analysis system using instrumented insoles and laser range sensor
US20240049987A1 (en) Gait measurement system, gait measurement method, and program recording medium
KR101926170B1 (en) Motion sensing method and apparatus for gait-monitoring
KR102229071B1 (en) Apparatus for implementing motion using piezoelectric sensor and method thereof
KR20150073319A (en) Three dimentional wakling analysys device using foor base point
KR20200069218A (en) Motion capture apparatus using movement of human centre of gravity and method thereof
CN112107290A (en) Systems, methods, and software applications for KAM predicting multiple gait cycles of a subject

Legal Events

Date Code Title Description
GRNT Written decision to grant