WO2020213786A1 - Système d'exécution de contenu interactif virtuel utilisant une reconnaissance de mouvement corporel - Google Patents

Système d'exécution de contenu interactif virtuel utilisant une reconnaissance de mouvement corporel Download PDF

Info

Publication number
WO2020213786A1
WO2020213786A1 PCT/KR2019/007315 KR2019007315W WO2020213786A1 WO 2020213786 A1 WO2020213786 A1 WO 2020213786A1 KR 2019007315 W KR2019007315 W KR 2019007315W WO 2020213786 A1 WO2020213786 A1 WO 2020213786A1
Authority
WO
WIPO (PCT)
Prior art keywords
movement
player
interactive content
event
digital camera
Prior art date
Application number
PCT/KR2019/007315
Other languages
English (en)
Korean (ko)
Inventor
고종필
Original Assignee
주식회사 지티온
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from KR1020190071560A external-priority patent/KR102275702B1/ko
Application filed by 주식회사 지티온 filed Critical 주식회사 지티온
Publication of WO2020213786A1 publication Critical patent/WO2020213786A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/363Image reproducers using image projection screens

Definitions

  • the present invention relates to a system for executing virtual interactive contents using body movement recognition, and more particularly, a system for executing virtual interactive contents by generating a predetermined event by recognizing a specific motion of a player's upper body and/or lower body with a digital camera. It is about.
  • a virtual interactive content execution technology that projects content such as a game on a large screen such as a wall, recognizes and tracks a player's motion or a thrown object such as a ball thrown by the player, and links with the execution of the content has recently been in the spotlight.
  • virtual interactive contents that can be enjoyed indoors regardless of environmental conditions such as outdoor temperature, fine dust concentration, rainfall, and snowfall are gradually being introduced.
  • One of the conventional virtual interactive content execution systems uses an infrared (IR) camera to track the movement of a player or a thrown object.
  • the IR camera module of this system includes at least one infrared light irradiation module and at least one light sensor module, and uses the lag or phase shift of the modulated optical signal for all pixels of the captured image.
  • ToF method Time-Of-Flight measurement
  • Patent Document 0001 relates to an object-throwing game display system, an IR camera that recognizes reflection information of infrared light of an object thrown on the front of the display, and recognized by the IR camera It includes a computer that obtains the location information by receiving the information of the infrared light.
  • Patent Document 0001 Since the technology of Patent Document 0001 identifies the position of the thrown object using infrared rays, in order to obtain a recognition rate enough to play a normal game, the game space should not be exposed to daylight or maintain illumination below a predetermined standard. Therefore, there is a limitation to playing the game in a closed room under low-illumination lighting or to play the game with the window covered with a blackout curtain so as not to be exposed to daylight.
  • infrared rays due to the nature of infrared rays, it is difficult to play a game smoothly in a high temperature environment above a predetermined temperature or above a predetermined humidity. For example, in a hot indoor game hall on a summer day, an outdoor court in broad daylight, or an indoor or outdoor court in the case of fog or rain, the recognition rate of a thrown object is significantly lowered.
  • Patent Document 0001 relates to a technology for tracking the thrown object thrown by the player, but the same problem occurs with the technology for tracking the player's motion with an infrared camera.
  • Another of the conventional interactive content execution systems is to install a touch plate using a piezoelectric sensor on the floor, and when the player makes various actions on the touch plate while watching the interactive content projected on the wall screen, the player's foot moves by the piezoelectric sensor Is detected and reflected in the execution of interactive content.
  • the present invention has been proposed to solve the above-mentioned problems, and an object of the present invention is to provide a virtual interactive content execution system that is not affected by environmental factors of a play place such as illumination, temperature, and humidity.
  • Another object of the present invention is to provide a virtual interactive content execution system that can dramatically improve the recognition rate by learning in advance various features of a person through repetitive pre-analysis in order to quickly and accurately identify a player from a play video. Is to do.
  • An embodiment of the present invention for achieving the above object, a digital camera for photographing the movement of the player; And a recognition module that identifies a movement pattern of a part of the player's body in the image captured by the digital camera, and when the movement pattern of the part of the body matches a preset pattern, transmitting an event including the identifier of the pattern to the interactive content application.
  • a digital camera for photographing the movement of the player
  • a recognition module that identifies a movement pattern of a part of the player's body in the image captured by the digital camera, and when the movement pattern of the part of the body matches a preset pattern, transmitting an event including the identifier of the pattern to the interactive content application.
  • It relates to an interactive content execution system using body movement recognition, comprising: an application driving device that executes a conversion engine including an event module.
  • the recognition module tracks the movement of a part of the body based on the distance to the body part of the player, and the event module includes an identifier of the corresponding pattern when the movement distance and the movement direction of the body part coincide with a preset pattern. And deliver the generated event to the interactive content application.
  • the digital camera has at least two image sensors, and the recognition module estimates a distance between the digital camera and a body part of the player by using a difference in angle of view of the image sensors.
  • the recognition module identifies movement of at least one of the player's left arm, right arm, left foot, and right foot.
  • the event module generates an event for at least one of a movement of the body part, a walk, a jump, and a movement in any one of a plurality of preset directions.
  • the digital camera includes at least one image sensor, and in this case, the recognition module includes a distance between the digital camera and a body part of the player based on the size of the body part of the player in the image captured by the digital camera.
  • a stage installed on the floor may be further included in order to provide a visual guide to the player about the preset movement direction and movement range.
  • it may further include a machine learning server that analyzes a plurality of image data including a person and learns pattern information for identifying a person from a background in the image in advance.
  • virtual interactive content can be enjoyed without being affected by environmental factors such as illumination, temperature, and humidity.
  • content can be enjoyed comfortably in an indoor space with sufficiently bright lighting even on hot, cold, or high concentration of fine dust, and content can be enjoyed on an outdoor court in an area where the temperature and weather suitable for exercise are maintained.
  • a touch sensor such as a piezoelectric element or the like is not required to recognize a player's body motion, inconvenience of using the system due to a failure can be prevented in advance.
  • the conversion engine generating the event and the virtual interactive content receiving the event are independently executed, there is no need to modify the virtual interactive content to maintain compatibility between the two programs. Therefore, the productivity of interactive content development is increased while the universality of the conversion engine is guaranteed.
  • FIG. 1 is a conceptual diagram schematically showing the configuration of a virtual interactive content execution system according to a first embodiment.
  • FIG. 2 is a block diagram showing a detailed configuration of a system for executing virtual interactive content according to the first embodiment.
  • 3 and 4 are block diagrams showing a system configuration of a modified embodiment of the first embodiment.
  • 5A through 5C illustrate various embodiments of a stage.
  • MODULE refers to a unit that processes a specific function or operation, and may mean hardware or software, or a combination of hardware and software.
  • interactive content refers to content that outputs or executes various results in response to a user's real-time action, not content that is unilaterally played or executed according to a predetermined plot. .
  • content is not executed using conventional input means such as a mouse or a touch pad (hereinafter referred to as'mouse, etc.'), but the actual content is executed on a separate computer device, but the execution image of the content is beamed.
  • Directly projected onto a wall, floor, or ceiling (hereinafter referred to as'wall surface') through a projector, projected onto a screen installed on a wall, etc., or through a display device (for example, a digital TV or digital monitor) installed on a wall.
  • Is output, and the player uses a mouse through various movements such as jumping, walking, moving the right or left arm, or moving the right or left leg on the directional bearing plate placed on the floor while looking at the wall on which the image of the content is displayed.
  • virtual interactive content refers to interactive content that induces dynamic movement or movement of a player.
  • virtual interactive content can be understood as a concept including all kinds of content that can induce a player's kinetic action. Therefore, it is obvious to those skilled in the art that it may be implemented as media content such as a tap dance game using a floor touch in nine directions, a game that experiences virtual historical relics using walking and arm movements.
  • Embodiment 1 relates to a virtual interactive content execution system that recognizes a player's body movement using a stereo camera.
  • FIG. 1 is a conceptual diagram schematically showing the configuration of a virtual interactive content execution system according to a first embodiment.
  • a digital camera 10 for photographing a user's action is disposed on the wall opposite to the wall on which the interactive content is projected, or on the ceiling or on either side of the wall, and the interactive content is executed on a separate application driving device 20 do.
  • An image output device 30 that receives an image of interactive content from the application driving device 20 and outputs it to the wall surface is disposed on the wall or ceiling opposite the wall surface on which the content is projected.
  • a stage 50 is disposed on the floor to provide a visual guide to the player regarding a predetermined orientation and reach distance.
  • FIG. 2 is a block diagram showing a detailed configuration of a system for executing virtual interactive content according to the first embodiment.
  • the system of Example 1 includes a digital camera 10, an application driving device 20 and an image output device 30, and includes at least one of a machine learning server 40 and a stage 50. It may contain more.
  • the digital camera 10 photographs a motion scene of the player and transmits the photographed image data to the application driving device 20.
  • the digital camera 10 uses an application driving device 20 and a wired communication interface such as USB, RJ-45, or a short-range or broadband wireless communication interface or communication protocol such as Bluetooth, IEEE 802.11, and LTE. Can be connected.
  • a wired communication interface such as USB, RJ-45, or a short-range or broadband wireless communication interface or communication protocol such as Bluetooth, IEEE 802.11, and LTE. Can be connected.
  • the communication interface or communication protocol mentioned here is only an example, and any communication interface and protocol for smoothly transmitting image data can be used.
  • a stereo-type measurement algorithm can be used.
  • the same object is photographed using two camera modules (image sensors) separated from each other, and the distance to the object is estimated by using the angle difference caused by the discrepancy between the viewpoints between the two camera modules.
  • the digital camera 10 of Example 1 includes at least two 2D image sensor modules (not shown).
  • the application driving device 20 executes the conversion engine 21 and the interactive content application 22.
  • the application driving device 20 may install and execute the conversion engine 21 and the interactive content application 22 together in a single device such as a desktop PC, a notebook computer, a mobile tab, a smartphone, and a server.
  • the application driving device 20 may install and execute the conversion engine 21 on a single device such as a desktop PC illustrated above, and install and execute the interactive content application 22 on a separate server 20-1.
  • FIG. 3 is a block diagram showing the system configuration of such a modified embodiment.
  • the conversion engine 21 is installed and executed on the digital camera 10, and only interactive content applications are executed on the application driving device 20, and the digital camera 10 and the application driving device 20 are It can be connected through a local area network or an LTE or 5G broadband network.
  • 4 is a block diagram showing the system configuration of this modified embodiment.
  • the conversion engine 21 When it is detected that the player's arm or foot moves in a predetermined pattern, the conversion engine 21 generates an event corresponding to the pattern, and transmits the generated event to the interactive content application 22. To this end, the conversion engine 21 may include a recognition module 21-1 and an event module 21-2.
  • the recognition module 21-1 identifies the player by processing the image data sent from the camera 10, and uses a stereotype technique to determine between the camera 10 and the player's moving body (for example, a moving right foot). Estimate the distance.
  • the identification of the player and the estimation of the distance to the moving body will be collectively defined as tracking. Tracking may be performed on all frames of image data sent from the camera 10, or intermittently performed on frames of preset intervals in consideration of the burden of load of the conversion engine 21 due to frequent tracking. It could be.
  • the recognition module 21-1 may be included in the conversion engine 21 or may be installed in the digital camera 10 as firmware.
  • the digital camera 10 provides tracking information including the distance to the object and the coordinates of the object instead of image data to the event module 21-2 of the conversion engine 21 do.
  • the event module 21-2 determines whether the player's body movement matches a predetermined pattern, generates an event including an identification flag of the movement pattern, and transmits the generated event to the interactive content application.
  • the principle of the event module 21-2 determining whether the player's body movement matches a predetermined pattern may be implemented with various algorithms. For better understanding, an example of executing interactive content using the movement of the legs of the player's body will be described.
  • an algorithm for recognizing a pattern when a player moves his or her feet in a specific direction may be implemented as follows.
  • the recognition module 21-1 first estimates the distance between the camera 10 and the right foot, and the camera 10 and the left foot from the captured image of the player standing at the center of the stage 50 and waiting, Set as the reference value for pattern recognition.
  • the recognition module 21-1 first analyzes only the image of the stage 50, but estimates the distance to the center point using the division line displayed on the upper surface of the stage 50, or separates and recognizes the stage object in the image.
  • the distance to the center point may be estimated, and the distance to the center point may be regarded as the player's initial position and set as a reference value for pattern recognition.
  • the recognition module 21-1 continuously tracks the movement of the player's left foot and right foot and transmits it to the event module 21-2, and the event module 21-2 determines the movement distance and the movement direction of the left foot in advance. If the movement distance and direction of movement of the left foot of the set pattern match, it is determined that movement of the pattern has occurred, and an event of the pattern is generated.
  • the event module 21-2 keeps the player's left foot at 10 o'clock from the reference point.
  • a predetermined distance the distance to the upper left area of the stage
  • it is recognized as a “left foot 10 o'clock pattern”, and a “left foot 10 o'clock event” with a flag pointing to the upper left area of the stage is generated.
  • an algorithm for recognizing a pattern in which the player jumps in place may be implemented as follows.
  • the recognition module 21-1 continuously tracks the movements of the player's left and right feet and transmits them to the event module 21-2.
  • the event module 21-2 recognizes as a "jumping pattern” and generates a "jumping event” when the movement direction of the left foot and the right foot is vertical and the movement distance is greater than a preset height.
  • an algorithm for recognizing a pattern of a player walking in place may be implemented as follows.
  • the recognition module 21-1 continuously tracks the movements of the player's left and right feet and transmits them to the event module 21-2.
  • the event module 21-2 recognizes as a “walking pattern” and recognizes as a “walking event” when it is determined that the movement direction of the left foot and the right foot is vertical and the movement distance is more than a preset height, but the left foot and the right foot alternately move up and down. Occurs.
  • the event module 21-2 generates an event including an identifier of the determined pattern and transmits it to the interactive content application 22.
  • GUI Graphical user interface
  • the interactive dancing content 22 whose score is counted is executed, interactive dancing
  • the content 22 converts the inactive footprint into an active footprint image when a "left foot movement event" is received from the event module 21-2 while an inactive footprint image at 10 o'clock is displayed on the screen. It can be run as a plot that counts a predetermined score.
  • the educational interactive content 22 is executed to listen to commentary on a specific relic while visiting ancient relics one by one on the wall screen.
  • a "walking event” is received from the event module 21-2
  • the content 22 continuously outputs a scene according to the forward walking on the screen.
  • a "walking stop event” is received from the event module 21-2, and the stationary scenery at that point is output.
  • the “right arm 10 o'clock event” is received from the event module 21-2, the description of the relic may be executed as a narrated plot.
  • a virtual reality (VR) headset is used as the image output device 30, the interestingness of the content may be doubled.
  • the term “event” may be understood as a concept including any event for inputting a user's instruction to the interactive content application 22. Therefore, the event transmitted from the conversion engine 21 to the interactive content application 22 may be an event related to the aforementioned arm/leg movement, but in addition to the left mouse click event, the mouse right click event, the mouse movement event, the mouse It can be variously defined as a double click event or a mouse wheel click event.
  • the event generated by the conversion engine 21 is compatible with the operating system in which the interactive content application 22 is executed. Alice, the developer of the interactive content application 22, does not need to discuss compatibility with Bob, the developer of the conversion engine 21 in advance, so the conversion engine 21 of the present invention is sold on the market. It has the advantage of being able to apply any interactive content to be applied without a separate modification for interfacing.
  • the image output device 30 may be any type of device as long as it has a function of outputting a content image on a wall or the like.
  • a beam projector for example, a beam projector, a display device such as a large TV or monitor mounted on a wall, and an augmented reality headset may be used as the image output device 30.
  • the image output device 30 is connected to the application driving device 20 through a cable or wireless communication.
  • the machine learning server 40 includes a machine learning engine (not shown) that learns various characteristics for identifying an object based on image data sent from the camera 10.
  • the machine learning server 40 uses the object based on at least one of a typical human shape, an arm position, a leg position, and a body silhouette for distinguishing a man and a woman. You can find a certain pattern to identify
  • the machine learning server 40 may receive image data through an application driving device 20 connected to the digital camera 10 or may be directly connected to the digital camera 10 to receive image data.
  • the machine learning server 40 finds a specific pattern to more clearly identify a person by repeatedly analyzing dozens to hundreds of different image data captured on a person (or male and female).
  • the recognition module 21-1 of the conversion engine 21 can easily identify the player from the image data using identification pattern information, which is a result obtained by learning in advance by the machine learning module 40.
  • the machine learning server 40 may learn only one object for one content, but if control is required with a plurality of objects according to the type of content, it may pre-learn to identify a plurality of different objects for one content. have.
  • the stage 50 provides a reference point for the player's initial position, and provides a visual direction guide so that the player can easily point a specific direction using the lower body's feet or the upper body's arms.
  • 5A to 5C illustrate various embodiments of the stage 50.
  • the stage 50 includes an external layout OL having a predetermined width, and includes a partition line IL for dividing and distinguishing an inner area of the external layout OL.
  • the player stands on the stage 50 and moves the left foot and/or the right foot of the lower body with reference to the division line IL to perform an action corresponding to a desired event action.
  • the stage 50 is divided into 9 by a division line IL, and the player sees the front, rear, left, right, center, diagonal lines, and mixed motions thereof through 9 divisions. It is possible to do it clearly.
  • the embodiment of FIG. 5B is an example of displaying the external layout OL and the division line IL using the laser light source L.
  • the external layout (OL) and division line (IL) displayed on the floor may have a function to guide the movement of the lower body of the player, and in the case of the external layout (OL) and division line (IL) displayed on the wall, the player It can have a function to guide the upper body motion of the.
  • FIG. 5C is an example of displaying the external layout OL and the partition line IL in a predetermined space using the hologram device H.
  • the hologram by the hologram device (H) is preferably projected onto a space where photographing is possible in the photographing device (10), and the player wants to move the upper body's left arm and/or right arm by referring to the division line (IL) from one side of the hologram. It is possible to take an action corresponding to the event action.
  • FIGS. 5A to 5C may be used separately or may be mixed and used. That is, it is possible to selectively use the embodiment of FIG. 5A or 5B to guide the lower body motion and apply the embodiment of FIG. 5B to the wall surface or selectively use the embodiment of FIG. 5C to guide the upper body motion. .
  • the paint of the partition line or the material of the stage does not need to be limited to any specific ones.
  • the partition line may be displayed using a general colored paint such as paint or ink, and the stage may be implemented as a mat made of artificial fiber.
  • the stage 50 may not be a component that must be provided. That is, in another embodiment, a separate stage 50 having substantiality is not disposed on the floor. Instead, if the content is executed and the user stands at a location in the shooting target area of the camera 10 and does not move for a preset time, the application driving device 20 recognizes the location as the initial location of the player. And when the player moves the upper body arm or lower body foot in 9 directions based on experience and sense, the application driving device 20 generates an event corresponding to each movement by tracking the movement of the arm and/or foot. Let it.
  • the entire or partial functions of the virtual interactive content execution system described above may be provided in a recording medium that can be read through a computer by tangibly implementing a program of instructions for implementing it.
  • the computer-readable recording medium may include program instructions, data files, data structures, etc. alone or in combination.
  • the program instructions recorded on the computer-readable recording medium may be specially designed and constructed for the present invention, or may be known and usable to those skilled in computer software.
  • Examples of the computer-readable recording medium include magnetic media such as hard disks, floppy disks, and magnetic tapes, optical media such as CD-ROMs and DVDs, and floptical disks.
  • Magneto-optical media and hardware devices specially configured to store and execute program instructions such as ROM, RAM, flash memory, USB memory, and the like.
  • the computer-readable recording medium may be a transmission medium such as an optical or metal wire or a waveguide including a carrier wave for transmitting a signal specifying a program command or a data structure.
  • Examples of program instructions include high-level language codes that can be executed by a computer using an interpreter or the like, in addition to machine language codes such as those produced by a compiler.
  • the hardware device may be configured to operate as one or more software modules to perform the operation of the present invention and vice versa.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Software Systems (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Artificial Intelligence (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • Medical Informatics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Computing Systems (AREA)
  • Mathematical Physics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

L'invention concerne un système d'exécution de contenu interactif virtuel utilisant une reconnaissance de mouvement corporel et, en particulier, un système permettant d'exécuter un contenu interactif virtuel en reconnaissant un mouvement particulier du haut du corps et/ou du bas du corps d'un joueur au moyen d'une caméra numérique et en générant un événement prédéterminé. L'invention permet de reconnaître avec précision le mouvement d'un joueur sans capteur tactile physique, ce qui permet de réduire le coût lié à l'établissement du système et d'éliminer les inconvénients peuvant être provoqués par une défaillance fréquente d'un capteur.
PCT/KR2019/007315 2019-04-17 2019-06-18 Système d'exécution de contenu interactif virtuel utilisant une reconnaissance de mouvement corporel WO2020213786A1 (fr)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
KR20190045098 2019-04-17
KR10-2019-0045098 2019-04-17
KR1020190071560A KR102275702B1 (ko) 2019-04-17 2019-06-17 신체 움직임 인식을 이용한 가상 인터렉티브 컨텐츠 실행 시스템
KR10-2019-0071560 2019-06-17

Publications (1)

Publication Number Publication Date
WO2020213786A1 true WO2020213786A1 (fr) 2020-10-22

Family

ID=72837405

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2019/007315 WO2020213786A1 (fr) 2019-04-17 2019-06-18 Système d'exécution de contenu interactif virtuel utilisant une reconnaissance de mouvement corporel

Country Status (1)

Country Link
WO (1) WO2020213786A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112379781A (zh) * 2020-12-10 2021-02-19 深圳华芯信息技术股份有限公司 基于脚部信息识别的人机交互方法、系统以及终端

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20130071059A (ko) * 2011-12-20 2013-06-28 엘지전자 주식회사 이동 단말기 및 그 제어방법
KR20140046197A (ko) * 2012-10-10 2014-04-18 주식회사 씨씨 동작인식 장치 및 방법, 그리고 프로그램을 저장한 컴퓨터로 판독 가능한 기록매체
KR20150035854A (ko) * 2015-02-17 2015-04-07 주식회사 홍인터내셔날 원격 멀티 모드 시 스로우 라인을 이용한 인증이 가능한 다트 게임 장치
US20180293442A1 (en) * 2017-04-06 2018-10-11 Ants Technology (Hk) Limited Apparatus, methods and computer products for video analytics
KR101963682B1 (ko) * 2018-09-10 2019-03-29 주식회사 큐랩 증강현실 기반의 스포츠 콘텐츠에 따른 신체 측정 데이터 관리 시스템

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20130071059A (ko) * 2011-12-20 2013-06-28 엘지전자 주식회사 이동 단말기 및 그 제어방법
KR20140046197A (ko) * 2012-10-10 2014-04-18 주식회사 씨씨 동작인식 장치 및 방법, 그리고 프로그램을 저장한 컴퓨터로 판독 가능한 기록매체
KR20150035854A (ko) * 2015-02-17 2015-04-07 주식회사 홍인터내셔날 원격 멀티 모드 시 스로우 라인을 이용한 인증이 가능한 다트 게임 장치
US20180293442A1 (en) * 2017-04-06 2018-10-11 Ants Technology (Hk) Limited Apparatus, methods and computer products for video analytics
KR101963682B1 (ko) * 2018-09-10 2019-03-29 주식회사 큐랩 증강현실 기반의 스포츠 콘텐츠에 따른 신체 측정 데이터 관리 시스템

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112379781A (zh) * 2020-12-10 2021-02-19 深圳华芯信息技术股份有限公司 基于脚部信息识别的人机交互方法、系统以及终端
CN112379781B (zh) * 2020-12-10 2023-02-28 深圳华芯信息技术股份有限公司 基于脚部信息识别的人机交互方法、系统以及终端

Similar Documents

Publication Publication Date Title
US9015638B2 (en) Binding users to a gesture based system and providing feedback to the users
CA2757057C (fr) Gestion de ports virtuels
CN102257456B (zh) 校正跟踪系统中的角度误差
CN102414641B (zh) 改变显示环境内的视图视角
US20180150686A1 (en) Systems and methods for operating a virtual reality environment using colored marker lights attached to game objects
US20060192852A1 (en) System, method, software arrangement and computer-accessible medium for providing audio and/or visual information
CN102449641A (zh) 用于对象跟踪的颜色校准
WO2017057806A1 (fr) Système de simulation de match et de pratique du tennis à l'aide d'un écran
CN105073210A (zh) 使用深度图像的用户身体角度、曲率和平均末端位置提取
WO2016208930A1 (fr) Système et procédé de visée automatique pour jeu mobile
CN101919241A (zh) 在投射图像上定位光斑的双模投影装置及方法
WO2017105120A1 (fr) Appareil d'entraînement au baseball, appareil de détection et procédé de détection utilisés par celui-ci, et procédé de commande de lancer de balle
CN110559632A (zh) 一种智慧滑雪健身仿真模拟器及其控制方法
CN107408003A (zh) 信息处理设备、信息处理方法和程序
KR102275702B1 (ko) 신체 움직임 인식을 이용한 가상 인터렉티브 컨텐츠 실행 시스템
WO2020213786A1 (fr) Système d'exécution de contenu interactif virtuel utilisant une reconnaissance de mouvement corporel
EP3621299A1 (fr) Dispositif de traitement d'informations, procédé de traitement d'informations et programme
JP7315489B2 (ja) 周辺機器追跡システムおよび方法
WO2016204335A1 (fr) Dispositif pour fournir un espace d'exercice virtuel augmenté par un système d'exercice basé sur du contenu interactif immersif et procédé associé
KR101692267B1 (ko) Hmd 유저와 복수의 일반인들 간에 상호작용이 가능한 가상현실 컨텐츠 시스템 및 그 제어방법
US20190151751A1 (en) Multi-dimensional movement recording and analysis method for movement entrainment education and gaming
WO2011115364A2 (fr) Appareil de traitement de données d'image pour le suivi de la position d'une source lumineuse
WO2016072738A1 (fr) Appareil de commande de dispositifs d'éclairage
JP2011092657A (ja) 複数の光源を使って操作を行うゲームシステム
Otsu et al. Enhanced concert experience using multimodal feedback from live performers

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19925006

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19925006

Country of ref document: EP

Kind code of ref document: A1

122 Ep: pct application non-entry in european phase

Ref document number: 19925006

Country of ref document: EP

Kind code of ref document: A1

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205 DATED 25/04/2022)

122 Ep: pct application non-entry in european phase

Ref document number: 19925006

Country of ref document: EP

Kind code of ref document: A1