WO2020213784A1 - 운동 유도를 위한 스포츠 인터렉티브 컨텐츠 실행 시스템 - Google Patents
운동 유도를 위한 스포츠 인터렉티브 컨텐츠 실행 시스템 Download PDFInfo
- Publication number
- WO2020213784A1 WO2020213784A1 PCT/KR2019/006029 KR2019006029W WO2020213784A1 WO 2020213784 A1 WO2020213784 A1 WO 2020213784A1 KR 2019006029 W KR2019006029 W KR 2019006029W WO 2020213784 A1 WO2020213784 A1 WO 2020213784A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- interactive content
- player
- sports
- image
- throwing
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/04815—Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/002—Specific input/output arrangements not covered by G06F3/01 - G06F3/16
- G06F3/005—Input arrangements through a video camera
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/042—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
- G06F3/0425—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means using a single imaging device like a video camera for tracking the absolute position of a single or a plurality of objects with respect to an imaged reference surface, e.g. video camera imaging a display or a projection screen, a table or a wall surface, on which a computer generated image is displayed or projected
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N20/00—Machine learning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/20—Movements or behaviour, e.g. gesture recognition
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/204—Image signal generators using stereoscopic image cameras
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/363—Image reproducers using image projection screens
Definitions
- the present invention relates to a sports interactive content execution system for inducing exercise, and when a sports-related interactive content is projected on a large screen such as a wall, and at least one player throws a throwing object such as a ball toward the content image on the wall, the touch point is
- the present invention relates to a system for inducing an exercise effect while interacting with contents in a way that contents images change.
- a function of recognizing a thrown object such as a ball that plays a role of a virtual mouse, tracking the movement of the thrown object, and finding the coordinates at the moment of touching the wall is essential.
- One of the conventional interactive content execution systems uses an infrared (IR) camera to recognize movement and touch coordinates of a thrown object.
- the IR camera module of this system includes at least one infrared light irradiation module and at least one light sensor module, and for all pixels of the captured image, the lag or phase shift of the modulated optical signal is controlled.
- ToF technique Time-Of-Flight measurement
- Patent Document 0001 relates to an object-throwing game display system, an IR camera that recognizes reflection information of infrared light of an object thrown on the front of the display, and recognized by the IR camera. It includes a computer that obtains the location information by receiving the information of the infrared light.
- Patent Document 0001 Since the technology of Patent Document 0001 identifies the position of the thrown object using infrared rays, in order to obtain a recognition rate enough to play a normal game, the game space should not be exposed to daylight or maintain illumination below a predetermined standard. Therefore, there is a limit to playing the game in a closed room under low-illumination lighting or to play the game with the window covered with a blackout curtain so as not to be exposed to daylight.
- Another conventional interactive content execution system recognizes touch coordinates of a thrown object using a touch display wall.
- a touch display wall a display device having a large screen is mounted on the wall, and a light emitting unit such as infrared rays and a laser and a light receiving unit are densely disposed at the edge of the display device. Therefore, when the thrown object thrown by the user touches the display screen, infrared light or laser light at the touch point is blocked and recognized as coordinates.
- the above-described IR camera method or touch display wall method simply identifies the position or touched coordinates of the thrown object (hereinafter, the ball is assumed to be a thrown object), so whether the user threw the ball outside a predetermined reference line or the wall surface. There is also a limit of not being able to tell whether or not the ball was thrown by getting close to the.
- the IR camera method identifies the ball only with simple information such as the shape or size of the ball, so if a 2D circular object is displayed in the content image projected on the wall, the player's actual lesson Confusing problems also arise.
- the present invention has been proposed to solve the above-mentioned problems, and an object of the present invention is to provide a sports interactive content execution system that is not affected by environmental factors of a play place such as illumination, temperature, and humidity.
- Another object of the present invention is to provide a sports interactive content execution system capable of remarkably improving the recognition rate by learning in advance various characteristics of a thrown object, which plays a role of a mouse that controls the execution of content, through repetitive analysis. .
- Another object of the present invention is to provide a sports interactive content execution system capable of implementing a rule of a baseline to be followed by a player based on an improved recognition rate or a rule preventing throwing objects of other players from throwing in multiplayer content.
- An embodiment of the present invention for achieving the above object, a digital camera for photographing a sports interactive content image displayed on a wall; And an object recognition module for identifying a projection in the photographed image of the sports interactive content and determining the distance and coordinates of the projection, and delivering an event including the coordinates of the projection when the projection hits the wall surface to the interactive content application.
- a sports interactive content execution system for inducing exercise comprising an event module and an application driving device that executes a conversion engine including a reference module that performs exception processing when the throwing point of the throwing object exceeds a reference line.
- the object recognition module identifies the player in the photographed image of the sports interactive content and then provides a distance to the player to the norm module, and the norm module compares the distance to the player and the distance of the baseline to exceed the baseline. Determine if it is thrown.
- the reference module compares the distance of the moving start point of the throwing object and the distance of the reference line to determine whether the throwing is beyond the reference line.
- the system of the present embodiment further includes a machine learning server that repeatedly analyzes a plurality of image data including the projection to learn a pattern related to at least one of a shape, a size, a pattern pattern on a surface, and a color for identifying the projection.
- a machine learning server that repeatedly analyzes a plurality of image data including the projection to learn a pattern related to at least one of a shape, a size, a pattern pattern on a surface, and a color for identifying the projection.
- the digital camera may have at least two image sensors, and the object recognition module calculates a distance between the digital camera and the projection by using a difference in angle of view of the image sensors.
- the digital camera may have at least one image sensor, and the object recognition module calculates a distance between the digital camera and a wall surface based on the size of a projected object in an image captured by the digital camera.
- Another embodiment of the present invention is a digital camera for capturing an image of a first sports interactive content and an image of a second sports interactive content displayed on a wall; An object recognition module for identifying a first and second throwing objects in the captured images of the first and second sports interactive contents, and determining distances and coordinates of the first and second throwing objects, and the first Running a conversion engine including an event module that delivers a first event including coordinates when the projectile hits the wall and a second event including coordinates when the second projectile hits the wall to the interactive content application. It relates to a sports interactive content execution system for inducing exercise, comprising an application driving device.
- the object recognition module further identifies a first player and a second player in the captured images of the first and second sports interactive content, and the first player throws the second throwing object or the second player 1
- a standard module that performs exception handling when throwing a throw may be further included.
- the first throw may further include a rule module that performs exception processing when the first thrown object moves from the play area of the second sports interactive content to the play area of the first sports interactive content and hits a wall.
- the digital camera may have at least two image sensors, and the object recognition module calculates a distance between the digital camera and the first or second projection by using a difference in angle of view of the image sensors.
- the digital camera may have at least one image sensor, and the object recognition module may include the digital camera and the first projection based on the size of the first or second projection in the image captured by the digital camera. Calculate the distance between the second throws.
- the present invention it is possible to enjoy sports interactive content without being affected by environmental factors of a play place such as illumination, temperature, and humidity.
- content can be enjoyed comfortably in an indoor space with sufficiently bright lighting even on hot, cold, or high concentration of fine dust, and content can be enjoyed on an outdoor court in an area where the temperature and weather suitable for exercise are maintained.
- a recognition rate can be remarkably improved by learning in advance various characteristics of a throwing object that plays a role of a mouse that controls execution of content through repetitive analysis.
- a rule of a baseline to be followed by a player or a rule that prevents throwing objects of other players from throwing in multiplayer content may be implemented as an algorithm.
- the conversion engine generating the event and the virtual interactive content receiving the event are independently executed, There is no need to modify virtual interactive contents to maintain compatibility between programs. Therefore, the productivity of the interactive content is increased while the universality of the conversion engine is guaranteed.
- FIG. 1 is a conceptual diagram schematically showing the configuration of a sports interactive content execution system according to a first embodiment.
- FIG. 2 is a block diagram showing a detailed configuration of the system for executing sports interactive content according to the first embodiment.
- 3 and 4 are block diagrams showing a system configuration of a modified embodiment of the first embodiment.
- 5A to 5D illustrate examples of photographing a projection at various locations in order to learn in advance identification information of the projection by machine learning.
- FIG. 6 is a block diagram showing a detailed configuration of a sports interactive content execution system according to a second embodiment.
- FIG. 7 is a flowchart showing a method of executing content according to the third embodiment step by step.
- FIG. 8 is a flowchart illustrating a machine learning process step by step in the method of executing sports interactive content according to the third embodiment.
- FIG. 9 is a conceptual diagram schematically showing the configuration of a sports interactive content execution system for a multiplayer according to the fourth embodiment.
- FIG. 10 is a block diagram showing a detailed configuration of a sports interactive content execution system according to a fourth embodiment.
- MODULE refers to a unit that processes a specific function or operation, and may mean hardware or software, or a combination of hardware and software.
- the term "throwing object” or “throwing object” refers to an object that can cause movement by a player using a part of his or her body or by using equipment such as a racket or a club. , Volleyball ball, tennis ball, badminton ball, Ozami, darts, and the like.
- the present invention is not limited thereto, and any object that maintains a certain shape and can be easily moved by a user may correspond to a “projection”.
- Such a “projection” may also be referred to as a “virtual mouse” or a “virtual pointer” in that it serves as an input means (eg, mouse, pointer, etc.) for executing or controlling sports interactive content.
- interactive content refers to content that outputs or executes various results in response to a user's real-time action, not content that is unilaterally played or executed according to a predetermined plot. .
- content is not executed using conventional input means such as a mouse or a touch pad (hereinafter referred to as'mouse, etc.'), but the actual content is executed on a separate computer device, but the execution image of the content is beamed.
- Directly projected onto a wall, floor, or ceiling (hereinafter referred to as'wall surface') through a projector, projected onto a screen installed on a wall, etc., or through a display device (for example, a digital TV or digital monitor) installed on a wall.
- a display device for example, a digital TV or digital monitor
- sports interactive content refers to interactive content that induces dynamic movement or movement of a player.
- a basketball hoop that moves vertically and horizontally is displayed on the wall screen, and the player throws a basketball and hits the basketball hoop, and the score is increased.
- a basketball game where the score is increased.
- soccer games that increase the score when they fill up, and image puzzle games in which if you match a specific puzzle with a ball in a video composed of nine puzzles, the puzzle rotates and fits into a complete original image.
- sports content should be understood as a concept including all kinds of content that can induce a player's kinetic action. Therefore, it is obvious to those skilled in the art that it may be implemented as media content such as an interactive movie, a digital book, or a digital frame.
- Embodiment 1 relates to a sports interactive content execution system that recognizes a throwing object and a baseline using a stereo camera.
- FIG. 1 is a conceptual diagram schematically showing the configuration of a sports interactive content execution system according to a first embodiment.
- the player plays the content by throwing the ball corresponding to the virtual mouse toward a specific point on the wall where the content is displayed.
- a digital camera that photographs a user's action and content scene is disposed on the wall opposite to the wall on which the content is projected, or on the ceiling or on either side of the wall, and the interactive content is provided in a separate application driving device (not shown in FIG. 1). Runs.
- a beam projector that receives an image of interactive content from an application driving device and outputs it to the wall is disposed on a wall or ceiling opposite the wall on which the content is projected.
- a baseline that the player should not cross may be displayed on the floor of the play area.
- One of the intent of the present invention is to induce a player's movement through content. If the player crosses the baseline and approaches the wall and throws the ball, the exercise effect is halved. Therefore, if the player throws the ball beyond the baseline, the score is unrecognized according to a predetermined rule or a re-throw is guided. process).
- the reference line of the floor surface may be displayed in real or virtual form. For example, it may be actually drawn through ink or paint, or displayed as light by interactive content, and may be displayed in various ways.
- FIG. 2 is a block diagram showing a detailed configuration of the system for executing sports interactive content according to the first embodiment.
- the system of Example 1 includes a digital camera 10, an application driving device 20, and an image output device 30, and may further include a machine learning server 40.
- the digital camera 10 photographs a content scene including a moving projectile, and transmits the photographed image data to the application driving device 20.
- the digital camera 10 uses an application driving device 20 and a wired communication interface such as USB, RJ-45, or a short-range or broadband wireless communication interface or communication protocol such as Bluetooth, IEEE 802.11, and LTE. Can be connected.
- a wired communication interface such as USB, RJ-45, or a short-range or broadband wireless communication interface or communication protocol such as Bluetooth, IEEE 802.11, and LTE. Can be connected.
- the communication interface or communication protocol mentioned here is only an example, and any communication interface and protocol for smoothly transmitting image data can be used.
- a stereo-type measurement algorithm may be used to identify the projection from the image data and estimate the distance between the camera 10 and the projection.
- the same object is photographed using two camera modules (image sensors) separated from each other, and the distance to the object is estimated by using the angle difference caused by the discrepancy between the viewpoints between the two camera modules.
- the digital camera 10 of Example 1 includes at least two 2D image sensor modules (not shown).
- the application driving device 20 executes the conversion engine 21 and the interactive content application 22.
- the application driving device 20 may install and execute the conversion engine 21 and the interactive content application 22 together in a single device such as a desktop PC, a notebook computer, a mobile tab, a smartphone, and a server.
- the application driving device 20 may install and execute the conversion engine 21 on a single device such as a desktop PC illustrated above, and install and execute the interactive content application 22 on a separate server 20-1.
- FIG. 3 is a block diagram showing the system configuration of such a modified embodiment.
- the conversion engine 21 is installed and executed on the digital camera 10, and only interactive content applications are executed on the application driving device 20, and the digital camera 10 and the application driving device 20 are It can be connected through a local area network or an LTE or 5G broadband network.
- 4 is a block diagram showing the system configuration of this modified embodiment.
- the conversion engine 21 generates an event corresponding to a click of a mouse when the projectile is touched on the wall, and transmits the generated event to the interactive content application 22.
- the conversion engine 21 may include an object recognition module 21-1, an event module 21-2, and a reference module 21-3.
- the object recognition module 21-1 processes image data sent from the camera 10 to identify the projected object, and estimates the distance between the camera 10 and the projected object using a stereotype technique. Object identification and distance estimation will be collectively defined as tracking. Tracking may be performed on all frames of image data sent from the camera 10, or intermittently performed on frames of preset intervals in consideration of the burden of load of the conversion engine 21 due to frequent tracking. It could be.
- the object recognition module 21-1 may be included in the conversion engine 21 or installed in the digital camera 10 as firmware.
- the digital camera 10 provides tracking information including the distance to the object and the coordinates of the object instead of image data to the event module 21-2 of the conversion engine 21 do.
- the event module 21-2 determines whether the thrown object collides with the wall, converts the coordinates of the collision point into coordinates on the execution screen of the interactive content application, generates an event including the converted coordinates, and interactively generates an event. Send to the content application.
- the principle of determining whether the event module 21-2 has collided with the wall surface may be implemented by various algorithms.
- An example algorithm is as follows. That is, the distance A between the camera 10 and the wall surface is measured in advance and stored in the conversion engine 21.
- the event module 21-2 compares the distance (B) with the object continuously sent by the object recognition module 21-1 with the previously stored distance (A), and when the two distances (A, B) become the same, the object Is considered to have hit the wall.
- Another example algorithm is as follows. That is, the event module 21-2 continuously monitors the change in the distance B with the object sent from the object recognition module 21-1. And the moment when the distance B increases and then turns to a decrease is determined as the moment of collision.
- Another example algorithm is as follows. That is, the event module 21-2 continuously monitors the change in the size of the object identified in the image data sent from the object recognition module 21-1. Since the size will gradually decrease as the distance from the camera 10 increases, the moment when the size of the object decreases and then turns to increase is determined as the moment of collision.
- the event module 21-2 has a mapping table in which the XY coordinates of the wall screen on which the content image is actually displayed and the xy coordinates on the execution screen of the content application are matched in advance.
- the event module 21-2 finds the XY coordinate of the collision point by processing the image data, and finds the xy coordinate matching the XY coordinate from the mapping table.
- the mapping table may be a database in which XY coordinates at predetermined intervals and xy coordinates at predetermined intervals are stored in advance, or an algorithm defining a correlation between the XY coordinates and the xy coordinates by an equation.
- the event module 21-2 generates an event including the converted xy coordinate and transmits it to the interactive content application.
- GUI Graphical user interface
- mouse_move_Event(A3,B3) By continuously generating the mouse cursor (A1,B1), (A2,B2), (A3,B3)... It is moved to the path of and displayed, and by generating mouse_left_Click(An, Bn) at the point where the mouse is stopped, it notifies the operating system or the activated application that the left mouse button is clicked at the coordinates of (An, Bn).
- event should be understood as a concept including all events for inputting a user's instruction to the interactive content application 220. Accordingly, events transmitted from the conversion engine 21 to the interactive content application 220 may be variously defined as a left mouse click event, a right mouse click event, a mouse movement event, a mouse double click event, and a mouse wheel click event.
- the object recognition module 21-1 identifies a plurality of objects
- the left mouse click event is performed by the event module 21-2. Is generated, a mouse right-click event is generated when the second object is recognized, and a mouse wheel click event is generated when the third object is recognized.
- the player since the player can control the virtual interactive content using three types of objects, it is possible to enjoy content with a richer plot.
- the conversion engine 21 generates an event and transmits the generated event to the interactive content application 22 so that the projected object operates like a mouse or a pointer.
- the event generated by the conversion engine 21 is compatible with the operating system in which the interactive content application 22 is executed. Alice, the developer of the interactive content application 22, does not need to discuss compatibility with Bob, the developer of the conversion engine 21 in advance, so the conversion engine 21 of the present invention is sold on the market. It has the advantage of being able to apply any interactive content to be applied without a separate modification for interfacing.
- the norm module 21-3 performs a predetermined exception processing when the throwing point of the throwing object, that is, the point where the player throwing the throwing object is located, exceeds the reference line.
- one of the intent of the present invention is to induce a player's movement through content. Therefore, if the player approaches the wall after crossing the baseline and throws the ball, the exercise effect is halved. Therefore, if the player throws the ball beyond the baseline, the score is unrecognized according to a predetermined rule, or a buzzer notifying a foul is issued. Exception handling such as ringing is performed.
- An example implementation method is as follows.
- the object recognition module 21-1 identifies the player by processing the captured image of the sports interactive content. Then, the distance to the player is estimated using any one of the above-described throw distance estimation algorithms and provided to the standard module 21-3.
- the norm module 21-3 determines whether the player has thrown beyond the baseline by comparing the distance to the player with the distance of the baseline measured in advance.
- the object recognition module 21-1 needs to further identify the player in addition to the thrown object in the captured image of the sports interactive content. Since a player can have various heights, clothes, and genders, rather than identifying each player, an algorithm that recognizes as a player can be applied when the shape of a normal person approximately matches within a predetermined error range. Alternatively, the algorithm can be simplified by recognizing only two leg parts rather than identifying the human body.
- Another example implementation method is as follows.
- the object recognition module 21-1 continuously tracks the movement of the projected object while identifying the projected object. Therefore, the norm module 21-3 determines whether the player has thrown beyond the reference line by comparing the distance of the point at which the throwing object starts moving from the stationary state and the distance of the baseline measured in advance. Just before the player throws or kicks the projectile, the projectile is in a stationary state, so the fact that the projectile starts moving from the stationary state means that the player throws or kicks the projectile. In this case, the object recognition module 21-1 only needs to identify the projected object from the content image.
- the image output device 30 may be any type of device as long as it has a function of outputting a content image on a wall or the like.
- a beam projector for example, a beam projector, a display device such as a large TV or monitor mounted on a wall, and an augmented reality headset may be used as the image output device 30.
- the image output device 30 is connected to the application driving device 20 through a cable or wireless communication.
- a problem may occur such as a shadow on the image by a user moving an object.
- a problem may occur such as a shadow on the image by a user moving an object.
- an image without a shaded area by a user may be displayed.
- the machine learning server 40 includes a machine learning engine (not shown) that learns various characteristics for identifying an object based on the image data sent from the camera 10.
- the machine learning server 40 selects the object based on at least one of the shape of the ball, the size of the ball, the pattern pattern on the surface of the ball such as a honeycomb pattern, and the color of the ball. You can find certain patterns to identify.
- the machine learning server 40 may receive image data through an application driving device 20 connected to the digital camera 10 or may be directly connected to the digital camera 10 to receive image data.
- 5A to 5D illustrate examples of photographing a projection at various locations in order to learn in advance identification information of the projection by machine learning.
- the user places a ball-like thrown on his hand and changes the orientation of the front, rear, left, right, up and down based on the camera 10, while changing tens to hundreds of images. Take a picture.
- FIGS. 5A to 5D a case in which a user directly grabs an object and shoots one by one is exemplified, but is not limited thereto, and a throwing object (ball) is thrown into the shooting area of the camera 10, or The scene of throwing a throwing object (ball) onto the wall is recorded as a video, and machine learning can be performed on the image of each frame constituting the video.
- the machine learning server 40 finds a specific pattern to more clearly identify the projected object by repeatedly analyzing dozens to hundreds of different image data captured in this way.
- the machine learning server 40 can perform repetitive learning on the player with the same principle. have.
- the object recognition module 21-1 of the transformation engine 21 can easily identify a throwing object and/or a player from the image data using identification pattern information, which is a result of the machine learning module 40 learning in advance. have.
- the object recognition module 21-1 of the transformation engine is Images can be accurately identified.
- the machine learning server 40 may learn only one object for one content, but if control is required with a plurality of objects according to the type of content, it may pre-learn to identify a plurality of different objects for one content. have.
- Embodiment 2 relates to a sports interactive content execution system that recognizes a throwing object using a mono camera.
- Example 2 assumes that a mono camera such as a closed-type camera (CCTV) is already installed for security purposes, or a mono camera is used to build a sports interactive content execution system at a relatively inexpensive price. However, it is not necessarily limited to these cases.
- a mono camera such as a closed-type camera (CCTV) is already installed for security purposes, or a mono camera is used to build a sports interactive content execution system at a relatively inexpensive price. However, it is not necessarily limited to these cases.
- CCTV closed-type camera
- FIG. 6 is a block diagram showing a detailed configuration of a sports interactive content execution system according to a second embodiment.
- the sports interactive content execution system of the second embodiment includes a digital camera 100, an application driving device 200, and an image output device 300, and may further include a machine learning server 400. have.
- the digital camera 100 photographs a content scene including a moving projection, and transmits the photographed image data to the application driving device 200.
- connection structure or communication protocol between the digital camera 100 and the application driving device 200 is the same as that of the digital camera 10 of the first embodiment.
- the digital camera 100 uses a structured pattern measurement algorithm to identify a projection in image data and estimate a distance between the camera 100 and the projection.
- the digital camera 100 of the structured pattern technique includes at least one light projection module and at least one image sensor module, and when the light projection module projects a structured set of light patterns onto an object, the image sensor is reflected by the projection.
- Optical 3D scanning is performed by capturing an image, and the distance between the camera 100 and the projection is measured using the 3D scanning result.
- the application driving device 200 executes the conversion engine 210 and the interactive content application 220. It is the same as described in the first embodiment that the conversion engine 210 and the interactive content application 220 may be executed in one device 200 or separately executed in a separate device.
- the transformation engine 210 generates an event corresponding to a click of a mouse when the projectile is touched on the wall, and transmits the event to the interactive content application 220.
- the conversion engine 210 may include an object recognition module 211, an event module 212, and a standard module 213.
- the object recognition module 211 processes the image data sent from the camera 100 to identify the projection, and estimates the distance between the camera 100 and the projection using a structured pattern technique.
- the event module 212 determines whether the thrown object collides with the wall, converts the coordinates of the collision point into coordinates on the execution screen of the interactive content application, generates an event including the converted coordinates, and converts the event into the interactive content application. Transfer to.
- the principle of the event module 212 transforming the coordinates is the same as described in the first embodiment.
- the criterion module 213 performs a predetermined exception processing when the throwing point of the throwing object, that is, the point where the player throwing the throwing object is located, exceeds the reference line.
- a detailed embodiment of the exception handling and a method of determining whether the player has thrown the ball beyond the baseline are the same as those of the standard module 21-3 of the first embodiment.
- image output device 300 and the machine learning server 400 are also the same as the image output device 30 and the machine learning server 40 of the first embodiment.
- Embodiment 3 relates to a method of executing sports interactive content.
- FIG. 7 is a flowchart showing a method of executing content according to the third embodiment step by step.
- an image of an interactive basketball game is displayed by a beam projector.
- the basketball interactive game is assumed to be a game in which the score increases when the player throws a basketball toward the basketball hoop on the wall and touches the basketball to a predetermined area close to the hoop in a situation where the basketball hoop is moving left and right at a random speed.
- the digital camera installed on the ceiling captures an image displayed on the wall and a scene in which the player throws a basketball on the wall, and transmits the captured image data to the application driving device in real time (S101).
- the conversion engine running in the application driving device identifies a basketball ball learned in advance from the image data sent from the camera (S102), and tracks the movement of the basketball (S103).
- tracking refers to the process of determining the distance between the identified object and the camera and the coordinates on the wall screen where the object is located.
- the conversion engine converts the XY coordinates of the touch point into the xy coordinates on the execution screen of the interactive content application (S106). .
- a mouse event including the converted coordinates is generated, and the mouse event is transmitted to the interactive content application (S107).
- the conversion engine generates an event that outputs a warning message such as "Please enter the baseline” even before the basketball hits the wall, or scores even if the basketball hits the wall.
- a warning message such as "Please enter the baseline” even before the basketball hits the wall, or scores even if the basketball hits the wall.
- An event not to be recorded is generated, and the generated event is transmitted to the interactive content application (S108).
- FIG. 8 is a flowchart illustrating a machine learning process step by step in the method of executing sports interactive content according to the third embodiment.
- FIG. 5 For better understanding, the situation of FIG. 5 will be described in detail with reference to the situation of FIG. 5 in which the player enters the shooting range of the digital camera and holds a virtual mouse object such as a ball in one hand and performs test shooting tens to hundreds of times.
- the machine learning server receives image data from a digital camera or an application driving device connected to the digital camera (S201), processes the image data, and processes at least one of the shape, size, surface pattern, and color of the basketball and/or player.
- the characteristic is derived (S202).
- a certain pattern for identifying an object is defined based on the derived characteristics (S203)
- the machine learning process is terminated, and the defined identification pattern is provided to the conversion engine (S204) to identify standards for basketball and/or players in the future. It can be used as data. And if it is still insufficient to define a certain pattern, steps S201 to S203 are repeatedly performed.
- the machine learning server repeatedly analyzes dozens to hundreds of different image data captured in this way, thereby defining a specific pattern to more clearly identify an object.
- the termination of the machine learning process may be automatically executed when a preset criterion is satisfied, or may be executed arbitrarily at the discretion of an administrator.
- the pattern for identifying the basketball defined through the above steps is provided to the conversion engine, so that the basketball can be accurately identified from the still image of the moving basketball, no matter what kind of background is behind the basketball.
- Embodiment 4 relates to a system for multiplayer for playing sports interactive content in a manner in which two or more players cooperate or compete.
- FIG. 9 is a conceptual diagram schematically showing the configuration of a sports interactive content execution system for a multiplayer according to the fourth embodiment.
- FIG. 9 basketball interactive game contents are displayed individually in the first area and the second area of the wall, and player 1 and player 2 competitively play using different first and second basketballs. It is about the case.
- FIG. 9 is only one embodiment and may be implemented as content in which two players play competitively or collaboratively using different basketball balls while viewing one wall screen together. It is also apparent to those skilled in the art that it may be extended to a plurality of players.
- images of the basketball interactive game are individually displayed in the first area and the second area of the wall by a beam projector.
- a single beam projector is assumed, but two beam projectors may be required according to an execution method of interactive content.
- the score increases when the player throws the basketball toward the basketball hoop on the wall and touches the basketball to the area near the predetermined goal. Assume it is a game.
- the first player and the second player play the content by throwing each basketball ball corresponding to the virtual mouse toward a specific point on the wall on which the content is displayed.
- a digital camera that photographs the action of the first player and the second player and the content scene is disposed on the wall opposite to the wall on which the content is projected, or on the ceiling or on either side of the wall, and an application driving device (Fig. 9).
- a beam projector that receives an image of interactive content from an application driving device and outputs it to the wall is disposed on a wall or ceiling opposite the wall on which the content is projected.
- a reference line that the player should not cross may be displayed on the floor of the play area.
- FIG. 10 is a block diagram showing a detailed configuration of a sports interactive content execution system according to a fourth embodiment.
- the system of the fifth embodiment includes a digital camera 1000, an application driving device 2000, and an image output device 3000, and may further include a machine learning server 4000.
- the digital camera 1000 photographs a content scene including the moving first and second projections, and transmits the captured image data to the application driving apparatus 2000.
- connection structure or communication protocol between the digital camera 1000 and the application driving device 2000 is the same as that of the digital camera 10 of the first embodiment.
- the digital camera 1000 photographs a scene in which players play content while throwing the first and second projections, and transmits the captured image data to the application driving device 2000.
- the application driving device 2000 executes the conversion engine 2100 and the interactive content application 2200. It is the same as described in the first embodiment that the conversion engine 2100 and the interactive content application 2200 can be executed in one device 2000 or separately in separate devices.
- the conversion engine 2100 generates an event corresponding to a click of a mouse when the first or second projection is touched on the wall, and transmits the event to the interactive content application 2200.
- the conversion engine 2100 may include an object recognition module 2110 and an event module 2120, and may further include a reference module 2130 if necessary.
- the object recognition module 2110 processes the image data sent from the camera 1000 to identify the first and second projections, and uses a stereo technique, a structured pattern technique, or a distance estimation algorithm for the same purpose. ) And the first projection and the distance between the camera 1000 and the second projection. That is, the object recognition module 2110 identifies the first and second projections from the content photographing image of the first region and the content photographing image of the second region, and determines the distance of the first and second projections. Know the coordinates.
- the event module 2120 determines whether the projected objects collide with the wall, converts the coordinates of the collision point sent from the object recognition module 2110 to coordinates on the execution screen of the interactive content application, and includes the converted coordinates. And sends the event to the interactive content application. That is, the event module 2120 includes a first event including the converted coordinates when the first projection hits the wall and a second event including the converted coordinates when the second projection hits the wall. Pass it to the application.
- the principle of converting the coordinates by the event module 2120 is the same as described in the first embodiment.
- the norm module 2130 is a case in which one player throws another player's throw (for example, a ball) into his or her content video (that is, when another player throws another player's ball into his video), one player Throws another player's thrown into another player's content video (i.e., throws another player's ball into a cross), or one player throws his or her throw into another player's content video (its own ball In the case of throwing with a cross), it monitors whether a situation that normally needs to be treated as a foul or an error occurs, and if such a foul situation occurs, the score is not recognized or a re-throw is guided according to a predetermined rule. Perform an exception process.
- another player's throw for example, a ball
- the norm module 2130 is a case in which one player throws another player's throw (for example, a ball) into his or her content video (that is, when another player throws another player's ball into his video), one player Throws another player's thrown
- the following solutions may be provided to identify a case where the first player throws the second player's second projection onto the content image.
- the object recognition module 2110 continuously tracks the first and second projections from the play image of the content.
- the norm module 2130 starts the movement of the second throwing object in the play area of the first sports interactive content (hereinafter referred to as the'first content area') and touches the screen of the first content area, or the first throws If you start moving in the play area of the second sports interactive content (hereinafter referred to as the'second content area') and touch the screen of the second content area, you cannot know which player threw it, but anyway It is clear that it is a thrown foul situation, so an exception is handled.
- the object recognition module 2110 further identifies a first player and a second player in addition to the first and second projections from the play image of the content.
- the norm module 2130 confirms that the first player throws the second throw and the second throw is touched in the first content area, or the second player throws the first throw and the first throws the second content. If it is confirmed that the area has been touched, an exception handling is performed for the foul situation.
- the following solutions may be provided to identify a case where the first player throws the second player's throwing object onto the second player's content image.
- the object recognition module 2110 continuously tracks the first and second projections from the play image of the content.
- the norm module 2130 is configured such that the second projection starts moving in the first content area and touches the screen of the second content area, or the first projection starts moving in the second content area, and the screen of the first content area If touched, it is a foul situation in which the ball of the other person is thrown onto the screen of the other person, and an exception is handled.
- the object recognition module 2110 further identifies a first player and a second player in addition to the first and second projections from the play image of the content.
- the norm module 2130 throws another person's ball onto the screen of another person. Since it is a foul situation, exception handling is performed.
- the object recognition module 2110 continuously tracks the first and second projections from the play image of the content.
- the norm module 2130 is configured such that the first projection starts to move in the first content area and leaves the coordinate area of the first content, or the second projection starts moving in the second content area, and the coordinate area of the second content If it is out of, it is a foul situation in which he throws his ball to the wrong place, so an exception is handled.
- the object recognition module 2110 further identifies a first player and a second player in addition to the first and second projections from the play image of the content.
- the standard module 2130 of the present embodiment also has the same function as in the first and second embodiments, when the throwing point of the throwing object, that is, the point where the player throwing the throwing object is located, exceeds the reference line. can do.
- image output device 3000 and the machine learning server 4000 are also the same as the image output device 30 and the machine learning server 40 of the first embodiment.
- the entire or partial functions of the sports interactive content execution system and method of the third and fourth embodiments described above are included in a recording medium that can be read through a computer by tangibly implementing a program of instructions for implementing the same. It will be readily understood by those skilled in the art that there may be.
- the computer-readable recording medium may include program instructions, data files, data structures, etc. alone or in combination.
- the program instructions recorded on the computer-readable recording medium may be specially designed and constructed for the present invention, or may be known and usable to those skilled in computer software. Examples of the computer-readable recording medium include magnetic media such as hard disks, floppy disks, and magnetic tapes, optical media such as CD-ROMs and DVDs, and floptical disks.
- Magneto-optical media and hardware devices specially configured to store and execute program instructions such as ROM, RAM, flash memory, USB memory, and the like.
- the computer-readable recording medium may be a transmission medium such as an optical or metal wire or a waveguide including a carrier wave for transmitting a signal specifying a program command or a data structure.
- Examples of program instructions include high-level language codes that can be executed by a computer using an interpreter or the like, in addition to machine language codes such as those produced by a compiler.
- the hardware device may be configured to operate as one or more software modules to perform the operation of the present invention and vice versa.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Human Computer Interaction (AREA)
- Software Systems (AREA)
- Signal Processing (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Artificial Intelligence (AREA)
- Data Mining & Analysis (AREA)
- Evolutionary Computation (AREA)
- Medical Informatics (AREA)
- Computing Systems (AREA)
- Mathematical Physics (AREA)
- Social Psychology (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Psychiatry (AREA)
- User Interface Of Digital Computer (AREA)
- Controls And Circuits For Display Device (AREA)
Abstract
Description
Claims (12)
- 벽면에 디스플레이 되는 스포츠 인터렉티브 컨텐츠 영상을 촬영하는 디지털 카메라; 및상기 스포츠 인터렉티브 컨텐츠의 촬영 영상에서 투척물을 식별하고 상기 투척물의 거리와 좌표를 파악하는 객체인식 모듈과, 상기 투척물이 벽면에 부딪힌 때 투척물의 좌표가 포함된 이벤트를 인터렉티브 컨텐츠 애플리케이션에 전달하는 이벤트 모듈과, 상기 투척물의 투척 지점이 기준선을 넘어서면 예외 처리를 수행하는 규준 모듈을 포함한 변환 엔진을 실행하는 애플리케이션 구동 장치를 포함하는 것을 특징으로 하는 운동 유도를 위한 스포츠 인터렉티브 컨텐츠 실행 시스템.
- 제1항에 있어서,상기 객체인식 모듈은 상기 스포츠 인터렉티브 컨텐츠의 촬영 영상에서 플레이어를 식별한 후 상기 플레이어와의 거리를 상기 규준 모듈에 제공하고,상기 규준 모듈은 상기 플레이어와의 거리와 기준선의 거리를 비교하여 기준선을 넘어선 투척인지 판단하는 것을 특징으로 하는 운동 유도를 위한 스포츠 인터렉티브 컨텐츠 실행 시스템.
- 제1항에 있어서,상기 규준 모듈은 상기 투척물의 이동 시작점의 거리와 기준선의 거리를 비교하여 기준선을 넘어선 투척인지 판단하는 것을 특징으로 하는 운동 유도를 위한 스포츠 인터렉티브 컨텐츠 실행 시스템.
- 제1항에 있어서,상기 투척물이 포함된 복수의 영상 데이터를 반복적으로 분석하여 상기 투척을 식별하기 위한 형상, 크기, 표면의 패턴 무늬, 색상 중 적어도 하나에 관한 패턴을 학습하는 머신러닝 서버를 더 포함하는 것을 특징으로 하는 운동 유도를 위한 스포츠 인터렉티브 컨텐츠 실행 시스템.
- 제1항에 있어서,상기 디지털 카메라는 적어도 두 개의 이미지 센서를 가지며,상기 객체인식 모듈은, 상기 이미지 센서들의 화각 차이를 이용하여 상기 디지털 카메라와 상기 투척물 간의 거리를 계산하는 것을 특징으로 하는 운동 유도를 위한 스포츠 인터렉티브 컨텐츠 실행 시스템.
- 제1항에 있어서,상기 디지털 카메라는 적어도 하나의 이미지 센서를 가지며,상기 객체인식 모듈은, 상기 디지털 카메라로 촬영한 영상 속의 투척물의 크기를 기초로 상기 디지털 카메라와 벽면 간의 거리를 계산하는 것을 특징으로 하는 운동 유도를 위한 스포츠 인터렉티브 컨텐츠 실행 시스템.
- 벽면에 디스플레이 되는 제1 스포츠 인터렉티브 컨텐츠의 영상과 제2 스포츠 인터렉티브 컨텐츠의 영상을 촬영하는 디지털 카메라;상기 제1 및 제2 스포츠 인터렉티브 컨텐츠의 촬영 영상에서 제1 투척물과 제2 투척물을 식별하고, 상기 제1 투척물 및 제2 투척물의 거리와 좌표를 파악하는 객체인식 모듈과, 상기 제1 투척물이 벽면에 부딪힌 때의 좌표가 포함된 제1 이벤트와 상기 제2 투척물이 벽면에 부딪힌 때의 좌표가 포함된 제2 이벤트를 인터렉티브 컨텐츠 애플리케이션에 전달하는 이벤트 모듈을 포함한 변환 엔진을 실행하는 애플리케이션 구동 장치를 포함하는 것을 특징으로 하는 운동 유도를 위한 스포츠 인터렉티브 컨텐츠 실행 시스템.
- 제7항에 있어서,상기 객체인식 모듈은 상기 제1 및 제2 스포츠 인터렉티브 컨텐츠의 촬영 영상에서 제1 플레이어와 제2 플레이어를 더 식별하며,상기 제1플레이어가 상기 제2 투척물을 던지거나 상기 제2플레이어가 상기 제1 투척물을 던지면 예외 처리를 수행하는 규준 모듈을 더 포함하는 것을 특징으로 하는 운동 유도를 위한 스포츠 인터렉티브 컨텐츠 실행 시스템.
- 제7항에 있어서,상기 제1 투척물이 제2 스포츠 인터렉티브 컨텐츠의 플레이 영역에서 제1 스포츠 인터렉티브 컨텐츠의 플레이 영역으로 이동하여 벽면에 부딪힌 경우 예외 처리를 수행하는 규준 모듈을 더 포함하는 것을 특징으로 하는 운동 유도를 위한 스포츠 인터렉티브 컨텐츠 실행 시스템.
- 제7항에 있어서,상기 제1 투척물 또는 제2 투척물이 포함된 복수의 영상 데이터를 반복적으로 분석하여 상기 제1 투척물 또는 상기 제2 투척물을 식별하기 위한 형상, 크기, 표면의 패턴 무늬, 색상 중 적어도 하나에 관한 패턴을 학습하는 머신러닝 서버를 더 포함하는 운동 유도를 위한 스포츠 인터렉티브 컨텐츠 실행 시스템.
- 제7항에 있어서,상기 디지털 카메라는 적어도 두 개의 이미지 센서를 가지며,상기 객체인식 모듈은, 상기 이미지 센서들의 화각 차이를 이용하여 상기 디지털 카메라와 상기 제1 투척물 또는 상기 제2 투척물 간의 거리를 계산하는 것을 특징으로 하는 운동 유도를 위한 스포츠 인터렉티브 컨텐츠 실행 시스템.
- 제7항에 있어서,상기 디지털 카메라는 적어도 하나의 이미지 센서를 가지며,상기 객체인식 모듈은, 상기 디지털 카메라로 촬영한 영상 속의 제1 투척물 또는 제2 투척물의 크기를 기초로 상기 디지털 카메라와 제1 투척물 또는 제2 투척물 간의 거리를 계산하는 것을 특징으로 하는 운동 유도를 위한 스포츠 인터렉티브 컨텐츠 실행 시스템.
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR10-2019-0045098 | 2019-04-17 | ||
KR20190045098 | 2019-04-17 | ||
KR10-2019-0058258 | 2019-05-17 | ||
KR1020190058258A KR102054148B1 (ko) | 2019-04-17 | 2019-05-17 | 운동 유도를 위한 스포츠 인터렉티브 컨텐츠 실행 시스템 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2020213784A1 true WO2020213784A1 (ko) | 2020-10-22 |
Family
ID=68729655
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/KR2019/006029 WO2020213784A1 (ko) | 2019-04-17 | 2019-05-20 | 운동 유도를 위한 스포츠 인터렉티브 컨텐츠 실행 시스템 |
PCT/KR2019/006028 WO2020213783A1 (ko) | 2019-04-17 | 2019-05-20 | 가상 인터렉티브 컨텐츠의 사용자 인터페이스 제공 시스템, 방법 및 이를 위한 컴퓨터 프로그램이 저장된 기록매체 |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/KR2019/006028 WO2020213783A1 (ko) | 2019-04-17 | 2019-05-20 | 가상 인터렉티브 컨텐츠의 사용자 인터페이스 제공 시스템, 방법 및 이를 위한 컴퓨터 프로그램이 저장된 기록매체 |
Country Status (2)
Country | Link |
---|---|
KR (3) | KR102054148B1 (ko) |
WO (2) | WO2020213784A1 (ko) |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR102454833B1 (ko) * | 2022-05-12 | 2022-10-14 | (주)이브이알스튜디오 | 가상의 아쿠아리움의 이미지를 표시하는 디스플레이 장치, 및 디스플레이 장치와 통신 가능한 사용자 단말의 제어 방법 |
KR20240078444A (ko) | 2022-11-23 | 2024-06-04 | 아주대학교산학협력단 | 학생의 신체 활동을 활성화하기 위한 멀티플레이 게임 관리 시스템 |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20120040818A (ko) * | 2010-10-20 | 2012-04-30 | 에스케이플래닛 주식회사 | 증강 현실 컨텐츠 재생 시스템 및 방법 |
KR20130071059A (ko) * | 2011-12-20 | 2013-06-28 | 엘지전자 주식회사 | 이동 단말기 및 그 제어방법 |
KR20150035854A (ko) * | 2015-02-17 | 2015-04-07 | 주식회사 홍인터내셔날 | 원격 멀티 모드 시 스로우 라인을 이용한 인증이 가능한 다트 게임 장치 |
US20180293442A1 (en) * | 2017-04-06 | 2018-10-11 | Ants Technology (Hk) Limited | Apparatus, methods and computer products for video analytics |
KR101963682B1 (ko) * | 2018-09-10 | 2019-03-29 | 주식회사 큐랩 | 증강현실 기반의 스포츠 콘텐츠에 따른 신체 측정 데이터 관리 시스템 |
Family Cites Families (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20110013076A (ko) * | 2009-08-01 | 2011-02-09 | 강병수 | 카메라 시스템을 이용한 손짓 및 터치형 양손 반지 마우스 입력 장치 |
KR101357260B1 (ko) * | 2010-10-22 | 2014-02-03 | 주식회사 팬택 | 증강 현실 사용자 인터페이스 제공 장치 및 방법 |
KR20120114767A (ko) | 2011-04-08 | 2012-10-17 | 동서대학교산학협력단 | 사물 투척형 게임 디스플레이 시스템 및 그 방법 |
JP6074170B2 (ja) * | 2011-06-23 | 2017-02-01 | インテル・コーポレーション | 近距離動作のトラッキングのシステムおよび方法 |
KR101330531B1 (ko) | 2011-11-08 | 2013-11-18 | 재단법인대구경북과학기술원 | 3차원 카메라를 이용한 가상 터치 방법 및 장치 |
KR101572346B1 (ko) * | 2014-01-15 | 2015-11-26 | (주)디스트릭트홀딩스 | 증강현실 스테이지, 라이브 댄스 스테이지 및 라이브 오디션을 위한 서비스 시스템 및 서비스 방법 |
KR101860753B1 (ko) * | 2016-06-13 | 2018-05-24 | (주)블루클라우드 | 사용자 인식 컨텐츠 제공 시스템 및 그 동작방법 |
-
2019
- 2019-05-17 KR KR1020190058258A patent/KR102054148B1/ko active IP Right Grant
- 2019-05-17 KR KR1020190058257A patent/KR102041279B1/ko active IP Right Grant
- 2019-05-20 WO PCT/KR2019/006029 patent/WO2020213784A1/ko active Application Filing
- 2019-05-20 WO PCT/KR2019/006028 patent/WO2020213783A1/ko active Application Filing
- 2019-06-17 KR KR1020190071560A patent/KR102275702B1/ko active IP Right Grant
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20120040818A (ko) * | 2010-10-20 | 2012-04-30 | 에스케이플래닛 주식회사 | 증강 현실 컨텐츠 재생 시스템 및 방법 |
KR20130071059A (ko) * | 2011-12-20 | 2013-06-28 | 엘지전자 주식회사 | 이동 단말기 및 그 제어방법 |
KR20150035854A (ko) * | 2015-02-17 | 2015-04-07 | 주식회사 홍인터내셔날 | 원격 멀티 모드 시 스로우 라인을 이용한 인증이 가능한 다트 게임 장치 |
US20180293442A1 (en) * | 2017-04-06 | 2018-10-11 | Ants Technology (Hk) Limited | Apparatus, methods and computer products for video analytics |
KR101963682B1 (ko) * | 2018-09-10 | 2019-03-29 | 주식회사 큐랩 | 증강현실 기반의 스포츠 콘텐츠에 따른 신체 측정 데이터 관리 시스템 |
Also Published As
Publication number | Publication date |
---|---|
KR102054148B1 (ko) | 2019-12-12 |
KR20200122202A (ko) | 2020-10-27 |
KR102275702B1 (ko) | 2021-07-09 |
WO2020213783A1 (ko) | 2020-10-22 |
KR102041279B1 (ko) | 2019-11-27 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2017123041A1 (ko) | 야구 연습 장치에 이용되는 센싱장치 및 센싱방법과, 이를 이용한 야구 연습 장치 및 이의 제어방법 | |
WO2022050792A1 (ko) | 테니스 자율훈련 시스템 | |
WO2019177364A1 (ko) | 가상 테니스 시뮬레이션 시스템 및 그 제어방법 | |
WO2017135632A1 (ko) | 플레이어 매칭 장치 및 플레이어 매칭 방법 | |
US11103783B2 (en) | Sports simulation system | |
WO2018131884A1 (en) | Moving robot and control method thereof | |
WO2020213784A1 (ko) | 운동 유도를 위한 스포츠 인터렉티브 컨텐츠 실행 시스템 | |
WO2016122217A1 (ko) | 다트 핀의 위치에 기초한 타격 면적에 따른 다트 게임을 제공하는 서버, 다트 게임 장치 및 컴퓨터 프로그램 | |
WO2017155343A1 (ko) | 다트 게임에 관련한 영상을 제공하기 위한 서버, 다트 게임 장치 및 컴퓨터 판독 가능 매체에 저장된 컴퓨터 프로그램 | |
WO2018030656A1 (ko) | 체험형 가상 야구 게임 장치 및 이에 의한 가상 야구 게임 제어방법 | |
WO2017135690A1 (ko) | 야구 연습 장치에 이용되는 센싱장치 및 센싱방법과, 이를 이용한 야구 연습 장치 및 이의 제어방법 | |
WO2013100239A1 (ko) | 스테레오 비전 시스템의 영상처리방법 및 그 장치 | |
US10942619B2 (en) | Interactive reality activity augmentation | |
WO2022092782A1 (ko) | 증강현실 인터랙티브 스포츠 장치를 이용한 개인별 운동량 측정 방법 | |
WO2018074709A1 (ko) | 실탄사격 시뮬레이션 게임 제공 방법 및 장치 | |
WO2019039747A1 (ko) | 가상 스포츠 시뮬레이션 장치 | |
Meško et al. | Laser spot detection | |
WO2016182330A1 (ko) | 이동부를 포함하는 다트 게임 장치 | |
WO2022080549A1 (ko) | 이중 라이다 센서 구조의 모션 트래킹 장치 | |
KR20090112538A (ko) | 조명 제어를 이용한 골프 영상 획득 장치, 및 그를 이용한 영상처리 기반의 골프 연습 시스템 | |
WO2019004531A1 (ko) | 사용자 신호 처리 방법 및 이러한 방법을 수행하는 장치 | |
WO2023054890A1 (ko) | 골프 코스 리스트 제공 장치 및 골프 코스 리스트 제공 방법 | |
WO2018038337A1 (ko) | Hmd 유저와 복수의 일반인들 간에 상호작용이 가능한 가상현실 컨텐츠 시스템 및 그 제어방법 | |
WO2022071660A1 (ko) | 대전 장치 및 대전 방법 | |
WO2020213786A1 (ko) | 신체 움직임 인식을 이용한 가상 인터렉티브 컨텐츠 실행 시스템 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 19925177 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 19925177 Country of ref document: EP Kind code of ref document: A1 |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 19925177 Country of ref document: EP Kind code of ref document: A1 |
|
32PN | Ep: public notification in the ep bulletin as address of the adressee cannot be established |
Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 25/04/2022) |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 19925177 Country of ref document: EP Kind code of ref document: A1 |