WO2017217725A1 - Système de fourniture de contenu de reconnaissance d'utilisateur et son procédé de fonctionnement - Google Patents

Système de fourniture de contenu de reconnaissance d'utilisateur et son procédé de fonctionnement Download PDF

Info

Publication number
WO2017217725A1
WO2017217725A1 PCT/KR2017/006112 KR2017006112W WO2017217725A1 WO 2017217725 A1 WO2017217725 A1 WO 2017217725A1 KR 2017006112 W KR2017006112 W KR 2017006112W WO 2017217725 A1 WO2017217725 A1 WO 2017217725A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
interface screen
knee
foot
bent
Prior art date
Application number
PCT/KR2017/006112
Other languages
English (en)
Korean (ko)
Inventor
권찬영
Original Assignee
(주)블루클라우드
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by (주)블루클라우드 filed Critical (주)블루클라우드
Publication of WO2017217725A1 publication Critical patent/WO2017217725A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]

Definitions

  • the present invention relates to a user-recognized content providing system and a method of operating the same.
  • a user recognizes a user to generate a predetermined event according to a bending motion of a knee by a projector projecting an interface screen and a motion recognition sensor recognizing a user's motion.
  • An object of the present invention is to provide a user-recognized content providing system and a method of operating the same, which can generate a predetermined event according to the bending motion of the knee by the projector unit projecting the interface screen and the motion recognition sensor unit recognizes the user's motion. do.
  • the interface screen providing unit for projecting the interface screen toward the space where the user is located, the operation recognition to recognize the user's operation
  • the sensor unit and the user raises their feet on the interface screen and keeps their knees bent for the first selection time
  • the position of the foot corresponding to the bent knee of the user is compared with the interface screen for a predetermined event.
  • it comprises a control unit for generating a.
  • the interface screen providing unit for projecting the interface screen toward the space in which the user is located, the operation recognition for recognizing the operation of the user
  • the controller may further include a controller configured to generate a predetermined event according to a result of comparing the position of the foot corresponding to the user's bent knee with the interface screen.
  • the user recognition content providing system may further include a display unit configured to display the interface screen or a content screen corresponding to the predetermined event under the control of the controller.
  • control unit may determine whether the user's knee is bent according to the position of the user's foot, knee and pelvis measured by the motion recognition sensor unit.
  • the interface screen providing unit projecting the interface screen toward the space where the user is located, the motion recognition sensor unit Recognizing the motion of the user, and when the user keeps the foot on the interface screen and bends the knee for the first selection time, the controller controls the position of the foot corresponding to the bent knee of the user. And generating a predetermined event in comparison with the screen.
  • the interface screen providing unit projecting the interface screen toward the space in which the user is located, the motion recognition sensor unit Recognizing a user's motion, when the user puts the foot on the interface screen, the control unit compares the position of the user's foot and the interface screen, and corresponding to the foot placed on the interface screen And when the user's knee is bent and stretched, the control unit generates a predetermined event according to a result of comparing the position of the foot corresponding to the user's bent and stretched knee with the interface screen.
  • the controller corresponds to the user's bent knee.
  • the method may further include generating a predetermined event by comparing the position of the foot with the interface screen.
  • the control unit may be the bent knee of the user.
  • the method may further include generating a predetermined event according to a result of comparing the position of the foot corresponding to the interface screen with the position of the foot.
  • the controller may display the position of the foot corresponding to the bent knee of the user on the interface screen.
  • the method may further include generating a predetermined event according to the result of the comparison.
  • the controller may further include generating a predetermined event according to a result of comparing the position of the foot corresponding to the bent knee of the user with the interface screen.
  • the controller compares the position of the foot placed on the interface screen with the interface screen. To generate a predetermined event.
  • control unit may determine whether the user's knee is bent according to the position of the user's foot, knee and pelvis measured by the motion recognition sensor unit.
  • the controller may include at least one of a position of the interface screen providing unit, a direction in which the interface screen providing unit projects the interface screen, a bottom surface of a space in which the interface screen is projected, and a position of the motion recognition sensor unit. In consideration of this, the position of the foot corresponding to the bent knee of the user may be compared with the interface screen.
  • the computer-readable recording medium is a computer-readable recording medium that stores a program for performing an operation method of the user recognition content providing system. It features.
  • the projector generates a predetermined event according to the bending operation of the knee by projecting the interface screen and the motion recognition sensor unit recognizes the user's motion. It can be effected.
  • the user recognition content providing system and its operation method according to an embodiment of the present invention as described above, a simple and effective exercise operation without the hassle of having to wear additional tools on the user's body to recognize the user's motion Through the user recognition content providing system can be operated.
  • the user recognition content providing system and the operation method according to an embodiment of the present invention as described above, by effectively determining the user's choice of a specific object by various combinations of the position of the user's foot and the knee bending operation, The success rate of object selection can be maximized.
  • FIG. 1 is a view showing a user recognition content providing system according to an embodiment of the present invention.
  • FIG. 2 is a flowchart illustrating a method of operating a user recognition content providing system according to an embodiment of the present invention.
  • FIG. 3 is a flowchart illustrating a method of operating a system for providing a user recognition content according to another embodiment of the present invention.
  • 4A to 4C are diagrams illustrating a use state of a system for providing user recognition content according to an embodiment of the present invention.
  • FIG. 5 is a flowchart illustrating a method of operating a user recognition content providing system according to another exemplary embodiment of the present invention.
  • FIG. 6 is a flowchart illustrating a method of operating a user recognition content providing system according to another embodiment of the present invention.
  • FIG. 7 is a flowchart illustrating a method of operating a user recognition content providing system according to another embodiment of the present invention.
  • FIG. 8 is a flowchart illustrating a method of operating a user recognition content providing system according to another embodiment of the present invention.
  • FIG. 9 is a flowchart illustrating a method of operating a user recognition content providing system according to another embodiment of the present invention.
  • FIG. 10 is a diagram illustrating an interface screen projected by a user recognition content providing system according to an exemplary embodiment.
  • the user recognition content providing system 10 may include a motion recognition sensor unit 200, an interface screen providing unit 300, and a display unit 400.
  • the user recognition content providing system 10 further includes a controller (100 of FIG. 4A) that provides an exercise program by controlling the motion recognition sensor unit 200 and the interface screen providing unit 300. can do.
  • the user-recognized content providing system 10 projects the interface screen of a desired exercise program on the floor through the interface screen providing unit 300, thereby providing various contents desired by the user as one content providing system. You can run an exercise program. That is, the user recognition content providing system 10 according to an embodiment of the present invention displays the interface screen of the exercise program based on the interactive content that guides the user's operation through the interface screen providing unit 300.
  • the motion recognition sensor unit 200 may detect the user's movement and transfer the same to the control unit 100.
  • the controller 100 may recognize coordinate information of main points (hands, feet, heads, and joints) of the user detected by the motion recognition sensor unit 200 to determine whether the motion program matches the motion of the user.
  • the controller 100 may provide a workout environment in which the user feels the same as reality by synchronizing the motion program with the user's movement.
  • the exercise program projected to the user's exercise space by the interface screen providing unit 300 induces human sense and cognition based on ICT (Information and Communication Technology) to expand the experience and emotion similar to the reality, Objects, people, and virtual objects can be based on immersive interactive content that interacts.
  • the realistic interactive content includes a virtual object for inducing an action to be taken by a user according to an exercise program (for example, when the arm is bent in the buffy exercise, which is one of the whole body movements that bends and runs, the hand should touch the floor).
  • Coordinate information and information on the order, the interval of change of the coordinates, and the like.
  • the coordinate information may be optimally provided to the user according to the user's age, gender, physical fitness, exercise amount, exercise program under training, and the like.
  • the exercise program stores a sequence of a user's exercise motion, and the controller 100 may output a sequence of the exercise action of the stored program through the interface screen providing unit 300.
  • the sequence may be stored in various forms such as a table form, a markup language in XML form, and a data structure.
  • the motion recognition sensor unit 200 may recognize a user's motion in the front exercise space.
  • the motion recognition sensor unit 200 may include at least one camera, a depth camera, an infrared sensor, or a combination thereof to detect the user and the user's movement from the front.
  • a plurality of motion recognition sensors 200 may be installed around the user (front, rear, left and right) to generate a multi-view image to recognize the user's motion. That is, the motion recognition sensor unit 200 may detect the user's motion space in three-dimensional or multi-view.
  • the motion recognition sensor unit 200 transmits the collected user, user's motion and spatial information to the control unit 100 to provide a more realistic three-dimensional motion interface optimized for the user and enable precise user's motion recognition.
  • the camera may be a camera that captures a left image and a right image to generate a 3D image, and may be regarded as implementing a multiview using a plurality of cameras.
  • data about the parallax of the left and right images may be extracted from the depth value of the depth camera or the 3D image, and used to convert the distance from the object. If the distance is close, the depth value is large, and if the distance is far, the depth value is used.
  • the motion recognition sensor unit 200 transmits infrared rays to a human body within a recognition range, and then detects infrared signals reflected from the human body, thereby displaying important joint parts of the human body as a plurality of points (for example, 20 points). In this way, the movement, speed, and position of the human body can be recognized.
  • the motion recognition sensor unit 200 may separate one person by a method such as recognizing a person at the closest distance when several people are located within a recognition range, and separately separating several people. It can also be recognized.
  • the motion recognition sensor unit 200 may be implemented by a "Kinect" device of Microsoft Corp. capable of motion capture and voice recognition, and generated by the motion recognition sensor unit 200.
  • the motion recognition signal is digital data for indicating the motion and position of the human body detected by the motion recognition sensor unit 200.
  • the position information and depth of a plurality of points representing the important joint parts of the detected human body may include information.
  • the interface screen providing unit 300 may project the interface screen toward the space where the user is located.
  • the interface screen providing unit 300 may include a projector unit 310 and a reflecting unit 320.
  • the interface screen providing unit 300 may project an enlarged image sufficiently in a place where a sufficient projection distance is not secured, such as a low ceiling, and project an image according to an exercise program at the same position even with a change in height.
  • the projector unit 310 may project an interface screen for executing an exercise program based on sensory interactive content to the reflector 320.
  • the projector 310 may exchange data with the controller 100 through a wired or wireless interface of various methods such as HDMI, RGB, BNC, and Wireless HD.
  • the projector unit 310 may project an interface screen for an exercise program based on immersive interactive content to the reflector 320 through a connection with the controller 100.
  • the position and the projection angle of the projector 310 may be variously changed in order to adjust the size and position of the projected interface screen according to the type of exercise program. As an example, the projector 310 may be moved up, down, left and right and rotated 360 degrees through a rail or bracket.
  • the reflector 320 may reflect the interface screen projected from the projector 310 and reflect the bottom of the exercise space of the user.
  • the reflector 320 may be formed of any one of a flat mirror, a convex mirror, a concave mirror, or an atypical mirror, and according to an embodiment, a bottom surface on which the interface screen is projected from the projector 310 according to the type and characteristic of the exercise program. You can adjust the size of the exercise space by adjusting the throwing distance. For example, when the reflector 320 is implemented as a planar mirror, the projection distance is twice as large as the distance between the planar mirrors in the projector 310 than when directly projected onto the bottom surface of the projector 310. Increased effect can be obtained.
  • the distance between the convex mirror and the distance according to the magnification of the convex mirror in the projector unit 310 is greater than when the projector unit 310 is directly projected onto the bottom surface.
  • the effect of increasing the projection distance can be obtained.
  • the reflecting unit 320 is implemented as a concave mirror, the projection distance is shorter than when the projector unit 310 is directly projected on the bottom surface, but the image projected by the projector unit 310 concentrates the light projected in a bright place. There is an effect that can be obtained.
  • the projection distance may be increased to secure an enlarged virtual motion space, and may be generated because the angle reflected by the mirror is not perpendicular to the floor surface. You can also correct distortion on the screen.
  • the reflector 320 may be applied to the acrylic or plastic by applying a paint having a higher reflection coefficient than a general mirror so that distortion of the image according to the distance to be projected does not occur.
  • the reflector 320 may move up, down, left, and right or rotate to adjust an angle reflected back and forth or left and right.
  • the reflector 320 may be moved up, down, left and right and rotated 360 degrees through a rail or bracket.
  • FIG. 1 shows that the reflecting unit 320 is formed as one reflective mirror, this is merely an example.
  • the reflecting unit 320 includes a plurality of reflecting mirrors to display the interface screen between the reflecting mirrors. By arranging to reflect, the projection distance between the projector 310 and the bottom surface can be further extended.
  • the projector unit 310 is installed in the lower portion of the user recognition content providing system 10, and a few reflective mirrors are zigzag in the middle, and then configured to be finally projected onto the floor surface through the reflective mirrors installed on the ceiling. Can be.
  • the interface screen providing unit 300 may include a plurality of projectors 310. In this case, by outputting a plurality of screens through different reflecting units 320, the image effects may be generated to overlap each other or to be adjacently output.
  • the interface screen providing unit 300 may omit the reflection unit 320 so that the interface screen projected by the projector 310 may be directly projected onto the floor.
  • the display 400 may display the interface screen or display a content screen corresponding to a predetermined event generated by the controller 100 under the control of the controller 100.
  • the display unit 400 may output a result of the user's motion recognized by the motion recognition sensor unit 200, and may display information about the motion and the result of the user's exercise according to the user's position or direction change during the exercise. You can output
  • the display unit 400 displays information including exercise amount information, user information, exercise evaluation and results, calorie information consumed, personal health information, confirmation of one's exercise contents, or a combination thereof according to the user's movement. By checking during or after the exercise, the user may provide an environment in which the user can exercise more efficiently.
  • the controller 100 may recognize a user detected by the motion recognition sensor unit 200, extract registered user information, and prepare an exercise program to be provided to the user.
  • the control unit 100 uses the interface screen providing unit 300 to display an interface screen composed of virtual objects of an exercise program generated based on realistic interactive content based on user information downloaded through a wired / wireless interface unit (not shown). Can be provided to
  • the controller 100 may recognize the user's motion through the motion recognition sensor unit 200 to recognize the user's motion in the actual exercise program according to the sequence of the exercise motion including the start and the end of the exercise program.
  • the controller 100 may provide a more realistic three-dimensional interface screen optimized for the user environment based on three-dimensional stereoscopic information on the motion space detected by the motion recognition sensor unit 200.
  • the controller 100 may project the user's exercise motion and body recognition state onto the floor through the interface screen providing unit 300 so that the user can immediately know.
  • the control unit 100 may accurately recognize a user's motion through the motion recognition sensor unit 200, and the user's motion recognized by the motion recognition sensor unit 200 with the correct reference motion provided by the corresponding exercise program. By comparing the motions in real time, if the user's motion does not match the reference motion, the corrected body image is displayed together with the posture correction message through the interface screen providing unit 300 or the display unit 400. It can lead to accurate movement.
  • the controller 100 may notify the motion recognition time by providing motion evaluation information to the user using sound effects or screen effects at the moment of motion recognition.
  • the controller 100 may compare the position of the foot corresponding to the bent knee of the user with the interface screen when the user raises the foot on the interface screen and maintains the bent knee for the first selection time. Can trigger an event. That is, when the motion recognition sensor unit 200 recognizes the user's motion and maintains the user's knee bent on one leg for the first selection time, the controller 100 may be placed on the user's bent knee on the interface screen. An event of a button corresponding to the position of the corresponding foot may be generated. As an example, the first selection time may be set to 2 seconds.
  • the control unit 100 sets the first selection time when the user raises one foot on the interface screen and bends the knee. It can be set shorter than the first selection time in the case. As an example, the first selection time in this case may be set to 1.7 seconds.
  • the user may be able to define a specific button on an interface screen by bending his knee in terms such as “quick selection”. The selection time can be defined in terms such as "quick selection time”.
  • the controller 100 compares the position of the foot corresponding to the bent knee of the user with the interface screen. To generate a predetermined event.
  • the controller 100 compares the position of the foot of the user with the interface screen and corresponds to the foot placed on the interface screen.
  • a predetermined event may be generated according to a result of comparing the position of the foot corresponding to the user's bent knee with the interface screen. That is, when the motion recognition sensor unit 200 recognizes the user's motion and the user bends and straightens the knee of one leg, the controller 100 corresponds to the position of the foot corresponding to the user's bent and extended knee on the interface screen. Can trigger an event of a button. This will be further described with reference to FIGS. 3 and 4A to 4C.
  • control unit 100 may determine whether the user's knee is bent according to the position of the user's foot, knee, and pelvis measured by the motion recognition sensor unit 200. That is, as described above, since the motion recognition sensor unit 200 may display important joint parts of the human body as a plurality of points (for example, 20 points), the user may compare the position and angle of the foot, knee, and pelvis of the user. It can be determined whether the knee of the person is bent.
  • control unit 100 the position of the interface screen providing unit 300, the direction in which the interface screen providing unit 300 projects the interface screen, the bottom surface of the space on which the interface screen is projected and the operation recognition
  • the position of the foot corresponding to the bent knee of the user may be compared with the interface screen in consideration of at least one of the positions of the sensor unit 200.
  • the control unit 100 In order for the control unit 100 to accurately determine the button corresponding to the position of the foot corresponding to the user's bent knee on the interface screen, the relative position of the interface screen providing unit 300 and the motion recognition sensor unit 200 and the interface screen This is because information such as projection angle and direction should be taken into account. That is, the controller 100 synchronizes the motion recognition sensor unit 200 with the interface screen providing unit 300 to recognize the spatial coordinates of the interface screen providing unit 300 and the motion recognition sensor unit 200 projecting the interface screen. It can provide a synchronization process for.
  • the posture information including the height or inclination of the user recognition content providing system 10 or a combination thereof is detected and transmitted to the controller 100. It may further include a posture detection unit (not shown).
  • the controller 100 controls the interface screen providing unit 300 based on location information such as height and inclination from the bottom of the space to automatically correct the size, distance, and position of the image projected into the virtual exercise space. Can be. That is, the controller 100 may control the size, distance, and position of the image projected in the virtual space to be always the same even if the height or inclination of the user recognition content providing system 10 is changed.
  • the controller 100 may automatically adjust the exercise intensity according to the user's movement, gender, age, physical fitness, condition, and the like.
  • the controller 100 automatically increases the response speed of the exercise program when the user operates faster and faster in response to the speed at which the user executes the exercise program. On the other hand, if the speed decreases, the controller 100 slows the response speed of the exercise program and optimizes the exercise. It can provide an interface.
  • the user-recognized content providing system 10 is connected to a remote management system (not shown) by wire or wirelessly, the age, gender, physical strength, exercise amount of each individual user, exercise program, etc.
  • the apparatus may further include a wired / wireless interface unit (not shown) that provides a network interface to download various user information and realistic interactive content.
  • the remote management system may store and manage the exercise information of each individual in a database (not shown) connected to the remote management system, and provide the exercise program to the user-recognized content providing system 10 according to an embodiment of the present invention. Or an exercise program produced by the user.
  • the user recognition content providing system 10 may download the exercise program from the remote management system based on user information and information on an exercise space, and may be stored in the controller 100. You can also use realistic interactive content-based exercise programs.
  • the user-recognized content providing system 10 recognizes a user in real time, and transmits and receives the user's exercise information in the remote management system by transmitting and receiving the user's information and spatial information with the remote management system. It can be managed seamlessly.
  • the user recognition content providing system 10 according to an embodiment of the present invention and the remote management system are automatically seamlessly linked to each other to perform health care through the user's personal information and exercise information without a user's separate operation. It will be possible.
  • the user-recognized content providing system 10 according to an embodiment of the present invention may continuously manage data such as average exercise amount, intensity, exercise period, etc. for each user to manage exercise history. Accordingly, the user can check and effectively manage athletic performance without time and space constraints.
  • the wired / wireless interface unit may communicate with an external device including a smart terminal, a remote controller, and a dedicated terminal through a wireless communication network including IrDA, Bluetooth, UWB and ZigBee.
  • the wired / wireless interface may receive an exercise program from the external device, or control the operation of the interface screen providing unit 300 based on the control signal.
  • Users can remotely adjust the size, distance and position of the image projected into the virtual exercise space by using a smart terminal such as a smart phone, a tablet PC, etc. or an external device including a remote controller or a dedicated control terminal through a wireless communication network. Can be.
  • a smart terminal such as a smart phone, a tablet PC, etc.
  • an external device including a remote controller or a dedicated control terminal through a wireless communication network.
  • software control such as changing an exercise program to be executed or increasing an exercise level in the exercise program may be possible.
  • the user recognition content providing system 10 may further include a power supply (not shown).
  • the power supply unit may supply power to components such as the control unit 100, the motion recognition sensor unit 200, the interface screen providing unit 300, and the display unit 400.
  • the power supply unit may receive power from the outside using a power cable and supply the power to the component, or may be implemented as a rechargeable internal battery to supply power to the component.
  • FIG. 2 is a flowchart illustrating a method of operating a user recognition content providing system according to an embodiment of the present invention.
  • the operation method 20 of the user recognition content providing system illustrated in FIG. 2 may be performed by the user recognition content providing system 10 according to an embodiment of the present invention.
  • the interface screen providing unit 300 projects an interface screen toward a user's exercise space (S21). It may include.
  • the motion recognition sensor unit 200 may include a step (S22) of recognizing the operation of the user.
  • the controller 100 analyzes the position of the user's foot, knee, and pelvis measured by the motion recognition sensor unit 200 to determine an angle formed by the user's foot, knee, and pelvis at a first reference angle ( Example: 90 degrees) or less, it can be determined that the user's knee is bent.
  • a first reference angle Example: 90 degrees
  • the controller 100 recognizes that the user maintains the bent knee of one leg for the first selection time by the motion recognition sensor 200, the position of the foot corresponding to the bent knee of the user on the interface screen Can trigger an event of a button that matches
  • the control unit 100 when the user raises both feet on the interface screen and bends both knees, the first selection time is the first time when the user raises one foot on the interface screen and bent knees 1 It can be set shorter than the selection time.
  • a predetermined event may be generated by comparing the position of the foot corresponding to the bent knee of the user with the interface screen. That is, the controller 100 may determine whether the user's knee is bent after the user places a foot on a specific button on the interface screen. In addition, when the user bends the knee while keeping the foot on a specific button on the interface screen, the controller 100 compares the position of the foot corresponding to the bent knee of the user with the interface screen to perform a predetermined event. Can be generated.
  • the control unit 100 displays a screen corresponding to the push-up exercise program through the display unit 400 to allow the user to push-ups.
  • the interface screen providing unit 300 may also project a screen associated with a push-up exercise to the floor.
  • FIG. 3 is a flowchart illustrating a method of operating a system for providing a user recognition content according to another embodiment of the present invention.
  • the operation method 30 of the user recognition content providing system illustrated in FIG. 3 may be performed by the user recognition content providing system 10 according to an embodiment of the present invention.
  • the interface screen providing unit 300 projects an interface screen toward a user's exercise space (S31). It may include.
  • the operation method 30 of the user recognition content providing system may include the step (S32) of the motion recognition sensor unit 200 to recognize the user's operation.
  • control unit 100 determines the position of the foot and the user's foot. Comparing the interface screen (S33) may be included.
  • control unit 100 may know the position of the user's foot through the motion recognition sensor unit 200, and determine which button is selected by the user on the interface screen projected by the interface screen providing unit 300. have.
  • the control unit 100 may include generating a predetermined event according to a result of comparing the position of the foot corresponding to the curved knee of the user with the interface screen (S34).
  • the controller 100 analyzes the position of the user's foot, knee, and pelvis measured by the motion recognition sensor unit 200 to determine an angle formed by the user's foot, knee, and pelvis at a first reference angle ( For example, 90 degrees) or less, it may be determined that the user's knee is bent. If the angle formed by the user's foot, knee, and pelvis is greater than or equal to a second reference angle (eg, 150 degrees), the user's knee is determined to be stretched. can do. When the user recognizes that the user's bent knee is extended by the motion recognition sensor unit 200, the controller 100 may generate an event of a button corresponding to the position of the foot corresponding to the bent knee of the user on the interface screen. .
  • a first reference angle For example, 90 degrees
  • a second reference angle eg, 150 degrees
  • the controller 100 may determine whether the user's knee is bent and stretched after the user places a foot on a specific button on the interface screen. In addition, when the user bends the knee while keeping the foot on a specific button on the interface screen, the controller 100 compares the position of the foot corresponding to the bent and extended knee of the user with the predetermined interface event. Can be generated.
  • 4A to 4C are diagrams illustrating a use state of a system for providing user recognition content according to an embodiment of the present invention.
  • the interface screen providing unit 300 of the user recognition content providing system 10 may project the interface screen S on the floor where the user is located.
  • the interface screen S may include a plurality of buttons B.
  • the interface screen S may be changed in various forms. According to an embodiment, there may be a preparation position where a footprint is displayed on the interface screen S, and selectable exercise program buttons may be displayed on the interface screen S when the user moves to the preparation position.
  • the user may select one of the buttons B on the interface screen S, put his feet up, and bend his knees. That is, the user may bend the knee while moving to the ready position by selecting any one of the front buttons (B) to put the foot.
  • the control unit 100 can detect the button (B) put the foot through the motion recognition sensor unit 200, the user bends the knee
  • the recognition sensor unit 200 may detect it.
  • the user touches any one button B on the interface screen S as shown in FIG. 4B.
  • the button corresponding to the position of the foot corresponding to the bent knee of the user is selected and a predetermined event can be executed.
  • the user may extend the knee bent in FIG. 4B with one foot on the selection button to clearly indicate his agency.
  • the operation method 30 of the user-recognized content providing system according to another exemplary embodiment of the present invention described above with reference to FIG. 3 when the user bends the knee as shown in FIG. It is determined that the button corresponding to the position of the foot corresponding to the knee is selected, and a predetermined event can be executed.
  • the user may select two buttons with both legs or bend both knees of the legs to select two buttons. Accordingly, the user may issue a command by combining the two buttons.
  • FIG. 5 is a flowchart illustrating a method of operating a user recognition content providing system according to another exemplary embodiment of the present invention.
  • the operation method 40 of the user recognition content providing system illustrated in FIG. 5 may be performed by the user recognition content providing system 10 according to an embodiment of the present invention.
  • the interface screen providing unit 300 projects an interface screen toward a user's exercise space (S41).
  • the motion recognition sensor unit 200 recognizes the user's motion (S42) and when the user puts his foot on the interface screen
  • the control unit 100 determines the position of the user's foot and the interface screen. Comparing may include the step (S43). Since the steps S41, S42, and S43 are similar to the steps S31, S32, and S33 described above with reference to FIG. 3, a detailed description thereof will be omitted.
  • the control unit 100 may include generating a predetermined event according to a result of comparing the position of the foot corresponding to the user's bent knee with the interface screen (S45). Since step S45 is similar to step S34 described above with reference to FIG. 3, a detailed description thereof will be omitted.
  • a method 40 for operating a user-recognized content providing system includes: when the user's bent knee is not extended (S44-> NO), the user raises a foot on the interface screen. If the bent knee is maintained for the first selection time, the controller 100 further includes generating a predetermined event by comparing the position of the foot corresponding to the bent knee of the user with the interface screen (S46). can do.
  • the controller 100 selects a button corresponding to the position of the foot corresponding to the bent knee of the user. It can be determined that the predetermined event is generated.
  • the control unit 100 when the user raises both feet on the interface screen and bends both knees, the first selection time is the first time when the user raises one foot on the interface screen and bent knees 1 It can be set shorter than the selection time.
  • FIG. 6 is a flowchart illustrating a method of operating a user recognition content providing system according to another embodiment of the present invention.
  • the operation method 50 of the user recognition content providing system illustrated in FIG. 6 may be performed by the user recognition content providing system 10 according to an exemplary embodiment.
  • the interface screen providing unit 300 projects an interface screen toward a user's exercise space (S51).
  • the motion recognition sensor unit 200 recognizes the user's motion (S52) and when the user puts his foot on the interface screen
  • the control unit 100 determines the position of the foot of the user and the interface screen. Comparing may include the step (S53).
  • the steps S51, S52, and S53 are similar to the steps S31, S32, and S33 described above with reference to FIG.
  • step S55 is similar to step S34 described above with reference to FIG. 3, a detailed description thereof will be omitted.
  • Operation method 50 of the user recognition content providing system when the bent knee of the user is not extended (S54-> NO), the bent knee of the user and the bent of the user Generating a predetermined event according to a result of comparing the position of the foot corresponding to the bent knee of the user with the interface screen when the height of at least one of the feet corresponding to the true knee rises (S56) ) May be further included.
  • step S56 of the method 50 of operating the user recognition content providing system may be performed.
  • step S56 may be performed.
  • the controller 100 may recognize the knee as if the knee is normally extended.
  • FIG. 7 is a flowchart illustrating a method of operating a user recognition content providing system according to another embodiment of the present invention.
  • the operation method 60 of the user recognition content providing system illustrated in FIG. 7 may be performed by the user recognition content providing system 10 according to an embodiment of the present invention.
  • steps S61, S62, S63, S64, and S65 included in a method 60 of operating a user-recognized content providing system according to another embodiment of the present invention are described with reference to FIG. 6. Since it is similar to the steps S53, S54 and S55, detailed description thereof will be omitted.
  • the method 60 for operating a user-recognized content providing system includes at least one of the user's pelvis, spine, and shoulders when the user's bent knee is not extended (S64-> NO). If the height of the control unit 100 may further include generating a predetermined event according to a result of comparing the position of the foot corresponding to the bent knee of the user with the interface screen (S66).
  • step S66 of the operation method 60 of the user recognition content providing system may be performed.
  • the control unit 100 when the user is bent knees, but the foot is not fully stretched, if the height of at least one of the user's pelvis, spine and shoulder rises more than a certain distance, even if the knee is not fully extended, the knee is normally You can perform the S66 step by admitting as unfolded. As an example, when the height of the user's pelvis and shoulders rises at the same time, the controller 100 may recognize the knees as if they were normally extended.
  • FIG. 8 is a flowchart illustrating a method of operating a user recognition content providing system according to another embodiment of the present invention.
  • the operation method 70 of the user recognition content providing system illustrated in FIG. 8 may be performed by the user recognition content providing system 10 according to an embodiment of the present invention.
  • steps S71, S72, S73, S74, and S75 included in a method 70 of operating a user-recognized content providing system according to another embodiment of the present invention are described with reference to FIG. 6. Since it is similar to the steps S53, S54 and S55, detailed description thereof will be omitted.
  • Operation method of the user recognition content providing system according to another embodiment of the present invention when the bent knee of the user is not extended (S74-> NO), the position of the bent knee of the user, the user
  • the cumulative change amount of at least one of the position of the pelvis and the position of the shoulder of the user is measured, and if the cumulative change amount exceeds a preset threshold, the controller 100 adjusts the position of the foot corresponding to the bent knee of the user.
  • the method may further include generating a predetermined event according to a result of comparing with the interface screen (S76).
  • the step S76 may be performed by acknowledging that the knee is normally extended in order to compensate for the user's effort.
  • the controller 100 may measure the cumulative position change amount by analyzing a user's motion detected by the motion recognition sensor unit 200. As an example, the cumulative position change may be measured while the user puts his foot on a specific button on the interface screen.
  • FIG. 9 is a flowchart illustrating a method of operating a user recognition content providing system according to another embodiment of the present invention.
  • the operation method 80 of the user recognition content providing system illustrated in FIG. 9 may be performed by the user recognition content providing system 10 according to an embodiment of the present invention.
  • a method 80 of operating a user recognition content providing system according to another embodiment of the present invention illustrated in FIG. 9 combines the operation method of the user recognition content providing system described above with reference to FIGS. 3 and 6 to 8. will be.
  • steps S81, S82, and S83 included in the operation method 80 of the user-recognized content providing system according to another embodiment of the present invention may be similar to the steps S51, S52, and S53 described above with reference to FIG. 6. Similar descriptions are omitted here.
  • a method 80 for operating a user-recognized content providing system includes bending the user's knee corresponding to a foot placed on the interface screen (S84-> YES).
  • the controller 100 may include generating a predetermined event according to a result of comparing the position of the foot corresponding to the bent knee of the user with the interface screen (S88).
  • step S88 may be performed.
  • the height of at least one of the bent knee of the user and the foot corresponding to the bent knee of the user does not increase. If not (S85-> NO), if the height of at least one of the pelvis, spine and shoulder of the user rises (S86-> YES) step S88 can be performed.
  • the operation method 80 of the user recognition content providing system when the height of at least one of the pelvis, spine and shoulder of the user does not rise (S86-> NO), If the cumulative change amount of at least one of the position of the bent knee of the user, the position of the pelvis of the user, and the position of the shoulder of the user is measured and the cumulative change amount exceeds a preset threshold (S87-> YES), the Step S88 may be performed. When the cumulative change amount does not exceed the preset threshold (S87-> NO), it can return to step S84 again.
  • a method 80 of operating a user recognition content providing system according to another embodiment of the present invention illustrated in FIG. 9 combines the operation method of the user recognition content providing system described above with reference to FIGS. 3 and 6 to 8. As such, the algorithm may be more sophisticated to not miss the user intended input.
  • steps S85, S86, and S87 shown in FIG. 9 is exemplary and may be variously changed according to the embodiment.
  • the steps S46 described with reference to FIG. 5 may also be combined with the steps S85, S86, and S87.
  • steps S84, S85, S86, and S87 the result may be determined within a predetermined time set by the controller 100 in advance.
  • the operating method of the user-recognized content providing system according to the present invention a computer that stores a program for performing the operation method of the user-recognized content providing system according to an embodiment of the present invention described above with reference to FIGS. It may be performed by a readable recording medium.
  • the operation method of the user recognition content providing system according to the present invention is performed by a computer program stored in a medium in order to execute the operation method of the user recognition content providing system described above with reference to FIGS. 2 to 9 in combination with hardware. Can be. It is possible to implement the operation method of the user-recognized content providing system according to the present invention as computer-readable codes on a computer-readable recording medium.
  • Computer-readable recording media include any type of recording device that stores data that can be read by a computer system. Examples of computer-readable recording media include ROM, RAM, CD-ROM, magnetic tape, and floppy. Disk, optical data storage, and the like. The computer readable recording medium can also be distributed over network coupled computer systems so that the computer readable code is stored and executed in a distributed fashion.
  • FIG. 10 is a diagram illustrating an interface screen projected by a user recognition content providing system according to an exemplary embodiment.
  • the interface screen may include a ready position where a footprint is displayed, and buttons corresponding to various exercise programs may be displayed.
  • the screen illustrated in FIG. 10 is an example, and the layout, configuration, interface, and detailed operation of the screen may be variously changed.
  • the control unit 100 the user puts a foot on the interface screen
  • the user may generate a predetermined event by comparing the position of the foot placed on the interface screen with the interface screen.
  • the second selection time may be set to 3 seconds and longer than the first selection time. That is, the controller 100 may determine that the corresponding button is selected even when the user keeps the foot on the specific button for more than the second selection time in addition to the operation of bending the knee when the user selects the button on the interface screen. .
  • the control unit 100 sets the second selection time when the user puts one foot on a specific button on the interface screen. It can be set shorter than the second selection time.
  • the user may keep the user's foot on a specific button on the interface screen for a second selection time to select a specific button, and “slow selection”.
  • the second selection time may be defined in terms such as “slow selection time”.
  • the user's selection of a specific object by various combinations of the position of the user's foot, the bending motion of the knee, and the blood of the knee and the holding time in each movement are variously combined. Can effectively understand the will. Accordingly, the success rate of object selection by the user can be maximized.
  • the user-recognized content providing system has been described with reference to an exercise system that recognizes a user's motion, selects an exercise menu, and provides exercise content corresponding to the selected menu.
  • the present invention is not limited thereto, and the present invention can be applied to an advertisement device that selects various advertisements by displaying an interface screen in a space where the user is located, and selecting a menu displayed on the interface screen.
  • the present invention can be applied to a rehabilitation medical device that provides various rehabilitation operations by projecting an interface screen in a space where a user is located and selecting a menu projected on an interface screen by a user who needs rehabilitation.
  • various embodiments described herein can be implemented by hardware, middleware, microcode, software and / or combinations thereof.
  • various embodiments may include one or more application specific semiconductors (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable gate arrays (FPGAs). ), Processors, controllers, microcontrollers, microprocessors, other electronic units designed to perform the functions presented herein, or a combination thereof.
  • ASICs application specific semiconductors
  • DSPs digital signal processors
  • DSPDs digital signal processing devices
  • PLDs programmable logic devices
  • FPGAs field programmable gate arrays
  • Processors controllers, microcontrollers, microprocessors, other electronic units designed to perform the functions presented herein, or a combination thereof.
  • various embodiments may be embedded in or encoded on a computer-readable medium containing instructions. Instructions embedded in or encoded on a computer-readable medium may cause a programmable processor or other processor to perform a method, for example, when the instructions are executed.
  • Computer-readable media includes both computer storage media and communication media including any medium that facilitates transfer of a computer program from one place to another.
  • the storage medium may be any available medium that can be accessed by a computer.
  • such computer-readable media may include RAM, ROM, EEPROM, CD-ROM, or other optical disk storage media, magnetic disk storage media or other magnetic storage devices, or instructions for accessing the desired program code by a computer or It can include any other medium that can be used to carry or store in the form of data structures.
  • Such hardware, software, firmware, etc. may be implemented within the same device or within separate devices to support the various operations and functions described herein.
  • the components, units, modules, components, etc., described herein as "parts" may be implemented separately or separately as discrete but interoperable logic devices.
  • the depiction of different features for modules, units, etc. is intended to highlight different functional embodiments and does not necessarily mean that they must be realized by individual hardware or software components. Rather, functionality associated with one or more modules or units may be performed by separate hardware or software components or integrated into common or separate hardware or software components.
  • the user recognition content providing system and its operation method according to embodiments of the present invention can be used in the industry related to the user recognition content providing system.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Position Input By Displaying (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

L'invention porte également sur un système de fourniture de contenu de reconnaissance d'utilisateur et sur son procédé de fonctionnement. Le système de fourniture de contenu de reconnaissance d'utilisateur comprend : une unité de fourniture d'écran d'interface qui projette un écran d'interface vers un espace où se trouve un utilisateur; une unité de capteur de reconnaissance de mouvement qui reconnaît les mouvements d'un utilisateur; et une unité de commande qui, lorsque l'utilisateur place son pied sur l'écran d'interface, compare l'écran d'interface et l'emplacement du pied de l'utilisateur, et, si le genou de l'utilisateur correspondant au pied placé sur l'écran d'interface a été étendu puis plié, génère un événement spécifique en fonction du résultat d'une comparaison de l'emplacement du pied correspondant à l'extension de l'utilisateur, puis du genou plié et de l'écran d'interface.
PCT/KR2017/006112 2016-06-13 2017-06-13 Système de fourniture de contenu de reconnaissance d'utilisateur et son procédé de fonctionnement WO2017217725A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR1020160073414A KR101860753B1 (ko) 2016-06-13 2016-06-13 사용자 인식 컨텐츠 제공 시스템 및 그 동작방법
KR10-2016-0073414 2016-06-13

Publications (1)

Publication Number Publication Date
WO2017217725A1 true WO2017217725A1 (fr) 2017-12-21

Family

ID=60663647

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2017/006112 WO2017217725A1 (fr) 2016-06-13 2017-06-13 Système de fourniture de contenu de reconnaissance d'utilisateur et son procédé de fonctionnement

Country Status (2)

Country Link
KR (1) KR101860753B1 (fr)
WO (1) WO2017217725A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20220280834A1 (en) * 2021-03-05 2022-09-08 John S. Keeney Agility ladder projection systems, devices, and methods
US12017112B2 (en) * 2021-03-05 2024-06-25 John S. Keeney Agility ladder projection systems, devices, and methods

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102125254B1 (ko) * 2018-05-15 2020-06-22 (주)블루클라우드 사용자 인식 보행 동작 측정 시스템 및 이를 이용한 보행 동작 측정 방법
KR102049096B1 (ko) * 2019-03-27 2019-11-26 주식회사 마이베네핏 혼합 현실 기반의 운동 시스템
KR102041279B1 (ko) * 2019-04-17 2019-11-27 주식회사 지티온 가상 인터렉티브 컨텐츠의 사용자 인터페이스 제공 시스템, 방법 및 이를 위한 컴퓨터 프로그램이 저장된 기록매체
KR102352366B1 (ko) 2019-04-30 2022-01-18 부산대학교 산학협력단 스마트 거울, 스마트 거울 재활 시스템 및 스마트 거울을 이용한 재활 훈련 방법
KR102051004B1 (ko) * 2019-05-03 2019-12-03 주식회사 마이베네핏 최적화된 음향을 제공하는 혼합 현실 기반 운동 시스템
KR102366102B1 (ko) * 2021-08-20 2022-02-24 주식회사 조이펀 3차원 캐릭터 기반 실감형 인터렉티브 운동 컨텐츠 제공 시스템
KR102354559B1 (ko) 2021-08-24 2022-01-21 한국기술교육대학교 산학협력단 콘텐츠 제어용 다종 인터페이스 장치
KR102510412B1 (ko) 2022-01-21 2023-03-16 서정협 양방향 증강현실 콘텐츠 생성 시스템
KR102434017B1 (ko) 2022-03-30 2022-08-22 유디포엠(주) 증강현실 콘텐츠 출력 장치

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR960006880A (ko) * 1994-08-01 1996-03-22 유재원 영양평가 및 운동처방에 의한 종합 건강 관리 시스템
KR20110101328A (ko) * 2010-03-08 2011-09-16 에스케이텔레콤 주식회사 제스처 인식을 이용한 콘텐츠 전송 방법 및 그 단말기
JP2014529420A (ja) * 2011-08-08 2014-11-13 ギャリー・アンド・マリー・ウエスト・ヘルス・インスティテュート 患者に施される身体のリハビリテーションを増強するための非侵襲性動作追跡システムと、装置と、方法
KR20160061557A (ko) * 2014-11-21 2016-06-01 대한민국(국립재활원장) 투사 영상을 이용한 보행 재활 치료 보조 시스템 및 방법

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR960006880A (ko) * 1994-08-01 1996-03-22 유재원 영양평가 및 운동처방에 의한 종합 건강 관리 시스템
KR20110101328A (ko) * 2010-03-08 2011-09-16 에스케이텔레콤 주식회사 제스처 인식을 이용한 콘텐츠 전송 방법 및 그 단말기
JP2014529420A (ja) * 2011-08-08 2014-11-13 ギャリー・アンド・マリー・ウエスト・ヘルス・インスティテュート 患者に施される身体のリハビリテーションを増強するための非侵襲性動作追跡システムと、装置と、方法
KR20160061557A (ko) * 2014-11-21 2016-06-01 대한민국(국립재활원장) 투사 영상을 이용한 보행 재활 치료 보조 시스템 및 방법

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
LEE, CHANG HUI: "Fitness Program Development using BlueCloude, and Mixed Reality", MAEIL BUSINESS NEWSPAPER, 9 February 2015 (2015-02-09), XP055463193, Retrieved from the Internet <URL:http://news.mk.co.kr/newsRead.php?year=2015&no=131280> *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20220280834A1 (en) * 2021-03-05 2022-09-08 John S. Keeney Agility ladder projection systems, devices, and methods
US12017112B2 (en) * 2021-03-05 2024-06-25 John S. Keeney Agility ladder projection systems, devices, and methods

Also Published As

Publication number Publication date
KR20170140726A (ko) 2017-12-21
KR101860753B1 (ko) 2018-05-24

Similar Documents

Publication Publication Date Title
WO2017217725A1 (fr) Système de fourniture de contenu de reconnaissance d&#39;utilisateur et son procédé de fonctionnement
WO2018155973A2 (fr) Appareil et procédé d&#39;apprentissage de radiologie basé sur la réalité virtuelle
JP5256561B2 (ja) 情報入力装置、情報出力装置および方法
WO2018054056A1 (fr) Procédé d&#39;exercice interactif et dispositif intelligent à porter sur la tête
CN108572728B (zh) 信息处理设备、信息处理方法和程序
WO2016024829A1 (fr) Système de guidage de correction de démarche, et son procédé de commande
JPWO2009072435A1 (ja) 画像認識装置および画像認識方法
WO2020040363A1 (fr) Procédé et dispositif de guidage de mouvement à l&#39;aide d&#39;un avatar 4d
WO2020130689A1 (fr) Dispositif électronique pour recommander un contenu de jeu et son procédé de fonctionnement
WO2015008935A1 (fr) Système de simulation et de formation à la réanimation cardiopulmonaire (cpr), et procédé de commande correspondant
WO2017065535A1 (fr) Dispositif électronique et son procédé de commande
US20210004132A1 (en) Information processing device, information processing method, and program
JP6651086B1 (ja) 画像分析プログラム、情報処理端末、及び画像分析システム
US20210287330A1 (en) Information processing system, method of information processing, and program
US11112961B2 (en) Information processing system, information processing method, and program for object transfer between devices
WO2016204335A1 (fr) Dispositif pour fournir un espace d&#39;exercice virtuel augmenté par un système d&#39;exercice basé sur du contenu interactif immersif et procédé associé
WO2012026681A9 (fr) Système de pratique des arts martiaux en réalité virtuelle utilisant un réseau et son procédé de commande
JP2017068468A (ja) 情報処理装置、情報処理方法及びプログラム
KR20160150215A (ko) 실감형 인터랙티브 콘텐츠에 기반한 운동시스템 및 그 방법
KR20190130761A (ko) 사용자 인식 보행 동작 측정 시스템 및 이를 이용한 보행 동작 측정 방법
TWI533238B (zh) 動作偵測與判斷裝置及方法
KR101268640B1 (ko) 대형 스크린을 위한 디스플레이 시스템 및 그 시스템의 동작 방법
WO2024058439A1 (fr) Procédé et appareil de détermination de persona d&#39;objet d&#39;avatar agencé dans un espace virtuel
WO2023224250A1 (fr) Appareil électronique et son procédé de commande
WO2024019374A1 (fr) Procédé et dispositif de fourniture d&#39;une image médicale en coupe transversale

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17813549

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 1.4.19)

122 Ep: pct application non-entry in european phase

Ref document number: 17813549

Country of ref document: EP

Kind code of ref document: A1