CN211180839U - Motion teaching equipment and motion teaching system - Google Patents

Motion teaching equipment and motion teaching system Download PDF

Info

Publication number
CN211180839U
CN211180839U CN201922042073.8U CN201922042073U CN211180839U CN 211180839 U CN211180839 U CN 211180839U CN 201922042073 U CN201922042073 U CN 201922042073U CN 211180839 U CN211180839 U CN 211180839U
Authority
CN
China
Prior art keywords
image
unit
motion
moving body
motion teaching
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201922042073.8U
Other languages
Chinese (zh)
Inventor
乔宇
邹静
王亚立
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Institute of Advanced Technology of CAS
Original Assignee
Shenzhen Institute of Advanced Technology of CAS
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Institute of Advanced Technology of CAS filed Critical Shenzhen Institute of Advanced Technology of CAS
Priority to CN201922042073.8U priority Critical patent/CN211180839U/en
Application granted granted Critical
Publication of CN211180839U publication Critical patent/CN211180839U/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Processing Or Creating Images (AREA)

Abstract

The application provides a motion teaching equipment and motion teaching system, wherein, a motion teaching equipment includes: the image display unit, the image acquisition unit, the unit and the image output unit are dressed to the motion body, wherein, the image output unit respectively with the image display unit, image acquisition unit and motion body are dressed and are connected, utilize image display motion teaching image, real-time gesture image is gathered to the image acquisition unit, motion body wearing unit gathers the spatial position information of motion joint, can determine the motion joint position of motion body in real-time gesture image based on spatial position information, also confirm the motion body imitates motion teaching image motion content's similarity size or difference size, and then generate corresponding target gesture image, need not adopt complicated image processing scheme, and need not rely on the comparatively powerful hardware equipment of operational capability, thereby the cost is reduced.

Description

Motion teaching equipment and motion teaching system
Technical Field
The application belongs to the technical field of computers, and particularly relates to a motion teaching device and a motion teaching system.
Background
Along with the higher living standard of people, more and more intelligent household appliances are also enjoyed by consumers. Such as smart speakers, smart televisions, virtual display gaming machines, and the like. The existing virtual display game machine can also be used as a motion teaching device, and guides a user to move by playing a preset motion teaching video, such as a yoga training video or a series of videos containing body-building actions, such as a self-gravity training video.
However, the existing motion teaching equipment guides the user to do indoor exercise and fitness by acquiring the image of the user in the motion process, then performing various complicated steps such as image preprocessing, image segmentation, pixel matrix calculation and the like on the image, and finally identifying whether the gesture of the user in the motion process by using the motion teaching equipment is matched with the preset motion teaching content. It can be seen that the existing motion teaching equipment inevitably needs to be applied to various image processing models and algorithms when various complex image processing and operations are performed on a motion image of a user, and needs to rely on hardware equipment with stronger operation capability, so that the problem of over-high cost exists.
SUMMERY OF THE UTILITY MODEL
The embodiment of the application provides motion teaching equipment and a motion teaching system, and the cost of the motion teaching equipment can be reduced.
An object of the present application is to provide a motion teaching apparatus, including:
the image display unit is used for displaying a preset motion teaching image and a target posture image; wherein the image content of the motion teaching image is used for imitating a moving body;
the image acquisition unit is used for acquiring a real-time attitude image when the moving body imitates the image content of the motion teaching image;
the moving body wearing unit is used for wearing on a moving joint of the moving body and collecting space position information of the moving joint;
and the image output unit is respectively connected with the image display unit, the image acquisition unit and the moving body wearing unit and is used for generating the target attitude image according to the real-time attitude image, the spatial position information and the motion teaching image.
Further, the image output unit is specifically configured to determine a real-time position of a moving joint in the real-time posture image according to the spatial position information, compare the real-time position with a reference position of the moving joint preset in the motion teaching image to obtain posture difference information, and generate the target posture image according to the posture difference information.
Further, the image output unit is specifically configured to generate a target correction image when the posture difference information is equal to or greater than a preset threshold.
Further, the image output unit is specifically configured to generate a target completion image when the posture difference information is smaller than a preset threshold.
Further, the motion teaching apparatus further includes:
and the information transceiving unit is connected with the image output unit and is used for wirelessly receiving the spatial position information sent by the moving body wearing unit and transmitting the spatial position information to the image output unit.
Further, the image output unit is also used for generating a control instruction according to the target correction image and sending the control instruction to the moving body wearing unit through the information transceiving unit; wherein the control instruction is used for controlling the moving body wearing unit to vibrate or contract.
Further, the motion teaching apparatus further includes:
the image rendering unit is connected with the information transceiving unit;
the image output unit is further configured to send the target correction image or the target completion image to the image rendering unit through the information transceiving unit; the image rendering unit is used for performing augmented reality image rendering according to the target correction image or the target completion image.
Further, the information transceiving unit comprises at least one of a Bluetooth unit, a WiFi unit, an infrared unit, a zigbee unit and a 5G communication unit.
Further, the moving body wearing unit includes at least one of gloves, a foot cover, a knee pad, a hand ring, a foot ring, a helmet, a collar, and a waist support band.
It is another object of the present application to provide a motion teaching system including a motion teaching apparatus as described above.
The application provides a pair of motion teaching equipment and motion teaching system, wherein, a motion teaching equipment includes: the image output unit is respectively connected with the image display unit, the image acquisition unit and the moving body wearing unit, a preset motion teaching image is displayed by using the image display unit, the real-time attitude image of the moving body imitating the image content of the motion teaching image is acquired by using the image acquisition unit, the moving body wearing unit is worn on a moving joint of the moving body and acquires the spatial position information of the moving joint, the spatial position information is used for representing the pixel point range of the moving joint of the moving body in the real-time attitude image, and the moving joint position of the moving body can be determined in the real-time attitude image based on the spatial position information, so that the image output unit can be used for determining the similarity degree or difference degree of the motion content of the moving body imitating the motion teaching image according to the real-time attitude image and the spatial position information, and then generate the target attitude image, send the target attitude image to the image display unit and carry out the display and finish the motion teaching, do not need to adopt complicated image processing scheme, and need not to rely on the hardware equipment that the operational capability is comparatively strong to the cost is reduced.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
FIG. 1 is a schematic structural diagram of a motion teaching apparatus provided in an embodiment of the present application;
FIG. 2 is a schematic diagram of a motion teaching apparatus according to another embodiment of the present application;
fig. 3 is a schematic structural diagram of a motion teaching system according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some, but not all, embodiments of the present application. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
Referring to fig. 1, fig. 1 is a schematic structural diagram of a motion teaching apparatus provided in an embodiment of the present application, and for convenience of description, only a portion related to the embodiment of the present application is shown.
As shown in fig. 1, a motion teaching apparatus 100 includes: an image display unit 10, an image pickup unit 20, a moving body wearing unit 30, and an image output unit 40. Specifically, the method comprises the following steps:
an image display unit 10 for displaying a preset motion teaching image and a target posture image; wherein, the image content of the motion teaching image is used for the simulation of the moving body.
And the image acquisition unit 20 is used for acquiring real-time attitude images when the moving body imitates the image content of the motion teaching images.
And a moving body wearing unit 30 for wearing on a moving joint of the moving body and collecting spatial position information of the moving joint.
And the image output unit 40 is respectively connected with the image display unit 10, the image acquisition unit 20 and the moving body wearing unit 30, and is used for generating a target posture image according to the real-time posture image, the spatial position information and the motion teaching image.
In this embodiment, the image display unit 10, the image capturing unit 20, and the moving object wearing unit 30 are respectively connected to the image output unit 40, wherein the image display unit 10 and the image capturing unit 20 are both electrically connected to the image output unit 40 by a wired connection, and the moving object wearing unit 30 is connected to the image output unit 40 by a wired connection. In other embodiments, the moving body wearing unit 30 and the image output unit 40 may be connected wirelessly.
The image display unit 10 displays the motion teaching image first, and the image display unit 10 displays the target posture image only when the image output unit 40 generates the target posture image and the image display unit 10 receives the target posture image transmitted by the image output unit 40. The image acquisition unit 20 acquires a real-time posture image when the moving body simulates the image content of the motion teaching image and transmits the real-time posture image to the image output unit 40, and the moving body wearing unit 30 is worn on the moving joint of the moving body, so that when the moving body simulates the image content of the motion teaching image, the moving body wearing unit 30 can acquire the spatial position information of the moving joint and transmit the spatial position information to the image output unit 40. The image output unit 40 generates a target posture image from the real-time posture image, the spatial position information, and the motion teaching image.
It should be noted that the preset motion teaching image content displayed in the image display unit 10 is used for the moving object simulation, that is, the motion teaching image is an image containing the moving posture or motion of the moving object. In all embodiments of the present application, the moving object may include a human being, or a robot capable of simulating the motion content of the motion teaching image, which is not limited herein. When the image display unit 10 displays the motion teaching image, the moving object simulates the motion teaching image by observing the content of the motion teaching image, and the image acquisition unit 20 acquires the image of the motion simulated motion of the moving object, that is, the real-time posture image of the moving object acquired by the image acquisition unit 20 corresponds to the image content of the motion teaching image in time sequence, that is, the real-time posture image of the moving object is a motion image made by the image content of the motion simulated motion teaching image of the moving object, that is, the image content of the motion simulated motion teaching image of the moving object corresponds to the real-time posture image of the moving object in time sequence. The moving body wearing unit 30 is worn on the moving joint of the moving body, and the moving body wearing unit 30 can acquire spatial position information of the moving joint of the moving body when the moving body simulates the image content of the motion teaching image. The image output unit 40 can determine the moving joint area of the moving body in the real-time posture image according to the spatial position information.
In order to identify how similar the image content of the moving object imitating the motion teaching image is, the image output unit 40 is used to combine the spatial position information with the real-time posture image, that is, to determine the moving joint area of the moving object in the real-time posture image and distinguish the moving joint area from other areas. And then comparing the motion gesture with the motion teaching image, and further determining the difference between the motion gesture simulated by the moving body and the image content of the motion teaching image. In all embodiments of the present application, the spatial position information includes kinematic joint position information and joint spatial information, where the kinematic joint position information is used to distinguish the position of a kinematic joint in a moving body, for example, to distinguish a wrist, an elbow, a shoulder, and the like. The joint space information is used to distinguish the positions of the moving joints in the real-time posture image, for example, to distinguish the wrist position, elbow position, shoulder position, and the like in the real-time posture image. The moving joint of the moving body and the specific position of the moving joint in the real-time attitude image can be determined in the real-time attitude image based on the spatial position information. Based on the comparison between the position of the motion joint and the characteristic position in the motion teaching image, the similarity or difference degree of the motion content of the motion body imitating the motion teaching image can be determined, and then a target posture image is generated.
Taking the example that the moving body is a human body and the moving body wearing unit 30 is a bracelet, the moving body wearing unit 30 is worn on the wrists on the left and right sides of the human body, the image acquisition unit 20 acquires images simulating the motions of the human body, and the obtained real-time posture images of the human body are obtained, because the moving body wearing unit 30 is worn on the wrists on the left and right sides of the human body, the position of the moving body wearing unit 30 can represent the spatial position of the wrists of the human body, that is, the spatial position information of the wrists of the human body acquired by the moving body wearing unit 30 can be displayed in the real-time posture images, that is, the wrists on the left and right sides of the human body can be determined in the real-time posture images of the human body according to the spatial position information, that is, the pixel range. Therefore, without image processing or feature recognition on the real-time attitude image, the image output unit 40 can be used to determine the position of the motion joint, i.e. the pixel range, from the real-time attitude image, and compare the position of the motion joint with the reference position of the motion joint preset in the motion teaching image, so as to determine the similarity degree or difference degree of the motion content of the human body imitation motion teaching image, and further generate the target attitude image.
It is understood that all units in the motion teaching apparatus provided in the present application are hardware entity units, for example, the image display unit 10 may be a display screen, a projector, AR glasses or VR glasses, the image acquisition unit 20 may be a camera or a computer with a camera function, the moving body wearing unit 30 may be any existing wearing apparatus in any wearing form, and the image output unit 40 may be an existing image processing chip, an image processor or an image processing server.
In practical applications, the moving body wearing unit 30 collects spatial position information of a moving joint, which may be specifically implemented by configuring a sensor pair, for example, a signal receiver and a signal transmitter are respectively built in the moving body wearing unit 30 and the image output unit 40, and spatial position information of the moving joint may be collected by information interaction between the signal receiver and the signal transmitter, where the spatial position information represents a pixel point coordinate set of the moving joint in a real-time attitude image, so the image output unit 40 may determine a range of the moving joint from the real-time attitude image according to the pixel point coordinate set, and thus, a position of each moving joint, that is, a pixel point range, may be determined in the real-time attitude image without performing image processing or other operations on the real-time attitude image.
In this embodiment, the image output unit 40 compares the position of the motion joint with a reference position of the motion joint preset in the motion teaching image, specifically, the position pixel coordinates of the motion joint and the reference position pixel coordinates are compared, and the more the pixel coordinate deviation numbers of the motion joint and the reference position pixel coordinates are, the larger the numerical value of the attitude difference information is.
It should be noted that, in all embodiments of the present application, the image output unit 40, as a hardware unit for outputting an image, may be an existing image processor, and the differentiation, comparison and output of the number of pixels are all functions of the existing image processor. Based on the disclosure of the present application, a person skilled in the art knows how to select an existing image processing chip or hardware physical device as an image output unit. Since the present application is not an improvement in image processing, details regarding image processing are not repeated herein.
As a possible implementation manner of the present embodiment, the image output unit 40 is specifically configured to generate the target correction image when the posture difference information is equal to or greater than a preset threshold. And when the posture difference information is smaller than a preset threshold value, generating a target completion image.
In this embodiment, when the posture difference information is equal to or greater than the preset threshold, it indicates that the difference between the position of the moving joint of the moving body and the preset reference position is large, that is, the difference between the motion posture of the moving body and the teaching video content is large; when the posture difference information is smaller than the preset threshold value, the difference between the position of the moving joint of the moving body and the preset reference position is smaller, namely the action posture of the moving body is similar to or the same as the teaching video content. The target correction image is used for reference of the moving body and correcting the action, and the target completion image may include an image of the moving body and may also include a customized animation image, for example, an image for representing encouragement or cheers.
The motion teaching device provided by the embodiment comprises: the image output unit is respectively connected with the image display unit, the image acquisition unit and the moving body wearing unit, a preset motion teaching image is displayed by using the image display unit, the real-time attitude image of the moving body imitating the image content of the motion teaching image is acquired by using the image acquisition unit, the moving body wearing unit is worn on a moving joint of the moving body and acquires the spatial position information of the moving joint, the spatial position information is used for representing the pixel point range of the moving joint of the moving body in the real-time attitude image, and the moving joint position of the moving body can be determined in the real-time attitude image based on the spatial position information, so that the image output unit can be used for determining the similarity degree or difference degree of the motion content of the moving body imitating the motion teaching image according to the real-time attitude image and the spatial position information, and then generate the target attitude image, send the target attitude image to the image display unit and carry out the display and finish the motion teaching, do not need to adopt complicated image processing scheme, and need not to rely on the hardware equipment that the operational capability is comparatively strong to the cost is reduced.
Fig. 2 shows a schematic structural diagram of a motion teaching apparatus according to another embodiment of the present application. The present embodiment is another embodiment based on the above embodiment.
As shown in fig. 2, the difference from the previous embodiment is that the motion teaching apparatus in this embodiment further includes: an information transmitting and receiving unit 50 connected to the image output unit 40, and an image rendering unit 60 connected to the information transmitting and receiving unit 50. Specifically, the method comprises the following steps:
and an information transceiver unit 50 for wirelessly receiving the spatial position information transmitted from the moving body wearing unit 30 and transmitting the spatial position information to the image output unit 40.
In the present embodiment, in order to realize wireless connection between the moving body wearing unit 30 and the image output unit 40, an information transmitting and receiving unit 50 is further added to the motion teaching apparatus, and the information transmitting and receiving unit 50 serves as a wireless communication unit of the image output unit 40, wirelessly receives all information transmitted to the image output unit 40, and wirelessly transmits all information transmitted by the image output unit 40.
As a possible implementation manner of the embodiment, the information transceiver unit 50 includes at least one of a bluetooth unit, a WiFi unit, an infrared unit, a zigbee unit, and a 5G communication unit.
It can be understood that, in practical applications, the different wireless communication units are selected to be used in combination or separately to implement the wireless information transceiving function of the information transceiving unit 50, so that those skilled in the art can select at least one of the communication units to implement the function of the information transceiving unit 50 according to actual design requirements.
As a possible implementation manner of this embodiment, the image output unit 40 is further configured to generate a control instruction according to the target correction image, and send the control instruction to the moving body wearing unit through the information transceiving unit; wherein, the control command is used for controlling the moving body wearing unit to vibrate or contract.
In this embodiment, only when the image output unit 40 generates the target correction image, the control instruction corresponding to the target correction image is generated at the same time, and the control instruction is sent to the moving body wearing unit through the information transceiving unit 50, so as to control the moving body wearing unit to vibrate or contract, and to tactually guide the moving body to perform the action posture correction.
As a possible implementation manner of this embodiment, the image output unit 40 is further configured to send the target correction image or the target completion image to the image rendering unit 60 through the information transceiving unit; the image rendering unit 60 is configured to perform augmented reality image rendering according to the target correction image or the target completion image.
In this embodiment, the image rendering unit 60 performs augmented reality image rendering according to the target correction image or the target completion image to obtain a rendering file, sends the rendering file to the image display unit 10, and loads the rendering file by the image display unit 10.
It should be noted that, since it has been disclosed in the prior art that the augmented reality image rendering is performed on an image and a rendered file is loaded, based on the content of the present application, a person skilled in the art can select an appropriate image processing or rendering device as the image rendering unit 60 to implement the solution of the present embodiment. Therefore, how to perform the augmented reality image rendering and how to load the rendering file is not described herein again.
As a possible implementation manner of the present embodiment, the moving body wearing unit 30 includes at least one of gloves, a foot cover, a knee pad, a hand ring, a foot ring, a helmet, a collar, and a waist support band.
In practical applications, the moving body wearing unit 30 may be configured according to the image content of the motion teaching image, for example, if the image content of the motion teaching image only includes the motion of the upper body, the moving body wearing unit 30 may be configured with at least one of a glove, a bracelet, a helmet, a collar, and a waist support band. For another example, the image content of the motion teaching image includes only the motion of the lower body, and the moving body wearing unit 30 may be configured with at least one of a foot cover, a knee pad, and a foot ring.
As a possible implementation manner of the embodiment, the information transceiver unit 50 includes at least one of a bluetooth unit, a WiFi unit, an infrared unit, a zigbee unit, and a 5G communication unit.
In practical applications, the information transceiver unit 50 may be selected according to the practical application environment of the motion teaching apparatus, for example, when the practical application environment is indoor and a WiFi hotspot network or a zigbee network is configured indoors, a WiFi unit and/or a zigbee unit is selected. For another example, when the actual application environment is outdoor, at least one of the bluetooth unit, the infrared unit and the 5G communication unit is selected.
The motion teaching device provided by the embodiment comprises: the image output unit is respectively connected with the image display unit, the image acquisition unit and the moving body wearing unit, a preset motion teaching image is displayed by using the image display unit, the real-time attitude image of the moving body imitating the image content of the motion teaching image is acquired by using the image acquisition unit, the moving body wearing unit is worn on a moving joint of the moving body and acquires the spatial position information of the moving joint, the spatial position information is used for representing the pixel point range of the moving joint of the moving body in the real-time attitude image, and the moving joint position of the moving body can be determined in the real-time attitude image based on the spatial position information, so that the image output unit can be used for determining the similarity degree or difference degree of the motion content of the moving body imitating the motion teaching image according to the real-time attitude image and the spatial position information, and then generate the target attitude image, send the target attitude image to the image display unit and carry out the display and finish the motion teaching, do not need to adopt complicated image processing scheme, and need not to rely on the hardware equipment that the operational capability is comparatively strong to the cost is reduced.
In addition, the motion teaching device in this embodiment further includes an information transceiver unit, where the information transceiver unit is connected to the image output unit, and serves as a wireless communication unit of the image output unit, and wirelessly receives all information sent to the image output unit, and wirelessly sends all information sent by the image output unit, thereby improving the overall expandability of the motion teaching device.
In addition, the motion teaching device in this embodiment further includes an image rendering unit, where the image rendering unit is connected to the image output unit, performs augmented reality image rendering according to the target correction image or the target completion image to obtain a rendering file, sends the rendering file to the image display unit, and loads the rendering file by the image display unit, thereby improving the overall display effect of the motion teaching device.
It is another object of the present application to provide a motion teaching system 200, as shown in fig. 3, including the motion teaching apparatus 100 in the above-described embodiment.
Since the detailed implementation and the working principle of the motion teaching system provided in this embodiment related to this application have been described in detail in the above embodiment of the motion teaching device, further description is omitted here.
The application provides a pair of motion teaching equipment and motion teaching system, wherein, a motion teaching equipment includes: the image output unit is respectively connected with the image display unit, the image acquisition unit and the moving body wearing unit, a preset motion teaching image is displayed by using the image display unit, the real-time attitude image of the moving body imitating the image content of the motion teaching image is acquired by using the image acquisition unit, the moving body wearing unit is worn on a moving joint of the moving body and acquires the spatial position information of the moving joint, the spatial position information is used for representing the pixel point range of the moving joint of the moving body in the real-time attitude image, and the moving joint position of the moving body can be determined in the real-time attitude image based on the spatial position information, so that the image output unit can be used for determining the similarity degree or difference degree of the motion content of the moving body imitating the motion teaching image according to the real-time attitude image and the spatial position information, and then generate the target attitude image, send the target attitude image to the image display unit and carry out the display and finish the motion teaching, do not need to adopt complicated image processing scheme, and need not to rely on the hardware equipment that the operational capability is comparatively strong to the cost is reduced.
The units in the terminal of the embodiment of the application can be combined, divided and deleted according to actual needs.
While the invention has been described with reference to specific embodiments, the scope of the invention is not limited thereto, and those skilled in the art can easily conceive various equivalent modifications or substitutions within the technical scope of the invention, and these modifications or substitutions are intended to be included in the scope of the invention. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (5)

1. An exercise teaching device, the exercise teaching device comprising:
the image display unit is used for displaying a preset motion teaching image and a target posture image; the image display unit is a display screen, a projector, AR glasses or VR glasses;
the image acquisition unit is used for acquiring a real-time attitude image when the moving body imitates the image content of the motion teaching image;
the moving body wearing unit is used for wearing on a moving joint of the moving body and collecting space position information of the moving joint;
and the image output unit is respectively connected with the image display unit, the image acquisition unit and the moving body wearing unit and is used for generating the target attitude image according to the real-time attitude image, the spatial position information and the motion teaching image.
2. The motion teaching apparatus of claim 1 wherein the motion teaching apparatus further comprises:
and the information transceiving unit is connected with the image output unit and is used for wirelessly receiving the spatial position information sent by the moving body wearing unit and transmitting the spatial position information to the image output unit.
3. The motion teaching apparatus of claim 2, wherein the information transceiving unit comprises at least one of a bluetooth unit, a WiFi unit, an infrared unit, a zigbee unit, and a 5G communication unit.
4. Motion teaching apparatus according to any one of claims 1 to 3, wherein the moving body wearing unit includes at least one of a glove, a foot cover, a knee pad, a bracelet, a foot ring, a helmet, a collar, and a waist support band.
5. An exercise teaching system comprising an exercise teaching apparatus according to any one of claims 1 to 4.
CN201922042073.8U 2019-11-21 2019-11-21 Motion teaching equipment and motion teaching system Active CN211180839U (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201922042073.8U CN211180839U (en) 2019-11-21 2019-11-21 Motion teaching equipment and motion teaching system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201922042073.8U CN211180839U (en) 2019-11-21 2019-11-21 Motion teaching equipment and motion teaching system

Publications (1)

Publication Number Publication Date
CN211180839U true CN211180839U (en) 2020-08-04

Family

ID=71808228

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201922042073.8U Active CN211180839U (en) 2019-11-21 2019-11-21 Motion teaching equipment and motion teaching system

Country Status (1)

Country Link
CN (1) CN211180839U (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112631423A (en) * 2020-12-18 2021-04-09 广东湾区智能终端工业设计研究院有限公司 Projection display method of foot ring and foot ring

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112631423A (en) * 2020-12-18 2021-04-09 广东湾区智能终端工业设计研究院有限公司 Projection display method of foot ring and foot ring

Similar Documents

Publication Publication Date Title
CN111460875B (en) Image processing method and apparatus, image device, and storage medium
US11262841B2 (en) Wireless wrist computing and control device and method for 3D imaging, mapping, networking and interfacing
CN109671141B (en) Image rendering method and device, storage medium and electronic device
CN107930048B (en) Space somatosensory recognition motion analysis system and motion analysis method
CN106200944A (en) The control method of a kind of object, control device and control system
CN112198959A (en) Virtual reality interaction method, device and system
KR20140107062A (en) Posture training system and method of control thereof
CN109701224B (en) Augmented reality AR wrist joint rehabilitation evaluation and training system
CN106843507A (en) A kind of method and system of virtual reality multi-person interactive
CN105739703A (en) Virtual reality somatosensory interaction system and method for wireless head-mounted display equipment
JP2023507241A (en) A proxy controller suit with arbitrary dual-range kinematics
CN206819290U (en) A kind of system of virtual reality multi-person interactive
WO2017061890A1 (en) Wireless full body motion control sensor
CN211180839U (en) Motion teaching equipment and motion teaching system
CN107632702B (en) Holographic projection system adopting light-sensing data gloves and working method thereof
WO2020147797A1 (en) Image processing method and apparatus, image device, and storage medium
CN210302240U (en) Augmented reality AR wrist joint rehabilitation evaluation and training system
CN105279354A (en) Scenario construction system capable of integrating users into plots
CN114067953A (en) Rehabilitation training method, system and computer readable storage medium
CN113569775A (en) Monocular RGB input-based mobile terminal real-time 3D human body motion capture method and system, electronic equipment and storage medium
TW201739486A (en) Augmented reality limb training method and system
JP2021099666A (en) Method for generating learning model
JP7458731B2 (en) Image generation system, image processing device, information processing device, image generation method, and program
KR102510048B1 (en) Control method of electronic device to output augmented reality data according to the exercise motion
US11783492B2 (en) Human body portion tracking method and human body portion tracking system

Legal Events

Date Code Title Description
GR01 Patent grant
GR01 Patent grant