CN111111143A - Motion assisting system based on augmented reality technology - Google Patents

Motion assisting system based on augmented reality technology Download PDF

Info

Publication number
CN111111143A
CN111111143A CN201811281792.9A CN201811281792A CN111111143A CN 111111143 A CN111111143 A CN 111111143A CN 201811281792 A CN201811281792 A CN 201811281792A CN 111111143 A CN111111143 A CN 111111143A
Authority
CN
China
Prior art keywords
image
motion
module
data
user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201811281792.9A
Other languages
Chinese (zh)
Inventor
刘朔一
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing Intelligent Simulation Technology Research Institute Co Ltd
Original Assignee
Nanjing Intelligent Simulation Technology Research Institute Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing Intelligent Simulation Technology Research Institute Co Ltd filed Critical Nanjing Intelligent Simulation Technology Research Institute Co Ltd
Priority to CN201811281792.9A priority Critical patent/CN111111143A/en
Publication of CN111111143A publication Critical patent/CN111111143A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B71/00Games or sports accessories not covered in groups A63B1/00 - A63B69/00
    • A63B71/06Indicating or scoring devices for games or players, or for other sports activities
    • A63B71/0619Displays, user interfaces and indicating devices, specially adapted for sport equipment, e.g. display mounted on treadmills
    • A63B71/0622Visual, audio or audio-visual systems for entertaining, instructing or motivating the user
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B71/00Games or sports accessories not covered in groups A63B1/00 - A63B69/00
    • A63B71/06Indicating or scoring devices for games or players, or for other sports activities
    • A63B71/0619Displays, user interfaces and indicating devices, specially adapted for sport equipment, e.g. display mounted on treadmills
    • A63B71/0622Visual, audio or audio-visual systems for entertaining, instructing or motivating the user
    • A63B2071/0638Displaying moving images of recorded environment, e.g. virtual environment
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2220/00Measuring of physical parameters relating to sporting activity
    • A63B2220/05Image processing for measuring physical parameters
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2220/00Measuring of physical parameters relating to sporting activity
    • A63B2220/80Special sensors, transducers or devices therefor
    • A63B2220/806Video cameras
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2230/00Measuring physiological parameters of the user
    • A63B2230/08Measuring physiological parameters of the user other bio-electrical signals
    • A63B2230/085Measuring physiological parameters of the user other bio-electrical signals used as a control parameter for the apparatus

Abstract

The invention discloses an exercise assisting system based on augmented reality technology, which relates to the technical field of exercise assisting and comprises: the device comprises a central control module, a motion capture module, an extraction module and an output module. The invention also discloses an exercise assisting method adopting the exercise assisting system based on the augmented reality technology. In order to solve the problems that the sense organ experience is low in the motion process of people and the self motion state cannot be observed by the people so as to be corrected in time, the motion data transmitted by the motion capture module is input and stored through the central control module, the image data, the scene data and the motion data are extracted from the extraction module according to the stored data and fused and detected to form a fused image and a comparison image, the fused image is transmitted to the output module, the sense organ experience of the user is improved by enabling the user to watch the fused image, and the correction action is performed by enabling the user to watch the comparison image, so that the wide market prospect is achieved.

Description

Motion assisting system based on augmented reality technology
Technical Field
The invention relates to the technical field of exercise assistance, in particular to an exercise assistance system based on an augmented reality technology.
Background
With the continuous development of society, more and more people begin to pay attention to their health. People usually build up body through exercise, and ensure that the body tends to be healthy. However, in the exercise process, people can not exercise in different environments due to condition limitation, sensory experience is reduced, and meanwhile, the people can not observe the exercise state of the people to correct the exercise state in time, so that the exercise efficiency is reduced.
Augmented Reality (AR) is a technology for calculating the position and angle of a camera image in real time and adding a corresponding image, and is widely applied to various fields such as military affairs, medical treatment, sports, engineering, entertainment and the like due to good real-time interactivity. The real environment and the virtual object can be overlaid to the same picture or space in real time by the augmented reality technology, and the sensory experience beyond reality is brought to people. Therefore, it is an urgent problem to design an exercise assisting system based on augmented reality technology.
Disclosure of Invention
The present invention is directed to an exercise assisting system based on augmented reality technology, so as to solve the problems in the background art mentioned above.
In order to achieve the purpose, the invention provides the following technical scheme:
an augmented reality technology-based motion assistance system comprising: the device comprises a central control module, an output module, a motion capture module and an extraction module, wherein the motion capture module and the extraction module are respectively connected with the central control module; the motion capture module is used for capturing motion electromechanical signals and motion image signals of a user, processing the motion electromechanical signals and the motion image signals to form motion data and transmitting the motion data to the central control module; the extraction module is used for transmitting image data and scene data to the central control module as required;
the central control module is used for inputting and storing the motion data transmitted by the motion capture module, extracting image data and scene data from the extraction module according to the stored data and performing fusion and detection on the image data, the scene data and the motion data as required to form a fused image and a contrast image and transmitting the fused image and the contrast image to the output module;
the fusion is to fuse the real scene image of the motion data and the scene data extracted by the extraction module to obtain a fused image for the user to watch, which is beneficial to improving the sensory experience of the user;
the detection is to compare the motion data with the image data extracted by the extraction module to detect whether the action of the user is standard or not, and identify the corresponding position and form a comparison image for the user to watch to correct the action when the action of the user is not standard;
the output module is used for outputting the fusion image and the comparison image transmitted by the central processing unit for watching by a user.
As a further scheme of the invention: the motion capture module comprises a data processing module, and an electromechanical signal capture module and an image capture module which are respectively connected with the data processing module.
As a still further scheme of the invention: the data processing module is used for carrying out data processing on the motion electromechanical signals captured by the electromechanical signal capturing module and the motion image signals captured by the image capturing module to form motion data and transmitting the motion data to the central control module;
the electromechanical signal capturing module is used for extracting current signals of relevant muscles or muscle groups completing specific actions by utilizing the surface electrodes, filtering and expanding the current signals, fitting the current signals according to time to form an intensity change curve, determining appointed action data according to the intensity change curve of the movement actions through simulation modeling, and forming movement electromechanical signals;
the image capturing module is used for capturing the motion image of the user for preprocessing to form a motion image signal.
As a still further scheme of the invention: the preprocessing is to cut the moving image by artificial center to remove the redundant background, and convert the cut image from RGB image to gray image, and at the same time, the resolution is changed from 120 × 160 to 60 × 80, thereby reducing the occupied space of the memory and improving the algorithm efficiency.
As a still further scheme of the invention: the central control module comprises an input module, a storage module and a central processing unit which are connected in sequence.
As a still further scheme of the invention: the input module is used for inputting the motion data formed by the motion capture module and the image data and the scene data extracted by the extraction module; a plurality of SDRAM storages are integrated on the input module, and a plurality of channels are formed by integrating the SDRAM storages, so that the workload is reduced;
the storage module is used for storing the motion data, the image data and the scene data input by the input module;
the central processing unit is used for fusing and detecting the image data, the scene data and the motion data, forming a fused image and a contrast image and transmitting the fused image and the contrast image to the output module.
As a still further scheme of the invention: the output module comprises a touch panel connected with the central processing unit; the touch panel includes a plurality of touch sensors to sense touch, slide, and gestures on the touch panel; the touch sensor may not only sense the boundary of a touch or slide action, but also detect the duration and pressure associated with the touch or slide operation.
An exercise assisting method using the above exercise assisting system based on augmented reality technology includes the following steps:
1) capturing a motion electromechanical signal and a motion image signal of a user through a motion capture module, performing data processing to form motion data, and transmitting the motion data to a central control module;
2) the motion data transmitted by the motion capture module is input and stored through the central control module, and the image data, the scene data and the motion data are extracted from the extraction module according to the stored data and the requirements to be fused and detected, so that a fused image and a contrast image are formed and transmitted to the output module;
3) the fused image and the comparison image transmitted by the central processing unit are output through the output module to be watched by the user, the fused image is used for being watched by the user, the sensory experience of the user is improved, and the comparison image is used for being watched by the user to correct the action.
Compared with the prior art, the invention has the beneficial effects that:
the invention is provided with a central control module, and a motion capture module, an extraction module and an output module which are respectively connected with the central control module, wherein the motion data transmitted by the motion capture module is input and stored by the central control module, and image data, scene data and motion data are extracted from the extraction module according to the stored data and fused and detected to form a fused image and a comparison image which are transmitted to the output module.
Drawings
Fig. 1 is a block diagram of an augmented reality technology-based motion assistance system.
Fig. 2 is a block diagram of a motion capture module in an augmented reality technology-based motion assistance system.
Fig. 3 is a block diagram of a central control module in an augmented reality technology-based exercise assisting system.
Detailed Description
The invention is described in further detail below with reference to the figures and specific examples. The following examples will assist those skilled in the art in further understanding the invention, but are not intended to limit the invention in any way. It should be noted that variations and modifications can be made by persons skilled in the art without departing from the spirit of the invention. These are all protection enclosures of the present invention.
It will be understood that the terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It is also to be understood that the terminology used in the description of the invention herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used in the specification of the present invention and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise.
Example 1
Referring to fig. 1-3, an exercise assisting system based on augmented reality technology includes: the device comprises a central control module, an output module, a motion capture module and an extraction module, wherein the motion capture module and the extraction module are respectively connected with the central control module;
in order to solve the problems that the sensory experience of people is low in the motion process and the self motion state cannot be observed by the people, the motion capture module is used for capturing motion electromechanical signals and motion image signals of a user, processing the motion electromechanical signals and the motion image signals to form motion data and transmitting the motion data to the central control module; the extraction module is used for transmitting image data and scene data to the central control module as required;
the central control module is used for inputting and storing the motion data transmitted by the motion capture module, extracting image data and scene data from the extraction module according to the stored data and performing fusion and detection on the image data, the scene data and the motion data as required to form a fused image and a contrast image and transmitting the fused image and the contrast image to the output module;
the fusion is to fuse the real scene image of the motion data and the scene data extracted by the extraction module to obtain a fused image for the user to watch, which is beneficial to improving the sensory experience of the user; the detection is to compare the motion data with the image data extracted by the extraction module to detect whether the action of the user is standard or not, and identify the corresponding position and form a comparison image for the user to watch to correct the action when the action of the user is not standard;
the output module is used for outputting the fused image and the comparison image transmitted by the central processing unit for a user to watch;
further, the motion capture module comprises a data processing module, and an electromechanical signal capture module and an image capture module which are respectively connected with the data processing module; the data processing module is used for carrying out data processing on the motion electromechanical signals captured by the electromechanical signal capturing module and the motion image signals captured by the image capturing module to form motion data and transmitting the motion data to the central control module; the electromechanical signal capturing module is used for extracting current signals of relevant muscles or muscle groups completing specific actions by utilizing the surface electrodes, filtering and expanding the current signals, fitting the current signals according to time to form an intensity change curve, determining appointed action data according to the intensity change curve of the movement actions through simulation modeling, and forming movement electromechanical signals; the image capturing module comprises intelligent equipment with a camera shooting function, such as a mobile phone, a notebook, a tablet computer and the like, and is used for capturing a moving image of a user and preprocessing the moving image to form a moving image signal; the preprocessing is to cut the moving image by artificial center to remove the redundant background, convert the cut image from RGB image to gray image, and change the resolution from 120 × 160 to 60 × 80, thus reducing the occupied space of the memory and improving the algorithm efficiency;
further, the central control module comprises an input module, a storage module and a central processing unit which are connected in sequence; the input module is used for inputting the motion data formed by the motion capture module and the image data and the scene data extracted by the extraction module; in the embodiment, 16MB SDRAM is selected to meet the design requirement of 1 MB/channel, and multiple channels are formed by integrating the SDRAM memories, so that the workload is reduced; the storage module is used for storing the motion data, the image data and the scene data input by the input module; the memory module may be implemented by any type of volatile or non-volatile memory device or combination thereof, such as Static Random Access Memory (SRAM), Electrically Erasable Programmable Read Only Memory (EEPROM), Erasable Programmable Read Only Memory (EPROM), Programmable Read Only Memory (PROM), Read Only Memory (ROM), magnetic memory, flash memory, magnetic disk or optical disk, in this embodiment, preferably, the memory module is a combination of Erasable Programmable Read Only Memory (EPROM) and magnetic memory;
further, the central processing unit is used for fusing and detecting the image data, the scene data and the motion data, forming a fused image and a contrast image and transmitting the fused image and the contrast image to the output module; the output module comprises a touch panel connected with the central processing unit; the touch panel includes a plurality of touch sensors to sense touch, slide, and gestures on the touch panel; the touch sensor may not only sense the boundary of a touch or slide action, but also detect the duration and pressure associated with the touch or slide operation.
In this embodiment, an exercise assisting method using the above exercise assisting system based on augmented reality technology includes the following steps:
1) capturing a motion electromechanical signal and a motion image signal of a user through a motion capture module, performing data processing to form motion data, and transmitting the motion data to a central control module;
2) the motion data transmitted by the motion capture module is input and stored through the central control module, and the image data, the scene data and the motion data are extracted from the extraction module according to the stored data and the requirements to be fused and detected, so that a fused image and a contrast image are formed and transmitted to the output module;
3) the fused image and the comparison image transmitted by the central processing unit are output through the output module to be watched by the user, the fused image is used for being watched by the user, the sensory experience of the user is improved, and the comparison image is used for being watched by the user to correct the action.
Example 2
Referring to fig. 1-2, an exercise assisting system based on augmented reality technology includes: the device comprises a central control module, an output module, a motion capture module and an extraction module, wherein the motion capture module and the extraction module are respectively connected with the central control module;
in order to solve the problems that the sensory experience of people is low in the motion process and the self motion state cannot be observed by the people, the motion capture module is used for capturing motion electromechanical signals and motion image signals of a user, processing the motion electromechanical signals and the motion image signals to form motion data and transmitting the motion data to the central control module; the extraction module is used for transmitting image data and scene data to the central control module as required;
the central control module is used for inputting and storing the motion data transmitted by the motion capture module, extracting image data and scene data from the extraction module according to the stored data and performing fusion and detection on the image data, the scene data and the motion data as required to form a fused image and a contrast image and transmitting the fused image and the contrast image to the output module;
the fusion is to fuse the real scene image of the motion data and the scene data extracted by the extraction module to obtain a fused image for the user to watch, which is beneficial to improving the sensory experience of the user; the detection is to compare the motion data with the image data extracted by the extraction module to detect whether the action of the user is standard or not, and identify the corresponding position and form a comparison image for the user to watch to correct the action when the action of the user is not standard;
the output module is used for outputting the fused image and the comparison image transmitted by the central processing unit for a user to watch;
further, the motion capture module comprises a data processing module, and an electromechanical signal capture module and an image capture module which are respectively connected with the data processing module; the data processing module is used for carrying out data processing on the motion electromechanical signals captured by the electromechanical signal capturing module and the motion image signals captured by the image capturing module to form motion data and transmitting the motion data to the central control module; the electromechanical signal capturing module is used for extracting current signals of relevant muscles or muscle groups completing specific actions by utilizing the surface electrodes, filtering and expanding the current signals, fitting the current signals according to time to form an intensity change curve, determining appointed action data according to the intensity change curve of the movement actions through simulation modeling, and forming movement electromechanical signals; the image capturing module comprises intelligent equipment with a camera shooting function, such as a mobile phone, a notebook, a tablet computer and the like, and is used for capturing a moving image of a user and preprocessing the moving image to form a moving image signal; the preprocessing is to cut the moving image by artificial center to remove the redundant background, convert the cut image from RGB image to gray image, and change the resolution from 120 × 160 to 60 × 80, thus reducing the occupied space of the memory and improving the algorithm efficiency;
furthermore, the central control module is used for fusing and detecting the image data, the scene data and the motion data, forming a fused image and a contrast image and transmitting the fused image and the contrast image to the output module; the output module comprises a touch panel connected with the central control module; the touch panel includes a plurality of touch sensors to sense touch, slide, and gestures on the touch panel; the touch sensor may not only sense the boundary of a touch or slide action, but also detect the duration and pressure associated with the touch or slide operation.
In this embodiment, an exercise assisting method using the above exercise assisting system based on augmented reality technology includes the following steps:
1) capturing a motion electromechanical signal and a motion image signal of a user through a motion capture module, performing data processing to form motion data, and transmitting the motion data to a central control module;
2) the motion data transmitted by the motion capture module is input and stored through the central control module, and the image data, the scene data and the motion data are extracted from the extraction module according to the stored data and the requirements to be fused and detected, so that a fused image and a contrast image are formed and transmitted to the output module;
3) the fused image and the comparison image transmitted by the central control module are output through the output module to be watched by a user, the fused image is used for being watched by the user, the sensory experience of the user is improved, and the comparison image is used for being watched by the user to correct the action.
Example 3
Referring to fig. 1-3, an exercise assisting system based on augmented reality technology includes: the device comprises a central control module, an output module, a motion capture module and an extraction module, wherein the motion capture module and the extraction module are respectively connected with the central control module;
in order to solve the problems that the sensory experience of people is low in the motion process and the self motion state cannot be observed by the people, the motion capture module is used for capturing motion electromechanical signals and motion image signals of a user, processing the motion electromechanical signals and the motion image signals to form motion data and transmitting the motion data to the central control module; the extraction module is used for transmitting image data and scene data to the central control module as required; the central control module is used for inputting and storing the motion data transmitted by the motion capture module, extracting image data and scene data from the extraction module according to the stored data and performing fusion and detection on the image data, the scene data and the motion data as required to form a fused image and a contrast image and transmitting the fused image and the contrast image to the output module; the fusion is to fuse the real scene image of the motion data and the scene data extracted by the extraction module to obtain a fused image for the user to watch, which is beneficial to improving the sensory experience of the user; the detection is to compare the motion data with the image data extracted by the extraction module to detect whether the action of the user is standard or not, and identify the corresponding position and form a comparison image for the user to watch to correct the action when the action of the user is not standard; the output module is used for outputting the fused image and the comparison image transmitted by the central processing unit for a user to watch;
further, the motion capture module comprises a data processing module, and an electromechanical signal capture module and an image capture module which are respectively connected with the data processing module; the data processing module is used for carrying out data processing on the motion electromechanical signals captured by the electromechanical signal capturing module and the motion image signals captured by the image capturing module to form motion data and transmitting the motion data to the central control module; the electromechanical signal capturing module is used for extracting current signals of relevant muscles or muscle groups completing specific actions by utilizing the surface electrodes, filtering and expanding the current signals, fitting the current signals according to time to form an intensity change curve, determining appointed action data according to the intensity change curve of the movement actions through simulation modeling, and forming movement electromechanical signals; the image capturing module is used for capturing a moving image of a user and preprocessing the moving image to form a moving image signal; the preprocessing is to cut the moving image by a human center to remove redundant background, convert the cut image from an RGB image into a gray image, and simultaneously change the resolution from 120 × 160 to 60 × 80;
further, the central control module comprises an input module, a storage module and a central processing unit which are connected in sequence; the input module is used for inputting the motion data formed by the motion capture module and the image data and the scene data extracted by the extraction module; the storage module is used for storing the motion data, the image data and the scene data input by the input module; the memory module may be implemented by any type of volatile or non-volatile memory device or combination thereof, such as Static Random Access Memory (SRAM), Electrically Erasable Programmable Read Only Memory (EEPROM), Erasable Programmable Read Only Memory (EPROM), Programmable Read Only Memory (PROM), Read Only Memory (ROM), magnetic memory, flash memory, magnetic disk or optical disk, in this embodiment, preferably, the memory module is a combination of Erasable Programmable Read Only Memory (EPROM) and Static Random Access Memory (SRAM);
further, the central processing unit is used for fusing and detecting the image data, the scene data and the motion data, forming a fused image and a contrast image and transmitting the fused image and the contrast image to the output module; the output module comprises a touch panel connected with the central processing unit; the touch panel includes a plurality of touch sensors to sense touch, slide, and gestures on the touch panel; the touch sensor may not only sense the boundary of a touch or slide action, but also detect the duration and pressure associated with the touch or slide operation.
In this embodiment, an exercise assisting method using the above exercise assisting system based on augmented reality technology includes the following steps:
1) capturing a motion electromechanical signal and a motion image signal of a user through a motion capture module, performing data processing to form motion data, and transmitting the motion data to a central control module;
2) the motion data transmitted by the motion capture module is input and stored through the central control module, and the image data, the scene data and the motion data are extracted from the extraction module according to the stored data and the requirements to be fused and detected, so that a fused image and a contrast image are formed and transmitted to the output module;
3) the fused image and the comparison image transmitted by the central processing unit are output through the output module to be watched by the user, the fused image is used for being watched by the user, the sensory experience of the user is improved, and the comparison image is used for being watched by the user to correct the action.
The invention has the beneficial effects that: the invention is provided with a central control module, and a motion capture module, an extraction module and an output module which are respectively connected with the central control module, wherein the motion data transmitted by the motion capture module is input and stored by the central control module, and image data, scene data and motion data are extracted from the extraction module according to the stored data and fused and detected to form a fused image and a comparison image which are transmitted to the output module.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by a computer program, which can be stored in a computer-readable storage medium, and when executed, can include the processes of the embodiments of the methods described above. The storage medium may be a magnetic disk, an optical disk, a read-only memory or a random access memory.
While the preferred embodiments of the present invention have been described in detail, the present invention is not limited to the above embodiments, and various changes can be made without departing from the spirit of the present invention within the knowledge of those skilled in the art. And are neither required nor exhaustive of all embodiments. And obvious variations or modifications of the invention may be made without departing from the scope of the invention.

Claims (8)

1. An augmented reality technology-based motion assistance system comprising: the device comprises a central control module, an output module, a motion capture module and an extraction module, wherein the motion capture module and the extraction module are respectively connected with the central control module;
the motion capture module is used for capturing motion electromechanical signals and motion image signals of a user, processing the motion electromechanical signals and the motion image signals to form motion data and transmitting the motion data to the central control module;
the extraction module is used for transmitting image data and scene data to the central control module as required;
the central control module is used for inputting and storing the motion data transmitted by the motion capture module, extracting image data and scene data from the extraction module according to the stored data and performing fusion and detection on the image data, the scene data and the motion data as required to form a fused image and a contrast image and transmitting the fused image and the contrast image to the output module;
the fusion is to fuse the real scene image of the motion data with the scene data extracted by the extraction module to obtain a fused image for the user to watch; the detection is to compare the motion data with the image data extracted by the extraction module and detect and form a comparison image for the user to watch and correct;
the output module is used for outputting the fusion image and the comparison image transmitted by the central processing unit for watching by a user.
2. The augmented reality technology-based motion assistance system of claim 1, wherein the motion capture module comprises a data processing module and an electromechanical signal capture module and an image capture module respectively connected to the data processing module.
3. The augmented reality technology-based motion assistance system of claim 2, wherein the data processing module is configured to perform data processing on the motion electromechanical signals captured by the electromechanical signal capturing module and the motion image signals captured by the image capturing module to form motion data, and transmit the motion data to the central control module; the electromechanical signal capturing module is used for extracting current signals of relevant muscles or muscle groups completing specific actions by utilizing the surface electrodes, filtering and expanding the current signals, fitting the current signals according to time to form an intensity change curve, determining appointed action data according to the intensity change curve of the movement actions through simulation modeling, and forming movement electromechanical signals; the image capturing module is used for capturing the motion image of the user for preprocessing to form a motion image signal.
4. The augmented reality technology-based motion assistance system of claim 3, wherein the preprocessing is to crop the moving image to an artificial center to remove an unnecessary background, and convert the cropped image from an RGB image into a grayscale image with a resolution of 120 x 160 to 60 x 80.
5. An augmented reality technology-based exercise assisting system as claimed in any one of claims 1 to 4, wherein the central control module comprises an input module, a storage module and a central processing unit which are connected in sequence.
6. The augmented reality technology-based motion assistance system of claim 5, wherein the input module is configured to input the motion data formed by the motion capture module and the image data and the scene data extracted by the extraction module; the storage module is used for storing the motion data, the image data and the scene data input by the input module; the central processing unit is used for fusing and detecting the image data, the scene data and the motion data, forming a fused image and a contrast image and transmitting the fused image and the contrast image to the output module.
7. The augmented reality technology-based motion assistance system of claim 6, wherein the output module comprises a touch panel connected to a central processor.
8. An exercise assisting method using the augmented reality technology-based exercise assisting system according to any one of claims 1 to 7, characterized by the steps of:
1) capturing a motion electromechanical signal and a motion image signal of a user through a motion capture module, performing data processing to form motion data, and transmitting the motion data to a central control module;
2) the motion data transmitted by the motion capture module is input and stored through the central control module, and the image data, the scene data and the motion data are extracted from the extraction module according to the stored data and the requirements to be fused and detected, so that a fused image and a contrast image are formed and transmitted to the output module;
3) the fused image and the comparison image transmitted by the central processing unit are output through the output module to be watched by the user, the fused image is used for being watched by the user, the sensory experience of the user is improved, and the comparison image is used for being watched by the user to correct the action.
CN201811281792.9A 2018-10-31 2018-10-31 Motion assisting system based on augmented reality technology Pending CN111111143A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811281792.9A CN111111143A (en) 2018-10-31 2018-10-31 Motion assisting system based on augmented reality technology

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811281792.9A CN111111143A (en) 2018-10-31 2018-10-31 Motion assisting system based on augmented reality technology

Publications (1)

Publication Number Publication Date
CN111111143A true CN111111143A (en) 2020-05-08

Family

ID=70484907

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811281792.9A Pending CN111111143A (en) 2018-10-31 2018-10-31 Motion assisting system based on augmented reality technology

Country Status (1)

Country Link
CN (1) CN111111143A (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106582016A (en) * 2016-12-05 2017-04-26 湖南简成信息技术有限公司 Augmented reality-based motion game control method and control apparatus
CN106648118A (en) * 2017-01-25 2017-05-10 宇龙计算机通信科技(深圳)有限公司 Virtual teaching method based on augmented reality, and terminal equipment
CN106730814A (en) * 2016-11-22 2017-05-31 深圳维京人网络科技有限公司 Marine fishing class game based on AR and face recognition technology
EP3258445A1 (en) * 2016-06-17 2017-12-20 Imagination Technologies Limited Augmented reality occlusion
CN108159698A (en) * 2017-12-29 2018-06-15 武汉艺术先生数码科技有限公司 Indoor cool run game simulation system based on AR
CN108434664A (en) * 2018-04-08 2018-08-24 上海应用技术大学 A kind of treadmill intelligent safe protector and guard method

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3258445A1 (en) * 2016-06-17 2017-12-20 Imagination Technologies Limited Augmented reality occlusion
CN106730814A (en) * 2016-11-22 2017-05-31 深圳维京人网络科技有限公司 Marine fishing class game based on AR and face recognition technology
CN106582016A (en) * 2016-12-05 2017-04-26 湖南简成信息技术有限公司 Augmented reality-based motion game control method and control apparatus
CN106648118A (en) * 2017-01-25 2017-05-10 宇龙计算机通信科技(深圳)有限公司 Virtual teaching method based on augmented reality, and terminal equipment
CN108159698A (en) * 2017-12-29 2018-06-15 武汉艺术先生数码科技有限公司 Indoor cool run game simulation system based on AR
CN108434664A (en) * 2018-04-08 2018-08-24 上海应用技术大学 A kind of treadmill intelligent safe protector and guard method

Similar Documents

Publication Publication Date Title
CN108229277B (en) Gesture recognition method, gesture control method, multilayer neural network training method, device and electronic equipment
CN108805047B (en) Living body detection method and device, electronic equipment and computer readable medium
CN105184246B (en) Living body detection method and living body detection system
CN110472613B (en) Object behavior identification method and device
CN109453517B (en) Virtual character control method and device, storage medium and mobile terminal
CN113658211B (en) User gesture evaluation method and device and processing equipment
CN109821239A (en) Implementation method, device, equipment and the storage medium of somatic sensation television game
CN112543343A (en) Live broadcast picture processing method and device based on live broadcast with wheat and electronic equipment
Lemley et al. Eye tracking in augmented spaces: A deep learning approach
KR101121712B1 (en) Providing device of eye scan path
CN106774829B (en) Object control method and device
KR20120090565A (en) Apparatus for processing sensory effect of image data and method for the same
CN111111143A (en) Motion assisting system based on augmented reality technology
CN111464740B (en) Image shooting method and device, storage medium and electronic equipment
CN114625456B (en) Target image display method, device and equipment
CN116820251B (en) Gesture track interaction method, intelligent glasses and storage medium
CN103428551A (en) Gesture remote control system
CN110609921B (en) Information processing method and electronic equipment
CN103076873A (en) Self-adaptation system and method of visual control window of computer
EP3553629B1 (en) Rendering a message within a volumetric data
CN117689826A (en) Three-dimensional model construction and rendering method, device, equipment and medium
CN116156079A (en) Video processing method, device, equipment and storage medium
CN115050054A (en) Prompting method and device, electronic equipment and readable storage medium
CN117582659A (en) Gesture-based AR multi-person interaction method and system
CN117572966A (en) Device control method, device, electronic device and readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20200508