CN109243575B - Virtual acupuncture method and system based on mobile interaction and augmented reality - Google Patents

Virtual acupuncture method and system based on mobile interaction and augmented reality Download PDF

Info

Publication number
CN109243575B
CN109243575B CN201811083904.XA CN201811083904A CN109243575B CN 109243575 B CN109243575 B CN 109243575B CN 201811083904 A CN201811083904 A CN 201811083904A CN 109243575 B CN109243575 B CN 109243575B
Authority
CN
China
Prior art keywords
virtual
acupuncture
coordinate system
controller
glasses
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN201811083904.XA
Other languages
Chinese (zh)
Other versions
CN109243575A (en
Inventor
杜广龙
陈子南
李方
张平
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
South China University of Technology SCUT
Original Assignee
South China University of Technology SCUT
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by South China University of Technology SCUT filed Critical South China University of Technology SCUT
Priority to CN201811083904.XA priority Critical patent/CN109243575B/en
Publication of CN109243575A publication Critical patent/CN109243575A/en
Application granted granted Critical
Publication of CN109243575B publication Critical patent/CN109243575B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H20/00ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
    • G16H20/40ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to mechanical, radiation or invasive therapies, e.g. surgery, laser therapy, dialysis or acupuncture
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H39/00Devices for locating or stimulating specific reflex points of the body for physical therapy, e.g. acupuncture
    • A61H39/08Devices for applying needles to such points, i.e. for acupuncture ; Acupuncture needles or accessories therefor
    • A61H39/086Acupuncture needles
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/014Hand-worn input/output arrangements, e.g. data gloves
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/20ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for computer-aided diagnosis, e.g. based on medical expert systems

Landscapes

  • Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Public Health (AREA)
  • Medical Informatics (AREA)
  • Epidemiology (AREA)
  • General Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Rehabilitation Therapy (AREA)
  • Primary Health Care (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • Urology & Nephrology (AREA)
  • Pathology (AREA)
  • Pain & Pain Management (AREA)
  • Physical Education & Sports Medicine (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Veterinary Medicine (AREA)
  • Databases & Information Systems (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Finger-Pressure Massage (AREA)

Abstract

The invention provides a virtual acupuncture method and a virtual acupuncture system based on mobile interaction and augmented reality. The method comprises five modules of augmented reality, a diagnosis subsystem, a coordinate system and conversion thereof, capturing of gesture positions and directions, collision detection and visual feedback. The augmented reality projects the experimental human model into the real world by wearing the AR equipment. In order to be able to transmit operations in a virtual environment to a real environment for execution, a coordinate system is required for coordinate transformation. The diagnosis subsystem builds an IFPN model through a back propagation algorithm to learn the corresponding relation between different diseases and treatment acupuncture points. Gesture capture is realized through the Leap Motion somatosensory controller fixed at the AR glasses, and the Motion and the position of the hand can be calculated through captured data. The contact point of the acupuncture needle is obtained through collision detection, and then guidance is carried out through visual feedback, so that the participants know the accuracy of the action.

Description

Virtual acupuncture method and system based on mobile interaction and augmented reality
Technical Field
The invention relates to a virtual acupuncture technology, in particular to a virtual acupuncture method and a virtual acupuncture system based on mobile interaction and augmented reality.
Background
Acupuncture is a medical treatment for treating diseases or maintaining human health by stimulating relevant acupuncture points. Acupuncture can be performed only on real patients or artificial manikins. Whereas the artificial mannequin provides a limited viewing angle, it provides unrealistic force feedback. Therefore, a system which provides abundant information and can identify different acupuncture points is developed, and the significance is very important.
There are many virtual acupuncture systems, but there are still many problems to be solved or to be worth further research. One is a contact method. Acupuncture is performed by remotely manipulating a virtual needle using a joystick, or remotely manipulating using a touch screen. But manipulation may be incorrect due to limited vision and interaction. Second, a non-contact method. Acupuncture is accomplished by placing physical markers on the body part to track the human hand and control remotely. However, these methods may fail if the marker is occluded. One solution is to use a Kinect to track the operator's hands and an interval kalman filter and a particle filter to control the robotic manipulator. But it is not perfect since people naturally use the best posture in real life.
Disclosure of Invention
In order to solve the above problems, the present invention provides a virtual acupuncture method and system based on mobile interaction and augmented reality. The invention realizes the connection of the real world and the virtual interface by using the AR wearable device. In this case, the restriction of the user's motion is broken. Further, it is possible to pay attention to what happens in the real world without considering the mapping relationship. The method also provides a more natural man-machine interaction mode. The operator can look over the simulated human body in any direction more accurately and freely, and operate acupuncture more accurately.
The invention is realized by the following technical scheme.
A virtual acupuncture method based on mobile interaction and augmented reality comprises the following steps:
s1, amplifying, namely projecting the experimental human body model into an experimental environment;
s2, constructing an Intelligent Fuzzy Petri Net (IFPN) model by using a back propagation algorithm to realize a diagnosis subsystem;
s3, obtaining hand coordinates by using a Leap Motion somatosensory controller;
s4, combining virtual scenes with real scenes through a coordinate system and conversion thereof;
s5, estimating and capturing the position and the direction of the hand;
s6, collision detection and visual feedback.
Further, the augmented reality in step S1 projects the virtual human body model into the real environment by wearing the AR device; the operator projects a simulated human body into the real world by wearing AR glasses, and performs acupuncture using virtual acupuncture.
Further, in step S2, the digital description of acupuncture is combined with the digitized anatomical atlas by the computer to locate the 3D acupuncture points for the simulated human body; the intelligent fuzzy Petri network is widely used in the fields of reasoning, evaluation and the like; the intelligent fuzzy Petri net is provided with an output position and an input position; the IFPN model is formed by combining an intuitionistic fuzzy set theory and a Petri network (feature pyramid) theory; firstly, training and learning a group of data by using a back propagation algorithm for the symptoms of diseases, and obtaining the weight of an input position to an output position in an IFPN model after training; introducing a Dynamic Certainty Factor (Dynamic Certainty Factor) into an IFPN model, and converting the IFPN model into a backward tree; in the backward tree, the importance factor of symptoms to the disease is obtained to identify the disease.
Further, in step S3, the presence of the palm and fingers can be detected by the Leap Motion somatosensory controller and the camera. The Leap Motion controller can calculate gestures and actions by itself. The Leap Motion controller uses a right cartesian coordinate system. With the center in the controller as the origin. The X axis is parallel to the long side of one side of the device, the Z axis is parallel to the other side of the device, a plane overlapped with the plane of the Leap Motion is formed, the Y axis is perpendicular to the plane, and the direction is upward. The frame around the controller is defined as XlYlZ1 coordinates. The visual range of the controller is inversely tapered and once the palm and fingers are detected, an invisible coordinate system defined as XhYhZh (with its origin at the palm center) will be constructed to digitize the position and orientation of the hand. The hand motion defined by the origin again in the XhYhZh coordinate system needs to be translated into data of the relative position and orientation of the hand in XlYlZ 1. When these data arrive at the server, the server will attempt to interpret its meaning to translate to a global coordinate system.
Furthermore, a least square solution of the conversion parameter vector is obtained by setting a similarity conversion and converting the similarity conversion into a deviation formula, so that the relationship between two coordinate systems is obtained, and the virtual and real conversion is realized. The internet also needs to calculate how this action is performed in the real world. The internet obtains the height, direction and position data of the hand from the Google HoloLens glasses, and calculates the position of the palm of the hand in the real world. In combination with the data in the Leap Motion controller, real world coordinates are obtained. In order to transmit a real-world image for display on a virtual scene rendered by google HoloLens glasses, the real-world coordinates and AR glasses coordinates need to be calibrated. The interactive scene structure modeling algorithm using a single image registers a simulated human body in a real scene to obtain the relationship between a virtual object and the real scene, and in order to enable a user to interact with the virtual object, the relationship between Leap Motion and HoloLens needs to be further configured. Since Leap Motion can capture every joint point of every human hand. In addition, to establish the coordinates of the HoloLens and the Leap Motion, at least three non-collinear common points are required. The position of the four corner points (x1, x2, x3, x4) of the top surface in the HoloLens coordinate system is first calculated using microsoft HoloLens viewing calibration boxes. Then, by touching four corner points with the tips of the thumbs of the user, the positions (x1', x2', x3', x4') in the Leap Motion coordinate system can be calculated by the Leap Motion controller, and the relationship between (x1, x2, x3, x4) and (x1', x2', x3', x4') is obtained.
Further, step S5 uses a Kalman Filter (Kalman Filter) to perform the position estimation of the palm, and uses the Kalman Filter algorithm to rewrite the interval Kalman Filter to implement denoising; PF (particle filtering) is the sub-optimal resolution of estimating the true a posteriori using a finite number of random state samples in the corresponding normalized weights; after time t _ k the density is approximated as
Figure GDA0001895775230000031
Where δ () is a unit pulse function, N is the number of samples,
Figure GDA0001895775230000032
is the normalized weight of the ith particle,
Figure GDA0001895775230000033
is the ith particle, however, in this case the accuracy of the PF is not high because k is too high; to improve this, for the time difference between two position iterations, the sum of the position differences is used to obtain the weight of the particle, instead of tkThe immediate position difference of the time, then, the cumulative position difference from the estimated value and the calculated value of the ith particle is used in the probability calculation, and when the cumulative error is the smallest, the probability of correct position is the highest; finally, based on the calculated values of likelihood, normalized weights can be obtained.
Further, in step S6, the acupuncture insertion process can be regarded as a mass-spring model, and therefore, the collision of a mass point with a spring can be regarded as a good approximation of the acupuncture process. When the needle is inserted into the skin, the internal forces applied include the spring force, i.e. the pulling force of the surrounding mass point, and the virtual spring force, i.e. the force generated after the virtual spring with the original length of 0 attached to the control point is deformed. The spring force F can be determined by the springs grouped with M neighborhoods around the mass point. Since the acupuncture needle needs to be rotated to achieve the therapeutic purpose during acupuncture treatment, the acupuncture needle is subjected to a friction force in a tangential direction of the skin during the rotation in one direction, and the magnitude of the friction force is related to the physiological parameters of the skin and the external rotation force applied to the needle. Knowing the magnitude of these forces, the operator can know the accuracy of his own operation by using the real-time change of the color of the virtual needle, and adjust his own insertion speed and rotation speed, thereby improving the accuracy of the operation.
The system for realizing the virtual acupuncture method based on mobile interaction and augmented reality of claim 1, which comprises an AR device, a Leap Motion somatosensory controller and a computer; the AR device comprises Microsoft HoloLens holographic glasses; the AR wearing device is used for projecting the experimental human body model into the real world, and an operator wears AR glasses to perform acupuncture by using a virtual needle; the Leap Motion somatosensory controller is used for detecting the existence, gesture position and gesture action of a palm and fingers; the Microsoft HoloLens holographic glasses are used for observing the calibration frame and calculating the positions of four corner points (x1, x2, x3 and x4) of the upper surface in a HoloLens coordinate system; the computer is used for constructing an Intelligent Fuzzy Petri Net (IFPN) model by using a back propagation algorithm to realize a diagnosis subsystem
Compared with the prior art, the invention has the following advantages and beneficial effects:
1. the invention uses the method of augmented reality, realizes the connection between the real world and the virtual interface, and the operator can have good immersion.
2. The diagnosis subsystem of the invention can feed back abundant information and prompt to the operator, the operator does not need prior knowledge, and the acupuncture operation can be realized by using practice
3. The invention provides a more natural man-machine interaction mode, and an operator can observe a virtual human body in any direction more naturally and accurately and take more accurate operation.
Drawings
Fig. 1 is a flow chart of a virtual acupuncture system.
FIG. 2 is an IFPN model for diagnostics.
Fig. 3 is a generated backward tree.
Fig. 4a and 4b are schematic diagrams of a hand-to-world coordinate system.
Detailed Description
For better understanding of the objects, technical solutions and advantages of the present invention, the technical solutions of the present invention will be described in detail below by way of examples with reference to the accompanying drawings, and it should be noted that processes or symbols that are not described in detail below can be understood or implemented by those skilled in the art with reference to the prior art.
The virtual acupuncture method of the embodiment adopts a system comprising AR glasses, a Leap Motion somatosensory controller and a computer; the flow of the method is shown in fig. 1, and is realized by the following steps:
and S1, the method is realized by wearing AR glasses. The virtual human body model is projected into a real environment, and an operator can perform acupuncture by wearing AR glasses and using virtual acupuncture to project the experimental human body model into the real world.
S2, the step is realized on the computer. Intelligent fuzzy Petri nets (feature pyramids networks) are widely used in the fields of reasoning, evaluation and the like. It has an output position and an input position. The IFPN model is formed by combining an intuitionistic fuzzy set theory and a Petri net theory. Firstly, the symptoms of the disease are trained and learned by a set of data through a back propagation algorithm, and an IFPN model is obtained after training, as shown in figure 2. In the model, the weight of the input position to the output position. The DCF is introduced into the IFPN model to convert IFPN into a backward tree, as shown in fig. 3. In the backward tree, the importance factor of symptoms to the disease is obtained to identify the disease.
In the following cases, IFPN is used to identify three common diseases, namely, spasm, dysentery and diarrhea. Fig. 2 is an IFPN model, and fig. 3 is a backward tree generated by IFPN conversion. By learning using a back propagation algorithm, weights are obtained describing the importance of input positions decisive for output positions, denoted by the symbols w1, w 2.
Study weight by summary case study. Each weight indicates a significance factor of an input position to an output position in the IFPN element. For example, w 1-0.617237 means that relative to other input positions p2 (frequent diarrhea), p6 (red tongue with yellow coating), p8 (rapid pulse) and p9 (weak pulse), the importance factor of position p1 (abdominal pain) for output position p16 (dysentery) is 0.617237, and w 15-0.09821 means that p10 (pale face) relative to other input positions p1 (abdominal pain), p3 (dry stool), p7 (white tongue with white coating), the importance factor of p9 (weak pulse) to output position p14 (cold constipation) is 0.09821
S3, the Leap Motion controller can detect the palm of the finger and can calculate the gesture position and Motion by itself after receiving the data to know how the hand moves. The hand data is then converted into a global coordinate system, as shown in fig. 4a and 4b (IMU is a device for measuring three-axis attitude angle (or angular velocity) and acceleration of an object), which is a schematic diagram of a hand-to-world coordinate system.
And S4, obtaining the least square solution of the conversion parameter vector by setting a similarity conversion and converting the similarity conversion into a deviation formula, and obtaining the relation between two coordinate systems to realize the conversion. The internet also needs to calculate how this action is performed in the real world. The internet obtains the height, direction and position data of the hand from microsoft HoloLens glasses, and calculates the position of the palm in the real world. In combination with the data in the Leap Motion, real world coordinates are obtained. In order to transmit a real-world image for display on a virtual scene rendered by microsoft HoloLens glasses, the real-world coordinates and the AR glasses coordinates need to be calibrated. The relation between the Leap Motion and microsoft HoloLens glasses needs to be further configured in order to enable a user to interact with the virtual object. Since Leap Motion can capture every joint point of every human hand. Furthermore, to establish the coordinates of microsoft HoloLens glasses and Leap Motion, at least three non-collinear commonalities are required. First, the calibration frame is observed using microsoft HoloLens, and the positions of four corner points (x1, x2, x3, x4) on the upper surface in the coordinate system of microsoft HoloLens are calculated. Then, by touching four corner points with the tips of the thumbs of the user, the positions (x1', x2', x3', x4') in the Leap Motion coordinate system can be calculated by Leap Motion, and the relationship between (x1, x2, x3, x4) and (x1', x2', x3', x4') is obtained.
S5, carrying out position estimation by using IKF, and rewriting an interval Kalman Filter by using a Kalman Filter algorithm to realize denoising. The orientation estimation is performed using the improved IPF, and for a time period, the sum of the position differences is used to obtain the weights.
S6, the acupuncture insertion process can be regarded as a mass point-spring model, so the collision of a mass point and a spring can be regarded as a good approximation of the acupuncture process. When the needle is inserted into the skin, the internal forces applied include the spring force, i.e. the pulling force of the surrounding mass point, and the virtual spring force, i.e. the force generated after the virtual spring with the original length of 0 attached to the control point is deformed. The spring force F can be determined by the springs grouped with M neighborhoods around the mass point. Since the acupuncture needle needs to be rotated to achieve the therapeutic purpose during acupuncture treatment, the acupuncture needle is subjected to a friction force in a tangential direction of the skin during the rotation in one direction, and the magnitude of the friction force is related to the physiological parameters of the skin and the external rotation force applied to the needle. Knowing the magnitude of these forces, the operator can know the accuracy of his own operation by using the real-time change of the color of the virtual needle, and adjust his own insertion speed and rotation speed, thereby improving the accuracy of the operation.
In summary, the present invention has been described in the specification and the drawings, and the above are only specific steps of the present invention, and do not limit the scope of the present invention.

Claims (6)

1. A virtual acupuncture method based on mobile interaction and augmented reality is characterized by comprising the following steps:
s1, amplifying, namely projecting the experimental human body model into an experimental environment;
s2, constructing an IFPN model by using a back propagation algorithm to realize a diagnosis subsystem;
s3, obtaining hand coordinates by using a Leap Motion somatosensory controller;
s4, combining virtual scenes with real scenes through a coordinate system and conversion thereof;
s5, estimating and capturing the position and the direction of the hand; estimating the position of the palm of the hand by using a Kalman Filter, and rewriting an interval Kalman Filter by using a Kalman Filter algorithm to realize denoising; after time t _ k the density is approximated as
Figure FDA0003463090320000011
Where δ () is a unit pulse function, N is the number of samples,
Figure FDA0003463090320000012
is the normalized weight of the ith particle,
Figure FDA0003463090320000013
is the ith particle; then, the cumulative position difference from the estimated value and the calculated value of the ith particle is used in the likelihood calculation, and finally, based on the likelihoodSexually calculating values, obtaining normalized weights;
s6, collision detection and visual feedback: the acupuncture insertion process is regarded as a mass point-spring model, and the collision of a mass point and a spring is regarded as the acupuncture process; when the needle is pricked into the skin, the received internal force comprises spring force, namely the pulling force of the surrounding matter point to the needle, and virtual spring force, namely the force generated after the virtual spring with the original length of 0 attached to the control point is deformed; the spring elasticity F is determined by a spring gathered in the neighborhood of M around the mass point; the acupuncture needle rotates along one direction, and the skin tangential direction of the acupuncture needle is subjected to a friction force, and the magnitude of the friction force is related to the physiological parameters of the skin and the external rotating force applied to the needle; the force is known, and the real-time change of the color of the virtual needle is utilized, so that an operator can know the accuracy of the operation, adjust the insertion speed and the rotation speed of the operator and improve the accuracy of the operation.
2. The virtual acupuncture method based on mobile interaction and augmented reality of claim 1, wherein the augmented reality in step S1 projects the experimental human body model into the real world by wearing AR equipment, and the operator uses a virtual needle to perform acupuncture by wearing AR glasses.
3. The virtual acupuncture method based on mobile interaction and augmented reality of claim 1, wherein in step S2, the digital description of acupuncture is combined with the digitized anatomical atlas by the computer to locate 3D acupuncture points for the simulated human body; said IFPN having an output location and an input location; the IFPN model is formed by combining an intuitionistic fuzzy set theory and a Petri net theory; firstly, training and learning a group of data by using a back propagation algorithm for the symptoms of diseases, and obtaining the weight of an input position to an output position in an IFPN model after training; introducing a Dynamic theoretical Factor into an IFPN model, and converting the IFPN model into a backward tree; in the backward tree, the importance factor of symptoms to the disease is obtained to identify the disease.
4. The virtual acupuncture method based on mobile interaction and augmented reality of claim 1, wherein in step S3, the palm and fingers are detected by the Leap Motion somatosensory controller; the somatosensory controller calculates gestures and actions by itself; the somatosensory controller uses a right-side Cartesian coordinate system, the controller is taken as a central origin, an X axis is parallel to a long edge on one side of the equipment, a Z axis is parallel to the other side of the equipment, a plane overlapped with the surface of the controller is formed, a Y axis is perpendicular to the plane, and the direction is upward; defining the frame of the controller as XlYLZ1 coordinates; the visual range of the controller is in an inverted cone shape, and once the palm and the fingers are detected, a coordinate system which is defined as XhYhZh and the origin of which is located at the center of the palm is constructed so as to digitize the position and the direction of the hand; the hand motion defined by the origin in the XhYhZh coordinate system needs to be translated into data of the relative position and orientation of the hand in the XlYlZ1 coordinates; and transmitting the data to a server, and converting the meaning of the data read by the server into a global coordinate system.
5. The method according to claim 1, wherein in step S4, a least square solution of the transformation parameter vector is obtained by setting a similarity transformation and transforming the setting into a deviation formula, so as to obtain a spatial position relationship between the coordinate system used by the Leap Motion controller and the global coordinate system of the real world, thereby realizing the transformation between virtual and real; the Internet obtains the height, direction and position data of the hand from Google HoloLens glasses, and calculates the position of the palm of the hand in the real world; combining the data in the controller to obtain real world coordinates; the real world coordinates and the AR glasses coordinates are calibrated so that images of the real world can be transmitted and displayed on a virtual scene rendered by Google Hololens glasses; the method comprises the steps that a simulation human body is registered in a real scene by using an interactive scene structure modeling algorithm of a single image to obtain a relation between a virtual object and the real scene, and a user can interact with the virtual object by further configuring a relation between a Leap Motion somatosensory controller and Google HoloLens glasses; simultaneously the controller captures each joint point of the human hand; establishing coordinates of Google HoloLens glasses and Leap Motion through at least three non-collinear common points, observing a calibration frame by using Microsoft HoloLens holographic glasses, and calculating the positions of four corner points (x1, x2, x3 and x4) on the upper surface in a HoloLens coordinate system; then, by touching four corner points with the tips of the thumbs of the user, the positions (x1', x2', x3', x4') in the Leap Motion coordinate system can be calculated by the Leap Motion somatosensory controller, and the relationship between (x1, x2, x3, x4) and (x1', x2', x3', x4') is obtained.
6. The system for realizing the virtual acupuncture method based on mobile interaction and augmented reality of claim 1 is characterized by comprising an AR device, a Leap Motion somatosensory controller and a computer; the AR device comprises Microsoft HoloLens holographic glasses; the AR equipment is used for projecting the experimental human body model into the real world, and an operator wears AR glasses to perform acupuncture by using a virtual needle; the Leap Motion somatosensory controller is used for detecting the existence, gesture position and gesture action of a palm and fingers; the Microsoft HoloLens holographic glasses are used for observing the calibration frame and calculating the positions of four corner points (x1, x2, x3 and x4) of the upper surface in a HoloLens coordinate system; the computer is used to implement a diagnostic subsystem by constructing an IFPN model using a back propagation algorithm.
CN201811083904.XA 2018-09-17 2018-09-17 Virtual acupuncture method and system based on mobile interaction and augmented reality Expired - Fee Related CN109243575B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811083904.XA CN109243575B (en) 2018-09-17 2018-09-17 Virtual acupuncture method and system based on mobile interaction and augmented reality

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811083904.XA CN109243575B (en) 2018-09-17 2018-09-17 Virtual acupuncture method and system based on mobile interaction and augmented reality

Publications (2)

Publication Number Publication Date
CN109243575A CN109243575A (en) 2019-01-18
CN109243575B true CN109243575B (en) 2022-04-22

Family

ID=65059560

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811083904.XA Expired - Fee Related CN109243575B (en) 2018-09-17 2018-09-17 Virtual acupuncture method and system based on mobile interaction and augmented reality

Country Status (1)

Country Link
CN (1) CN109243575B (en)

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109917911B (en) * 2019-02-20 2021-12-28 西北工业大学 Information physical interaction-based vibration tactile feedback device design method
CN111429569B (en) * 2020-03-30 2022-07-26 华南理工大学 Human anatomy teaching method based on 5G + augmented reality
CN112532801A (en) * 2020-12-04 2021-03-19 上海影创信息科技有限公司 Safety protection method and system of VR equipment based on heat distribution detection
CN114446442B (en) * 2022-02-24 2022-08-05 湖南省万卓医疗器械有限公司 Meridian flow acupuncture physiotherapy instrument based on intelligent control
CN114971219A (en) * 2022-05-05 2022-08-30 北京理工大学 Multi-view-angle human factor dynamic evaluation method and system based on augmented reality
CN116646052B (en) * 2023-06-28 2024-02-09 西安交通大学医学院第二附属医院 Auxiliary acupuncture positioning system and method based on three-dimensional human body model
CN117434571B (en) * 2023-12-21 2024-03-15 绘见科技(深圳)有限公司 Method for determining absolute pose of equipment based on single antenna, MR equipment and medium

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101799717A (en) * 2010-03-05 2010-08-11 天津大学 Man-machine interaction method based on hand action catch
CN106327983A (en) * 2016-09-06 2017-01-11 成都华域天府数字科技有限公司 Acupuncture acupoint determination auxiliary teaching system
CN106523393A (en) * 2016-11-18 2017-03-22 山东科技大学 Fault diagnosis method used for downhole draining system
CN106650251A (en) * 2016-12-14 2017-05-10 南京信息工程大学 Modeling method of acupuncture force feedback deformable model
CN107221000A (en) * 2017-04-11 2017-09-29 天津大学 Acupoint Visualization Platform and its image processing method based on augmented reality
CN107993545A (en) * 2017-12-15 2018-05-04 天津大学 Children's acupuncture training simulation system and emulation mode based on virtual reality technology

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101799717A (en) * 2010-03-05 2010-08-11 天津大学 Man-machine interaction method based on hand action catch
CN106327983A (en) * 2016-09-06 2017-01-11 成都华域天府数字科技有限公司 Acupuncture acupoint determination auxiliary teaching system
CN106523393A (en) * 2016-11-18 2017-03-22 山东科技大学 Fault diagnosis method used for downhole draining system
CN106650251A (en) * 2016-12-14 2017-05-10 南京信息工程大学 Modeling method of acupuncture force feedback deformable model
CN107221000A (en) * 2017-04-11 2017-09-29 天津大学 Acupoint Visualization Platform and its image processing method based on augmented reality
CN107993545A (en) * 2017-12-15 2018-05-04 天津大学 Children's acupuncture training simulation system and emulation mode based on virtual reality technology

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
A Fuzzy Petri Nets System for Heart Disease Diagnosis;Hussin Attya Lafta et al.;《Journal of Babylon University》;20171231;第25卷(第2期);第317-328页 *
基于BP算法的IFPN参数优化方法;郑寇全等;《控制与决策》;20131231;第28卷(第12期);第1779-1785页 *
基于双Leap Motion的三维可视化交互方法研究;孙国道等;《计算机辅助设计与图形学学报》;20180731;第30卷(第07期);第1268-1275页 *

Also Published As

Publication number Publication date
CN109243575A (en) 2019-01-18

Similar Documents

Publication Publication Date Title
CN109243575B (en) Virtual acupuncture method and system based on mobile interaction and augmented reality
JP7273880B2 (en) Virtual object driving method, device, electronic device and readable storage medium
CN106346485B (en) The Non-contact control method of bionic mechanical hand based on the study of human hand movement posture
CN101579238B (en) Human motion capture three dimensional playback system and method thereof
Ueda et al. A hand-pose estimation for vision-based human interfaces
CN105252532A (en) Method of cooperative flexible attitude control for motion capture robot
CN201431466Y (en) Human motion capture and thee-dimensional representation system
Cordella et al. Patient performance evaluation using Kinect and Monte Carlo-based finger tracking
CN106873787A (en) A kind of gesture interaction system and method for virtual teach-in teaching
CN103529944A (en) Human body movement identification method based on Kinect
CN103112007A (en) Human-machine interaction method based on mixing sensor
TW202026846A (en) Action capture method for presenting an image similar to the motion of a user and displaying the image on a display module
KR20200051938A (en) Method for controlling interaction in virtual reality by tracking fingertips and VR system using it
WO2024094227A1 (en) Gesture pose estimation method based on kalman filtering and deep learning
Silva et al. Sensor data fusion for full arm tracking using myo armband and leap motion
JP2023507241A (en) A proxy controller suit with arbitrary dual-range kinematics
Maycock et al. Robust tracking of human hand postures for robot teaching
CN113496168B (en) Sign language data acquisition method, device and storage medium
Ángel-López et al. Kinematic hand analysis using motion capture technology
CN111369626A (en) Markless point upper limb movement analysis method and system based on deep learning
WO2019152566A1 (en) Systems and methods for subject specific kinematic mapping
White et al. A virtual reality application for stroke patient rehabilitation
TW201619754A (en) Medical image object-oriented interface auxiliary explanation control system and method thereof
CN115390739A (en) Remote interactive input method and device based on palm touch and electronic equipment
CN114926613A (en) Method and system for enhancing reality of human body data and space positioning

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20220422