CN115202471A - Whole body posture tracking and touch equipment and virtual reality system - Google Patents

Whole body posture tracking and touch equipment and virtual reality system Download PDF

Info

Publication number
CN115202471A
CN115202471A CN202210704377.XA CN202210704377A CN115202471A CN 115202471 A CN115202471 A CN 115202471A CN 202210704377 A CN202210704377 A CN 202210704377A CN 115202471 A CN115202471 A CN 115202471A
Authority
CN
China
Prior art keywords
user
signal
module
control unit
touch
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210704377.XA
Other languages
Chinese (zh)
Inventor
杜伟华
张�浩
陈丽莉
韩鹏
何惠东
石娟娟
秦瑞峰
姜倩文
赵砚秋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
BOE Technology Group Co Ltd
Beijing BOE Display Technology Co Ltd
Original Assignee
BOE Technology Group Co Ltd
Beijing BOE Display Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by BOE Technology Group Co Ltd, Beijing BOE Display Technology Co Ltd filed Critical BOE Technology Group Co Ltd
Priority to CN202210704377.XA priority Critical patent/CN115202471A/en
Publication of CN115202471A publication Critical patent/CN115202471A/en
Priority to PCT/CN2023/091411 priority patent/WO2023246305A1/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/015Input arrangements based on nervous system activity detection, e.g. brain waves [EEG] detection, electromyograms [EMG] detection, electrodermal response detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/016Input arrangements with force or tactile feedback as computer generated output to the user

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Dermatology (AREA)
  • Neurosurgery (AREA)
  • Neurology (AREA)
  • General Health & Medical Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The utility model provides a whole body gesture is tracked and touch equipment and virtual reality system belongs to virtual reality augmented display technical field. The whole body posture tracking and haptic apparatus of the present disclosure includes: the device comprises a main body structure, at least one motion detection unit, a control unit and at least one touch control unit; a body structure configured to be worn on a limb by a user; the motion detection unit is arranged on the main body structure and is configured to detect the limb motion of the user and generate the motion posture information of the user; the control unit is configured to process the received motion posture information, generate a first control signal and transmit the first control signal to the VR equipment so that the VR equipment can control the limb actions of the virtual user, and generate a second control signal according to a touch signal after receiving the touch signal fed back by the VR equipment that the virtual user is touched; a haptic control unit configured to feedback a true haptic sensation to the user limb through the body structure according to the second control signal.

Description

Whole body posture tracking and touch equipment and virtual reality system
Technical Field
The disclosure belongs to the technical field of virtual reality augmented display, and particularly relates to a whole body posture tracking and touch device and a virtual reality system.
Background
Virtual Reality (VR) is a new practical technology developed in the 20 th century. Virtual reality technology encompasses computer, electronic information, simulation technology, the basic implementation of which is that a computer simulates a virtual environment to give a person a sense of environmental immersion. With the continuous development of social productivity and scientific technology, VR technology is increasingly in great demand in various industries.
Most mobile VR heads exhibit rotational tracking (3 DoF) and you can look up or down, tilt left or right. But will not be tracked if you try to tilt or move the position of your head. Virtual reality can currently only be based on our eyes and ears.
Disclosure of Invention
The present invention is directed to solve at least one of the problems of the prior art, and provides a whole body posture tracking and haptic device and a virtual reality system.
In a first aspect, embodiments of the present disclosure provide a whole-body gesture tracking and haptic device, which includes: the device comprises a main body structure, at least one motion detection unit, a control unit and at least one touch control unit; wherein the content of the first and second substances,
the body structure configured to be worn by a user on a limb;
the motion detection unit is arranged on the main body structure and is configured to detect the limb motion of the user and generate the motion posture information of the user;
the control unit is configured to process the received motion posture information, generate a first control signal, transmit the first control signal to the VR equipment, control the limb movement of the virtual user by the VR equipment, and generate a second control signal according to a touch signal fed back by the VR equipment after receiving the touch signal touched by the virtual user;
the tactile control unit is configured to feed back a real tactile sensation to the user limb through the body structure according to the second control signal.
In some examples, the motion pose information includes a rotation angle of a motion joint; the action detection unit comprises a first detection module and a second detection module;
the first detection module is configured to detect and process an electromyographic signal generated by a skeletal muscle of a user;
the second detection module is configured to detect and process a skin surface tension signal of the user;
the control unit is configured to obtain a rotation angle of a movement joint of the user through a first preset algorithm according to the processed electromyographic signal and the skin surface tension signal, and obtain movement posture information of the user through a second preset algorithm according to the rotation angle of the movement joint of the user to generate a first control signal.
In some examples, the first detection module comprises an electromyographic signal electrode and a first bandpass amplifier; the second detection module comprises a skin surface tension strain gauge and a second band-pass amplifier;
the electromyographic signal electrode is configured to detect an electromyographic signal generated by skeletal muscle of a user;
the first band-pass amplifier is configured to amplify the electromyographic signals and transmit the signals to the control unit;
the skin surface tension strain gauge is configured to detect a skin surface tension signal of a user;
the second band-pass amplifier is configured to amplify the skin surface tension signal and transmit the amplified signal to the control unit.
In some examples, the control unit includes an analog-to-digital conversion module, a first calculation module, and a first control module;
the analog-to-digital conversion module is configured to convert the processed electromyographic signals and the processed skin surface tension signals into first digital signals and second digital signals respectively;
the first calculation module is configured to obtain a rotation angle of a movement joint of the user according to the first digital signal and the second digital signal through a first preset algorithm, and obtain movement posture information of the user through a second preset algorithm;
the first control module is configured to generate a first control signal according to the motion posture information of the user, transmit the first control signal to the VR equipment so that the VR equipment can control the limb movement of the virtual user, and generate a second control signal according to a touch signal after receiving the touch signal fed back by the VR equipment that the virtual user is touched.
In some examples, the first calculation module is specifically configured to obtain, according to the first digital signal and the second digital signal and through a first preset algorithm, a rotation angle of a kinematic joint of a user, and calculate, according to a pre-established kinematic model, a spatial pose coordinate of a current position point of each kinematic joint with respect to an initial point of each kinematic joint, respectively, so as to obtain the motion posture information of the user; the initial point is angle information of each motion joint of the initial posture of the user.
In some examples, the body structure is comprised of a plurality of hollow tube braids; the touch control unit comprises a first switch module, a second switch module and a touch module;
the first switch module and the second switch module are connected with the hollow wire pipe and are used for controlling gas to enter and discharge from the hollow wire pipe according to the second control signal; and the touch module is arranged on the hollow conduit defined by the first switch module and the second switch module, and when the main body structure is worn by a user, the touch module is positioned on one side of the hollow conduit close to the human body.
In some examples, the control unit includes a second control module and a second calculation module;
the second calculating module is configured to calculate position information of the virtual user actually touched according to a touch signal fed back by the VR device and touched by the virtual user;
the second control module is configured to generate the second control signal according to the position information actually touched by the user.
In some examples, the material of the hollow conduit comprises a fibrous material.
In some examples, the first and second switch modules include valves.
In some examples, the touch module includes a hammer.
In some examples, the haptic control unit further includes a suction assembly configured to inflate the hollow tube with gas and to exhaust the gas from the hollow tube.
In some examples, the suction assembly is an air pump.
In a second aspect, an embodiment of the present disclosure further provides a virtual reality system, which includes the whole-body gesture tracking and haptic device and the VR device; the VR device is communicatively coupled with the whole-body pose tracking and haptic device.
In some examples, the whole-body gesture tracking and haptic device and the VR device are connected through WIFI or bluetooth.
In some examples, the VR device includes a VR headset.
Drawings
FIG. 1 is a schematic diagram of a human kinematics model;
FIG. 2a is a diagram of the overall body pose tracking and haptic device effect;
FIG. 2b is a schematic view of a detail of the weave of section I of FIG. 2 a;
FIG. 3 is a schematic diagram of virtual reaction forces in a virtual sense of touch;
FIG. 4 is a schematic diagram of virtual pain sensation in a virtual sense of touch;
FIG. 5 is a flowchart of whole body pose tracking;
FIG. 6 is a flow chart of virtual haptics;
FIG. 7 is a schematic diagram of posture tracking of an elbow joint of a human body;
FIG. 8 is a schematic block diagram of motion detection;
fig. 9 is a schematic view of a virtual reality system.
Detailed Description
In order to make the technical solutions of the present invention better understood, the present invention will be described in further detail with reference to the accompanying drawings and specific embodiments.
Unless otherwise defined, technical or scientific terms used herein shall have the ordinary meaning as understood by one of ordinary skill in the art to which this disclosure belongs. The use of "first," "second," and similar terms in this disclosure is not intended to indicate any order, quantity, or importance, but rather is used to distinguish one element from another. Also, the use of the terms "a," "an," or "the" and similar referents do not denote a limitation of quantity, but rather denote the presence of at least one. The word "comprising" or "comprises", and the like, means that the element or item listed before the word covers the element or item listed after the word and its equivalents, but does not exclude other elements or items. The terms "connected" or "coupled" and the like are not restricted to physical or mechanical connections, but may include electrical connections, whether direct or indirect. "upper", "lower", "left", "right", and the like are used only to indicate relative positional relationships, and when the absolute position of the object being described is changed, the relative positional relationships may also be changed accordingly.
Existing Virtual Reality (VR) technologies generally require the use of VR devices to visually and audibly enclose a person from the outside world, and guide the user to create a sense of being in a Virtual environment. VR devices 2 that are currently common on the market include VR glasses, VR helmets, etc.
Wherein, VR helmets generally include lenses, display screens, cables; the lens is one of the most key components, not only plays a role in linking human eyes of a user and VR (virtual reality) content, but also determines the visual effect of the user to a great extent;
in addition, the high-performance display enables the VR helmet to have enough pixel density to display clear images and enables moving pictures in the VR to be smooth; high-end VR helmets also use a dual screen display to provide a stereoscopic 3D effect, each screen showing a slightly offset image to each eye, and our brain will then automatically "glue" them together into one image and create an illusion of depth in the process;
in addition, the VR helmet is also provided with a built-in sensor, so that the virtual character can make corresponding changes along with the head of the user, and a more accurate display picture is obtained. However, most VR devices 2 in the prior art still track around the head and hands, which makes the interaction in the virtual environment single and the reality of virtual reality is poor. Even if it can be tracked by a tracker or by using a constellation of specific shapes on the tracked object, this system is not feasible because it is cumbersome and difficult to track the detailed pose. Meanwhile, how to let the user experience the touch sense of different body parts of the virtual user and blend the virtual touch sense into the virtual reality is still an important problem in the field, so that the real experience sense of the virtual reality is improved.
In view of this, in the embodiment of the present disclosure, a whole-body posture tracking and haptic device 1 is provided, which detects the limb movement of the user and generates user movement posture information through a movement detection unit 20, thereby implementing the whole-body posture tracking of the user; virtual touch is realized through the main body structure 10 and the touch control unit 40, and the VR device 2 is responsible for tracking the position and the posture of the head in general, so that the real experience of virtual reality is greatly improved, and a certain promoting effect is realized on the development of real virtual reality of a virtual reality system.
The display module according to the embodiments of the present disclosure is described below with reference to the accompanying drawings and specific embodiments.
In a first aspect, the disclosed embodiments provide a whole-body gesture tracking and haptic device 1, which can be used for information interaction with a VR device 2 worn on the head of a user. The whole-body posture tracking and haptic device 1 includes: a body structure 10, at least one motion detection unit 20, a control unit 30, at least one haptic control unit 40. Wherein the body structure 10 is configured to be worn by a user on a limb; a motion detection unit 20 is mounted on the main body structure 10, the motion detection unit 20 being configured to detect a limb motion of the user and generate user motion posture information; the control unit 30 is configured to process the received motion posture information, generate a first control signal, transmit the first control signal to the VR device 2, allow the VR device 2 to control the limb movement of the virtual user, and generate a second control signal according to the touch signal after receiving the touch signal fed back by the VR device 2 that the virtual user is touched; the haptic control unit 40 is configured to feed back a true haptic sensation to the limb of the user through the body structure 10 according to the second control signal.
It should be noted that the main body structure 10 in the embodiment of the present disclosure may be a garment, and when the user wears the main body structure 10 when using the whole body posture tracking and tactile device 1, the main body structure 10 can be relatively fixed on the user's limb (e.g., a tight-fitting garment) so as to fit the skin of the human body. The motion detection unit 20 is fixed on the main body structure 10, and when the main body structure 10 is worn by a user, the position of the motion detection unit 20 generally corresponds to the skeletal muscle of the user and is located on the side of the main body structure 10 away from the skin of the user.
In the embodiment of the present disclosure, since the whole body posture tracking and haptic device 1 includes the main body structure 10, the at least one motion detection unit 20, the control unit 30, and the at least one haptic control unit 40, the control unit 30 can process the received real motion posture information of the user detected by the motion detection unit 20, generate the first control signal, and transmit the first control signal to the VR device 2, so that the VR device 2 can control the limb motion of the virtual user, and after receiving the touch signal fed back by the VR device 2 and touched by the virtual user, generate the second control signal according to the touch signal, and the haptic control unit 40 can feed back the real haptic sensation to the limb of the user through the main body structure 10 according to the second control signal, the whole body posture tracking and haptic device 1 according to the embodiment of the present disclosure can realize the whole body posture tracking of the user, and can give the virtual sensation to the user according to the display of the VR device 2, thereby greatly improving the real experience of the virtual display.
In some examples, the motion pose information may be a rotation angle of the motion joint. The motion detection unit 20 includes a first detection module 21 and a second detection module 22. Specifically, the first detection module 21 is configured to detect and process an electromyographic signal generated by a skeletal muscle of a user; the second detection module 22 is configured to detect and process a skin surface tension signal of the user; the control unit 30 is configured to obtain a rotation angle of a moving joint of the user through a first preset algorithm according to the processed electromyogram signal and the skin surface tension signal, and obtain movement posture information of the user through a second preset algorithm according to the rotation angle of the moving joint of the user to generate a first control signal.
Further, the first detection module 21 includes an electromyographic signal electrode 211 and a first band pass amplifier 212; the second detection module 22 comprises a skin surface tension strain gauge 221 and a second band-pass amplifier 222; an electromyographic signal electrode 211 configured to detect an electromyographic signal generated by a skeletal muscle of a user; a first band pass amplifier 212 configured to amplify the electromyographic signals and transmit to the control unit 30; a skin surface tension strain gauge 221 configured to detect a skin surface tension signal of a user; a second band pass amplifier 222 configured to amplify the skin surface tension signal and transmit it to the control unit 30.
It should be noted that, the motion detection unit 20 is mounted on the main structure 10, and its projection on the main structure 10 covers the outside of the skeletal muscle of the whole human body; the number of the myoelectric signal electrodes 211, the first band pass amplifiers 212, the skin surface tension strain gauges 221, and the second band pass amplifiers 222 in the motion detection unit 20 can be adjusted according to different body parts.
Specifically, when muscles contract, the electromyographic signal electrode 211 detects electromyographic current, the larger the muscle contraction degree is, the larger the generated electromyographic current is, the contraction degree of the muscles can be pushed out by detecting the size of the electromyographic current, then the joint rotation angle is calculated, the action of a virtual user can be controlled by each joint action angle of a human body, and further the whole body posture tracking of the human body is completed; the skin surface tension strain gauge 221 can detect skin stress changes in muscle contraction and relaxation processes, when muscles contract, the skin is under the action of extrusion force, the skin surface tension strain gauge 221 is under the action of pressure, when muscles relax, the skin is under the action of surface tension, the skin surface tension strain force detects the action of tension force, and the skin stress change current can be matched with the electromyographic signal current to calculate the joint rotation angle more accurately.
The first band-pass amplifier 212 and the second band-pass amplifier 222 are used for filtering and amplifying the electromyographic signal and the skin surface tension signal, but the signal amplifiers for amplifying the signals are not limited to the band-pass amplifiers, and therefore the type of the signal amplifiers is not limited thereto, that is, the detected signals can be effectively amplified.
Further, the control unit 30 includes an analog-to-digital conversion module 31, a first calculation module and a first control module; the analog-to-digital conversion module 31 is configured to convert the processed electromyographic signals and skin surface tension signals into first digital signals and second digital signals, respectively. The first calculation module is configured to obtain a rotation angle of a kinematic joint of the user according to the first digital signal and the second digital signal through a first preset algorithm, and obtain motion posture information of the user through a second preset algorithm. The first control module is configured to generate a first control signal according to the motion posture information of the user, transmit the first control signal to the VR device 2, allow the VR device 2 to control the limb movement of the virtual user, and generate a second control signal according to the touch signal after receiving the touch signal fed back by the VR device 2 that the virtual user is touched.
In some examples, the first calculation module is specifically configured to obtain, according to the first digital signal and the second digital signal, a rotation angle of a kinematic joint of the user through a first preset algorithm, and calculate, according to a pre-established kinematic model, a spatial pose coordinate of a current position point of each kinematic joint with respect to an initial point of each kinematic joint, respectively, so as to obtain motion posture information of the user; the initial point is angle information of each motion joint of the initial posture of the user.
Further, the first calculation module comprises a Multimedia Application Processor (MAP), an Inertial Measurement Unit (IMU), and is configured to firstly process the first digital signal and the second digital signal through the MAP to obtain a rotation angle of a kinematic joint of the user, and then calculate the motion posture information of the user through a kinematic formula according to the rotation angle of the kinematic joint of the user.
Specifically, when the system is used for the first time, virtual and real body calibration is carried out, namely, the physical size, the initial posture, the spatial coordinates of the hand and foot tail ends of the user body correspond to the virtual user. Wherein, the physical size of the body is obtained by taking a full-body picture through the VR device 2; the initial posture takes the initial posture of a user starting to use the system as a reference, after the system is started, the initial posture is recorded by tracking the whole body posture, and in the following whole body posture tracking process, the joint angles of the initial posture are taken as initial points; the spatial coordinates of the hand and foot extremities can be calculated by kinematics.
Establishment of a fixed point O 1 And establish X 1 O 1 Y 1 Coordinate system, fixed point O 1 The position of the relative VR equipment 2 is not changed, and the relative posture is changed by the VR equipment2, obtaining and calculating by IMU; taking the position which needs to be calculated by the user as O 2 And establish X 2 O 2 Y 2 Coordinate system, calculating X 2 O 2 Y 2 Relative to X 1 O 1 Y 1 By calculating the user's limb relative to O 1 And finally obtaining the user posture.
FIG. 1 is a schematic diagram of a human kinematics model; the process that the user controls the virtual user through the limb action comprises the following steps: the user makes corresponding action according to the virtual picture, the whole body posture is tracked, the tactile device 1 tracks the posture, the control unit 30 sends the body posture coordinate to the VR device 2, and the VR device 2 controls the action of the virtual user through the user posture. Specifically, as shown in fig. 1, taking the left arm of the human body as an example in the figure, a kinematic model is established, and O is calculated 2 Point relative to O 1 Relative spatial pose coordinates of the points; o is 1 The relative position of the point to the VR device 2 is fixed, the relative posture can be obtained and calculated by the VR device 2 and the IMU of the control unit 30, the information of the angle of each joint of the left arm of the human body in the figure can be tracked by the posture of the whole body, and then O is calculated 2 Point relative to O 1 Pose coordinates of the points:
Figure BDA0003704762380000091
Figure BDA0003704762380000092
Figure BDA0003704762380000093
Figure BDA0003704762380000094
in the above formula:
1)
Figure BDA0003704762380000095
is theta 2 Relative to O 1 The pose coordinates of (1);
2)cθ 2 is cos theta 2
3)sθ 2 Is sin theta 2
In the course of the above calculation process,
Figure BDA0003704762380000096
is a hand coordinate system X 2 O 2 Y 2 Relative to the central coordinate system X of the control unit 30 1 O 1 Y 1 By calculating the relative position of the limbs of the body to O 1 The whole body posture information of the human body can be obtained through the coordinates of the human body. In this process, the process of controlling the virtual user by the user through the limb action includes: the user makes corresponding actions according to the virtual picture, the whole body posture tracking and the tactile device 1 perform posture tracking, and the control unit 30 sends the position and posture coordinates of the user to the VR device 2, so that the user controls the actions of the virtual user.
It will be appreciated that during the above calculation, the hand coordinate system X is 2 O 2 Y 2 And control unit 30 center coordinate system X 1 O 1 Y 1 The establishment of (a) is not critical. Usually, the control unit 30 is located at the center of gravity of the human body, which makes it more convenient to calculate the whole body posture information of the human body, but the control unit 30 may not be located at the center of gravity of the human body, which does not affect the tracking and calculation of the whole body posture at any position of the human body.
In some examples, the control unit 30 generates a second control signal according to the touch signal after receiving the touch signal that the virtual user is touched, which is fed back by the VR device 2, and the haptic control unit 40 can feed back the real haptic sensation to the limb of the user through the body structure 10 according to the second control signal.
In some examples, the control unit 30 includes a second control module and a second calculation module; the second calculating module is configured to calculate to obtain position information that the user is actually touched according to a touch signal fed back by the VR device 2 and the virtual user is touched; the second control module is configured to generate a second control signal according to the position information that the user is actually touched, and the haptic control unit 40 may feed back a real haptic sensation to the limb of the user through the body structure 10 according to the second control signal.
It should be noted that the haptic control unit 40 may exist independently from the control unit 30, i.e. the control unit 30 does not directly control the implementation of the virtual sense of touch, but the haptic control unit 40 may also be controlled by the control unit 30, and the two are communicatively connected to implement the virtual sense of touch function.
Further, the haptic control unit 40 includes a first switch module 41, a second switch module 42, and a touch module 43; the first switch module 41 and the second switch module 42 are connected with the hollow conduit and are used for controlling the gas to enter and exit the hollow conduit according to a second control signal; and when the touch module 43 is disposed on the hollow tube defined by the first switch module 41 and the second switch module 42, and when the user wears the main body structure 10, the touch module 43 is located on the side of the hollow tube close to the human body.
Specifically, the main body structure 10 is formed by weaving a plurality of hollow tubular pipes; wherein, fig. 2b is a detailed schematic view of the part i in fig. 2a, as shown in fig. 2b, the hollow conduit is in a net-shaped staggered structure and the internal air pressure is adjustable, the flexibility of the hollow conduit is changed along with the internal air pressure, and the whole flexibility of the hollow conduit can be changed by filling or discharging air; the local air pressure of the hollow wire tube is controllable, and virtual touch is realized by adjusting the local air pressure of the hollow wire tube.
In some examples, the material of the hollow fiber tube comprises a fibrous material, i.e., a hollow fiber tube. The fiber material has the characteristics of high strength, light weight and good air permeability, so the main body structure 10 formed by weaving the hollow fiber tubes is suitable for being worn by human bodies. The material of the hollow conduit is not limited to fiber material, however, and any type of material may be tried in order to meet the functional requirements of the body structure 10, but it is preferable to be lightweight and high performance.
In some examples, the first and second switch modules 41, 42 include, but are not limited to, valves. The first switch module 41 and the second switch module 42 are used for controlling the local pressure of the hollow conduit, and thus the form thereof is not limited.
In some examples, the touch module 43 includes a hammer; after the tactile control unit 40 receives the second control signal, the first switch module 41 and the second switch module 42 on the two sides of the striking hammer are controlled to be closed, so that the local air pressure of the hollow conduit is reduced, and the striking hammer strikes a striking point on the surface of the skin of the human body under the action of pressure, so that virtual pain is realized.
It should be noted that, instead of turning off the first switch module 41 and the second switch module 42 on both sides of the striking hammer to control the striking hammer to strike the skin surface of the human body, the striking hammer may be directly controlled to rise and fall to realize the virtual touch.
In some examples, the haptic control unit 40 further includes a suction assembly 44, the suction assembly 44 configured to inflate the hollow tube with gas and to exhaust the gas from the hollow tube. The virtual touch can be realized by adjusting the air pressure in the hollow wire.
In some examples, the suction assembly 44 is an air pump. The air pump fills the hollow tube with air and discharges the air in the hollow tube. Specifically, gas is led into the gas storage cylinder through the gas guide pipe, so that the gas is led into the hollow line pipe, and meanwhile, the gas storage cylinder leads the gas in the gas storage cylinder into the pressure regulating valve fixed on the air pump through the gas guide pipe, so that the air pressure in the gas storage cylinder is controlled. When the air pressure in the air cylinder does not reach the pressure set by the pressure regulating valve, the air entering the pressure regulating valve from the air cylinder cannot jack the valve of the pressure regulating valve; when the air pressure in the air reservoir reaches the pressure set by the pressure regulating valve, the air entering the pressure regulating valve from the air reservoir jacks the pressure regulating valve, enters the air passage communicated with the pressure regulating valve in the air pump, and controls the air inlet of the air pump to be normally open through the air passage, so that the air pump runs under the idle load. When the air pump is stopped, the air pump can automatically exhaust air.
It should be noted that the form and number of the suction unit 44 are not limited herein, as it is capable of filling the hollow tube with gas and discharging the gas from the hollow tube. For example: a blower may also be used as the suction assembly 44. Of course, considering that the whole-body gesture tracking and haptic device 1 provided by the present disclosure may be worn on the body of the user, the weight and performance of the pumping assembly 44 should be considered, i.e., the pumping assembly 44 is lighter and smaller as well as better in the case of realizing the function of the pumping assembly 44.
In order to make clearer the specific action principle of the whole-body gesture tracking and haptic device 1 provided by the embodiment of the present disclosure, the following description is made with reference to a specific example.
FIG. 2a is a diagram of the overall body pose tracking and haptic device effect; FIG. 3 is a schematic diagram of virtual reaction forces in a virtual sense of touch; FIG. 4 is a schematic diagram of virtual pain sensation in a virtual sense of touch; FIG. 5 is a flowchart of whole body pose tracking; FIG. 6 is a flow chart of virtual haptics; referring to fig. 2a, 3, 4, 5, and 6, the motion detection unit 20 of the whole body posture tracking and haptic device 1 is mounted on the main body structure 10, and configured to detect the limb motions of the user, when the user moves, the motion detection unit 20 detects the limb motions of the user and generates user motion posture information to transmit to the control unit 30, and the control unit 30 is configured to process the received motion posture information, generate a first control signal, and transmit to the VR device 2, so that the VR device 2 controls the limb motions of the virtual user, thereby realizing the whole body posture tracking; when a user moves in a VR environment and is hit by a virtual character, a touch signal is generated and transmitted to the VR device 2, after the VR device 2 feeds back the touch signal touched by the virtual user, a second control signal is generated according to the touch signal, and the touch control unit 40 is configured to feed back real touch perception to the limb of the user through the main body structure 10 according to the second control signal, so that virtual touch is achieved.
Specifically, as shown in fig. 5, when the system is used for the first time, virtual-real body calibration is performed, that is, the physical size of the user body, the initial posture, the spatial coordinates of the ends of the hands and the feet correspond to the virtual user. Wherein, the physical size of the body is obtained by taking a full-body picture through the VR device 2; the initial posture takes the initial posture of a user starting to use the system as a reference, after the system is started, the initial posture is recorded by tracking the whole body posture, and in the following whole body posture tracking process, the angles of all joints of the initial posture are taken as initial points; the spatial coordinates of the hand and foot ends can be calculated by kinematics.
After the user performs the virtual-real body calibration, the posture tracking of the whole body is started, and fig. 7 is a schematic diagram of the posture tracking of the elbow joint of the human body; as shown in fig. 7, taking tracking of the elbow joint as an example, the elbow joint motion detection unit 20 is attached to the lateral side of the triceps muscle and the lateral side of the biceps muscle. FIG. 8 is a schematic block diagram of motion detection; as shown in fig. 8, each motion detection unit 20 includes a first detection module 21 and a second detection module 22, wherein the first detection module 21 includes an electromyographic signal electrode 211 and a first band pass amplifier 212, and the second detection module 22 includes a skin surface tension strain gauge 221 and a second band pass amplifier 222. When muscles contract, the myoelectric signal electrode 211 detects myoelectric current, the larger the muscle contraction degree is, the larger the generated myoelectric current is, the contraction degree of the muscles can be pushed out by detecting the myoelectric current, then the joint rotation angle is calculated, the action of a virtual user can be controlled by each joint action angle of a human body, and further the whole body posture tracking of the human body is completed; the skin surface tension strain gauge 221 can detect skin stress changes in muscle contraction and relaxation processes, when muscles contract, the skin is under the action of extrusion force, the skin surface tension strain gauge 221 is under the action of pressure, when muscles relax, the skin is under the action of surface tension, the skin surface tension strain force detects the action of tension force, and the joint rotation angle can be calculated more accurately by matching skin stress change current with myoelectric signal current, so that the posture tracking of the whole body is completed.
As shown in fig. 3, when the user tries to push an object with the arm of the virtual user in the virtual scene, the user swings the arm in the real environment, and the virtual user pushes the object in the virtual scene according to the motion of the user. In a real scene, when a certain object is pushed, the reaction force of the object to the hand or the arm is sensed, and in the present disclosure, the effect is realized in a virtual scene. When a user swings an arm to push a certain object along the movement shown in fig. 3 in the reverse direction, the air pressure of the hollow wire pipe outside the big arm is reduced at the moment, a force F1 contracting towards the inner side of the hollow wire pipe is generated, and when the small arm swings, the arm can feel a reaction force of the F2 under the action of the F1, namely, the user can also feel the reaction force received by the virtual user, so that the virtual touch is realized.
As shown in fig. 6, when the virtual user is hit, the flow of the pain sensation felt by the user is as follows: after the virtual user is hit, the VR device 2 records the coordinate position of the body of the virtual user, and then transmits the position information to the tactile control unit 40, and the tactile control unit 40 adjusts the pressure at the corresponding position of the body of the user to generate slight pain.
Specifically, as shown in fig. 4, when the user plays a fighting game, the virtual opponent is placed on the virtual user, and in order to make the virtual reality closer to the real scene, the user should feel pain when the virtual user is hit. When a certain part of the virtual user body is hit, the virtual user body corresponds to a hitting point on the surface of the skin of the user in reality, at the time, the air pressure of hollow conduits on the two sides of the first switch module 41 and the second switch module 42 at the hitting point is reduced, under the action of pressure, the first switch module 41 and the second switch module 42 are under the tensile force action of F1 and F2, the touch module 43 hits the hitting point on the surface of the skin under the action of F3 pressure, and the user feels slight pain, so that the virtual pain is realized.
In a second aspect, an embodiment of the present disclosure further provides a virtual reality system. FIG. 9 is a schematic view of a virtual reality system; as shown in fig. 9, it includes the above-mentioned whole body posture tracking and haptic device 1 and VR device 2; the VR device 2 is communicatively connected to the whole body pose tracking and haptic device 1.
In order to make clearer the specific working principle of the virtual reality system provided by the embodiment of the present disclosure, the following description is made with reference to a specific example.
In one example, referring to fig. 9, the virtual reality system provided by the present disclosure includes a whole body posture tracking and haptic device 1 and a VR device 2, where the whole body posture tracking and haptic device 1 mainly performs a whole body posture tracking and virtual haptic function, and the VR device 2 mainly performs a universal head six-degree-of-freedom tracking. The whole body posture tracking is mainly completed by a motion detection unit 20 arranged on the outer surface of a skeletal muscle of the whole body, the motion detection unit 20 mainly comprises an electromyographic signal electrode 211 and a skin surface tension strain gauge 221, when a human body moves, the electromyographic signal electrode 211 and the skin surface tension strain gauge 221 generate weak current, the weak current is filtered and amplified by a band-pass amplifier and is converted into a digital signal by an analog-to-digital conversion module 31, and a current value is processed by MAP (MAP) and is converted into a corresponding joint angle, so that the whole body posture tracking is realized. The virtual touch sense is realized by the tight clothing woven by the hollow fiber tubes, the virtual touch sense is realized by adjusting the pressure inside the local fiber tubes, the tight clothing can realize that the local pressure is controllable, each local pressure is controlled by a separate switch module, the local pressure is adjusted by leading in or exhausting gas through the suction assembly 44, then the virtual touch sense is realized, and the virtual touch sense is realized through the touch module 43. Meanwhile, the IMU can record the posture information of the main control box. The whole body posture tracking and haptic device 1 is in communication connection with the VR device 2 through the control unit 30, the control unit 30 sends body posture information of a user to the VR device 2 for processing, and the VR device 2 sends haptic point coordinates to the haptic control unit 40, so that a virtual haptic function is realized. It should be noted that, in this example, the first switch module 41 and the second switch module 42 are valves, the touch module 43 is a striking hammer, and the pumping assembly 44 is an air pump.
In some examples, the whole-body gesture tracking and haptic device 1 and the VR device 2 are connected through WIFI or bluetooth to enable information interaction.
In some examples, the VR device 2 comprises a VR headset, although the VR device 2 can also be any device that is worn on the head of a user, such as VR glasses, and has a display of the VR.
It will be understood that the above embodiments are merely exemplary embodiments taken to illustrate the principles of the present invention, which is not limited thereto. It will be apparent to those skilled in the art that various modifications and improvements can be made without departing from the spirit and substance of the invention, and these modifications and improvements are also considered to be within the scope of the invention.

Claims (15)

1. A whole-body pose tracking and haptic apparatus, comprising: the device comprises a main body structure, at least one motion detection unit, a control unit and at least one touch control unit; wherein the content of the first and second substances,
the body structure configured to be worn by a user on a limb;
the motion detection unit is arranged on the main body structure and is configured to detect the limb motion of the user and generate the motion posture information of the user;
the control unit is configured to process the received motion posture information, generate a first control signal, transmit the first control signal to the VR equipment, control the limb movement of the virtual user by the VR equipment, and generate a second control signal according to a touch signal fed back by the VR equipment after receiving the touch signal touched by the virtual user;
the tactile control unit is configured to feed back a real tactile sensation to the user limb through the body structure according to the second control signal.
2. The whole-body pose tracking and haptic device of claim 1, wherein the motion pose information comprises a rotation angle of a motion joint; the action detection unit comprises a first detection module and a second detection module;
the first detection module is configured to detect and process electromyographic signals generated by skeletal muscles of a user;
the second detection module is configured to detect and process a skin surface tension signal of the user;
the control unit is configured to obtain a rotation angle of a movement joint of the user through a first preset algorithm according to the processed electromyographic signal and the skin surface tension signal, and obtain movement posture information of the user through a second preset algorithm according to the rotation angle of the movement joint of the user to generate a first control signal.
3. The whole body pose tracking and haptic device of claim 2, wherein said first detection module comprises a myoelectric signal electrode and a first band pass amplifier; the second detection module comprises a skin surface tension strain gauge and a second band-pass amplifier;
the electromyographic signal electrode is configured to detect an electromyographic signal generated by skeletal muscle of a user;
the first band-pass amplifier is configured to amplify the electromyographic signals and transmit the signals to the control unit;
the skin surface tension strain gauge is configured to detect a skin surface tension signal of a user;
the second band-pass amplifier is configured to amplify the skin surface tension signal and transmit the amplified signal to the control unit.
4. The whole-body pose tracking and haptic device of claim 2 or 3, wherein the control unit comprises an analog-to-digital conversion module, a first calculation module and a first control module;
the analog-to-digital conversion module is configured to convert the processed electromyographic signals and the processed skin surface tension signals into first digital signals and second digital signals respectively;
the first calculation module is configured to obtain a rotation angle of a movement joint of the user according to the first digital signal and the second digital signal through a first preset algorithm, and obtain movement posture information of the user through a second preset algorithm;
the first control module is configured to generate a first control signal according to the movement posture information of the user, transmit the first control signal to the VR device, control the limb movement of the virtual user by the VR device, and generate a second control signal according to the touch signal after receiving the touch signal, fed back by the VR device, of the virtual user.
5. The whole-body posture tracking and haptic device according to claim 4, wherein the first computing module is specifically configured to obtain, according to the first digital signal and the second digital signal and through a first preset algorithm, a rotation angle of a kinematic joint of a user, and obtain, according to a pre-established kinematic model, a spatial pose coordinate of a current position point of each kinematic joint with respect to an initial point of each kinematic joint, respectively, so as to obtain the kinematic posture information of the user; the initial point is angle information of each motion joint of the initial posture of the user.
6. The whole body pose tracking and haptic device of claim 1, wherein the body structure is comprised of a plurality of hollow wire tubes woven; the tactile control unit comprises a first switch module, a second switch module and a touch module;
the first switch module and the second switch module are connected with the hollow wire pipe and are used for controlling gas to enter and discharge from the hollow wire pipe according to the second control signal; and the touch module is arranged on the hollow conduit defined by the first switch module and the second switch module, and when the main body structure is worn by a user, the touch module is positioned on one side of the hollow conduit close to the human body.
7. The whole-body pose tracking and haptic device of claim 6, wherein the control unit comprises a second control module and a second calculation module;
the second calculating module is configured to calculate to obtain position information that the user is actually touched according to a touch signal fed back by the VR device and that the virtual user is touched;
the second control module is configured to generate the second control signal according to the position information actually touched by the user.
8. The whole body pose tracking and haptics device of claim 6, wherein said hollow tube material comprises a fibrous material.
9. The whole body pose tracking and haptic device of claim 6, wherein the first and second switch modules comprise valves.
10. The whole body pose tracking and haptic apparatus of claim 6, wherein the touch module comprises a hammer.
11. The whole body pose tracking and haptics device of claim 6, wherein the haptics control unit further comprises a suction assembly configured to inflate and deflate the hollow tube.
12. The whole body pose tracking and haptic device of claim 11, wherein the suction assembly is an air pump.
13. A virtual reality system comprising the whole-body pose tracking and haptic device of any one of claims 1-12 and a VR device; the VR device is communicatively coupled with the whole-body pose tracking and haptic device.
14. The virtual reality system of claim 13, wherein the whole-body pose tracking and haptic device and the VR device are connected via WIFI or bluetooth.
15. The virtual reality system of claim 13, wherein the VR device comprises a VR headset.
CN202210704377.XA 2022-06-21 2022-06-21 Whole body posture tracking and touch equipment and virtual reality system Pending CN115202471A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202210704377.XA CN115202471A (en) 2022-06-21 2022-06-21 Whole body posture tracking and touch equipment and virtual reality system
PCT/CN2023/091411 WO2023246305A1 (en) 2022-06-21 2023-04-28 Whole-body posture tracking and haptic device and virtual reality system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210704377.XA CN115202471A (en) 2022-06-21 2022-06-21 Whole body posture tracking and touch equipment and virtual reality system

Publications (1)

Publication Number Publication Date
CN115202471A true CN115202471A (en) 2022-10-18

Family

ID=83576875

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210704377.XA Pending CN115202471A (en) 2022-06-21 2022-06-21 Whole body posture tracking and touch equipment and virtual reality system

Country Status (2)

Country Link
CN (1) CN115202471A (en)
WO (1) WO2023246305A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023246305A1 (en) * 2022-06-21 2023-12-28 京东方科技集团股份有限公司 Whole-body posture tracking and haptic device and virtual reality system

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10296086B2 (en) * 2015-03-20 2019-05-21 Sony Interactive Entertainment Inc. Dynamic gloves to convey sense of touch and movement for virtual objects in HMD rendered environments
CN106227339A (en) * 2016-08-16 2016-12-14 西安中科比奇创新科技有限责任公司 wearable device, virtual reality human-computer interaction system and method
CN107632699B (en) * 2017-08-01 2019-10-11 东南大学 Natural human-machine interaction system based on the fusion of more perception datas
CN107943282A (en) * 2017-11-06 2018-04-20 上海念通智能科技有限公司 A kind of man-machine interactive system and method based on augmented reality and wearable device
CN113190114B (en) * 2021-04-14 2022-05-20 三峡大学 Virtual scene experience system and method with haptic simulation and emotional perception
CN115202471A (en) * 2022-06-21 2022-10-18 京东方科技集团股份有限公司 Whole body posture tracking and touch equipment and virtual reality system

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023246305A1 (en) * 2022-06-21 2023-12-28 京东方科技集团股份有限公司 Whole-body posture tracking and haptic device and virtual reality system

Also Published As

Publication number Publication date
WO2023246305A1 (en) 2023-12-28

Similar Documents

Publication Publication Date Title
EP3427103B1 (en) Virtual reality
CN108854034B (en) Cerebral apoplexy rehabilitation training system based on virtual reality and inertial motion capture
KR102065687B1 (en) Wireless wrist computing and control device and method for 3d imaging, mapping, networking and interfacing
CN107330967B (en) Rider motion posture capturing and three-dimensional reconstruction system based on inertial sensing technology
JP6193764B2 (en) Human computer interaction control method and its operation
US10996757B2 (en) Methods and apparatus for generating haptic interaction for virtual reality
CN108815804A (en) VR rehabilitation training of upper limbs platform and method based on MYO armlet and mobile terminal
CN104524742A (en) Cerebral palsy child rehabilitation training method based on Kinect sensor
US11039974B2 (en) Full or partial body physical feedback system and wearable exoskeleton
US11086392B1 (en) Devices, systems, and methods for virtual representation of user interface devices
MX2011001698A (en) 3d monocular visual tracking therapy system for the rehabilitation of human upper limbs.
CN101422656B (en) Electric game operation device capable of sensing human action
CN106358024A (en) Stroke monitoring system and stroke monitoring method
WO2021108231A1 (en) System and method for remote control of devices in an environment
WO2023246305A1 (en) Whole-body posture tracking and haptic device and virtual reality system
CN111752393A (en) Wearable intelligent glove
WO2022116407A1 (en) Virtual reality apparatus using back lever device in linkage with omnidirectional running platform
CN109189279A (en) Human-computer interaction device
CN115364327A (en) Hand function training and evaluation rehabilitation glove system based on motor imagery
CN212421309U (en) Remote control device of foot type robot
CN108415575A (en) It is a kind of can embedded sensors motion capture gloves
US9940847B1 (en) Virtual reality exercise device
CN111687847A (en) Remote control device and control interaction mode of foot type robot
CN207888651U (en) A kind of robot teaching system based on action fusion
CN114602138B (en) Upper limb personalized rehabilitation training method and system based on human body movement model

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination