CN114265503B - Texture rendering method applied to pen-type vibration touch feedback device - Google Patents

Texture rendering method applied to pen-type vibration touch feedback device Download PDF

Info

Publication number
CN114265503B
CN114265503B CN202111592676.0A CN202111592676A CN114265503B CN 114265503 B CN114265503 B CN 114265503B CN 202111592676 A CN202111592676 A CN 202111592676A CN 114265503 B CN114265503 B CN 114265503B
Authority
CN
China
Prior art keywords
rendering
segment
segments
candidate
haptic
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202111592676.0A
Other languages
Chinese (zh)
Other versions
CN114265503A (en
Inventor
燕学智
孙重阳
孙晓颖
赵昱
佴威至
王庆龙
张淼
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jilin University
Original Assignee
Jilin University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jilin University filed Critical Jilin University
Priority to CN202111592676.0A priority Critical patent/CN114265503B/en
Publication of CN114265503A publication Critical patent/CN114265503A/en
Application granted granted Critical
Publication of CN114265503B publication Critical patent/CN114265503B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management

Landscapes

  • User Interface Of Digital Computer (AREA)

Abstract

The invention relates to a texture rendering method applied to a pen-type vibration touch feedback device, and belongs to the field of virtual reality and man-machine interaction. The method comprises the steps of collecting touch data when interacting with real textures, dividing acceleration into touch data segments with equal length, calculating novelty, and extracting acceleration data subfragments; during haptic rendering, a mapping model of haptic rendering data and acceleration data subfragments is established according to haptic rendering data such as speed, normal force, direction and inclination angle during real-time acquisition rendering interaction, a driving signal is generated, and a haptic effect of textures is presented to an operator. The advantages are that: modeling the acceleration data from the angle of the time domain signal, directly dividing and extracting the sub-segments from the acquired acceleration data, splicing and synthesizing the driving signals to perform haptic rendering, and maintaining the fidelity of the acquired original acceleration data, so that the rendering effect is more real and natural, and the sense of realism of haptic reproduction of textures is effectively improved.

Description

Texture rendering method applied to pen-type vibration touch feedback device
Technical Field
The invention belongs to the field of virtual reality and man-machine interaction, and particularly relates to a texture rendering method applied to a pen-type vibration touch feedback device.
Background
The haptic feedback technology is an important technology applied to man-machine interaction, can enable a person to perceive and operate a virtual object through touching, and is combined with the audiovisual feedback technology to help the person to acquire more abundant information and feelings, so that the quality of interaction with the virtual world is improved, and the haptic feedback technology has wide application prospects in the fields of education and teaching, medical treatment and health, entertainment games and the like.
Texture of the object surface plays an important role in human perception and recognition of the object, and texture haptic rendering algorithms are always hot problems in research in the field of haptic feedback. In the existing texture haptic rendering algorithm, a force model rendering method based on graphics uses a computer graphics method to model the microstructure of a texture surface, and corresponding haptic rendering signals are calculated according to the contact state during rendering. Minksy et al, 1990 paper "Feeling and seeing: issuesinforce display," designed a tangential force gradient algorithm in a two-dimensional plane using the Sandcap system, and rendered force feedback of a textured surface using a two-degree-of-freedom force joystick, which is proportional to the texture surface profile height. However, because the texture surface structure is complex, it is difficult to build an accurate micro model, and if the micro model is complex, the calculation amount will increase, resulting in excessive time delay of the output signal in the rendering process, and affecting the user experience.
The image feature extraction rendering method utilizes an image processing method to extract one or more features of the real texture image, and establishes a mapping relation between the features and the touch rendering signal. The Vasudevan et al paper, "tangable images: runtime generation of haptic textures from images," in 2008, used an edge detection algorithm for gray image processing, and proposed a design of a haptic rendering mask, whereby a user perceived the texture and contours of an image through a haptic feedback device. In 2010 paper Image-based haptic texture rendering, li et al propose a color Image texture haptic reproduction method, which respectively performs normal force and tangential force haptic modeling according to color temperature and local color change of an Image. However, the signals output by the rendering methods are unstable, and the color information of the images is easily influenced by other parameters, so that the feature extraction is influenced, and the generated signals cannot accurately render the touch sense of the texture.
The data-driven rendering method is used for establishing a touch model by collecting interaction information with real textures and utilizing the information, and the touch feedback device outputs touch rendering signals according to the real-time interaction information and the touch model. In the paper "Vibrotactile Feedback Rendering of Patterned Textures Using a Waveform Segment Table Method" of Nai et al 2021, acceleration signals are collected under the interaction conditions of different normal forces and speeds, the most representative signal segments are selected and stored by utilizing dynamic time warping, and during haptic rendering, the normal forces and speeds collected in real time determine and generate signal segments which are output to a haptic feedback device to provide haptic feedback. The touch feedback device consists of a surface plate, a surface touch pen, a power amplifier and a vibration actuator, wherein the surface plate reads the force and the position of a person holding the touch pen in real time when the touch pen slides. In paper Modeling and Rendering Realistic Textures fromUnconstrained Tool-Surface Interaction in 2014, a tactile recording device consisting of a six-axis force/moment sensor, a two-axis accelerometer, an electromagnetic motion tracking sensor and a pen body is utilized by Culbertson to collect acceleration signals of isotropic texture under any normal force and speed interaction condition, an LPC model is utilized to model amplitude spectrum of the collected acceleration signals, and during rendering, the acceleration signals are calculated and generated by utilizing an interpolation algorithm according to acceleration and normal force collected in real time and output to the pen-type tactile feedback device to generate tactile feedback. The tactile feedback device comprises a Wacom device, a Cintiq 12WX interactive pen and a vibration actuator, wherein the Wacom device can measure the force and the position of a pen point in the sliding process of the human-held Cintiq 12WX interactive pen. Shin et al in 2015, paper "Data-Driven Modeling of Isotropic Haptic Textures Using Frequency-Decomposed Neural Networks," proposed modeling acceleration signals of a texture acquired under specific normal force and velocity interaction conditions using a frequency division neural network, which is applicable to both isotropic and anisotropic textures. Abdulali in 2016's paper "Data-Driven Modeling of Anisotropic Haptic Textures: data Segmentation and Interpolation" proposes a rendering method for anisotropic textures, which expands two interaction conditions, namely normal force and speed, during Data acquisition into two-dimensional vectors, and interpolates input Data by using a radial basis function network. In the data acquisition stage, the accelerometer and the force sensor are installed on the Phantom Premium 1.0, and acceleration data, force data and position data which the Phantom Premium 1.0 can provide are acquired respectively.
The method extracts the amplitude spectrum characteristics of the acceleration signals, uses the statistical parameters to represent, predicts the statistical parameters by using an interpolation method in the rendering process, and then recovers the amplitude spectrum characteristics of the acceleration signals according to the statistical parameters. However, these methods only recover the amplitude spectrum characteristics of the acceleration signal, ignore the time domain characteristics of the acceleration signal, and the generated acceleration signal is not matched with the acquired signal in the time domain, and the interpolation method cannot accurately predict the parameters of the synthesized acceleration signal, so that the rendering effect is not ideal.
Disclosure of Invention
The invention provides a texture touch rendering method applied to a pen-type vibration touch feedback device, which is characterized in that acceleration data and other touch data are collected when the acceleration data and the other touch data interact with a real texture, the acceleration data are divided into sub-segments, a mapping model of the sub-segments and the sub-segments is built according to the touch rendering data interacted in real time, and the sub-segments are output in real time, so that the touch rendering of the texture is realized.
The technical scheme adopted by the invention is that the method comprises the following steps:
(1) Collecting haptic data while interacting with a real texture, comprising: acceleration a, first velocity v 1 First normal force F 1 First direction D 1 And a first inclination angle theta 1
(2) Determining a segmentation time length L, and dividing the haptic data acceleration a into haptic data segments with N' equal length according to the segmentation time length L;
(3) Respectively reserving N ' haptic data segments according to the N ' haptic data segments, and determining the novelty of each haptic data segment in the N ' haptic data segments;
(4) Re-dividing the N' haptic data segments into M Duan Houxuan sub-segments, respectively, according to novelty;
(5) According to the M Duan Houxuan subfragments, M is reserved 1 Segment candidate sub-segment, determining M 1 The transition probability between every two candidate sub-segments in the segment candidate sub-segments;
(6) The method for collecting touch rendering data at the current moment in real time during rendering interaction comprises the following steps: second speed v 2 Second normal force F 2 Second direction D 2 And a second inclination angle theta 2
(7) From M, according to transition probability and current moment touch rendering data 1 And determining a rendering segment at the current moment in the candidate sub-segments, and outputting the rendering segment in real time to perform haptic rendering of textures.
The step (1) of the invention collects the touch data when interacting with the real texture, and specifically comprises the following steps:
the acceleration a is obtained in real time through a multi-axis accelerometer fixed on the texture touch data acquisition device;
First velocity v 1 The method comprises the steps of obtaining in real time through a positioning device fixed on a texture touch data acquisition device;
first normal force F 1 Acquiring in real time through a force sensor fixed on the texture touch data acquisition device;
first direction D 1 The method comprises the steps of obtaining in real time through a positioning device fixed on a texture touch data acquisition device;
first inclination angle theta 1 Acquiring in real time through a gesture sensor fixed on the texture touch data acquisition device;
the step (2) of the present invention determines the segmentation time length L, and specifically includes:
carrying out Fourier transformation on the haptic data acceleration a, selecting a frequency point P with the largest frequency spectrum amplitude value, taking the reciprocal of the frequency point as a segmentation time length L, and dividing the haptic data acceleration a into N' haptic data segments with equal length according to the segmentation time length L;
the step (3) of the invention is to determine the novelty of each of the N "haptic data segments, and specifically further comprises:
n 'eigenvectors corresponding to the N' equal-length haptic data segments are respectively determined, and according to the N 'eigenvectors, first similarity between every two haptic data segments in the N' equal-length haptic data segments is determined;
respectively taking each haptic data segment in N' equal length haptic data segments as a center, and respectively selecting adjacent m length haptic data segments from front to back to form haptic data segments; selecting N ' sections of equal-length haptic data sections from the N ' sections of equal-length haptic data sections, and determining second similarity between every two sections of the N ' sections of equal-length haptic data sections according to the first similarity between every two sections of haptic data sections;
And determining the novelty of each of the N-segment equal-length haptic data segments according to the second similarity.
The step (4) of the invention re-divides the N' segment of haptic data into M Duan Houxuan sub-segments, which comprises:
and connecting the novelty of the N sections of the haptic data segments according to the novelty of each of the N sections of the haptic data segments with equal length to obtain a novelty curve. In the novelty curve, will be above the threshold T 0 The maximum value of (a) is taken as a breakpoint, the sub-segment between the two breakpoints is taken as a candidate sub-segment, and the haptic data acceleration a is re-segmented into M Duan Houxuan sub-segments.
The M is determined in the step (5) of the invention 1 The transition probability between every two candidate sub-segments in the segment candidate sub-segments specifically comprises:
determining a third similarity between every two candidate sub-segments in the M candidate sub-segments according to the second similarity;
respectively taking each candidate sub-segment in the M candidate sub-segments as a center, and respectively selecting n adjacent candidate sub-segments from front to back; selecting M from M segment candidate sub-segments 1 Segment candidate sub-segments and determining M according to the third similarity 1 A fourth similarity between every two candidate sub-segments of the segment candidate sub-segments;
According to the fourth similarity, M is determined 1 Every two segments in the segment candidate sub-segmentTransition probabilities between candidate sub-segments.
In the step (6), the invention collects the touch rendering data at the current moment in real time during the rendering interaction, and specifically comprises the following steps:
second speed v 2 The method comprises the steps of obtaining through a positioning device fixed on a texture touch rendering device;
second normal force F 2 Acquiring in real time through a force sensor fixed on the texture touch rendering device;
second direction D 2 The method comprises the steps of obtaining through a positioning device fixed on a texture touch rendering device;
second inclination angle theta 2 Acquiring by a gesture sensor fixed on the texture touch rendering device;
and constructing a touch rendering vector at the current moment according to the touch rendering data at the current moment.
In the step (7), the invention outputs the rendering segment in real time to perform the haptic rendering of the texture, and specifically comprises the following steps:
determination of M 1 Average speed of texture haptic data acquisition device for time t for each of the segment candidate sub-segmentsAverage normal force->Average direction->Average dip->And constructing a haptic vector of the candidate sub-segment;
respectively determining the Euclidean distance between the haptic vector of each candidate sub-segment and the haptic rendering vector at the current moment;
Selecting candidate sub-fragments corresponding to a plurality of Euclidean distances larger than a threshold Euclidean distance as a candidate rendering fragment set;
determining the rendering segment at the current moment according to the transition probability between the rendering segment at the previous moment and the candidate rendering segment in the candidate rendering segment set, specifically including:
1) The first maximum transition probability is calculated according to the following formula:
wherein C is i For a candidate rendering fragment of the set of candidate rendering fragments,for the probability of transition from rendering segment e to candidate rendering segment Ci at the previous time, P max The method comprises the steps that the maximum value of transition probability between a rendering fragment at the previous moment and a candidate rendering fragment in a candidate rendering fragment set is obtained;
selecting a candidate rendering segment corresponding to the first maximum transition probability as a rendering segment at the current moment;
2) Further comprises: the second maximum transition probability is calculated according to the following formula:
wherein C is i For a candidate rendering fragment of the set of candidate rendering fragments,rendering segment e to candidate rendering segment C for the previous time i Transition probability, P' max The maximum value of the transition probability between the rendering segment at the previous moment and the candidate rendering segment in the candidate rendering segment set is given, and mu is the number of adjacent rendering segments of the rendering segment e at the previous moment in the haptic data acceleration a;
And selecting the candidate rendering segment corresponding to the second maximum transition probability as the rendering segment.
3) Further comprises: the third maximum transition probability is calculated according to the following formula:
wherein Ci is a candidate rendering fragment in the candidate rendering fragment set,for the transition probability of the rendering fragment e to the candidate rendering fragment Ci at the previous moment, P max P is the maximum value of transition probability between the rendering fragment at the previous moment and a plurality of candidate rendering fragments 0 μ is the number of adjacent rendering segments of the rendering segment e in the haptic data acceleration a at the previous moment, which is the threshold value of the transition probability;
and selecting a candidate rendering segment corresponding to the third maximum transition probability as a rendering segment.
The pen-type vibration tactile feedback device of the invention comprises: pen texture haptic data acquisition and pen vibrotactile feedback, wherein:
1) The pen-type texture tactile data acquisition comprises:
the pen body, accelerometer, position sensor, force transducer, gesture sensor for gather the touch data when interacting with real material, include: acceleration a, first velocity v 1 First normal force F 1 First direction D 1 And a first inclination angle theta 1
The processing module is used for determining a segmentation time length L and dividing the acceleration into N' haptic data segments with equal length according to the segmentation time length; the method comprises the steps of,
Determining a novelty of each of the haptic data segments; and re-dividing the acceleration into M according to the novelty 1 Segment candidate sub-segments; the method comprises the steps of,
determining the transition probability between every two segments of the candidate sub-segments;
2) The pen-type vibrotactile feedback device includes:
pen body, position sensor and force sensorThe sensor and the gesture sensor are used for collecting touch rendering data at the current moment in the rendering interaction in real time, and comprise the following steps: second speed v 2 Second normal force F 2 Second direction D 2 And a second inclination angle theta 2
The processing module is used for determining the rendering segment from the candidate sub-segments according to the transition probability and the touch rendering data at the current moment, and converting the rendering segment into a driving signal to be output in real time;
the power amplifier is used for amplifying the driving signal generated by the processing module and outputting the amplified driving signal to the vibration actuator;
the vibration actuator is used for receiving the driving signal output by the power amplifier in real time, providing corresponding tactile feedback and performing tactile rendering of textures;
and the interaction panel is used for holding the pen body by hand and performing interaction actions such as sliding on the interaction panel.
The invention has the following beneficial effects:
1. according to the texture touch rendering method, the acceleration data subfragments are directly screened for rendering according to the current interaction conditions, interpolation calculation is not needed, driving signals can be predicted more accurately, and the sense of reality of texture touch reproduction is effectively improved.
2. According to the texture touch rendering method, modeling is conducted on acceleration data from the angle of a time domain signal, the collected acceleration data is divided into sub-segments, the sub-segments are spliced into driving signals during touch rendering, the fidelity of the collected acceleration data is maintained, and rendering effects are more natural.
3. The texture touch rendering method provided by the invention has strong universality and can be expanded to other planar touch feedback devices, such as electrostatic force touch feedback devices, air pressure film touch feedback devices and the like.
Drawings
FIG. 1 is a schematic diagram of a pen-type vibrotactile feedback device of the present invention;
FIG. 2 is a flow chart of the present invention;
FIG. 3 is a schematic diagram of the collection of haptic data while interacting with a real texture;
FIG. 4 is a schematic diagram of the amplitude spectrum after Fourier transforming the haptic data acceleration a;
FIG. 5 is a schematic diagram of the division of haptic data acceleration a into haptic data segments;
FIG. 6 is a feature vector calculation flow diagram of a haptic data segment;
FIG. 7 is a schematic diagram of candidate sub-fragments;
FIG. 8 is a schematic diagram of haptic rendering data at a current time instant when a rendering interaction is collected in real-time.
Detailed Description
Comprises the following steps:
(1) Collecting haptic data while interacting with a real texture, comprising: acceleration a, first velocity v 1 First normal force F 1 First direction D 1 And a first inclination angle theta 1
(2) Determining a segmentation time length L, and dividing the haptic data acceleration a into haptic data segments with N' equal length according to the segmentation time length L;
(3) Respectively reserving N ' haptic data segments according to the N ' haptic data segments, and determining the novelty of each haptic data segment in the N ' haptic data segments;
(4) Re-dividing the N' haptic data segments into M Duan Houxuan sub-segments, respectively, according to novelty;
(5) According to the M Duan Houxuan subfragments, M is reserved 1 Segment candidate sub-segment, determining M 1 The transition probability between every two candidate sub-segments in the segment candidate sub-segments;
(6) The method for collecting touch rendering data at the current moment in real time during rendering interaction comprises the following steps: second speed v 2 Second normal force F 2 Second direction D 2 And a second inclination angle theta 2
(7) From M, according to transition probability and current moment touch rendering data 1 And determining a rendering segment at the current moment in the candidate sub-segments, and outputting the rendering segment in real time to perform haptic rendering of textures.
The step (1) of the invention collects the touch data when interacting with the real texture, and specifically comprises the following steps:
the acceleration a is obtained in real time through a multi-axis accelerometer fixed on the texture touch data acquisition device;
first velocity v 1 The method comprises the steps of obtaining in real time through a positioning device fixed on a texture touch data acquisition device;
first normal force F 1 Acquiring in real time through a force sensor fixed on the texture touch data acquisition device;
first direction D 1 The method comprises the steps of obtaining in real time through a positioning device fixed on a texture touch data acquisition device;
first inclination angle theta 1 Acquiring in real time through a gesture sensor fixed on the texture touch data acquisition device;
the step (2) of the present invention determines the segmentation time length L, and specifically includes:
carrying out Fourier transformation on the haptic data acceleration a, selecting a frequency point P with the largest frequency spectrum amplitude value, taking the reciprocal of the frequency point as a segmentation time length L, and dividing the haptic data acceleration a into N' haptic data segments with equal length according to the segmentation time length L;
the step (3) of the invention is to determine the novelty of each of the N "haptic data segments, and specifically further comprises:
N 'eigenvectors corresponding to the N' equal-length haptic data segments are respectively determined, and according to the N 'eigenvectors, first similarity between every two haptic data segments in the N' equal-length haptic data segments is determined;
respectively taking each haptic data segment in N' equal length haptic data segments as a center, and respectively selecting adjacent m length haptic data segments from front to back to form haptic data segments; selecting N ' sections of equal-length haptic data sections from the N ' sections of equal-length haptic data sections, and determining second similarity between every two sections of the N ' sections of equal-length haptic data sections according to the first similarity between every two sections of haptic data sections;
and determining the novelty of each of the N-segment equal-length haptic data segments according to the second similarity.
The step (4) of the invention re-divides the N' segment of haptic data into M Duan Houxuan sub-segments, which comprises:
and connecting the novelty of the N sections of the haptic data segments according to the novelty of each of the N sections of the haptic data segments with equal length to obtain a novelty curve. In the novelty curve, will be above the threshold T 0 The maximum value of (a) is taken as a breakpoint, the sub-segment between the two breakpoints is taken as a candidate sub-segment, and the haptic data acceleration a is re-segmented into M Duan Houxuan sub-segments.
The M is determined in the step (5) of the invention 1 The transition probability between every two candidate sub-segments in the segment candidate sub-segments specifically comprises:
determining a third similarity between every two candidate sub-segments in the M candidate sub-segments according to the second similarity;
respectively taking each candidate sub-segment in the M candidate sub-segments as a center, and respectively selecting n adjacent candidate sub-segments from front to back; selecting M from M segment candidate sub-segments 1 Segment candidate sub-segments and determining M according to the third similarity 1 A fourth similarity between every two candidate sub-segments of the segment candidate sub-segments;
according to the fourth similarity, M is determined 1 Transition probabilities between every two candidate sub-segments in the segment candidate sub-segments.
In the step (6), the invention collects the touch rendering data at the current moment in real time during the rendering interaction, and specifically comprises the following steps:
second speed v 2 The method comprises the steps of obtaining through a positioning device fixed on a texture touch rendering device;
second normal force F 2 Acquiring in real time through a force sensor fixed on the texture touch rendering device;
second direction D 2 The method comprises the steps of obtaining through a positioning device fixed on a texture touch rendering device;
second inclination angle theta 2 Acquisition by an attitude sensor fixed to a texture haptic rendering device ;
And constructing a touch rendering vector at the current moment according to the touch rendering data at the current moment.
In the step (7), the invention outputs the rendering segment in real time to perform the haptic rendering of the texture, and specifically comprises the following steps:
determination of M 1 Average speed of texture haptic data acquisition device for time t for each of the segment candidate sub-segmentsAverage normal force->Average direction->Average dip->And constructing a haptic vector of the candidate sub-segment;
respectively determining the Euclidean distance between the haptic vector of each candidate sub-segment and the haptic rendering vector at the current moment;
selecting candidate sub-fragments corresponding to a plurality of Euclidean distances larger than a threshold Euclidean distance as a candidate rendering fragment set;
determining the rendering segment at the current moment according to the transition probability between the rendering segment at the previous moment and the candidate rendering segment in the candidate rendering segment set, specifically including:
1) The first maximum transition probability is calculated according to the following formula:
wherein C is i For a candidate rendering fragment of the set of candidate rendering fragments,for the probability of transition from rendering segment e to candidate rendering segment Ci at the previous time, P max The method comprises the steps that the maximum value of transition probability between a rendering fragment at the previous moment and a candidate rendering fragment in a candidate rendering fragment set is obtained;
Selecting a candidate rendering segment corresponding to the first maximum transition probability as a rendering segment at the current moment;
2) Further comprises: the second maximum transition probability is calculated according to the following formula:
wherein C is i For a candidate rendering fragment of the set of candidate rendering fragments,rendering segment e to candidate rendering segment C for the previous time i Transition probability, P' max The maximum value of the transition probability between the rendering segment at the previous moment and the candidate rendering segment in the candidate rendering segment set is given, and mu is the number of adjacent rendering segments of the rendering segment e at the previous moment in the haptic data acceleration a;
and selecting the candidate rendering segment corresponding to the second maximum transition probability as the rendering segment.
3) Further comprises: the third maximum transition probability is calculated according to the following formula:
wherein Ci is a candidate rendering fragment in the candidate rendering fragment set,for the transition probability of the rendering fragment e to the candidate rendering fragment Ci at the previous moment, P max P is the maximum value of transition probability between the rendering fragment at the previous moment and a plurality of candidate rendering fragments 0 For the threshold of transition probability, μ is the last time segment e was rendered at touchA number of adjacent rendering segments in the sense data acceleration a;
and selecting a candidate rendering segment corresponding to the third maximum transition probability as a rendering segment.
The pen-type vibration tactile feedback device of the invention comprises: pen texture haptic data acquisition and pen vibrotactile feedback, wherein:
1) The pen-type texture tactile data acquisition comprises:
the pen body, accelerometer, position sensor, force transducer, gesture sensor for gather the touch data when interacting with real material, include: acceleration a, first velocity v 1 First normal force F 1 First direction D 1 And a first inclination angle theta 1
The processing module is used for determining a segmentation time length L and dividing the acceleration into N' haptic data segments with equal length according to the segmentation time length; the method comprises the steps of,
determining a novelty of each of the haptic data segments; and re-dividing the acceleration into M according to the novelty 1 Segment candidate sub-segments; the method comprises the steps of,
determining the transition probability between every two segments of the candidate sub-segments;
2) The pen-type vibrotactile feedback device includes:
the pen body, position sensor, force sensor and gesture sensor for the current moment of time touch rendering data when rendering the interaction is gathered in real time includes: second speed v 2 Second normal force F 2 Second direction D 2 And a second inclination angle theta 2
The processing module is used for determining the rendering segment from the candidate sub-segments according to the transition probability and the touch rendering data at the current moment, and converting the rendering segment into a driving signal to be output in real time;
The power amplifier is used for amplifying the driving signal generated by the processing module and outputting the amplified driving signal to the vibration actuator;
the vibration actuator is used for receiving the driving signal output by the power amplifier in real time, providing corresponding tactile feedback and performing tactile rendering of textures;
and the interaction panel is used for holding the pen body by hand and performing interaction actions such as sliding on the interaction panel.
The present invention will be further described with reference to the drawings and specific examples, and it should be noted that these are not intended to limit the scope of the present disclosure.
FIG. 1 is a schematic diagram of a pen-type vibrotactile feedback device according to the present invention, comprising: the pen comprises a posture sensor 101, a position sensor 102, a pen body 103, a vibration actuator 104, a force sensor 105, a processing module 106, a power amplifier 107 and an interaction panel 108; wherein, based on the tactile feedback device of Nai et al in 2021 paper "Vibrotactile Feedback Rendering of PatternedTextures Using a Waveform Segment Table Method", the invention adds an attitude sensor 101, and the specific workflow is as follows:
the hand-held pen body 103 performs interaction actions such as sliding on the interaction panel 108, and the gesture sensor 101, the position sensor 102 and the force sensor 105 acquire a first inclination angle theta in real time 2 First speed v 2 First direction D 2 And a first normal force F 2 The touch rendering data at the current moment is transmitted to the processing module; the processing module determines the rendering segment from the candidate sub-segments according to the transition probability and the touch rendering data at the current moment, and converts the rendering segment into a driving signal to be output to the power amplifier in real time; the power amplifier amplifies the driving signal generated by the processing module and outputs the amplified driving signal to the vibration actuator; the vibration actuator receives the driving signal output by the power amplifier in real time, provides corresponding tactile feedback and performs tactile rendering of textures.
FIG. 2 is a flow chart of a rendering method of the present invention, which includes steps 1 to 7;
step 1, collecting touch data when interacting with a real texture, comprising: acceleration a, first velocity v 1 First normal force F 1 First direction D 1 And a first inclination angle theta 1 As shown in fig. 3.
In this step, the acceleration a is acquired in real time by a multi-axis accelerometer fixed to the texture tactile data acquisition device 302 at a sampling rate S0.
First velocity v 1 Is acquired in real time by a positioning device fixed to the texture tactile data acquisition device 302 at a sampling rate S1, in particular, the position 303 (x 1 ,y 1 ) Position 304 (x' 1 ,y' 1 ) Time interval t 1 Determining a first velocity v across the real textured surface 301 1 The calculation formula is as follows:
first normal force F 1 The sample rate is S2, acquired in real time by a force sensor affixed to the texture tactile data collection device 302.
First direction D 1 Is acquired in real time by a positioning device fixed to the texture tactile data acquisition device 302, with a sampling rate S3, in particular, the position 303 (x 1 ,y 1 ) Position 304 (x' 1 ,y' 1 ) Determining a first direction D 1 The calculation formula is as follows:
first inclination angle theta 1 The texture touch data acquisition device 302 is fixed on a gesture sensor for real-time acquisition, and the sampling rate is S4;
step 2, determining a segmentation time length L, and dividing the haptic data acceleration a in the step 1 into haptic data segments with N' equal length according to the segmentation time length L;
in this step, the haptic data acceleration a is fourier transformed, and as shown in fig. 4, a frequency point P401 having the largest spectral amplitude value is selected, and the inverse of the frequency point is taken as the segmentation time length L. As shown in fig. 5, from the time starting point t 0 501, the haptic data acceleration a is divided into N haptic data segments in accordance with the segment time length L502. In the segmentation process, if the length of the haptic data segment does not reach L, discarding the haptic data segment, and finally reserving N' haptic data segments with equal length. If no haptic data segment is discarded during the segmentation, then N' =n; with haptic data segments discarded, N' <N。
And 3, reserving N ' haptic data segments according to the N ' haptic data segments obtained in the step 2, and determining the novelty of each haptic data segment in the N ' haptic data segments.
In an alternative embodiment, step 3 specifically includes:
in step 301, a first similarity between every two haptic data segments of equal length of N' segments is determined. Specifically, each of the N 'equal-length haptic data segments is respectively subjected to fast fourier transform, a mel cepstrum coefficient is further calculated according to the result of the fast fourier transform to be used as a feature vector X, N' feature vectors corresponding to the N 'equal-length haptic data segments are determined, the calculation process is as shown in fig. 6, and any two haptic data segments a1 in the N' equal-length haptic data segments are determined i And a1 j Respectively corresponding feature vectors X i And X j Correlation between as haptic data segment a1 i With haptic data segment a1 j First similarity b between i,j Specifically, the calculation formula is:
step 302, two equal length haptic data segments a1 determined in accordance with step 301 i And a1 j First similarity b between i,j A second similarity between the two haptic data segments is determined. Taking into account the two adjacent sections, namely taking the ith section of the haptic data section as the center, a haptic data section A consisting of the first m sections of adjacent haptic data sections and the last m sections of adjacent haptic data sections i Centering on the j-th haptic data segment,one haptic data segment A consisting of the first m haptic data segments and the last m haptic data segments j . If the ith haptic data segment a1 i Or the j-th haptic data segment a1 j If the number of the front or rear haptic data segments is smaller than m, discarding the haptic data segments and the haptic data segments whose number of the front or rear haptic data segments is smaller than m, and finally reserving N' haptic data segments with equal length. If no haptic data segment is discarded, then N "=n'; if a haptic data segment is discarded, N'<N'. Determining haptic data segment A i Haptic data segment a2 in (2) i+k With haptic data segment A j Haptic data segment a2 in (2) j+k First similarity b between i+k,j+k And weighted sum as the i-th haptic data segment a1 of the N-segment equal-length haptic data segments i With the j-th haptic data segment a1 j Second similarity b between i ' ,j Wherein k is taken over [ -m, m]All integers in the interval, specifically, the calculation formula is:
wherein m is an integer between 1 and 10, [ omega ] -m ……ω m ]For the weighted window, is the weight of 2m+1 haptic data segments within one haptic data segment. The weighted window is selected from rectangular window, hanning window, hamming window, triangular window, gaussian window, etc.
Step 303, the haptic data segment a1 determined in accordance with step 302 i And a1 j A second degree of similarity b 'between' i,j Obtaining a similarity matrix b of N 'x N':
from haptic data segment a1 i And a1 j A second degree of similarity b 'between' i,j The novelty N (i) of the haptic data segment a1i is determined, in particular, by the calculation formula:
wherein K is k1k2 Is a kernel matrix of h x h, h is an integer between 1 and 10, b' i+k1,i+k2 Is a haptic data segment a1 i+k1 With haptic data segment a1 i+k2 A second similarity between, k1 and k2, respectively, is taken over [ -h/2,h/2 []All integers in between.
Specifically, the kernel matrix K k1k2 The window function may be two-dimensional, or may be a two-dimensional matrix of the form:
wherein A is a matrix of h/2 Xh/2:
step 4, according to the novelty obtained in step 303, the N "segment haptic data segment obtained in step 302 is re-segmented into M Duan Houxuan sub-segments.
In this step, the haptic data segment a1 obtained according to step 303 i And (3) the novelty value is connected with the novelty value of the N' section of the tactile data section to obtain a novelty curve, and the novelty curve is normalized. In the novelty curve, will be above the threshold T 0 Is taken as a breakpoint, T 0 The value range of (2) is [0.4,0.8 ]]The sub-segment between the two break points is used as a candidate sub-segment to obtain a candidate sub-segment sequence lambda consisting of M segments of candidate sub-segments.
Step 5, reserving M according to the M segment candidate sub-fragments obtained in the step 4 1 Segment candidate sub-segment, determining M 1 Transition probabilities between every two candidate sub-segments in the segment candidate sub-segments.
In an alternative embodiment, step 5 specifically includes:
step 501, determining any two candidates in the M-segment candidate sub-segmentAnd a third similarity between the fragment alpha and the candidate sub-fragment beta. Specifically, as shown in fig. 7, assume that: candidate sub-segment alpha 702 contains a Q segment of haptic data and begins with a Q segment of haptic data 701 and ends with a q+Q segment of haptic data 703, candidate sub-segment beta 705 contains a R segment of haptic data and begins with a R segment of haptic data 704 and ends with a r+R segment of haptic data 706, and Q<R, third similarity B 'between candidate segment alpha 702 and candidate segment beta 705' α,β The specific calculation formula is as follows:
wherein b' q+k,r+[kR/Q] Is a haptic data segment a1 q+k With haptic data segment a1 r+[kR/Q] Second degree of similarity, ω' k Is a weighting coefficient, k is taken over [1, Q ]]All integers within a range, [ omega ]' 1 ……ω' Q ]For the weighted window, it is the weight of each of the Q haptic data segments. The weighted window is selected from rectangular window, hanning window, hamming window, triangular window, gaussian window, etc.
Step 502, taking into account the first n and last n candidate sub-segments adjacent to candidate sub-segment alpha and candidate sub-segment beta in the M segment candidate sub-segments, if the number of sub-segments before or after the candidate sub-segment is less than n, discarding the candidate sub-segment to obtain a segment consisting of M 1 Candidate segment sequence lambda consisting of segment candidate segments 1 . According to the third similarity in step 501, M is determined again 1 Candidate segment alpha of segment candidate segments 1 And candidate subfragment beta 1 Fourth similarity betweenSpecifically, the calculation formula is as follows:
wherein ω' k Is a weighting coefficient, k is taken over [ -n, n []All integers within a range, [ omega ] " -n ……ω” n ]The weighting window is the weight of each candidate sub-segment in 2n+1 candidate sub-segments. The weighted window is selected from rectangular window, hanning window, hamming window, triangular window, gaussian window, etc. n is an integer between 1 and 10.
Step 503, determining M 1 Candidate sub-segment alpha of segment candidate sub-segments 1 To candidate subfragment beta 1 Transition probability of (2)Transition probability->With candidate subfragment alpha 1 +1 and candidate subfragment beta 1 Fourth similarity between +.>Related to, and following an exponential function relationship:
where A is the normalization constant and τ is the scaling parameter.
Step 6, collecting touch rendering data at the current moment in the rendering interaction in real time, wherein the step comprises the following steps: second speed v 2 Second normal force F 2 Second direction D 2 Second inclination angle theta 2 As shown in fig. 8.
In this step, the second velocity v 2 The sampling rate is S1, obtained by a positioning device located on the texture haptic rendering device 802, in particular the position 803 (x 2 ,y 2 ) Position 804 (x' 2 ,y' 2 ) Time interval t 2 Determining a second velocity v across the surface of the interactive panel 801 2 The calculation formula is as follows:
second normal force F 2 Acquired in real-time by a force sensor located on the texture haptic rendering device 802, at a sampling rate S2.
Second direction D 2 The sampling rate is S3, obtained by a positioning device located on the texture haptic rendering device 802, in particular the position 803 (x 2 ,y 2 ) Position 804 (x' 2 ,y' 2 ) Determining the second direction D 2 The calculation formula is as follows:
second inclination angle theta 2 Acquired by a gesture sensor located on the texture haptic rendering device 802, the sampling rate is S4;
the obtained haptic rendering data is formed into a haptic rendering vector (v) 2 ,F 2 ,D 2 ,θ 2 )。
Step 7, according to the transition probability of step 5 and the touch rendering data at the current moment of step 6, from M 1 And determining a rendering segment at the current moment in the segment candidate sub-segments, and outputting the rendering segment in real time to perform haptic rendering of textures.
In an alternative embodiment, step 7 may specifically include:
step 701, determining M 1 Average speed of texture haptic data acquisition device for time t for each of segment candidate sub-segmentsSpecifically, the calculation formula is:
wherein v is 11 Representing a first velocity v for a time t for which each candidate sub-segment lasts 1 V 1S1×t Representing a first velocity v for a time t for which each candidate sub-segment lasts 1 S1 x t sample point values of (c).
Determination of M 1 Average normal force of texture haptic data acquisition device for time t for each of segment candidate sub-segmentsSpecifically, the calculation formula is:
wherein F is 11 Representing the first normal force F for the duration t of each candidate sub-segment 1 Is the first sample point value of F 1S2×t Representing the first normal force F for the duration t of each candidate sub-segment 1 S2 x t sample point values of (c).
Determination of M 1 Average direction of texture haptic data acquisition device for time t for each of segment candidate sub-segmentsSpecifically, the calculation formula is:
wherein D is 11 Representing a first direction D for a time t for which each candidate sub-segment lasts 1 D is the first sample point value of (D) 1S3×t Representing a first direction D for a time t for which each candidate sub-segment lasts 1 S3 x t sample point values of (c).
Determination of M 1 Average tilt angle of texture haptic data acquisition device for time t for each candidate sub-segment of segment candidate sub-segmentsSpecifically, the calculation formula is:
wherein θ 11 Representing a first tilt angle theta for a time t for which each candidate sub-segment lasts 1 And θ 1S4×t Representing a first tilt angle theta for a time t for which each candidate sub-segment lasts 1 S4 x t sample point values;
rendering vectors that make up the candidate sub-segment
Step 702, sequentially calculating the current vector (v 2 ,F 2 ,D 2 ,θ 2 ) And M is as follows 1 Rendering vectors for each of the segment candidate sub-segmentsSpecifically, the calculation formula is as follows: />
M 1 The Euclidean distance maximum value in the segment candidate subfragment is d max Threshold d 0 The value range of (2) is [0.4d ] max ,0.8d max ]Select M 1 The Euclidean distance d in the segment candidate sub-segment is higher than the threshold d 0 Forms a candidate rendering fragment set C, wherein the candidate rendering fragments in the set C are C i
Step 703, determining a rendering fragment g at the current time according to the transition probability between the rendering fragment e at the previous time and the candidate rendering fragment Ci in the candidate rendering fragment set C.
Specifically, the first maximum transition probability may be calculated according to the following formula:
wherein, the liquid crystal display device comprises a liquid crystal display device,rendering segment e to candidate rendering segment C for the previous time i Transition probability, P of (2) max The method comprises the steps that the maximum value of transition probabilities between a rendering fragment e and a plurality of candidate rendering fragments at the previous moment is obtained;
and selecting a candidate rendering segment corresponding to the first maximum transition probability as a rendering segment g at the current moment.
Specifically, the second maximum transition probability may also be calculated according to the following formula:
wherein, the liquid crystal display device comprises a liquid crystal display device,rendering segment e to candidate rendering segment C for the previous time i Transition probability, P' max For the maximum value of the transition probability between the rendering segment e at the previous moment and the candidate rendering segments, mu is the number of adjacent rendering segments of the rendering segment e at the previous moment in the haptic data acceleration a, and the value range is [1,10];
And selecting a candidate rendering segment corresponding to the second maximum transition probability as a rendering segment g at the current moment.
Specifically, the third maximum transition probability may also be calculated according to the following formula:
wherein, the liquid crystal display device comprises a liquid crystal display device,rendering segment e to candidate rendering segment C for the previous time i Transition probability, P' max For the maximum value of transition probability between the rendering segment e at the previous moment and a plurality of candidate rendering segments, P 0 The value range is [0.4,0.8 for the threshold value of the transition probability]Mu is the number of adjacent rendering fragments of the rendering fragment e in the acceleration data a at the previous moment, and the value range is [1,10];
And selecting a candidate rendering segment corresponding to the third maximum transition probability as a rendering segment g at the current moment.
And outputting the rendering fragment g at the current moment in real time to perform haptic rendering of textures.
Finally, it should be noted that: the above embodiments are only for illustrating the technical solution of the present invention, and are not limiting; although the invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit of the corresponding technical solutions from the scope of the technical solutions of the embodiments of the present invention.

Claims (4)

1. A texture rendering method applied to a pen-type vibrotactile feedback device, comprising:
(1) Collecting haptic data while interacting with a real texture, comprising: acceleration a, first velocity v 1 First normal force F 1 First direction D 1 And a first inclination angle theta 1
(2) Determining a segmentation time length L, and dividing the haptic data acceleration a into haptic data segments with N' equal length according to the segmentation time length L;
(3) Reserving N ' haptic data segments according to the N ' haptic data segments with equal length respectively, and determining the novelty of each haptic data segment in the N ' haptic data segments;
(4) Re-dividing the N "segment haptic data segment into M Duan Houxuan sub-segments, respectively, according to the novelty;
(5) Reserving M according to the M segment candidate sub-fragments 1 Segment candidate sub-segments, determining the M 1 The transition probability between every two candidate sub-segments in the segment candidate sub-segments;
(6) The method for collecting touch rendering data at the current moment in real time during rendering interaction comprises the following steps: second speed v 2 Second normal force F 2 Second direction D 2 And a second inclination angle theta 2
(7) Haptic rendering data from the M according to the transition probability and the current time 1 Determining a rendering segment at the current moment in the segment candidate sub-segments, and outputting the rendering segment in real time to perform haptic rendering of textures;
the step (2) of determining the segmentation time length L specifically includes:
performing Fourier transformation on the haptic data acceleration a, selecting a frequency point P with the largest frequency spectrum amplitude value, taking the reciprocal of the frequency point as the segmentation time length L, and dividing the haptic data acceleration a into haptic data segments with the same length as the N' segments according to the segmentation time length L;
The step (3) of determining the novelty of each of the N "haptic data segments specifically includes:
respectively determining N 'eigenvectors corresponding to the N' equivalent haptic data segments, and determining a first similarity between every two haptic data segments in the N 'equivalent haptic data segments according to the N' eigenvectors;
respectively taking each haptic data segment in the N' segments of equal length as a center, and respectively selecting m adjacent haptic data segments in length from front to back to form haptic data segments; selecting the N ' section of haptic data sections from the N ' sections of the haptic data sections with equal length, and determining the second similarity between every two sections of the N ' section of the haptic data sections according to the first similarity between every two sections of the haptic data sections;
determining the novelty of each of the N "haptic data segments according to the second similarity, respectively;
the step (4) of re-dividing the N "segment haptic data segment into M Duan Houxuan sub-segments specifically includes:
connecting the novelty of the N 'haptic data segments to obtain a novelty curve according to the novelty of each of the N' haptic data segments, wherein the novelty curve is higher than a threshold T 0 The maximum value point of the (a) is used as a breakpoint, a sub-segment between two breakpoints is used as a candidate sub-segment, and the haptic data acceleration a is re-divided into M Duan Houxuan sub-segments;
said step (5) determining said M 1 The transition probability between every two candidate sub-segments in the segment candidate sub-segments specifically comprises:
determining a third similarity between every two candidate sub-segments in the M candidate sub-segments according to the second similarity;
respectively taking each candidate sub-segment in the M candidate sub-segments as a center, and respectively selecting n adjacent candidate sub-segments from front to back; selecting M from the M candidate sub-segments 1 Segment candidate sub-segments and determining the M according to the third similarity 1 A fourth similarity between every two of the candidate sub-segments;
determining the M according to the fourth similarity 1 The transition probability between every two candidate sub-segments in a segment candidate sub-segment;
in the step (7), the real-time output of the rendering segment is used for performing the haptic rendering of textures, which specifically comprises the following steps:
determining the M 1 Average speed of texture haptic data acquisition device for time t for each of candidate sub-segments Average normal force->Average direction->Average dip->And constructing a haptic vector for the candidate sub-segment;
respectively determining Euclidean distances corresponding to the haptic vector of each candidate sub-segment and the haptic rendering vector at the current moment;
selecting the candidate sub-fragments corresponding to a plurality of Euclidean distances larger than a threshold Euclidean distance as a candidate rendering fragment set;
determining a rendering segment at the current moment according to the transition probability between the rendering segment at the previous moment and the candidate rendering segment in the candidate rendering segment set;
the determining the rendering segment at the current time according to the transition probability between the rendering segment at the previous time and the candidate rendering segment in the candidate rendering segment set specifically includes:
1) The first maximum transition probability is calculated according to the following formula:
wherein C is i For a candidate rendering fragment of the set of candidate rendering fragments,for the probability of transition from rendering segment e to candidate rendering segment Ci at the previous time, P max The method comprises the steps of rendering a maximum value of transition probabilities between a fragment at the previous moment and the candidate rendering fragments in the candidate rendering fragment set;
selecting the candidate rendering segment corresponding to the first maximum transition probability as the rendering segment at the current moment;
2) Further comprises: the second maximum transition probability is calculated according to the following formula:
wherein P' max The maximum value of the transition probability between the rendering segment at the previous moment and the candidate rendering segment in the candidate rendering segment set is set, and mu is the number of adjacent rendering segments of the rendering segment e at the previous moment in the haptic data acceleration a;
selecting the candidate rendering segment corresponding to the second maximum transition probability as the rendering segment;
3) Further comprises: the third maximum transition probability is calculated according to the following formula:
wherein P max P is the maximum value of transition probability between the rendering fragment at the previous moment and a plurality of candidate rendering fragments 0 μ is the number of adjacent rendering segments of the rendering segment e in the haptic data acceleration a at the previous moment, which is the threshold value of the transition probability;
and selecting the candidate rendering segment corresponding to the third maximum transition probability as the rendering segment.
2. The texture rendering method applied to the pen-type vibrotactile feedback device according to claim 1, wherein the step (1) collects tactile data when interacting with a real texture, and specifically comprises:
the acceleration a is obtained in real time through a multi-axis accelerometer fixed on the pen-type texture touch data acquisition device;
The first speed v 1 Acquiring in real time through a positioning device fixed on the pen-type texture touch data acquisition device;
said first normal force F 1 Acquiring in real time through a force sensor fixed on the pen-type texture touch data acquisition device;
the first direction D 1 Acquiring in real time through a positioning device fixed on the pen-type texture touch data acquisition device;
the first inclination angle theta 1 And acquiring in real time through an attitude sensor fixed on the pen-type texture touch data acquisition device.
3. The texture rendering method applied to the pen-type vibrotactile feedback device according to claim 1, wherein the step (6) collects the current time haptic rendering data during the rendering interaction in real time, and specifically includes:
the second speed v 2 The method comprises the steps of obtaining through a positioning device fixed on a pen-type vibration touch feedback device;
said second normal force F 2 Acquiring in real time through a force sensor fixed on a pen-type vibration touch feedback device;
the second direction D 2 The method comprises the steps of obtaining through a positioning device fixed on a pen-type vibration touch feedback device;
the second inclination angle theta 2 Acquired by a gesture sensor fixed on a pen-type vibration touch feedback device;
And forming a touch rendering vector at the current moment according to the touch rendering data at the current moment.
4. The texture rendering method applied to a pen-type vibrotactile feedback device according to claim 1, wherein the pen-type vibrotactile feedback device comprises: a pen-type texture haptic data acquisition device and a pen-type vibrotactile feedback device, wherein:
1) The pen-type texture tactile data acquisition device comprises:
the pen body, accelerometer, position sensor, force transducer, gesture sensor for gather the touch data when interacting with real material, include: acceleration a, first velocity v 1 First, theNormal force F 1 First direction D 1 And a first inclination angle theta 1
The processing module is used for determining a segmentation time length L and dividing the acceleration into N' haptic data segments with equal length according to the segmentation time length; a kind of electronic device with high-pressure air-conditioning system:
determining a novelty of each of the haptic data segments; and re-dividing the acceleration into M according to the novelty 1 Segment candidate sub-segments; a kind of electronic device with high-pressure air-conditioning system:
determining the transition probability between every two segments of the candidate sub-segments;
2) The pen-type vibrotactile feedback device includes:
the pen body, positioner, force sensor and attitude sensor for the current moment of time touch rendering data when rendering the interaction is gathered in real time includes: second speed v 2 Second normal force F 2 Second direction D 2 And a second inclination angle theta 2
The processing module is used for determining the rendering segment from the candidate sub-segments according to the transition probability and the touch rendering data at the current moment, and converting the rendering segment into a driving signal to be output in real time;
the power amplifier is used for amplifying the driving signal generated by the processing module and outputting the amplified driving signal to the vibration actuator;
the vibration actuator is used for receiving the driving signal output by the power amplifier in real time, providing corresponding tactile feedback and performing tactile rendering of textures;
and the interaction panel is used for holding the pen body by hand and performing interaction actions such as sliding on the interaction panel.
CN202111592676.0A 2021-12-22 2021-12-22 Texture rendering method applied to pen-type vibration touch feedback device Active CN114265503B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111592676.0A CN114265503B (en) 2021-12-22 2021-12-22 Texture rendering method applied to pen-type vibration touch feedback device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111592676.0A CN114265503B (en) 2021-12-22 2021-12-22 Texture rendering method applied to pen-type vibration touch feedback device

Publications (2)

Publication Number Publication Date
CN114265503A CN114265503A (en) 2022-04-01
CN114265503B true CN114265503B (en) 2023-10-13

Family

ID=80829340

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111592676.0A Active CN114265503B (en) 2021-12-22 2021-12-22 Texture rendering method applied to pen-type vibration touch feedback device

Country Status (1)

Country Link
CN (1) CN114265503B (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100787952B1 (en) * 2006-08-11 2007-12-24 광주과학기술원 Apparatus and method for haptic rendering using local occupancy map instance, and haptic system using them
CN109559758A (en) * 2018-11-05 2019-04-02 清华大学 A method of texture image is converted by haptic signal based on deep learning
CN112805673A (en) * 2018-10-10 2021-05-14 美的集团股份有限公司 Method and system for providing remote robot control

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060284834A1 (en) * 2004-06-29 2006-12-21 Sensable Technologies, Inc. Apparatus and methods for haptic rendering using a haptic camera view
KR102398389B1 (en) * 2014-11-12 2022-05-16 엘지디스플레이 주식회사 Method for modeling of haptic signal from haptic object, display apparatus and method for driving thereof

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100787952B1 (en) * 2006-08-11 2007-12-24 광주과학기술원 Apparatus and method for haptic rendering using local occupancy map instance, and haptic system using them
CN112805673A (en) * 2018-10-10 2021-05-14 美的集团股份有限公司 Method and system for providing remote robot control
CN109559758A (en) * 2018-11-05 2019-04-02 清华大学 A method of texture image is converted by haptic signal based on deep learning

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
触觉再现技术研究进展;赵璐;刘越;卓荦;;计算机辅助设计与图形学学报(第11期);全文 *
面向视频感知的静电力触觉渲染方法;吴赛文;陈建;孙晓颖;;计算机应用(第04期);全文 *

Also Published As

Publication number Publication date
CN114265503A (en) 2022-04-01

Similar Documents

Publication Publication Date Title
Li et al. Deep Fisher discriminant learning for mobile hand gesture recognition
Yang et al. Dynamic hand gesture recognition using hidden Markov models
Ren et al. Depth camera based hand gesture recognition and its applications in human-computer-interaction
Rui et al. Segmenting visual actions based on spatio-temporal motion patterns
Basu et al. 3D modeling and tracking of human lip motions
Nair et al. Hand gesture recognition system for physically challenged people using IOT
Geetha et al. A vision based dynamic gesture recognition of indian sign language on kinect based depth images
Kumar et al. Three-dimensional sign language recognition with angular velocity maps and connived feature resnet
Bao et al. Dynamic hand gesture recognition based on SURF tracking
CN111460976B (en) Data-driven real-time hand motion assessment method based on RGB video
Malik et al. DeepAirSig: End-to-end deep learning based in-air signature verification
Nooruddin et al. HGR: Hand-gesture-recognition based text input method for AR/VR wearable devices
Tran et al. Phase segmentation methods for an automatic surgical workflow analysis
CN107346207B (en) Dynamic gesture segmentation recognition method based on hidden Markov model
CN114265503B (en) Texture rendering method applied to pen-type vibration touch feedback device
Enikeev et al. Recognition of sign language using leap motion controller data
Fakhfakh et al. Gesture recognition system for isolated word sign language based on key-point trajectory matrix
Caplier et al. Comparison of 2D and 3D analysis for automated cued speech gesture recognition
CN114764580A (en) Real-time human body gesture recognition method based on no-wearing equipment
Pradeep et al. Advancement Of Sign Language Recognition Through Technology Using Python And OpenCV
Xie et al. Data-driven motion estimation with low-cost sensors
Elmagrouni et al. A Deep Learning Framework for Hand Gesture Recognition and Multimodal Interface Control.
Ni et al. QMGR-Net: quaternion multi-graph reasoning network for 3D hand pose estimation
Shen Gesture interactive recognition method of moving equipment based on virtual reality technology
Hisham et al. Arabic dynamic gestures recognition using microsoft Kinect

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant