CN112214111A - Ultrasonic array interaction method and system integrating visual touch perception - Google Patents

Ultrasonic array interaction method and system integrating visual touch perception Download PDF

Info

Publication number
CN112214111A
CN112214111A CN202011105839.3A CN202011105839A CN112214111A CN 112214111 A CN112214111 A CN 112214111A CN 202011105839 A CN202011105839 A CN 202011105839A CN 112214111 A CN112214111 A CN 112214111A
Authority
CN
China
Prior art keywords
array
palm
ultrasonic
platform
hand
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202011105839.3A
Other languages
Chinese (zh)
Other versions
CN112214111B (en
Inventor
李晓玲
郑子明
赖嘉芮
张丽霞
朱慧进
施乐洋
陈漳沂
王龙
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xian Jiaotong University
Original Assignee
Xian Jiaotong University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xian Jiaotong University filed Critical Xian Jiaotong University
Priority to CN202011105839.3A priority Critical patent/CN112214111B/en
Publication of CN112214111A publication Critical patent/CN112214111A/en
Application granted granted Critical
Publication of CN112214111B publication Critical patent/CN112214111B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/01Indexing scheme relating to G06F3/01
    • G06F2203/012Walk-in-place systems for allowing a user to walk in a virtual environment while constraining him to a given position in the physical environment

Abstract

The invention discloses an ultrasonic array interaction method and system integrating visual touch perception, which comprises the steps of collecting hand pose information, and calculating fingertip coordinates to form a position array for fingers; for the palm, calculating palm center coordinates and palm normal vector parameters to form a vector array; if the finger is detected, selecting and activating the ultrasonic array based on the distance according to the position array; if the palm is detected, selecting and starting the motion platform based on the directivity according to the finger position array and the palm vector array, activating the corresponding ultrasonic array when the platform is parallel to the palm, and returning angle information; and driving the transducer to emit ultrasonic waves to generate a focus point by adopting the focus information after the homogeneous coordinate transformation to form tactile feedback, and displaying the focus point, the virtual hand and the surrounding scene information in real time to finish visual and tactile fusion perception. The invention provides multi-angle and different-form tactile feedback for fingers and palms, so that the feeling is more precise and vivid, visual perception is integrated, the interaction experience is enriched, and the interaction accuracy is improved.

Description

Ultrasonic array interaction method and system integrating visual touch perception
Technical Field
The invention belongs to the technical field of visual and tactile fusion perception, and particularly relates to an ultrasonic array interaction method and system with visual and tactile perception fused.
Background
Nowadays, the recognition processing technology of information such as images and voice is developed gradually over the years, and then means such as visual perception and voice interaction are widely applied, so that a man-machine interaction mode mainly focuses on visual and auditory interaction, and the capability of realizing touch interaction perception is very limited. Touch plays a second role in acquiring environmental information of people than vision, and is the only perception channel with bidirectional information transmission capability.
Compared with the traditional contact type touch sensation interaction, the non-contact type touch sensation interaction can obtain touch sensation without contacting an object, so that the non-contact type touch sensation interaction has important application in the fields of public operation exhibition, AR/VR and the like. Meanwhile, the integration of visual display can enhance the interaction experience and improve the intuitiveness and the accuracy in the interaction process. An air touch interaction technology based on an ultrasonic phased array adopts an acoustic radiation pressure principle to form focused touch points in a three-dimensional space, and a non-binding natural interaction mode is provided. People can sense the tactile stimulation in space without wearing any equipment, and the dependence on providing tactile perception for touch is eliminated.
Because the new crown epidemic situation is outbreak in the world, people attach particular importance to the health and safety of public contact objects, and the non-contact touch interaction based on the ultrasonic phased array provides an effective solution, so that the cross contact infection is avoided, and the device is cleaner and more sanitary.
In the existing method, a focus point is determined according to the scanned hand position, the ultrasonic wave emission time delay is calculated, acoustic emission force is generated by focusing, and the acoustic emission force acts on skin to form touch sense. Meanwhile, the ultrasonic resonance frequency is modulated, so that the hand skin is more sensitive. The system only has tactile feedback, does not distinguish the perception of fingers and palms, and cannot obtain more elaborate and diversified tactile feelings. And the other hand position tracking detection module and the ultrasonic array touch generation circuit which can work independently are combined to generate non-contact touch feedback in real time in a three-dimensional space, so that the structure is simple and the realization is easy. But because the system is to assist the blind in perceiving braille, no visual interaction channel is provided. Meanwhile, the ultrasonic array is fixed, and if the motion amplitude of the blind hand is large, the blind hand can not accurately sense the Braille touch points. And thirdly, determining a time sequence of focus data according to a phase retrieval mode, controlling sound pressure distribution in a cavity surrounded by the ultrasonic phased array by adopting a spatial modulation method, forming a sensible Braille touch point, and identifying hand position and rapidly updating information. Because it is a system for assisting the blind, there is no visual perception. And the fixed space phased array is difficult to modularize and assemble and disassemble.
Disclosure of Invention
The technical problem to be solved by the invention is to provide an ultrasonic array interaction method and system integrating visual and tactile perception, which utilizes a hand pose acquisition module and a variable angle follow-up device to consider hand data into fingers and palms, respectively selects a distance and directivity-based mode to activate corresponding ultrasonic arrays, and realizes the purpose of moving along with the hands by crossing array focuses. Meanwhile, visual perception is fused, interaction experience is enhanced, and interaction accuracy is improved.
The invention adopts the following technical scheme:
an ultrasonic array interaction method integrating visual touch perception is characterized by collecting hand pose information and calculating fingertip coordinates to form a position array for fingers; for the palm, calculating palm center coordinates and palm normal vector parameters to form a vector array; if only the finger is detected, selecting and activating the ultrasonic array based on the distance according to the position array; if the palm is detected, selecting and starting the motion platform based on the directivity according to the finger position array and the palm vector array, activating the corresponding ultrasonic array when the platform is parallel to the palm, and returning angle information; and driving the transducer to emit ultrasonic waves to generate a focus point by adopting the focus information after the homogeneous coordinate transformation to form tactile feedback, and displaying the focus point, the virtual hand and the surrounding scene information in real time to finish visual and tactile fusion perception.
Specifically, the method comprises the following steps:
s1, electrifying the interactive system, resetting the variable angle follow-up device, and taking the normal vector of the initial horizontal position as
Figure BDA0002726927500000031
S2, starting a hand pose acquisition module, detecting hand information, judging whether a hand is detected, and if not, jumping to the step S9 to finish; if yes, collecting hand pose data;
s3, judging the finger or palm by the main controller according to the hand pose data, if only the finger i is detected, calculating the fingertip coordinate
Figure BDA0002726927500000032
Forming a position array, and selecting and activating the ultrasonic arrays in the multi-array ultrasonic touch perception module nearby based on the distance from the fingertips to the center of each ultrasonic array;
if the palm is detected, calculating the palm center coordinate P ═ Xp,Yp,Zp) And normal vector of palm surface
Figure BDA0002726927500000033
Forming a vector array, and selecting and starting a motion platform in the variable angle follow-up device based on directionality by combining the finger position array and the palm vector array; activating the corresponding ultrasonic array after the platform moves to be parallel to the palm;
s4, if the platform does not move, the return angle value is 0; if the platform finishes moving, the return angle value is beta;
s5, performing homogeneous coordinate transformation according to the position relation and focus information between the hand pose acquisition module and the ultrasonic array;
s6, driving the transducer to emit ultrasonic waves according to the focus information to generate a focus point to form tactile feedback;
s7, displaying information of the focus point, the virtual hand and the surrounding scene in real time by the display device;
s8, finishing visual sense and tactile sense fusion perception, and returning to the step S2;
and S9, ending.
Further, in step S3, when the platform is at the initial horizontal position, the length of the linear actuator is c, the movement distance of the telescopic rod is S, the initial S (0) is 0, S (k) > 0 indicates extension, S (k) < 0 indicates shortening, k is the number of calculations, specifically:
Figure BDA0002726927500000034
wherein k is more than or equal to 1, beta is the rotation angle of the motion platform, a is the distance between the hinged position of the telescopic rod of the linear driver and the platform and the rotating shaft of the platform, b is the distance between the hinged position of the base of the linear driver and the rotating shaft of the platform, and alpha is the included angle between the horizontal line and the straight line between the hinged point of the base and the rotating point of the platform.
Further, in step S5, the homogeneous coordinate transformation specifically includes:
Figure BDA0002726927500000041
wherein A is a matrix formed by the initial position relation of the hand pose acquisition module and the multi-array ultrasonic tactile perception module, and the block matrix A1The block matrix A is related to the establishment of local coordinate systems and unit system2Related to the distance between the two origins of coordinates, thetaj、ωj
Figure BDA0002726927500000042
The rotation angles of X, Y, Z axes of the single ultrasonic array j and the hand position posture acquisition module respectively,
Figure BDA0002726927500000043
is the focal point coordinate of the ultrasonic array local coordinate system.
Another technical solution of the present invention is an ultrasonic array interactive system with fusion of visual and tactile perception, comprising:
the hand pose acquisition module acquires hand image information based on binocular vision imaging, positions key parts of fingers and a palm, calculates the coordinate position of the fingers and the pose of the palm surface, sends data to the main controller end, and acquires and updates hand data in real time;
the variable angle follow-up device calculates the motion distance of the linear driver according to the normal vector parameter of the palm face output by the main controller, and the microcontroller controls the linear driver to move so as to ensure that the motion platform is parallel to the palm and returns angle information to the main controller;
the main controller receives the hand pose data to obtain a palm face normal vector parameter and sends the palm face normal vector parameter to the microcontroller; meanwhile, the returned angle information of the motion platform and the position of the focus point are utilized to carry out homogeneous coordinate transformation, and the transformed focus point information is transmitted to the FPGA controller;
the multi-array ultrasonic tactile perception module is used for transmitting ultrasonic waves to form a tactile focus point; calculating the phase delay of the ultrasonic array according to the focus information, and focusing ultrasonic waves at a focus after passing through an amplifying circuit, wherein the focus moves along with a palm or a finger;
and the display device is used for displaying the focus point, the virtual hand and the surrounding scene information.
The hand pose acquisition module comprises a binocular camera, the binocular camera is connected with the optical sensor and the infrared transmitter through the communication control circuit, the infrared transmitter is used for detecting whether hands are close to the hand, and the binocular camera and the optical sensor are used for identifying and acquiring hand information.
Specifically, the variable-angle follow-up device comprises a motion platform, the motion platform is connected with a linear driver through a microcontroller, and the microcontroller controls the linear driver to move to drive the motion platform to move parallel to the palm.
Further, motion platform includes central platform, and central platform is n limit structure, and n is the positive integer, and every side all articulates a motion platform that the profile is n limit structure, and every motion platform can be rotatory around articulated department, is provided with the ultrasonic array on every motion platform.
Furthermore, the rotation angle of the moving platform is 0-90 degrees.
Furthermore, the ultrasonic array is formed by arranging a plurality of ultrasonic transducers, the transmitting frequency of the ultrasonic transducers is 40kHz, and the array is fixed on the moving platform.
Compared with the prior art, the invention has at least the following beneficial effects:
the invention relates to an ultrasonic array interaction method integrating visual and tactile perception, which separately considers fingers and palms according to data of a hand pose acquisition module, respectively selects corresponding ultrasonic arrays in a distance and directivity based mode, and realizes two parts of different tactile feedback by adopting different modulation technologies. This not only allows for following hand movements across the array focus, but also provides a more subtle and realistic haptic experience.
Furthermore, by constructing a visual scene, the motion synchronization between the virtual hand and the real hand is established, the contact position of the virtual hand and the object is highlighted, and meanwhile, the tactile feedback is provided during contact, so that the aims of visual and tactile fusion perception, rich interaction experience and improvement of perception accuracy are fulfilled.
Furthermore, the distance from the finger to the center of each ultrasonic array is calculated according to the fingertip coordinates, the arrays are activated nearby, the inclination angle is calculated according to the palm vector, the movement distance of the linear driver is further obtained, the movement platform is parallel to the palm, the ultrasonic waves are focused on the hand more, energy loss is reduced, and touch feeling is enhanced.
Further, through homogeneous coordinate transformation, hand position information is transformed from a coordinate system where the hand pose acquisition module is located to a coordinate system where each ultrasonic array is located, so that focus information under a local coordinate system is obtained, and a touch point is formed through focusing.
According to the ultrasonic array interaction system integrating visual and tactile perception, continuous tracking of hands is realized by combining the hand pose acquisition module and the variable angle follow-up device, and even if a wrist rotates greatly, the ultrasonic array can be kept parallel to a palm in real time, so that the maximum-strength tactile stimulation is provided. Adopt hinged joint simultaneously, compare in fixed space phased array, more nimble freedom makes things convenient for modularization equipment and dismantlement, practices thrift the space.
Furthermore, the hand pose acquisition module can detect whether a hand exists and identify and acquire hand information in real time, provides a basis for the ultrasonic array to emit ultrasonic waves to form a focus point, and realizes the focus cross-array hand tracking motion.
Furthermore, the variable-angle follow-up device can adapt to the situation that the fingers move at will and the wrists rotate greatly, and has the advantages of a fixed planar array and a space array, and is more flexible and free.
Furthermore, the center platform and the side platform are hinged to enable the ultrasonic array fixed on the side platform to rotate along with the hand, real-time all-around touch feeling is provided, and meanwhile, the hinge arrangement is also beneficial to modularization loading and unloading, and is convenient and fast.
Furthermore, the rotation angle is set to be 0-90 degrees, so that the system is suitable for scenes with palms in a horizontal state, a vertical state or a certain inclined angle state, and the application occasions are enlarged.
Furthermore, in order to obtain a focus with higher spatial resolution and reduce the energy loss of ultrasonic wave focusing, an ultrasonic transducer with stable and mature performance and the transmitting frequency of 40kHz is adopted.
In conclusion, the invention provides multi-angle and different-form tactile feedback for fingers and palms, so that the feeling is more delicate and vivid, and meanwhile, the visual perception is fused, the interaction experience can be enriched, and the interaction accuracy is improved.
The technical solution of the present invention is further described in detail by the accompanying drawings and embodiments.
Drawings
FIG. 1 is a schematic diagram of an interactive system of the present invention;
FIG. 2 is an expanded view of a hexagonal ultrasonic multi-array in accordance with the present invention;
FIG. 3 is a flowchart of the interactive system operation of the present invention;
FIG. 4 is a schematic diagram of the variable angle servo device calculation in the present invention;
fig. 5 is a schematic view of an application scenario of the interactive system in the present invention.
Wherein: 1. a hand pose acquisition module; 2. a variable angle follow-up device; 3. a main controller; 4. a multi-array ultrasonic tactile perception module; 5. a display device.
Detailed Description
In the description of the present invention, it is to be understood that the terms "center", "longitudinal", "lateral", "up", "down", "front", "back", "left", "right", "vertical", "horizontal", "top", "bottom", "inner", "outer", "one side", "one end", "one side", and the like indicate orientations or positional relationships based on those shown in the drawings, and are only for convenience of description and simplicity of description, and do not indicate or imply that the device or element referred to must have a particular orientation, be constructed in a particular orientation, and be operated, and thus, are not to be construed as limiting the present invention. In addition, in the description of the present invention, "a plurality" means two or more unless otherwise specified.
In the description of the present invention, it should be noted that, unless otherwise explicitly specified or limited, the terms "mounted," "connected," and "connected" are to be construed broadly, e.g., as meaning either a fixed connection, a removable connection, or an integral connection; can be mechanically or electrically connected; they may be connected directly or indirectly through intervening media, or they may be interconnected between two elements. The specific meanings of the above terms in the present invention can be understood in specific cases to those skilled in the art.
Various structural schematics according to the disclosed embodiments of the invention are shown in the drawings. The figures are not drawn to scale, wherein certain details are exaggerated and possibly omitted for clarity of presentation. The shapes of various regions, layers and their relative sizes and positional relationships shown in the drawings are merely exemplary, and deviations may occur in practice due to manufacturing tolerances or technical limitations, and a person skilled in the art may additionally design regions/layers having different shapes, sizes, relative positions, according to actual needs.
The invention provides an ultrasonic array interaction method and system integrating visual touch perception. The invention enables the ultrasonic focusing point to move along with the hand part in multiple angles, simultaneously realizes linkage according to touch sense, improves the accuracy of continuous non-contact sensing, and is particularly suitable for public touch places with sanitary and safety requirements.
Referring to fig. 1, the ultrasonic array interactive system integrating visual sense and tactile sense of the invention includes a hand pose acquisition module 1, a variable angle follow-up device 2, a main controller 3, a multi-array ultrasonic tactile sense module 4, and a display device 5. The main controller 3 is respectively connected with the pose acquisition module 1, the variable angle follow-up device 2, the multi-array ultrasonic touch perception module 4 and the display device 5, and information transmission among all parts is achieved.
The hand pose acquisition module 1 comprises a binocular camera, an infrared emitter, an optical sensor and a communication control circuit, acquires hand image information based on binocular vision imaging, positions key parts such as fingers and palms, calculates the coordinate positions of the fingers and the palm face pose, sends hand pose data to the main controller 3, and acquires and updates hand data in real time.
The variable angle follow-up device 2 comprises a microcontroller, a linear driver and a motion platform, the motion distance of the linear driver is calculated according to the hand coordinates and the vector array output by the main controller 3, the microcontroller controls the linear driver to move, the motion platform is enabled to be parallel to the palm, and the angle information of the motion platform is returned to the main controller 3; after each power-on, the moving platform is reset to the initial horizontal position.
Referring to fig. 2, in the motion platform of the variable angle follow-up device 2, the central platform is horizontally fixed and has a hexagonal outline, and the rest of the motion platforms are hexagonal and hinged to the central platform and can rotate around the hinged position, and the motion range is 0 to 90 °. The ultrasonic array fixed on the platform is also hexagonal, and a multi-array expansion structure is formed.
The main controller 3 receives hand pose data from the hand pose acquisition module 1, the motion platform angle of the variable angle follow-up device 2 and the touch information of the multi-array ultrasonic touch sensing module 4, and outputs the hand coordinate and vector array of the variable angle follow-up device 2, the focus data of the multi-array ultrasonic touch sensing module 4 and the visual and touch fusion information of the display device 5 after analysis, calculation and processing.
The multi-array ultrasonic touch perception module 4 comprises a plurality of ultrasonic arrays, an amplifying circuit and an FPGA controller, the FPGA controller calculates the phase delay of the ultrasonic arrays according to the focus data received from the main controller 3, the control signal drives the ultrasonic transducer through the amplifying circuit by combining with modulation frequency and amplitude information, a focus which can move along with a palm or a finger is generated, and the touch information is transmitted to the main controller 3. The ultrasonic array is formed by arranging a plurality of ultrasonic transducers, the transmitting frequency of the transducers is 40kHz, and the array is fixed on a moving platform and moves along with the moving platform.
The display device 5 receives the visual and tactile fusion information from the main controller 3, can display a focus point, a virtual hand and surrounding scenes, enriches user interaction experience, and improves perception accuracy.
Referring to fig. 3, according to the ultrasonic array interaction method with visual and tactile perception integrated, hand information is detected by the hand pose acquisition module 1, and if a hand is not detected, the process is ended; and if the hand is detected, acquiring hand pose data. The main controller 3 receives and processes the data, and calculates fingertip coordinates for the fingers to form a position array; and for the palm, calculating the palm center coordinates and the palm normal vector parameters to form a vector array. If the hand pose acquisition module 1 only detects fingers, if a fist is clenched and a forefinger is stretched out, the ultrasonic arrays in the multi-array ultrasonic touch perception module 4 are selected and activated based on the distance according to the position arrays; if the palm is detected, the motion platform in the variable angle follow-up device 2 is selected and started based on the directionality according to the finger position array and the palm vector array, and after the platform is parallel to the palm, the corresponding ultrasonic array is activated, and angle information is returned. The focus information after the homogeneous coordinate transformation is sent to an FPGA controller in the multi-array ultrasonic touch perception module 4, the transducer is driven to emit ultrasonic waves to generate focus points to form touch feedback, meanwhile, the display device 5 displays the focus points, the virtual hand and surrounding scene information in real time, visual and touch fusion perception is completed, interaction experience is enriched, and perception accuracy is improved. The method comprises the following specific steps:
s1, electrifying the interactive system, resetting the variable angle follow-up device 2, and taking the normal vector of the initial horizontal position as
Figure BDA0002726927500000101
S2, starting the hand pose acquisition module 1 and detecting hand information; judging whether a hand is detected, if not, jumping to the step S9 and ending; if yes, hand pose data are collected;
s3, the main controller 3 judges the finger or palm according to the hand pose data, if only the finger i is detected, the fingertip coordinate is calculated
Figure BDA0002726927500000102
Forming a position array, and selecting and activating the ultrasonic arrays in the multi-array ultrasonic touch perception module 4 nearby based on the distance from the fingertips to the center of each ultrasonic array; if the palm is detected, calculating the palm center coordinate P ═ Xp,Yp,Zp) And normal vector of palm surface
Figure BDA0002726927500000103
Forming a vector array, and selecting and starting a motion platform in the variable angle follow-up device 2 based on the directionality by combining the finger position array and the palm vector array;
referring to fig. 4, let the rotation angle of the motion platform be β, the distance between the hinge point of the telescopic rod of the linear actuator and the platform and the rotation axis of the platform be a, the distance between the hinge point of the base of the linear actuator and the rotation axis of the platform be b, the included angle between the horizontal line and the straight line from the hinge point of the base to the rotation point of the platform be α, the length of the linear actuator when the platform is at the initial horizontal position be c, the motion distance of the telescopic rod be s, the initial s (0) is 0, s (k) is greater than 0 to indicate extension, s (k) is less than 0 to indicate shortening, and k is the number.
Figure BDA0002726927500000111
Figure BDA0002726927500000112
And k is more than or equal to 1, and the corresponding ultrasonic array is activated after the platform moves to be parallel to the palm.
S4, if the platform does not move, the return angle value is 0; if the platform finishes moving, the return angle value is beta;
s5, performing homogeneous coordinate transformation according to the position relation and focus information between the hand pose acquisition module 1 and the ultrasonic array;
setting a matrix formed by the initial position relation of the hand pose acquisition module 1 and the multi-array ultrasonic touch perception module 4 as A, wherein the blocking matrix A1The block matrix A is related to the establishment of local coordinate systems and unit system2Relating to the distance between the two coordinate origins. The rotation angles of the X, Y, Z axes from the single ultrasonic array j to the hand posture collecting module 1 are respectively thetaj、ωj
Figure BDA0002726927500000113
Beta is at least equal to theta according to the shape and arrangement structure of each ultrasonic arrayj、ωj
Figure BDA0002726927500000114
Is concerned with.
The focal point coordinate under the local coordinate system of the ultrasonic array is
Figure BDA0002726927500000115
Figure BDA0002726927500000116
S6, driving the transducer to emit ultrasonic waves according to the focus information to generate a focus point to form tactile feedback;
s7, displaying information of the focus point, the virtual hand and the surrounding scene in real time by the display device 5;
s8, finishing visual sense and tactile sense fusion perception, and returning to the step S2;
and S9, ending.
In order to make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, but not all, embodiments of the present invention. The components of the embodiments of the present invention generally described and illustrated in the figures herein may be arranged and designed in a wide variety of different configurations. Thus, the following detailed description of the embodiments of the present invention, presented in the figures, is not intended to limit the scope of the invention, as claimed, but is merely representative of selected embodiments of the invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Referring to fig. 5, the ultrasonic array interactive system integrating visual touch sensing of the present invention utilizes a hand pose acquisition module 1 and a variable angle follow-up device 2, and takes hand data into consideration by a main controller 3, and selects a distance and directionality-based mode to activate a corresponding multi-array ultrasonic touch sensing module 4, so as to realize the movement of following the hand across the array focus. Meanwhile, visual perception is integrated with the display device 5, interaction experience is enhanced, and interaction accuracy is improved.
The ultrasonic array sound field force feedback test shows that the pressure intensity is monotonously reduced along with the distance of a focus point from the center of the array; and the perception gradually decreases as the height of the hand from the array increases from 9cm to 25 cm. Therefore, the position and the distance between the hand and the array are greatly influenced on the touch sensation, so that the array is kept parallel to the hand, the distance between the hand and the array is shortened, and the method has important significance for enhancing the touch sensation. In addition, through a psychophysical experiment, ten testees (three females and seven males) are selected, the fact that the integration of visual perception plays a positive role in ultrasonic tactile feedback is found, interaction experience is enriched, and the tactile points can be perceived more quickly by combining virtual hands and surrounding scene information. Meanwhile, the different tactile experiences of the fingers and the palm make the testee feel that the touch immersion is enhanced.
In summary, according to the ultrasonic array interaction method and system fusing visual and tactile perception, the hand pose acquisition module and the variable angle follow-up device are used for respectively adopting different modes to form tactile feedback on fingers and palms, the fine hand tactile perception is perfected, the cross-array tracking motion of a focus is realized, the system application occasions are enlarged, and meanwhile, the display device fuses the visual perception and the tactile perception, so that the immersion and the accuracy of an interaction system are improved.
The above-mentioned contents are only for illustrating the technical idea of the present invention, and the protection scope of the present invention is not limited thereby, and any modification made on the basis of the technical idea of the present invention falls within the protection scope of the claims of the present invention.

Claims (10)

1. An ultrasonic array interaction method integrating visual touch perception is characterized in that hand pose information is collected, and for fingers, fingertip coordinates are calculated to form a position array; for the palm, calculating palm center coordinates and palm normal vector parameters to form a vector array; if only the finger is detected, selecting and activating the ultrasonic array based on the distance according to the position array; if the palm is detected, selecting and starting the motion platform based on the directivity according to the finger position array and the palm vector array, activating the corresponding ultrasonic array when the platform is parallel to the palm, and returning angle information; and driving the transducer to emit ultrasonic waves to generate a focus point by adopting the focus information after the homogeneous coordinate transformation to form tactile feedback, and displaying the focus point, the virtual hand and the surrounding scene information in real time to finish visual and tactile fusion perception.
2. The method for interacting the ultrasonic array fused with visual and tactile perception according to claim 1, comprising the following specific steps:
s1, electrifying the interactive system, resetting the variable angle follow-up device, and taking the normal vector of the initial horizontal position as
Figure FDA0002726927490000011
S2, starting a hand pose acquisition module, detecting hand information, judging whether a hand is detected, and if not, jumping to the step S9 to finish; if yes, collecting hand pose data;
s3, judging the finger or palm by the main controller according to the hand pose data, if only the finger i is detected, calculating the fingertip coordinate
Figure FDA0002726927490000012
Forming a position array, and selecting and activating the ultrasonic arrays in the multi-array ultrasonic touch perception module nearby based on the distance from the fingertips to the center of each ultrasonic array;
if the palm is detected, calculating the palm center coordinate P ═ Xp,Yp,Zp) And normal vector of palm surface
Figure FDA0002726927490000013
Forming a vector array, and selecting and starting a motion platform in the variable angle follow-up device based on directionality by combining the finger position array and the palm vector array; activating the corresponding ultrasonic array after the platform moves to be parallel to the palm;
s4, if the platform does not move, the return angle value is 0; if the platform finishes moving, the return angle value is beta;
s5, performing homogeneous coordinate transformation according to the position relation and focus information between the hand pose acquisition module and the ultrasonic array;
s6, driving the transducer to emit ultrasonic waves according to the focus information to generate a focus point to form tactile feedback;
s7, displaying information of the focus point, the virtual hand and the surrounding scene in real time by the display device;
s8, finishing visual sense and tactile sense fusion perception, and returning to the step S2;
and S9, ending.
3. The method for interacting with an ultrasound array with fused visual-tactile perception according to claim 2, wherein in step S3, when the platform is at the initial horizontal position, the length of the linear driver is c, the moving distance of the telescopic rod is S, the initial S (0) is 0, S (k) > 0 represents elongation, S (k) < 0 represents shortening, and k is the number of calculations:
Figure FDA0002726927490000021
wherein k is more than or equal to 1, beta is the rotation angle of the motion platform, a is the distance between the hinged position of the telescopic rod of the linear driver and the platform and the rotating shaft of the platform, b is the distance between the hinged position of the base of the linear driver and the rotating shaft of the platform, and alpha is the included angle between the horizontal line and the straight line between the hinged point of the base and the rotating point of the platform.
4. The method for interacting with an ultrasound array by fusing visual and tactile sensations according to claim 2, wherein in step S5, the homogeneous coordinate transformation is specifically:
Figure FDA0002726927490000022
wherein A is a matrix formed by the initial position relation of the hand pose acquisition module and the multi-array ultrasonic tactile perception module, and the block matrix A1The block matrix A is related to the establishment of local coordinate systems and unit system2Related to the distance between the two origins of coordinates, thetaj、ωj
Figure FDA0002726927490000023
The rotation angles of X, Y, Z axes of the single ultrasonic array j and the hand position posture acquisition module respectively,
Figure FDA0002726927490000024
is the focal point coordinate of the ultrasonic array local coordinate system.
5. An ultrasound array interactive system fusing visual tactile perception, which is characterized in that the interactive method of claim 1 is utilized, and comprises the following steps:
the hand pose acquisition module (1) acquires hand image information based on binocular vision imaging, positions key parts of fingers and a palm, calculates the coordinate position of the fingers and the pose of the palm surface, sends data to the main controller end, and acquires and updates hand data in real time;
the variable angle follow-up device (2) calculates the motion distance of the linear driver according to the normal vector parameter of the palm face output by the main controller, and the microcontroller controls the linear driver to move, so that the motion platform is parallel to the palm and returns angle information to the main controller;
the main controller (3) receives the hand pose data to obtain a palm face normal vector parameter and sends the palm face normal vector parameter to the microcontroller; meanwhile, the returned angle information of the motion platform and the position of the focus point are utilized to carry out homogeneous coordinate transformation, and the transformed focus point information is transmitted to the FPGA controller;
a multi-array ultrasonic tactile perception module (4) for transmitting ultrasonic waves to form a tactile focus point; calculating the phase delay of the ultrasonic array according to the focus information, and focusing ultrasonic waves at a focus after passing through an amplifying circuit, wherein the focus moves along with a palm or a finger;
and a display device (5) for displaying the focus point, the virtual hand, and the surrounding scene information.
6. The system according to claim 5, characterized in that the hand pose acquisition module (1) comprises a binocular camera, the binocular camera is connected with the optical sensor and the infrared emitter through the communication control circuit, the infrared emitter is used for detecting whether hands are close to, and the binocular camera and the optical sensor are used for identifying and acquiring hand information.
7. The system according to claim 5, wherein the variable angle follow-up device (2) comprises a motion platform, the motion platform is connected with a linear driver through a microcontroller, and the microcontroller controls the linear driver to move so as to drive the motion platform to move parallel to the palm.
8. The system of claim 7, wherein the motion platform comprises a central platform, the central platform is an n-sided polygon structure, n is a positive integer, each side is hinged to a motion platform with an n-sided polygon outline, each motion platform can rotate around the hinged joint, and each motion platform is provided with an ultrasonic array.
9. The system of claim 8, wherein the rotation angle of the motion platform is 0-90 °.
10. The system of claim 8, wherein the ultrasonic array is formed by arranging a plurality of ultrasonic transducers, the transmitting frequency of the ultrasonic transducers is 40kHz, and the array is fixed on the moving platform.
CN202011105839.3A 2020-10-15 2020-10-15 Ultrasonic array interaction method and system integrating visual touch perception Active CN112214111B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011105839.3A CN112214111B (en) 2020-10-15 2020-10-15 Ultrasonic array interaction method and system integrating visual touch perception

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011105839.3A CN112214111B (en) 2020-10-15 2020-10-15 Ultrasonic array interaction method and system integrating visual touch perception

Publications (2)

Publication Number Publication Date
CN112214111A true CN112214111A (en) 2021-01-12
CN112214111B CN112214111B (en) 2021-11-19

Family

ID=74055363

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011105839.3A Active CN112214111B (en) 2020-10-15 2020-10-15 Ultrasonic array interaction method and system integrating visual touch perception

Country Status (1)

Country Link
CN (1) CN112214111B (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112764592A (en) * 2021-01-15 2021-05-07 安徽省东超科技有限公司 Touch feedback system, terminal device, touch feedback control method and storage medium
CN112764593A (en) * 2021-01-15 2021-05-07 安徽省东超科技有限公司 Touch feedback control method, storage medium, touch feedback system and terminal device
CN113030947A (en) * 2021-02-26 2021-06-25 北京京东方技术开发有限公司 Non-contact control device and electronic apparatus
CN113672097A (en) * 2021-10-22 2021-11-19 华中师范大学 Teacher hand perception interaction method in three-dimensional comprehensive teaching field
WO2023095141A1 (en) * 2021-11-25 2023-06-01 Amir Amedi Tactile representation of location characteristics and content in 3d

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105426024A (en) * 2015-11-25 2016-03-23 吉林大学 Ultrasonic focus based haptic feedback system and method
US20160364960A1 (en) * 2015-06-09 2016-12-15 Elwha Llc Systems and methods for ultrasonically induced tactile stimuli in an entertainment system
CN109634300A (en) * 2018-11-23 2019-04-16 中国运载火箭技术研究院 Based on the multiple no-manned plane control system and method every empty-handed gesture and ultrasonic wave touch feedback
CN109782916A (en) * 2019-01-21 2019-05-21 哈尔滨工业大学(深圳) Braille interactive system and its exchange method based on ultrasonic tactile
US20190196591A1 (en) * 2017-12-22 2019-06-27 Ultrahaptics Ip Ltd Human Interactions with Mid-Air Haptic Systems
CN110515459A (en) * 2019-08-21 2019-11-29 西安交通大学 A kind of the ultrasonic wave haptic feedback system and its method of assisting blind perception
KR20190136572A (en) * 2018-05-31 2019-12-10 한국교통대학교산학협력단 Transducers for non contact mid air tactile sensing feedback and Operating method thereof
CN111175763A (en) * 2020-02-21 2020-05-19 吉林大学 Local ultrasonic sensor array focusing method based on multi-array synthetic aperture
CN111340961A (en) * 2020-02-16 2020-06-26 吉林大学 Ultrasonic aerial touch rendering method based on three-dimensional grid model

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160364960A1 (en) * 2015-06-09 2016-12-15 Elwha Llc Systems and methods for ultrasonically induced tactile stimuli in an entertainment system
CN105426024A (en) * 2015-11-25 2016-03-23 吉林大学 Ultrasonic focus based haptic feedback system and method
US20190196591A1 (en) * 2017-12-22 2019-06-27 Ultrahaptics Ip Ltd Human Interactions with Mid-Air Haptic Systems
KR20190136572A (en) * 2018-05-31 2019-12-10 한국교통대학교산학협력단 Transducers for non contact mid air tactile sensing feedback and Operating method thereof
CN109634300A (en) * 2018-11-23 2019-04-16 中国运载火箭技术研究院 Based on the multiple no-manned plane control system and method every empty-handed gesture and ultrasonic wave touch feedback
CN109782916A (en) * 2019-01-21 2019-05-21 哈尔滨工业大学(深圳) Braille interactive system and its exchange method based on ultrasonic tactile
CN110515459A (en) * 2019-08-21 2019-11-29 西安交通大学 A kind of the ultrasonic wave haptic feedback system and its method of assisting blind perception
CN111340961A (en) * 2020-02-16 2020-06-26 吉林大学 Ultrasonic aerial touch rendering method based on three-dimensional grid model
CN111175763A (en) * 2020-02-21 2020-05-19 吉林大学 Local ultrasonic sensor array focusing method based on multi-array synthetic aperture

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
CHUANG, CHENG-HSIN等: "Ultrasonic tactile sensor integrated with TFT array for force feedback and shape recognition", 《SENSORS AND ACTUATORS A: PHYSICAL》 *
杨鹏: "基于超声波辐射压力的空中触觉反馈技术", 《中国优秀博硕士学位论文全文数据库(硕士)信息科技辑》 *

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112764592A (en) * 2021-01-15 2021-05-07 安徽省东超科技有限公司 Touch feedback system, terminal device, touch feedback control method and storage medium
CN112764593A (en) * 2021-01-15 2021-05-07 安徽省东超科技有限公司 Touch feedback control method, storage medium, touch feedback system and terminal device
CN113030947A (en) * 2021-02-26 2021-06-25 北京京东方技术开发有限公司 Non-contact control device and electronic apparatus
CN113672097A (en) * 2021-10-22 2021-11-19 华中师范大学 Teacher hand perception interaction method in three-dimensional comprehensive teaching field
WO2023095141A1 (en) * 2021-11-25 2023-06-01 Amir Amedi Tactile representation of location characteristics and content in 3d

Also Published As

Publication number Publication date
CN112214111B (en) 2021-11-19

Similar Documents

Publication Publication Date Title
CN112214111B (en) Ultrasonic array interaction method and system integrating visual touch perception
EP2755194B1 (en) 3d virtual training system and method
US9141254B2 (en) Navigation system and user interface for directing a control action
JP6155448B2 (en) Wireless wrist computing and controlling device and method for 3D imaging, mapping, networking and interfacing
JP4278979B2 (en) Single camera system for gesture-based input and target indication
US20010014781A1 (en) Human touch massager
JP5186723B2 (en) Communication robot system and communication robot gaze control method
US20110118877A1 (en) Robot system and method and computer-readable medium controlling the same
CN103744518B (en) Stereo interaction method and display device thereof and system
CN107789803B (en) Cerebral stroke upper limb rehabilitation training method and system
WO2012106978A1 (en) Method for controlling man-machine interaction and application thereof
KR101518727B1 (en) A stereoscopic interaction system and stereoscopic interaction method
JP2018180840A (en) Head-mount display control device, operation method and operation program thereof, and image display system
CN105653038A (en) Intelligent glove
CN106468917B (en) A kind of long-range presentation exchange method and system of tangible live real-time video image
US11331805B2 (en) Motion restriction system and method
JPH11513157A (en) Interactive navigation device for virtual environment
US20160364960A1 (en) Systems and methods for ultrasonically induced tactile stimuli in an entertainment system
Li et al. An egocentric computer vision based co-robot wheelchair
CN113110734A (en) System for generating virtual shape perception based on focused ultrasonic waves
Velazquez et al. Walking using touch: Design and preliminary prototype of a non-invasive ETA for the visually impaired
Wei et al. Object localization assistive system based on CV and vibrotactile encoding
Alcañiz et al. Technological background of VR
CN106293012A (en) A kind of three-dimensional body-sensing bi-direction interactive system and method
WO2018074054A1 (en) Display control device, display control method, and program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant