CN110850985A - Sound tweezers control device and method based on virtual reality technology - Google Patents

Sound tweezers control device and method based on virtual reality technology Download PDF

Info

Publication number
CN110850985A
CN110850985A CN201911119071.2A CN201911119071A CN110850985A CN 110850985 A CN110850985 A CN 110850985A CN 201911119071 A CN201911119071 A CN 201911119071A CN 110850985 A CN110850985 A CN 110850985A
Authority
CN
China
Prior art keywords
virtual reality
particles
acoustic
target
operating
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
CN201911119071.2A
Other languages
Chinese (zh)
Inventor
吕舒晗
杜依诺
张宝军
王泉森
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to CN201911119071.2A priority Critical patent/CN110850985A/en
Publication of CN110850985A publication Critical patent/CN110850985A/en
Withdrawn legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality

Abstract

The application discloses a sound forceps control device and method based on a virtual reality technology, wherein the device comprises virtual reality equipment, a sound forceps control module and a sound forceps control module, wherein the virtual reality equipment is used for acquiring user state information, converting the user state information into an operation instruction and sending the operation instruction; the computer equipment is used for receiving the operation instruction and processing the operation instruction; and the acoustic tweezers device is used for receiving the processed operation instruction, the operation instruction controls the acoustic tweezers device to generate vortex acoustic beams, and the vortex acoustic beams control the particles to move to a specified position so as to complete the control of the particles. According to the invention, through the interactive communication between the virtual reality equipment and the acoustic tweezers equipment, the acoustic tweezers are operated by operating the virtual reality equipment, so that the fine operation on particles is completed.

Description

Sound tweezers control device and method based on virtual reality technology
Technical Field
The invention relates to the technical field of tiny particle control, in particular to a sound forceps control device and method based on a virtual reality technology.
Background
After American scientists obtain the Nobel prize of physics by the aid of the optical tweezers technology, the focus to be urgently broken through at home and abroad is achieved by using the acoustic vortex orbital angular momentum to realize remote non-contact capture and perform nondestructive accurate control on living samples. The acoustic tweezers have more advantages than the optical tweezers in the related operation in the human tissue, and are mainly reflected in that: the method is not limited by the limitation that laser can only penetrate through a transparent medium, and is more favorable for application in biological tissues; the energy used for controlling the acoustic tweezers is small, so that the risks of killing cells and burning normal tissues are greatly reduced; although sound waves have a spin angular momentum unlike light waves, helical sound waves have an orbital angular momentum like light waves, and the resulting torque can manipulate the motion of particles. I.e. acoustic vortices are particles that can be manipulated inside an object non-invasively and non-contactingly. Acoustic systems based on acoustic vortices are known as perfect cell manipulation tools.
Virtual reality technology has been implemented in medical rehabilitation training, virtual surgery, telemedicine, and the like. Virtual reality technology has made significant progress in both immersive display and virtual manipulation. However, no related scheme is provided in the aspect of using a virtual implementation technology to control the acoustic tweezers at present.
Disclosure of Invention
In view of the above-mentioned drawbacks and deficiencies in the prior art, it is desirable to provide an apparatus and a method for controlling acoustic tweezers based on virtual reality technology, which solves the problem that no solution for operating acoustic tweezers to perform fine manipulation on particles by operating a virtual reality device exists at present.
In a first aspect, the present application provides a sound tweezers control device based on virtual reality technology, including:
the virtual reality equipment is used for acquiring the user state information, converting the user state information into an operation instruction and sending the operation instruction,
the computer equipment is used for receiving the operation instruction and processing the operation instruction,
and the acoustic tweezers device is used for receiving the processed operation instruction, the operation instruction controls the acoustic tweezers device to generate a vortex acoustic beam, and the vortex acoustic beam controls the target particles to move to a specified position so as to complete the control of the target particles.
In a second aspect, the present application provides a method for controlling acoustic tweezers based on a virtual reality technology, including:
receiving an operation instruction sent by virtual reality equipment, wherein the operation instruction is obtained by the virtual reality equipment according to user state information;
processing the operation instruction, and sending the processed operation instruction to the acoustic tweezers equipment;
the operating instruction is used for controlling the acoustic tweezers device to generate a vortex acoustic beam, and controlling the target particles to move to a specified position through the vortex acoustic beam so as to complete the manipulation of the target particles.
In a third aspect, the present application provides another sound tweezers control method based on a virtual reality technology, including:
displaying a virtual reality image of the operating handle or the operating glove in real time;
calculating the pointing direction and the steering direction of a virtual reality image of an operating handle or an operating glove through the hand motion posture of the operating handle or the operating glove, selecting a first particle to be controlled, which is intersected with the pointing direction, as a target particle, and acquiring the initial position of the target particle;
determining a designated position of the target particle;
calculating moving path information of the target particles through the initial positions of the target particles and the designated positions of the target particles;
controlling the acoustic tweezers equipment to generate vortex acoustic beams according to the moving path information, and controlling the target particles to move from the initial position to the specified position through the vortex acoustic beams;
and receiving and displaying a feedback image of the target particles at the designated position.
In summary, according to the acoustic tweezers control device and method based on the virtual reality technology provided by the embodiments of the present application, user posture information is obtained through a virtual reality device, then an operation instruction is sent to a computer device, the operation instruction is sent to the acoustic tweezers device after being processed by the computer device, the processed operation instruction controls the acoustic tweezers device to generate a vortex acoustic beam, and the vortex acoustic beam controls particles to move so as to complete manipulation of the particles. By means of the interactive communication between the virtual reality equipment and the acoustic tweezers equipment, the acoustic tweezers are operated by operating the virtual reality equipment, and accordingly fine operation on particles is completed.
Drawings
Other features, objects and advantages of the present application will become more apparent upon reading of the following detailed description of non-limiting embodiments thereof, made with reference to the accompanying drawings in which:
fig. 1 is a basic flowchart of a sound forceps control device based on a virtual reality technology according to an embodiment of the present application;
fig. 2 is a communication and control flowchart of a sound forceps control device based on a virtual reality technology according to an embodiment of the present application;
FIG. 3 is a simplified flow chart of manipulating particles via a virtual reality device according to an embodiment of the present disclosure;
FIG. 4 is a schematic diagram illustrating the optimization of the movement path of a target particle based on a VFH algorithm according to an embodiment of the present disclosure;
fig. 5 is a schematic diagram of acquiring a microparticle heatmap based on FCN provided in an embodiment of the present application;
fig. 6 is a basic flowchart schematic diagram of a sound forceps control method based on a virtual reality technology according to an embodiment of the present application;
fig. 7 is a basic flowchart schematic diagram of another sound tweezers control method based on a virtual reality technology according to an embodiment of the present application;
FIG. 8 is a schematic diagram of a computer system according to the present application.
Detailed Description
The present application will be described in further detail with reference to the following drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the relevant invention and not restrictive of the invention. It should be noted that, for convenience of description, only the portions related to the present invention are shown in the drawings.
It should be noted that the embodiments and features of the embodiments in the present application may be combined with each other without conflict. The present application will be described in detail below with reference to the embodiments with reference to the attached drawings.
For convenience of understanding and explanation, the virtual reality technology-based acoustic tweezers control apparatus and method provided by the embodiments of the present application are described in detail below with reference to fig. 1 to 7.
Please refer to fig. 1 and fig. 2, which respectively provide a basic flow diagram and a communication and control flow diagram of a sound tweezers control device based on virtual reality technology for an embodiment of the present application, where the sound tweezers control device includes:
and the virtual reality device 101 is used for acquiring the user state information, converting the user state information into an operation instruction and sending the operation instruction.
And the computer device 102 is used for receiving the operation instruction and processing the operation instruction.
And the acoustic tweezers device 103 is used for receiving the processed operation instruction, controlling the acoustic tweezers device to generate a vortex acoustic beam, and controlling the target particles to move to a specified position through the vortex acoustic beam so as to complete the manipulation of the target particles.
In this embodiment, the virtual reality device 101 is configured to obtain the user status information, convert the user status information into an operation instruction, and send the operation instruction.
Further, the user state information comprises head position and/or head turning data of the user and operation actions of hands of the user;
the virtual reality device comprises a virtual reality helmet which acquires head position and/or head steering data of a user in real time;
the virtual reality equipment further comprises an operating handle or an operating glove, and the operating handle or the operating glove is used for acquiring the operating action of the hand of the user in real time.
Here, the Virtual Reality device is generally called a VR (Virtual Reality) device, and the VR device in this embodiment may be other VR interaction devices besides a VR helmet and a VR handle or a VR glove, such as: VR all-in-one, VR head-mounted apparatus.
Specifically, the VR headset is used for acquiring the head position and/or the head turning data of the user in real time besides displaying the visual feedback image to the user, and the specific acquisition process is as follows: the VR helmet collects six-degree-of-freedom data of the head of a user in real time, the six-degree-of-freedom data comprise moving degrees of freedom in the directions of three rectangular coordinate axes of x, y and z and turning degrees of freedom around the three rectangular coordinate axes of x, y and z, the position and turning information of the user reflected by the six-degree-of-freedom data determine the real-time position and turning of the head of the user in a virtual space, a rotation and translation matrix in a virtual space scene is calculated according to the real-time position and turning data, and the rotation and translation matrix is adjusted and updated in real time according to the actual movement of the head of the user, so that the user obtains good immersion.
Specifically, the VR handle or VR glove is used for acquiring operation actions of the hand of the user in real time, wherein the operation actions include actions such as movement, key pressing, bending and the like of the hand.
Preferably, the virtual reality device converts the acquired operation motion of the user's hand into an operation instruction and transmits the operation instruction. It can be understood that the virtual reality device needs to convert a series of operation actions of the hand of the user into a series of operation instructions which can be recognized and transmitted, wherein the operation instructions at least comprise position and rotation information of the two hands, handle key operation information and bending data of the feedback fingers of the glove. The transmission here means that the operation instruction is transmitted to the computer device through the virtual reality device.
The generation process of a series of operation actions of the hand of the user corresponds to a process in which the user manipulates the particles in the three-dimensional space through the VR device, as shown in fig. 3, fig. 3 is a simplified flow chart of manipulating the particles through the virtual reality device provided in the embodiment of the present application, and the specific process is as follows:
and S1, selecting target particles from the particles to be controlled by the virtual reality equipment through an operating handle or an operating glove, and acquiring the initial positions of the target particles.
Preferably, the virtual reality device selects the target particles from the virtual three-dimensional space of the particles to be controlled by operating the handle or the operating glove, wherein the virtual three-dimensional space of the particles to be controlled is a VR image of a three-dimensional model which is displayed by the VR headset and is established in advance for the particles to be controlled and the surrounding environment.
It should be noted that the three-dimensional model created in advance for the particles to be manipulated and their surroundings is used as a basis for a user to perform a series of manipulations in a virtual three-dimensional space through an operation handle or an operation glove.
Preferably, the process of specifically selecting the target particle and obtaining the initial position of the target particle is as follows:
virtual reality equipment, namely a VR helmet displays a virtual reality image of the operating handle or the operating gloves in real time, the virtual reality equipment calculates the pointing direction and the steering direction of the operating handle or the operating gloves in the virtual reality image through the hand motion gesture of the operating handle or the operating gloves, and a first particle to be controlled intersecting with the pointing direction is selected as a target particle.
Specifically, the pointing direction and the turning direction of the operating handle or the operating glove in the virtual reality image can be understood as the pointing direction and the turning direction of virtual light projected by an extension line of the handle or the glove in the virtual reality image, and a first particle to be controlled, which is intersected with the virtual light, is selected as a target particle, wherein the three-dimensional space position of the first particle to be controlled is the initial position of the target particle. It should be noted that once the hand motion gesture of the handle or the glove changes, the pointing direction and the steering direction of the handle or the glove in the virtual reality image also change, and by analogy, different target particles can be selected by changing the hand motion gesture of the handle or the glove.
It should be noted that the virtual reality image is a VR image of a three-dimensional model created by the particles to be manipulated and their surroundings in advance, which is displayed by the VR helmet, and when the operating handle or the operating glove is operated in the virtual reality image, the real-time hand motion gesture of the operating handle or the operating glove in the virtual reality image can be displayed by the VR helmet.
And S2, determining the designated position of the target particles by moving the operating handle or the operating glove.
Specifically, after the target particles are selected, the user can determine the designated positions of the target particles by moving the operating handle or operating the position of the glove in the virtual reality image. It will be appreciated that the designated location of the target particles may be any designated location of the operating handle or operating glove, depending on the actual experimental purpose of the particles to be manipulated.
And S3, calculating the moving path information of the target particles through the initial positions of the target particles and the appointed positions of the target particles.
Specifically, the virtual reality device calculates the moving path information of the target particles according to the initial position and the designated position. It should be noted that the operation command converted from a series of operation actions generated by the operation handle or the operation glove of the virtual reality device in the virtual reality image, such as the initial position of the target microparticle, the specified position of the target microparticle, and the moving path information of the target microparticle, is also sent as the operation command by the virtual reality device to the computer device. The moving path information of the target particles is used for controlling the vortex sound beam to pick up the target particles and move the target particles from the initial position to the designated position.
It should be noted that, when determining the moving path of the target particle, in order to avoid the target particle colliding with other particles during moving and causing unexpected movement of other particles, an obstacle avoidance algorithm needs to be applied to optimize the selection of the moving path of the target particle.
Specifically, a Vector Field Histogram (VFH) algorithm is adopted in this embodiment, and the algorithm is a classic algorithm applied to robot navigation, and is used to drive the moving direction of the robot so as to avoid touching an obstacle and approach a target. Fig. 4 is a schematic diagram for explaining the moving path of the target particle optimized based on the VFH algorithm according to an embodiment of the present invention, first, the VHF algorithm calculates the traveling cost in each direction, as shown in fig. 4(a), if there are more obstacles in a certain direction, the traveling cost is higher, and the obstacles in the direction at different distances are accumulated, and the accumulated weight is different according to the distance, and actually, according to the traveling cost in different directions, it can be visually represented by a bar graph, as shown in fig. 4(b), the abscissa is a direction from-180 degrees to 180 degrees, and the ordinate is the traveling cost at the corresponding angle, and as shown in the figure, the higher the bar graph is, the higher the cost of traveling in the direction is indicated, and the more impossible the traveling cost is indicated. Theoretically, this histogram low region is convenient for traveling, but may deviate from the target direction, therefore, the present embodiment balances the traveling cost and the target direction by a balance function, and finally selects a relatively optimum direction for traveling, as shown in fig. 4 (c).
In this embodiment, the computer device 102 is configured to receive the operation instruction and process the operation instruction.
Specifically, the computer device is mainly used for receiving the operation instructions sent by the VR handle in the whole acoustic tweezers control device, processing the operation instructions, and then sending the processed operation instructions to the acoustic tweezers device. The processing of the operation instruction here may be understood as converting the operation instruction sent by the virtual reality device into an operation instruction of the acoustic tweezers device, and specifically, the instruction conversion includes conversion of an operation position and a scale ratio.
In addition, as shown in fig. 2, the acoustic-tweezers control apparatus in this embodiment further includes a microscope, and the microscope and the virtual reality device update real-time data through a computer device, so the computer device further has the following operations:
on the first hand, before the virtual reality device obtains the user state information, an initial three-dimensional space model needs to be established for the particles to be controlled and the surrounding environment thereof in advance through the computer device, and the specific process is as follows:
the microscope sends the initial image data of the particles to be controlled before moving to the computer equipment, the computer equipment extracts image characteristic data of the initial image data of the particles to be controlled before moving, establishes an initial three-dimensional model of the particles to be controlled before moving according to the extracted image characteristic data, and sends the initial three-dimensional model to the virtual reality equipment for displaying.
The initial image data before the movement of the particle to be manipulated herein refers to the initial image data of the particle to be manipulated which is not operated at first.
In a second aspect, after the target particle is moved to the designated position, a current three-dimensional space model needs to be established for the particle to be manipulated and the surrounding environment through a computer device, and the specific process is as follows:
and the microscope sends the current image data of the particles to be controlled to the computer equipment, the computer equipment performs feature extraction on the current image data of the particles to be controlled, establishes a current three-dimensional model of the particles to be controlled according to the extracted feature data, and sends the current three-dimensional model to the virtual reality equipment for displaying.
The virtual reality image is displayed in the virtual reality device such as a helmet, and displayed to the user in the helmet.
It is understood that, as shown in fig. 2, the entity of the particle to be manipulated is placed in the field of view of the microscope, and the microscope monitors the particle to be manipulated and its environment in real time, specifically, the position change and the form change of the particle in real time, and the microscope can transmit the image data of the particle in real time to the computer device due to the communication connection between the microscope and the computer device.
The computer device extracts image features of the image data to obtain position information and particle size information of the particles, and the process is implemented as follows: the computer device of the embodiment performs semantic segmentation by using a Full Convolutional Network (FCN) to determine pixel information of particles in an image. After an image is input, a pixel prediction result, namely the category to which each pixel belongs, is directly obtained at an output end, so that the semantic segmentation of the image is realized through an end-to-end method. Fig. 5 is a schematic diagram of acquiring a particle thermal map based on FCN according to an embodiment of the present application, and as shown in fig. 5, first, a full connection layer of a convolutional neural network is regarded as a convolutional layer, a size of a convolutional template is a size of an input feature map, that is, the full connection network is regarded as a convolution of an entire input map, the full connection layer has 4096 × 6 convolution kernels, 4096 convolution kernels of 1 × 1 and 1000 convolution kernels of 1 × 1, and finally, an image semantic segmentation result is obtained, that is, a position of a particle and a size of the particle, which is the extracted image feature data, can be acquired.
In this embodiment, the acoustic tweezers device 103 is configured to receive a processed operation instruction, where the operation instruction controls the acoustic tweezers device to generate a vortex acoustic beam, and the vortex acoustic beam controls the target particles to move to a specified position to complete manipulation of the target particles.
Specifically, the operation instruction controls the acoustic tweezers device to generate the vortex acoustic beam, and the action torque is applied to the target particles in the process that the vortex acoustic beam controls the target particles to move, so that the position of the target particles is changed, and the size of the target particles is also changed.
Specifically, the target particles are moved from the initial positions of the target particles to the designated positions of the target particles according to the moving path information of the target particles under the action of the vortex sound beams according to the operation instructions processed by the computer equipment.
The acoustic tweezers device is referred to as an acoustic tweezers generating device, and the embodiment uses a vortex acoustic beam experimental device which mainly comprises a three-dimensional acoustic field scanning water tank and a vortex acoustic field emission array. The experimental device is used for generating vortex sound beams, testing the performance of the vortex sound beams in an experiment, and verifying the sound field distribution, the phase characteristics, the orbital angular momentum transmission and the action of torque on an object of the vortex sound beams.
Specifically, the theory of generating the vortex sound beam by the experimental device is as follows:
in the optical field, for the research of optical waves carrying Orbital Angular Momentum (OAM), a laguerre gaussian beam model is generally used as an object, and the cylindrical coordinates thereof are expressed as follows:
Figure BDA0002274915080000091
wherein p is the number of radial nodes numbered from 0,
Figure BDA0002274915080000092
for a related Laguerre polynomial, k is the wave number, zR=kw2(0) The/2 is the Rayleigh range,is the local beam width, w (0) is the beam waist,
Figure BDA0002274915080000094
is a gouy phase shift.
Inspired by the optical field, a Laguerre Gaussian model is introduced into acoustics, and the diffusion of radial high-order nodes and beam waists in the axial direction is ignored, so that a vortex acoustic beam model is constructed.
Figure BDA0002274915080000095
The intensity distribution thereof is thus obtained:
Figure BDA0002274915080000101
since the excitation sound field of an actual sound source can be regarded as gaussian, the radius of the circular array of discrete sound sources can be calculated from the intensity distribution of the above equation. The following equation is satisfied:
Figure BDA0002274915080000102
and dynamic focusing of the sound field is realized by adopting a phase control method, and vortex sound beams are generated preliminarily. And discretizing the circumference 2 pi to obtain n point sound sources. The phased array drive waveform when generating an l-order OAM topology should obviously satisfy the following equation:
Figure BDA0002274915080000103
wherein:
Figure BDA0002274915080000104
where θ is the azimuth, and i is the sequential number of the point sound source.
The method controls the focus position, and when the method is used for orbital angular momentum multiplexing besides particle capture and dynamic transportation, the OAM driving signals loaded with information of each order are added and processed to drive the sound source array, thereby exciting and obtaining a multiplexed OAM sound field.
The driving signal of the multiplexed sound field is as follows:
Figure BDA0002274915080000105
in addition, due to the action of the rotation torque on the object, orbital angular momentum carried by the non-zero topological acoustic vortex can be transmitted to the acoustic absorption object to generate rotation torque. And (3) respectively starting from the Mie particle, Rayleigh condition and intermediate condition, and quantitatively calculating the force borne by the particle by adopting different methods to give out force analysis of the particle. In addition to particle size, particle shape, number, and impedance ratio are also strongly dependent on force, which can affect its rotation or clustering behavior.
In summary, according to the acoustic tweezers control device based on the virtual reality technology provided by the embodiment of the present application, the virtual reality device is used to obtain the user state information, and then send the operation instruction to the computer device, and then send the operation instruction to the acoustic tweezers device after being processed by the computer device, the processed operation instruction controls the acoustic tweezers device to generate the vortex acoustic beam, and the vortex acoustic beam controls the movement of the particles to complete the manipulation of the particles. By means of the interactive communication between the virtual reality equipment and the acoustic tweezers equipment, the acoustic tweezers are operated by operating the virtual reality equipment, and accordingly fine operation on particles is completed.
Based on the first embodiment, the application also provides a sound tweezers control method based on the virtual reality technology, and the method is applied to the terminal. It should be noted that the terminal referred to in the embodiments of the present application may include, but is not limited to, a Personal Computer (PC), a Personal Digital Assistant (PDA), a Tablet Computer (Tablet Computer), a wireless handheld device, a mobile phone, and the like.
The sound tweezers control device in the first embodiment may be applied to the sound tweezers control method based on the virtual reality technology provided in this embodiment. As shown in fig. 6, fig. 6 is a basic flowchart schematic diagram of a sound tweezers control method based on a virtual reality technology provided in an embodiment of the present application, where the sound tweezers control method includes:
s201, receiving an operation instruction sent by virtual reality equipment, wherein the operation instruction is obtained by the virtual reality equipment according to user state information.
Specifically, the user state information includes head position and/or head turning data of the user, and an operation action of a hand of the user;
the virtual reality device comprises a virtual reality helmet which acquires head position and/or head steering data of a user in real time;
the virtual reality device further comprises an operating handle or an operating glove, and the operating handle or the operating glove is used for acquiring operating actions of the hand of the user in real time, wherein the operating actions comprise actions such as movement, key pressing, bending and the like of the hand.
Preferably, the operation instruction is obtained by converting a series of operation actions of the virtual reality device according to the acquired hand of the user into a series of operation information which can be recognized and transmitted, wherein the operation instruction at least comprises position and rotation information of the two hands, handle key operation information and glove feedback finger bending data.
The specific operation action obtaining process corresponds to a process in which a user manipulates the particles in the three-dimensional space through the VR device, as shown in fig. 3, the specific manipulation steps are as follows:
and S1, selecting target particles from the particles to be controlled through the operating handle or the operating glove and acquiring the initial positions of the target particles.
Preferably, the operation handle or the operation glove selects the target particles from the virtual three-dimensional space of the particles to be controlled through the operation handle or the operation glove, wherein the virtual three-dimensional space of the particles to be controlled is a VR image of a three-dimensional model which is displayed through a VR headset and is established in advance for the particles to be controlled and the surrounding environment.
Preferably, the process of specifically selecting the target particle and obtaining the initial position of the target particle is as follows:
virtual reality equipment, namely a VR helmet displays a virtual reality image of the operating handle or the operating gloves in real time, the virtual reality equipment calculates the pointing direction and the steering direction of the operating handle or the operating gloves in the virtual reality image through the hand motion gesture of the operating handle or the operating gloves, and a first particle to be controlled intersecting with the pointing direction is selected as a target particle.
And S2, determining the designated position of the target particles by moving the operating handle or the operating glove.
Specifically, after the target particles are selected, the user can determine the designated positions of the target particles by moving the operating handle or operating the position of the glove in the virtual reality image. It will be appreciated that the designated location of the target particles may be any designated location of the operating handle or operating glove, depending on the actual experimental purpose of the particles to be manipulated.
And S3, calculating the moving path information of the target particles through the initial positions of the target particles and the appointed positions of the target particles.
Specifically, the operation command converted from a series of operation actions generated by the operation handle or the operation glove in the virtual reality image, such as the initial position of the target microparticle, the specified position of the target microparticle, and the movement path information of the target microparticle, is also sent as the operation command to the computer device by the virtual reality device. The moving path information of the target particles is used for controlling the vortex sound beam to pick up the target particles and move the target particles from the initial position to the designated position.
It should be noted that, when determining the moving path of the target particle, in order to avoid the target particle colliding with other particles during moving and causing unexpected movement of other particles, an obstacle avoidance algorithm needs to be applied to optimize the selection of the moving path of the target particle.
S202, processing the operation instruction, and sending the processed operation instruction to the acoustic tweezers device; the operating instruction is used for controlling the acoustic tweezers device to generate a vortex acoustic beam, and controlling the target particles to move to a specified position through the vortex acoustic beam so as to complete the manipulation of the target particles.
The processing of the operation instruction here may be understood as converting the operation instruction sent by the virtual reality device into an operation instruction of the acoustic tweezers device, and specifically, the instruction conversion includes conversion of an operation position and a scale ratio.
Specifically, the operation instruction controls the acoustic tweezers device to generate the vortex acoustic beam, and the action torque is applied to the target particles in the process that the vortex acoustic beam controls the target particles to move, so that the position of the target particles is changed, and the size of the target particles is also changed.
Specifically, the target particles are moved from the initial positions of the target particles to the specified positions of the target particles according to the moving path information of the target particles under the action of the vortex sound beams according to the processed operation instructions.
In addition, in the first aspect, before the step of obtaining the operation instruction by the virtual reality device according to the user state information, an initial three-dimensional space model needs to be established in advance for the particles to be controlled and the surrounding environment thereof, and the specific process is as follows:
s1, receiving initial image data which are sent by a microscope and are used before the movement of the particles to be controlled;
s2, extracting image characteristic data of the initial image data before the movement of the particles to be controlled, and establishing an initial three-dimensional model before the movement of the particles to be controlled according to the extracted image characteristic data;
and S3, sending the initial three-dimensional model to virtual reality equipment.
The initial image data before the movement of the particle to be manipulated herein refers to the initial image data of the particle to be manipulated which is not operated at first.
In a second aspect, after the target particle is moved to the designated position, a current three-dimensional space model needs to be established for the particle to be manipulated and the surrounding environment, and the specific process is as follows:
s1, receiving current image data of the particles to be controlled sent by the microscope;
s2, extracting the characteristics of the current image data of the particles to be controlled, and establishing a current three-dimensional model of the particles to be controlled according to the extracted characteristic data;
and S3, sending the current three-dimensional model of the particle to be controlled to virtual reality equipment.
The virtual reality image is displayed in the virtual reality device such as a helmet, and displayed to the user in the helmet.
Here, image feature extraction is performed on the image data to obtain position information and particle size information of the particles, and the specific implementation of the process is as follows: the present embodiment uses a full convolutional network for semantic segmentation to determine the pixel information of the particles in the image. After an image is input, a pixel prediction result, namely the category to which each pixel belongs, is directly obtained at an output end, so that the image semantic segmentation is realized through an end-to-end method, and the finally obtained image semantic segmentation result can be used for obtaining the position and the size of the particles, which are the extracted image characteristic data. And rebuilding the three-dimensional model according to the extracted image characteristic data, and sending the current three-dimensional model to virtual reality equipment for displaying, for example, displaying a VR image in a VR helmet.
In summary, according to the acoustic tweezers control method based on the virtual reality technology provided by the embodiment of the present application, the operation instruction sent by the virtual reality device is received, then the operation instruction is processed, the processed operation instruction is sent to the acoustic tweezers device, the acoustic tweezers device is controlled by the processed operation instruction to generate the vortex acoustic beam, and finally the vortex acoustic beam controls the particles to move to the specified position. Through the interactive communication mode between the virtual reality equipment and the acoustic tweezers equipment, the acoustic tweezers are indirectly operated by operating the virtual reality equipment, and therefore fine operation on particles is completed.
It should be noted that, for the descriptions of the same steps and the same contents in this embodiment as those in other embodiments, reference may be made to the descriptions in other embodiments, which are not described herein again.
Based on the first embodiment and the second embodiment, the embodiment of the present application further provides another sound tweezers control method based on a virtual reality technology, and the method is applied to a virtual reality device. It should be noted that the virtual reality device referred to in the embodiments of the present application may include, but is not limited to, a VR helmet, a VR handle, or a VR glove.
The sound tweezers control device in the first embodiment may be applied to the sound tweezers control method based on the virtual reality technology provided in this embodiment. As shown in fig. 7, fig. 7 is a basic flow diagram of another sound tweezers control method based on a virtual reality technology according to an embodiment of the present application, where the sound tweezers control method includes:
and S301, displaying the virtual reality image of the operating handle or the operating glove in real time.
In this embodiment, the VR headset is used to display the virtual reality image of the operating handle or the operating glove in real time, specifically, when the operating handle or the operating glove is operated in the virtual three-dimensional space, the virtual reality image can display the real-time hand movement posture of the operating handle or the operating glove, and the virtual three-dimensional space is the VR image of the three-dimensional model created by the particles to be controlled and the surrounding environment thereof in advance, which is displayed by the VR headset.
S302, calculating the direction and the steering of the virtual reality image of the operating handle or the operating glove through the hand motion posture of the operating handle or the operating glove, selecting a first particle to be controlled intersected with the direction of the direction as a target particle, and acquiring the initial position of the target particle.
Specifically, the pointing direction and the turning direction of the operating handle or the operating glove in the virtual reality image can be understood as the pointing direction and the turning direction of virtual light projected by an extension line of the handle or the glove in the virtual reality image, and a first particle to be controlled, which is intersected with the virtual light, is selected as a target particle, wherein the three-dimensional space position of the first particle to be controlled is the initial position of the target particle. It should be noted that once the hand motion gesture of the handle or the glove changes, the pointing direction and the steering direction of the handle or the glove in the virtual reality image also change, and by analogy, different target particles can be selected by changing the hand motion gesture of the handle or the glove.
S303, determining the designated position of the target particle.
Specifically, after the target particles are selected, the user can determine the designated positions of the target particles by moving the operating handle or operating the position of the glove in the virtual reality image. It will be appreciated that the designated location of the target particles may be any designated location of the operating handle or operating glove, depending on the actual experimental purpose of the particles to be manipulated.
S304, calculating the moving path information of the target particles through the initial positions of the target particles and the appointed positions of the target particles.
Specifically, the virtual reality device calculates the moving path information of the target particles according to the initial position and the designated position. It should be noted that the operation command converted from a series of operation actions generated by the operation handle or the operation glove of the virtual reality device in the virtual reality image, such as the initial position of the target microparticle, the specified position of the target microparticle, and the moving path information of the target microparticle, is also sent as the operation command by the virtual reality device to the computer device.
It should be noted that, when determining the moving path of the target particle, in order to avoid the target particle colliding with other particles during moving and causing unexpected movement of other particles, an obstacle avoidance algorithm needs to be applied to optimize the selection of the moving path of the target particle.
And S305, controlling the acoustic tweezers device to generate a vortex acoustic beam according to the moving path information, and controlling the target particles to move from the initial position to the specified position through the vortex acoustic beam.
Specifically, the operation instruction controls the acoustic tweezers device to generate the vortex acoustic beam, and the action torque is applied to the target particles in the process that the vortex acoustic beam controls the target particles to move, so that the position of the target particles is changed, and the size of the target particles is also changed. And moving the target particles from the initial positions of the target particles to the specified positions of the target particles according to the moving path information of the target particles under the action of the vortex sound beams according to the operation instructions processed by the computer equipment.
S306, receiving and displaying a feedback image of the target particles at the designated position.
Specifically, the VR headset of the virtual reality device displays a VR image of a current three-dimensional model of the particle to be manipulated, and the current three-dimensional model of the particle to be manipulated is established in advance through, for example, a computer device.
In summary, according to the acoustic tweezers control method based on the virtual reality technology provided by the embodiment of the present application, the operation handle or the operation glove is used to select the target particles in the virtual three-dimensional space, select the initial position and the designated position of the target particles, and calculate the movement path information of the target particles, according to the movement path information, the acoustic tweezers device can be controlled to generate the vortex acoustic beam, and finally the vortex acoustic beam controls the target particles to move from the initial position to the designated position, so as to complete the fine operation on the particles.
It should be noted that, for the descriptions of the same steps and the same contents in this embodiment as those in other embodiments, reference may be made to the descriptions in other embodiments, which are not described herein again.
Based on the second and third embodiments, the present application provides a computer system. Referring to fig. 8, the computer system 400 includes a Central Processing Unit (CPU)401 that can perform various appropriate actions and processes according to a program stored in a Read Only Memory (ROM)402 or a program loaded from a storage section into a Random Access Memory (RAM) 403. In the RAM403, various programs and data necessary for system operation are also stored. The CPU401, ROM 402, and RAM403 are connected to each other via a bus 404. An input/output (I/O) interface 405 is also connected to bus 404.
The following components are connected to the I/O interface 405: an input section 406 including a keyboard, a mouse, and the like; an output section 407 including a display device such as a Cathode Ray Tube (CRT), a Liquid Crystal Display (LCD), and the like, and a speaker; a storage section 408 including a hard disk and the like; and a communication section 409 including a network interface card such as a LAN card, a modem, or the like. The communication section 409 performs communication processing via a network such as the internet. A driver 410 is also connected to the I/O interface 405 as needed. A removable medium 411 such as a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory, or the like is mounted on the drive 410 as necessary, so that a computer program read out therefrom is mounted into the storage section 408 as necessary.
In particular, according to embodiments of the present application, the process described above with reference to the flowchart of fig. 6 may be implemented as a computer software program. For example, a second embodiment of the present application includes a computer program product comprising a computer program carried on a computer readable medium, the computer program being executable by the CPU401 to perform the steps of:
receiving an operation instruction sent by virtual reality equipment, wherein the operation instruction is obtained by the virtual reality equipment according to user state information;
processing the operation instruction, and sending the processed operation instruction to the acoustic tweezers equipment; the operating instruction is used for controlling the acoustic tweezers device to generate a vortex acoustic beam, and controlling the target particles to move to a specified position through the vortex acoustic beam so as to complete the manipulation of the target particles.
Another example is: the process described above with reference to the flowchart of fig. 7 may be implemented as a computer software program. For example, a third embodiment of the present application includes a computer program product comprising a computer program carried on a computer readable medium, the computer program being executable by the CPU401 to perform the steps of:
displaying a virtual reality image of the operating handle or the operating glove in real time;
calculating the pointing direction and the steering direction of a virtual reality image of an operating handle or an operating glove through the hand motion posture of the operating handle or the operating glove, selecting a first particle to be controlled, which is intersected with the pointing direction, as a target particle, and acquiring the initial position of the target particle;
determining a designated position of the target particle;
calculating moving path information of the target particles through the initial positions of the target particles and the designated positions of the target particles;
controlling the acoustic tweezers equipment to generate vortex acoustic beams according to the moving path information, and controlling the target particles to move from the initial position to the specified position through the vortex acoustic beams;
and receiving and displaying a feedback image of the target particles at the designated position.
In such an embodiment, the computer program may be downloaded and installed from a network through the communication section 409, and/or installed from the removable medium 411.
It should be noted that the computer readable medium shown in the present application may be a computer readable signal medium or a computer readable storage medium or any combination of the two. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the present application, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In this application, however, a computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: wireless, wire, fiber optic cable, RF, etc., or any suitable combination of the foregoing.
The flowcharts in the figures illustrate the architecture, functionality, and operation of possible implementations of virtual reality technology based acoustic tweezer control apparatus, methods and computer program products according to various embodiments of the present application. In this regard, each block in the flowchart may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams or flowchart illustration, and combinations of blocks in the block diagrams or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units described in the embodiments of the present application may be implemented by software, or may be implemented by hardware, and the described units may also be disposed in a processor. Wherein the names of the elements do not in some way constitute a limitation on the elements themselves. The described units or modules may also be provided in a processor, and may be described as: a processor includes a virtual reality device module, a computer device module, and an acoustic tweezers device module. Wherein the designation of a unit or module does not in some way constitute a limitation of the unit or module itself.
As another aspect, the present application also provides a computer-readable medium, which may be contained in the terminal described in the above embodiments; or may exist separately and not be assembled into the terminal. The computer readable medium carries one or more programs, and when the one or more programs are executed by a terminal, the terminal is enabled to implement the method for controlling acoustic tweezers based on virtual reality technology as in the second or third embodiment.
For example, the terminal may implement the following as shown in fig. 6:
s201, receiving an operation instruction sent by virtual reality equipment, wherein the operation instruction is obtained by the virtual reality equipment according to user state information;
s202, processing the operation instruction, and sending the processed operation instruction to the acoustic tweezers device; the operating instruction is used for controlling the acoustic tweezers device to generate a vortex acoustic beam, and controlling the target particles to move to a specified position through the vortex acoustic beam so as to complete the manipulation of the target particles.
For another example, the terminal may implement the following as shown in fig. 6:
s301, displaying a virtual reality image of the operating handle or the operating glove in real time;
s302, calculating the direction and the steering of a virtual reality image of an operating handle or an operating glove through the hand motion posture of the operating handle or the operating glove, selecting a first particle to be controlled intersecting with the direction of the direction as a target particle, and acquiring the initial position of the target particle;
s303, determining the designated position of the target particle;
s304, calculating the moving path information of the target particles through the initial positions of the target particles and the appointed positions of the target particles;
s305, controlling the acoustic tweezers device to generate a vortex acoustic beam according to the moving path information, and controlling the target particles to move from the initial position to the specified position through the vortex acoustic beam;
s306, receiving and displaying a feedback image of the target particles at the designated position.
It should be noted that although in the above detailed description several modules or units of the terminal for action execution are mentioned, such a division is not mandatory. Indeed, the features and functionality of two or more modules or units described above may be embodied in one module or unit, according to embodiments of the present disclosure. Conversely, the features and functions of one module or unit described above may be further divided into embodiments by a plurality of modules or units.
Moreover, although the steps of the methods of the present disclosure are depicted in the drawings in a particular order, this does not require or imply that the steps must be performed in this particular order, or that all of the depicted steps must be performed, to achieve desirable results. Additionally or alternatively, certain steps may be omitted, multiple steps combined into one step execution, and/or one step broken down into multiple step executions, etc.
Through the above description of the embodiments, those skilled in the art will readily understand that the exemplary embodiments described herein may be implemented by software, or by software in combination with necessary hardware.
The above description is only a preferred embodiment of the application and is illustrative of the principles of the technology employed. It will be appreciated by a person skilled in the art that the scope of the invention as referred to in the present application is not limited to the embodiments with a specific combination of the above-mentioned features, but also covers other embodiments with any combination of the above-mentioned features or their equivalents without departing from the inventive concept. For example, the above features may be replaced with (but not limited to) features having similar functions disclosed in the present application.

Claims (10)

1. The utility model provides a sound tweezers controlling means based on virtual reality technique which characterized in that includes:
the virtual reality equipment is used for acquiring user state information, converting the user state information into an operation instruction and sending the operation instruction;
the computer equipment is used for receiving the operation instruction and processing the operation instruction,
and the acoustic tweezers device is used for receiving the processed operation instruction, the operation instruction controls the acoustic tweezers device to generate a vortex acoustic beam, and the vortex acoustic beam controls the target particles to move to a specified position so as to complete the control of the target particles.
2. The virtual reality technology-based acoustic tweezers control device according to claim 1, wherein:
the user state information comprises head position and/or head turning data of a user and operation actions of hands of the user;
the virtual reality device comprises a virtual reality helmet which acquires head position and/or head steering data of a user in real time;
the virtual reality equipment further comprises an operating handle or an operating glove, and the operating handle or the operating glove is used for acquiring the operating action of the hand of the user in real time.
3. The virtual reality technology-based acoustic tweezers control device according to claim 1, wherein the virtual reality device selects target particles from the particles to be manipulated by operating a handle or an operating glove and obtains an initial position of the target particles; determining the designated position of the target particles by moving the operating handle or the operating glove; and calculating the moving path information of the target particles through the initial positions of the target particles and the designated positions of the target particles.
4. The virtual reality technology-based acoustic tweezers control device according to claim 3, wherein the virtual reality device displays a virtual reality image of the operating handle or the operating glove in real time, the virtual reality device calculates the pointing direction and the turning direction of the virtual reality image of the operating handle or the operating glove through the hand motion gesture of the operating handle or the operating glove, and selects the first particle to be controlled intersecting the pointing direction as the target particle.
5. The virtual reality technology-based acoustic tweezers control device according to claim 1, further comprising a microscope, wherein the microscope sends initial image data of the particles to be controlled before moving to the computer device, the computer device extracts image feature data of the initial image data of the particles to be controlled before moving, establishes an initial three-dimensional model of the particles to be controlled before moving according to the extracted image feature data, and sends the initial three-dimensional model to the virtual reality device for displaying.
6. The virtual reality technology-based acoustic tweezer control apparatus of claim 5, wherein the microscope is further configured to: after the target particles are moved to the designated positions, sending the current image data of the particles to be controlled to the computer equipment, carrying out feature extraction on the current image data of the particles to be controlled by the computer equipment, establishing a current three-dimensional model of the particles to be controlled according to the extracted feature data, and sending the current three-dimensional model to the virtual reality equipment for displaying.
7. A sound tweezers control method based on virtual reality technology is characterized in that,
receiving an operation instruction sent by virtual reality equipment, wherein the operation instruction is obtained by the virtual reality equipment according to user state information;
processing the operation instruction, and sending the processed operation instruction to the acoustic tweezers equipment;
the operating instruction is used for controlling the acoustic tweezers device to generate a vortex acoustic beam, and controlling the target particles to move to a specified position through the vortex acoustic beam so as to complete the manipulation of the target particles.
8. The virtual reality technology-based acoustic tweezers control method according to claim 7, further comprising:
receiving initial image data which is sent by a microscope and is used for controlling particles before moving;
extracting image characteristic data of the initial image data before the movement of the particles to be controlled, and establishing an initial three-dimensional model before the movement of the particles to be controlled according to the extracted image characteristic data;
and sending the initial three-dimensional model to virtual reality equipment.
9. The virtual reality technology-based acoustic tweezers control method according to claim 7, further comprising:
after the target particles are moved to the designated positions, current image data of the particles to be controlled, which are sent by a microscope, are received;
performing feature extraction on the current image data of the particles to be controlled, and establishing a current three-dimensional model of the particles to be controlled according to the extracted feature data;
and sending the current three-dimensional model of the particle to be controlled to virtual reality equipment.
10. A sound tweezers control method based on virtual reality technology is characterized by comprising the following steps:
displaying a virtual reality image of the operating handle or the operating glove in real time;
calculating the pointing direction and the steering direction of a virtual reality image of an operating handle or an operating glove through the hand motion posture of the operating handle or the operating glove, selecting a first particle to be controlled, which is intersected with the pointing direction, as a target particle, and acquiring the initial position of the target particle;
determining a designated position of the target particle;
calculating moving path information of the target particles through the initial positions of the target particles and the designated positions of the target particles;
controlling the acoustic tweezers equipment to generate vortex acoustic beams according to the moving path information, and controlling the target particles to move from the initial position to the specified position through the vortex acoustic beams;
and receiving and displaying a feedback image of the target particles at the designated position.
CN201911119071.2A 2019-11-15 2019-11-15 Sound tweezers control device and method based on virtual reality technology Withdrawn CN110850985A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911119071.2A CN110850985A (en) 2019-11-15 2019-11-15 Sound tweezers control device and method based on virtual reality technology

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911119071.2A CN110850985A (en) 2019-11-15 2019-11-15 Sound tweezers control device and method based on virtual reality technology

Publications (1)

Publication Number Publication Date
CN110850985A true CN110850985A (en) 2020-02-28

Family

ID=69601857

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911119071.2A Withdrawn CN110850985A (en) 2019-11-15 2019-11-15 Sound tweezers control device and method based on virtual reality technology

Country Status (1)

Country Link
CN (1) CN110850985A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112450995A (en) * 2020-10-28 2021-03-09 杭州无创光电有限公司 Situation simulation endoscope system
CN114280983A (en) * 2021-12-08 2022-04-05 深圳先进技术研究院 Sound control method and system based on man-machine interaction
WO2023102774A1 (en) * 2021-12-08 2023-06-15 深圳先进技术研究院 Acoustic control method and system based on human-machine interaction

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112450995A (en) * 2020-10-28 2021-03-09 杭州无创光电有限公司 Situation simulation endoscope system
CN114280983A (en) * 2021-12-08 2022-04-05 深圳先进技术研究院 Sound control method and system based on man-machine interaction
WO2023102774A1 (en) * 2021-12-08 2023-06-15 深圳先进技术研究院 Acoustic control method and system based on human-machine interaction

Similar Documents

Publication Publication Date Title
CN110850985A (en) Sound tweezers control device and method based on virtual reality technology
US11279022B2 (en) Robot control, training and collaboration in an immersive virtual reality environment
US20230168787A1 (en) Systems, apparatuses, interfaces and implementing methods for displaying and manipulating temporal or sequential objects
Poupyrev et al. Egocentric object manipulation in virtual environments: empirical evaluation of interaction techniques
Jayaram et al. Assessment of VR technology and its applications to engineering problems
US7536655B2 (en) Three-dimensional-model processing apparatus, three-dimensional-model processing method, and computer program
Lucente et al. Visualization space: A testbed for deviceless multimodal user interface
Manoharan et al. Precision improvement and delay reduction in surgical telerobotics
Kihonge et al. Spatial mechanism design in virtual reality with networking
JP2014203463A (en) Creating ergonomic manikin postures and controlling computer-aided design environments using natural user interfaces
Nikolakis et al. Cybergrasp and phantom integration: Enhanced haptic access for visually impaired users
Tsamis et al. Intuitive and safe interaction in multi-user human robot collaboration environments through augmented reality displays
WO2018156087A1 (en) Finite-element analysis augmented reality system and method
Schkolne et al. Immersive design of DNA molecules with a tangible interface
Kensek et al. Augmented reality: An application for architecture
Aladin et al. Designing user interaction using gesture and speech for mixed reality interface
Su et al. Effective manipulation for industrial robot manipulators based on tablet PC
Dani et al. COVIRDS: a conceptual virtual design system
Mohanty et al. Kinesthetic metaphors for precise spatial manipulation: a study of object rotation
Liu et al. COMTIS: Customizable touchless interaction system for large screen visualization
Köse et al. Dynamic predictive modeling approach of user behavior in virtual reality based application
CN106681516B (en) Natural man-machine interaction system based on virtual reality
Guan et al. A novel robot teaching system based on augmented reality
Zhijiang et al. Virtual reality-based telesurgery via teleprogramming scheme combined with semi-autonomous control
Yu et al. Google glass-based remote control of a mobile robot

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WW01 Invention patent application withdrawn after publication

Application publication date: 20200228

WW01 Invention patent application withdrawn after publication