CN116149474A - Control method and device of near-eye display equipment and near-eye display equipment - Google Patents

Control method and device of near-eye display equipment and near-eye display equipment Download PDF

Info

Publication number
CN116149474A
CN116149474A CN202310067681.2A CN202310067681A CN116149474A CN 116149474 A CN116149474 A CN 116149474A CN 202310067681 A CN202310067681 A CN 202310067681A CN 116149474 A CN116149474 A CN 116149474A
Authority
CN
China
Prior art keywords
user
hand
virtual object
target virtual
eye display
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310067681.2A
Other languages
Chinese (zh)
Inventor
殷红杰
孙洪涛
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Goertek Inc
Original Assignee
Goertek Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Goertek Inc filed Critical Goertek Inc
Priority to CN202310067681.2A priority Critical patent/CN116149474A/en
Publication of CN116149474A publication Critical patent/CN116149474A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/014Hand-worn input/output arrangements, e.g. data gloves
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/016Input arrangements with force or tactile feedback as computer generated output to the user
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • G06V40/28Recognition of hand or arm movements, e.g. recognition of deaf sign language

Abstract

The disclosure provides a control method and device of a near-eye display device and the near-eye display device, wherein the method comprises the following steps: acquiring hand pose information of a user; determining the relationship between the hand of the user and the target virtual object according to the hand pose information and the position information of the target virtual object, wherein the relationship between the hand of the user and the target virtual object comprises whether the hand of the user touches the target virtual object or not; when the hand of the user touches the target virtual object, the tactile feedback device worn by the hand of the user is controlled to apply tactile sensation to the hand of the user.

Description

Control method and device of near-eye display equipment and near-eye display equipment
Technical Field
The disclosure relates to the technical field of near-eye display, and more particularly, to a control method and device of near-eye display equipment and the near-eye display equipment.
Background
In the process that the user uses near-eye display devices such as AR (Augmented Reality) glasses, VR (Virtual Reality) glasses, and MR (Mix Reality) glasses, the user interacts with a Virtual object in a scene provided by the near-eye display device to the user, for example, touches the Virtual object with a hand and takes the Virtual object.
At present, when a user operates a virtual object, the user cannot obtain the real operation experience of the virtual object, for example, cannot obtain the touch feeling after the hand touches the virtual object, so that the user experience is poor.
Therefore, how to make the user obtain the real experience of operating the virtual object when the user operates the virtual object is a problem to be solved by those skilled in the art.
Disclosure of Invention
It is an object of the present disclosure to provide a new solution for near-eye display device control.
According to a first aspect of the present disclosure, there is provided a control method of a near-eye display device, including:
acquiring hand pose information of a user;
determining the relationship between the hand of the user and the target virtual object according to the hand pose information and the position information of the target virtual object, wherein the relationship between the hand of the user and the target virtual object comprises whether the hand of the user touches the target virtual object or not;
when the hand of the user touches the target virtual object, the tactile feedback device worn by the hand of the user is controlled to apply tactile sensation to the hand of the user.
Optionally, the controlling the haptic feedback device worn by the hand of the user to apply the touch feeling to the hand of the user in the case that the hand of the user touches the target virtual object includes:
determining the operation parameters of the haptic feedback device according to the attribute parameters of the target virtual object under the condition that the hand of the user touches the target virtual object;
controlling a haptic feedback device to apply a tactile sensation to a user's hand according to the operating parameter;
the attribute parameters of the target virtual object include at least one of a material, strength, hardness, elasticity, surface roughness and temperature corresponding to the target virtual object.
Optionally, the relationship between the hand of the user and the target virtual object further includes whether the hand of the user holds the target virtual object;
the touch feedback device for controlling the hand wearing of the user to apply touch to the hand of the user when the hand of the user touches the target virtual object comprises:
under the condition that the hand of a user touches a target virtual object, if the hand of the user holds the target virtual object, determining the operation parameters of the haptic feedback device according to the holding tightness and the attribute parameters of the target virtual object;
controlling a haptic feedback device to apply a tactile sensation to a user's hand according to the operating parameter;
wherein, the attribute parameters of the target virtual object comprise at least one of materials, strength, hardness, elasticity, surface roughness and temperature corresponding to the target virtual object;
and the holding tightness is determined according to the hand pose information and the corresponding size and weight of the target virtual object.
Optionally, the tactile feedback device includes at least one of a vibration feedback module, a pressure feedback module, and a temperature feedback module;
in the case where the haptic feedback device includes a vibration feedback module, the operating parameters include a vibration intensity and a vibration frequency output by the vibration feedback module; or alternatively, the process may be performed,
in the case where the haptic feedback device includes a pressure feedback module, the operating parameter includes a pressure output by the pressure feedback module; or alternatively, the process may be performed,
in the case where the haptic feedback device includes a temperature feedback module, the operating parameter includes a temperature output by the temperature feedback module.
Optionally, the method further comprises:
and under the condition that the hand of the user touches the target virtual object, responding to the moving operation of the user on the target virtual object, and adjusting the operation parameters according to the moving speed of the user on the target virtual object.
Optionally, the acquiring the hand pose information of the user includes:
and obtaining hand pose information of the user according to the image acquired by the camera of the near-eye display device and data output by a target sensor worn by the hand of the user, wherein the target sensor comprises an acceleration sensor and an angular velocity sensor.
According to a second aspect of the present disclosure, there is provided a control apparatus of a near-eye display device, comprising:
the receiving module is used for receiving hand pose information of the user, which is sent by the touch glove of the hand of the user;
the first determining module is used for determining whether the hand of the user succeeds in the target virtual object according to the hand pose information of the user and the position information of the target virtual object;
and the control module is used for controlling the touch glove to transmit vibration to the hand of the user under the condition that the hand of the user successfully touches the target virtual object.
Optionally, the apparatus further comprises a second determination module;
the second determining module is configured to determine at least one of intensity and frequency of vibration of the vibration device according to a target parameter of the target virtual object when the hand of the user successfully touches the target virtual object, where the target parameter includes at least one of intensity, hardness, and elasticity of the target virtual object.
According to a third aspect of the present disclosure, there is provided a near-eye display device comprising an apparatus as described in the second aspect; or alternatively, the process may be performed,
a memory for storing computer instructions and a processor for invoking the computer instructions from the memory to perform the method of any of the first aspects.
According to a fourth aspect of the present disclosure, there is provided a computer readable storage medium having stored thereon a computer program which, when executed by a processor, implements the method according to any of the first aspects.
According to the control method of the near-to-eye display device, under the condition that the hands of the user touch the target virtual object, the touch feedback device worn by the hands of the user is controlled to apply touch to the hands of the user, so that the user can obtain real operation experience of the virtual object displayed to the user by the near-to-eye display device.
Other features of the present disclosure and its advantages will become apparent from the following detailed description of exemplary embodiments of the disclosure, which proceeds with reference to the accompanying drawings.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments of the disclosure and together with the description, serve to explain the principles of the disclosure.
Fig. 1 is a block diagram of a hardware configuration of a near-eye display device implementing a control method of the near-eye display device according to an embodiment of the present disclosure;
fig. 2 is a flowchart of a control method of a near-eye display device provided in an embodiment of the present application;
fig. 3 is a schematic structural diagram of a control device of the near-eye display apparatus provided in the embodiment of the present application;
fig. 4 is a schematic structural diagram of a near-eye display device provided in an embodiment of the present application.
Detailed Description
Various exemplary embodiments of the present disclosure will now be described in detail with reference to the accompanying drawings. It should be noted that: the relative arrangement of the components and steps, numerical expressions and numerical values set forth in these embodiments do not limit the scope of the present disclosure unless it is specifically stated otherwise.
The following description of at least one exemplary embodiment is merely illustrative in nature and is in no way intended to limit the disclosure, its application, or uses.
Techniques, methods, and apparatus known to one of ordinary skill in the relevant art may not be discussed in detail, but are intended to be part of the specification where appropriate.
In all examples shown and discussed herein, any specific values should be construed as merely illustrative, and not a limitation. Thus, other examples of exemplary embodiments may have different values.
It should be noted that: like reference numerals and letters denote like items in the following figures, and thus once an item is defined in one figure, no further discussion thereof is necessary in subsequent figures.
< hardware configuration embodiment >
Fig. 1 is a block diagram of a hardware configuration of a near-eye display device implementing a control method of the near-eye display device according to an embodiment of the present disclosure.
The near-eye display device 1000 may include a processor 1100, a memory 1200, an interface apparatus 1300, a communication apparatus 1400, a display apparatus 1500, an input apparatus 1600, a speaker 1700, a microphone 1800, and a camera 1900, among others. The processor 1100 may be a central processing unit CPU, a microprocessor MCU, or the like. The memory 1200 includes, for example, ROM (read only memory), RAM (random access memory), nonvolatile memory such as a hard disk, and the like. The interface device 1300 includes, for example, a USB interface, a headphone interface, and the like. The communication device 1400 can perform wired or wireless communication, for example. The display device 1500 is, for example, a liquid crystal display, a touch display, or the like. The input device 1600 may include, for example, a touch screen, keyboard, etc. A user may input/output voice information through the speaker 1700 and the microphone 1800, and the camera 1900 is used to capture images.
Although a plurality of devices are shown for the near-eye display apparatus 1000 in fig. 1, the present disclosure may relate to only some of the devices therein, for example, the near-eye display apparatus 1000 relates to only the memory 1200 and the processor 1100.
In an embodiment applied to the present disclosure, the memory 1200 of the near-eye display device 1000 is used to store instructions for controlling the processor 1100 to perform the control method of the near-eye display device provided by the embodiment of the present disclosure.
In the above description, a skilled person may design instructions according to the disclosed aspects of the present disclosure. How the instructions control the processor to operate is well known in the art and will not be described in detail here.
< method example >
The disclosed embodiments provide a control method of a near-eye display device, which is performed by a near-eye display device 1000 as shown in fig. 1.
As shown in fig. 2, the method includes the following S2100-S2300:
s2100: and acquiring hand pose information of the user.
Acquiring hand pose information of a user, including: and obtaining hand pose information of the user according to the image acquired by the camera of the near-eye display device and data output by the target sensor worn by the hand of the user, wherein the target sensor comprises an acceleration sensor and an angular velocity sensor.
Specifically, the position relationship among different key points of the user's hand in the image acquired by the camera of the near-eye display device; or, the hand pose information of the user can be effectively determined by wearing the target sensor on the hand of the user and by data output by the target sensor worn on the hand of the user, wherein the target sensor comprises an acceleration sensor and an angular velocity sensor.
For example, a plurality of IMUs (Inertial measurement unit, inertial measurement units) may be disposed on the hand of the user, where the IMUs are devices for measuring three-axis attitude angles (or angular rates) and accelerations of an object, and in general, one IMU includes three single-axis accelerometers and three single-axis gyroscopes, where the accelerometers detect acceleration signals of the object on three independent axes of the carrier coordinate system, and the gyroscopes detect angular velocity signals of the carrier relative to the navigation coordinate system, and by using angular velocities and accelerations of different positions of the hand of the user in three-dimensional space, the pose of the hand of the user can be determined, so that the pose of the hand of the user can be timely and effectively obtained, so as to determine the positional relationship between the hand of the user and the virtual object.
The data communication can be performed between the target sensor and the near-eye display device, after the near-eye display device obtains the data output by the target sensor worn by the hand of the user, the data operation can be performed through a processor arranged in the near-eye display device to obtain the hand pose information of the user, or the data output by the target sensor can be sent to the server to receive the pose information of the hand calculated by the server.
S2200: and determining the relationship between the hand of the user and the target virtual object according to the hand pose information and the position information of the target virtual object, wherein the relationship between the hand of the user and the target virtual object comprises whether the hand of the user touches the target virtual object or not.
The near-eye display device in the embodiment of the application may be AR glasses, VR glasses, MR glasses, and the like, where AR (Augmented Reality ) technology is a technology that increases the perception of the real world by a user through information provided by a computer system, and superimposes computer-generated virtual objects, scenes, or system prompt information into the real scene, so as to implement "augmentation" of reality.
The VR is characterized in that a virtual world of a three-dimensional space is generated by computer simulation, so that the simulation of visual sense, auditory sense, touch sense and other senses of a user is provided, the user has quite enough 'immersion' and 'feeling of being in the scene', and the user can observe or operate things in the three-dimensional space in time and without limitation.
MR includes augmented reality and augmented virtualization, referring to a new visual environment created by merging the real and virtual worlds. Physical and digital objects coexist in the new visualization environment and can interact in real time. Meanwhile, there may be some virtual objects in a scene provided to the user by the near-eye display device, and the virtual objects may be manipulated during the use of the near-eye display device by the user, for example, during the game playing by the user using the near-eye display device, the user may perform operations such as touching, lifting, moving, throwing, etc. the virtual objects, for example, grabbing the virtual bullets after contacting the virtual bullets, and loading them into the clips of the virtual pistol.
According to the position information of the target virtual object and the pose information of the hand of the user, the relation between the hand of the user and the target virtual object can be determined in a coordinate change mode, and whether the hand of the user touches the target virtual object can be further determined, so that whether the touch transmission device is controlled to transmit touch to the hand of the user can be determined, unnecessary touch on the hand of the user can be avoided under the condition that the hand of the user does not touch the target virtual object, and meanwhile, the power consumption of the touch transmission device can be reduced.
In the embodiment of the present application, the virtual object closest to the hand of the user may be used as the target virtual object, or the user may take a part of virtual objects from a plurality of virtual objects, or may use a virtual object located in the moving direction of the hand of the user as the target virtual object.
S2300: when the hand of the user touches the target virtual object, the tactile feedback device worn by the hand of the user is controlled to apply tactile sensation to the hand of the user.
In one embodiment, in a case that a hand of a user touches a target virtual object, in order to make an operation experience of the user on the target virtual object more realistic, a haptic feedback device worn by the hand of the user may be controlled to apply a touch feeling to the hand of the user, which specifically includes:
under the condition that the hand of a user touches a target virtual object, according to the attribute parameters of the target virtual object, the operation parameters of the tactile feedback device can be determined; the haptic feedback device may be controlled to apply a tactile sensation to the user's hand according to the operating parameters; the attribute parameters of the target virtual object comprise at least one of materials, strength, hardness, elasticity, surface roughness and temperature corresponding to the target virtual object.
In one embodiment, the operating parameters of the haptic feedback device may be controlled within the target interval to avoid excessive operating parameters applied to the user's hand, causing damage to the user's hand, e.g., controlling the temperature value applied to the user's hand to be between 0 degrees and 40 degrees.
In the real world, the hands of users feel different to objects of different materials, for example, they feel soft when touching soft cotton and feel hard when touching rock.
The near-to-eye display equipment also has corresponding attribute parameters such as materials, strength, hardness, elasticity, surface roughness and temperature for the virtual object in the scene provided by the user, so that the touch sense transmission device worn by the hand of the user can transmit the touch sense corresponding to the attribute parameters of the target virtual object to the hand of the user under the condition that the user touches the target virtual object, the operation experience of the user can be more real, and the operation experience of the user is improved.
When the target virtual object touched by the user changes, the attribute parameters of the changed target virtual object can be obtained, so that the attribute parameters of the virtual object touched by the user can be timely obtained.
In one embodiment, the haptic feedback device includes at least one of a vibration feedback module, a pressure feedback module, and a temperature feedback module.
In particular, in the case where the haptic feedback device includes a vibration feedback module, the operating parameters include the intensity of vibration and the frequency of vibration output by the vibration feedback module; alternatively, where the haptic feedback device includes a pressure feedback module, the operating parameter includes a pressure output by the pressure feedback module; alternatively, where the haptic feedback device includes a temperature feedback module, the operating parameter includes a temperature output by the temperature feedback module.
Therefore, under the condition that the tactile feedback device comprises a plurality of feedback devices, for example, the tactile feedback device comprises the vibration feedback module and the temperature feedback module, the hands of the user can receive vibration and temperature corresponding to the target virtual object at the same time, the variety of tactile feelings of the hands of the user under the condition of touching the target virtual object can be improved, and the interestingness of user operation is increased.
For example, when the hand of the user touches the virtual stone, the user can simultaneously feel the temperature of the virtual stone, the pressure feedback after the touch, and the vibration feedback after the touch by simultaneously providing the vibration feedback module, the pressure feedback module, and the temperature feedback module in the haptic feedback device.
In one embodiment, since the user's operation on the target virtual object may not be limited to touching, the relationship of the user's hand to the target virtual object further includes whether the user's hand holds the target virtual object; the user may hold the target virtual object, or hold a part of the target virtual object by the user's hand, and perform operations such as lifting and dragging the virtual object, or hold a part or all of the target virtual object in the hand by the user's hand, for example, the user's hand holds the virtual pistol, which is not limited in this application.
Whether the user holds the target virtual object by the hand can be determined by the hand pose information of the user and the position information of the target virtual object, which belongs to common knowledge of the person in the art, and the application is not repeated here.
Under the condition that the hand of the user touches the target virtual object, if the hand of the user holds the target virtual object, the operation parameters of the tactile feedback device can be determined according to the holding tightness and the attribute parameters of the target virtual object; according to the operation parameters, the touch feedback device is controlled to apply touch to the hands of the user, so that the touch corresponding to the holding tightness can be transferred to the hands of the user, and the operation experience of the user is more real.
The attribute parameters of the target virtual object comprise at least one of materials, strength, hardness, elasticity, surface roughness and temperature corresponding to the target virtual object; the holding tightness can be determined according to the hand pose information and the corresponding size and weight of the target virtual object. It should be noted that, in the embodiment of the present application, the degree of tightness of gripping may describe the strength of gripping of the same virtual object by the hand of the user in the same gripping manner.
For example, a user holds a certain virtual stone in a hand, and the holding tightness degree of the user can be determined according to the hand pose information of the user and the corresponding size and weight of the virtual stone; the user can correspondingly grip the virtual stone with different grip gestures according to different grip tightness.
In the embodiment of the application, when the gesture of the user or the size or weight of the virtual object changes, the holding tightness degree of the user can be redetermined, so that the holding tightness degree can be updated in time.
In one embodiment, when the hand of the user touches the target virtual object, the user may perform a movement operation on the target virtual object, so that the operation parameters may be adjusted according to the movement speed of the user on the target virtual object in response to the movement operation of the user on the target virtual object, so that a touch feeling corresponding to the movement speed of the hand of the user on the target virtual object may be transferred to the user, and the operation experience of the user is more realistic.
For example, when a user pushes a virtual square in a scene provided by a near-eye display device to the user to move the virtual square in a horizontal direction, and the pushing speed of the user to the virtual square is changed, the hand of the user is not only touched with the virtual square, but also pushed with the virtual square.
In the case that the hand of the user touches the target virtual object, the position information of the target virtual object may be bound to the pose information of the hand of the user, and the position information of the target virtual object may be updated accordingly according to the pose information of the hand, for example, in the case that it is determined that the hand of the user holds the target virtual object in the palm of the hand, the movement speed of the hand of the user may be used as the movement speed of the hand of the user with respect to the target virtual object.
When the hand gesture of the user is obviously changed, for example, when the hand of the user is changed from being held to being released, the binding relation between the gesture information of the hand of the user and the position information of the virtual object can be released, and whether the hand of the user is in contact with the virtual object can be re-determined.
According to the control method of the near-to-eye display device, under the condition that the hands of the user touch the target virtual object, the touch feedback device worn by the hands of the user is controlled to apply touch to the hands of the user, so that the user can obtain real operation experience on the virtual object.
< device example >
The disclosed embodiments provide a control device 40 of a near-eye display apparatus, the device 40 including: a receiving module 41, a first determining module 42 and a control module 43. Wherein:
the receiving module is used for receiving hand pose information of the user, which is sent by the touch glove of the hand of the user;
the first determining module is used for determining whether the hand of the user succeeds in the target virtual object according to the hand pose information of the user and the position information of the target virtual object;
and the control module is used for controlling the touch glove to transmit vibration to the hand of the user under the condition that the hand of the user successfully touches the target virtual object.
In one embodiment, the control device of the near-eye display apparatus further comprises a second determination module; the second determining module is used for determining at least one of the intensity and the frequency of vibration of the vibration device according to target parameters of the target virtual object under the condition that the hand of the user successfully touches the target virtual object, wherein the target parameters comprise at least one of the intensity, the hardness and the elasticity of the target virtual object.
According to the control device of the near-to-eye display device, under the condition that the hands of the user touch the target virtual object, the touch feedback device worn by the hands of the user is controlled to apply touch to the hands of the user, so that the user can obtain real operation experience on the virtual object.
< device example >
The disclosed embodiments provide a near-eye display device 50, the device 50 comprising any of the apparatus 40 as provided in the apparatus embodiments described above; or alternatively, the process may be performed,
comprising a memory 51 and a processor 52, the memory 51 being for storing computer instructions, the processor 52 being for invoking the computer instructions from the memory 51 to perform the method according to any of the method embodiments described above.
< storage Medium embodiment >
The disclosed embodiments provide a computer readable storage medium having stored thereon a computer program which, when executed by a processor, implements a method according to any of the above-described method embodiments.
The present disclosure may be a system, method, and/or computer program product. The computer program product may include a computer readable storage medium having computer readable program instructions embodied thereon for causing a processor to implement aspects of the present disclosure.
The computer readable storage medium may be a tangible device that can hold and store instructions for use by an instruction execution device. The computer readable storage medium may be, for example, but not limited to, an electronic storage device, a magnetic storage device, an optical storage device, an electromagnetic storage device, a semiconductor storage device, or any suitable combination of the foregoing. More specific examples (a non-exhaustive list) of the computer-readable storage medium would include the following: portable computer disks, hard disks, random Access Memory (RAM), read-only memory (ROM), erasable programmable read-only memory (EPROM or flash memory), static Random Access Memory (SRAM), portable compact disk read-only memory (CD-ROM), digital Versatile Disks (DVD), memory sticks, floppy disks, mechanical coding devices, punch cards or in-groove structures such as punch cards or grooves having instructions stored thereon, and any suitable combination of the foregoing. Computer-readable storage media, as used herein, are not to be construed as transitory signals per se, such as radio waves or other freely propagating electromagnetic waves, electromagnetic waves propagating through waveguides or other transmission media (e.g., optical pulses through fiber optic cables), or electrical signals transmitted through wires.
The computer readable program instructions described herein may be downloaded from a computer readable storage medium to a respective computing/processing device or to an external computer or external storage device over a network, such as the internet, a local area network, a wide area network, and/or a wireless network. The network may include copper transmission cables, fiber optic transmissions, wireless transmissions, routers, firewalls, switches, gateway computers and/or edge servers. The network interface card or network interface in each computing/processing device receives computer readable program instructions from the network and forwards the computer readable program instructions for storage in a computer readable storage medium in the respective computing/processing device.
Computer program instructions for performing the operations of the present disclosure can be assembly instructions, instruction Set Architecture (ISA) instructions, machine-related instructions, microcode, firmware instructions, state setting data, or source or object code written in any combination of one or more programming languages, including an object oriented programming language such as Smalltalk, c++ or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The computer readable program instructions may be executed entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computer (for example, through the Internet using an Internet service provider). In some embodiments, aspects of the present disclosure are implemented by personalizing electronic circuitry, such as programmable logic circuitry, field Programmable Gate Arrays (FPGAs), or Programmable Logic Arrays (PLAs), with state information of computer readable program instructions, which can execute the computer readable program instructions.
Various aspects of the present disclosure are described herein with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the disclosure. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer-readable program instructions.
These computer readable program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks. These computer readable program instructions may also be stored in a computer readable storage medium that can direct a computer, programmable data processing apparatus, and/or other devices to function in a particular manner, such that the computer readable medium having the instructions stored therein includes an article of manufacture including instructions which implement the function/act specified in the flowchart and/or block diagram block or blocks.
The computer readable program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other devices to cause a series of operational steps to be performed on the computer, other programmable apparatus or other devices to produce a computer implemented process such that the instructions which execute on the computer, other programmable apparatus or other devices implement the functions/acts specified in the flowchart and/or block diagram block or blocks.
The flowcharts and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of instructions, which comprises one or more executable instructions for implementing the specified logical function(s). In some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions. It is well known to those skilled in the art that implementation by hardware, implementation by software, and implementation by a combination of software and hardware are all equivalent.
The foregoing description of the embodiments of the present disclosure has been presented for purposes of illustration and description, and is not intended to be exhaustive or limited to the embodiments disclosed. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the various embodiments described. The terminology used herein was chosen in order to best explain the principles of the embodiments, the practical application, or the technical improvement of the technology in the marketplace, or to enable others of ordinary skill in the art to understand the embodiments disclosed herein. The scope of the present disclosure is defined by the appended claims.

Claims (10)

1. A control method of a near-eye display device, comprising:
acquiring hand pose information of a user;
determining the relationship between the hand of the user and the target virtual object according to the hand pose information and the position information of the target virtual object, wherein the relationship between the hand of the user and the target virtual object comprises whether the hand of the user touches the target virtual object or not;
when the hand of the user touches the target virtual object, the tactile feedback device worn by the hand of the user is controlled to apply tactile sensation to the hand of the user.
2. The method of claim 1, wherein the controlling the haptic feedback device worn by the user's hand to apply the haptic sensation to the user's hand in the event that the user's hand touches the target virtual object comprises:
determining the operation parameters of the haptic feedback device according to the attribute parameters of the target virtual object under the condition that the hand of the user touches the target virtual object;
controlling a haptic feedback device to apply a tactile sensation to a user's hand according to the operating parameter;
the attribute parameters of the target virtual object include at least one of a material, strength, hardness, elasticity, surface roughness and temperature corresponding to the target virtual object.
3. The method of claim 1, wherein the relationship of the user's hand to the target virtual object further comprises whether the user's hand is holding the target virtual object;
the touch feedback device for controlling the hand wearing of the user to apply touch to the hand of the user when the hand of the user touches the target virtual object comprises:
under the condition that the hand of a user touches a target virtual object, if the hand of the user holds the target virtual object, determining the operation parameters of the haptic feedback device according to the holding tightness and the attribute parameters of the target virtual object;
controlling a haptic feedback device to apply a tactile sensation to a user's hand according to the operating parameter;
wherein, the attribute parameters of the target virtual object comprise at least one of materials, strength, hardness, elasticity, surface roughness and temperature corresponding to the target virtual object;
and the holding tightness is determined according to the hand pose information and the corresponding size and weight of the target virtual object.
4. The method of claim 2 or 3, wherein the haptic feedback device comprises at least one of a vibration feedback module, a pressure feedback module, a temperature feedback module;
in the case where the haptic feedback device includes a vibration feedback module, the operating parameters include a vibration intensity and a vibration frequency output by the vibration feedback module; or alternatively, the process may be performed,
in the case where the haptic feedback device includes a pressure feedback module, the operating parameter includes a pressure output by the pressure feedback module; or alternatively, the process may be performed,
in the case where the haptic feedback device includes a temperature feedback module, the operating parameter includes a temperature output by the temperature feedback module.
5. The method of any one of claims 2-4, further comprising:
and under the condition that the hand of the user touches the target virtual object, responding to the moving operation of the user on the target virtual object, and adjusting the operation parameters according to the moving speed of the user on the target virtual object.
6. The method according to any one of claims 1-5, wherein the acquiring hand pose information of the user comprises:
and obtaining hand pose information of the user according to the image acquired by the camera of the near-eye display device and data output by a target sensor worn by the hand of the user, wherein the target sensor comprises an acceleration sensor and an angular velocity sensor.
7. A control apparatus for a near-eye display device, comprising:
the receiving module is used for receiving hand pose information of the user, which is sent by the touch glove of the hand of the user;
the first determining module is used for determining whether the hand of the user succeeds in the target virtual object according to the hand pose information of the user and the position information of the target virtual object;
and the control module is used for controlling the touch glove to transmit vibration to the hand of the user under the condition that the hand of the user successfully touches the target virtual object.
8. The control apparatus of a near-eye display device of claim 7, wherein the apparatus further comprises a second determination module;
the second determining module is configured to determine at least one of intensity and frequency of vibration of the vibration device according to a target parameter of the target virtual object when the hand of the user successfully touches the target virtual object, where the target parameter includes at least one of intensity, hardness, and elasticity of the target virtual object.
9. A near-eye display device comprising a memory for storing computer instructions and a processor for invoking the computer instructions from the memory to perform the method of any of claims 1-7.
10. A computer-readable storage medium, characterized in that a computer program is stored thereon, which, when being executed by a processor, implements the method according to any of claims 1-7.
CN202310067681.2A 2023-01-16 2023-01-16 Control method and device of near-eye display equipment and near-eye display equipment Pending CN116149474A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310067681.2A CN116149474A (en) 2023-01-16 2023-01-16 Control method and device of near-eye display equipment and near-eye display equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310067681.2A CN116149474A (en) 2023-01-16 2023-01-16 Control method and device of near-eye display equipment and near-eye display equipment

Publications (1)

Publication Number Publication Date
CN116149474A true CN116149474A (en) 2023-05-23

Family

ID=86352194

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310067681.2A Pending CN116149474A (en) 2023-01-16 2023-01-16 Control method and device of near-eye display equipment and near-eye display equipment

Country Status (1)

Country Link
CN (1) CN116149474A (en)

Similar Documents

Publication Publication Date Title
CN109891368B (en) Switching of moving objects in augmented and/or virtual reality environments
CN107533373B (en) Input via context-sensitive collision of hands with objects in virtual reality
EP3398030B1 (en) Haptic feedback for non-touch surface interaction
JP6591805B2 (en) System and method for a surface element providing an electrostatic haptic effect
CN107209568A (en) Phone control and presence in virtual reality
JP2016126772A (en) Systems and methods for generating haptically enhanced objects for augmented and virtual reality applications
CN106598246B (en) Interaction control method and device based on virtual reality
US20170315721A1 (en) Remote touchscreen interface for virtual reality, augmented reality and mixed reality devices
CN105446481A (en) Gesture based virtual reality human-machine interaction method and system
JP2015056183A (en) Systems and methods for performing haptic conversion
US20190163271A1 (en) Systems and methods for providing haptic feedback according to tilt-based inputs
CN102362246A (en) Systems and methods for using multiple actuators to realize textures
KR20180094799A (en) Automatic localized haptics generation system
CN108700944B (en) Systems and methods relating to motion in a virtual reality environment
JP7252252B2 (en) Initiate modal control based on hand position
Kang Effect of interaction based on augmented context in immersive virtual reality environment
WO2017185502A1 (en) Terminal, and method and apparatus for implementing tactile feedback on terminal
JP6732078B2 (en) System, method and non-transitory computer readable medium for integrating haptic overlays in augmented reality
CN116149474A (en) Control method and device of near-eye display equipment and near-eye display equipment
CN117716322A (en) Augmented Reality (AR) pen/hand tracking
Halim et al. Designing ray-pointing using real hand and touch-based in handheld augmented reality for object selection
US11946744B2 (en) Synchronization of a gyroscope in a virtual-reality environment
Cao et al. Research and Implementation of virtual pottery
Sokolowski et al. A Contextual Semantic Interaction Interface for Virtual Reality Environments
Pürzel et al. Applications of a Modular Interaction Framework for Virtual Reality Testing in a Smart Environment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination