WO2023246159A1 - 一种虚拟现实的交互方法、相关装置、设备以及存储介质 - Google Patents

一种虚拟现实的交互方法、相关装置、设备以及存储介质 Download PDF

Info

Publication number
WO2023246159A1
WO2023246159A1 PCT/CN2023/078921 CN2023078921W WO2023246159A1 WO 2023246159 A1 WO2023246159 A1 WO 2023246159A1 CN 2023078921 W CN2023078921 W CN 2023078921W WO 2023246159 A1 WO2023246159 A1 WO 2023246159A1
Authority
WO
WIPO (PCT)
Prior art keywords
virtual object
virtual
virtual reality
response
trigger
Prior art date
Application number
PCT/CN2023/078921
Other languages
English (en)
French (fr)
Inventor
崔兰
Original Assignee
腾讯科技(深圳)有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 腾讯科技(深圳)有限公司 filed Critical 腾讯科技(深圳)有限公司
Publication of WO2023246159A1 publication Critical patent/WO2023246159A1/zh

Links

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/25Output arrangements for video game devices
    • A63F13/28Output arrangements for video game devices responding to control signals received from the game device for affecting ambient conditions, e.g. for vibrating players' seats, activating scent dispensers or affecting temperature or light
    • A63F13/285Generating tactile feedback signals via the game input device, e.g. force feedback
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • A63F13/57Simulating properties, behaviour or motion of objects in the game world, e.g. computing tyre load in a car race game
    • A63F13/577Simulating properties, behaviour or motion of objects in the game world, e.g. computing tyre load in a car race game using determination of contact between game characters or objects, e.g. to avoid collision between virtual racing cars
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8082Virtual reality

Definitions

  • This application relates to the field of virtual reality technology, and in particular to the interactive technology of virtual reality.
  • VR Virtual Reality
  • VR technology is a computer simulation system that can create and experience a virtual world. It uses computers to generate a simulated environment and supports users to immerse themselves in the simulated environment. More specifically, VR technology is based on real-life data, generates electronic signals through computer technology, and combines them with various output devices to transform them into phenomena that can be felt by users.
  • VR tactile feedback is mainly realized through VR handles.
  • the more common one is based on vibration feedback such as linear motors, that is, a force feedback device is set at the trigger of the handle, which can give a set reaction force when the user presses.
  • a special touch performance device can also be installed on the grip to simulate touch based on specified waveform vibrations.
  • the existing tactile feedback mode is relatively simple, mainly focused on human-computer interaction scenarios, and lacks a sense of interaction between users.
  • Embodiments of the present application provide a virtual reality interaction method, related devices, equipment, and storage media.
  • This application supports interactive operations between virtual objects in a virtual reality scene where at least two people interact, and can provide corresponding somatosensory feedback to real users through a virtual reality suite, thereby improving the sense of interaction between users and making virtual reality interaction
  • the experience is more immersive and realistic.
  • this application provides a virtual reality interaction method, which is executed by a terminal device, including:
  • At least two virtual objects are displayed in the virtual reality scene, wherein the at least two virtual objects include a first virtual object and a second virtual object.
  • the first virtual object is a virtual object controlled by the first object
  • the second virtual object is a second virtual object.
  • the virtual reality kit In response to an interactive operation triggered by the first virtual object for the second virtual object, the virtual reality kit is controlled to trigger somatosensory feedback, wherein the virtual reality kit includes at least one virtual reality device worn by the second object.
  • this application provides a virtual reality interactive device, including:
  • a display module configured to display at least two virtual objects in a virtual reality scene, where the at least two virtual objects include a first virtual object and a second virtual object, and the first virtual object is a first pair A virtual object controlled by the first object, and the second virtual object is a virtual object controlled by the second object;
  • the control module is configured to control the virtual reality kit to trigger somatosensory feedback in response to the interactive operation triggered by the first virtual object for the second virtual object, wherein the virtual reality kit includes at least one virtual reality device worn by the second object.
  • Another aspect of the present application provides a computer device, including a memory and a processor.
  • the memory stores a computer program.
  • the processor executes the computer program, the methods of the above aspects are implemented.
  • Another aspect of the present application provides a computer-readable storage medium on which a computer program is stored.
  • the computer program is executed by a processor, the methods of the above aspects are implemented.
  • Another aspect of the present application provides a computer program product, including a computer program, which implements the methods of the above aspects when executed by a processor.
  • Embodiments of the present application provide a virtual reality interaction method, that is, displaying at least two virtual objects in a virtual reality scene.
  • the at least two virtual objects include a first virtual object and a second virtual object, wherein the first virtual object is a virtual object controlled by the first object, and the second virtual object is a virtual object controlled by the second object.
  • the virtual reality kit is controlled to trigger somatosensory feedback, and the virtual reality kit includes at least one virtual reality device worn by the second object.
  • the virtual reality suite provides corresponding somatosensory feedback to real users, thereby enhancing the sense of interaction between users and making the virtual reality interactive experience more immersive and realistic; here the interaction between users
  • the sense of interaction means that in a virtual reality scene in which at least two people participate, due to interactive operations between virtual objects controlled by each user, the real user is provided with somatosensory feedback corresponding to the interactive operation, so that the real user feels the corresponding Interactive somatosensory.
  • Figure 1 is a schematic diagram of the physical architecture of the virtual reality interactive system in the embodiment of the present application.
  • Figure 2 is a schematic diagram of the logical architecture of the virtual reality interactive system in the embodiment of the present application.
  • FIG. 3 is a schematic structural diagram of an ultrasonic sensing module in an embodiment of the present application.
  • Figure 4 is a schematic flow chart of a virtual reality interaction method in an embodiment of the present application.
  • Figure 5 is a schematic diagram of responding to session messages based on a virtual reality scene in an embodiment of the present application
  • Figure 6 is a schematic diagram of responding to a contactless interactive message based on a virtual reality scene in an embodiment of the present application
  • Figure 7 is a schematic diagram of receiving emails based on a virtual reality scene in an embodiment of the present application.
  • Figure 8 is a schematic diagram of response file transmission based on virtual reality scenes in the embodiment of the present application.
  • Figure 9 is a schematic diagram of responding to a team invitation message based on a virtual reality scene in an embodiment of the present application.
  • Figure 10 is a schematic diagram of responding to a roll call prompt message based on a virtual reality scene in an embodiment of the present application
  • Figure 11 is a schematic diagram of the upper body limbs of the virtual object in the embodiment of the present application.
  • Figure 12 is a schematic diagram of responding to a touch operation on the shoulder based on a virtual reality scene in an embodiment of the present application
  • Figure 13 is a schematic diagram of responding to an arm touch operation based on a virtual reality scene in an embodiment of the present application
  • Figure 14 is a schematic diagram of responding to a back touch operation based on a virtual reality scene in an embodiment of the present application
  • Figure 15 is a schematic diagram of responding to a waist touch operation based on a virtual reality scene in an embodiment of the present application
  • Figure 16 is a schematic diagram of the head of the virtual object in the embodiment of the present application.
  • Figure 17 is a schematic diagram of responding to a head accessory wearing operation based on a virtual reality scene in an embodiment of the present application
  • Figure 18 is a schematic diagram of responding to a head accessory removal operation based on a virtual reality scene in an embodiment of the present application
  • Figure 19 is a schematic diagram of responding to a facial touch operation based on a virtual reality scene in an embodiment of the present application
  • Figure 20 is a schematic diagram of responding to a neck touch operation based on a virtual reality scene in an embodiment of the present application
  • Figure 21 is a schematic diagram of the virtual object's hand in the embodiment of the present application.
  • Figure 22 is a schematic diagram of a handshake operation based on a virtual reality scene response in an embodiment of the present application
  • Figure 23 is a schematic diagram of responding to a high-five operation based on a virtual reality scene in an embodiment of the present application.
  • Figure 24 is a schematic diagram of responding to a hand touch operation based on a virtual reality scene in an embodiment of the present application
  • Figure 25 is a schematic diagram of responding to a first object transfer operation based on a virtual reality scene in an embodiment of the present application
  • Figure 26 is a schematic diagram of responding to a second object transfer operation based on a virtual reality scene in an embodiment of the present application
  • Figure 27 is a schematic diagram of a whisper operation based on a virtual reality scene in an embodiment of the present application.
  • Figure 28 is a schematic diagram of responding to a hug operation based on a virtual reality scene in an embodiment of the present application
  • Figure 29 is a schematic diagram of responding to a passing operation based on a virtual reality scene in an embodiment of the present application.
  • Figure 30 is a schematic diagram of a virtual reality interaction device in an embodiment of the present application.
  • Figure 31 is a schematic structural diagram of a virtual reality device in an embodiment of the present application.
  • Virtual reality technology has been recognized by more and more people. Users can experience the most realistic feeling in the virtual reality world. The authenticity of its simulated environment makes it difficult to distinguish between true and false from the real world, making people feel like they are there. The feeling of environment. At the same time, virtual reality has the perceptual functions that humans have, such as hearing, vision, touch, taste and smell. Finally, it has a super powerful simulation system that truly Human-computer interaction is realized, allowing people to operate at will and get the most realistic feedback from the environment during the operation process. It is the characteristics of virtual reality technology such as immersion, interactivity, multi-perception, imagination and autonomy that make it loved by many people. These characteristics will be introduced below.
  • Immersion is the most important feature of virtual reality technology, which allows users to become and feel that they are part of the environment created by the computer system.
  • the immersion of virtual reality technology depends on the user's perception system. When the user perceives the virtual world When stimulated (for example, touch, taste, smell, motion perception, etc.), there will be a resonance of thinking, causing psychological immersion and feeling like entering the real world.
  • Interactivity refers to the degree to which the user can operate objects in the simulated environment and the natural degree of feedback from the environment.
  • the corresponding technology allows the user to interact with the environment.
  • the surrounding environment will also react in some way. If the user comes into contact with an object in the virtual space, the user should be able to feel it on his hands. If the user makes any action on the object, the position and state of the object should also change.
  • most virtual reality technologies have perceptual functions including vision, hearing, touch, movement, etc.
  • Conceptuality is also called imagination. Users can interact with surrounding objects in virtual space, broaden their cognitive scope, and create scenes that do not exist in the objective world or environments that are impossible to occur.
  • Autonomy refers to the degree to which objects in a virtual environment act according to the laws of physics. For example, when pushed by a force, an object will move in the direction of the force, tip over, or fall from the table to the ground.
  • the virtual reality interactive system includes a server and at least two sets of virtual reality suites.
  • the virtual reality kit 120 is a virtual reality kit worn by a real user
  • the virtual reality kit 130 is a virtual reality kit worn by another real user.
  • the virtual reality kit includes at least one virtual reality device, and a client is installed on the virtual reality device.
  • the client can be run on the virtual reality device through a browser, or through an independent application (application, APP).
  • the format runs on a virtual reality device, etc.
  • the server involved in this application can be an independent physical server, or a server cluster or distributed system composed of multiple physical servers. It can also provide cloud services, cloud databases, cloud computing, cloud functions, cloud storage, network services, Cloud communications, middleware services, domain name services, security services, Content Delivery Network (CDN), and big data and artificial intelligence Cloud servers for basic cloud computing services such as platforms.
  • the virtual reality device can be a head-mounted display device, a ring device, a glove device, a belt device, a shoe device, a handle device, clothing, a wearable device, an exoskeleton, etc., but is not limited thereto.
  • the virtual reality device and the server can be connected directly or indirectly through wired or wireless communication methods, which is not limited in this application. There is no limit on the number of servers and virtual reality devices.
  • virtual object A controlled by user A and virtual object B controlled by user B are in the same virtual reality scene.
  • user A wears a virtual reality kit 120 and user B wears a virtual reality kit 130.
  • the virtual reality kit 120 and the virtual reality kit 130 establish a communication connection through the server 110.
  • user A controls virtual object A to trigger an action on virtual object B
  • user B will receive corresponding somatosensory feedback through the virtual reality kit 130 he wears.
  • Figure 2 is a schematic diagram of the logical architecture of the virtual reality interaction system in the embodiment of the present application.
  • the virtual object controlled by the user receives a prompt or triggers interaction in the virtual reality scene, the background will match
  • the corresponding tactile feedback mechanism provides users with corresponding tactile feedback through the ultrasonic sensing module built into the virtual reality device (for example, a head-mounted display device or a ring device).
  • the ultrasonic sensing module built into the virtual reality device (for example, a head-mounted display device or a ring device). specifically:
  • identification is based on the APP identity document (ID) used when the user logs in to the virtual reality application.
  • ID APP identity document
  • determine the interaction category new message reminders are mainly classified as prompt interactions, and triggering interactive actions in specific scenarios are interactive interactions.
  • the position of the hand is mainly identified through a virtual reality device (for example, a ring device or a glove device, etc.).
  • match the feedback mechanism For example, head interaction can feedback one vibration through the head-mounted display device worn by the user.
  • hand interaction can feedback one vibration through the ring device worn by the user. That is, the user is provided with corresponding tactile vibration feedback through the built-in ultrasonic sensor of the virtual reality device worn by the user.
  • the virtual reality device in this application can also provide other types of feedback.
  • Feedback types include but are not limited to gas feedback, liquid feedback, pressure-sensitive feedback, etc., and are not limited here.
  • FIG. 3 is a schematic structural diagram of the ultrasonic sensing module in an embodiment of the present application.
  • the ultrasonic sensing module The group usually includes an ultrasonic sensor array, a control circuit and a drive circuit.
  • the ultrasonic sensor array is mainly divided into a sending part and a receiving part.
  • the transmitting part is mainly composed of a transmitter and a transducer.
  • the transducer can convert the energy generated when the piezoelectric chip is vibrated by voltage excitation into ultrasonic waves.
  • the transmitter emits the generated ultrasonic waves.
  • the receiving part is mainly composed of a transducer and an amplification circuit.
  • the transducer receives the reflected ultrasonic wave. Since mechanical vibration is generated when receiving the ultrasonic wave, the transducer can convert the mechanical energy into an electrical signal, which is then generated by the amplification circuit. The electrical signal is amplified.
  • the virtual reality interaction method in this application can be executed by a terminal device.
  • the terminal device can be a virtual reality device.
  • the virtual reality interaction method provided by this application includes:
  • the at least two virtual objects include a first virtual object and a second virtual object.
  • the first virtual object is a virtual object controlled by the first object
  • the second virtual object is A virtual object controlled by the second object;
  • a virtual reality scene in a virtual reality scene, at least two virtual objects are displayed.
  • This application will take at least two virtual objects including a first virtual object and a second virtual object as an example to illustrate, but it should not be understood that are limitations of this application.
  • the virtual object can be a character image, a cartoon image, etc., which is not limited here.
  • the first object wears the virtual reality kit and enters the virtual reality scene as the first virtual object.
  • the second object also wears the virtual reality kit and enters the virtual reality scene as the second virtual object.
  • the first object can control the first virtual object to interact with the second virtual object through the virtual reality suite.
  • the second object can also control the second virtual object to interact with the first virtual object through the virtual reality suite.
  • virtual reality scenes involved in this application include but are not limited to game scenes, industrial manufacturing scenes, medical scenes, educational scenes, shopping scenes, office meeting scenes, training scenes, safety drill scenes, live broadcast scenes, and home decoration and architectural design scenes. wait.
  • the virtual reality kit involved in this application includes one or more virtual reality devices.
  • control the virtual reality kit In response to the interactive operation triggered by the first virtual object for the second virtual object, control the virtual reality kit to trigger somatosensory feedback, wherein the virtual reality kit includes at least one virtual reality device worn by the second object.
  • the virtual reality kit in a virtual reality scene, interactive operations between different virtual objects are supported, and based on the type and intensity of the interactive operations, the virtual reality kit can be controlled to trigger somatosensory feedback.
  • the first object i.e., user A
  • the second object i.e., user B
  • wears The virtual reality kit responds to the interactive operation and triggers corresponding somatosensory feedback.
  • the virtual reality kit worn by the first object ie, user A
  • the embodiment of the present application provides a virtual reality interaction method.
  • a virtual reality interaction method Through the above method, interactive operations between virtual objects are supported in a virtual reality scene where at least two people interact.
  • the virtual reality suite provides corresponding somatosensory feedback to real users, thereby enhancing the sense of interaction between users and making the virtual reality interactive experience more immersive and realistic.
  • the virtual reality suite in response to an interactive operation triggered by the first virtual object for the second virtual object, the virtual reality suite is controlled.
  • Trigger somatosensory feedback which may include:
  • control the virtual reality set In response to the session message sent by the first virtual object to the second virtual object, control the virtual reality set
  • the head-mounted display device in the software triggers vibration feedback.
  • a feedback method for conversation messages in a social virtual reality scene is introduced.
  • the virtual reality scene supports real users to create their own virtual objects and control the virtual objects to perform activities in the virtual reality scene.
  • the following will introduce the social virtual reality scenario as an example.
  • Figure 5 is a schematic diagram of responding to a session message based on a virtual reality scene in an embodiment of the present application.
  • the first object controls the first virtual object.
  • the second object controls the second virtual object.
  • the first object controls the first virtual object to send a conversation message to the second virtual object.
  • the conversation message is "Hello, I am Jerry, nice to meet you.”
  • the second virtual object receives the conversation message, the head-mounted display device worn by the second object will trigger vibration feedback.
  • the second object can control the second virtual object to reply to the conversation message sent by the first object through the first virtual object.
  • the second object controls the second virtual object to send a conversation message to the first virtual object.
  • the conversation message is "Hi, Jerry, nice to meet you, my name is Mary.” Based on this, when the first virtual object receives the conversation message, the head-mounted display device worn by the first object will trigger vibration feedback.
  • the vibration feedback provided by the head-mounted display device can be one vibration or several consecutive vibrations.
  • embodiments of the present application provide a feedback method for conversation messages in a social virtual reality scene.
  • the virtual reality scene real users can perceive the conversation messages sent by other users through the virtual reality suite.
  • it enriches the diversity of virtual reality tactile feedback and the functionality of the interactive carrier.
  • it combines social Virtual reality scenes provide more diverse interaction methods, allowing users to feel a more realistic sense of social interaction, which is conducive to making the interactive experience more interesting.
  • the virtual reality suite in response to an interactive operation triggered by the first virtual object for the second virtual object, the virtual reality suite is controlled.
  • Trigger somatosensory feedback which may include:
  • control the head-mounted display device in the virtual reality kit In response to the contactless interaction message sent by the first virtual object to the second virtual object, control the head-mounted display device in the virtual reality kit to trigger vibration feedback.
  • a feedback method for non-contact interactive messages in a social virtual reality scene is introduced.
  • the virtual reality scene supports real users to create their own virtual objects and control the virtual objects to perform activities in the virtual reality scene.
  • the following will introduce the social virtual reality scenario as an example.
  • Figure 6 is a schematic diagram of responding to a non-contact interactive message based on a virtual reality scene in an embodiment of the present application.
  • the first object controls the first A virtual object
  • the second object controls the second virtual object.
  • An object controls the first virtual object to send a non-contact interactive message to the second virtual object.
  • the non-contact interactive message is a "show of love" message, which can be expressed by sending love or blowing kisses. Based on this, when the second virtual object receives the "show of love" message, the head-mounted display device worn by the second object will trigger vibration feedback.
  • the second object can control the second virtual object to reply to the "love" message sent by the first object through the first virtual object.
  • the second object may control the second virtual object to send a non-contact interaction message to the first virtual object.
  • the non-contact interaction message is a "wink” message. Based on this, when the first virtual object receives the "wink” message, the head-mounted display device worn by the first object will trigger vibration feedback.
  • the vibration feedback provided by the head-mounted display device can be one vibration or several consecutive vibrations.
  • Non-contact interactive messages include but are not limited to "showing love”, “holding fists”, “likes”, “winking”, etc.
  • embodiments of this application provide a feedback method for non-contact interactive messages in a social virtual reality scene.
  • a feedback method for non-contact interactive messages in a social virtual reality scene Through the above method, in the virtual reality scene, real users can perceive the non-contact interactive messages sent by other users through the virtual reality kit. On the one hand, it enriches the diversity of virtual reality tactile feedback and the functionality of the interactive carrier.
  • , combined with social virtual reality scenes provides more diverse interaction methods, allowing users to feel a more realistic sense of social interaction, which is conducive to improving the fun of the interactive experience.
  • the virtual reality suite in response to an interactive operation triggered by the first virtual object for the second virtual object, the virtual reality suite is controlled.
  • Trigger somatosensory feedback which may include:
  • a feedback method for emails in an office virtual reality scene is introduced.
  • the virtual reality scene supports real users to create their own virtual objects and control the virtual objects to perform activities in the virtual reality scene.
  • the following will take the office virtual reality scene as an example to introduce.
  • Figure 7 is a schematic diagram of receiving emails based on a virtual reality scene in an embodiment of the present application.
  • the first object controls the first virtual object
  • the second object controls Second virtual object.
  • the first object controls the first virtual object to send an email
  • the recipient of the email includes the second virtual object, that is, the email address of the recipient of the email includes the email address of the second virtual object (for example, mary@qq.com).
  • the second virtual object receives the email, the head-mounted display device worn by the second object will trigger vibration feedback.
  • the vibration feedback provided by the head-mounted display device can be one vibration or several consecutive vibrations.
  • embodiments of this application provide a feedback method for emails in an office virtual reality scene.
  • the virtual reality scene real users can perceive emails sent by other users through the virtual reality suite.
  • it enriches the diversity of virtual reality tactile feedback and the functionality of the interactive carrier.
  • it can be combined with the office Virtual reality scenes provide more diverse interaction methods, making users immersed in the scene, which is conducive to making the interactive experience more interesting.
  • the virtual reality suite in response to an interactive operation triggered by the first virtual object for the second virtual object, the virtual reality suite is controlled.
  • Trigger somatosensory feedback which may include:
  • the head-mounted display device in the virtual reality suite is controlled to trigger vibration feedback.
  • a feedback method for electronic files in an office virtual reality scene is introduced.
  • the virtual reality scene supports real users to create their own virtual objects and control the virtual objects to perform activities in the virtual reality scene.
  • the following will take the office virtual reality scene as an example to introduce.
  • Figure 8 is a schematic diagram of response file transmission based on a virtual reality scene in an embodiment of the present application.
  • the first object controls the first virtual object
  • the second object Control the second virtual object.
  • the first object controls the first virtual object to send the electronic file to the second virtual object.
  • the head-mounted display device worn by the second object will trigger vibration feedback.
  • the vibration feedback provided by the head-mounted display device can be one vibration or several consecutive vibrations.
  • Electronic files include but are not limited to text files, image files, graphics files, video files, sound files, hypermedia link files, program files, data files, etc.
  • embodiments of this application provide a feedback method for electronic files in an office virtual reality scene.
  • the virtual reality scene real users can perceive the electronic files transmitted by other users through the virtual reality suite.
  • it enriches the diversity of virtual reality tactile feedback and the functionality of the interactive carrier.
  • it can be combined with the office Virtual reality scenes provide more diverse interaction methods, making users immersed in the scene, which is conducive to making the interactive experience more interesting.
  • the virtual reality suite in response to an interactive operation triggered by the first virtual object for the second virtual object, the virtual reality suite is controlled.
  • Trigger somatosensory feedback which may include:
  • control the virtual reality In response to the team invitation message sent by the first virtual object to the second virtual object, control the virtual reality
  • the head-mounted display device in the actual kit triggers vibration feedback.
  • a feedback method for team invitation messages in a game virtual reality scene is introduced.
  • the virtual reality scene supports real users to create their own virtual objects and control the virtual objects to perform activities in the virtual reality scene.
  • the following will take the game virtual reality scene as an example to introduce.
  • Figure 9 is a schematic diagram of responding to a team invitation message based on a virtual reality scene in an embodiment of the present application.
  • the first object controls the first virtual object
  • the The second object controls the second virtual object.
  • the first object controls the first virtual object to create "Team A" in the game
  • the first object controls the first virtual object to send a team invitation message for "Team A" to the second virtual object, that is, Invite the second virtual object to join "Team A" to play the game.
  • the head-mounted display device worn by the second object will trigger vibration feedback.
  • the vibration feedback provided by the head-mounted display device can be one vibration or several consecutive vibrations.
  • Game types include but are not limited to multiplayer online battle arena (MOBA), real-time strategy game (RTS), role-playing game (RPG), and first-person shooters Games (first-person shooting game, FPS), etc.
  • embodiments of the present application provide a feedback method for team invitation messages in a game virtual reality scene.
  • the virtual reality scene real users can perceive team invitation messages initiated by other users through the virtual reality suite.
  • it enriches the diversity of virtual reality tactile feedback and the functionality of the interactive carrier.
  • game virtual reality scenes it provides more diverse interaction methods, making users immersed in the scene, which is conducive to improving the fun of the interactive experience.
  • the virtual reality suite in response to an interactive operation triggered by the first virtual object for the second virtual object, the virtual reality suite is controlled.
  • Trigger somatosensory feedback which may include:
  • control the head-mounted display device in the virtual reality suite In response to the roll call prompt message sent by the first virtual object to the second virtual object, control the head-mounted display device in the virtual reality suite to trigger vibration feedback.
  • a feedback method for roll call prompt messages in a distance education virtual reality scenario is introduced.
  • the virtual reality scene supports real users to create their own virtual objects and control the virtual objects to perform activities in the virtual reality scene.
  • the following will take the virtual reality scenario of distance education as an example to introduce.
  • Figure 10 is a schematic diagram of responding to a roll call prompt message based on a virtual reality scene in an embodiment of the present application.
  • the first object controls the first virtual object
  • the second The object controls the second virtual object.
  • the first object controls the first virtual object to teach, and during the teaching process, the first object initiates a roll call for the second virtual object, that is, the second virtual object
  • the pseudo object sends a roll call reminder message.
  • the name of the second virtual object is "Mary”
  • the first virtual object names "Mary” a naming prompt message for the second virtual object is sent.
  • the second virtual object receives the roll call prompt message, the head-mounted display device worn by the second object will trigger vibration feedback.
  • the vibration feedback provided by the head-mounted display device can be one vibration or several consecutive vibrations.
  • embodiments of this application provide a feedback method for roll call prompt messages in distance education virtual reality scenarios.
  • the virtual reality scene real users can perceive the roll call prompt messages initiated by other users through the virtual reality suite.
  • it enriches the diversity of virtual reality tactile feedback and the functionality of the interactive carrier.
  • it combines
  • the distance education virtual reality scene provides more diverse interaction methods, making users immersed in the scene, which is conducive to improving the teaching effect.
  • the virtual reality suite may include a head-mounted display device and a ring device, and accordingly, in response to the first virtual
  • the object controls the virtual reality kit to trigger somatosensory feedback for the interactive operation triggered by the second virtual object, which may include:
  • the head display device and the ring device are controlled to trigger vibration feedback.
  • a way to trigger head and hand feedback based on upper body body contact is introduced.
  • triggering interactive actions between virtual objects is also supported.
  • the first virtual object can touch the upper body limbs of the second virtual object, that is, the upper body limb contact is triggered for the second virtual object.
  • the head-mounted display device and the ring device worn by the second object will trigger vibration feedback at the same time.
  • Figure 11 is a schematic diagram of the upper body limbs of the virtual object in the embodiment of the present application.
  • A1 is used to indicate the upper body limbs of the virtual object, where the upper body limbs can be understood as Upper body except head and hands.
  • the first virtual object and the second virtual object both as characters as an example, if the first virtual object touches the upper body limbs of the second virtual object, it can be determined as an upper body interaction, that is, an upper body limb touch.
  • embodiments of the present application provide a way to trigger head and hand feedback based on upper body body contact.
  • upper body physical interaction between virtual objects is supported, thereby enriching the diversity of virtual reality tactile feedback and the functionality of the interactive carrier.
  • a way of responding to upper body physical contact in a virtual reality scene is introduced.
  • the virtual reality scene supports real users to create their own virtual objects and control the virtual objects to interact in the virtual reality scene. The following will be introduced based on different interactive scenarios.
  • Figure 12 is a schematic diagram of responding to a shoulder touch operation based on a virtual reality scene in an embodiment of the present application.
  • the first object controls the first virtual object
  • the second object controls the second virtual object.
  • the first object controls the first virtual object to move toward the second virtual object, and then pats the shoulder of the second virtual object. That is, the first virtual object triggers a touch operation on the shoulder of the second virtual object.
  • the head-mounted display device and the ring device worn by the second object will trigger vibration feedback at the same time.
  • Figure 13 is a schematic diagram of responding to an arm touch operation based on a virtual reality scene in an embodiment of the present application.
  • the first object controls the first virtual object
  • the second object controls the second virtual object.
  • the first object controls the first virtual object to move toward the second virtual object, and then pats the arm of the second virtual object. That is, the first virtual object triggers a touch operation on the arm of the second virtual object.
  • the head-mounted display device and the ring device worn by the second object will trigger vibration feedback at the same time.
  • Figure 14 is a schematic diagram of responding to a back touch operation based on a virtual reality scene in an embodiment of the present application.
  • the first object controls the first virtual object
  • the second object controls the second virtual object.
  • the first object controls the first virtual object to move toward the second virtual object, and then pats the back of the second virtual object. That is, the first virtual object triggers a touch operation on the back of the second virtual object.
  • the head-mounted display device and the ring device worn by the second object will trigger vibration feedback at the same time.
  • Figure 15 is a schematic diagram of responding to a waist touch operation based on a virtual reality scene in an embodiment of the present application.
  • the first object controls the first virtual object
  • the second object controls the second virtual object.
  • the first object controls the first virtual object to move toward the second virtual object, and then pats the waist of the second virtual object. That is, the first virtual object triggers a touch operation on the waist of the second virtual object.
  • the head-mounted display device and the ring device worn by the second object will trigger vibration feedback at the same time.
  • the vibration feedback provided by the head-mounted display device and the ring device can be one vibration or several consecutive vibrations; the number of vibrations of the head-mounted device and the ring device can be the same or different.
  • embodiments of the present application provide a way to respond to upper body body contact in a virtual reality scene.
  • real users can sense the Being aware of the upper body body contact operations triggered by other users enriches the diversity of virtual reality tactile feedback and the functionality of the interactive carrier, enhances the reality of the interaction process, and makes users immersed in the scene.
  • controlling the head-mounted display device and the ring device to trigger vibration feedback may specifically include:
  • the vibration intensity information is determined based on the motion amplitude information, where the vibration intensity information is positively correlated with the motion amplitude information;
  • control the head-mounted display device and the ring device Based on the vibration intensity information, control the head-mounted display device and the ring device to trigger vibration feedback.
  • the virtual reality device can also have built-in inertial measurement units (IMU).
  • IMU inertial measurement units
  • the IMU includes a gyroscope and an accelerometer.
  • the accelerometer is used to detect the acceleration signals of the object in three independent axes of the carrier coordinate system.
  • the directional velocity can be obtained by integrating the acceleration in one direction.
  • the first virtual object triggering upper body physical contact with the second virtual object as an example, in which the first object (i.e., a real user) wears a ring device (i.e., a virtual reality ring device). Therefore, through the ring device Control the first virtual object to pat the upper body of the second virtual object. Therefore, the hand movement speed of the first object can be measured through the IMU built into the ring device. Different movement speeds correspond to different movement range information. For ease of understanding, please refer to Table 1. Table 1 is an illustration of the correspondence between movement speed, movement range information and vibration intensity information.
  • v represents the hand movement speed of the action triggering party. It can be seen that the greater the hand movement speed of the first object, the greater the corresponding movement amplitude, and accordingly, the greater the vibration intensity that the action receiver (ie, the second object) can perceive. If the first object controls the first virtual object to touch the shoulder of the second virtual object at a speed of 3 meters/second, the second object can feel a slight vibration through the head-mounted display device and the ring device.
  • the movement speed in Table 1 can also be the movement speed of other parts, such as the elbow.
  • the above examples are introduced by taking hand movement speed as an example, but should not be understood as a limitation of the present application.
  • the distance information of the picture can also be sensed through a camera (for example, a depth camera or a binocular camera), and then a corresponding algorithm can be used to identify the movement range of the real user.
  • a camera for example, a depth camera or a binocular camera
  • the embodiment of the present application provides a method of providing corresponding force feedback according to the range of motion.
  • users can control virtual objects to perform corresponding actions in the virtual scene through their own actions.
  • the motion range information of the upper body body contact can be determined by detecting the movement speed of the real user.
  • the virtual reality kit is controlled to trigger vibration feedback based on the corresponding vibration intensity information. This can better simulate the real state between users, making the virtual reality interactive experience more immersive and realistic.
  • the virtual reality suite includes a head-mounted display device that responds to the first virtual object triggering the second virtual object.
  • the interactive operation controls the virtual reality kit to trigger somatosensory feedback, which may include:
  • the head-mounted display device In response to the head contact triggered by the first virtual object with respect to the second virtual object, the head-mounted display device is controlled to trigger vibration feedback.
  • a way to trigger head feedback based on head contact is introduced.
  • triggering interactive actions between virtual objects is also supported.
  • the first virtual object may touch the head of the second virtual object, that is, the head contact is triggered for the second virtual object.
  • the head-mounted display device worn by the second object will trigger vibration feedback.
  • Figure 16 is a schematic diagram of the head of a virtual object in an embodiment of the present application.
  • B1 is used to indicate the head of the virtual object, where the head can be understood as The part above the neck.
  • the first virtual object and the second virtual object both as characters as an example, if the first virtual object touches the head of the second virtual object, it can be determined as head interaction, that is, head contact.
  • embodiments of the present application provide a way to trigger head feedback based on head contact.
  • head contact interaction between virtual objects is supported, thereby enriching the diversity of virtual reality tactile feedback and the functionality of the interactive carrier.
  • a way of responding to head contact in a virtual reality scene is introduced.
  • the virtual reality scene supports real users to create their own virtual objects and control the virtual objects to interact in the virtual reality scene. The following will be introduced based on different interactive scenarios.
  • Figure 17 is a diagram based on the A schematic diagram of a virtual reality scene responding to a head accessory wearing operation, as shown in (A) of Figure 17 .
  • the first object controls the first virtual object
  • the second object controls the second virtual object.
  • the first virtual object holds the earphones and walks towards the second virtual object, preparing to put on the earphones for the second virtual object.
  • the first virtual object puts on the headset for the second virtual object, and in the process, touches the head of the second virtual object. Based on this, the head-mounted display device worn by the second object will trigger vibration feedback.
  • Figure 18 is a schematic diagram of responding to a head accessory removal operation based on a virtual reality scene in an embodiment of the present application.
  • the first object The first virtual object is controlled, and the second object controls the second virtual object.
  • the second virtual object wears glasses, and the first virtual object is ready to take off the glasses for the second virtual object.
  • the first virtual object removes the glasses for the second virtual object, and in the process, the first virtual object touches the head of the second virtual object. Based on this, the head-mounted display device worn by the second object will trigger vibration feedback.
  • Figure 19 is a schematic diagram of responding to a facial touch operation based on a virtual reality scene in an embodiment of the present application.
  • the first object controls the first virtual object
  • the second object controls a second virtual object.
  • the first object controls the first virtual object to move toward the second virtual object, and then touches the cheek of the second virtual object. That is, the first virtual object triggers a touch operation on the face of the second virtual object.
  • the head-mounted display device worn by the second object will trigger vibration feedback.
  • Figure 20 is a schematic diagram of responding to a neck touch operation based on a virtual reality scene in an embodiment of the present application.
  • the first object controls the first virtual object
  • the second object controls the second virtual object.
  • the first object controls the first virtual object to move towards the second virtual object, and then touches the neck of the second virtual object. That is, the first virtual object triggers a touch operation on the neck of the second virtual object.
  • the head-mounted display device worn by the second object will trigger vibration feedback.
  • the vibration feedback provided by the head-mounted display device can be one vibration or several consecutive vibrations.
  • Head accessories in this application include but are not limited to headphones, glasses, hats, hair accessories, etc.
  • embodiments of the present application provide a way to respond to head contact in a virtual reality scene.
  • real users can perceive the head contact operations triggered by other users through the virtual reality kit, enriching the diversity of virtual reality tactile feedback and the functionality of the interactive carrier, and improving the realism of the interaction process. , making users immersed in the scene.
  • controlling the head-mounted display device to trigger vibration feedback may specifically include:
  • the vibration intensity information is determined based on the motion amplitude information, where the vibration intensity information is positively correlated with the motion amplitude information;
  • the head-mounted display device is controlled to trigger vibration feedback.
  • a method of providing corresponding force feedback according to the movement range is introduced.
  • an IMU can be built into the virtual reality device, and the movement speed of the virtual reality device can be further detected through the IMU.
  • use the camera to sense the distance information of the screen, and use the corresponding algorithm to identify the real user's movement range.
  • the movement range of the real user can also be detected through other methods, which are not limited here.
  • the first object ie, the real user
  • wears the ring device ie, the virtual reality ring device
  • the ring device Control the first virtual object to pat the head of the second virtual object. Therefore, the hand movement speed of the first object can be measured through the IMU built into the ring device. Different movement speeds correspond to different movement range information. For ease of understanding, please refer to Table 1 again. If the first object controls the first virtual object to touch the head of the second virtual object at a speed of 6 meters/second, the second object can feel a moderate vibration through the head-mounted display device.
  • the embodiment of the present application provides a method of providing corresponding force feedback according to the range of motion.
  • the user can control the virtual object to perform corresponding actions in the virtual scene (for example, a fighting virtual reality game) through his own actions.
  • the movement speed of the real user can be detected to determine the movement amplitude information of the head contact, thereby controlling the virtual reality kit to trigger vibration feedback based on the corresponding vibration intensity information. This can better simulate the real state between users, making the virtual reality interactive experience more immersive and realistic.
  • the virtual reality suite includes a ring device, and accordingly, in response to the first virtual object targeting the second virtual Object-triggered interactive operations control the virtual reality kit to trigger somatosensory feedback, which may include:
  • the ring device In response to the hand contact triggered by the first virtual object with respect to the second virtual object, the ring device is controlled to trigger vibration feedback.
  • a way to trigger hand feedback based on hand contact is introduced.
  • triggering interactive actions between virtual objects is also supported.
  • the first virtual object may touch the hand of the second virtual object, that is, the hand contact is triggered for the second virtual object.
  • the ring device worn by the second object will trigger vibration feedback.
  • Figure 21 is a schematic diagram of a virtual object's hand in an embodiment of the present application.
  • C1 is used to indicate the hand of the virtual object, where the hand can be understood as Fingers, back of hand and palm.
  • the hand of the first virtual object touches the hand of the second virtual object, it can be determined as hand interaction, that is, hand contact.
  • embodiments of the present application provide a way to trigger hand feedback based on hand contact. Pass Through the above method, in the virtual reality scene, physical interaction between virtual objects is supported, thereby enriching the diversity of virtual reality tactile feedback and the functionality of the interactive carrier.
  • responding to the hand contact triggered by the first virtual object against the second virtual object may specifically include :
  • a way of responding to hand contact in a virtual reality scene is introduced.
  • the virtual reality scene supports real users to create their own virtual objects and control the virtual objects to interact in the virtual reality scene. The following will be introduced based on different interactive scenarios.
  • Figure 22 is a schematic diagram of a handshake operation based on a virtual reality scene response in an embodiment of the present application.
  • the first object controls the first virtual object
  • the second The object controls the second virtual object.
  • the first object controls a handshake between the first virtual object and the second virtual object, that is, the first virtual object triggers a handshake operation for the second virtual object.
  • the ring device worn by the second object will trigger vibration feedback.
  • the ring device worn by the first subject can also trigger vibration feedback.
  • Figure 23 is a schematic diagram of responding to a high-five operation based on a virtual reality scene in an embodiment of the present application.
  • the first object controls the first virtual object
  • the second The object controls the second virtual object.
  • the first object controls the first virtual object to high-five the second virtual object, that is, the first virtual object triggers a high-five operation for the second virtual object.
  • the ring device worn by the second object will trigger vibration feedback.
  • the ring device worn by the first subject can also trigger vibration feedback.
  • Figure 24 is a schematic diagram of responding to a hand touch operation based on a virtual reality scene in an embodiment of the present application.
  • the first object controls the first virtual object
  • the second object controls the second virtual object.
  • the first object controls the hand of the first virtual object to touch the second virtual object, that is, the first virtual object triggers a hand touch operation on the second virtual object.
  • the ring device worn by the second object will trigger vibration feedback.
  • the ring device worn by the first subject can also trigger vibration feedback.
  • the vibration feedback provided by the ring device can be one vibration or several consecutive vibrations.
  • Hand touch in this application includes but is not limited to touching the palm, touching the back of the hand, touching the fingers, etc.
  • embodiments of the present application provide a way to respond to hand contact in a virtual reality scene.
  • real users can perceive their Other user-triggered hand contact operations enrich the diversity of virtual reality tactile feedback and the functionality of interactive carriers, enhance the reality of the interaction process, and make users immersed in the scene.
  • controlling the ring device to trigger vibration feedback may specifically include:
  • the vibration intensity information is determined based on the motion amplitude information, where the vibration intensity information is positively correlated with the motion amplitude information;
  • the ring device Based on the vibration intensity information, the ring device is controlled to trigger vibration feedback.
  • a method of providing corresponding force feedback according to the movement range is introduced.
  • an IMU can be built into the virtual reality device, and the movement speed of the virtual reality device can be further detected through the IMU.
  • use the camera to sense the distance information of the screen, and use the corresponding algorithm to identify the real user's movement range.
  • the movement range of the real user can also be detected through other methods, which are not limited here.
  • the first object ie, the real user
  • wears the ring device ie, the virtual reality ring device
  • the ring device Control the first virtual object to pat the back of the second virtual object's hand. Therefore, the hand movement speed of the first object can be measured through the IMU built into the ring device. Different movement speeds correspond to different movement range information. For ease of understanding, please refer to Table 1 again. If the first object controls the first virtual object to touch the hand of the second virtual object at a speed of 7 meters/second, the second object can feel a moderate vibration through the ring device.
  • the embodiment of the present application provides a method of providing corresponding force feedback according to the range of motion.
  • users can control virtual objects to perform corresponding actions in the virtual scene through their own actions.
  • the movement speed of the real user can be detected to determine the movement amplitude information of the hand contact, thereby controlling the virtual reality kit to trigger vibration feedback based on the corresponding vibration intensity information. This can better simulate the real state between users, making the virtual reality interactive experience more immersive and realistic.
  • the virtual reality kit may include a glove device, correspondingly, in response to the first virtual object targeting the second Interactive operations triggered by virtual objects control the virtual reality kit to trigger somatosensory feedback, which may include:
  • the glove device is controlled to trigger heating feedback in response to a contact operation on the first object.
  • a feedback method for delivering hot objects in a virtual reality scene is introduced.
  • the virtual reality scene supports real users to create their own virtual objects and control the virtual objects to interact in the virtual reality scene.
  • the following is an example where the first object is a hot water cup.
  • Figure 25 shows a virtual-based A schematic diagram of a virtual reality scene responding to a first object transfer operation.
  • the first object controls the first virtual object
  • the second object controls the second virtual object.
  • the first virtual object holds a cup of hot water and walks towards the second virtual object, preparing to hand the cup of hot water to the second virtual object.
  • the first virtual object hands the hot water cup to the second virtual object.
  • the glove device worn by the second object will trigger heating feedback, for example, the glove device will rise by 5 degrees Celsius.
  • the glove device can not only provide thermal feedback, but also provide vibration feedback, etc.
  • the first object includes but is not limited to hot water, hot towels, hot soup, etc.
  • embodiments of the present application provide a feedback method for transferring hot objects in a virtual reality scene.
  • a feedback method for transferring hot objects in a virtual reality scene.
  • real users can perceive other users passing hot objects through the virtual reality kit, thereby triggering heat feedback.
  • This enriches the diversity of virtual reality tactile feedback and the functionality of the interactive carrier, making the user immersed in the scene, which is conducive to improving the fun of the interactive experience.
  • the virtual reality kit may include a glove device, correspondingly, in response to the first virtual object targeting the second Interactive operations triggered by virtual objects control the virtual reality kit to trigger somatosensory feedback, which may include:
  • the glove device is controlled to trigger cooling feedback.
  • a feedback method for delivering ice objects in a virtual reality scene is introduced.
  • the virtual reality scene supports real users to create their own virtual objects and control the virtual objects to interact in the virtual reality scene.
  • the following introduction will take the second object as ice cream as an example.
  • Figure 26 is a schematic diagram of responding to a second object transfer operation based on a virtual reality scene in an embodiment of the present application.
  • the first object controls the second object transfer operation.
  • a virtual object, and the second object controls the second virtual object.
  • the first virtual object holds an ice cream and walks towards the second virtual object, preparing to hand the ice cream to the second virtual object.
  • the first virtual object delivers ice cream to the second virtual object.
  • the glove device worn by the second object will trigger a cooling feedback, for example, the glove device decreases by 5 degrees Celsius.
  • glove devices can not only provide cooling feedback, but also vibration feedback, etc.
  • the second object includes but is not limited to ice cream, ice cubes, etc.
  • embodiments of the present application provide a feedback method for delivering ice objects in virtual reality scenes. Mode.
  • real users can perceive other users passing ice objects through the virtual reality kit, thereby triggering ice feedback.
  • This enriches the diversity of virtual reality tactile feedback and the functionality of the interactive carrier, making the user immersed in the scene, which is conducive to improving the fun of the interactive experience.
  • the virtual reality suite may include a head-mounted display device, and accordingly, in response to the first virtual object targeting the first 2.
  • Interactive operations triggered by virtual objects, controlling the virtual reality kit to trigger somatosensory feedback which may include:
  • the head-mounted display device In response to a whisper operation triggered by the first virtual object for the second virtual object, the head-mounted display device is controlled to trigger airflow feedback.
  • a way to respond to a whisper in a virtual reality scene is introduced.
  • the virtual reality scene supports real users to create their own virtual objects and control the virtual objects to interact in the virtual reality scene. The following will be introduced based on interactive scenarios.
  • Figure 27 is a schematic diagram of responding to a whisper operation based on a virtual reality scene in an embodiment of the present application.
  • the first object controls the first virtual object
  • the second The object controls the second virtual object.
  • the first object controls the first virtual object to whisper to the second virtual object, that is, the first virtual object triggers a whisper operation for the second virtual object.
  • the head-mounted display device worn by the second subject will trigger airflow feedback to simulate the effect of speaking in the ear.
  • the airflow feedback provided by the head-mounted display device can be once or several times continuously.
  • embodiments of the present application provide a way to respond to whispers in a virtual reality scene.
  • real users can perceive the whisper operations triggered by other users through the virtual reality kit, which enriches the diversity of virtual reality tactile feedback and the functionality of the interactive carrier, improves the realism of the interaction process, and makes Users are immersed in the experience.
  • the virtual reality kit may include a belt device, correspondingly, in response to the first virtual object targeting the second Interactive operations triggered by virtual objects control the virtual reality kit to trigger somatosensory feedback, which may include:
  • the belt device In response to the hug operation triggered by the first virtual object for the second virtual object, the belt device is controlled to trigger contraction feedback.
  • a way of responding to a hug in a virtual reality scene is introduced.
  • the virtual reality scene supports real users to create their own virtual objects and control the virtual objects to interact in the virtual reality scene. The following will be introduced based on interactive scenarios.
  • Figure 28 is a diagram based on the A schematic diagram of a virtual reality scene responding to a hug operation.
  • the first object controls the first virtual object
  • the second object controls the second virtual object.
  • the first object controls the first virtual object to hug the second virtual object, that is, the first virtual object triggers the hug operation for the second virtual object.
  • the belt device worn by the second subject triggers feedback of inward contraction.
  • a belt device worn by the first subject also triggers inward retraction feedback.
  • the embodiment of the present application provides a way to respond to a hug in a virtual reality scene.
  • real users can perceive the hug operations triggered by other users through the virtual reality kit, which enriches the diversity of virtual reality tactile feedback and the functionality of the interactive carrier, improves the realism of the interaction process, and makes Users are immersed in the experience.
  • the virtual reality kit may include a shoe device, correspondingly, in response to the first virtual object targeting the second Interactive operations triggered by virtual objects control the virtual reality kit to trigger somatosensory feedback, which may include:
  • the shoe device When the target sphere contacts the lower body limb of the second virtual object, the shoe device is controlled to trigger vibration feedback.
  • a feedback method for simulating passing in a virtual reality scene is introduced.
  • the virtual reality scene supports real users to create their own virtual objects and control the virtual objects to perform activities in the virtual reality scene. The following will be introduced based on interactive scenarios.
  • Figure 29 is a schematic diagram of responding to a passing operation based on a virtual reality scene in an embodiment of the present application.
  • the first object controls the first virtual object
  • the second object controls the second virtual object.
  • the first virtual object is preparing to pass the football under its feet to the second virtual object.
  • the shoe device worn by the second object will trigger vibration feedback.
  • the vibration feedback provided by the shoe device can be one vibration or several consecutive vibrations.
  • embodiments of the present application provide a feedback method for simulating passing in a virtual reality scene.
  • a feedback method for simulating passing in a virtual reality scene.
  • real users can perceive the passing operations triggered by other users through the virtual reality kit.
  • it enriches the diversity of virtual reality tactile feedback and the functionality of the interactive carrier.
  • Virtual reality scenes provide more diverse interaction methods, making users immersed in the scene.
  • the virtual reality interactive device in this application is described in detail below, please refer to Figure 30, Figure 30 This is a schematic diagram of an embodiment of the virtual reality interaction device in the embodiment of the present application.
  • the virtual reality interaction device 30 includes:
  • the display module 310 is used to display at least two virtual objects in a virtual reality scene, where the at least two virtual objects include a first virtual object and a second virtual object.
  • the first virtual object is a virtual object controlled by the first object.
  • the second virtual object is a virtual object controlled by the second object;
  • the control module 320 is configured to control the virtual reality kit to trigger somatosensory feedback in response to the interactive operation triggered by the first virtual object for the second virtual object, wherein the virtual reality kit includes at least one virtual reality device worn by the second object.
  • the virtual reality suite includes a head-mounted display device.
  • the control module 320 is specifically configured to control the virtual reality kit to trigger somatosensory feedback in at least one of the following ways:
  • control the head-mounted display device In response to the session message sent by the first virtual object to the second virtual object, control the head-mounted display device to trigger vibration feedback;
  • control the head-mounted display device In response to the contactless interaction message sent by the first virtual object to the second virtual object, control the head-mounted display device to trigger vibration feedback;
  • control the head-mounted display device In response to an email sent by the first virtual object, control the head-mounted display device to trigger vibration feedback, wherein the recipient of the email includes the second virtual object;
  • control the head-mounted display device In response to the team invitation message sent by the first virtual object to the second virtual object, control the head-mounted display device to trigger vibration feedback;
  • the head-mounted display device In response to the roll call prompt message sent by the first virtual object to the second virtual object, the head-mounted display device is controlled to trigger vibration feedback.
  • the virtual reality suite includes a head-mounted display device and a ring device;
  • the control module 320 is specifically used to control the head-mounted display device and the ring device to trigger vibration feedback in response to the upper body body contact triggered by the first virtual object with respect to the second virtual object.
  • the control module 320 is specifically configured to respond to a touch operation triggered by the first virtual object on the shoulder of the second virtual object;
  • the control module 320 is specifically used to obtain movement range information corresponding to upper body body contact, where, Movement range information is used to describe the movement range of upper body body contact;
  • the vibration intensity information is determined based on the motion amplitude information, where the vibration intensity information is positively correlated with the motion amplitude information;
  • control the head-mounted display device and the ring device Based on the vibration intensity information, control the head-mounted display device and the ring device to trigger vibration feedback.
  • the virtual reality suite includes a head-mounted display device
  • the control module 320 is specifically configured to control the head-mounted display device to trigger vibration feedback in response to the head contact triggered by the first virtual object with respect to the second virtual object.
  • the control module 320 is specifically configured to respond to the head accessory wearing operation triggered by the first virtual object for the second virtual object;
  • the control module 320 is specifically used to obtain the motion amplitude information corresponding to the head contact, where the motion amplitude information is used to describe the motion amplitude of the head contact;
  • the vibration intensity information is determined based on the motion amplitude information, where the vibration intensity information is positively correlated with the motion amplitude information;
  • the head-mounted display device is controlled to trigger vibration feedback.
  • the virtual reality suite includes a ring device
  • the control module 320 is specifically configured to control the ring device to trigger vibration feedback in response to the hand contact triggered by the first virtual object against the second virtual object.
  • the control module 320 is specifically configured to respond to a handshake operation triggered by the first virtual object for the second virtual object;
  • the control module 320 is specifically used to obtain the motion amplitude information corresponding to the hand contact, where the motion amplitude information is used to describe the motion amplitude of the hand contact;
  • the vibration intensity information is determined based on the motion amplitude information, where the vibration intensity information and the motion amplitude Information is positively related;
  • the ring device Based on the vibration intensity information, the ring device is controlled to trigger vibration feedback.
  • the virtual reality suite includes a glove device
  • the control module 320 is specifically configured to when the first virtual object transfers the first object to the second virtual object,
  • the glove device In response to the contact operation on the first object, the glove device is controlled to trigger heating feedback.
  • the virtual reality suite includes a glove device
  • the control module 320 is specifically configured to control the glove device to trigger cooling feedback in response to a contact operation on the second object when the first virtual object transfers the second object to the second virtual object.
  • the virtual reality suite includes a head-mounted display device
  • the control module 320 is specifically configured to control the head-mounted display device to trigger airflow feedback in response to the whisper operation triggered by the first virtual object for the second virtual object.
  • the virtual reality suite includes a belt device
  • the control module 320 is specifically configured to control the belt device to trigger contraction feedback in response to the hug operation triggered by the first virtual object for the second virtual object.
  • the virtual reality suite includes a shoe device
  • the control module 320 is specifically configured to respond to a passing operation triggered by the first virtual object to the second virtual object; when the target sphere contacts the lower body limbs of the second virtual object, control the shoe device to trigger vibration feedback.
  • the embodiment of the present application also provides a terminal device, as shown in Figure 31.
  • a terminal device for convenience of explanation, only the parts related to the embodiment of the present application are shown. If the specific technical details are not disclosed, please refer to the method part of the embodiment of the present application. .
  • the terminal device is a virtual reality device as an example for explanation:
  • FIG 31 shows a block diagram of a partial structure of a virtual reality device related to the terminal device provided by the embodiment of the present application.
  • the virtual reality device includes: a radio frequency (RF) circuit 410, a memory 420, an input unit 430 (which includes a touch panel 431 and other input devices 432), a display unit 440 (which includes a display panel 441), Sensor 450, audio circuit 460 (which is connected to speaker 461 and microphone 462), wireless fidelity (WiFi) module 470, processor 480, power supply 490 and other components.
  • RF radio frequency
  • the structure of the virtual reality device shown in Figure 31 does not constitute a limitation on the virtual reality device, and may include more or less components than shown, or some components may be combined, or different components may be used. layout.
  • Memory 420 may be used to store software programs and modules that processor 480 stores by executing The software programs and modules of the memory 420 are used to execute various functional applications and data processing of the virtual reality device.
  • the memory 420 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required for at least one function (such as a sound playback function, an image playback function, etc.), etc.; the storage data area may store a program based on Data created by the use of virtual reality equipment (such as audio data, phone books, etc.), etc.
  • memory 420 may include high-speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid-state storage device.
  • the processor 480 is the control center of the virtual reality device, using various interfaces and lines to connect various parts of the entire virtual reality device, by running or executing software programs and/or modules stored in the memory 420, and calling the software programs and/or modules stored in the memory 420. of data, perform various functions of the virtual reality device and process the data.
  • the processor 480 may include one or more processing units; optionally, the processor 480 may integrate an application processor and a modem processor, where the application processor mainly processes the operating system, user interface and application programs. etc., the modem processor mainly handles wireless communications. It can be understood that the above modem processor may not be integrated into the processor 480 .
  • the virtual reality device may also include a camera, a Bluetooth module, etc., which will not be described again here.
  • the steps performed by the terminal device in the above embodiment may be based on the terminal device structure shown in FIG. 31 .
  • Embodiments of the present application also provide a computer-readable storage medium on which a computer program is stored.
  • the computer program is executed by a processor, the steps of the method described in each of the foregoing embodiments are implemented.
  • the embodiments of the present application also provide a computer program product, which includes a computer program.
  • a computer program product which includes a computer program.
  • the steps of the method described in each of the foregoing embodiments are implemented.
  • relevant data such as user operation information is involved.
  • user permission or consent needs to be obtained, and the relevant data Collection, use and processing need to comply with relevant laws, regulations and standards of relevant countries and regions.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

本申请公开了一种虚拟现实的交互方法,应用场景至少包括各类虚拟现实设备,如:头显设备、指环设备、手套设备等。本申请方法包括:在虚拟现实场景中显示至少两个虚拟对象,至少两个虚拟对象包括第一虚拟对象以及第二虚拟对象,第一虚拟对象为第一对象控制的虚拟对象,第二虚拟对象为第二对象控制的虚拟对象;响应第一虚拟对象针对第二虚拟对象触发的互动操作,控制虚拟现实套件触发体感反馈,虚拟现实套件包括第二对象佩戴的至少一个虚拟现实设备。本申请还提供了相关装置、设备及存储介质。本申请在至少两人互动的虚拟现实场景中,支持虚拟对象之间的互动操作。通过虚拟现实套件给真实用户提供相应的体感反馈,从而提升用户之间的交互感。

Description

一种虚拟现实的交互方法、相关装置、设备以及存储介质
本申请要求于2022年06月21日提交中国专利局、申请号为2022107059424、申请名称为“一种虚拟现实的交互方法、相关装置、设备以及存储介质”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及虚拟现实技术领域,尤其涉及虚拟现实的交互技术。
背景技术
虚拟现实(Virtual Reality,VR),顾名思义,就是虚拟和现实相互结合。VR技术是一种可以创建和体验虚拟世界的计算机仿真系统,它利用计算机生成一种模拟环境,支持用户沉浸到该模拟环境中。更具体的,VR技术基于现实生活中的数据,通过计算机技术产生电子信号,将其与各种输出设备结合使其转化为能够使用户感受到的现象。
VR沉浸式体验是手机、电脑和平板等设备无法带来的。目前,VR触觉反馈主要通过VR手柄实现,比较常见的是基于线性马达等震动反馈,即在手柄的扳机处设置力反馈装置,可以在用户按压时给予设定好的反作用力。握把处也可以设置特殊的触感表现装置,依据指定的波形震动模拟触感。
然而,现有方案中至少存在如下问题:现有的触觉反馈模式比较单一,主要集中在人机交互的场景,缺乏用户之间的交互感。
发明内容
本申请实施例提供了一种虚拟现实的交互方法、相关装置、设备以及存储介质。本申请在至少两人互动的虚拟现实场景中,支持虚拟对象之间的互动操作,并且可以通过虚拟现实套件给真实用户提供相应的体感反馈,从而提升用户之间的交互感,使得虚拟现实交互体验更加沉浸拟真。
有鉴于此,本申请一方面提供一种虚拟现实的交互方法,由终端设备执行,包括:
在虚拟现实场景中显示至少两个虚拟对象,其中,至少两个虚拟对象包括第一虚拟对象以及第二虚拟对象,第一虚拟对象为第一对象控制的虚拟对象,第二虚拟对象为第二对象控制的虚拟对象;
响应第一虚拟对象针对第二虚拟对象触发的互动操作,控制虚拟现实套件触发体感反馈,其中,虚拟现实套件包括第二对象佩戴的至少一个虚拟现实设备。
本申请另一方面提供一种虚拟现实交互装置,包括:
显示模块,用于在虚拟现实场景中显示至少两个虚拟对象,其中,至少两个虚拟对象包括第一虚拟对象以及第二虚拟对象,第一虚拟对象为第一对 象控制的虚拟对象,第二虚拟对象为第二对象控制的虚拟对象;
控制模块,用于响应第一虚拟对象针对第二虚拟对象触发的互动操作,控制虚拟现实套件触发体感反馈,其中,虚拟现实套件包括第二对象佩戴的至少一个虚拟现实设备。
本申请另一方面提供一种计算机设备,包括存储器和处理器,存储器存储有计算机程序,处理器执行计算机程序时实现上述各方面的方法。
本申请的另一方面提供了一种计算机可读存储介质,其上存储有计算机程序,计算机程序被处理器执行时实现上述各方面的方法。
本申请的另一个方面,提供了一种计算机程序产品,包括计算机程序,该计算机程序被处理器执行时实现上述各方面的方法。
从以上技术方案可以看出,本申请实施例具有以下优点:
本申请实施例提供了一种虚拟现实的交互方法,即,在虚拟现实场景中显示至少两个虚拟对象,至少两个虚拟对象包括第一虚拟对象以及第二虚拟对象,其中,第一虚拟对象为第一对象控制的虚拟对象,第二虚拟对象为第二对象控制的虚拟对象。当响应第一虚拟对象针对第二虚拟对象触发的互动操作时,控制虚拟现实套件触发体感反馈,虚拟现实套件包括第二对象佩戴至少一个虚拟现实设备。通过上述方式,在至少两人参与的虚拟现实场景中,支持至少两个用户各自控制的虚拟对象之间进行互动操作。基于此,结合虚拟场景中的互动操作,通过虚拟现实套件给真实用户提供相应的体感反馈,从而提升用户之间的交互感,使得虚拟现实交互体验更加沉浸拟真;此处的用户之间的交互感是指在至少两人参与的虚拟现实场景中,因用户各自操控的虚拟对象之间产生互动操作,而为真实用户提供与该互动操作对应的体感反馈,从而使真实用户感受到对应的互动体感。
附图说明
图1为本申请实施例中虚拟现实交互系统的一个物理架构示意图;
图2为本申请实施例中虚拟现实交互系统的一个逻辑架构示意图;
图3为本申请实施例中超声波传感模组的一个结构示意图;
图4为本申请实施例中虚拟现实交互方法的一个流程示意图;
图5为本申请实施例中基于虚拟现实场景响应会话消息的一个示意图;
图6为本申请实施例中基于虚拟现实场景响应非接触式互动消息的一个示意图;
图7为本申请实施例中基于虚拟现实场景接收邮件的一个示意图;
图8为本申请实施例中基于虚拟现实场景响应文件传输的一个示意图;
图9为本申请实施例中基于虚拟现实场景响应组队邀请消息的一个示意图;
图10为本申请实施例中基于虚拟现实场景响应点名提示消息的一个示意图;
图11为本申请实施例中虚拟对象上身肢体的一个示意图;
图12为本申请实施例中基于虚拟现实场景响应肩膀部位触碰操作的一个示意图;
图13为本申请实施例中基于虚拟现实场景响应手臂部位触碰操作的一个示意图;
图14为本申请实施例中基于虚拟现实场景响应背部部位触碰操作的一个示意图;
图15为本申请实施例中基于虚拟现实场景响应腰部部位触碰操作的一个示意图;
图16为本申请实施例中虚拟对象头部的一个示意图;
图17为本申请实施例中基于虚拟现实场景响应头部配件佩戴操作的一个示意图;
图18为本申请实施例中基于虚拟现实场景响应头部配件卸下操作的一个示意图;
图19为本申请实施例中基于虚拟现实场景响应面部触碰操作的一个示意图;
图20为本申请实施例中基于虚拟现实场景响应颈部触碰操作的一个示意图;
图21为本申请实施例中虚拟对象手部的一个示意图;
图22为本申请实施例中基于虚拟现实场景响应握手操作的一个示意图;
图23为本申请实施例中基于虚拟现实场景响应击掌操作的一个示意图;
图24为本申请实施例中基于虚拟现实场景响应手部触碰操作的一个示意图;
图25为本申请实施例中基于虚拟现实场景响应第一物件传递操作的一个示意图;
图26为本申请实施例中基于虚拟现实场景响应第二物件传递操作的一个示意图;
图27为本申请实施例中基于虚拟现实场景响应耳语操作的一个示意图;
图28为本申请实施例中基于虚拟现实场景响应拥抱操作的一个示意图;
图29为本申请实施例中基于虚拟现实场景响应传球操作的一个示意图;
图30为本申请实施例中虚拟现实交互装置的一个示意图;
图31为本申请实施例中虚拟现实设备的一个结构示意图。
具体实施方式
虚拟现实技术受到了越来越多人的认可,用户可以在虚拟现实世界体验到最真实的感受,其模拟环境的真实性,使其与现实世界难辨真假,让人有种身临其境的感觉。同时,虚拟现实具有人类所拥有的感知功能,比如,听觉、视觉、触觉、味觉和嗅觉等感知。最后,它具有超强的仿真系统,真正 实现了人机交互,使人在操作过程中,可以随意操作并且得到环境最真实的反馈。正是虚拟现实技术的沉浸性、交互性、多感知性、构想性以及自主性等特征,使它受到了许多人的喜爱,下面将对这些特征分别进行介绍。
(1)沉浸性;
沉浸性是虚拟现实技术最主要的特征,就是让用户成为并感受到自己是计算机系统所创造的环境中的一部分,虚拟现实技术的沉浸性取决于用户的感知系统,当使用者感知到虚拟世界的刺激(例如,触觉、味觉、嗅觉、运动感知等)时,便会产生思维共鸣,造成心理沉浸,感觉如同进入真实世界。
(2)交互性;
交互性是指用户对模拟环境内物体的可操作程度和从环境得到反馈的自然程度,使用者进入虚拟空间,相应的技术让使用者跟环境产生相互作用,当使用者进行某种操作时,周围的环境也会做出某种反应。如果使用者接触到虚拟空间中的物体,则使用者手上应该能够感受到,如果使用者对物体有所动作,则物体的位置和状态也应改变。
(3)多感知性;
多感知性表示计算机技术应该拥有很多感知方式,例如,听觉,触觉、嗅觉等。目前大多数虚拟现实技术所具有的感知功能包括视觉、听觉、触觉、运动等。
(4)构想性;
构想性也称想象性,使用者在虚拟空间中,可以与周围物体进行互动,可以拓宽认知范围,创造客观世界不存在的场景或不可能发生的环境。
(5)自主性;
自主性是指虚拟环境中的物体依据物理定律动作的程度。例如,当受到力的推动时,物体会向力的方向移动,或翻倒,或从桌面落到地面等。
为了在虚拟场景中实现更好的交互性,达到更佳的沉浸效果,本申请提出了一种虚拟现实的交互方法,该方法应用于图1所示的虚拟现实交互系统,如图所示,虚拟现实交互系统包括服务器和至少两套虚拟现实套件。以图中示出的服务器110、虚拟现实套件120和虚拟现实套件130为例,虚拟现实套件120为一个真实用户佩戴的虚拟现实套件,虚拟现实套件130为另一个真实用户佩戴的虚拟现实套件。虚拟现实套件包括至少一个虚拟现实设备,且虚拟现实设备上安装有客户端,其中,客户端可以通过浏览器的形式运行于虚拟现实设备上,也可以通过独立的应用程序(application,APP)的形式运行于虚拟现实设备上等,对于客户端的具体展现形式,此处不做限定。本申请涉及的服务器可以是独立的物理服务器,也可以是多个物理服务器构成的服务器集群或者分布式系统,还可以是提供云服务、云数据库、云计算、云函数、云存储、网络服务、云通信、中间件服务、域名服务、安全服务、内容分发网络(Content Delivery Network,CDN)、以及大数据和人工智能 平台等基础云计算服务的云服务器。虚拟现实设备可以是头显设备、指环设备、手套设备、腰带设备、鞋子设备、手柄设备、衣物、可穿戴设备、外骨骼等,但并不局限于此。虚拟现实设备与服务器之间可以通过有线或无线通信方式进行直接或间接地连接,本申请在此不做限制。服务器和虚拟现实设备的数量也不做限制。
示例性地,用户A控制的虚拟对象A和用户B控制的虚拟对象B同在一个虚拟现实场景。其中,用户A佩戴虚拟现实套件120,用户B佩戴虚拟现实套件130,虚拟现实套件120与虚拟现实套件130通过服务器110建立通信连接。当用户A控制虚拟对象A向虚拟对象B触发一个动作时,用户B会通过其佩戴的虚拟现实套件130收到相应的体感反馈。
基于图1所示的虚拟现实交互系统的物理架构,下面将结合图2,介绍虚拟现实交互系统的逻辑架构。请参阅图2,图2为本申请实施例中虚拟现实交互系统的一个逻辑架构示意图,如图所示,当用户控制的虚拟对象在虚拟现实场景中收到提示或触发互动时,后台会匹配相应的触觉反馈机制,并通过虚拟现实设备(例如,头显设备或指环设备)内置的超声波传感模组,为用户提供相应的触觉反馈。具体地:
首先,进行用户身份识别。即,基于用户登陆虚拟现实应用时的APP身份标识号(Identity document,ID)进行识别。接下来,进行交互类别判定。其中,新消息提醒主要归为提示类交互,在具体场景中触发互动动作属于互动类交互。判别是否触发互动的动作时,主要通过虚拟现实设备(例如,指环设备或手套设备等)识别手的位置。最后,匹配反馈机制,例如,头部互动可通过用户佩戴的头显设备反馈1次震动,又例如,手部互动可通过用户佩戴的指环设备反馈1次震动。即,通过用户佩戴的虚拟现实设备内置的超声波传感器,为用户提供相应的触觉震动反馈。
需要说明的是,本申请中的虚拟现实设备除了可提供震动反馈,还可以提供其他类型的反馈,反馈类型包含但不仅限于气体反馈、液体反馈、压感反馈等,此处不做限定。
鉴于虚拟现实设备中内置一套超声波传感模组,为了便于理解,请参阅图3,图3为本申请实施例中超声波传感模组的一个结构示意图,如图所示,超声波传感模组通常包括超声传感器阵列、控制电路和驱动电路,超声波传感器阵列主要分为发送部分和接收部分。
发送部分主要由发送器和换能器构成,换能器可以将压电晶片受到电压激励振动时产生的能量转化为超声波,发送器将产生的超声波发射出去。
接收部分主要由换能器和放大电路组成,换能器接收到反射回来的超声波,由于接收超声波时会产生机械振动,因此换能器可以将机械能转换成电信号,再由放大电路对产生的电信号进行放大。
结合上述介绍,下面将对本申请中虚拟现实交互方法进行介绍,请参阅 图4,本申请实施例中虚拟现实交互方法可以由终端设备执行,该终端设备具体可以是虚拟现实设备,本申请提供的虚拟现实交互方法包括:
210、在虚拟现实场景中显示至少两个虚拟对象,其中,至少两个虚拟对象包括第一虚拟对象以及第二虚拟对象,第一虚拟对象为第一对象控制的虚拟对象,第二虚拟对象为第二对象控制的虚拟对象;
在一个或多个实施例中,在虚拟现实场景中,显示至少两个虚拟对象,本申请将以至少两个虚拟对象包括第一虚拟对象以及第二虚拟对象为例进行说明,但不应理解为对本申请的限定。其中,虚拟对象可以是人物形象,也可以是卡通形象等,此处不做限定。
具体地,第一对象(即,用户A)佩戴虚拟现实套件,并以第一虚拟对象的身份进入虚拟现实场景。类似地,第二对象(即,用户B)也佩戴虚拟现实套件,并以第二虚拟对象的身份进入虚拟现实场景。基于此,第一对象可通过虚拟现实套件控制第一虚拟对象与第二虚拟对象互动,类似地,第二对象也可通过虚拟现实套件控制第二虚拟对象与第一虚拟对象互动。
需要说明的是,本申请涉及的虚拟现实场景包含但不仅限于游戏场景、工业制造场景、医疗场景、教育场景、购物场景、办公会议场景、培训场景、安全演练场景、直播场景、家装建筑设计场景等。本申请涉及的虚拟现实套件包括一个或多个虚拟现实设备。
220、响应第一虚拟对象针对第二虚拟对象触发的互动操作,控制虚拟现实套件触发体感反馈,其中,虚拟现实套件包括第二对象佩戴的至少一个虚拟现实设备。
在一个或多个实施例中,在虚拟现实场景中,支持不同的虚拟对象之间进行互动操作,基于互动操作的类型以及强度,可控制虚拟现实套件触发体感反馈。
具体地,第一对象(即,用户A)控制第一虚拟对象向第二对象(即,用户B)控制的第二虚拟对象触发一个互动操作,于是,第二对象(即,用户B)佩戴的虚拟现实套件响应该互动操作,并触发相应的体感反馈。可选地,第一对象(即,用户A)佩戴的虚拟现实套件也可以响应该互动操作,并触发相应的体感反馈。
本申请实施例提供了一种虚拟现实的交互方法。通过上述方式,在至少两人互动的虚拟现实场景中,支持虚拟对象之间的互动操作。基于此,结合虚拟现实场景中的互动操作,通过虚拟现实套件给真实用户提供相应的体感反馈,从而提升用户之间的交互感,使得虚拟现实交互体验更加沉浸拟真。
可选地,在上述图4对应的各个实施例的基础上,本申请实施例提供的另一个可选实施例中,响应第一虚拟对象针对第二虚拟对象触发的互动操作,控制虚拟现实套件触发体感反馈,具体可以包括:
响应第一虚拟对象针对第二虚拟对象发送的会话消息,控制虚拟现实套 件中的头显设备触发震动反馈。
在一个或多个实施例中,介绍了一种在社交虚拟现实场景中针对会话消息的反馈方式。由前述实施例可知,虚拟现实场景支持真实用户创建属于自己的虚拟对象,并控制虚拟对象在虚拟现实场景中进行活动。下面将以社交虚拟现实场景为例进行介绍。
具体地,为了便于理解,请参阅图5,图5为本申请实施例中基于虚拟现实场景响应会话消息的一个示意图,如图5中(A)图所示,第一对象控制第一虚拟对象,且,第二对象控制第二虚拟对象。示例性地,第一对象控制第一虚拟对象向第二虚拟对象发送一条会话消息,例如,该会话消息为“你好,我是杰瑞,很高兴认识你”。基于此,当第二虚拟对象收到该会话消息时,第二对象佩戴的头显设备会触发震动反馈。
类似地,如图5中(B)图所示,第二对象可以控制第二虚拟对象回复第一对象通过第一虚拟对象发出的会话消息。示例性地,第二对象控制第二虚拟对象向第一虚拟对象发送一条会话消息,例如,该会话消息为“嗨,杰瑞,很高兴认识你,我叫玛丽”。基于此,当第一虚拟对象收到该会话消息时,第一对象佩戴的头显设备会触发震动反馈。
需要说明的是,图5示出的虚拟对象的数量、虚拟对象的形象、会话消息的数量以及会话消息的内容等均为示意,不应理解为对本申请的限定。
可以理解的是,头显设备提供的震动反馈可以是震动1次,也可以是连续震动若干次。
其次,本申请实施例提供了一种在社交虚拟现实场景中针对会话消息的反馈方式。通过上述方式,在虚拟现实场景中,真实用户可通过虚拟现实套件感知到其他用户发送的会话消息,一方面丰富了虚拟现实触觉反馈的多样性和交互载体的功能性,另一方面,结合社交虚拟现实场景提供了更多元的交互方式,使得用户感受到更真实的社交感,有利于提升交互体验的趣味性。
可选地,在上述图4对应的各个实施例的基础上,本申请实施例提供的另一个可选实施例中,响应第一虚拟对象针对第二虚拟对象触发的互动操作,控制虚拟现实套件触发体感反馈,具体可以包括:
响应第一虚拟对象针对第二虚拟对象发送的非接触式互动消息,控制虚拟现实套件中的头显设备触发震动反馈。
在一个或多个实施例中,介绍了一种在社交虚拟现实场景中针对非接触式互动消息的反馈方式。由前述实施例可知,虚拟现实场景支持真实用户创建属于自己的虚拟对象,并控制虚拟对象在虚拟现实场景中进行活动。下面将以社交虚拟现实场景为例进行介绍。
具体地,为了便于理解,请参阅图6,图6为本申请实施例中基于虚拟现实场景响应非接触式互动消息的一个示意图,如图6中(A)图所示,第一对象控制第一虚拟对象,且,第二对象控制第二虚拟对象。示例性地,第 一对象控制第一虚拟对象向第二虚拟对象发送一条非接触式互动消息,例如,该非接触式互动消息为“示爱”消息,可通过发射爱心或者发射飞吻的方式来表达。基于此,当第二虚拟对象收到该“示爱”消息时,第二对象佩戴的头显设备会触发震动反馈。
类似地,如图6中(B)图所示,第二对象可以控制第二虚拟对象回复第一对象通过第一虚拟对象发出的“示爱”消息。示例性地,第二对象可以控制第二虚拟对象向第一虚拟对象发送一条非接触式互动消息,例如,该非接触式互动消息为“使眼色”消息。基于此,当第一虚拟对象收到该“使眼色”消息时,第一对象佩戴的头显设备会触发震动反馈。
需要说明的是,图6示出的虚拟对象的数量、虚拟对象的形象、非接触式互动消息的数量以及非接触式互动消息的内容等均为示意,不应理解为对本申请的限定。
可以理解的是,头显设备提供的震动反馈可以是震动1次,也可以是连续震动若干次。非接触式互动消息包含但不限于“示爱”、“抱拳”、“点赞”、“使眼色”等。
其次,本申请实施例提供了一种在社交虚拟现实场景中针对非接触式互动消息的反馈方式。通过上述方式,在虚拟现实场景中,真实用户可通过虚拟现实套件感知到其他用户发送的非接触式互动消息,一方面丰富了虚拟现实触觉反馈的多样性和交互载体的功能性,另一方面,结合社交虚拟现实场景提供了更多元的交互方式,使得用户感受到更真实的社交感,有利于提升交互体验的趣味性。
可选地,在上述图4对应的各个实施例的基础上,本申请实施例提供的另一个可选实施例中,响应第一虚拟对象针对第二虚拟对象触发的互动操作,控制虚拟现实套件触发体感反馈,具体可以包括:
响应第一虚拟对象发送的电子邮件,控制虚拟现实套件中的头显设备触发震动反馈,其中,电子邮件的收件方包括第二虚拟对象。
在一个或多个实施例中,介绍了一种在办公虚拟现实场景中针对电子邮件的反馈方式。由前述实施例可知,虚拟现实场景支持真实用户创建属于自己的虚拟对象,并控制虚拟对象在虚拟现实场景中进行活动。下面将以办公虚拟现实场景为例进行介绍。
具体地,为了便于理解,请参阅图7,图7为本申请实施例中基于虚拟现实场景接收邮件的一个示意图,如图所示,第一对象控制第一虚拟对象,且,第二对象控制第二虚拟对象。示例性地,第一对象控制第一虚拟对象发送一封电子邮件,该电子邮件的收件方包括第二虚拟对象,即,电子邮件的收件方邮箱地址包括第二虚拟对象的邮箱地址(例如,mary@qq.com)。基于此,当第二虚拟对象收到该电子邮件时,第二对象佩戴的头显设备会触发震动反馈。
需要说明的是,图7示出的虚拟对象的数量、虚拟对象的形象、电子邮件的数量以及电子邮件的内容等均为示意,不应理解为对本申请的限定。
可以理解的是,头显设备提供的震动反馈可以是震动1次,也可以是连续震动若干次。
其次,本申请实施例提供了一种在办公虚拟现实场景中针对电子邮件的反馈方式。通过上述方式,在虚拟现实场景中,真实用户可通过虚拟现实套件感知到其他用户发送的电子邮件,一方面丰富了虚拟现实触觉反馈的多样性和交互载体的功能性,另一方面,结合办公虚拟现实场景提供更多元的交互方式,使得用户身临其境,有利于提升交互体验的趣味性。
可选地,在上述图4对应的各个实施例的基础上,本申请实施例提供的另一个可选实施例中,响应第一虚拟对象针对第二虚拟对象触发的互动操作,控制虚拟现实套件触发体感反馈,具体可以包括:
响应第一虚拟对象针对第二虚拟对象发送的电子文件,控制虚拟现实套件中的头显设备触发震动反馈。
在一个或多个实施例中,介绍了一种在办公虚拟现实场景中针对电子文件的反馈方式。由前述实施例可知,虚拟现实场景支持真实用户创建属于自己的虚拟对象,并控制虚拟对象在虚拟现实场景中进行活动。下面将以办公虚拟现实场景为例进行介绍。
具体地,为了便于理解,请参阅图8,图8为本申请实施例中基于虚拟现实场景响应文件传输的一个示意图,如图所示,第一对象控制第一虚拟对象,且,第二对象控制第二虚拟对象。示例性地,第一对象控制第一虚拟对象向第二虚拟对象发送电子文件。基于此,当第二虚拟对象收到该电子文件时,第二对象佩戴的头显设备会触发震动反馈。
需要说明的是,图8示出的虚拟对象的数量、虚拟对象的形象、电子文件的数量以及电子文件的内容等均为示意,不应理解为对本申请的限定。
可以理解的是,头显设备提供的震动反馈可以是震动1次,也可以是连续震动若干次。电子文件包含但不限于文本文件、图像文件、图形文件、影像文件、声音文件、超媒体链结文件、程序文件以及数据文件等。
其次,本申请实施例提供了一种在办公虚拟现实场景中针对电子文件的反馈方式。通过上述方式,在虚拟现实场景中,真实用户可通过虚拟现实套件感知到其他用户传输的电子文件,一方面丰富了虚拟现实触觉反馈的多样性和交互载体的功能性,另一方面,结合办公虚拟现实场景提供更多元的交互方式,使得用户身临其境,有利于提升交互体验的趣味性。
可选地,在上述图4对应的各个实施例的基础上,本申请实施例提供的另一个可选实施例中,响应第一虚拟对象针对第二虚拟对象触发的互动操作,控制虚拟现实套件触发体感反馈,具体可以包括:
响应第一虚拟对象针对第二虚拟对象发送的组队邀请消息,控制虚拟现 实套件中的头显设备触发震动反馈。
在一个或多个实施例中,介绍了一种在游戏虚拟现实场景中针对组队邀请消息的反馈方式。由前述实施例可知,虚拟现实场景支持真实用户创建属于自己的虚拟对象,并控制虚拟对象在虚拟现实场景中进行活动。下面将以游戏虚拟现实场景为例进行介绍。
具体地,为了便于理解,请参阅图9,图9为本申请实施例中基于虚拟现实场景响应组队邀请消息的一个示意图,如图所示,第一对象控制第一虚拟对象,且,第二对象控制第二虚拟对象。示例性地,第一对象控制第一虚拟对象在游戏中创建“队伍A”,然后,第一对象控制第一虚拟对象向第二虚拟对象发送针对“队伍A”的组队邀请消息,即,邀请第二虚拟对象加入“队伍A”进行游戏。基于此,当第二虚拟对象收到该组队邀请消息时,第二对象佩戴的头显设备会触发震动反馈。
需要说明的是,图9示出的虚拟对象的数量、虚拟对象的形象以及游戏类型等均为示意,不应理解为对本申请的限定。
可以理解的是,头显设备提供的震动反馈可以是震动1次,也可以是连续震动若干次。游戏类型包含但不限于多人在线战术竞技游戏(multiplayer online battle arena,MOBA)、即时战略游戏(real-time strategy game,RTS)、角色扮演游戏(role-playing game,RPG)以及第一人称射击类游戏(first-person shooting game,FPS)等。
其次,本申请实施例提供了一种在游戏虚拟现实场景中针对组队邀请消息的反馈方式。通过上述方式,在虚拟现实场景中,真实用户可通过虚拟现实套件感知到其他用户发起的组队邀请消息,一方面丰富了虚拟现实触觉反馈的多样性和交互载体的功能性,另一方面,结合游戏虚拟现实场景提供了更多元的交互方式,使得用户身临其境,有利于提升交互体验的趣味性。
可选地,在上述图4对应的各个实施例的基础上,本申请实施例提供的另一个可选实施例中,响应第一虚拟对象针对第二虚拟对象触发的互动操作,控制虚拟现实套件触发体感反馈,具体可以包括:
响应第一虚拟对象针对第二虚拟对象发送的点名提示消息,控制虚拟现实套件中的头显设备触发震动反馈。
在一个或多个实施例中,介绍了一种在远程教育虚拟现实场景中针对点名提示消息的反馈方式。由前述实施例可知,虚拟现实场景支持真实用户创建属于自己的虚拟对象,并控制虚拟对象在虚拟现实场景中进行活动。下面将以远程教育虚拟现实场景为例进行介绍。
具体地,为了便于理解,请参阅图10,图10为本申请实施例中基于虚拟现实场景响应点名提示消息的一个示意图,如图所示,第一对象控制第一虚拟对象,且,第二对象控制第二虚拟对象。示例性地,第一对象控制第一虚拟对象进行授课,在授课过程中针对第二虚拟对象发起点名,即向第二虚 拟对象发送点名提示消息。例如,第二虚拟对象的名字为“玛丽”,当第一虚拟对象点名“玛丽”时,即发送针对第二虚拟对象的点名提示消息。基于此,当第二虚拟对象收到该点名提示消息时,第二对象佩戴的头显设备会触发震动反馈。
需要说明的是,图10示出的虚拟对象的数量以及虚拟对象的形象等均为示意,不应理解为对本申请的限定。
可以理解的是,头显设备提供的震动反馈可以是震动1次,也可以是连续震动若干次。
其次,本申请实施例提供了一种在远程教育虚拟现实场景中针对点名提示消息的反馈方式。通过上述方式,在虚拟现实场景中,真实用户可通过虚拟现实套件感知到其他用户发起的点名提示消息,一方面丰富了虚拟现实触觉反馈的多样性和交互载体的功能性,另一方面,结合远程教育虚拟现实场景提供了更多元的交互方式,使得用户身临其境,有利于提升教学效果。
可选地,在上述图4对应的各个实施例的基础上,本申请实施例提供的另一个可选实施例中,虚拟现实套件可以包括头显设备和指环设备,相应地,响应第一虚拟对象针对第二虚拟对象触发的互动操作,控制虚拟现实套件触发体感反馈,具体可以包括:
响应第一虚拟对象针对第二虚拟对象触发的上身肢体接触,控制头显设备以及指环设备触发震动反馈。
在一个或多个实施例中,介绍了一种基于上身肢体接触触发头部和手部反馈的方式。由前述实施例可知,在虚拟现实场景中,还支持虚拟对象之间触发互动动作。具体的,第一虚拟对象可以触碰第二虚拟对象的上身肢体,即,针对第二虚拟对象触发上身肢体接触。基于此,第二对象佩戴的头显设备和指环设备会同时触发震动反馈。
具体地,为了便于理解,请参阅图11,图11为本申请实施例中虚拟对象上身肢体的一个示意图,如图所示,A1用于指示虚拟对象的上身肢体,其中,上身肢体可以理解为除了头部和手部以外的上半身。以第一虚拟对象和第二虚拟对象均为人物形象作为示例,如果第一虚拟对象触碰到了第二虚拟对象的上身肢体,那么可判别为上半身互动,即上身肢体触碰。
其次,本申请实施例提供了一种基于上身肢体接触触发头部和手部反馈的方式。通过上述方式,在虚拟现实场景中,支持虚拟对象之间进行上身肢体互动,从而丰富了虚拟现实触觉反馈的多样性和交互载体的功能性。
可选地,在上述图4对应的各个实施例的基础上,本申请实施例提供的另一个可选实施例中,响应第一虚拟对象针对第二虚拟对象触发的上身肢体接触,具体可以包括:
响应第一虚拟对象针对第二虚拟对象的肩膀部位触发的触碰操作;
或者,响应第一虚拟对象针对第二虚拟对象的手臂部位触发的触碰操作;
或者,响应第一虚拟对象针对第二虚拟对象的背部部位触发的触碰操作;
或者,响应第一虚拟对象针对第二虚拟对象的腰部部位触发的触碰操作。
在一个或多个实施例中,介绍了一种在虚拟现实场景中响应上身肢体接触的方式。由前述实施例可知,虚拟现实场景支持真实用户创建属于自己的虚拟对象,并控制虚拟对象在虚拟现实场景中进行互动。下面将结合不同的互动场景进行介绍。
示例性地,为了便于理解,请参阅图12,图12为本申请实施例中基于虚拟现实场景响应肩膀部位触碰操作的一个示意图,如图所示,第一对象控制第一虚拟对象,且,第二对象控制第二虚拟对象。第一对象控制第一虚拟对象走向第二虚拟对象,然后,拍了一下第二虚拟对象的肩膀部位,即,第一虚拟对象针对第二虚拟对象的肩膀部位触发了触碰操作。基于此,第二对象佩戴的头显设备和指环设备会同时触发震动反馈。
示例性地,为了便于理解,请参阅图13,图13为本申请实施例中基于虚拟现实场景响应手臂部位触碰操作的一个示意图,如图所示,第一对象控制第一虚拟对象,且,第二对象控制第二虚拟对象。第一对象控制第一虚拟对象走向第二虚拟对象,然后,拍了一下第二虚拟对象的手臂部位,即,第一虚拟对象针对第二虚拟对象的手臂部位触发了触碰操作。基于此,第二对象佩戴的头显设备和指环设备会同时触发震动反馈。
示例性地,为了便于理解,请参阅图14,图14为本申请实施例中基于虚拟现实场景响应背部部位触碰操作的一个示意图,如图所示,第一对象控制第一虚拟对象,且,第二对象控制第二虚拟对象。第一对象控制第一虚拟对象走向第二虚拟对象,然后,拍了一下第二虚拟对象的背部部位,即,第一虚拟对象针对第二虚拟对象的背部部位触发了触碰操作。基于此,第二对象佩戴的头显设备和指环设备会同时触发震动反馈。
示例性地,为了便于理解,请参阅图15,图15为本申请实施例中基于虚拟现实场景响应腰部部位触碰操作的一个示意图,如图所示,第一对象控制第一虚拟对象,且,第二对象控制第二虚拟对象。第一对象控制第一虚拟对象走向第二虚拟对象,然后,拍了一下第二虚拟对象的腰部部位,即,第一虚拟对象针对第二虚拟对象的腰部部位触发了触碰操作。基于此,第二对象佩戴的头显设备和指环设备会同时触发震动反馈。
需要说明的是,图12至图15示出的虚拟对象的数量以及虚拟对象的形象等均为示意,不应理解为对本申请的限定。
可以理解的是,头显设备和指环设备提供的震动反馈可以是震动1次,也可以是连续震动若干次;头显设备的震动次数与指环设备的震动次数可以相同,也可以不同。
再次,本申请实施例提供了一种在虚拟现实场景中响应上身肢体接触的方式。通过上述方式,在虚拟现实场景中,真实用户可通过虚拟现实套件感 知到其他用户触发的上身肢体接触操作,丰富了虚拟现实触觉反馈的多样性和交互载体的功能性,提升交互过程的真实感,使得用户身临其境。
可选地,在上述图4对应的各个实施例的基础上,本申请实施例提供的另一个可选实施例中,控制头显设备以及指环设备触发震动反馈,具体可以包括:
获取上身肢体接触对应的动作幅度信息,其中,动作幅度信息用于描述上身肢体接触的动作幅度大小;
基于动作幅度信息确定震动力度信息,其中,震动力度信息与动作幅度信息正相关;
基于震动力度信息,控制头显设备以及指环设备触发震动反馈。
在一个或多个实施例中,介绍了一种按照动作幅度提供相应力度反馈的方式。由前述实施例可知,虚拟现实设备中还可以内置惯性测量装置(inertial measurement units,IMU),通常情况下,IMU包括陀螺仪及加速度计。其中,加速度计用于检测物体在载体坐标系统独立三轴的加速度信号,对单方向加速度积分即可得到方向速度。
具体地,以第一虚拟对象向第二虚拟对象触发上身肢体接触为例,其中,第一对象(即,真实的用户)佩戴指环设备(即,虚拟现实指环设备),由此,通过指环设备控制第一虚拟对象拍一下第二虚拟对象的上身肢体。于是,可通过指环设备中内置的IMU,测量出第一对象的手部运动速度。不同的运动速度对应于不同的动作幅度信息,为了便于理解,请参阅表1,表1为运动速度、动作幅度信息以及震动力度信息之间对应关系的一个示意。
表1
其中,“v”表示动作触发方的手部运动速度。可见,第一对象的手部运动速度越大,其对应的动作幅度也越大,相应地,动作接收方(即,第二对象)能感知到的震动强度也越大。如果第一对象以3米/秒的速度控制第一虚拟对象触碰第二虚拟对象的肩膀,则第二对象可通过头显设备以及指环设备感受到轻微震动。
可以理解的是,表1中的运动速度也可以是其他部位的运动速度,例如,肘部。上述示例以手部运动速度为例进行介绍,然而不应理解为对本申请的限定。
需要说明的是,在实际应用中,也可以通过摄像头(例如,深度摄像头或者双目摄像头)感知画面的距离信息,进而采用相应的算法识别出真实用户的动作幅度。或者,还可以通过其他方式检测真实用户的动作幅度,此处 不做限定。
再次,本申请实施例提供了一种按照动作幅度提供相应力度反馈的方式。通过上述方式,用户可通过自己的动作,来控制虚拟对象在虚拟场景执行相应的动作。基于此,可通过检测真实用户的动作速度,确定上身肢体接触的动作幅度信息,由此,控制虚拟现实套件基于相应的震动力度信息触发震动反馈。从而更好地模拟用户之间的真实状态,使得虚拟现实交互体验更加沉浸拟真。
可选地,在上述图4对应的各个实施例的基础上,本申请实施例提供的另一个可选实施例中,虚拟现实套件包括头显设备,响应第一虚拟对象针对第二虚拟对象触发的互动操作,控制虚拟现实套件触发体感反馈,具体可以包括:
响应第一虚拟对象针对第二虚拟对象触发的头部接触,控制头显设备触发震动反馈。
在一个或多个实施例中,介绍了一种基于头部接触触发头部反馈的方式。由前述实施例可知,在虚拟现实场景中,还支持虚拟对象之间触发互动动作。具体的,第一虚拟对象可以触碰第二虚拟对象的头部,即,针对第二虚拟对象触发头部接触。基于此,第二对象佩戴的头显设备会触发震动反馈。
具体地,为了便于理解,请参阅图16,图16为本申请实施例中虚拟对象头部的一个示意图,如图所示,B1用于指示虚拟对象的头部,其中,头部可以理解为脖子以上的部分。以第一虚拟对象和第二虚拟对象均为人物形象作为示例,如果第一虚拟对象触碰到了第二虚拟对象的头部,那么可判别为头部互动,即头部接触。
其次,本申请实施例提供了一种基于头部接触触发头部反馈的方式。通过上述方式,在虚拟现实场景中,支持虚拟对象之间进行头部接触互动,从而丰富了虚拟现实触觉反馈的多样性和交互载体的功能性。
可选地,在上述图4对应的各个实施例的基础上,本申请实施例提供的另一个可选实施例中,响应第一虚拟对象针对第二虚拟对象触发的头部接触,具体可以包括:
响应第一虚拟对象针对第二虚拟对象触发的头部配件佩戴操作;
或者,响应第一虚拟对象针对第二虚拟对象触发的头部配件卸下操作;
或者,响应第一虚拟对象针对第二虚拟对象的面部触发的触碰操作;
或者,响应第一虚拟对象针对第二虚拟对象的颈部触发的触碰操作。
在一个或多个实施例中,介绍了一种在虚拟现实场景中响应头部接触的方式。由前述实施例可知,虚拟现实场景支持真实用户创建属于自己的虚拟对象,并控制虚拟对象在虚拟现实场景中进行互动。下面将结合不同的互动场景进行介绍。
示例性地,为了便于理解,请参阅图17,图17为本申请实施例中基于 虚拟现实场景响应头部配件佩戴操作的一个示意图,如图17中(A)图所示,第一对象控制第一虚拟对象,且,第二对象控制第二虚拟对象。其中,第一虚拟对象拿着耳机走向第二虚拟对象,准备为第二虚拟对象戴上耳机。如图17中(B)图所示,第一虚拟对象为第二虚拟对象戴上耳机,在此过程中,碰到了第二虚拟对象的头部。基于此,第二对象佩戴的头显设备会触发震动反馈。
示例性地,为了便于理解,请参阅图18,图18为本申请实施例中基于虚拟现实场景响应头部配件卸下操作的一个示意图,如图18中(A)图所示,第一对象控制第一虚拟对象,且,第二对象控制第二虚拟对象。其中,第二虚拟对象戴着眼镜,第一虚拟对象准备为第二虚拟对象取下眼镜。如图18中(B)图所示,第一虚拟对象为第二虚拟对象卸下眼镜,在此过程中,碰到了第二虚拟对象的头部。基于此,第二对象佩戴的头显设备会触发震动反馈。
示例性地,为了便于理解,请参阅图19,图19为本申请实施例中基于虚拟现实场景响应面部触碰操作的一个示意图,如图所示,第一对象控制第一虚拟对象,且,第二对象控制第二虚拟对象。第一对象控制第一虚拟对象走向第二虚拟对象,然后,碰了一下第二虚拟对象的脸颊部位,即,第一虚拟对象针对第二虚拟对象的面部部位触发了触碰操作。基于此,第二对象佩戴的头显设备会触发震动反馈。
示例性地,为了便于理解,请参阅图20,图20为本申请实施例中基于虚拟现实场景响应颈部触碰操作的一个示意图,如图所示,第一对象控制第一虚拟对象,且,第二对象控制第二虚拟对象。第一对象控制第一虚拟对象走向第二虚拟对象,然后,碰了一下第二虚拟对象的颈部部位,即,第一虚拟对象针对第二虚拟对象的颈部部位触发了触碰操作。基于此,第二对象佩戴的头显设备会触发震动反馈。
需要说明的是,图17至图20示出的虚拟对象的数量以及虚拟对象的形象等均为示意,不应理解为对本申请的限定。
可以理解的是,头显设备提供的震动反馈可以是震动1次,也可以是连续震动若干次。本申请中的头部配件包含但不仅限于耳机、眼镜、帽子、发饰等。
再次,本申请实施例提供了一种在虚拟现实场景中响应头部接触的方式。通过上述方式,在虚拟现实场景中,真实用户可通过虚拟现实套件感知到其他用户触发的头部接触操作,丰富了虚拟现实触觉反馈的多样性和交互载体的功能性,提升交互过程的真实感,使得用户身临其境。
可选地,在上述图4对应的各个实施例的基础上,本申请实施例提供的另一个可选实施例中,控制头显设备触发震动反馈,具体可以包括:
获取头部接触对应的动作幅度信息,其中,动作幅度信息用于描述头部接触的动作幅度大小;
基于动作幅度信息确定震动力度信息,其中,震动力度信息与动作幅度信息正相关;
基于震动力度信息,控制头显设备触发震动反馈。
在一个或多个实施例中,介绍了一种按照动作幅度提供相应力度反馈的方式。由前述实施例可知,虚拟现实设备中可以内置IMU,通过IMU可以进一步探测虚拟现实设备的运动速度。或,通过摄像头感知画面的距离信息,采用相应的算法识别真实用户的动作幅度。或者,还可以通过其他方式检测真实用户的动作幅度,此处不做限定。
具体地,以第一虚拟对象向第二虚拟对象触发头部接触为例,其中,第一对象(即,真实的用户)佩戴指环设备(即,虚拟现实指环设备),由此,通过指环设备控制第一虚拟对象拍一下第二虚拟对象的头部。于是,可通过指环设备中内置的IMU,测量出第一对象的手部运动速度。不同的运动速度对应于不同的动作幅度信息,为了便于理解,请再次参阅表1。如果第一对象以6米/秒的速度控制第一虚拟对象触碰第二虚拟对象的头部,则第二对象可通过头显设备感受到中度震动。
再次,本申请实施例提供了一种按照动作幅度提供相应力度反馈的方式。通过上述方式,用户可通过自己的动作来控制虚拟对象在虚拟场景(例如,格斗类的虚拟现实游戏)执行相应的动作。基于此,可通过检测真实用户的动作速度,确定头部接触的动作幅度信息,由此,控制虚拟现实套件基于相应的震动力度信息触发震动反馈。从而更好地模拟用户之间的真实状态,使得虚拟现实交互体验更加沉浸拟真。
可选地,在上述图4对应的各个实施例的基础上,本申请实施例提供的另一个可选实施例中,虚拟现实套件包括指环设备,相应地,响应第一虚拟对象针对第二虚拟对象触发的互动操作,控制虚拟现实套件触发体感反馈,具体可以包括:
响应第一虚拟对象针对第二虚拟对象触发的手部接触,控制指环设备触发震动反馈。
在一个或多个实施例中,介绍了一种基于手部接触触发手部反馈的方式。由前述实施例可知,在虚拟现实场景中,还支持虚拟对象之间触发互动动作。具体的,第一虚拟对象可以触碰第二虚拟对象的手部,即,针对第二虚拟对象触发手部接触。基于此,第二对象佩戴的指环设备会触发震动反馈。
具体地,为了便于理解,请参阅图21,图21为本申请实施例中虚拟对象手部的一个示意图,如图所示,C1用于指示虚拟对象的手部,其中,手部可以理解为手指、手背和手掌的部分。以第一虚拟对象和第二虚拟对象均为人物形象作为示例,如果第一虚拟对象的手触碰到了第二虚拟对象的手部,那么可判别为手部互动,即手部接触。
其次,本申请实施例提供了一种基于手部接触触发手部反馈的方式。通 过上述方式,在虚拟现实场景中,支持虚拟对象之间进行肢体互动,从而丰富了虚拟现实触觉反馈的多样性和交互载体的功能性。
可选地,在上述图4对应的各个实施例的基础上,本申请实施例提供的另一个可选实施例中,响应第一虚拟对象针对第二虚拟对象触发的手部接触,具体可以包括:
响应第一虚拟对象针对第二虚拟对象触发的握手操作;
或者,响应第一虚拟对象针对第二虚拟对象触发的击掌操作;
或者,响应第一虚拟对象针对第二虚拟对象触发的手部触碰操作。
在一个或多个实施例中,介绍了一种在虚拟现实场景中响应手部接触的方式。由前述实施例可知,虚拟现实场景支持真实用户创建属于自己的虚拟对象,并控制虚拟对象在虚拟现实场景中进行互动。下面将结合不同的互动场景进行介绍。
示例性地,为了便于理解,请参阅图22,图22为本申请实施例中基于虚拟现实场景响应握手操作的一个示意图,如图所示,第一对象控制第一虚拟对象,且,第二对象控制第二虚拟对象。第一对象控制第一虚拟对象与第二虚拟对象握手,即,第一虚拟对象针对第二虚拟对象触发握手操作。基于此,第二对象佩戴的指环设备会触发震动反馈。可选地,第一对象佩戴的指环设备也可以触发震动反馈。
示例性地,为了便于理解,请参阅图23,图23为本申请实施例中基于虚拟现实场景响应击掌操作的一个示意图,如图所示,第一对象控制第一虚拟对象,且,第二对象控制第二虚拟对象。第一对象控制第一虚拟对象与第二虚拟对象击掌,即,第一虚拟对象针对第二虚拟对象触发击掌操作。基于此,第二对象佩戴的指环设备会触发震动反馈。可选地,第一对象佩戴的指环设备也可以触发震动反馈。
示例性地,为了便于理解,请参阅图24,图24为本申请实施例中基于虚拟现实场景响应手部触碰操作的一个示意图,如图所示,第一对象控制第一虚拟对象,且,第二对象控制第二虚拟对象。第一对象控制第一虚拟对象触碰第二虚拟对象的手,即,第一虚拟对象针对第二虚拟对象的触发手部触碰操作。基于此,第二对象佩戴的指环设备会触发震动反馈。可选地,第一对象佩戴的指环设备也可以触发震动反馈。
需要说明的是,图22至图24示出的虚拟对象的数量以及虚拟对象的形象等均为示意,不应理解为对本申请的限定。
可以理解的是,指环设备提供的震动反馈可以是震动1次,也可以是连续震动若干次。本申请中的手部触碰包含但不仅限于触碰手掌、触碰手背、触碰手指等。
再次,本申请实施例提供了一种在虚拟现实场景中响应手部接触的方式。通过上述方式,在虚拟现实场景中,真实用户可通过虚拟现实套件感知到其 他用户触发的手部接触操作,丰富了虚拟现实触觉反馈的多样性和交互载体的功能性,提升交互过程的真实感,使得用户身临其境。
可选地,在上述图4对应的各个实施例的基础上,本申请实施例提供的另一个可选实施例中,控制指环设备触发震动反馈,具体可以包括:
获取手部接触对应的动作幅度信息,其中,动作幅度信息用于描述手部接触的动作幅度大小;
基于动作幅度信息确定震动力度信息,其中,震动力度信息与动作幅度信息正相关;
基于震动力度信息,控制指环设备触发震动反馈。
在一个或多个实施例中,介绍了一种按照动作幅度提供相应力度反馈的方式。由前述实施例可知,虚拟现实设备中可以内置IMU,通过IMU可以进一步探测虚拟现实设备的运动速度。或,通过摄像头感知画面的距离信息,采用相应的算法识别出真实用户的动作幅度。或者,还可以通过其他方式检测真实用户的动作幅度,此处不做限定。
具体地,以第一虚拟对象向第二虚拟对象触发手部接触为例,其中,第一对象(即,真实的用户)佩戴指环设备(即,虚拟现实指环设备),由此,通过指环设备控制第一虚拟对象拍一下第二虚拟对象的手背。于是,可通过指环设备中内置的IMU,测量出第一对象的手部运动速度。不同的运动速度对应于不同的动作幅度信息,为了便于理解,请再次参阅表1。如果第一对象以7米/秒的速度控制第一虚拟对象触碰第二虚拟对象的手部,则第二对象可通过指环设备感受到中度震动。
再次,本申请实施例提供了一种按照动作幅度提供相应力度反馈的方式。通过上述方式,用户可通过自己的动作来控制虚拟对象在虚拟场景执行相应的动作。基于此,可通过检测真实用户的动作速度,确定手部接触的动作幅度信息,由此,控制虚拟现实套件基于相应的震动力度信息触发震动反馈。从而更好地模拟用户之间的真实状态,使得虚拟现实交互体验更加沉浸拟真。
可选地,在上述图4对应的各个实施例的基础上,本申请实施例提供的另一个可选实施例中,虚拟现实套件可以包括手套设备,相应地,响应第一虚拟对象针对第二虚拟对象触发的互动操作,控制虚拟现实套件触发体感反馈,具体可以包括:
当第一虚拟对象向第二虚拟对象传递第一物件时,响应针对第一物件的接触操作,控制手套设备触发发热反馈。
在一个或多个实施例中,介绍了一种在虚拟现实场景中传递热物件的反馈方式。由前述实施例可知,虚拟现实场景支持真实用户创建属于自己的虚拟对象,并控制虚拟对象在虚拟现实场景中进行互动。下面将以第一物件为热水杯为例进行介绍。
具体地,为了便于理解,请参阅图25,图25为本申请实施例中基于虚 拟现实场景响应第一物件传递操作的一个示意图,如图25中(A)图所示,第一对象控制第一虚拟对象,且,第二对象控制第二虚拟对象。其中,第一虚拟对象拿着一杯热水走向第二虚拟对象,准备为第二虚拟对象递上热水杯。如图25中(B)图所示,第一虚拟对象为第二虚拟对象递上热水杯。当热水杯碰到第二虚拟对象的手部时,第二对象佩戴的手套设备会触发发热反馈,例如,手套设备升高5摄氏度。
需要说明的是,图25示出的虚拟对象的数量以及虚拟对象的形象等均为示意,不应理解为对本申请的限定。
可以理解的是,手套设备不仅可提供热度反馈,还可以提供震动反馈等。第一物件包含但不限于热水,热毛巾,热汤等。
其次,本申请实施例提供了一种在虚拟现实场景中传递热物件的反馈方式。通过上述方式,在虚拟现实场景中,真实用户可通过虚拟现实套件感知到其他用户传递热物件,以此触发热度反馈。从而丰富了虚拟现实触觉反馈的多样性和交互载体的功能性,使得用户身临其境,有利于提升交互体验的趣味性。
可选地,在上述图4对应的各个实施例的基础上,本申请实施例提供的另一个可选实施例中,虚拟现实套件可以包括手套设备,相应地,响应第一虚拟对象针对第二虚拟对象触发的互动操作,控制虚拟现实套件触发体感反馈,具体可以包括:
当第一虚拟对象向第二虚拟对象传递第二物件时,响应针对第二物件的接触操作,控制手套设备触发降温反馈。
在一个或多个实施例中,介绍了一种在虚拟现实场景中传递冰物件的反馈方式。由前述实施例可知,虚拟现实场景支持真实用户创建属于自己的虚拟对象,并控制虚拟对象在虚拟现实场景中进行互动。下面将以第二物件为冰淇淋为例进行介绍。
具体地,为了便于理解,请参阅图26,图26为本申请实施例中基于虚拟现实场景响应第二物件传递操作的一个示意图,如图26中(A)图所示,第一对象控制第一虚拟对象,且,第二对象控制第二虚拟对象。其中,第一虚拟对象拿着一个冰淇淋走向第二虚拟对象,准备为第二虚拟对象递上冰淇淋。如图26中(B)图所示,第一虚拟对象为第二虚拟对象递上冰淇淋。当冰淇淋碰到了第二虚拟对象的手部时,第二对象佩戴的手套设备会触发降温反馈,例如,手套设备降低5摄氏度。
需要说明的是,图26示出的虚拟对象的数量以及虚拟对象的形象等均为示意,不应理解为对本申请的限定。
可以理解的是,手套设备不仅可提供降温反馈,还可以提供震动反馈等。第二物件包含但不限于冰淇淋,冰块等。
其次,本申请实施例提供了一种在虚拟现实场景中传递冰物件的反馈方 式。通过上述方式,在虚拟现实场景中,真实用户可通过虚拟现实套件感知到其他用户传递冰物件,以此触发冰感反馈。从而丰富了虚拟现实触觉反馈的多样性和交互载体的功能性,使得用户身临其境,有利于提升交互体验的趣味性。
可选地,在上述图4对应的各个实施例的基础上,本申请实施例提供的另一个可选实施例中,虚拟现实套件可以包括头显设备,相应地,响应第一虚拟对象针对第二虚拟对象触发的互动操作,控制虚拟现实套件触发体感反馈,具体可以包括:
响应第一虚拟对象针对第二虚拟对象触发的耳语操作,控制头显设备触发气流反馈。
在一个或多个实施例中,介绍了一种在虚拟现实场景中响应耳语的方式。由前述实施例可知,虚拟现实场景支持真实用户创建属于自己的虚拟对象,并控制虚拟对象在虚拟现实场景中进行互动。下面将结合互动场景进行介绍。
示例性地,为了便于理解,请参阅图27,图27为本申请实施例中基于虚拟现实场景响应耳语操作的一个示意图,如图所示,第一对象控制第一虚拟对象,且,第二对象控制第二虚拟对象。第一对象控制第一虚拟对象向第二虚拟对象说悄悄话,即,第一虚拟对象针对第二虚拟对象的触发耳语操作。基于此,第二对象佩戴的头显设备会触发气流反馈,以此模拟在耳边说话的效果。
需要说明的是,图27示出的虚拟对象的数量以及虚拟对象的形象等均为示意,不应理解为对本申请的限定。
可以理解的是,头显设备提供的气流反馈可以是1次,也可以是连续若干次。
再次,本申请实施例提供了一种在虚拟现实场景中响应耳语的方式。通过上述方式,在虚拟现实场景中,真实用户可通过虚拟现实套件感知到其他用户触发的耳语操作,丰富了虚拟现实触觉反馈的多样性和交互载体的功能性,提升交互过程的真实感,使得用户身临其境。
可选地,在上述图4对应的各个实施例的基础上,本申请实施例提供的另一个可选实施例中,虚拟现实套件可以包括腰带设备,相应地,响应第一虚拟对象针对第二虚拟对象触发的互动操作,控制虚拟现实套件触发体感反馈,具体可以包括:
响应第一虚拟对象针对第二虚拟对象触发的拥抱操作,控制腰带设备触发收缩反馈。
在一个或多个实施例中,介绍了一种在虚拟现实场景中响应拥抱的方式。由前述实施例可知,虚拟现实场景支持真实用户创建属于自己的虚拟对象,并控制虚拟对象在虚拟现实场景中进行互动。下面将结合互动场景进行介绍。
示例性地,为了便于理解,请参阅图28,图28为本申请实施例中基于 虚拟现实场景响应拥抱操作的一个示意图,如图所示,第一对象控制第一虚拟对象,且,第二对象控制第二虚拟对象。第一对象控制第一虚拟对象与第二虚拟对象拥抱,即,第一虚拟对象针对第二虚拟对象触发拥抱操作。基于此,第二对象佩戴的腰带设备会触发向内收缩的反馈。可选地,第一对象佩戴的腰带设备也会触发向内收缩的反馈。
需要说明的是,图28示出的虚拟对象的数量以及虚拟对象的形象等均为示意,不应理解为对本申请的限定。
再次,本申请实施例提供了一种在虚拟现实场景中响应拥抱的方式。通过上述方式,在虚拟现实场景中,真实用户可通过虚拟现实套件感知到其他用户触发的拥抱操作,丰富了虚拟现实触觉反馈的多样性和交互载体的功能性,提升交互过程的真实感,使得用户身临其境。
可选地,在上述图4对应的各个实施例的基础上,本申请实施例提供的另一个可选实施例中,虚拟现实套件可以包括鞋子设备,相应地,响应第一虚拟对象针对第二虚拟对象触发的互动操作,控制虚拟现实套件触发体感反馈,具体可以包括:
响应第一虚拟对象向第二虚拟对象触发的传球操作;
当目标球体接触到第二虚拟对象的下身肢体时,控制鞋子设备触发震动反馈。
在一个或多个实施例中,介绍了一种在虚拟现实场景中模拟传球的反馈方式。由前述实施例可知,虚拟现实场景支持真实用户创建属于自己的虚拟对象,并控制虚拟对象在虚拟现实场景中进行活动。下面将结合互动场景进行介绍。
具体地,为了便于理解,请参阅图29,图29为本申请实施例中基于虚拟现实场景响应传球操作的一个示意图,如图29中(A)图所示,第一对象控制第一虚拟对象,且,第二对象控制第二虚拟对象。其中,第一虚拟对象准备将脚下的足球传向第二虚拟对象。如图29中(B)图所示,当足球碰到第二虚拟对象的下身肢体时,第二对象佩戴的鞋子设备会触发震动反馈。
需要说明的是,图29示出的虚拟对象的数量以及虚拟对象的形象等均为示意,不应理解为对本申请的限定。
可以理解的是,鞋子设备提供的震动反馈可以是震动1次,也可以是连续震动若干次。
其次,本申请实施例提供了一种在虚拟现实场景中模拟传球的反馈方式。通过上述方式,在虚拟现实场景中,真实用户可通过虚拟现实套件感知到其他用户触发的传球操作,一方面丰富了虚拟现实触觉反馈的多样性和交互载体的功能性,另一方面,结合虚拟现实场景提供更多元的交互方式,使得用户身临其境。
下面对本申请中的虚拟现实交互装置进行详细描述,请参阅图30,图30 为本申请实施例中虚拟现实交互装置的一个实施例示意图,虚拟现实交互装置30包括:
显示模块310,用于在虚拟现实场景中显示至少两个虚拟对象,其中,至少两个虚拟对象包括第一虚拟对象以及第二虚拟对象,第一虚拟对象为第一对象控制的虚拟对象,第二虚拟对象为第二对象控制的虚拟对象;
控制模块320,用于响应第一虚拟对象针对第二虚拟对象触发的互动操作,控制虚拟现实套件触发体感反馈,其中,虚拟现实套件包括第二对象佩戴的至少一个虚拟现实设备。
可选地,在上述图30所对应的实施例的基础上,本申请实施例提供的虚拟现实交互装置30的另一实施例中,虚拟现实套件包括头显设备。控制模块320具体用于通过以下至少一种方式,控制虚拟现实套件触发体感反馈:
响应第一虚拟对象针对第二虚拟对象发送的会话消息,控制头显设备触发震动反馈;
响应第一虚拟对象针对第二虚拟对象发送的非接触式互动消息,控制头显设备触发震动反馈;
响应第一虚拟对象发送的电子邮件,控制头显设备触发震动反馈,其中,电子邮件的收件方包括第二虚拟对象;
响应第一虚拟对象针对第二虚拟对象发送的电子文件,控制头显设备触发震动反馈;
响应第一虚拟对象针对第二虚拟对象发送的组队邀请消息,控制头显设备触发震动反馈;
响应第一虚拟对象针对第二虚拟对象发送的点名提示消息,控制头显设备触发震动反馈。
可选地,在上述图30所对应的实施例的基础上,本申请实施例提供的虚拟现实交互装置30的另一实施例中,虚拟现实套件包括头显设备和指环设备;
控制模块320,具体用于响应第一虚拟对象针对第二虚拟对象触发的上身肢体接触,控制头显设备以及指环设备触发震动反馈。
可选地,在上述图30所对应的实施例的基础上,本申请实施例提供的虚拟现实交互装置30的另一实施例中,
控制模块320,具体用于响应第一虚拟对象针对第二虚拟对象的肩膀部位触发的触碰操作;
或者,响应第一虚拟对象针对第二虚拟对象的手臂部位触发的触碰操作;
或者,响应第一虚拟对象针对第二虚拟对象的背部部位触发的触碰操作;
或者,响应第一虚拟对象针对第二虚拟对象的腰部部位触发的触碰操作。
可选地,在上述图30所对应的实施例的基础上,本申请实施例提供的虚拟现实交互装置30的另一实施例中,
控制模块320,具体用于获取上身肢体接触对应的动作幅度信息,其中, 动作幅度信息用于描述上身肢体接触的动作幅度大小;
基于动作幅度信息确定震动力度信息,其中,震动力度信息与动作幅度信息正相关;
基于震动力度信息,控制头显设备以及指环设备触发震动反馈。
可选地,在上述图30所对应的实施例的基础上,本申请实施例提供的虚拟现实交互装置30的另一实施例中,虚拟现实套件包括头显设备;
控制模块320,具体用于响应第一虚拟对象针对第二虚拟对象触发的头部接触,控制头显设备触发震动反馈。
可选地,在上述图30所对应的实施例的基础上,本申请实施例提供的虚拟现实交互装置30的另一实施例中,
控制模块320,具体用于响应第一虚拟对象针对第二虚拟对象触发的头部配件佩戴操作;
或者,响应第一虚拟对象针对第二虚拟对象触发的头部配件卸下操作;
或者,响应第一虚拟对象针对第二虚拟对象的面部触发的触碰操作;
或者,响应第一虚拟对象针对第二虚拟对象的颈部触发的触碰操作。
可选地,在上述图30所对应的实施例的基础上,本申请实施例提供的虚拟现实交互装置30的另一实施例中,
控制模块320,具体用于获取头部接触对应的动作幅度信息,其中,动作幅度信息用于描述头部接触的动作幅度大小;
基于动作幅度信息确定震动力度信息,其中,震动力度信息与动作幅度信息正相关;
基于震动力度信息,控制头显设备触发震动反馈。
可选地,在上述图30所对应的实施例的基础上,本申请实施例提供的虚拟现实交互装置30的另一实施例中,虚拟现实套件包括指环设备;
控制模块320,具体用于响应第一虚拟对象针对第二虚拟对象触发的手部接触,控制指环设备触发震动反馈。
可选地,在上述图30所对应的实施例的基础上,本申请实施例提供的虚拟现实交互装置30的另一实施例中,
控制模块320,具体用于响应第一虚拟对象针对第二虚拟对象触发的握手操作;
或者,响应第一虚拟对象针对第二虚拟对象触发的击掌操作;
或者,响应第一虚拟对象针对第二虚拟对象触发的手部触碰操作。
可选地,在上述图30所对应的实施例的基础上,本申请实施例提供的虚拟现实交互装置30的另一实施例中,
控制模块320,具体用于获取手部接触对应的动作幅度信息,其中,动作幅度信息用于描述手部接触的动作幅度大小;
基于动作幅度信息确定震动力度信息,其中,震动力度信息与动作幅度 信息正相关;
基于震动力度信息,控制指环设备触发震动反馈。
可选地,在上述图30所对应的实施例的基础上,本申请实施例提供的虚拟现实交互装置30的另一实施例中,虚拟现实套件包括手套设备;
控制模块320,具体用于当第一虚拟对象向第二虚拟对象传递第一物件时,
响应针对第一物件的接触操作,控制手套设备触发发热反馈。
可选地,在上述图30所对应的实施例的基础上,本申请实施例提供的虚拟现实交互装置30的另一实施例中,虚拟现实套件包括手套设备;
控制模块320,具体用于当第一虚拟对象向第二虚拟对象传递第二物件时,响应针对第二物件的接触操作,控制手套设备触发降温反馈。
可选地,在上述图30所对应的实施例的基础上,本申请实施例提供的虚拟现实交互装置30的另一实施例中,虚拟现实套件包括头显设备;
控制模块320,具体用于响应第一虚拟对象针对第二虚拟对象触发的耳语操作,控制头显设备触发气流反馈。
可选地,在上述图30所对应的实施例的基础上,本申请实施例提供的虚拟现实交互装置30的另一实施例中,虚拟现实套件包括腰带设备;
控制模块320,具体用于响应第一虚拟对象针对第二虚拟对象触发的拥抱操作,控制腰带设备触发收缩反馈。
可选地,在上述图30所对应的实施例的基础上,本申请实施例提供的虚拟现实交互装置30的另一实施例中,虚拟现实套件包括鞋子设备;
控制模块320,具体用于响应第一虚拟对象向第二虚拟对象触发的传球操作;当目标球体接触到第二虚拟对象的下身肢体时,控制鞋子设备触发震动反馈。
本申请实施例还提供了一种终端设备,如图31所示,为了便于说明,仅示出了与本申请实施例相关的部分,具体技术细节未揭示的,请参照本申请实施例方法部分。在本申请实施例中,以终端设备为虚拟现实设备为例进行说明:
图31示出的是与本申请实施例提供的终端设备相关的虚拟现实设备的部分结构的框图。参考图31,虚拟现实设备包括:射频(radio frequency,RF)电路410、存储器420、输入单元430(其中包括触控面板431和其他输入设备432)、显示单元440(其中包括显示面板441)、传感器450、音频电路460(其连接有扬声器461和传声器462)、无线保真(wireless fidelity,WiFi)模块470、处理器480、以及电源490等部件。本领域技术人员可以理解,图31中示出的虚拟现实设备结构并不构成对虚拟现实设备的限定,可以包括比图示更多或更少的部件,或者组合某些部件,或者不同的部件布置。
存储器420可用于存储软件程序以及模块,处理器480通过运行存储在 存储器420的软件程序以及模块,从而执行虚拟现实设备的各种功能应用以及数据处理。存储器420可主要包括存储程序区和存储数据区,其中,存储程序区可存储操作系统、至少一个功能所需的应用程序(比如声音播放功能、图像播放功能等)等;存储数据区可存储根据虚拟现实设备的使用所创建的数据(比如音频数据、电话本等)等。此外,存储器420可以包括高速随机存取存储器,还可以包括非易失性存储器,例如至少一个磁盘存储器件、闪存器件、或其他易失性固态存储器件。
处理器480是虚拟现实设备的控制中心,利用各种接口和线路连接整个虚拟现实设备的各个部分,通过运行或执行存储在存储器420内的软件程序和/或模块,以及调用存储在存储器420内的数据,执行虚拟现实设备的各种功能和处理数据。可选的,处理器480可包括一个或多个处理单元;可选的,处理器480可集成应用处理器和调制解调处理器,其中,应用处理器主要处理操作系统、用户界面和应用程序等,调制解调处理器主要处理无线通信。可以理解的是,上述调制解调处理器也可以不集成到处理器480中。
尽管未示出,虚拟现实设备还可以包括摄像头、蓝牙模块等,在此不再赘述。
上述实施例中由终端设备所执行的步骤可以基于该图31所示的终端设备结构。
本申请实施例中还提供一种计算机可读存储介质,其上存储有计算机程序,该计算机程序被处理器执行时,实现前述各个实施例描述方法的步骤。
本申请实施例中还提供一种计算机程序产品,包括计算机程序,该计算机程序被处理器执行时,实现前述各个实施例描述方法的步骤。
可以理解的是,在本申请的具体实施方式中,涉及到用户操作信息等相关的数据,当本申请以上实施例运用到具体产品或技术中时,需要获得用户许可或者同意,且相关数据的收集、使用和处理需要遵守相关国家和地区的相关法律法规和标准。
以上所述,以上实施例仅用以说明本申请的技术方案,而非对其限制;尽管参照前述实施例对本申请进行了详细的说明,本领域的普通技术人员应当理解:其依然可以对前述各实施例所记载的技术方案进行修改,或者对其中部分技术特征进行等同替换;而这些修改或者替换,并不使相应技术方案的本质脱离本申请各实施例技术方案的精神和范围。

Claims (20)

  1. 一种虚拟现实的交互方法,由终端设备执行,包括:
    在虚拟现实场景中显示至少两个虚拟对象,其中,所述至少两个虚拟对象包括第一虚拟对象以及第二虚拟对象,所述第一虚拟对象为第一对象控制的虚拟对象,所述第二虚拟对象为第二对象控制的虚拟对象;
    响应所述第一虚拟对象针对所述第二虚拟对象触发的互动操作,控制虚拟现实套件触发体感反馈,其中,所述虚拟现实套件包括所述第二对象佩戴的至少一个虚拟现实设备。
  2. 根据权利要求1所述的交互方法,所述虚拟现实套件包括头显设备,所述响应所述第一虚拟对象针对所述第二虚拟对象触发的互动操作,控制虚拟现实套件触发体感反馈,包括以下至少一种:
    响应所述第一虚拟对象针对所述第二虚拟对象发送的会话消息,控制所述头显设备触发震动反馈;
    响应所述第一虚拟对象针对所述第二虚拟对象发送的非接触式互动消息,控制所述头显设备触发震动反馈;
    响应所述第一虚拟对象发送的电子邮件,控制所述头显设备触发震动反馈,其中,所述电子邮件的收件方包括所述第二虚拟对象;
    响应所述第一虚拟对象针对所述第二虚拟对象发送的电子文件,控制所述头显设备触发震动反馈;
    响应所述第一虚拟对象针对所述第二虚拟对象发送的组队邀请消息,控制所述头显设备触发震动反馈;
    响应所述第一虚拟对象针对所述第二虚拟对象发送的点名提示消息,控制所述头显设备触发震动反馈。
  3. 根据权利要求1所述的交互方法,所述虚拟现实套件包括头显设备和指环设备;所述响应所述第一虚拟对象针对所述第二虚拟对象触发的互动操作,控制虚拟现实套件触发体感反馈,包括:
    响应所述第一虚拟对象针对所述第二虚拟对象触发的上身肢体接触,控制所述头显设备以及所述指环设备触发震动反馈。
  4. 根据权利要求3所述的交互方法,所述响应所述第一虚拟对象针对所述第二虚拟对象触发的上身肢体接触,包括:
    响应所述第一虚拟对象针对所述第二虚拟对象的肩膀部位触发的触碰操作;
    或者,响应所述第一虚拟对象针对所述第二虚拟对象的手臂部位触发的触碰操作;
    或者,响应所述第一虚拟对象针对所述第二虚拟对象的背部部位触发的触碰操作;
    或者,响应所述第一虚拟对象针对所述第二虚拟对象的腰部部位触发的 触碰操作。
  5. 根据权利要求3或4所述的交互方法,所述控制头显设备以及指环设备触发震动反馈,包括:
    获取所述上身肢体接触对应的动作幅度信息,其中,所述动作幅度信息用于描述所述上身肢体接触的动作幅度大小;
    基于所述动作幅度信息确定震动力度信息,其中,所述震动力度信息与所述动作幅度信息正相关;
    基于所述震动力度信息,控制所述头显设备以及所述指环设备触发震动反馈。
  6. 根据权利要求1所述的交互方法,所述虚拟现实套件包括头显设备;所述响应所述第一虚拟对象针对所述第二虚拟对象触发的互动操作,控制虚拟现实套件触发体感反馈,包括:
    响应所述第一虚拟对象针对所述第二虚拟对象触发的头部接触,控制所述头显设备触发震动反馈。
  7. 根据权利要求6所述的交互方法,所述响应所述第一虚拟对象针对所述第二虚拟对象触发的头部接触,包括:
    响应所述第一虚拟对象针对所述第二虚拟对象触发的头部配件佩戴操作;
    或者,响应所述第一虚拟对象针对所述第二虚拟对象触发的头部配件卸下操作;
    或者,响应所述第一虚拟对象针对所述第二虚拟对象的面部触发的触碰操作;
    或者,响应所述第一虚拟对象针对所述第二虚拟对象的颈部触发的触碰操作。
  8. 根据权利要求6或7所述的交互方法,所述控制所述头显设备触发震动反馈,包括:
    获取所述头部接触对应的动作幅度信息,其中,所述动作幅度信息用于描述所述头部接触的动作幅度大小;
    基于所述动作幅度信息确定震动力度信息,其中,所述震动力度信息与所述动作幅度信息正相关;
    基于所述震动力度信息,控制所述头显设备触发震动反馈。
  9. 根据权利要求1所述的交互方法,所述虚拟现实套件包括指环设备;所述响应所述第一虚拟对象针对所述第二虚拟对象触发的互动操作,控制虚拟现实套件触发体感反馈,包括:
    响应所述第一虚拟对象针对所述第二虚拟对象触发的手部接触,控制所述指环设备触发震动反馈。
  10. 根据权利要求9所述的交互方法,所述响应所述第一虚拟对象针对所述第二虚拟对象触发的手部接触,包括:
    响应所述第一虚拟对象针对所述第二虚拟对象触发的握手操作;
    或者,响应所述第一虚拟对象针对所述第二虚拟对象触发的击掌操作;
    或者,响应所述第一虚拟对象针对所述第二虚拟对象触发的手部触碰操作。
  11. 根据权利要求9或10所述的交互方法,所述控制所述指环设备触发震动反馈,包括:
    获取所述手部接触对应的动作幅度信息,其中,所述动作幅度信息用于描述所述手部接触的动作幅度大小;
    基于所述动作幅度信息确定震动力度信息,其中,所述震动力度信息与所述动作幅度信息正相关;
    基于所述震动力度信息,控制所述指环设备触发震动反馈。
  12. 根据权利要求1所述的交互方法,所述虚拟现实套件包括手套设备;所述响应所述第一虚拟对象针对所述第二虚拟对象触发的互动操作,控制虚拟现实套件触发体感反馈,包括:
    当所述第一虚拟对象向所述第二虚拟对象传递第一物件时,响应针对所述第一物件的接触操作,控制所述手套设备触发发热反馈。
  13. 根据权利要求1所述的交互方法,所述虚拟现实套件包括手套设备;所述响应所述第一虚拟对象针对所述第二虚拟对象触发的互动操作,控制虚拟现实套件触发体感反馈,包括:
    当所述第一虚拟对象向所述第二虚拟对象传递第二物件时,响应针对所述第二物件的接触操作,控制所述手套设备触发降温反馈。
  14. 根据权利要求1所述的交互方法,所述虚拟现实套件包括头显设备;所述响应所述第一虚拟对象针对所述第二虚拟对象触发的互动操作,控制虚拟现实套件触发体感反馈,包括:
    响应所述第一虚拟对象针对所述第二虚拟对象触发的耳语操作,控制所述头显设备触发气流反馈。
  15. 根据权利要求1所述的交互方法,所述虚拟现实套件包括腰带设备;所述响应所述第一虚拟对象针对所述第二虚拟对象触发的互动操作,控制虚拟现实套件触发体感反馈,包括:
    响应所述第一虚拟对象针对所述第二虚拟对象触发的拥抱操作,控制所述腰带设备触发收缩反馈。
  16. 根据权利要求1所述的交互方法,所述虚拟现实套件包括鞋子设备;所述响应所述第一虚拟对象针对所述第二虚拟对象触发的互动操作,控制虚拟现实套件触发体感反馈,包括:
    响应所述第一虚拟对象向所述第二虚拟对象触发的传球操作;
    当目标球体接触到第二虚拟对象的下身肢体时,控制所述鞋子设备触发震动反馈。
  17. 一种虚拟现实交互装置,包括:
    显示模块,用于在虚拟现实场景中显示至少两个虚拟对象,其中,所述至少两个虚拟对象包括第一虚拟对象以及第二虚拟对象,所述第一虚拟对象为第一对象控制的虚拟对象,所述第二虚拟对象为第二对象控制的虚拟对象;
    控制模块,用于响应所述第一虚拟对象针对所述第二虚拟对象触发的互动操作,控制虚拟现实套件触发体感反馈,其中,所述虚拟现实套件包括所述第二对象佩戴的至少一个虚拟现实设备。
  18. 一种终端设备,包括存储器和处理器,所述存储器存储有计算机程序,所述处理器执行所述计算机程序时实现权利要求1至16中任一项所述的交互方法的步骤。
  19. 一种计算机可读存储介质,其上存储有计算机程序,所述计算机程序被处理器执行时实现权利要求1至16中任一项所述的交互方法的步骤。
  20. 一种计算机程序产品,包括计算机程序,该计算机程序被处理器执行时实现权利要求1至16中任一项所述的交互方法的步骤。
PCT/CN2023/078921 2022-06-21 2023-03-01 一种虚拟现实的交互方法、相关装置、设备以及存储介质 WO2023246159A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202210705942.4A CN117298590A (zh) 2022-06-21 2022-06-21 一种虚拟现实的交互方法、相关装置、设备以及存储介质
CN202210705942.4 2022-06-21

Publications (1)

Publication Number Publication Date
WO2023246159A1 true WO2023246159A1 (zh) 2023-12-28

Family

ID=89279858

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2023/078921 WO2023246159A1 (zh) 2022-06-21 2023-03-01 一种虚拟现实的交互方法、相关装置、设备以及存储介质

Country Status (2)

Country Link
CN (1) CN117298590A (zh)
WO (1) WO2023246159A1 (zh)

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150293592A1 (en) * 2014-04-15 2015-10-15 Samsung Electronics Co., Ltd. Haptic information management method and electronic device supporting the same
CN106227339A (zh) * 2016-08-16 2016-12-14 西安中科比奇创新科技有限责任公司 可穿戴设备、虚拟现实人机交互系统及方法
CN106681490A (zh) * 2016-11-29 2017-05-17 维沃移动通信有限公司 一种虚拟现实终端的数据处理方法和虚拟现实终端
CN106873775A (zh) * 2017-01-16 2017-06-20 深圳中科呼图电子商务有限公司 一种虚拟现实交互的实现方法、系统及mr手套
CN107636605A (zh) * 2015-03-20 2018-01-26 索尼互动娱乐股份有限公司 传达在头戴式显示器渲染的环境中的虚拟对象的触感和移动的动态手套
CN108073285A (zh) * 2018-01-02 2018-05-25 联想(北京)有限公司 一种电子设备及控制方法
CN108874123A (zh) * 2018-05-07 2018-11-23 北京理工大学 一种通用的模块化的虚拟现实被动力触觉反馈系统
CN111667560A (zh) * 2020-06-04 2020-09-15 成都飞机工业(集团)有限责任公司 一种基于vr虚拟现实角色的交互结构及交互方法
CN112203114A (zh) * 2020-09-07 2021-01-08 佛山创视嘉科技有限公司 协同演奏方法、系统、终端设备及存储介质
CN113296605A (zh) * 2021-05-24 2021-08-24 中国科学院深圳先进技术研究院 力反馈方法、力反馈装置及电子设备
CN113946211A (zh) * 2021-10-14 2022-01-18 网易有道信息技术(江苏)有限公司 基于元宇宙的多个对象的交互方法及相关设备

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150293592A1 (en) * 2014-04-15 2015-10-15 Samsung Electronics Co., Ltd. Haptic information management method and electronic device supporting the same
CN107636605A (zh) * 2015-03-20 2018-01-26 索尼互动娱乐股份有限公司 传达在头戴式显示器渲染的环境中的虚拟对象的触感和移动的动态手套
CN106227339A (zh) * 2016-08-16 2016-12-14 西安中科比奇创新科技有限责任公司 可穿戴设备、虚拟现实人机交互系统及方法
CN106681490A (zh) * 2016-11-29 2017-05-17 维沃移动通信有限公司 一种虚拟现实终端的数据处理方法和虚拟现实终端
CN106873775A (zh) * 2017-01-16 2017-06-20 深圳中科呼图电子商务有限公司 一种虚拟现实交互的实现方法、系统及mr手套
CN108073285A (zh) * 2018-01-02 2018-05-25 联想(北京)有限公司 一种电子设备及控制方法
CN108874123A (zh) * 2018-05-07 2018-11-23 北京理工大学 一种通用的模块化的虚拟现实被动力触觉反馈系统
CN111667560A (zh) * 2020-06-04 2020-09-15 成都飞机工业(集团)有限责任公司 一种基于vr虚拟现实角色的交互结构及交互方法
CN112203114A (zh) * 2020-09-07 2021-01-08 佛山创视嘉科技有限公司 协同演奏方法、系统、终端设备及存储介质
CN113296605A (zh) * 2021-05-24 2021-08-24 中国科学院深圳先进技术研究院 力反馈方法、力反馈装置及电子设备
CN113946211A (zh) * 2021-10-14 2022-01-18 网易有道信息技术(江苏)有限公司 基于元宇宙的多个对象的交互方法及相关设备

Also Published As

Publication number Publication date
CN117298590A (zh) 2023-12-29

Similar Documents

Publication Publication Date Title
TWI672168B (zh) 用於處理用於頭戴顯示器(hmd)之內容的系統,用於與由電腦產生以便呈現於hmd上之虛擬實境場景介接的周邊裝置,及模擬與呈現於hmd中之虛擬場景中的虛擬物件接觸之感受的方法
JP7277545B2 (ja) 検出された手入力に基づく仮想手ポーズのレンダリング
US10254833B2 (en) Magnetic tracking of glove interface object
JP6316387B2 (ja) 広範囲同時遠隔ディジタル提示世界
US9665174B2 (en) Magnetic tracking of glove fingertips with peripheral devices
CN108984087B (zh) 基于三维虚拟形象的社交互动方法及装置
WO2017170146A1 (ja) 制御方法、仮想現実体験提供装置およびプログラム
GB2556347A (en) Virtual reality
CN103197757A (zh) 一种沉浸式虚拟现实系统及其实现方法
KR20220065039A (ko) 인공 현실 시스템들을 위한 안전 모드 피처
CN108876878B (zh) 头像生成方法及装置
Schraffenberger Arguably augmented reality: relationships between the virtual and the real
TWI803224B (zh) 聯絡人信息展示方法、裝置、電子設備、計算機可讀儲存媒體及計算機程式産品
CN113260954B (zh) 基于人工现实的用户群组
WO2023246159A1 (zh) 一种虚拟现实的交互方法、相关装置、设备以及存储介质
Hashimoto et al. Novel tactile display for emotional tactile experience
CN114425162A (zh) 一种视频处理方法和相关装置
US20240173618A1 (en) User-customized flat computer simulation controller
JP7333529B1 (ja) 端末装置の制御プログラム、端末装置、端末装置の制御方法、サーバ装置の制御プログラム、サーバ装置、及びサーバ装置の制御方法
US20230419625A1 (en) Showing context in a communication session
US20240177359A1 (en) Training a machine learning model for reconstructing occluded regions of a face
JP7053074B1 (ja) 鑑賞システム、鑑賞装置及びプログラム
WO2024118290A1 (en) User-customized flat computer simulation controller field
WO2024118295A1 (en) Training a machine learning model for reconstructing occluded regions of a face
Puerta et al. Using head-tracking to create a shareable virtual travel experience

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23825817

Country of ref document: EP

Kind code of ref document: A1