WO2019072064A1 - 虚拟现实环境下的控制方法、设备和存储介质 - Google Patents

虚拟现实环境下的控制方法、设备和存储介质 Download PDF

Info

Publication number
WO2019072064A1
WO2019072064A1 PCT/CN2018/105054 CN2018105054W WO2019072064A1 WO 2019072064 A1 WO2019072064 A1 WO 2019072064A1 CN 2018105054 W CN2018105054 W CN 2018105054W WO 2019072064 A1 WO2019072064 A1 WO 2019072064A1
Authority
WO
WIPO (PCT)
Prior art keywords
component
interactive object
movable
dimensional interactive
computer device
Prior art date
Application number
PCT/CN2018/105054
Other languages
English (en)
French (fr)
Inventor
沈超
Original Assignee
腾讯科技(深圳)有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 腾讯科技(深圳)有限公司 filed Critical 腾讯科技(深圳)有限公司
Publication of WO2019072064A1 publication Critical patent/WO2019072064A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04815Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/01Indexing scheme relating to G06F3/01
    • G06F2203/012Walk-in-place systems for allowing a user to walk in a virtual environment while constraining him to a given position in the physical environment
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/048023D-info-object: information is displayed on the internal or external surface of a three dimensional manipulable object, e.g. on the faces of a cube that can be rotated by the user

Definitions

  • the present application relates to the field of computer technology, and in particular, to a control method, device, and storage medium in a virtual reality environment.
  • VR Virtual Reality
  • the interactive control in the virtual reality environment is to select a two-dimensional interactive menu interface in a three-dimensional space of the virtual reality environment, and then select a menu in the two-dimensional interactive menu interface by emitting rays through the handle, wherein The intersection of the ray and the two-dimensional interactive menu interface is the menu of the desired interaction, and then the button of the handle is used to further click on the selected interactive menu to implement interactive control.
  • the above control method does not have a real sense of interaction in a virtual reality environment, and the interaction control method based on a mouse click operation based on a two-dimensional computer screen is basically similar, and the user still needs to perform aiming selection through a mouse-like operation, and needs to press a button. Further click and other multi-step operations to achieve interactive control, the control efficiency is low.
  • a control method, a computer device, and a storage medium in a virtual reality environment are provided.
  • a control method in a virtual reality environment comprising:
  • the computer device displays a three-dimensional interactive object including a movable part and a fixed part in a virtual reality environment
  • the computer device monitors movement of the virtual operating body in the virtual reality environment
  • the computer device controls the movable component to move relative to the fixed component and follow the virtual operating body when the virtual operating body is moved into contact with the movable component;
  • the computer device outputs a control instruction corresponding to the three-dimensional interactive object according to a relative position of the movable component relative to the fixed component.
  • a computer device comprising a memory and one or more processors having stored therein computer readable instructions, the computer readable instructions being executed by the one or more processors such that the one or more The processors perform the following steps:
  • a three-dimensional interactive object including a moving part and a fixed part
  • One or more storage media storing computer readable instructions, when executed by one or more processors, cause one or more processors to perform the following steps:
  • a three-dimensional interactive object including a moving part and a fixed part
  • FIG. 1 is an application environment diagram of a control method in a virtual reality environment in an embodiment
  • Figure 2 is a block diagram of a computer device in one embodiment
  • FIG. 3 is a schematic flow chart of a control method in a virtual reality environment in an embodiment
  • FIG. 4 is a schematic diagram of a three-dimensional interactive object in one embodiment
  • 5A to 5B are schematic views showing the movement of a movable member in one embodiment
  • FIG. 6 is a schematic flow chart showing a step of displaying a three-dimensional interactive object in an embodiment
  • FIG. 7 is a schematic flow chart of a control step of a moving part in one embodiment
  • Figure 8 is a flow chart showing the steps of moving a moving part in one embodiment
  • Figure 9 is a schematic illustration of the movement of a moving part in one embodiment
  • FIG. 10 is a schematic flow chart of a control method in a virtual reality environment in another embodiment
  • FIG. 11 is a block diagram of a control device in a virtual reality environment in one embodiment.
  • Figure 12 is a block diagram of a control device in a virtual reality environment in another embodiment.
  • FIG. 1 is an application environment diagram of a control method in a virtual reality environment in an embodiment.
  • the application environment includes an application environment including a real operating device 110 and a computer device 120 that perform connection communication over a network.
  • the virtual reality application (which may be simply referred to as a virtual reality application) is installed in the computer device 120, and the virtual reality application may implement a virtual reality scene.
  • the real operating device 110 is a device for operating and controlling a virtual reality application in the computer device 110 in a real environment.
  • the real operating device 110 may include an operation handle or other input device having an input operation function, such as a sensor for capturing a human hand motion.
  • the computer device 120 may be a terminal, and the terminal may be a desktop computer or a mobile terminal, and the mobile terminal may include at least one of a mobile phone, a tablet computer, a personal digital assistant, and a wearable device.
  • the computer device 120 can implement a virtual reality scene through a virtual reality application running on the computer device 120, and display a three-dimensional interactive object 120a including a moving part and a fixed part in the virtual reality scene, wherein the three-dimensional interactive object 120a is a two-dimensional function menu Each function menu in the interface performs a visually processed three-dimensional model.
  • the three-dimensional interactive object 120a has an interactive function of the corresponding function menu, that is, the function corresponding to the corresponding function menu can be realized by operating the three-dimensional interactive object 120a.
  • the computer device 120 can generate a virtual operating body 120b through a running virtual reality application, wherein the virtual operating body 120b is a virtual execution body for performing operational control on the three-dimensional interactive object.
  • the global spatial location of the virtual operating entity 120b in the virtual reality environment may be derived from the physical spatial location mapping of the real operating device 110.
  • the user can move the corresponding virtual operating body 120b in the virtual reality environment by moving the physical space position of the real operating device 110 in a real environment.
  • the computer device 120 can control the moving parts to move relative to the fixed parts and follow the virtual operating body.
  • the computer device 120 can output a control command corresponding to the three-dimensional interactive object 120a in accordance with the relative position of the movable member with respect to the fixed member.
  • the computer device can be the computer device 120 of FIG.
  • the computer device includes a processor, memory, network interface, and display screen connected by a system bus.
  • the memory comprises a non-volatile storage medium and an internal memory.
  • the non-volatile storage medium of the computer device can store an operating system and computer readable instructions that, when executed, can cause the processor to perform a control method in a virtual reality environment.
  • the processor of the computer device is used to provide computing and control capabilities to support the operation of the entire computer device.
  • Computer readable instructions may be stored in the internal memory of the computer device, the computer readable instructions being executable by the processor to cause the processor to perform a control method in a virtual reality environment.
  • the network interface of the computer device is used for network communication.
  • the display of the computer device can be a liquid crystal display or an electronic ink display.
  • FIG. 2 is only a block diagram of a part of the structure related to the solution of the present application, and does not constitute a limitation of the computer device to which the solution of the present application is applied.
  • the specific computer device may It includes more or fewer components than those shown in the figures, or some components are combined, or have different component arrangements.
  • FIG. 3 is a schematic flow chart of a control method in a virtual reality environment in an embodiment. This embodiment is mainly illustrated by applying the control method in the virtual reality environment to the computer device in FIG. 2 described above. Referring to FIG. 3, the method specifically includes the following steps:
  • the three-dimensional interactive object is a three-dimensional model obtained by visualizing each function menu in the two-dimensional function menu interface.
  • the three-dimensional interactive object has the interactive function of the corresponding function menu, that is, the function corresponding to the corresponding function menu can be realized by operating the three-dimensional interactive object.
  • a three-dimensional interactive object can have various interactive functions such as volume adjustment, progress adjustment, position movement, or direction adjustment.
  • the representation of the three-dimensional interactive object can be a three-dimensional button, a slider, a drawbar, and the like.
  • interactive functions such as volume adjustment, progress adjustment, and position movement can be achieved by moving the slider on the slider.
  • the moving part is a part of the three-dimensional interactive object that is capable of generating motion in a partial space of the three-dimensional interactive object.
  • the fixed component is a component in the three-dimensional interactive object that is fixed in the partial space of the three-dimensional interactive object and does not generate motion.
  • the local space of the three-dimensional interactive object is a combination of the local spatial coordinate system established by the center of the three-dimensional interactive object and the center of the three-dimensional interactive object, and constitutes a space of the three-dimensional interactive object itself. It can be understood that each object has its own local space coordinate system and local space, and the relative position of each object's own local space coordinate system and the object remains unchanged.
  • the three-dimensional interactive object will be exemplified in conjunction with FIG. 4.
  • the button, the pull rod and the sliding bar are respectively three different three-dimensional interactive objects, wherein 402a in the button, 404a in the pull rod and the sliding strip 406a belongs to the movable part, and 402b in the button, 404b in the tie rod, and 406b in the slide bar belong to the fixed part.
  • the virtual operating body is a virtual execution body for performing operation control on the three-dimensional interactive object.
  • the global spatial location of the virtual operating entity in the virtual reality environment may be derived from the physical spatial location mapping of the real operating device.
  • the mapping position of the physical space position of the operation handle in the real environment in the virtual reality environment is the global spatial position of the virtual operation body in the virtual reality environment.
  • the global space refers to the overall three-dimensional space in the virtual reality environment, and the real operation device may include an operation handle or other input device having an input operation function (for example, a sensor for capturing a human hand motion, etc.).
  • the computer device can receive the physical space location of the real operating device, and obtain a global spatial location of the virtual operating body in the virtual reality environment according to the physical spatial location mapping.
  • the user can move the global space position of the virtual operating body in the virtual reality environment by moving the physical space position of the real operating device.
  • the computer device can monitor the movement of the virtual operating body in the virtual reality environment.
  • the computer device can control the moving component to move relative to the fixed component in the three-dimensional interactive object and follow the motion of the virtual operating body.
  • the fixed component of the three-dimensional interactive object is a component in the partial space of the three-dimensional interactive object that is fixed and does not generate motion
  • the movement of the movable component relative to the fixed component means that the movable component only moves in the partial space of the three-dimensional interactive object, and Not the entire three-dimensional interactive object changes in the global spatial position in the virtual reality environment.
  • the moving part follows the virtual operating body movement, which means that when the virtual operating body is in contact with the moving part, if the virtual operating body moves, the moving part moves accordingly, so that the virtual operating body directly acts on the fixed part to perform the movement.
  • FIG. 5A-5B are schematic views of the movement of a control moving part in one embodiment.
  • the virtual hand model 502 in FIG. 5A is a virtual operation body, and the button is a three-dimensional interactive object, wherein 504a is a movable component, 504b is a fixed component, and after the virtual hand model 502 contacts the movable component 504a of the button, moves toward the direction of the dotted arrow.
  • 5B indicates a state in which the button 504a is moved in the direction of the dotted arrow with respect to the fixed member 504b and following the movement of the virtual hand model 502.
  • the relative position of the movable component relative to the fixed component refers to the relative relationship between the position where the movable component is located and the position where the fixed component is located in the partial space of the associated three-dimensional interactive object.
  • the control instruction corresponding to the three-dimensional interactive object is a control instruction for realizing the function of the three-dimensional interactive object.
  • the function of the three-dimensional interactive object is to adjust the volume, and the computer device can output a control command for adjusting the volume according to the relative position of the movable component relative to the fixed component.
  • the computer device may output a control instruction corresponding to the three-dimensional interactive object according to the relative position of the movable component relative to the fixed component during the movement following the virtual operating body.
  • the computer device may also determine a relative position of the movable component relative to the fixed component according to a position of the component where the movable component is located when the virtual operating body leaves the movable component, and output a control instruction corresponding to the three-dimensional interactive object.
  • the relative positions of the movable members relative to the fixed members are different, and the output control commands corresponding to the three-dimensional interactive objects are different.
  • the function of the three-dimensional interactive object includes at least one sub-function, and all the sub-functions are combined to form a function of the three-dimensional interactive object, and each sub-function corresponds to a different control instruction, and the control instruction corresponding to the sub-function is This subfunction can be implemented when triggered.
  • the computer device may preset a correspondence relationship between a relative position of the movable member with respect to the fixed member (hereinafter referred to as a relative position) and each sub-function (hereinafter referred to as a sub-function) of the three-dimensional interactive object.
  • the relative position and the sub-function can be set one-to-one, and the corresponding sub-functions can be different according to different relative positions. It is also possible to set the corresponding sub-function according to the preset range to which the relative position belongs, the sub-functions corresponding to the relative positions belonging to different preset ranges are different, and the sub-functions corresponding to the relative positions within the same preset range are the same.
  • the computer device can output control instructions for implementing respective sub-functions in the three-dimensional interactive object in accordance with the relative position of the movable component relative to the stationary component.
  • the function of the three-dimensional interactive object is positional movement, which includes sub-functions of advancing, retreating or turning, and the computer device can output control commands such as advancing, retreating or turning according to the relative position of the movable part relative to the fixed part.
  • the relative position of the movable member relative to the fixed member is the position of the movable member in the local space.
  • the computer device may output a position value at which the movable component is located, and generate a control instruction corresponding to the three-dimensional interactive object according to the position value.
  • the computer device may obtain a corresponding control instruction according to the preset value mapping relationship between the position value and the control instruction according to the position value mapping. For example, when the position value is 0, the corresponding volume is 0, and when the position value becomes 1, the control command that increases the volume to 50 is output.
  • the above control method in the virtual reality environment virtualizes the interactive function menu interface into a three-dimensional interactive object, and in the virtual reality environment, the virtual virtual body can directly control by moving the virtual operating body to the movable component of the three-dimensional interactive object.
  • the movable component follows the virtual operating body motion, and according to the relative position of the moving active component relative to the fixed component, the control command corresponding to the three-dimensional interactive object can be directly output, and the interactive control can be realized by moving the virtual operating body, without requiring two-dimensional
  • the menu interface performs cumbersome steps such as aiming selection and button click to interact, improving the control efficiency in the virtual reality environment.
  • step S302 (referred to as a three-dimensional interactive object display step) specifically includes the following steps:
  • S602. Determine a center for constructing a three-dimensional interactive object in a virtual reality environment.
  • the center used to construct the three-dimensional interactive object is the center point of the three-dimensional interactive object to be constructed.
  • the position of the center point of the three-dimensional interactive object is preset, and the computer device can acquire the position of the preset center point.
  • the coordinate system of the local space of the three-dimensional interactive object to be constructed is established with the center point of the three-dimensional interactive object as the origin. It can be understood that each three-dimensional interactive object has its own local space coordinate system. The relative position of the coordinate system of the local space of each three-dimensional interactive object and the three-dimensional interactive object remains unchanged.
  • the three-dimensional interactive object comprises a movable part and a fixed part. It can be understood that the three-dimensional interactive object does not limit the two parts of the movable part and the fixed part.
  • the computer device may acquire preset coordinates of the movable component and the fixed component constituting the three-dimensional interactive component in a coordinate system of a partial space of the three-dimensional interactive component.
  • the computer device can find the position corresponding to the determined coordinate in the coordinate system of the established local space according to the determined coordinates, and then construct the movable component and the fixed component according to the determined position, thereby obtaining a three-dimensional interactive object.
  • the three-dimensional interactive object is a three-dimensional model capable of realizing functions of a two-dimensional function menu, so that in a virtual reality environment
  • the control operation is more convenient and intuitive, which makes the control more efficient.
  • step S306 (referred to as a moving component control step) specifically includes the following steps:
  • the initial position of the component is a position in the partial space of the three-dimensional interactive object when the movable component has not followed the movement of the operating body.
  • S704 Determine a motion initial position of the virtual operating body relative to a local space of the three-dimensional interactive object when the virtual operating body moves into contact with the movable component.
  • the initial position of the motion is a position where the virtual operating body is located in a partial space of the three-dimensional interactive object when the virtual operating body is in contact with the moving component.
  • the computer device may acquire the position of the virtual operating body in the global space of the virtual reality environment when the virtual operating body is in contact with the moving component, and convert the position in the global space into a position in the partial space of the three-dimensional interactive object. Get the initial position of the exercise.
  • S706 Acquire a current moving position of the virtual operating body in the local space when the virtual operating body contacts and moves with the moving component.
  • the virtual operating body contacts and moves with the movable component, which means that the virtual operating body moves when it is in contact with the movable component.
  • the current moving position is a position where the virtual operating body is in a partial space of the three-dimensional interactive object when it contacts and moves with the moving part.
  • the computer device may acquire a current position of the virtual operating body in the global space when the virtual operating body contacts and move with the moving component, and convert the current position in the global space into a position in the partial space of the three-dimensional interactive object. Get the current position of the exercise.
  • S708 Control the movable component to move relative to the fixed component from the initial position of the component according to the change of the initial position of the motion to the current motion position.
  • the computer device may determine the motion path of the movable component according to the change of the motion initial position of the virtual operating body to the current motion position, and control the movable component to move relative to the fixed component according to the determined motion path from the component initial position.
  • the computer device may use the path of the virtual operating body from the initial motion position to the current motion position as the motion path of the active component. In another embodiment, the computer device may also determine the position of each of the motion initial position and the current motion position of the virtual operating body in the target direction, and arrive at the target direction from the motion initial position of the virtual operating body in the target direction. The path formed by the position of the current moving position in the target direction as the moving path of the moving part.
  • the computer device may determine the current motion position of the virtual operating body in the local space every preset frame number by using the preset number of frames as a statistical period. Specifically, the computer device can determine the current motion position of the virtual operating body in the local space when each frame of the image is generated by using one frame as a statistical period, thereby obtaining the virtual operating body along the image of each frame from the initial motion position. The trajectory of the current moving position is counted, so that the moving part is controlled to move relative to the fixed part from the initial position of the part according to the moving trajectory.
  • the computer device may further monitor whether the change of the motion initial position to the current motion position is within a preset range, and if yes, perform step S708, and if not, stop the active component in corresponding to The position of the part of the boundary value of the preset range.
  • the position of the component corresponding to the boundary value of the preset range refers to the position where the movable component is located when the virtual operating body just moves to the boundary value of the preset range.
  • the preset range of the motion initial position to the current motion position is 5 meters, and for the motion initial position of the virtual operating body to the current motion position within 5 meters, the computer device can control the moving component to move relative to the fixed component from the initial position of the component. If the virtual operating body moves more than 5 meters, the moving part stays at the position of the part where the moving part is located when the virtual operating body moves to just 5 meters, without following the virtual operating body movement.
  • the movable member by controlling the movement of the virtual operating body from the initial position of the movement of the partial space to the current movement position, the movable member is controlled to move relative to the fixed member from the initial position of the member, and the movable member can be moved relative to the fixed member and follow the virtual operating body.
  • the movement moves to control the movement of the moving parts to trigger the control instruction corresponding to the three-dimensional interactive object, that is, the virtual control body directly controls the operation of the moving parts to realize the interactive control, thereby improving the control efficiency in the virtual reality environment.
  • step 708 (referred to as a moving part moving step) specifically includes the following steps:
  • the target coordinate axis in the local space refers to the target coordinate axis in the coordinate system of the local space of the three-dimensional interactive object. It can be understood that the direction in which the target coordinate axis is located is the direction in which the movable member can move relative to the fixed member, that is, the movable member can move in the direction of the target coordinate axis with respect to the fixed member.
  • S804 Determine a first length and a direction of the movable component to move according to a difference between coordinate values of the current motion position and the motion initial position on the target coordinate axis.
  • the direction in which the moving part needs to move refers to the direction in which the moving part needs to move along the target coordinate axis, including the direction of the origin motion of the coordinate system that deviates from the local space along the target coordinate axis, and the local space along the target coordinate axis.
  • the computer device may subtract the coordinate value of the current motion position on the target coordinate axis from the coordinate value of the motion initial position on the target coordinate axis to obtain a difference between the two, and determine the active component according to the obtained difference value.
  • the first length and direction of movement It can be understood that the computer device can also subtract the coordinate value of the current motion position on the target coordinate axis from the coordinate value of the motion initial position on the target coordinate axis to obtain the difference between the two.
  • the relative position of the coordinate system of the local space and the three-dimensional interactive object is fixed, and the fixed component of the three-dimensional interactive object is also fixed in the local space of the three-dimensional interactive object, so the target coordinate axis is opposite to the fixed component. It is fixed.
  • the three-dimensional interactive object is a sliding bar
  • the sliding block is a movable component b
  • the sliding bar is a fixed component
  • the sliding bar is in a coordinate system of a local space.
  • the target coordinate axis is the X axis
  • the coordinate value of the current motion position of the virtual operation body a on the X axis of the target coordinate axis is x 2
  • the coordinate value of the motion initial position of the virtual operation body a on the X axis of the target coordinate axis is x 1 .
  • the difference between the two coordinate values is x 2 -x 1 , then it can be determined that the first length of the moving part b to be moved is
  • the difference between the coordinate values of the current motion position and the motion initial position on the target coordinate axis determines the first length and direction in which the movable member needs to move. And controlling the movable part from the initial position of the part, along the target coordinate axis fixed with respect to the fixed part, and moving the first length in the direction.
  • the movable part is moved relative to the fixed part and moves with the virtual operating body, thereby controlling the movement instruction of the moving part to trigger a control instruction corresponding to the three-dimensional interactive object to realize the interactive control.
  • the motion of the movable member can be made more accurate, and some interference motion is eliminated, so that the control command generated by the trigger is more accurate.
  • the method further comprises: determining a current position of the component of the active component in the local space when the virtual operating body leaves the moving component; determining the active component in the local space according to the current location of the component The part stops at the position; the moving part stays at the part stop position.
  • the current position of the component is the moment when the virtual operating body leaves the moving component, and the real-time position of the movable component in the local space.
  • the component stop position is the position that needs to be in the local space when the moving part stops moving.
  • the computer device can monitor the moment when the virtual operator exits the active component based on an event triggering mechanism of the game collision system.
  • the computer device can determine the current position of the moving part in the local space by monitoring the coordinate position of the moving part in the coordinate system of the local space.
  • the computer device may also obtain a function indicating a relationship between a moving position of the virtual operating body and a moving position of the moving member according to a relationship between the moving positions of the movable moving member when moving with the virtual operating body. .
  • the computer device can obtain the location of the virtual operating body when it last touches the moving part before leaving the moving part.
  • the computer device takes the position at which the virtual operating body finally touches the moving part as a function of the above-mentioned parameters, and obtains the current position of the moving part in the local space.
  • the current position of the component and the component stop position may be the same position or different positions.
  • the computer device needs to control the moving component to move from the current position of the component to the stop of the component. position.
  • the component stop position of the movable component in the partial space is determined; and the movable component is stopped at the component stop position to move the movable component relative to the fixed component according to the stop position of the component.
  • the relative position of the device outputs the control command corresponding to the three-dimensional interactive object, thereby avoiding the problem that the system pressure is excessively generated by frequently generating the control command during the movement of the moving component, and the processing resource is saved.
  • determining the component stop position of the active component in the local space according to the current position of the component comprises: determining a type of the three-dimensional interactive object; when the type of the three-dimensional interactive object is a stateless type, the current location of the component is taken as an activity The part stop position of the part in the local space.
  • the types of three-dimensional interactive objects include stateless types and stateful types.
  • a stateless type of three-dimensional interactive object refers to a three-dimensional interactive object that stops moving immediately after the virtual operating body is removed.
  • a three-dimensional interactive object of a stateful type refers to a three-dimensional interactive object that is moved to a target stop position that is matched according to the position matching logic after the virtual operating body leaves.
  • a three-dimensional interactive object is a button, and for a button belonging to a stateless type, after the virtual operating body leaves the active component of the button, the active component of the button stops moving.
  • the function of the button is a switch. If the button needs to be pressed at least half to open the switch, after the virtual operating body leaves the active part of the button, if the active part of the button is not at least half, the switch cannot be turned on, then the active part of the button will be The automatic motion returns to the position before it was pressed.
  • the component stop position corresponding to the local spatial position of the movable component is a continuous floating point value. For example, a floating point value within 0-1.
  • the component stop position corresponding to the local spatial position of the movable component is a discrete preset position value. For example, if the preset position values are 0, 0.5, and 1, the position of the active component included in the virtual operating body needs to be matched with the values of the three preset positions, and the three values are selected according to the matching result. Target stop position.
  • the computer device determines the type of the three-dimensional interactive object.
  • the type of the three-dimensional interactive object is a stateless type, it indicates that the movable component included in the three-dimensional interactive object needs to stop moving immediately after the virtual operating body leaves, and the computer device
  • the current position of the component can be directly used as the component stop position of the active component in the local space.
  • the computer device can match the target stop position to the active component as the final component stop position of the active component based on the current position of the component of the active component.
  • the relative position of the movable member relative to the fixed member is the position of the movable member in the local space.
  • the computer device may output a component stop position value at which the movable component is located, and generate a control instruction corresponding to the three-dimensional interactive object according to the component stop position value.
  • the computer device may obtain a corresponding control instruction according to the preset mapping relationship between the position value and the control instruction according to the component stop position value mapping. For example, when the initial position value of the component is 0, the corresponding volume is 0, and when the value of the component stop position becomes 1, the control command that increases the volume to 50 is output.
  • the position of the moving part in the local space is determined by determining the type of the three-dimensional interactive object.
  • the type of the three-dimensional interactive object is a stateless type
  • the current position of the moving part is used as the moving part.
  • the position of the component in the space stops, making the determined component stop position more accurate.
  • the virtual operating body has absolute control over the movement of the moving parts, thereby making the control of the moving parts more flexible.
  • determining the component stop position of the active component in the local space according to the current position of the component comprises: determining a type of the three-dimensional interactive object; and when the type of the three-dimensional interactive object is a stateful type, acquiring the active component in the local space At least one preset stop position corresponding to the middle; and a preset stop position closest to the current position of the component is selected as the component stop position of the movable component in the partial space.
  • the three-dimensional interactive object of the state type refers to a three-dimensional interactive object that is moved to the target stop position matched according to the position matching logic after the virtual operating body leaves.
  • the computer device for the movable component of the state-of-the-art type three-dimensional interactive object, at least one preset stop position is preset in the partial space.
  • the computer device determines the type of three-dimensional interactive object.
  • the computer device can acquire at least one preset stop position corresponding to the active component in the local space.
  • the computer device can match the current position of the component with the acquired at least one preset stop position, and select a preset stop position that is closest to the current position of the component as the component stop position of the active component in the partial space.
  • the preset stop position closest to the current position of the component is selected as the component stop position of the active component in the local space, that is, when the virtual operating body pairs the movable component
  • the target stop position of the moving part can be accurately determined, so that the output control command is more accurate.
  • the moving the component to the component stop position comprises: obtaining a coordinate value of each of the component current position and the component stop position on the target coordinate axis in the local space; according to the component current position and the component stop position on the target coordinate axis; The difference between the coordinate values determines a second length at which the moving part needs to move; the control moving part moves from the current position of the part to a second length along the target coordinate axis to reach the part stop position.
  • the target coordinate axis in the local space refers to the target coordinate axis in the coordinate system of the local space of the three-dimensional interactive object. It can be understood that the direction in which the target coordinate axis is located is the direction in which the movable member can move relative to the fixed member, that is, the movable member can move in the direction of the target coordinate axis with respect to the fixed member.
  • the computer device can subtract the coordinate value of the component current position on the target coordinate axis from the coordinate value of the component stop position on the target coordinate axis to obtain a difference between the two.
  • the computer device can determine a second length of movement of the active component based on the difference obtained. It can be understood that the computer device can also subtract the coordinate value of the component stop position on the target coordinate axis from the coordinate value of the component current position on the target coordinate axis to obtain the difference between the two. Further, the computer device can control the movable component to move a second length along the target coordinate axis from the current position of the component to reach the component stop position.
  • control efficiency is improved by controlling the movable member to move the second length along the target coordinate axis to reach the component stop position from the current position of the component, so that the movable member reaches the component stop position with the shortest path.
  • a control method in another virtual reality environment is provided, and the method specifically includes the following steps:
  • a three-dimensional interactive object including a moving part and a fixed part is displayed.
  • step S1002 includes: determining, in a virtual reality environment, a center for constructing a three-dimensional interactive object, and establishing a coordinate system of a local space of the three-dimensional interactive object to be constructed according to the center, and determining a moving component that constitutes the three-dimensional interactive object. And the coordinates of the fixed components in the coordinate system, according to the determined coordinates, the movable component and the fixed component are constructed to obtain a three-dimensional interactive object.
  • S1004 Acquire an initial position of the component in the local space of the three-dimensional interactive object.
  • S1006 Determine a motion initial position of the virtual operating body relative to a local space of the three-dimensional interactive object when the virtual operating body is in contact with the movable component.
  • the method further comprises: monitoring movement of the virtual operating body in the virtual reality environment.
  • S1008 Acquire a current moving position of the virtual operating body in the local space when the virtual operating body contacts and moves with the moving component.
  • S1010 Acquire coordinate values of the current motion position and the motion initial position on the target coordinate axis in the local space.
  • S1012 Determine a first length and direction of movement of the movable component according to a difference between coordinate values of the current motion position and the motion initial position on the target coordinate axis.
  • S1014 Control the moving part from the initial position of the component, along a target coordinate axis fixed with respect to the fixed component, and move the first length in the direction.
  • step S1018 Determine the type of the three-dimensional interactive object.
  • the process proceeds to step S1020, and when the type of the three-dimensional interactive object is a stateless type, the process proceeds to step S1030.
  • S1020 Acquire at least one preset stop position corresponding to the active component in the local space.
  • S1022 Select a preset stop position that is closest to the current position of the component as a component stop position of the active component in the local space.
  • S1024 Acquire coordinate values of the current position of the component and the stop position of the component on the target coordinate axis in the local space.
  • S1026 Determine a second length of movement of the movable component according to a difference between a current position of the component and a coordinate value of the component stop position on the target coordinate axis.
  • the control moving component moves from the current position of the component to the second length along the target coordinate axis to reach the component stop position.
  • the current position of the component is used as the component stop position of the active component in the local space.
  • the above control method in the virtual reality environment virtualizes the interactive function menu interface into a three-dimensional interactive object, and in the virtual reality environment, the virtual virtual body can directly control by moving the virtual operating body to the movable component of the three-dimensional interactive object.
  • the movable component follows the virtual operating body motion, and according to the relative position of the moving active component relative to the fixed component, the control command corresponding to the three-dimensional interactive object can be directly output, and the interactive control can be realized by moving the virtual operating body, without requiring two-dimensional
  • the menu interface performs cumbersome steps such as aiming selection and button click to interact, improving the control efficiency in the virtual reality environment.
  • the movement of the movable part relative to the fixed part is controlled from the initial position of the part, and the movement of the movable part relative to the fixed part and following the virtual operating body can be realized. Movement, thereby controlling the movement of the moving part to trigger a control instruction corresponding to the three-dimensional interactive object to achieve interactive control.
  • moving the first length along the target coordinate axis and direction fixed relative to the fixed member can make the motion of the movable member more accurate, eliminating some interference motion, thereby making the control command generated by the trigger more accurate.
  • the preset stop position closest to the current position of the component is selected as the component stop position of the active component in the local space, that is, when the virtual operator controls the motion of the movable component
  • the target stop position of the moving part can also be accurately determined, so that the output control command is more accurate.
  • a computer device is provided.
  • the internal structure of the computer device can be as shown in FIG. 2, and the computer device includes a control device in a virtual reality environment, and the control device in the virtual reality environment includes various modules.
  • Each module may be implemented in whole or in part by software, hardware, or a combination thereof.
  • a control device 1100 in a virtual reality environment includes a display module 1102, a motion monitoring module 1104, a control module 1106, and an instruction output module 1108, where:
  • the display module 1102 is configured to display a three-dimensional interactive object including a movable component and a fixed component in a virtual reality environment.
  • the mobile monitoring module 1104 is configured to monitor movement of the virtual operating entity in the virtual reality environment.
  • the control module 1106 is configured to control the movable component to move relative to the fixed component and follow the virtual operating body after the virtual operating body moves into contact with the movable component.
  • the instruction output module 1108 is configured to output a control instruction corresponding to the three-dimensional interactive object according to a relative position of the movable component relative to the fixed component.
  • the display module 1102 is further configured to: in a virtual reality environment, determine a center for constructing a three-dimensional interactive object; establish a coordinate system of a local space of the three-dimensional interactive object to be constructed according to the center; and determine a three-dimensional interaction
  • the coordinates of the movable part and the fixed part of the object in the coordinate system; the movable part and the fixed part are constructed according to the determined coordinates to obtain a three-dimensional interactive object.
  • control module 1106 includes:
  • a position determining module 1106a configured to acquire a component initial position of the moving component in a partial space of the three-dimensional interactive object; and when the virtual operating body moves into contact with the movable component, determining the virtual operating body relative to the The initial position of the movement of the partial space of the three-dimensional interactive object; when the virtual operating body contacts and moves with the movable component, acquires the current moving position of the virtual operating body in the partial space.
  • the motion control module 1106b is configured to control the movable component to move relative to the fixed component from the initial position of the component according to the change of the motion initial position to the current motion position.
  • the motion control module 1106b is further configured to acquire coordinate values of the current motion position and the motion initial position on a target coordinate axis in the local space; according to the current motion position and location Determining a difference between coordinate values of the motion initial position on the target coordinate axis, determining a first length and direction of movement of the movable component; controlling the movable component from the initial position of the component, along with respect to the The target coordinate axis fixed by the fixing member and moving the first length in the direction.
  • control module 1106 is further configured to: when the virtual operating body leaves the moving component, determine a current position of the component of the moving component in the partial space; according to the current location of the component, Determining a component stop position of the movable component in the partial space; stopping the movable component at the component stop position.
  • control module 1106 is further configured to determine a type of the three-dimensional interactive object; when the type of the three-dimensional interactive object is a stateless type, the current position of the component is used as the active component. The component in the partial space stops at a position.
  • control module 1106 is further configured to determine a type of the three-dimensional interactive object; when the type of the three-dimensional interactive object is a stateful type, acquiring the active component in the local space Corresponding at least one preset stop position; selecting a preset stop position that is closest to the current position of the component as a component stop position of the movable component in the partial space.
  • control module 1106 is further configured to acquire coordinate values of the component current position and the component stop position on the target coordinate axis in the local space; according to the component current position and the Determining a difference between coordinate values of the component stop position on the target coordinate axis, determining a second length of movement of the movable component; controlling the movable component to move along the target coordinate axis from a current position of the component The second length is to reach the component stop position.
  • control device in the virtual reality environment provided by the present application may be implemented in the form of a computer readable instruction executable on a computer device as shown in FIG.
  • the non-volatile storage medium of the computer device can store various program modules constituting the control device in the virtual reality environment, for example, the display module 1102, the movement monitoring module 1104, the control module 1106, and the instruction output module 1108 shown in FIG. .
  • Each of the program modules includes computer readable instructions for causing the computer device to perform steps in a method of control in a virtual reality environment of various embodiments of the present application described in this specification, for example,
  • the computer device can display the three-dimensional interactive object including the moving part and the fixed part in the virtual reality environment through the display module 1102 in the interactive data processing apparatus 1100 as shown in FIG.
  • the computer device may output a control instruction corresponding to the three-dimensional interactive object by the instruction output module 1108 according to the relative position of the movable component relative to the fixed component.
  • a computer apparatus comprising a memory and a processor having stored therein computer readable instructions that, when executed by a processor, cause the processor to perform the following steps Displaying, in a virtual reality environment, a three-dimensional interactive object including a movable component and a fixed component; monitoring movement of the virtual operating body in the virtual reality environment; and controlling the virtual operating body after moving into contact with the movable component
  • the movable component moves relative to the fixed component and follows the virtual operating body; and outputs a control instruction corresponding to the three-dimensional interactive object according to the relative position of the movable component relative to the fixed component.
  • displaying a three-dimensional interactive object including a moving component and a fixed component comprising: determining, in a virtual reality environment, a center for constructing a three-dimensional interactive object; Constructing a coordinate system of a local space of the three-dimensional interactive object; determining coordinates of each of the movable component and the fixed component constituting the three-dimensional interactive object in the coordinate system; constructing the movable component and the fixed component according to the determined coordinates to obtain a three-dimensional interactive object .
  • controlling the movable component to move relative to the fixed component and following the virtual operating body includes: acquiring the movable component in the Determining a component initial position in a partial space of the three-dimensional interactive object; determining a motion initial position of the virtual operating body relative to a partial space of the three-dimensional interactive object when the virtual operating body is moved into contact with the movable component; Obtaining a current moving position of the virtual operating body in the partial space when the virtual operating body contacts and moves with the moving component; controlling the activity according to a change of the moving initial position to the current moving position The component moves relative to the stationary component from the initial position of the component.
  • controlling the movement of the movable component relative to the fixed component from the initial position of the component according to the change of the motion initial position to the current motion position comprises: acquiring the current motion a coordinate value of each of the position and the initial position of the motion on the target coordinate axis in the local space; determining a difference according to a difference between the current motion position and a coordinate value of the motion initial position on the target coordinate axis Determining a first length and direction of movement of the movable member; controlling the movable member from the initial position of the member, along the target coordinate axis fixed relative to the fixed member, and moving the first direction One length.
  • the computer readable instructions prior to the outputting the control instruction corresponding to the three-dimensional interactive object in accordance with the relative position of the movable component relative to the fixed component, the computer readable instructions further cause the processor to perform the following steps: Determining, when the virtual operating body leaves the moving component, a current position of the component of the moving component in the partial space; determining a component stopping position of the moving component in the partial space according to the current position of the component ; the moving part is stopped at the part stop position.
  • determining the component stop position of the movable component in the partial space according to the current position of the component comprises: determining a type of the three-dimensional interactive object; when the type of the three-dimensional interactive object is When there is no state type, the current position of the component is taken as the component stop position of the active component in the local space.
  • determining the component stop position of the movable component in the partial space according to the current position of the component comprises: determining a type of the three-dimensional interactive object; when the type of the three-dimensional interactive object is When there is a state type, acquiring at least one preset stop position corresponding to the moving part in the partial space; selecting a preset stop position closest to the current position of the part, as the moving part is in the part The position of the part in the space stops.
  • the staying the movable component in the component stop position comprises: acquiring coordinate values of each of the component current position and the component stop position in a target coordinate axis in the local space; Determining a difference between a current position of the component and a coordinate value of the component stop position on the target coordinate axis, determining a second length of the movable component to be moved; controlling the movable component from a current position of the component The second length is moved along the target coordinate axis to reach the component stop position.
  • a storage medium storing computer readable instructions, when executed by one or more processors, causes one or more processors to perform the following steps: in virtual reality In the environment, displaying a three-dimensional interactive object including the movable part and the fixed part; monitoring movement of the virtual operating body in the virtual reality environment; and controlling the moving part relative to the virtual operating body after moving into contact with the moving part
  • the fixing member follows the movement of the virtual operating body; and outputs a control instruction corresponding to the three-dimensional interactive object according to the relative position of the movable member relative to the fixed member.
  • displaying a three-dimensional interactive object including a moving component and a fixed component comprising: determining, in a virtual reality environment, a center for constructing a three-dimensional interactive object; Constructing a coordinate system of a local space of the three-dimensional interactive object; determining coordinates of each of the movable component and the fixed component constituting the three-dimensional interactive object in the coordinate system; constructing the movable component and the fixed component according to the determined coordinates to obtain a three-dimensional interactive object .
  • controlling the movable component to move relative to the fixed component and following the virtual operating body includes: acquiring the movable component in the Determining a component initial position in a partial space of the three-dimensional interactive object; determining a motion initial position of the virtual operating body relative to a partial space of the three-dimensional interactive object when the virtual operating body is moved into contact with the movable component; Obtaining a current moving position of the virtual operating body in the partial space when the virtual operating body contacts and moves with the moving component; controlling the activity according to a change of the moving initial position to the current moving position The component moves relative to the stationary component from the initial position of the component.
  • controlling the movement of the movable component relative to the fixed component from the initial position of the component according to the change of the motion initial position to the current motion position comprises: acquiring the current motion a coordinate value of each of the position and the initial position of the motion on the target coordinate axis in the local space; determining a difference according to a difference between the current motion position and a coordinate value of the motion initial position on the target coordinate axis Determining a first length and direction of movement of the movable member; controlling the movable member from the initial position of the member, along the target coordinate axis fixed relative to the fixed member, and moving the first direction One length.
  • the computer readable instructions prior to the outputting the control instruction corresponding to the three-dimensional interactive object in accordance with the relative position of the movable component relative to the fixed component, the computer readable instructions further cause the processor to perform the following steps: Determining, when the virtual operating body leaves the moving component, a current position of the component of the moving component in the partial space; determining a component stopping position of the moving component in the partial space according to the current position of the component ; the moving part is stopped at the part stop position.
  • determining the component stop position of the movable component in the partial space according to the current position of the component comprises: determining a type of the three-dimensional interactive object; when the type of the three-dimensional interactive object is When there is no state type, the current position of the component is taken as the component stop position of the active component in the local space.
  • determining the component stop position of the movable component in the partial space according to the current position of the component comprises: determining a type of the three-dimensional interactive object; when the type of the three-dimensional interactive object is When there is a state type, acquiring at least one preset stop position corresponding to the moving part in the partial space; selecting a preset stop position closest to the current position of the part, as the moving part is in the part The position of the part in the space stops.
  • the staying the movable component in the component stop position comprises: acquiring coordinate values of each of the component current position and the component stop position in a target coordinate axis in the local space; Determining a difference between a current position of the component and a coordinate value of the component stop position on the target coordinate axis, determining a second length of the movable component to be moved; controlling the movable component from a current position of the component The second length is moved along the target coordinate axis to reach the component stop position.
  • the storage medium may be a non-volatile storage medium such as a magnetic disk, an optical disk, a read-only memory (ROM), or a random access memory (RAM).

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Processing Or Creating Images (AREA)

Abstract

一种虚拟现实环境下的控制方法,包括:在虚拟现实环境中,显示包括活动部件和固定部件的三维交互物件(S302);监测虚拟操作体在虚拟现实环境中的移动(S304);当虚拟操作体移动至与活动部件接触后,控制活动部件相对于固定部件并跟随虚拟操作体运动(S306);按照活动部件相对于固定部件的相对位置,输出与三维交互物件对应的控制指令(S308)。

Description

虚拟现实环境下的控制方法、设备和存储介质
本申请要求于2017年10月10日提交中国专利局,申请号为2017109361329,申请名称为“虚拟现实环境下的控制方法、装置、设备和存储介质”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及计算机技术领域,特别是涉及一种虚拟现实环境下的控制方法、设备和存储介质。
背景技术
随着科学技术的飞速发展,虚拟现实(VR,Virtual Reality)技术凭借其逼真的体验和良好的互动性,越来越受到广大用户的青睐。
目前,在虚拟现实环境下的交互控制,是通过在虚拟现实环境的三维空间中显示二维的交互菜单界面,然后通过手柄发射射线来对二维交互菜单界面中的各个菜单进行选择,其中,射线与二维交互菜单界面的交点即为所希望交互的菜单,然后利用手柄的按键对选中的交互菜单进一步地进行点击等,来实现交互控制。
上述控制方式,没有一种虚拟现实环境下的真实交互感,与基于二维电脑屏幕进行鼠标点击操作的交互控制方式基本上类似,用户仍然需要通过类似鼠标的操作进行瞄准选择,并需要进行按键进一步点击等多步骤的操作来实现交互控制,控制效率低。
发明内容
根据本申请提供的各种实施例,提供了一种虚拟现实环境下的控制方法、计算机设备和存储介质。
一种虚拟现实环境下的控制方法,包括:
计算机设备在虚拟现实环境中,显示包括活动部件和固定部件的三维交互物件;
所述计算机设备监测虚拟操作体在所述虚拟现实环境中的移动;
所述计算机设备当所述虚拟操作体移动至与所述活动部件接触后,控制所述活动部件相对于所述固定部件并跟随所述虚拟操作体运动;及
所述计算机设备按照所述活动部件相对于所述固定部件的相对位置,输出与所述三维交互物件对应的控制指令。
一种计算机设备,包括存储器和一个或多个处理器,所述存储器中存储有计算机可读指令,所述计算机可读指令被所述一个或多个处理器执行时,使得所述一个或多个处理器执行如下步骤:
在虚拟现实环境中,显示包括活动部件和固定部件的三维交互物件;
监测虚拟操作体在所述虚拟现实环境中的移动;
当所述虚拟操作体移动至与所述活动部件接触后,控制所述活动部件相对于所述固定部件并跟随所述虚拟操作体运动;及
按照所述活动部件相对于所述固定部件的相对位置,输出与所述三维交互物件对应的控制指令。
一个或多个存储有计算机可读指令的存储介质,所述计算机可读指令被一个或多个处理器执行时,使得一个或多个处理器执行如下步骤:
在虚拟现实环境中,显示包括活动部件和固定部件的三维交互物件;
监测虚拟操作体在所述虚拟现实环境中的移动;
当所述虚拟操作体移动至与所述活动部件接触后,控制所述活动部件相对于所述固定部件并跟随所述虚拟操作体运动;及
按照所述活动部件相对于所述固定部件的相对位置,输出与所述三维交互物件对应的控制指令。
本申请的一个或多个实施例的细节在下面的附图和描述中提出。基于本申请的说明书、附图以及权利要求书,本申请的其它特征、目的和优点将变得更加明显。
附图说明
为了更清楚地说明本申请实施例中的技术方案,下面将对实施例描述中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图仅仅是本申请的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其他的附图。
图1为一个实施例中虚拟现实环境下的控制方法的应用环境图;
图2为一个实施例中计算机设备的框图;
图3为一个实施例中虚拟现实环境下的控制方法的流程示意图;
图4为一个实施例中三维交互物件的示意图;
图5A至图5B为一个实施例中控制活动部件运动的示意图;
图6为一个实施例中三维交互物件显示步骤的流程示意图;
图7为一个实施例中活动部件控制步骤的流程示意图;
图8为一个实施例中活动部件运动步骤的流程示意图;
图9为一个实施例中活动部件运动的示意图;
图10为另一个实施例中虚拟现实环境下的控制方法的流程示意图;
图11为一个实施例中虚拟现实环境下的控制装置的框图;及
图12为另一个实施例中虚拟现实环境下的控制装置的框图。
具体实施方式
为了使本申请的目的、技术方案及优点更加清楚明白,以下结合附图及实施例,对本申请进行进一步详细说明。应当理解,此处所描述的具体实施例仅仅用以解释本申请,并不用于限定本申请。
图1为一个实施例中虚拟现实环境下的控制方法的应用环境图。参照图1,该应用环境包括应用环境包括通过网络进行连接通信的真实操作装置110和计算机设备120。其中,计算机设备120中安装有虚拟现实应用程序(可简称为虚拟现实应用),虚拟现实应用可以实现虚拟现实场景。真实操作装置 110,是位于真实环境中对计算机设备110中的虚拟现实应用进行操作控制的装置,真实操作装置110可以包括操作手柄或其它具有输入操作功能的输入装置,比如捕捉人手动作的传感器等。计算机设备120可以是终端,终端可以为台式计算机或移动终端,移动终端可以包括手机、平板电脑、个人数字助理和穿戴式设备等中的至少一种。
计算机设备120可以通过计算机设备120上运行的虚拟现实应用实现虚拟现实场景,并在虚拟现实场景中显示包括活动部件和固定部件的三维交互物件120a,其中,三维交互物件120a是将二维功能菜单界面中的各个功能菜单进行可视觉化处理得到的三维模型,三维交互物件120a具备相应功能菜单的交互功能,即通过对三维交互物件120a进行操作可以实现相应功能菜单所对应的功能。计算机设备120可以通过运行的虚拟现实应用生成虚拟操作体120b,其中,虚拟操作体120b是用于对三维交互物件进行操作控制的虚拟执行主体。虚拟操作体120b在虚拟现实环境中的全局空间位置可以是由真实操作装置110的物理空间位置映射得到。用户可以在真实环境中通过移动真实操作装置110的物理空间位置,来实现虚拟现实环境中相应的虚拟操作体120b的移动。当虚拟操作体120b与三维交互物件120a中的活动部件接触后,计算机设备120可以控制活动部件相对于固定部件并跟随虚拟操作体运动。计算机设备120可以按照活动部件相对于固定部件的相对位置,输出与三维交互物件120a对应的控制指令。
图2为一个实施例中计算机设备的内部结构示意图。该计算机设备可以是图1中的计算机设备120。参照图2,该计算机设备包括通过系统总线连接的处理器、存储器、网络接口和显示屏。其中,存储器包括非易失性存储介质和内存储器。该计算机设备的非易失性存储介质可存储操作系统和计算机可读指令,该计算机可读指令被执行时,可使得处理器执行一种虚拟现实环境下的控制方法。该计算机设备的处理器用于提供计算和控制能力,支撑整个计算机设备的运行。该计算机设备的内存储器中可储存有计算机可读指令,该计算机可读指令被处理器执行时,可使得处理器执行一种虚拟现实环境下 的控制方法。计算机设备的网络接口用于进行网络通信。计算机设备的显示屏可以是液晶显示屏或者电子墨水显示屏。
本领域技术人员可以理解,图2中示出的结构,仅仅是与本申请方案相关的部分结构的框图,并不构成对本申请方案所应用于其上的计算机设备的限定,具体的计算机设备可以包括比图中所示更多或更少的部件,或者组合某些部件,或者具有不同的部件布置。
图3为一个实施例中虚拟现实环境下的控制方法的流程示意图。本实施例主要以该虚拟现实环境下的控制方法应用于上述图2中的计算机设备来举例说明。参照图3,该方法具体包括如下步骤:
S302,在虚拟现实环境中,显示包括活动部件和固定部件的三维交互物件。
其中,三维交互物件,是将二维功能菜单界面中的各个功能菜单进行可视化处理得到的三维模型。三维交互物件具备相应功能菜单的交互功能,即通过对三维交互物件进行操作可以实现相应功能菜单所对应的功能。比如,三维交互物件可以具备实现音量调节、进度调节、位置移动或方向调节等各种交互功能。
在一个实施例中,三维交互物件的表现形式可以是三维的按钮、滑动条和拉杆等。比如,可以通过移动滑动条上的滑块来实现音量调节、进度调节和位置移动等交互功能。
活动部件,是三维交互物件中的、且在三维交互物件的局部空间中能够产生运动的部件。固定部件,是三维交互物件中的、且在三维交互物件的局部空间中固定不产生运动的部件。其中,三维交互物件的局部空间,是以三维交互物件的中心为原点建立的局部空间坐标系和三维交互物件的该中心组合起来,所构成的属于三维交互物件自身的空间。可以理解,每个物体都拥有自身的局部空间坐标系和局部空间,每个物体自身的局部空间坐标系与该物体的相对位置始终保持不变。
现结合图4对三维交互物件进行举例说明,如图4所示,按钮、拉杆和 滑动条分别为三个不同的三维交互物件,其中,按钮中的402a、拉杆中的404a以及滑动条中的406a就属于活动部件,按钮中的402b、拉杆中的404b以及滑动条中的406b就属于固定部件。
S304,监测虚拟操作体在虚拟现实环境中的移动。
其中,虚拟操作体,是用于对三维交互物件进行操作控制的虚拟执行主体。虚拟操作体在虚拟现实环境中的全局空间位置可以是由真实操作装置的物理空间位置映射得到。比如,真实操作装置为操作手柄时,操作手柄于真实环境中所处的物理空间位置在虚拟现实环境中的映射位置即为该虚拟操作体在虚拟现实环境中的全局空间位置。其中,全局空间,是指该虚拟现实环境下的整体三维空间,真实操作装置可以包括操作手柄或其它具有输入操作功能的输入装置(例如捕捉人手动作的传感器等)。
具体地,计算机设备可以接收真实操作装置的物理空间位置,并根据该物理空间位置映射得到虚拟现实环境中的虚拟操作体的全局空间位置。用户可以通过移动真实操作装置的物理空间位置,来实现对虚拟操作体在虚拟现实环境中的全局空间位置的移动。计算机设备可以监测虚拟操作体在虚拟现实环境中的移动。
S306,当虚拟操作体移动至与活动部件接触后,控制活动部件相对于固定部件并跟随虚拟操作体运动。
当监测到虚拟操作体与虚拟现实环境中的三维交互物体中的活动部件接触后,计算机设备可以控制活动部件相对于三维交互物体中的固定部件、且跟随虚拟操作体的运动而运动。可以理解,三维交互物体的固定部件是三维交互物体的局部空间中处于固定不产生运动的部件,活动部件相对于固定部件运动则是指活动部件仅在三维交互物体的局部空间内运动,而并不是整个三维交互物体在虚拟现实环境中的全局空间位置发生变化。
此外,活动部件跟随虚拟操作体运动,是指虚拟操作体与活动部件接触时,若虚拟操作体运动,活动部件也随之运动,以实现虚拟操作体直接作用于固定部件上进行运动。
图5A至图5B为一个实施例中控制活动部件运动的示意图。图5A中的虚拟手模型502为虚拟操作体,按钮为三维交互物件,其中,504a为活动部件,504b为固定部件,虚拟手模型502接触按钮的活动部件504a后,往虚线箭头方向运动,图5B则表示按钮504a相对于固定部件504b、并且跟随虚拟手模型502的运动而向虚线箭头方向运动后的状态。
S308,按照活动部件相对于固定部件的相对位置,输出与三维交互物件对应的控制指令。
其中,活动部件相对于固定部件的相对位置,是指在所属三维交互物件的局部空间中,活动部件所处的位置与固定部件所处的位置之间的相对关系。与三维交互物件对应的控制指令,是用于实现三维交互物件所具备的功能的控制指令。比如,三维交互物件所具备的功能是调节音量,则计算机设备可以按照活动部件相对于固定部件的相对位置,输出调节音量的控制指令。
具体地,计算机设备可以按照活动部件在跟随虚拟操作体运动的过程中相对于固定部件的相对位置,输出与三维交互物件对应的控制指令。计算机设备还可以根据虚拟操作体在离开活动部件时,活动部件所处的部件位置确定该活动部件相对于固定部件的相对位置,输出与三维交互物件对应的控制指令。
在一个实施例中,活动部件相对于固定部件的相对位置不同,所输出的与所述三维交互物件对应的控制指令不同。
具体地,三维交互物件所具备的功能包括至少一个的子功能,所有子功能组合起来构成三维交互物件所具备的功能,各子功能分别对应不同的控制指令,子功能所对应的控制指令在被触发时可以实现该子功能。
计算机设备可以预先设置活动部件相对于固定部件的相对位置(以下简称相对位置)与三维交互物件所具备的各子功能(以下简称子功能)之间的对应关系。其中,可以将相对位置与子功能一一对应设置,相对位置不同所对应的子功能可以不同。也可以是根据相对位置所属的预设范围设置对应的子功能,属于不同预设范围内的相对位置所对应的子功能不同,属于相同预 设范围内的相对位置所对应的子功能相同。
计算机设备可以按照活动部件相对于固定部件的相对位置,输出用于实现三维交互物件中相应子功能的控制指令。比如,三维交互物件所具备的功能是位置移动,其中就包括前进、后退或转身的子功能,则计算机设备可以按照活动部件相对于固定部件的相对位置,输出前进、后退或转身等控制指令。
在一个实施例中,活动部件相对于固定部件的相对位置,是活动部件在局部空间所处的位置。本实施例中,计算机设备可以输出活动部件所处的位置数值,根据该位置数值生成与三维交互物件对应的控制指令。
具体地,计算机设备可以根据位置数值与控制指令间的预设映射关系,根据该位置数值映射得到相应的控制指令。比如,位置数值为0时,所对应的音量为0,位置数值变为1时,则输出将音量调大至50的控制指令。
上述虚拟现实环境下的控制方法,通过将进行交互的功能菜单界面虚拟为三维交互物件,在虚拟现实环境中,通过虚拟操作体与三维交互物件的活动部件接触,移动虚拟操作体就可以直接控制活动部件跟随虚拟操作体运动,根据运动的活动部件相对于固定部件的相对位置,就可以直接输出与三维交互物件对应的控制命令,通过移动虚拟操作体就可以实现交互控制,不需要基于二维菜单界面进行瞄准选择并按键点击等繁琐的步骤来进行交互,提高了虚拟现实环境下的控制效率。
如图6所示,在一个实施例中,步骤S302(简称三维交互物件显示步骤),具体包括以下步骤:
S602,在虚拟现实环境中,确定用于构建三维交互物件的中心。
其中,用于构建三维交互物件的中心,为需构建的三维交互物件的中心点。
可以理解,在虚拟现实环境中,预先设置构建三维交互物件的中心点的位置,计算机设备可以获取预先设置的中心点的位置。
S604,根据中心建立需构建的三维交互物件的局部空间的坐标系。
具体地,以三维交互物件的中心点为原点,建立需构建的三维交互物件的局部空间的坐标系。可以理解,每个三维交互物件都具有自身的局部空间的坐标系。每个三维交互物件的局部空间的坐标系与该三维交互物件的相对位置始终保持不变。
S606,确定组成三维交互物件的活动部件和固定部件各在坐标系中的坐标。
其中,三维交互物件包括活动部件和固定部件。可以理解,这里并不限定三维交互物件仅包括活动部件和固定部件这两个部分。
具体地,计算机设备可以获取组成该三维交互部件的活动部件和固定部件在该三维交互部件的局部空间的坐标系中的预设坐标。
S608,根据确定的坐标,构建活动部件和固定部件得到三维交互物件。
具体地,计算机设备可以根据确定的坐标,在所建立的局部空间的坐标系中找到该确定的坐标所对应的位置,进而根据确定位置构建活动部件和固定部件,从而得到三维交互物件。
上述实施例中,通过在虚拟现实环境中,构建包括活动部件和固定部件的三维交互物件,该三维交互物件是能够实现二维的功能菜单所具备功能的三维模型,使得在虚拟现实环境中的控制操作更加的便捷、直观,从而使得控制效率更高。
如图7所示,在一个实施例中,步骤S306(简称活动部件控制步骤)具体包括以下步骤:
S702,获取活动部件在三维交互物件的局部空间中的部件初始位置。
其中,部件初始位置,是活动部件尚未跟随操作体运动时在三维交互物件的局部空间中的位置。
S704,当虚拟操作体移动至与所述活动部件接触时,确定虚拟操作体相对于三维交互物件的局部空间的运动初始位置。
其中,运动初始位置,是虚拟操作体在恰好与活动部件接触时,虚拟操作体在三维交互物件的局部空间所处的位置。
具体地,计算机设备可以在虚拟操作体与活动部件接触时,获取虚拟操作体在虚拟现实环境的全局空间中的位置,将该全局空间中的位置转换为三维交互物件的局部空间中的位置,得到运动初始位置。
S706,当虚拟操作体与活动部件接触并运动时,获取虚拟操作体在局部空间中的当前运动位置。
其中,虚拟操作体与活动部件接触并运动,是指虚拟操作体与活动部件保持接触时进行运动。当前运动位置,是虚拟操作体在与活动部件接触并运动时在三维交互物件的局部空间中所处的位置。
具体地,计算机设备可以在虚拟操作体与活动部件接触并运动时,获取虚拟操作体在全局空间中的当前位置,并将全局空间中的当前位置转换为三维交互物件的局部空间中的位置,得到当前运动位置。
S708,根据运动初始位置到当前运动位置的变化,控制活动部件从部件初始位置起相对于固定部件运动。
具体地,计算机设备可以按照虚拟操作体的运动初始位置到当前运动位置的变化,确定活动部件的运动路径,并控制活动部件从部件初始位置起按照所确定运动路径相对于固定部件运动。
在一个实施例中,计算机设备可以将虚拟操作体由运动初始位置到当前运动位置的路径,作为活动部件的运动路径。在另一个实施例中,计算机设备也可以确定虚拟操作体的运动初始位置和当前运动位置各在目标方向上的位置,将从虚拟操作体的运动初始位置在目标方向的位置沿着目标方向到达当前运动位置在目标方向上的位置所形成的路径,作为活动部件的运动路径。
在一个实施例中,计算机设备可以以预设帧数为一个统计周期,确定每隔预设帧数时虚拟操作体在局部空间中的当前运动位置。具体地,计算机设备可以以一帧为一个统计周期,确定产生每一帧图像时虚拟操作体在局部空间中的当前运动位置,从而得到虚拟操作体从运动初始位置起,沿着每一帧图像所统计的当前运动位置的运动轨迹,从而按照该运动轨迹来控制活动部件从部件初始位置起相对于固定部件运动。
在一个实施例中,在步骤S708之前,计算机设备还可以监测运动初始位置到当前运动位置的变化是否在预设范围内,若是,则执行步骤S708,若否,则将活动部件停留在对应于预设范围的边界值的部件位置。其中,对应于预设范围的边界值的部件位置,是指虚拟操作体刚好运动至预设范围的边界值时,活动部件所处的位置。
比如,运动初始位置到当前运动位置的预设范围为5米,则针对虚拟操作体的运动初始位置到5米内的当前运动位置,计算机设备可以控制活动部件从部件初始位置起相对于固定部件运动,如果虚拟操作体运动超过5米,则将活动部件停留在虚拟操作体运动至刚好5米时该活动部件所处的部件位置,而不跟随虚拟操作体运动。
上述实施例中,通过虚拟操作体在局部空间的运动初始位置到当前运动位置的变化,控制活动部件从部件初始位置起相对于固定部件运动,能够实现活动部件相对于固定部件并跟随虚拟操作体的运动而运动,从而通过控制活动部件运动,来触发与三维交互物件对应的控制指令,即通过虚拟操作体直接控制活动部件的操作实现交互控制,提高了虚拟现实环境下的控制效率。
如图8所示,在一个实施例中,步骤708(简称活动部件运动步骤)具体包括以下步骤:
S802,获取当前运动位置和运动初始位置各在局部空间中目标坐标轴上的坐标值。
其中,局部空间中的目标坐标轴,是指三维交互物件的局部空间的坐标系中的目标坐标轴。可以理解,目标坐标轴所处的方向,是活动部件相对于固定部件能够运动的方向,即活动部件可以相对于固定部件在目标坐标轴方向上运动。
S804,根据当前运动位置和运动初始位置在目标坐标轴上的坐标值之间的差值,确定活动部件需运动的第一长度和方向。
其中,活动部件需运动的方向,是指活动部件需要沿着目标坐标轴运动的方向,包括沿着目标坐标轴背离局部空间的坐标系的原点运动的方向、沿 着目标坐标轴向着局部空间的坐标系的原点运动的方向,以及沿着目标坐标轴先向着局部空间的坐标系的原点运动再背离局部空间的坐标系的原点运动的方向。
具体地,计算机设备可以将当前运动位置在目标坐标轴上的坐标值减去运动初始位置在目标坐标轴上的坐标值,得到两者之间的差值,根据得到的差值确定活动部件需运动的第一长度和方向。可以理解,计算机设备也可以将运动初始位置在目标坐标轴上的坐标值减去当前运动位置在目标坐标轴上的坐标值,得到两者之间的差值。
S806,控制活动部件从部件初始位置起,沿着相对于固定部件固定的目标坐标轴,并朝该方向移动第一长度。
可以理解,局部空间的坐标系与三维交互物件的相对位置是固定的,而三维交互物件的固定部件也是在三维交互物件的局部空间中的位置是固定的,所以,目标坐标轴相对于固定部件是固定的。
图9为一个实施例中活动部件运动的示意图,如图9所示,三维交互物件为滑动条,滑块为活动部件b,滑杆为固定部件,该滑动条的局部空间的坐标系中的目标坐标轴为X轴,虚拟操作体a的当前运动位置在目标坐标轴X轴上的坐标值为x 2,虚拟操作体a的运动初始位置在目标坐标轴X轴上的坐标值为x 1。两个坐标值的差值为x 2-x 1,则可确定活动部件b需运动的第一长度为|x 2-x 1|,方向为背向三维交互物件的原点O,则计算机设备可以控制活动部件b从部件初始位置起,沿着目标坐标轴X,并朝背向原点O的方向移动第一长度|x 2-x 1|。
上述实施例中,当前运动位置和运动初始位置在目标坐标轴上的坐标值之间的差值,确定活动部件需运动的第一长度和方向。并控制活动部件从部件初始位置起,沿着相对于固定部件固定的目标坐标轴,并朝该方向移动第一长度。实现了活动部件相对于固定部件并随着虚拟操作体运动,从而通过控制活动部件运动,来触发与三维交互物件对应的控制指令,以实现交互控制。此外,沿着相对于固定部件固定的目标坐标轴并朝该方向移动第一长度, 可以使得活动部件的运动更加的准确,剔除了一些干扰运动,从而使得触发生成的控制指令更加的准确。
在一个实施例中,在步骤S308之前,该方法还包括:当虚拟操作体离开活动部件时,确定活动部件在局部空间中的部件当前位置;根据部件当前位置,确定活动部件在局部空间中的部件停止位置;将活动部件停留于部件停止位置。
其中,部件当前位置,是虚拟操作体离开活动部件的瞬间,活动部件在局部空间中所处的实时位置。部件停止位置,是活动部件停止运动时需要在局部空间中处于的位置。
在一个实施例中,计算机设备可以根据游戏碰撞系统的事件触发机制来监测虚拟操作体离开活动部件的瞬间。当虚拟操作体离开活动部件时,计算机设备可以通过监测活动部件在局部空间的坐标系中所处的坐标位置,确定活动部件在局部空间中的部件当前位置。
在另一个实施例中,计算机设备也可以根据控制活动部件随虚拟操作体运动时两者间的运动位置间的关系,得到表示虚拟操作体的运动位置和活动部件的运动位置之间关系的函数。计算机设备可以获取虚拟操作体在离开活动部件前最后接触活动部件时所处的位置。计算机设备将虚拟操作体最后接触活动部件时所处的位置作为入参代入上述得到的函数,得到活动部件的在局部空间中的部件当前位置。
需要说明的是,部件当前位置与部件停止位置可以是相同的位置,也可以是不同的位置,当两者是不同的位置时,则计算机设备需要控制活动部件从部件当前位置起运动到部件停止位置。
本实施例中,当虚拟操作体离开活动部件时,确定活动部件在局部空间中的部件停止位置;并将活动部件停留于部件停止位置,以根据停留于部件停止位置的活动部件相对于固定部件的相对位置,来输出与三维交互物件对应的控制指令,避免了在活动部件运动过程中就频繁的生成控制指令造成系统压力过大的问题,同时节省了处理资源。
在一个实施例中,根据部件当前位置,确定活动部件在局部空间中的部件停止位置包括:确定三维交互物件的类型;当三维交互物件的类型为无状态类型时,则将部件当前位置作为活动部件在局部空间中的部件停止位置。
其中,三维交互物件的类型包括无状态类型和有状态类型。无状态类型的三维交互物件,是指当虚拟操作体离开后所包括的活动部件就立即停止运动的三维交互物件。有状态类型的三维交互物件,是指在虚拟操作体离开后,所包括的活动部件仍然会运动至根据位置匹配逻辑所匹配出的目标停止位置的三维交互物件。
比如,三维交互物件为按钮,针对属于无状态类型的按钮,则虚拟操作体离开该按钮的活动部件后,该按钮的活动部件就会停止运动。而针对属于有状态类型的按钮,该按钮所具备的功能为开关。假如需要将按钮按压至少一半才能够打开开关,则虚拟操作体离开该按钮的活动部件后,如果该按钮的活动部件未被按照至少一半,则不能够打开开关,那么该按钮的活动部件就会自动运动返回至未被按压前的位置。
在一个实施例中,针对无状态的类型的三维交互物件,活动部件在局部空间位置所对应的部件停止位置是连续型的浮点数值。比如,0-1内的浮点数值。针对有状态的类型的三维交互物件,活动部件在局部空间位置所对应的部件停止位置为离散的预设位置数值。比如,预设位置数值为0、0.5和1,则需要将在虚拟操作体离开后,所包括的活动部件的位置与这3个预设位置数值匹配,根据匹配结果从这3个数值中选取目标停止位置。
具体地,计算机设备会确定三维交互物件的类型,当三维交互物件的类型为无状态类型时,则说明该三维交互物件所包括的活动部件在虚拟操作体离开后需要立即停止运动,计算机设备则可以直接将部件当前位置作为该活动部件在局部空间中的部件停止位置。
在一个实施例中,当三维交互物件的类型为有状态类型时,则计算机设备可以根据活动部件的部件当前位置,为活动部件匹配出目标停止位置,作为该活动部件最终的部件停止位置。
在一个实施例中,活动部件相对于固定部件的相对位置,是活动部件在局部空间所处的位置。本实施例中,计算机设备可以输出活动部件所处的部件停止位置数值,根据该部件停止位置数值生成与三维交互物件对应的控制指令。
具体地,计算机设备可以根据位置数值与控制指令间的预设映射关系,根据该部件停止位置数值映射得到相应的控制指令。比如,部件初始位置数值为0时,所对应的音量为0,部件停止位置数值变为1时,则输出将音量调大至50的控制指令。
上述实施例中,通过对三维交互物件的类型的判断,来确定活动部件在局部空间中的部件停止位置,当三维交互物件的类型为无状态类型时,则将部件当前位置作为活动部件在局部空间中的部件停止位置,使得所确定的部件停止位置更准确。而且使得虚拟操作体对活动部件的运动具有绝对控制权,从而使得对活动部件的控制更加的灵活。
在一个实施例中,根据部件当前位置,确定活动部件在局部空间中的部件停止位置包括:确定三维交互物件的类型;当三维交互物件的类型为有状态类型时,则获取活动部件在局部空间中所对应的至少一个预设停止位置;选取距离部件当前位置最近的预设停止位置,作为活动部件在局部空间中的部件停止位置。
其中,有状态类型的三维交互物件,是指在虚拟操作体离开后,所包括的活动部件仍然会运动至根据位置匹配逻辑所匹配出的目标停止位置的三维交互物件。
具体地,计算机设备中针对有状态类型的三维交互物件的活动部件,在局部空间中预先设置了至少一个预设停止位置。计算机设备会确定三维交互物件的类型。当三维交互物件的类型为有状态类型时,计算机设备则可以获取该活动部件在局部空间中所对应的至少一个预设停止位置。计算机设备可以将部件当前位置与所获取的至少一个预设停止位置进行匹配,选取距离部件当前位置最近的预设停止位置,作为活动部件在局部空间中的部件停止位 置。
上述实施例中,当三维交互物件的类型为有状态类型时,选取距离部件当前位置最近的预设停止位置,作为活动部件在局部空间中的部件停止位置,即当虚拟操作体对活动部件的运动控制存在一些误差时,也能够准确地确定出活动部件的目标停止位置,从而使得输出的控制指令更准确。
在一个实施例中,将活动部件停留于部件停止位置包括:获取部件当前位置和部件停止位置各在局部空间中目标坐标轴上的坐标值;根据部件当前位置和部件停止位置在目标坐标轴上的坐标值之间的差值,确定活动部件需运动的第二长度;控制活动部件由部件当前位置起,沿着目标坐标轴运动第二长度,以到达部件停止位置。
其中,局部空间中的目标坐标轴,是指三维交互物件的局部空间的坐标系中的目标坐标轴。可以理解,目标坐标轴所处的方向,是活动部件相对于固定部件能够运动的方向,即活动部件可以相对于固定部件在目标坐标轴方向上运动。
具体地,计算机设备可以将部件当前位置在目标坐标轴上的坐标值减去部件停止位置在目标坐标轴上的坐标值,得到两者之间的差值。计算机设备可以根据得到的差值确定活动部件需运动的第二长度。可以理解,计算机设备也可以将部件停止位置在目标坐标轴上的坐标值减去部件当前位置在目标坐标轴上的坐标值,得到两者之间的差值。进一步地,计算机设备可以控制活动部件由部件当前位置起,沿着目标坐标轴运动第二长度,以到达部件停止位置。
上述实施例中,通过控制活动部件沿着目标坐标轴运动第二长度以从部件当前位置到达部件停止位置,使得活动部件以最短路径到达部件停止位置,提高了控制效率。
如图10所示,在一个实施例中,提供了另一种虚拟现实环境下的控制方法,该方法具体包括以下步骤:
S1002,在虚拟现实环境中,显示包括活动部件和固定部件的三维交互物 件。
在一个实施例中,步骤S1002包括:在虚拟现实环境中,确定用于构建三维交互物件的中心,根据中心建立需构建的三维交互物件的局部空间的坐标系,确定组成三维交互物件的活动部件和固定部件各在坐标系中的坐标,根据确定的坐标,构建活动部件和固定部件得到三维交互物件。
S1004,获取活动部件在三维交互物件的局部空间中的部件初始位置。
S1006,当虚拟操作体与活动部件接触时,确定虚拟操作体相对于三维交互物件的局部空间的运动初始位置。
在一个实施例中,步骤S1006之前,该方法还包括:监测虚拟操作体在虚拟现实环境中的移动。
S1008,当虚拟操作体与活动部件接触并运动时,获取虚拟操作体在局部空间中的当前运动位置。
S1010,获取当前运动位置和运动初始位置各在局部空间中目标坐标轴上的坐标值。
S1012,根据当前运动位置和运动初始位置在目标坐标轴上的坐标值之间的差值,确定活动部件需运动的第一长度和方向。
S1014,控制活动部件从部件初始位置起,沿着相对于固定部件固定的目标坐标轴,并朝该方向移动第一长度。
S1016,当虚拟操作体离开活动部件时,确定活动部件在局部空间中的部件当前位置。
S1018,确定三维交互物件的类型。当三维交互物件的类型为有状态类型时,则进入步骤S1020,当三维交互物件的类型为无状态类型时,则进入步骤S1030。
S1020,获取活动部件在局部空间中所对应的至少一个预设停止位置。
S1022,选取距离部件当前位置最近的预设停止位置,作为活动部件在局部空间中的部件停止位置。
S1024,获取部件当前位置和部件停止位置各在局部空间中目标坐标轴上 的坐标值。
S1026,根据部件当前位置和部件停止位置在目标坐标轴上的坐标值之间的差值,确定活动部件需运动的第二长度。
S1028,控制活动部件由部件当前位置起,沿着目标坐标轴运动第二长度,以到达部件停止位置。
S1030,将部件当前位置作为活动部件在局部空间中的部件停止位置。
S1032,按照活动部件相对于固定部件的相对位置,输出与三维交互物件对应的控制指令。
上述虚拟现实环境下的控制方法,通过将进行交互的功能菜单界面虚拟为三维交互物件,在虚拟现实环境中,通过虚拟操作体与三维交互物件的活动部件接触,移动虚拟操作体就可以直接控制活动部件跟随虚拟操作体运动,根据运动的活动部件相对于固定部件的相对位置,就可以直接输出与三维交互物件对应的控制命令,通过移动虚拟操作体就可以实现交互控制,不需要基于二维菜单界面进行瞄准选择并按键点击等繁琐的步骤来进行交互,提高了虚拟现实环境下的控制效率。
其次,通过虚拟操作体在局部空间的运动初始位置到当前运动位置的变化,控制活动部件从部件初始位置起相对于固定部件运动,能够实现活动部件相对于固定部件并跟随虚拟操作体的运动而运动,从而通过控制活动部件运动,来触发与三维交互物件对应的控制指令,以实现交互控制。
然后,沿着相对于固定部件固定的目标坐标轴和方向移动第一长度,可以使得活动部件的运动更加的准确,剔除了一些干扰运动,从而使得触发生成的控制指令更加的准确。
此外,当三维交互物件的类型为有状态类型时,选取距离部件当前位置最近的预设停止位置,作为活动部件在局部空间中的部件停止位置,即当虚拟操作体对活动部件的运动控制存在一些误差时,也能够准确地确定出活动部件的目标停止位置,从而使得输出的控制指令更准确。
在一个实施例中,提供了一种计算机设备,该计算机设备的内部结构可 如图2所示,该计算机设备包括虚拟现实环境下的控制装置,虚拟现实环境下的控制装置中包括各个模块,每个模块可全部或部分通过软件、硬件或其组合来实现。
如图11所示,在一个实施例中,提供了一种虚拟现实环境下的控制装置1100,该装置1100包括显示模块1102、移动监测模块1104、控制模块1106以及指令输出模块1108,其中:
显示模块1102,用于在虚拟现实环境中,显示包括活动部件和固定部件的三维交互物件。
移动监测模块1104,用于监测虚拟操作体在虚拟现实环境中的移动。
控制模块1106,用于当虚拟操作体移动至与所述活动部件接触后,控制所述活动部件相对于所述固定部件并跟随所述虚拟操作体运动。
指令输出模块1108,用于按照所述活动部件相对于所述固定部件的相对位置,输出与所述三维交互物件对应的控制指令。
在一个实施例中,显示模块1102还用于在虚拟现实环境中,确定用于构建三维交互物件的中心;根据所述中心建立需构建的三维交互物件的局部空间的坐标系;确定组成三维交互物件的活动部件和固定部件各在所述坐标系中的坐标;根据确定的坐标,构建所述活动部件和固定部件得到三维交互物件。
如图12所示,在一个实施例中,所述控制模块1106包括:
位置确定模块1106a,用于获取所述活动部件在所述三维交互物件的局部空间中的部件初始位置;当虚拟操作体移动至与所述活动部件接触时,确定所述虚拟操作体相对于所述三维交互物件的局部空间的运动初始位置;当所述虚拟操作体与所述活动部件接触并运动时,获取所述虚拟操作体在所述局部空间中的当前运动位置。
运动控制模块1106b,用于根据所述运动初始位置到所述当前运动位置的变化,控制所述活动部件从所述部件初始位置起相对于所述固定部件运动。
在一个实施例中,所述运动控制模块1106b还用于获取所述当前运动位 置和所述运动初始位置各在所述局部空间中目标坐标轴上的坐标值;根据所述当前运动位置和所述运动初始位置在目标坐标轴上的坐标值之间的差值,确定所述活动部件需运动的第一长度和方向;控制所述活动部件从所述部件初始位置起,沿着相对于所述固定部件固定的所述目标坐标轴,并朝所述方向移动所述第一长度。
在一个实施例中,所述控制模块1106还用于当所述虚拟操作体离开所述活动部件时,确定所述活动部件在所述局部空间中的部件当前位置;根据所述部件当前位置,确定所述活动部件在所述局部空间中的部件停止位置;将所述活动部件停留于所述部件停止位置。
在一个实施例中,所述控制模块1106还用于确定所述三维交互物件的类型;当所述三维交互物件的类型为无状态类型时,则将所述部件当前位置作为所述活动部件在所述局部空间中的部件停止位置。
在一个实施例中,所述控制模块1106还用于确定所述三维交互物件的类型;当所述三维交互物件的类型为有状态类型时,则获取所述活动部件在所述局部空间中所对应的至少一个预设停止位置;选取距离所述部件当前位置最近的预设停止位置,作为所述活动部件在所述局部空间中的部件停止位置。
在一个实施例中,所述控制模块1106还用于获取所述部件当前位置和所述部件停止位置各在所述局部空间中目标坐标轴上的坐标值;根据所述部件当前位置和所述部件停止位置在目标坐标轴上的坐标值之间的差值,确定所述活动部件需运动的第二长度;控制所述活动部件由所述部件当前位置起,沿着所述目标坐标轴运动所述第二长度,以到达所述部件停止位置。
在一个实施例中,本申请提供的虚拟现实环境下的控制装置可以实现为一种计算机可读指令的形式,所述计算机可读指令可在如图2所示的计算机设备上运行,所述计算机设备的非易失性存储介质可存储组成该虚拟现实环境下的控制装置的各个程序模块,比如,图11所示的显示模块1102、移动监测模块1104、控制模块1106、及指令输出模块1108。各个程序模块中包括计算机可读指令,所述计算机可读指令用于使所述计算机设备执行本说明书 中描述的本申请各个实施例的虚拟现实环境下的控制方法中的步骤,例如,所述计算机设备可以通过如图11所示的交互数据处理装置1100中的显示模块1102在虚拟现实环境中,显示包括活动部件和固定部件的三维交互物件,通过移动监测模块1104监测虚拟操作体在虚拟现实环境中的移动,并通过控制模块1106当虚拟操作体移动至与所述活动部件接触后,控制所述活动部件相对于所述固定部件并跟随所述虚拟操作体运动。计算机设备可以通过指令输出模块1108按照所述活动部件相对于所述固定部件的相对位置,输出与所述三维交互物件对应的控制指令。
在一个实施例中,提供了一种计算机设备,包括存储器和处理器,所述存储器中存储有计算机可读指令,所述计算机可读指令被处理器执行时,使得所述处理器执行如下步骤:在虚拟现实环境中,显示包括活动部件和固定部件的三维交互物件;监测虚拟操作体在所述虚拟现实环境中的移动;当虚拟操作体移动至与所述活动部件接触后,控制所述活动部件相对于所述固定部件并跟随所述虚拟操作体运动;按照所述活动部件相对于所述固定部件的相对位置,输出与所述三维交互物件对应的控制指令。
在一个实施例中,所述在虚拟现实环境中,显示包括活动部件和固定部件的三维交互物件,包括:在虚拟现实环境中,确定用于构建三维交互物件的中心;根据所述中心建立需构建的三维交互物件的局部空间的坐标系;确定组成三维交互物件的活动部件和固定部件各在所述坐标系中的坐标;根据确定的坐标,构建所述活动部件和固定部件得到三维交互物件。
在一个实施例中,所述当虚拟操作体移动至与所述活动部件接触后,控制所述活动部件相对于所述固定部件并跟随所述虚拟操作体运动包括:获取所述活动部件在所述三维交互物件的局部空间中的部件初始位置;当虚拟操作体移动至与所述活动部件接触时,确定所述虚拟操作体相对于所述三维交互物件的局部空间的运动初始位置;当所述虚拟操作体与所述活动部件接触并运动时,获取所述虚拟操作体在所述局部空间中的当前运动位置;根据所述运动初始位置到所述当前运动位置的变化,控制所述活动部件从所述部件 初始位置起相对于所述固定部件运动。
在一个实施例中,所述根据所述运动初始位置到所述当前运动位置的变化,控制所述活动部件从所述部件初始位置起相对于所述固定部件运动,包括:获取所述当前运动位置和所述运动初始位置各在所述局部空间中目标坐标轴上的坐标值;根据所述当前运动位置和所述运动初始位置在目标坐标轴上的坐标值之间的差值,确定所述活动部件需运动的第一长度和方向;控制所述活动部件从所述部件初始位置起,沿着相对于所述固定部件固定的所述目标坐标轴,并朝所述方向移动所述第一长度。
在一个实施例中,在所述按照所述活动部件相对于所述固定部件的相对位置,输出与所述三维交互物件对应的控制指令之前,计算机可读指令还使得处理器执行以下步骤:当所述虚拟操作体离开所述活动部件时,确定所述活动部件在所述局部空间中的部件当前位置;根据所述部件当前位置,确定所述活动部件在所述局部空间中的部件停止位置;将所述活动部件停留于所述部件停止位置。
在一个实施例中,所述根据所述部件当前位置,确定所述活动部件在所述局部空间中的部件停止位置包括:确定所述三维交互物件的类型;当所述三维交互物件的类型为无状态类型时,则将所述部件当前位置作为所述活动部件在所述局部空间中的部件停止位置。
在一个实施例中,所述根据所述部件当前位置,确定所述活动部件在所述局部空间中的部件停止位置包括:确定所述三维交互物件的类型;当所述三维交互物件的类型为有状态类型时,则获取所述活动部件在所述局部空间中所对应的至少一个预设停止位置;选取距离所述部件当前位置最近的预设停止位置,作为所述活动部件在所述局部空间中的部件停止位置。
在一个实施例中,所述将所述活动部件停留于所述部件停止位置包括:获取所述部件当前位置和所述部件停止位置各在所述局部空间中目标坐标轴上的坐标值;根据所述部件当前位置和所述部件停止位置在目标坐标轴上的坐标值之间的差值,确定所述活动部件需运动的第二长度;控制所述活动部 件由所述部件当前位置起,沿着所述目标坐标轴运动所述第二长度,以到达所述部件停止位置。
在一个实施例中,提供了一种存储有计算机可读指令的存储介质,所述计算机可读指令被一个或多个处理器执行时,使得一个或多个处理器执行如下步骤:在虚拟现实环境中,显示包括活动部件和固定部件的三维交互物件;监测虚拟操作体在所述虚拟现实环境中的移动;当虚拟操作体移动至与所述活动部件接触后,控制所述活动部件相对于所述固定部件并跟随所述虚拟操作体运动;按照所述活动部件相对于所述固定部件的相对位置,输出与所述三维交互物件对应的控制指令。
在一个实施例中,所述在虚拟现实环境中,显示包括活动部件和固定部件的三维交互物件,包括:在虚拟现实环境中,确定用于构建三维交互物件的中心;根据所述中心建立需构建的三维交互物件的局部空间的坐标系;确定组成三维交互物件的活动部件和固定部件各在所述坐标系中的坐标;根据确定的坐标,构建所述活动部件和固定部件得到三维交互物件。
在一个实施例中,所述当虚拟操作体移动至与所述活动部件接触后,控制所述活动部件相对于所述固定部件并跟随所述虚拟操作体运动包括:获取所述活动部件在所述三维交互物件的局部空间中的部件初始位置;当虚拟操作体移动至与所述活动部件接触时,确定所述虚拟操作体相对于所述三维交互物件的局部空间的运动初始位置;当所述虚拟操作体与所述活动部件接触并运动时,获取所述虚拟操作体在所述局部空间中的当前运动位置;根据所述运动初始位置到所述当前运动位置的变化,控制所述活动部件从所述部件初始位置起相对于所述固定部件运动。
在一个实施例中,所述根据所述运动初始位置到所述当前运动位置的变化,控制所述活动部件从所述部件初始位置起相对于所述固定部件运动,包括:获取所述当前运动位置和所述运动初始位置各在所述局部空间中目标坐标轴上的坐标值;根据所述当前运动位置和所述运动初始位置在目标坐标轴上的坐标值之间的差值,确定所述活动部件需运动的第一长度和方向;控制 所述活动部件从所述部件初始位置起,沿着相对于所述固定部件固定的所述目标坐标轴,并朝所述方向移动所述第一长度。
在一个实施例中,在所述按照所述活动部件相对于所述固定部件的相对位置,输出与所述三维交互物件对应的控制指令之前,计算机可读指令还使得处理器执行以下步骤:当所述虚拟操作体离开所述活动部件时,确定所述活动部件在所述局部空间中的部件当前位置;根据所述部件当前位置,确定所述活动部件在所述局部空间中的部件停止位置;将所述活动部件停留于部件停止位置。
在一个实施例中,所述根据所述部件当前位置,确定所述活动部件在所述局部空间中的部件停止位置包括:确定所述三维交互物件的类型;当所述三维交互物件的类型为无状态类型时,则将所述部件当前位置作为所述活动部件在所述局部空间中的部件停止位置。
在一个实施例中,所述根据所述部件当前位置,确定所述活动部件在所述局部空间中的部件停止位置包括:确定所述三维交互物件的类型;当所述三维交互物件的类型为有状态类型时,则获取所述活动部件在所述局部空间中所对应的至少一个预设停止位置;选取距离所述部件当前位置最近的预设停止位置,作为所述活动部件在所述局部空间中的部件停止位置。
在一个实施例中,所述将所述活动部件停留于所述部件停止位置包括:获取所述部件当前位置和所述部件停止位置各在所述局部空间中目标坐标轴上的坐标值;根据所述部件当前位置和所述部件停止位置在目标坐标轴上的坐标值之间的差值,确定所述活动部件需运动的第二长度;控制所述活动部件由所述部件当前位置起,沿着所述目标坐标轴运动所述第二长度,以到达所述部件停止位置。
可以理解,本申请说明书中的“计算机设备”即为本申请发明名称中的“设备”。
本领域普通技术人员可以理解实现上述实施例方法中的全部或部分流程,是可以通过计算机可读指令来指令相关的硬件来完成,该计算机可读指 令可存储于一计算机可读取存储介质中,该程序在执行时,可包括如上述各方法的实施例的流程。其中,前述的存储介质可为磁碟、光盘、只读存储记忆体(Read-Only Memory,ROM)等非易失性存储介质,或随机存储记忆体(Random Access Memory,RAM)等。
以上实施例的各技术特征可以进行任意的组合,为使描述简洁,未对上述实施例中的各个技术特征所有可能的组合都进行描述,然而,只要这些技术特征的组合不存在矛盾,都应当认为是本说明书记载的范围。
以上实施例仅表达了本申请的几种实施方式,其描述较为具体和详细,但并不能因此而理解为对发明专利范围的限制。应当指出的是,对于本领域的普通技术人员来说,在不脱离本申请构思的前提下,还可以做出若干变形和改进,这些都属于本申请的保护范围。因此,本申请专利的保护范围应以所附权利要求为准。

Claims (20)

  1. 一种虚拟现实环境下的控制方法,包括:
    计算机设备在虚拟现实环境中,显示包括活动部件和固定部件的三维交互物件;
    所述计算机设备监测虚拟操作体在所述虚拟现实环境中的移动;
    当所述虚拟操作体移动至与所述活动部件接触后,所述计算机设备控制所述活动部件相对于所述固定部件并跟随所述虚拟操作体运动;及
    所述计算机设备按照所述活动部件相对于所述固定部件的相对位置,输出与所述三维交互物件对应的控制指令。
  2. 根据权利要求1所述的方法,其特征在于,所述计算机设备在虚拟现实环境中,显示包括活动部件和固定部件的三维交互物件包括:
    计算机设备在虚拟现实环境中,确定用于构建三维交互物件的中心;
    所述计算机设备根据所述中心建立需构建的三维交互物件的局部空间的坐标系;
    所述计算机设备确定组成三维交互物件的活动部件和固定部件各在所述坐标系中的坐标;及
    所述计算机设备根据确定的坐标,构建所述活动部件和固定部件得到三维交互物件。
  3. 根据权利要求1所述的方法,其特征在于,所述当所述虚拟操作体移动至与所述活动部件接触后,所述计算机设备控制所述活动部件相对于所述固定部件并跟随所述虚拟操作体运动包括:
    所述计算机设备获取所述活动部件在所述三维交互物件的局部空间中的部件初始位置;
    当所述虚拟操作体移动至与所述活动部件接触时,所述计算机设备确定所述虚拟操作体相对于所述三维交互物件的局部空间的运动初始位置;
    当所述虚拟操作体与所述活动部件接触并运动时,所述计算机设备获取所述虚拟操作体在所述局部空间中的当前运动位置;及
    所述计算机设备根据所述运动初始位置到所述当前运动位置的变化,控制所述活动部件从所述部件初始位置起相对于所述固定部件运动。
  4. 根据权利要求3所述的方法,其特征在于,所述计算机设备根据所述运动初始位置到所述当前运动位置的变化,控制所述活动部件从所述部件初始位置起相对于所述固定部件运动包括:
    所述计算机设备获取所述当前运动位置和所述运动初始位置各在所述局部空间中目标坐标轴上的坐标值;
    所述计算机设备根据所述当前运动位置和所述运动初始位置在目标坐标轴上的坐标值之间的差值,确定所述活动部件需运动的第一长度和方向;及
    所述计算机设备控制所述活动部件从所述部件初始位置起,沿着相对于所述固定部件固定的所述目标坐标轴,并朝所述方向移动所述第一长度。
  5. 根据权利要求1所述的方法,其特征在于,还包括:
    当所述虚拟操作体离开所述活动部件时,所述计算机设备确定所述活动部件在所述局部空间中的部件当前位置;
    所述计算机设备根据所述部件当前位置,确定所述活动部件在所述局部空间中的部件停止位置;及
    所述计算机设备将所述活动部件停留于所述部件停止位置。
  6. 根据权利要求5所述的方法,其特征在于,所述计算机设备根据所述部件当前位置,确定所述活动部件在所述局部空间中的部件停止位置包括:
    所述计算机设备确定所述三维交互物件的类型;及
    当所述三维交互物件的类型为无状态类型时,所述计算机设备则将所述部件当前位置作为所述活动部件在所述局部空间中的部件停止位置。
  7. 根据权利要求5所述的方法,其特征在于,所述计算机设备根据所述部件当前位置,确定所述活动部件在所述局部空间中的部件停止位置包括:
    所述计算机设备确定所述三维交互物件的类型;
    当所述三维交互物件的类型为有状态类型时,所述计算机设备则获取所述活动部件在所述局部空间中所对应的至少一个预设停止位置;及
    所述计算机设备选取距离所述部件当前位置最近的预设停止位置,作为所述活动部件在所述局部空间中的部件停止位置。
  8. 根据权利要求7所述的方法,其特征在于,所述计算机设备将所述活动部件停留于所述部件停止位置包括:
    所述计算机设备获取所述部件当前位置和所述部件停止位置各在所述局部空间中目标坐标轴上的坐标值;
    所述计算机设备根据所述部件当前位置和所述部件停止位置在目标坐标轴上的坐标值之间的差值,确定所述活动部件需运动的第二长度;及
    所述计算机设备控制所述活动部件由所述部件当前位置起,沿着所述目标坐标轴运动所述第二长度,以到达所述部件停止位置。
  9. 一种计算机设备,包括存储器和一个或多个处理器,所述存储器中存储有计算机可读指令,所述计算机可读指令被处理器执行时,使得所述处理器执行如下步骤:
    在虚拟现实环境中,显示包括活动部件和固定部件的三维交互物件;
    监测虚拟操作体在所述虚拟现实环境中的移动;
    当所述虚拟操作体移动至与所述活动部件接触后,控制所述活动部件相对于所述固定部件并跟随所述虚拟操作体运动;及
    按照所述活动部件相对于所述固定部件的相对位置,输出与所述三维交互物件对应的控制指令。
  10. 根据权利要求9所述的计算机设备,其特征在于,所述在虚拟现实环境中,显示包括活动部件和固定部件的三维交互物件包括:
    在虚拟现实环境中,确定用于构建三维交互物件的中心;
    根据所述中心建立需构建的三维交互物件的局部空间的坐标系;
    确定组成三维交互物件的活动部件和固定部件各在所述坐标系中的坐标;及
    根据确定的坐标,构建所述活动部件和固定部件得到三维交互物件。
  11. 根据权利要求9所述的计算机设备,其特征在于,所述当所述虚拟 操作体移动至与所述活动部件接触后,控制所述活动部件相对于所述固定部件并跟随所述虚拟操作体运动包括:
    获取所述活动部件在所述三维交互物件的局部空间中的部件初始位置;
    当所述虚拟操作体移动至与所述活动部件接触时,确定所述虚拟操作体相对于所述三维交互物件的局部空间的运动初始位置;
    当所述虚拟操作体与所述活动部件接触并运动时,获取所述虚拟操作体在所述局部空间中的当前运动位置;及
    根据所述运动初始位置到所述当前运动位置的变化,控制所述活动部件从所述部件初始位置起相对于所述固定部件运动。
  12. 根据权利要求11所述的计算机设备,其特征在于,所述根据所述运动初始位置到所述当前运动位置的变化,控制所述活动部件从所述部件初始位置起相对于所述固定部件运动包括:
    获取所述当前运动位置和所述运动初始位置各在所述局部空间中目标坐标轴上的坐标值;
    根据所述当前运动位置和所述运动初始位置在目标坐标轴上的坐标值之间的差值,确定所述活动部件需运动的第一长度和方向;及
    控制所述活动部件从所述部件初始位置起,沿着相对于所述固定部件固定的所述目标坐标轴,并朝所述方向移动所述第一长度。
  13. 根据权利要求9所述的计算机设备,其特征在于,所述计算机可读指令被一个或多个处理器执行时,还使得所述一个或多个处理器执行如下步骤:
    当所述虚拟操作体离开所述活动部件时,确定所述活动部件在所述局部空间中的部件当前位置;
    根据所述部件当前位置,确定所述活动部件在所述局部空间中的部件停止位置;及
    将所述活动部件停留于所述部件停止位置。
  14. 根据权利要求13所述的计算机设备,其特征在于,所述根据所述部 件当前位置,确定所述活动部件在所述局部空间中的部件停止位置包括:
    确定所述三维交互物件的类型;及
    当所述三维交互物件的类型为无状态类型时,则
    将所述部件当前位置作为所述活动部件在所述局部空间中的部件停止位置。
  15. 根据权利要求13所述的计算机设备,其特征在于,所述根据所述部件当前位置,确定所述活动部件在所述局部空间中的部件停止位置包括:
    确定所述三维交互物件的类型;
    当所述三维交互物件的类型为有状态类型时,则
    获取所述活动部件在所述局部空间中所对应的至少一个预设停止位置;及
    选取距离所述部件当前位置最近的预设停止位置,作为所述活动部件在所述局部空间中的部件停止位置。
  16. 根据权利要求15所述的计算机设备,其特征在于,所述将所述活动部件停留于所述部件停止位置包括:
    获取所述部件当前位置和所述部件停止位置各在所述局部空间中目标坐标轴上的坐标值;
    根据所述部件当前位置和所述部件停止位置在目标坐标轴上的坐标值之间的差值,确定所述活动部件需运动的第二长度;及
    控制所述活动部件由所述部件当前位置起,沿着所述目标坐标轴运动所述第二长度,以到达所述部件停止位置。
  17. 一种存储有计算机可读指令的存储介质,所述计算机可读指令被一个或多个处理器执行时,使得一个或多个处理器执行如下步骤:
    在虚拟现实环境中,显示包括活动部件和固定部件的三维交互物件;
    监测虚拟操作体在所述虚拟现实环境中的移动;
    当所述虚拟操作体移动至与所述活动部件接触后,控制所述活动部件相对于所述固定部件并跟随所述虚拟操作体运动;及
    按照所述活动部件相对于所述固定部件的相对位置,输出与所述三维交互物件对应的控制指令。
  18. 根据权利要求17所述的存储介质,其特征在于,所述在虚拟现实环境中,显示包括活动部件和固定部件的三维交互物件包括:
    在虚拟现实环境中,确定用于构建三维交互物件的中心;
    根据所述中心建立需构建的三维交互物件的局部空间的坐标系;
    确定组成三维交互物件的活动部件和固定部件各在所述坐标系中的坐标;及
    根据确定的坐标,构建所述活动部件和固定部件得到三维交互物件。
  19. 根据权利要求17所述的存储介质,其特征在于,所述当所述虚拟操作体移动至与所述活动部件接触后,控制所述活动部件相对于所述固定部件并跟随所述虚拟操作体运动包括:
    获取所述活动部件在所述三维交互物件的局部空间中的部件初始位置;
    当所述虚拟操作体移动至与所述活动部件接触时,确定所述虚拟操作体相对于所述三维交互物件的局部空间的运动初始位置;
    当所述虚拟操作体与所述活动部件接触并运动时,获取所述虚拟操作体在所述局部空间中的当前运动位置;及
    根据所述运动初始位置到所述当前运动位置的变化,控制所述活动部件从所述部件初始位置起相对于所述固定部件运动。
  20. 根据权利要求17所述的存储介质,其特征在于,所述计算机可读指令被一个或多个处理器执行时,还使得所述一个或多个处理器执行如下步骤:
    当所述虚拟操作体离开所述活动部件时,确定所述活动部件在所述局部空间中的部件当前位置;
    根据所述部件当前位置,确定所述活动部件在所述局部空间中的部件停止位置;及
    将所述活动部件停留于所述部件停止位置。
PCT/CN2018/105054 2017-10-10 2018-09-11 虚拟现实环境下的控制方法、设备和存储介质 WO2019072064A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201710936132.9A CN109656432B (zh) 2017-10-10 2017-10-10 虚拟现实环境下的控制方法、装置、设备和存储介质
CN201710936132.9 2017-10-10

Publications (1)

Publication Number Publication Date
WO2019072064A1 true WO2019072064A1 (zh) 2019-04-18

Family

ID=66101216

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2018/105054 WO2019072064A1 (zh) 2017-10-10 2018-09-11 虚拟现实环境下的控制方法、设备和存储介质

Country Status (2)

Country Link
CN (1) CN109656432B (zh)
WO (1) WO2019072064A1 (zh)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106371573A (zh) * 2015-12-04 2017-02-01 北京智谷睿拓技术服务有限公司 触觉反馈的方法、装置和虚拟现实交互系统
CN106502401A (zh) * 2016-10-31 2017-03-15 宇龙计算机通信科技(深圳)有限公司 一种图像控制方法及装置
CN107045389A (zh) * 2017-04-14 2017-08-15 腾讯科技(深圳)有限公司 一种实现控制固定被控物的方法及装置
CN107198879A (zh) * 2017-04-20 2017-09-26 网易(杭州)网络有限公司 虚拟现实场景中的移动控制方法、装置及终端设备

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102722249B (zh) * 2012-06-05 2016-03-30 上海鼎为电子科技(集团)有限公司 操控方法、操控装置及电子装置
CN105308536A (zh) * 2013-01-15 2016-02-03 厉动公司 用于显示器控制和定制姿势解释的动态用户交互
US10839572B2 (en) * 2016-03-10 2020-11-17 FlyInside, Inc. Contextual virtual reality interaction
CN106980362A (zh) * 2016-10-09 2017-07-25 阿里巴巴集团控股有限公司 基于虚拟现实场景的输入方法及装置
CN106909219B (zh) * 2017-02-15 2022-09-16 腾讯科技(深圳)有限公司 基于三维空间的交互控制方法和装置、智能终端
US20170178260A1 (en) * 2017-03-01 2017-06-22 Navitaire Llc Systems and methods for improved data integration in virtual reality architectures
CN107168541A (zh) * 2017-04-07 2017-09-15 北京小鸟看看科技有限公司 一种输入的实现方法和装置
CN107096223B (zh) * 2017-04-20 2020-09-25 网易(杭州)网络有限公司 虚拟现实场景中的移动控制方法、装置及终端设备

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106371573A (zh) * 2015-12-04 2017-02-01 北京智谷睿拓技术服务有限公司 触觉反馈的方法、装置和虚拟现实交互系统
CN106502401A (zh) * 2016-10-31 2017-03-15 宇龙计算机通信科技(深圳)有限公司 一种图像控制方法及装置
CN107045389A (zh) * 2017-04-14 2017-08-15 腾讯科技(深圳)有限公司 一种实现控制固定被控物的方法及装置
CN107198879A (zh) * 2017-04-20 2017-09-26 网易(杭州)网络有限公司 虚拟现实场景中的移动控制方法、装置及终端设备

Also Published As

Publication number Publication date
CN109656432B (zh) 2022-09-13
CN109656432A (zh) 2019-04-19

Similar Documents

Publication Publication Date Title
CN107913520B (zh) 信息处理方法、装置、电子设备及存储介质
JP6659644B2 (ja) 応用素子の代替的グラフィック表示の事前の生成による入力に対する低レイテンシの視覚的応答およびグラフィック処理ユニットの入力処理
US10332563B2 (en) Method and program for generating responsive image
JP5147933B2 (ja) 人−機械インターフェース装置システム及び方法
US20200218356A1 (en) Systems and methods for providing dynamic haptic playback for an augmented or virtual reality environments
US20130346858A1 (en) Remote Control of Audio Application and Associated Sub-Windows
WO2021218486A1 (en) Method and device for adjusting the control-display gain of a gesture controlled electronic device
WO2022237268A1 (zh) 头戴式显示设备的信息输入方法、装置及头戴式显示设备
JP7395070B1 (ja) ビデオ処理方法及び装置、電子設備及びコンピュータ読み取り可能な記憶媒体
CN108776544B (zh) 增强现实中的交互方法及装置、存储介质、电子设备
WO2020131592A1 (en) Mode-changeable augmented reality interface
WO2018019256A1 (zh) 一种虚拟现实系统及其视角调节方法和装置
WO2014194148A2 (en) Systems and methods involving gesture based user interaction, user interface and/or other features
CN111190826B (zh) 一种虚拟现实沉浸式追踪环境的测试方法、装置、存储介质及设备
CN106681506B (zh) 一种终端设备中非vr应用的交互方法及终端设备
US20140317549A1 (en) Method for Controlling Touchscreen by Using Virtual Trackball
US10073586B2 (en) Method and system for mouse pointer to automatically follow cursor
US20170249784A1 (en) Computer-readable non-transitory storage medium having stored therein information processing program, information processing system, information processing method, and information processing apparatus
JP5876600B1 (ja) 情報処理プログラム、及び情報処理方法
CN111124156A (zh) 移动终端的交互控制方法和移动终端
CN112987924A (zh) 用于设备交互的方法、装置、设备和存储介质
WO2019072064A1 (zh) 虚拟现实环境下的控制方法、设备和存储介质
Darbar et al. Text selection in ar-hmd using a smartphone as an input device
CN113457117B (zh) 游戏中的虚拟单位选取方法及装置、存储介质及电子设备
TWI737175B (zh) 物件操控方法、主機裝置及電腦可讀儲存媒體

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18866357

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18866357

Country of ref document: EP

Kind code of ref document: A1