WO2024077873A1 - 虚拟场景缩放方法、装置、存储介质与电子设备 - Google Patents

虚拟场景缩放方法、装置、存储介质与电子设备 Download PDF

Info

Publication number
WO2024077873A1
WO2024077873A1 PCT/CN2023/082287 CN2023082287W WO2024077873A1 WO 2024077873 A1 WO2024077873 A1 WO 2024077873A1 CN 2023082287 W CN2023082287 W CN 2023082287W WO 2024077873 A1 WO2024077873 A1 WO 2024077873A1
Authority
WO
WIPO (PCT)
Prior art keywords
virtual
control wheel
scene
zoom control
virtual scene
Prior art date
Application number
PCT/CN2023/082287
Other languages
English (en)
French (fr)
Inventor
钟杰
Original Assignee
网易(杭州)网络有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 网易(杭州)网络有限公司 filed Critical 网易(杭州)网络有限公司
Publication of WO2024077873A1 publication Critical patent/WO2024077873A1/zh

Links

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/24Constructional details thereof, e.g. game controllers with detachable joystick handles
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/60Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor
    • A63F13/63Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor by the player, e.g. authoring using a level editor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04815Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04847Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04806Zoom, i.e. interaction techniques or interactors for controlling the zooming operation

Definitions

  • the present disclosure relates to the field of virtual reality technology, and in particular to a virtual scene scaling method, a virtual scene scaling device, a computer-readable storage medium, and an electronic device.
  • Virtual reality technology can provide users with an immersive virtual scene experience.
  • the user can only move within a fixed-size space, and the virtual scene is usually presented to the user in a fixed size.
  • the virtual scene is scaled based on the user's gestures by collecting the user's gestures, but the scaling ratio is fixed each time, the operation is cumbersome and the degree of freedom is low.
  • the present disclosure provides a virtual scene scaling method, a virtual scene scaling device, a computer-readable storage medium and an electronic device, thereby at least to a certain extent solving the problem that the virtual scene scaling operation in the related art is cumbersome and has a low degree of freedom.
  • a virtual scene zooming method comprising: in response to a triggering operation of a virtual scene zooming event, displaying a virtual zoom control wheel in a user's field of view of the virtual scene; mapping a real position of a virtual reality handle on the virtual zoom control wheel; determining a target zoom ratio of the virtual scene according to a position movement operation of the virtual reality handle on the virtual zoom control wheel; and zooming the virtual scene according to the target zoom ratio.
  • a virtual scene zooming device comprising: an event triggering module, for displaying a virtual zoom control wheel in a user's field of view of a virtual scene in response to a triggering operation of a virtual scene zooming event; a position mapping module, for mapping a real position of a virtual reality handle on the virtual zoom control wheel; a ratio determination module, for determining a target zoom ratio of the virtual scene according to a position movement operation of the virtual reality handle on the virtual zoom control wheel; and a scene zooming module, for zooming the virtual scene according to the target zoom ratio.
  • a computer-readable storage medium on which a computer program is stored, and when the computer program is executed by a processor, the virtual scene scaling method described above is implemented.
  • an electronic device comprising: a processor; and a memory for storing executable instructions of the processor; wherein the processor is configured to perform the above-mentioned virtual scene scaling method by executing the executable instructions.
  • a virtual zoom control wheel is displayed in the user's field of view of the virtual scene; the real position of the virtual reality handle is mapped on the virtual zoom control wheel; according to the position movement operation of the virtual reality handle on the virtual zoom control wheel, the target zoom ratio of the virtual scene is determined; according to the target zoom ratio, the virtual scene is zoomed.
  • the virtual scene zooming of any ratio can be achieved within a certain range without jumping out of the virtual scene interface, which greatly improves the convenience and freedom of virtual scene zooming.
  • using the position movement operation of the virtual reality handle on the virtual zoom control wheel as a reference for determining the target zoom ratio can avoid the problem of recognition errors caused by irregular user operation gestures, improve the sensitivity of zoom ratio discrimination, and due to the unique certainty of the position of the virtual reality handle and the reference of the virtual zoom control wheel, the accuracy of virtual scene zooming can also be ensured.
  • FIG1 is a flowchart showing a virtual scene scaling method according to an exemplary embodiment of the present disclosure
  • FIG2A shows a display example of a virtual zoom control wheel according to an exemplary embodiment of the present disclosure
  • FIG2B shows a display example of a virtual zoom control wheel operation prompt mark in one exemplary embodiment of the present disclosure
  • FIG3 shows a flowchart of displaying a virtual zoom control wheel in one exemplary embodiment of the present disclosure
  • FIG4 shows a flowchart of determining a target scaling ratio of a virtual scene according to a position movement operation of a virtual reality handle in one of the exemplary embodiments of the present disclosure
  • FIG5 shows a flowchart of determining a target scaling ratio of a virtual scene according to a second position of a virtual reality handle in one exemplary embodiment of the present disclosure
  • FIG6 shows a preferred virtual scene zooming implementation flow chart of one of the exemplary embodiments of the present disclosure
  • FIG7 shows a structural block diagram of a virtual scene scaling device according to an exemplary embodiment of the present disclosure
  • FIG. 8 shows an electronic device for implementing the virtual scene scaling method according to one of the exemplary embodiments of the present disclosure.
  • jumping out of the virtual scene interface to adjust the virtual scene zoom will cause the virtual scene display to be interrupted, which in turn affects the user's immersive experience.
  • the entire virtual scene zoom process is relatively cumbersome.
  • the zoom ratio is fixed each time, resulting in a low degree of freedom in virtual scene zooming.
  • the user performs gesture operations it is necessary to use a specific sensing device to identify it. Since the user's gestures vary from person to person during actual operation, it may cause recognition errors due to non-standard movements, resulting in low zoom accuracy of the virtual scene.
  • an exemplary embodiment of the present disclosure provides a virtual scene scaling method, which can be executed by a virtual reality display terminal and can be applied to VR (Virtual Reality) interactive scenes such as SLG (Simulation Game, strategy game) that require zooming in and out of game maps.
  • VR Virtual Reality
  • SLG Simulation Game, strategy game
  • An exemplary embodiment of the present disclosure provides a virtual scene scaling method, as shown in FIG1 , which specifically includes the following steps S110 to S140:
  • Step S110 in response to a triggering operation of a virtual scene zooming event, displaying a virtual zoom control wheel in the user's field of view of the virtual scene;
  • Step S120 mapping the real position of the virtual reality handle onto the virtual zoom control wheel
  • Step S130 determining a target zoom ratio of the virtual scene according to the position movement operation of the virtual reality handle on the virtual zoom control wheel;
  • Step S140 scaling the virtual scene according to the target scaling ratio.
  • the virtual scene zooming of any proportion can be achieved within a certain range without jumping out of the virtual scene interface, which greatly improves the convenience and freedom of virtual scene zooming.
  • using the position movement operation of the virtual reality handle on the virtual zoom control wheel as a reference for determining the target zoom ratio can avoid the problem of recognition errors caused by irregular user operation gestures, improve the sensitivity of zoom ratio discrimination, and due to the unique certainty of the virtual reality handle position and the reference of the virtual zoom control wheel, it can also ensure the accuracy of virtual scene zooming.
  • Step S110 in response to a triggering operation of a virtual scene zooming event, a virtual zoom control wheel is displayed in the user's field of view of the virtual scene.
  • a virtual reality controller refers to a physical device that controls the interaction between a virtual scene and a real scene, such as a VR game controller.
  • a virtual scene scaling event refers to an event that enlarges or reduces the presentation ratio of a virtual scene.
  • the triggering operation of the virtual scene zooming event may be a user clicking a designated button on a virtual reality controller.
  • the triggering operation of the virtual scene zooming event may also be a long press operation of a user on a designated button on a virtual reality controller.
  • the triggering operation of the virtual scene zoom event can be specifically set by the developer and is not specifically limited here.
  • the user field of view of the virtual scene refers to the range of the virtual scene that the current user can view.
  • it can be the virtual scene picture that the user can view through VR glasses, or it can be the virtual scene picture displayed on a VR external display screen, as shown in the virtual scene picture 201 in Figure 2A.
  • the virtual zoom control wheel is used to assist the user in controlling the virtual zoom control wheel, which may be shown as the wheel logo 202 in FIG. 2A , and may be displayed in the virtual scene screen 201 .
  • a virtual zoom control wheel is displayed in the user's field of view of the virtual scene, as shown in FIG3 , which can be specifically implemented by the following steps S310 to S320:
  • Step S310 in response to the triggering operation of the virtual reality handle on the virtual scene zooming event, obtaining the first position of the virtual reality handle, the first position being the mapping of the virtual reality handle in the virtual scene when the virtual scene zooming event is triggered Location;
  • Step S320 determining a wheel display position of a virtual zoom control wheel in the user's field of view of the virtual scene according to the first position of the virtual reality handle, and displaying the virtual zoom control wheel at the wheel display position.
  • the first position refers to the mapping position of the virtual reality handle in the virtual scene when the virtual scene zoom event is triggered, and can be used as the initial position of the subsequent movement of the virtual reality handle, that is, the starting position of the zoom adjustment.
  • the position of the virtual reality handle in the real scene can be obtained based on the position sensor in the virtual reality handle, and the position of the virtual reality handle in the real scene is mapped to the virtual scene to obtain the first position of the virtual reality handle.
  • the wheel display position of the virtual zoom control wheel can be determined according to the first position of the virtual reality handle.
  • the wheel display position of the virtual zoom control wheel can be associated with the first position of the virtual reality handle.
  • the preset length can be used as the deviation distance of the center position of the virtual zoom control wheel relative to the first position; the preset orientation can be used as the deviation orientation of the center position of the virtual zoom control wheel relative to the first position; based on the deviation distance and deviation orientation of the center position of the virtual zoom control wheel relative to the first position, the wheel display position of the virtual zoom control wheel can be determined in the user's field of view of the virtual scene, and the virtual zoom control wheel can be displayed at the wheel display position.
  • the display position of the virtual zoom control wheel is determined according to the first position of the virtual reality handle, so that the user can more intuitively observe the starting position of the zoom adjustment, which can improve the convenience of user operation.
  • the above-mentioned determining the wheel display position of the virtual zoom control wheel in the user's field of view of the virtual scene according to the first position of the virtual reality handle, and displaying the virtual zoom control wheel at the wheel display position can be specifically achieved through the following steps: determining the center position of the virtual zoom control wheel in the user's field of view of the virtual scene according to the first position of the virtual reality handle, a preset deviation distance of the center position of the virtual zoom control wheel relative to the first position, and a preset deviation orientation of the center position of the virtual zoom control wheel relative to the first position; and displaying the virtual zoom control wheel according to the center position of the virtual zoom control wheel and the preset wheel radius.
  • the preset wheel radius refers to a predetermined display radius of the virtual zoom control wheel.
  • the preset wheel radius can be used as a measure of the deviation distance of the center position of the virtual zoom control wheel relative to the first position, and the deviation distance does not exceed the preset wheel radius.
  • half of the preset wheel radius can be used as the preset deviation distance of the center position of the virtual zoom control wheel relative to the first position, that is, the distance between the first position and the center position of the virtual zoom control wheel is half of the preset wheel radius, so as to better meet the user's two different operation requirements of zooming in and zooming out.
  • the deviation direction of the center position of the virtual zoom control wheel relative to the first position can be pre-configured, and the center position of the virtual zoom control wheel can be located at any direction of the first position, which is not specifically limited here.
  • the lower right side of the first position may be used as the preset deviation direction of the center position of the virtual zoom control wheel relative to the first position, that is, the first position is located at the upper left side of the center position of the virtual zoom control wheel.
  • the position of the initial handle position marker 203 can be used as the first position.
  • the initial handle position marker 203 is located at the upper left of the center position of the virtual zoom control wheel, and the distance between the initial handle position marker 203 and the center position of the virtual zoom control wheel is half the radius of the wheel.
  • the virtual zoom control wheel is displayed according to the preset deviation distance and preset deviation direction of the center position of the virtual zoom control wheel relative to the first position, so that the size of the virtual zoom control wheel determined each time is absolutely fixed, and the position of the virtual zoom control wheel is relatively fixed with respect to the first position, so that each zooming is changed within a certain range based on the previous zooming ratio, which can improve the convenience of user operation.
  • Step S120 mapping the real position of the virtual reality handle onto the virtual zoom control wheel.
  • the real position of the virtual reality handle can be mapped on the virtual zoom control wheel, and the first position can be marked on the virtual reality control wheel to facilitate the user to refer to the position when performing position movement operations.
  • the actual position of the virtual reality handle can also be mapped and marked on the virtual zoom control wheel in real time, as shown in the current handle position mark 204 shown in FIG. 2A , so that the user can observe the relative position state of the virtual reality handle in real time.
  • Step S130 determining a target zoom ratio of the virtual scene according to the position movement operation of the virtual reality handle on the virtual zoom control wheel.
  • the operation of moving the position of the virtual reality handle on the virtual zoom control wheel can be achieved by the user controlling the virtual reality handle to move in the real scene.
  • the target zoom ratio refers to the zoom ratio of the virtual scene to be adjusted.
  • the virtual scene when it is detected that the virtual reality handle moves toward the inside of the virtual zoom control wheel, the virtual scene can be zoomed out, and the greater the distance moved inward, the greater the zoom-out ratio; when it is detected that the virtual reality handle moves toward the outside of the virtual zoom control wheel, the virtual scene can be zoomed in, and the greater the distance moved outward, the greater the zoom-in ratio.
  • the above-mentioned determination of the target zoom ratio of the virtual scene according to the position movement operation of the virtual reality handle on the virtual zoom control wheel can be specifically implemented by the following steps S410 to S420 as shown in FIG4 :
  • Step S410 in response to the movement end operation of the virtual reality handle, determining a second position of the virtual reality handle, where the second position is a mapping position of the virtual reality handle in the virtual scene when the movement end operation is triggered by the virtual reality handle;
  • Step S420 determining a target scaling ratio of the virtual scene according to the second position of the virtual reality handle.
  • the second position is the mapping position of the virtual reality handle in the virtual scene when the virtual reality handle triggers the move end operation.
  • the move end operation of the virtual reality handle can lock the current zoom ratio, that is, the zoom ratio corresponding to the virtual scene when the virtual reality handle triggers the move end operation is used as the target zoom ratio.
  • the movement end operation of the virtual reality controller may be a click operation of the user on a designated button on the virtual reality controller again.
  • the movement end operation of the virtual reality controller may also be an operation in which the user releases a designated button on the virtual reality controller.
  • the target zoom ratio may be determined based on the relative position change between the first position and the second position of the virtual reality handle.
  • the wheel zoom scale corresponding to the second position of the virtual reality handle can be used as the target zoom ratio. It should be noted that when the virtual zoom control wheel displays a zoom scale, the wheel zoom scale corresponding to the first position of the virtual reality handle can be used as the reference zoom scale, and each time the virtual scene moves toward the center of the circle by a specific distance, the virtual scene can be reduced by a specific ratio, and each time the virtual scene moves away from the center of the circle by a specific distance, the virtual scene can be enlarged by a specific ratio.
  • the target zoom ratio of the virtual scene is further determined. Since the position information is unique and certain, the problem of irregular user gestures can be avoided, thereby improving the accuracy of scene zooming.
  • a moving end operation prompt mark 205 may be displayed next to the virtual zoom control wheel to guide the user's operation.
  • the above-mentioned determination of the target scaling ratio of the virtual scene according to the second position of the virtual reality handle, as shown in FIG5 can be specifically implemented by the following steps S510 to S520:
  • Step S510 determining a relative position change of the virtual reality handle according to the second position and the first position of the virtual reality handle, wherein the first position is a mapping position of the virtual reality handle in the virtual scene when the virtual scene event is triggered;
  • Step S520 determining a target zoom ratio of the virtual scene according to the relative position change of the virtual reality handle and the zoom ratio of the virtual scene when the virtual scene zoom event is triggered.
  • the relative position change of the virtual reality handle refers to the position change of the second position relative to the first position.
  • the relative position change of the virtual reality handle can be represented by the difference between the second distance and the first distance, wherein the first distance is the distance between the first position and the center position of the virtual zoom control wheel, and the second distance is the distance between the second position and the center position of the virtual zoom control wheel; the relative position change of the virtual reality handle can also be represented by the distance between the second position and the first position.
  • the target scaling ratio can be freely controlled, so that the virtual scene can be scaled at any ratio within a certain range, which improves the freedom of virtual scene scaling to a certain extent.
  • the following further illustrates the process of determining the target scaling ratio by using the difference between the second distance and the first distance to represent the relative position change of the virtual reality handle.
  • the determining of the relative position change of the virtual reality handle based on the second position and the first position of the virtual reality handle can be specifically implemented in the following manner: determining the relative position change of the virtual reality handle based on a difference between a second distance and the first distance, the first distance being the distance between the first position and the center position of the virtual zoom control wheel, and the second distance being the distance between the second position and the center position of the virtual zoom control wheel.
  • S2-S1 can be used as the relative position change of the virtual reality handle, where S2 represents the second position The distance between the first position and the center of the virtual zoom control wheel, S1 represents the distance between the first position and the center of the virtual zoom control wheel.
  • S2-S1 is a positive value
  • the virtual scene can be enlarged; if S2-S1 is a negative value, the virtual scene can be reduced; if S2 is equal to S1, the virtual scene is not scaled, that is, the target zoom ratio is the same as the zoom ratio of the virtual scene when the virtual scene zoom event is triggered.
  • the larger the target zoom ratio; the smaller the absolute value of
  • the relative position change is determined based on the second distance and the first distance, which provides a parameter basis for determining the target zoom ratio, so that the user can freely and accurately adjust the zoom ratio of the virtual scene.
  • the relative change rate k*(S2-S1)*100% of the virtual reality handle can be calculated based on the relative position change S2-S1 of the virtual reality handle and the preset constant k.
  • the preset constant k can be determined by the developer based on a large number of tests, and can be used to adjust the scaling ratio corresponding to each unit of movement to meet the comfort of the user's visual perception.
  • the relative change rate of the virtual reality handle can be added by 1, and then multiplied by the scaling ratio of the virtual scene when the virtual scene scaling event is triggered to obtain the target scaling ratio.
  • the target scaling ratio N2 can be obtained by calculating (1+k*(S2-S1)*100%)*N1.
  • the maximum zoom ratio value allowed for zooming can also be set.
  • a maximum zoom-in prompt circle 206 or a maximum zoom-out prompt circle 207 as shown in FIG. 2B can be displayed in the virtual zoom control wheel to prompt the user that the maximum zoom ratio has been reached.
  • the process may continue to refer to FIG. 1 and execute step S140 .
  • Step S140 scaling the virtual scene according to the target scaling ratio.
  • the virtual scene After determining the target scaling ratio, the virtual scene can be adaptively scaled according to the target scaling ratio, thereby achieving free adjustment of the presentation size of the virtual scene, improving the flexibility of VR interaction, and getting rid of the limitations of the real scene.
  • a zoom operation prompt mark 208 can also be added to the virtual zoom control wheel to assist in guiding the user to perform the zoom operation.
  • a preferred virtual scene scaling implementation flow chart is also provided, which may specifically include the following steps:
  • Step S601 in response to a triggering operation of a virtual scene zooming event, obtaining a first position of a virtual reality handle
  • Step S602 determining the center position of the virtual zoom control wheel in the user's field of view of the virtual scene according to the first position of the virtual reality handle, the preset deviation distance of the center position of the virtual zoom control wheel relative to the first position, and the preset deviation direction of the center position of the virtual zoom control wheel relative to the first position;
  • Step S603 displaying a virtual zoom control wheel according to the center position of the virtual zoom control wheel and a preset wheel radius;
  • Step S604 mapping the real position of the virtual reality handle onto the virtual zoom control wheel
  • Step S605 in response to the movement end operation of the virtual reality handle, determining a second position of the virtual reality handle;
  • Step S606 determining a relative position change of the virtual reality handle according to a difference between the second distance and the first distance
  • Step S607 determining the target zoom ratio of the virtual scene according to the relative position change of the virtual reality handle and the zoom ratio of the virtual scene when the virtual scene zoom event is triggered.
  • Step S608 scaling the virtual scene according to the target scaling ratio.
  • the first distance is the distance between the first position and the center position of the virtual zoom control wheel
  • the second distance is the distance between the second position and the center position of the virtual zoom control wheel
  • the virtual scene is zoomed based on the position of the virtual reality handle, so that the user can complete the interaction of virtual scene zooming with one hand, which greatly improves the freedom and convenience of virtual scene zooming.
  • the exemplary embodiment of the present disclosure further provides a virtual scene scaling device.
  • the virtual scene scaling device 700 may include:
  • An event triggering module 710 for displaying a virtual zoom control wheel in the user's field of view of the virtual scene in response to a triggering operation of a virtual scene zooming event
  • a position mapping module 720 for mapping the real position of the virtual reality handle onto the virtual zoom control wheel
  • a ratio determination module 730 determines a target zoom ratio of the virtual scene according to a position movement operation of the virtual reality handle on the virtual zoom control wheel;
  • the scene scaling module 740 is used to scale the virtual scene according to the target scaling ratio.
  • the event triggering module 710 may include: a first position acquisition module, used to respond to a triggering operation of a virtual scene zooming event, to acquire a first position of the virtual reality handle, the first position being a mapping position of the virtual reality handle in the virtual scene when the virtual scene zooming event is triggered; and a wheel display module, used to determine a wheel display position of a virtual zoom control wheel in a user's field of view of the virtual scene according to the first position of the virtual reality handle, and display the virtual zoom control wheel at the wheel display position.
  • the wheel display module may be configured to: determine the center position of the virtual zoom control wheel in the user's field of view of the virtual scene based on the first position of the virtual reality handle, a preset deviation distance of the center position of the virtual zoom control wheel relative to the first position, and a preset deviation orientation of the center position of the virtual zoom control wheel relative to the first position; and display the virtual zoom control wheel based on the center position of the virtual zoom control wheel and the preset wheel radius.
  • a preset deviation distance of the center position of the virtual zoom control wheel relative to the first position is half of a preset wheel radius.
  • the ratio determination module 730 may include: a second position determination module, used to determine a second position of the virtual reality handle in response to a move end operation of the virtual reality handle, the second position being a mapping position of the virtual reality handle in the virtual scene when the virtual reality handle triggers the move end operation; and a first ratio determination submodule, used to determine a target scaling ratio of the virtual scene according to the second position of the virtual reality handle.
  • the ratio determination submodule may further include: a position change determination module, used to determine the relative position change of the virtual reality handle according to the second position and the first position of the virtual reality handle, the first position being the mapping position of the virtual reality handle in the virtual scene when the virtual scene event is triggered; a second ratio determination submodule, used to determine the target zoom ratio of the virtual scene according to the relative position change of the virtual reality handle and the zoom ratio of the virtual scene when the virtual scene zoom event is triggered.
  • a position change determination module used to determine the relative position change of the virtual reality handle according to the second position and the first position of the virtual reality handle, the first position being the mapping position of the virtual reality handle in the virtual scene when the virtual scene event is triggered
  • a second ratio determination submodule used to determine the target zoom ratio of the virtual scene according to the relative position change of the virtual reality handle and the zoom ratio of the virtual scene when the virtual scene zoom event is triggered.
  • the position change determination module may be configured to determine the relative position change of the virtual reality handle based on a difference between a second distance and a first distance, wherein the first distance is the distance between the first position and the center position of the virtual zoom control wheel, and the second distance is the distance between the second position and the center position of the virtual zoom control wheel.
  • the exemplary embodiments of the present disclosure also provide a computer-readable storage medium on which a program product capable of implementing the virtual scene scaling method described above in this specification is stored.
  • various aspects of the present disclosure may also be implemented in the form of a program product, which includes a program code, and when the program product is run on an electronic device, the program code is used to enable the electronic device to perform the steps described in the above "Exemplary Method" section of this specification according to various exemplary embodiments of the present disclosure.
  • the program product stored on the computer-readable storage medium can enable the electronic device to perform the following steps:
  • a virtual zoom control wheel is displayed in a user's field of view of the virtual scene
  • the above-mentioned display of a virtual zoom control wheel in the user's field of view of the virtual scene in response to the triggering operation of the virtual scene zoom event can be achieved through the following steps: in response to the triggering operation of the virtual scene zoom event, obtaining a first position of the virtual reality handle, the first position being a mapping position of the virtual reality handle in the virtual scene when the virtual scene zoom event is triggered; determining a wheel display position of the virtual zoom control wheel in the user's field of view of the virtual scene according to the first position of the virtual reality handle, and displaying the virtual zoom control wheel at the wheel display position.
  • the above-mentioned determining the wheel display position of the virtual zoom control wheel in the user's field of view of the virtual scene according to the first position of the virtual reality handle, and displaying the virtual zoom control wheel at the wheel display position can be achieved by the following steps: determining the center position of the virtual zoom control wheel in the user's field of view of the virtual scene according to the first position of the virtual reality handle, a preset deviation distance of the center position of the virtual zoom control wheel relative to the first position, and a preset deviation orientation of the center position of the virtual zoom control wheel relative to the first position; and displaying the virtual zoom control wheel according to the center position of the virtual zoom control wheel and the preset wheel radius.
  • the center position of the virtual zoom control wheel is preset relative to the first position.
  • the deviation distance is half of the preset wheel radius.
  • the above-mentioned determination of the target zoom ratio of the virtual scene based on the position movement operation of the virtual reality handle on the virtual zoom control wheel can be achieved by the following steps: in response to the movement end operation of the virtual reality handle, determining the second position of the virtual reality handle, the second position being the mapping position of the virtual reality handle in the virtual scene when the virtual reality handle triggers the movement end operation; determining the target zoom ratio of the virtual scene according to the second position of the virtual reality handle.
  • the above-mentioned determination of the target scaling ratio of the virtual scene based on the second position of the virtual reality handle can be achieved by the following steps: determining the relative position change of the virtual reality handle based on the second position and the first position of the virtual reality handle, the first position being the mapping position of the virtual reality handle in the virtual scene when the virtual scene event is triggered; determining the target scaling ratio of the virtual scene based on the relative position change of the virtual reality handle and the scaling ratio of the virtual scene when the virtual scene scaling event is triggered.
  • the above-mentioned determination of the relative position change of the virtual reality handle based on the second position and the first position of the virtual reality handle can be achieved by the following steps: determining the relative position change of the virtual reality handle based on the difference between the second distance and the first distance, the first distance being the distance between the first position and the center position of the virtual zoom control wheel, and the second distance being the distance between the second position and the center position of the virtual zoom control wheel.
  • the virtual scene can be zoomed in any proportion within a certain range without jumping out of the virtual scene interface, which greatly improves the convenience and freedom of virtual scene zooming.
  • using the position movement operation of the virtual reality handle on the virtual zoom control wheel as a reference for determining the target zoom ratio can avoid the problem of user operation gestures not being recognized, thereby improving the sensitivity of zoom ratio discrimination, and due to the unique certainty of the virtual reality handle position and the referenceability of the virtual zoom control wheel, the accuracy of virtual scene zooming can also be ensured.
  • the program product may be in the form of a portable compact disk read-only memory (CD-ROM) and include program code, and may be run on an electronic device, such as a personal computer.
  • CD-ROM portable compact disk read-only memory
  • the program product of the present disclosure is not limited thereto, and in this document, a readable storage medium may be any tangible medium containing or storing a program that may be used by or in conjunction with an instruction execution system, apparatus, or device.
  • the program product may use any combination of one or more readable media.
  • the readable medium may be a readable signal medium or a readable storage medium.
  • the readable storage medium may be, for example, but not limited to, an electrical, magnetic, optical, electromagnetic, infrared, or semiconductor system, device or device, or any combination of the above. More specific examples of readable storage media (a non-exhaustive list) include: an electrical connection with one or more wires, a portable disk, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disk read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the above.
  • Computer readable signal media may include data signals propagated in baseband or as part of a carrier wave, which carry readable program code. Such propagated data signals may take a variety of forms, including but not limited to electromagnetic signals, optical signals, or any suitable combination of the above. Readable signal media may also be any readable medium other than readable storage media, which may be sent, propagated, or transmitted for use by or in conjunction with an instruction execution system, apparatus, or device. Suitable programs for use.
  • the program code contained on the readable medium can be transmitted using any appropriate medium, including but not limited to wireless, wired, optical cable, RF (Radio Frequency), etc., or any suitable combination of the above.
  • Program code for performing the operations of the present disclosure may be written in any combination of one or more programming languages, including object-oriented programming languages such as Java, C++, etc., and conventional procedural programming languages such as "C" or similar programming languages.
  • the program code may be executed entirely on the user computing device, partially on the user device, as a separate software package, partially on the user computing device and partially on a remote computing device, or entirely on a remote computing device or server.
  • the remote computing device may be connected to the user computing device through any type of network, including a local area network (LAN) or a wide area network (WAN), or may be connected to an external computing device (e.g., through the Internet using an Internet service provider).
  • LAN local area network
  • WAN wide area network
  • Internet service provider e.g., AT&T, MCI, Sprint, EarthLink, etc.
  • the exemplary embodiment of the present disclosure also provides an electronic device capable of implementing the above-mentioned virtual scene scaling method.
  • the electronic device 800 according to this exemplary embodiment of the present disclosure is described below with reference to FIG8.
  • the electronic device 800 shown in FIG8 is only an example and should not bring any limitation to the functions and scope of use of the embodiments of the present disclosure.
  • electronic device 800 may be in the form of a general-purpose computing device.
  • Components of electronic device 800 may include, but are not limited to: at least one processing unit 810, at least one storage unit 820, a bus 830 connecting different system components (including storage unit 820 and processing unit 810), and a display unit 840.
  • the storage unit 820 stores program codes, which can be executed by the processing unit 810, so that the processing unit 810 executes the steps according to various exemplary embodiments of the present disclosure described in the above “Exemplary Method” section of this specification.
  • processing unit 810 may perform the following steps:
  • a virtual zoom control wheel is displayed in a user's field of view of the virtual scene
  • the above-mentioned display of a virtual zoom control wheel in the user's field of view of the virtual scene in response to the triggering operation of the virtual scene zoom event can be achieved through the following steps: in response to the triggering operation of the virtual scene zoom event, obtaining a first position of the virtual reality handle, the first position being a mapping position of the virtual reality handle in the virtual scene when the virtual scene zoom event is triggered; determining a wheel display position of the virtual zoom control wheel in the user's field of view of the virtual scene according to the first position of the virtual reality handle, and displaying the virtual zoom control wheel at the wheel display position.
  • the above-mentioned determining the wheel display position of the virtual zoom control wheel in the user's field of view of the virtual scene according to the first position of the virtual reality handle, and displaying the virtual zoom control wheel at the wheel display position can be achieved by the following steps: determining the wheel display position of the virtual zoom control wheel according to the first position of the virtual reality handle, the center position of the virtual zoom control wheel, and the center position of the virtual zoom control wheel.
  • the center position of the virtual zoom control wheel is determined in the user's field of view of the virtual scene; according to the center position of the virtual zoom control wheel and the preset wheel radius, the virtual zoom control wheel is displayed.
  • a preset deviation distance of the center position of the virtual zoom control wheel relative to the first position is half of a preset wheel radius.
  • the above-mentioned determination of the target zoom ratio of the virtual scene based on the position movement operation of the virtual reality handle on the virtual zoom control wheel can be achieved by the following steps: in response to the movement end operation of the virtual reality handle, determining the second position of the virtual reality handle, the second position being the mapping position of the virtual reality handle in the virtual scene when the virtual reality handle triggers the movement end operation; determining the target zoom ratio of the virtual scene according to the second position of the virtual reality handle.
  • the above-mentioned determination of the target scaling ratio of the virtual scene based on the second position of the virtual reality handle can be achieved by the following steps: determining the relative position change of the virtual reality handle based on the second position and the first position of the virtual reality handle, the first position being the mapping position of the virtual reality handle in the virtual scene when the virtual scene event is triggered; determining the target scaling ratio of the virtual scene based on the relative position change of the virtual reality handle and the scaling ratio of the virtual scene when the virtual scene scaling event is triggered.
  • the above-mentioned determination of the relative position change of the virtual reality handle based on the second position and the first position of the virtual reality handle can be achieved by the following steps: determining the relative position change of the virtual reality handle based on the difference between the second distance and the first distance, the first distance being the distance between the first position and the center position of the virtual zoom control wheel, and the second distance being the distance between the second position and the center position of the virtual zoom control wheel.
  • the virtual scene can be zoomed in any proportion within a certain range without jumping out of the virtual scene interface, which greatly improves the convenience and freedom of virtual scene zooming.
  • using the position movement operation of the virtual reality handle on the virtual zoom control wheel as a reference for determining the target zoom ratio can avoid the problem of user operation gestures not being recognized, thereby improving the sensitivity of zoom ratio discrimination, and due to the unique certainty of the virtual reality handle position and the referenceability of the virtual zoom control wheel, the accuracy of virtual scene zooming can also be ensured.
  • the storage unit 820 may include a readable medium in the form of a volatile storage unit, such as a random access memory unit (RAM) 821 and/or a cache memory unit 822 , and may further include a read-only memory unit (ROM) 823 .
  • RAM random access memory
  • ROM read-only memory
  • the storage unit 820 may also include a program/utility 824 having a set (at least one) of program modules 825, such program modules 825 including but not limited to: an operating system, one or more application programs, other program modules, and program data, each of which or some combination may include an implementation of a network environment.
  • program modules 825 including but not limited to: an operating system, one or more application programs, other program modules, and program data, each of which or some combination may include an implementation of a network environment.
  • Bus 830 may represent one or more of several types of bus structures, including a memory unit bus or memory unit controller, a peripheral bus, an accelerated graphics port, a processing unit, or a local bus using any of a variety of bus architectures.
  • the electronic device 800 may also communicate with one or more external devices 900 (e.g., a keyboard, a pointing device, a Bluetooth device, etc.), and may also communicate with one or more devices that enable a user to interact with the electronic device 800, and/or communicate with a user to enable a user to interact with the electronic device 800.
  • the electronic device 800 can communicate with any device (e.g., router, modem, etc.) that communicates with one or more other computing devices. Such communication can be performed through an input/output (I/O) interface 850.
  • I/O input/output
  • the electronic device 800 can also communicate with one or more networks (e.g., a local area network (LAN), a wide area network (WAN), and/or a public network, such as the Internet) through a network adapter 860.
  • the network adapter 860 communicates with other modules of the electronic device 800 through a bus 830.
  • other hardware and/or software modules can be used in conjunction with the electronic device 800, including but not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID (Redundant Arrays of Independent Disks, Redundant Array of Independent Disks) systems, tape drives, and data backup storage systems, etc.
  • the technical solution according to the implementation of the present disclosure can be embodied in the form of a software product, which can be stored in a non-volatile storage medium (which can be a CD-ROM, a USB flash drive, a mobile hard disk, etc.) or on a network, including several instructions to enable a computing device (which can be a personal computer, a server, a terminal device, or a network device, etc.) to execute the method according to the exemplary implementation of the present disclosure.
  • a non-volatile storage medium which can be a CD-ROM, a USB flash drive, a mobile hard disk, etc.
  • a computing device which can be a personal computer, a server, a terminal device, or a network device, etc.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Processing Or Creating Images (AREA)
  • User Interface Of Digital Computer (AREA)
  • Position Input By Displaying (AREA)

Abstract

一种虚拟场景缩放方法、装置、存储介质与电子设备,该虚拟场景缩放方法包括:响应于虚拟场景缩放事件的触发操作,在虚拟场景的用户视野中显示一虚拟缩放控制轮盘(S110);将虚拟现实手柄的现实位置映射在虚拟缩放控制轮盘上(S120);根据虚拟现实手柄在虚拟缩放控制轮盘上的位置移动操作,确定虚拟场景的目标缩放比例(S130);根据目标缩放比例,对虚拟场景进行缩放(S140)。本公开通过虚拟现实手柄与虚拟缩放控制轮盘的交互,实现了对虚拟场景的缩放,提升了虚拟场景缩放的自由度和便捷性。 (图1)

Description

虚拟场景缩放方法、装置、存储介质与电子设备
相关申请的交叉引用
本公开要求于2022年10月09日提交的申请号为202211229133.7、名称为“虚拟场景缩放方法、装置、存储介质与电子设备”的中国专利申请的优先权,该中国专利申请的全部内容通过引用结合在本公开中。
技术领域
本公开涉及虚拟现实技术领域,尤其涉及一种虚拟场景缩放方法、虚拟场景缩放装置、计算机可读存储介质与电子设备。
背景技术
虚拟现实技术可以为用户提供沉浸式的虚拟场景享受体验。在用户进行虚拟现实体验过程中,受现实环境的约束,用户只能在固定大小的空间范围内移动,该虚拟场景通常也以固定大小对用户进行呈现。为了适配于空间范围较大的虚拟场景,通常需要对虚拟场景进行缩放,以摆脱现实环境所带来的约束。
相关技术中,在对虚拟场景进行缩放时,通常需要先跳出虚拟场景界面,再打开系统设置界面,进行场景缩放设置,进而实现对虚拟场景呈现尺寸的调节。对于虚拟现实类游戏来说,这种虚拟场景缩放方式太过于繁琐,游戏打断感较强。此外,相关技术中,还通过采集用户的手势,基于用户的手势对虚拟场景进行缩放,但是每次缩放的比例是固定的,操作繁琐且自由度较低。
需要说明的是,在上述背景技术部分公开的信息仅用于加强对本公开的背景的理解,因此可以包括不构成对本领域普通技术人员已知的现有技术的信息。
发明内容
本公开提供了一种虚拟场景缩放方法、虚拟场景缩放装置、计算机可读存储介质与电子设备,进而至少在一定程度上解决相关技术中虚拟场景缩放操作繁琐且自由度较低的问题。
本公开的其他特性和优点将通过下面的详细描述变得显然,或部分地通过本公开的实践而习得。
根据本公开的第一方面,提供一种虚拟场景缩放方法,所述方法包括:响应于虚拟场景缩放事件的触发操作,在虚拟场景的用户视野中显示一虚拟缩放控制轮盘;将虚拟现实手柄的现实位置映射在所述虚拟缩放控制轮盘上;根据所述虚拟现实手柄在所述虚拟缩放控制轮盘上的位置移动操作,确定所述虚拟场景的目标缩放比例;根据所述目标缩放比例,对所述虚拟场景进行缩放。
根据本公开的第二方面,提供一种虚拟场景缩放装置,所述装置包括:事件触发模块,用于响应于虚拟场景缩放事件的触发操作,在虚拟场景的用户视野中显示一虚拟缩放控制轮盘;位置映射模块,用于将虚拟现实手柄的现实位置映射在所述虚拟缩放控制轮盘上;比例确定模块,根据所述虚拟现实手柄在所述虚拟缩放控制轮盘上的位置移动操作,确定所述虚拟场景的目标缩放比例;场景缩放模块,用于根据所述目标缩放比例,对所述虚拟场景进行缩放。
根据本公开的第三方面,提供一种计算机可读存储介质,其上存储有计算机程序,所述计算机程序被处理器执行时实现上述虚拟场景缩放方法。
根据本公开的第四方面,提供一种电子设备,包括:处理器;以及存储器,用于存储所述处理器的可执行指令;其中,所述处理器配置为经由执行所述可执行指令来执行上述虚拟场景缩放方法。
本公开的技术方案具有以下有益效果:
上述虚拟场景缩放过程中,响应于虚拟场景缩放事件的触发操作,在虚拟场景的用户视野中显示一虚拟缩放控制轮盘;将虚拟现实手柄的现实位置映射在虚拟缩放控制轮盘上;根据虚拟现实手柄在虚拟缩放控制轮盘上的位置移动操作,确定虚拟场景的目标缩放比例;根据目标缩放比例,对虚拟场景进行缩放。一方面,基于虚拟现实手柄在虚拟缩放控制轮盘上的位置移动操作,无需跳出虚拟场景界面,可在一定范围内实现任意比例的虚拟场景缩放,大大提升了虚拟场景缩放的便捷性和自由度。另一方面,将虚拟现实手柄在虚拟缩放控制轮盘上的位置移动操作作为确定目标缩放比例的参考依据,能够避免由于用户操作手势不规范所导致的识别错误的问题,提升了缩放比例判别的灵敏度,且由于虚拟现实手柄位置的唯一确定性以及虚拟缩放控制轮盘的可参考性,还能够确保虚拟场景缩放的精准性。
应当理解的是,以上的一般描述和后文的细节描述仅是示例性和解释性的,并不能限制本公开。
附图说明
此处的附图被并入说明书中并构成本说明书的一部分,示出了符合本公开的实施方式,并与说明书一起用于解释本公开的原理。显而易见地,下面描述中的附图仅仅是本公开的一些实施方式,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其他的附图。
图1示出本公开示例性实施方式其中之一种虚拟场景缩放方法的流程图;
图2A示出本公开示例性实施方式其中之一种虚拟缩放控制轮盘的显示实例图;
图2B示出本公开示例性实施方式其中之一种虚拟缩放控制轮盘操作提示标识的显示实例图;
图3示出本公开示例性实施方式其中之一种显示虚拟缩放控制轮盘的流程图;
图4示出本公开示例性实施方式其中之一种根据虚拟现实手柄的位置移动操作确定虚拟场景的目标缩放比例的流程图;
图5示出本公开示例性实施方式其中之一种根据虚拟现实手柄的第二位置确定虚拟场景的目标缩放比例的流程图;
图6示出本公开示例性实施方式其中之一种优选的虚拟场景缩放实施流程图;
图7示出本公开示例性实施方式其中之一种虚拟场景缩放装置的结构框图;
图8示出本公开示例性实施方式其中之一种用于实现上述虚拟场景缩放方法的电子设备。
具体实施方式
现在将参考附图更全面地描述示例实施方式。然而,示例实施方式能够以多种形式实施,且不应被理解为限于在此阐述的范例;相反,提供这些实施方式使得本公开将更加全面和完整,并将示例实施方式的构思全面地传达给本领域的技术人员。所描述的特征、结构或特性可以以任何合适的方式结合在一个或更多实施方式中。在下面的描述中,提供许多具体细节从而给出对本公开的实施方式的充分理解。然而,本领域技术人员将意识到,可以实践本公开的技术方案而省略所述特定细节中的一个或更多,或者可以采用其它的方法、组元、装置、步骤等。在其它情况下,不详细示出或描述公知技术方案以避免喧宾夺主而使得本公开的各方面变得模糊。
本说明书中使用用语“一个”、“一”、“该”和“所述”用以表示存在一个或多个要素/组成部分/等;用语“包括”和“具有”用以表示开放式的包括在内的意思并且是指除了列出的要素/组成部分/等之外还可存在另外的要素/组成部分/等;用语“第一”和“第二”等仅作为标记使用,不是对其对象的数量限制。
此外,附图仅为本公开的示意性图解,并非一定是按比例绘制。图中相同的附图标记表示相同或类似的部分,因而将省略对它们的重复描述。附图中所示的一些方框图是功能实体,不一定必须与物理或逻辑上独立的实体相对应。可以采用软件形式来实现这些功能实体,或在一个或多个硬件模块或集成电路中实现这些功能实体,或在不同网络和/或处理器装置和/或微控制器装置中实现这些功能实体。
相关技术中,跳出虚拟场景界面进行虚拟场景缩放调节,会导致虚拟场景显示中断,进而影响到用户的沉浸式体验,整个虚拟场景缩放过程较为繁琐。此外,通过采集用户手势进行虚拟场景调节,每次缩放的比例是固定的,导致虚拟场景缩放自由度较低,而且用户在执行手势操作时,需要通过特定的感知设备去识别,由于用户手势在实际操作时因人而异,可能会由于动作不标准导致识别错误的问题,导致虚拟场景的缩放精准度较低。
鉴于上述一个或多个问题,本公开的示例性实施方式提供一种虚拟场景缩放方法,可以由虚拟现实显示终端来执行,能够适用于SLG(Simulation Game,策略类游戏)等这类需要放大缩小游戏地图的VR(Virtual Reality,虚拟现实)交互场景。
本公开的示例性实施方式提供一种虚拟场景缩放方法,如图1所示,具体包括以下步骤S110至步骤S140:
步骤S110,响应于虚拟场景缩放事件的触发操作,在虚拟场景的用户视野中显示一虚拟缩放控制轮盘;
步骤S120,将虚拟现实手柄的现实位置映射在虚拟缩放控制轮盘上;
步骤S130,根据虚拟现实手柄在虚拟缩放控制轮盘上的位置移动操作,确定虚拟场景的目标缩放比例;
步骤S140,根据目标缩放比例,对虚拟场景进行缩放。
上述虚拟场景缩放过程中,一方面,基于虚拟现实手柄在虚拟缩放控制轮盘上的位置移动操作,无需跳出虚拟场景界面,可在一定范围内实现任意比例的虚拟场景缩放,大大提升了虚拟场景缩放的便捷性和自由度。另一方面,将虚拟现实手柄在虚拟缩放控制轮盘上的位置移动操作作为确定目标缩放比例的参考依据,能够避免由于用户操作手势不规范所导致的识别错误的问题,提升了缩放比例判别的灵敏度,且由于虚拟现实手柄位置的唯一确定性以及虚拟缩放控制轮盘的可参考性,还能够确保虚拟场景缩放的精准性。
下面分别对图1中的每个步骤进行具体说明。
步骤S110,响应于虚拟场景缩放事件的触发操作,在虚拟场景的用户视野中显示一虚拟缩放控制轮盘。
虚拟现实手柄指的是控制虚拟场景与现实场景进行交互的实体设备,例如VR游戏手柄。虚拟场景缩放事件指的是将虚拟场景的呈现比例进行放大或缩小调整的事件。
示例性的,虚拟场景缩放事件的触发操作可以是用户对虚拟现实手柄上指定按键的点击操作。
示例性的,虚拟场景缩放事件的触发操作还可以是用户对虚拟现实手柄上指定按键的长按操作。
需要说明的是,在实际应用过程中,虚拟场景缩放事件的触发操作具体可由开发人员进行设定,这里不做具体限定。
虚拟场景的用户视野指的是当前用户可以观看到的虚拟场景的范围,示例性的,可以是用户通过VR眼镜所能够观看的虚拟场景画面,也可以是VR外接显示屏中所显示的虚拟场景画面,如图2A中的虚拟场景画面201所示。
虚拟缩放控制轮盘用于辅助用户控制虚拟缩放控制轮盘,可如图2A中的轮盘标识202所示,可显示于虚拟场景画面201中。
在一种可选的实施方式中,上述响应于虚拟现实手柄对虚拟场景缩放事件的触发操作,在虚拟场景的用户视野中显示一虚拟缩放控制轮盘,如图3所示,具体可通过以下步骤S310至步骤S320来实现:
步骤S310,响应于虚拟现实手柄对虚拟场景缩放事件的触发操作,获取虚拟现实手柄的第一位置,第一位置为虚拟场景缩放事件被触发时虚拟现实手柄在虚拟场景中的映射 位置;
步骤S320,根据虚拟现实手柄的第一位置,在虚拟场景的用户视野中确定虚拟缩放控制轮盘的轮盘显示位置,并在轮盘显示位置处显示虚拟缩放控制轮盘。
第一位置指的是虚拟场景缩放事件被触发时虚拟现实手柄在虚拟场景中的映射位置,可以作为后续虚拟现实手柄移动的初始位置,即缩放调节的起始位置。
当虚拟现实手柄对虚拟场景缩放事件被触发时,可基于虚拟现实手柄中的位置传感器,获得虚拟现实手柄在真实场景中的位置,并将该虚拟现实手柄在真实场景中的位置映射至虚拟场景中,得到虚拟现实手柄的第一位置。
在获得虚拟现实手柄的第一位置后,可以根据该虚拟现实手柄的第一位置确定虚拟缩放控制轮盘的轮盘显示位置。
该虚拟缩放控制轮盘的轮盘显示位置,可与虚拟现实手柄的第一位置之间建立关联关系。
在执行步骤S320时,可以将预设长度作为虚拟缩放控制轮盘的圆心位置相对于第一位置的偏离距离;可以将预设方位作为虚拟缩放控制轮盘的圆心位置相对于第一位置的偏离方位;可以根据该虚拟缩放控制轮盘的圆心位置相对于第一位置的偏离距离以及偏离方位,在虚拟场景的用户视野中确定虚拟缩放控制轮盘的轮盘显示位置,并在轮盘显示位置处显示虚拟缩放控制轮盘。
图3所示的步骤中,根据虚拟现实手柄的第一位置,确定虚拟缩放控制轮盘的显示位置,使得用户能够较为直观的观测到缩放调节的起始位置,能够提高用户操作的便利性。
在一种可选的实施方式中,上述根据虚拟现实手柄的第一位置,在虚拟场景的用户视野中确定虚拟缩放控制轮盘的轮盘显示位置,并在轮盘显示位置处显示虚拟缩放控制轮盘,具体可通过以下步骤来实现:根据虚拟现实手柄的第一位置、虚拟缩放控制轮盘的圆心位置相对于第一位置的预设偏离距离以及虚拟缩放控制轮盘的圆心位置相对于第一位置的预设偏离方位,在虚拟场景的用户视野中确定虚拟缩放控制轮盘的圆心位置;根据虚拟缩放控制轮盘的圆心位置以及预设轮盘半径,显示虚拟缩放控制轮盘。
预设轮盘半径指的是预先所确定的虚拟缩放控制轮盘的显示半径。可以基于预设轮盘半径作为衡量虚拟缩放控制轮盘的圆心位置相对于第一位置的偏离距离,该偏离距离不超过预设轮盘半径。
示例性的,可以将预设轮盘半径的一半作为虚拟缩放控制轮盘的圆心位置相对于第一位置的预设偏离距离,即第一位置与虚拟缩放控制轮盘的圆心位置之间的距离为预设轮盘半径的一半,以便于更好的满足用户缩小和放大两种不同的操作需求。
虚拟缩放控制轮盘的圆心位置相对于第一位置的偏离方位可以预先进行配置,虚拟缩放控制轮盘的圆心位置可以位于第一位置的任意方位,这里不做具体限定。
示例性的,可以将第一位置的右下方作为虚拟缩放控制轮盘的圆心位置相对于第一位置的预设偏离方位,即第一位置位于虚拟缩放控制轮盘的圆心位置的左上方。
如图2A所示,可以将初始手柄位置标识203所在位置作为第一位置,该初始手柄位置标识203位于虚拟缩放控制轮盘的圆心位置的左上方,且该初始手柄位置标识203与虚拟缩放控制轮盘的圆心位置之间的距离为轮盘半径的一半。
需要说明的是,上述偏离距离以及偏离方位的参数配置只是示例性的说明,在实际应用过程中,可根据实际需求或用户习惯进行设定,这里不做具体限定。
上述过程中,根据虚拟缩放控制轮盘的圆心位置相对于第一位置的预设偏离距离以及预设偏离方位,显示虚拟缩放控制轮盘,使得每次所确定的虚拟缩放控制轮盘的大小是绝对固定的,虚拟缩放控制轮盘的位置对于第一位置来说是相对固定的,使得每一次的缩放都是基于上一次的缩放比例进行一定范围的变化,能够提高用户操作的便利性。
步骤S120,将虚拟现实手柄的现实位置映射在虚拟缩放控制轮盘上。
如图2A中所显示的初始手柄位置标识203所示,在虚拟场景缩放事件被触发时,可将将虚拟现实手柄的现实位置在虚拟缩放控制轮盘上进行映射,并将第一位置在虚拟现实控制轮盘上进行标识,以便于用户在进行位置移动操作时,进行位置参考。
还可以将虚拟现实手柄的现实位置在虚拟缩放控制轮盘上实时进行映射并标识,如图2A中所显示的当前手柄位置标识204所示,以便于用户能够实时观测到虚拟现实手柄的相对位置状态。
步骤S130,根据虚拟现实手柄在虚拟缩放控制轮盘上的位置移动操作,确定虚拟场景的目标缩放比例。
虚拟现实手柄在虚拟缩放控制轮盘上的位置移动操作,可由用户通过控制虚拟现实手柄在现实场景中进行移动来实现。目标缩放比例指的是虚拟场景待调节的缩放比例。
示例性的,当检测到虚拟现实手柄向虚拟缩放控制轮盘内部移动时,则可以对虚拟场景进行缩小处理,向内部移动距离越大,缩小比例越大;当检测到虚拟现实手柄向虚拟缩放控制轮盘外部移动时,则可以对虚拟场景进行放大处理,向外部移动距离越大,放大比例越大。
在一种可选的实施方式中,上述根据虚拟现实手柄在虚拟缩放控制轮盘上的位置移动操作,确定虚拟场景的目标缩放比例,如图4所示具体可通过以下步骤S410至步骤S420来实现:
步骤S410,响应于虚拟现实手柄的移动结束操作,确定虚拟现实手柄的第二位置,第二位置为虚拟现实手柄触发移动结束操作时虚拟现实手柄在虚拟场景中的映射位置;
步骤S420,根据虚拟现实手柄的第二位置,确定虚拟场景的目标缩放比例。
第二位置为虚拟现实手柄触发移动结束操作时虚拟现实手柄在虚拟场景中的映射位置。虚拟现实手柄的移动结束操作可以锁定当前缩放比例,即可将虚拟现实手柄触发移动结束操作时虚拟场景所对应的缩放比例作为目标缩放比例。
示例性的,虚拟现实手柄的移动结束操作可以是用户对虚拟现实手柄上指定按键的再次点击操作。
示例性的,虚拟现实手柄的移动结束操作还可以是用户对虚拟现实手柄上指定按键的松手操作。
若虚拟缩放控制轮盘不显示缩放刻度,则可基于虚拟现实手柄的第一位置与第二位置之间的相对位置变化量来确定目标缩放比例。
若虚拟缩放控制轮盘显示缩放刻度,则可将虚拟现实手柄的第二位置所对应的轮盘缩放刻度作为目标缩放比例。需要说明的是,当虚拟缩放控制轮盘显示缩放刻度时,可将虚拟现实手柄的第一位置所对应的轮盘缩放刻度作为基准缩放刻度,每向圆心位置移动特定距离,则可将虚拟场景缩小特定比例,每背离圆心位置移动特定距离,则可将虚拟场景放大特定比例。
上述过程中,基于虚拟现实手柄的第二位置,进一步确定虚拟场景的目标缩放比例,由于位置信息具有唯一性和确定性,因而能够避免用户手势的不规范问题,进而提升场景缩放的准确性。
此外,如图2B所示,还可以在虚拟缩放控制轮盘旁显示移动结束操作提示标识205,以便于引导用户操作。
在一种可选的实施方式中,上述根据虚拟现实手柄的第二位置,确定虚拟场景的目标缩放比例,如图5所示,具体可通过以下步骤S510至步骤S520来实现:
步骤S510,根据虚拟现实手柄的第二位置和第一位置,确定虚拟现实手柄的相对位置变化量,第一位置为虚拟场景事件被触发时虚拟现实手柄在虚拟场景中的映射位置;
步骤S520,根据虚拟现实手柄的相对位置变化量以及虚拟场景缩放事件被触发时虚拟场景的缩放比例,确定虚拟场景的目标缩放比例。
虚拟现实手柄的相对位置变化量指的是第二位置相对于第一位置的位置变化量。示例性的,可以用第二距离与第一距离之间的差值来表征虚拟现实手柄的相对位置变化量,其中第一距离为第一位置与虚拟缩放控制轮盘的圆心位置之间的距离,第二距离为第二位置与虚拟缩放控制轮盘的圆心位置之间的距离;还可以用第二位置与第一位置之间的距离来表征虚拟现实手柄的相对位置变化量。
上述过程中,基于虚拟现实手柄的相对位置变化量,可以实现对目标缩放比例的自由控制,使得虚拟场景可以在一定范围内进行任意比例的缩放,在一定程度上提升了虚拟场景缩放的自由度。
下面以第二距离与第一距离之间的差值来表征虚拟现实手柄的相对位置变化量来进一步说明目标缩放比例的确定过程。
在一种可选的实施方式中,上述根据虚拟现实手柄的第二位置和第一位置,确定虚拟现实手柄的相对位置变化量,具体可通过以下方式来实现:根据第二距离与第一距离之间的差值,确定虚拟现实手柄的相对位置变化量,第一距离为第一位置与虚拟缩放控制轮盘的圆心位置之间的距离,第二距离为第二位置与虚拟缩放控制轮盘的圆心位置之间的距离。
示例性的,可将S2-S1作为虚拟现实手柄的相对位置变化量,其中S2表示第二位置 与虚拟缩放控制轮盘的圆心位置之间的距离,S1表示第一位置与虚拟缩放控制轮盘的圆心位置之间的距离。需要说明的是,若S2-S1为正值,则可对虚拟场景进行放大处理;若S2-S1为负值,则可对虚拟场景进行缩小处理;若S2与S1相等,则对虚拟场景不进行缩放处理,即目标缩放比例与虚拟场景缩放事件被触发时虚拟场景的缩放比例相同。此外,|S2-S1|绝对值越大,目标缩放比例越大;|S2-S1|绝对值越小,目标缩放比例越小。
上述过程中,基于第二距离与第一距离确定相对位置变化量,为确定目标缩放比例提供了参数依据,以便于用户可以对虚拟场景的缩放比例进行自由且精准的调节。
在确定虚拟现实手柄的相对位置变化量后,可以根据虚拟现实手柄的相对位置变化量S2-S1以及预设常数k,计算虚拟现实手柄的相对变化率k*(S2-S1)*100%。其中,预设常数k可以由开发人员根据大量测试进行确定,可用于调整每单位移动所对应的缩放比例,以便符合用户视觉感知的舒适度。接着,可以将虚拟现实手柄的相对变化率加1后,与虚拟场景缩放事件被触发时虚拟场景的缩放比例相乘,得到目标缩放比例。示例性的,可通过计算(1+k*(S2-S1)*100%)*N1得到目标缩放比例N2。
需要说明的是,上述过程中,通过将虚拟现实手柄的相对变化率加1,可以减少虚拟现实手柄位置的相对变化率对虚拟场景缩放变化率的直接影响,使得用户的视觉感知有一个合理的变化率区间,以避免缩放比例的数值处于不断减小无法增大的状态。
此外,还可以设置允许缩放的最大缩放比例值,在虚拟现实手柄的移动过程中,若虚拟现实手柄在虚拟缩放控制轮盘中所在位置所对应的缩放比例,超过了最大缩放比例值,则可在虚拟缩放控制轮盘中显示如图2B中所示的最大放大提示圈206或最大缩小提示圈207,以提示用户当前已达到最大缩放比例。
在确定目标缩放比例后,可继续参考图1,执行步骤S140。
步骤S140,根据目标缩放比例,对虚拟场景进行缩放。
在确定目标缩放比例后,可根据该目标缩放比例对虚拟场景进行适应性缩放,进而实现对虚拟场景呈现尺寸的自由调整,提升VR交互的灵活性,摆脱现实场景的限制。
示例性的,第二位置相比于第一位置,距离虚拟缩放控制轮盘的圆心位置更近时,可将虚拟场景进行缩小处理;第二位置相比于第一位置,距离虚拟缩放控制轮盘的圆心位置更远时,可将虚拟场景进行放大处理。如图2B所示,在生成虚拟缩放控制轮盘时,还可以在虚拟缩放控制轮盘上增加缩放操作提示标识208,以辅助指导用户进行缩放操作。
如图6所示,还提供一种优选的虚拟场景缩放实施流程图,具体可包括以下步骤:
步骤S601,响应于虚拟场景缩放事件的触发操作,获取虚拟现实手柄的第一位置;
步骤S602,根据虚拟现实手柄的第一位置、虚拟缩放控制轮盘的圆心位置相对于第一位置的预设偏离距离以及虚拟缩放控制轮盘的圆心位置相对于第一位置的预设偏离方位,在虚拟场景的用户视野中确定虚拟缩放控制轮盘的圆心位置;
步骤S603,根据虚拟缩放控制轮盘的圆心位置以及预设轮盘半径,显示虚拟缩放控制轮盘;
步骤S604,将虚拟现实手柄的现实位置映射在虚拟缩放控制轮盘上;
步骤S605,响应于虚拟现实手柄的移动结束操作,确定虚拟现实手柄的第二位置;
步骤S606,根据第二距离与第一距离之间的差值,确定虚拟现实手柄的相对位置变化量;
步骤S607,根据虚拟现实手柄的相对位置变化量以及虚拟场景缩放事件被触发时虚拟场景的缩放比例,确定虚拟场景的目标缩放比例。
步骤S608,根据目标缩放比例,对虚拟场景进行缩放。
其中,第一距离为第一位置与虚拟缩放控制轮盘的圆心位置之间的距离,第二距离为第二位置与虚拟缩放控制轮盘的圆心位置之间的距离。
图6所示的步骤中基于虚拟现实手柄所在位置进行虚拟场景缩放,使得用户单手即可完成虚拟场景缩放的交互,大大提升了虚拟场景缩放的自由度和便捷性。
本公开的示例性实施方式还提供一种虚拟场景缩放装置,如图7所示,该虚拟场景缩放装置700可以包括:
事件触发模块710,用于响应于虚拟场景缩放事件的触发操作,在虚拟场景的用户视野中显示一虚拟缩放控制轮盘;
位置映射模块720,用于将虚拟现实手柄的现实位置映射在虚拟缩放控制轮盘上;
比例确定模块730,根据虚拟现实手柄在虚拟缩放控制轮盘上的位置移动操作,确定虚拟场景的目标缩放比例;
场景缩放模块740,用于根据目标缩放比例,对虚拟场景进行缩放。
在一种可选的实施方式中,事件触发模块710,可以包括:第一位置获取模块,用于响应于虚拟场景缩放事件的触发操作,获取虚拟现实手柄的第一位置,第一位置为虚拟场景缩放事件被触发时虚拟现实手柄在虚拟场景中的映射位置;轮盘显示模块,用于根据虚拟现实手柄的第一位置,在虚拟场景的用户视野中确定虚拟缩放控制轮盘的轮盘显示位置,并在轮盘显示位置处显示虚拟缩放控制轮盘。
在一种可选的实施方式中,轮盘显示模块,可以被配置为:根据虚拟现实手柄的第一位置、虚拟缩放控制轮盘的圆心位置相对于第一位置的预设偏离距离以及虚拟缩放控制轮盘的圆心位置相对于第一位置的预设偏离方位,在虚拟场景的用户视野中确定虚拟缩放控制轮盘的圆心位置;根据虚拟缩放控制轮盘的圆心位置以及预设轮盘半径,显示虚拟缩放控制轮盘。
在一种可选的实施方式中,上述虚拟缩放控制轮盘的圆心位置相对于第一位置的预设偏离距离为预设轮盘半径的一半。
在一种可选的实施方式中,比例确定模块730,可以包括:第二位置确定模块,用于响应于虚拟现实手柄的移动结束操作,确定虚拟现实手柄的第二位置,第二位置为虚拟现实手柄触发移动结束操作时虚拟现实手柄在虚拟场景中的映射位置;第一比例确定子模块,用于根据虚拟现实手柄的第二位置,确定虚拟场景的目标缩放比例。
在一种可选的实施方式中,比例确定子模块,还可以包括:位置变化量确定模块,用于根据虚拟现实手柄的第二位置和第一位置,确定虚拟现实手柄的相对位置变化量,第一位置为虚拟场景事件被触发时虚拟现实手柄在虚拟场景中的映射位置;第二比例确定子模块,用于根据虚拟现实手柄的相对位置变化量以及虚拟场景缩放事件被触发时虚拟场景的缩放比例,确定虚拟场景的目标缩放比例。
在一种可选的实施方式中,位置变化量确定模块,可以被配置为:根据第二距离与第一距离之间的差值,确定虚拟现实手柄的相对位置变化量,第一距离为第一位置与虚拟缩放控制轮盘的圆心位置之间的距离,第二距离为第二位置与虚拟缩放控制轮盘的圆心位置之间的距离。
上述虚拟场景缩放装置700中各部分的具体细节在方法部分实施方式中已经详细说明,未披露的细节内容可以参见方法部分的实施方式内容,因而不再赘述。
本公开的示例性实施方式还提供了一种计算机可读存储介质,其上存储有能够实现本说明书上述虚拟场景缩放方法的程序产品。在一些可能的实施方式中,本公开的各个方面还可以实现为一种程序产品的形式,其包括程序代码,当程序产品在电子设备上运行时,程序代码用于使电子设备执行本说明书上述“示例性方法”部分中描述的根据本公开各种示例性实施方式的步骤。
具体的,计算机可读存储介质上所存储的程序产品可使电子设备执行以下步骤:
响应于虚拟场景缩放事件的触发操作,在虚拟场景的用户视野中显示一虚拟缩放控制轮盘;
将虚拟现实手柄的现实位置映射在虚拟缩放控制轮盘上;
根据虚拟现实手柄在虚拟缩放控制轮盘上的位置移动操作,确定虚拟场景的目标缩放比例;
根据目标缩放比例,对虚拟场景进行缩放。
在一种可选的实施方式中,上述响应于虚拟场景缩放事件的触发操作,在虚拟场景的用户视野中显示一虚拟缩放控制轮盘,可通过以下步骤来实现:响应于虚拟场景缩放事件的触发操作,获取虚拟现实手柄的第一位置,第一位置为虚拟场景缩放事件被触发时虚拟现实手柄在虚拟场景中的映射位置;根据虚拟现实手柄的第一位置,在虚拟场景的用户视野中确定虚拟缩放控制轮盘的轮盘显示位置,并在轮盘显示位置处显示虚拟缩放控制轮盘。
在一种可选的实施方式中,上述根据虚拟现实手柄的第一位置,在虚拟场景的用户视野中确定虚拟缩放控制轮盘的轮盘显示位置,并在轮盘显示位置处显示虚拟缩放控制轮盘,可通过以下步骤来实现:根据虚拟现实手柄的第一位置、虚拟缩放控制轮盘的圆心位置相对于第一位置的预设偏离距离以及虚拟缩放控制轮盘的圆心位置相对于第一位置的预设偏离方位,在虚拟场景的用户视野中确定虚拟缩放控制轮盘的圆心位置;根据虚拟缩放控制轮盘的圆心位置以及预设轮盘半径,显示虚拟缩放控制轮盘。
在一种可选的实施方式中,上述虚拟缩放控制轮盘的圆心位置相对于第一位置的预设 偏离距离为预设轮盘半径的一半。
在一种可选的实施方式中,上述根据虚拟现实手柄在虚拟缩放控制轮盘上的位置移动操作,确定虚拟场景的目标缩放比例,可通过以下步骤来实现:响应于虚拟现实手柄的移动结束操作,确定虚拟现实手柄的第二位置,第二位置为虚拟现实手柄触发移动结束操作时虚拟现实手柄在虚拟场景中的映射位置;根据虚拟现实手柄的第二位置,确定虚拟场景的目标缩放比例。
在一种可选的实施方式中,上述根据虚拟现实手柄的第二位置,确定虚拟场景的目标缩放比例,可通过以下步骤来实现:根据虚拟现实手柄的第二位置和第一位置,确定虚拟现实手柄的相对位置变化量,第一位置为虚拟场景事件被触发时虚拟现实手柄在虚拟场景中的映射位置;根据虚拟现实手柄的相对位置变化量以及虚拟场景缩放事件被触发时虚拟场景的缩放比例,确定虚拟场景的目标缩放比例。
在一种可选的实施方式中,上述根据虚拟现实手柄的第二位置和第一位置,确定虚拟现实手柄的相对位置变化量,可通过以下步骤来实现:根据第二距离与第一距离之间的差值,确定虚拟现实手柄的相对位置变化量,第一距离为第一位置与虚拟缩放控制轮盘的圆心位置之间的距离,第二距离为第二位置与虚拟缩放控制轮盘的圆心位置之间的距离。
一方面,基于虚拟现实手柄在虚拟缩放控制轮盘上的位置移动操作,无需跳出虚拟场景界面,可在一定范围内实现任意比例的虚拟场景缩放,大大提升了虚拟场景缩放的便捷性和自由度。另一方面,将虚拟现实手柄在虚拟缩放控制轮盘上的位置移动操作作为确定目标缩放比例的参考依据,能够避免用户操作手势识别不到的问题,进而提升缩放比例判别的灵敏度,且由于虚拟现实手柄位置的唯一确定性以及虚拟缩放控制轮盘的可参考性,还能够确保虚拟场景缩放的精准性。
该程序产品可以采用便携式紧凑盘只读存储器(CD-ROM)并包括程序代码,并可以在电子设备,例如个人电脑上运行。然而,本公开的程序产品不限于此,在本文件中,可读存储介质可以是任何包含或存储程序的有形介质,该程序可以被指令执行系统、装置或者器件使用或者与其结合使用。
程序产品可以采用一个或多个可读介质的任意组合。可读介质可以是可读信号介质或者可读存储介质。可读存储介质例如可以为但不限于电、磁、光、电磁、红外线、或半导体的系统、装置或器件,或者任意以上的组合。可读存储介质的更具体的例子(非穷举的列表)包括:具有一个或多个导线的电连接、便携式盘、硬盘、随机存取存储器(RAM)、只读存储器(ROM)、可擦式可编程只读存储器(EPROM或闪存)、光纤、便携式紧凑盘只读存储器(CD-ROM)、光存储器件、磁存储器件、或者上述的任意合适的组合。
计算机可读信号介质可以包括在基带中或者作为载波一部分传播的数据信号,其中承载了可读程序代码。这种传播的数据信号可以采用多种形式,包括但不限于电磁信号、光信号或上述的任意合适的组合。可读信号介质还可以是可读存储介质以外的任何可读介质,该可读介质可以发送、传播或者传输用于由指令执行系统、装置或者器件使用或者与其结 合使用的程序。
可读介质上包含的程序代码可以用任何适当的介质传输,包括但不限于无线、有线、光缆、RF(Radio Frequency,射频)等等,或者上述的任意合适的组合。
可以以一种或多种程序设计语言的任意组合来编写用于执行本公开操作的程序代码,程序设计语言包括面向对象的程序设计语言—诸如Java、C++等,还包括常规的过程式程序设计语言—诸如“C”语言或类似的程序设计语言。程序代码可以完全地在用户计算设备上执行、部分地在用户设备上执行、作为一个独立的软件包执行、部分在用户计算设备上部分在远程计算设备上执行、或者完全在远程计算设备或服务器上执行。在涉及远程计算设备的情形中,远程计算设备可以通过任意种类的网络,包括局域网(LAN)或广域网(WAN),连接到用户计算设备,或者,可以连接到外部计算设备(例如利用因特网服务提供商来通过因特网连接)。
本公开的示例性实施方式还提供了一种能够实现上述虚拟场景缩放方法的电子设备。下面参照图8来描述根据本公开的这种示例性实施方式的电子设备800。图8显示的电子设备800仅仅是一个示例,不应对本公开实施方式的功能和使用范围带来任何限制。
如图8所示,电子设备800可以以通用计算设备的形式表现。电子设备800的组件可以包括但不限于:至少一个处理单元810、至少一个存储单元820、连接不同系统组件(包括存储单元820和处理单元810)的总线830和显示单元840。
存储单元820存储有程序代码,程序代码可以被处理单元810执行,使得处理单元810执行本说明书上述“示例性方法”部分中描述的根据本公开各种示例性实施方式的步骤。
具体的,处理单元810可执行以下步骤:
响应于虚拟场景缩放事件的触发操作,在虚拟场景的用户视野中显示一虚拟缩放控制轮盘;
将虚拟现实手柄的现实位置映射在虚拟缩放控制轮盘上;
根据虚拟现实手柄在虚拟缩放控制轮盘上的位置移动操作,确定虚拟场景的目标缩放比例;
根据目标缩放比例,对虚拟场景进行缩放。
在一种可选的实施方式中,上述响应于虚拟场景缩放事件的触发操作,在虚拟场景的用户视野中显示一虚拟缩放控制轮盘,可通过以下步骤来实现:响应于虚拟场景缩放事件的触发操作,获取虚拟现实手柄的第一位置,第一位置为虚拟场景缩放事件被触发时虚拟现实手柄在虚拟场景中的映射位置;根据虚拟现实手柄的第一位置,在虚拟场景的用户视野中确定虚拟缩放控制轮盘的轮盘显示位置,并在轮盘显示位置处显示虚拟缩放控制轮盘。
在一种可选的实施方式中,上述根据虚拟现实手柄的第一位置,在虚拟场景的用户视野中确定虚拟缩放控制轮盘的轮盘显示位置,并在轮盘显示位置处显示虚拟缩放控制轮盘,可通过以下步骤来实现:根据虚拟现实手柄的第一位置、虚拟缩放控制轮盘的圆心位置相 对于第一位置的预设偏离距离以及虚拟缩放控制轮盘的圆心位置相对于第一位置的预设偏离方位,在虚拟场景的用户视野中确定虚拟缩放控制轮盘的圆心位置;根据虚拟缩放控制轮盘的圆心位置以及预设轮盘半径,显示虚拟缩放控制轮盘。
在一种可选的实施方式中,上述虚拟缩放控制轮盘的圆心位置相对于第一位置的预设偏离距离为预设轮盘半径的一半。
在一种可选的实施方式中,上述根据虚拟现实手柄在虚拟缩放控制轮盘上的位置移动操作,确定虚拟场景的目标缩放比例,可通过以下步骤来实现:响应于虚拟现实手柄的移动结束操作,确定虚拟现实手柄的第二位置,第二位置为虚拟现实手柄触发移动结束操作时虚拟现实手柄在虚拟场景中的映射位置;根据虚拟现实手柄的第二位置,确定虚拟场景的目标缩放比例。
在一种可选的实施方式中,上述根据虚拟现实手柄的第二位置,确定虚拟场景的目标缩放比例,可通过以下步骤来实现:根据虚拟现实手柄的第二位置和第一位置,确定虚拟现实手柄的相对位置变化量,第一位置为虚拟场景事件被触发时虚拟现实手柄在虚拟场景中的映射位置;根据虚拟现实手柄的相对位置变化量以及虚拟场景缩放事件被触发时虚拟场景的缩放比例,确定虚拟场景的目标缩放比例。
在一种可选的实施方式中,上述根据虚拟现实手柄的第二位置和第一位置,确定虚拟现实手柄的相对位置变化量,可通过以下步骤来实现:根据第二距离与第一距离之间的差值,确定虚拟现实手柄的相对位置变化量,第一距离为第一位置与虚拟缩放控制轮盘的圆心位置之间的距离,第二距离为第二位置与虚拟缩放控制轮盘的圆心位置之间的距离。
一方面,基于虚拟现实手柄在虚拟缩放控制轮盘上的位置移动操作,无需跳出虚拟场景界面,可在一定范围内实现任意比例的虚拟场景缩放,大大提升了虚拟场景缩放的便捷性和自由度。另一方面,将虚拟现实手柄在虚拟缩放控制轮盘上的位置移动操作作为确定目标缩放比例的参考依据,能够避免用户操作手势识别不到的问题,进而提升缩放比例判别的灵敏度,且由于虚拟现实手柄位置的唯一确定性以及虚拟缩放控制轮盘的可参考性,还能够确保虚拟场景缩放的精准性。
存储单元820可以包括易失性存储单元形式的可读介质,例如随机存取存储单元(RAM)821和/或高速缓存存储单元822,还可以进一步包括只读存储单元(ROM)823。
存储单元820还可以包括具有一组(至少一个)程序模块825的程序/实用工具824,这样的程序模块825包括但不限于:操作系统、一个或者多个应用程序、其它程序模块以及程序数据,这些示例中的每一个或某种组合中可能包括网络环境的实现。
总线830可以为表示几类总线结构中的一种或多种,包括存储单元总线或者存储单元控制器、外围总线、图形加速端口、处理单元或者使用多种总线结构中的任意总线结构的局域总线。
电子设备800也可以与一个或多个外部设备900(例如键盘、指向设备、蓝牙设备等)通信,还可与一个或者多个使得用户能与该电子设备800交互的设备通信,和/或与使得 该电子设备800能与一个或多个其它计算设备进行通信的任何设备(例如路由器、调制解调器等等)通信。这种通信可以通过输入/输出(I/O)接口850进行。并且,电子设备800还可以通过网络适配器860与一个或者多个网络(例如局域网(LAN),广域网(WAN)和/或公共网络,例如因特网)通信。如图所示,网络适配器860通过总线830与电子设备800的其它模块通信。应当明白,尽管图中未示出,可以结合电子设备800使用其它硬件和/或软件模块,包括但不限于:微代码、设备驱动器、冗余处理单元、外部磁盘驱动阵列、RAID(Redundant Arrays of Independent Disks,独立磁盘冗余阵列)系统、磁带驱动器以及数据备份存储系统等。
通过以上的实施方式的描述,本领域的技术人员易于理解,这里描述的示例实施方式可以通过软件实现,也可以通过软件结合必要的硬件的方式来实现。因此,根据本公开实施方式的技术方案可以以软件产品的形式体现出来,该软件产品可以存储在一个非易失性存储介质(可以是CD-ROM,U盘,移动硬盘等)中或网络上,包括若干指令以使得一台计算设备(可以是个人计算机、服务器、终端装置、或者网络设备等)执行根据本公开示例性实施方式的方法。
此外,上述附图仅是根据本公开示例性实施方式的方法所包括的处理的示意性说明,而不是限制目的。易于理解,上述附图所示的处理并不表明或限制这些处理的时间顺序。另外,也易于理解,这些处理可以是例如在多个模块中同步或异步执行的。
应当注意,尽管在上文详细描述中提及了用于动作执行的设备的若干模块或者单元,但是这种划分并非强制性的。实际上,根据本公开的示例性实施方式,上文描述的两个或更多模块或者单元的特征和功能可以在一个模块或者单元中具体化。反之,上文描述的一个模块或者单元的特征和功能可以进一步划分为由多个模块或者单元来具体化。
所属技术领域的技术人员能够理解,本公开的各个方面可以实现为系统、方法或程序产品。因此,本公开的各个方面可以具体实现为以下形式,即:完全的硬件实施方式、完全的软件实施方式(包括固件、微代码等),或硬件和软件方面结合的实施方式,这里可以统称为“电路”、“模块”或“系统”。本领域技术人员在考虑说明书及实践这里公开的发明后,将容易想到本公开的其他实施方式。本公开旨在涵盖本公开的任何变型、用途或者适应性变化,这些变型、用途或者适应性变化遵循本公开的一般性原理并包括本公开未公开的本技术领域中的公知常识或惯用技术手段。说明书和实施方式仅被视为示例性的,本公开的真正范围和精神由权利要求指出。
应当理解的是,本公开并不局限于上面已经描述并在附图中示出的精确结构,并且可以在不脱离其范围进行各种修改和改变。本公开的范围仅由所附的权利要求来限定。

Claims (10)

  1. 一种虚拟场景缩放方法,所述方法包括:
    响应于虚拟场景缩放事件的触发操作,在虚拟场景的用户视野中显示一虚拟缩放控制轮盘;
    将虚拟现实手柄的现实位置映射在所述虚拟缩放控制轮盘上;
    根据所述虚拟现实手柄在所述虚拟缩放控制轮盘上的位置移动操作,确定所述虚拟场景的目标缩放比例;
    根据所述目标缩放比例,对所述虚拟场景进行缩放。
  2. 根据权利要求1所述的方法,其中,所述响应于虚拟场景缩放事件的触发操作,在虚拟场景的用户视野中显示一虚拟缩放控制轮盘,包括:
    响应于虚拟场景缩放事件的触发操作,获取所述虚拟现实手柄的第一位置,所述第一位置为所述虚拟场景缩放事件被触发时虚拟现实手柄在所述虚拟场景中的映射位置;
    根据所述虚拟现实手柄的第一位置,在所述虚拟场景的用户视野中确定所述虚拟缩放控制轮盘的轮盘显示位置,并在所述轮盘显示位置处显示所述虚拟缩放控制轮盘。
  3. 根据权利要求2所述的方法,其中,所述根据所述虚拟现实手柄的第一位置,在所述虚拟场景的用户视野中确定所述虚拟缩放控制轮盘的轮盘显示位置,并在所述轮盘显示位置处显示所述虚拟缩放控制轮盘,包括:
    根据所述虚拟现实手柄的第一位置、所述虚拟缩放控制轮盘的圆心位置相对于所述第一位置的预设偏离距离以及所述虚拟缩放控制轮盘的圆心位置相对于所述第一位置的预设偏离方位,在所述虚拟场景的用户视野中确定所述虚拟缩放控制轮盘的圆心位置;
    根据所述虚拟缩放控制轮盘的圆心位置以及预设轮盘半径,显示所述虚拟缩放控制轮盘。
  4. 根据权利要求3所述的方法,其中,所述虚拟缩放控制轮盘的圆心位置相对于所述第一位置的预设偏离距离为所述预设轮盘半径的一半。
  5. 根据权利要求1所述的方法,其中,所述根据所述虚拟现实手柄在所述虚拟缩放控制轮盘上的位置移动操作,确定所述虚拟场景的目标缩放比例,包括:
    响应于所述虚拟现实手柄的移动结束操作,确定所述虚拟现实手柄的第二位置,所述第二位置为所述虚拟现实手柄触发移动结束操作时所述虚拟现实手柄在所述虚拟场景中的映射位置;
    根据所述虚拟现实手柄的第二位置,确定所述虚拟场景的目标缩放比例。
  6. 根据权利要求5所述的方法,其中,所述根据所述虚拟现实手柄的第二位置,确定所述虚拟场景的目标缩放比例,包括:
    根据所述虚拟现实手柄的第二位置和第一位置,确定所述虚拟现实手柄的相对位置变化量,所述第一位置为所述虚拟场景事件被触发时虚拟现实手柄在所述虚拟场景中的映射 位置;
    根据所述虚拟现实手柄的相对位置变化量以及所述虚拟场景缩放事件被触发时所述虚拟场景的缩放比例,确定所述虚拟场景的目标缩放比例。
  7. 根据权利要求6所述的方法,其中,所述根据所述虚拟现实手柄的第二位置和第一位置,确定所述虚拟现实手柄的相对位置变化量,包括:
    根据第二距离与第一距离之间的差值,确定所述虚拟现实手柄的相对位置变化量,所述第一距离为所述第一位置与所述虚拟缩放控制轮盘的圆心位置之间的距离,所述第二距离为所述第二位置与所述虚拟缩放控制轮盘的圆心位置之间的距离。
  8. 一种虚拟场景缩放装置,所述装置包括:
    事件触发模块,用于响应于虚拟场景缩放事件的触发操作,在虚拟场景的用户视野中显示一虚拟缩放控制轮盘;
    位置映射模块,用于将虚拟现实手柄的现实位置映射在所述虚拟缩放控制轮盘上;
    比例确定模块,根据所述虚拟现实手柄在所述虚拟缩放控制轮盘上的位置移动操作,确定所述虚拟场景的目标缩放比例;
    场景缩放模块,用于根据所述目标缩放比例,对所述虚拟场景进行缩放。
  9. 一种计算机可读存储介质,其上存储有计算机程序,其中,所述计算机程序被处理器执行时实现权利要求1至7任一项所述的方法。
  10. 一种电子设备,其中,包括:
    处理器;以及
    存储器,用于存储所述处理器的可执行指令;
    其中,所述处理器配置为经由执行所述可执行指令来执行权利要求1至7任一项所述的方法。
PCT/CN2023/082287 2022-10-09 2023-03-17 虚拟场景缩放方法、装置、存储介质与电子设备 WO2024077873A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202211229133.7A CN115531875A (zh) 2022-10-09 2022-10-09 虚拟场景缩放方法、装置、存储介质与电子设备
CN202211229133.7 2022-10-09

Publications (1)

Publication Number Publication Date
WO2024077873A1 true WO2024077873A1 (zh) 2024-04-18

Family

ID=84733903

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2023/082287 WO2024077873A1 (zh) 2022-10-09 2023-03-17 虚拟场景缩放方法、装置、存储介质与电子设备

Country Status (2)

Country Link
CN (1) CN115531875A (zh)
WO (1) WO2024077873A1 (zh)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115531875A (zh) * 2022-10-09 2022-12-30 网易(杭州)网络有限公司 虚拟场景缩放方法、装置、存储介质与电子设备

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103279357A (zh) * 2013-06-05 2013-09-04 四目信息科技(上海)有限公司 计算机图形界面中实现滑动轮盘控件功能的系统及其方法
US20170287214A1 (en) * 2016-03-31 2017-10-05 Glen J. Anderson Path navigation in virtual environment
CN108579083A (zh) * 2018-04-27 2018-09-28 腾讯科技(深圳)有限公司 虚拟场景显示方法、装置、电子装置及存储介质
US20200126301A1 (en) * 2018-10-23 2020-04-23 Microsoft Technology Licensing, Llc Efficiency Enhancements To Construction Of Virtual Reality Environments
CN113413597A (zh) * 2021-06-21 2021-09-21 网易(杭州)网络有限公司 虚拟道具的装配方法、装置、计算机设备和存储介质
CN115531875A (zh) * 2022-10-09 2022-12-30 网易(杭州)网络有限公司 虚拟场景缩放方法、装置、存储介质与电子设备

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103279357A (zh) * 2013-06-05 2013-09-04 四目信息科技(上海)有限公司 计算机图形界面中实现滑动轮盘控件功能的系统及其方法
US20170287214A1 (en) * 2016-03-31 2017-10-05 Glen J. Anderson Path navigation in virtual environment
CN108579083A (zh) * 2018-04-27 2018-09-28 腾讯科技(深圳)有限公司 虚拟场景显示方法、装置、电子装置及存储介质
US20200126301A1 (en) * 2018-10-23 2020-04-23 Microsoft Technology Licensing, Llc Efficiency Enhancements To Construction Of Virtual Reality Environments
CN113413597A (zh) * 2021-06-21 2021-09-21 网易(杭州)网络有限公司 虚拟道具的装配方法、装置、计算机设备和存储介质
CN115531875A (zh) * 2022-10-09 2022-12-30 网易(杭州)网络有限公司 虚拟场景缩放方法、装置、存储介质与电子设备

Also Published As

Publication number Publication date
CN115531875A (zh) 2022-12-30

Similar Documents

Publication Publication Date Title
JP3847641B2 (ja) 情報処理装置、情報処理プログラム、情報処理プログラムを記録したコンピュータ読み取り可能な記録媒体、及び情報処理方法
KR20180015533A (ko) 디스플레이 제어 방법, 저장 매체 및 전자 장치
US20140096084A1 (en) Apparatus and method for controlling user interface to select object within image and image input device
JP2005509973A (ja) ジェスチャに基づくユーザインタフェース用の方法及び装置
EP2885695A1 (en) User interface element focus based on user's gaze
US10228762B2 (en) Analysis of user interface interactions within a virtual reality environment
CN108776544B (zh) 增强现实中的交互方法及装置、存储介质、电子设备
US11367258B2 (en) Display device, user terminal device, display system including the same and control method thereof
WO2024077873A1 (zh) 虚拟场景缩放方法、装置、存储介质与电子设备
JP7245223B2 (ja) ボタン設置方法、装置、プログラムおよび記録媒体
WO2023160697A1 (zh) 一种鼠标的模型映射方法、装置、设备和存储介质
JP2021531589A (ja) 目標対象の動作認識方法、装置及び電子機器
WO2019166005A1 (zh) 智能终端及其感控方法、具有存储功能的装置
WO2022017421A1 (zh) 交互方法、显示装置、发射装置、交互系统及存储介质
WO2024077872A1 (zh) 显示位置调整方法、装置、存储介质与电子设备
CN113559501A (zh) 游戏中的虚拟单位选取方法及装置、存储介质及电子设备
WO2014082417A1 (zh) 一种空中输入设备的运行方法、运行控制方法及相关装置
WO2024007675A1 (zh) 虚拟对象的切换方法、装置、存储介质及电子装置
CN113457117B (zh) 游戏中的虚拟单位选取方法及装置、存储介质及电子设备
CN112534390A (zh) 用于提供虚拟输入工具的电子装置及其方法
CN112637653B (zh) 一种电视机触摸遥控方法及系统
TWI768407B (zh) 預測控制方法、輸入系統及電腦可讀取記錄媒體
CN110908509B (zh) 多增强现实设备的协作方法及装置、电子设备、存储介质
CN110162251B (zh) 图像缩放方法及装置、存储介质、电子设备
CN109857314B (zh) 有屏设备的手势控制方法和装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23876086

Country of ref document: EP

Kind code of ref document: A1