CN107479692B - Virtual reality scene control method and device and virtual reality device - Google Patents
Virtual reality scene control method and device and virtual reality device Download PDFInfo
- Publication number
- CN107479692B CN107479692B CN201710547961.8A CN201710547961A CN107479692B CN 107479692 B CN107479692 B CN 107479692B CN 201710547961 A CN201710547961 A CN 201710547961A CN 107479692 B CN107479692 B CN 107479692B
- Authority
- CN
- China
- Prior art keywords
- speed
- virtual reality
- moving
- limit value
- speed range
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/01—Indexing scheme relating to G06F3/01
- G06F2203/012—Walk-in-place systems for allowing a user to walk in a virtual environment while constraining him to a given position in the physical environment
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The invention discloses a control method and equipment for a virtual reality scene and virtual reality equipment. The method comprises the following steps: providing an interactive reference in a virtual reality scene, the interactive reference for providing a visual reference to a user in the virtual reality scene; responding to the operation of moving the interactive reference object by a user, moving the interactive reference object in the virtual reality scene, and acquiring the moving speed of the interactive reference object; adjusting the moving speed to deviate from a preset speed range when the moving speed falls within the speed range. According to the invention, the vertigo generated when the user watches the moving scenery presented by the high-speed movement of the scene picture in the virtual reality scene can be reduced, and the virtual reality experience of the user is improved.
Description
Technical Field
The invention relates to the technical field of virtual reality, in particular to a control method and equipment of a virtual reality scene and virtual reality equipment.
Background
In recent years, virtual reality technology is rapidly developed, and virtual scenes can be presented to provide users with a nearly real sense of immersion. Accordingly, virtual reality devices such as virtual reality helmets, virtual reality glasses, and the like are beginning to be of increasing interest to multiple users.
However, when the user uses the virtual reality device, the user is usually actually stationary, and the virtual reality scene provided to the user through the virtual reality device is usually stereoscopic and the scene picture moves even at a high speed, so that when the user watches the virtual reality scene with both eyes through the virtual reality helmet or the virtual reality glasses, the user watches the moving scenery presented by the high-speed movement of the scene picture in the virtual reality scene with the own both-eye position as the stationary reference position, the visual discomfort can occur, and the phenomena of dizzy and vomiting can occur in severe cases, which seriously affects the virtual reality experience of the user.
Accordingly, the inventors have determined that there is a need for improvement of the above-described problems in the prior art.
Disclosure of Invention
An object of the present invention is to provide a new technical solution for a control method for a virtual reality scene.
According to a first aspect of the present invention, there is provided a method for controlling a virtual reality scene, including:
providing an interactive reference in a virtual reality scene, the interactive reference for providing a visual reference to a user in the virtual reality scene;
responding to the operation of moving the interactive reference object by a user, moving the interactive reference object in the virtual reality scene, and acquiring the moving speed of the interactive reference object;
adjusting the moving speed to deviate from a preset speed range when the moving speed falls within the speed range.
Optionally, the step of adjusting the moving speed to deviate from the speed range comprises:
when the difference value between the moving speed and the lower limit value of the speed range is not greater than a preset down-regulation threshold value, adjusting the moving speed to be smaller than the lower limit value of the speed range;
and when the difference value between the moving speed and the lower limit value of the speed range is larger than a preset down-regulation threshold value, adjusting the moving speed to be larger than the upper limit value of the speed range.
Optionally, when the difference between the moving speed and the lower limit value of the speed range is not greater than a preset down-regulation threshold, multiplying the moving speed by a preset down-regulation factor to obtain an updated moving speed, so as to adjust the moving speed to be smaller than the lower limit value of the speed range;
and when the difference value between the moving speed and the lower limit value of the speed range is larger than a preset down-regulation threshold value, multiplying the moving speed by a preset up-regulation factor to obtain an updated moving speed, and realizing the adjustment of the moving speed to be smaller than the lower limit value of the speed range.
Optionally, the method further comprises:
and setting an upper limit value and a lower limit value of the speed range according to the acquired visual parameters.
Optionally, the visual parameters include a visual focus capture speed, a binocular focusing speed, and a stereo synthesis speed;
the step of setting the upper and lower limits of the speed range comprises:
setting the upper limit value of the speed range as the sum of the visual focus capturing speed, the binocular focusing speed, the stereo synthesis speed and a preset deviation threshold;
the lower limit value of the speed range is set as the sum of the visual focus capturing speed, the binocular focusing speed, and the stereoscopic synthesis speed.
Optionally, the interactive reference is a vehicle for providing a user with an experience of being carried for movement in the virtual reality scene.
According to a second aspect of the present invention, there is provided a control apparatus for a virtual reality scene, comprising:
a providing unit for providing an interactive reference in a virtual reality scene, the interactive reference being for providing a visual reference for a user in the virtual reality scene;
the moving unit is used for responding to the operation of moving the interactive reference object by a user, moving the interactive reference object in the virtual reality scene and acquiring the moving speed of the interactive reference object;
an adjusting unit for adjusting the moving speed to deviate from the speed range when the moving speed falls within a preset speed range.
Optionally, the adjusting unit is further configured to:
when the difference value between the moving speed and the lower limit value of the speed range is not greater than a preset down-regulation threshold value, adjusting the moving speed to be smaller than the lower limit value of the speed range;
and when the difference value between the moving speed and the lower limit value of the speed range is larger than a preset down-regulation threshold value, adjusting the moving speed to be larger than the upper limit value of the speed range.
Optionally, the apparatus further comprises:
and the setting unit is used for setting the upper limit value and the lower limit value of the speed range according to the acquired visual parameters.
According to a third aspect of the invention there is provided a virtual reality device comprising a control device for any one of the virtual reality scenarios as provided in the second aspect of the invention.
The inventor of the invention finds that in the prior art, a control method and device for a virtual reality scene and a virtual reality device do not exist, so that the dizzy feeling generated when a user watches a moving scene displayed by high-speed movement of a scene picture in the virtual reality scene can be reduced, and the virtual reality experience of the user is improved. Therefore, the technical task to be achieved or the technical problems to be solved by the present invention are never thought or anticipated by those skilled in the art, and therefore the present invention is a new technical solution.
Other features of the present invention and advantages thereof will become apparent from the following detailed description of exemplary embodiments thereof, which proceeds with reference to the accompanying drawings.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments of the invention and together with the description, serve to explain the principles of the invention.
FIG. 1 is a block diagram showing an example of a hardware configuration of a computing system that may be used to implement an embodiment of the invention;
fig. 2 is a flowchart of a control method for a virtual reality scene according to an embodiment of the present invention;
fig. 3 is a schematic diagram showing an example of a virtual scene in the control method of the virtual reality scene according to the embodiment of the present invention;
fig. 4 is a schematic block diagram of a control device for a virtual reality scene according to an embodiment of the present invention;
fig. 5 is a schematic block diagram of a virtual reality device provided in an embodiment of the present invention.
Detailed Description
Various exemplary embodiments of the present invention will now be described in detail with reference to the accompanying drawings. It should be noted that: the relative arrangement of the components and steps, the numerical expressions and numerical values set forth in these embodiments do not limit the scope of the present invention unless specifically stated otherwise.
The following description of at least one exemplary embodiment is merely illustrative in nature and is in no way intended to limit the invention, its application, or uses.
Techniques, methods, and apparatus known to those of ordinary skill in the relevant art may not be discussed in detail but are intended to be part of the specification where appropriate.
In all examples shown and discussed herein, any particular value should be construed as merely illustrative, and not limiting. Thus, other examples of the exemplary embodiments may have different values.
It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, further discussion thereof is not required in subsequent figures.
< hardware configuration >
Fig. 1 is a block diagram showing a hardware configuration of an electronic apparatus 1000 that can implement an embodiment of the present invention.
The electronic device 1000 may be a virtual reality helmet or virtual reality glasses, or the like. As shown in fig. 1, the electronic device 1000 may include a processor 1100, a memory 1200, an interface device 1300, a communication device 1400, a display device 1500, an input device 1600, a speaker 1700, a microphone 1800, and the like. The processor 1100 may be a central processing unit CPU, a microprocessor MCU, or the like. The memory 1200 includes, for example, a ROM (read only memory), a RAM (random access memory), a nonvolatile memory such as a hard disk, and the like. The interface device 1300 includes, for example, a USB interface, a headphone interface, and the like. The communication device 1400 is capable of wired or wireless communication, for example, and may specifically include Wifi communication, bluetooth communication, 2G/3G/4G/5G communication, and the like. The display device 1500 is, for example, a liquid crystal display panel, a touch panel, or the like. The input device 1600 may include, for example, a touch screen, a keyboard, a somatosensory input, and the like. A user can input/output voice information through the speaker 1700 and the microphone 1800.
The electronic device shown in fig. 1 is merely illustrative and is in no way meant to limit the invention, its application, or uses. In an embodiment of the present invention, the memory 1200 of the electronic device 1000 is configured to store instructions, and the instructions are configured to control the processor 1100 to perform an operation to execute a control method of any virtual reality scene provided in an embodiment of the present invention. It will be appreciated by those skilled in the art that although a plurality of means are shown for the electronic device 1000 in fig. 1, the present invention may relate to only some of the means therein, e.g. the electronic device 1000 relates to only the processor 1100 and the storage means 1200. The skilled person can design the instructions according to the disclosed solution. How the instructions control the operation of the processor is well known in the art and will not be described in detail herein.
< method examples >
Fig. 2 is a flowchart of a control method for a virtual reality scene according to an embodiment of the present invention. As shown in fig. 2, the method for controlling a virtual reality scene provided in this embodiment may include:
Specifically, the interactive reference object is a scene object, such as a vehicle, disposed in a virtual reality scene and serving as a visual reference object of a user, which can provide a human-computer interaction experience in response to a user operation, and is used for providing the user with an experience of being carried and moved in the virtual reality scene. More specifically, the vehicle may be a cable car, a rail car, a boat, etc., which are not enumerated herein. In one example, as shown in fig. 3, the vehicle is a rope car, which can move in the virtual reality scene in response to the movement operation of the user, and provides the experience that the user is carried to move in the virtual reality scene.
And step 220, responding to the operation of the user for moving the interactive reference object, moving the interactive reference object in the virtual reality scene, and acquiring the moving speed of the interactive reference object.
The operation of moving the interactive reference object by the user can be realized by operating the handle. For example, in the virtual reality scene shown in fig. 3, the interactive reference object is a rope car, and the user wants to move the rope car, can hold the handle and press a control key, such as a Fire key, to implement an operation of moving the rope car, which is mapped to the virtual reality scene that the user pulls the rope of the rope car backwards, so as to implement that the rope of the rope car in the virtual reality scene is dragged to move the rope car forwards.
It should be noted that the moving speed of the interactive reference object is the moving speed of the virtual reality scene picture. The moving speed can be calculated by sampling the moving distance and the moving time of the interactive reference object in the virtual reality scene. Specifically, the moving speed of the cross reference object is the ratio of the moving distance and the moving time obtained by sampling.
For example, in the case that the interactive reference object is a cable car, the moving speed of the cable car is obtained, specifically, the starting position of the cable car in the virtual reality scene may be recorded when the user presses a Fire key, and the current position and the moving time of the cable car in the virtual reality scene after the user pulls the handle backwards are sampled. And determining the moving distance of the rope car according to the difference value between the current position and the initial position, and determining the moving speed of the rope car in the virtual reality scene according to the ratio of the moving distance of the rope car to the moving time.
And 230, when the moving speed falls into a preset speed range, adjusting the moving speed to deviate from the speed range.
The preset speed range is a speed range in which the user is likely to produce dizziness. When the moving speed of the interactive reference object in the virtual reality scene falls into the preset speed range, the user is easy to feel dizzy. In this embodiment, when the moving speed of the interactive reference object in the acquired virtual reality scene falls within a preset speed range, the moving speed is adjusted to make the moving speed of the interactive reference object deviate from the range, so that the vertigo of the user is reduced.
It is understood that the user's vertigo feeling is low when the moving speed of the picture is slow. When the moving speed of the picture is high, the visual focus of the user leaves the picture moving at high speed and is projected to a low-speed or static object, and the user only keeps watching the picture moving at high speed by visual residual light. Therefore, when the moving speed of the picture is high, the eyes of the user can unconsciously abandon the capture of the object moving at high speed and focus on the low-speed or static object, thereby avoiding dizziness.
Therefore, in this embodiment, the step of adjusting the moving speed to deviate from the speed range may specifically include:
and when the difference value between the moving speed and the lower limit value of the speed range is not greater than a preset down-regulation threshold value, adjusting the moving speed to be smaller than the lower limit value of the speed range.
And when the difference value between the moving speed and the lower limit value of the speed range is larger than a preset down-regulation threshold value, adjusting the moving speed to be larger than the upper limit value of the speed range.
Optionally, when the difference between the moving speed and the lower limit value of the speed range is not greater than a preset down-regulation threshold, multiplying the moving speed by a preset down-regulation factor to obtain an updated moving speed, so as to adjust the moving speed to be smaller than the lower limit value of the speed range; and when the difference value between the moving speed and the lower limit value of the speed range is larger than a preset down-regulation threshold value, multiplying the moving speed by a preset up-regulation factor to obtain an updated moving speed, and realizing the adjustment of the moving speed to be smaller than the lower limit value of the speed range.
It should be noted that the up-regulation factor and the down-regulation factor are set for adapting the sense of the eyes of the user to adjust to the comfortable range of the eyes of the user, and the up-regulation factor and the down-regulation factor can be adaptively adjusted according to the receptivity of different users, so as to reduce the vertigo of the user to the maximum extent.
In one example, visual parameters of a user can be obtained, an upper limit value and a lower limit value of a speed range are set, the receptivity of different users to vertigo feeling is considered, and the virtual reality experience of the user is improved better. Correspondingly, the method for controlling a virtual reality scene provided in this embodiment further includes:
and setting an upper limit value and a lower limit value of the speed range according to the acquired visual parameters.
When a user observes a picture visually, that is, by observing the picture with human eyes, the human eyes can track the movement of the picture. For example, when a user observes a virtual reality scene through a virtual reality device, the eyes of the user track the scene movement of the scene. The visual parameters are parameters reflecting the speed of human eyes tracking the scene when a user visually observes the scene and the scene moves.
In one example, the visual parameters may include a visual focus capture speed, a binocular focusing speed, and a stereo synthesis speed. Specifically, when a user observes a virtual reality scene through human eyes, the vision is attracted by objects appearing in the scene, and the vision focus is located on the object of most interest, if no object which is more interesting to the user exists in the scene, the vision of the user is in a fast patrol state, and the vision focus is changed rapidly at the moment. The capture visual focus speed refers to the tracking speed of the user's eyes from the patrol viewpoint to a point where the eyes fall on the object of interest. And the binocular focusing speed refers to a tracking speed when the focus of the eyes of the user is adjusted to be consistent after the eyes of the user fall on a certain point of the interested object. The stereoscopic synthesis speed is a tracking speed when a user clearly sees a certain point on a screen with both eyes to perform stereoscopic synthesis.
The values of the acquired visual parameters are different according to different virtual reality devices and different users using the virtual reality devices, and therefore the upper limit value and the lower limit value of the speed range set according to the acquired visual parameters are different.
More specifically, the step of setting the upper limit value and the lower limit value of the speed range includes:
setting the upper limit value of the speed range as the sum of the visual focus capturing speed, the binocular focusing speed, the stereo synthesis speed and a preset deviation threshold;
the lower limit value of the speed range is set as the sum of the visual focus capturing speed, the binocular focusing speed, and the stereoscopic synthesis speed.
It can be understood that the lower limit of the speed range is set as the sum of the visual focus capturing speed, the binocular focusing speed and the stereo synthesis speed, and after the moving speed of the interactive reference object falls into the preset speed range, if the difference between the moving speed and the lower limit of the speed range is not greater than the preset down-regulation threshold, the moving speed is adjusted to be less than the lower limit of the speed range, and the dizzy feeling of the user is naturally reduced due to the reduction of the moving speed.
And a preset deviation threshold value is added in the upper limit value of the speed range, after the moving speed of the interactive reference object falls into the preset speed range, if the difference value between the moving speed and the lower limit value of the speed range is larger than a preset down-regulation threshold value, the moving speed is regulated to be larger than the upper limit value of the speed range, so that the moving speed is far larger than the speed which a user can pay attention to, the attention point of the user is enabled to leave the picture moving at high speed, and the dizzy phenomenon is avoided.
Therefore, the control method for the virtual reality scene provided by the embodiment can reduce the dizzy feeling generated when the user watches the moving scenery displayed by the high-speed movement of the scene picture in the virtual reality scene, and improve the virtual reality experience of the user.
< apparatus embodiment >
< control apparatus >
Fig. 4 is a schematic block diagram of a control device for a virtual reality scene according to an embodiment of the present invention. As shown in fig. 4, the control apparatus 4000 for a virtual reality scene provided in this embodiment includes:
a providing unit 4100 for providing an interactive reference in a virtual reality scene, the interactive reference being used for providing a visual reference for a user in the virtual reality scene;
a moving unit 4200, configured to, in response to an operation of a user moving the interactive reference, move the interactive reference in the virtual reality scene and obtain a moving speed of the interactive reference;
an adjusting unit 4300 for adjusting the moving speed to deviate from the speed range when the moving speed falls within a preset speed range.
Optionally, the adjusting unit 4300 is further configured to:
when the difference value between the moving speed and the lower limit value of the speed range is not greater than a preset down-regulation threshold value, adjusting the moving speed to be smaller than the lower limit value of the speed range;
and when the difference value between the moving speed and the lower limit value of the speed range is larger than a preset down-regulation threshold value, adjusting the moving speed to be larger than the upper limit value of the speed range.
Optionally, the control device 4000 for a virtual reality scene further includes a setting unit 4400, configured to set an upper limit value and a lower limit value of the speed range according to the obtained visual parameter.
The control device of the virtual reality scene provided in this embodiment may be used to execute the technical solutions of the above method embodiments, and the implementation principle and the technical effect are similar, which are not described herein again.
It will be appreciated by those skilled in the art that the control device 4000 of the virtual reality scenario may be implemented in various ways. For example, the control device 4000 of the virtual reality scene may be implemented by an instruction configuration processor. For example, instructions may be stored in ROM and read from ROM into a programmable device when the device is started to implement the control device 4000 for a virtual reality scene. For example, the control device 4000 of the virtual reality scene may be solidified into a dedicated device (e.g., ASIC). The control apparatus 4000 of the virtual reality scene may be divided into units independent of each other, or they may be implemented by being combined together. The control apparatus 4000 of the virtual reality scene may be implemented by one of the various implementations described above, or may be implemented by a combination of two or more of the various implementations described above.
< virtual reality device >
Fig. 5 is a schematic block diagram of a virtual reality device provided in an embodiment of the present invention. As shown in fig. 5, the virtual reality apparatus 5000 provided in the present embodiment includes the control apparatus 4000 for a virtual reality scene provided in the above-described embodiment.
The virtual reality device provided in this embodiment may be used to implement the technical solution of the above method embodiment, and the implementation principle and technical effect are similar, which are not described herein again.
It will be appreciated by those skilled in the art that the control device 5000 of the virtual reality scene may be implemented in various ways. For example, the control device 5000 of the virtual reality scene may be implemented by an instruction configuration processor. For example, the instructions may be stored in ROM and read from ROM into a programmable device when the device is started to implement the control device 5000 of the virtual reality scene. For example, the control device 5000 of the virtual reality scene may be solidified into a dedicated device (e.g., ASIC). The control device 5000 of the virtual reality scene may be divided into units independent of each other, or may be implemented by combining them together. The control device 5000 of the virtual reality scene may be implemented by one of the various implementations described above, or may be implemented by a combination of two or more of the various implementations described above.
It is well known to those skilled in the art that with the development of electronic information technology such as large scale integrated circuit technology and the trend of software hardware, it has been difficult to clearly divide the software and hardware boundaries of a computer system. As any of the operations may be implemented in software or hardware. Execution of any of the instructions may be performed by hardware, as well as by software. Whether a hardware implementation or a software implementation is employed for a certain machine function depends on non-technical factors such as price, speed, reliability, storage capacity, change period, and the like. Accordingly, it will be apparent to those skilled in the art of electronic information technology that a more direct and clear description of one embodiment is provided by describing the various operations within the embodiment. Knowing the operations to be performed, the skilled person can directly design the desired product based on considerations of said non-technical factors.
The present invention may be a system, method and/or computer program product. The computer program product may include a computer-readable storage medium having computer-readable program instructions embodied therewith for causing a processor to implement various aspects of the present invention.
The computer readable storage medium may be a tangible device that can hold and store the instructions for use by the instruction execution device. The computer readable storage medium may be, for example, but not limited to, an electronic memory device, a magnetic memory device, an optical memory device, an electromagnetic memory device, a semiconductor memory device, or any suitable combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), a Static Random Access Memory (SRAM), a portable compact disc read-only memory (CD-ROM), a Digital Versatile Disc (DVD), a memory stick, a floppy disk, a mechanical coding device, such as punch cards or in-groove projection structures having instructions stored thereon, and any suitable combination of the foregoing. Computer-readable storage media as used herein is not to be construed as transitory signals per se, such as radio waves or other freely propagating electromagnetic waves, electromagnetic waves propagating through a waveguide or other transmission medium (e.g., optical pulses through a fiber optic cable), or electrical signals transmitted through electrical wires.
The computer-readable program instructions described herein may be downloaded from a computer-readable storage medium to a respective computing/processing device, or to an external computer or external storage device via a network, such as the internet, a local area network, a wide area network, and/or a wireless network. The network may include copper transmission cables, fiber optic transmission, wireless transmission, routers, firewalls, switches, gateway computers and/or edge servers. The network adapter card or network interface in each computing/processing device receives computer-readable program instructions from the network and forwards the computer-readable program instructions for storage in a computer-readable storage medium in the respective computing/processing device.
The computer program instructions for carrying out operations of the present invention may be assembler instructions, Instruction Set Architecture (ISA) instructions, machine-related instructions, microcode, firmware instructions, state setting data, or source or object code written in any combination of one or more programming languages, including an object oriented programming language such as Smalltalk, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The computer-readable program instructions may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider). In some embodiments, aspects of the present invention are implemented by personalizing an electronic circuit, such as a programmable logic circuit, a Field Programmable Gate Array (FPGA), or a Programmable Logic Array (PLA), with state information of computer-readable program instructions, which can execute the computer-readable program instructions.
Aspects of the present invention are described herein with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the invention. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer-readable program instructions.
These computer-readable program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks. These computer-readable program instructions may also be stored in a computer-readable storage medium that can direct a computer, programmable data processing apparatus, and/or other devices to function in a particular manner, such that the computer-readable medium storing the instructions comprises an article of manufacture including instructions which implement the function/act specified in the flowchart and/or block diagram block or blocks.
The computer readable program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other devices to cause a series of operational steps to be performed on the computer, other programmable apparatus or other devices to produce a computer implemented process such that the instructions which execute on the computer, other programmable apparatus or other devices implement the functions/acts specified in the flowchart and/or block diagram block or blocks.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present invention. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of instructions, which comprises one or more executable instructions for implementing the specified logical function(s). In some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions. It is well known to those skilled in the art that implementation by hardware, by software, and by a combination of software and hardware are equivalent.
Having described embodiments of the present invention, the foregoing description is intended to be exemplary, not exhaustive, and not limited to the embodiments disclosed. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the described embodiments. The terminology used herein is chosen in order to best explain the principles of the embodiments, the practical application, or improvements made to the technology in the marketplace, or to enable others of ordinary skill in the art to understand the embodiments disclosed herein. The scope of the invention is defined by the appended claims.
Claims (7)
1. A control method of a virtual reality scene is characterized by comprising the following steps:
providing an interactive reference in a virtual reality scene, the interactive reference for providing a visual reference to a user in the virtual reality scene;
responding to the operation of moving the interactive reference object by a user, moving the interactive reference object in the virtual reality scene, and acquiring the moving speed of the interactive reference object;
adjusting the moving speed to deviate from a preset speed range when the moving speed falls into the speed range;
wherein the method further comprises: setting an upper limit value and a lower limit value of the speed range according to the acquired visual parameters;
the visual parameters comprise a visual focus capturing speed, a binocular focusing speed and a stereo synthesis speed;
the step of setting the upper and lower limits of the speed range comprises:
setting the upper limit value of the speed range as the sum of the visual focus capturing speed, the binocular focusing speed, the stereo synthesis speed and a preset deviation threshold;
the lower limit value of the speed range is set as the sum of the visual focus capturing speed, the binocular focusing speed, and the stereoscopic synthesis speed.
2. The method of claim 1, wherein the step of adjusting the movement speed to deviate from the speed range comprises:
when the difference value between the moving speed and the lower limit value of the speed range is not greater than a preset down-regulation threshold value, adjusting the moving speed to be smaller than the lower limit value of the speed range;
and when the difference value between the moving speed and the lower limit value of the speed range is larger than a preset down-regulation threshold value, adjusting the moving speed to be larger than the upper limit value of the speed range.
3. The method of claim 2,
when the difference value between the moving speed and the lower limit value of the speed range is not larger than a preset down-regulation threshold value, multiplying the moving speed by a preset down-regulation factor to obtain an updated moving speed, and adjusting the moving speed to be smaller than the lower limit value of the speed range;
and when the difference value between the moving speed and the lower limit value of the speed range is larger than a preset lower-regulating threshold value, multiplying the moving speed by a preset upper-regulating factor to obtain an updated moving speed, and realizing the adjustment of the moving speed to be larger than the upper limit value of the speed range.
4. The method according to any one of claims 1 to 3,
the interactive reference object is a carrier and is used for providing the experience of being carried and moved in the virtual reality scene for the user.
5. A control device for a virtual reality scene, comprising:
a providing unit for providing an interactive reference in a virtual reality scene, the interactive reference being for providing a visual reference for a user in the virtual reality scene;
the moving unit is used for responding to the operation of moving the interactive reference object by a user, moving the interactive reference object in the virtual reality scene and acquiring the moving speed of the interactive reference object;
an adjusting unit for adjusting the moving speed to deviate from a preset speed range when the moving speed falls within the speed range;
wherein the control apparatus further includes:
the setting unit is used for setting an upper limit value and a lower limit value of the speed range according to the acquired visual parameters;
the visual parameters comprise a visual focus capturing speed, a binocular focusing speed and a stereo synthesis speed;
the step of setting the upper and lower limits of the speed range comprises:
setting the upper limit value of the speed range as the sum of the visual focus capturing speed, the binocular focusing speed, the stereo synthesis speed and a preset deviation threshold;
the lower limit value of the speed range is set as the sum of the visual focus capturing speed, the binocular focusing speed, and the stereoscopic synthesis speed.
6. The apparatus of claim 5, wherein the adjustment unit is further configured to:
when the difference value between the moving speed and the lower limit value of the speed range is not greater than a preset down-regulation threshold value, adjusting the moving speed to be smaller than the lower limit value of the speed range;
and when the difference value between the moving speed and the lower limit value of the speed range is larger than the preset down-regulation threshold value, adjusting the moving speed to be larger than the upper limit value of the speed range.
7. A virtual reality apparatus, characterized in that,
control device comprising a virtual reality scene according to any one of claims 5 and 6.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710547961.8A CN107479692B (en) | 2017-07-06 | 2017-07-06 | Virtual reality scene control method and device and virtual reality device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710547961.8A CN107479692B (en) | 2017-07-06 | 2017-07-06 | Virtual reality scene control method and device and virtual reality device |
Publications (2)
Publication Number | Publication Date |
---|---|
CN107479692A CN107479692A (en) | 2017-12-15 |
CN107479692B true CN107479692B (en) | 2020-08-28 |
Family
ID=60595650
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201710547961.8A Active CN107479692B (en) | 2017-07-06 | 2017-07-06 | Virtual reality scene control method and device and virtual reality device |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN107479692B (en) |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108096834A (en) * | 2017-12-29 | 2018-06-01 | 深圳奇境森林科技有限公司 | A kind of virtual reality anti-dazzle method |
CN109701272B (en) * | 2018-12-26 | 2023-05-09 | 努比亚技术有限公司 | Game picture processing method and device, mobile terminal and storage medium |
US11314327B2 (en) * | 2020-04-22 | 2022-04-26 | Htc Corporation | Head mounted display and control method thereof |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105976424A (en) * | 2015-12-04 | 2016-09-28 | 乐视致新电子科技(天津)有限公司 | Image rendering processing method and device |
DE102016001313A1 (en) * | 2016-02-05 | 2017-08-10 | Audi Ag | Method for operating a virtual reality system and virtual reality system |
CN105955454A (en) * | 2016-04-15 | 2016-09-21 | 北京小鸟看看科技有限公司 | Anti-vertigo method and device for virtual reality system |
CN106331681A (en) * | 2016-08-22 | 2017-01-11 | 石惠卿 | Image display method of head-mounted display |
CN106354256B (en) * | 2016-08-28 | 2019-05-17 | 杭州勺子网络科技有限公司 | A kind of control method for movement of virtual reality |
-
2017
- 2017-07-06 CN CN201710547961.8A patent/CN107479692B/en active Active
Also Published As
Publication number | Publication date |
---|---|
CN107479692A (en) | 2017-12-15 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11523103B2 (en) | Providing a three-dimensional preview of a three-dimensional reality video | |
US11024083B2 (en) | Server, user terminal device, and control method therefor | |
EP2972559B1 (en) | Methods and apparatus for displaying images on a head mounted display | |
CN106598229B (en) | Virtual reality scene generation method and device and virtual reality system | |
EP3148184B1 (en) | Video telephony system, image display apparatus, driving method of image display apparatus, method for generating realistic image, and non-transitory computer readable recording medium | |
US10681341B2 (en) | Using a sphere to reorient a location of a user in a three-dimensional virtual reality video | |
CN108604175B (en) | Apparatus and associated methods | |
CN114207559A (en) | Computing device and augmented reality integration | |
US20140146178A1 (en) | Image processing apparatus and method using smart glass | |
CA2943449A1 (en) | Non-visual feedback of visual change in a gaze tracking method and device | |
CN107908447B (en) | Application switching method and device and virtual reality device | |
CN107479692B (en) | Virtual reality scene control method and device and virtual reality device | |
US11032535B2 (en) | Generating a three-dimensional preview of a three-dimensional video | |
CN106527938A (en) | Method and device for operating application program | |
US10831266B2 (en) | Personalized adaptation of virtual reality content based on eye strain context | |
CN107408186A (en) | The display of privacy content | |
CN106598247B (en) | Response control method and device based on virtual reality | |
CN110196629A (en) | Virtual reality interface shows control method and device | |
US20180160133A1 (en) | Realtime recording of gestures and/or voice to modify animations | |
AU2015209516A1 (en) | Universal capture | |
CN113473105A (en) | Image synchronization method, image display and processing device and image synchronization system | |
CN110800308B (en) | Methods, systems, and media for presenting a user interface in a wearable device | |
CN106648757B (en) | Data processing method of virtual reality terminal and virtual reality terminal | |
CN116225237B (en) | Interaction control method, device, equipment and storage medium in augmented reality space | |
KR102286518B1 (en) | Control method of screen motion dependiong on controller input and head-mounted display using the same |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |