WO2024060959A1 - Procédé et appareil pour ajuster une image de visualisation dans un environnement virtuel, support de stockage et dispositif - Google Patents
Procédé et appareil pour ajuster une image de visualisation dans un environnement virtuel, support de stockage et dispositif Download PDFInfo
- Publication number
- WO2024060959A1 WO2024060959A1 PCT/CN2023/116228 CN2023116228W WO2024060959A1 WO 2024060959 A1 WO2024060959 A1 WO 2024060959A1 CN 2023116228 W CN2023116228 W CN 2023116228W WO 2024060959 A1 WO2024060959 A1 WO 2024060959A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- video
- virtual environment
- virtual
- viewing area
- drag
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 124
- 238000003860 storage Methods 0.000 title claims abstract description 20
- 230000004044 response Effects 0.000 claims abstract description 49
- 230000008569 process Effects 0.000 claims description 36
- 238000004590 computer program Methods 0.000 claims description 30
- 230000002452 interceptive effect Effects 0.000 claims description 21
- 230000000007 visual effect Effects 0.000 claims description 19
- 230000001960 triggered effect Effects 0.000 claims description 14
- 208000027534 Emotional disease Diseases 0.000 abstract 1
- 238000010586 diagram Methods 0.000 description 21
- 230000003993 interaction Effects 0.000 description 13
- 230000006870 function Effects 0.000 description 11
- 238000012545 processing Methods 0.000 description 7
- 230000001953 sensory effect Effects 0.000 description 7
- 230000003190 augmentative effect Effects 0.000 description 5
- 238000004891 communication Methods 0.000 description 5
- 230000004438 eyesight Effects 0.000 description 5
- 230000008447 perception Effects 0.000 description 5
- 230000001360 synchronised effect Effects 0.000 description 5
- 238000007654 immersion Methods 0.000 description 4
- 238000003825 pressing Methods 0.000 description 4
- 238000004364 calculation method Methods 0.000 description 3
- 230000008878 coupling Effects 0.000 description 3
- 238000010168 coupling process Methods 0.000 description 3
- 238000005859 coupling reaction Methods 0.000 description 3
- 238000006073 displacement reaction Methods 0.000 description 3
- 230000000694 effects Effects 0.000 description 3
- 238000005516 engineering process Methods 0.000 description 3
- 230000016776 visual perception Effects 0.000 description 3
- 241001465754 Metazoa Species 0.000 description 2
- 230000008859 change Effects 0.000 description 2
- 238000013461 design Methods 0.000 description 2
- 238000001514 detection method Methods 0.000 description 2
- 230000008034 disappearance Effects 0.000 description 2
- 239000011521 glass Substances 0.000 description 2
- 238000003384 imaging method Methods 0.000 description 2
- 238000004088 simulation Methods 0.000 description 2
- 230000005236 sound signal Effects 0.000 description 2
- 230000003068 static effect Effects 0.000 description 2
- 230000007704 transition Effects 0.000 description 2
- 101001121408 Homo sapiens L-amino-acid oxidase Proteins 0.000 description 1
- 102100026388 L-amino-acid oxidase Human genes 0.000 description 1
- 101100233916 Saccharomyces cerevisiae (strain ATCC 204508 / S288c) KAR5 gene Proteins 0.000 description 1
- 230000009471 action Effects 0.000 description 1
- 230000006399 behavior Effects 0.000 description 1
- 238000004422 calculation algorithm Methods 0.000 description 1
- 239000003086 colorant Substances 0.000 description 1
- 230000008713 feedback mechanism Effects 0.000 description 1
- 230000010354 integration Effects 0.000 description 1
- 238000012423 maintenance Methods 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 230000008439 repair process Effects 0.000 description 1
- 230000008786 sensory perception of smell Effects 0.000 description 1
- 230000014860 sensory perception of taste Effects 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 238000012549 training Methods 0.000 description 1
- 238000013519 translation Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F9/00—Arrangements for program control, e.g. control units
- G06F9/06—Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
- G06F9/44—Arrangements for executing specific programs
- G06F9/451—Execution arrangements for user interfaces
Definitions
- Embodiments of the present disclosure relate to a method, device, storage medium and equipment for adjusting viewing images in a virtual environment.
- Embodiments of the present disclosure provide a method, device, storage medium, and equipment for adjusting viewing images in a virtual environment.
- a drag and drop method for three-dimensional space of 2D video, VR180 video, and VR360 video is designed to allow users to experience the VR space.
- the charm and different viewing perspectives in the video field enhance the immersive experience of watching movies in the virtual reality space.
- embodiments of the present disclosure provide a method for adjusting a viewing image in a virtual environment.
- the method includes:
- the video picture displayed in the video viewing area is determined.
- dragging the video viewing area in response to the drag control information and based on the initial position of the ray cursor and the first angle includes:
- the initial position of the ray cursor is taken as the origin of the spherical space, and the first included angle is fixed, and the video viewing area is dragged along the spherical surface of the spherical space.
- the viewing mode of the virtual environment is a two-dimensional video viewing mode
- the video viewing area is a virtual screen
- the virtual screen is in a display state before responding to the drag control information.
- the dragging of the video viewing area along the spherical surface of the spherical space includes:
- the method further includes:
- the virtual screen is controlled to flip 180 degrees around the center of the virtual screen.
- the method further includes:
- the virtual screen is dragged along the spherical surface of the spherical space to the virtual ground in the virtual environment, and a mold-crossing situation occurs between the virtual screen and the virtual ground, the virtual ground is hidden.
- the method further includes:
- the frame of the virtual screen is controlled to return to normal display.
- determining the video picture displayed in the video viewing area based on the current drag position of the video viewing area in the virtual environment includes:
- the video picture displayed in the video viewing area is a two-dimensional video played in full screen, wherein the virtual screen is displayed at the current drag position.
- the video picture and the video picture displayed on the virtual screen at the drag starting position are both two-dimensional videos played in full screen.
- the video viewing area is a viewing frame of a preset ratio.
- the viewfinder frame is in a hidden state;
- control the viewfinder frame In response to the drag control information, control the viewfinder frame to be displayed in the virtual environment
- the method further includes:
- the method further includes:
- the method before responding to the reset visual field control instruction, the method further includes:
- Reset visual field prompt information is displayed in the viewfinder, and the reset visual field prompt information is used to prompt the subject to input the reset visual field control instruction.
- determining the video picture displayed in the video viewing area based on the current drag position of the video viewing area in the virtual environment includes:
- the video picture displayed by the viewfinder frame is the video map corresponding to the current drag position in the panoramic video, wherein the viewfinder frame is in The video picture displayed at the current dragging position is different from the video picture displayed at the dragging starting position of the viewfinder.
- the method further includes: during the dragging process, hiding the ray cursor and displaying the cursor focus of the ray cursor located on the video viewing area.
- the method further includes:
- the drag control information is drag control information generated based on the keys of the object manipulation interactive device, then in response to the drag control information, vibration prompt information is sent to the interactive device, and the vibration prompt information is used to indicate The interactive device vibrates to indicate that the drag operation is triggered.
- the method further includes:
- the drag control information is drag control information generated based on the object's bare hand gesture, then in response to the drag control information, voice prompt information is issued, and the voice prompt information is used to prompt that the drag operation is triggered.
- inventions of the present disclosure provide a device for adjusting viewing images in a virtual environment.
- the device includes:
- a display unit is used to display a virtual environment.
- a ray cursor and a video viewing area are presented in the virtual environment.
- the ray cursor points in the direction of the video viewing area and forms a space between the ray cursor and the video viewing area.
- a control unit configured to respond to the drag control information and drag the video viewing area based on the initial position of the ray cursor and the first angle
- a determining unit configured to determine the video picture displayed in the video viewing area based on the current drag position of the video viewing area in the virtual environment.
- embodiments of the present disclosure provide a computer-readable storage medium that stores a computer program, and the computer program is suitable for loading by a processor to execute the steps described in any of the above embodiments.
- inventions of the present disclosure provide a virtual reality device.
- the virtual reality device includes a processor and a memory.
- a computer program is stored in the memory.
- the processor invokes the computer program stored in the memory.
- embodiments of the present disclosure provide a computer program product, including a computer program.
- the computer program is executed by a processor, the method for adjusting a viewing image in a virtual environment as described in any of the above embodiments is implemented.
- FIG1 is a schematic diagram of a flow chart of a method for adjusting a movie viewing image in a virtual environment provided by an embodiment of the present disclosure
- Figure 2 is a schematic diagram of the first application scenario of the method for adjusting the viewing picture in a virtual environment provided by an embodiment of the present disclosure
- FIG3 is a schematic diagram of a second application scenario of the method for adjusting a movie viewing image in a virtual environment provided by an embodiment of the present disclosure
- Figure 4 is a schematic diagram of the third application scenario of the method for adjusting the viewing picture in a virtual environment provided by an embodiment of the present disclosure
- Figure 5 is a schematic diagram of the fourth application scenario of the viewing picture adjustment method in a virtual environment provided by an embodiment of the present disclosure
- Figure 6 is a schematic diagram of the fifth application scenario of the viewing picture adjustment method in a virtual environment provided by an embodiment of the present disclosure
- Figure 7 is a schematic diagram of the sixth application scenario of the viewing picture adjustment method in a virtual environment provided by an embodiment of the present disclosure.
- Figure 8 is a schematic diagram of the seventh application scenario of the viewing picture adjustment method in a virtual environment provided by an embodiment of the present disclosure
- Figure 9 is a schematic diagram of the eighth application scenario of the method for adjusting the viewing picture in a virtual environment provided by an embodiment of the present disclosure.
- Figure 10 is a schematic diagram of the ninth application scenario of the viewing picture adjustment method in a virtual environment provided by an embodiment of the present disclosure.
- Figure 11 is a schematic diagram of the tenth application scenario of the viewing picture adjustment method in a virtual environment provided by an embodiment of the present disclosure
- Figure 12 is a schematic diagram of an eleventh application scenario of the viewing picture adjustment method in a virtual environment provided by an embodiment of the present disclosure
- FIG13 is a schematic diagram of a twelfth application scenario of the method for adjusting a movie viewing image in a virtual environment provided by an embodiment of the present disclosure
- FIG14 is a schematic diagram of the structure of a device for adjusting a viewing image in a virtual environment provided by an embodiment of the present disclosure
- Figure 15 is a first structural schematic diagram of a virtual reality device provided by an embodiment of the present disclosure.
- Figure 16 is a second structural schematic diagram of a virtual reality device provided by an embodiment of the present disclosure.
- Embodiments of the present disclosure provide a method, device, computer-readable storage medium, virtual reality device, server, and computer program product for adjusting a viewing image in a virtual environment.
- the viewing picture adjustment method in the virtual environment of the embodiment of the present disclosure can be executed by a virtual reality device or a server.
- the disclosed embodiments can be applied to various application scenarios such as Extended Reality (XR), Virtual Reality (VR), Augmented Reality (AR), and Mixed Reality (MR).
- XR Extended Reality
- VR Virtual Reality
- AR Augmented Reality
- MR Mixed Reality
- Extended Reality is a concept that includes Virtual Reality (VR), Augmented Reality (AR) and Mixed Reality (MR), which means that the virtual world is connected to the real world. environment, technology that enables users to interact with that environment in real time.
- Virtual Reality a technology for creating and experiencing virtual worlds, calculates and generates a virtual environment, which is a kind of multi-source information (the virtual reality mentioned in this article at least includes visual perception, and can also include auditory perception, Tactile perception, motion perception, and even taste perception, smell perception, etc.), realizing the integration of virtual environment, interactive three-dimensional dynamic vision and simulation of entity behavior, immersing users in simulated virtual reality In the environment, it can be used in various virtual environments such as maps, games, videos, education, medical care, simulation, collaborative training, sales, assisted manufacturing, maintenance and repair, etc.
- the virtual reality mentioned in this article at least includes visual perception, and can also include auditory perception, Tactile perception, motion perception, and even taste perception, smell perception, etc.
- Augmented Reality is a method in which the camera's posture parameters in the real world (or three-dimensional world, real world) are calculated in real time during the process of camera image collection.
- Virtual elements include, but are not limited to: images, videos, and three-dimensional models.
- the goal of AR technology is to connect the virtual world to the real world on the screen for interaction.
- MR Mixed Reality
- the computer-created sensory input can be adapted to the sensory input from the physical set. Changes in sensory input to the physical set.
- some electronic systems used to render the MR scenery may monitor orientation and/or position relative to the physical scenery to enable virtual objects to interact with real objects (i.e., physical elements from the physical scenery or representations thereof). For example, the system can monitor motion so that virtual plants appear stationary relative to physical buildings.
- Augmented Virtuality refers to a computer-created scenery or virtual scenery that incorporates at least one sensory input from the physical scenery to simulate the scenery.
- the one or more sensory inputs from the physical setting may be a representation of at least one feature of the physical setting.
- a virtual object may take on the color of a physical element captured by one or more imaging sensors.
- virtual objects may exhibit characteristics consistent with actual weather conditions in the physical scene, as identified via weather-related imaging sensors and/or online weather data.
- an augmented reality forest can have virtual trees and structures, but the animals can have features accurately recreated from images taken of physical animals.
- Virtual field of view the area in the virtual environment that the user can perceive through the lens in the virtual reality device, uses the field of view (Field Of View, FOV) of the virtual field of view to represent the perceived area.
- FOV Field Of View
- Virtual reality equipment a terminal that realizes virtual reality effects, can usually be provided in the form of glasses, helmet-mounted displays (HMD), and contact lenses to achieve visual perception and other forms of perception.
- HMD helmet-mounted displays
- contact lenses to achieve visual perception and other forms of perception.
- virtual reality equipment realizes The form is not limited to this, and can be further miniaturized or enlarged as needed.
- the virtual reality devices recorded in the embodiments of the present disclosure may include but are not limited to the following: type:
- PCVR Computer-based virtual reality
- External computer-based virtual reality devices use the data output by the PC to achieve virtual reality effects.
- Mobile virtual reality equipment supports setting up a mobile terminal (such as a smartphone) in various ways (such as a head-mounted display with a special card slot), and through a wired or wireless connection with the mobile terminal, the mobile terminal performs virtual reality Function-related calculations and output data to mobile virtual reality devices, such as viewing virtual reality videos through mobile terminal APPs.
- a mobile terminal such as a smartphone
- ways such as a head-mounted display with a special card slot
- the mobile terminal performs virtual reality Function-related calculations and output data to mobile virtual reality devices, such as viewing virtual reality videos through mobile terminal APPs.
- the all-in-one virtual reality device has a processor for performing calculations related to virtual functions, so it has independent virtual reality input and output functions. It does not need to be connected to a PC or mobile terminal, and has a high degree of freedom in use.
- Each embodiment of the present disclosure provides a method for adjusting the viewing image in a virtual environment.
- the method can be executed by a terminal or a server, or can be executed jointly by a terminal and a server.
- the embodiments of the present disclosure use the method of adjusting the viewing image in a virtual environment The method is explained by taking the execution of the terminal (virtual reality device) as an example.
- Figure 1 is a schematic flowchart of a viewing picture adjustment method in a virtual environment provided by an embodiment of the present disclosure.
- Figures 2 to 13 are schematic diagrams of relevant application scenarios provided by an embodiment of the present disclosure, wherein, The blank background in Figure 2 can be a three-dimensional virtual space layer.
- the method includes:
- Step 110 Display a virtual environment with a ray cursor and a video viewing area presented in the virtual environment.
- the ray cursor points in the direction of the video viewing area and forms a first cursor with the video viewing area. angle.
- a virtual environment 10 is displayed.
- a ray cursor 11 and a video viewing area 12 are presented in the virtual environment 10 .
- the ray cursor 11 points in the direction of the video viewing area 12 and is connected to the video viewing area 12 .
- forming a first included angle ⁇ For example, in order to enrich the presentation of the virtual environment 10 , a virtual handle 13 can be presented at the starting position 111 of the ray cursor 11 , and there is a beam emitting in the direction of the video viewing area 12 directly in front of the virtual handle 13 .
- a first included angle ⁇ is formed between the line cursor 11, the ray cursor 11 and the video viewing area 12.
- the first included angle ⁇ is the included angle formed by the ray cursor 11 and the plane where the video viewing area 12 is located.
- the viewing area is the video panel (hereinafter referred to as the virtual screen) in the background; in VR180 and VR360, the viewing area is the visual focus area, which is the corresponding viewfinder when dragging subsequently.
- the ray cursor 11 and the video viewing area 12 presented in the virtual environment 10 can be displayed normally or hidden.
- the viewing mode of the virtual environment 10 is a two-dimensional video viewing mode
- the video viewing area 12 can be a virtual screen.
- the virtual screen Before the virtual screen is dragged and during the dragging process, the virtual screen can be in The display state is used to simulate that the object (the object can be a user) is in a movie viewing scene such as a cinema; the ray cursor 11 can be in the display state before the virtual screen is dragged, and the ray cursor 11 can be displayed while the virtual screen is being dragged.
- the virtual handle 13 In a hidden state; for example, if a virtual handle 13 is also present at the starting position 111 of the ray cursor 11, the virtual handle 13 can be in a displayed state before the virtual screen is dragged. can be hidden.
- the video viewing area 12 can be a viewfinder with a preset ratio (equivalent to the user's visual focus area in the virtual environment).
- the viewfinder before the viewfinder is dragged, the viewfinder can be hidden to simulate that the user is in a panoramic video scene, and the entire virtual environment displays panoramic video; while the viewfinder is being dragged, the viewfinder It can be in a display state; the ray cursor 11 can be in a display state before the viewfinder frame is dragged, and the ray cursor 11 can be in a hidden state while the viewfinder frame is being dragged; for example, if the starting position 111 of the ray cursor 11 is still If the virtual handle 13 is present, the virtual handle 13 can be in a displayed state before the viewfinder frame is dragged, and the virtual handle 13 can be in a hidden state while the viewfinder frame is being dragged.
- Step 120 Drag the video viewing area in response to the drag control information and based on the initial position of the ray cursor and the first angle.
- the drag control information generated by the keys of the interactive device can be manipulated based on the object.
- a virtual environment can be displayed on the display of a virtual reality device.
- the virtual reality device can be connected to an external interactive device, and the virtual environment can be driven by controlling the buttons of the interactive device.
- the ray cursor in the environment can be controlled to control the drag operation of the ray cursor driver on the video viewing area in response to the drag control information.
- the interactive device may include a handle, a digital glove, a specific interactive device, etc.
- drag control information can be generated based on an object's bare hand gesture.
- the drag operation of the ray cursor driver on the video viewing area can be controlled in response to the drag control information.
- dragging the video viewing area in response to the drag control information and based on the initial position of the ray cursor and the first angle includes:
- the initial position of the ray cursor is taken as the origin of the spherical space, and the first included angle is fixed, and the video viewing area is dragged along the spherical surface of the spherical space.
- a spherical space 20 can be provided, and the spherical space 20 serves as a viewing container.
- the movie can be viewed in the viewing container of the spherical space 20 .
- the drag control information taking the starting position 111 of the ray cursor 11 as the origin A of the spherical space 20 and fixing the first angle ⁇ between the ray cursor 11 and the plane where the video viewing area 12 is located, along the spherical surface
- the spherical surface of space 20 drags the video viewing area 12.
- a cubic space can also be set up as a viewing container.
- the viewing can be performed in the viewing container of the cubic space.
- the initial position of the ray cursor is taken as the center point of the cube space, and the first angle between the ray cursor and the plane where the video viewing area is located is fixed, and the video viewing area is dragged along the surface of the cube space. shadow area.
- the cursor focus of the control ray cursor is located in the draggable video viewing area, and along the x direction and/or y axis direction, human-computer interaction can be performed in the cubic space to realize dragging of the viewing screen.
- the viewing mode of the virtual environment is a two-dimensional video viewing mode
- the video viewing area is a virtual screen
- the virtual screen is in a display state before responding to the drag control information
- dragging the video viewing area along the spherical surface of the spherical space includes: dragging the virtual screen along the spherical surface of the spherical space in at least one direction of the x-axis direction and the y-axis direction.
- the viewing mode of the virtual environment is a two-dimensional (2D) video viewing mode
- the video viewing area is a virtual screen
- the virtual screen before responding to the drag control information, the virtual screen is in a display state.
- the 2D video viewing mode can refer to the virtual environment 10 shown in Figure 2 or Figure 4.
- the virtual environment 10 is presented with a ray cursor 11 and a video viewing area 12 (ie, a virtual screen 121), where the ray cursor 11 points to direction of the video viewing area 12 and forming a first included angle ⁇ with the video viewing area 12 .
- a virtual handle 13 can be presented at the starting position 111 of the ray cursor 11 , and directly in front of the virtual handle 13 there is a ray emitting in the direction of the video viewing area 12 (ie, the virtual screen 121 ).
- a first included angle ⁇ is formed between the cursor 11, the ray cursor 11 and the video viewing area 12 (ie, the virtual screen 121).
- the drag trigger condition includes that the ray cursor 11 points to the video viewing area 12 (ie, the virtual screen 121 ), and the cursor focus 112 of the ray cursor 11 is in the video viewing area 12 (ie, the virtual screen 121 ), the video viewing area 12 (i.e., the virtual screen 121) can be understood as the video hot zone range, and the object generates drag control information by long-pressing the preset button in the interactive device (such as a handle), and the virtual reality device obtains
- the displacement of the ray cursor 11 in the x/y axis direction can be controlled to be greater than or equal to xdp in response to the drag control information to drag the video viewing area 12 (ie, the virtual screen 121).
- x in xdp is a dynamic value.
- the size of the rebound can be set according to the strength of the user's drag, and different values can be assigned in different scenarios.
- the preset keys may include a Grip key, a Trigger key, an A/X key (the same function as Trigger), etc.
- an interaction avoidance condition may also be set.
- the interaction avoidance condition includes any of the following: dragging occurs when the "settings panel” pops up; dragging occurs when the "settings panel” disappears.
- one of the above two is selected for interaction avoidance, such as setting the dragging to occur when the "settings panel” pops up, or setting the dragging to occur when the "settings panel” disappears.
- the interaction avoidance conditions can also include: drag and drop When triggered, "video viewing area drag and drop” will be executed first, and "web page interaction" will not be executed.
- the drag object is the video viewing area 12 (ie, the virtual screen 121 ), and the dragged video viewing area 12 (ie, the virtual screen 121 ), the video picture displayed in the video viewing area 12 (ie, the virtual screen 121) is a 2D full-screen video.
- the first angle ⁇ between the ray cursor 11 and the video viewing area 12 (ie, the virtual screen 121 ) is fixed, and the video viewing area 12 (ie, the virtual screen 121 ) is dragged by moving the ray cursor 11 .
- the video viewing area 12 ie, the virtual screen 121
- the video viewing area 12 is dragged along the spherical surface of the spherical space 20 in at least one direction of the x-axis direction and the y-axis direction. ), that is, the x-y spherical full-degree-of-freedom mobile video viewing area 12 (ie, the virtual screen 121).
- the starting position 111 of the ray cursor 11 is the origin A of the spherical space 20
- the ray cursor 11 is the radius of the spherical space 20
- the first intersection between the ray cursor 11 and the video viewing area 12 ie, the virtual screen 121
- Angle ⁇ is the position of the spherical moving landing point B (x-axis coordinate and y-axis coordinate), that is, the content seen by the subject on the display screen of the virtual reality device is always the position of point B tangent to the spherical surface of the spherical space 20,
- the first included angle ⁇ remains unchanged during dragging.
- the line segment AB in the figure represents the ray cursor 11 shot from the virtual handle 13 to the video viewing area 12 (ie, the virtual screen 121).
- the RotationZ (lateral rotation) value of the video viewing area is always 0.
- the center point of the virtual screen 121 moves along the spherical surface of the spherical space 20 in at least one direction of the x-axis direction and the y-axis direction.
- the method further includes: if the virtual screen is dragged along the spherical surface of the spherical space in the y-axis direction to the top or bottom of the y-axis direction, controlling the virtual screen to surround the spherical surface of the spherical space.
- the center of the virtual screen is flipped 180 degrees.
- the top margin of the virtual screen the bottom margin of the virtual screen (drag to the top/bottom)
- the bottom margin of the virtual screen the same time Responds to dragging and performs a 180° flip around the center point of the virtual screen.
- the method further includes: if the virtual screen is dragged along the spherical surface of the spherical space to the virtual ground in the virtual environment, and the virtual screen and the virtual ground encounter a mold-crossing situation. , then the virtual ground is hidden.
- the disappearance processing condition can be that when the virtual ground 14 is penetrated by the virtual screen 121, the virtual ground 14 is controlled to gradually disappear based on the dragging speed.
- the virtual ground 14 is completely hidden.
- the method further includes:
- the frame of the virtual screen is controlled to return to normal display.
- the border 1211 of the virtual screen 121 is highlighted.
- the border of the virtual screen returns to the nominal state.
- the handle connected to the virtual reality device will vibrate; when the drag ends, the handle will stop vibrating.
- a reset field of view instruction can be generated by long-pressing the "Home" key to reset the field of view in response to the reset field of view instruction and reset the position of the virtual screen to a default position in the field of view.
- the video viewing area is a viewfinder of a preset proportion.
- the drag control information takes the initial position of the ray cursor as the origin of the spherical space, fixes the first included angle, and drags the video viewing area along the spherical surface of the spherical space, including:
- control the viewfinder frame In response to the drag control information, control the viewfinder frame to be displayed in the virtual environment
- the video viewing area is a viewing frame with a preset ratio.
- the viewfinder frame with a preset ratio may be a 16:9 viewfinder frame.
- the viewfinder frame may be an area of 1280*720.
- the drag trigger conditions include that the ray cursor points to the video viewing area (i.e., the viewfinder frame), and the cursor focus of the ray cursor is in the video viewing area (i.e., the viewfinder frame)
- the video viewing area i.e., the viewfinder
- the video hot zone range can also be understood as the visual focus area in the immersive experience
- the object is in the interactive device (such as a handle) by long pressing
- the preset keys generate drag control information, and when the virtual reality device obtains the drag control information, it can control the displacement of the ray cursor in the x/y axis direction to be greater than or equal to xdp in response to the drag control information to drag the video Viewing area (i.e. viewing frame).
- the preset keys may include a Grip key, a Trigger key, an A/X key (the same function as Trigger), etc.
- interaction avoidance conditions include any of the following: dragging when the "Settings Panel” pops up; dragging when the “Settings Panel” pops up; dragging when the “Settings Panel” and “Immersion Bar” disappear; “Immersion Bar” is displayed Dragging occurs when the “immersion bar” disappears; dragging occurs when the “immersion bar” disappears. Among them, choose one of the above four to avoid interaction.
- the video viewing area 12 in FIG. 3 may be a viewfinder 122.
- the dragged object is the video viewing area 12 (i.e., the viewfinder 122)
- the video screen displayed in the video viewing area 12 (i.e., the viewfinder 122) following the dragged video viewing area 12 (i.e., the viewfinder 122) is the video map corresponding to the current drag position in the panoramic video.
- the fixed ray cursor 11 and the first clip of the video viewing area 12 i.e., the viewfinder 122 are fixed.
- Angle ⁇ by moving the ray cursor 11 to drag the video viewing area 12 (ie, the viewfinder 122).
- the starting position 111 of the ray cursor 11 is the origin A of the spherical space 20, and the ray cursor 11 is the spherical space.
- fix the first angle ⁇ between the ray cursor 11 and the video viewing area 12 i.e., the viewing frame 122
- move horizontally along the spherical surface of the spherical space 20 and drag the video viewing area 12 i.e., the viewing frame 122).
- the method further includes:
- the area in the virtual environment 10 except the viewfinder 122 is controlled to display the mask layer 17, for example , the mask layer 17 can be a black mask layer, or it can be a mask layer of other colors.
- the mask layer 17 can be a black mask layer, or it can be a mask layer of other colors.
- the viewfinder frame 122 and the mask layer 17 are hidden, and during the immersive experience, the entire immersive video image of the virtual environment 10 is visible.
- the handle connected to the virtual reality device will vibrate; when the drag ends, the handle will stop vibrating.
- a reset field of view instruction may be generated by long-pressing the "Home" key to reset the field of view in response to the reset field of view instruction and reset the position of the viewfinder frame 122 to a default position in the field of view.
- the method further includes:
- the method before responding to the reset visual field control instruction, the method further includes:
- Reset visual field prompt information is displayed in the viewfinder, and the reset visual field prompt information is used to prompt the subject to input the reset visual field control instruction.
- the reset field of view prompt information can be displayed in the viewfinder in the form of a toast message prompt box 1221.
- the content of the reset field of view prompt message is "Long press the Home button on one handle to reset the field of view” or "Long press the Home button on one handle to reset the field of view” or "Long press the Home button on one handle to reset the field of view”.
- the subject long presses the Home key or ⁇ key to trigger the reset vision control instruction through the content of the reset vision prompt information, so that the virtual reality device performs a vision reset operation in response to the reset vision control instruction, such as controlling the left side of a 180-degree panoramic video. Move the border to the left border of the viewfinder, or control the right border of the 180-degree panoramic video to move to the right border of the viewfinder.
- the toast message prompt box 1221 is used to display a visual field reset prompt message in the viewfinder 122.
- the toast message prompt box 1221 does not have any control buttons and will not gain focus and will automatically disappear after a period of time.
- the video boundary bounces back to the viewfinder boundary, that is, the left boundary of the video bounces back to the left boundary of the viewfinder, or the right boundary of the video bounces back to the right boundary of the viewfinder. If the boundary of the viewfinder is not dragged beyond the boundary of the video, the toast message prompt box will not appear.
- the preset distance value is xdp
- x in xdp is a dynamic value.
- the size of the rebound can be set according to the user's drag strength, and different values can be assigned in different scenarios.
- the method further includes:
- the drag control information is drag control information generated based on the keys of the object manipulation interactive device, then in response to the drag control information, vibration prompt information is sent to the interactive device, and the vibration prompt information is used to indicate The interactive device vibrates to indicate that the drag operation is triggered.
- the interactive device takes a handle as an example.
- a vibration prompt information is sent to the handle.
- the vibration prompt information is used to indicate that the handle vibrates.
- the handle generates an instantaneous vibration in response to the vibration prompt information to prompt the object for the drag operation. is triggered.
- the vibration duration is x seconds, say 3 seconds.
- the method further includes:
- the drag control information is drag control information generated based on the object's bare hand gesture, then in response to the drag control information, voice prompt information is issued, and the voice prompt information is used to prompt that the drag operation is triggered.
- a voice prompt message can be issued, and the voice prompt message is used to prompt that the drag-and-drop operation is triggered.
- the method further includes: during the dragging process, hiding the ray cursor and displaying the cursor focus of the ray cursor located on the video viewing area.
- the ray cursor 11 can be hidden and only the cursor focus 112 is displayed.
- the play bar 15 can also be hidden; the non-normal (standard) state can also be displayed in a full-screen video; the video play/pause state displayed on the virtual screen 121 remains unchanged.
- the virtual environment is displayed as shown in Figure 7, and the video picture displayed in the video viewing area 12 (ie, the virtual screen 121) is a 2D full-screen video.
- the ray cursor 11 can be hidden, and only the cursor focus 112 is displayed.
- the video play/pause state displayed in the viewfinder 122 remains unchanged.
- the virtual environment is displayed as shown in FIG. 9, and the video screen displayed in the video viewing area 12 (i.e., the viewfinder 122) is the video map corresponding to the dragging position.
- Step 130 Determine the video picture displayed in the video viewing area based on the current drag position of the video viewing area in the virtual environment.
- the video viewing area is determined based on the current drag position of the video viewing area in the virtual environment.
- the video screen displayed in the area includes:
- the video picture displayed in the video viewing area is a two-dimensional video played in full screen, wherein the virtual screen
- the video picture displayed by the virtual screen at the current dragging position and the video picture displayed by the virtual screen at the dragging starting position are both two-dimensional videos played in full screen.
- the virtual reality device responds to the drag end command and cancels the drag.
- the virtual screen stays at the current Drag position.
- the "UI display” of the entire full-screen mode of the virtual screen keeps the relative positions of the "full-screen bar (bar)", “full-screen video”, and "setting panel” unchanged.
- the full-screen video is a two-dimensional video played in full screen.
- the video images displayed in the video viewing area in the 2D video viewing mode are all full-screen videos.
- the playback or pause state of the video remains unchanged. .
- the display shows the default state.
- the default state represents the initial state of the display system.
- the displayed video image will be adjusted to a tilted 45° image (the video image after dragging); but the next time the user wears the virtual reality device, the displayed video image will be The video screen is a normal 90° screen (default state).
- the position drag information of the virtual screen may not be recorded, but the zoom information of the ray cursor may be recorded.
- the scaling information includes the scaling size of the ray cursor, through which the virtual screen can be zoomed in or out based on the screen center point.
- the determination is based on the current drag position of the video viewing area in the virtual environment.
- the video images displayed in the video viewing area include:
- the video picture displayed by the viewfinder frame is the video map corresponding to the current drag position in the panoramic video, wherein the viewfinder frame is in The video picture displayed at the current dragging position is different from the video picture displayed at the dragging starting position of the viewfinder.
- a drag end command is generated, so that the virtual reality device responds to the drag end command and cancels the drag.
- the viewfinder stays at the current dragging position, and the viewfinder Disappears in the way of Alpha change, and the video picture displayed in the viewfinder remains at the current dragging position for playback.
- the video picture displayed in the viewfinder is the video map corresponding to the current dragging position in the panoramic video, in which the viewfinder is at the current dragging position.
- the video screen displayed is different from the video screen displayed at the drag starting position of the viewfinder.
- the VR180° immersive viewing mode it is a 180° panorama, which is equivalent to a hemisphere.
- the subject wears the virtual reality device, he or she can watch the panoramic video in a 180° range.
- the video content displayed in the viewfinder is dragged out of the 180° displayable range according to the movement of the viewfinder, there will be a black mask or Alpha gradient to blend with the viewing screen.
- the viewing screen is in color.
- the video content is dragged out 180°, there will be a transition state from the color viewing screen to a pure black scene. This transition state is the Alpha change.
- the drag position will not be recorded.
- the display shows the default state.
- Embodiments of the present disclosure display a virtual environment, and a ray cursor and a video viewing area are presented in the virtual environment, where the ray cursor points in the direction of the video viewing area and forms a first angle with the video viewing area; in response to dragging Drag control information, and drag the video viewing area based on the initial position of the ray cursor and the first angle; determine the video picture displayed in the video viewing area based on the current dragging position of the video viewing area in the virtual environment.
- the disclosed embodiment designs a drag and drop method for three-dimensional space of 2D video, VR180 video, and VR360 video, which allows users to experience the charm of VR space and different viewing angles in the video field, and improves the viewing experience in virtual reality space. immersive experience.
- the embodiment of the present disclosure also provides a device for adjusting the viewing image in the virtual environment.
- FIG. 14 is a schematic structural diagram of a viewing image adjustment device in a virtual environment provided by an embodiment of the present disclosure.
- the viewing screen adjustment device 200 in the virtual environment may include:
- the display unit 210 is used to display a virtual environment, wherein the virtual environment presents a ray A cursor and a video viewing area, wherein the ray cursor points in the direction of the video viewing area and forms a first angle with the video viewing area;
- the control unit 220 is configured to drag the video viewing area in response to the drag control information and based on the initial position of the ray cursor and the first angle;
- the determining unit 230 is configured to determine the video picture displayed in the video viewing area based on the current drag position of the video viewing area in the virtual environment.
- control unit 220 is specifically used to:
- the initial position of the ray cursor is taken as the origin of the spherical space, and the first included angle is fixed, and the video viewing area is dragged along the spherical surface of the spherical space.
- the viewing mode of the virtual environment is a two-dimensional video viewing mode
- the video viewing area is a virtual screen
- the virtual screen is in a display state before responding to the drag control information.
- control unit 220 drags the video viewing area along the spherical surface of the spherical space, it is specifically configured to: drag the video viewing area along the spherical surface of the spherical space in at least one direction of the x-axis direction and the y-axis direction. Drag the virtual screen.
- control unit 220 is also configured to: if the virtual screen is dragged along the spherical surface of the spherical space in the y-axis direction to the top or bottom of the y-axis direction, control the The virtual screen is flipped 180 degrees around the center of the virtual screen.
- control unit 220 is also configured to: if the virtual screen is dragged along the spherical surface of the spherical space to the virtual ground in the virtual environment, and the virtual screen and the virtual ground If a mold-breaking situation occurs, the virtual ground will be hidden.
- control unit 220 is also used to:
- the border of the virtual screen is controlled to return to normal display.
- the determining unit 230 is specifically used to:
- the video picture displayed in the video viewing area is a two-dimensional video played in full screen, wherein the virtual screen is displayed at the current drag position.
- the video picture and the video picture displayed on the virtual screen at the drag starting position are both two-dimensional videos played in full screen.
- the video viewing area is a viewfinder of a preset proportion. In response to the drag control information Previously, the viewfinder frame was hidden;
- the control unit 220 is used for:
- control the viewfinder frame In response to the drag control information, control the viewfinder frame to be displayed in the virtual environment
- control unit 220 is further configured to:
- control unit 220 is also used to:
- control unit 220 before responding to the reset visual field control instruction, the control unit 220 is also configured to:
- Reset visual field prompt information is displayed in the viewfinder, and the reset visual field prompt information is used to prompt the subject to input the reset visual field control instruction.
- the determining unit 230 is specifically used to:
- the video picture displayed by the viewfinder frame is the video map corresponding to the current drag position in the panoramic video, wherein the viewfinder frame is in The video picture displayed at the current dragging position is different from the video picture displayed at the dragging starting position of the viewfinder.
- control unit 220 is further configured to hide the ray cursor and display the cursor focus of the ray cursor located on the video viewing area during the dragging process.
- control unit 220 is further configured to:
- the drag control information is drag control information generated based on the keys of the object manipulation interactive device, then in response to the drag control information, vibration prompt information is sent to the interactive device, and the vibration prompt information is used to indicate The interactive device vibrates to indicate that the drag operation is triggered.
- control unit 220 is also configured to: if the drag control information is drag control information generated based on the object's bare hand gesture, when responding to the drag control information, issue a voice prompt Information, the voice prompt information is used to prompt that the drag operation is triggered.
- Each unit in the above-mentioned viewing image adjustment device 200 in the virtual environment can be implemented in whole or in part by software, hardware, and combinations thereof.
- Each of the above-mentioned units may be embedded in or independent of the processor in the virtual reality device in the form of hardware, or may be stored in the memory of the virtual reality device in the form of software, so that the processor can call and execute the operations corresponding to the above-mentioned units.
- the viewing image adjustment device 200 in the virtual environment can be integrated in a terminal or server that has a storage device and a processor and has computing capabilities, or the viewing image adjustment device 200 in the virtual environment is the terminal or server.
- the present disclosure also provides a virtual reality device, including a memory and a processor.
- a computer program is stored in the memory.
- the processor executes the computer program, it implements the steps in the above method embodiments.
- FIG 15 is a schematic structural diagram of a virtual reality device provided by an embodiment of the present disclosure.
- the virtual reality device 300 can usually be provided in the form of glasses, a helmet-mounted display (HMD), or a contact lens. It is used to realize visual perception and other forms of perception.
- the form of virtual reality equipment is not limited to this, and can be further miniaturized or enlarged as needed.
- the virtual reality device 300 may include but is not limited to the following components:
- Detection module 301 Use various sensors to detect the user's operation commands and act on the virtual environment, such as continuously updating the image displayed on the display screen following the user's line of sight, to achieve user interaction with the virtual and scene, for example, continuously updating the real content based on the detected rotation direction of the user's head.
- Feedback module 302 receives data from sensors and provides real-time feedback to the user; wherein, the feedback module 302 may be used to display a graphical user interface, such as displaying a virtual environment on the graphical user interface.
- the feedback module 302 may include a display screen or the like.
- Sensor 303 On the one hand, it accepts operation commands from the user and applies them to the virtual environment; on the other hand, it provides the results of the operation to the user in the form of various feedbacks.
- Control module 304 Controls sensors and various input/output devices, including obtaining user data (such as actions, voice) and output sensing data, such as images, vibrations, temperature and sound, etc., to the user, virtual environment and real world Have an effect.
- user data such as actions, voice
- output sensing data such as images, vibrations, temperature and sound, etc.
- Modeling module 305 Constructs a three-dimensional model of the virtual environment, which can also include various feedback mechanisms such as sound and touch in the three-dimensional model.
- a virtual environment can be constructed through the modeling module 305, and the virtual environment can be displayed through the feedback module 302.
- a ray cursor and a video viewing area are presented in the virtual environment, wherein the ray The cursor points in the direction of the video viewing area and forms a first angle with the video viewing area; then the control module 304 responds to the drag control information and based on the initial position of the ray cursor and the At the first included angle, drag the video viewing area; then the control module 304 determines the video picture displayed in the video viewing area based on the current dragging position of the video viewing area in the virtual environment.
- the virtual reality device 300 also includes a processor 310 with one or more processing cores. Or the memory 320 of more than one computer-readable storage medium and the computer program stored on the memory 320 and executable on the processor. Among them, the processor 310 is electrically connected to the memory 320.
- the structure of the virtual reality device shown in the figures does not constitute a limitation on the virtual reality device, and may include more or fewer components than shown in the figures, or combine certain components, or arrange different components. .
- the processor 310 is the control center of the virtual reality device 300. It uses various interfaces and lines to connect various parts of the entire virtual reality device 300, by running or loading software programs and/or modules stored in the memory 320, and by calling the software programs and/or modules stored in the memory 320.
- the data in 320 performs various functions of the virtual reality device 300 and processes the data, thereby performing virtual reality processing.
- the actual device 300 performs overall monitoring.
- the processor 310 in the virtual reality device 300 will follow the following steps to load instructions corresponding to the processes of one or more application programs into the memory 320, and the processor 310 will run the instructions stored in application in memory 320 to implement various functions:
- processor 310 may include detection module 301, control module 304, and modeling module 305.
- the virtual reality device 300 further includes: a radio frequency circuit 306, an audio circuit 307, and a power supply 308.
- the processor 310 is electrically connected to the memory 320, the feedback module 302, the sensor 303, the radio frequency circuit 306, the audio circuit 307, and the power supply 308, respectively.
- the structure of the virtual reality device shown in FIG15 or FIG16 does not constitute a limitation on the virtual reality device, and may include more or less components than shown in the figure, or combine certain components, or arrange the components differently.
- the radio frequency circuit 306 can be used to send and receive radio frequency signals to establish wireless communication with network equipment or other virtual reality equipment through wireless communication, and to send and receive signals with network equipment or other virtual reality equipment.
- the audio circuit 307 can be used to provide an audio interface between the user and the virtual reality device through speakers and microphones.
- the audio circuit 307 can transmit the electrical signal converted from the received audio data to the speaker, which converts it into a sound signal and outputs it; on the other hand, the microphone converts the collected sound signal into an electrical signal, which is received and converted by the audio circuit 307
- the audio data is processed by the audio data output processor 301 and then sent to, for example, another virtual reality device via the radio frequency circuit 306, or the audio data is output to the memory for further processing.
- Audio circuit 307 may also include an earplug jack to provide peripheral headphones with Communication with virtual reality devices.
- the power supply 308 is used to power various components of the virtual reality device 300 .
- the virtual reality device 300 may further include a camera, a wireless fidelity module, a Bluetooth module, an input module, etc., which will not be described in detail here.
- the present disclosure also provides a computer-readable storage medium for storing a computer program.
- the computer-readable storage medium can be applied to a virtual reality device or server, and the computer program causes the virtual reality device or server to execute the corresponding process in the viewing picture adjustment method in a virtual environment in the embodiment of the present disclosure. For simplicity, in This will not be described again.
- the present disclosure also provides a computer program product including a computer program stored in a computer-readable storage medium.
- the processor of the virtual reality device reads the computer program from the computer-readable storage medium, and the processor executes the computer program, so that the virtual reality device executes the corresponding process in the viewing picture adjustment method in the virtual environment in the embodiment of the present disclosure,
- a computer program product including a computer program stored in a computer-readable storage medium.
- the present disclosure also provides a computer program, the computer program includes a computer program, and the computer program is stored in a computer-readable storage medium.
- the processor of the virtual reality device reads the computer program from the computer-readable storage medium, and the processor executes the computer program, so that the virtual reality device executes the corresponding process in the viewing picture adjustment method in the virtual environment in the embodiment of the present disclosure, For the sake of brevity, no further details will be given here.
- the processor in the embodiment of the present disclosure may be an integrated circuit chip with signal processing capabilities.
- each step of the above method embodiment can be completed through an integrated logic circuit of hardware in the processor or instructions in the form of software.
- the above-mentioned processor can be a general-purpose processor, a digital signal processor (Digital Signal Processor, DSP), an application specific integrated circuit (Application Specific Integrated Circuit, ASIC), an off-the-shelf programmable gate array (Field Programmable Gate Array, FPGA) or other available processors.
- DSP Digital Signal Processor
- ASIC Application Specific Integrated Circuit
- FPGA Field Programmable Gate Array
- a general-purpose processor may be a microprocessor or the processor may be any conventional processor, etc.
- the steps of the method disclosed in conjunction with the embodiments of the present disclosure can be directly embodied as hardware translation.
- the execution of the code processor is completed, or the execution is completed using a combination of hardware and software modules in the decoding processor.
- the software module can be located in random access memory, flash memory, read-only memory, programmable read-only memory or electrically erasable programmable memory, registers and other mature storage media in this field.
- the storage medium is located in the memory, and the processor reads the information in the memory and completes the steps of the above method in combination with its hardware.
- non-volatile memory may be volatile memory or non-volatile memory, or may include both volatile and non-volatile memory.
- non-volatile memory can be read-only memory (Read-Only Memory, ROM), programmable read-only memory (Programmable ROM, PROM), erasable programmable read-only memory (Erasable PROM, EPROM), electrically removable memory.
- Erase programmable read-only memory Electrodeically EPROM, EEPROM
- Volatile memory may be Random Access Memory (RAM), which is used as an external cache.
- RAM static random access memory
- DRAM dynamic random access memory
- DRAM synchronous dynamic random access memory
- SDRAM double data rate synchronous dynamic random access memory
- Double Data Rate SDRAM DDR SDRAM
- enhanced SDRAM ESDRAM
- Synchlink DRAM SLDRAM
- Direct Rambus RAM Direct Rambus RAM
- the disclosed systems, devices and methods can be implemented in other ways.
- the device embodiments described above are only illustrative.
- the division of the units is only a logical function division. In actual implementation, there may be other division methods.
- multiple units or components may be combined or can be integrated into another system, or some features can be ignored, or not implemented.
- the coupling or direct coupling or communication connection between each other shown or discussed may be through some interfaces, and the indirect coupling or communication connection of the devices or units may be in electrical, mechanical or other forms.
- the units described as separate components may or may not be physically separated, and the components shown as units may or may not be physical units, that is, they may be located in one place, or they may be distributed to multiple network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of this embodiment.
- each functional unit in the embodiment of the present disclosure may be integrated into one processing unit, or each unit may exist physically alone, or two or more units may be integrated into one unit.
- the functions are implemented in the form of software functional units and sold or used as independent products, they can be stored in a computer-readable storage medium.
- the technical solution of the present disclosure can be embodied in essence or part of the technical solution in the form of a software product.
- the computer software product is stored in a storage medium and includes a number of instructions to enable a virtual reality device ( It may be a personal computer or a server) that executes all or part of the steps of the methods described in various embodiments of the present disclosure.
- the aforementioned storage media include: U disk, mobile hard disk, ROM, RAM, magnetic disk or optical disk and other media that can store program codes.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Software Systems (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
- Controls And Circuits For Display Device (AREA)
Abstract
La présente divulgation concerne un procédé et un appareil pour ajuster une image de visualisation dans un environnement virtuel, ainsi qu'un support de stockage et un dispositif. Le procédé consiste à : afficher un environnement virtuel, un curseur à rayon et une zone de visualisation de vidéo étant présentés dans l'environnement virtuel, et le curseur à rayon pointant dans la direction de la zone de visualisation de vidéo et formant un premier angle inclus avec la zone de visualisation de vidéo ; en réponse à des informations de commande de glissement, faire glisser la zone de visualisation de vidéo sur la base de la position initiale du curseur à rayon et du premier angle inclus ; et sur la base de la position de glissement actuelle de la zone de visualisation de vidéo dans l'environnement virtuel, déterminer une image vidéo, qui est affichée dans la zone de visualisation de vidéo. Dans la présente divulgation, des modes de glissement pour des espaces tridimensionnels dans une vidéo 2D, une vidéo VR180 et une vidéo VR360 sont conçus, de telle sorte qu'un utilisateur puisse connaître la fascination d'un espace de réalité virtuelle (RV) et de différents angles de visualisation dans le domaine des vidéos, ce qui permet d'améliorer l'expérience immersive de visualisation dans l'espace de RV.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202211146244.1 | 2022-09-20 | ||
CN202211146244.1A CN117784915A (zh) | 2022-09-20 | 2022-09-20 | 虚拟环境中的观影画面调整方法、装置、存储介质及设备 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2024060959A1 true WO2024060959A1 (fr) | 2024-03-28 |
Family
ID=90387539
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2023/116228 WO2024060959A1 (fr) | 2022-09-20 | 2023-08-31 | Procédé et appareil pour ajuster une image de visualisation dans un environnement virtuel, support de stockage et dispositif |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN117784915A (fr) |
WO (1) | WO2024060959A1 (fr) |
Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20170221180A1 (en) * | 2016-01-29 | 2017-08-03 | Colopl, Inc. | Method and system for providing a virtual reality space |
CN107037876A (zh) * | 2015-10-26 | 2017-08-11 | Lg电子株式会社 | 系统及控制其的方法 |
CN107045389A (zh) * | 2017-04-14 | 2017-08-15 | 腾讯科技(深圳)有限公司 | 一种实现控制固定被控物的方法及装置 |
CN107396077A (zh) * | 2017-08-23 | 2017-11-24 | 深圳看到科技有限公司 | 虚拟现实全景视频流投影方法和设备 |
CN107977083A (zh) * | 2017-12-20 | 2018-05-01 | 北京小米移动软件有限公司 | 基于vr系统的操作执行方法及装置 |
US20180131920A1 (en) * | 2016-11-08 | 2018-05-10 | Samsung Electronics Co., Ltd. | Display apparatus and control method thereof |
US20200225830A1 (en) * | 2019-01-11 | 2020-07-16 | Microsoft Technology Licensing, Llc | Near interaction mode for far virtual object |
WO2020204594A1 (fr) * | 2019-04-04 | 2020-10-08 | 주식회사 코믹스브이 | Dispositif de réalité virtuelle et son procédé de commande |
CN113286138A (zh) * | 2021-05-17 | 2021-08-20 | 聚好看科技股份有限公司 | 一种全景视频显示方法及显示设备 |
US20220150464A1 (en) * | 2019-03-08 | 2022-05-12 | Sony Group Corporation | Image processing apparatus, image processing method, and image processing program |
-
2022
- 2022-09-20 CN CN202211146244.1A patent/CN117784915A/zh active Pending
-
2023
- 2023-08-31 WO PCT/CN2023/116228 patent/WO2024060959A1/fr unknown
Patent Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107037876A (zh) * | 2015-10-26 | 2017-08-11 | Lg电子株式会社 | 系统及控制其的方法 |
US20170221180A1 (en) * | 2016-01-29 | 2017-08-03 | Colopl, Inc. | Method and system for providing a virtual reality space |
US20180131920A1 (en) * | 2016-11-08 | 2018-05-10 | Samsung Electronics Co., Ltd. | Display apparatus and control method thereof |
CN107045389A (zh) * | 2017-04-14 | 2017-08-15 | 腾讯科技(深圳)有限公司 | 一种实现控制固定被控物的方法及装置 |
CN107396077A (zh) * | 2017-08-23 | 2017-11-24 | 深圳看到科技有限公司 | 虚拟现实全景视频流投影方法和设备 |
CN107977083A (zh) * | 2017-12-20 | 2018-05-01 | 北京小米移动软件有限公司 | 基于vr系统的操作执行方法及装置 |
US20200225830A1 (en) * | 2019-01-11 | 2020-07-16 | Microsoft Technology Licensing, Llc | Near interaction mode for far virtual object |
US20220150464A1 (en) * | 2019-03-08 | 2022-05-12 | Sony Group Corporation | Image processing apparatus, image processing method, and image processing program |
WO2020204594A1 (fr) * | 2019-04-04 | 2020-10-08 | 주식회사 코믹스브이 | Dispositif de réalité virtuelle et son procédé de commande |
CN113286138A (zh) * | 2021-05-17 | 2021-08-20 | 聚好看科技股份有限公司 | 一种全景视频显示方法及显示设备 |
Also Published As
Publication number | Publication date |
---|---|
CN117784915A (zh) | 2024-03-29 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11079999B2 (en) | Display screen front panel of HMD for viewing by users viewing the HMD player | |
US10078917B1 (en) | Augmented reality simulation | |
US12022357B1 (en) | Content presentation and layering across multiple devices | |
WO2018086224A1 (fr) | Procédé et appareil permettant de générer une scène de réalité virtuelle, et système de réalité virtuelle | |
US11806615B2 (en) | Asynchronous virtual reality interactions | |
US10712900B2 (en) | VR comfort zones used to inform an In-VR GUI editor | |
JP2021002288A (ja) | 画像処理装置、コンテンツ処理システム、および画像処理方法 | |
CN111937045B (zh) | 信息处理装置、信息处理方法和记录介质 | |
TW201301892A (zh) | 體積式視訊呈現 | |
US11900520B1 (en) | Specifying effects for entering or exiting a computer-generated reality environment | |
US11694658B2 (en) | Transferring a virtual object | |
US12039659B2 (en) | Method and device for tailoring a synthesized reality experience to a physical setting | |
CN114327700A (zh) | 一种虚拟现实设备及截屏图片播放方法 | |
WO2020206647A1 (fr) | Procédé et appareil pour commander, au moyen du suivi du mouvement d'utilisateur, la lecture d'un contenu vidéo | |
KR20210056414A (ko) | 혼합 현실 환경들에서 오디오-가능 접속된 디바이스들을 제어하기 위한 시스템 | |
WO2024060959A1 (fr) | Procédé et appareil pour ajuster une image de visualisation dans un environnement virtuel, support de stockage et dispositif | |
CN111149356A (zh) | 用于投影沉浸式视听内容的方法 | |
JP7365132B2 (ja) | 情報処理装置、表示方法およびコンピュータプログラム | |
WO2024193568A1 (fr) | Procédé et appareil d'interaction, et dispositif, support et programme | |
JP7418498B2 (ja) | プログラム、情報処理装置、および方法 | |
WO2024193544A1 (fr) | Procédé et appareil d'édition d'objet virtuel, dispositif, support et programme | |
WO2024193569A1 (fr) | Procédé et appareil d'interaction, dispositif, support et programme | |
CN117115237A (zh) | 虚拟现实的位置切换方法、装置、存储介质及设备 | |
CN117671201A (zh) | 信息刷新方法、装置、存储介质及设备 | |
JP2024107935A (ja) | プログラム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 23867250 Country of ref document: EP Kind code of ref document: A1 |