CN113850921A - VR projection method, device, storage medium and terminal - Google Patents

VR projection method, device, storage medium and terminal Download PDF

Info

Publication number
CN113850921A
CN113850921A CN202111145271.2A CN202111145271A CN113850921A CN 113850921 A CN113850921 A CN 113850921A CN 202111145271 A CN202111145271 A CN 202111145271A CN 113850921 A CN113850921 A CN 113850921A
Authority
CN
China
Prior art keywords
mobile terminal
desktop interface
coordinate
screen
user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111145271.2A
Other languages
Chinese (zh)
Inventor
杨梓龙
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huizhou TCL Mobile Communication Co Ltd
Original Assignee
Huizhou TCL Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huizhou TCL Mobile Communication Co Ltd filed Critical Huizhou TCL Mobile Communication Co Ltd
Priority to CN202111145271.2A priority Critical patent/CN113850921A/en
Publication of CN113850921A publication Critical patent/CN113850921A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • Software Systems (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the application provides a VR projection method, a VR projection device, a storage medium and a mobile terminal, wherein the VR projection method comprises the following steps: when the VR mode of the mobile terminal is detected to be started, hiding a desktop interface of the mobile terminal; recording a screen of the desktop interface to obtain a target video, wherein the desktop interface comprises content displayed on the desktop interface and sub-content generated by operating the content by a user; and performing VR display on the target video on a screen of the mobile terminal to generate a VR picture. According to the embodiment of the application, all contents on the mobile terminal can be projected into the VR glasses, so that a user can experience resource contents of the whole network, and the method is not limited to contents on certain software.

Description

VR projection method, device, storage medium and terminal
Technical Field
The present application relates to the field of VR technologies, and in particular, to a VR projection method, apparatus, storage medium, and mobile terminal.
Background
At present, VR resources are limited, and if a user wants to watch VR videos, the user can only be limited to corresponding apps, so that the user experience is poor, and the use frequency and the interest of the user on VR glasses are reduced due to the fact that huge resources are not available.
Therefore, the prior art has defects and needs to be improved and developed.
Disclosure of Invention
The embodiment of the application provides a VR projection method, a VR projection device, a storage medium and a mobile terminal, all contents on the mobile terminal can be projected into VR glasses, and a user can experience resource contents of the whole network without being limited to contents on certain software.
An embodiment of the present application provides a VR projection method, including:
when the VR mode of the mobile terminal is detected to be started, hiding a desktop interface of the mobile terminal;
recording a screen of the desktop interface to obtain a target video, wherein the desktop interface comprises content displayed on the desktop interface and sub-content generated by operating the content by a user;
and performing VR display on the target video on a screen of the mobile terminal to generate a VR picture.
In the VR projection method according to the embodiment of the application, after the screen recording is performed on the desktop interface to obtain the target video, the VR projection method further includes:
acquiring a first operation of a user on the VR picture;
converting the first operation into a second operation for operating the desktop interface;
and the mobile terminal performs corresponding processing according to the second operation and updates the VR picture according to a processing result.
In an embodiment of the VR projection method, the converting the first operation into a second operation for operating the desktop interface includes:
obtaining the size ratio of the VR picture to the desktop interface before hiding;
establishing a corresponding relation between the coordinates on the VR picture and the coordinates on the desktop interface before hiding based on the size ratio;
and converting the first operation into a second operation for operating the desktop interface based on the corresponding relation of the coordinates.
In the VR projection method according to this embodiment of the present application, the first operation is a first click operation, the second operation is a second click operation, the obtaining a first operation of a user on the VR screen, and converting the first operation into a second operation for operating the desktop interface includes:
acquiring a first coordinate of a click position corresponding to the first click operation;
and acquiring a second coordinate corresponding to the first coordinate based on the corresponding relation of the coordinates, wherein the second coordinate is a coordinate corresponding to a click position corresponding to the second click operation.
In the VR projection method according to this embodiment of the present application, the first operation is a first sliding operation, the second operation is a second sliding operation, the obtaining a first operation of a user on the VR screen, and converting the first operation into a second operation of operating the desktop interface includes:
acquiring a first initial coordinate of the first sliding operation, wherein the first initial coordinate is a coordinate corresponding to an initial sliding point of the first sliding operation;
acquiring a second initial coordinate corresponding to the first initial coordinate based on the corresponding relation of the coordinates, wherein the second initial coordinate is a coordinate corresponding to an initial sliding point of the second sliding operation;
acquiring a first termination coordinate of the first sliding operation, wherein the first termination coordinate is a coordinate corresponding to a termination sliding point of the first sliding operation;
acquiring a second termination coordinate corresponding to the first termination coordinate based on the corresponding relation of the coordinates, wherein the second termination coordinate is a coordinate corresponding to a termination sliding point of the second sliding operation;
and obtaining the second sliding operation based on the second starting coordinate and the second ending coordinate.
In the VR projection method according to the embodiment of the present application, hiding a desktop interface of the mobile terminal when it is detected that the mobile terminal starts the VR mode includes:
and when the VR switch of the mobile terminal is detected to be turned on, controlling the mobile terminal to start the VR mode and hiding the desktop interface of the mobile terminal.
In the VR projection method according to the embodiment of the application, the content displayed on the desktop interface is software, the sub-content generated by the user operating the software is a video on the software, and the recording of the desktop interface to obtain a target video includes:
and recording a screen of a screen for playing the video on the software to obtain a target video.
Embodiments of the present application further provide a VR projection apparatus, the apparatus includes:
the hidden module is used for hiding the desktop interface of the mobile terminal when the VR mode of the mobile terminal is detected to be started;
the screen recording module is used for recording a screen of the desktop interface to obtain a target video, wherein the desktop interface comprises content displayed on the desktop interface and sub-content generated by the operation of a user on the content;
and the display module is used for performing VR display on the target video on the screen of the mobile terminal so as to generate a VR picture.
An embodiment of the present application further provides a computer storage medium, where a computer program is stored in the computer storage medium, and when the computer program runs on a computer, the computer is caused to execute the VR projection method according to any embodiment.
The embodiment of the present application further provides a mobile terminal, where the mobile terminal includes a processor and a memory, where the memory stores a computer program, and the processor is configured to execute the VR projection method according to any embodiment by calling the computer program stored in the memory.
According to the embodiment of the application, when the VR mode of the mobile terminal is detected to be started, the desktop interface of the mobile terminal is hidden; recording a screen of the desktop interface to obtain a target video, wherein the desktop interface comprises content displayed on the desktop interface and sub-content generated by operating the content by a user; and performing VR display on the target video on a screen of the mobile terminal to generate a VR picture. According to the embodiment of the application, a video APP does not need to enter to open a VR video, then a VR mode corresponding to the video can be opened, the VR mode can be directly opened on a desktop interface of the mobile terminal, then after the VR mode is opened, the desktop interface of the mobile terminal is hidden, the video is not displayed on a screen of the mobile terminal, then the screen of the hidden desktop interface is recorded, the recorded video is displayed in a VR mode on the screen of the mobile terminal, all content on the interface of the mobile terminal can be projected into VR glasses, a user can experience resource content of the whole network, and the video is not limited to content on certain software.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings used in the description of the embodiments will be briefly introduced below. It is obvious that the drawings in the following description are only some embodiments of the application, and that for a person skilled in the art, other drawings can be derived from them without inventive effort.
Fig. 1 is a schematic flowchart of a VR projection method according to an embodiment of the present application.
Fig. 2 is a schematic structural diagram of a VR projection apparatus according to an embodiment of the present application.
Fig. 3 is a schematic structural diagram of another VR projection apparatus according to an embodiment of the present application.
Fig. 4 is a schematic structural diagram of a mobile terminal according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application. It is to be understood that the embodiments described are only a few embodiments of the present application and not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without inventive step, are within the scope of the present application.
The embodiment of the application provides a VR projection method which can be applied to a mobile terminal. The mobile terminal can be a smart phone, a tablet computer and other devices.
Referring to fig. 1, fig. 1 is a schematic flow chart of a VR projection method according to an embodiment of the present disclosure. The VR projection method is applied to a mobile terminal, and the method can comprise the following steps:
step 101, when the VR mode of the mobile terminal is detected to be started, hiding a desktop interface of the mobile terminal.
Wherein, the mode that VR mode was opened to this embodiment does, directly clicks the VR switch on mobile terminal's desktop interface, then opens the VR mode, rather than entering certain video APP and opening the VR video, then open the VR mode corresponding with this video again.
The desktop interface is hidden, and a mode that the desktop interface is reduced to be invisible to human eyes on a screen of the mobile terminal can be adopted.
In some embodiments, hiding the desktop interface of the mobile terminal when it is detected that the VR mode is turned on by the mobile terminal includes:
and when the VR switch of the mobile terminal is detected to be turned on, controlling the mobile terminal to start the VR mode and hiding the desktop interface of the mobile terminal.
The VR switch can be arranged on a desktop interface of the mobile terminal, or in a pull-down mode like a mobile phone, the mobile phone interface is pulled down, then the VR switch is arranged in the mobile terminal and marked as a VR mode, the VR mode can be started by clicking the VR switch, and the VR mode can be closed by clicking the VR switch again.
Wherein, after opening the VR mode, the user can watch the VR video that shows on the mobile terminal screen through wearing VR glasses.
And 102, recording a screen of the desktop interface to obtain a target video, wherein the desktop interface comprises content displayed on the desktop interface and sub-content generated by operating the content by a user.
The contents displayed on the desktop interface may include all software on the desktop (e.g., mango TV, archi art, and cool video), various function switches (e.g., VR switch, flight mode switch, shake mode switch, and the like) set on the desktop, and the like.
The sub-content generated by the user operating the content, for example, the content displayed on the desktop interface is the odds art APP, and the sub-content generated by the user operating the odds art APP may be the content on the odds art APP, including all the content displayed on the odds art APP interface, such as the interface content on the odds art APP, or videos of a tv show, a movie, a general art program, and the like on the odds art APP.
In some embodiments, the content displayed on the desktop interface is software, the sub-content generated by the user operating the software is a video on the software, and the recording the desktop interface to obtain the target video includes:
and recording a screen of a screen for playing the video on the software to obtain a target video.
For example, if the content displayed on the desktop interface is the love art APP, the sub-content generated by the user operating the love art APP is a video on the love art APP, and a screen of a screen playing the video is recorded to obtain the target video.
In some embodiments, the screen recording the desktop interface to obtain the target video further includes:
acquiring a first operation of a user on the VR picture;
converting the first operation into a second operation for operating the desktop interface;
and the mobile terminal performs corresponding processing according to the second operation and updates the VR picture according to a processing result.
The user indirectly operates the hidden desktop interface by operating on the VR screen (for example, opening the azygos APP on the desktop interface and then selecting a certain VR movie on the azygos APP), that is, a first operation of the user on the VR screen is converted into a second operation of operating the desktop interface, so as to operate the desktop interface.
For example, when a user starts to watch a video on the love art APP, and later the user wants to change the video software to watch the video on the mango TV, for example, the user can exit the love art APP on the VR screen and select the mango TV, so as to indirectly operate the hidden desktop interface, that is, correspondingly exit the love art APP on the desktop interface and select the mango TV. The mobile terminal responds to quit the love art APP, selects the operation of the mango TV, and updates the VR picture according to the operation.
In some embodiments, the translating the first operation into a second operation to operate the desktop interface includes:
obtaining the size ratio of the VR picture to the desktop interface before hiding;
establishing a corresponding relation between the coordinates on the VR picture and the coordinates on the desktop interface before hiding based on the size ratio;
and converting the first operation into a second operation for operating the desktop interface based on the corresponding relation of the coordinates.
The corresponding relation between the coordinates on the VR picture and the coordinates on the desktop interface before hiding is pre-established by obtaining the size ratio of the VR picture to the desktop interface before hiding, so that the first operation of a user on the VR picture can be converted into the second operation of operating the desktop interface according to the corresponding relation of the coordinates.
In some embodiments, the first operation is a first click operation, the second operation is a second click operation, the obtaining a first operation of the user on the VR screen, and the converting the first operation into a second operation of operating the desktop interface includes:
acquiring a first coordinate of a click position corresponding to the first click operation;
and acquiring a second coordinate corresponding to the first coordinate based on the corresponding relation of the coordinates, wherein the second coordinate is a coordinate corresponding to a click position corresponding to the second click operation.
For example, the user clicks a position a on the VR screen, coordinates (first coordinates) of the position are (a, b), and since a correspondence between the coordinates on the VR screen and the coordinates on the desktop interface before hiding is established in advance, a second coordinate corresponding to the coordinates on the desktop interface can be obtained from the coordinates (a, b) on the VR screen. For example, if the coordinates (c, d) on the desktop interface corresponding to the coordinates (a, b) on the VR screen are (c, d), the click position to be clicked on the desktop interface can be known from the coordinates (c, d).
In some embodiments, the obtaining a first operation of a user on the VR screen, and the converting the first operation into a second operation of operating the desktop interface, includes:
acquiring a first initial coordinate of the first sliding operation, wherein the first initial coordinate is a coordinate corresponding to an initial sliding point of the first sliding operation;
acquiring a second initial coordinate corresponding to the first initial coordinate based on the corresponding relation of the coordinates, wherein the second initial coordinate is a coordinate corresponding to an initial sliding point of the second sliding operation;
acquiring a first termination coordinate of the first sliding operation, wherein the first termination coordinate is a coordinate corresponding to a termination sliding point of the first sliding operation;
acquiring a second termination coordinate corresponding to the first termination coordinate based on the corresponding relation of the coordinates, wherein the second termination coordinate is a coordinate corresponding to a termination sliding point of the second sliding operation;
and obtaining the second sliding operation based on the second starting coordinate and the second ending coordinate.
Since the user performs the sliding operation on the VR screen, there will be a sliding start point and a sliding end point (end point), and based on the correspondence relationship, the sliding start point of the first sliding operation will correspond to the sliding start point of the second sliding operation, and the sliding end point of the first sliding operation will correspond to the sliding end point of the second sliding operation. Therefore, the first start coordinate of the first slide operation (the slide start point coordinate of the first slide operation) will correspond to the second start coordinate of the second slide operation (the slide start point coordinate of the second slide operation), and the first end coordinate of the first slide operation (the slide end point coordinate of the first slide operation) will correspond to the second end coordinate of the second slide operation (the slide end point coordinate of the second slide operation), based on the correspondence relationship of the preset coordinates, so that the second operation of operating the desktop interface can be acquired by acquiring the first slide operation of the user on the VR screen.
And 103, performing VR display on the target video on a screen of the mobile terminal to generate a VR picture.
After the target video is obtained, VR display is carried out on the target video on a screen of the mobile terminal, and therefore the user who waits for VR glasses can watch the VR video.
All the above optional technical solutions may be combined arbitrarily to form optional embodiments of the present application, and are not described herein again.
In particular implementation, the present application is not limited by the execution sequence of the described steps, and some steps may be performed in other sequences or simultaneously without conflict.
As can be seen from the above, in the VR projection method provided in the embodiment of the application, when it is detected that the VR mode of the mobile terminal is started, the desktop interface of the mobile terminal is hidden; recording a screen of the desktop interface to obtain a target video, wherein the desktop interface comprises content displayed on the desktop interface and sub-content generated by operating the content by a user; and performing VR display on the target video on a screen of the mobile terminal to generate a VR picture. According to the embodiment of the application, a video APP does not need to enter to open a VR video, then a VR mode corresponding to the video can be opened, the VR mode can be directly opened on a desktop interface of the mobile terminal, then after the VR mode is opened, the desktop interface of the mobile terminal is hidden, the video is not displayed on a screen of the mobile terminal, then the screen of the hidden desktop interface is recorded, the recorded video is displayed in a VR mode on the screen of the mobile terminal, all content on the interface of the mobile terminal can be projected into VR glasses, a user can experience resource content of the whole network, and the video is not limited to content on certain software.
The embodiment of the application also provides a VR projection device, which can be integrated in a mobile terminal. The mobile terminal can be a smart phone, a tablet computer and other devices.
Referring to fig. 2, fig. 2 is a schematic structural diagram of a VR projection apparatus according to an embodiment of the present application. The VR projection 30 may include:
the hiding module 31 is configured to hide a desktop interface of the mobile terminal when it is detected that the VR mode of the mobile terminal is started;
the screen recording module 32 is configured to record a screen of the desktop interface to obtain a target video, where the desktop interface includes content displayed on the desktop interface and sub-content generated by a user operating the content;
and the display module 33 is configured to perform VR display on the target video on the screen of the mobile terminal to generate a VR picture.
In some embodiments, the hiding module 31 is configured to, when it is detected that a VR switch of the mobile terminal is turned on, control the mobile terminal to turn on a VR mode and hide a desktop interface of the mobile terminal.
In some embodiments, the screen recording module 32 is configured to record a screen of a screen on which a video on the software is played to obtain a target video when the content displayed on the desktop interface is the software and the sub-content generated by the user operating the software is the video on the software.
In specific implementation, the modules may be implemented as independent entities, or may be combined arbitrarily and implemented as one or several entities.
As can be seen from the above, in the VR projection apparatus 30 provided in this embodiment of the application, when it is detected that the VR mode of the mobile terminal is started, the desktop interface of the mobile terminal is hidden by the hiding module 31; recording the screen of the desktop interface through a screen recording module 32 to obtain a target video, wherein the desktop interface comprises content displayed on the desktop interface and sub-content generated by operating the content by a user; and performing VR display on the target video on a screen of the mobile terminal through a display module 33 to generate a VR picture. According to the embodiment of the application, all contents on the interface of the mobile terminal can be projected into the VR glasses, so that a user can experience resource contents of the whole network, and the user is not limited to the contents on certain software.
Referring to fig. 3, fig. 3 is another schematic structural diagram of a VR projection apparatus according to an embodiment of the present disclosure, in which a vibrating VR projection apparatus 30 includes a memory 120, one or more processors 180, and one or more applications, where the one or more applications are stored in the memory 120 and configured to be executed by the processor 180; the processor 180 may include a concealment module 31, a screen recording module 32, and a display module 33. For example, the structures and connection relationships of the above components may be as follows:
the memory 120 may be used to store applications and data. The memory 120 stores applications containing executable code. The application programs may constitute various functional modules. The processor 180 executes various functional applications and data processing by running the application programs stored in the memory 120. Further, the memory 120 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device. Accordingly, the memory 120 may also include a memory controller to provide the processor 180 with access to the memory 120.
The processor 180 is a control center of the device, connects various parts of the entire terminal using various interfaces and lines, performs various functions of the device and processes data by running or executing an application program stored in the memory 120 and calling data stored in the memory 120, thereby monitoring the entire device. Optionally, processor 180 may include one or more processing cores; preferably, the processor 180 may integrate an application processor and a modem processor, wherein the application processor mainly processes an operating system, a user interface, an application program, and the like.
Specifically, in this embodiment, the processor 180 loads the executable code corresponding to the process of one or more application programs into the memory 120 according to the following instructions, and the processor 180 runs the application programs stored in the memory 120, thereby implementing various functions:
the hiding module 31 is configured to hide a desktop interface of the mobile terminal when it is detected that the VR mode of the mobile terminal is started;
the screen recording module 32 is configured to record a screen of the desktop interface to obtain a target video, where the desktop interface includes content displayed on the desktop interface and sub-content generated by a user operating the content;
and the display module 33 is configured to perform VR display on the target video on the screen of the mobile terminal to generate a VR picture.
In some embodiments, the hiding module 31 is configured to, when it is detected that a VR switch of the mobile terminal is turned on, control the mobile terminal to turn on a VR mode and hide a desktop interface of the mobile terminal.
In some embodiments, the screen recording module 32 is configured to record a screen of a screen on which a video on the software is played to obtain a target video when the content displayed on the desktop interface is the software and the sub-content generated by the user operating the software is the video on the software.
The embodiment of the application also provides the mobile terminal. The mobile terminal can be a smart phone, a tablet computer and other devices.
Referring to fig. 4, fig. 4 is a schematic structural diagram of a mobile terminal according to an embodiment of the present disclosure, where the mobile terminal may be used to implement the vibration adjustment method provided in the foregoing embodiment. The mobile terminal 1200 may be a smart phone or a tablet computer.
As shown in fig. 4, the mobile terminal 1200 may include an RF (Radio Frequency) circuit 110, a memory 120 including one or more computer-readable storage media (only one shown), an input unit 130, a display unit 140, a sensor 150, an audio circuit 160, a transmission module 170, a processor 180 including one or more processing cores (only one shown), and a power supply 190. Those skilled in the art will appreciate that the mobile terminal 1200 configuration illustrated in fig. 4 is not intended to be limiting of the mobile terminal 1200 and may include more or less components than those illustrated, or some components in combination, or a different arrangement of components. Wherein:
the RF circuit 110 is used for receiving and transmitting electromagnetic waves, and performs interconversion between the electromagnetic waves and electrical signals, so as to communicate with a communication network or other devices. The RF circuitry 110 may include various existing circuit elements for performing these functions, such as an antenna, a radio frequency transceiver, a digital signal processor, an encryption/decryption chip, a Subscriber Identity Module (SIM) card, memory, and so forth. The RF circuitry 110 may communicate with various networks such as the internet, an intranet, a wireless network, or with other devices over a wireless network.
The memory 120 may be configured to store software programs and modules, such as program instructions/modules corresponding to the vibration adjustment method in the foregoing embodiment, and the processor 180 may be configured to execute various functional applications and data processing by running the software programs and modules stored in the memory 120, so as to project all contents on the interface of the mobile terminal into the VR glasses, so that a user can experience resource contents of the whole network, rather than contents limited to only a certain software. Memory 120 may include high speed random access memory and may also include non-volatile memory, such as one or more magnetic storage devices, flash memory, or other non-volatile solid-state memory. In some examples, the memory 120 may further include memory located remotely from the processor 180, which may be connected to the electronic device 1200 via a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The input unit 130 may be used to receive input numeric or character information and generate keyboard, mouse, joystick, optical or trackball signal inputs related to user settings and function control. In particular, the input unit 130 may include a touch-sensitive surface 131 as well as other input devices 132. The touch-sensitive surface 131, also referred to as a touch display screen or a touch pad, may collect touch operations by a user on or near the touch-sensitive surface 131 (e.g., operations by a user on or near the touch-sensitive surface 131 using a finger, a stylus, or any other suitable object or attachment), and drive the corresponding connection device according to a predetermined program. Alternatively, the touch sensitive surface 131 may comprise two parts, a touch detection means and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 180, and can receive and execute commands sent by the processor 180. Additionally, the touch-sensitive surface 131 may be implemented using various types of resistive, capacitive, infrared, and surface acoustic waves. In addition to the touch-sensitive surface 131, the input unit 130 may also include other input devices 132. In particular, other input devices 132 may include, but are not limited to, one or more of a physical keyboard, function keys (such as volume control keys, switch keys, etc.), a trackball, a mouse, a joystick, and the like.
The display unit 140 may be used to display information input by or provided to the user and various graphic user interfaces of the mobile terminal 1200, which may be configured by graphics, text, icons, video, and any combination thereof. The Display unit 140 may include a Display panel 141, and optionally, the Display panel 141 may be configured in the form of an LCD (Liquid Crystal Display), an OLED (Organic Light-Emitting Diode), or the like. Further, the touch-sensitive surface 131 may cover the display panel 141, and when a touch operation is detected on or near the touch-sensitive surface 131, the touch operation is transmitted to the processor 180 to determine the type of the touch event, and then the processor 180 provides a corresponding visual output on the display panel 141 according to the type of the touch event. Although in FIG. 4, touch-sensitive surface 131 and display panel 141 are shown as two separate components to implement input and output functions, in some embodiments, touch-sensitive surface 131 may be integrated with display panel 141 to implement input and output functions.
The mobile terminal 1200 may also include at least one sensor 150, such as a light sensor, a motion sensor, and other sensors. Specifically, the light sensor may include an ambient light sensor that may adjust the brightness of the display panel 141 according to the brightness of ambient light, and a proximity sensor that may turn off the display panel 141 and/or the backlight when the mobile terminal 1200 is moved to the ear. As one of the motion sensors, the gravity acceleration sensor can detect the magnitude of acceleration in each direction (generally, three axes), can detect the magnitude and direction of gravity when the mobile phone is stationary, and can be used for applications of recognizing the posture of the mobile phone (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), vibration recognition related functions (such as pedometer and tapping), and the like; as for other sensors such as a gyroscope, a barometer, a hygrometer, a thermometer, and an infrared sensor, which may be further configured in the mobile terminal 1200, detailed descriptions thereof are omitted.
Audio circuitry 160, speaker 161, and microphone 162 may provide an audio interface between a user and mobile terminal 1200. The audio circuit 160 may transmit the electrical signal converted from the received audio data to the speaker 161, and convert the electrical signal into a sound signal for output by the speaker 161; on the other hand, the microphone 162 converts the collected sound signal into an electric signal, converts the electric signal into audio data after being received by the audio circuit 160, and then outputs the audio data to the processor 180 for processing, and then to the RF circuit 110 to be transmitted to, for example, another terminal, or outputs the audio data to the memory 120 for further processing. The audio circuitry 160 may also include an earbud jack to provide communication of peripheral headphones with the mobile terminal 1200.
The mobile terminal 1200, which may assist the user in e-mail, web browsing, and streaming media access through the transmission module 170 (e.g., Wi-Fi module), provides the user with wireless broadband internet access. Although fig. 4 shows the transmission module 170, it is understood that it does not belong to the essential constitution of the mobile terminal 1200, and may be omitted entirely within the scope not changing the essence of the invention as needed.
The processor 180 is a control center of the mobile terminal 1200, connects various parts of the entire mobile phone using various interfaces and lines, and performs various functions of the mobile terminal 1200 and processes data by operating or executing software programs and/or modules stored in the memory 120 and calling data stored in the memory 120, thereby integrally monitoring the mobile phone. Optionally, processor 180 may include one or more processing cores; in some embodiments, the processor 180 may integrate an application processor, which primarily handles operating systems, user interfaces, applications, etc., and a modem processor, which primarily handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor 180.
The mobile terminal 1200 also includes a power supply 190 (e.g., a battery) that powers the various components and, in some embodiments, may be logically coupled to the processor 180 via a power management system that may be used to manage charging, discharging, and power consumption management functions. The power supply 190 may also include any component including one or more of a dc or ac power source, a recharging system, a power failure detection circuit, a power converter or inverter, a power status indicator, and the like.
Although not shown, the mobile terminal 1200 may further include a camera (e.g., a front camera, a rear camera), a bluetooth module, and the like, which are not described in detail herein. Specifically, in the present embodiment, the display unit 140 of the mobile terminal 1200 is a touch screen display, and the mobile terminal 1200 further includes a memory 120 and one or more programs, wherein the one or more programs are stored in the memory 120, and the one or more programs configured to be executed by the one or more processors 180 include instructions for:
when the VR mode of the mobile terminal is detected to be started, hiding a desktop interface of the mobile terminal;
recording a screen of the desktop interface to obtain a target video, wherein the desktop interface comprises content displayed on the desktop interface and sub-content generated by operating the content by a user;
and performing VR display on the target video on a screen of the mobile terminal to generate a VR picture.
In some embodiments, the processor 180 is configured to obtain a first operation of a user on the VR screen;
converting the first operation into a second operation for operating the desktop interface;
and the mobile terminal performs corresponding processing according to the second operation and updates the VR picture according to a processing result.
In some embodiments, the processor 180 is configured to obtain a size ratio of the VR screen to the desktop interface before hiding;
establishing a corresponding relation between the coordinates on the VR picture and the coordinates on the desktop interface before hiding based on the size ratio;
and converting the first operation into a second operation for operating the desktop interface based on the corresponding relation of the coordinates.
In some embodiments, the processor 180 is configured to determine, when the first operation is a first click operation, the second operation is a second click operation,
acquiring a first coordinate of a click position corresponding to the first click operation;
and acquiring a second coordinate corresponding to the first coordinate based on the corresponding relation of the coordinates, wherein the second coordinate is a coordinate corresponding to a click position corresponding to the second click operation.
In some embodiments, the processor 180 is configured to obtain a first start coordinate of the first sliding operation when the first operation is a first sliding operation and the second operation is a second sliding operation, where the first start coordinate is a coordinate corresponding to a start sliding point of the first sliding operation;
acquiring a second initial coordinate corresponding to the first initial coordinate based on the corresponding relation of the coordinates, wherein the second initial coordinate is a coordinate corresponding to an initial sliding point of the second sliding operation;
acquiring a first termination coordinate of the first sliding operation, wherein the first termination coordinate is a coordinate corresponding to a termination sliding point of the first sliding operation;
acquiring a second termination coordinate corresponding to the first termination coordinate based on the corresponding relation of the coordinates, wherein the second termination coordinate is a coordinate corresponding to a termination sliding point of the second sliding operation;
and obtaining the second sliding operation based on the second starting coordinate and the second ending coordinate.
In some embodiments, the processor 180 is configured to control the mobile terminal to turn on the VR mode and hide a desktop interface of the mobile terminal when it is detected that the VR switch of the mobile terminal is turned on.
In some embodiments, the processor 180 is configured to record a screen of a screen on which a video on the software is played to obtain a target video when the content displayed on the desktop interface is the software and the sub-content generated by the user operating the software is the video on the software.
As can be seen from the above, an embodiment of the present application provides a mobile terminal 1200, where the mobile terminal 1200 performs the following steps: when the VR mode of the mobile terminal is detected to be started, hiding a desktop interface of the mobile terminal; recording a screen of the desktop interface to obtain a target video, wherein the desktop interface comprises content displayed on the desktop interface and sub-content generated by operating the content by a user; and performing VR display on the target video on a screen of the mobile terminal to generate a VR picture. According to the embodiment of the application, a video APP does not need to enter to open a VR video, then a VR mode corresponding to the video can be opened, the VR mode can be directly opened on a desktop interface of the mobile terminal, then after the VR mode is opened, the desktop interface of the mobile terminal is hidden, the video is not displayed on a screen of the mobile terminal, then the screen of the hidden desktop interface is recorded, the recorded video is displayed in a VR mode on the screen of the mobile terminal, all content on the interface of the mobile terminal can be projected into VR glasses, a user can experience resource content of the whole network, and the video is not limited to content on certain software.
An embodiment of the present application further provides a storage medium, where a computer program is stored in the storage medium, and when the computer program runs on a computer, the computer executes the VR projection method described in any of the above embodiments.
It should be noted that, for the VR projection method described in this application, it can be understood by those skilled in the art that all or part of the process of implementing the VR projection method described in this application may be implemented by controlling related hardware through a computer program, where the computer program may be stored in a computer readable storage medium, such as a memory of a mobile terminal, and executed by at least one processor in the mobile terminal, and during the execution, the process of implementing the embodiment of the vibration adjustment method may be included. The storage medium may be a magnetic disk, an optical disk, a Read Only Memory (ROM), a Random Access Memory (RAM), or the like.
For the vibration adjustment device of the embodiment of the present application, each functional module may be integrated into one processing chip, or each module may exist alone physically, or two or more modules are integrated into one module. The integrated module can be realized in a hardware mode, and can also be realized in a software functional module mode. The integrated module, if implemented in the form of a software functional module and sold or used as a stand-alone product, may also be stored in a computer readable storage medium, such as a read-only memory, a magnetic or optical disk, or the like.
The VR projection method, apparatus, storage medium, and mobile terminal provided in the embodiments of the present application are described in detail above. The principle and the implementation of the present application are explained herein by applying specific examples, and the above description of the embodiments is only used to help understand the method and the core idea of the present application; meanwhile, for those skilled in the art, according to the idea of the present application, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present application.

Claims (10)

1. A VR projection method, comprising:
when the VR mode of the mobile terminal is detected to be started, hiding a desktop interface of the mobile terminal;
recording a screen of the desktop interface to obtain a target video, wherein the desktop interface comprises content displayed on the desktop interface and sub-content generated by operating the content by a user;
and performing VR display on the target video on a screen of the mobile terminal to generate a VR picture.
2. The VR projection method of claim 1, wherein after capturing the desktop interface to obtain the target video, further comprising:
acquiring a first operation of a user on the VR picture;
converting the first operation into a second operation for operating the desktop interface;
and the mobile terminal performs corresponding processing according to the second operation and updates the VR picture according to a processing result.
3. The VR projection method of claim 2, wherein translating the first operation to a second operation that operates the desktop interface comprises:
obtaining the size ratio of the VR picture to the desktop interface before hiding;
establishing a corresponding relation between the coordinates on the VR picture and the coordinates on the desktop interface before hiding based on the size ratio;
and converting the first operation into a second operation for operating the desktop interface based on the corresponding relation of the coordinates.
4. The VR projection method of claim 3, wherein the first operation is a first click operation and the second operation is a second click operation, the obtaining a first operation of a user on the VR screen, and the translating the first operation into a second operation to operate the desktop interface comprises:
acquiring a first coordinate of a click position corresponding to the first click operation;
and acquiring a second coordinate corresponding to the first coordinate based on the corresponding relation of the coordinates, wherein the second coordinate is a coordinate corresponding to a click position corresponding to the second click operation.
5. The VR projection method of claim 3, wherein the first operation is a first sliding operation and the second operation is a second sliding operation, the obtaining a first operation of a user on the VR screen, and the translating the first operation into a second operation of operating the desktop interface includes:
acquiring a first initial coordinate of the first sliding operation, wherein the first initial coordinate is a coordinate corresponding to an initial sliding point of the first sliding operation;
acquiring a second initial coordinate corresponding to the first initial coordinate based on the corresponding relation of the coordinates, wherein the second initial coordinate is a coordinate corresponding to an initial sliding point of the second sliding operation;
acquiring a first termination coordinate of the first sliding operation, wherein the first termination coordinate is a coordinate corresponding to a termination sliding point of the first sliding operation;
acquiring a second termination coordinate corresponding to the first termination coordinate based on the corresponding relation of the coordinates, wherein the second termination coordinate is a coordinate corresponding to a termination sliding point of the second sliding operation;
and obtaining the second sliding operation based on the second starting coordinate and the second ending coordinate.
6. The VR projection method of claim 1, wherein hiding a desktop interface of a mobile terminal when it is detected that the VR mode is turned on comprises:
and when the VR switch of the mobile terminal is detected to be turned on, controlling the mobile terminal to start the VR mode and hiding the desktop interface of the mobile terminal.
7. The VR projection method of claim 1, wherein the content displayed on the desktop interface is software, the sub-content generated by the user operating the software is a video on the software, and the recording of the desktop interface to obtain the target video comprises:
and recording a screen of a screen for playing the video on the software to obtain a target video.
8. A VR projection apparatus, comprising:
the hidden module is used for hiding the desktop interface of the mobile terminal when the VR mode of the mobile terminal is detected to be started;
the screen recording module is used for recording a screen of the desktop interface to obtain a target video, wherein the desktop interface comprises content displayed on the desktop interface and sub-content generated by the operation of a user on the content;
and the display module is used for performing VR display on the target video on the screen of the mobile terminal so as to generate a VR picture.
9. A computer storage medium having a computer program stored therein, which when run on a computer causes the computer to perform the VR projection method of any of claims 1 to 7.
10. A mobile terminal, characterized in that the mobile terminal comprises a processor and a memory, the memory having stored therein a computer program, the processor being configured to execute the VR projection method of any of claims 1 to 7 by invoking the computer program stored in the memory.
CN202111145271.2A 2021-09-28 2021-09-28 VR projection method, device, storage medium and terminal Pending CN113850921A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111145271.2A CN113850921A (en) 2021-09-28 2021-09-28 VR projection method, device, storage medium and terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111145271.2A CN113850921A (en) 2021-09-28 2021-09-28 VR projection method, device, storage medium and terminal

Publications (1)

Publication Number Publication Date
CN113850921A true CN113850921A (en) 2021-12-28

Family

ID=78980447

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111145271.2A Pending CN113850921A (en) 2021-09-28 2021-09-28 VR projection method, device, storage medium and terminal

Country Status (1)

Country Link
CN (1) CN113850921A (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106293095A (en) * 2016-08-17 2017-01-04 惠州Tcl移动通信有限公司 A kind of method and system based on intelligent watch operation VR
CN111078108A (en) * 2019-12-13 2020-04-28 惠州Tcl移动通信有限公司 Screen display method and device, storage medium and mobile terminal
CN112527174A (en) * 2019-09-19 2021-03-19 华为技术有限公司 Information processing method and electronic equipment

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106293095A (en) * 2016-08-17 2017-01-04 惠州Tcl移动通信有限公司 A kind of method and system based on intelligent watch operation VR
CN112527174A (en) * 2019-09-19 2021-03-19 华为技术有限公司 Information processing method and electronic equipment
CN111078108A (en) * 2019-12-13 2020-04-28 惠州Tcl移动通信有限公司 Screen display method and device, storage medium and mobile terminal

Non-Patent Citations (5)

* Cited by examiner, † Cited by third party
Title
PICO-VR官方: "Pico Neo 3 | 如何截屏和录屏", pages 2, Retrieved from the Internet <URL:https://www.bilibili.com/read/cv13219683/> *
小熊668: "用Pico Neo2观看视频的N个方式", pages 9, Retrieved from the Internet <URL:https://www.bilibili.com/video/BV1Yp4y1s7SY/?spm_id_from=333.337.search-card.all.click> *
张秋实;王力农;方雅琪;胡建勋;: "基于手势交互的输电线路运维沉浸式操作训练平台", 电力科学与技术学报, no. 04, 28 December 2017 (2017-12-28) *
田洪刚;肖天非;: "基于职业院校信息化教学大赛对微课资源建设的应用研究", 才智, no. 29, 15 October 2018 (2018-10-15) *
郭力源;: "VR技术在煤矿安全生产中的应用", 科学技术创新, no. 27, 25 September 2018 (2018-09-25) *

Similar Documents

Publication Publication Date Title
CN108255378B (en) Display control method and mobile terminal
JP7062092B2 (en) Display control method and terminal
CN109407921B (en) Application processing method and terminal device
CN109240577B (en) Screen capturing method and terminal
CN107742072B (en) Face recognition method and mobile terminal
US20220300302A1 (en) Application sharing method and electronic device
CN108334272B (en) Control method and mobile terminal
CN107870674B (en) Program starting method and mobile terminal
CN109710349B (en) Screen capturing method and mobile terminal
CN109451141B (en) Operation control method and related terminal
US11165950B2 (en) Method and apparatus for shooting video, and storage medium
CN111010523B (en) Video recording method and electronic equipment
CN109407948B (en) Interface display method and mobile terminal
CN109407949B (en) Display control method and terminal
CN108124059B (en) Recording method and mobile terminal
CN107948429B (en) Content demonstration method, terminal equipment and computer readable storage medium
CN110096203B (en) Screenshot method and mobile terminal
CN109618218B (en) Video processing method and mobile terminal
CN109144393B (en) Image display method and mobile terminal
CN110866465A (en) Control method of electronic equipment and electronic equipment
CN108073405B (en) Application program unloading method and mobile terminal
CN108132749B (en) Image editing method and mobile terminal
CN108052258B (en) Terminal task processing method, task processing device and mobile terminal
CN111078186A (en) Playing method and electronic equipment
CN109164908B (en) Interface control method and mobile terminal

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination