WO2022135195A1 - Procédé et appareil permettant d'afficher une interface de réalité virtuelle, dispositif, et support de stockage lisible - Google Patents

Procédé et appareil permettant d'afficher une interface de réalité virtuelle, dispositif, et support de stockage lisible Download PDF

Info

Publication number
WO2022135195A1
WO2022135195A1 PCT/CN2021/137278 CN2021137278W WO2022135195A1 WO 2022135195 A1 WO2022135195 A1 WO 2022135195A1 CN 2021137278 W CN2021137278 W CN 2021137278W WO 2022135195 A1 WO2022135195 A1 WO 2022135195A1
Authority
WO
WIPO (PCT)
Prior art keywords
app
display content
interface
carrier object
terminal device
Prior art date
Application number
PCT/CN2021/137278
Other languages
English (en)
Chinese (zh)
Inventor
吕冯麟
许琪羚
夏沛
李龙华
黄炳洁
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Publication of WO2022135195A1 publication Critical patent/WO2022135195A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72448User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions
    • H04M1/72454User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions according to context-related or environment-related conditions

Definitions

  • the present application relates to the field of VR technology, and in particular, to a method, apparatus, device and readable storage medium for displaying a VR interface.
  • VR virtual reality
  • the software development kit (SDK) of the third-party process is used to draw the content to be displayed, that is, the content to be displayed is drawn through the VR SDK, The drawn result is then post-processed again to form a VR image, which is then sent to the VR device for display. If the system interface needs to display some content at this time, such as a pop-up window, you need to send the displayed content of the pop-up window to the VR SDK of the current third-party process for drawing.
  • the present application provides a method, apparatus, device and readable storage medium for displaying a VR interface, which can avoid display interruption during application switching.
  • a method for displaying a VR interface including: acquiring first display content by calling a first carrier object, where the first display content is the display content of a first application (application, APP);
  • the second carrier object acquires second display content, and the second display content is the display content of the second APP; generates a VR interface according to the first display content and the second display content, and the VR interface includes the first display content and the second display content.
  • a display content and a second display content displaying the VR interface.
  • the above-mentioned first APP and second APP may be different APPs, and the APPs may include but are not limited to video playing APPs, game programs, system short message programs, camera programs, and the like.
  • the above-mentioned first display content is the display content of the first APP, which may include pictures, text, colors, etc.;
  • the above-mentioned second display content is the display content of the second APP, which may include pictures, text, colors, etc.;
  • the first display content and The second display content is, for example, the shooting interface of the camera or the text content of the short message, and the like.
  • a new application may be created in the system of the terminal device, that is, an intermediate layer for creating VR interface rendering, and this new application may be called "VrRenderer".
  • VrRenderer obtains the first display object of the first APP through the first carrier object, and calls the second display content through the second carrier.
  • This method does not need to send the second display content of the second APP to the first APP.
  • the process draws the VR interface, but obtains the display content of the two APPs through the two carrier objects respectively, and draws the VR interface according to the display content of the two APPs, and obtains the first display content and the second display content.
  • VR interface
  • the switching of the first APP will not cause the display content of the second APP to be lost, which avoids the interruption of the displayed content during the application switching process, so the display interface is continuously stable and the user experience is improved.
  • the first APP cannot obtain the second display content of the second APP, the security of the data of the second APP is enhanced.
  • the acquiring the first display content by calling the first carrier object includes: receiving a first carrier object creation request from the first APP; creating the first carrier object according to the first carrier object creation request , the first carrier object is used for the first APP to fill the content to be displayed; a first notification message is received from the first APP, where the first notification message is used to indicate that the first carrier object has been filled ; call the first carrier object according to the first notification message to obtain the first display content.
  • the first APP can send a first carrier object creation request to VrRenderer through a specific interface or channel.
  • VrRenderer receives the first carrier object creation request, it creates an empty first carrier object based on the request, and then VrRenderer returns the built first carrier object to the first APP, so that the first APP fills the first display content into the first carrier object, and every time the first APP fills the first carrier object, it notifies VrRenderer through the first notification message
  • the first display content is acquired, thereby realizing continuous display of the VR interface.
  • the VrRenderer obtains the display content of the first APP through the first carrier object, and the first APP does not need to send the display content to a third-party APP, so the leakage of the display content can be avoided, and data security is improved.
  • the creating the first carrier object according to the first carrier object creation request includes: creating a first structure object according to the first carrier object creation request; creating the first structure object according to the first structure object The first carrier object.
  • the acquiring the second display content by invoking the second carrier object includes: receiving a second carrier object creation request from the second APP; creating the second carrier object according to the second carrier object creation request , the second carrier object is used for the second APP to fill the content to be displayed; a second notification message is received from the second APP, where the second notification message is used to indicate that the second carrier object has been filled ; call the second carrier object according to the second notification message to obtain the second display content.
  • the second APP can send a second carrier object creation request to VrRenderer through a specific interface or channel.
  • VrRenderer receives the second carrier object creation request, it creates an empty second carrier object based on the request, and then VrRenderer returns the created second carrier object to the second APP, so that the second APP fills the second display content into the second carrier object, and every time the second APP fills the second carrier object, it notifies VrRenderer through a second notification message
  • the second display content is acquired, so as to realize continuous display of the VR interface.
  • the VrRenderer obtains the display content of the second APP through the second carrier object, and the second APP does not need to send the display content to the third-party APP, so the leakage of the display content can be avoided, and the data security is improved.
  • the creating the second carrier object according to the second carrier object creation request includes: creating a second structure object according to the second carrier object creation request; creating the second carrier object according to the second structure object The second carrier object.
  • the first APP is a VR APP
  • the second APP is a system APP
  • the first APP is a VR APP and the second APP is a system APP, such as a short message
  • the system APP is already displayed on the interface of the video playback APP at this time
  • the user switches the VR application for example
  • the interface of the pop-up window of the short message can still be displayed on the interface of the game APP, and the content of the short message cannot be displayed due to switching to the interface of the game APP.
  • the method can realize the continuous display of the interface of the system APP and the interface of the VR APP, avoid the user from missing information in the system, and do not need the user to close the interface of the current VR APP and reopen the system APP, which facilitates the user's operation of the system APP.
  • the first carrier object and the second carrier object are map objects.
  • the above-mentioned first carrier object and second carrier object are map objects, such as surface objects. Since the surface objects are objects with a fixed resolution, different APPs can fill the content to be displayed according to the fixed resolution, so that the The call to display content is easy to implement.
  • an apparatus for displaying a VR interface including a unit composed of software and/or hardware, and the unit is configured to execute any one of the methods in the technical solutions described in the first aspect.
  • an electronic device comprising a processor, a memory and an interface, the processor, the memory and the interface cooperate with each other, the memory is used for storing a computer program, and the processor is used for calling and running the computer from the memory
  • the program enables the electronic device to execute any one of the methods in the technical solutions described in the first aspect.
  • a computer-readable storage medium where a computer program is stored in the computer-readable storage medium, and when the computer program is executed by a processor, the processor is caused to execute the technology described in the first aspect any method in the program.
  • a computer program product includes: computer program code, when the computer program code is run on a terminal device, the terminal device is made to execute the technical solution described in the first aspect. any method.
  • FIG. 1 is a schematic structural diagram of an example of a terminal device 100 provided by an embodiment of the present application
  • FIG. 2 is a software structural block diagram of a terminal device 100 provided by an embodiment of the present application.
  • FIG. 3 is a schematic diagram of an example of a VR interface provided by an embodiment of the present application.
  • FIG. 4 is a schematic diagram of an example of a VR interface provided by an embodiment of the present application.
  • FIG. 5 is a schematic diagram of an example of a VR interface provided by an embodiment of the present application.
  • FIG. 6 is a schematic diagram of an example of a VR interface drawn by an SDK of a third-party APP provided by an embodiment of the present application;
  • FIG. 7 is a schematic diagram of an example of generating a VR interface by an intermediate layer provided by an embodiment of the present application.
  • FIG. 8 is a schematic diagram of the software architecture of an example of an intermediate layer provided by an embodiment of the present application.
  • FIG. 9 is a schematic flowchart of an example of a method for acquiring first display content of a first APP provided by an embodiment of the present application.
  • FIG. 10 is a schematic flowchart of an example of a method for obtaining second display content of a second APP provided by an embodiment of the present application
  • FIG. 11 is a schematic flowchart of an example of a method for displaying a VR interface provided by an embodiment of the present application.
  • FIG. 12 is an example of a sequence diagram for displaying a VR interface provided by an embodiment of the present application.
  • FIG. 13 is a schematic structural diagram of an example of a device for displaying a VR interface provided by an embodiment of the present application.
  • first and second are only used for descriptive purposes, and should not be construed as indicating or implying relative importance or implicitly indicating the number of indicated technical features.
  • a feature defined as “first” or “second” may expressly or implicitly include one or more of that feature.
  • the method for launching an application can be applied to mobile phones, tablet computers, wearable devices, in-vehicle devices, augmented reality (AR)/virtual reality (VR) devices, notebook computers, super mobile devices
  • AR augmented reality
  • VR virtual reality
  • terminal devices such as a personal computer (ultra-mobile personal computer, UMPC), a netbook, a personal digital assistant (personal digital assistant, PDA), the embodiments of the present application do not impose any restrictions on the specific type of the terminal device.
  • FIG. 1 is a schematic structural diagram of an example of a terminal device 100 provided by an embodiment of the present application.
  • the terminal device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2 , mobile communication module 150, wireless communication module 160, audio module 170, speaker 170A, receiver 170B, microphone 170C, headphone jack 170D, sensor module 180, buttons 190, motor 191, indicator 192, camera 193, display screen 194, and Subscriber identification module (subscriber identification module, SIM) card interface 195 and so on.
  • SIM Subscriber identification module
  • the sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, and ambient light. Sensor 180L, bone conduction sensor 180M, etc.
  • the terminal device 100 may include more or less components than those shown in the drawings, or combine some components, or separate some components, or arrange different components.
  • the illustrated components may be implemented in hardware, software, or a combination of software and hardware.
  • the processor 110 may include one or more processing units, for example, the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), controller, memory, video codec, digital signal processor (digital signal processor, DSP), baseband processor, and/or neural-network processing unit (NPU) Wait. Wherein, different processing units may be independent devices, or may be integrated in one or more processors.
  • application processor application processor, AP
  • modem processor graphics processor
  • graphics processor graphics processor
  • ISP image signal processor
  • controller memory
  • video codec digital signal processor
  • DSP digital signal processor
  • NPU neural-network processing unit
  • the controller may be the nerve center and command center of the terminal device 100 .
  • the controller can generate an operation control signal according to the instruction operation code and timing signal, and complete the control of fetching and executing instructions.
  • a memory may also be provided in the processor 110 for storing instructions and data.
  • the memory in processor 110 is cache memory. This memory may hold instructions or data that have just been used or recycled by the processor 110 . If the processor 110 needs to use the instruction or data again, it can be called directly from the memory. Repeated accesses are avoided and the latency of the processor 110 is reduced, thereby increasing the efficiency of the system.
  • the processor 110 may include one or more interfaces.
  • the interface may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous transceiver (universal asynchronous transmitter) receiver/transmitter, UART) interface, mobile industry processor interface (MIPI), general-purpose input/output (GPIO) interface, subscriber identity module (SIM) interface, and / or universal serial bus (universal serial bus, USB) interface, etc.
  • I2C integrated circuit
  • I2S integrated circuit built-in audio
  • PCM pulse code modulation
  • PCM pulse code modulation
  • UART universal asynchronous transceiver
  • MIPI mobile industry processor interface
  • GPIO general-purpose input/output
  • SIM subscriber identity module
  • USB universal serial bus
  • the I2C interface is a bidirectional synchronous serial bus that includes a serial data line (SDA) and a serial clock line (SCL).
  • the processor 110 may contain multiple sets of I2C buses.
  • the processor 110 can be respectively coupled to the touch sensor 180K, the charger, the flash, the camera 193 and the like through different I2C bus interfaces.
  • the processor 110 may couple the touch sensor 180K through the I2C interface, so that the processor 110 and the touch sensor 180K communicate with each other through the I2C bus interface, so as to realize the touch function of the terminal device 100 .
  • the I2S interface can be used for audio communication.
  • the processor 110 may contain multiple sets of I2S buses.
  • the processor 110 may be coupled with the audio module 170 through an I2S bus to implement communication between the processor 110 and the audio module 170 .
  • the audio module 170 can transmit audio signals to the wireless communication module 160 through the I2S interface, so as to realize the function of answering calls through a Bluetooth headset.
  • the PCM interface can also be used for audio communications, sampling, quantizing and encoding analog signals.
  • the audio module 170 and the wireless communication module 160 may be coupled through a PCM bus interface.
  • the audio module 170 can also transmit audio signals to the wireless communication module 160 through the PCM interface, so as to realize the function of answering calls through the Bluetooth headset. Both the I2S interface and the PCM interface can be used for audio communication.
  • the UART interface is a universal serial data bus used for asynchronous communication.
  • the bus may be a bidirectional communication bus. It converts the data to be transmitted between serial communication and parallel communication.
  • a UART interface is typically used to connect the processor 110 with the wireless communication module 160 .
  • the processor 110 communicates with the Bluetooth module in the wireless communication module 160 through the UART interface to implement the Bluetooth function.
  • the audio module 170 can transmit audio signals to the wireless communication module 160 through the UART interface, so as to realize the function of playing music through the Bluetooth headset.
  • the MIPI interface can be used to connect the processor 110 with peripheral devices such as the display screen 194 and the camera 193 .
  • MIPI interfaces include camera serial interface (CSI), display serial interface (DSI), etc.
  • the processor 110 communicates with the camera 193 through the CSI interface, so as to realize the shooting function of the terminal device 100 .
  • the processor 110 communicates with the display screen 194 through the DSI interface to implement the display function of the terminal device 100 .
  • the GPIO interface can be configured by software.
  • the GPIO interface can be configured as a control signal or as a data signal.
  • the GPIO interface may be used to connect the processor 110 with the camera 193, the display screen 194, the wireless communication module 160, the audio module 170, the sensor module 180, and the like.
  • the GPIO interface can also be configured as I2C interface, I2S interface, UART interface, MIPI interface, etc.
  • the USB interface 130 is an interface that conforms to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, and the like.
  • the USB interface 130 can be used to connect a charger to charge the terminal device 100, and can also be used to transmit data between the terminal device 100 and peripheral devices. It can also be used to connect headphones to play audio through the headphones. This interface can also be used to connect other terminal devices, such as AR devices.
  • the interface connection relationship between the modules illustrated in the embodiments of the present application is only a schematic illustration, and does not constitute a structural limitation of the terminal device 100 .
  • the terminal device 100 may also adopt different interface connection manners in the foregoing embodiments, or a combination of multiple interface connection manners.
  • the charging management module 140 is used to receive charging input from the charger.
  • the charger may be a wireless charger or a wired charger.
  • the charging management module 140 may receive charging input from the wired charger through the USB interface 130 .
  • the charging management module 140 may receive wireless charging input through the wireless charging coil of the terminal device 100 . While the charging management module 140 charges the battery 142 , it can also supply power to the terminal device through the power management module 141 .
  • the power management module 141 is used for connecting the battery 142 , the charging management module 140 and the processor 110 .
  • the power management module 141 receives input from the battery 142 and/or the charging management module 140 and supplies power to the processor 110 , the internal memory 121 , the external memory, the display screen 194 , the camera 193 , and the wireless communication module 160 .
  • the power management module 141 can also be used to monitor parameters such as battery capacity, battery cycle times, battery health status (leakage, impedance).
  • the power management module 141 may also be provided in the processor 110 .
  • the power management module 141 and the charging management module 140 may also be provided in the same device.
  • the wireless communication function of the terminal device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, the modulation and demodulation processor, the baseband processor, and the like.
  • Antenna 1 and Antenna 2 are used to transmit and receive electromagnetic wave signals.
  • the structures of the antenna 1 and the antenna 2 in FIG. 1 are only an example.
  • Each antenna in terminal device 100 may be used to cover a single or multiple communication frequency bands. Different antennas can also be reused to improve antenna utilization.
  • the antenna 1 can be multiplexed as a diversity antenna of the wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
  • the mobile communication module 150 may provide a wireless communication solution including 2G/3G/4G/5G, etc. applied on the terminal device 100 .
  • the mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (LNA) and the like.
  • the mobile communication module 150 can receive electromagnetic waves from the antenna 1, filter and amplify the received electromagnetic waves, and transmit them to the modulation and demodulation processor for demodulation.
  • the mobile communication module 150 can also amplify the signal modulated by the modulation and demodulation processor, and then turn it into an electromagnetic wave for radiation through the antenna 1 .
  • at least part of the functional modules of the mobile communication module 150 may be provided in the processor 110 .
  • at least part of the functional modules of the mobile communication module 150 may be provided in the same device as at least part of the modules of the processor 110 .
  • the modem processor may include a modulator and a demodulator.
  • the modulator is used to modulate the low frequency baseband signal to be sent into a medium and high frequency signal.
  • the demodulator is used to demodulate the received electromagnetic wave signal into a low frequency baseband signal. Then the demodulator transmits the demodulated low-frequency baseband signal to the baseband processor for processing.
  • the low frequency baseband signal is processed by the baseband processor and passed to the application processor.
  • the application processor outputs sound signals through audio devices (not limited to the speaker 170A, the receiver 170B, etc.), or displays images or videos through the display screen 194 .
  • the modem processor may be a stand-alone device.
  • the modem processor may be independent of the processor 110, and may be provided in the same device as the mobile communication module 150 or other functional modules.
  • the wireless communication module 160 can provide applications on the terminal device 100 including wireless local area networks (WLAN) (such as wireless fidelity (Wi-Fi) networks), bluetooth (BT), global navigation satellites System (global navigation satellite system, GNSS), frequency modulation (frequency modulation, FM), near field communication technology (near field communication, NFC), infrared technology (infrared, IR) and other wireless communication solutions.
  • WLAN wireless local area networks
  • BT wireless fidelity
  • GNSS global navigation satellites System
  • frequency modulation frequency modulation, FM
  • NFC near field communication technology
  • IR infrared technology
  • the wireless communication module 160 may be one or more devices integrating at least one communication processing module.
  • the wireless communication module 160 receives electromagnetic waves via the antenna 2 , frequency modulates and filters the electromagnetic wave signals, and sends the processed signals to the processor 110 .
  • the wireless communication module 160 can also receive the signal to be sent from the processor 110 , perform frequency modulation on it, amplify it, and convert it into electromagnetic waves for radiation through
  • the antenna 1 of the terminal device 100 is coupled with the mobile communication module 150, and the antenna 2 is coupled with the wireless communication module 160, so that the terminal device 100 can communicate with the network and other devices through wireless communication technology.
  • the wireless communication technology may include global system for mobile communications (GSM), general packet radio service (GPRS), code division multiple access (CDMA), broadband Code Division Multiple Access (WCDMA), Time Division Code Division Multiple Access (TD-SCDMA), Long Term Evolution (LTE), BT, GNSS, WLAN, NFC , FM, and/or IR technology, etc.
  • the GNSS may include global positioning system (global positioning system, GPS), global navigation satellite system (global navigation satellite system, GLONASS), Beidou navigation satellite system (beidou navigation satellite system, BDS), quasi-zenith satellite system (quasi -zenith satellite system, QZSS) and/or satellite based augmentation systems (SBAS).
  • global positioning system global positioning system, GPS
  • global navigation satellite system global navigation satellite system, GLONASS
  • Beidou navigation satellite system beidou navigation satellite system, BDS
  • quasi-zenith satellite system quadsi -zenith satellite system, QZSS
  • SBAS satellite based augmentation systems
  • the terminal device 100 implements a display function through a GPU, a display screen 194, an application processor, and the like.
  • the GPU is a microprocessor for image processing, and is connected to the display screen 194 and the application processor.
  • the GPU is used to perform mathematical and geometric calculations for graphics rendering.
  • Processor 110 may include one or more GPUs that execute program instructions to generate or alter display information.
  • Display screen 194 is used to display images, videos, and the like.
  • Display screen 194 includes a display panel.
  • the display panel can be a liquid crystal display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode or an active-matrix organic light-emitting diode (active-matrix organic light).
  • LED diode AMOLED
  • flexible light-emitting diode flexible light-emitting diode (flex light-emitting diode, FLED), Miniled, MicroLed, Micro-oLed, quantum dot light-emitting diode (quantum dot light emitting diodes, QLED) and so on.
  • the terminal device 100 may include one or N display screens 194 , where N is a positive integer greater than one.
  • the terminal device 100 can realize the shooting function through the ISP, the camera 193, the video codec, the GPU, the display screen 194 and the application processor.
  • the ISP is used to process the data fed back by the camera 193 .
  • the shutter is opened, the light is transmitted to the camera photosensitive element through the lens, the light signal is converted into an electrical signal, and the camera photosensitive element transmits the electrical signal to the ISP for processing, and converts it into an image visible to the naked eye.
  • ISP can also perform algorithm optimization on image noise, brightness, and skin tone.
  • ISP can also optimize the exposure, color temperature and other parameters of the shooting scene.
  • the ISP may be provided in the camera 193 .
  • Camera 193 is used to capture still images or video.
  • the object is projected through the lens to generate an optical image onto the photosensitive element.
  • the photosensitive element may be a charge coupled device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor.
  • CMOS complementary metal-oxide-semiconductor
  • the photosensitive element converts the optical signal into an electrical signal, and then transmits the electrical signal to the ISP to convert it into a digital image signal.
  • the ISP outputs the digital image signal to the DSP for processing.
  • DSP converts digital image signals into standard RGB, YUV and other formats of image signals.
  • the terminal device 100 may include 1 or N cameras 193 , where N is a positive integer greater than 1.
  • the digital signal processor is used to process digital signals, in addition to processing digital image signals, it can also process other digital signals. For example, when the terminal device 100 selects a frequency point, the digital signal processor is used to perform Fourier transform on the frequency point energy, and the like.
  • Video codecs are used to compress or decompress digital video.
  • the terminal device 100 may support one or more video codecs.
  • the terminal device 100 can play or record videos in various encoding formats, for example, moving picture experts group (moving picture experts group, MPEG) 1, MPEG2, MPEG3, MPEG4 and so on.
  • MPEG moving picture experts group
  • the NPU is a neural-network (NN) computing processor.
  • NN neural-network
  • Applications such as intelligent cognition of the terminal device 100 can be implemented through the NPU, such as image recognition, face recognition, speech recognition, text understanding, and the like.
  • the external memory interface 120 can be used to connect an external memory card, such as a Micro SD card, to expand the storage capacity of the terminal device 100 .
  • the external memory card communicates with the processor 110 through the external memory interface 120 to realize the data storage function. For example to save files like music, video etc in external memory card.
  • Internal memory 121 may be used to store computer executable program code, which includes instructions.
  • the processor 110 executes various functional applications and data processing of the terminal device 100 by executing the instructions stored in the internal memory 121 .
  • the internal memory 121 may include a storage program area and a storage data area.
  • the storage program area can store an operating system, an application program required for at least one function (such as a sound playback function, an image playback function, etc.), and the like.
  • the storage data area may store data (such as audio data, phone book, etc.) created during the use of the terminal device 100 and the like.
  • the internal memory 121 may include high-speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, universal flash storage (UFS), and the like.
  • the terminal device 100 may implement audio functions through an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, an application processor, and the like. Such as music playback, recording, etc.
  • the audio module 170 is used for converting digital audio information into analog audio signal output, and also for converting analog audio input into digital audio signal. Audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be provided in the processor 110 , or some functional modules of the audio module 170 may be provided in the processor 110 .
  • Speaker 170A also referred to as a "speaker" is used to convert audio electrical signals into sound signals.
  • the terminal device 100 can listen to music through the speaker 170A, or listen to a hands-free call.
  • the receiver 170B also referred to as "earpiece" is used to convert audio electrical signals into sound signals.
  • the terminal device 100 answers a call or a voice message, the voice can be answered by placing the receiver 170B close to the human ear.
  • the microphone 170C also called “microphone” or “microphone” is used to convert sound signals into electrical signals.
  • the user can make a sound by approaching the microphone 170C through a human mouth, and input the sound signal into the microphone 170C.
  • the terminal device 100 may be provided with at least one microphone 170C.
  • the terminal device 100 may be provided with two microphones 170C, which may implement a noise reduction function in addition to collecting sound signals.
  • the terminal device 100 may further be provided with three, four or more microphones 170C to collect sound signals, reduce noise, identify sound sources, and implement directional recording functions.
  • the earphone jack 170D is used to connect wired earphones.
  • the earphone interface 170D can be the USB interface 130, or can be a 3.5mm open mobile terminal platform (open mobile terminal platform, OMTP) standard interface, a cellular telecommunications industry association of the USA (CTIA) standard interface.
  • OMTP open mobile terminal platform
  • CTIA cellular telecommunications industry association of the USA
  • the pressure sensor 180A is used to sense pressure signals, and can convert the pressure signals into electrical signals.
  • the pressure sensor 180A may be provided on the display screen 194 .
  • the capacitive pressure sensor may be comprised of at least two parallel plates of conductive material. When a force is applied to the pressure sensor 180A, the capacitance between the electrodes changes.
  • the terminal device 100 determines the intensity of the pressure according to the change in capacitance. When a touch operation acts on the display screen 194, the terminal device 100 detects the intensity of the touch operation according to the pressure sensor 180A.
  • the terminal device 100 may also calculate the touched position according to the detection signal of the pressure sensor 180A.
  • touch operations acting on the same touch position but with different touch operation intensities may correspond to different operation instructions. For example, when a touch operation whose intensity is less than the first pressure threshold acts on the short message application icon, the instruction for viewing the short message is executed. When a touch operation with a touch operation intensity greater than or equal to the first pressure threshold acts on the short message application icon, the instruction to create a new short message is executed.
  • the gyro sensor 180B may be used to determine the motion attitude of the terminal device 100 .
  • the angular velocity of the end device 100 about three axes ie, the x, y and z axes
  • the gyro sensor 180B can be used for image stabilization.
  • the gyro sensor 180B detects the shaking angle of the terminal device 100, calculates the distance to be compensated by the lens module according to the angle, and allows the lens to offset the shaking of the terminal device 100 through reverse motion to achieve anti-shake.
  • the gyro sensor 180B can also be used for navigation and somatosensory game scenarios.
  • the air pressure sensor 180C is used to measure air pressure.
  • the terminal device 100 calculates the altitude through the air pressure value measured by the air pressure sensor 180C to assist in positioning and navigation.
  • the magnetic sensor 180D includes a Hall sensor.
  • the terminal device 100 can detect the opening and closing of the flip holster using the magnetic sensor 180D.
  • the terminal device 100 can detect the opening and closing of the flip according to the magnetic sensor 180D. Further, according to the detected opening and closing state of the leather case or the opening and closing state of the flip cover, characteristics such as automatic unlocking of the flip cover are set.
  • the acceleration sensor 180E can detect the magnitude of the acceleration of the terminal device 100 in various directions (generally three axes).
  • the magnitude and direction of gravity can be detected when the terminal device 100 is stationary. It can also be used to identify the posture of terminal devices, and can be used in applications such as horizontal and vertical screen switching, pedometers, etc.
  • the terminal device 100 can measure the distance through infrared or laser. In some embodiments, when shooting a scene, the terminal device 100 can use the distance sensor 180F to measure the distance to achieve fast focusing.
  • Proximity light sensor 180G may include, for example, light emitting diodes (LEDs) and light detectors, such as photodiodes.
  • the light emitting diodes may be infrared light emitting diodes.
  • the terminal device 100 emits infrared light to the outside through the light emitting diode.
  • the terminal device 100 detects infrared reflected light from nearby objects using a photodiode. When sufficient reflected light is detected, it can be determined that there is an object near the terminal device 100 . When insufficient reflected light is detected, the terminal device 100 may determine that there is no object near the terminal device 100 .
  • the terminal device 100 can use the proximity light sensor 180G to detect that the user holds the terminal device 100 close to the ear to talk, so as to automatically turn off the screen to save power.
  • Proximity light sensor 180G can also be used in holster mode, pocket mode automatically unlocks and locks the screen.
  • the ambient light sensor 180L is used to sense ambient light brightness.
  • the terminal device 100 can adaptively adjust the brightness of the display screen 194 according to the perceived ambient light brightness.
  • the ambient light sensor 180L can also be used to automatically adjust the white balance when taking pictures.
  • the ambient light sensor 180L can also cooperate with the proximity light sensor 180G to detect whether the terminal device 100 is in a pocket, so as to prevent accidental touch.
  • the fingerprint sensor 180H is used to collect fingerprints.
  • the terminal device 100 can use the collected fingerprint characteristics to realize fingerprint unlocking, accessing application locks, taking photos with fingerprints, answering incoming calls with fingerprints, and the like.
  • the temperature sensor 180J is used to detect the temperature.
  • the terminal device 100 uses the temperature detected by the temperature sensor 180J to execute the temperature processing strategy. For example, when the temperature reported by the temperature sensor 180J exceeds a threshold value, the terminal device 100 reduces the performance of the processor located near the temperature sensor 180J, so as to reduce power consumption and implement thermal protection.
  • the terminal device 100 when the temperature is lower than another threshold, the terminal device 100 heats the battery 142 to avoid abnormal shutdown of the terminal device 100 caused by the low temperature.
  • the terminal device 100 boosts the output voltage of the battery 142 to avoid abnormal shutdown caused by low temperature.
  • Touch sensor 180K also called “touch panel”.
  • the touch sensor 180K may be disposed on the display screen 194 , and the touch sensor 180K and the display screen 194 form a touch screen, also called a “touch screen”.
  • the touch sensor 180K is used to detect a touch operation on or near it.
  • the touch sensor can pass the detected touch operation to the application processor to determine the type of touch event.
  • Visual output related to touch operations may be provided through display screen 194 .
  • the touch sensor 180K may also be disposed on the surface of the terminal device 100 , which is different from the position where the display screen 194 is located.
  • the bone conduction sensor 180M can acquire vibration signals.
  • the bone conduction sensor 180M can acquire the vibration signal of the vibrating bone mass of the human voice.
  • the bone conduction sensor 180M can also contact the pulse of the human body and receive the blood pressure beating signal.
  • the bone conduction sensor 180M can also be disposed in the earphone, combined with the bone conduction earphone.
  • the audio module 170 can analyze the voice signal based on the vibration signal of the vocal vibration bone block obtained by the bone conduction sensor 180M, so as to realize the voice function.
  • the application processor can analyze the heart rate information based on the blood pressure beat signal obtained by the bone conduction sensor 180M, and realize the function of heart rate detection.
  • the keys 190 include a power-on key, a volume key, and the like. Keys 190 may be mechanical keys. It can also be a touch key.
  • the terminal device 100 may receive key input and generate key signal input related to user settings and function control of the terminal device 100 .
  • Motor 191 can generate vibrating cues.
  • the motor 191 can be used for vibrating alerts for incoming calls, and can also be used for touch vibration feedback.
  • touch operations acting on different applications can correspond to different vibration feedback effects.
  • the motor 191 can also correspond to different vibration feedback effects for touch operations on different areas of the display screen 194 .
  • Different application scenarios for example: time reminder, receiving information, alarm clock, games, etc.
  • the touch vibration feedback effect can also support customization.
  • the indicator 192 can be an indicator light, which can be used to indicate the charging state, the change of the power, and can also be used to indicate a message, a missed call, a notification, and the like.
  • the SIM card interface 195 is used to connect a SIM card.
  • the SIM card can be contacted and separated from the terminal device 100 by inserting into the SIM card interface 195 or pulling out from the SIM card interface 195 .
  • the terminal device 100 may support 1 or N SIM card interfaces, where N is a positive integer greater than 1.
  • the SIM card interface 195 can support Nano SIM card, Micro SIM card, SIM card and so on. Multiple cards can be inserted into the same SIM card interface 195 at the same time. The types of the plurality of cards may be the same or different.
  • the SIM card interface 195 can also be compatible with different types of SIM cards.
  • the SIM card interface 195 is also compatible with external memory cards.
  • the terminal device 100 interacts with the network through the SIM card to realize functions such as calls and data communication.
  • the terminal device 100 adopts an eSIM, that is, an embedded SIM card.
  • the eSIM card can be embedded in the terminal device 100 and cannot be separated from the terminal device 100 .
  • the software system of the terminal device 100 may adopt a layered architecture, an event-driven architecture, a microkernel architecture, a microservice architecture, or a cloud architecture.
  • the embodiments of the present application take an Android system with a layered architecture as an example to exemplarily describe the software structure of the terminal device 100 .
  • FIG. 2 is a block diagram of a software structure of a terminal device 100 according to an embodiment of the present application.
  • the layered architecture divides the software into several layers, and each layer has a clear role and division of labor. Layers communicate with each other through software interfaces.
  • the Android system is divided into four layers, which are, from top to bottom, an application layer, an application framework layer, an Android runtime (Android runtime) and a system library, and a kernel layer.
  • the application layer can include a series of application packages.
  • the application package can include applications such as camera, gallery, calendar, call, map, navigation, WLAN, Bluetooth, music, video, short message and so on.
  • the application framework layer provides an application programming interface (application programming interface, API) and a programming framework for applications in the application layer.
  • the application framework layer includes some predefined functions.
  • the application framework layer may include window managers, content providers, view systems, telephony managers, resource managers, notification managers, and the like.
  • a window manager is used to manage window programs.
  • the window manager can get the size of the display screen, determine whether there is a status bar, lock the screen, take screenshots, etc.
  • Content providers are used to store and retrieve data and make these data accessible to applications.
  • the data may include video, images, audio, calls made and received, browsing history and bookmarks, phone book, etc.
  • the view system includes visual controls, such as controls for displaying text, controls for displaying pictures, and so on. View systems can be used to build applications.
  • a display interface can consist of one or more views.
  • the display interface including the short message notification icon may include a view for displaying text and a view for displaying pictures.
  • the telephony manager is used to provide the communication function of the terminal device 100 .
  • the management of call status including connecting, hanging up, etc.).
  • the resource manager provides various resources for the application, such as localization strings, icons, pictures, layout files, video files and so on.
  • the notification manager enables applications to display notification information in the status bar, which can be used to convey notification-type messages, and can disappear automatically after a brief pause without user interaction. For example, the notification manager is used to notify download completion, message reminders, etc.
  • the notification manager can also display notifications in the status bar at the top of the system in the form of graphs or scroll bar text, such as notifications of applications running in the background, and notifications on the screen in the form of dialog windows. For example, text information is prompted in the status bar, a prompt sound is issued, the terminal device vibrates, and the indicator light flashes.
  • the Android runtime includes core libraries and a virtual machine. Android runtime is responsible for scheduling and management of the Android system.
  • the core library consists of two parts: one is the function functions that the java language needs to call, and the other is the core library of Android.
  • the application layer and the application framework layer run in virtual machines.
  • the virtual machine executes the java files of the application layer and the application framework layer as binary files.
  • the virtual machine is used to perform functions such as object lifecycle management, stack management, thread management, safety and exception management, and garbage collection.
  • a system library can include multiple functional modules. For example: surface manager (surface manager), media library (media library), 3D graphics processing library (eg: OpenGL ES), 2D graphics engine (eg: SGL), etc.
  • surface manager surface manager
  • media library media library
  • 3D graphics processing library eg: OpenGL ES
  • 2D graphics engine eg: SGL
  • the Surface Manager is used to manage the display subsystem and provides a fusion of 2D and 3D layers for multiple applications.
  • the media library supports playback and recording of a variety of commonly used audio and video formats, as well as still image files.
  • the media library can support a variety of audio and video encoding formats, such as: MPEG4, H.264, MP3, AAC, AMR, JPG, PNG, etc.
  • the 3D graphics processing library is used to implement 3D graphics drawing, image rendering, compositing, and layer processing.
  • 2D graphics engine is a drawing engine for 2D drawing.
  • the kernel layer is the layer between hardware and software.
  • the kernel layer contains at least display drivers, camera drivers, audio drivers, and sensor drivers.
  • FIG. 1 and FIG. 2 the terminal device having the structure shown in FIG. 1 and FIG. 2 as an example, and combine the drawings and application scenarios to specifically describe the method for displaying a VR interface provided by the embodiments of the present application.
  • a VR device can be connected to a terminal device through a USB cable or wirelessly, wherein the VR device can be a VR helmet, VR glasses and other devices, and the terminal device can be a mobile phone, a tablet and other devices.
  • the terminal device converts the VR application interface into the bread diagram shown in Figure 3 through the SDK that comes with different VR applications.
  • the left side is the view of the left eye
  • the right side is the view of the right eye.
  • the picture is sent to the VR device, and then the human eye can see a clear and distorted picture through the VR device.
  • a pop-up window or the like appears on the interface of the terminal device, if the image generated by the terminal device is as shown in FIG.
  • the terminal device needs to process the content of the pop-up window into an image as shown in FIG. 5 , that is, the content of the pop-up window is generated in the left-eye view and the right-eye view of the bread chart respectively, and the generated image is processed accordingly.
  • Anti-distortion can allow users to clearly see the content of the pop-up window.
  • the content of the pop-up window needs to be sent to the currently used third-party APP, and the SDK of the third-party APP will draw the own interface and the system pop-up window interface uniformly.
  • the third-party APP renders the display content of the system interface and the display content of the third-party APP, and performs the underlying display to complete the display of the VR interface.
  • a new application program may be created in the system of the terminal device, that is, an intermediate layer for VR interface rendering.
  • This new application program may be called "VrRenderer", and VrRenderer provides services by creating a VrRendererService service.
  • the VrRenderer can provide Binder objects externally through specific interfaces, such as the onBind interface.
  • the Binder interface provided by VrRenderer is an Aidl file, and the key interface is requestRendersurface.
  • This interface can provide an interaction channel between VrRenderer and a third-party APP.
  • the third-party APP can send a request to create a carrier object by calling the requestRendersurface interface.
  • the system APP can also use the same method to call the corresponding requestRendersurface interface to send a request to create a carrier object.
  • the middle layer ie VrRenderer
  • the middle layer renders the display content of the system interface and the display content of the third-party APP and sends it to the bottom layer, so as to avoid the loss of the display content of the system interface when the third-party APP is switched, and there is no need to Send the displayed content of the system interface to a third-party APP to ensure data security.
  • Different third-party APPs start corresponding activity components, such as activity component 1 (activity1), activity component 2 (activity2) . . . activity component N ( activityN), and then fill the respective display content (app content) to the corresponding carrier object (such as surface); at the same time, the display content of the system interface (system UI content), camera display content (camera content) and other display content (other content) can be filled by the corresponding carrier object, and the VrRenderer as the middle layer calls the multiple display contents and performs a rendering, that is, the multiple display contents are synthesized, and then the synthesized result is sent to the VR SDK of the system for two The second rendering is performed, and the rendered VR interface is displayed.
  • activity components such as activity component 1 (activity1), activity component 2 (activity2) . . . activity component N ( activityN)
  • the display content of the system interface system UI content
  • camera display content camera display content
  • other display content other display content
  • the VrRenderer as the middle layer calls the multiple display contents and
  • the method includes:
  • the above-mentioned first APP may be a third-party APP, such as a video playing APP, a game APP, and the like.
  • the third-party APP can send a request for the first carrier object to the VrRenderer as the middle layer through a pre-established channel or by calling the requestRendersurface interface, and the VrRenderer receives the first carrier sent by the third-party APP. object request.
  • the VrRenderer creates a first carrier object corresponding to the third-party APP based on the received first carrier object creation request.
  • the first carrier object may be a texture object, such as a surface object.
  • the surface object is an object with a fixed resolution, it can be implemented that different APPs fill the content to be displayed according to the fixed resolution, so that the displayed content can be displayed.
  • the call is easy to implement.
  • a manner of acquiring the first carrier object may include: creating a first structure object according to a first carrier object creation request; and creating a first carrier object according to the first structure object.
  • VrRenderer When VrRenderer receives a first carrier object request sent by a third-party APP, VrRenderer uses OpenGL technology to create a corresponding first structure object, such as a texture object, for the third-party APP, and then further creates a first carrier object based on the first structure object , such as creating a surface object from a texture object. There is a binding relationship between the texture object and the surface object.
  • the first carrier object is an empty object, which can be used as the carrier of the display content of the third-party APP.
  • VrRenderer returns the created first carrier object to the third-party APP, and the third-party APP fills the empty first carrier object with its own display content frame by frame.
  • the third-party APP fills the first carrier object, Send a first notification message to VrRenderer to notify VrRenderer to obtain the display content of the third-party APP.
  • the VrRenderer receives the first notification message, it can be determined that the first carrier object has been filled with the display content of the corresponding APP.
  • VrRenderer Based on the first notification message sent by the third-party APP notification, VrRenderer continuously and continuously calls the updateTeximage function, and continuously obtains the first display content filled in each frame of the third-party APP from the filled first carrier object, thereby realizing Continuous display of VR interface.
  • the terminal device by calling the requestRenderSurface service of VrRenderer, the terminal device realizes the creation of a corresponding carrier object based on a carrier creation request of a third-party APP, and can create carrier objects for different APPs in a targeted manner, which is convenient for multiple APPs. Management of carrier objects.
  • the terminal device returns the built first carrier object to the first APP by calling VrRenderer, so that the first APP fills the first display content into the first carrier object.
  • the notification message notifies the VrRenderer to obtain the first display content, thereby realizing continuous display of the VR interface and improving user experience.
  • the VrRenderer obtains the display content of the first APP through the first carrier object, and the first APP does not need to send the display content to a third-party APP, so the leakage of the display content can be avoided and data security is improved.
  • the specific process of acquiring the second display content of the system APP as the second APP by the intermediate layer VrRenderer may be shown in FIG. 10 , including:
  • the above-mentioned second APP may be a system APP, such as a short message APP, a camera APP, and the like.
  • the system APP can send the second carrier object request to the VrRenderer as the middle layer through the pre-established channel or call the requestRenderSurface interface, and the VrRenderer receives the second carrier object request sent by the system APP.
  • the VrRenderer creates a second carrier object corresponding to the system APP based on the received second carrier object creation request.
  • the second carrier object may be a texture object, such as a surface object. Since the surface object is an object with a fixed resolution, different APPs can fill the content to be displayed according to the fixed resolution, so that the displayed content can be The call is easy to implement.
  • the acquiring manner of the second carrier object may include: creating a second structure object according to a second carrier object creation request; and creating a first carrier object according to the second structure object.
  • VrRenderer When VrRenderer receives the second carrier object request sent by the system APP, VrRenderer uses OpenGL technology to create a corresponding second structure object, such as a texture object, for the system APP, and then further creates a second carrier object based on the second structure object, such as Create a surface object from a texture object. There is a binding relationship between the texture object and the surface object.
  • the second structure object can be used to fill the display content in the second carrier object.
  • the second carrier object is an empty object, which can be used as the carrier of the display content of the system APP.
  • VrRenderer returns the created second carrier object to the system APP, and the system APP fills its own display content into the empty second carrier object frame by frame. Send a second notification message to notify the VrRenderer to obtain the display content of the system APP. Whenever the VrRenderer receives the second notification message from the system APP, it can be determined that the second carrier object has been filled with the display content of the corresponding APP in this frame.
  • VrRenderer Based on the received second notification message sent by the system APP, VrRenderer continuously calls the updateTexImage function, and continuously obtains the second display content filled in each frame of the system APP from the filled second carrier object, thereby realizing the continuous VR interface. show.
  • the terminal device realizes the creation of corresponding carrier objects based on the carrier creation request of the system APP by calling the requestRendersurface interface of VrRenderer, and can create carrier objects for different APPs in a targeted manner, which is convenient for carriers of multiple APPs object management.
  • VrRenderer returns the built first carrier object to the first APP, so that the first APP fills the first display content into the first carrier object, and every time the first APP fills the first carrier object, it notifies VrRenderer through the first notification message The first display content is acquired, thereby realizing continuous display of the VR interface, and improving user experience.
  • the VrRenderer obtains the display content of the first APP through the first carrier object, and the first APP does not need to send the display content to a third-party APP, so the leakage of the display content can be avoided and data security is improved.
  • the above-mentioned first APP and second APP may be different APPs, which may include, but are not limited to, a video playing APP, a game program, a short message program of the system, a camera program, and the like.
  • the above-mentioned first display content is the display content of the first APP, and may include pictures, text, colors, etc.
  • the above-mentioned second display content is the display content of the second APP, and may include pictures, text, colors, etc., such as the camera's shooting interface Or the text content of the SMS, etc.
  • the terminal device can call the first carrier object through the intermediate layer VrRenderer to obtain the first display content of the first APP, and call the second carrier object to obtain the second display content of the second APP.
  • the terminal device invokes the VrRenderer to draw the VR interface according to the display contents of the two APPs, and obtains the VR interface including the first display content and the second display content. For example, first rendering is performed, that is, the first display content and the second display content are superimposed and synthesized, so that the rendered image result includes the first display content and the second display content, and then a second rendering is performed to generate the data used by the VR device.
  • the VR interface can then be sent to the VR device for display.
  • the terminal device does not need to send the second display content of the second APP to the first APP, and draws the VR interface through the process of the first APP, but obtains the display content of the two APPs through two carrier objects respectively. , and draw the VR interface according to the display contents of the two APPs, obtain a VR interface including the first display content and the second display content, and display the VR interface. Since this method does not need to send the second display content of the second APP to the first APP for drawing, even in the process of application switching, the switching of the first APP will not cause the display content of the second APP to be lost, avoiding the situation that the display content of the second APP is lost.
  • the interruption of the display content during the application switching process is avoided, and the transmission of the display interface across the processes is realized, so the display interface is continuously stable during the process switching, and the user experience is improved.
  • the first APP cannot obtain the second display content of the second APP, the security of the data of the second APP is enhanced.
  • the first APP is a VR APP, that is, a third-party APP equipped with a VR device, such as a video playback APP or a game APP
  • the second APP is a system APP, such as a text message
  • the system APP is If it is already displayed on the interface of the video playing APP
  • the user switches the VR application for example, when the user switches the first APP from the video playing APP to the game APP, the pop-up window of the short message can still be displayed on the interface of the game APP.
  • the interface of the game APP will not be unable to display the text message content due to switching to the interface of the game APP.
  • the method can realize the continuous display of the interface of the system APP and the interface of the VR APP, avoid the user from missing information in the system, and do not need the user to close the interface of the current VR APP and reopen the system APP, which facilitates the user's operation of the system APP.
  • the sequence diagram among the third-party APP as the first APP, the VrRenderer as the middle layer, and the system APP as the second APP may also be as shown in FIG. 12 .
  • the VrRenderer starts the service
  • the third-party APP and the system APP respectively start their respective active components
  • the third-party APP and the system APP respectively send a request for the binding service (binder service) to the VrRenderer
  • the VrRenderer returns the corresponding interface of each APP to provide services.
  • the interaction channel between VrRenderer and the third-party APP and the system APP is established respectively.
  • the third-party APP and the system APP send a request to obtain the surface object through their respective interaction channels, that is, by accessing the requestRenderService interface. Based on this request, VrRenderer creates a texture object for the APP for the requested APP, and creates the corresponding APP according to the texture object. surface object, and then return the created surface object to the corresponding APP.
  • the third-party APP and the system APP respectively fill the corresponding surface in each frame and send a notification message (notify) to VrRenderer.
  • VrRenderer calls the update function (updateTexIimage) to obtain each The display content of the frame.
  • the VrRenderer draws the acquired display content to obtain the final VR interface.
  • FIG. 12 only one third-party APP and one system APP are used as examples for illustration. In fact, VrRenderer can realize the rendering of more APP display contents, thereby making the VR interface richer.
  • the present application can divide the functional modules of the device displaying the VR interface according to the above method examples.
  • each function can be divided into each functional module, or two or more functions can be integrated into one module.
  • the above-mentioned integrated modules can be implemented in the form of hardware, and can also be implemented in the form of software function modules. It should be noted that the division of modules in this application is schematic, and is only a logical function division, and other division methods may be used in actual implementation.
  • FIG. 13 shows a schematic structural diagram of a device 1300 for displaying a VR interface provided by the present application.
  • Apparatus 1300 includes:
  • the first obtaining module 1301 is configured to obtain the first display content by calling the first carrier object, where the first display content is the display content of the first application program APP.
  • the second obtaining module 1302 is configured to obtain second display content by calling the second carrier object, where the second display content is the display content of the second APP.
  • the generating module 1303 is configured to generate a VR interface according to the first display content and the second display content, where the VR interface includes the first display content and the second display content.
  • the display module 1304 is used to display the VR interface.
  • the first obtaining module 1301 is specifically configured to receive a first carrier object creation request from the first APP; create the first carrier object according to the first carrier object creation request, the first carrier object The carrier object is used for the first APP to fill the content to be displayed; a first notification message is received from the first APP, and the first notification message is used to indicate that the first carrier object has been filled; according to the first notification message A notification message calls the first carrier object to acquire the first display content.
  • the first obtaining module 1301 is specifically configured to create a first structure object according to the first carrier object creation request; and create the first carrier object according to the first structure object.
  • the second obtaining module 1302 is specifically configured to receive a second carrier object creation request from the second APP; create the second carrier object according to the second carrier object creation request, and the second carrier object
  • the carrier object is used for the second APP to fill the content to be displayed; a second notification message is received from the second APP, and the second notification message is used to indicate that the second carrier object has been filled; according to the first
  • the second notification message calls the second carrier object to obtain the second display content.
  • the second obtaining module 1302 is specifically configured to create a second structure object according to the second carrier object creation request; and create the second carrier object according to the second structure object.
  • the first APP is a VR APP
  • the second APP is a system APP
  • the first carrier object and the second carrier object are map objects.
  • An embodiment of the present application further provides an electronic device, including the above-mentioned processor.
  • the electronic device provided in this embodiment may be the terminal device 100 shown in FIG. 1 , and is configured to execute the above method for displaying a VR interface.
  • the terminal device may include a processor, memory and an interface.
  • the processing module may be used to control and manage the actions of the terminal device, for example, may be used to support the terminal device to perform steps performed by the display unit, the detection unit, and the processing unit.
  • the storage module can be used to support the terminal device to execute stored program codes and data.
  • the communication module can be used to support the communication between the terminal device and other devices.
  • the processing module may be a processor or a controller. It may implement or execute the various exemplary logical blocks, modules and circuits described in connection with this disclosure.
  • the processor may also be a combination that implements computing functions, such as a combination of one or more microprocessors, a combination of digital signal processing (DSP) and a microprocessor, and the like.
  • the storage module may be a memory.
  • the communication module may specifically be a device that interacts with other terminal devices, such as a radio frequency circuit, a Bluetooth chip, and a Wi-Fi chip.
  • the terminal device involved in this embodiment may be a device having the structure shown in FIG. 1 .
  • Embodiments of the present application further provide a computer-readable storage medium, where a computer program is stored in the computer-readable storage medium, and when the computer program is executed by a processor, the processor is made to execute the description in any of the foregoing embodiments.
  • the method of displaying the VR interface is not limited to a computer program stored in the computer-readable storage medium, and when the computer program is executed by a processor, the processor is made to execute the description in any of the foregoing embodiments.
  • Embodiments of the present application further provide a computer program product, which, when the computer program product runs on a computer, causes the computer to execute the above-mentioned relevant steps, so as to realize the method for displaying a VR interface in the above-mentioned embodiment.
  • the electronic device, computer-readable storage medium, computer program product or chip provided in this embodiment are all used to execute the corresponding method provided above. Therefore, for the beneficial effects that can be achieved, reference may be made to the above-provided method. The beneficial effects in the corresponding method will not be repeated here.
  • the disclosed apparatus and method may be implemented in other manners.
  • the apparatus embodiments described above are only illustrative.
  • the division of modules or units is only a logical function division. In actual implementation, there may be other division methods.
  • multiple units or components may be combined or May be integrated into another device, or some features may be omitted, or not implemented.
  • the shown or discussed mutual coupling or direct coupling or communication connection may be through some interfaces, indirect coupling or communication connection of devices or units, and may be in electrical, mechanical or other forms.
  • Units described as separate components may or may not be physically separated, and components shown as units may be one physical unit or multiple physical units, that is, may be located in one place, or may be distributed in multiple different places. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution in this embodiment.
  • each functional unit in each embodiment of the present application may be integrated into one processing unit, or each unit may exist physically alone, or two or more units may be integrated into one unit.
  • the above-mentioned integrated units may be implemented in the form of hardware, or may be implemented in the form of software functional units.
  • the integrated unit if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a readable storage medium.
  • a readable storage medium including several instructions to make a device (which may be a single chip microcomputer, a chip, etc.) or a processor (processor) to execute all or part of the steps of the methods in the various embodiments of the present application.
  • the aforementioned storage medium includes: U disk, mobile hard disk, read only memory (ROM), random access memory (random access memory, RAM), magnetic disk or optical disk and other media that can store program codes.

Landscapes

  • Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Environmental & Geological Engineering (AREA)
  • Telephone Function (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

La présente invention se rapporte au domaine technique de la réalité virtuelle (VR) et concerne un procédé et un appareil permettant d'afficher une interface de VR, un dispositif et un support de stockage lisible. Le dispositif peut être un téléphone mobile, un ordinateur plat, un dispositif portable, un dispositif monté sur véhicule, un casque de VR, des lunettes de VR, et analogues. Le procédé consiste : à obtenir un premier contenu d'affichage en appelant un premier objet de support, le premier contenu d'affichage étant un contenu d'affichage d'une première application (app) ; à obtenir un second contenu d'affichage en appelant un second objet de support, le second contenu d'affichage étant un contenu d'affichage d'une seconde application ; à générer une interface de VR en fonction du premier contenu d'affichage et du second contenu d'affichage, l'interface de VR comprenant le premier contenu d'affichage et le second contenu d'affichage ; et à afficher l'interface de VR. Le procédé peut éviter une interruption d'affichage lorsque l'application est commutée.
PCT/CN2021/137278 2020-12-25 2021-12-11 Procédé et appareil permettant d'afficher une interface de réalité virtuelle, dispositif, et support de stockage lisible WO2022135195A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202011568857.5A CN114691248B (zh) 2020-12-25 2020-12-25 显示虚拟现实界面的方法、装置、设备和可读存储介质
CN202011568857.5 2020-12-25

Publications (1)

Publication Number Publication Date
WO2022135195A1 true WO2022135195A1 (fr) 2022-06-30

Family

ID=82129925

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/137278 WO2022135195A1 (fr) 2020-12-25 2021-12-11 Procédé et appareil permettant d'afficher une interface de réalité virtuelle, dispositif, et support de stockage lisible

Country Status (2)

Country Link
CN (1) CN114691248B (fr)
WO (1) WO2022135195A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116684517A (zh) * 2022-09-29 2023-09-01 荣耀终端有限公司 发送响应消息的方法和装置

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110114746A (zh) * 2017-01-04 2019-08-09 武汉六为科技有限公司 显示虚拟现实画面的方法和虚拟现实设备
CN110347305A (zh) * 2019-05-30 2019-10-18 华为技术有限公司 一种vr多屏显示方法及电子设备
US20200066043A1 (en) * 2018-08-21 2020-02-27 Disney Enterprises, Inc. Multi-screen interactions in virtual and augmented reality

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110114746A (zh) * 2017-01-04 2019-08-09 武汉六为科技有限公司 显示虚拟现实画面的方法和虚拟现实设备
US20200066043A1 (en) * 2018-08-21 2020-02-27 Disney Enterprises, Inc. Multi-screen interactions in virtual and augmented reality
CN110347305A (zh) * 2019-05-30 2019-10-18 华为技术有限公司 一种vr多屏显示方法及电子设备

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116684517A (zh) * 2022-09-29 2023-09-01 荣耀终端有限公司 发送响应消息的方法和装置

Also Published As

Publication number Publication date
CN114691248A (zh) 2022-07-01
CN114691248B (zh) 2024-04-12

Similar Documents

Publication Publication Date Title
WO2020259452A1 (fr) Procédé d'affichage plein écran pour terminal mobile et appareil
WO2020253719A1 (fr) Procédé de d'enregistrement d'écran et dispositif électronique
CN112399390B (zh) 一种蓝牙回连的方法及相关装置
CN114546190A (zh) 一种应用显示方法及电子设备
WO2020093988A1 (fr) Procédé de traitement d'image et dispositif électronique
WO2021190344A1 (fr) Dispositif électronique à affichage multi-écran, et procédé d'affichage multi-écran pour dispositif électronique
CN113722058B (zh) 一种资源调用方法及电子设备
WO2022033320A1 (fr) Procédé de communication bluetooth, équipement terminal et support d'enregistrement lisible par ordinateur
WO2021052204A1 (fr) Procédé de découverte de dispositif basé sur un carnet d'adresses, procédé de communication audio et vidéo, et dispositif électronique
WO2022042770A1 (fr) Procédé de commande d'état de service de communication, dispositif terminal et support de stockage lisible
WO2022001258A1 (fr) Procédé et appareil d'affichage à écrans multiples, dispositif terminal et support de stockage
WO2022095744A1 (fr) Procédé de commande d'affichage vr, dispositif électronique et support de stockage lisible par ordinateur
WO2022143180A1 (fr) Procédé d'affichage collaboratif, dispositif terminal et support de stockage lisible par ordinateur
CN113641271A (zh) 应用窗口的管理方法、终端设备及计算机可读存储介质
WO2022170856A1 (fr) Procédé d'établissement de connexion et dispositif électronique
WO2022022674A1 (fr) Procédé de disposition d'icône d'application et appareil associé
WO2022135195A1 (fr) Procédé et appareil permettant d'afficher une interface de réalité virtuelle, dispositif, et support de stockage lisible
CN112437341B (zh) 一种视频流处理方法及电子设备
EP4395290A1 (fr) Procédé de lecture audio bluetooth, dispositif électronique, et support de stockage
CN116048831B (zh) 一种目标信号处理方法和电子设备
WO2022170854A1 (fr) Procédé d'appel vidéo et dispositif associé
WO2022161006A1 (fr) Procédé et appareil de synthèse de photographie, et dispositif électronique et support de stockage lisible
WO2022078116A1 (fr) Procédé de génération d'image à effet de pinceau, procédé et dispositif d'édition d'image et support de stockage
WO2021129453A1 (fr) Procédé de capture d'écran et dispositif associé
CN115482143A (zh) 应用的图像数据调用方法、系统、电子设备及存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21909192

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21909192

Country of ref document: EP

Kind code of ref document: A1