WO2023083218A1 - 投屏中流畅显示画面的方法、相关装置及系统 - Google Patents

投屏中流畅显示画面的方法、相关装置及系统 Download PDF

Info

Publication number
WO2023083218A1
WO2023083218A1 PCT/CN2022/130904 CN2022130904W WO2023083218A1 WO 2023083218 A1 WO2023083218 A1 WO 2023083218A1 CN 2022130904 W CN2022130904 W CN 2022130904W WO 2023083218 A1 WO2023083218 A1 WO 2023083218A1
Authority
WO
WIPO (PCT)
Prior art keywords
image frame
side device
screen
display
area
Prior art date
Application number
PCT/CN2022/130904
Other languages
English (en)
French (fr)
Inventor
王永德
段潇潇
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Publication of WO2023083218A1 publication Critical patent/WO2023083218A1/zh

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
    • H04N21/4402Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display
    • H04N21/440281Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display by altering the temporal resolution, e.g. by frame skipping
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/4104Peripherals receiving signals from specially adapted client devices
    • H04N21/4126The peripheral being portable, e.g. PDAs or mobile phones
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
    • H04N21/4402Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W76/00Connection management
    • H04W76/10Connection setup
    • H04W76/14Direct-mode setup
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units

Definitions

  • the present application relates to the field of terminal technology, and in particular to a method, related device and system for smoothly displaying images in screen projection.
  • Screen projection is one of the widely used functions of electronic devices, including mirror projection and online projection.
  • the screen projection device intercepts the content displayed on its own display screen and sends it to the screen-projected device, so that the screen-projected device displays the same content as the screen-casting device.
  • the fluency of the content displayed on the mirrored device is the main factor affecting the user's visual experience. How to make the projected device display the projected content smoothly is an important research direction to improve user experience.
  • the present application provides a method for smoothly displaying pictures in screen projection, a related device and a system, which support the projected device to display smooth and continuous pictures, and avoid the feeling of freezing or jumping.
  • a method for smoothly displaying pictures in screen projection which is applied to a communication system including a first device and a second device.
  • the method includes: the first device and the second device establish a communication connection; the first device intercepts and displays The content displayed on the screen, get the first image frame, and send the first image frame to the second device; the second device receives the first image frame; the second device displays in turn: the first image frame, the first predicted image frame ; The first predicted image frame is obtained by the second device according to the first image frame.
  • the second device that is, the projected device side, displays the received image frame and the predicted image frame, which can continuously display the picture and maintain the visual continuity of the picture. , to avoid the Caton problem.
  • image prediction is performed based on the latest received image frame, which can ensure the smoothness and stability of the image and avoid visual jumps. From the user's point of view, the user can see a smooth and continuous picture without the feeling of stuttering or jumping, and can obtain a good screen projection experience.
  • the communication connection established between the first device and the second device may be a Wi-Fi connection, a Bluetooth connection, an NFC connection, or a remote connection.
  • the connection between the first device and the second device can be established based on miracast, digital living network alliance (digital living network alliance, DLNA) protocol, AirPlay and other protocols.
  • DLNA digital living network alliance
  • the first device before obtaining the first image frame, may capture the content displayed on the display screen to obtain the second image frame, and convert the second image frame sent to the second device; the second device receives the second image frame; the second device also displays the second image frame before displaying the first image frame; wherein, the first predicted image frame is determined by the second device based on the first image frame and the second image frame is obtained.
  • the second device that is, the projected device side, can obtain the predicted image frame according to the latest two received image frames.
  • the time point at which the second device receives the second image frame, and the time point at which the first device obtains the second image frame are within the first duration. That is to say, only the image frames received by the second device with a small delay can be sent for display, which can ensure the low delay of the mirroring screen projection and give the user a feeling that the second device and the first device display images synchronously , to give users a better mirroring experience.
  • the first device after the first device obtains the first image frame, it can capture the content displayed on the display screen again to obtain the third image frame, and convert the third image frame sent to the second device; the second device does not receive the third image frame, or the second device does not receive the third image frame within the first time period after the first device intercepts the third image frame.
  • the second device can display the predicted image frame when there is a frame loss, such as not receiving the image frame or not receiving the image frame within the specified time, so that the second device can continuously display the picture and maintain the visual The continuity of the picture avoids the problem of stuttering.
  • the second device may obtain the first predicted image frame according to the first image frame under any of the following circumstances:
  • the second device obtains the first predicted image frame according to the first image frame.
  • the second device obtains the first predicted image frame according to the first image frame when the communication quality corresponding to the communication connection is lower than a threshold.
  • the communication quality between the first device and the second device is lower than a certain value, it means that the communication quality between the two parties is poor, and it is very likely that the image sent by the first device to the second device will be caused by the poor communication quality. Frames can be lost during communication. In other words, case 2 is very likely to cause frame loss. Therefore, obtaining the first predicted image frame in the case 2 can effectively avoid the bad experience brought by possible frame loss to the user.
  • the first device runs the first application in the foreground and scrolls and displays the content in the display screen at a speed greater than the first value.
  • the user operation received by the first device may trigger the above situation 3.
  • the method of the first aspect may further include: before the second device displays the first predicted image frame, the first device runs the first application and receives the first operation, and the first device responds to the first operation with a larger than the first The speed of the value scrolls and displays the content in the display screen; the first device sends the application information of the first application and the operation information of the first operation to the second device; the second device receives the application information of the first application and the operation information of the first operation The operation information is to obtain the first predicted image frame according to the first image frame.
  • the user operation received by the second device may trigger the above situation 3.
  • the method of the first aspect may further include: before the second device displays the first predicted image frame, the first device runs the first application, and sends the application information of the first application to the second device; the second device receives The second operation and send the operation information of the second operation to the first device, triggering the first device to scroll and display the content in the display screen at a speed greater than the first value; the second device obtains the first predicted image frame according to the first image frame .
  • the first application is an application that can slide or scroll to display content in a user interface in response to a user operation, for example, a browser, a social application, a reading application, and the like.
  • the display frame rate of the second device is to the screen projection frame rate of the first device, the better the mirror image projection effect seen by the user is.
  • the predicted image frame can be obtained when the more serious frame loss problem begins to occur, so as to avoid the bad experience brought to the user by subsequent continuous frame loss.
  • the time point when the second device receives the first image frame, and the time point when the first device obtains the first image frame, within the first duration are to say, only the image frames received by the second device with a small delay can be sent for display, which can ensure the low delay of the mirroring screen projection and give the user a feeling that the second device and the first device display images synchronously , to give users a better mirroring experience.
  • the first predicted image frame is: based on the first image frame, after moving the content displayed in the motion area in the motion area according to the motion vector, use The resulting image after filling the free areas with predicted data.
  • the motion area is the area where different contents are displayed in the first image frame and the fourth image frame;
  • the movement vector is a vector in which the position of the target content in the fourth image frame moves to the position of the target content in the first image frame ;
  • the free area is the area where no content is displayed in the motion area after the content displayed in the motion area is moved.
  • the second device may trigger the operation information of the sliding display interface of the first device based on the first image frame and the application information run by the first device, etc. To determine the above motion region and motion vector.
  • the second device may determine the motion region and the motion vector by comparing the first image frame and the second image frame.
  • the second device performs image prediction based on the latest received image frame, which can ensure the smoothness and stability of the projected screen and avoid visual jumps. This can support the user to see a smooth picture without the feeling of jumping.
  • the second device may obtain prediction data according to the content displayed in the free area of the first image frame in any one or more of the following ways:
  • the second device performs blurring processing on the content displayed in the free area of the first image frame.
  • the blurring processing may include, for example, mean blurring, median blurring, Gaussian blurring, bilateral blurring, surface blurring, box blurring, double blurring, Image processing methods such as bokeh blur, axis-shift blur, aperture blur, granular blur, radial image blur, and direction blur.
  • the second device directly uses the content displayed in the free area of the first image frame as prediction data.
  • the second device uses a neural network algorithm to perform image prediction on the content displayed in the free area of the first image frame to obtain prediction data.
  • the second device obtains prediction data from previously cached image frames.
  • the second device can obtain the prediction data according to the content displayed in the free area of the first image frame, so as to obtain the first predicted image frame.
  • the method of obtaining the first predicted image frame based on the first image frame can ensure the smoothness and stability of the projected screen, avoid visual jumps, and support users to see smooth images without jumping a feeling of.
  • the display screen of the second device includes a screen projection area, and the screen projection area is used to sequentially display the first image frame and the first predicted image frame, and the screen projection area Take up part or all of the second device's display.
  • the projection area occupies the entire display screen, the user can obtain an immersive screen projection experience; when the projection area occupies part of the display screen, the second device can use other areas of the display screen to display the user interface provided by the second device itself, without affecting The user operates the second device.
  • the second device may also adjust the position, size, shape, etc. of the screen projection area in response to user operations.
  • the second device displays the first predicted image frame
  • it displays the second predicted image frame
  • the second predicted image frame is determined by the second device based on the first predicted image frame. frames get.
  • the second device can continuously predict multiple frames of images, and continuously display multiple predicted image frames, which can ensure that the second device can continuously display images for a period of time, maintain the visual continuity of the images, and avoid freezing problems . It can guarantee the fluency and stability of the picture in the second device for a period of time, and avoid visual jumps.
  • the faster the content displayed on the display screen of the first device changes the higher the screen projection frame rate of the content displayed on the screen intercepted by the first device is. . This can help the first device to keenly capture the change process of the screen on the display screen, so that the second device can also respond to the corresponding process, avoiding the effect of sudden changes on the second device.
  • the display frame rate of the second device to sequentially display the first image frame and the first predicted image frame is equal to the screen projection frame rate of the content displayed on the screen intercepted by the first device.
  • the display frame rate is consistent with the screen projection frame rate, users can see the optimal mirror projection effect.
  • a method for smoothly displaying images in screen projection which is applied to a second device, and the method includes: the second device establishes a communication connection with the first device; the second device receives the information sent by the first device The first image frame, the first image frame is obtained by the first device intercepting the content displayed on the display screen; the second device displays in sequence: the first image frame and the first predicted image frame; the first predicted image frame is obtained by the second device according to the first An image frame is obtained.
  • an electronic device including: memory, one or more processors; the memory is coupled to one or more processors, the memory is used to store computer program codes, the computer program codes include computer instructions, and one or more The processor invokes computer instructions to make the electronic device execute the method according to the second aspect or any implementation manner of the second aspect.
  • the embodiment of the present application provides a communication system, including a first device and a second device, and the second device is configured to execute the method according to the second aspect or any implementation manner of the second aspect.
  • the embodiment of the present application provides a computer-readable storage medium, including instructions, and when the instructions are run on the electronic device, the electronic device executes the method according to the second aspect or any one of the implementation manners of the second aspect.
  • an embodiment of the present application provides a computer program product, which, when running on a computer, causes the computer to execute the method of the second aspect or any one of the implementation manners of the second aspect.
  • the projected device side can perform image prediction based on the latest received image frame, and display the predicted image frame when a frame is lost .
  • the projected device can continuously display images, maintain the visual continuity of the images, ensure a high frame rate on the projected device, and avoid freezing problems.
  • image prediction is performed according to the latest received image frame, which can ensure the smoothness and stability of the image, avoid visual jumps, and bring a good screen projection experience to users.
  • Fig. 1 is the architecture of the communication system provided by the embodiment of the present application.
  • FIG. 2A is a hardware structural diagram of an electronic device provided by an embodiment of the present application.
  • FIG. 2B is a software structure diagram of the electronic device provided by the embodiment of the present application.
  • FIG. 3 is a flowchart of a method for smoothly displaying images in screen projection provided by an embodiment of the present application
  • 4A-4D are the user interfaces involved when the source-side device 100 activates the screen mirroring function
  • FIG. 4E is a user interface involved when the end-side device 200 starts the screen mirroring function
  • 5A-5E are the user interfaces involved in the mirroring process of the source device 100;
  • FIG. 6 is a schematic diagram of a display queue and a predictive image queue provided by an embodiment of the present application.
  • FIG. 7 is a schematic diagram of a predicted image provided by an embodiment of the present application.
  • 8A-8D are user interfaces involved in the process of mirroring and screen mirroring of the device 200 on the device side.
  • first and second are used for descriptive purposes only, and cannot be understood as implying or implying relative importance or implicitly specifying the quantity of indicated technical features. Therefore, the features defined as “first” and “second” may explicitly or implicitly include one or more of these features. In the description of the embodiments of the present application, unless otherwise specified, the “multiple” The meaning is two or more.
  • UI user interface
  • the term "user interface (UI)” in the following embodiments of this application is a medium interface for interaction and information exchange between an application program or an operating system and a user, and it realizes the difference between the internal form of information and the form acceptable to the user. conversion between.
  • the user interface is the source code written in a specific computer language such as java and extensible markup language (XML).
  • the source code of the interface is parsed and rendered on the electronic device, and finally presented as content that can be recognized by the user.
  • the commonly used form of user interface is the graphical user interface (graphic user interface, GUI), which refers to the user interface related to computer operation displayed in a graphical way. It may be text, icons, buttons, menus, tabs, text boxes, dialog boxes, status bars, navigation bars, Widgets, and other visible interface elements displayed on the display screen of the electronic device.
  • the screen casting process there may be scenes with a high frame rate.
  • the screen projection frame rate of the screen content intercepted by the screen projection device side is relatively high.
  • the process of sending the projected screen content to the projected device by the projecting device may drop frames, resulting in the display frame on the projected device side
  • the display rate is low, the display screen is not smooth, and there are obvious visual freezes.
  • the following embodiments of the present application provide a method, a related device, and a system for smoothly displaying images in screen projection.
  • this method during the mirror projection process of the screen-casting device and the screen-casting device, if the screen-casting frame rate on the screen-casting device side is higher than the threshold, the screen-casting device side can perform image processing based on the latest received image frame. Predictive, to display predicted image frames in case of frame drops.
  • the projected device side displays the received image frame and the predicted image frame, which can continuously display the picture, maintain the visual continuity of the picture, ensure the high frame rate of the projected device side, and avoid freezing question.
  • image prediction is performed based on the latest received image frame, which can ensure the smoothness and stability of the image and avoid visual jumps. From the user's point of view, the user can see a smooth and continuous picture without the feeling of stuttering or jumping, and can obtain a good screen projection experience.
  • mirroring screen projection means that the screen projection device intercepts the content displayed on its own display screen and sends it to the screen-projected device, so that the screen-projected device displays the same content as the screen-projection device process.
  • the technology used for screen mirroring may include but not limited to wireless fidelity (wireless fidelity, Wi-Fi), bluetooth, near field communication technology (near field communication, NFC), mobile communication technology or wired technology and so on.
  • the protocol used for mirror projection may include but not limited to miracast, digital living network alliance (digital living network alliance, DLNA) protocol, AirPlay and so on.
  • mirror projection can also be called other terms, such as collaborative projection, mirror projection, wireless projection, and so on.
  • the screen-casting device can also be called the source-side device, and the screen-casting device can also be called the device-side device.
  • the source-side device and the device-side device will be used as examples for description.
  • the source-side device may also be called a first device, and the device-side device may also be called a second device.
  • the screen projection frame rate refers to the number of frames of screen content captured by the source-side device within a unit of time.
  • the unit of the projection frame rate can be frames per second (frames per second, FPS) or Hertz (Hz).
  • FPS frames per second
  • Hz Hertz
  • the frame rate of screen projection is related to the speed of screen change on the display screen of the source device. The faster the screen screen changes, the higher the frame rate of screen projection for the source device to capture the screen content.
  • Scenarios where the screen projection frame rate of the source-side device is high may include scenarios where the content displayed on the source-side device changes rapidly when the user slides quickly on the display screen of the source-side device. For example, when the source-side device is running applications such as browsers and social software, if the user quickly slides the user interface on the display screen, the screen on the display screen will change rapidly. At this time, the frame rate of the screen projection of the source-side device is relatively high.
  • frame loss refers to the loss of screen projection content captured by the source-side device (that is, the image frame) during transmission to the device-side device, or being actively discarded by the device-side device.
  • Reasons for frame loss may include, but are not limited to: poor connection conditions between the source side and the device side, such as poor network quality (such as low network speed), low wired bandwidth, low image coding efficiency at the source side, Image decoding efficiency is low and so on.
  • FIG. 1 exemplarily shows the architecture of a communication system 10 .
  • a communication system 10 includes: a source-side device 100 and a device-side device 200 .
  • the source-side device 100 may establish a communication connection with the device-side device 200 .
  • the communication connection may be a Wi-Fi connection, a Bluetooth connection, an NFC connection, or a remote connection, etc., or may be a wired connection such as a connection based on a data line, which is not limited in this embodiment of the present application.
  • the source-side device 100 may include, but not limited to, a smart phone, a tablet computer, a personal digital assistant (personal digital assistant, PDA), an augmented reality (augmented reality, AR) device, a virtual reality (virtual reality, VR) device, an artificial intelligence (artificial intelligence, AI) devices, wearable devices (such as smart watches, smart glasses), etc.
  • exemplary embodiments of electronic devices include, but are not limited to Portable electronic devices with Linux or other operating systems.
  • the aforementioned electronic equipment may also be other portable electronic equipment, such as a laptop computer (Laptop). It should also be understood that, in some other embodiments, the above-mentioned electronic device may not be a portable electronic device, but a desktop computer.
  • the source-side device 100 has a display screen, and the display screen can display the local content of the source-side device 100 or the content from the network.
  • the display screen can also be used to receive various types of gestures input by the user, such as sliding gestures, clicking gestures, dragging gestures, pinching gestures and so on.
  • the source-side device 100 can change the content displayed on the display screen in response to various gestures input by the user.
  • the end-side device 200 may be a tablet computer, a TV, a smart screen, a vehicle-mounted device, or an electronic billboard.
  • the end-side device 200 may have a larger display screen than the source-side device 100 .
  • when the end-side device 200 is a TV it can be used with a TV box, and the TV box is used to convert the received digital signal into an analog signal and send it to the TV for display.
  • the end-side device 200 may be a TV set with a digital-to-analog conversion function, or a TV set equipped with a TV box.
  • the end-side device 200 when the end-side device 200 is a TV or a smart screen, it can also be used in conjunction with a remote control.
  • the remote controller and the end-side device 200 may communicate through infrared signals.
  • the source-side device 100 and the device-side device 200 after the source-side device 100 and the device-side device 200 establish a communication connection, they can perform a screen mirroring process.
  • the source-side device 100 can display the corresponding content on the display screen according to the user operation input by the user, and determine the screen projection frame rate according to the change speed of the content on the display screen, and capture the screen display displayed on the screen at the screen projection frame rate. content, and send the content to the end-side device 200 through the communication connection.
  • the source-side device 100 may notify the device-side device 200 of the screen-casting frame rate determined by itself based on the communication connection with the device-side device 200 .
  • the source-side device 100 may also send its own operating status to the device-side device 200 based on the communication connection with the device-side device 200 , such as information about running applications.
  • the end-side device 200 may perform image prediction according to the latest received image frame to obtain a predicted image frame.
  • image prediction For the timing or scene of the device-side device 200 predicting the image frame, reference may be made to the detailed introduction of subsequent method embodiments, which will not be repeated here.
  • FIG. 2A is a schematic structural diagram of an electronic device provided by an embodiment of the present application.
  • the electronic device may be the source-side device 100 in the communication system shown in FIG. 1 , or may be the device-side device 200 .
  • the electronic device may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (universal serial bus, USB) interface 130, a charging management module 140, a power management module 141, a battery 142, Antenna 1, antenna 2, mobile communication module 150, wireless communication module 160, audio module 170, speaker 170A, receiver 170B, microphone 170C, earphone interface 170D, sensor module 180, button 190, motor 191, indicator 192, camera 193, A display screen 194, and a subscriber identification module (subscriber identification module, SIM) card interface 195, etc.
  • SIM subscriber identification module
  • the sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, bone conduction sensor 180M, etc.
  • the structure shown in the embodiment of the present application does not constitute a specific limitation on the electronic device.
  • the electronic device may include more or fewer components than shown in the illustrations, or combine certain components, or separate certain components, or arrange different components.
  • the illustrated components can be realized in hardware, software or a combination of software and hardware.
  • the processor 110 may include one or more processing units, for example: the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processing unit (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), controller, video codec, digital signal processor (digital signal processor, DSP), baseband processor, and/or neural network processor (neural-network processing unit, NPU), etc. Wherein, different processing units may be independent devices, or may be integrated in one or more processors.
  • application processor application processor, AP
  • modem processor graphics processing unit
  • GPU graphics processing unit
  • image signal processor image signal processor
  • ISP image signal processor
  • controller video codec
  • digital signal processor digital signal processor
  • baseband processor baseband processor
  • neural network processor neural-network processing unit
  • the controller can generate an operation control signal according to the instruction opcode and timing signal, and complete the control of fetching and executing the instruction.
  • a memory may also be provided in the processor 110 for storing instructions and data.
  • the memory in processor 110 is a cache memory.
  • the memory may hold instructions or data that the processor 110 has just used or recycled. If the processor 110 needs to use the instruction or data again, it can be called directly from the memory. Repeated access is avoided, and the waiting time of the processor 110 is reduced, thus improving the efficiency of the system.
  • the wireless communication function of the electronic device can be realized by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, the modem processor and the baseband processor.
  • Antenna 1 and Antenna 2 are used to transmit and receive electromagnetic wave signals.
  • Each antenna in an electronic device can be used to cover a single or multiple communication frequency bands. Different antennas can also be multiplexed to improve the utilization of the antennas.
  • Antenna 1 can be multiplexed as a diversity antenna of a wireless local area network.
  • the antenna may be used in conjunction with a tuning switch.
  • the mobile communication module 150 can provide wireless communication solutions including 2G/3G/4G/5G applied to electronic devices.
  • the mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (low noise amplifier, LNA) and the like.
  • the mobile communication module 150 can receive electromagnetic waves through the antenna 1, filter and amplify the received electromagnetic waves, and send them to the modem processor for demodulation.
  • the mobile communication module 150 can also amplify the signals modulated by the modem processor, and convert them into electromagnetic waves and radiate them through the antenna 1 .
  • at least part of the functional modules of the mobile communication module 150 may be set in the processor 110 .
  • at least part of the functional modules of the mobile communication module 150 and at least part of the modules of the processor 110 may be set in the same device.
  • a modem processor may include a modulator and a demodulator.
  • the modulator is used for modulating the low-frequency baseband signal to be transmitted into a medium-high frequency signal.
  • the demodulator is used to demodulate the received electromagnetic wave signal into a low frequency baseband signal. Then the demodulator sends the demodulated low-frequency baseband signal to the baseband processor for processing.
  • the low-frequency baseband signal is passed to the application processor after being processed by the baseband processor.
  • the application processor outputs sound signals through audio equipment (not limited to speaker 170A, receiver 170B, etc.), or displays images or videos through display screen 194 .
  • the modem processor may be a stand-alone device.
  • the modem processor may be independent of the processor 110, and be set in the same device as the mobile communication module 150 or other functional modules.
  • the wireless communication module 160 can provide wireless local area networks (wireless local area networks, WLAN) (such as wireless fidelity (Wireless fidelity, Wi-Fi) network), bluetooth (bluetooth, BT), global navigation satellite system, etc. (global navigation satellite system, GNSS), frequency modulation (frequency modulation, FM), near field communication technology (near field communication, NFC), infrared technology (infrared, IR) and other wireless communication solutions.
  • the wireless communication module 160 may be one or more devices integrating at least one communication processing module.
  • the wireless communication module 160 receives electromagnetic waves via the antenna 2 , demodulates and filters the electromagnetic wave signals, and sends the processed signals to the processor 110 .
  • the wireless communication module 160 can also receive the signal to be sent from the processor 110 , frequency-modulate it, amplify it, and convert it into electromagnetic waves through the antenna 2 for radiation.
  • the antenna 1 of the electronic device is coupled to the mobile communication module 150, and the antenna 2 is coupled to the wireless communication module 160, so that the electronic device can communicate with the network and other devices through wireless communication technology.
  • the wireless communication technology may include global system for mobile communications (GSM), general packet radio service (general packet radio service, GPRS), code division multiple access (code division multiple access, CDMA), broadband Code division multiple access (wideband code division multiple access, WCDMA), time division code division multiple access (time-division code division multiple access, TD-SCDMA), long term evolution (long term evolution, LTE), BT, GNSS, WLAN, NFC , FM, and/or IR techniques, etc.
  • GSM global system for mobile communications
  • general packet radio service general packet radio service
  • CDMA code division multiple access
  • WCDMA broadband Code division multiple access
  • time division code division multiple access time-division code division multiple access
  • TD-SCDMA time-division code division multiple access
  • LTE long term evolution
  • BT GNSS
  • the GNSS may include a global positioning system (global positioning system, GPS), a global navigation satellite system (global navigation satellite system, GLONASS), a Beidou navigation satellite system (beidou navigation satellite system, BDS), a quasi-zenith satellite system (quasi -zenith satellite system (QZSS) and/or satellite based augmentation systems (SBAS).
  • GPS global positioning system
  • GLONASS global navigation satellite system
  • Beidou navigation satellite system beidou navigation satellite system
  • BDS Beidou navigation satellite system
  • QZSS quasi-zenith satellite system
  • SBAS satellite based augmentation systems
  • the electronic device realizes the display function through the GPU, the display screen 194, and the application processor.
  • the GPU is a microprocessor for image processing, and is connected to the display screen 194 and the application processor. GPUs are used to perform mathematical and geometric calculations for graphics rendering.
  • Processor 110 may include one or more GPUs that execute program instructions to generate or change display information.
  • the display screen 194 is used to display images, videos and the like.
  • the display screen 194 includes a display panel.
  • the display panel may be a liquid crystal display (LCD).
  • the display panel can also use organic light-emitting diodes (organic light-emitting diodes, OLEDs), active-matrix organic light-emitting diodes or active-matrix organic light-emitting diodes (active-matrix organic light emitting diodes, AMOLEDs), flexible light-emitting diodes ( flex light-emitting diode, FLED), miniled, microled, micro-oled, quantum dot light emitting diodes (quantum dot light emitting diodes, QLED), etc.
  • the electronic device may include 1 or N display screens 194, where N is a positive integer greater than 1.
  • the electronic device can realize the shooting function through ISP, camera 193 , video codec, GPU, display screen 194 and application processor.
  • Digital signal processors are used to process digital signals. In addition to digital image signals, they can also process other digital signals. For example, when an electronic device selects a frequency point, a digital signal processor is used to perform Fourier transform on the frequency point energy, etc.
  • Video codecs are used to compress or decompress digital video.
  • An electronic device may support one or more video codecs.
  • the electronic device can play or record video in multiple encoding formats, for example: moving picture experts group (moving picture experts group, MPEG) 1, MPEG2, MPEG3, MPEG4, etc.
  • the internal memory 121 may include one or more random access memories (random access memory, RAM) and one or more non-volatile memories (non-volatile memory, NVM).
  • RAM random access memory
  • NVM non-volatile memory
  • Random access memory can include static random-access memory (SRAM), dynamic random access memory (DRAM), synchronous dynamic random access memory (synchronous dynamic random access memory, SDRAM), double data rate synchronous Dynamic random access memory (double data rate synchronous dynamic random access memory, DDR SDRAM, such as the fifth generation DDR SDRAM is generally called DDR5SDRAM), etc.; non-volatile memory can include disk storage devices, flash memory (flash memory).
  • SRAM static random-access memory
  • DRAM dynamic random access memory
  • SDRAM synchronous dynamic random access memory
  • SDRAM synchronous dynamic random access memory
  • DDR SDRAM double data rate synchronous dynamic random access memory
  • non-volatile memory can include disk storage devices, flash memory (flash memory).
  • flash memory can include NOR FLASH, NAND FLASH, 3D NAND FLASH, etc.
  • it can include single-level storage cells (single-level cell, SLC), multi-level storage cells (multi-level cell, MLC), triple-level cell (TLC), quad-level cell (QLC), etc.
  • SLC single-level storage cells
  • MLC multi-level storage cells
  • TLC triple-level cell
  • QLC quad-level cell
  • UFS universal flash storage
  • embedded multimedia memory card embedded multi media Card
  • the random access memory can be directly read and written by the processor 110, and can be used to store executable programs (such as machine instructions) of an operating system or other running programs, and can also be used to store data of users and application programs.
  • the non-volatile memory can also store executable programs and data of users and application programs, etc., and can be loaded into the random access memory in advance for the processor 110 to directly read and write.
  • the external memory interface 120 can be used to connect an external non-volatile memory, so as to expand the storage capacity of the electronic device.
  • the external non-volatile memory communicates with the processor 110 through the external memory interface 120 to implement a data storage function. For example, files such as music and video are stored in an external non-volatile memory.
  • the electronic device can implement audio functions through the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the earphone interface 170D, and the application processor. Such as music playback, recording, etc.
  • the pressure sensor 180A is used to sense the pressure signal and convert the pressure signal into an electrical signal.
  • pressure sensor 180A may be disposed on display screen 194 .
  • pressure sensors 180A such as resistive pressure sensors, inductive pressure sensors, and capacitive pressure sensors.
  • a capacitive pressure sensor may be comprised of at least two parallel plates with conductive material.
  • the electronic device detects the intensity of the touch operation according to the pressure sensor 180A.
  • the electronic device may also calculate the touched position according to the detection signal of the pressure sensor 180A.
  • touch operations acting on the same touch position but with different touch operation intensities may correspond to different operation instructions. For example: when a touch operation with a touch operation intensity less than the first pressure threshold acts on the short message application icon, an instruction to view short messages is executed. When a touch operation whose intensity is greater than or equal to the first pressure threshold acts on the icon of the short message application, the instruction of creating a new short message is executed.
  • the touch sensor 180K is also called “touch device”.
  • the touch sensor 180K can be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, also called a “touch screen”.
  • the touch sensor 180K is used to detect a touch operation on or near it.
  • the touch sensor can pass the detected touch operation to the application processor to determine the type of touch event.
  • Visual output related to the touch operation can be provided through the display screen 194 .
  • the touch sensor 180K may also be disposed on the surface of the electronic device, which is different from the position of the display screen 194 .
  • the keys 190 include a power key, a volume key and the like.
  • the key 190 may be a mechanical key. It can also be a touch button.
  • the electronic device can receive key input and generate key signal input related to user settings and function control of the electronic device.
  • the indicator 192 can be an indicator light, and can be used to indicate charging status, power change, and can also be used to indicate messages, missed calls, notifications, and the like.
  • the display screen 194 is used for displaying local content from the source device 100, or displaying content from the network.
  • the display screen 194 can also receive various gesture operations input by the user, and display different content in response to the gesture operations.
  • the wireless communication module 160 is used to establish a communication connection with the end-side device 200, and the communication connection may be a wireless communication connection such as a Wi-Fi connection or a Bluetooth connection.
  • the processor 110 is configured to determine the screen projection frame rate according to the change speed of the content displayed on the display screen 194, and intercept the content displayed on the display screen 194 at the screen projection frame rate to obtain image frames.
  • the wireless communication module 160 is configured to send the screen projection frame rate and image frames determined by the processor 110 to the end-side device 200 based on the communication connection with the end-side device 200 . In some embodiments, the wireless communication module 160 is further configured to send information about applications run by the source-side device 100 to the device-side device 200 .
  • the wireless communication module 160 is used to establish a communication connection with the source device 100, and the communication connection may be a wireless communication connection such as a Wi-Fi connection or a Bluetooth connection.
  • the wireless communication module 160 is also configured to receive the screen projection frame rate, the captured image frame, etc. sent by the source-side device 100 based on the communication connection with the source-side device 100 . In some embodiments, the wireless communication module 160 may also receive the information of the application run by the source-side device 100 sent by the source-side device 100 .
  • the processor 110 is configured to perform image prediction according to the latest received image frame to obtain a predicted image frame.
  • the processor 110 is further configured to determine whether there is currently a frame loss phenomenon according to the situation of receiving the image frame sent by the source-side device 100 and in combination with the frame rate of the screen projection. If a frame drop occurs, the predicted image frame is displayed when the frame drop occurred.
  • the software system of the electronic device in the embodiment of the present application shown in FIG. 2A may adopt a layered architecture, an event-driven architecture, a micro-kernel architecture, a micro-service architecture, or a cloud architecture.
  • the Android system with layered architecture is taken as an example to illustrate the software structure of the electronic device.
  • FIG. 2B is a block diagram of the software structure of the electronic device according to the embodiment of the present application.
  • the layered architecture divides the software into several layers, and each layer has a clear role and division of labor. Layers communicate through software interfaces.
  • the Android system is divided into four layers, which are application program layer, application program framework layer, Android runtime and system library, and kernel layer from top to bottom.
  • the application layer can consist of a series of application packages.
  • the application package may include applications such as camera, gallery, calendar, call, map, navigation, WLAN, Bluetooth, music, video, and short message.
  • the application package may include a screen-casting application.
  • the screen projection application in the source-side device 100 is used to support the smooth display of images in the subsequent screen projection.
  • the operation performed by the source-side device 100, the screen projection application in the terminal-side device 200 is used to support smooth display in the subsequent screen projection. Operations performed by the end-side device 200 in the method embodiment of the screen.
  • the application framework layer provides an application programming interface (application programming interface, API) and a programming framework for applications in the application layer.
  • the application framework layer includes some predefined functions.
  • the application framework layer may include a window manager, a content provider, a view system, a phone manager, a resource manager, a notification manager, and the like.
  • a window manager is used to manage window programs.
  • the window manager can get the size of the display screen, determine whether there is a status bar, lock the screen, capture the screen, etc.
  • Content providers are used to store and retrieve data and make it accessible to applications.
  • Said data may include video, images, audio, calls made and received, browsing history and bookmarks, phonebook, etc.
  • the view system includes visual controls, such as controls for displaying text, controls for displaying pictures, and so on.
  • the view system can be used to build applications.
  • a display interface can consist of one or more views.
  • a display interface including a text message notification icon may include a view for displaying text and a view for displaying pictures.
  • the phone manager is used to provide communication functions of electronic devices. For example, the management of call status (including connected, hung up, etc.).
  • the resource manager provides various resources for the application, such as localized strings, icons, pictures, layout files, video files, and so on.
  • the notification manager enables the application to display notification information in the status bar, which can be used to convey notification-type messages, and can automatically disappear after a short stay without user interaction.
  • the notification manager is used to notify the download completion, message reminder, etc.
  • the notification manager can also be a notification that appears on the top status bar of the system in the form of a chart or scroll bar text, such as a notification of an application running in the background, or a notification that appears on the screen in the form of a dialog window.
  • prompting text information in the status bar issuing a prompt sound, vibrating the electronic device, and flashing the indicator light, etc.
  • the Android Runtime includes core library and virtual machine. The Android runtime is responsible for the scheduling and management of the Android system.
  • the core library consists of two parts: one part is the function function that the java language needs to call, and the other part is the core library of Android.
  • the application layer and the application framework layer run in virtual machines.
  • the virtual machine executes the java files of the application program layer and the application program framework layer as binary files.
  • the virtual machine is used to perform functions such as object life cycle management, stack management, thread management, security and exception management, and garbage collection.
  • a system library can include multiple function modules. For example: surface manager (surface manager), media library (Media Libraries), 3D graphics processing library (eg: OpenGL ES), 2D graphics engine (eg: SGL), etc.
  • the surface manager is used to manage the display subsystem and provides the fusion of 2D and 3D layers for multiple applications.
  • the media library supports playback and recording of various commonly used audio and video formats, as well as still image files, etc.
  • the media library can support a variety of audio and video encoding formats, such as: MPEG4, H.264, MP3, AAC, AMR, JPG, PNG, etc.
  • the 3D graphics processing library is used to implement 3D graphics drawing, image rendering, compositing, and layer processing, etc.
  • 2D graphics engine is a drawing engine for 2D drawing.
  • the kernel layer is the layer between hardware and software.
  • the kernel layer includes at least a display driver, a camera driver, an audio driver, and a sensor driver.
  • FIG. 3 exemplarily shows the flow of the method for smoothly displaying images in screen projection.
  • the method may include the following steps:
  • the source-side device 100 and the device-side device 200 establish a communication connection.
  • the source-side device 100 may detect a user operation input by the user, and in response to the user operation, turn on one or more of WLAN, Bluetooth, NFC or mobile network in the wireless communication module 160, and Other devices that can establish a communication connection for screen mirroring can be discovered through one or more wireless communication technologies in Wi-Fi, Bluetooth, NFC, and mobile networks.
  • the source device 100 may display the identifiers of the multiple devices in the user interface for the user to select one of them. or multiple devices to establish a communication connection.
  • FIG. 4A and FIG. 4B exemplarily show the process of establishing a communication connection between the source-side device 100 and the device-side device 200 .
  • FIG. 4A exemplarily shows an exemplary user interface 41 on the source-side device 100 for displaying installed application programs.
  • the user interface 41 displays: a status bar, a calendar and time indicator, a weather indicator, a page indicator, a tray with frequently used application icons, other application icons, and the like. Not limited thereto, in some embodiments, the user interface 41 shown in FIG. 4A may further include a navigation bar, a side bar, and the like. In some embodiments, the user interface 41 exemplarily shown in FIG. 4A may be called a main interface (home screen).
  • the source-side device 100 can display the window shown in Figure 4B on the user interface 41 111.
  • a control 111A can be displayed in the window 111 , and the control 111A can accept user operations (such as touch operations and click operations) to enable/disable the mirroring function of the source device 100 .
  • the expression form of the control 111A may include an icon and/or text (for example, the text "screen projection", “wireless screen projection”, “multi-screen interaction", “big screen projection”, etc.).
  • Window 111 may also display switch controls for other functions such as Wi-Fi, Bluetooth, flashlight and the like.
  • the source-side device 100 may detect a click operation on the control 111A, and enable the screen mirroring function. In some embodiments, after detecting the user operation on the control 111A, the source-side device 100 may change the display form of the control 111A, such as adding a shadow when the control 111A is displayed.
  • the user may also input a slide-down operation on other interfaces of the setting application or user interfaces of other applications to trigger the source-side device 100 to display the window 111 .
  • the user operation performed on the control 111A in the window 111 shown in FIGS. 4A and 4B is not limited.
  • the user operation for enabling the screen mirroring function may also be an operation for enabling a function option in the setting application.
  • the user can also put the source-side device 100 close to the NFC tag of the end-side device 200 to trigger the source-side device 100 to start the screen mirroring function.
  • the embodiment of the present application does not limit the user operation of enabling the screen mirroring function.
  • the source-side device 100 responds to the user operation of enabling the mirroring screen projection function by the user, and enables one or more of WLAN, Bluetooth or NFC in the wireless communication module 160, and can pass through one of Wi-Fi, Bluetooth, and NFC. or multiple wireless communication technologies to discover screen-casting devices that can establish a screen-casting communication connection.
  • the source-side device 100 After the source-side device 100 finds the screen-projecting devices capable of establishing a communication connection, it may display the identifiers of these screen-projecting devices, for example, display the window 112 as shown in FIG. 4C .
  • window 112 may display: identifications and connection options of one or more screen-casting devices.
  • the source-side device 100 may detect a user operation on the connection option, and establish a communication connection with the screen-casting device indicated by the device identifier corresponding to the option.
  • the identifiers and connection options of the one or more screen-casting devices include an identifier 112A and a connection option 112B.
  • the source-side device 100 detects a user operation on the connection option 112B by the user, in response to the operation, the source-side device 100 can send a communication connection to the device with the identifier "HUAWEI 20" displayed in the identifier 112A.
  • the connection option 112B can be updated to an option 112C as shown in FIG. 4D , which is used to prompt the user that the source device 100 is searching for a device that can be used for screen mirroring.
  • this embodiment of the present application does not limit the user operation of the source-side device 100 to select a device to establish a communication connection.
  • the source-side device 100 can display other information besides the logo of the device capable of projecting the screen, such as The device type and the like of the device are not limited in this embodiment of the present application.
  • the device identified as "HUAWEI 20" After the device identified as "HUAWEI 20" receives the communication connection request sent by the source device 100, it can display a user interface 42 as shown in Figure 4E, which includes a window 201, which is used to prompt Whether the user agrees to establish a communication connection.
  • the window 201 may include: a confirm control 201A and a cancel control 201B.
  • the confirmation control 201A can establish a communication connection with the source-side device 100 in response to user operations.
  • the device identified as "HUAWEI20" is the end-side device 200 that establishes a communication connection with the source-side device 100.
  • the end-side The device 200 may display the user interface provided by the source-side device 100 , for details, refer to the user interface displayed on the subsequent device-side device 200 .
  • the cancel control 201B may refuse to establish a communication connection with the source-side device 100 in response to a user operation.
  • the end-side device 200 may not display the prompt information, that is, the window 201 shown in FIG. 4E may not be displayed, and directly establish a communication connection with the source-side device 100 .
  • both the source-side device 100 and the device-side device 200 can run a screen projection application, so as to support the establishment of a communication connection between the two devices and the subsequent mirroring and screen projection process.
  • the communication connection established between the source-side device 100 and the end-side device 200 can be a Wi-Fi connection, a Bluetooth connection, an NFC connection, or a remote connection, etc., or a wired connection such as a data line-based connection, which is not limited in this embodiment of the application.
  • the source-side device 100 intercepts the screen content according to the screen projection frame rate, obtains an image frame, and sends the image frame to the end-side device 200 .
  • the screen-casting frame rate of the source-side device 100 is determined by the source-side device 100 according to the changing speed of the picture displayed on its own display screen.
  • the screen displayed on the display screen of the source-side device 100 and the changes of the screen are controlled by the user independently.
  • the user can manipulate the source-side device 100 to perform any type of operation or function, and the source-side device 100 can display different content on the display screen in response to the operation input by the user.
  • the source-side device 100 may start a social application and display social content in response to a user operation.
  • the source-side device 100 may, in response to a user operation, start a reading application to display novel text.
  • the source-side device 100 may start a social application, etc. in response to a user operation.
  • the user operation input by the user on the source-side device 100 for changing the displayed content of the source-side device 100 determines the screen-casting frame rate of the source-side device 100 .
  • the screen projection frame rate of the source-side device 100 does not exceed the full frame rate, that is, the maximum frame rate.
  • the full frame rate may be preset by the source-side device 100, for example, it may be 60 FPS, etc., which is not limited here.
  • the source-side device 100 can continuously capture the screen content displayed on the display screen according to the determined screen projection frame rate, obtain the corresponding image frame, and based on the communication connection with the device-side device 200, send the image frame to the device-side device 200 Device 200.
  • the source-side device 100 may send an image frame to the device-side device 200 every time it captures an image frame, that is, continuously send the image data stream to the device-side device 200 during the screen mirroring process. It is equivalent to, during the process of screen mirroring, S102 will continue to perform multiple times.
  • the source-side device 100 may stamp each image frame with a time stamp according to the capture time, or stamp a sequence number according to the sequence of capture.
  • the source-side device 100 may encode the intercepted image frame and send it to the device-side device 200 . If the source-side device 100 has insufficient computing power or other reasons, it may take a lot of time in the encoding stage.
  • the source-side device 100 synchronizes the screen projection frame rate to the device-side device 200 .
  • the source-side device 100 can periodically synchronize the screen projection frame rate to the device-side device 200 based on the communication connection with the device-side device 200, or it can set the The new screen projection frame rate is synchronized to the end-side device 200 .
  • S103 will be executed multiple times during the screen mirroring process.
  • the source-side device 100 may send the screen projection frame rate and the image frame intercepted in S102 to the device-side device 200 together, or separately, which is not limited in this embodiment of the present application.
  • the source-side device 100 starts the first application, and displays the first user interface on the display screen.
  • the embodiment of the present application does not limit the order of S101 and S104.
  • the source device 100 may first execute S101 and then execute S104 during the execution of S102, or may execute S104 first and then execute S101 and S102.
  • the flow chart shown in FIG. 3 is described as an example of the former execution sequence.
  • the first application is an application that can slide or scroll to display content in the user interface in response to user operations, such as a browser, a social networking application, a reading application, and the like.
  • FIG. 5A and FIG. 5B exemplarily show the process of starting the first application by the source-side device 100 .
  • FIG. 5A is a user interface 41 displayed by the source-side device 100 , and the user interface 41 may be a main interface provided by the source-side device 100 .
  • the source-side device 100 may detect a user operation (such as a click operation, a touch operation, etc.) on the social application icon 501 in the main interface.
  • the source-side device 100 may display the user interface 51 provided by the social application as shown in FIG. 5B .
  • the user interface 51 is an example of the first user interface.
  • the method for starting the first application is not limited to that shown in FIG. 5A , and in some other embodiments, the source-side device 100 may also start the first application in other ways.
  • the source-side device 100 may also start the first application in response to the voice instruction.
  • the source-side device 100 can be used with a mouse, and the source-side device 100 can respond to a double-click operation received on the mouse after the cursor of the mouse is located at the position of the first application icon, and start the first application and so on.
  • the first user interface may include system content and page content.
  • system content refers to the content provided by the system application program when the source device 100 runs the system application program, such as a status bar, a system navigation bar, and the like.
  • System content is usually displayed in a fixed area on the display screen, and the electronic device will not change the position where the system content is displayed according to user operations.
  • the page content refers to the content provided by the first application currently running in the foreground on the source-side device 100 , and may include, for example, an application title bar, an application menu bar, an application internal navigation bar, application content, and the like.
  • the source-side device 100 can load the page content provided by the first application through the network or locally.
  • the first user interface only displays part of the page content, and when the source-side device 100 receives the user's operation of sliding up and down on the screen, other parts of the page content may be displayed in the first user interface. That is to say, the content of a page provided by the first application is relatively long, and usually only a part of the content of the page is displayed on the display screen.
  • the source-side device 100 may detect a user operation of sliding up or down in the page content by the user, and in response to the operation, the source-side device 100 scrolls the page content in the scrollable area in the first user interface. In this way, users can browse more page content in detail according to their needs.
  • the page content may be the main page or other pages of the first application provided by the first application, which is not limited here.
  • the content of the page may come from the source device 100 locally, or from the network.
  • the status bar therein is system content, and other content other than the status bar is page content provided by social applications.
  • the status bar is located in area 502 of the display screen, and other content is located in area 503 of the display screen.
  • the first user interface may also only include page content.
  • the user interface 51 shown in FIG. 5B may also only include other content other than the status bar, that is, only include the content in the area 503 .
  • the display area where the first user interface is located can be divided into two parts: a scrollable area and a non-scrollable area.
  • the scrollable area refers to an area that can change and scroll to display different content in response to a user's operation (for example, a gesture of sliding up and down).
  • the content displayed in the scrollable area can be scrolled and displayed in response to a user's operation (for example, a gesture of sliding up and down).
  • the content displayed in the scrollable area may include social content, news, fiction text, pictures, and the like.
  • a partial area 503a in the area 503 is a scrollable area, and multiple pieces of content are displayed in the scrollable area.
  • the non-scrollable area refers to an area that does not scroll to display different content in response to a user's operation (for example, a gesture of sliding up and down).
  • the content displayed in the non-scrollable area will not be scrolled in response to user operations (for example, gestures of sliding up and down).
  • the content displayed in the non-scrollable area may include: system content, part of page content such as menu bar, search bar, application navigation bar and so on.
  • the area 502 where the status bar is located, and the partial area 503b in the area 503 are non-scrollable areas.
  • a menu bar is displayed in the non-scrollable area 503b.
  • different contents may be displayed in the scrollable area 503a.
  • the display area where the first user interface is located may also only include a scrollable area and not include a non-scrollable area.
  • the user interface 51 shown in FIG. 5B may only display the page content displayed in the scrollable area 503a.
  • the content of the first user interface depends on the display mechanism of the source-side device 100 and the content provided by the first application.
  • the positions of the scrollable area and the non-scrollable area on the display screen depend on the location of each item in the first user interface on the display screen. s position.
  • step S105 the source-side device 100 synchronizes the application information of the first application to the device-side device 200 .
  • the source-side device 100 may periodically synchronize the application information of the application running on the source-side device 100 to the device-side device 200 based on the communication connection with the device-side device 200, or When the application changes, the application information of the newly running application is synchronized to the end-side device 200 .
  • the application may be the first application or other applications.
  • the source-side device 100 may send the application information of the application and the image frame intercepted in S102 to the device-side device 200 together, or separately, which is not limited in this embodiment of the present application.
  • the application information of an application may include any one or more of the following: the identification of the application (such as name, code), the type of application to which the application belongs (such as browser, social application, reading application, etc.), the user interface provided by the application Information.
  • the information of the user interface may include, for example, the type of the user interface, the position and size of the scrollable area in the user interface, the position and size of the non-scrollable area, and the like.
  • the source-side device 100 scrolls and displays the page content located in the scrollable area in the first user interface at a speed greater than the first value.
  • the source-side device 100 may execute S106 during the execution of S102 after executing S101.
  • the source-side device 100 may receive a first operation input by the user on the source-side device 100, and execute S106.
  • the first operation may be a sliding operation detected by the source-side device 100 acting on a scrollable area in the display screen (for example, a sliding gesture in any direction such as a gesture of sliding up, a gesture of sliding down), or a gesture of the source-side device 100
  • the sliding operation received on the mouse after the cursor of the mouse is located in the scrollable area can also be a voice command, and so on.
  • the source-side device 100 scrolling to display the page content in the scrollable area means that the page content in the scrollable area scrolls or moves in a certain direction at a certain speed. During this process, some page content is moved out of the scrollable area and is no longer displayed, some page content changes its position in the scrollable area, and new page content appears in the scrollable area.
  • the direction in which the source-side device 100 scrolls and displays page content is related to the first operation. If the first operation is a sliding operation acting on the scrollable area, the direction in which the source-side device 100 scrolls and displays the page content may be the same as the direction of the sliding operation. For example, if the first operation is a sliding operation in any direction, the source-side device 100 may scroll and display the page content in any direction. In some other embodiments, the source-side device 100 presets that the page content can only be scrolled in a fixed direction (such as an upward direction or a downward direction), and then the source-side device 100 will During the movement operation of the vector, the page content can be scrolled in this fixed direction.
  • a fixed direction such as an upward direction or a downward direction
  • the direction in which the source-side device 100 scrolls the page content in the scrollable area may refer to upward, downward, or the like.
  • scrolling up refers to that the page content displayed on the display screen moves in a direction from the bottom of the display screen to the top.
  • scrolling down means that the page content displayed on the display screen moves in a direction from the top of the display screen to the bottom end.
  • the speed at which the source-side device 100 scrolls and displays page content is related to the first operation. If the first operation is a sliding operation acting on the scrollable area, the speed at which the source-side device 100 scrolls and displays page content is related to the speed of the sliding operation. Specifically, while receiving the sliding operation acting on the scrollable area, the source-side device 100 scrolls the page content in the scrollable area in the scrolling direction at the speed of the sliding operation in the scrolling direction. At this time, from the user's point of view, the page content in the scrollable area is moving following the sliding operation of the hand. After the user finishes inputting the sliding operation, the source-side device 100 will continue to scroll and display the page content in the scrollable area in the scrolling direction according to the inertia.
  • the source-side device 100 may slowly reduce the speed of scrolling the page content until the scrolling stops after the user finishes inputting the sliding operation. In some other implementation manners, the source-side device 100 may first increase the speed of scrolling the content of the page after the user finishes inputting the sliding operation, and then slowly decrease the speed of scrolling the content of the page until the scrolling stops. It can be seen that during the process of the source-side device 100 scrolling and displaying the page content located in the scrollable area in the first user interface, the scrolling speed will change according to the first operation.
  • the motion vector V of the sliding operation on the display screen can be calculated by the following formula:
  • Xv and Yv 2 are motion vectors of the sliding operation in the X direction and the Y direction, respectively.
  • the X direction and the Y direction may respectively be a direction from the left side of the display screen to the right side, and a direction from the bottom of the display screen to the top side.
  • the source-side device 100 When the first operation meets a certain condition, for example, when the speed of the sliding operation is greater than a certain value, the source-side device 100 will scroll at a faster speed (for example, a speed greater than the first value) to display the scrollable area in the first user interface. page content.
  • the first value can be set in advance, which is not limited here.
  • the source-side device 100 when the source-side device 100 scrolls and displays the page content in the scrollable area in the first user interface at a faster speed, the screen displayed on the display screen changes faster, and the source-side device 100 determines that The screencasting frame rate is also higher. Therefore, during the execution of S106 , the source-side device 100 will capture the screen content at the larger screen projection frame rate, obtain multiple image frames, and send the image frames to the device-side device 200 .
  • FIG. 5B-FIG. 5E exemplarily show the scene when the source-side device 100 scrolls and displays the page content located in the scrollable area in the first user interface.
  • the source-side device 100 starts to detect the upward sliding operation on the scrollable area 503a in FIG. 5B , and detects that the user ends the above-mentioned upward sliding operation in FIG.
  • the content of the page in the scrollable area 503 a is scrolled up in the sliding operation.
  • the page content in the scrollable area 503a can follow the user's hand to scroll, and the area where the user's hand touches the display screen during the movement displays the same content, such as the bottom of the animal picture in FIG. 5B and FIG. 5C .
  • the source-side device 100 continues to scroll upwards to display the page content in the scrollable area 503a according to the inertia.
  • FIGS. 5B-5E only exemplarily show the content displayed on the display screen when the source-side device 100 scrolls to display page content.
  • the source-side device 100 can display more images during the scrolling process.
  • the image frames obtained by the source-side device 100 by using the determined screen projection frame rate to capture the screen content may include four image frames in the user interface shown in FIGS. 5B-5E .
  • the four image frames shown in FIGS. 5B-5E intercepted by the source-side device 100 are called image frame 1, image frame 2, image frame 3, and image frame 4, respectively.
  • the end-side device 200 may receive the second operation input by the user on the end-side device 200 during the process of displaying the screen projection content, and send the operation information of the second operation to the source-side device 100, to trigger the source-side device 100 to execute S106.
  • the user can manipulate the content displayed by the source-side device 100 on the device-side device 200 .
  • the second operation may be a sliding operation detected by the terminal-side device 200 and acting on the mirrored screen projection area in the display screen for displaying screen projection content (for example, a sliding gesture in any direction such as an upward sliding gesture or a downward sliding gesture) , when the end-side device 200 is used with a mouse, the sliding operation received on the mouse after the cursor of the mouse is located in the scrollable area, or when the end-side device 200 is used with a remote control, when the remote control selects the scrollable
  • the click operation received on the remote control after the area can also be a voice command, and so on.
  • the source-side device 100 scrolling and displaying the page content in the scrollable area (such as speed, direction light) triggered by the second operation detected by the end-side device 200 can refer to the first operation detected by the source-side device 100 to trigger the scrolling display.
  • the method of page content in the scrollable area will not be repeated here.
  • step S107 the source-side device 100 synchronizes the operation information of the first operation to the device-side device 200 .
  • the source-side device 100 may synchronize the detected operation information to the device-side device 200 based on the communication connection with the device-side device 200 .
  • the operation may include the first operation or other operations.
  • the source-side device 100 may send the operation information and the image frame intercepted in S102 to the device-side device 200 together, or separately, which is not limited in this embodiment of the present application.
  • the operation information may include any one or more of the following: type of operation (such as sliding operation type), direction of operation, speed of operation, duration of operation, trajectory of operation, motion vector of operation.
  • type of operation such as sliding operation type
  • direction of operation such as sliding operation type
  • speed of operation such as speed of operation
  • duration of operation such as trajectory of operation
  • motion vector of operation such as motion vector of operation.
  • the end-side device 200 receives the image frame sent by the source-side device 100, and sends the received image frame into the display queue.
  • the image frame sent by the source-side device 100 to the device-side device 200 may be lost during the communication process.
  • part of the image frames sent by the source-side device 100 may not be received by the device-side device 200 .
  • the device-side device 200 may only receive image frame 1 and image frame 2, while image frame 3 and image frame 4 are lost due to communication.
  • the end-side device 200 may receive the encoded image frame and perform a decoding operation on it to obtain the image frame. If the end-side device 200 has insufficient computing power or other reasons, it may take a lot of time to decode and acquire image frames.
  • the device-side device 200 can use any of the following strategies to send the received image frames into the display queue: strategy 1, the device-side device 200 sequentially sends the image frames according to the order in which the image frames are received Frames are sent to the display queue.
  • strategy 2 if the source-side device 200 puts a time stamp on the image frame, the end-side device 200 can sequentially send the image frames to the display queue according to the order of time indicated by the time stamp.
  • Strategy 3 if the source-side device 200 puts serial numbers on the image frames, the device-side device 200 can sequentially send the image frames to the display queue according to the order of the serial numbers.
  • the end-side device 200 can discard the received outdated image frame to avoid sending it to the display queue, or send it to the display queue Take it out after it is in the queue.
  • the outdated image frame refers to an image frame received or decoded by the terminal-side device 200 and whose interception time from the source-side device 100 is longer than the first duration.
  • the end-side device 200 may discard a part of image frames (for example, 2 frames) in the front of the display queue when the number of image frames in the display queue exceeds a certain number (for example, 2).
  • the device-side device 200 may also directly discard image frames whose interception time is longer than the receiving time or whose decoding time is greater than the first duration according to the time stamp carried in the image frame. In this way, by discarding outdated image frames, the end-side device 200 can ensure low latency of mirroring screen projection, give the user a feeling that the end-side device 200 and the source-side device 100 display images synchronously, and give users a better experience.
  • the display sending queue is a queue used in the device 200 for providing display images to the display screen.
  • the images in the display queue are sequentially provided to the display screen for display according to the sequence in which they are sent into the queue.
  • the image frames that have been provided to the display screen are no longer stored in the display sending queue.
  • the display queue can have a predetermined size, for example, it can store up to 4 image frames. It can be seen that the image frames contained in the display queue will be updated in real time, and the image frames contained in the display queue can be different at different times.
  • a cache may also be set in the end-side device 200, and the image frames displayed on the display screen provided by the display sending queue may be stored in the cache.
  • the end-side device 200 follows the first-in-first-out rule, and takes out the image frame that is first sent to the display queue from the display queue for display at a fixed frequency.
  • the end-side device 200 may continuously generate a synchronization signal at the fixed frequency, and when the synchronization signal arrives, take an image frame from the display sending queue for display.
  • the time point at which the synchronization signal is generated will be referred to as a synchronization time point in subsequent embodiments.
  • FIG. 6 exemplarily shows the image frames included in the display queue of the end-side device 200 at different times within a period of time.
  • the abscissa is the time axis.
  • the synchronization signal is generated at the projection frame rate.
  • the synchronization time point for generating the synchronization signal is the time point when the end-side device 200 fetches the image frame from the display queue for display.
  • the current time point is located between synchronization time points 2 and 3, and the future time that has not yet arrived is after the current time point.
  • Figure 6 shows the display queues in four different time periods, and for the time periods corresponding to the display queues, refer to the time corresponding to the time axis below it.
  • the display queue includes an image frame a, and the image frame a is provided to the display screen of the end-side device 200 for display at the arrival time of the synchronization signal 1 .
  • a new image frame b is added to the display sending queue, and the image frame b is provided to the display screen of the end-side device 200 for display at the arrival time of the synchronization signal 2 .
  • the end-side device 200 receives the image frame c and the image frame d successively, and the image frame c and the image frame d are received in order in the display queue, and the image frame c is provided to the end-side device at the arrival time of the synchronization signal 3 200 is displayed on the display screen, and the image frame d is provided to the display screen of the end-side device 200 at the arrival time of the synchronization signal 4 for display.
  • the end-side device 200 successively receives the image frames e-g, and the display queue receives the image frames e-g in sequence, and the image frame e and the image frame f are eliminated from the display queue because they are outdated image frames.
  • the image frame g is provided to the display screen of the end-side device 200 at the arrival time of the synchronization signal 7 for display.
  • the end-side device 200 predicts the predicted image frame corresponding to the first time point according to the latest image frame entering the display queue.
  • the end-side device 200 executes S109 in any of the following scenarios:
  • the communication quality between the end-side device 200 and the source-side device 100 can be measured by parameters such as communication signal strength, communication delay, and signal-to-noise ratio.
  • the communication quality between the end-side device 200 and the source-side device 100 is lower than a certain value, it indicates that the communication quality between the two parties is poor, and it is very likely that the source-side device 100 sends a message to the end-side Image frames of device 200 may be lost during communication. That is to say, scene 2 is very likely to cause frame loss, and executing S109 in scene 2 can effectively avoid the bad experience brought by possible frame loss to the user.
  • Scenario 3 After the terminal-side device 200 establishes a communication connection with the source-side device 100 for screen mirroring, if the source-side device 100 runs a specified type of application in the foreground, and the source-side device 100 executes S106, the Scrolling and displaying the page content located in the scrollable area in the first user interface at a speed of one value, and then the end-side device 200 executes S109.
  • the end-side device 200 may determine whether the source-side device 100 is running a specified type of application in the foreground according to the application information synchronized by the source-side device 100 in step S105 above.
  • the specified type of application is an application that slides or scrolls to display content in a user interface in response to user operations, and may include, for example, a browser, a social networking application, a reading application, and the like.
  • the end-side device 200 may also determine whether the source-side device 100 has received a specified type of operation according to the operation information synchronized by the source-side device 100 in step S107 above. Alternatively, the end-side device 200 may also determine whether it has received an operation of a specified type.
  • the specified type of operation refers to an operation used to trigger the source device 100 to scroll and display the page content in the scrollable area in the first user interface at a speed greater than a first value, such as a sliding operation with a speed greater than a certain value, and so on.
  • the source-side device 100 runs an application of a specified type in the foreground and executes S106, the source-side device 100 will perform screen mirroring at a higher screen-casting frame rate. If there is a frame loss at a high screen projection frame rate, it will bring a very obvious visual freeze to the user. Therefore, executing S109 in scene 3 can effectively avoid the bad experience brought to the user by frame loss at a high screen projection frame rate. .
  • the terminal side may execute S109 during the screen mirroring process.
  • the display frame rate of the end-side device 200 is the number of image frames displayed on the display screen per unit time during the mirroring process. For example, if the end-side device 200 displays the projected screen content according to the sending display queue shown in FIG.
  • the display frequency during the generation time of signals 4-6 is 0 FPS.
  • Executing S109 in Scenario 4 can execute S109 when a relatively serious frame loss problem begins to occur, so as to avoid bad experience brought to the user by subsequent continuous frame loss.
  • the end-side device 200 may also perform S109 in combination with any of the above-mentioned scenarios.
  • the end-side device 200 may execute S109 under the condition that Scenario 3 and Scenario 4 are satisfied at the same time.
  • the end-side device 200 executes S109 each time a new image frame is received.
  • S109 will be executed multiple times.
  • the first time point is after the time point when S109 is currently executed.
  • the first time point may include: when the device-side device 200 only displays images according to the image frames currently in the display queue, the display frame rate of the device-side device 200 is lower than one or more synchronous time points at which the screen projection frame rate is lower. That is to say, the first time point includes, assuming that the end-side device 200 displays according to the image frames in the current display queue, one or more synchronous time points after the image frames in the current display queue are displayed.
  • the number of the multiple synchronization time points may be a fixed number, which may be preset by the end-side device 200 or independently set by the user, for example, may be two.
  • synchronization time points after synchronization time point 4 are the first time point.
  • synchronization time points 5 and 6 may be the first time point.
  • the following describes how the end-side device 200 predicts the predicted image frame corresponding to the first time point using the latest image frame entering the display queue.
  • the latest image frames entering the display queue please refer to the related description of S108.
  • the latest image frame entering the display queue is the latest image frame received by the end-side device 200 .
  • the end-side device 200 can obtain the latest image frame entering the display queue from the display queue and/or the cache.
  • the end-side device 200 executes S109 at the current time point, based on the current time point, the latest two image frames sent to the display queue are image frame c and image frame d, and the end-side device 200 may acquire image frame c and image frame d from the current display queue.
  • the end-side device 200 may use the latest two image frames entering the display queue to predict the first predicted image frame corresponding to the first time point.
  • the two image frames that entered the display queue latest may be the two adjacent image frames intercepted by the source-side device 100, or are image frames captured by the source-side device 100 twice in a non-adjacent manner. That is to say, the latest image frame captured by the source-side device 100 before capturing image frame 2 may be image frame 1 or other image frames.
  • the two image frames newly entered by the end-side device 200 in the display queue in FIG. 6 are image frame c and image frame d for example.
  • image frame c is specifically image frame 1 shown in FIG. 5B
  • image frame d is specifically image frame 2 shown in FIG. 5B .
  • the process of the end-side device 200 predicting the first predicted image frame corresponding to the first time point according to the two latest image frames entering the display queue may include the following steps:
  • the end-side device 200 compares the image frame 1 and the image frame 2, and determines a motion area in the image frame 2.
  • the motion area in the image frame d refers to the area in the image frame d whose display content is changed compared with the image frame c, which is equivalent to the area where the content of the scrollable area of the user interface provided by the source device 100 in the image frame d is displayed .
  • S1091 performed by the end-side device 200 specifically includes the following steps:
  • Step 1 traverse all the pixels of the image frame 2, determine the change value of each pixel point of the image frame 2 compared with the image frame 1 at the same position, and then determine the pixel points whose pixel value changes exceed the threshold T, and obtain the pixel points obtained by these pixels Consists of area 1.
  • the end-side device 200 binarizes the difference between the image frame 1 and the image frame 2 during the motion to find out the motion area.
  • the degree of change of each pixel in image frame 2 can be calculated in the following way:
  • f k-1 (x, y) and f k (x, y) represent the pixel values of the pixel with coordinates (x, y) in image frame 1 and image frame 2 respectively
  • D k (x, y ) represents the change value of the pixel with coordinates (x, y) in image frame 2 compared to image frame 1.
  • the threshold T can be preset, for example, dynamic acquisition can be calculated in advance according to the method of maximum variance between classes. T can be an experience value.
  • the area 1 determined in step 1 it may not be a standard shape (such as a rectangle), or it may be a discrete area.
  • the scrolling area is usually a standard-shaped and concentrated area .
  • the actual scrolling area should be the scrollable area 503a in FIG. Area 1 may be only a part of non-standard and discrete areas in the actual scrollable area 503a.
  • the end-side device 200 may further perform a correction operation on the basis of the area 1.
  • Step 2 modify area 1 to get area 2.
  • the end-side device 200 can standardize the shape of the area 1 , for example, a standardized rectangle can be obtained according to the shape of the area 1 . Specifically, the end-side device 200 can obtain the maximum abscissa x max , the minimum abscissa x min , the maximum ordinate y max , and the minimum ordinate y min of each pixel point in area 1, and then determine the following four coordinate points: ( x min ,y min ),(x min ,y max ),(x max ,y min ),(x max ,y max ), determine the area formed by the above four coordinate points as a standard area.
  • the end-side device 200 can de-discretize the shape of the area 1, for example, a concentrated area can be obtained according to the shape of the area 1.
  • the end-side device 200 may combine the union of the motion region determined when predicting the image frame and the region 1 within a period of time to obtain the region 2 . If the end-side device 200 has not predicted an image frame for a period of time before, the end-side device 200 can use two or more image frames that entered the display queue before image frame 1 and image frame 2 to perform one or more Perform a similar operation to the above step 1 to obtain one or more areas, and then combine these areas with the above area 1. Since the user operations input by the user on the source-side device 100 are generally the same within a period of time, and the scrollable area in the source-side device 100 will not change, a comparative Precise zone of motion.
  • the end-side device 200 can combine the user's operation habit of sliding the source-side device 100 to obtain a concentrated area according to the shape of the area 1 .
  • the device-side device 200 can directly expand the width of the region 1 to the width of the image frame 2 .
  • the device-side device 200 may directly extend the length of the area 1 to the length of the image frame 2 on the basis of the area 1 . This method takes into account the user's operation and the manner in which the source-side device 100 responds to the operation, and can obtain a relatively accurate motion area.
  • Region 2 is the motion region in image frame 2.
  • the end-side device 200 compares the image frame 1 and the image frame 2, and determines the motion vector in the image frame 2
  • moving vector Indicates the distance and direction of movement. That is to say, comparing image frame 1 and image frame 2, a certain display content or target point in image frame 1 is moving according to the moving vector After moving, it is displayed in the motion area of image frame 2.
  • the motion vector when the same content moves from the position in image frame 1 to the position in image frame 2 is the motion vector
  • the impulse function is obtained by inverse Fourier transform, and the offset vector can be obtained by taking the peak value of the impulse function
  • Image f 1 (x,y) represents image frame 1 and image f 2 (x,y) represents image frame 2 .
  • the end-side device 200 can also use another method to calculate the movement vector
  • the end-side device 200 can also calibrate the logo pixels that display the same content in image frame 1 and image frame 2, and calculate the movement vector of the logo pixel moving from the position in image frame 1 to the position in image frame 2, so as to as a moving vector
  • the end-side device 200 displays the displayed content in the motion area according to the motion vector After moving, the vacated area in the moving area is filled with predicted data to obtain a predicted image frame; the predicted data is obtained according to the content displayed in the vacated area of the image frame 2 .
  • the end-side device 200 displays the displayed content in the motion area according to the motion vector Move, after moving, part of the content in the original motion area will be moved out of the motion area and will no longer be displayed, and there is an empty area in the original motion area.
  • the size of the vacated area and the area occupied by the removed content in the motion area may be the same or different in size.
  • the end-side device 200 obtains prediction data according to the content originally displayed in the vacated area of the image frame 2, and fills the predicted data into the vacated area.
  • the end-side device 200 can use mean blur, median blur, Gaussian blur, bilateral blur, surface blur, box blur, double blur, bokeh blur, axis-shift blur, iris blur, granular blur, radius blur, etc.
  • Image processing methods such as blurring and direction blurring process the content originally displayed in the vacated area of the image frame 2 to obtain prediction data. Compared with the content originally displayed in the vacated area of the image frame 2, the prediction data obtained by the fuzzy image processing method has a lower definition.
  • the end-side device 200 may also directly regard the content originally displayed in the vacated area of the image frame 2 as prediction data.
  • the end-side device 200 may also use a neural network algorithm to perform image prediction based on the content originally displayed in the vacated area of the image frame 2 to obtain prediction data.
  • the neural network algorithm can be trained by taking the content displayed by a large number of electronic devices in the user interface as input, and the content displayed by the electronic device after scrolling the content in the user interface as output. For example, if the content originally displayed in the vacated area of the image frame 2 is part of a typical pattern, the end-side device 200 may predict another part of the typical pattern through an algorithm.
  • the device-side device 200 can also store the image frames sent by the source-side device 100 within a period of time, for example, it can be stored in a cache, and the device-side device 200 can obtain prediction data based on the stored image frames .
  • the user inputs multiple user operations (such as repeated up and down sliding operations) within a period of time, which triggers the source-side device 100 to repeatedly display the same image frame, and the device-side device 200 can obtain prediction data based on the previously displayed image frames.
  • the above methods for obtaining forecast data may also be implemented in any combination.
  • the end-side device 200 may use an image frame newly entered into the display queue to predict the predicted image frame corresponding to the first first time point.
  • an image frame d that is newly entered into the display queue by the end-side device 200 in FIG. 6 is taken as an example for illustration. Assume that the image frame d is specifically the image frame 2 shown in FIG. 5B .
  • the process of the end-side device 200 predicting the first predicted image frame corresponding to the first time point according to the latest image frame entering the display queue may include the following steps:
  • the end-side device 200 determines the motion area in the image frame 2 according to the application information of the first application.
  • the moving area in the image frame 2 refers to the area in the image frame 2 whose display content has been changed compared with the image frame (for example, image frame 1) captured last time before the image frame 2 by the source device 100, which is equivalent to the image frame 2. Display the area where the content is located in the scrollable area of the user interface provided by the source-side device 100.
  • the device-side device 200 can determine the position of the scrollable area in the first user interface according to the first user interface information contained in the application information of the first application sent by the source-side device 100 and size, and determine the area where the content of the scrollable area is located when the image frame 2 is displayed in the first user interface as the motion area of the image frame 2 .
  • the end-side device 200 determines the motion vector in the image frame 2 according to the operation information of the first operation or the second operation
  • the source-side device 100 displays the latest captured image frame (for example, image frame 1) before the captured image frame 2, the content of the page in the motion area is displayed according to the motion vector After moving, image frame 2 is displayed.
  • moving vector Indicates the distance and direction of movement. That is to say, comparing image frame 1 and image frame 2, a certain display content or target point in image frame 1 is moving according to the moving vector After moving, it is displayed in the motion area of image frame 2.
  • the motion vector when the same content moves from the position in image frame 1 to the position in image frame 2 is the motion vector
  • the device-side device 200 can determine the motion vector in the image frame 2 according to the operation information of the first operation sent by the source-side device 100 Alternatively, if the end-side device 200 receives the second operation and triggers the source-side device 100 to execute S106, the end-side device 200 may determine the movement vector of the image frame 2 according to the received operation information of the second operation
  • the movement vector when the source-side device 100 scrolls and displays the content in the scrollable area is only related to the corresponding operation, so the device-side device 200 can directly determine the image frame 2 according to the operation information of the first operation or the second operation
  • the moving vector of For example, taking the moving direction of the first operation or the second operation as the moving direction of the image frame 2, the speed between the first operation or the second operation in the moving direction and the duration between two adjacent synchronization time points Product, as the shift length of image frame 2.
  • the source-side device 100 scrolls and displays the content in the scrollable area.
  • the movement vector and the corresponding operation are related to the sliding parameters of the source-side device 100 itself.
  • the operation information of the second operation and the sliding parameters of the source-side device 100 itself determine the movement vector of the image frame 2 For example, different devices may present different sliding page effects in response to the same first operation or second operation, and this embodiment of the present application may consider this point to accurately calculate the movement vector of the image frame 2
  • the source-side device 100 scrolls and displays the content in the scrollable area. or the operation information of the second operation and the application running in the foreground of the source device 100 determines the movement vector of the image frame 2 For example, when the source-side device 100 runs different applications, it may present different sliding page effects in response to the same first operation or second operation, and this embodiment of the present application may take this into account to accurately calculate the movement of the image frame 2 vector
  • the end-side device 200 may predict the predicted image frame corresponding to the first time point according to one or two image frames newly entered into the display queue.
  • FIG. 7 exemplarily shows a process in which the end-side device 200 acquires a predicted image frame based on the image frame 2 .
  • the end-side device 200 determines the motion region 701 and the motion vector in the image frame 2 through the method shown in any of the above-mentioned embodiments
  • the end-side device 200 displays the content originally displayed in the motion area 701 according to the movement vector move. After moving, it can be seen that part of the content originally displayed in the motion area 701, such as the content 701a, is moved out of the motion area, and some areas in the original motion area 701 are vacant, such as the area 701b in FIG. 7 .
  • the end-side device 200 obtains prediction data according to the content originally displayed in the area 701 b of the image frame 2 , and fills it into the area 701 b to obtain the predicted image frame 5 .
  • the end-side device 200 After the end-side device 200 predicts the predicted image frame corresponding to the first first time point, it can continue to predict the predicted image frame corresponding to the subsequent first time point.
  • the device-side device 200 For the method for the device-side device 200 to predict the predicted image frame corresponding to the subsequent first time point, reference may also be made to any one of the methods in the above two embodiments. Specifically, the end-side device 200 can predict the predicted image frame corresponding to the second first time point according to the latest image frame entering the display queue, and the predicted image frame corresponding to the first time point; The predicted image frame corresponding to the first time point, and the predicted image frame corresponding to the second time point, the predicted image frame corresponding to the third first time point, and so on.
  • the end-side device 200 may predict the predicted image frame corresponding to the second first time point according to the predicted image frame corresponding to the first time point; predict the third predicted image frame corresponding to the second time point The predicted image frame corresponding to a time point, and so on.
  • the step of the end-side device 200 predicting a new predicted image frame based on an image frame refer to operations similar to S1091'-S1093'.
  • the end-side device 200 obtains the predicted image frame 6 according to the image frame 2 and the predicted image frame 5 .
  • the definition of the content displayed in the area 701b of the predicted image frame 6 is lower than the definition of the content originally displayed in the area 701b of the predicted image frame 5 .
  • the end-side device 200 Since the image frames in the display queue of the end-side device 200 change in real time, the end-side device 200 will execute S109 multiple times, therefore, the first time point and the predicted image frame determined each time the end-side device 200 executes S109 can be different. The first time point and predicted image frame determined when the end-side device 200 executes S109 last time will overwrite the result of the previous execution of S109, and the previously determined first time point and predicted image frame become invalid.
  • the device-side device 200 may send the predicted image frames into the predicted image queue sequentially, and the predicted image frame corresponding to the first previous time point is first sent into the predicted image queue. Since S109 will be executed multiple times, every time S109 is executed, the image frames in the predictive image queue are updated once.
  • the end-side device 200 determined the predicted image frame 5 and the predicted image frame 6 according to the image frame c and the image frame d (that is, the image frame 1 and the image frame 2) when executing S109 last time, then The predicted image frame 5 and the predicted image frame 6 are sequentially sent to the predicted image queue.
  • the end-side device 200 can continuously generate the synchronization signal at the fixed frequency, and at the synchronization time point when the synchronization signal arrives, take out an image frame from the current display queue according to the principle of first-in-first-out for display. If there is no image frame in the current display queue, a predicted image frame is taken out from the current predicted image queue for display according to the first-in-first-out principle.
  • the image frame c is provided to the display screen of the end-side device 200 for display at the arrival time of the synchronization signal 3, and the image frame d is displayed at The arrival time of the synchronization signal 4 is provided to the display screen of the end-side device 200 for display.
  • the end-side device 200 takes out the predicted image frame 5 from the predicted image queue for display, and when the synchronization signal 6 arrives, takes the predicted image frame 6 from the predicted image queue for display. Similar to the display queue, the predicted image frames that have been provided to the display screen are no longer stored in the predicted image queue, but can be stored in the cache of the end-side device 200 .
  • the end-side device 200 may display the image frames in the display queue or predictive image queue in full screen, or may display the image frames in the display queue or predictive image queue in a partial area of the display screen.
  • the area on the display screen of the device 200 used to display the image frames in the display queue or predicted image queue is called the mirror projection area or screen projection area. It can be seen that the screen projection area of the mirror image may be the entire area of the display screen, or a part of the area of the display screen.
  • the size ratio of the screen mirroring area and the display screen of the source device 100 may be the same or different. If the two are the same, the end-side device 200 may display the corresponding image frame in the mirror image projection area in proportion. If the two are different, the end-side device 200 may display the image frame after stretching or scaling according to the size of the mirrored screen projection area, so as to adaptively match the mirrored screen projection area.
  • the position, size, shape, etc. of the mirror projection area on the display screen of the end-side device 200 can be set by default by the end-side device 200, or can be set or adjusted by the user independently, which is not limited in this embodiment of the present application .
  • the end-side device 200 can display the image frames in the display queue or the predicted image queue in the mirrored screen area, as well as Other areas of the mirrored screen projection area display content provided by the end-side device 200 itself.
  • the content displayed in other areas of the mirrored screen projection area depends on the applications and interfaces opened by the current end-side device 200 , such as interfaces that can be provided for desktops or other applications, which are not limited here.
  • 8A-8D exemplarily show user interfaces displayed by the end-side device 200 during screen mirroring.
  • the mirror projection area 801 occupies a part of the display screen of the terminal-side device 200 , and the remaining area of the display screen displays the desktop of the terminal-side device 200 .
  • the source-side device 100 and the device-side device 200 establish a communication connection, assume that the source-side device 100 intercepts the four image frames shown in FIGS. Due to reasons such as communication quality or delay, the image frame 1-2 is sent to the display queue of the end-side device 200, while the image frame 3-4 is discarded during the communication process or discarded due to being outdated.
  • 8A-8D show user interfaces displayed by the end-side device 200 at several synchronization time points.
  • the end-side device 200 displays the image frame 1 in the mirror projection area 801 .
  • the end-side device 200 displays the image frame 2 in the mirror projection area 801 .
  • the end-side device 200 displays the predicted image frame 5 in the mirror projection area 801 .
  • the end-side device 200 displays the predicted image frame 6 in the mirror projection area 801 .
  • the predicted image frame 5 and the predicted image frame 6 are the predicted image frames predicted by the end-side device 200 in S109 .
  • the end-side device 200 can display the predicted image frame to Ensure that the display frame rate of the end-side device 200 is close to or equal to the screen projection frame rate of the source-side device 100, and continuously display images, which can maintain the visual continuity of the images and avoid freezing problems. Moreover, image prediction is performed based on the latest received image frame, which can ensure the smoothness and stability of the image and avoid visual jumps. From the user's point of view, the user can see a smooth and continuous picture without the feeling of stuttering or jumping, and can obtain a good screen projection experience.
  • the latest image frame entered by the terminal-side device 200 into the display queue may be referred to as the first image frame, and the penultimate image frame newly entered into the display queue may be referred to as the second image frame.
  • the image frame 2 mentioned in S109 is the first image frame
  • the image frame 1 is the second image frame.
  • the image frame intercepted by the source-side device 100 and sent to the device-side device 200 after the first image frame is intercepted, but with frame loss, may be referred to as a third image frame.
  • a third image frame For example, if the image frame 3 and the image frame 4 in FIGS. 5D-5E intercepted by the source-side device 100 are lost due to communication reasons, it is the third image frame.
  • the predicted image frame corresponding to the first first time point predicted by the end-side device 200 may be referred to as the first predicted image frame, and the predicted image frame corresponding to the second first time point may be referred to as the second predicted image frame.
  • Predict image frames For example, the predicted image frame 5 may be a first predicted image frame, and the predicted image frame 6 may be called a second predicted image frame.
  • the end-side device 100 predicts the predicted image frame corresponding to the first first time point, if the prediction is made based on the two image frames that are newly sent to the display queue, the preceding image frame of the latest image frame that is sent to the display queue An image frame may be called a fourth image frame. If the prediction is made based on the latest image frame sent to the display queue, the latest image frame captured by the source-side device 100 before the latest image frame sent to the display queue can also be called the fourth image frame . For example, referring to the foregoing S109, the image frame 1 may be the fourth image frame.
  • each step in the foregoing method embodiments may be implemented by an integrated logic circuit of hardware in a processor or instructions in the form of software.
  • the method steps disclosed in connection with the embodiments of the present application may be directly implemented by a hardware processor, or implemented by a combination of hardware and software modules in the processor.
  • the present application also provides an electronic device, which may include: a memory and a processor.
  • the memory can be used to store computer programs; the processor can be used to call the computer programs in the memory, so that the electronic device executes the method performed by any one of the source-side device 100 or the device-side device 200 in any one of the above-mentioned embodiments.
  • the present application also provides a chip system, the chip system including at least one processor, configured to implement the functions involved in the method executed by any one of the source-side device 100 or the device-side device 200 in any one of the above embodiments.
  • the chip system further includes a memory, the memory is used to store program instructions and data, and the memory is located inside or outside the processor.
  • the system-on-a-chip may consist of chips, or may include chips and other discrete devices.
  • processors in the chip system there may be one or more processors in the chip system.
  • the processor can be realized by hardware or by software.
  • the processor may be a logic circuit, an integrated circuit, or the like.
  • the processor may be a general-purpose processor implemented by reading software codes stored in a memory.
  • the memory may be integrated with the processor, or may be configured separately from the processor, which is not limited in this embodiment of the present application.
  • the memory can be a non-transitory processor, such as a read-only memory ROM, which can be integrated with the processor on the same chip, or can be respectively arranged on different chips.
  • the arrangement manner of the memory and the processor is not specifically limited.
  • the chip system may be a field programmable gate array (field programmable gate array, FPGA), may be an application specific integrated circuit (ASIC), or may be a system chip (system on chip, SoC), It can also be a central processing unit (central processor unit, CPU), it can also be a network processor (network processor, NP), it can also be a digital signal processing circuit (digital signal processor, DSP), it can also be a microcontroller (micro controller unit, MCU), and can also be a programmable logic device (programmable logic device, PLD) or other integrated chips.
  • FPGA field programmable gate array
  • ASIC application specific integrated circuit
  • SoC system chip
  • CPU central processing unit
  • NP network processor
  • DSP digital signal processing circuit
  • MCU microcontroller
  • PLD programmable logic device
  • the present application also provides a computer program product, and the computer program product includes: a computer program (also called code, or instruction), which, when the computer program is executed, causes the computer to execute the source-side program in any one of the above-mentioned embodiments.
  • the present application also provides a computer-readable storage medium, where the computer-readable storage medium stores a computer program (also called a code, or an instruction).
  • a computer program also called a code, or an instruction.
  • the computer program When the computer program is executed, the computer is made to execute the method performed by any one of the source-side device 100 or the device-side device 200 in any of the above-mentioned embodiments.
  • the processor in the embodiment of the present application may be an integrated circuit chip that has a signal processing capability.
  • each step of the above-mentioned method embodiments may be completed by an integrated logic circuit of hardware in a processor or instructions in the form of software.
  • the above-mentioned processor can be a general-purpose processor, a digital signal processor (digital signal processor, DSP), an application-specific integrated circuit (AP 800plication specific integrated circuit, ASIC), a field programmable gate array (field programmable gate array, FPGA) or other Programmable logic devices, discrete gate or transistor logic devices, discrete hardware components.
  • DSP digital signal processor
  • ASIC application-specific integrated circuit
  • FPGA field programmable gate array
  • Various methods, steps, and logic block diagrams disclosed in the embodiments of the present application may be implemented or executed.
  • a general-purpose processor may be a microprocessor, or the processor may be any conventional processor, or the like.
  • the steps of the method disclosed in connection with the embodiments of the present application may be directly implemented by a hardware decoding processor, or implemented by a combination of hardware and software modules in the decoding processor.
  • the software module can be located in a mature storage medium in the field such as random access memory, flash memory, read-only memory, programmable read-only memory or electrically erasable programmable memory, register.
  • the storage medium is located in the memory, and the processor reads the information in the memory, and completes the steps of the above method in combination with its hardware.
  • the embodiment of the present application further provides a device.
  • the apparatus may specifically be a component or a module, and the apparatus may include one or more processors and memory associated therewith.
  • the memory is used to store computer programs.
  • the device is made to execute the networking methods in the above method embodiments.
  • the apparatus, computer-readable storage medium, computer program product or chip provided in the embodiments of the present application are all used to execute the corresponding method provided above. Therefore, the beneficial effects that it can achieve can refer to the beneficial effects in the corresponding method provided above, and will not be repeated here.
  • all or part of them may be implemented by software, hardware, firmware or any combination thereof.
  • software When implemented using software, it may be implemented in whole or in part in the form of a computer program product.
  • the computer program product includes one or more computer instructions. When the computer program instructions are loaded and executed on the computer, the processes or functions according to the present application will be generated in whole or in part.
  • the computer can be a general purpose computer, a special purpose computer, a computer network, or other programmable devices.
  • the computer instructions may be stored in or transmitted from one computer-readable storage medium to another computer-readable storage medium, for example, the computer instructions may be transmitted from a website, computer, server or data center Transmission to another website site, computer, server, or data center by wired (eg, coaxial cable, optical fiber, DSL) or wireless (eg, infrared, wireless, microwave, etc.) means.
  • the computer-readable storage medium may be any available medium that can be accessed by a computer, or a data storage device such as a server or a data center integrated with one or more available media.
  • the available medium may be a magnetic medium (such as a floppy disk, a hard disk, or a magnetic tape), an optical medium (such as a DVD), or a semiconductor medium (such as a solid state disk (solid state disk, SSD)), etc.
  • the processes can be completed by computer programs to instruct related hardware.
  • the programs can be stored in computer-readable storage media.
  • When the programs are executed may include the processes of the foregoing method embodiments.
  • the aforementioned storage medium includes: ROM or random access memory RAM, magnetic disk or optical disk, and other various media that can store program codes.

Landscapes

  • Engineering & Computer Science (AREA)
  • Signal Processing (AREA)
  • Multimedia (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Controls And Circuits For Display Device (AREA)
  • Transforming Electric Information Into Light Information (AREA)

Abstract

本申请公开了一种投屏中流畅显示画面的方法、相关装置及系统。在该方法中,投屏设备和被投屏设备在镜像投屏过程中,被投屏设备侧可以根据最新接收到的图像帧进行图像预测,在出现丢帧时显示预测的图像帧。实施本申请方案,被投屏设备侧可以连续地显示画面,维持视觉上画面的连续性,保证被投屏设备侧的高帧率,避免卡顿问题。并且,根据最新接收到的图像帧进行图像预测,可以保障画面的流畅性和稳定性,避免带来视觉上的跳跃,给用户带来良好的投屏体验。

Description

投屏中流畅显示画面的方法、相关装置及系统
本申请要求于2021年11月11日提交中国专利局、申请号为202111333283.8、申请名称为“一种画面显示方法及电子设备”的中国专利申请的优先权,以及,于2022年03月10日提交中国专利局、申请号为202210238051.2、申请名称为“投屏中流畅显示画面的方法、相关装置及系统”的中国专利申请的优先权其全部内容通过引用结合在本申请中。
技术领域
本申请涉及终端技术领域,尤其涉及投屏中流畅显示画面的方法、相关装置及系统。
背景技术
投屏是电子设备广泛应用的功能之一,包括镜像投屏和在线投屏两种。在镜像投屏过程中,投屏设备截取自身显示屏所显示的内容并发送给被投屏设备,以使得被投屏设备显示和投屏设备相同的内容。在镜像投屏过程中,被投屏设备显示投屏内容的流畅度是影响用户视觉体验的主要因素。如何让被投屏设备流畅地显示投屏内容,是提升用户体验的重要研究方向。
发明内容
本申请提供了投屏中流畅显示画面的方法、相关装置及系统,支持被投屏设备显示流畅且连续的画面,避免出现卡顿或跳跃的感觉。
第一方面,提供一种投屏中流畅显示画面的方法,应用于包括第一设备和第二设备的通信系统,该方法包括:第一设备和第二设备建立通信连接;第一设备截取显示屏所显示的内容,得到第一图像帧,并将第一图像帧发送给第二设备;第二设备接收到第一图像帧;第二设备依次显示:第一图像帧、第一预测图像帧;第一预测图像帧由第二设备根据第一图像帧得到。
实施第一方面提供的方法,在镜像投屏过程中,第二设备即被投屏设备侧显示已接收到的图像帧和预测的图像帧,可以连续地显示画面,维持视觉上画面的连续性,避免卡顿问题。并且,根据最新接收到的图像帧进行图像预测,可以保障画面的流畅性和稳定性,避免带来视觉上的跳跃。在用户看来,用户能够看到流畅且连续的画面,不会出现卡顿或跳跃的感觉,可以获得良好的投屏体验。
结合第一方面,在一些实施方式中,第一设备和第二设备之间建立的通信连接可以是Wi-Fi连接、蓝牙连接、NFC连接或者远程连接等等。第一设备和第二设备之间可以基于miracast、数字生活网络联盟(digital living network alliance,DLNA)协议、AirPlay等协议建立连接。
结合第一方面及上述任意一种实施方式,在一些实施方式中,第一设备在得到第一图像帧之前,可以截取显示屏所显示的内容,得到第二图像帧,并将第二图像帧发送给第二设备;第二设备接收到第二图像帧;第二设备在显示第一图像帧之前,还显示第二图像帧;其中,第一预测图像帧由第二设备根据第一图像帧和第二图像帧得到。
通过上一实施方式,第二设备即被投屏设备侧可以根据最新接收到的两张图像帧来获得预测图像帧。
结合上一实施方式中,第二设备接收到第二图像帧的时间点,和,第一设备得到第二图 像帧的时间点,在第一时长内。也就是说,只有第二设备接收到的时延较小的图像帧可以被送显,这样可以保证镜像投屏的低时延,给用户一种第二设备和第一设备同步显示图像的感觉,给用户更好的镜像投屏使用体验。
结合第一方面及上述任意一种实施方式,在一些实施方式中,第一设备得到第一图像帧之后,可以再次截取显示屏所显示的内容,得到第三图像帧,并将第三图像帧发送给第二设备;第二设备未接收到第三图像帧,或者,第二设备在第一设备截取第三图像帧后的第一时长内未接收到第三图像帧。通过该实施方式,第二设备可以在出现丢帧,如未接收到图像帧或者未在规定时间内接收到图像帧时,显示预测图像帧,这样第二设备可以连续地显示画面,维持视觉上画面的连续性,避免卡顿问题。
结合第一方面及上述任意一种实施方式,在一些实施方式中,第二设备可以在以下任意一种情况下,根据第一图像帧得到第一预测图像帧:
情况1,第一设备和第二设备建立通信连接之后,第二设备根据第一图像帧得到第一预测图像帧。
情况2,第二设备在通信连接对应的通信质量低于阈值的情况下,根据第一图像帧得到第一预测图像帧。
如果第一设备和第二设备之间的通信质量低于一定值,则说明双方之间的通信质量较差,则极有可能由于通信质量较差而导致第一设备发送给第二设备的图像帧会在通信过程中丢失。也就是说,情况2下极有可能导致丢帧。因此,在情况2下获取第一预测图像帧,能够有效避免可能出现的丢帧给用户带来的不良体验。
情况3,第一设备在前台运行第一应用并以大于第一值的速度滚动显示显示屏中的内容。
在情况3下,第一设备显示屏中的内容会快速变化,第一设备将以较高的投屏帧率来截取屏幕内容。高投屏帧率下如果出现丢帧,会给用户带来非常明显的视觉卡顿,因此,在情况3下获取第一预测图像帧,能够有效避免高投屏帧率下丢帧给用户带来的不良体验。
在一些实施方式中,第一设备接收到的用户操作可以触发上述情况3。具体的,第一方面的方法还可以包括:第二设备在显示第一预测图像帧之前,第一设备运行第一应用并接收到第一操作,第一设备响应于第一操作以大于第一值的速度滚动显示显示屏中的内容;第一设备将第一应用的应用信息、第一操作的操作信息发送给第二设备;第二设备接收到第一应用的应用信息、第一操作的操作信息,根据第一图像帧得到第一预测图像帧。
在一些实施方式中,第二设备接收到的用户操作可以触发上述情况3。具体的,第一方面的方法还可以包括:第二设备在显示第一预测图像帧之前,第一设备运行第一应用,将第一应用的应用信息发送给第二设备;第二设备接收到第二操作并将第二操作的操作信息发送给第一设备,触发第一设备以大于第一值的速度滚动显示显示屏中的内容;第二设备根据第一图像帧得到第一预测图像帧。
第一应用是可以响应于用户操作而滑动或滚动显示用户界面中内容的应用,例如可以为浏览器、社交类应用、阅读类应用等等。
情况4,在第二设备显示第一设备发送的图像帧的显示帧率,和,第一设备截取显示屏所显示内容的投屏帧率之间的差值大于第二值的情况下,第二设备根据第一图像帧得到第一预测图像帧。
第二设备的显示帧率越接近于第一设备的投屏帧率,用户看到的镜像投屏效果越好。第二设备的显示帧率和第一设备的投屏帧率之间的差值越大,镜像投屏过程中的丢帧问题也就越严重。在情况4下,可以在开始出现较为严重的丢帧问题时获得预测图像帧,避免后续持 续的丢帧给用户带来的不良体验。
结合第一方面及上述任意一种实施方式,在一些实施方式中,第二设备接收到第一图像帧的时间点,和,第一设备得到第一图像帧的时间点,在第一时长内。也就是说,只有第二设备接收到的时延较小的图像帧可以被送显,这样可以保证镜像投屏的低时延,给用户一种第二设备和第一设备同步显示图像的感觉,给用户更好的镜像投屏使用体验。
结合第一方面及上述任意一种实施方式,在一些实施方式中,第一预测图像帧为:在第一图像帧的基础上,将运动区域显示的内容按照移动向量在运动区域中移动后,使用预测数据填充到空闲区域后所得到的图像。
其中,则运动区域为第一图像帧中和第四图像帧中显示不同内容的区域;移动向量为目标内容在第四图像帧中的位置移动至目标内容在第一图像帧中的位置的向量;空闲区域为移动运动区域显示的内容后,运动区域中未显示内容的区域。如果第二设备根据第一图像帧来得到第一预测图像帧,则第四图像帧为第一设备在第一图像帧之前最近一次截取显示屏所显示内容得到的图像帧;如果第二设备根据第一图像帧和第二图像帧来得到第一预测图像帧,则第四图像帧为第二设备在显示第一图像帧之前最近一次显示的该第二图像帧。预测数据由第二设备根据第一图像帧在空闲区域所显示的内容得到。
如果第二设备根据第一图像帧来得到第一预测图像帧,则第二设备可以根据第一图像帧,以及,第一设备运行的应用信息,触发第一设备滑动显示界面的操作信息等,来确定上述运动区域和移动向量。
如果第二设备根据第一图像帧和第二图像帧来得到第一预测图像帧,则第二设备可以通过对比第一图像帧和第二图像帧,来确定上述运动区域和移动向量。
通过上一实施方式,第二设备根据最新接收到的图像帧进行图像预测,可以保障投屏画面的流畅性和稳定性,避免带来视觉上的跳跃。这样可以支持用户看到流畅的画面,不会出现跳跃的感觉。
结合上一实施方式,在一些实施方式中,第二设备可以通过以下任意一种或多种方式,来根据第一图像帧在空闲区域所显示的内容得到预测数据:
1,第二设备对第一图像帧在空闲区域所显示的内容进行模糊处理,该模糊处理例如可包括均值模糊、中值模糊、高斯模糊、双边模糊、表面模糊、方框模糊、双重模糊、散景模糊、移轴模糊、光圈模糊、粒状模糊、径像模糊、方向模糊等图像处理方式。
2,第二设备直接将第一图像帧在空闲区域所显示的内容,作为预测数据。
3,第二设备使用神经网络算法,对第一图像帧在空闲区域所显示的内容进行图像预测,得到预测数据。
4,第二设备根据以往缓存的图像帧,从中得到预测数据。
通过上一实施方式,第二设备都可以根据第一图像帧在空闲区域所显示的内容得到预测数据,从而得到第一预测图像帧。这样根据第一图像帧来得到第一预测图像帧的方式,可以保障投屏画面的流畅性和稳定性,避免带来视觉上的跳跃,还可以支持用户看到流畅的画面,不会出现跳跃的感觉。
结合第一方面及上述任意一种实施方式,在一些实施方式中,第二设备的显示屏包括投屏区域,投屏区域用于依次显示第一图像帧、第一预测图像帧,投屏区域占用第二设备的部分或全部显示屏。投屏区域占用全部显示屏时,用户可以获得沉浸式的投屏体验;投屏区域占用部分显示屏时,第二设备可以利用显示屏的其他区域显示第二设备本身提供的用户界面,不影响用户操作第二设备。
在一些实施方式中,投屏区域占用部分显示屏时,第二设备还可以响应于用户操作,调整该投屏区域的位置、大小、形状等等。
结合第一方面及上述任意一种实施方式,在一些实施方式中,第二设备显示第一预测图像帧之后,显示第二预测图像帧,第二预测图像帧由第二设备根据第一预测图像帧得到。相当于,第二设备可以连续预测多帧图像,并且连续显示多张预测图像帧,这样可以保证一段时间内第二设备都可以连续地显示画面,维持视觉上画面的连续性,避免卡顿问题。可以保障一段时间内第二设备中画面的流畅性和稳定性,避免带来视觉上的跳跃。
结合第一方面及上述任意一种实施方式,在一些实施方式中,第一设备的显示屏所显示内容的变化速度越快,第一设备截取显示屏所显示的内容的投屏帧率越高。这样可以利于第一设备敏锐地捕获显示屏上画面的变化过程,从而使得第二设备上也能够对应呈现响应的过程,避免第二设备出现突变的效果。
结合第一方面,在一些实施方式中,第二设备依次显示第一图像帧、第一预测图像帧的显示帧率,等于,第一设备截取显示屏所显示的内容的投屏帧率。显示帧率和投屏帧率一致时,可以让用户看到最优的镜像投屏效果。
第二方面,提供一种投屏中流畅显示画面的方法,应用于第二设备,该方法包括:第二设备建立和第一设备之间的通信连接;第二设备接收到第一设备发送的第一图像帧,第一图像帧由第一设备截取显示屏所显示的内容得到;第二设备依次显示:第一图像帧、第一预测图像帧;第一预测图像帧由第二设备根据第一图像帧得到。
第二方面的方法,可参考第一方面或第一方面的任意一种实施方式中第二设备所执行的操作。第二方面的方法所能够实现的技术效果,也可参考第一方面或第一方面的任意一种实施方式中第二设备所执行的操作的技术效果。
第三方面,提供一种电子设备,包括:存储器、一个或多个处理器;存储器与一个或多个处理器耦合,存储器用于存储计算机程序代码,计算机程序代码包括计算机指令,一个或多个处理器调用计算机指令以使得电子设备执行如第二方面或第二方面任意一种实施方式的方法。
第四方面,本申请实施例提供了通信系统,包括第一设备、第二设备,第二设备用于执行如第二方面或第二方面任意一种实施方式的方法。
第五方面,本申请实施例提供了一种计算机可读存储介质,包括指令,当指令在电子设备上运行时,使得电子设备执行如第二方面或第二方面任意一种实施方式的方法。
第六方面,本申请实施例提供了一种计算机程序产品,当计算机程序产品在计算机上运行时,使得计算机执行第二方面或第二方面任意一种实施方式的方法。
实施本申请提供的技术方案,投屏设备和被投屏设备在镜像投屏过程中,被投屏设备侧可以根据最新接收到的图像帧进行图像预测,在出现丢帧时显示预测的图像帧。实施本申请方案,被投屏设备侧可以连续地显示画面,维持视觉上画面的连续性,保证被投屏设备侧的高帧率,避免卡顿问题。并且,根据最新接收到的图像帧进行图像预测,可以保障画面的流畅性和稳定性,避免带来视觉上的跳跃,给用户带来良好的投屏体验。
附图说明
图1为本申请实施例提供的通信系统的架构;
图2A为本申请实施例提供的电子设备的硬件结构图;
图2B为本申请实施例提供的电子设备的软件结构图;
图3为本申请实施例提供的投屏中流畅显示画面的方法流程图;
图4A-图4D为源侧设备100启动镜像投屏功能时涉及的用户界面;
图4E为端侧设备200启动镜像投屏功能时涉及的用户界面;
图5A-图5E为源侧设备100在镜像投屏过程中涉及的用户界面;
图6为本申请实施例提供的送显队列及预测图像队列示意图;
图7为本申请实施例提供的预测图像的示意图;
图8A-图8D为端侧设备200在镜像投屏过程中涉及的用户界面。
具体实施方式
下面将结合附图对本申请实施例中的技术方案进行清楚、详尽地描述。其中,在本申请实施例的描述中,除非另有说明,“/”表示或的意思,例如,A/B可以表示A或B;文本中的“和/或”仅仅是一种描述关联对象的关联关系,表示可以存在三种关系,例如,A和/或B,可以表示:单独存在A,同时存在A和B,单独存在B这三种情况,另外,在本申请实施例的描述中,“多个”是指两个或多于两个。
以下,术语“第一”、“第二”仅用于描述目的,而不能理解为暗示或暗示相对重要性或者隐含指明所指示的技术特征的数量。由此,限定有“第一”、“第二”的特征可以明示或者隐含地包括一个或者更多个该特征,在本申请实施例的描述中,除非另有说明,“多个”的含义是两个或两个以上。
本申请以下实施例中的术语“用户界面(user interface,UI)”,是应用程序或操作系统与用户之间进行交互和信息交换的介质接口,它实现信息的内部形式与用户可以接受形式之间的转换。用户界面是通过java、可扩展标记语言(extensible markup language,XML)等特定计算机语言编写的源代码,界面源代码在电子设备上经过解析,渲染,最终呈现为用户可以识别的内容。用户界面常用的表现形式是图形用户界面(graphic user interface,GUI),是指采用图形方式显示的与计算机操作相关的用户界面。它可以是在电子设备的显示屏中显示的文本、图标、按钮、菜单、选项卡、文本框、对话框、状态栏、导航栏、Widget等可视的界面元素。
在投屏过程中,可能会出现投屏帧率(frame rate)较高的场景。例如,在镜像投屏过程中,当投屏设备的显示屏所显示的内容发生快速变化时,投屏设备侧截取屏幕内容的投屏帧率较高。在这种情况下,如果出现网络时延、设备编解码速度低等问题,则投屏设备将投屏内容发送给被投屏设备的过程可能出现丢帧,导致被投屏设备侧的显示帧率较低,显示画面不流畅,出现明显的视觉卡顿。
为了解决上述问题,本申请以下实施例提供了投屏中流畅显示画面的方法、相关装置及系统。在该方法中,投屏设备和被投屏设备在镜像投屏过程中,如果投屏设备侧的投屏帧率高于阈值,则被投屏设备侧可以根据最新接收到的图像帧进行图像预测,在出现丢帧时显示预测的图像帧。
实施上述方法,被投屏设备侧显示已接收到的图像帧和预测的图像帧,可以连续地显示画面,维持视觉上画面的连续性,保证被投屏设备侧的高帧率,避免卡顿问题。并且,根据最新接收到的图像帧进行图像预测,可以保障画面的流畅性和稳定性,避免带来视觉上的跳跃。在用户看来,用户能够看到流畅且连续的画面,不会出现卡顿或跳跃的感觉,可以获得良好的投屏体验。
在本申请以下实施例提供的方法中,镜像投屏是指投屏设备截取自身显示屏所显示的内容并发送给被投屏设备,以使得被投屏设备显示和投屏设备相同的内容的过程。镜像投屏所使用的技术可包括但不限于无线保真(wireless fidelity,Wi-Fi)、蓝牙、近距离无线通信技术(near field communication,NFC)、移动通信技术或有线技术等等。镜像投屏所使用的协议可包括但不限于miracast、数字生活网络联盟(digital living network alliance,DLNA)协议、AirPlay等等。
镜像投屏仅是本申请实施例使用的一个名词,其代表的含义在本实施例中已经记载,其名称并不能对本实施例构成任何限制。例如,镜像投屏也可以被称为其他名词,如协同投屏、镜面投屏、无线投屏等等。
在镜像投屏过程中,投屏设备也可以被称为源侧设备,被投屏设备也可以被称为端侧设备。在后续实施例中,将以源侧设备和端侧设备为例进行描述。在本申请实施例中,源侧设备也可以被称为第一设备,端侧设备也可以被称为第二设备。
在本申请以下实施例提供的方法中,投屏帧率是指源侧设备在单位时间内截取屏幕内容的帧数。投屏帧率的单位可以为帧每秒(frames per second,FPS)或赫兹(Hz)。投屏帧率的大小和源侧设备显示屏上画面的变化快慢相关,显示屏画面变化越快,源侧设备截取投屏内容的投屏帧率越高。
源侧设备的投屏帧率较高的场景可包括用户在源侧设备的显示屏上快速滑动而出发显示屏显示的内容快速变化等场景。例如,源侧设备运行浏览器、社交软件等应用时,如果用户快速滑动显示屏上的用户界面,则显示屏上的画面会发生快速变化,此时源侧设备的投屏帧率较高。
在本申请以下实施例中,丢帧是指源侧设备截取的投屏内容(即图像帧)在传输给端侧设备的过程中丢失,或者,被端侧设备主动丢弃。导致丢帧的原因例如可包括但不限于:源侧和端侧之间的连接条件较差如网络质量差(如网速低)、有线带宽低,源侧的图像编码效率低,端侧的图像解码效率低等等。
下面,首先介绍本申请实施例提供的用于投屏的通信系统10。
图1示例性示出了通信系统10的架构。
如图1所示,通信系统10包括:源侧设备100、端侧设备200。
在本申请实施例中,源侧设备100可以和端侧设备200建立通信连接。该通信连接可以是Wi-Fi连接、蓝牙连接、NFC连接或者远程连接等等,也可以是有线连接如基于数据线的连接,本申请实施例对此不作任何限制。
源侧设备100可以包括但不限于智能手机、平板电脑、个人数字助理(personal digital assistant,PDA)、增强现实(augmented reality,AR)设备、虚拟现实(virtual reality,VR)设备、人工智能(artificial intelligence,AI)设备、可穿戴式设备(如智能手表、智能眼镜)等。电子设备的示例性实施例包括但不限于搭载
Figure PCTCN2022130904-appb-000001
Linux或者其它操作系统的便携式电子设备。上述电子设备也可为其它便携式电子设备,诸如膝上型计算机(Laptop)等。还应当理解的是,在其他一些实施例中,上述电子设备也可以不是便携式电子设备,而是台式计算机。
源侧设备100具备显示屏,该显示屏可以显示源侧设备100本地的内容,也可以显示来自网络的内容。该显示屏还可以用于接收用户输入的各种类型的手势,例如滑动手势、点击手势、拖动手势、捏合手势等等。源侧设备100可以响应于用户输入的各类手势,更改显示 屏上所显示的内容。
端侧设备200可以是平板电脑、电视机、智慧屏、车载设备或者电子广告牌等。端侧设备200可以相对于源侧设备100来说具备较大尺寸的显示屏。在一些实施例中,端侧设备200为电视机时,可以配合电视盒子使用,电视盒子用于将接收到的数字信号转换为模拟信号后发送给电视机进行显示。在一些实施例中,端侧设备200可以为本身具备数模转换功能的电视机,也可以为配置有电视盒子的电视机。在一些实施例中,端侧设备200为电视机或智慧屏时,还可以和遥控器配合使用。遥控器和端侧设备200之间可以通过红外线信号通信。
在本申请实施例中,源侧设备100和端侧设备200建立通信连接之后,两者可以进行镜像投屏过程。源侧设备100可以根据用户输入的用户操作在显示屏上显示对应的内容,并根据显示屏上内容的变化速度来确定投屏帧率,以该投屏帧率来截取显示屏上所显示的内容,将该内容通过通信连接发送给端侧设备200。
在上述镜像投屏过程中,源侧设备100可以基于和端侧设备200之间的通信连接,将自身确定的投屏帧率通知给端侧设备200。源侧设备100还可以基于和端侧设备200之间的通信连接,将自身运行状况发送给端侧设备200,例如运行的应用的信息等。
端侧设备200可以根据最新接收到的图像帧进行图像预测,得到预测的图像帧。端侧设备200预测图像帧的时机或场景,可参考后续方法实施例的详细介绍,这里暂不赘述。
图2A为本申请实施例提供的电子设备的结构示意图。该电子设备可以为图1所示通信系统中的源侧设备100,也可以为端侧设备200。
如图2A所示,电子设备可以包括处理器110,外部存储器接口120,内部存储器121,通用串行总线(universal serial bus,USB)接口130,充电管理模块140,电源管理模块141,电池142,天线1,天线2,移动通信模块150,无线通信模块160,音频模块170,扬声器170A,受话器170B,麦克风170C,耳机接口170D,传感器模块180,按键190,马达191,指示器192,摄像头193,显示屏194,以及用户标识模块(subscriber identification module,SIM)卡接口195等。其中传感器模块180可以包括压力传感器180A,陀螺仪传感器180B,气压传感器180C,磁传感器180D,加速度传感器180E,距离传感器180F,接近光传感器180G,指纹传感器180H,温度传感器180J,触摸传感器180K,环境光传感器180L,骨传导传感器180M等。
可以理解的是,本申请实施例示意的结构并不构成对电子设备的具体限定。在本申请另一些实施例中,电子设备可以包括比图示更多或更少的部件,或者组合某些部件,或者拆分某些部件,或者不同的部件布置。图示的部件可以以硬件,软件或软件和硬件的组合实现。
处理器110可以包括一个或多个处理单元,例如:处理器110可以包括应用处理器(application processor,AP),调制解调处理器,图形处理器(graphics processing unit,GPU),图像信号处理器(image signal processor,ISP),控制器,视频编解码器,数字信号处理器(digital signal processor,DSP),基带处理器,和/或神经网络处理器(neural-network processing unit,NPU)等。其中,不同的处理单元可以是独立的器件,也可以集成在一个或多个处理器中。
控制器可以根据指令操作码和时序信号,产生操作控制信号,完成取指令和执行指令的控制。
处理器110中还可以设置存储器,用于存储指令和数据。在一些实施例中,处理器110中的存储器为高速缓冲存储器。该存储器可以保存处理器110刚用过或循环使用的指令或数据。如果处理器110需要再次使用该指令或数据,可从所述存储器中直接调用。避免了重复 存取,减少了处理器110的等待时间,因而提高了系统的效率。
电子设备的无线通信功能可以通过天线1,天线2,移动通信模块150,无线通信模块160,调制解调处理器以及基带处理器等实现。
天线1和天线2用于发射和接收电磁波信号。电子设备中的每个天线可用于覆盖单个或多个通信频带。不同的天线还可以复用,以提高天线的利用率。例如:可以将天线1复用为无线局域网的分集天线。在另外一些实施例中,天线可以和调谐开关结合使用。
移动通信模块150可以提供应用在电子设备上的包括2G/3G/4G/5G等无线通信的解决方案。移动通信模块150可以包括至少一个滤波器,开关,功率放大器,低噪声放大器(low noise amplifier,LNA)等。移动通信模块150可以由天线1接收电磁波,并对接收的电磁波进行滤波,放大等处理,传送至调制解调处理器进行解调。移动通信模块150还可以对经调制解调处理器调制后的信号放大,经天线1转为电磁波辐射出去。在一些实施例中,移动通信模块150的至少部分功能模块可以被设置于处理器110中。在一些实施例中,移动通信模块150的至少部分功能模块可以与处理器110的至少部分模块被设置在同一个器件中。
调制解调处理器可以包括调制器和解调器。其中,调制器用于将待发送的低频基带信号调制成中高频信号。解调器用于将接收的电磁波信号解调为低频基带信号。随后解调器将解调得到的低频基带信号传送至基带处理器处理。低频基带信号经基带处理器处理后,被传递给应用处理器。应用处理器通过音频设备(不限于扬声器170A,受话器170B等)输出声音信号,或通过显示屏194显示图像或视频。在一些实施例中,调制解调处理器可以是独立的器件。在另一些实施例中,调制解调处理器可以独立于处理器110,与移动通信模块150或其他功能模块设置在同一个器件中。
无线通信模块160可以提供应用在电子设备上的包括无线局域网(wireless local area networks,WLAN)(如无线保真(wireless fidelity,Wi-Fi)网络),蓝牙(bluetooth,BT),全球导航卫星系统(global navigation satellite system,GNSS),调频(frequency modulation,FM),近距离无线通信技术(near field communication,NFC),红外技术(infrared,IR)等无线通信的解决方案。无线通信模块160可以是集成至少一个通信处理模块的一个或多个器件。无线通信模块160经由天线2接收电磁波,将电磁波信号解调以及滤波处理,将处理后的信号发送到处理器110。无线通信模块160还可以从处理器110接收待发送的信号,对其进行调频,放大,经天线2转为电磁波辐射出去。
在一些实施例中,电子设备的天线1和移动通信模块150耦合,天线2和无线通信模块160耦合,使得电子设备可以通过无线通信技术与网络以及其他设备通信。所述无线通信技术可以包括全球移动通讯系统(global system for mobile communications,GSM),通用分组无线服务(general packet radio service,GPRS),码分多址接入(code division multiple access,CDMA),宽带码分多址(wideband code division multiple access,WCDMA),时分码分多址(time-division code division multiple access,TD-SCDMA),长期演进(long term evolution,LTE),BT,GNSS,WLAN,NFC,FM,和/或IR技术等。所述GNSS可以包括全球卫星定位系统(global positioning system,GPS),全球导航卫星系统(global navigation satellite system,GLONASS),北斗卫星导航系统(beidou navigation satellite system,BDS),准天顶卫星系统(quasi-zenith satellite system,QZSS)和/或星基增强系统(satellite based augmentation systems,SBAS)。
电子设备通过GPU,显示屏194,以及应用处理器等实现显示功能。GPU为图像处理的微处理器,连接显示屏194和应用处理器。GPU用于执行数学和几何计算,用于图形渲染。 处理器110可包括一个或多个GPU,其执行程序指令以生成或改变显示信息。
显示屏194用于显示图像,视频等。显示屏194包括显示面板。显示面板可以采用液晶显示屏(liquid crystal display,LCD)。显示面板还可以采用有机发光二极管(organic light-emitting diode,OLED),有源矩阵有机发光二极体或主动矩阵有机发光二极体(active-matrix organic light emitting diode,AMOLED),柔性发光二极管(flex light-emitting diode,FLED),miniled,microled,micro-oled,量子点发光二极管(quantum dot light emitting diodes,QLED)等制造。在一些实施例中,电子设备可以包括1个或N个显示屏194,N为大于1的正整数。
电子设备可以通过ISP,摄像头193,视频编解码器,GPU,显示屏194以及应用处理器等实现拍摄功能。
数字信号处理器用于处理数字信号,除了可以处理数字图像信号,还可以处理其他数字信号。例如,当电子设备在频点选择时,数字信号处理器用于对频点能量进行傅里叶变换等。
视频编解码器用于对数字视频压缩或解压缩。电子设备可以支持一种或多种视频编解码器。这样,电子设备可以播放或录制多种编码格式的视频,例如:动态图像专家组(moving picture experts group,MPEG)1,MPEG2,MPEG3,MPEG4等。
内部存储器121可以包括一个或多个随机存取存储器(random access memory,RAM)和一个或多个非易失性存储器(non-volatile memory,NVM)。
随机存取存储器可以包括静态随机存储器(static random-access memory,SRAM)、动态随机存储器(dynamic random access memory,DRAM)、同步动态随机存储器(synchronous dynamic random access memory,SDRAM)、双倍资料率同步动态随机存取存储器(double data rate synchronous dynamic random access memory,DDR SDRAM,例如第五代DDR SDRAM一般称为DDR5SDRAM)等;非易失性存储器可以包括磁盘存储器件、快闪存储器(flash memory)。
快闪存储器按照运作原理划分可以包括NOR FLASH、NAND FLASH、3D NAND FLASH等,按照存储单元电位阶数划分可以包括单阶存储单元(single-level cell,SLC)、多阶存储单元(multi-level cell,MLC)、三阶储存单元(triple-level cell,TLC)、四阶储存单元(quad-level cell,QLC)等,按照存储规范划分可以包括通用闪存存储(英文:universal flash storage,UFS)、嵌入式多媒体存储卡(embedded multi media Card,eMMC)等。
随机存取存储器可以由处理器110直接进行读写,可以用于存储操作系统或其他正在运行中的程序的可执行程序(例如机器指令),还可以用于存储用户及应用程序的数据等。
非易失性存储器也可以存储可执行程序和存储用户及应用程序的数据等,可以提前加载到随机存取存储器中,用于处理器110直接进行读写。
外部存储器接口120可以用于连接外部的非易失性存储器,实现扩展电子设备的存储能力。外部的非易失性存储器通过外部存储器接口120与处理器110通信,实现数据存储功能。例如将音乐,视频等文件保存在外部的非易失性存储器中。
电子设备可以通过音频模块170,扬声器170A,受话器170B,麦克风170C,耳机接口170D,以及应用处理器等实现音频功能。例如音乐播放,录音等。
压力传感器180A用于感受压力信号,可以将压力信号转换成电信号。在一些实施例中,压力传感器180A可以设置于显示屏194。压力传感器180A的种类很多,如电阻式压力传感器,电感式压力传感器,电容式压力传感器等。电容式压力传感器可以是包括至少两个具有导电材料的平行板。当有力作用于压力传感器180A,电极之间的电容改变。电子设备根据电容的变化确定压力的强度。当有触摸操作作用于显示屏194,电子设备根据压力传感器180A 检测所述触摸操作强度。电子设备也可以根据压力传感器180A的检测信号计算触摸的位置。在一些实施例中,作用于相同触摸位置,但不同触摸操作强度的触摸操作,可以对应不同的操作指令。例如:当有触摸操作强度小于第一压力阈值的触摸操作作用于短消息应用图标时,执行查看短消息的指令。当有触摸操作强度大于或等于第一压力阈值的触摸操作作用于短消息应用图标时,执行新建短消息的指令。
触摸传感器180K,也称“触控器件”。触摸传感器180K可以设置于显示屏194,由触摸传感器180K与显示屏194组成触摸屏,也称“触控屏”。触摸传感器180K用于检测作用于其上或附近的触摸操作。触摸传感器可以将检测到的触摸操作传递给应用处理器,以确定触摸事件类型。可以通过显示屏194提供与触摸操作相关的视觉输出。在另一些实施例中,触摸传感器180K也可以设置于电子设备的表面,与显示屏194所处的位置不同。
按键190包括开机键,音量键等。按键190可以是机械按键。也可以是触摸式按键。电子设备可以接收按键输入,产生与电子设备的用户设置以及功能控制有关的键信号输入。
指示器192可以是指示灯,可以用于指示充电状态,电量变化,也可以用于指示消息,未接来电,通知等。
当源侧设备100实现为图2A所示的电子设备时:
显示屏194用于显示来自源侧设备100本地的内容,或者显示来自网络的内容。显示屏194还可以接收用户输入的各类手势操作,并响应于该手势操作,显示不同的内容。
无线通信模块160用于和端侧设备200建立通信连接,该通信连接可以是Wi-Fi连接、蓝牙连接等无线通信连接。
处理器110用于根据显示屏194所显示内容的变化速度来确定投屏帧率,以该投屏帧率来截取显示屏194上所显示的内容,获取图像帧。
无线通信模块160用于基于和端侧设备200之间的通信连接,将处理器110确定的投屏帧率、图像帧发送给端侧设备200。在一些实施例中,无线通信模块160还可用于将源侧设备100运行的应用的信息发送给端侧设备200。
当端侧设备200实现为图2A所示的电子设备时:
无线通信模块160用于和源侧设备100建立通信连接,该通信连接可以是Wi-Fi连接、蓝牙连接等无线通信连接。
无线通信模块160还用于基于和源侧设备100之间的通信连接,接收源侧设备100发送的投屏帧率、截取的图像帧等。在一些实施例中,无线通信模块160还可以接收源侧设备100发送的该源侧设备100运行的应用的信息。
处理器110用于根据最新接收到的图像帧进行图像预测,得到预测的图像帧。
处理器110还用于根据接收源侧设备100发送的图像帧的情况,结合投屏帧率,判断当前是否存在丢帧现象。如果出现丢帧,则在出现丢帧时显示预测的图像帧。
图2A所示的本申请实施例中电子设备的软件系统可以采用分层架构,事件驱动架构,微核架构,微服务架构,或云架构。本申请实施例以分层架构的Android系统为例,示例性说明电子设备的软件结构。
图2B是本申请实施例的电子设备的软件结构框图。
分层架构将软件分成若干个层,每一层都有清晰的角色和分工。层与层之间通过软件接 口通信。在一些实施例中,将Android系统分为四层,从上至下分别为应用程序层,应用程序框架层,安卓运行时(Android runtime)和系统库,以及内核层。
应用程序层可以包括一系列应用程序包。
如图2B所示,应用程序包可以包括相机,图库,日历,通话,地图,导航,WLAN,蓝牙,音乐,视频,短信息等应用程序。
在本申请实施例中,应用程序包可包括投屏应用。源侧设备100中的投屏应用用于支持后续投屏中流畅显示画面的方法实施例中源侧设备100所执行操作,端侧设备200中的投屏应用用于支持后续投屏中流畅显示画面的方法实施例中端侧设备200所执行操作。
应用程序框架层为应用程序层的应用程序提供应用编程接口(application programming interface,API)和编程框架。应用程序框架层包括一些预先定义的函数。
如图2B所示,应用程序框架层可以包括窗口管理器,内容提供器,视图系统,电话管理器,资源管理器,通知管理器等。
窗口管理器用于管理窗口程序。窗口管理器可以获取显示屏大小,判断是否有状态栏,锁定屏幕,截取屏幕等。
内容提供器用来存放和获取数据,并使这些数据可以被应用程序访问。所述数据可以包括视频,图像,音频,拨打和接听的电话,浏览历史和书签,电话簿等。
视图系统包括可视控件,例如显示文字的控件,显示图片的控件等。视图系统可用于构建应用程序。显示界面可以由一个或多个视图组成的。例如,包括短信通知图标的显示界面,可以包括显示文字的视图以及显示图片的视图。
电话管理器用于提供电子设备的通信功能。例如通话状态的管理(包括接通,挂断等)。
资源管理器为应用程序提供各种资源,比如本地化字符串,图标,图片,布局文件,视频文件等等。
通知管理器使应用程序可以在状态栏中显示通知信息,可以用于传达告知类型的消息,可以短暂停留后自动消失,无需用户交互。比如通知管理器被用于告知下载完成,消息提醒等。通知管理器还可以是以图表或者滚动条文本形式出现在系统顶部状态栏的通知,例如后台运行的应用程序的通知,还可以是以对话窗口形式出现在屏幕上的通知。例如在状态栏提示文本信息,发出提示音,电子设备振动,指示灯闪烁等。
Android Runtime包括核心库和虚拟机。Android runtime负责安卓系统的调度和管理。
核心库包含两部分:一部分是java语言需要调用的功能函数,另一部分是安卓的核心库。
应用程序层和应用程序框架层运行在虚拟机中。虚拟机将应用程序层和应用程序框架层的java文件执行为二进制文件。虚拟机用于执行对象生命周期的管理,堆栈管理,线程管理,安全和异常的管理,以及垃圾回收等功能。
系统库可以包括多个功能模块。例如:表面管理器(surface manager),媒体库(Media Libraries),三维图形处理库(例如:OpenGL ES),2D图形引擎(例如:SGL)等。
表面管理器用于对显示子系统进行管理,并且为多个应用程序提供了2D和3D图层的融合。
媒体库支持多种常用的音频,视频格式回放和录制,以及静态图像文件等。媒体库可以支持多种音视频编码格式,例如:MPEG4,H.264,MP3,AAC,AMR,JPG,PNG等。
三维图形处理库用于实现三维图形绘图,图像渲染,合成,和图层处理等。
2D图形引擎是2D绘图的绘图引擎。
内核层是硬件和软件之间的层。内核层至少包含显示驱动,摄像头驱动,音频驱动,传 感器驱动。
下面基于图1所示的通信系统10,图2A及图2B所示的电子设备,介绍本申请实施例提供的投屏中流畅显示画面的方法。
图3示例性示出了该投屏中流畅显示画面的方法的流程。
如图3所示,该方法可包括如下步骤:
S101,源侧设备100和端侧设备200建立通信连接。
在本申请实施例中,源侧设备100可以检测到用户输入的用户操作,并响应于该用户操作,开启无线通信模块160中的WLAN、蓝牙、NFC或移动网络中的一个或多个,并可以通过Wi-Fi、蓝牙、NFC、移动网络中的一项或多项无线通信技术发现可以建立用于镜像投屏的通信连接的其他设备。
在一些实施例中,源侧设备100可以在发现到多个可以建立用于镜像投屏的通信连接的设备后,可以将这多个设备的标识显示在用户界面中,以供用户从中选择一个或多个设备建立通信连接。
图4A及图4B示例性示出了源侧设备100和端侧设备200建立通信连接的过程。
图4A示例性示出了源侧设备100上的用于展示已安装应用程序的示例性用户界面41。
用户界面41显示有:状态栏、日历(calendar)和时间指示符、天气指示符、页面指示符、具有常用应用程序图标的托盘、其他应用程序图标等等。不限于此,在一些实施例中,图4A所示的用户界面41还可包括导航栏、侧边栏等等。在一些实施例中,图4A示例性所示的用户界面41可以称为主界面(home screen)。
如图4A及图4B所示,源侧设备100检测到从显示屏顶部向下滑动的操作后,响应于该滑动操作,源侧设备100可以在用户界面41上显示如图4B所示的窗口111。如图4B所示,窗口111中可以显示有控件111A,控件111A可接受开启/关闭源侧设备100的镜像投屏功能的用户操作(例如触摸操作、点击操作)。控件111A的表现形式可以包括图标和/或文本(例如文本“投屏”、“无线投屏”、“多屏互动”、“大屏投射”等)。窗口111中还可以显示有其他功能例如Wi-Fi、蓝牙、手电筒等等的开关控件。
如图4B所示,源侧设备100可以检测到作用于控件111A的点击操作,开启镜像投屏功能。在一些实施例中,源侧设备100检测到作用于控件111A的用户操作后,可以更改控件111A的显示形式,例如增加显示控件111A时的阴影等。
不限于在图4A所示的用户界面41上,用户还可以在设置应用的其他界面或者其他应用的用户界面上输入向下滑动的操作,触发源侧设备100显示窗口111。
不限于图4A及图4B示出的用户在窗口111中作用于控件111A的用户操作。可选地,开启镜像投屏功能的用户操作还可以是作用于设置应用中的功能选项的开启操作。又例如,用户还可以将源侧设备100贴近端侧设备200的NFC标签,触发源侧设备100开启镜像投屏功能。本申请实施例对开启镜像投屏功能的用户操作不作限制。
源侧设备100响应于用户开启镜像投屏功能的用户操作,开启无线通信模块160中的WLAN、蓝牙或NFC中的一项或多项,并可以通过Wi-Fi、蓝牙、NFC中的一项或多项无线通信技术发现可以建立投屏通信连接的可投屏设备。
源侧设备100在查找到可建立通信连接的可投屏设备后,可以显示这些可投屏设备的标识,示例性地,显示如图4C所示的窗口112。
如图4C所示,窗口112中可以显示有:一个或多个可投屏设备的标识和连接选项。源侧 设备100可以检测到作用于该连接选项的用户操作,与该选项对应的设备标识指示的可投屏设备建立通信连接。其中,这一个或多个可投屏设备的标识和连接选项中包括标识112A和连接选项112B。当源侧设备100检测到用户作用于连接选项112B的用户操作,响应于该操作,源侧设备100可以向标识112A中显示的标识为“HUAWEI 20”的设备发送通信连接。并且,连接选项112B可以更新为如图4D所示的选项112C,用于提示用户源侧设备100正在搜寻可用于镜像投屏的设备。
可以理解的是,本申请实施例不限制源侧设备100选择建立通信连接的设备的用户操作,源侧设备100除了可以显示可投屏设备的标识外,还可以显示其他信息,例如可投屏设备的设备类型等等,本申请实施例对此不作限制。
在识为“HUAWEI 20”的设备在接收到源侧设备100发送的通信连接请求后,可以显示如图4E所示的用户界面42,该用户界面42中包括窗口201,该窗口201用于提示用户是否同意建立通信连接。其中,窗口201可包括:确认控件201A、取消控件201B。其中,确认控件201A可以响应于用户操作,与源侧设备100建立通信连接,这时,该标识为“HUAWEI20”的设备即为与源侧设备100建立通信连接的端侧设备200,该端侧设备200可以显示源侧设备100提供的用户界面,具体参见后续端侧设备200上所显示的用户界面。取消控件201B可以响应于用户操作,拒绝与源侧设备100建立通信连接。
可选地,在一些实施方式中,端侧设备200在接收到通信连接请求后,可以不显示提示信息,即不显示如图4E所示的窗口201,直接建立与源侧设备100的通信连接。
在一些实施例中,源侧设备100和端侧设备200均可以运行投屏应用,以支持两个设备之间建立通信连接以及进行后续的镜像投屏过程。
在本申请实施例中,源侧设备100和端侧设备200之间建立的通信连接可以是Wi-Fi连接、蓝牙连接、NFC连接或者远程连接等等,也可以是有线连接如基于数据线的连接,本申请实施例对此不作任何限制。
S102-S110,镜像投屏过程。
S102,在和端侧设备200建立通信连接后,源侧设备100根据投屏帧率截取屏幕内容,获得图像帧,并将图像帧发送给端侧设备200。
源侧设备100和端侧设备200建立通信连接后,在镜像投屏过程中,源侧设备100的投屏帧率由该源侧设备100根据自身显示屏上所显示画面的变化速度决定。源侧设备100的显示屏上所显示画面的变化速度越快,源侧设备100确定的投屏帧率也就越高。也就是说,源侧设备100的画面变化速度越快时,源侧设备100会以更快的速度截取更多的屏幕内容发送给端侧设备200。这样可以保证源侧设备100的画面变化会被捕捉到,端侧设备200接收到的图像帧可以反映这种变化过程,不会出现画面的突变而给用户造成卡顿感。
源侧设备100和端侧设备200建立通信连接后,源侧设备100的显示屏上所显示的画面以及该画面的变化情况均由用户自主操控。用户可以操控源侧设备100执行任意类型的操作或功能,源侧设备100可以响应于用户输入的操作在显示屏上显示不同的内容。例如,源侧设备100可以响应于用户操作,启动社交类应用,显示社交内容。又例如,源侧设备100可以响应于用户操作,启动阅读类应用,显示小说文本。又例如,源侧设备100可以响应于用户操作,启动社交类应用,等等。
可以说,用户针对源侧设备100输入的用于更改源侧设备100所显示内容的用户操作,决定了源侧设备100的投屏帧率。
在一些实施例中,源侧设备100的投屏帧率不超过满帧帧率,即最大帧率。满帧帧率可以由源侧设备100预先设置,例如可以为60FPS等,这里不做限定。
源侧设备100可以根据确定的投屏帧率,持续截取显示屏上所显示的屏幕内容,获得对应的图像帧,并基于和端侧设备200之间的通信连接,将图像帧发送给端侧设备200。在一些实施例中,源侧设备100可以每截取一张图像帧,就向端侧设备200发送一张图像帧,即在镜像投屏过程中持续向端侧设备200发送图像数据流。相当于,在镜像投屏过程中,S102将持续多次进行。
在一些实施方式中,源侧设备100可以将每一张图像帧按照截取时间打上时间戳,或者,按照截取的先后顺序打上序号。
在一些实施方式中,源侧设备100可以将截取的图像帧编码后发送给端侧设备200。如果源侧设备100算力不足或者其他原因,可能在编码阶段耗费较多时间。
S103,在执行S102的过程中,源侧设备100将投屏帧率同步给端侧设备200。
在本申请实施例中,源侧设备100可以基于和端侧设备200之间的通信连接,周期性地将投屏帧率同步给端侧设备200,也可以在投屏帧率发生变化时将新的投屏帧率同步给端侧设备200。
也就是说,S103在镜像投屏过程中,将多次执行。
源侧设备100可以将投屏帧率和在S102中截取的图像帧一起发送给端侧设备200,也可以分开发送,本申请实施例对此不做限定。
S104,源侧设备100启动第一应用,并在显示屏上显示第一用户界面。
本申请实施例对S101和S104的先后顺序不做限定,源侧设备100可以先执行S101然后在执行S102的过程中执行S104,也可以先执行S104然后执行S101和S102。图3所示的流程图以前一种执行顺序为例进行说明。
在本申请实施例中,第一应用是可以响应于用户操作而滑动或滚动显示用户界面中内容的应用,例如可以为浏览器、社交类应用、阅读类应用等等。
图5A及图5B示例性示出了源侧设备100启动第一应用的过程。
图5A是源侧设备100显示的用户界面41,该用户界面41可以为源侧设备100提供的主界面。如图5A所示,源侧设备100可以检测到作用于主界面中社交类应用图标501上的用户操作(如点击操作、触摸操作等等)。源侧设备100可响应于该用户操作,显示如图5B所示的由该社交类应用提供的用户界面51。该用户界面51为第一用户界面的一个示例。
不限于图5A所示的启动第一应用的方式,在其他一些实施例中,源侧设备100还可以通过其他方式来启动第一应用。例如,源侧设备100还可以响应于语音指令启动第一应用。又例如,源侧设备100可以配合鼠标使用,源侧设备100可以响应于鼠标的光标位于第一应用图标所在位置后,在鼠标上接收到的双击操作,启动该第一应用等等。
在一些实施例中,第一用户界面可以包括系统内容和页面内容。
其中,系统内容是指源侧设备100运行系统应用程序时,系统应用程序提供的内容,例如状态栏、系统导航栏等。系统内容通常在显示屏中的固定区域显示,电子设备不会根据用户操作而更改显示该系统内容的位置。
页面内容是指源侧设备100当前在前台运行的第一应用提供的内容,例如可包括应用标题栏、应用菜单栏、应用内部导航栏、应用内容等等。当源侧设备100开启第一应用时,源 侧设备100可以通过网络加载或者在本地加载该第一应用提供的页面内容。并且,第一用户界面只显示了部分的页面内容,当源侧设备100接收到用户作用于屏幕上下滑动的操作时,第一用户界面中可以显示该页面内容的其他部分。也就是说,第一应用提供的一个页面内容较长,显示屏上通常只会显示页面内容的一部分。源侧设备100可以检测到用户作用于页面内容中向上或向下滑动的用户操作,响应于该操作,源侧设备100滚动第一用户界面中可滚动区域的页面内容。这样,用户可以根据自己的需求详细浏览更多的页面内容。
页面内容可以是第一应用提供的该第一应用的主页面或其他页面,这里不限定。页面内容可以来自源侧设备100本地,也可以来自网络。
示例性地,参考图5B示出的用户界面51,其中的状态栏为系统内容,状态栏以外的其他内容为社交类应用提供的页面内容。状态栏位于显示屏中的区域502,其他内容位于显示屏中的区域503。
在另一些实施例中,第一用户界面也可以仅包括页面内容。例如,图5B所示的用户界面51也可以只包括状态栏以外的其他内容,即只包括区域503中的内容。
按照能否响应于用户操作而滚动显示内容,第一用户界面所在的显示区域,即源侧设备100的显示屏,可以分为两个部分:可滚动区域,和,不可滚动区域。
1.可滚动区域
可滚动区域是指会响应于用户的操作(例如,上下滑动的手势)而更改滚动显示不同内容的区域。可滚动区域中显示的内容,可以响应于用户的操作(例如,上下滑动的手势)而滚动显示。可滚动区域中显示的内容可包括社交内容、新闻、小说文本、图片等等。
示例性地,参考图5B示出的用户界面51,区域503中的部分区域503a为可滚动区域,可滚动区域中显示有多条内容。
2.不可滚动区域
不可滚动区域是指不会响应于用户的操作(例如,上下滑动的手势)而滚动显示不同内容的区域。不可滚动区域中显示的内容,不会响应于用户的操作(例如,上下滑动的手势)而滚动显示。不可滚动区域中显示的内容可包括:系统内容,页面内容中的部分内容如菜单栏、搜索栏、应用导航栏等等。
示例性地,参考图5B示出的用户界面51,状态栏所在区域502,以及,区域503中的部分区域503b均为不可滚动区域。不可滚动区域503b中显示有菜单栏。图5B中菜单栏的不同选项被选中时,可滚动区域503a中可以显示不同的内容。
在本申请其他一些实施例中,第一用户界面所在的显示区域也可以只包括可滚动区域,不包括不可滚动区域。例如,图5B所示的用户界面51可以仅显示可滚动区域503a中显示的页面内容。
第一用户界面的内容取决于源侧设备100的显示机制以及第一应用提供的内容,可滚动区域和不可滚动区域在显示屏中的位置取决于第一用户界面中各项内容位于显示屏中的位置。
可选步骤S105,源侧设备100将第一应用的应用信息同步给端侧设备200。
在本申请实施例中,源侧设备100可以基于和端侧设备200之间的通信连接,周期性地将源侧设备100运行的应用的应用信息同步给端侧设备200,也可以在运行的应用发生变化时将新运行的应用的应用信息同步给端侧设备200。该应用可以是第一应用,也可以是其他应用。
源侧设备100可以将应用的应用信息和在S102中截取的图像帧一起发送给端侧设备200,也可以分开发送,本申请实施例对此不做限定。
应用的应用信息可包括以下任意一项或多项:应用的标识(如名称、代码)、应用所属的应用类型(如浏览器、社交类应用、阅读类应用等)、应用所提供的用户界面的信息。用户界面的信息例如可包括用户界面的类型、用户界面中可滚动区域的位置及尺寸、不可滚动区域的位置及尺寸等等。
S106,在执行S102的过程中,源侧设备100以大于第一值的速度滚动显示第一用户界面中位于可滚动区域的页面内容。
源侧设备100可以在执行S101之后,在执行S102的过程中,执行S106。
在一些实施例中,源侧设备100可以接收到用户在该源侧设备100上输入的第一操作,执行S106。第一操作可以是源侧设备100检测到的作用于显示屏中可滚动区域的滑动操作(例如任意方向的滑动手势如向上滑动的手势、向下滑动的手势),也可以是源侧设备100配合鼠标使用时,在鼠标的光标位于可滚动区域后在鼠标上接收到的滑动操作,还可以是语音指令,等等。
源侧设备100滚动显示可滚动区域的页面内容是指,可滚动区域中的页面内容以一定速度往某个方向滚动或移动。在此过程中,有部分页面内容被移出该可滚动区域不再显示,有部分页面内容更改了位于该可滚动区域中的位置,也有新的页面内容出现在该可滚动区域中。
源侧设备100滚动显示页面内容的方向和第一操作相关。如果第一操作是作用于可滚动区域的滑动操作,则源侧设备100滚动显示页面内容的方向可以和该滑动操作的方向相同。例如,如果该第一操作是任意方向的滑动操作,则源侧设备100也可以以任意方向来滚动显示页面内容。在其他一些实施例中,源侧设备100预先设定只能往固定方向(如向上的方向或向下的方向)滚动页面内容,则源侧设备100在接收到包含在该固定方向上的运动矢量的移动操作时,可以以该固定方向来滚动页面内容。
例如,源侧设备100滚动可滚动区域中的页面内容的方向可以是指向上、向下等。其中,向上滚动是指显示屏中显示的页面内容,以显示屏底端往顶端的方向移动。同理可得,向下滚动是指显示屏中显示的页面内容,以显示屏顶端往底端的方向移动。
源侧设备100滚动显示页面内容的速度和第一操作相关。如果第一操作是作用于可滚动区域的滑动操作,则源侧设备100滚动显示页面内容的速度和该滑动操作的速度相关。具体的,源侧设备100接收到作用于可滚动区域的滑动操作的同时,以该滑动操作在滚动方向上的速度,向滚动方向来滚动可滚动区域中的页面内容。此时,在用户看来,可滚动区域中的页面内容跟随手部的滑动操作在移动。用户结束输入该滑动操作之后,源侧设备100还会随着惯性继续向滚动方向滚动显示可滚动区域中的页面内容。在一些实施方式中,源侧设备100可以在用户结束输入滑动操作后,慢慢降低滚动该页面内容的速度,直至停止滚动。在另一些实施方式中,源侧设备100可以在用户结束输入滑动操作后,先提高滚动该页面内容的速度,然后再慢慢降低滚动该页面内容的速度,直至停止滚动。可见,源侧设备100滚动显示第一用户界面中位于可滚动区域的页面内容的过程中,滚动速度会随着第一操作发生变化。
在一些实施方式中,第一操作实现为作用于可滚动区域的滑动操作时,可以通过以下公式来计算该滑动操作在显示屏上的运动矢量V:
Figure PCTCN2022130904-appb-000002
其中,Xv、Yv 2分别为该滑动操作在X方向、Y方向上的运动矢量。X方向、Y方向可 以分别为从显示屏左侧指向右侧的方向、从显示屏底部指向顶端的方向。
当第一操作符合一定条件,例如滑动操作的速度大于一定值时,源侧设备100就会以较快的速度(如大于第一值的速度)滚动显示第一用户界面中位于可滚动区域的页面内容。第一值可以预先设置,这里不做限定。
按照前文S102中的相关描述,源侧设备100以较快的速度滚动显示第一用户界面中位于可滚动区域的页面内容时,显示屏上所显示画面的变化速度较快,源侧设备100确定的投屏帧率也较大。因此,在执行S106的过程中,源侧设备100将会以该较大的投屏帧率来截取屏幕内容,获得多张图像帧,并将图像帧发送给端侧设备200。
图5B-图5E示例性示出了源侧设备100滚动显示第一用户界面中位于可滚动区域的页面内容时的场景。
如图5B-图5C所示,源侧设备100在图5B中开始检测到作用于可滚动区域503a的向上滑动操作,在图5C中检测到用户结束上述向上滑动的操作,源侧设备100响应于该滑动操作向上滚动显示可滚动区域503a中的页面内容。在此过程中,可滚动区域503a中的页面内容可以跟随用户的手部滚动,用户手部移动过程中接触显示屏的区域均显示相同的内容,例如图5B和图5C中的动物图片底部。
如图5C-图5E所示,用户结束上述向上滑动的操作之后,源侧设备100随着惯性继续向上滚动显示可滚动区域503a中的页面内容。
图5B-图5E仅示例性示出了源侧设备100滚动显示页面内容时显示屏所显示的内容,具体实现中,源侧设备100在滚动过程中可显示更多的图像。
源侧设备100使用确定的投屏帧率截取屏幕内容获得的图像帧可包括图5B-图5E所示的用户界面中的四张图像帧。在后续实施例中,将源侧设备100截取的图5B-图5E示出的四张图像帧分别称为图像帧1,图像帧2,图像帧3和图像帧4。
在另一些实施例中,端侧设备200可以在显示投屏内容的过程中,接收到用户在该端侧设备200上输入的第二操作,并将第二操作的操作信息发送给源侧设备100,以触发源侧设备100执行S106。相当于,在镜像投屏过程中,用户可以在端侧设备200上操控源侧设备100所显示的内容。
第二操作可以是端侧设备200检测到的作用于显示屏中用于显示投屏内容的镜像投屏区域的滑动操作(例如任意方向的滑动手势如向上滑动的手势、向下滑动的手势),也可以是端侧设备200配合鼠标使用时,在鼠标的光标位于可滚动区域后在鼠标上接收到的滑动操作,还可以是端侧设备200配合遥控器使用时,在遥控器选中可滚动区域后在遥控器上接收到的点击操作,还可以是语音指令,等等。
端侧设备200检测到的第二操作触发的源侧设备100滚动显示可滚动区域的页面内容的方式(如速度、方向灯),可以参考前文源侧设备100检测到的第一操作触发滚动显示可滚动区域的页面内容的方式,这里不再赘述。
可选步骤S107,源侧设备100将第一操作的操作信息同步给端侧设备200。
在本申请实施例中,源侧设备100可以基于和端侧设备200之间的通信连接,将检测到的操作信息同步给端侧设备200。该操作可以包括第一操作,也可以包括其他操作。
源侧设备100可以将操作信息和在S102中截取的图像帧一起发送给端侧设备200,也可 以分开发送,本申请实施例对此不做限定。
操作信息可包括以下任意一项或多项:操作的类型(如滑动操作类型)、操作的方向、操作的速度、操作的持续时间、操作的轨迹、操作的运动矢量。
S108,端侧设备200接收到源侧设备100发送的图像帧,将接收到的图像帧送入送显队列中。
在镜像投屏过程中,源侧设备100和端侧设备200之间不一定能持续保证高质量的通信,因此,源侧设备100发送给端侧设备200的图像帧可能会在通信过程中丢失一部分,源侧设备100发送的部分图像帧中可能不能被端侧设备200接收到。例如,图5B-图5E中源侧设备100截取的四张图像帧中,端侧设备200可能仅接收到了图像帧1和图像帧2,而图像帧3和图像帧4则由于通信原因丢失。
在一些实施例中,端侧设备200可以接收到编码的图像帧,并对其进行解码操作,以获取图像帧。如果端侧设备200算力不足或者其他原因,可能在解码获取图像帧时耗费较多时间。
在本申请实施例中,端侧设备200可以使用以下任意一种策略,将接收到的图像帧送入送显队列中:策略1,端侧设备200根据接收图像帧的先后顺序,依次将图像帧送入送显队列。策略2,如果源侧设备200在图像帧中打上了时间戳,则端侧设备200可以根据时间戳指示的时间的先后顺序,依次将图像帧送入送显队列。策略3,如果源侧设备200在图像帧中打上了序号,则端侧设备200可以根据序号的先后顺序,依次将图像帧送入送显队列。
在一些实施方式中,为了保证镜像投屏过程中时延较小,端侧设备200可以将接收到的过时图像帧丢弃,避免将其送入送显队列中,或者在将其送入送显队列中后再取出。其中,过时图像帧是指端侧设备200接收到的或者解码得到的,距离源侧设备100的截取时间大于第一时长的图像帧。端侧设备200可以在送显队列中的图像帧超过一定数量(例如2张)时,丢弃送显队列中靠前的部分图像帧(例如2张)。端侧设备200还可以根据图像帧中携带的时间戳,将截取时间距离接收时间或者解码得到时间大于第一时长的图像帧直接丢弃。这样,端侧设备200通过丢弃过时图像帧,可以保证镜像投屏的低时延,给用户一种端侧设备200和源侧设备100同步显示图像的感觉,给用户更好的使用体验。
在本申请实施例中,送显队列是端侧设备200中用于给显示屏提供显示图像的队列。送显队列中的图像按照被送入该队列的先后顺序,依次提供给显示屏进行显示。已经提供给显示屏的图像帧,不再存储在该送显队列中。送显队列可以有预定的大小,例如最多可以存储4张图像帧。可见,送显队列包含的图像帧会实时更新,不同时刻送显队列包含的图像帧可以不同。同时,端侧设备200中还可以设置缓存,送显队列提供给显示屏显示过的图像帧,可以存储在该缓存中。
通常情况下,端侧设备200遵循先入先出的规则,以固定的频率从送显队列中取出最先被送入该送显队列的图像帧进行显示。该固定频率和源侧设备100的投屏帧率相关,例如该固定频率=1/投屏帧率。例如,如果投屏帧率为60帧/秒,则该固定频率可以为16.66毫秒。具体实现中,端侧设备200可以以该固定频率不断生成同步信号,在同步信号到达时从送显队列中取出一张图像帧进行显示。生成同步信号的时间点,在后续实施例中将被称为同步时间点。
参考图6,图6示例性示出了一段时间内,端侧设备200的送显队列在不同时间包括的图像帧。
如图6所示,横坐标为时间轴,时间轴上以投屏帧率生成同步信号,生成同步信号的同步时间点为端侧设备200从送显队列中取出图像帧进行显示的时间点。其中,当前时间点位于同步时间点2和3之间,当前时间点之后是还未到达的未来时间。
图6示出了四个不同时间段的送显队列,送显队列对应的时间段可参考其下方时间轴对应的时间。
初始时,送显队列中包含一张图像帧a,该图像帧a在同步信号1的到达时间被提供给端侧设备200的显示屏进行显示。
之后,送显队列中新入了一张图像帧b,该图像帧b在同步信号2的到达时间被提供给端侧设备200的显示屏进行显示。
然后,端侧设备200先后接收到图像帧c和图像帧d,送显队列中按顺序接收了图像帧c和图像帧d,该图像帧c在同步信号3的到达时间被提供给端侧设备200的显示屏进行显示,图像帧d在同步信号4的到达时间被提供给端侧设备200的显示屏进行显示。
再后来,端侧设备200先后接收到图像帧e-g,送显队列中按顺序接收了图像帧e-g,图像帧e和图像帧f由于是过时图像帧,被剔除出送显队列。图像帧g在同步信号7的到达时间被提供给端侧设备200的显示屏进行显示。
在同步信号5-同步信号6的生成时间期间,送显队列中无图像帧。
S109,端侧设备200根据最新进入送显队列的图像帧,预测第一时间点对应的预测图像帧。
端侧设备200在以下任意一种场景下执行S109:
场景1,端侧设备200在和源侧设备100建立了用于镜像投屏的通信连接后,即执行S109。
场景2,端侧设备200在和源侧设备100建立了用于镜像投屏的通信连接后,如果双方之间的通信质量低于阈值,即执行S109。
端侧设备200和源侧设备100之间的通信质量可以通过通信信号强度、通信时延、信噪比等参数衡量。
如果端侧设备200和源侧设备100之间的通信质量低于一定值,则说明双方之间的通信质量较差,则极有可能由于通信质量较差而导致源侧设备100发送给端侧设备200的图像帧会在通信过程中丢失。也就是说,场景2极有可能导致丢帧,在场景2下执行S109,能够有效避免可能出现的丢帧给用户带来的不良体验。
场景3,端侧设备200在和源侧设备100建立了用于镜像投屏的通信连接后,如果源侧设备100在前台运行指定类型的应用,并且源侧设备100执行了S106即以大于第一值的速度滚动显示第一用户界面中位于可滚动区域的页面内容,则端侧设备200执行S109。
端侧设备200可以根据前文步骤S105中源侧设备100同步的应用信息,判断源侧设备100是否在前台运行指定类型的应用。指定类型的应用是响应于用户操作而滑动或滚动显示用户界面中内容的应用,例如可以包括浏览器、社交类应用、阅读类应用等等。
端侧设备200还可以根据前文步骤S107中源侧设备100同步的操作信息,判断源侧设备100是否接收到指定类型的操作。或者,端侧设备200还可以判断自身是否接收到指定类型的操作。指定类型的操作是指用于触发源侧设备100以大于第一值的速度滚动显示第一用户界面中位于可滚动区域的页面内容的操作,例如速度大于一定值的滑动操作,等等。
如果源侧设备100在前台运行指定类型的应用并执行了S106,则源侧设备100将以较高的投屏帧率来进行镜像投屏。高投屏帧率下如果出现丢帧,会给用户带来非常明显的视觉卡 顿,因此,在场景3下执行S109,能够有效避免高投屏帧率下丢帧给用户带来的不良体验。
基于前述的步骤,源侧设备100执行S104和S106后,即源侧设备100启动第一应用并以大于第一值的速度滚动显示第一用户界面中位于可滚动区域的页面内容后,端侧设备200在镜像投屏过程中可以执行S109。
场景4,端侧设备200的显示帧率和源侧设备100的投屏帧率之间的差值大于一定值如第二值时,端侧设备200执行S109。
端侧设备200的显示帧率由镜像投屏过程中,显示屏在单位时间内显示图像帧的帧数。例如,如果端侧设备200按照图6所示的送显队列来显示投屏内容,假设每16.66毫秒生成一个同步信号,则在同步信号1-4的生成时间期间的显示频率为60FPS,在同步信号4-6的生成时间期间的显示频率为0FPS。
端侧设备200的显示帧率越接近于源侧设备100的投屏帧率,用户看到的镜像投屏效果越好。端侧设备200的显示帧率和源侧设备100的投屏帧率之间的差值越大,镜像投屏过程中的丢帧问题也就越严重。在场景4下执行S109,可以在开始出现较为严重的丢帧问题时执行S109,避免后续持续的丢帧给用户带来的不良体验。
不限于上述单独列出的几个场景,在本申请其他一些实施例中,端侧设备200还可以结合上述的任意多个场景来执行S109。例如,端侧设备200可以在同时满足场景3和场景4的条件下,执行S109。
端侧设备200在执行S109时,具体可以有以下几种策略:
策略1,端侧设备200每接收到一张新的图像帧时,执行一次S109。
策略2,端侧设备200的送显队列中每增加一张新的图像帧,执行一次S109。
策略3,端侧设备200周期性地执行S109。
也就是说,在本申请实施例中,S109将多次执行。
第一时间点位于当前执行S109的时间点之后。第一时间点可以包括:端侧设备200仅按照当前送显队列中的图像帧来显示图像时,端侧设备200的显示帧率低于投屏帧率的一个或多个同步时间点。也就是说,第一时间点包括,假定端侧设备200按照当前送显队列中的图像帧来显示,在显示完当前送显队列中的图像帧之后的一个或多个同步时间点。该多个同步时间点的数量可以为固定数量,可以由端侧设备200预先设定或者由用户自主设定,例如可以为2。
例如,参考图6,假设端侧设备200在当前时间点执行S109,那么按照端侧设备200当前时间点送显队列中的图像帧来显示图像,在同步时间点4该送显队列中的图像帧全部显示完,同步时间点4之后端侧设备200的显示帧率为0。因此,同步时间点4之后的一个或多个同步时间点为第一时间点。例如,同步时间点5和6可以为第一时间点。
下面介绍端侧设备200如何使用最新进入送显队列的图像帧,预测第一时间点对应的预测图像帧。
端侧设备200将接收到的图像帧送入送显队列的策略,可参考S108中的相关描述。最新进入送显队列的图像帧,可以参考S108的相关描述。显然地,最新进入送显队列的图像帧,也就是端侧设备200最新接收到的图像帧。
在执行S109时,最新进入送显队列的图像帧,可能已经有部分或全部被送往显示屏中显 示并已被送显队列剔除。因此端侧设备200可以从送显队列,和/或,缓存中获取最新进入送显队列的图像帧。
例如,参考图6,假设端侧设备200在当前时间点执行S109,则基于当前时间点来说,最新被送入送显队列的两张图像帧为图像帧c和图像帧d,端侧设备200可以从当前的送显队列中获取图像帧c和图像帧d。
在一些实施例中,端侧设备200可以使用最新进入送显队列的两张图像帧,预测第一个第一时间点对应的预测图像帧。根据前文S108中端侧设备200将接收到的图像帧送入送显队列的策略可知,最新进入送显队列的两张图像帧可能是源侧设备100相邻截取的两张图像帧,也可能是源侧设备100非相邻两次截取的图像帧。也就是说,源侧设备100在截取图像帧2之前最近一次截取的图像帧,可能是图像帧1,也可能是其他图像帧。
下面以图6中端侧设备200最新进入送显队列的两张图像帧为图像帧c和图像帧d为例进行说明,图像帧c相较于图像帧d先进入送显队列。假设图像帧c具体为图5B示出的图像帧1,图像帧d具体为图5B示出的图像帧2。
端侧设备200根据最新进入送显队列的两张图像帧,预测第一个第一时间点对应的预测图像帧的过程可包括如下步骤:
S1091,端侧设备200对比图像帧1和图像帧2,确定图像帧2中的运动区域。
图像帧d中的运动区域是指,图像帧d中相较于图像帧c变换了显示内容的区域,相当于图像帧d显示在源侧设备100提供的用户界面的可滚动区域的内容所在区域。
在一些实施例中,端侧设备200执行的S1091具体包括如下步骤:
步骤1,遍历图像帧2的所有像素点,确定图像帧2相比于图像帧1在同一位置的各个像素点的变化值,然后确定像素值变化超过阈值T的像素点,获得由这些像素点组成的区域1。
相当于,端侧设备200将运动过程中图像帧1和图像帧2之间的差异二值化,找出运动的区域。
图像帧2中各个像素点的变化程度可以通过以下方式计算得到:
D k(x,y)=|f k(x,y)-f k-1(x,y)|
其中,f k-1(x,y)、f k(x,y)分别表示坐标为(x,y)的像素点在图像帧1、图像帧2中的像素值,D k(x,y)表示坐标为(x,y)的像素点在图像帧2中相较于图像帧1的变化值。
阈值T可以预先设定,例如可以预先根据最大类间方差法计算动态获取。T可以为经验值。
根据步骤1确定的区域1,有可能并非标准形状(如矩形)的区域,还有可能是离散的区域。但是,根据源侧设备100截取的图像帧1和图像帧2时的运动情况来看,通常源侧设备100在用户界面中滚动显示页面内容时,滚动的区域通常是标准形状并且集中的一块区域。
例如,对比图像帧1和图像帧2,实际的滚动区域应当为图5C中的可滚动区域503a,但是由于图像帧1和图像帧2中有部分空白或者相同的内容,因此使用步骤1确定的区域1,可能仅是实际可滚动区域503a中的部分非标准且离散的区域。
为了能够得到更加准确的图像帧2中的运动区域,S1091之后,端侧设备200还可以在区域1的基础上执行进一步修正操作。
步骤2,修正区域1,得到区域2。
在一些实施方式中,端侧设备200可以将区域1的形状标准化,例如可根据区域1的形状得到标准化的矩形。具体的,端侧设备200可以获取区域1中各个像素点中的最大横坐标 x max、最小横坐标x min、最大纵坐标y max、最小纵坐标y min,然后确定以下四个坐标点:(x min,y min),(x min,y max),(x max,y min),(x max,y max),将上述四个坐标点构成的区域确定为一个标准的区域。
在一些实施方式中,端侧设备200可以将区域1的形状去离散化,例如可根据区域1的形状得到一个集中的区域。
端侧设备200可以结合在一段时间内,取之前预测图像帧时确定的运动区域,和,区域1的并集,获取得到区域2。如果端侧设备200之前的一段时间内并未预测过图像帧,则端侧设备200可以使用在图像帧1和图像帧2之前进入送显队列的两张或多张图像帧,执行一次或多次和上述步骤1类似的操作,获得一个或多个区域,然后将这些区域与上述区域1取并集。由于通常在一段时间内,用户在源侧设备100上输入的用户操作通常是相同的,源侧设备100中的可滚动区域不会发生变化,因此可以通过上述取并集的方式,来获取较为精准的运动区域。
端侧设备200可以结合用户滑动源侧设备100的操作习惯,根据区域1的形状得到一个集中的区域。例如,如果源侧设备100接收到上下滑动的操作,则端侧设备200可以在区域1的基础上,将其宽度直接扩展至图像帧2的宽度。又例如,如果源侧设备100接收到左右滑动的操作,则端侧设备200可以在区域1的基础上,将其长度直接扩展至图像帧2的长度。这种方式考虑到用户的操作以及源侧设备100响应操作时的方式,可以获取较为精准的运动区域。
上述三种任意一种或多种方式的结合获取到的区域,可以称为区域2。区域2即为图像帧2中的运动区域。
S1092,端侧设备200对比图像帧1和图像帧2,确定图像帧2中的移动向量
Figure PCTCN2022130904-appb-000003
源侧设备100在显示图像帧1时,将运动区域中的页面内容按照移动向量
Figure PCTCN2022130904-appb-000004
移动后,就显示图像帧2。移动向量
Figure PCTCN2022130904-appb-000005
表示了移动的距离及方向。也就是说,对比图像帧1和图像帧2,图像帧1中的某个显示内容或目标点在按照移动向量
Figure PCTCN2022130904-appb-000006
移动后,显示在了图像帧2的运动区域中。换句话说,源侧设备100在显示屏上显示图像帧1和图像帧2时,同一内容在图像帧1中的位置移动至图像帧2中的位置时的移动向量,即为该移动向量
Figure PCTCN2022130904-appb-000007
也即是说,假设图像f 1(x,y)经过向量
Figure PCTCN2022130904-appb-000008
平移得到图像f 2(x,y),即需要获取
Figure PCTCN2022130904-appb-000009
f 2(x,y)=f 1(x-dx,y-dy)
映射到频域,即
F 2(u,v)=F 1(u,v)*e -i*2π*(u*dx+v*dy)
互功率谱
Figure PCTCN2022130904-appb-000010
傅里叶反变换得到脉冲函数,通过对脉冲函数取峰值,可以得到偏移向量
Figure PCTCN2022130904-appb-000011
图像f 1(x,y)表示图像帧1,图像f 2(x,y)表示图像帧2。
在其他一些实施例中,端侧设备200还可以使用另外的方法来计算移动向量
Figure PCTCN2022130904-appb-000012
例如,端侧设备200还可以标定图像帧1和图像帧2中显示相同内容的标志像素点,计算该标志像素点从图像帧1中的位置移动至图像帧2中位置的移动向量,以此作为移动向量
Figure PCTCN2022130904-appb-000013
可见,上述S1091-S1092通过对比最新送入送显队列的图像帧2和图像帧1的方式,来确定图像帧2相较于送显队列中的上一帧图像(即图像帧1)的运动区域和移动向量
Figure PCTCN2022130904-appb-000014
S1093,端侧设备200在图像帧2的基础上,在运动区域中将其显示的内容按照移动向量
Figure PCTCN2022130904-appb-000015
移动,移动后该运动区域中空出来的区域使用预测数据填充,得到预测图像帧;该预测数据根据图像帧2在该空出来的区域所显示的内容得到。
具体的,端侧设备200在图像帧2的基础上,在运动区域中将其显示的内容按照移动向量
Figure PCTCN2022130904-appb-000016
移动,移动后,原本运动区域中的部分内容会被移出该运动区域并不再显示,并且原本运动区域中有一个空出来的区域。该空出来的区域,和,被移出内容原来在运动区域中所占的区域,大小可以相同,也可以不同。
之后,端侧设备200根据图像帧2原来在该空出来的区域所显示的内容,得到预测数据,将该预测数据填充到该空出来的区域中。
在一些实施例中,端侧设备200可采用均值模糊、中值模糊、高斯模糊、双边模糊、表面模糊、方框模糊、双重模糊、散景模糊、移轴模糊、光圈模糊、粒状模糊、径像模糊、方向模糊等图像处理方法处理图像帧2原来在该空出来的区域所显示的内容,从而得到预测数据。通过模糊图像处理方法得到的预测数据,相较于图像帧2原来在该空出来的区域所显示的内容,清晰度较低。
在另一些实施例中,端侧设备200也可以直接将图像帧2原来在该空出来的区域所显示的内容,看作预测数据。
在又一些实施例中,端侧设备200还可以使用神经网络算法,根据图像帧2原来在该空出来的区域所显示的内容进行图像预测,得到预测数据。该神经网络算法可以大量的电子设备在用户界面中显示的内容为输入,以电子设备在该用户界面中滚动该内容后所显示的内容为输出,训练得到。例如,如果图像帧2原来在该空出来的区域所显示的内容为典型图案的一部分,则端侧设备200可以通过算法预测出该典型图案的另一部分。
在又一些实施例中,端侧设备200还可以存储一段时间内源侧设备100发送过来的图像帧,例如可以将其存储在缓存中,端侧设备200可以根据存储的图像帧来得到预测数据。例如用户在一段时间内输入多次用户操作(例如反复的上下滑动操作),触发源侧设备100反复显示相同的图像帧,则端侧设备200可以根据之前已经显示过的图像帧得到预测数据。
上述几种获取预测数据的方法也可以任意结合实施。
在另一些实施例中,端侧设备200可以使用最新进入送显队列的一张图像帧,预测第一个第一时间点对应的预测图像帧。下面以图6中端侧设备200最新进入送显队列的一张图像帧为图像帧d为例进行说明。假设图像帧d具体为图5B示出的图像帧2。
端侧设备200根据最新进入送显队列的一张图像帧,预测第一个第一时间点对应的预测图像帧的过程可包括如下步骤:
S1091’,端侧设备200根据第一应用的应用信息,确定图像帧2中的运动区域。
这里,图像帧2中运动区域是指,图像帧2中相较于源侧设备100在图像帧2之前最近一次截取的图像帧(例如图像帧1)变换了显示内容的区域,相当于图像帧2显示在源侧设备100提供的用户界面的可滚动区域的内容所在区域。
如果源侧设备100执行了S105,则端侧设备200可以根据源侧设备100发送的第一应用的应用信息中包含的第一用户界面的信息,确定该第一用户界面中可滚动区域的位置及尺寸,并将图像帧2显示于第一用户界面中时位于该可滚动区域的内容所在的区域,确定为图像帧2的运动区域。
S1092’,端侧设备200根据第一操作或第二操作的操作信息,确定图像帧2中的移动向 量
Figure PCTCN2022130904-appb-000017
这里,源侧设备100在显示截取图像帧2之前最近一次截取的图像帧(例如图像帧1)时,将运动区域中的页面内容按照移动向量
Figure PCTCN2022130904-appb-000018
移动后,就显示图像帧2。移动向量
Figure PCTCN2022130904-appb-000019
表示了移动的距离及方向。也就是说,对比图像帧1和图像帧2,图像帧1中的某个显示内容或目标点在按照移动向量
Figure PCTCN2022130904-appb-000020
移动后,显示在了图像帧2的运动区域中。换句话说,源侧设备100在显示屏上显示图像帧1和图像帧2时,同一内容在图像帧1中的位置移动至图像帧2中的位置时的移动向量,即为该移动向量
Figure PCTCN2022130904-appb-000021
如果源侧设备100执行了S107,则端侧设备200可以根据源侧设备100发送的第一操作的操作信息来确定图像帧2中的移动向量
Figure PCTCN2022130904-appb-000022
或者,如果端侧设备200接收到第二操作并触发源侧设备100执行了S106,则端侧设备200可以根据接收到的第二操作的操作信息来确定图像帧2的移动向量
Figure PCTCN2022130904-appb-000023
在一些实施方式中,源侧设备100滚动显示可滚动区域中内容时的移动向量仅和对应的操作相关,因此端侧设备200可以根据第一操作或第二操作的操作信息直接确定图像帧2的移动向量
Figure PCTCN2022130904-appb-000024
例如,将第一操作或第二操作的移动方向作为图像帧2的移动方向,以第一操作或第二操作在移动方向上的速度和相邻两个同步时间点之间的时长之间的乘积,作为图像帧2的移动长度。
在一些实施方式中,源侧设备100滚动显示可滚动区域中内容时的移动向量和对应的操作,以及,源侧设备100本身的滑动参数相关,因此端侧设备200可以根据第一操作或第二操作的操作信息以及源侧设备100本身的滑动参数确定图像帧2的移动向量
Figure PCTCN2022130904-appb-000025
例如,不同设备响应于相同的第一操作或第二操作,可以呈现出不同的滑动页面效果,本申请实施例可考虑这一点来准确地计算图像帧2的移动向量
Figure PCTCN2022130904-appb-000026
在一些实施方式中,源侧设备100滚动显示可滚动区域中内容时的移动向量和对应的操作,以及,源侧设备100当前在前台运行的应用相关,因此端侧设备200可以根据第一操作或第二操作的操作信息以及源侧设备100在前台运行的应用确定图像帧2的移动向量
Figure PCTCN2022130904-appb-000027
例如,源侧设备100运行不同的应用时,响应于相同的第一操作或第二操作,可以呈现出不同的滑动页面效果,本申请实施例可考虑这一点来准确地计算图像帧2的移动向量
Figure PCTCN2022130904-appb-000028
可见,上述S1091’-S1092’通过推测的方式,来确定图像帧2相较于源侧设备100截取到的上一张图像帧的运动区域和移动向量
Figure PCTCN2022130904-appb-000029
S1093’,参考S1093。
也就是说,本申请实施例中,端侧设备200可以根据最新进入送显队列的一张或两张图像帧,预测第一时间点对应的预测图像帧。
参考图7,图7示例性示出了端侧设备200在图像帧2的基础上获取预测图像帧的过程。
如图7所示,端侧设备200通过上述任意一种实施例所示的方法确定了图像帧2中的运动区域701和移动向量
Figure PCTCN2022130904-appb-000030
端侧设备200在运动区域701中将原本显示于运动区域701的内容按照移动向量
Figure PCTCN2022130904-appb-000031
移动。移动后,可见,原本显示于运动区域701中的部分内容,如内容701a被移出该运动区域,原本的运动区域701中有部分区域空出来,如图7中的区域701b。
如图7所示,端侧设备200根据图像帧2原本在区域701b中显示的内容,得到预测数据,将其填充至区域701b之后,即得到预测图像帧5。示例性地,预测图像帧5在区域701b中显示的内容的清晰度,低于,图像帧2原本在区域701b中显示内容的清晰度。
端侧设备200预测第一个第一时间点对应的预测图像帧后,可以持续预测后续的第一时 间点对应的预测图像帧。端侧设备200预测后续的第一时间点对应的预测图像帧的方法也可以参考上述两种实施例的任意一种方法。具体的,端侧设备200可以根据最新进入送显队列的一张图像帧,和,第一个时间点对应的预测图像帧,预测第二个第一时间点对应的预测图像帧;根据第一个时间点对应的预测图像帧,和,第二个时间点对应的预测图像帧,预测第三个第一时间点对应的预测图像帧,以此类推。端侧设备200根据两张图像帧来预测新的预测图像帧的步骤,可参考和S1091-S1093类似的操作。或者,端侧设备200可以根据第一个时间点对应的预测图像帧,预测第二个第一时间点对应的预测图像帧;根据第二个时间点对应的预测图像帧,预测第三个第一时间点对应的预测图像帧,以此类推。端侧设备200根据一张图像帧来预测新的预测图像帧的步骤,可参考和S1091’-S1093’类似的操作。
参考图7,假设端侧设备200根据图像帧2和预测图像帧5,得到预测图像帧6。示例性地,预测图像帧6在区域701b中显示的内容的清晰度,低于,预测图像帧5原本在区域701b中显示内容的清晰度。
由于端侧设备200送显队列中的图像帧是实时发生变化的,端侧设备200会多次执行S109,因此,端侧设备200每一次执行S109时确定的第一时间点和预测图像帧可以不同。端侧设备200最近一次执行S109时确定的第一时间点和预测图像帧将覆盖之前执行S109时的结果,之前确定的第一时间点和预测图形帧失效。
在一些实施方式中,端侧设备200可以将预测的图像帧,按照先后顺序依次送入预测图像队列,在前的第一时间点对应的预测图像帧先送入该预测图像队列。由于S109会多次执行,每执行一次S109,更新一次预测图像队列中的图像帧。
示例性地,参考图6,假设端侧设备200最近一次执行S109时,根据图像帧c和图像帧d(即图像帧1和图像帧2)确定了预测图像帧5和预测图像帧6,则将预测图像帧5和预测图像帧6依次送入预测图像队列。
S110,在同步时间点到达时,如果送显队列中有图像帧,则端侧设备200显示该送显队列中的图像帧;如果送显队列中没有图像帧,则端侧设备200显示预测图像队列中的图像帧。
具体的,端侧设备200可以以该固定频率不断生成同步信号,在同步信号到达的同步时间点,从当前的送显队列中按照先入先出的原则取出一张图像帧进行显示。如果当前的送显队列中没有图像帧,则从当前的预测图像队列中按照先入先出的原则取出一张预测图像帧进行显示。
参考图6,假设送显队列中按顺序先后接收了图像帧c和图像帧d,该图像帧c在同步信号3的到达时间被提供给端侧设备200的显示屏进行显示,图像帧d在同步信号4的到达时间被提供给端侧设备200的显示屏进行显示。在同步信号4后,该送显队列中没有图像帧。那么,在同步信号5到达时,端侧设备200从预测图像队列中取出预测图像帧5进行显示,在同步信号6到达时从预测图像队列中取出预测图像帧6进行显示。和送显队列类似,已经提供给显示屏的预测图像帧,不再存储在该预测图像队列中,可以存储在端侧设备200的缓存中。
在本申请实施例中,端侧设备200可以全屏显示上述送显队列或预测图像队列中的图像帧,也可以在显示屏的部分区域显示上述送显队列或预测图像队列中的图像帧。本申请实施例将端侧设备200的显示屏上用于显示上述送显队列或预测图像队列中的图像帧的区域,称 为镜像投屏区域或投屏区域。可见,该镜像投屏区域可以是显示屏的全部区域,也可以是显示屏的部分区域。
在一些实施例中,该镜像投屏区域和源侧设备100中显示屏的尺寸比例,可以相同,也可以不同。若两者相同,则端侧设备200可以按比例在镜像投屏区域中显示对应的图像帧。若两者不同,则端侧设备200可以根据该镜像投屏区域的尺寸对该图像帧进行拉伸或缩放处理后进行显示,以自适应匹配该镜像投屏区域。
该镜像投屏区域位于端侧设备200的显示屏中的位置、大小、形状等,均可以由端侧设备200默认设置,也可以由用户自主设置或调整,本申请实施例对此不做限定。
如果该镜像投屏区域仅占用端侧设备200的显示屏的部分区域,则端侧设备200除了在该镜像投屏区域显示上述送显队列或预测图像队列中的图像帧以外,还可以在该镜像投屏区域的其他区域显示端侧设备200自身提供的内容。该镜像投屏区域的其他区域所显示的内容取决于当前端侧设备200所运行的应用及打开的界面等等,例如可以为桌面或其他应用提供的界面等等,这里不做限定。
图8A-图8D示例性示出了端侧设备200在镜像投屏过程中所显示的用户界面。
如图8A-图8D所示,镜像投屏区域801占用端侧设备200显示屏的一部分,显示屏的其余区域显示有端侧设备200的桌面。
在源侧设备100和端侧设备200建立通信连接后,假设源侧设备100截取了图5B-图5E示出的四张图像帧即图像帧1-图像帧4发送给端侧设备200。由于通信质量或时延等原因,其中的图像帧1-2被送入了端侧设备200的送显队列,而图像帧3-4在通信过程中被丢弃或者由于过时被丢弃。
图8A-图8D示出了端侧设备200在先后的几个同步时间点所显示的用户界面。
如图8A所示,首先,端侧设备200在镜像投屏区域801显示图像帧1。
如图8B所示,其次,端侧设备200在镜像投屏区域801显示图像帧2。
如图8C所示,然后,端侧设备200在镜像投屏区域801显示预测图像帧5。
如图8D所示,最后,端侧设备200在镜像投屏区域801显示预测图像帧6。
其中,预测图像帧5和预测图像帧6是端侧设备200在S109中预测得到的预测图像帧。
将图8A-图8D和图5B-图5E对比,可见,图8A中的镜像投屏区域801中显示源侧设备100在图5B中截取的图像帧1,图8B中的镜像投屏区域801中显示源侧设备100在图5C中截取的图像帧2。之后,由于源侧设备100在图5E中截取的图像帧3和图5E中截取的图像帧4出现丢帧,因此端侧设备200在图8A中的镜像投屏区域801中显示预测图像帧5,图8B中的镜像投屏区域801中显示预测图像帧6。
通过图3所示的方法流程,当端侧设备200的送显队列中的图像帧已经被送显完,而新的图像帧还未到达时,该端侧设备200可以显示预测图像帧,来保证端侧设备200的显示帧率接近或等于源侧设备100的投屏帧率,连续显示画面,可以维持视觉上画面的连续性,避免卡顿问题。并且,根据最新接收到的图像帧进行图像预测,可以保障画面的流畅性和稳定性,避免带来视觉上的跳跃。在用户看来,用户能够看到流畅且连续的画面,不会出现卡顿或跳跃的感觉,可以获得良好的投屏体验。
在上述图3所示的方法流程以及其他实施例中:
端侧设备200最新进入送显队列的一张图像帧,可以被称为第一图像帧,最新进入送显队列的倒数第二张图像帧,可以被称为第二图像帧。例如,S109中提及的图像帧2为第一图像帧,图像帧1为第二图像帧。
源侧设备100在截取第一图像帧之后所截取并发送给端侧设备200,但出现丢帧的图像帧,可以被称为第三图像帧。例如,源侧设备100截取的图5D-图5E中的图像帧3和图像帧4由于通信原因丢失,则为该第三图像帧。
端侧设备200预测的在第一个第一时间点对应的预测图像帧,可以被称为第一预测图像帧,在第二个第一时间点对应的预测图像帧,可以被称为第二预测图像帧。例如,预测图像帧5可以为第一预测图像帧,预测图像帧6可以被称为第二预测图像帧。
端侧设备100在预测第一个第一时间点对应的预测图像帧时,如果依据最新送入送显队列的两张图像帧进行预测,则最新送入送显队列的一张图像帧的前一张图像帧,可以被称为第四图像帧。如果依据最新送入送显队列的一张图像帧进行预测,则源侧设备100在最新送入送显队列的一张图像帧之前最近一次截取的图像帧,也可以被称为第四图像帧。例如,参考前文S109,图像帧1可以为第四图像帧。
应理解,上述方法实施例中的各步骤可以通过处理器中的硬件的集成逻辑电路或者软件形式的指令完成。结合本申请实施例所公开的方法步骤可以直接体现为硬件处理器执行完成,或者用处理器中的硬件及软件模块组合执行完成。
本申请还提供一种电子设备,该电子设备可以包括:存储器和处理器。其中,存储器可用于存储计算机程序;处理器可用于调用所述存储器中的计算机程序,以使得该电子设备执行上述任意一个实施例中源侧设备100或端侧设备200任意一个执行的方法。
本申请还提供了一种芯片系统,所述芯片系统包括至少一个处理器,用于实现上述任一个实施例中源侧设备100或端侧设备200任意一个执行的方法中所涉及的功能。
在一种可能的设计中,所述芯片系统还包括存储器,所述存储器用于保存程序指令和数据,存储器位于处理器之内或处理器之外。
该芯片系统可以由芯片构成,也可以包含芯片和其他分立器件。
可选地,该芯片系统中的处理器可以为一个或多个。该处理器可以通过硬件实现也可以通过软件实现。当通过硬件实现时,该处理器可以是逻辑电路、集成电路等。当通过软件实现时,该处理器可以是一个通用处理器,通过读取存储器中存储的软件代码来实现。
可选地,该芯片系统中的存储器也可以为一个或多个。该存储器可以与处理器集成在一起,也可以和处理器分离设置,本申请实施例并不限定。示例性地,存储器可以是非瞬时性处理器,例如只读存储器ROM,其可以与处理器集成在同一块芯片上,也可以分别设置在不同的芯片上,本申请实施例对存储器的类型,以及存储器与处理器的设置方式不作具体限定。
示例性地,该芯片系统可以是现场可编程门阵列(field programmable gate array,FPGA),可以是专用集成芯片(application specific integrated circuit,ASIC),还可以是系统芯片(system on chip,SoC),还可以是中央处理器(central processor unit,CPU),还可以是网络处理器(network processor,NP),还可以是数字信号处理电路(digital signal processor,DSP),还可以是微控制器(micro controller unit,MCU),还可以是可编程控制器(programmable logic device,PLD)或其他集成芯片。
本申请还提供一种计算机程序产品,所述计算机程序产品包括:计算机程序(也可以称为代码,或指令),当所述计算机程序被运行时,使得计算机执行上述任一个实施例中源侧设备100或端侧设备200任意一个执行的方法。
本申请还提供一种计算机可读存储介质,所述计算机可读存储介质存储有计算机程序(也可以称为代码,或指令)。当所述计算机程序被运行时,使得计算机执行上述任一个实施例中 源侧设备100或端侧设备200任意一个执行的方法。
应理解,本申请实施例中的处理器可以是一种集成电路芯片,具有信号的处理能力。在实现过程中,上述方法实施例的各步骤可以通过处理器中的硬件的集成逻辑电路或者软件形式的指令完成。上述的处理器可以是通用处理器、数字信号处理器(digital signal processor,DSP)、专用集成电路(AP 800plication specific integrated circuit,ASIC)、现场可编程门阵列(field programmable gate array,FPGA)或者其他可编程逻辑器件、分立门或者晶体管逻辑器件、分立硬件组件。可以实现或者执行本申请实施例中的公开的各方法、步骤及逻辑框图。通用处理器可以是微处理器或者该处理器也可以是任何常规的处理器等。结合本申请实施例所公开的方法的步骤可以直接体现为硬件译码处理器执行完成,或者用译码处理器中的硬件及软件模块组合执行完成。软件模块可以位于随机存储器,闪存、只读存储器,可编程只读存储器或者电可擦写可编程存储器、寄存器等本领域成熟的存储介质中。该存储介质位于存储器,处理器读取存储器中的信息,结合其硬件完成上述方法的步骤。
另外,本申请实施例还提供一种装置。该装置具体可以是组件或模块,该装置可包括相连的一个或多个处理器和存储器。其中,存储器用于存储计算机程序。当该计算机程序被一个或多个处理器执行时,使得装置执行上述各方法实施例中的联网方法。
其中,本申请实施例提供的装置、计算机可读存储介质、计算机程序产品或芯片均用于执行上文所提供的对应的方法。因此,其所能达到的有益效果可参考上文所提供的对应的方法中的有益效果,此处不再赘述。
本申请的各实施方式可以任意进行组合,以实现不同的技术效果。
在上述实施例中,可以全部或部分地通过软件、硬件、固件或者其任意组合来实现。当使用软件实现时,可以全部或部分地以计算机程序产品的形式实现。所述计算机程序产品包括一个或多个计算机指令。在计算机上加载和执行所述计算机程序指令时,全部或部分地产生按照本申请所述的流程或功能。所述计算机可以是通用计算机、专用计算机、计算机网络、或者其他可编程装置。所述计算机指令可以存储在计算机可读存储介质中,或者从一个计算机可读存储介质向另一个计算机可读存储介质传输,例如,所述计算机指令可以从一个网站站点、计算机、服务器或数据中心通过有线(例如同轴电缆、光纤、数字用户线)或无线(例如红外、无线、微波等)方式向另一个网站站点、计算机、服务器或数据中心进行传输。所述计算机可读存储介质可以是计算机能够存取的任何可用介质或者是包含一个或多个可用介质集成的服务器、数据中心等数据存储设备。所述可用介质可以是磁性介质,(例如,软盘、硬盘、磁带)、光介质(例如,DVD)、或者半导体介质(例如固态硬盘(solid state disk,SSD))等。
本领域普通技术人员可以理解实现上述实施例方法中的全部或部分流程,该流程可以由计算机程序来指令相关的硬件完成,该程序可存储于计算机可读取存储介质中,该程序在执行时,可包括如上述各方法实施例的流程。而前述的存储介质包括:ROM或随机存储记忆体RAM、磁碟或者光盘等各种可存储程序代码的介质。
总之,以上所述仅为本申请技术方案的实施例而已,并非用于限定本申请的保护范围。凡根据本申请的揭露,所作的任何修改、等同替换、改进等,均应包含在本申请的保护范围之内。

Claims (25)

  1. 一种投屏中流畅显示画面的方法,其特征在于,所述方法应用于包括第一设备和第二设备的通信系统,所述方法包括:
    所述第一设备和所述第二设备建立通信连接;
    所述第一设备截取显示屏所显示的内容,得到第一图像帧,并将所述第一图像帧发送给所述第二设备;
    所述第二设备接收到所述第一图像帧;
    所述第二设备依次显示:所述第一图像帧、第一预测图像帧;所述第一预测图像帧由所述第二设备根据所述第一图像帧得到。
  2. 根据权利要求1所述的方法,其特征在于,所述方法还包括:
    所述第一设备在得到所述第一图像帧之前,截取显示屏所显示的内容,得到第二图像帧,并将所述第二图像帧发送给所述第二设备;
    所述第二设备接收到所述第二图像帧;
    所述第二设备在显示所述第一图像帧之前,还显示所述第二图像帧;
    其中,所述第一预测图像帧由所述第二设备根据所述第一图像帧和所述第二图像帧得到。
  3. 根据权利要求1或2所述的方法,其特征在于,所述第二设备显示所述第一预测图像帧之前,所述方法还包括:
    所述第一设备得到所述第一图像帧之后,再次截取所述显示屏所显示的内容,得到第三图像帧,并将所述第三图像帧发送给所述第二设备;
    所述第二设备未接收到所述第三图像帧,或者,所述第二设备在所述第一设备截取所述第三图像帧后的第一时长内未接收到所述第三图像帧。
  4. 根据权利要求1-3任一项所述的方法,其特征在于,所述第二设备在显示所述第一预测图像帧之前,所述方法还包括:
    所述第一设备运行第一应用并接收到第一操作,所述第一设备响应于所述第一操作以大于第一值的速度滚动显示显示屏中的内容;所述第一设备将所述第一应用的应用信息、所述第一操作的操作信息发送给所述第二设备;所述第二设备接收到所述第一应用的应用信息、所述第一操作的操作信息,根据所述第一图像帧得到所述第一预测图像帧;
    或者,
    所述第一设备运行第一应用,将所述第一应用的应用信息发送给所述第二设备;所述第二设备接收到第二操作并将所述第二操作的操作信息发送给所述第一设备,触发所述第一设备以大于第一值的速度滚动显示显示屏中的内容;所述第二设备根据所述第一图像帧得到所述第一预测图像帧。
  5. 根据权利要求1-3任一项所述的方法,其特征在于,所述第二设备显示所述第一预测图像帧之前,所述方法还包括:
    所述第一设备和所述第二设备建立通信连接之后,所述第二设备根据所述第一图像帧得到所述第一预测图像帧;
    或者,所述第二设备在所述通信连接对应的通信质量低于阈值的情况下,根据所述第一图像帧得到所述第一预测图像帧;
    或者,在所述第二设备显示所述第一设备发送的图像帧的显示帧率,和,所述第一设备截取显示屏所显示内容的投屏帧率之间的差值大于第二值的情况下,所述第二设备根据所述第一图像帧得到所述第一预测图像帧。
  6. 根据权利要求1-5任一项所述的方法,其特征在于,
    所述第二设备接收到所述第一图像帧的时间点,和,所述第一设备得到所述第一图像帧的时间点,在第一时长内。
  7. 根据权利要求2所述的方法,其特征在于,
    所述第二设备接收到所述第二图像帧的时间点,和,所述第一设备得到所述第二图像帧的时间点,在第一时长内。
  8. 根据权利要求1-7任一项所述的方法,其特征在于,
    所述第一预测图像帧为:在所述第一图像帧的基础上,将运动区域显示的内容按照移动向量在所述运动区域中移动后,使用预测数据填充到空闲区域后所得到的图像;
    其中,所述运动区域为所述第一图像帧中和第四图像帧中显示不同内容的区域;所述移动向量为目标内容在所述第四图像帧中的位置移动至所述目标内容在所述第一图像帧中的位置的向量;所述空闲区域为移动所述运动区域显示的内容后,所述运动区域中未显示内容的区域;所述第四图像帧为所述第一设备在所述第一图像帧之前最近一次截取显示屏所显示内容得到的图像帧,或者,为所述第二设备在显示所述第一图像帧之前最近一次显示的第二图像帧;
    所述预测数据由所述第二设备根据所述第一图像帧在所述空闲区域所显示的内容得到。
  9. 根据权利要求1-8任一项所述的方法,其特征在于,
    所述第二设备的显示屏包括投屏区域,所述投屏区域用于依次显示所述第一图像帧、第一预测图像帧,所述投屏区域占用所述第二设备的部分或全部显示屏。
  10. 根据权利要求1-9任一项所述的方法,其特征在于,所述方法还包括:
    所述第二设备显示所述第一预测图像帧之后,显示第二预测图像帧,所述第二预测图像帧由所述第二设备根据所述第一预测图像帧得到。
  11. 根据权利要求1-10任一项所述的方法,其特征在于,
    所述第一设备的显示屏所显示内容的变化速度越快,所述第一设备截取显示屏所显示的内容的投屏帧率越高。
  12. 根据权利要求1-3任一项所述的方法,其特征在于,所述第二设备依次显示所述第一图像帧、所述第一预测图像帧的显示帧率,等于,所述第一设备截取显示屏所显示的内容的投屏帧率。
  13. 一种投屏中流畅显示画面的方法,其特征在于,所述方法应用于第二设备,所述方法包括:
    所述第二设备建立和第一设备之间的通信连接;
    所述第二设备接收到所述第一设备发送的第一图像帧,所述第一图像帧由所述第一设备截取显示屏所显示的内容得到;
    所述第二设备依次显示:所述第一图像帧、第一预测图像帧;所述第一预测图像帧由所述第二设备根据所述第一图像帧得到。
  14. 根据权利要求13所述的方法,其特征在于,所述方法还包括:
    所述第二设备接收到所述第一设备发送的第二图像帧,所述第二图像帧由所述第一设备在得到所述第一图像帧之前截取显示屏所显示的内容得到;
    所述第二设备显示所述第一图像帧之前,还显示所述第二图像帧;
    其中,所述第一预测图像帧由所述第二设备根据所述第一图像帧和所述第二图像帧得到。
  15. 根据权利要求13或14所述的方法,其特征在于,所述第二设备显示所述第一预测图像帧之前,所述方法还包括:
    所述第二设备未接收到第三图像帧,或者,所述第二设备在所述第一设备截取所述第三图像帧后的第一时长内未接收到所述第三图像帧;
    其中,所述第三图像帧由所述第一设备得到所述第一图像帧之后,再次截取所述显示屏所显示的内容得到,并由所述第一设备发送给所述第二设备。
  16. 根据权利要求13-15任一项所述的方法,其特征在于,所述第二设备显示所述第一预测图像帧之前,所述方法还包括:
    所述第二设备接收到所述第一设备发送的第一应用的应用信息、第一操作的操作信息,根据所述第一图像帧得到所述第一预测图像帧;所述第一应用为所述第一设备运行的应用,所述第一操作为所述第一设备运行所述第一应用时接收到的操作,所述第一操作用于触发所述第一设备在运行所述第一应用时以大于第一值的速度滚动显示显示屏中的内容;
    或者,
    所述第二设备接收到所述第一设备发送的第一应用的应用信息,所述第一应用为所述第一设备运行的应用;所述第二设备接收到第二操作,并将所述第二操作的操作信息发送给所述第一设备,以触发所述第一设备在运行所述第一应用时以大于第一值的速度滚动显示显示屏中的内容;所述第二设备根据所述第一图像帧得到所述第一预测图像帧。
  17. 根据权利要求13-15任一项所述的方法,其特征在于,所述第二设备显示所述第一预测图像帧之前,所述方法还包括:
    所述第一设备建立和所述第二设备之间的通信连接之后,所述第二设备根据所述第一图像帧得到所述第一预测图像帧;
    或者,所述第二设备在所述通信连接对应的通信质量低于阈值的情况下,根据所述第一图像帧得到所述第一预测图像帧;
    或者,在所述第二设备显示所述第一设备发送的图像帧的显示帧率,和,所述第一设备截取显示屏所显示内容的投屏帧率之间的差值大于第二值的情况下,所述第二设备根据所述 第一图像帧得到所述第一预测图像帧。
  18. 根据权利要求13-17任一项所述的方法,其特征在于,
    所述第二设备接收到所述第一图像帧的时间点,和,所述第一设备得到所述第一图像帧的时间点,在第一时长内。
  19. 根据权利要求14所述的方法,其特征在于,
    所述第二设备接收到所述第二图像帧的时间点,和,所述第一设备得到所述第二图像帧的时间点,在第一时长内。
  20. 根据权利要求13-19任一项所述的方法,其特征在于,
    所述第一预测图像帧为:在所述第一图像帧的基础上,将运动区域显示的内容按照移动向量在所述运动区域中移动后,使用预测数据填充到空闲区域后所得到的图像;
    其中,所述运动区域为所述第一图像帧中和第四图像帧中显示不同内容的区域;所述移动向量为目标内容在所述第四图像帧中的位置移动至所述目标内容在所述第一图像帧中的位置的向量;所述空闲区域为移动所述运动区域显示的内容后,所述运动区域中未显示内容的区域;所述第四图像帧为所述第一设备在所述第一图像帧之前最近一次截取显示屏所显示内容得到的图像帧,或者,为所述第二设备在显示所述第一图像帧之前最近一次显示的第二图像帧;
    所述预测数据由所述第二设备根据所述第一图像帧在所述空闲区域所显示的内容得到。
  21. 根据权利要求13-20任一项所述的方法,其特征在于,
    所述第二设备的显示屏包括投屏区域,所述投屏区域用于依次显示所述第一图像帧、第一预测图像帧,所述投屏区域占用所述第二设备的部分或全部显示屏。
  22. 根据权利要求13-21任一项所述的方法,其特征在于,所述方法还包括:
    所述第二设备显示所述第一预测图像帧之后,显示第二预测图像帧,所述第二预测图像帧由所述第二设备根据所述第一预测图像帧得到。
  23. 根据权利要求13-22任一项所述的方法,其特征在于,所述第二设备依次显示所述第一图像帧、所述第一预测图像帧的显示帧率,等于,所述第一设备截取显示屏所显示的内容的投屏帧率。
  24. 一种电子设备,其特征在于,包括:存储器、一个或多个处理器;所述存储器与所述一个或多个处理器耦合,所述存储器用于存储计算机程序代码,所述计算机程序代码包括计算机指令,所述一个或多个处理器调用所述计算机指令以使得所述电子设备执行如权利要求13-23中任一项所述的方法。
  25. 一种计算机可读存储介质,包括指令,其特征在于,当所述指令在电子设备上运行时,使得所述电子设备执行如权利要求13-23中任一项所述的方法。
PCT/CN2022/130904 2021-11-11 2022-11-09 投屏中流畅显示画面的方法、相关装置及系统 WO2023083218A1 (zh)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
CN202111333283 2021-11-11
CN202111333283.8 2021-11-11
CN202210238051.2 2022-03-10
CN202210238051.2A CN116112747A (zh) 2021-11-11 2022-03-10 投屏中流畅显示画面的方法、相关装置及系统

Publications (1)

Publication Number Publication Date
WO2023083218A1 true WO2023083218A1 (zh) 2023-05-19

Family

ID=86266118

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/130904 WO2023083218A1 (zh) 2021-11-11 2022-11-09 投屏中流畅显示画面的方法、相关装置及系统

Country Status (2)

Country Link
CN (1) CN116112747A (zh)
WO (1) WO2023083218A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116828242A (zh) * 2023-08-30 2023-09-29 亿咖通(湖北)技术有限公司 长链路投屏方法、系统及存储介质

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160306422A1 (en) * 2010-02-23 2016-10-20 Muv Interactive Ltd. Virtual reality system with a finger-wearable control
WO2018069215A1 (en) * 2016-10-12 2018-04-19 Thomson Licensing Method, apparatus and stream for coding transparency and shadow information of immersive video format
CN110049361A (zh) * 2019-03-05 2019-07-23 北京奇艺世纪科技有限公司 显示控制方法、装置、投屏设备及计算机可读介质
CN111417006A (zh) * 2020-04-28 2020-07-14 广州酷狗计算机科技有限公司 视频投屏方法、装置、终端及存储介质
CN113596231A (zh) * 2021-07-28 2021-11-02 努比亚技术有限公司 一种投屏显示控制方法、设备及计算机可读存储介质

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160306422A1 (en) * 2010-02-23 2016-10-20 Muv Interactive Ltd. Virtual reality system with a finger-wearable control
WO2018069215A1 (en) * 2016-10-12 2018-04-19 Thomson Licensing Method, apparatus and stream for coding transparency and shadow information of immersive video format
CN110049361A (zh) * 2019-03-05 2019-07-23 北京奇艺世纪科技有限公司 显示控制方法、装置、投屏设备及计算机可读介质
CN111417006A (zh) * 2020-04-28 2020-07-14 广州酷狗计算机科技有限公司 视频投屏方法、装置、终端及存储介质
CN113596231A (zh) * 2021-07-28 2021-11-02 努比亚技术有限公司 一种投屏显示控制方法、设备及计算机可读存储介质

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116828242A (zh) * 2023-08-30 2023-09-29 亿咖通(湖北)技术有限公司 长链路投屏方法、系统及存储介质
CN116828242B (zh) * 2023-08-30 2023-12-05 亿咖通(湖北)技术有限公司 长链路投屏方法、系统及存储介质

Also Published As

Publication number Publication date
CN116112747A (zh) 2023-05-12

Similar Documents

Publication Publication Date Title
CN113553014B (zh) 多窗口投屏场景下的应用界面显示方法及电子设备
WO2021027747A1 (zh) 一种界面显示方法及设备
WO2021139768A1 (zh) 跨设备任务处理的交互方法、电子设备及存储介质
US20220398059A1 (en) Multi-window display method, electronic device, and system
WO2022052773A1 (zh) 多窗口投屏方法及电子设备
US10726604B2 (en) Controlling display performance using display statistics and feedback
CN113553130B (zh) 应用执行绘制操作的方法及电子设备
WO2022105445A1 (zh) 基于浏览器的应用投屏方法及相关装置
WO2022083465A1 (zh) 电子设备的投屏方法及其介质和电子设备
WO2023093776A1 (zh) 界面生成方法及电子设备
WO2023005900A1 (zh) 一种投屏方法、电子设备及系统
WO2023083218A1 (zh) 投屏中流畅显示画面的方法、相关装置及系统
WO2023016014A1 (zh) 视频编辑方法和电子设备
US20240012534A1 (en) Navigation Bar Display Method, Display Method, and First Electronic Device
CN116048933A (zh) 一种流畅度检测方法
WO2023066165A1 (zh) 动画效果显示方法及电子设备
CN116708753B (zh) 预览卡顿原因的确定方法、设备及存储介质
CN116700601B (zh) 内存优化方法、设备及存储介质
WO2023093779A1 (zh) 界面生成方法及电子设备
CN115098449B (zh) 一种文件清理方法及电子设备
CN114281440B (zh) 一种双系统中用户界面的显示方法及电子设备
WO2023066177A1 (zh) 动画效果显示方法及电子设备
WO2023030276A1 (zh) 一种显示方法、装置、设备及存储介质
WO2024104094A1 (zh) 截图分享方法及电子设备
CN114079654B (zh) 数据重传方法、系统及相关装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22892012

Country of ref document: EP

Kind code of ref document: A1