CN116033158A - Screen projection method and electronic equipment - Google Patents

Screen projection method and electronic equipment Download PDF

Info

Publication number
CN116033158A
CN116033158A CN202210901655.0A CN202210901655A CN116033158A CN 116033158 A CN116033158 A CN 116033158A CN 202210901655 A CN202210901655 A CN 202210901655A CN 116033158 A CN116033158 A CN 116033158A
Authority
CN
China
Prior art keywords
electronic device
frame rate
mobile phone
screen
encoder
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202210901655.0A
Other languages
Chinese (zh)
Other versions
CN116033158B (en
Inventor
李鹏飞
张伟
李上红
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honor Device Co Ltd
Original Assignee
Honor Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honor Device Co Ltd filed Critical Honor Device Co Ltd
Publication of CN116033158A publication Critical patent/CN116033158A/en
Application granted granted Critical
Publication of CN116033158B publication Critical patent/CN116033158B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Telephone Function (AREA)

Abstract

The embodiment of the application provides a screen projection method and electronic equipment, wherein the method is executed by first electronic equipment and comprises the following steps: obtaining a target coding frame rate, wherein the target coding frame rate is determined according to the number of pixels which can be coded by an encoder in the first electronic device in a preset time length and the resolution adopted by the first electronic device when the image data of the screen to be projected is coded; and encoding the image data to be projected by adopting the target encoding frame rate and resolution to obtain encoded data, and transmitting the encoded data to the second electronic equipment. The method can enable the encoder to encode at the optimal encoding frame rate, reasonably utilize the performance of the encoder, and improve the success rate of the encoding process, thereby improving the success rate of screen projection.

Description

Screen projection method and electronic equipment
The present application claims priority from the national intellectual property agency, application number 202210600336.6, chinese patent application entitled "screen projection method and electronic device," filed 30 months 2022, the entire contents of which are incorporated herein by reference.
Technical Field
The application relates to the technical field of screen projection, in particular to a screen projection method and electronic equipment.
Background
Currently, more electronic devices support a wireless screen-casting technology, that is, a display interface of an electronic device a is screen-cast and displayed on a screen of another electronic device B, and a user can watch display content through the electronic device B. For example, the mobile phone may throw the display interface onto a tablet computer for display.
In the process of screen projection from a mobile phone to a tablet computer, the mobile phone is usually required to encode the screen projection data firstly, then the encoded data is sent to the tablet computer, so that the tablet computer decodes the encoded data, and then a screen projection interface is displayed.
However, as the types of current mobile phones are more and more, the coding performances of different mobile phones are different, and therefore, before coding the data to be screened, the coding parameters applicable to the current mobile phones need to be determined so as to improve the success rate of screening.
Disclosure of Invention
The application provides a screen projection method and electronic equipment, which can improve the success rate of screen projection.
In a first aspect, the present application provides a screen projection method, which is executed by a first electronic device, including: obtaining a target coding frame rate, wherein the target coding frame rate is determined according to the number of pixels which can be coded by an encoder in the first electronic device in a preset time length and the resolution adopted by the first electronic device when the image data of the screen to be projected is coded; and encoding the image data to be projected by adopting the target encoding frame rate and resolution to obtain encoded data, and transmitting the encoded data to the second electronic equipment.
The first electronic device may be a mobile phone, the second electronic device may be a tablet computer, and in a process of screen projection from the mobile phone to the tablet computer, screen projection resolution (may be simply referred to as resolution) and coding frame rate are required to be adopted to code data to be projected. In order to improve the coding success rate of the mobile phone, the method determines the target coding frame rate according to the number of the pixels which can be coded in a preset time period (for example, 1 second) and the screen projection resolution of the mobile phone coder, and can code in the coding capacity range of the coder. In some implementations, the target encoding frame rate may also be referred to as an optimal encoding frame rate.
It may be appreciated that, in one implementation manner, the target encoding frame rate may be determined by the first electronic device according to the number of pixels that may be encoded by the encoder of the first electronic device in a preset duration and the screen-throwing resolution, where the first electronic device may directly use the target encoding frame rate to encode after determining the target encoding frame rate. In another implementation manner, the target encoding frame rate may be determined by the second electronic device according to the number of pixels that can be encoded by the encoder of the first electronic device in a preset duration and the screen-throwing resolution, where the second electronic device needs to obtain the number of pixels that can be encoded by the encoder of the first electronic device from the first electronic device (the screen-throwing resolution may be obtained by negotiating between the first electronic device and the second electronic device), and after determining the target encoding frame rate, the target encoding frame rate needs to be returned to the first electronic device for the first electronic device to encode with the target encoding frame rate.
In one implementation, the number of pixels that can be encoded by the encoder within the preset time period can be obtained through a preset interface.
In the implementation manner, when the first electronic device projects the display interface to the second electronic device, an optimal coding frame rate can be determined according to the number of the pixels which can be coded in the current encoder and the projection resolution, so that the encoder codes at the optimal coding frame rate, the performance of the encoder is reasonably utilized, and the success rate of the coding process is improved, thereby improving the projection success rate.
With reference to the first aspect, in some implementations of the first aspect, in a case that the target encoding frame rate is determined by the first electronic device according to the number of pixels that can be encoded by an encoder of the first electronic device in a preset duration and a screen-dropping resolution, the method further includes: the number of pixels that the encoder can encode within a preset duration is determined.
In one implementation, a macroblock (also referred to as an encoding block) with encoding capability is included within an encoder, each macroblock encoding a certain number of pixels. Then, determining the number of pixels that the encoder can encode within the preset duration includes: the number of pixels that the encoder can encode within the preset duration is determined according to the number of available macroblocks in the encoder and the number of pixels that each of the available macroblocks can encode within the preset duration.
Wherein the first electronic device may take, as the number of encoder-encodable pixels, a product of the number of available macroblocks and the number of pixels encodable per available macroblock, where the number of pixels encodable per available macroblock is the same. In the case where the number of pixels encodable by each of the available macroblocks is different, the first electronic device may add the number of pixels encodable by each of the available macroblocks to obtain the number of pixels encodable by the encoder.
After determining the number of pixels that the encoder can encode within the preset duration, the first electronic device may determine, according to the relationship: and determining a target coding frame rate, wherein NxNxn is the number of pixels which can be coded in a preset time period by the encoder, N is the number of available macro blocks in the encoder, N is the number of pixels which can be coded in a preset time period by each available macro block, hxL is the resolution, and M is the target coding frame rate.
In one implementation, where M may not be an integer in the case where the above relationship is satisfied, then the target encoding frame rate may be the maximum integer that satisfies the relationship, and the subsequent encoding efficiency may be improved by employing the maximum encoding frame rate.
With reference to the first aspect, in some implementations of the first aspect, the method further includes: the number of available macroblocks in the encoder is obtained.
In one implementation, the CPU may provide an interface for obtaining the number of available macroblocks in the encoder, which may be: hnmediacodec.getmediamacroblockinfo (). GetAvailableNum (), wherein HnMediaCodecManager hnMediaCodec =hnmaccodecmanager.getinstance ().
In one implementation manner, in a case where the target encoding frame rate is determined by the second electronic device according to the number of pixels that can be encoded by the encoder of the first electronic device in the preset duration and the screen-projection resolution, the second electronic device may execute the technical solution of the following second aspect.
In a second aspect, the present application provides a screen projection method, which is executed by a second electronic device, including: the method comprises the steps of sending a target coding frame rate to first electronic equipment, wherein the target coding frame rate is determined by second electronic equipment according to the number of pixels which can be coded by an encoder in the first electronic equipment in a preset duration and the resolution adopted by the first electronic equipment when the first electronic equipment codes image data to be projected; and receiving coded data from the first electronic equipment, and displaying an image corresponding to the coded data, wherein the coded data is obtained by coding image data to be projected on a screen by the first electronic equipment by adopting a target coding frame rate and a target coding resolution.
With reference to the second aspect, in some implementations of the second aspect, the method further includes: the number of pixels that the encoder can encode within the preset duration is determined according to the number of available macroblocks in the encoder and the number of pixels that each of the available macroblocks can encode within the preset duration.
With reference to the second aspect, in some implementations of the second aspect, the method further includes: according to the relation: and determining a target coding frame rate, wherein NxNxn is the number of pixels which can be coded in a preset time period by the encoder, N is the number of available macro blocks in the encoder, N is the number of pixels which can be coded in a preset time period by each available macro block, hxL is the resolution, and M is the target coding frame rate.
With reference to the second aspect, in some implementations of the second aspect, the target encoding frame rate is a maximum integer that satisfies a relational expression.
With reference to the second aspect, in some implementations of the second aspect, before determining the number of pixels that the encoder can encode within the preset duration, the method further includes: the number of available macroblocks in the encoder is obtained.
With reference to the second aspect, in some implementations of the second aspect, obtaining a number of available macroblocks in the encoder includes: according to the interface: hnmediacodec. Getmediamacroblockinfo (). GetAvailableNum () acquires the number of available macroblocks in the encoder;
Wherein HnMediaCodecManager hnMediaCodec =hnmacrocodecmanaager.
For implementation principles and technical effects of the second aspect and implementation manners of the second aspect, refer to the description of the first aspect, and are not repeated herein.
In a third aspect, the present application provides a screen projection system, including a first electronic device and a second electronic device, where the first electronic device executes any one of the methods in the first aspect and the second electronic device executes any one of the methods in the second aspect.
In a fourth aspect, the present application provides an apparatus, which is included in an electronic device, and has a function of implementing the electronic device behavior in the first aspect and the possible implementation manner of the first aspect, or implements the electronic device behavior in the second aspect and the possible implementation manner of the second aspect. The functions may be realized by hardware, or may be realized by hardware executing corresponding software. The hardware or software includes one or more modules or units corresponding to the functions described above. Such as a receiving module or unit, a processing module or unit, etc.
In a fifth aspect, the present application provides an electronic device, the electronic device comprising: a processor, a memory, and an interface; the processor, the memory and the interface cooperate with each other to enable the electronic device to execute any one of the methods in the technical solutions of the first aspect, or execute any one of the methods in the technical solutions of the second aspect.
In a sixth aspect, the present application provides a chip comprising a processor. The processor is configured to read and execute a computer program stored in the memory to perform the method of the first aspect and any possible implementation thereof, or to perform the method of the second aspect and any possible implementation thereof.
Optionally, the chip further comprises a memory, and the memory is connected with the processor through a circuit or a wire.
Further optionally, the chip further comprises a communication interface.
In a seventh aspect, the present application provides a computer-readable storage medium, in which a computer program is stored, which when executed by a processor causes the processor to perform any one of the methods of the first aspect or to perform any one of the methods of the second aspect.
In an eighth aspect, the present application provides a computer program product comprising: computer program code which, when run on an electronic device, causes the electronic device to perform any one of the methods of the first aspect or to perform any one of the methods of the second aspect.
Drawings
Fig. 1 is a schematic structural diagram of an example of an electronic device according to an embodiment of the present application;
FIG. 2 is a block diagram of a software architecture of an example electronic device according to an embodiment of the present application;
FIG. 3 is a schematic diagram of a desktop interface of a mobile phone according to an embodiment of the present disclosure;
fig. 4 is a schematic diagram of an opening interface of a mobile phone assistant APP according to an embodiment of the present application;
fig. 5 (a) is a schematic diagram of a connection interface of a mobile phone assistant APP according to an embodiment of the present application;
fig. 5 (b) is a schematic diagram of a connection interface of a tablet computer according to an embodiment of the present application;
fig. 6 is a schematic diagram of an example of a wireless screen-throwing interface of a mobile phone according to an embodiment of the present application;
fig. 7 is a schematic diagram of a comparative display of an example of a mobile phone screen to a tablet computer according to an embodiment of the present disclosure;
FIG. 8 is a schematic diagram of a system architecture of a mobile phone and a tablet computer according to an embodiment of the present disclosure;
fig. 9 is a schematic diagram of a process of performing data interaction between each module in a mobile phone and each module in a tablet computer in a process of transmitting data between the mobile phone and the tablet computer provided in an embodiment of the present application;
fig. 10 is a schematic diagram of a process of performing data interaction between each module in a mobile phone and each module in a tablet computer in a process of establishing connection between the mobile phone and the tablet computer according to an embodiment of the present disclosure;
FIG. 11 is a flowchart of an exemplary screen projection method according to an embodiment of the present disclosure;
fig. 12 is a schematic diagram of a disconnection interface of a mobile phone assistant APP according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be described below with reference to the drawings in the embodiments of the present application. Wherein, in the description of the embodiments of the present application, "/" means or is meant unless otherwise indicated, for example, a/B may represent a or B; "and/or" herein is merely an association relationship describing an association object, and means that three relationships may exist, for example, a and/or B may mean: a exists alone, A and B exist together, and B exists alone. In addition, in the description of the embodiments of the present application, "plurality" means two or more than two.
The terms "first," "second," "third," and the like, are used below for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defining "a first", "a second", or a third "may explicitly or implicitly include one or more such feature.
In this embodiment of the present application, the "screen projection" refers to transmitting data of a display interface on one electronic device to another electronic device, so that the other electronic device displays the same display interface. For ease of understanding, the above-described "another electronic apparatus" is referred to as a projection apparatus. The embodiment of the application can realize the processes of casting the screen of the mobile phone to the tablet personal computer, casting the screen of the tablet personal computer to the personal computer (personal computer, PC), casting the screen of the mobile phone to the PC and the like, and the electronic equipment is taken as the mobile phone and the screen casting equipment is taken as the tablet personal computer for illustration.
Currently, in the process of projecting a screen from a mobile phone to a tablet computer, the mobile phone is generally required to encode data to be projected on the screen first, and then the encoded data is transmitted to the tablet computer. In some video screen-throwing scenes, for example, when a mobile phone is playing a video, the video is to be cast to a tablet computer, if the video belongs to a high-definition high-frame-rate video, that is, the resolution and the coding frame rate of the video are relatively high, in the process of coding video data by the mobile phone, if the coding performance of the mobile phone is relatively low (that is, the performance of an encoder is relatively poor), a result that the mobile phone cannot successfully code may occur, and the screen-throwing failure may occur.
Illustratively, when video data is encoded by a cell phone, it is typically encoded by an encoder in the cell phone, in which there are encoded blocks (macro blocks). Assuming that the available coding blocks in the current encoder are a blocks, and the corresponding resolution and coding frame rate (the coding frame rate is usually preset) are relatively high when video data is coded, if b blocks are calculated to be required to be coded successfully, if b is greater than a, the available coding blocks cannot complete the coding process of the video data, namely the coding fails, and the coded data cannot be obtained. In this case, the tablet pc also cannot receive the encoded data, and thus cannot display the screen-projection interface, i.e., the screen projection fails.
In view of this, the embodiment of the present application provides a screen projection method, where an electronic device may determine an optimal coding frame rate according to a coding block available in a current encoder and a resolution of screen projection data, so that the encoder encodes at the optimal coding frame rate, and reasonably uses the available coding block, thereby improving a success rate of a coding process, and further improving a screen projection success rate. It should be noted that, the screen projection method provided in the embodiment of the present application may be applied to electronic devices with a screen projection function, such as a tablet computer, a PC, a super mobile personal computer (ultra-mobile personal computer, UMPC), a vehicle-mounted device, a netbook, a personal digital assistant (personal digital assistant, PDA), etc., and the embodiment of the present application does not limit the specific type of the electronic device.
Fig. 1 is a schematic structural diagram of an electronic device 100 according to an embodiment of the present application. Taking the example of the electronic device 100 being a mobile phone, the electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (universal serial bus, USB) interface 130, a charge management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, keys 190, a motor 191, an indicator 192, a camera 193, a display 194, a subscriber identity module (subscriber identification module, SIM) card interface 195, and the like. The sensor module 180 may include a pressure sensor 180A, a gyro sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
The processor 110 may include one or more processing units, such as: the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), a controller, a memory, a video codec, a digital signal processor (digital signal processor, DSP), a baseband processor, and/or a neural network processor (neural-network processing unit, NPU), etc. Wherein the different processing units may be separate devices or may be integrated in one or more processors.
The controller may be a neural hub and a command center of the electronic device 100, among others. The controller can generate operation control signals according to the instruction operation codes and the time sequence signals to finish the control of instruction fetching and instruction execution.
A memory may also be provided in the processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that the processor 110 has just used or recycled. If the processor 110 needs to reuse the instruction or data, it may be called directly from memory. Repeated accesses are avoided and the latency of the processor 110 is reduced, thereby improving the efficiency of the system.
In some embodiments, the processor 110 may include one or more interfaces. The interfaces may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous receiver transmitter (universal asynchronous receiver/transmitter, UART) interface, a mobile industry processor interface (mobile industry processor interface, MIPI), a general-purpose input/output (GPIO) interface, a subscriber identity module (subscriber identity module, SIM) interface, and/or a universal serial bus (universal serial bus, USB) interface, among others.
The charge management module 140 is configured to receive a charge input from a charger. The charger can be a wireless charger or a wired charger. In some wired charging embodiments, the charge management module 140 may receive a charging input of a wired charger through the USB interface 130. In some wireless charging embodiments, the charge management module 140 may receive wireless charging input through a wireless charging coil of the electronic device 100. The charging management module 140 may also supply power to the electronic device through the power management module 141 while charging the battery 142.
The power management module 141 is used for connecting the battery 142, and the charge management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charge management module 140 and provides power to the processor 110, the internal memory 121, the external memory, the display 194, the camera 193, the wireless communication module 160, and the like. The power management module 141 may also be configured to monitor battery capacity, battery cycle number, battery health (leakage, impedance) and other parameters. In other embodiments, the power management module 141 may also be provided in the processor 110. In other embodiments, the power management module 141 and the charge management module 140 may be disposed in the same device.
The wireless communication function of the electronic device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The wireless communication module 160 may provide solutions for wireless communication including wireless local area network (wireless local area networks, WLAN) (e.g., wireless fidelity (wireless fidelity, wi-Fi) network), bluetooth (BT), global navigation satellite system (global navigation satellite system, GNSS), frequency modulation (frequency modulation, FM), near field wireless communication technology (near field communication, NFC), infrared technology (IR), etc., as applied to the electronic device 100. The wireless communication module 160 may be one or more devices that integrate at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, modulates the electromagnetic wave signals, filters the electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, frequency modulate it, amplify it, and convert it to electromagnetic waves for radiation via the antenna 2.
The electronic device 100 implements display functions through a GPU, a display screen 194, an application processor, and the like. The GPU is a microprocessor for image processing, and is connected to the display 194 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 110 may include one or more GPUs that execute program instructions to generate or change display information.
The display screen 194 is used to display images, videos, and the like. The display 194 includes a display panel. The display panel may employ a liquid crystal display (liquid crystal display, LCD), an organic light-emitting diode (OLED), an active-matrix organic light emitting diode (AMOLED), a flexible light-emitting diode (flex), a mini, a Micro-OLED, a quantum dot light-emitting diode (quantum dot light emitting diodes, QLED), or the like. In some embodiments, the electronic device 100 may include 1 or N display screens 194, N being a positive integer greater than 1.
The electronic device 100 may implement photographing functions through an ISP, a camera 193, a video codec, a GPU, a display screen 194, an application processor, and the like.
Video codecs are used to compress or decompress digital video. The electronic device 100 may support one or more video codecs. In this way, the electronic device 100 may play or record video in a variety of encoding formats, such as: dynamic picture experts group (moving picture experts group, MPEG) 1, MPEG2, MPEG3, MPEG4, etc.
The internal memory 121 may be used to store computer-executable program code that includes instructions. The processor 110 executes various functional applications of the electronic device 100 and data processing by executing instructions stored in the internal memory 121. The internal memory 121 may include a storage program area and a storage data area. The storage program area may store an application program (such as a sound playing function, an image playing function, etc.) required for at least one function of the operating system, etc. The storage data area may store data created during use of the electronic device 100 (e.g., audio data, phonebook, etc.), and so on. In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (universal flash storage, UFS), and the like.
The electronic device 100 may implement audio functions through an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, an application processor, and the like. Such as music playing, recording, etc.
It should be understood that the structure of the electronic device 100 is not particularly limited in the embodiments of the present application, except for the various components or modules listed in fig. 1. In other embodiments of the present application, electronic device 100 may also include more or fewer components than shown, or certain components may be combined, or certain components may be split, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
The software system of the electronic device 100 may employ a layered architecture, an event driven architecture, a microkernel architecture, a microservice architecture, or a cloud architecture. In this embodiment, taking an Android system with a layered architecture as an example, a software structure of the electronic device 100 is illustrated.
Fig. 2 is a software configuration block diagram of the electronic device 100 according to the embodiment of the present application. Taking the example of the electronic device 100 being a mobile phone, the layered architecture divides the software into several layers, each of which has a distinct role and division of labor. The layers communicate with each other through a software interface. In some embodiments, the Android system is divided into four layers, from top to bottom, an application layer, an application framework layer, an Zhuoyun row (Android run) and system libraries, and a kernel layer, respectively. The application layer may include a series of application packages.
As shown in fig. 2, the application package may include applications for cameras, gallery, calendar, phone calls, maps, navigation, WLAN, bluetooth, music, video, short messages, phone assistants, etc.
The application framework layer provides an application programming interface (application programming interface, API) and programming framework for application programs of the application layer. The application framework layer includes a number of predefined functions.
As shown in FIG. 2, the application framework layer may include a window manager, a content provider, a view system, a telephony manager, a resource manager, a notification manager, and the like.
The window manager is used for managing window programs. The window manager can acquire the size of the display screen, judge whether a status bar exists, lock the screen, intercept the screen and the like.
The content provider is used to store and retrieve data and make such data accessible to applications. The data may include video, images, audio, calls made and received, browsing history and bookmarks, phonebooks, etc.
The view system includes visual controls, such as controls to display text, controls to display pictures, and the like. The view system may be used to build applications. The display interface may be composed of one or more views. For example, a display interface including a text message notification icon may include a view displaying text and a view displaying a picture.
The telephony manager is used to provide the communication functions of the electronic device 100. Such as the management of call status (including on, hung-up, etc.).
The resource manager provides various resources for the application program, such as localization strings, icons, pictures, layout files, video files, and the like.
The notification manager allows the application to display notification information in a status bar, can be used to communicate notification type messages, can automatically disappear after a short dwell, and does not require user interaction. Such as notification manager is used to inform that the download is complete, message alerts, etc. The notification manager may also be a notification in the form of a chart or scroll bar text that appears on the system top status bar, such as a notification of a background running application, or a notification that appears on the screen in the form of a dialog window. For example, a text message is prompted in a status bar, a prompt tone is emitted, the electronic device vibrates, and an indicator light blinks, etc.
Android runtimes include core libraries and virtual machines. Android run time is responsible for scheduling and management of the Android system.
The core library consists of two parts: one part is a function which needs to be called by java language, and the other part is a core library of android.
The application layer and the application framework layer run in a virtual machine. The virtual machine executes java files of the application program layer and the application program framework layer as binary files. The virtual machine is used for executing the functions of object life cycle management, stack management, thread management, security and exception management, garbage collection and the like.
The system library may include a plurality of functional modules. For example: surface manager (surface manager), media library (media library), three-dimensional graphics processing library (e.g., openGL ES), 2D graphics engine (e.g., SGL), etc.
The surface manager is used to manage the display subsystem and provides a fusion of 2D and 3D layers for multiple applications.
Media libraries support a variety of commonly used audio, video format playback and recording, still image files, and the like. The media library may support a variety of audio and video encoding formats, such as MPEG4, h.264, MP3, AAC, AMR, JPG, PNG, etc.
The three-dimensional graphic processing library is used for realizing three-dimensional graphic drawing, image rendering, synthesis, layer processing and the like.
The 2D graphics engine is a drawing engine for 2D drawing.
The kernel layer is a layer between hardware and software. The inner core layer at least comprises display drive, camera drive, audio drive, sensor drive and the like.
It should be understood that the software architecture of the electronic device 100 is not specifically limited in this embodiment, except for the various layers listed in fig. 2 and the modules included in the various layers. In other embodiments of the present application, the electronic device 100 may also include more or fewer layers than shown, or certain layers may be combined, or certain layers may be split, or different arrangements of modules.
First, describing a process of screen-casting a mobile phone to a tablet computer, in one implementation manner, an Application (APP) for a user to perform a screen-casting operation may be installed on the mobile phone, for example, the mobile phone APP may be a mobile phone assistant APP, and as shown in fig. 3, the mobile phone assistant APP may be displayed on a desktop of the mobile phone in a form of a desktop icon. The user can start the mobile phone assistant APP by clicking the desktop icon, and the starting interface of the mobile phone assistant APP can be seen in fig. 3. In the interface shown in fig. 3, the starting interface of the mobile phone assistant APP includes an "immediate connection" control and a description of a screen-throwing mode supported by the mobile phone, for example, a mirror mode currently supported by the mobile phone. On the interface, after clicking the "connect immediately" control, the user jumps to the display interface shown in fig. 4, and the mobile phone starts searching for available screen-throwing devices nearby, and these available screen-throwing devices can perform short-distance communication with the mobile phone, such as bluetooth communication (it should be noted that at this time, both the mobile phone and the searched available screen-throwing device have turned on the bluetooth switch). Assuming that the currently available screen-casting devices have device 1 and device 2, the handset may display the basic information (e.g., device names) of device 1 and device 2 in the available device list shown in fig. 4. Then, the user may click to select one device in the available device list (assuming that device 2, i.e., tablet computer, is selected), and the connection interface shown in fig. 5 (a) is displayed on the mobile phone; at the same time, the handset sends a connection request to the device 2, and the device 2 displays a confirmation interface as shown in fig. 5 (b). After clicking the "agree" control on the interface shown in the diagram (b) in fig. 5, the user can complete the screen-throwing connection process of the mobile phone and the tablet computer, and then the mobile phone can send the data of the current display interface to the tablet computer, and the current display interface of the mobile phone is displayed on the tablet computer. When the mobile phone and the tablet personal computer are connected in a screen throwing way, the interface displayed by the mobile phone at present is still the interface of the mobile phone assistant APP, and then the interface is displayed on the tablet personal computer; after the video playing interface is opened on the mobile phone, the video playing interface is displayed on the tablet personal computer. For example, a schematic diagram of the effect of the mobile phone on the tablet pc may be shown in fig. 6.
It can be understood that an APP for the user to perform the screen-throwing operation can also be installed in the tablet computer, for example, the tablet computer APP, through which the user can throw the tablet computer to the electronic device such as the PC, and specific operation steps are similar to those of the mobile phone screen throwing, and will not be described herein.
In another implementation manner, the mobile phone may have a wireless screen-throwing function, for example, as shown in fig. 7, a user may click on a "wireless screen-throwing" control through a pull-down system menu of the mobile phone, and the mobile phone starts searching for available screen-throwing devices nearby in response to a clicking operation of the user, and displays a list of available devices in a blank area of a screen-throwing interface of the mobile phone. When the available screen-casting device is searched, the mobile phone can display the basic information (such as the device name) of the device 1 and the device 2 in the available device list. Then, the user clicks and selects one device (assuming that the device 2 and the tablet computer are selected) in the available device list, the mobile phone sends a connection request to the device 2, and after the user confirms connection on the tablet computer, the screen-throwing connection process of the mobile phone and the tablet computer can be completed.
Based on the above process of executing the screen projection operation by the user, the screen projection method provided in the embodiment of the present application is described below with reference to a system architecture of a mobile phone and a tablet computer and a data interaction process between each module in the system architecture. Fig. 8 is a schematic diagram of a system architecture of a mobile phone and a tablet computer, and fig. 9 is a schematic diagram of a process of performing data interaction between each module in the mobile phone and each module in the tablet computer in the screen projection method provided in the embodiment of the present application.
As shown in fig. 8, the mobile phone at least includes: the mobile phone assistant APP of the application program layer comprises a first screen projection module, a first transmission module and a first basic capability module of the capability layer, a multimedia framework of the framework layer, a first Bluetooth driver, a first Wi-Fi driver and a first USB driver of the kernel layer, and a CPU, a graphics card (GPU), an encoder, a first Bluetooth chip and a first Wi-Fi chip of the hardware layer. The tablet computer at least comprises: the tablet personal assistant APP of the application program layer, a second transmission module, a second screen projection module and a second basic capability module of the capability layer, a second Bluetooth driver, a second Wi-Fi driver, a second USB driver and a display driver of the kernel layer, a Graphics Processing Unit (GPU) of the hardware layer, a decoder, a second Bluetooth chip, a second Wi-Fi chip and a display screen.
Wherein the mobile phone assistant APP and the tablet assistant APP may be used to interact with a user, on which the user may perform various operations. Optionally, the mobile phone assistant APP and the tablet computer APP may further include a service management module and a service setting module, which are configured to manage and set the APP. The mobile phone assistant APP of the mobile phone and the tablet assistant APP of the tablet computer can negotiate the screen-throwing resolution. The first screen projection module of the mobile phone can comprise a screen capturing module, a coding logic module and a virtualization service module; the screen capturing module can be used for capturing image data of a display interface on the mobile phone, the coding logic module can call the coder to code the image data, and the first transmission module can be used for sending the coded data to the second transmission module of the tablet computer; the virtualization service module can provide service instructions, such as instructing the mobile phone assistant APP to calculate an optimal coding frame rate. The first basic capability module may provide corresponding capabilities for the first screen projection module in implementing various functions. The first Bluetooth driver can call the capability of the first Bluetooth chip, and the first Wi-Fi driver can call the capability of the first Wi-Fi chip, so that the mobile phone and the tablet computer are connected in a Bluetooth mode or a Wi-Fi mode. The second screen projection module of the tablet computer can comprise a decoding logic module, a display module and a virtualization service module; the decoding logic module can call the decoder to decode the received encoded data, and the display module can call the display driver to display the decoded image data on the display screen. It should be noted that the functions of other modules on the tablet computer are similar to the functions of corresponding modules in the mobile phone, and are not described herein.
Based on the system architecture shown in fig. 8, as shown in fig. 9, a process of the screen projection method provided in the embodiment of the present application may include:
s1, a mobile phone assistant APP of a mobile phone and a tablet personal computer tablet assistant APP negotiate a screen-throwing resolution.
The mobile phone and the tablet computer can negotiate the screen resolution after establishing the connection, and the process can be described in the embodiment shown in fig. 10 below. It is understood that the projection resolution may be the resolution of one frame of image, that is, the number of pixels of one frame of image.
S2, the mobile phone assistant APP stores the negotiated screen-throwing resolution.
Optionally, the mobile phone assistant APP can store the negotiated screen resolution in a data file for reading when required by the subsequent calculation.
S3, the mobile phone assistant APP sends a message of successful connection establishment to the first screen projection module.
S4, the first screen projection module sends a first message to the mobile phone assistant APP.
In S3, the mobile phone assistant APP sends a message of successful connection establishment to the first screen projection module, that is, feeds back to the first screen projection module that the mobile phone assistant APP has made data preparation, and can execute the next operation. Then, the first screen projection module may send a first message to the mobile phone assistant APP to instruct the mobile phone assistant APP to perform the next operation. Optionally, the mobile phone assistant APP may send a message that the connection is established successfully to the virtualization service module in the first screen-throwing module, and correspondingly, the virtualization service module sends the first message to the mobile phone assistant APP.
S5, the mobile phone assistant APP sends a second message to the multimedia framework, wherein the second message is used for obtaining the number of available coding blocks.
And S6, the multimedia framework sends a second message to the CPU.
Alternatively, the messages sent between the modules in the handset may be inter-process communication (IPC) messages.
S7, the CPU obtains the number of available coding blocks.
Wherein the CPU may provide an interface for obtaining the number of available coding blocks in the encoder. Alternatively, the interface may be as follows:
HnMediaCodecManager hnMediaCodec=HnMediaCodecManager.getInstance();
n=hnmediacodec.getmediamacroblockinfo (). GetAvailableNum (), N is the number of available coding blocks acquired.
S8, the CPU sends the number of available coding blocks to the multimedia framework.
S9, the multimedia framework sends the number of available coding blocks to the mobile phone assistant APP.
S10, the mobile phone assistant APP calculates the optimal coding frame rate according to the number of available coding blocks and the screen-throwing resolution.
In this step, the mobile phone assistant APP may first read the negotiated screen-projection resolution from the data file, and then calculate an optimal coding frame rate according to the number of available coding blocks and the screen-projection resolution, where the coding frame rate may be a number of frames coded in a preset time (for example, within 1 second).
In one implementation, the process of calculating the optimal coding frame rate by the mobile phone assistant APP may be: assuming that the number of available coding blocks is N, the number of pixels that can be coded by one coding block in a preset time (for example, in 1 second) is n×n, the screen resolution is h×l, and the optimal coding frame rate is M, the following relation needs to be satisfied:
the number of pixels actually coded by the coder in 1 second during coding is smaller than or equal to the maximum number of the pixels which can be coded by the available coding blocks, so that the coder can be ensured to be successfully coded.
Illustratively, assuming that h×l=1920×1080, n=1000, n×n=16×16, 1920×1080×m is equal to or less than 1000×16×16, and M is the largest integer, that is, M is equal to 1, the calculated optimal encoding frame rate of the encoder is 1 frame/second.
S11, the mobile phone assistant APP sends the optimal coding frame rate to the first screen projection module.
S12, the first screen projection module stores the optimal coding frame rate.
Optionally, the mobile phone assistant APP may send the optimal coding frame rate to the virtualization service module in the first screen projection module, and then the virtualization service module stores the optimal coding frame rate.
S13, the first screen projection module sets encoder parameters according to the optimal encoding frame rate.
Optionally, in the case that the above-mentioned virtualization service module stores the optimal coding frame rate, the coding logic module in the first screen projection module may set the encoder parameters, and this process may be: the virtualized service module initializes the coding logic module and sends an optimal coding frame rate to the coding logic module, and the coding logic module sets the encoder parameters according to the optimal coding frame rate. That is, by setting the encoder parameters, the encoder can be made to encode at the optimal encoding frame rate at the time of encoding.
When the mobile phone sets the parameters of the encoder, the tablet personal computer also needs to set the decoder, specifically:
s14, the tablet personal assistant APP sends a message of successful connection establishment to the second screen projection module.
S15, initializing a decoder by the second screen projection module.
Optionally, the tablet assistant APP may send a message that the connection is established successfully to the decoding logic module in the second screen projection module, and the decoding logic module initializes the decoder.
The mobile phone can then acquire the image data of the current display interface for screen projection, and the process can include:
s16, the first screen projection module acquires image data.
Alternatively, the image data may be acquired by a screen capture module in the first projection module, e.g., the screen capture module acquires the image data from a graphics card (GPU) through an API interface.
The display card (GPU) can store an image queue, the image queue comprises image data to be displayed on a mobile phone display screen, and the image data can be sent to be displayed after the GPU draws and renders the data to be displayed on the mobile phone display screen.
S17, the first screen projection module calls an encoder to encode the acquired image data.
The first screen projection module can call the encoder through the video encoding capability in the first basic capability module to encode the image data. Optionally, the screen capturing module may send the image data to an encoding logic module in the first screen projection module, and then the encoding logic module invokes an encoder to encode the acquired image data. It will be appreciated that the encoding parameters used in this step for encoding the image data are the negotiated screen resolution and the calculated optimal encoding frame rate.
S18, the first screen projection module sends the encoded image data to a second transmission module of the tablet personal computer through the first transmission module.
Because the mobile phone and the tablet computer in S1 have established a connection, the first transmission module may send the image data to the second transmission module of the tablet computer along a data channel (e.g., socket channel) corresponding to the connection.
And S19, the second transmission module of the tablet personal computer sends the encoded image data to the second screen projection module.
S20, the second screen projection module calls a decoder to decode the image data.
The second screen projection module can call the decoder through the video decoding capability in the second basic capability module to decode the image data. Optionally, the second transmission module may send the encoded image data to a decoding logic module in the second projection module, and the decoding logic module invokes a decoder to decode the image data.
S21, the second screen projection module sends the decoded image data to a display driver, and then the display screen is used for displaying images.
Alternatively, the decoded image data may be sent to the display module by the decoding logic module, and then sent to the display driver by the display module.
After the screen is successfully projected, the second screen projection module can also send the message of the successful screen projection to the second transmission module, then the second transmission module sends the message to the first screen projection module of the mobile phone, and then the first screen projection module sends the message to the mobile phone assistant APP. Optionally, the mobile phone assistant APP can also display a message of successful screen projection to prompt the user that the screen projection is successful currently.
It should be noted that, the format of encoding by the encoder and the format of decoding by the decoder in the embodiments of the present application are not limited, as long as the encoding/decoding and displaying process can be implemented. For example, the NV12 format image data may be encoded into the H264 format image data.
It should be further noted that, since the mobile phone determines the optimal coding frame rate according to the number of available coding blocks, the number of available coding blocks may change, for example, the mobile phone performs other operations to occupy the coding blocks in the screen-throwing process, so that the mobile phone may acquire the latest number of available coding blocks again after a fixed time interval, and calculate a new optimal coding frame rate according to the latest number of available coding blocks. Or the mobile phone can monitor the change event of the number of the available coding blocks, when the number of the available coding blocks is changed, the latest number of the available coding blocks is obtained, and the new optimal coding frame rate is calculated according to the latest number of the available coding blocks.
In the above embodiment, when the mobile phone throws the screen of the display interface to the tablet computer, an optimal coding frame rate can be determined according to the number of available coding blocks in the current encoder and the resolution of the screen throwing data, so that the encoder encodes at the optimal coding frame rate, the available coding blocks are reasonably utilized, and the success rate of the encoding process is improved, thereby improving the success rate of screen throwing.
For the process of negotiating the screen resolution of the mobile phone assistant APP of the mobile phone and the tablet assistant APP of the tablet computer in S1, as shown in fig. 10, the process of performing data interaction between each module in the mobile phone and each module in the tablet computer may include:
s30, the user clicks an 'immediate connection' control in the mobile phone assistant APP.
S31, the mobile phone assistant APP sends a message indicating searching of the screen throwing device to the first screen throwing module.
After receiving the "immediate connection" operation input by the user, the mobile phone assistant APP may send an IPC message to the first screen-throwing module, where optionally, the IPC message may carry a search instruction, where the search instruction is used to instruct the first screen-throwing module to invoke the capability search of the first bluetooth chip to find available screen-throwing devices nearby.
S32, the first screen projection module calls the capability search of the first Bluetooth chip to find available screen projection equipment nearby.
In this step, the first screen-throwing module invokes the capability of the first bluetooth chip, that is, searches for bluetooth signals of available screen-throwing devices by using the first bluetooth chip.
S33, the Bluetooth chip of the available screen throwing device receives a search signal of the mobile phone and sends the search signal to the second screen throwing module.
Only one available screen-throwing device of the tablet computer is shown in fig. 10, and the principle of the corresponding bluetooth chip, namely the second bluetooth chip, and the other available screen-throwing devices are similar, which is not shown in fig. 10.
S34, the second screen projection module of the tablet personal computer feeds back the basic information of the second screen projection module to the first screen projection module of the mobile phone.
Wherein the basic information of the device includes a device identifier, such as a device name, a media access control address (media access control address, MAC address), etc., and the second screen projection module may store the basic information of the device.
S35, the first screen projection module of the mobile phone sends the received basic information (such as the equipment name and the MAC address) of the screen projection equipment to the mobile phone assistant APP.
S36, the mobile phone assistant APP displays basic information of the available screen-throwing devices in an available device list.
S37, the user selects the device 2 (i.e. tablet computer) in the available device list.
S38, the mobile phone assistant APP receives the selected operation of the user and sends a message indicating connection establishment to the first screen projection module.
The message may also be an IPC message, configured to instruct the first screen-throwing module to invoke the first bluetooth chip to establish bluetooth connection (BLE connection) with the tablet computer; optionally, the IPC message may carry an identification of the tablet.
S39, the first screen projection module invokes the capability of the first Bluetooth chip to be connected with the second Bluetooth chip of the tablet personal computer in a Bluetooth mode.
S40, the first Bluetooth chip and the second Bluetooth chip establish Bluetooth connection.
S41, after the Bluetooth connection is established between the mobile phone and the tablet personal computer (namely the device 2), the first Bluetooth chip sends a message of successful establishment of the Bluetooth connection to the first screen projection module.
S42, the first screen projection module generates an SSID and a password.
S43, the first screen projection module calls the first Wi-Fi chip to establish a softAP.
S44, the first Wi-Fi chip creates a softAP.
The SoftAP corresponds to the SSID and the password created above.
S45, the first screen projection module sends the SSID and the password to a second screen projection module of the tablet personal computer through the established Bluetooth connection channel.
S46, the second screen projection module calls a second Wi-Fi chip to establish Wi-Fi P2P connection with the mobile phone according to the received SSID and the password. Meanwhile, a socket data channel is also established between the mobile phone and the tablet computer so as to enable the subsequent mobile phone and the tablet computer to mutually transmit data. It should be noted that, the connection type established between the mobile phone and the tablet computer is not limited to bluetooth connection and Wi-Fi P2P connection, and may be other connection modes, as long as data transmission between the two can be realized.
S47, the first Wi-Fi chip sends a message of successful connection establishment to the first screen projection module, and the second Wi-Fi chip sends a message of successful connection establishment to the second screen projection module.
Alternatively, the process of establishing connection between the mobile phone and the tablet computer in S30 to S47 may be performed by interaction between the first connection discovery module in the first screen projection module and the second connection discovery module in the second screen projection module, which are not shown in fig. 8.
S48, the first screen projection module acquires the resolution of the display image supported by the mobile phone from the display card.
S49, the first screen projection module sends the resolution of the display image supported by the mobile phone to the second screen projection module of the tablet computer.
S50, the second screen projection module selects one resolution from the received resolutions as the screen projection resolution.
S51, the second screen projection module sends the screen projection resolution to the first screen projection module of the mobile phone.
For steps S48-S51, as an achievable manner, the first screen projection module of the mobile phone may send the resolution of the display image supported by the mobile phone to the second screen projection module of the tablet computer, for example, the resolution of 1920×1080 and 2520×1680 are sent. After the tablet personal computer receives the two resolutions, one resolution can be selected according to the resolution of the tablet personal computer, and the resolution is returned to the first screen projection module of the mobile phone through the second screen projection module. For example, the tablet computer supports 2520 x 1680 resolution, which is returned to the handset; for the case that the resolution sent to the tablet computer by the mobile phone is not the resolution supported by the tablet computer, the tablet computer can select a resolution closer to the resolution. For this step, as another implementation manner, the tablet computer may actively send its own resolution to the first screen-throwing module of the mobile phone through the second screen-throwing module, for example, the second screen-throwing module directly sends 2520×1680 resolution to the first screen-throwing module, so that the selection process of S48-S51 is reduced.
S52, the first screen projection module of the mobile phone sends screen projection resolution (such as 2520 multiplied by 1680) to the mobile phone assistant APP, and the second screen projection module of the tablet computer sends screen projection resolution to the tablet phone assistant APP.
Therefore, the mobile phone and the tablet computer complete the negotiation process of the screen-throwing resolution.
The screen projection method according to the present application will be described in more detail with reference to the following embodiment. The following embodiments may be combined with the above embodiments, and the same or similar concepts or processes will not be described in detail in the following embodiments.
Fig. 11 is a schematic flow chart of a screen projection method according to an embodiment of the present application, and the method may be applied to a mobile phone and a tablet computer as shown in fig. 8. As shown in fig. 11, a process of the screen projection method provided in the embodiment of the present application may include:
s101, the mobile phone receives screen operation input by a user.
The screen-throwing operation input by the user may be input through the mobile phone assistant APP on the mobile phone, for example, by clicking the "immediate connection" control in fig. 3, or by clicking the "wireless screen-throwing" control in fig. 7. It should be noted that, in the embodiment of the present application, the "mobile phone assistant APP" is named as an example, but other APPs having the same or similar functions are also applicable to the embodiment of the present application, such as "mobile phone screen-throwing APP", "device interconnection APP", and the like; in addition, the embodiments of the present application are shown by way of example with the nomenclature of "immediate connect", but other nomenclature having the same or similar functionality is equally applicable to the embodiments of the present application, such as "one-touch screen", "open screen", and the like.
S102, the mobile phone responds to the screen-throwing operation to search and display available screen-throwing equipment.
After receiving the screen-throwing operation input by the user, the mobile phone assistant APP can call the capability of the Bluetooth chip to search available screen-throwing equipment nearby. In this embodiment of the present application, after searching for an available screen device, the first screen module may obtain information such as an internet protocol address (internet protocol address, IP address), a MAC address, a universal unique identifier (universally unique identifier, uuid), a device identifier, and a device name of the screen device. Optionally, the first screen projection module may send the above information of the screen projection device to the mobile phone assistant APP for display to the user for viewing.
In some embodiments, the mobile phone assistant APP may not be able to display all of the above information of available screen-casting devices on the list of available devices in view of the size of the display size of the mobile phone display. Thus, only the device identification (e.g., device name) of the available screen-casting device may be displayed on the list of available devices. If the user wants to view other information of a certain screen device, the user can perform long-press operation or double-click operation on the device name, and after receiving the long-press operation or the double-click operation, the mobile phone assistant APP can display other information of the screen device.
In some embodiments, the list of available devices displayed by the mobile phone assistant APP includes all devices in the vicinity of the mobile phone that have bluetooth enabled, such as a tablet, a mobile phone, or a wearable device. In other embodiments, after receiving the information of all the devices with the bluetooth function turned on, the mobile phone assistant APP can screen out the devices supporting the screen to display, for example, the wearable device does not support the screen, and then the mobile phone assistant APP does not display the wearable device in the available device list.
S103, the mobile phone receives the selected operation of the user on the tablet personal computer in the available screen throwing equipment.
The user can determine the equipment which the user wants to screen the mobile phone through the available equipment list displayed by the mobile phone assistant APP. As shown in fig. 4, the available screen-throwing devices include device 1 and device 2, and if the user wants to throw the mobile phone onto device 2 (i.e. the tablet computer), the user can click on the area where the name of the device is located. The mobile phone assistant APP can receive the selected operation of the tablet personal computer by the user.
In some embodiments, the user may also perform a selection operation on the tablet computer through voice input, for example, the user inputs "device 2" through the microphone of the mobile phone, and after receiving the sound signal through the microphone, the microphone may convert the sound signal into an electrical signal and send the electrical signal to the mobile phone assistant APP.
S104, the mobile phone responds to the selected operation, establishes communication connection with the tablet personal computer, and negotiates screen-throwing resolution with the tablet personal computer.
S105, the mobile phone acquires the number of available coding blocks in the encoder, and calculates the optimal coding frame rate according to the number of available coding blocks and the screen projection resolution.
The process of negotiating the screen resolution and the process of calculating the optimal coding frame rate may be referred to the description of the above embodiments and will not be repeated here.
And S106, the mobile phone encodes the image data by adopting the optimal encoding frame rate and screen projection resolution, and transmits the encoded image data to the tablet personal computer so that the tablet personal computer displays an image corresponding to the image data.
In some scenarios, if the user no longer needs to screen the mobile phone to the tablet computer, as shown in fig. 12, the user may click on the "disconnect" control on the interface of the mobile phone assistant APP, and after receiving the disconnect operation, the mobile phone assistant APP sends a disconnect message to the first screen-casting module, where the first screen-casting module invokes the capability of the Wi-Fi chip to disconnect from the Wi-Fi P2P of the tablet computer.
As can be seen from the above description, the optimal coding frame rate is calculated by the mobile phone, and in other embodiments, the optimal coding frame rate may be calculated by the tablet computer, and the process may include the steps of:
A: and the mobile phone receives the screen throwing operation input by the user.
B: and the mobile phone responds to the screen-throwing operation to search and display available screen-throwing equipment.
C: and the mobile phone receives the selected operation of the user on the tablet personal computer in the available screen throwing equipment.
D: and the mobile phone responds to the selected operation, establishes communication connection with the tablet personal computer and negotiates screen-throwing resolution with the tablet personal computer.
E: the tablet personal computer obtains the number of available coding blocks in the encoder of the mobile phone, and calculates the optimal coding frame rate according to the number of available coding blocks and the screen projection resolution.
F: and the tablet personal computer sends the optimal coding frame rate to the mobile phone.
G: the mobile phone adopts the optimal coding frame rate and screen projection resolution to code the image data of the display interface, and sends the coded image data to the tablet personal computer so that the tablet personal computer displays an image corresponding to the image data.
Compared with the embodiment, in the first embodiment, after the optimal coding frame rate is determined by the mobile phone, the mobile phone can directly adopt the optimal coding frame rate for coding without being determined and transmitted by the tablet personal computer, so that the transmission power consumption is reduced.
In the screen projection method, when the electronic equipment projects the screen of the display interface to the screen projection equipment, an optimal coding frame rate can be determined according to the number of available coding blocks in the current encoder and the resolution of the screen projection data, so that the encoder can code at the optimal coding frame rate, the available coding blocks are reasonably utilized, the success rate of the coding process is improved, and the screen projection success rate is further improved.
Examples of the screen projection method provided by the embodiment of the application are described in detail above. It will be appreciated that the electronic device, in order to achieve the above-described functions, includes corresponding hardware and/or software modules that perform the respective functions. Those of skill in the art will readily appreciate that the elements and algorithm steps of the examples described in connection with the embodiments disclosed herein may be implemented as hardware or combinations of hardware and computer software. Whether a function is implemented as hardware or computer software driven hardware depends upon the particular application and design constraints imposed on the solution. Those skilled in the art may implement the described functionality using different approaches for each particular application in conjunction with the embodiments, but such implementation is not to be considered as outside the scope of this application.
The embodiment of the present application may divide the functional modules of the electronic device according to the above method examples, for example, may divide each function into each functional module corresponding to each function, for example, a detection unit, a processing unit, a display unit, or the like, or may integrate two or more functions into one module. The integrated modules may be implemented in hardware or in software functional modules. It should be noted that, in the embodiment of the present application, the division of the modules is schematic, which is merely a logic function division, and other division manners may be implemented in actual implementation.
It should be noted that, all relevant contents of each step related to the above method embodiment may be cited to the functional description of the corresponding functional module, which is not described herein.
The electronic device provided in this embodiment is configured to execute the screen projection method, so that the same effect as that of the implementation method can be achieved.
In case an integrated unit is employed, the electronic device may further comprise a processing module, a storage module and a communication module. The processing module can be used for controlling and managing the actions of the electronic equipment. The memory module may be used to support the electronic device to execute stored program code, data, etc. And the communication module can be used for supporting the communication between the electronic device and other devices.
Wherein the processing module may be a processor or a controller. Which may implement or perform the various exemplary logic blocks, modules, and circuits described in connection with this disclosure. A processor may also be a combination that performs computing functions, e.g., including one or more microprocessors, digital signal processing (digital signal processing, DSP) and microprocessor combinations, and the like. The memory module may be a memory. The communication module can be a radio frequency circuit, a Bluetooth chip, a Wi-Fi chip and other equipment which interact with other electronic equipment.
In one embodiment, when the processing module is a processor and the storage module is a memory, the electronic device according to this embodiment may be a device having the structure shown in fig. 1.
The embodiment of the application also provides a computer readable storage medium, in which a computer program is stored, which when executed by a processor, causes the processor to execute the screen projection method of any of the above embodiments.
The present application also provides a computer program product, which when run on a computer, causes the computer to perform the above-mentioned related steps to implement the screen projection method in the above-mentioned embodiments.
In addition, embodiments of the present application also provide an apparatus, which may be specifically a chip, a component, or a module, and may include a processor and a memory connected to each other; the memory is used for storing computer-executed instructions, and when the device is operated, the processor can execute the computer-executed instructions stored in the memory, so that the chip executes the screen projection method in each method embodiment.
The electronic device, the computer readable storage medium, the computer program product or the chip provided in this embodiment are used to execute the corresponding method provided above, so that the beneficial effects thereof can be referred to the beneficial effects in the corresponding method provided above, and will not be described herein.
It will be appreciated by those skilled in the art that, for convenience and brevity of description, only the above-described division of the functional modules is illustrated, and in practical application, the above-described functional allocation may be performed by different functional modules according to needs, i.e. the internal structure of the apparatus is divided into different functional modules to perform all or part of the functions described above.
In the several embodiments provided in this application, it should be understood that the disclosed apparatus and method may be implemented in other ways. For example, the apparatus embodiments described above are merely illustrative, e.g., the division of modules or units is merely a logical function division, and there may be additional divisions when actually implemented, e.g., multiple units or components may be combined or integrated into another apparatus, or some features may be omitted or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or units, which may be in electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and the parts shown as units may be one physical unit or a plurality of physical units, may be located in one place, or may be distributed in a plurality of different places. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in each embodiment of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The integrated units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a readable storage medium. Based on such understanding, the technical solution of the embodiments of the present application may be essentially or a part contributing to the prior art or all or part of the technical solution may be embodied in the form of a software product stored in a storage medium, including several instructions to cause a device (may be a single-chip microcomputer, a chip or the like) or a processor (processor) to perform all or part of the steps of the methods of the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read Only Memory (ROM), a random access memory (random access memory, RAM), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
The foregoing is merely specific embodiments of the present application, but the scope of the present application is not limited thereto, and any person skilled in the art can easily think about changes or substitutions within the technical scope of the present application, and the changes and substitutions are intended to be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (12)

1. A method of screening, the method performed by a first electronic device, comprising:
obtaining a target coding frame rate, wherein the target coding frame rate is determined according to the number of pixels which can be coded by an encoder in the first electronic equipment in a preset duration and the resolution adopted by the first electronic equipment when the image data to be projected is coded;
and encoding the image data to be projected by adopting the target encoding frame rate and the resolution to obtain encoded data, and transmitting the encoded data to a second electronic device.
2. The method of claim 1, wherein the target encoding frame rate is determined by the first electronic device based on the number of pixels the encoder can encode within a preset duration and the resolution.
3. The method according to claim 2, wherein the method further comprises:
and determining the number of the pixels which can be coded in the preset duration according to the number of the available macro blocks in the coder and the number of the pixels which can be coded in the preset duration of each available macro block.
4. A method according to claim 3, characterized in that the method further comprises:
according to the relation: and determining the target coding frame rate, wherein the H×L×M is less than or equal to N×n×n, N is the number of pixels which can be coded in a preset duration of the encoder, N is the number of available macro blocks in the encoder, n×n is the number of pixels which can be coded in the preset duration of each available macro block, H×L is the resolution, and M is the target coding frame rate.
5. The method of claim 4, wherein the target encoded frame rate is a maximum integer that satisfies the relationship.
6. The method of any one of claims 3 to 5, wherein prior to determining the number of pixels the encoder can encode within a preset time period, the method further comprises:
the number of available macroblocks in the encoder is obtained.
7. The method of claim 6, wherein the obtaining the number of available macroblocks in the encoder comprises:
according to the interface: hnmediacodec. Getmediamacroblockinfo (). GetAvailableNum () acquires the number of available macroblocks in the encoder;
wherein HnMediaCodecManager hnMediaCodec =hnmacrocodecmanaager.
8. The method of claim 1, wherein the target encoding frame rate is determined by the second electronic device according to the number of pixels that the encoder can encode within a preset duration and the resolution, and the obtaining the target encoding frame rate includes:
a target encoded frame rate is received from the second electronic device.
9. A method of screening, the method performed by a second electronic device, comprising:
transmitting a target coding frame rate to a first electronic device, wherein the target coding frame rate is determined by the second electronic device according to the number of pixels which can be coded by an encoder in the first electronic device in a preset time period and the resolution adopted by the first electronic device when the first electronic device codes image data to be projected;
and receiving encoded data from the first electronic equipment, and displaying an image corresponding to the encoded data, wherein the encoded data is obtained by encoding the image data to be projected by the first electronic equipment by adopting the target encoding frame rate and the resolution.
10. A screen projection system comprising a first electronic device performing the method of any one of claims 1 to 8 and a second electronic device performing the method of claim 9.
11. An electronic device, comprising:
one or more processors;
one or more memories;
the memory stores one or more programs that, when executed by the processor, cause the electronic device to perform the method of any of claims 1-8 or the method of claim 9.
12. A computer readable storage medium, in which a computer program is stored which, when executed by a processor, causes the processor to perform the method of any one of claims 1 to 8 or to perform the method of claim 9.
CN202210901655.0A 2022-05-30 2022-07-28 Screen projection method and electronic equipment Active CN116033158B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202210600336 2022-05-30
CN2022106003366 2022-05-30

Publications (2)

Publication Number Publication Date
CN116033158A true CN116033158A (en) 2023-04-28
CN116033158B CN116033158B (en) 2024-04-16

Family

ID=86090103

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210901655.0A Active CN116033158B (en) 2022-05-30 2022-07-28 Screen projection method and electronic equipment

Country Status (1)

Country Link
CN (1) CN116033158B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117294690A (en) * 2023-11-22 2023-12-26 荣耀终端有限公司 QoE evaluation method and electronic equipment

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110392047A (en) * 2019-07-02 2019-10-29 华为技术有限公司 Data transmission method, device and equipment
CN110865782A (en) * 2019-09-29 2020-03-06 华为终端有限公司 Data transmission method, device and equipment
CN112558825A (en) * 2019-09-26 2021-03-26 华为技术有限公司 Information processing method and electronic equipment
CN113407142A (en) * 2021-07-13 2021-09-17 海信视像科技股份有限公司 Display device and screen projection method
CN113691850A (en) * 2021-08-25 2021-11-23 深圳康佳电子科技有限公司 Screen projection control method and device, intelligent terminal and computer readable storage medium
CN113722058A (en) * 2021-06-16 2021-11-30 荣耀终端有限公司 Resource calling method and electronic equipment
WO2022052773A1 (en) * 2020-09-10 2022-03-17 华为技术有限公司 Multi-window screen projection method and electronic device

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110392047A (en) * 2019-07-02 2019-10-29 华为技术有限公司 Data transmission method, device and equipment
CN112558825A (en) * 2019-09-26 2021-03-26 华为技术有限公司 Information processing method and electronic equipment
CN110865782A (en) * 2019-09-29 2020-03-06 华为终端有限公司 Data transmission method, device and equipment
WO2022052773A1 (en) * 2020-09-10 2022-03-17 华为技术有限公司 Multi-window screen projection method and electronic device
CN113722058A (en) * 2021-06-16 2021-11-30 荣耀终端有限公司 Resource calling method and electronic equipment
CN113407142A (en) * 2021-07-13 2021-09-17 海信视像科技股份有限公司 Display device and screen projection method
CN113691850A (en) * 2021-08-25 2021-11-23 深圳康佳电子科技有限公司 Screen projection control method and device, intelligent terminal and computer readable storage medium

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117294690A (en) * 2023-11-22 2023-12-26 荣耀终端有限公司 QoE evaluation method and electronic equipment
CN117294690B (en) * 2023-11-22 2024-04-12 荣耀终端有限公司 QoE evaluation method and electronic equipment

Also Published As

Publication number Publication date
CN116033158B (en) 2024-04-16

Similar Documents

Publication Publication Date Title
CN112291764B (en) Content connection system
WO2022052773A1 (en) Multi-window screen projection method and electronic device
WO2022257977A1 (en) Screen projection method for electronic device, and electronic device
KR20150082940A (en) Apparatas and method for controlling a rotation of screen in an electronic device
WO2022083465A1 (en) Electronic device screen projection method, medium thereof, and electronic device
WO2022100304A1 (en) Method and apparatus for transferring application content across devices, and electronic device
US20240073978A1 (en) Method for monitoring link and terminal device
WO2022222924A1 (en) Method for adjusting screen projection display parameters
CN116033158B (en) Screen projection method and electronic equipment
CN110825402B (en) Method and device for downloading data packet
CN115119048B (en) Video stream processing method and electronic equipment
CN116056053B (en) Screen projection method, electronic device, system and computer readable storage medium
CN114928898B (en) Method and device for establishing session based on WiFi direct connection
CN116033157B (en) Screen projection method and electronic equipment
CN114915996A (en) Communication exception handling method and related device
EP4283464A1 (en) Distributed device capability virtualization method, medium, and electronic device
CN117135729B (en) Multi-device cooperation method, system and terminal device
WO2022206600A1 (en) Screen projection method and system, and related apparatus
WO2023045392A1 (en) Cloud mobile phone implementation method and apparatus
CN116055715B (en) Scheduling method of coder and decoder and electronic equipment
CN116744106B (en) Control method of camera application and terminal equipment
CN115460445B (en) Screen projection method of electronic equipment and electronic equipment
WO2024109443A1 (en) Device connection method, device and system
CN116137720A (en) Method for reducing power consumption and electronic equipment
CN116939739A (en) Bluetooth gateway switching method, electronic equipment and readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant