CN117008860A - Application screen projection method and device, electronic equipment and storage medium - Google Patents
Application screen projection method and device, electronic equipment and storage medium Download PDFInfo
- Publication number
- CN117008860A CN117008860A CN202310982461.2A CN202310982461A CN117008860A CN 117008860 A CN117008860 A CN 117008860A CN 202310982461 A CN202310982461 A CN 202310982461A CN 117008860 A CN117008860 A CN 117008860A
- Authority
- CN
- China
- Prior art keywords
- screen
- display
- throwing
- target application
- application
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 116
- 238000012544 monitoring process Methods 0.000 claims abstract description 41
- 239000000203 mixture Substances 0.000 claims abstract description 3
- 230000006870 function Effects 0.000 claims description 55
- 238000005266 casting Methods 0.000 claims description 19
- 230000015572 biosynthetic process Effects 0.000 claims description 18
- 238000003786 synthesis reaction Methods 0.000 claims description 18
- 230000008859 change Effects 0.000 claims description 17
- 230000004044 response Effects 0.000 claims description 8
- 230000000694 effects Effects 0.000 abstract description 12
- 239000000872 buffer Substances 0.000 description 41
- 238000004590 computer program Methods 0.000 description 14
- 230000008569 process Effects 0.000 description 12
- 238000010586 diagram Methods 0.000 description 8
- 238000004891 communication Methods 0.000 description 5
- 230000002452 interceptive effect Effects 0.000 description 4
- 230000005540 biological transmission Effects 0.000 description 3
- 230000007246 mechanism Effects 0.000 description 3
- 239000011230 binding agent Substances 0.000 description 2
- 238000006243 chemical reaction Methods 0.000 description 2
- 238000013507 mapping Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 238000004806 packaging method and process Methods 0.000 description 2
- 238000012545 processing Methods 0.000 description 2
- 239000002131 composite material Substances 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000000750 progressive effect Effects 0.000 description 1
- 238000009877 rendering Methods 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 230000002194 synthesizing effect Effects 0.000 description 1
- 230000009466 transformation Effects 0.000 description 1
- 230000001960 triggered effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/14—Digital output to display device ; Cooperation and interconnection of the display device with other functional units
- G06F3/1454—Digital output to display device ; Cooperation and interconnection of the display device with other functional units involving copying of the display data of a local workstation or window to a remote workstation or window so that an actual copy of the data is displayed simultaneously on two or more displays, e.g. teledisplay
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04845—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04847—Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F9/00—Arrangements for program control, e.g. control units
- G06F9/06—Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
- G06F9/46—Multiprogramming arrangements
- G06F9/54—Interprogram communication
- G06F9/544—Buffers; Shared memory; Pipes
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Software Systems (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The application discloses an application screen projection method and device, belongs to the technical field of display control, and is used for solving the problem of black edge display during screen projection. The method is applied to a screen projection transmitting end and comprises the following steps: responding to the execution of the screen-throwing operation on the target application displayed in the first physical screen of the screen-throwing sending end, and starting screen-throwing service; creating a virtual screen corresponding to the first physical screen through the screen throwing service; registering a monitoring function for monitoring the first display attribute of the target application, sending cache data corresponding to the virtual screen and the current value of the first display attribute monitored by the monitoring function to the screen-throwing receiving end, so that the screen-throwing receiving end performs picture composition according to the cache data and the current value of the first display attribute, and displaying an interface of the target application at the screen-throwing receiving end. The method eliminates the black edge displayed when the screen-throwing receiving end displays the interface of the target application, and improves the screen-throwing display effect.
Description
Technical Field
The present application relates to the field of display control technologies, and in particular, to an application screen projection method, an application screen projection device, an electronic device, and a computer readable storage medium.
Background
As the vehicle-mounted system becomes more complex, cross-device communication is also accompanied, and meanwhile, more screen-projection scenes are also increased. An application drop screen is one of the drop screen scenarios, i.e. a layer of a certain application displayed on a first physical screen (e.g. a display screen of a first vehicle device) is dropped onto a second physical screen (e.g. a display screen of another vehicle device, or another display screen of the first vehicle device) for display. Because the display state of the application on the first physical screen comprises full-screen display and non-full-screen display, in the non-full-screen display state, the application layer displayed in the first physical screen comprises a navigation bar and/or a status bar, and in the prior art, the application is in different display states, and the screen throwing effect is different. For example, when the application is in a non-full screen display state, a status bar and/or a navigation bar exists in the display layer, which causes a black edge on the display interface of the second physical screen after the screen is thrown to the second physical screen, and affects the display effect of the second physical screen.
In summary, the prior art application screen projection method needs to be improved.
Disclosure of Invention
The embodiment of the application provides an application screen-throwing method, an application screen-throwing device and electronic equipment, which can eliminate black edges of the display of a second physical screen and improve the screen-throwing display effect in a scene of throwing the screen to the second physical screen when the application is in a non-full screen state.
In a first aspect, an embodiment of the present application discloses an application screen-throwing method, applied to a screen-throwing transmitting end, the method comprising:
responding to the execution of the screen-throwing operation on the target application displayed in the first physical screen of the screen-throwing sending end, and starting screen-throwing service;
creating a virtual screen corresponding to the first physical screen through the screen throwing service;
registering a monitoring function for monitoring a first display attribute of the target application;
the method comprises the steps that cache data corresponding to a virtual screen and a current value of a first display attribute monitored by a monitoring function are sent to a screen-throwing receiving end, so that the screen-throwing receiving end performs picture synthesis according to the cache data and the current value of the first display attribute, and an interface of the target application is displayed at the screen-throwing receiving end;
wherein the first display attribute includes: the target application corresponds to the display size and the display position of the first physical screen; the cache data is as follows: and the display data of the screen projection layer where the interface of the target application is.
In a second aspect, an embodiment of the present application discloses an application screen-throwing device, applied to a screen-throwing transmitting end, where the device includes:
the screen-throwing service starting module is used for responding to the execution of screen-throwing operation on the target application displayed in the first physical screen of the screen-throwing sending end and starting screen-throwing service;
The virtual screen creation module is used for creating a virtual screen corresponding to the first physical screen through the screen throwing service;
a monitoring registration module, configured to register a monitoring function for monitoring a first display attribute of the target application;
the screen-throwing data sending module is used for sending cache data corresponding to the virtual screen and the current value of the first display attribute monitored by the monitoring function to a screen-throwing receiving end, so that the screen-throwing receiving end performs picture synthesis according to the cache data and the current value of the first display attribute, and an interface of the target application is displayed at the screen-throwing receiving end;
wherein the first display attribute includes: the target application corresponds to the display size and the display position of the first physical screen; the cache data is as follows: and the display data of the screen projection layer where the interface of the target application is.
In a third aspect, an embodiment of the present application discloses an application screen-projection method, applied to a screen-projection receiving end, where the method includes:
responding to the cache data corresponding to the virtual screen sent by the screen throwing sending end and the current value of the first display attribute of the target application, and acquiring the cache data corresponding to the interface of the target application as data to be displayed;
Displaying the interface of the target application on the second physical screen according to the first display attribute and the data to be displayed;
wherein the first display attribute includes: the target application corresponds to the display size and the display position of the first physical screen; the cache data is as follows: and the display data of the screen projection layer where the interface of the target application is.
In a fourth aspect, an embodiment of the present application discloses an application screen-throwing device, applied to a screen-throwing receiving end, where the device includes:
the system comprises a to-be-displayed data acquisition module, a display module and a display module, wherein the to-be-displayed data acquisition module is used for responding to the cache data corresponding to a virtual screen and sent by a screen throwing sending end and the current value of a first display attribute of a target application, and acquiring the cache data corresponding to an interface of the target application as to-be-displayed data;
the screen projection display module is used for displaying the interface of the target application on the second physical screen according to the first display attribute and the data to be displayed;
wherein the first display attribute includes: the target application corresponds to the display size and the display position of the first physical screen; the cache data is as follows: and the display data of the screen projection layer where the interface of the target application is.
In a fifth aspect, the embodiment of the present application further discloses an electronic device, including a memory, a processor, and a computer program stored in the memory and capable of running on the processor, where the processor implements the application screen projection method according to the embodiment of the present application when executing the computer program.
In a sixth aspect, embodiments of the present application disclose a computer readable storage medium having stored thereon a computer program which, when executed by a processor, performs the steps of the method for applying a screen projection disclosed in the embodiments of the present application.
The application screen-throwing method disclosed by the embodiment of the application is applied to a screen-throwing sending end, and screen-throwing service is started by responding to the execution of screen-throwing operation on a target application displayed in a first physical screen of the screen-throwing sending end; creating a virtual screen corresponding to the first physical screen through the screen throwing service; registering a monitoring function for monitoring a first display attribute of the target application; and sending the cache data corresponding to the virtual screen and the current value of the first display attribute monitored by the monitoring function to a screen throwing receiving end, so that the screen throwing receiving end performs picture synthesis according to the cache data and the current value of the first display attribute, and displaying an interface of the target application at the screen throwing receiving end. The method eliminates the black edge displayed when the screen-throwing receiving end displays the interface of the target application, and improves the screen-throwing display effect.
The foregoing description is only an overview of the present application, and is intended to be implemented in accordance with the teachings of the present application in order that the same may be more clearly understood and to make the same and other objects, features and advantages of the present application more readily apparent.
Drawings
For the purpose of making the objects, technical solutions and advantages of the embodiments of the present application more apparent, the technical solutions of the embodiments of the present application will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present application, and it is apparent that the described embodiments are some embodiments of the present application, but not all embodiments of the present application. All other embodiments, which can be made by those skilled in the art based on the embodiments of the application without making any inventive effort, are intended to be within the scope of the application.
FIG. 1 is one of the flowcharts of the method for applying a screen shot disclosed in the embodiment of the present application;
FIG. 2 is a schematic diagram of an application display state in the method for applying screen projection according to the embodiment of the present application;
FIG. 3 is a second schematic diagram of an application display state in the method for applying screen projection according to the embodiment of the present application;
FIG. 4 is a third diagram illustrating an application display status in the method for applying screen projection according to the embodiment of the present application;
FIG. 5 is a diagram showing an application display state in the method for applying screen projection according to the embodiment of the present application;
FIG. 6 is a second flowchart of an embodiment of the present application for applying a screen projection method;
FIG. 7 is a schematic diagram of a reverse control using a screen projection method according to an embodiment of the present application;
FIG. 8 is a third flowchart of an application screen-casting method according to an embodiment of the present application;
FIG. 9 is a schematic diagram of an embodiment of the present application;
FIG. 10 is a schematic diagram of a second embodiment of a screen projection device according to the present application;
fig. 11 schematically shows a block diagram of an electronic device for performing the method according to the application; and
fig. 12 schematically shows a memory unit for holding or carrying program code for implementing the method according to the application.
Detailed Description
The following description of the embodiments of the present application will be made clearly and fully with reference to the accompanying drawings, in which it is evident that the embodiments described are some, but not all embodiments of the application. All other embodiments, which can be made by those skilled in the art based on the embodiments of the application without making any inventive effort, are intended to be within the scope of the application.
The application screen projection method disclosed in the embodiment of the application needs to include: a screen sending end, a screen receiving end, a distributed soft bus (distributed framework softbus, abbreviated as 'DFS') and technical support of screen service. In some embodiments of the present application, the screen sending end and the screen receiving end are different component modules with display screens on the vehicle. In the embodiment of the application, the physical screen of the screen-throwing sending end is marked as a first physical screen, and the physical screen of the screen-throwing receiving end is marked as a second physical screen. The distributed soft bus DFS is used for transmitting display content and reverse control data between the screen throwing sending end and the screen throwing receiving end. The screen throwing service is used for processing the display content and the countercontrol data and controlling the screen throwing process.
In order to facilitate readers to understand the implementation effect of the screen projection method disclosed by the embodiment of the application, the reasons for displaying black edges in the screen projection scene in the prior art discovered by the inventor are analyzed and described.
In the process of screen projection of an application of the android system, when a user initiates screen projection through a gesture or other modes of operating an application displayed in a first physical screen (denoted as a target application in the embodiment of the application), relevant services built in a system layer of a screen projection sending end receive a screen projection instruction (denoted as a first instruction in the text), and screen projection operation is executed according to the first instruction. In the process of executing the screen throwing operation, the service transmits display attribute information such as the position, the size and the like of a screen throwing layer to a data consumer Surface file inger, wherein Surface file inger is a service for synthesizing Surface by an android system, and the service is called as an interface synthesis service in the embodiment of the application. In the android system, surface represents a drawable image buffer.
The graphics architecture of android uses a producer-consumer model, surface can be the producer in the buffer queue, the most common consumer for image streams is the Surface eFl inger service, which receives data buffers (e.g., multiple surfaces) from multiple sources, combines them, and sends them to the display device for display.
When the android application starts to be created, the screen position of the layer of the target application starts from the screen coordinates (0, 0), and the applied display memory also starts from the screen cached (0, 0) coordinates. I.e. creating a virtual screen, a virtual screen of the same size as the first physical screen is created. At this time, if the target application is in the full-screen display state shown in fig. 2, an application layer is displayed in full screen in the virtual screen; if the target application is in the non-full-screen display state shown in fig. 3, 4 and 5, that is, the content displayed in the first screen includes not only the application layer but also the content of the status bar and/or the navigation bar, the size of the application layer is different from that of the first physical screen, that is, the application layer is not displayed in the virtual screen in full screen, so that the target application displayed in the second physical screen appears black after the screen is thrown. The reason for the black edge of the second physical screen is that the default bottom of the screen area where the status bar and the navigation bar are located in the virtual screen is black, and the surface file inger service brings the full screen content of the virtual screen when being synthesized last, which is a defect of the native android system in a screen throwing scene.
To solve these problems, an embodiment of the present application discloses a screen projection method, which relates to logic from a system layer to a nat layer, for example, and sets an attribute for recording size information of an application opening and position information displayed in a first physical screen when the application is opened. The size and the position can be changed as much as possible by combining the change of the status bar and the navigation bar, and the embodiment of the application designs a related method, so that the change of the display size and the display position of the target application can be timely notified to the nat-layer, after the nat-layer obtains information, the information such as the display size, the display position and the like is informed to the surface file writer service, and the surface file writer service can acquire the correct size during synthesis, thereby ensuring that the display effect of the target application displayed by the screen-throwing receiving end is normal no matter what state the application is.
The following describes a specific implementation of the screen projection method disclosed in the embodiment of the present application with reference to a specific example.
The application screen projection method disclosed by the embodiment of the application is applied to a screen projection transmitting end, and as shown in fig. 1, the method comprises the following steps: steps 110 to 140.
And step 110, in response to executing the screen-throwing operation on the target application displayed in the first physical screen of the screen-throwing sending end, starting screen-throwing service.
After the automobile engine is started, a certain application arranged in the automobile engine can be accessed through the desktop according to user operation, a window corresponding to the application is newly built in the first physical screen, and then an application interface of the application is displayed in the window. The vehicle-mounted system can acquire information such as application identification, display position, display size and the like of the currently entered application.
The user may perform the screen-casting of the application displayed in the first physical screen (in the embodiment of the present application, the application is denoted as "target application" to the designated device (in the embodiment of the present application, the application is denoted as "screen-casting receiving end"). In the current screen-casting scenario, the vehicle terminal that starts to display the target application may be denoted as "screen-casting sending end". Optionally, the screen-casting receiving end may be another component module with a display function of the vehicle, or may be an independent device with a data connection relationship with the vehicle.
After detecting the screen-throwing operation executed on the target application displayed in the first physical screen, the screen-throwing sending end starts the screen-throwing service.
In the embodiment of the application, the display content transmission and the inverse control data transmission are performed between the screen-throwing sending end and the screen-throwing receiving end through the distributed soft bus DFS. Correspondingly, before the screen-throwing service is started, whether the distributed soft buses are communicated or not needs to be judged.
In some embodiments of the present application, in response to executing a screen-casting operation on a target application displayed in a first physical screen of the screen-casting transmitting end, starting a screen-casting service, further including: responding to the execution of the screen throwing operation on the target application displayed in the first physical screen of the screen throwing sending end, and detecting whether the distributed soft buses are communicated; responding to the communication of the distributed soft buses, and starting screen-throwing service; and ending the screen throwing in response to the distributed soft bus is not communicated.
And step 120, creating a virtual screen corresponding to the first physical screen through the screen throwing service.
The virtual screen is used for displaying interface data of a layer where the target application is located in the first physical screen in a invisible mode.
In the prior art, after the screen-throwing service is started, a virtual screen corresponding to the first physical screen can be created at the screen-throwing sending end through the screen-throwing service. In the screen throwing process, the target application sends the interface data to be displayed currently to a first physical screen of a screen throwing sending end for displaying, and meanwhile, sends the interface data to a virtual screen for displaying, wherein the size of the virtual screen is consistent with that of the first physical screen. The user may see the application interface displayed in the first physical screen while the virtual screen is not displayed to the user. The screen projection service is used for recording the display content in the virtual screen, transmitting the recorded video stream to a screen projection receiving end, and displaying the video stream in a second physical screen by the screen projection receiving end so as to realize screen projection of the application.
Optionally, the android system creates a virtual screen in a form of a buffer, that is, creates a buffer, and is used as the virtual screen to buffer display data of a screen-throwing layer where an interface of the target application is located (that is, a layer for displaying the target in the first physical screen). And the process that the target application sends the display data to the virtual screen, namely the process that the target application sends the display data to the cache corresponding to the virtual screen.
The specific implementation manner of creating the virtual screen refers to the prior art, and the embodiments of the present application are not described in detail.
And step 130, registering a monitoring function for monitoring the first display attribute of the target application.
Wherein the first display attribute includes: the target application corresponds to the display size and the display position of the first physical screen.
The actual size of the window corresponding to each application will change as the status of the navigation bar changes. For example, when an application is displayed full screen, the application window (e.g., the screen area of the interface displaying the target application) is sized to correspond to the size of the first physical screen, occupying all of the first physical screen, as shown in fig. 2. When the navigation bar is required to be displayed in the first physical screen, the application is non-full screen display, the size of the application window is smaller than that of the first physical screen, and the application window is displayed at a first designated display position and in a designated size, as shown in fig. 3. When the status bar needs to be displayed in the first physical screen, the application is in non-full screen display, the size of the application window is smaller than that of the first physical screen, and the application window is displayed in a second designated display position and in a designated size, as shown in fig. 4. When the navigation bar and the status bar need to be displayed simultaneously in the first physical screen, the application is in non-full screen display, the size of the application window is smaller than that of the first physical screen, and the application window is displayed in a specified size at a third specified display position, as shown in fig. 5.
As described above, when the target application of the screen projection is not displayed in full screen in the first physical screen, the screen projection method in the prior art is adopted, the display content in the virtual screen corresponds to the area of the status bar or the navigation bar, and after the screen projection is performed on the second physical screen at the receiving end of the screen projection, the display is displayed as black.
In the embodiment of the application, in order to eliminate the black edge in the above situation, it is necessary to acquire the actual display size and position of the target application in the first physical screen, and set the display position and size of the target application in the second physical screen according to the actual display size and position of the target application, so as to achieve the effect of eliminating the display of the black edge.
Optionally, as shown in fig. 6, before the response to executing the screen-throwing operation on the target application displayed in the first physical screen of the screen-throwing sending end and starting the screen-throwing service, the method further includes: step 100 and step 102.
Step 100, in response to starting to display the target application on the first physical screen, defining an object for storing a first display attribute of the target application in a local system frame of the screen-casting transmitting end;
step 102, based on a preset display state change event, synchronously updating the current value of the first display attribute of the target application stored in the object.
When the screen-throwing transmitting end starts the display target application, an object can be defined in the nat layer and used for storing the first display attribute of the target application, such as the size and the position of the display area of the storage target application in the first physical screen.
Further, when the display position or the display size of the target application in the first physical screen changes, the screen projection transmitting end can calculate the real-time display size and the display position of the interface of the target application in real time, and update the calculated display size and display position into the object. For example, the first display attribute may be updated by a system attribute setting method of the android system.
Optionally, the screen sending end may acquire the change of the display size or the display position of the target application by registering and monitoring a preset display state change event. Wherein, the preset display state change event includes, but is not limited to, any one of the following: a hidden or expired status bar, a hidden or expired navigation bar, a full screen display, etc. Taking the width of the first physical screen as width and the height as an example, the display area of the first physical screen is (0, 0+width, 0+height), if the navigation bar exists when the target application is opened, the display real area of the target application is (0, width, height-height of the navigation bar); if both the navigation bar and the status bar are hidden, the display real area of the target application is (0, width, height).
Optionally, the registering is used for monitoring a monitoring function of the first display attribute of the target application, including: and creating a function callback, wherein the function callback is used for being executed when the first display attribute of the target application changes so as to acquire the current value of the first display attribute through the object. For example, the drop service may create a function callback at initialization, e.g., called: add_sysprop_change_cal lback.
This function callback will execute when the first display attribute changes. When the display size or the display position of the target application changes, the function callback is executed, and in the callback function, the current value of the first display attribute stored in the object is read, and is notified to a native layer (namely a local system framework) and further notified to a screen throwing service.
After the nat ive layer obtains the first display attribute of the target application, the surface efl inger service may obtain the first display attribute of the target application through the nat ive layer.
Optionally, the method further comprises: responsive to a change in the apparent state of the navigation bar and/or status bar on the first physical screen, the local framework recalculates the display size and display position of the target application on the first physical screen; and updating the first display attribute of the target application stored in the object through the recalculated display size and display position, and triggering the execution of the function callback to acquire the current value of the first display attribute through the object.
For example, when the target application is switched from non-full screen display to full screen display on the first physical screen, at this point the navigation bar and status bar will hide the display and the local framework recalculates the display size and display location of the target application on the first physical screen. For another example, when one of the navigation bar or the status bar displayed simultaneously with the target application on the first physical screen is hidden to be displayed, or when the target application is in full screen to be displayed, the navigation bar or the status bar is exhaled by the hidden status, the local framework is triggered to recalculate the display size and the display position of the target application on the first physical screen. After the local framework recalculates the current display size and display position of the target application, the display size and display position of the target application stored in the object created to store the first display attribute is updated. The object value update operation further triggers execution of a preset function callback. The function callback acquires the current value of the first display attribute stored in the object in the executing process, so that the screen throwing service can acquire the current value of the first display attribute in time.
And 140, sending the cache data corresponding to the virtual screen and the current value of the first display attribute monitored by the monitoring function to a screen throwing receiving end, so that the screen throwing receiving end performs picture synthesis according to the cache data and the current value of the first display attribute, and displaying an interface of the target application at the screen throwing receiving end.
As described above, the first display attribute includes: the target application corresponds to the display size and the display position of the first physical screen; the cache data is as follows: and the display data of the screen projection layer where the interface of the target application is.
And then, the screen-throwing service records the virtual screen, and sends the cached data corresponding to the virtual screen and the first display attribute of the target application which is monitored by the monitoring function newly to a screen-throwing receiving end for the screen-throwing receiving end to display on the second physical screen.
Optionally, the sending, to the screen-casting receiving end, the cached data corresponding to the virtual screen and the current value of the first display attribute monitored by the monitoring function includes: and sending the cache data corresponding to the virtual screen and the current value of the first display attribute monitored by the monitoring function to a screen throwing receiving end through a distributed soft bus.
In other embodiments of the present application, when the screen-throwing sending end and the screen-throwing receiving end are located in the same vehicle system, the screen-throwing sending end and the screen-throwing receiving end may share the same screen-throwing service, so that the screen-throwing sending end and the screen-throwing receiving end may transmit the current value of the first display attribute through the screen-throwing service, and transmit the buffered data through the distributed soft bus.
Firstly, a screen throwing service calls a bufferQueuer, namely creating a buffer queue bufferQueue by using a createBufferQueue, registering an instance pointer mFrameListener notified by a consumer callback, returning an interface IGgraphicBufferProducer object of a cross-process communication mechanism Binder, and packaging the IGgraphicBufferProducer object into a Surface object to be transmitted to a producer. And the Surface object corresponds to the interface of the target application. In the application scenario disclosed in the embodiment of the present application, a producer may understand a target application, and a consumer may understand a screen-throwing service.
Among them, a Surface object is an important class for displaying view contents in an android system, which can be responsible for managing the display of a rectangular area on a screen by creating an instance in an application program, and can render various contents such as graphics, video, text, and the like to a hidden area. In the actual use process, the interface IGgraphicBufferProducer object of the inter-process communication mechanism Binder is packaged into a method or an interface called in the Surface object, and the Surface object can acquire data to be displayed through the inter-process communication mechanism.
When the producer creates the producer queue producer queuebuffer, the consumer is notified of the consumer callback.
The consumer (such as a screen throwing service) asynchronously acquires a Buffer for storing the data to be displayed, and further sends the data in the Buffer to the opposite terminal through the DFS. For example, the screen-throwing service asynchronously acquires a Buffer, then reads data in the Buffer, and sends the data to the opposite terminal through the DFS as follows:
a) Calling CpuConsumer, namely obtaining a Buffer which is already pushed into a Buffer queue by lockNextBuffer (), and registering an instance pointer mFrameListener of a consumer call-back notification;
b) Invoking the encapsulated DFS interface sendbit aryData to send data to the opposite terminal;
c) Calling a method CpuConsumer of unlock Buffer () to release Buffer.
Next, the Consumer Consumer gets a Buffer. The consumer can perform the following steps to obtain Buffer:
e) Calling method CpuConsumer: -, lockNextBuffer () -)
The acquireBufferLocked () - > BufferQueueConsumer is that acquireBuffer () -, acquire Buffer;
f) Lock Buffer, call the following method:
CpuConsumer::lockBufferItem()->GraphicBuffer::lockAsync()。
finally, the consumer releases the acquired Buffer. And finishing the transmission of the cache data in the cache corresponding to the current virtual screen.
The above methods and interfaces are interfaces and methods defined in the android system, and specific functions of each method or interface in the embodiments of the present application are not described in detail.
The specific implementation manner of sending the cache data to the opposite terminal by the screen-drop service based on the producer-consumer mode is referred to in the prior art, and will not be described in detail in the embodiment of the present application.
Similarly, the screen throwing service can also send the first display attribute to the screen throwing receiving end by calling the encapsulated DFS interface sendmail.
After receiving the cache data (namely, the data stream corresponding to the display interface in the virtual screen) sent by the screen sending end, the screen sending end displays the interface of the target application according to the first display attribute, and does not display a black area.
Optionally, the screen capturing receiving end performs picture synthesis according to the cached data and the current value of the first display attribute, and displays the interface of the target application on the screen capturing receiving end, where the screen capturing receiving end includes: the screen throwing receiving end obtains the cache data corresponding to the interface of the target application as data to be displayed according to the current value of the first display attribute and the cache data; and the screen projection receiving end displays the interface of the target application on the second physical screen according to the first display attribute and the data to be displayed.
As described above, when the display size or the display position of the target application changes, the screen-casting service obtains the first display attribute of the target application through the function callback created in advance. Meanwhile, the screen projection service can inform the screen projection receiving end of the acquired first display attribute. For example, the screen-throwing service may transmit the acquired first display attribute to the screen-throwing receiving end through the DFS, and then the screen-throwing service running on the screen-throwing receiving end acquires the first display attribute.
In a specific implementation process, in order to realize screen projection, a screen projection receiving end also needs to start a screen projection service, and the screen projection service is used for calling a DFS interface to acquire display data and a first display attribute in a virtual screen transmitted by a screen projection sending end. Wherein, the cache data comprises: interface data, status bars, and/or navigation bars of the interface of the target application fill in data of the region of black pixel data (i.e., black border region).
In the prior art, after receiving the cache data sent by the screen-throwing sending end, the screen-throwing receiving end invokes the surface eFl inger service to create a window of the target application on the second physical screen, wherein the window is used for displaying an interface of the target application, and the size of the created window and the corresponding cache are equal to the size of the second physical screen. Then, the screen throwing service synthesizes the data (namely, the data with the black area) transmitted by the screen throwing sending end to the beginning position of (0, 0) in the window through the surface eFl inger service.
In the embodiment of the application, in order to eliminate the black edge displayed in the second physical screen, the screen projection service can determine the data corresponding to the interface of the target application in the cache data as the data to be displayed according to the display size and the display position in the received first display attribute of the target application, and the data is used for displaying on the second physical screen.
As described above, the cache size of the virtual screen is consistent with the size of the first physical screen, and the first display attribute is the display size and the display position of the target application on the first physical screen, and the display positions on the first physical screen are in one-to-one correspondence with the display positions on the virtual screen; the buffer memory positions corresponding to the virtual screen and the display positions of the virtual screen have a one-to-one correspondence, so that buffer memory data in the designated position and the size area can be determined according to the two correspondence.
Optionally, the screen-throwing receiving end displays, on the second physical screen, the interface of the target application according to the first display attribute and the data to be displayed, including: the screen throwing receiving end modifies the display size of an interface synthesis service SurfaceFl inger for combining the target application according to the first display attribute; and the screen-throwing receiving end displays the interface of the target application on the second physical screen based on the set display size and the data to be displayed.
The projection service may then modify the size of the cache corresponding to the interface created by the Surface efl inger service on the second physical screen according to the received display size in the first display attribute of the target application (i.e., the display size of the interface of the target application in the first physical screen), for example, modify the cache size and the window size (i.e., the display size) of the interface (e.g., surface 2) created by the Surface efl inger service of the target application on the second physical screen to be equal to the display size in the first display attribute, and set the display position of the interface (e.g., surface 2) on the second physical screen to be (0, 0). And then, the screen projection service combines the display contents of the target application on the second physical screen according to the data and the display size in the buffer memory after the size modification, and the display contents are used for rendering and displaying.
Specifically, taking the display size of the first physical screen at the screen-throwing sending end and the second physical screen at the screen-throwing receiving end as X, the size of the interface of the target application in the first physical screen is Y (X > Y) as an example, the data in the virtual screen includes filling data, that is, black pixel data, and the position of the interface display with the size of Y can be moved so as to avoid displaying black edges on the second physical screen. For example, the position of the interface display of size Y is moved to the (0, 0) position of the first physical screen.
In the specific implementation process, after the screen projection is started, the screen projection service of the screen projection receiving end is started, after the screen projection service of the screen projection receiving end receives the data transmitted by the DFS, the method is adopted to determine the cache data corresponding to the interface of the target application, the cache data are used as the data to be displayed, and the data to be displayed are copied to the Surface2 for display. The Surface2 corresponds to an interface of a target application created by calling the Surface eFlinger service by the screen throwing application on the second physical screen. And the buffer size, the display size and the display position in the Surface2 are adjusted according to the interface of the target application.
In the process of data copying and displaying, firstly, a screen throwing service applies for an internal function image cache ANATIVWindow_Buffer object used by a local frame class through an mSurface-lock (NUOUtBuffer) interface, then copies data to be displayed onto the ANATIVWindow_Buffer, and then calls the mSurface- > unlockAndPost () interface to display the ANATIVWindow_Buffer.
Next, call Surface:: lock () method applies for Buffer. The method comprises the following specific steps:
a) Calling a Surface connection (NATIVE_WINDOW_API_CPU) method to connect with a server;
b) Calling a Surface to apply for a Buffer by a dequeue Buffer () method;
c) Calling a GraphicBuffer, namely locking the Buffer by a lockAsync () interface;
then, calling Surface to apply for Buffer by dequeue Buffer (), the specific steps are as follows:
e) Applying for a Buffer through an IGgraphicBufferProducer interface;
f) When the Buffer is empty or the GraphiBuffer is empty, the IGphiBuffer producer is called, wherein the requestBuffer () acquires GraphiBuffer in the Buffer queue BufferQueue.
Finally, call Surface that the unlockAndPost () will be BufferQueue to BufferQueue, wait for the composition to send and show. The method comprises the following specific steps:
g) And calling GraphicBuffer to unlock the Buffer by an unlock Async ().
h) The Buffer is pushed into a Buffer queue Buffer by a Surface:: queue Buffer interface, namely a calling method iggraphicbuffer producer:: queue Buffer ().
The above methods and interfaces are interfaces and methods defined in the android system, and specific functions of each method or interface in the embodiments of the present application are not described in detail.
The process of copying and displaying the data in the Surface cache refers to the prior art, and the description of the embodiment of the present application is omitted.
In the android system, each open application interface should have a window, and the android system controls the display of the application interface by calculating the visible area of each window (i.e. the screen area visible on the screen by the user, which is called "visibleregion screen" in the android system). In android systems, each interface is commonly referred to as a Surface. Each Surface has its own location, size, and the like on the screen, and the content to be displayed within each Surface may change for each Surface. The Surface is used for recording information such as the size, the display position, the display content and the like of the corresponding application program.
And the Surface eFinger service is responsible for combining the surfaces into a single main Surface, i.e., a composite interface. And finally, the content of the individual mail source is sent and displayed.
The Surface eFinger service executes the instruction of Window manager, and window combination is executed according to the information recorded in each Surface.
For example, when an application window changes state, the proxy object surfaceincompositiercient packages state change information such as the window size, position, display order, etc. of the application, and sends the state change information to the SurfaceFinger service. After the surfacer service changes these state information, it wakes up the waiting listener and sets a flag to tell the listening thread window that its state has changed and needs to be processed.
In the android system, a Surface is created by a Surface eFinger service for each window (such as each application interface), in order to manage the multi-window interface, the Surface eFinger service creates a Client instance for each Surface (i.e. the application interface), and after a control structure created by the Client is packaged, the control structure is returned to the proxy object Surface configuration Client. This Client is then assigned a Surface. The proxy object surfacposiercclient is the interface between the application and SurfaceFlinger.
The surfinger service starts a listening thread at creation time, which is responsible for processing each time a window is updated. Therefore, the screen throwing service writes the target data into the Surface corresponding to the interface of the target application each time, or triggers the Surface eFinger service combination interface each time the size and the position of the interface window of the target application change, and sends and displays the interface to update the interface of the target application on the second physical screen.
In some embodiments of the present application, after the sending, to the screen-casting receiving end, the cached data corresponding to the virtual screen and the current value of the first display attribute monitored by the monitoring function, the method further includes: and responding to the execution of the function callback, acquiring the current value of the first display attribute through the object, and sending the first display attribute to a screen throwing receiving end. For example, when the status bar displayed on the first physical screen is switched from the calling status to the hiding status, the interface display position and the display size of the target application will change, at this time, the function callback is executed, and the calculated current value of the first display attribute is notified to the screen throwing service. And then, the screen throwing service sends the real-time value of the first display attribute to a screen throwing receiving end.
In some embodiments of the present application, after the second physical screen displays the interface of the target application according to the first display attribute and the data to be displayed, the method further includes: the screen throwing receiving end responds to a touch event applied to the target displayed in the second physical screen, and acquires the touch type of the touch event and the touch coordinates of a touch point; the screen throwing receiving end obtains display coordinates on a second physical screen corresponding to the touch coordinates; the screen-throwing receiving end obtains the relative display coordinates of the touch point relative to the target application displayed on the second physical screen according to the display coordinates and the display position of the target application on the second physical screen; and the screen throwing receiving end packages the touch type and the relative display coordinates into touch event information, and returns the touch event information to the screen throwing sending end so as to trigger the screen throwing sending end to interactively control the target application displayed in the first physical screen according to the touch event information.
In some embodiments of the present application, after the sending, to the screen-casting receiving end, the cached data corresponding to the virtual screen and the current value of the first display attribute monitored by the monitoring function, the method further includes: receiving touch event information sent by a screen-throwing receiving end, wherein the touch event information comprises: the touch type and the relative display coordinates of the touch point relative to the target application displayed on the second physical screen; according to the first display attribute and the display size of the target application on the second physical screen, carrying out coordinate transformation on the relative display coordinates to obtain display coordinates corresponding to the first physical screen; acquiring touch coordinates corresponding to the display coordinates of the corresponding first physical screen; and generating a touch event applied to the target displayed on the first physical screen according to the touch coordinate and the touch type.
For example, when the user clicks an icon of a target application displayed on the second physical screen, the screen-casting receiving end may detect a click event, and touch screen coordinates of the click position. And then, according to the corresponding relation between the touch screen coordinates of the clicking position and the display coordinates of the second physical screen, obtaining the display coordinates of the clicking position. And then, according to the display position and the display size of the target application on the second physical screen, calculating to obtain the relative display coordinates of the click position relative to the target application displayed on the second physical screen. And generating touch event information according to the relative display coordinates. The touch event information can be transmitted back to the screen throwing sending end through the DFS.
Specifically, for example, a user clicks a position p2 in the target application displayed on the second physical screen in fig. 7, after the screen receiving end detects a click event of the point p2, the touch screen coordinates corresponding to the click event are obtained, and display coordinates (x 22, y 22) of the point p2 relative to the second physical screen are generated based on the touch screen coordinates of the point p2 according to a pre-established mapping relationship between the touch screen coordinates and the screen coordinates. Further, relative display coordinates (x 21, y 21) of the point p2 with respect to the target application displayed on the second physical screen are calculated based on the display position (x 20, y 20) of the target application on the second physical screen and the display coordinates (x 2, y 2) of the point p2 with respect to the second physical screen, wherein x21=x22-x20, y21=y22-y 20. And then, the screen throwing receiving end packages the event type corresponding to the clicking event and the relative display coordinates (x 21, y 21) of the point p2 relative to the target application displayed on the second physical screen into touch event information, and returns the touch event information to the screen throwing sending end.
After the screen-throwing sending end receives the touch event information, analyzing the touch event information to obtain the touch event type and the relative display coordinates (x 21, y 21) of the touch point relative to the target application displayed on the second physical screen, wherein the relative display coordinates (x 21, y 21) of the point p2 are as described above. And then, scaling S after the target application is projected onto the second physical screen according to the display size of the target application on the first physical screen and the display size of the target application on the second physical screen. Then, the relative display coordinates (x 21, y 21) obtained by the analysis are subjected to coordinate scaling according to the scaling ratio S, and the relative display coordinates p1 (x 11, y 11) of the point p1 corresponding to the point p2 on the target application displayed on the first physical screen are obtained. Then, based on the display position (x 10, y 10) of the target application on the first physical screen and the relative display coordinates p1 (x 11, y 11) of the point p1, the display coordinates (x 12, y 12) of the point p1 relative to the first physical screen are calculated, wherein x12=x11+x10, y12=y11+y10. And finally, according to the mapping relation between the display coordinates of the first physical screen and the touch screen coordinates, converting the point p1 relative to the display coordinates (x 12, y 12) on the first physical screen into the touch screen coordinates of the screen throwing transmitting end. And finally, generating a touch event applied to the target displayed on the first physical screen according to the touch screen coordinates obtained through conversion and the touch event type obtained through analysis.
The specific method for converting the coordinates of the touch screen into the display coordinates of the screen refers to the prior art, and is not repeated in the embodiment of the present application.
The application screen-throwing method disclosed by the embodiment of the application is applied to a screen-throwing sending end, and screen-throwing service is started by responding to the execution of screen-throwing operation on a target application displayed in a first physical screen of the screen-throwing sending end; creating a virtual screen corresponding to the first physical screen through the screen throwing service; registering a listening function for listening to a first display attribute of the target application, wherein the first display attribute comprises: the target application corresponds to the display size and the display position of the first physical screen; and sending the cache data corresponding to the virtual screen and the current value of the first display attribute monitored by the monitoring function to a screen throwing receiving end, so that the screen throwing receiving end performs picture synthesis according to the cache data and the current value of the first display attribute, and the interface of the target application is displayed at the screen throwing receiving end, thereby eliminating black edges displayed when the interface of the target application is displayed at the screen throwing receiving end, and improving the screen throwing display effect.
Furthermore, according to the application screen projection method disclosed by the embodiment of the application, the screen projection application reverse control is performed by accurately displaying the screen projection application interface and according to the corresponding relation of the display position and the size between the interface displayed by the screen projection receiving end and the interface displayed by the screen projection sending end, so that the reverse control accuracy is improved.
Correspondingly, the embodiment of the application also discloses an application screen-throwing method, as shown in fig. 8, which comprises the following steps: step 810 and step 820.
Step 810, obtaining the cache data corresponding to the interface of the target application as data to be displayed in response to the cache data corresponding to the virtual screen and the current value of the first display attribute of the target application, which are sent by the screen throwing sending end.
The current values of the cached data and the first display attribute of the target application are sent by a screen throwing sending end through the following method: the method comprises the steps that a screen throwing sending end responds to the fact that screen throwing operation is executed on a target application displayed in a first physical screen of the screen throwing sending end, and screen throwing service is started; the screen-throwing sending end creates a virtual screen corresponding to the first physical screen through the screen-throwing service; registering a monitoring function for monitoring a first display attribute of the target application by a screen sending end; and the screen throwing sending end sends the cache data corresponding to the virtual screen and the current value of the first display attribute monitored by the monitoring function to the screen throwing receiving end.
Wherein the first display attribute includes: the target application corresponds to the display size and the display position of the first physical screen; the cache data is as follows: and the display data of the screen projection layer where the interface of the target application is. And the screen projection layer is a layer of the interface of the target application on the first physical screen.
Optionally, registering a listening function for listening to the first display attribute of the target application includes:
and creating a function callback, wherein the function callback is used for being executed when the first display attribute of the target application changes so as to acquire the current value of the first display attribute through the object.
The specific embodiment of creating the virtual screen and obtaining the first display attribute is referred to in the foregoing description, and will not be described herein.
And step 820, displaying the interface of the target application on the second physical screen according to the first display attribute and the data to be displayed.
The screen throwing receiving end responds to the cache data corresponding to the virtual screen sent by the screen throwing sending end and the current value of the first display attribute of the target application, and acquires the cache data corresponding to the interface of the target application, and the cache data is used as the specific implementation mode of the data to be displayed, and is not repeated herein.
Optionally, displaying, on the second physical screen, the interface of the target application according to the first display attribute and the data to be displayed, including: the screen throwing receiving end modifies the display size of an interface synthesis service SurfaceFl inger for combining the target application according to the first display attribute; and the screen-throwing receiving end displays the interface of the target application on the second physical screen based on the set display size and the data to be displayed.
The specific implementation manner of displaying the interface of the target application on the second physical screen by the screen-throwing receiving terminal according to the first display attribute and the data to be displayed is referred to in the foregoing description, and will not be described herein.
Optionally, after the second physical screen displays the interface of the target application according to the first display attribute and the data to be displayed, the method further includes: responding to a touch event of the target application displayed in the second physical screen, and acquiring a touch type of the touch event and touch coordinates of a touch point; acquiring display coordinates on a second physical screen corresponding to the touch coordinates; according to the display coordinates and the display position of the target application on the second physical screen, acquiring the relative display coordinates of the touch point relative to the target application displayed on the second physical screen; and packaging the touch type and the relative display coordinates into touch event information, and transmitting the touch event information back to the screen-throwing transmitting end so as to trigger the screen-throwing transmitting end to perform interactive control on the target application displayed in the first physical screen according to the touch event information.
The screen-throwing receiving end responds to the touch event of the target application displayed in the second physical screen, acquires the touch type of the touch event, and the specific implementation of the touch coordinates of the touch point is referred to in the foregoing, and is not repeated here.
The screen-throwing receiving end obtains the display coordinates on the second physical screen corresponding to the touch coordinates, and obtains the specific implementation of the relative display coordinates of the touch point relative to the target application displayed on the second physical screen according to the display coordinates and the display position of the target application on the second physical screen.
The screen-throwing receiving end packages the touch type and the relative display coordinates into touch event information, and returns the touch event information to the specific implementation mode of the screen-throwing sending end, which is described above, and the detailed description is omitted here.
Optionally, the screen-throwing sending end performs interactive control on the target application displayed in the first physical screen according to the touch event information, and includes: the screen throwing receiving end responds to receiving touch event information sent by the screen throwing receiving end, wherein the touch event information comprises the following components: the touch type and the relative display coordinates of the touch point relative to the target application displayed on the second physical screen; the screen projection receiving end performs coordinate conversion on the relative display coordinates according to the first display attribute and the display size of the target application on the second physical screen to obtain display coordinates corresponding to the first physical screen; the screen throwing receiving end obtains touch control coordinates corresponding to the display coordinates of the corresponding first physical screen; and the screen throwing receiving end generates a touch event of a target application displayed on a first physical screen according to the touch coordinate and the touch type, and performs interactive control on the target application displayed in the first physical screen based on the touch event.
The specific implementation manner of the screen-throwing sending terminal for performing the interactive control on the target application displayed in the first physical screen according to the touch event information is referred to in the foregoing description, and is not repeated herein.
The embodiment of the application also discloses an application screen-throwing method which is applied to a screen-throwing receiving end, wherein the method obtains the cache data corresponding to the interface of the target application as data to be displayed by responding to the cache data corresponding to the virtual screen sent by the screen-throwing sending end and the current value of the first display attribute of the target application; and displaying the interface of the target application on the second physical screen according to the first display attribute and the data to be displayed. The method eliminates the black edge displayed when the screen-throwing receiving end displays the interface of the target application, and improves the screen-throwing display effect.
Furthermore, according to the application screen projection method disclosed by the embodiment of the application, the screen projection application reverse control is performed by accurately displaying the screen projection application interface and according to the corresponding relation of the display position and the size between the interface displayed by the screen projection receiving end and the interface displayed by the screen projection sending end, so that the reverse control accuracy is improved.
Correspondingly, the embodiment of the application also discloses an application screen-throwing device which is applied to a screen-throwing sending end, as shown in fig. 9, and the device comprises:
A screen-throwing service starting module 910, configured to respond to executing a screen-throwing operation on a target application displayed in a first physical screen of the screen-throwing sending end, and start screen-throwing service;
a virtual screen creation module 920, configured to create a virtual screen corresponding to the first physical screen through the screen-casting service;
a monitor registration module 930, configured to register a monitor function for monitoring a first display attribute of the target application;
and a screen-throwing data sending module 940, configured to send, to a screen-throwing receiving end, the cache data corresponding to the virtual screen and the current value of the first display attribute monitored by the monitoring function, so that the screen-throwing receiving end performs picture synthesis according to the cache data and the current value of the first display attribute, and displays an interface of the target application at the screen-throwing receiving end.
Wherein the first display attribute includes: the target application corresponds to the display size and the display position of the first physical screen; the cache data is as follows: and the display data of the screen projection layer where the interface of the target application is.
Optionally, the apparatus further includes:
the first display attribute definition module is used for responding to the starting display of the target application on the first physical screen and defining an object for storing the first display attribute of the target application in a local system frame of the screen throwing transmitting end;
And the first display attribute updating module is used for synchronously updating the current value of the first display attribute of the target application stored in the object based on a preset display state change event.
Optionally, the snoop registration module 930 is further configured to:
and creating a function callback, wherein the function callback is used for being executed when the first display attribute of the target application changes so as to acquire the current value of the first display attribute through the object.
Optionally, the screen capturing receiving end performs picture synthesis according to the cached data and the current value of the first display attribute, and displays the interface of the target application on the screen capturing receiving end, where the screen capturing receiving end includes:
the screen throwing receiving end obtains the cache data corresponding to the interface of the target application as data to be displayed according to the current value of the first display attribute and the cache data;
and the screen projection receiving end displays the interface of the target application on the second physical screen according to the first display attribute and the data to be displayed.
Optionally, the apparatus further includes:
responsive to a change in the apparent state of the navigation bar and/or status bar on the first physical screen, the local framework recalculates the display size and display position of the target application on the first physical screen;
And updating the first display attribute of the target application stored in the object through the recalculated display size and display position, and triggering the execution of the function callback to acquire the current value of the first display attribute through the object.
The application screen projection device disclosed by the embodiment of the application is used for realizing the application screen projection method disclosed by the embodiment of the application, and the specific implementation of each module of the device is not repeated, and can be referred to the specific implementation of the corresponding steps of the method embodiment.
The device is applied to a screen projection transmitting end, and the device starts screen projection service by responding to the screen projection operation of a target application displayed in a first physical screen of the screen projection transmitting end; creating a virtual screen corresponding to the first physical screen through the screen throwing service; registering a listening function for listening to a first display attribute of the target application, wherein the first display attribute comprises: the target application corresponds to the display size and the display position of the first physical screen; and sending the cache data corresponding to the virtual screen and the current value of the first display attribute monitored by the monitoring function to a screen throwing receiving end, so that the screen throwing receiving end performs picture synthesis according to the cache data and the current value of the first display attribute, and the interface of the target application is displayed at the screen throwing receiving end, thereby eliminating black edges displayed when the interface of the target application is displayed at the screen throwing receiving end, and improving the screen throwing display effect.
Furthermore, the screen-throwing device for the application disclosed by the embodiment of the application performs screen-throwing application reverse control by accurately displaying the screen-throwing application interface and according to the corresponding relation of the display position and the size between the interface displayed by the screen-throwing receiving end and the interface displayed by the screen-throwing sending end, thereby improving the reverse control accuracy.
Correspondingly, the embodiment of the application also discloses an application screen-throwing device which is applied to a screen-throwing receiving end, as shown in fig. 10, and the device comprises:
the to-be-displayed data obtaining module 1010 is configured to obtain, as to-be-displayed data, cache data corresponding to an interface of a target application in response to cache data corresponding to a virtual screen sent by a screen-throwing sending end and a current value of a first display attribute of the target application;
the screen display module 1020 is configured to display, on the second physical screen, an interface of the target application according to the first display attribute and the data to be displayed;
the current values of the cached data and the first display attribute of the target application are sent by a screen throwing sending end through the following method:
the method comprises the steps that a screen throwing sending end responds to the fact that screen throwing operation is executed on a target application displayed in a first physical screen of the screen throwing sending end, and screen throwing service is started;
The screen-throwing sending end creates a virtual screen corresponding to the first physical screen through the screen-throwing service;
registering a monitoring function for monitoring a first display attribute of the target application by a screen sending end;
and the screen throwing sending end sends the cache data corresponding to the virtual screen and the current value of the first display attribute monitored by the monitoring function to the screen throwing receiving end.
Wherein the first display attribute includes: the target application corresponds to the display size and the display position of the first physical screen; the cache data is as follows: and the display data of the screen projection layer where the interface of the target application is. And the screen projection layer is a layer of the interface of the target application on the first physical screen.
Optionally, the screen display module 1020 is further configured to:
modifying the display size of an interface synthesis service SurfaceFl inger for combining the target application according to the first display attribute;
and displaying the interface of the target application on the second physical screen based on the set display size and the data to be displayed.
The application screen projection device disclosed by the embodiment of the application is used for realizing the application screen projection method disclosed by the embodiment of the application, and the specific implementation of each module of the device is not repeated, and can be referred to the specific implementation of the corresponding steps of the method embodiment.
The device acquires the cache data corresponding to an interface of a target application as data to be displayed by responding to the cache data corresponding to a virtual screen sent by a screen throwing sending end and the current value of a first display attribute of the target application; and displaying the interface of the target application on the second physical screen according to the first display attribute and the data to be displayed. The device eliminates the black edge displayed when the screen receiving end displays the interface of the target application, and improves the screen display effect.
In this specification, each embodiment is described in a progressive manner, and each embodiment is mainly described by differences from other embodiments, and identical and similar parts between the embodiments are all enough to be referred to each other. For the device embodiments, since they are substantially similar to the method embodiments, the description is relatively simple, and reference is made to the description of the method embodiments for relevant points.
The above describes in detail a method and apparatus for applying screen projection provided by the present application, and specific examples are applied herein to illustrate the principles and embodiments of the present application, and the above description of the examples is only for helping to understand the method and a core idea of the present application; meanwhile, as those skilled in the art will have variations in the specific embodiments and application scope in accordance with the ideas of the present application, the present description should not be construed as limiting the present application in view of the above.
The apparatus embodiments described above are merely illustrative, wherein the elements illustrated as separate elements may or may not be physically separate, and the elements shown as elements may or may not be physical elements, may be located in one place, or may be distributed over a plurality of network elements. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of this embodiment. Those of ordinary skill in the art will understand and implement the present application without undue burden.
Various component embodiments of the application may be implemented in hardware, or in software modules running on one or more processors, or in a combination thereof. Those skilled in the art will appreciate that some or all of the functions of some or all of the components in an electronic device according to embodiments of the present application may be implemented in practice using a microprocessor or Digital Signal Processor (DSP). The present application can also be implemented as an apparatus or device program (e.g., a computer program and a computer program product) for performing a portion or all of the methods described herein. Such a program embodying the present application may be stored on a computer readable medium, or may have the form of one or more signals. Such signals may be downloaded from an internet website, provided on a carrier signal, or provided in any other form.
For example, fig. 11 shows an electronic device in which the method according to the application may be implemented. The electronic device may be a PC, a mobile terminal, a personal digital assistant, a tablet computer, etc. The electronic device conventionally comprises a processor 1110 and a memory 1120 and program code 1130 stored on said memory 1120 and executable on the processor 1110, said processor 1110 implementing the method described in the above embodiments when said program code 1130 is executed. The memory 1120 may be a computer program product or a computer readable medium. The memory 1120 may be an electronic memory such as a flash memory, an EEPROM (electrically erasable programmable read only memory), an EPROM, a hard disk, or a ROM. The memory 1120 has a memory space 11201 for program code 1130 of a computer program for performing any of the method steps described above. For example, the memory space 11201 for the program code 1130 may include individual computer programs for implementing the various steps in the above methods, respectively. The program code 1130 is computer readable code. These computer programs may be read from or written to one or more computer program products. These computer program products comprise a program code carrier such as a hard disk, a Compact Disc (CD), a memory card or a floppy disk. The computer program comprises computer readable code which, when run on an electronic device, causes the electronic device to perform a method according to the above-described embodiments.
The embodiment of the application also discloses a computer readable storage medium, on which a computer program is stored, which when being executed by a processor, realizes the steps of the application screen projection method according to the embodiment of the application.
Such a computer program product may be a computer readable storage medium, which may have memory segments, memory spaces, etc. arranged similarly to the memory 1120 in the electronic device shown in fig. 11. The program code may be stored in the computer readable storage medium, for example, in a suitable form. The computer readable storage medium is typically a portable or fixed storage unit as described with reference to fig. 12. Typically, the memory unit comprises computer readable code 1130', which computer readable code 1130' is code that is read by a processor, which code, when executed by the processor, implements the steps of the method described above.
Reference herein to "one embodiment," "an embodiment," or "one or more embodiments" means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the application. Furthermore, it is noted that the word examples "in one embodiment" herein do not necessarily all refer to the same embodiment.
In the description provided herein, numerous specific details are set forth. However, it is understood that embodiments of the application may be practiced without these specific details. In some instances, well-known methods, structures and techniques have not been shown in detail in order not to obscure an understanding of this description.
In the claims, any reference signs placed between parentheses shall not be construed as limiting the claim. The word "comprising" does not exclude the presence of elements or steps not listed in a claim. The word "a" or "an" preceding an element does not exclude the presence of a plurality of such elements. The application may be implemented by means of hardware comprising several distinct elements, and by means of a suitably programmed computer. In the unit claims enumerating several means, several of these means may be embodied by one and the same item of hardware. The use of the words first, second, third, etc. do not denote any order. These words may be interpreted as names.
Finally, it should be noted that: the above embodiments are only for illustrating the technical solution of the present application, and are not limiting; although the application has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit and scope of the technical solutions of the embodiments of the present application.
Claims (11)
1. An application screen projection method is characterized by being applied to a screen projection transmitting end, and the method comprises the following steps:
responding to the execution of the screen-throwing operation on the target application displayed in the first physical screen of the screen-throwing sending end, and starting screen-throwing service;
creating a virtual screen corresponding to the first physical screen through the screen throwing service;
registering a monitoring function for monitoring a first display attribute of the target application;
the method comprises the steps that cache data corresponding to a virtual screen and a current value of a first display attribute monitored by a monitoring function are sent to a screen-throwing receiving end, so that the screen-throwing receiving end performs picture synthesis according to the cache data and the current value of the first display attribute, and an interface of the target application is displayed at the screen-throwing receiving end;
wherein the first display attribute includes: the target application corresponds to the display size and the display position of the first physical screen; the cache data is as follows: and the display data of the screen projection layer where the interface of the target application is.
2. The method according to claim 1, wherein the responding to the execution of the screen-casting operation on the target application displayed in the first physical screen of the screen-casting transmitting end, before starting the screen-casting service, further comprises:
In response to starting to display the target application on the first physical screen, defining an object for storing a first display attribute of the target application in a local system framework of the screen throwing transmitting end;
and synchronously updating the current value of the first display attribute of the target application stored in the object based on a preset display state change event.
3. The method of claim 2, wherein registering a listening function for listening for the first display attribute of the target application comprises:
and creating a function callback, wherein the function callback is used for being executed when the first display attribute of the target application changes so as to acquire the current value of the first display attribute through the object.
4. The method of claim 1, wherein the screen capturing receiving terminal performs screen synthesis according to the cached data and the current value of the first display attribute, and displaying the interface of the target application at the screen capturing receiving terminal includes:
the screen throwing receiving end obtains the cache data corresponding to the interface of the target application as data to be displayed according to the current value of the first display attribute and the cache data;
And the screen projection receiving end displays the interface of the target application on the second physical screen according to the first display attribute and the data to be displayed.
5. A method according to claim 3, characterized in that the method further comprises:
responsive to a change in the apparent state of the navigation bar and/or status bar on the first physical screen, the local framework recalculates the display size and display position of the target application on the first physical screen;
and updating the first display attribute of the target application stored in the object through the recalculated display size and display position, and triggering the execution of the function callback to acquire the current value of the first display attribute through the object.
6. An application screen projection method, which is characterized by being applied to a screen projection receiving end, comprising the following steps:
responding to the cache data corresponding to the virtual screen sent by the screen throwing sending end and the current value of the first display attribute of the target application, and acquiring the cache data corresponding to the interface of the target application as data to be displayed;
displaying the interface of the target application on the second physical screen according to the first display attribute and the data to be displayed;
Wherein the first display attribute includes: the target application corresponds to the display size and the display position of the first physical screen; the cache data is as follows: and the display data of the screen projection layer where the interface of the target application is.
7. The method of claim 6, wherein displaying the interface of the target application on the second physical screen according to the first display attribute and the data to be displayed comprises:
modifying the display size of the interface composition service for combining the target application according to the first display attribute;
and performing picture synthesis based on the set display size and the data to be displayed, and displaying the interface of the target application on the second physical screen.
8. An application screen throwing device, which is characterized in that the device is applied to a screen throwing sending end, and comprises:
the screen-throwing service starting module is used for responding to the execution of screen-throwing operation on the target application displayed in the first physical screen of the screen-throwing sending end and starting screen-throwing service;
the virtual screen creation module is used for creating a virtual screen corresponding to the first physical screen through the screen throwing service;
a monitoring registration module, configured to register a monitoring function for monitoring a first display attribute of the target application;
The screen-throwing data sending module is used for sending cache data corresponding to the virtual screen and the current value of the first display attribute monitored by the monitoring function to a screen-throwing receiving end, so that the screen-throwing receiving end performs picture synthesis according to the cache data and the current value of the first display attribute, and an interface of the target application is displayed at the screen-throwing receiving end;
wherein the first display attribute includes: the target application corresponds to the display size and the display position of the first physical screen; the cache data is as follows: and the display data of the screen projection layer where the interface of the target application is.
9. An application screen-throwing device, characterized in that it is applied to a screen-throwing receiving terminal, said device comprising:
the system comprises a to-be-displayed data acquisition module, a display module and a display module, wherein the to-be-displayed data acquisition module is used for responding to the cache data corresponding to a virtual screen and sent by a screen throwing sending end and the current value of a first display attribute of a target application, and acquiring the cache data corresponding to an interface of the target application as to-be-displayed data;
the screen projection display module is used for displaying the interface of the target application on the second physical screen according to the first display attribute and the data to be displayed;
Wherein the first display attribute includes: the target application corresponds to the display size and the display position of the first physical screen; the cache data is as follows: and the display data of the screen projection layer where the interface of the target application is.
10. An electronic device comprising a memory, a processor and program code stored on the memory and executable on the processor, wherein the processor implements the application screen projection method of any one of claims 1 to 7 when the program code is executed by the processor.
11. A computer readable storage medium having stored thereon program code, which when executed by a processor performs the steps of the application screen projection method of any of claims 1 to 7.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310982461.2A CN117008860A (en) | 2023-08-04 | 2023-08-04 | Application screen projection method and device, electronic equipment and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310982461.2A CN117008860A (en) | 2023-08-04 | 2023-08-04 | Application screen projection method and device, electronic equipment and storage medium |
Publications (1)
Publication Number | Publication Date |
---|---|
CN117008860A true CN117008860A (en) | 2023-11-07 |
Family
ID=88561523
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202310982461.2A Pending CN117008860A (en) | 2023-08-04 | 2023-08-04 | Application screen projection method and device, electronic equipment and storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN117008860A (en) |
-
2023
- 2023-08-04 CN CN202310982461.2A patent/CN117008860A/en active Pending
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US8042094B2 (en) | Architecture for rendering graphics on output devices | |
JP5149411B2 (en) | System and method for a unified synthesis engine in a graphics processing system | |
EP2715529B1 (en) | Global composition system | |
US8898577B2 (en) | Application sharing with occlusion removal | |
CN111240626A (en) | Method and system for double-screen interaction of intelligent cabin operating system based on Hypervisor | |
JP5166552B2 (en) | Multi-buffer support for off-screen surfaces in graphics processing systems | |
CN114741044B (en) | Cross-operation environment display output sharing method based on heterogeneous rendering | |
CN113032080B (en) | Page implementation method, application program, electronic device and storage medium | |
US9563971B2 (en) | Composition system thread | |
CN114741081B (en) | Cross-operation environment display output sharing method based on heterogeneous cache access | |
WO2021227688A1 (en) | Screen extension method and apparatus, and terminal device and computer-readable storage medium | |
EP2997547B1 (en) | Primitive-based composition | |
WO2018120992A1 (en) | Window rendering method and terminal | |
CN116672702A (en) | Image rendering method and electronic equipment | |
JP2014135013A (en) | Image transfer method, server apparatus, and program | |
CN113918366B (en) | Information processing method, information processing device, electronic equipment and storage medium | |
CN116821040B (en) | Display acceleration method, device and medium based on GPU direct memory access | |
CN112532896A (en) | Video production method, video production device, electronic device and storage medium | |
JP2010026297A (en) | Image composition system, display control method, drawing processing apparatus, and control program | |
US10733689B2 (en) | Data processing | |
US20160098811A1 (en) | Apparatus and method for combining video frame and graphics frame | |
CN115546410A (en) | Window display method and device, electronic equipment and storage medium | |
CN111752505A (en) | Real-time image capturing method, system and storage medium for VR | |
CN117008860A (en) | Application screen projection method and device, electronic equipment and storage medium | |
CN113784075B (en) | Screen video reading method, system and computing device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |