WO2021213120A1 - Procédé et appareil de projection d'écran et dispositif électronique - Google Patents

Procédé et appareil de projection d'écran et dispositif électronique Download PDF

Info

Publication number
WO2021213120A1
WO2021213120A1 PCT/CN2021/082506 CN2021082506W WO2021213120A1 WO 2021213120 A1 WO2021213120 A1 WO 2021213120A1 CN 2021082506 W CN2021082506 W CN 2021082506W WO 2021213120 A1 WO2021213120 A1 WO 2021213120A1
Authority
WO
WIPO (PCT)
Prior art keywords
electronic device
interface
control
user
layout information
Prior art date
Application number
PCT/CN2021/082506
Other languages
English (en)
Chinese (zh)
Inventor
王勇
王欢
李�杰
李英浩
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Publication of WO2021213120A1 publication Critical patent/WO2021213120A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1454Digital output to display device ; Cooperation and interconnection of the display device with other functional units involving copying of the display data of a local workstation or window to a remote workstation or window so that an actual copy of the data is displayed simultaneously on two or more displays, e.g. teledisplay
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04806Zoom, i.e. interaction techniques or interactors for controlling the zooming operation

Definitions

  • This application relates to the technical field of smart terminals, in particular to screen projection methods, devices and electronic equipment.
  • the user will use the method of projection to project the interface of the first electronic device (such as a mobile phone) to the second electronic device (such as a PC) for display.
  • the first electronic device and the second electronic device may include, but are not limited to, mobile terminals (mobile phones), PADs, PCs, wearable devices, electronic large screens, televisions, car central controls, etc.
  • the screen projection method between electronic devices mainly adopts the mirror mode, as shown in FIG. 1, that is, the interface on the first electronic device (eg, mobile phone) is completely projected onto the second electronic device (eg, PC).
  • the interface displayed by the first electronic device such as a mobile phone
  • the screen interface is deformed, or cannot be displayed completely, and the screen projection effect is poor, and the user experience is poor.
  • the embodiments of the present application provide a screen projection method, device, and electronic equipment, which can improve the screen projection effect and enhance the user experience.
  • an embodiment of the present application provides a screen projection method, including:
  • the first electronic device displays to the user various levels of controls of the source interface, and obtains the controls selected by the user from the various levels of controls; and, the first electronic device displays to the user a virtual screen generated based on the screen information of the second electronic device, and selects the user
  • the controls are displayed on the virtual screen, and the controls displayed on the virtual screen can be edited by the user;
  • the first electronic device After detecting the user's confirmation operation on the interface layout on the virtual screen, the first electronic device sends the interface layout information of the virtual screen to the second electronic device, and the interface layout information includes: the first control identifier of the first control placed on the virtual screen And layout information, so that the second electronic device displays the projection interface of the source interface according to the interface layout information.
  • the above-mentioned first electronic device or the second electronic device may be a mobile terminal (mobile phone), a smart screen, a drone, an intelligent connected vehicle (hereinafter referred to as ICV), or a smart/intelligent car. ) Or on-board equipment and other equipment.
  • This method can make the layout of the projection interface displayed on the second electronic device consistent with the interface layout on the virtual screen when the user confirms the operation, so that the visual effect of the projection interface is more similar to that of the native application interface of the second electronic device
  • the effect is not just the mirroring, copying, stretching, and scaling of the source interface in the first electronic device, so as to provide users with a more natural and native screen projection experience.
  • the first electronic device displays to the user various levels of controls of the source interface, including:
  • the first electronic device extracts first data from the interface data of the source interface, and the first data records the control identifiers of the controls in the active interface, the hierarchical relationship between the controls, and the control drawing instructions of the controls;
  • the first electronic device displays various levels of controls of the source interface to the user according to the first data.
  • the first electronic device extracts the first data from the interface data of the source interface, including:
  • the first electronic device extracts the view tree from the interface data of the source interface, the view tree records the control identifiers of the controls in the active interface and the hierarchical relationship between the controls, and extracts the control drawing instructions corresponding to the control identifiers recorded in the view tree from the interface data .
  • the first electronic device displays to the user various levels of controls of the source interface according to the first data, including:
  • the first electronic device uses the control drawing instruction of the control to draw each level control of the source interface according to the hierarchical relationship between the controls, and to show the drawn each level control to the user; and/or,
  • the first electronic device displays the control identification of the control and the hierarchical relationship between the controls to the user.
  • the screen of the first electronic device is divided into a first display area and a second display area;
  • the first electronic device displays to the user various levels of controls of the source interface, including:
  • the first electronic device displays various levels of controls of the source interface to the user in the first display area
  • the first electronic device shows the user a virtual screen generated based on the screen information of the second electronic device, including:
  • the first electronic device displays a virtual screen to the user in the second display area.
  • the first electronic device marks the first control identifier in the first data of the interface data sent to the source interface of the second electronic device according to the first control identifier in the interface layout information.
  • the first electronic device sends the acquired first control identifier and layout information to the second electronic device, including:
  • the first electronic device carries the acquired interface layout information in the interface data of the source interface and sends it to the second electronic device.
  • an embodiment of the present application provides a screen projection method, including:
  • the second electronic device receives the interface layout information sent by the first electronic device.
  • the interface layout information includes: the first control identifier and layout information; The interface layout information of the screen;
  • the second electronic device receives the interface data of the source interface sent by the first electronic device
  • the second electronic device displays the projection interface of the source interface according to the interface data and the interface layout information.
  • This aspect makes the layout of the projection interface displayed on the second electronic device consistent with the interface layout on the virtual screen when the user confirms the operation, so that the visual effect of the projection interface is more similar to the visual effect of the native application interface of the second electronic device , Not just the mirroring, copying, stretching, and zooming of the source interface in the first electronic device, so as to provide users with a more natural and native screen projection experience.
  • the second electronic device displays the projection interface of the source interface according to the interface data and the interface layout information, including:
  • the second electronic device obtains the first control identifier and the layout information corresponding to the first control identifier from the interface layout information; the second electronic device obtains the control drawing instruction corresponding to the first control identifier from the interface data according to the acquired first control identifier ;
  • the second electronic device draws the control corresponding to the first control identifier using the control drawing instruction corresponding to the first control identifier according to the layout information corresponding to the first control identifier.
  • the first control identifier in the interface data has a mark
  • the second electronic device displays the projection interface of the source interface according to the interface data and the interface layout information, including:
  • the second electronic device obtains the first control identifier and the layout information corresponding to the first control identifier from the interface layout information;
  • the second electronic device obtains, from the interface data, a control drawing instruction corresponding to the marked first control identifier
  • the second electronic device draws the control corresponding to the first control identifier using the control drawing instruction corresponding to the first control identifier according to the layout information corresponding to the first control identifier.
  • an embodiment of the present application provides a first electronic device, including:
  • the controls displayed on the virtual screen can be edited by the user;
  • the interface layout information includes: the first control identifier of the first control placed on the virtual screen and the layout information, So that the second electronic device displays the projection interface of the source interface according to the interface layout information.
  • the step of causing the user to show each level of control of the source interface includes:
  • the first data records the control identifiers of the controls in the active interface, the hierarchical relationship between the controls, and the control drawing instructions of the controls;
  • the step of causing the first data to be extracted from the interface data of the source interface includes:
  • the view tree is extracted from the interface data of the source interface, and the view tree records the control identifiers of the controls in the active interface and the hierarchical relationship between the controls; the control drawing instructions corresponding to the control identifiers recorded in the view tree are extracted from the interface data.
  • the step of causing the user to display each level of control of the source interface according to the first data includes:
  • control drawing instructions of the control to draw each level control of the source interface according to the hierarchical relationship between the controls, and show the drawn each level control to the user; and/or,
  • the screen of the first electronic device is divided into a first display area and a second display area;
  • the steps of causing the user to show the various levels of controls of the source interface include:
  • the various levels of controls of the source interface are displayed to the user;
  • the step of causing the user to show the virtual screen generated based on the screen information of the second electronic device includes:
  • the virtual screen is shown to the user in the second display area.
  • the first electronic device when the instruction is executed by the first electronic device, the first electronic device further executes the following steps:
  • the first control identifier in the interface layout information the first control identifier in the first data of the interface data sent to the source interface of the second electronic device is marked.
  • the step of causing the acquired control identifier and layout information to be sent to the second electronic device includes:
  • the obtained interface layout information is carried in the interface data of the source interface and sent to the second electronic device.
  • an embodiment of the present application provides a second electronic device, including:
  • the interface layout information includes: a first control identifier and layout information; the interface layout information is the interface layout of the virtual screen when the first electronic device detects the user's confirmation operation on the interface layout on the virtual screen information;
  • the projection interface of the source interface is displayed according to the interface layout information.
  • the step of causing the screen projection interface of the source interface to be displayed according to the first control identifier and layout information according to the interface data includes:
  • the control corresponding to the first control identifier is drawn using the control drawing instruction corresponding to the first control identifier.
  • the first control identifier in the interface data has a mark; when the instruction is executed by the second electronic device, the step of causing the screen projection interface of the source interface to be displayed according to the first control identifier and layout information according to the interface data includes:
  • the control corresponding to the first control identifier is drawn using the control drawing instruction corresponding to the first control identifier.
  • an embodiment of the present application provides a computer-readable storage medium in which a computer program is stored, and when it runs on a computer, the computer executes the method of the first aspect.
  • an embodiment of the present application provides a computer-readable storage medium in which a computer program is stored, and when the computer program is run on a computer, the computer executes the method of the second aspect.
  • this application provides a computer program, which is used to execute the method of the first aspect or the second aspect when the computer program is executed by a computer.
  • the program in the seventh aspect may be stored in whole or in part on a storage medium packaged with the processor, or may be stored in part or in a memory not packaged with the processor.
  • Fig. 1 is an example diagram of a screen projection method in a mirror image mode in the prior art
  • FIG. 2A is an example diagram of a screen projection function provided by the system according to an embodiment of the application.
  • FIG. 2B is an example diagram of the control hierarchy of the embodiment of this application.
  • 2C is an example diagram of instruction extraction according to an embodiment of the application.
  • FIG. 2D is an example diagram of a view tree according to an embodiment of the application.
  • FIG. 3 is a flowchart of an embodiment of the screen projection method of this application.
  • FIG. 4 is a flowchart of another embodiment of the screen projection method of this application.
  • FIG. 5 is a flowchart of another embodiment of the screen projection method according to this application.
  • FIG. 6 is a flowchart of another embodiment of the screen projection method according to this application.
  • FIG. 7 is a flowchart of another embodiment of the screen projection method according to this application.
  • FIG. 8 is a structural diagram of an embodiment of the screen projection device of this application.
  • FIG. 9 is a structural diagram of another embodiment of a screen projection device according to the present application.
  • FIG. 10 is a schematic structural diagram of an embodiment of an electronic device of this application.
  • Screen projection in the embodiments of the present application refers to transferring interface data on one electronic device to another electronic device for display.
  • first electronic device the above-mentioned “one electronic device”
  • second electronic device the above-mentioned “another electronic device”
  • the interface that needs to be projected is called the "source interface”
  • the interface displayed in the second electronic device after the first electronic device is projected is called the "projection interface”.
  • the screen projection function may be provided by the system or by the application, which is not limited in the embodiment of the present application.
  • FIG. 2A shows an example of the screen projection function provided by the system.
  • the user can realize the screen projection between two electronic devices through the following operations: the user brings up the system drop-down notification in the first electronic device-selects more shortcut tools-finds wireless projection or mirroring Entrance—Select a connectable device, and the selected connectable electronic device is used as the second electronic device in the embodiment of the present application.
  • the two electronic devices that perform screen projection can be directly connected, for example, the direct connection between the two electronic devices can be realized through Bluetooth, WiFi, etc.; or, the two electronic devices can also be connected to other electronic devices separately.
  • Electronic devices such as cloud servers are connected to achieve indirect connection.
  • the connection between the two electronic devices can be switched between direct connection and indirect connection, which is not limited in the embodiment of the present application.
  • the application program interface in the embodiment of the present application is a medium interface for interaction and information exchange between the application program and the user, and it realizes the conversion between the internal form of the information and the form acceptable to the user.
  • the commonly used form of application program interface is Graphic User Interface (GUI for short), which refers to a user interface related to computer operations that is displayed in a graphical manner.
  • GUI Graphic User Interface
  • the interface can be composed of visual interface elements such as icons, buttons, menus, tabs, text boxes, dialog boxes, status bars, navigation bars, and widgets. These visual interface elements can be called application interface elements. Control.
  • the controls in the application interface are hierarchical. For example, the entire application interface can be considered as the lowest-level controls, that is, the first-level controls. The higher the level, the finer the controls and the greater the number of controls. For example, referring to Figure 2B, suppose there is an interface including button group 1 and button group 2, and each button group includes 3 buttons, then the interface can be considered as the lowest level control, button group 1 and Button group 2 is the second level of controls, and buttons 11-13 and 21-23 are the third level of controls.
  • the interface data of the source interface records the controls of the active interface and the hierarchical relationship between the controls.
  • the skia instructions can be extracted from the instruction layer skcanvas.
  • the skia instructions record the view tree (View Tree). ), control drawing instructions; among them,
  • the view tree records the hierarchical relationship between View and View.
  • View is the base class of all controls in the Android system. It is an abstraction of interface layer controls. View usually represents a control.
  • ViewGroup is a kind of A special View, which represents the combination relationship between different Views. Normally, the entire interface of the application is a ViewGroup, where each control is a separate View. Views and ViewGroups can be combined/nested, combined/embedded After the set, a complete View tree is formed.
  • the View tree also records the controls at different levels in the application interface and the hierarchical relationship between the controls. For example, the View tree generally uses the control identifier (ID) to identify the control. See the screenshot of the view tree shown in Figure 2D (View and ViewGroup will not be distinguished below).
  • ID control identifier
  • DecorView is the first-level view, and the IDs are respectively The IDs corresponding to the controls for action mode bar stub (ViewStub) and content (FrameLayout) are located in the second level; the IDs are action bar split action bar (HWToolBarMenuContainer), conversation list conversation list (FrameLayout), and toolbar container layout.
  • the IDs corresponding to the controls of (RelativeLayout) are located at the third level; the IDs are the start new conversation button (ImageView) and the corresponding IDs of the controls of the toolbar hwtoolbar (HwToolbar) are located at the fourth level. Therefore, the view tree records the controls at different levels in the application interface and the hierarchical relationship between the controls through the control identifier and the hierarchical relationship between the control identifiers.
  • Control drawing instructions are associated with the view tree, recording how the visual graphics of each control in the view tree are drawn, and recording the control display content of each control.
  • the control may only include visual graphics or control display content, or may include both visual graphics and control display content.
  • the electronic device can draw each level of the control in the view tree based on the control drawing instruction to obtain the visual graphics of each control and/or the display content of the control, thereby forming an application program interface.
  • the display content of the controls in the embodiments of the present application refers to data such as numbers and texts that need to be displayed for each control in the application program interface in addition to the visual graphics.
  • the control drawing instruction can draw the visual graphics of the control, such as the box representing the button in FIG. 2B, and can also draw the text displayed on the control, such as button 11, button 12, etc. at the same time.
  • the display content of the control may be constant or may be constantly changing.
  • the first electronic device may send the control ID of the control display content change and the changed control display content to the second electronic device, or may also send the control display content change to the second electronic device
  • the second electronic device updates the projection interface correspondingly to keep the content displayed on the source interface and the projection interface consistent.
  • the first electronic device When starting the screen projection, the first electronic device sends the interface data of the source interface to the second electronic device; during the screen projection process, the source interface of the first electronic device may change, for example, the user clicks a button in the source interface , To open another application interface, the source interface becomes the newly opened application interface. At this time, the first electronic device can re-send the interface data of the source interface to the second electronic device to maintain the projection interface and the source interface The displayed content is the same.
  • the screen projection method between electronic devices mainly adopts the mirror mode, as shown in FIG. 1, that is, the interface on the first electronic device (eg, mobile phone) is completely projected to the second electronic device (eg, PC).
  • superior. Take the Android-based Miracast and iOS-based AirPlay as examples.
  • the basic principles of both are: the interface data of the first electronic device is encoded with audio and video and then sent to the second electronic device.
  • the second electronic device performs decoding processing to obtain the interface data of the first electronic device for display.
  • the projection interface can only be stretched, zoomed, and cropped according to the screen size of the second electronic device, so that the projection interface displayed on the second electronic device is deformed or cannot be displayed completely. The projection effect is poor and the user experience is poor.
  • embodiments of the present application provide a screen projection method, device, and electronic equipment, which can improve the screen projection effect and enhance user experience.
  • Fig. 3 is a flowchart of an embodiment of a screen projection method according to this application. As shown in Fig. 3, the method may include:
  • Step 301 The first electronic device displays each level of control of the source interface to the user, and obtains the control selected by the user from each level of control.
  • the first electronic device when the first electronic device displays each level of control of the source interface to the user, it may display: the control and/or the hierarchical relationship between the control identifiers.
  • the controls in the application program interface are displayed as visual graphics in the application program interface.
  • most controls have the function of executing the function or triggering the code to run and complete the response through the "event". This function is used by the user Click and other operations can be triggered.
  • the first electronic device displays various levels of controls of the source interface to the user. The main function is to facilitate the user to intuitively select the required controls. Therefore, in the embodiment of the present application, the first electronic device may only display the controls. Display the visual graphics of the control and/or the display content of the control without providing a response operation for the function of the control.
  • buttons 11 in Figure 2B For example, suppose that the control display content of button 11 in Figure 2B is: Next page, when the user clicks with the mouse, the application can be triggered to open the application interface corresponding to the next page (that is, execute "Next page” "The function of this button). In this step, the control shown to the user can only show the visual graphic of button 11 and the display content of the "Next Page” control. The user can select the visual graphic by clicking with the mouse, but The response operation of opening the application program interface corresponding to the next page may no longer be provided.
  • control identifiers such as the view tree generally belongs to the internal properties of the interface, and does not intuitively reflect the specific control. Therefore, for most users who are not skilled in the art, the way to show the control to the user It is more intuitive and has a better user experience.
  • Step 302 The first electronic device displays a virtual screen generated based on the screen information of the second electronic device to the user, and displays the controls selected by the user on the virtual screen, and the controls displayed on the virtual screen can be edited by the user.
  • the screen information of the second electronic device may include: the horizontal size and the vertical size (or aspect ratio) of the display screen of the second electronic device.
  • the screen information of the second electronic device may also include: interactive mode and so on.
  • the interaction mode may include: whether the display screen supports touch control, etc.
  • the editing operation performed by the user on the control may include, but is not limited to: position movement, size change, display direction change, and/or deletion.
  • the first electronic device showing the virtual screen to the user in step 302 can be executed at the same time as the first electronic device showing each level of control of the source interface to the user in step 301, that is, the first electronic device simultaneously showing each level of control to the user and Virtual screen; or, in step 302, the first electronic device displays the virtual screen to the user, which can also be executed after step 301, which is not limited in the embodiment of the present application.
  • the controls displayed on the virtual screen may also be the visual graphics of the control and/or the display content of the control, without providing a response operation for the function of the control.
  • Step 303 The first electronic device detects the user's confirmation operation for the interface layout on the virtual screen, and sends the interface layout information of the virtual screen to the second electronic device.
  • the interface layout information includes: the first control of the first control placed on the virtual screen. Control identification and layout information, so that the second electronic device displays the projection interface of the source interface according to the interface layout information.
  • the first control refers to the control placed on the virtual screen, that is, the control that is still placed on the virtual screen after the user performs a series of editing operations on the control selected by the user displayed on the virtual screen in step 302.
  • Each first control corresponds to: a first control identifier, and layout information; the layout information may include information such as the position, size, and display direction of the control on the virtual screen.
  • a layout configuration file may be generated according to the foregoing interface layout information, and the layout configuration file may be sent to the second electronic device.
  • the first electronic device may directly send the interface layout information to the second electronic device; or, the first electronic device sending the interface layout information to the second electronic device may include: the first electronic device carries the interface layout information in The interface data of the source interface is sent to the second electronic device.
  • the first electronic device after the first electronic device sends the interface layout information to the second electronic device, it can compare the first data of the interface data sent to the source interface of the second electronic device according to the first control identifier in the interface layout information.
  • the first control identifier in the display screen is marked so that the second electronic device can draw the control in the projection interface according to the first control identifier with the mark when displaying the projection interface.
  • the first electronic device sends the interface layout information to the second electronic device, and the second electronic device can store the interface layout information.
  • the second electronic device can display the projection interface of the source interface according to the interface layout information. If the user wants to modify the interface layout information of the projection interface of the source interface in the second electronic device, he can trigger the above-mentioned method of the embodiment of the present application to re-edit the interface layout on the virtual screen and re-send the interface layout information to the second electronic device.
  • the second electronic device updates the locally stored interface layout information corresponding to the source interface, so that the user can modify the interface layout of the projection interface of the source interface in the second electronic device.
  • the first electronic device displays the controls in the source interface selected by the user on the virtual screen, and the user edits the displayed controls.
  • the virtual screen is displayed on the virtual screen.
  • the interface layout information is sent to the second electronic device, and the second electronic device displays the projection interface according to the interface layout information of the virtual screen, so that the layout of the projection interface displayed on the second electronic device is the same as the interface layout on the virtual screen when the user confirms the operation Consistent, so that the visual effect of the projection interface is more similar to the visual effect of the native application interface of the second electronic device, rather than just mirroring, copying, stretching, and zooming the source interface in the first electronic device, thereby providing users with A more natural and native screen projection experience.
  • FIG. 4 is a flowchart of another embodiment of the screen projection method according to this application. As shown in FIG. 4, the method may include:
  • Step 401 The second electronic device receives the interface layout information sent by the first electronic device, the interface layout information includes: the first control identifier and layout information; the interface layout information is the first electronic device detecting that the user confirms the interface layout on the virtual screen Interface layout information of the virtual screen during operation;
  • Step 402 The second electronic device receives the interface data of the source interface sent by the first electronic device;
  • Step 403 The second electronic device displays the screen projection interface of the source interface according to the interface layout information according to the received interface data of the source interface.
  • the second electronic device may first obtain the first control identifier and the layout information corresponding to the first control identifier from the interface layout information, and then obtain the first control identifier from the interface data from the interface data according to the first control identifier obtained from the interface layout information. Acquire the control drawing instruction corresponding to the first control identifier; according to the layout information corresponding to the first control identifier in the interface layout information, use the control drawing instruction corresponding to the first control identifier to draw the control corresponding to the first control identifier.
  • the screen projection interface displayed by the second electronic device only displays the control corresponding to the first control identifier included in the interface layout information, and the layout of the screen projection interface displayed by the second electronic device will be consistent with the interface layout on the virtual screen. Only the control display content in each control may be different (the content of the second device changes in real time with the interface of the first device).
  • the first control identifier in the first data of the interface data may have a mark.
  • the second electronic device does not need to obtain the first control identifier from the interface layout information first, and can more conveniently obtain the first control identifier from the interface data.
  • Each first control identifier and the control drawing instruction corresponding to the first control identifier are found.
  • step 403 may include: the second electronic device obtains the first control identifier and the layout information corresponding to the first control identifier from the interface layout information ; The second electronic device obtains from the interface data the control drawing instruction corresponding to the marked first control identifier; the second electronic device draws the first control drawing instruction corresponding to the first control identifier according to the layout information corresponding to the first control identifier The control identifies the corresponding control.
  • the layout of the projection interface displayed on the second electronic device is consistent with the interface layout on the virtual screen when the user confirms the operation, so that the visual effect of the projection interface is more similar to the native application of the second electronic device
  • the visual effect of the program interface is not just the mirroring, copying, stretching, and zooming of the source interface in the first electronic device, so as to provide users with a more natural and native screen projection experience.
  • FIG. 5 is a flowchart of another embodiment of the projection method of this application.
  • the first electronic device is a mobile phone
  • the second electronic device is a head-up display (HUD, Head Up Display) in a car as an example .
  • the source interface that needs to be screened in the first electronic device is the navigation interface.
  • the screen of the first electronic device (such as a mobile phone) is divided into a first display area 51 and a second display area 52.
  • the first electronic device displays the source interface to the user in the first display area 51.
  • the control selected by the user is displayed on the display.
  • the aspect ratio of the generated virtual screen 521 and the display screen of the second electronic device may be the same.
  • the first electronic device to display to the user each level of control of the source interface may include:
  • the first electronic device extracts first data from the interface data of the source interface, and the first data includes: the control identifiers of the controls in the source interface, the hierarchical relationship between the controls, and the control drawing instructions of the controls;
  • control drawing instructions of the control to draw each level of control according to the hierarchical relationship between the controls, and show the drawn each level of control to the user.
  • the above-mentioned first data may include: a view tree and a control drawing instruction corresponding to a control identifier recorded in the view tree.
  • the first electronic device extracting the first data from the interface data of the source interface may include: the first electronic device extracts a view tree from the interface data of the source interface, and the view tree records the control identifiers of the controls in the active interface and the hierarchy between the controls Relationship, extract the control drawing instruction corresponding to the control identifier recorded in the view tree from the interface data.
  • the user can select a control in the first display area 51 and drag the control to display on the virtual screen.
  • the first electronic device obtains the control selected by the user by detecting the above-mentioned operation of the user on the control. .
  • the user may perform a preset operation on the control in the first display area.
  • the foregoing preset operation may include, but is not limited to: double-clicking the control displayed in the first display area, etc., and the first electronic device detects that the user is targeting The above operations of the control obtain the control selected by the user.
  • the navigation interface is the first-level control
  • the navigation information display control group 511 on the left is the second-level control
  • the map route display control 512 on the right is also the second-level control
  • the navigation information is displayed.
  • the control group 511 includes: 4 text boxes shown by the dotted line and 5 third-level controls with turning arrows.
  • the control selected by the user is the text box showing the turning distance and the turning arrow as an example.
  • the user can perform editing operations on the controls displayed on the virtual screen to obtain the edited screen projection interface layout of the user.
  • section 520 take the user's editing operations such as zooming in and moving the position of the text showing the turning distance and turning arrow as an example.
  • the user edits the controls displayed on the virtual screen
  • the user performs a confirmation operation on the interface layout on the virtual screen; accordingly, the first electronic device detects that the user is on the virtual screen
  • the confirmation operation of the interface layout sends the interface layout information on the virtual screen to the second electronic device (such as HUD); the second electronic device stores the received interface layout information.
  • the first electronic device sends the interface data of the source interface to the second electronic device; the second electronic device receives the interface data of the source interface, and according to the interface data, displays the projection screen of the source interface according to the interface layout information interface.
  • the interface data of the source interface sent by the first electronic device to the second electronic device records the control identifiers of all controls in the active interface, the hierarchical relationship between the controls, the control drawing instructions, etc., and the interface layout information records the first A control identifier and the layout information corresponding to the first control identifier.
  • the interface layout information records the first A control identifier and the layout information corresponding to the first control identifier.
  • the hierarchical relationship between the first control identifiers and the control drawing of the first control identifier can be found from the interface data Instruction, according to the layout information corresponding to the first control identifier, the control corresponding to the first control identifier can be drawn by using the control drawing instruction corresponding to the first control identifier.
  • the control corresponding to each first control identifier in the interface layout information is displayed on the screen projection interface according to the hierarchical relationship according to the above method, and the screen projection interface is obtained.
  • the displayed screen projection interface will be similar to the interface displayed on the virtual screen when the user performs the above confirmation operation, except that the display content of the control displayed on each control may be different.
  • the screen of the first electronic device is divided into a third display area and a fourth display area.
  • the first electronic device displays the hierarchical relationship between control identifiers in the third display area and in the fourth display area. Displaying the visual graphics of each level of control (that is, the first electronic device displays each level of control of the source interface to the user).
  • a check box is set for each control identifier.
  • the user can select the control identifier by selecting the check box.
  • the first electronic device obtains the user's selection by detecting the check box selection operation described above.
  • the control ID (that is, to obtain the control selected by the user from each level of control).
  • the first electronic device displays a virtual screen on the screen, and displays the visual graphics of the control corresponding to the control identifier selected by the user on the virtual screen; the visual graphics of the control displayed on the virtual screen can be edited by the user operate.
  • the screen of the first electronic device is divided into a fifth display area and a sixth display area.
  • the first electronic device displays the hierarchical relationship between the control identifiers in the fifth display area (that is, the first electronic).
  • the device displays all levels of controls of the source interface to the user), displays the virtual screen in the sixth display area, displays the visual graphics and/or control display content of the corresponding control corresponding to the control identifier selected by the user on the virtual screen;
  • the visual graphics and/or the display content of the control can be edited by the user.
  • a check box is set for each control identifier, and the user can select the control identifier by selecting the check box.
  • the first electronic device obtains the user's selection by detecting the above-mentioned selection operation on the check box.
  • the control ID (that is, to obtain the control selected by the user from each level of control).
  • the visual graphic of the control displayed on the virtual screen can be drawn on the virtual screen by the first electronic device according to the control identifier selected by the user and using the control drawing instruction corresponding to the control identifier.
  • the method of the embodiment of the present application shows the user a virtual screen generated based on the screen information of the second electronic device.
  • the user can select the controls displayed on the virtual screen from the source interface according to personal needs and perform layout. Realize the optimization and editing of the projection interface displayed in the second electronic device, especially when the second electronic device has a special-shaped display screen, targeted optimization can be performed based on the special-shaped display screen, so that the visual effect of the projection interface is more similar to the first 2.
  • the visual effects of the native application program interface of the electronic device not just the mirroring, copying, stretching, and zooming of the source interface in the first electronic device, so as to provide users with a more natural and native screen projection experience; moreover, screen projection
  • the clarity of the interface is relatively better, and it will not be blurred by zooming in. For example, if you project the music playing interface in your phone to your watch, because the watch’s display screen is relatively small, you can only choose to play, and the three buttons of the previous song and the next song will be displayed on the projection screen; or if the phone is displayed on the screen.
  • the interface of the navigation application in is projected on the HUD display, and only key buttons such as direction indication can be displayed on the projection interface and so on.
  • FIG. 8 is a schematic structural diagram of an embodiment of a screen projection device according to this application. As shown in FIG. 8, the device 80 may include:
  • the first display unit 81 is used to display each level of control of the source interface to the user, and obtain the control selected by the user from each level of control;
  • the second display unit 82 is configured to display a virtual screen generated based on the screen information of the second electronic device to the user, and display the control selected by the user on the virtual screen, and the control displayed on the virtual screen can be edited by the user;
  • the sending unit 83 is configured to send the interface layout information of the virtual screen to the second electronic device after detecting the user's confirmation operation on the interface layout on the virtual screen.
  • the interface layout information includes: the first control of the first control placed on the virtual screen. Control identification and layout information, so that the second electronic device displays the projection interface of the source interface according to the interface layout information.
  • the first display unit 81 may be specifically used for:
  • the first data records the control identifiers of the controls in the active interface, the hierarchical relationship between the controls, and the control drawing instructions of the controls;
  • the first display unit 81 may be specifically used for:
  • the view tree is extracted from the interface data of the source interface, the view tree records the control identifiers of the controls in the active interface and the hierarchical relationship between the controls, and the control drawing instructions corresponding to the control identifiers recorded in the view tree are extracted from the interface data.
  • the first display unit 81 may be specifically used for:
  • control drawing instructions of the control to draw each level control of the source interface according to the hierarchical relationship between the controls, and show the drawn each level control to the user; and/or,
  • the screen of the device is divided into a first display area and a second display area;
  • the first display unit 81 may be specifically used to: display various levels of controls of the source interface to the user in the first display area;
  • the second display unit 82 may be specifically used for: the first electronic device displays a virtual screen to the user in the second display area.
  • the sending unit 83 may be further configured to mark the first control identifier in the interface data sent to the source interface of the second electronic device according to the control identifier of the control placed on the virtual screen.
  • the sending unit 83 may be specifically configured to carry the acquired interface layout information in the interface data of the source interface and send it to the second electronic device.
  • FIG. 9 is a schematic structural diagram of another embodiment of a screen projection device according to the present application. As shown in FIG. 9, the device 90 may include:
  • the receiving unit 91 is configured to receive interface layout information sent by the first electronic device, the interface layout information includes: a first control identifier and layout information; the interface layout information is the first electronic device detecting that the user confirms the layout of the interface on the virtual screen Interface layout information of the current virtual screen; receiving interface data of the source interface sent by the first electronic device;
  • the display unit 92 is configured to display the projection interface of the source interface according to the interface data and the interface layout information.
  • the display unit 82 may be specifically configured to: obtain the first control identifier and the layout information corresponding to the first control identifier from the interface layout information; obtain the first control identifier corresponding to the first control identifier from the interface data according to the acquired first control identifier According to the layout information corresponding to the first control identifier, use the control drawing instruction corresponding to the first control identifier to draw the control corresponding to the first control identifier.
  • the first control identifier in the interface data has a mark
  • the display unit 82 may be specifically used to: obtain the first control identifier and the layout information corresponding to the first control identifier from the interface layout information; obtain the mark from the interface data
  • the first control identifier corresponds to the control drawing instruction; according to the layout information corresponding to the first control identifier, the control corresponding to the first control identifier is drawn using the control drawing instruction corresponding to the first control identifier.
  • the above units may be one or more integrated circuits configured to implement the above methods, for example: one or more specific integrated circuits (Application Specific Integrated Circuit; hereinafter referred to as ASIC), or, one or more micro-processing DSP (Digital Singnal Processor; hereinafter referred to as DSP), or, one or more Field Programmable Gate Array (Field Programmable Gate Array; hereinafter referred to as FPGA), etc.
  • ASIC Application Specific Integrated Circuit
  • DSP Digital Singnal Processor
  • FPGA Field Programmable Gate Array
  • these units can be integrated together and implemented in the form of a System-On-a-Chip (hereinafter referred to as SOC).
  • SOC System-On-a-Chip
  • FIG. 10 is a schematic structural diagram of an embodiment of an electronic device of this application. As shown in FIG. 10, the above-mentioned electronic device may include: a display screen; one or more processors; a memory; and one or more computer programs.
  • the above-mentioned electronic equipment can be a mobile terminal (mobile phone), a smart screen, a drone, an intelligent connected vehicle (Intelligent Connected Vehicle; hereinafter referred to as ICV), a smart/intelligent car (smart/intelligent car), or in-vehicle equipment, etc. equipment.
  • ICV Intelligent Connected Vehicle
  • a smart/intelligent car smart/intelligent car
  • in-vehicle equipment etc. equipment.
  • the above-mentioned one or more computer programs are stored in the above-mentioned memory, and the above-mentioned one or more computer programs include instructions.
  • the above-mentioned instructions are executed by the above-mentioned device, the above-mentioned device executes the methods shown in FIGS. 3-7.
  • the foregoing electronic device may be the first electronic device described in the embodiments of the application, and when the foregoing instruction is executed by the first electronic device, the first electronic device is caused to perform the following steps:
  • the controls displayed on the virtual screen can be edited by the user;
  • the interface layout information includes: the first control identifier of the first control placed on the virtual screen and the layout information, So that the second electronic device displays the projection interface of the source interface according to the interface layout information.
  • the step of causing the user to show each level of control of the source interface includes:
  • the first data records the control identifiers of the controls in the active interface, the hierarchical relationship between the controls, and the control drawing instructions of the controls;
  • the step of causing the first data to be extracted from the interface data of the source interface includes:
  • the view tree is extracted from the interface data of the source interface, and the view tree records the control identifiers of the controls in the active interface and the hierarchical relationship between the controls; the control drawing instructions corresponding to the control identifiers recorded in the view tree are extracted from the interface data.
  • the step of causing the user to show the various levels of controls of the source interface according to the first data includes:
  • control drawing instructions of the control to draw each level control of the source interface according to the hierarchical relationship between the controls, and show the drawn each level control to the user; and/or,
  • the screen of the first electronic device is divided into a first display area and a second display area;
  • the steps of causing the user to show the various levels of controls of the source interface include:
  • the various levels of controls of the source interface are displayed to the user;
  • the step of causing the user to show the virtual screen generated based on the screen information of the second electronic device includes:
  • the virtual screen is shown to the user in the second display area.
  • the first electronic device when the above instruction is executed by the first electronic device, the first electronic device further executes the following steps:
  • the first control identifier in the interface layout information the first control identifier in the first data of the interface data sent to the source interface of the second electronic device is marked.
  • the step of causing the acquired control identifier and layout information to be sent to the second electronic device includes:
  • the obtained interface layout information is carried in the interface data of the source interface and sent to the second electronic device.
  • the above electronic device may be the second electronic device described in the embodiment of the application, and when the above instruction is executed by the second electronic device, the second electronic device is caused to perform the following steps:
  • the interface layout information includes: a first control identifier and layout information; the interface layout information is the interface layout of the virtual screen when the first electronic device detects the user's confirmation operation on the interface layout on the virtual screen information;
  • the projection interface of the source interface is displayed according to the interface layout information.
  • the step of causing the screen projection interface of the source interface to be displayed according to the first control identifier and layout information according to the interface data includes:
  • the control corresponding to the first control identifier is drawn using the control drawing instruction corresponding to the first control identifier.
  • the first control identifier in the interface data has a mark; when the above instruction is executed by the second electronic device, the step of causing the screen projection interface of the source interface to be displayed according to the first control identifier and layout information according to the interface data includes:
  • the control corresponding to the first control identifier is drawn using the control drawing instruction corresponding to the first control identifier.
  • the electronic device shown in FIG. 10 may be a terminal device or a circuit device built in the aforementioned terminal device.
  • the device can be used as the above-mentioned first electronic device or second electronic device to perform the functions/steps in the methods provided in the embodiments shown in FIG. 3 to FIG. 7 of this application.
  • the electronic device 1000 may include a processor 1010, an external memory interface 1020, an internal memory 1021, a universal serial bus (USB) interface 1030, a charging management module 1040, a power management module 1041, a battery 1042, an antenna 1, and an antenna 2.
  • Mobile communication module 1050 wireless communication module 1060, audio module 1070, speaker 1070A, receiver 1070B, microphone 1070C, earphone jack 1070D, sensor module 1080, buttons 1090, motor 1091, indicator 1092, camera 1093, display 1094, and Subscriber identification module (subscriber identification module, SIM) card interface 1095, etc.
  • SIM Subscriber identification module
  • the sensor module 1080 can include pressure sensor 1080A, gyroscope sensor 1080B, air pressure sensor 1080C, magnetic sensor 1080D, acceleration sensor 1080E, distance sensor 1080F, proximity light sensor 1080G, fingerprint sensor 1080H, temperature sensor 1080J, touch sensor 1080K, ambient light Sensor 1080L, bone conduction sensor 1080M, etc.
  • the structure illustrated in the embodiment of the present invention does not constitute a specific limitation on the electronic device 1000.
  • the electronic device 1000 may include more or fewer components than shown, or combine certain components, or split certain components, or arrange different components.
  • the illustrated components can be implemented in hardware, software, or a combination of software and hardware.
  • the processor 1010 may include one or more processing units.
  • the processor 1010 may include an application processor (AP), a modem processor, a graphics processing unit (GPU), and an image signal processor. (image signal processor, ISP), controller, video codec, digital signal processor (digital signal processor, DSP), baseband processor, and/or neural-network processing unit (NPU), etc.
  • AP application processor
  • modem processor modem processor
  • GPU graphics processing unit
  • image signal processor image signal processor
  • ISP image signal processor
  • controller video codec
  • digital signal processor digital signal processor
  • DSP digital signal processor
  • NPU neural-network processing unit
  • the different processing units may be independent devices or integrated in one or more processors.
  • the controller can generate operation control signals according to the instruction operation code and timing signals to complete the control of fetching instructions and executing instructions.
  • a memory may also be provided in the processor 1010 to store instructions and data.
  • the memory in the processor 1010 is a cache memory.
  • the memory can store instructions or data that have just been used or recycled by the processor 1010. If the processor 1010 needs to use the instruction or data again, it can be directly called from the memory. Repeated access is avoided, the waiting time of the processor 1010 is reduced, and the efficiency of the system is improved.
  • the processor 1010 may include one or more interfaces.
  • the interface may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, and a universal asynchronous transmitter/receiver (universal asynchronous) interface.
  • I2C integrated circuit
  • I2S integrated circuit built-in audio
  • PCM pulse code modulation
  • PCM pulse code modulation
  • UART universal asynchronous transmitter/receiver
  • MIPI mobile industry processor interface
  • GPIO general-purpose input/output
  • SIM subscriber identity module
  • USB Universal Serial Bus
  • the I2C interface is a bidirectional synchronous serial bus, including a serial data line (SDA) and a serial clock line (SCL).
  • the processor 1010 may include multiple sets of I2C buses.
  • the processor 1010 may be coupled to the touch sensor 1080K, charger, flash, camera 1093, etc., respectively through different I2C bus interfaces.
  • the processor 1010 may couple the touch sensor 1080K through an I2C interface, so that the processor 1010 and the touch sensor 1080K communicate through the I2C bus interface to realize the touch function of the electronic device 1000.
  • the I2S interface can be used for audio communication.
  • the processor 1010 may include multiple sets of I2S buses.
  • the processor 1010 may be coupled with the audio module 1070 through an I2S bus to implement communication between the processor 1010 and the audio module 1070.
  • the audio module 1070 can transmit audio signals to the wireless communication module 1060 through the I2S interface, so as to realize the function of answering calls through the Bluetooth headset.
  • the PCM interface can also be used for audio communication to sample, quantize and encode analog signals.
  • the audio module 1070 and the wireless communication module 1060 may be coupled through a PCM bus interface.
  • the audio module 1070 may also transmit audio signals to the wireless communication module 1060 through the PCM interface, so as to realize the function of answering calls through the Bluetooth headset. Both the I2S interface and the PCM interface can be used for audio communication.
  • the UART interface is a universal serial data bus used for asynchronous communication.
  • the bus can be a two-way communication bus. It converts the data to be transmitted between serial communication and parallel communication.
  • the UART interface is generally used to connect the processor 1010 and the wireless communication module 1060.
  • the processor 1010 communicates with the Bluetooth module in the wireless communication module 1060 through the UART interface to realize the Bluetooth function.
  • the audio module 1070 may transmit audio signals to the wireless communication module 1060 through the UART interface, so as to realize the function of playing music through the Bluetooth headset.
  • the MIPI interface can be used to connect the processor 1010 with the display screen 1094, the camera 1093 and other peripheral devices.
  • the MIPI interface includes a camera serial interface (camera serial interface, CSI), a display serial interface (display serial interface, DSI), and so on.
  • the processor 1010 and the camera 1093 communicate through a CSI interface to implement the shooting function of the electronic device 1000.
  • the processor 1010 and the display screen 1094 communicate through a DSI interface to realize the display function of the electronic device 1000.
  • the GPIO interface can be configured through software.
  • the GPIO interface can be configured as a control signal or as a data signal.
  • the GPIO interface can be used to connect the processor 1010 with the camera 1093, the display screen 1094, the wireless communication module 1060, the audio module 1070, the sensor module 1080, and so on.
  • the GPIO interface can also be configured as an I2C interface, I2S interface, UART interface, MIPI interface, etc.
  • the USB interface 1030 is an interface that complies with the USB standard specifications, and specifically may be a Mini USB interface, a Micro USB interface, a USB Type C interface, and so on.
  • the USB interface 1030 can be used to connect a charger to charge the electronic device 1000, and can also be used to transfer data between the electronic device 1000 and peripheral devices. It can also be used to connect earphones and play audio through earphones. This interface can also be used to connect other electronic devices, such as AR devices.
  • the interface connection relationship between the modules illustrated in the embodiment of the present invention is merely illustrative, and does not constitute a structural limitation of the electronic device 1000.
  • the electronic device 1000 may also adopt different interface connection modes in the foregoing embodiments, or a combination of multiple interface connection modes.
  • the charging management module 1040 is used to receive charging input from the charger.
  • the charger can be a wireless charger or a wired charger.
  • the charging management module 1040 may receive the charging input of the wired charger through the USB interface 1030.
  • the charging management module 1040 may receive the wireless charging input through the wireless charging coil of the electronic device 1000. While the charging management module 1040 charges the battery 1042, it can also supply power to the electronic device through the power management module 1041.
  • the power management module 1041 is used to connect the battery 1042, the charging management module 1040 and the processor 1010.
  • the power management module 1041 receives input from the battery 1042 and/or the charging management module 1040, and supplies power to the processor 1010, internal memory 1021, display screen 1094, camera 1093, and wireless communication module 1060.
  • the power management module 1041 can also be used to monitor parameters such as battery capacity, battery cycle times, and battery health status (leakage, impedance).
  • the power management module 1041 may also be provided in the processor 1010.
  • the power management module 1041 and the charging management module 1040 may also be provided in the same device.
  • the wireless communication function of the electronic device 1000 can be realized by the antenna 1, the antenna 2, the mobile communication module 1050, the wireless communication module 1060, the modem processor, and the baseband processor.
  • the antenna 1 and the antenna 2 are used to transmit and receive electromagnetic wave signals.
  • Each antenna in the electronic device 1000 can be used to cover a single or multiple communication frequency bands. Different antennas can also be reused to improve antenna utilization.
  • Antenna 1 can be multiplexed as a diversity antenna of a wireless local area network.
  • the antenna can be used in combination with a tuning switch.
  • the mobile communication module 1050 can provide wireless communication solutions including 2G/3G/4G/5G and the like applied to the electronic device 1000.
  • the mobile communication module 1050 may include at least one filter, switch, power amplifier, low noise amplifier (LNA), etc.
  • the mobile communication module 1050 can receive electromagnetic waves by the antenna 1, and perform processing such as filtering, amplifying and transmitting the received electromagnetic waves to the modem processor for demodulation.
  • the mobile communication module 1050 can also amplify the signal modulated by the modem processor, and convert it into electromagnetic wave radiation via the antenna 1.
  • at least part of the functional modules of the mobile communication module 1050 may be provided in the processor 1010.
  • at least part of the functional modules of the mobile communication module 1050 and at least part of the modules of the processor 1010 may be provided in the same device.
  • the modem processor may include a modulator and a demodulator.
  • the modulator is used to modulate the low frequency baseband signal to be sent into a medium and high frequency signal.
  • the demodulator is used to demodulate the received electromagnetic wave signal into a low-frequency baseband signal.
  • the demodulator then transmits the demodulated low-frequency baseband signal to the baseband processor for processing.
  • the application processor outputs a sound signal through an audio device (not limited to a speaker 1070A, a receiver 1070B, etc.), or displays an image or video through the display screen 1094.
  • the modem processor may be an independent device.
  • the modem processor may be independent of the processor 1010 and be provided in the same device as the mobile communication module 1050 or other functional modules.
  • the wireless communication module 1060 can provide applications on the electronic device 1000 including wireless local area networks (WLAN) (such as wireless fidelity (Wi-Fi) networks), bluetooth (BT), and global navigation satellites. System (global navigation satellite system, GNSS), frequency modulation (FM), near field communication (NFC), infrared technology (infrared, IR) and other wireless communication solutions.
  • the wireless communication module 1060 may be one or more devices integrating at least one communication processing module.
  • the wireless communication module 1060 receives electromagnetic waves via the antenna 2, frequency modulates and filters the electromagnetic wave signals, and sends the processed signals to the processor 1010.
  • the wireless communication module 1060 can also receive the signal to be sent from the processor 1010, perform frequency modulation, amplify, and convert it into electromagnetic waves to radiate through the antenna 2.
  • the antenna 1 of the electronic device 1000 is coupled with the mobile communication module 1050, and the antenna 2 is coupled with the wireless communication module 1060, so that the electronic device 1000 can communicate with the network and other devices through wireless communication technology.
  • the wireless communication technology may include global system for mobile communications (GSM), general packet radio service (GPRS), code division multiple access (CDMA), broadband Code division multiple access (wideband code division multiple access, WCDMA), time-division code division multiple access (TD-SCDMA), long term evolution (LTE), BT, GNSS, WLAN, NFC , FM, and/or IR technology, etc.
  • the GNSS may include global positioning system (GPS), global navigation satellite system (GLONASS), Beidou navigation satellite system (BDS), quasi-zenith satellite system (quasi -zenith satellite system, QZSS) and/or satellite-based augmentation systems (SBAS).
  • GPS global positioning system
  • GLONASS global navigation satellite system
  • BDS Beidou navigation satellite system
  • QZSS quasi-zenith satellite system
  • SBAS satellite-based augmentation systems
  • the electronic device 1000 implements a display function through a GPU, a display screen 1094, and an application processor.
  • the GPU is a microprocessor for image processing, which is connected to the display screen 1094 and the application processor.
  • the GPU is used to perform mathematical and geometric calculations and is used for graphics rendering.
  • the processor 1010 may include one or more GPUs that execute program instructions to generate or change display information.
  • the display screen 1094 is used to display images, videos, and so on.
  • the display screen 1094 includes a display panel.
  • the display panel can use liquid crystal display (LCD), organic light-emitting diode (OLED), active matrix organic light-emitting diode or active-matrix organic light-emitting diode (active-matrix organic light-emitting diode).
  • LCD liquid crystal display
  • OLED organic light-emitting diode
  • active-matrix organic light-emitting diode active-matrix organic light-emitting diode
  • AMOLED flexible light-emitting diode (FLED), Miniled, MicroLed, Micro-oLed, quantum dot light-emitting diode (QLED), etc.
  • the electronic device 1000 may include one or N display screens 1094, and N is a positive integer greater than one.
  • the electronic device 1000 can realize a shooting function through an ISP, a camera 1093, a video codec, a GPU, a display 1094, and an application processor.
  • the ISP is used to process the data fed back by the camera 1093. For example, when taking a picture, the shutter is opened, the light is transmitted to the photosensitive element of the camera through the lens, the light signal is converted into an electrical signal, and the photosensitive element of the camera transmits the electrical signal to the ISP for processing and is converted into an image visible to the naked eye.
  • ISP can also optimize the image noise, brightness, and skin color. ISP can also optimize the exposure, color temperature and other parameters of the shooting scene.
  • the ISP may be provided in the camera 1093.
  • the camera 1093 is used to capture still images or videos.
  • the object generates an optical image through the lens and is projected to the photosensitive element.
  • the photosensitive element may be a charge coupled device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor.
  • CMOS complementary metal-oxide-semiconductor
  • the photosensitive element converts the optical signal into an electrical signal, and then transfers the electrical signal to the ISP to convert it into a digital image signal.
  • ISP outputs digital image signals to DSP for processing.
  • DSP converts digital image signals into standard RGB, YUV and other formats of image signals.
  • the electronic device 1000 may include 1 or N cameras 1093, and N is a positive integer greater than 1.
  • Digital signal processors are used to process digital signals. In addition to digital image signals, they can also process other digital signals. For example, when the electronic device 1000 selects a frequency point, the digital signal processor is used to perform Fourier transform on the energy of the frequency point.
  • Video codecs are used to compress or decompress digital video.
  • the electronic device 1000 may support one or more video codecs. In this way, the electronic device 1000 can play or record videos in multiple encoding formats, such as: moving picture experts group (MPEG) 1, MPEG2, MPEG3, MPEG4, and so on.
  • MPEG moving picture experts group
  • MPEG2 MPEG2, MPEG3, MPEG4, and so on.
  • NPU is a neural-network (NN) computing processor.
  • NN neural-network
  • applications such as intelligent cognition of the electronic device 1000 can be implemented, such as image recognition, face recognition, voice recognition, text understanding, and so on.
  • the external memory interface 1020 may be used to connect an external memory card, such as a Micro SD card, so as to expand the storage capacity of the electronic device 1000.
  • the external memory card communicates with the processor 1010 through the external memory interface 1020 to realize the data storage function. For example, save music, video and other files in an external memory card.
  • the internal memory 1021 may be used to store computer executable program code, where the executable program code includes instructions.
  • the internal memory 1021 may include a program storage area and a data storage area.
  • the storage program area can store an operating system, an application program (such as a sound playback function, an image playback function, etc.) required by at least one function, and the like.
  • the data storage area can store data (such as audio data, phone book, etc.) created during the use of the electronic device 1000.
  • the internal memory 1021 may include a high-speed random access memory, and may also include a non-volatile memory, such as at least one magnetic disk storage device, a flash memory device, a universal flash storage (UFS), and the like.
  • the processor 1010 executes various functional applications and data processing of the electronic device 1000 by running instructions stored in the internal memory 1021 and/or instructions stored in a memory provided in the processor.
  • the electronic device 1000 can implement audio functions through an audio module 1070, a speaker 1070A, a receiver 1070B, a microphone 1070C, a headphone interface 1070D, and an application processor. For example, music playback, recording, etc.
  • the audio module 1070 is used to convert digital audio information into an analog audio signal for output, and is also used to convert an analog audio input into a digital audio signal.
  • the audio module 1070 can also be used to encode and decode audio signals.
  • the audio module 1070 may be provided in the processor 1010, or part of the functional modules of the audio module 1070 may be provided in the processor 1010.
  • the speaker 1070A also called a “speaker” is used to convert audio electrical signals into sound signals.
  • the electronic device 1000 can listen to music through the speaker 1070A, or listen to a hands-free call.
  • the receiver 1070B also called “earpiece” is used to convert audio electrical signals into sound signals.
  • the electronic device 1000 answers a call or voice message, it can receive the voice by bringing the receiver 1070B close to the human ear.
  • the microphone 1070C also called “microphone” or “microphone” is used to convert sound signals into electrical signals.
  • the user can make a sound by approaching the microphone 1070C through the human mouth, and input the sound signal into the microphone 1070C.
  • the electronic device 1000 may be provided with at least one microphone 1070C. In other embodiments, the electronic device 1000 may be provided with two microphones 1070C, which can implement noise reduction functions in addition to collecting sound signals. In some other embodiments, the electronic device 1000 may also be provided with three, four or more microphones 1070C to collect sound signals, reduce noise, identify sound sources, and realize directional recording functions.
  • the earphone interface 1070D is used to connect wired earphones.
  • the earphone interface 1070D may be a USB interface 1030, or a 3.5mm open mobile terminal platform (OMTP) standard interface, or a cellular telecommunications industry association of the USA (CTIA) standard interface.
  • OMTP open mobile terminal platform
  • CTIA cellular telecommunications industry association of the USA
  • the pressure sensor 1080A is used to sense the pressure signal and can convert the pressure signal into an electrical signal.
  • the pressure sensor 1080A may be provided on the display screen 1094.
  • the capacitive pressure sensor may include at least two parallel plates with conductive materials.
  • the electronic device 1000 may also calculate the touched position according to the detection signal of the pressure sensor 1080A.
  • touch operations that act on the same touch position but have different touch operation strengths may correspond to different operation instructions. For example: when a touch operation whose intensity of the touch operation is less than the first pressure threshold is applied to the short message application icon, an instruction to view the short message is executed. When a touch operation with a touch operation intensity greater than or equal to the first pressure threshold acts on the short message application icon, an instruction to create a new short message is executed.
  • the gyro sensor 1080B may be used to determine the movement posture of the electronic device 1000.
  • the angular velocity of the electronic device 100 around three axes ie, x, y, and z axes
  • the gyro sensor 1080B can be used for shooting anti-shake.
  • the gyro sensor 1080B detects the jitter angle of the electronic device 1000, calculates the distance that the lens module needs to compensate according to the angle, and allows the lens to counteract the jitter of the electronic device 1000 through reverse movement to achieve anti-shake.
  • the gyroscope sensor 1080B can also be used for navigation and somatosensory game scenes.
  • the air pressure sensor 1080C is used to measure air pressure.
  • the electronic device 1000 calculates the altitude based on the air pressure value measured by the air pressure sensor 1080C to assist positioning and navigation.
  • the magnetic sensor 1080D includes a Hall sensor.
  • the electronic device 1000 can use the magnetic sensor 1080D to detect the opening and closing of the flip holster.
  • the electronic device 1000 can detect the opening and closing of the flip according to the magnetic sensor 1080D.
  • features such as automatic unlocking of the flip cover are set.
  • the acceleration sensor 1080E can detect the magnitude of the acceleration of the electronic device 1000 in various directions (generally three axes). When the electronic device 1000 is stationary, the magnitude and direction of gravity can be detected. It can also be used to identify the posture of electronic devices, and apply to applications such as horizontal and vertical screen switching, pedometers, and so on.
  • the electronic device 1000 can measure the distance by infrared or laser. In some embodiments, when shooting a scene, the electronic device 1000 may use the distance sensor 1080F to measure the distance to achieve fast focusing.
  • the proximity light sensor 1080G may include, for example, a light emitting diode (LED) and a light detector, such as a photodiode.
  • the light emitting diode may be an infrared light emitting diode.
  • the electronic device 1000 emits infrared light to the outside through the light emitting diode.
  • the electronic device 1000 uses a photodiode to detect infrared reflected light from nearby objects. When sufficient reflected light is detected, it can be determined that there is an object near the electronic device 1000. When insufficient reflected light is detected, the electronic device 1000 can determine that there is no object near the electronic device 1000.
  • the electronic device 1000 can use the proximity light sensor 1080G to detect that the user holds the electronic device 1000 close to the ear to talk, so as to automatically turn off the screen to save power.
  • Proximity light sensor 1080G can also be used in leather case mode, pocket mode automatically unlocks and locks the screen.
  • the ambient light sensor 1080L is used to perceive the brightness of the ambient light.
  • the electronic device 1000 can adaptively adjust the brightness of the display screen 1094 according to the perceived brightness of the ambient light.
  • the ambient light sensor 1080L can also be used to automatically adjust the white balance when taking pictures.
  • the ambient light sensor 1080L can also cooperate with the proximity light sensor 1080G to detect whether the electronic device 1000 is in the pocket to prevent accidental touch.
  • the fingerprint sensor 1080H is used to collect fingerprints.
  • the electronic device 1000 can use the collected fingerprint characteristics to realize fingerprint unlocking, access application locks, fingerprint photographs, fingerprint answering calls, and so on.
  • the temperature sensor 1080J is used to detect temperature.
  • the electronic device 1000 uses the temperature detected by the temperature sensor 1080J to execute a temperature processing strategy. For example, when the temperature reported by the temperature sensor 1080J exceeds a threshold value, the electronic device 1000 executes to reduce the performance of the processor located near the temperature sensor 1080J, so as to reduce power consumption and implement thermal protection.
  • the electronic device 1000 when the temperature is lower than another threshold, the electronic device 1000 heats the battery 1042 to avoid abnormal shutdown of the electronic device 1000 due to low temperature.
  • the electronic device 1000 boosts the output voltage of the battery 1042 to avoid abnormal shutdown caused by low temperature.
  • Touch sensor 1080K also called “touch device”.
  • the touch sensor 1080K can be set on the display screen 1094, and the touch screen is composed of the touch sensor 1080K and the display screen 1094, which is also called a “touch screen”.
  • the touch sensor 1080K is used to detect touch operations acting on or near it.
  • the touch sensor can pass the detected touch operation to the application processor to determine the type of touch event.
  • the visual output related to the touch operation can be provided through the display screen 1094.
  • the touch sensor 1080K may also be disposed on the surface of the electronic device 1000, which is different from the position of the display screen 1094.
  • the bone conduction sensor 1080M can acquire vibration signals.
  • the bone conduction sensor 1080M can obtain the vibration signal of the vibrating bone mass of the human voice.
  • the bone conduction sensor 1080M can also contact the human pulse and receive the blood pressure pulse signal.
  • the bone conduction sensor 1080M may also be provided in the earphone, combined with the bone conduction earphone.
  • the audio module 1070 can parse the voice signal based on the vibration signal of the vibrating bone block of the voice obtained by the bone conduction sensor 1080M, and realize the voice function.
  • the application processor may analyze the heart rate information based on the blood pressure beating signal obtained by the bone conduction sensor 1080M, and realize the heart rate detection function.
  • the button 1090 includes a power-on button, a volume button, and so on.
  • the button 1090 may be a mechanical button. It can also be a touch button.
  • the electronic device 1000 may receive key input, and generate key signal input related to user settings and function control of the electronic device 1000.
  • the motor 1091 can generate vibration prompts.
  • the motor 1091 can be used for incoming call vibration notification, and can also be used for touch vibration feedback.
  • touch operations applied to different applications can correspond to different vibration feedback effects.
  • Acting on touch operations in different areas of the display screen 1094, the motor 1091 can also correspond to different vibration feedback effects.
  • Different application scenarios for example: time reminding, receiving information, alarm clock, games, etc.
  • the touch vibration feedback effect can also support customization.
  • the indicator 1092 can be an indicator light, which can be used to indicate the charging status, power change, and can also be used to indicate messages, missed calls, notifications, and so on.
  • the SIM card interface 1095 is used to connect to the SIM card.
  • the SIM card can be inserted into the SIM card interface 1095 or pulled out from the SIM card interface 1095 to achieve contact and separation with the electronic device 1000.
  • the electronic device 1000 may support 1 or N SIM card interfaces, and N is a positive integer greater than 1.
  • the SIM card interface 1095 can support Nano SIM cards, Micro SIM cards, SIM cards, etc.
  • the same SIM card interface 1095 can insert multiple cards at the same time. The types of the multiple cards can be the same or different.
  • the SIM card interface 1095 can also be compatible with different types of SIM cards.
  • the SIM card interface 1095 can also be compatible with external memory cards.
  • the electronic device 1000 interacts with the network through the SIM card to implement functions such as call and data communication.
  • the electronic device 1000 adopts an eSIM, that is, an embedded SIM card.
  • the eSIM card can be embedded in the electronic device 1000 and cannot be separated from the electronic device 1000.
  • the electronic device 1000 shown in FIG. 10 can implement various processes of the methods provided in the embodiments shown in FIGS. 3 to 7 of this application.
  • the operation and/or function of each module in the electronic device 1000 are respectively for implementing the corresponding process in the foregoing method embodiment.
  • processor 1010 in the electronic device 1000 shown in FIG. 10 may be a system-on-chip SOC, and the processor 1010 may include a central processing unit (CPU), and may further include other types of processors. For example: Graphics Processing Unit (GPU), etc.
  • CPU central processing unit
  • GPU Graphics Processing Unit
  • each part of the processor or processing unit inside the processor 1010 can cooperate to implement the previous method flow, and the corresponding software program of each part of the processor or processing unit can be stored in the internal memory 121.
  • the device includes a storage medium and a central processing unit.
  • the storage medium may be a non-volatile storage medium.
  • a computer executable program is stored in the storage medium.
  • the central processing unit is connected to the The non-volatile storage medium is connected, and the computer executable program is executed to implement the method provided by the embodiments shown in FIG. 3 to FIG. 7 of this application.
  • the processors involved may include, for example, CPU, DSP, microcontroller or digital signal processor, and may also include GPU, embedded neural network processor (Neural-network Process Units; hereinafter referred to as NPU) and Image signal processing (Image Signal Processing; hereinafter referred to as ISP), which may also include necessary hardware accelerators or logic processing hardware circuits, such as ASIC, or one or more integrated circuits used to control the execution of the technical solutions of this application Circuit etc.
  • the processor may have a function of operating one or more software programs, and the software programs may be stored in a storage medium.
  • An embodiment of the present application also provides a computer-readable storage medium, which stores a computer program, which when running on a computer, causes the computer to execute the functions provided by the embodiments shown in Figs. 3 to 7 of the present application. method.
  • the embodiments of the present application also provide a computer program product.
  • the computer program product includes a computer program that, when running on a computer, causes the computer to execute the method provided by the embodiments shown in FIGS. 3 to 7 of the present application.
  • At least one refers to one or more
  • multiple refers to two or more.
  • And/or describes the association relationship of the associated objects, indicating that there can be three types of relationships, for example, A and/or B, which can mean that A exists alone, A and B exist at the same time, and B exists alone. Among them, A and B can be singular or plural.
  • the character “/” generally indicates that the associated objects before and after are in an “or” relationship.
  • the following at least one item” and similar expressions refer to any combination of these items, including any combination of single items or plural items.
  • At least one of a, b, and c can represent: a, b, c, a and b, a and c, b and c, or a and b and c, where a, b, and c can be single, or There can be more than one.
  • any function is implemented in the form of a software functional unit and sold or used as an independent product, it can be stored in a computer readable storage medium.
  • the technical solution of the present application essentially or the part that contributes to the existing technology or the part of the technical solution can be embodied in the form of a software product, and the computer software product is stored in a storage medium, including Several instructions are used to make a computer device (which may be a personal computer, a server, or a network device, etc.) execute all or part of the steps of the methods described in the various embodiments of the present application.
  • the aforementioned storage media include: U disk, mobile hard disk, read-only memory (Read-Only Memory; hereinafter referred to as ROM), random access memory (Random Access Memory; hereinafter referred to as RAM), magnetic disks or optical disks, etc.
  • ROM read-only memory
  • RAM random access memory
  • magnetic disks or optical disks etc.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

Procédé et un appareil de projection d'écran et dispositif électronique. Dans le procédé de projection d'écran, un premier dispositif électronique affiche divers niveaux de commandes d'une interface source à un utilisateur et acquiert une commande sélectionnée par l'utilisateur parmi les divers niveaux de commandes ; le premier dispositif électronique affiche un écran virtuel généré sur la base d'informations d'écran d'un second dispositif électronique à l'utilisateur et affiche la commande sélectionnée par l'utilisateur sur l'écran virtuel, la commande affichée sur l'écran virtuel pouvant être utilisée par l'utilisateur pour effectuer une opération d'édition ; et après la détection d'une opération de confirmation de l'utilisateur sur une disposition d'interface sur l'écran virtuel, le premier dispositif électronique envoie des informations de disposition d'interface de l'écran virtuel au second dispositif électronique, de telle sorte que le second dispositif électronique affiche une interface de projection d'écran de l'interface de source en fonction de l'identification de la commande et des informations de disposition, ce qui permet d'améliorer l'effet de projection d'écran et l'expérience utilisateur.
PCT/CN2021/082506 2020-04-23 2021-03-24 Procédé et appareil de projection d'écran et dispositif électronique WO2021213120A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202010328653.8A CN111443884A (zh) 2020-04-23 2020-04-23 投屏方法、装置和电子设备
CN202010328653.8 2020-04-23

Publications (1)

Publication Number Publication Date
WO2021213120A1 true WO2021213120A1 (fr) 2021-10-28

Family

ID=71654354

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/082506 WO2021213120A1 (fr) 2020-04-23 2021-03-24 Procédé et appareil de projection d'écran et dispositif électronique

Country Status (2)

Country Link
CN (1) CN111443884A (fr)
WO (1) WO2021213120A1 (fr)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115599335A (zh) * 2022-12-13 2023-01-13 佳瑛科技有限公司(Cn) 基于多屏模式下共享版式文件的方法及系统
CN116089256A (zh) * 2022-05-13 2023-05-09 荣耀终端有限公司 终端测试方法、装置及存储介质
CN117156189A (zh) * 2023-02-27 2023-12-01 荣耀终端有限公司 投屏显示方法及电子设备
CN117707715A (zh) * 2023-05-25 2024-03-15 荣耀终端有限公司 应用程序的管理方法和电子设备

Families Citing this family (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111443884A (zh) * 2020-04-23 2020-07-24 华为技术有限公司 投屏方法、装置和电子设备
CN114363678A (zh) * 2020-09-29 2022-04-15 华为技术有限公司 一种投屏方法及设备
CN112000410B (zh) * 2020-08-17 2024-03-19 努比亚技术有限公司 一种投屏控制方法、设备及计算机可读存储介质
CN114816294A (zh) 2020-09-02 2022-07-29 华为技术有限公司 一种显示方法及设备
CN112286477B (zh) * 2020-11-16 2023-12-08 Oppo广东移动通信有限公司 投屏显示方法及相关产品
CN114816158A (zh) * 2021-01-11 2022-07-29 华为技术有限公司 界面的控制方法、装置、电子设备和可读存储介质
CN113438526A (zh) * 2021-06-25 2021-09-24 维沃移动通信有限公司 屏幕内容分享方法、显示方法、装置、设备及存储介质
CN115686401A (zh) * 2021-07-28 2023-02-03 华为技术有限公司 一种投屏方法、电子设备及系统
CN113778360B (zh) * 2021-08-20 2022-07-22 荣耀终端有限公司 投屏方法和电子设备
CN116887005B (zh) * 2021-08-27 2024-05-03 荣耀终端有限公司 投屏方法、电子设备和计算机可读存储介质
CN115016702B (zh) * 2021-09-10 2023-10-27 荣耀终端有限公司 扩展屏幕模式下选择应用程序显示屏幕的控制方法及系统
CN113805827B (zh) * 2021-09-14 2024-05-07 北京百度网讯科技有限公司 一种投屏展示方法、装置、电子设备及存储介质
CN115914700A (zh) * 2021-09-30 2023-04-04 上海擎感智能科技有限公司 投屏处理方法、系统、电子设备和存储介质
WO2023103948A1 (fr) * 2021-12-08 2023-06-15 华为技术有限公司 Procédé d'affichage et dispositif électronique
CN114579231A (zh) * 2022-02-15 2022-06-03 北京优酷科技有限公司 页面展示方法、装置及电子设备
CN114610434A (zh) * 2022-03-28 2022-06-10 联想(北京)有限公司 输出控制方法及电子设备
CN114884990A (zh) * 2022-05-06 2022-08-09 亿咖通(湖北)技术有限公司 一种基于虚拟屏的投屏方法和设备
CN115209213B (zh) * 2022-08-23 2023-01-20 荣耀终端有限公司 一种无线投屏方法及移动设备
CN117850715A (zh) * 2022-09-30 2024-04-09 华为技术有限公司 投屏显示方法、电子设备及系统
CN115562539A (zh) * 2022-11-09 2023-01-03 维沃移动通信有限公司 控件显示方法及装置、电子设备和可读存储介质

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105915978A (zh) * 2015-12-14 2016-08-31 乐视致新电子科技(天津)有限公司 一种车载显示控制方法及其装置
CN107273083A (zh) * 2017-06-30 2017-10-20 百度在线网络技术(北京)有限公司 一种终端设备之间的交互方法、装置、设备和存储介质
CN110381195A (zh) * 2019-06-05 2019-10-25 华为技术有限公司 一种投屏显示方法及电子设备
CN110688042A (zh) * 2019-09-29 2020-01-14 百度在线网络技术(北京)有限公司 界面显示方法和装置
CN111443884A (zh) * 2020-04-23 2020-07-24 华为技术有限公司 投屏方法、装置和电子设备

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103176797B (zh) * 2013-02-21 2015-12-09 用友网络科技股份有限公司 界面布局装置和界面布局方法
CN103645906B (zh) * 2013-12-25 2018-04-10 上海斐讯数据通信技术有限公司 基于固定的界面布局文件实现界面重新布局的方法及系统
CN109753315A (zh) * 2018-11-22 2019-05-14 广州小鸡快跑网络科技有限公司 一种智能设备交互式内容编辑实现方法及存储介质
CN110554816B (zh) * 2019-07-25 2024-05-07 华为技术有限公司 一种界面生成方法及设备

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105915978A (zh) * 2015-12-14 2016-08-31 乐视致新电子科技(天津)有限公司 一种车载显示控制方法及其装置
CN107273083A (zh) * 2017-06-30 2017-10-20 百度在线网络技术(北京)有限公司 一种终端设备之间的交互方法、装置、设备和存储介质
CN110381195A (zh) * 2019-06-05 2019-10-25 华为技术有限公司 一种投屏显示方法及电子设备
CN110688042A (zh) * 2019-09-29 2020-01-14 百度在线网络技术(北京)有限公司 界面显示方法和装置
CN111443884A (zh) * 2020-04-23 2020-07-24 华为技术有限公司 投屏方法、装置和电子设备

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116089256A (zh) * 2022-05-13 2023-05-09 荣耀终端有限公司 终端测试方法、装置及存储介质
CN116089256B (zh) * 2022-05-13 2024-03-12 荣耀终端有限公司 终端测试方法、装置及存储介质
CN115599335A (zh) * 2022-12-13 2023-01-13 佳瑛科技有限公司(Cn) 基于多屏模式下共享版式文件的方法及系统
CN115599335B (zh) * 2022-12-13 2023-08-22 佳瑛科技有限公司 基于多屏模式下共享版式文件的方法及系统
CN117156189A (zh) * 2023-02-27 2023-12-01 荣耀终端有限公司 投屏显示方法及电子设备
CN117707715A (zh) * 2023-05-25 2024-03-15 荣耀终端有限公司 应用程序的管理方法和电子设备

Also Published As

Publication number Publication date
CN111443884A (zh) 2020-07-24

Similar Documents

Publication Publication Date Title
WO2021213120A1 (fr) Procédé et appareil de projection d'écran et dispositif électronique
WO2020259452A1 (fr) Procédé d'affichage plein écran pour terminal mobile et appareil
WO2021017889A1 (fr) Procédé d'affichage d'appel vidéo appliqué à un dispositif électronique et appareil associé
WO2020168965A1 (fr) Procédé de commande d'un dispositif électronique à écran pliant et dispositif électronique
WO2021213164A1 (fr) Procédé d'interaction entre des interfaces d'application, dispositif électronique et support de stockage lisible par ordinateur
WO2021000807A1 (fr) Procédé et appareil de traitement pour un scénario d'attente dans une application
WO2021052214A1 (fr) Procédé et appareil d'interaction par geste de la main et dispositif terminal
CN112399390B (zh) 一种蓝牙回连的方法及相关装置
WO2021036585A1 (fr) Procédé d'affichage sur écran souple, et dispositif électronique
WO2021036770A1 (fr) Procédé de traitement d'écran partagé et dispositif terminal
WO2020029306A1 (fr) Procédé de capture d'image et dispositif électronique
CN113885759A (zh) 通知消息处理方法、设备、系统及计算机可读存储介质
WO2021208723A1 (fr) Procédé et appareil d'affichage plein écran, et dispositif électronique
WO2021180089A1 (fr) Procédé et appareil de commutation d'interface et dispositif électronique
WO2020118490A1 (fr) Procédé de division d'écran automatique, interface utilisateur graphique et dispositif électronique
WO2022001258A1 (fr) Procédé et appareil d'affichage à écrans multiples, dispositif terminal et support de stockage
WO2020056684A1 (fr) Procédé et dispositif utilisant de multiples écouteurs tws connectés en mode relais pour réaliser une interprétation automatique
WO2021057626A1 (fr) Procédé de traitement d'image, appareil, dispositif et support de stockage informatique
WO2022022319A1 (fr) Procédé et système de traitement d'image, dispositif électronique et système de puce
CN114115770A (zh) 显示控制的方法及相关装置
WO2023241209A9 (fr) Procédé et appareil de configuration de papier peint de bureau, dispositif électronique et support de stockage lisible
WO2020062304A1 (fr) Procédé de transmission de fichier et dispositif électronique
WO2021037034A1 (fr) Procédé de commutation de l'état d'une application et dispositif terminal
CN112532508B (zh) 一种视频通信方法及视频通信装置
WO2023029916A1 (fr) Procédé et appareil d'affichage d'annotation, dispositif terminal et support de stockage lisible

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21792808

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21792808

Country of ref document: EP

Kind code of ref document: A1