CN111443884A - Screen projection method and device and electronic equipment - Google Patents

Screen projection method and device and electronic equipment Download PDF

Info

Publication number
CN111443884A
CN111443884A CN202010328653.8A CN202010328653A CN111443884A CN 111443884 A CN111443884 A CN 111443884A CN 202010328653 A CN202010328653 A CN 202010328653A CN 111443884 A CN111443884 A CN 111443884A
Authority
CN
China
Prior art keywords
interface
control
electronic device
layout information
user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010328653.8A
Other languages
Chinese (zh)
Inventor
王勇
王欢
李�杰
李英浩
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN202010328653.8A priority Critical patent/CN111443884A/en
Publication of CN111443884A publication Critical patent/CN111443884A/en
Priority to PCT/CN2021/082506 priority patent/WO2021213120A1/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1454Digital output to display device ; Cooperation and interconnection of the display device with other functional units involving copying of the display data of a local workstation or window to a remote workstation or window so that an actual copy of the data is displayed simultaneously on two or more displays, e.g. teledisplay
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04806Zoom, i.e. interaction techniques or interactors for controlling the zooming operation

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Controls And Circuits For Display Device (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the application provides a screen projection method, a screen projection device and electronic equipment, wherein in the screen projection method, first electronic equipment displays all hierarchical controls of a source interface to a user and acquires a control selected by the user from all the hierarchical controls; the first electronic equipment displays a virtual screen generated based on the screen information of the second electronic equipment to a user, and displays a control selected by the user on the virtual screen, wherein the control displayed on the virtual screen can be edited by the user; after detecting that the user confirms the operation of the interface layout on the virtual screen, the first electronic device sends the interface layout information of the virtual screen to the second electronic device, so that the second electronic device displays the screen projection interface of the source interface according to the control identification and the layout information, the screen projection effect is improved, and the user experience is improved.

Description

Screen projection method and device and electronic equipment
Technical Field
The application relates to the technical field of intelligent terminals, in particular to a screen projection method and device and electronic equipment.
Background
In order to achieve various different purposes of large screen display, entertainment and the like, a user may use a screen projection manner to project an interface of a first electronic device (e.g., a mobile phone) onto a second electronic device (e.g., a PC) for display, where the first electronic device and the second electronic device may include, but are not limited to, a mobile terminal (mobile phone), a PAD, a PC, a wearable device, a large electronic screen, a television, a central control of an automobile, and the like.
At present, a screen projection mode between electronic devices mainly adopts a mirror mode, as shown in fig. 1, that is, an interface on a first electronic device (e.g., a mobile phone) is completely projected onto a second electronic device (e.g., a PC). In the screen projection mode of the mirror image mode, the interface displayed by the first electronic device (e.g., a mobile phone) can only be correspondingly stretched, zoomed and cut according to the size of the screen of the second electronic device, so that the screen projection interface displayed on the second electronic device has the problems of deformation, incomplete display and the like, the screen projection effect is poor, and the user experience is poor.
Disclosure of Invention
The embodiment of the application provides a screen projection method and device and electronic equipment, which can improve the screen projection effect and improve user experience.
In a first aspect, an embodiment of the present application provides a screen projection method, including:
the method comprises the steps that first electronic equipment displays all hierarchical controls of a source interface to a user and obtains controls selected by the user from all the hierarchical controls; the first electronic equipment displays a virtual screen generated based on the screen information of the second electronic equipment to a user, and displays a control selected by the user on the virtual screen, wherein the control displayed on the virtual screen can be edited by the user;
after detecting that a user confirms the operation of the interface layout on the virtual screen, the first electronic device sends the interface layout information of the virtual screen to the second electronic device, wherein the interface layout information comprises: and the first control identification and the layout information of the first control are placed on the virtual screen, so that the second electronic equipment can display the screen projection interface of the source interface according to the interface layout information.
The first electronic device or the second electronic device may be a mobile terminal (mobile phone), a smart screen, an unmanned aerial Vehicle, an Intelligent Connected Vehicle (ICV), an Intelligent Vehicle (smart/Intelligent car) or a Vehicle-mounted device.
The method can enable the layout of the screen projection interface displayed on the second electronic equipment to be consistent with the interface layout on the virtual screen when the user confirms the operation, so that the visual effect of the screen projection interface is more similar to that of the native application program interface of the second electronic equipment, and the screen projection interface is not only the mirror image, the copy, the stretch and the zoom of the source interface in the first electronic equipment, so that more natural and native screen projection experience is provided for the user.
Wherein, the first electronic equipment shows each hierarchy control piece of source interface to the user, includes:
the method comprises the steps that first electronic equipment extracts first data from interface data of a source interface, and control identification of controls in the source interface, hierarchical relation among the controls and control drawing instructions of the controls are recorded in the first data;
and the first electronic equipment displays each hierarchical control of the source interface to a user according to the first data.
The method for extracting the first data from the interface data of the source interface by the first electronic device includes:
the first electronic device extracts a view tree from interface data of a source interface, the view tree records control identification of controls in the source interface and hierarchical relation among the controls, and extracts a control drawing instruction corresponding to the control identification recorded in the view tree from the interface data.
The first electronic device displays the hierarchical control element of the source interface to a user according to the first data, and the hierarchical control element comprises:
the first electronic equipment draws all the hierarchical controls of the source interface according to the hierarchical relation among the controls by using the control drawing instruction of the controls, and displays the drawn all the hierarchical controls to a user; and/or the presence of a gas in the gas,
the first electronic equipment shows the control identification of the controls and the hierarchical relation among the controls to a user.
The screen of the first electronic equipment is divided into a first display area and a second display area;
a first electronic device presenting hierarchical controls of a source interface to a user, comprising:
the method comprises the steps that the first electronic equipment displays all hierarchical controls of a source interface to a user in a first display area;
the first electronic device presents a virtual screen generated based on screen information of the second electronic device to a user, including:
the first electronic device presents the virtual screen to the user in the second display area.
Wherein, still include:
and the first electronic equipment marks the first control identification in the first data of the interface data of the source interface sent to the second electronic equipment according to the first control identification in the interface layout information.
The method for sending the acquired first control identification and layout information to the second electronic device by the first electronic device includes:
and the first electronic equipment carries the acquired interface layout information in the interface data of the source interface and sends the interface layout information to the second electronic equipment.
In a second aspect, an embodiment of the present application provides a screen projection method, including:
the second electronic device receives interface layout information sent by the first electronic device, wherein the interface layout information comprises: a first control identification and layout information; the interface layout information is the interface layout information of the virtual screen when the first electronic equipment detects the confirmation operation of the user on the interface layout on the virtual screen;
the second electronic equipment receives interface data of a source interface sent by the first electronic equipment;
and the second electronic equipment displays the screen projection interface of the source interface according to the interface layout information according to the interface data.
According to the aspect, the layout of the screen projection interface displayed on the second electronic equipment is consistent with the interface layout on the virtual screen when the user confirms the operation, so that the visual effect of the screen projection interface is more similar to that of the native application program interface of the second electronic equipment, and the screen projection interface is not only mirrored, copied, stretched and zoomed in the source interface of the first electronic equipment, so that more natural and native screen projection experience is provided for the user.
The second electronic device displays a screen projection interface of the source interface according to the interface layout information according to the interface data, and the screen projection interface comprises:
the second electronic equipment acquires the first control identification and layout information corresponding to the first control identification from the interface layout information; the second electronic equipment acquires a control drawing instruction corresponding to the first control identification from the interface data according to the acquired first control identification;
and the second electronic equipment uses the control drawing instruction corresponding to the first control identification to draw the control corresponding to the first control identification according to the layout information corresponding to the first control identification.
The first control identification in the interface data has a mark, and the second electronic device displays a screen projection interface of the source interface according to the interface layout information according to the interface data, and the screen projection interface comprises:
the second electronic equipment acquires the first control identification and layout information corresponding to the first control identification from the interface layout information;
the second electronic equipment acquires a control drawing instruction corresponding to the first control identification with the mark from the interface data;
and the second electronic equipment uses the control drawing instruction corresponding to the first control identification to draw the control corresponding to the first control identification according to the layout information corresponding to the first control identification.
In a third aspect, an embodiment of the present application provides a first electronic device, including:
a display screen; one or more processors; a memory; and one or more computer programs, wherein the one or more computer programs are stored in the memory, the one or more computer programs comprising instructions which, when executed by the first electronic device, cause the first electronic device to perform the steps of:
displaying each level control of the source interface to a user, and acquiring a control selected by the user from each level control; displaying a virtual screen generated based on the screen information of the second electronic device to the user, and displaying a control selected by the user on the virtual screen, wherein the control displayed on the virtual screen can be edited by the user;
after detecting that a user confirms the operation of the interface layout on the virtual screen, sending the interface layout information of the virtual screen to the second electronic device, wherein the interface layout information comprises: and the first control identification and the layout information of the first control are placed on the virtual screen, so that the second electronic equipment can display the screen projection interface of the source interface according to the interface layout information.
Wherein the instructions, when executed by the first electronic device, cause the step of presenting the respective hierarchical controls of the source interface to the user to comprise:
extracting first data from interface data of a source interface, wherein the first data records control identification of controls in the source interface, hierarchical relation among the controls and control drawing instructions of the controls;
and displaying each hierarchical control of the source interface to a user according to the first data.
Wherein the instructions, when executed by the first electronic device, cause the step of extracting the first data from the interface data of the source interface to comprise:
extracting a view tree from interface data of a source interface, wherein the view tree records control identification of controls in the source interface and hierarchical relation among the controls; and extracting a control drawing instruction corresponding to the control identification recorded in the view tree from the interface data.
Wherein the instructions, when executed by the first electronic device, cause the step of presenting the hierarchical controls of the source interface to the user according to the first data to comprise:
drawing each level control of the source interface according to the level relation among the controls by using a control drawing instruction of the controls, and displaying the drawn each level control to a user; and/or the presence of a gas in the gas,
and showing the control identification of the controls and the hierarchical relation among the controls to the user.
The screen of the first electronic equipment is divided into a first display area and a second display area;
the instructions, when executed by the first electronic device, cause the step of exposing the hierarchical controls of the source interface to the user to comprise:
displaying each level control of the source interface to a user in a first display area;
the instructions, when executed by the first electronic device, cause the step of presenting a virtual screen generated based on screen information of the second electronic device to a user to comprise:
the virtual screen is presented to the user in the second display area.
Wherein the instructions, when executed by the first electronic device, cause the first electronic device to further perform the steps of:
and marking the first control identification in the first data of the interface data of the source interface sent to the second electronic equipment according to the first control identification in the interface layout information.
When the instruction is executed by the first electronic device, the step of sending the acquired control identification and layout information to the second electronic device includes:
and carrying the acquired interface layout information in the interface data of the source interface and sending the interface layout information to the second electronic equipment.
In a fourth aspect, an embodiment of the present application provides a second electronic device, including:
a display screen; one or more processors; a memory; and one or more computer programs, wherein the one or more computer programs are stored in the memory, the one or more computer programs comprising instructions which, when executed by the second electronic device, cause the second electronic device to perform the steps of:
receiving interface layout information sent by first electronic equipment, wherein the interface layout information comprises: a first control identification and layout information; the interface layout information is the interface layout information of the virtual screen when the first electronic equipment detects the confirmation operation of the user on the interface layout on the virtual screen;
receiving interface data of a source interface sent by first electronic equipment;
and displaying the screen projection interface of the source interface according to the interface layout information according to the interface data.
When the instruction is executed by the second electronic device, the step of displaying the screen projection interface of the source interface according to the first control identification and the layout information according to the interface data comprises the following steps:
acquiring a first control identification and layout information corresponding to the first control identification from the interface layout information; acquiring a control drawing instruction corresponding to the first control identification from the interface data according to the acquired first control identification;
and according to the layout information corresponding to the first control identification, drawing the control corresponding to the first control identification by using the control drawing instruction corresponding to the first control identification.
Wherein the first control identification in the interface data has a mark; when the instruction is executed by the second electronic device, the step of displaying the screen projection interface of the source interface according to the first control identification and the layout information according to the interface data comprises the following steps:
acquiring a first control identification and layout information corresponding to the first control identification from the interface layout information; acquiring a control drawing instruction corresponding to a first control identification with a mark from interface data;
and according to the layout information corresponding to the first control identification, drawing the control corresponding to the first control identification by using the control drawing instruction corresponding to the first control identification.
In a fifth aspect, embodiments of the present application provide a computer-readable storage medium, in which a computer program is stored, which, when run on a computer, causes the computer to perform the method of the first aspect.
In a sixth aspect, embodiments of the present application provide a computer-readable storage medium, in which a computer program is stored, which, when run on a computer, causes the computer to perform the method of the second aspect.
In a seventh aspect, the present application provides a computer program for performing the method of the first or second aspect when the computer program is executed by a computer.
In a possible design, the program in the seventh aspect may be stored in whole or in part on a storage medium packaged with the processor, or in part or in whole on a memory not packaged with the processor.
Drawings
FIG. 1 is a diagram illustrating a screen projection mode in a prior art mirror mode;
FIG. 2A is a diagram illustrating a system providing a screen projection function according to an embodiment of the present application;
FIG. 2B is a diagram illustrating an exemplary control hierarchy in accordance with an embodiment of the present disclosure;
FIG. 2C is a diagram illustrating an example of instruction extraction according to an embodiment of the present application;
FIG. 2D is a diagram illustrating a view tree according to an embodiment of the present application;
FIG. 3 is a flowchart of one embodiment of a screen projection method of the present application;
FIG. 4 is a flowchart of another embodiment of a screen projection method of the present application;
FIG. 5 is a flowchart of another embodiment of a screen projection method of the present application;
FIG. 6 is a flowchart of another embodiment of a screen projection method of the present application;
FIG. 7 is a flowchart of another embodiment of a screen projection method of the present application;
FIG. 8 is a block diagram of one embodiment of a screen projection device of the present application;
FIG. 9 is a block diagram of another embodiment of a screen projection apparatus of the present application;
fig. 10 is a schematic structural diagram of an embodiment of an electronic device according to the present application.
Detailed Description
The terminology used in the description of the embodiments section of the present application is for the purpose of describing particular embodiments of the present application only and is not intended to be limiting of the present application.
The following description is made by way of example and not by way of limitation, with reference to the accompanying terminology used in the embodiments of the present application.
In the embodiment of the present application, "screen projection" refers to transmitting interface data on one electronic device to another electronic device for display. For convenience of description, in the embodiments of the present application, the above-mentioned "one electronic device" is referred to as a "first electronic device"; the "another electronic device" described above is referred to as a "second electronic device"; an interface needing screen projection in the first electronic equipment is called a source interface, and an interface displayed in the second electronic equipment after the first electronic equipment is subjected to screen projection is called a screen projection interface. It should be noted that the screen projection function may be provided by the system or by the application, and the embodiment of the present application is not limited.
Referring to fig. 2A, an exemplary diagram of the screen projection function provided by the system is shown. Specifically, the user can implement screen projection between two electronic devices through the following operations: the user calls out a system pull-down notice in the first electronic device, selects more shortcut tools, finds a wireless screen projection or mirror image entrance, and selects a connectable device, wherein the selected connectable device serves as a second electronic device in the embodiment of the application.
In the embodiment of the application, two pieces of electronic equipment for projecting screens can be directly connected, for example, the two pieces of electronic equipment can be directly connected through bluetooth, WiFi and the like; alternatively, the two electronic devices may be connected to each other through a connection with another electronic device, such as a cloud server, to achieve indirect connection. In the process of projecting the screen, the connection between the two electronic devices can be switched between a direct connection and an indirect connection, and the embodiment of the application is not limited.
The application program interface in the embodiment of the application is a medium interface for interaction and information exchange between the application program and the user, and realizes conversion between the internal form of the information and the form acceptable by the user. A common presentation form of an application Interface is a Graphical User Interface (GUI), which refers to a User Interface related to computer operations and displayed in a graphical manner. The interface may be comprised of visual interface elements such as icons, buttons, menus, tabs, text boxes, dialog boxes, status bars, navigation bars, widgets (widgets), etc., which may be referred to as controls of the application interface.
The controls in the application program interface are hierarchical, for example, the entire application program interface can be regarded as the controls of the lowest hierarchy, that is, the controls of the first hierarchy, and the higher the hierarchy is, the finer the controls are, and the more the number of controls is. For example, referring to FIG. 2B, assuming that there is an interface including a button group 1 and a button group 2, each of which includes 3 buttons, the interface can be regarded as the control of the lowest hierarchy, the button group 1 and the button group 2 are the controls of the second hierarchy, and the buttons 11-13, 21-23 are the controls of the third hierarchy.
It should be noted that the interface data of the source interface records the controls of the source interface and the hierarchical relationship between the controls. For example, in an Android system, a ski instruction is in an Android frame layer of interface data of an application program interface (Activity), as shown in fig. 2C, the ski instruction can be specifically extracted from an instruction layer skcanvas, and a View Tree (View Tree) and a control drawing instruction are recorded in the ski instruction; wherein,
the View tree records the hierarchical relationship between views and views, wherein a View is a base class of all controls in the Android system and is an abstraction of the controls of the interface layer, a View usually represents a control, and a View group (View group) is a special View which represents a combined relationship between different views, and typically, the entire interface of an application is a View group, wherein each control is a separate View, View and View group can be combined/nested, and a complete View tree is formed after combination/nesting, and the View tree records the hierarchical relationship between the controls at different levels in the application interface and the controls, for example, the control Identification (ID) in the View tree identifies the controls, see the View tree screenshot shown in fig. 2D (no longer distinguishes between View and View group) which can be considered that a DecorView is a View at a first level, the ID is an action model (View) and the View tree control (no longer distinguishes between View and View control) at a level, and the View tree control (no longer distinguishes View and View control) at a View tree corresponding to a View tree at a second level, and a View tree corresponding control (button) which is located in a new View tree, and a View tree corresponding to a View tree at a View tree (diagram) at a new View tree, and a View tree corresponding to a View tree at a new View at a View tree, and a View at a View tree, wherein the same level, and a View tree, are identified by a View tree, and a View tree, and a View, wherein the View tree, and a View, and a.
The control drawing instruction has an association relation with the view tree, records how the visual graphics of each control in the view tree are drawn, and records the control display content of each control. For a certain control, the control may only include a visual graph or control display content, or may include both the visual graph and the control display content. The electronic equipment can draw the control pieces of each level in the view tree based on the control drawing instruction to obtain the visual graph and/or the control display content of each control piece, so that an application program interface is formed.
The control display content in the embodiment of the application refers to data such as numbers, characters and the like which need to be displayed except visual graphics of each control in an application program interface. As shown in fig. 2B, the control drawing instruction may draw a visual graph of the control, for example, a box representing the button in fig. 2B, and may also draw characters shown on the control, for example, the button 11, the button 12, and the like. The control display content may be constant or may be constantly changing. When the display content of the control changes, the first electronic device may send the control ID of the control with the changed display content and the changed display content of the control to the second electronic device, or may send interface data of the source interface with the changed display content of the control to the second electronic device, and the second electronic device updates the screen-casting interface correspondingly to keep the display content of the source interface and the screen-casting interface consistent.
When screen projection is started, the first electronic equipment sends interface data of a source interface to the second electronic equipment; in the screen projection process, a source interface of the first electronic device may change, for example, when a user clicks a certain button in the source interface to open another application interface, the source interface changes into a newly opened application interface, and at this time, the first electronic device may send interface data of the source interface to the second electronic device again to keep the content displayed by the screen projection interface and the content displayed by the source interface consistent.
The screen projection method in the embodiment of the present application is explained below.
In the prior art, a screen projection mode between electronic devices mainly adopts a mirror mode, as shown in fig. 1, that is, an interface on a first electronic device (e.g., a mobile phone) is completely projected onto a second electronic device (e.g., a PC). Taking two screen projection methods, namely, Miracast based on android and AirPlay based on iOS as examples, the basic principles of the two methods are as follows: and after audio and video coding processing is carried out on the interface data of the first electronic equipment, the interface data are sent to the second electronic equipment, and the second electronic equipment carries out decoding processing to obtain the interface data of the first electronic equipment for displaying. The screen projection interface in the screen projection mode can be correspondingly stretched, zoomed and cut only according to the size of the screen of the second electronic device, so that the screen projection interface displayed on the second electronic device is deformed or cannot be completely displayed, the screen projection effect is poor, and the user experience is poor.
Therefore, the screen projection method, the screen projection device and the electronic equipment can improve the screen projection effect and improve user experience.
Fig. 3 is a flowchart of an embodiment of a screen projection method according to the present application, and as shown in fig. 3, the method may include:
step 301: and the first electronic equipment displays all the hierarchical controls of the source interface to a user and acquires the control selected by the user from all the hierarchical controls.
When the first electronic device presents the hierarchical control elements of the source interface to the user, what is presented may be: controls, and/or hierarchical relationships between control identifications.
It should be noted that the controls in the application program interface are displayed as visual graphics in the application program interface, but most of the controls have a function of executing a function or causing code to run and complete a response through an "event", and such a function may be triggered by a click of a user, or the like. In the embodiment of the application, the first electronic device presents each level of control of the source interface to the user, and the main function is to facilitate the user to intuitively select the required control, so that the first electronic device in the embodiment of the application can only present the visual graph and/or the control display content of the control when presenting the control, and does not provide response operation for the function of the control. For example, assume that the controls of the button 11 in fig. 2B display: when the user clicks with the mouse, the application program can be triggered to open the application program interface corresponding to the next page (that is, the function of the button "next page" is executed), the control displayed to the user in this step can only display the visual graph of the button 11 and the display content of the control "next page", and the user clicks with the mouse to select the visual graph, but the response operation of opening the application program interface corresponding to the next page is not provided.
It should be noted that, for example, the hierarchical relationship between the control identifiers such as the view tree generally belongs to the internal attribute of the interface, and does not intuitively reflect the specific control, so that for most users who are not skilled in the art, the way of showing the control to the user is more intuitive, and better user experience is provided.
Step 302: the first electronic equipment displays a virtual screen generated based on the screen information of the second electronic equipment to a user, and displays a control selected by the user on the virtual screen, wherein the control displayed on the virtual screen can be edited by the user.
Wherein, the screen information of the second electronic device may include: the lateral dimension and the longitudinal dimension (or aspect ratio) of the display screen of the second electronic device. The screen information of the second electronic device may further include: interactive mode, etc. The interaction mode can comprise: whether the display screen supports touch control, etc.
Wherein, the editing operation performed by the user on the control can include but is not limited to: position shifting, size changing, display orientation changing, and/or deleting, etc.
In step 302, the first electronic device presents the virtual screen to the user, which can be executed simultaneously with the presentation of the hierarchical control of the source interface to the user by the first electronic device in step 301, that is, the first electronic device presents the hierarchical control and the virtual screen to the user simultaneously; alternatively, the first electronic device presents the virtual screen to the user in step 302, which may also be executed after step 301, and this embodiment of the application is not limited.
Optionally, the control presented on the virtual screen may also display content for the visual graphics of the control and/or the control without providing responsive operation to the functionality of the control.
Step 303: the method comprises the steps that the first electronic equipment detects that a user confirms the operation of the interface layout on the virtual screen, and sends the interface layout information of the virtual screen to the second electronic equipment, wherein the interface layout information comprises: and the first control identification and the layout information of the first control are placed on the virtual screen, so that the second electronic equipment can display the screen projection interface of the source interface according to the interface layout information.
The first control is a control placed on the virtual screen, that is, the control still placed on the virtual screen after the user performs a series of editing operations on the control selected by the user and displayed on the virtual screen in step 302. Each first control corresponds to: a first control identification, and layout information; the layout information may include: and the position, the size, the display direction and other information of the control on the virtual screen.
Optionally, a layout configuration file may be generated according to the interface layout information, and the layout configuration file may be sent to the second electronic device.
Optionally, the first electronic device may directly send the interface layout information to the second electronic device; alternatively, the sending, by the first electronic device, the interface layout information to the second electronic device may include: the first electronic equipment carries the interface layout information in the interface data of the source interface and sends the interface layout information to the second electronic equipment.
Optionally, after the first electronic device sends the interface layout information to the second electronic device, the first electronic device may mark the first control identifier in the first data of the interface data of the source interface sent to the second electronic device according to the first control identifier in the interface layout information, so that the second electronic device can conveniently draw a control in the screen-projecting interface according to the first control identifier with the mark when the screen-projecting interface is displayed.
It should be noted that the first electronic device sends the interface layout information to the second electronic device, the second electronic device may store the interface layout information, and when the first electronic device subsequently projects the source interface to the second electronic device, the second electronic device may display the screen projection interface of the source interface according to the interface layout information. If the user wants to modify the interface layout information of the screen projection interface of the source interface in the second electronic device, the method of the embodiment of the application may be triggered, the interface layout on the virtual screen is re-edited, the interface layout information is re-sent to the second electronic device, and the second electronic device updates the interface layout information corresponding to the locally stored source interface, so as to achieve the purpose that the user modifies the interface layout of the screen projection interface of the source interface in the second electronic device.
In the method shown in fig. 3, a first electronic device displays a control in a source interface selected by a user on a virtual screen, the user performs an editing operation on the displayed control, after the user finishes editing and performs a confirmation operation, interface layout information of the virtual screen is sent to a second electronic device, and the second electronic device displays a screen projection interface according to the interface layout information of the virtual screen, so that the layout of the screen projection interface displayed on the second electronic device is consistent with the interface layout on the virtual screen when the user confirms the operation, and the visual effect of the screen projection interface is more similar to the visual effect of a native application program interface of the second electronic device, and not only is the mirror image, copy, stretch and zoom of the source interface in the first electronic device, so that a more natural and native screen projection experience is provided for the user.
Fig. 4 is a flowchart of another embodiment of a screen projection method of the present application, which can include, referring to fig. 4:
step 401: the second electronic device receives interface layout information sent by the first electronic device, wherein the interface layout information comprises: a first control identification and layout information; the interface layout information is the interface layout information of the virtual screen when the first electronic equipment detects the confirmation operation of the user on the interface layout on the virtual screen;
step 402: the second electronic equipment receives interface data of a source interface sent by the first electronic equipment;
step 403: and the second electronic equipment displays the screen projection interface of the source interface according to the received interface data of the source interface and the interface layout information.
Specifically, the second electronic device may first obtain the first control identifier and the layout information corresponding to the first control identifier from the interface layout information, and then obtain the control drawing instruction corresponding to the first control identifier from the interface data according to the first control identifier obtained from the interface layout information; and according to the layout information corresponding to the first control identification in the interface layout information, drawing the control corresponding to the first control identification by using the control drawing instruction corresponding to the first control identification.
Therefore, only the control corresponding to the first control identifier included in the interface layout information is displayed in the screen projection interface displayed by the second electronic device, the layout of the screen projection interface displayed by the second electronic device is consistent with the interface layout on the virtual screen, and only the display contents of the controls in the controls may be different (the contents of the second device change in real time with the interface of the first device).
Optionally, the first control identifier in the first data of the interface data may have a mark, and accordingly, the second electronic device may more conveniently find each first control identifier and the control drawing instruction corresponding to the first control identifier from the interface data without first obtaining the first control identifier from the interface layout information, where step 403 may include: the second electronic equipment acquires the first control identification and layout information corresponding to the first control identification from the interface layout information; the second electronic equipment acquires a control drawing instruction corresponding to the first control identification with the mark from the interface data; and the second electronic equipment uses the control drawing instruction corresponding to the first control identification to draw the control corresponding to the first control identification according to the layout information corresponding to the first control identification.
In the method shown in fig. 4, the layout of the screen projection interface displayed on the second electronic device is consistent with the interface layout on the virtual screen when the user confirms the operation, so that the visual effect of the screen projection interface is more similar to the visual effect of the native application program interface of the second electronic device, and not only the mirroring, copying, stretching and zooming of the source interface in the first electronic device, thereby providing a more natural and native screen projection experience for the user.
Hereinafter, the above screen projection method will be exemplarily described.
Fig. 5 is a flowchart of another embodiment of the screen projecting method of the present application, and in fig. 5, the first electronic device is a mobile phone, and the second electronic device is a Head Up Display (HUD) in an automobile. The source interface needing screen projection in the first electronic equipment is a navigation interface.
Referring to part 510 in fig. 5, a screen of a first electronic device (e.g., a mobile phone) is divided into a first display area 51 and a second display area 52, and the first electronic device presents, to a user, hierarchical controls of a source interface in the first display area 51 for the user to select the controls; the method comprises the steps that first electronic equipment obtains a control selected by a user; the first electronic device presents a virtual screen 521 generated based on the screen information of the second electronic device in the second display area 52, and presents a user-selected control on the virtual screen 521.
Optionally, in order to ensure that the screen projection interface displayed by the second electronic device can fit the screen size of the second electronic device, the aspect ratio of the generated virtual screen 521 to the display screen of the second electronic device may be the same.
Optionally, the first electronic device presenting to the user hierarchical controls of the source interface may comprise:
the first electronic equipment extracts first data from interface data of a source interface, wherein the first data comprises: the method comprises the following steps that control identification of controls in a source interface, hierarchical relation among the controls and control drawing instructions of the controls are obtained;
and drawing the control of each level according to the level relation among the controls by using the control drawing instruction of the control, and displaying the drawn control of each level to a user.
Specifically, the first data may include: and identifying corresponding control drawing instructions by the view tree and the controls recorded in the view tree. The first electronic device extracting first data from interface data of the source interface may include: the first electronic device extracts a view tree from interface data of a source interface, the view tree records control identification of controls in the source interface and hierarchical relation among the controls, and extracts a control drawing instruction corresponding to the control identification recorded in the view tree from the interface data.
Optionally, the user may select a certain control in the first display area 51, drag the control to the virtual screen for display, and the first electronic device obtains the control selected by the user by detecting the operation of the user on the control.
Optionally, the user may perform a preset operation on the control in the first display area, where the preset operation may include, but is not limited to: and double clicking a control and the like displayed in the first display area, and acquiring the control selected by the user by the first electronic equipment by detecting the operation of the user on the control.
Referring to the portion 510 in fig. 5, the navigation interface is a first hierarchical control, the left navigation information display control group 511 is a second hierarchical control, the right map route display control 512 is also a second hierarchical control, and the navigation information display control group 511 includes: the 4 textboxes and the 5 third-level controls of the turning arrow shown in the dotted line part take the user-selected control as the textbox showing the turning distance and the turning arrow as an example.
Referring to part 520 in fig. 5, a user may perform an editing operation on a control displayed on the virtual screen, so as to obtain a screen-casting interface layout edited by the user.
In section 520, the user performs editing operations such as enlarging the text showing the turning distance, turning arrows, and moving the position, for example.
Referring to part 530 in fig. 5, after the user finishes editing the control displayed on the virtual screen, a confirmation operation for the interface layout on the virtual screen is performed; correspondingly, the first electronic device detects the confirmation operation of the user on the interface layout on the virtual screen, and sends the interface layout information on the virtual screen to the second electronic device (such as HUD); and the second electronic equipment stores the received interface layout information.
The embodiment of the present application is not limited to the specific implementation of the above-mentioned confirmation operation, for example, as shown in part 530 in fig. 5, a user clicks a confirmation button provided on a screen as an example.
Referring to part 540 in fig. 5, the first electronic device sends interface data of the source interface to the second electronic device; and the second electronic equipment receives the interface data of the source interface and displays the screen projection interface of the source interface according to the interface data and the interface layout information.
Specifically, control identifiers of all controls in the active interface, hierarchical relationships between the controls, control drawing instructions and the like are recorded in interface data of the source interface sent by the first electronic device to the second electronic device, the first control identifier and layout information corresponding to the first control identifier are recorded in the interface layout information, the hierarchical relationships between the first control identifiers and the control drawing instructions of the first control identifiers can be found from the interface data through the first control identifier recorded in the interface layout information, and the controls corresponding to the first control identifiers can be drawn by using the control drawing instructions corresponding to the first control identifiers according to the layout information corresponding to the first control identifiers. And displaying the control corresponding to each first control identification in the interface layout information on the screen projection interface according to the hierarchical relationship by the method, so as to obtain the screen projection interface.
Here, the displayed screen-casting interface will be similar to the interface presented on the virtual screen when the user performs the above-described confirmation operation, except that the display content of the widget presented on each widget may be different.
Different from the way of displaying the control by drawing the visual graph of the control in fig. 5, in the embodiment of the present application illustrated in fig. 6, the hierarchical relationship between the control identifiers of the controls is also displayed. In particular, the method comprises the following steps of,
referring to portion 610 of fig. 6, the screen of the first electronic device is divided into a third display area and a fourth display area, and the first electronic device displays the hierarchical relationship between the control identifications in the third display area and displays the visual graphics of the hierarchical controls in the fourth display area (i.e., the first electronic device displays the hierarchical controls of the source interface to the user).
Optionally, in part 610, a check box is set for each control identifier, and the user can select the control identifier by checking the check box, and accordingly, the first electronic device obtains the control identifier selected by the user (i.e., obtains the control selected by the user from the hierarchical controls) by detecting the above-mentioned selection operation for the check box.
Referring to part 620 in fig. 6, the first electronic device displays a virtual screen on the screen, and displays a visual graph of the control selected by the user and identifying the corresponding control on the virtual screen; the visual graphics of the controls presented on the virtual screen may be used by the user to perform editing operations.
Different from the way of displaying the control by drawing the visual graph of the control in fig. 5, the way of displaying the control by drawing the visual graph of the control and displaying the hierarchical relationship between the control identifications of the control in fig. 6 is shown, in the embodiment of the present application shown in fig. 7, only the hierarchical relationship between the control identifications of the control is displayed, and when the user selects the control identification, the visual graph of the control corresponding to the control identification is directly drawn on the virtual screen. In particular, the method comprises the following steps of,
referring to a portion 710 in fig. 7, a screen of the first electronic device is divided into a fifth display area and a sixth display area, where the first electronic device displays a hierarchical relationship between control identifiers in the fifth display area (that is, the first electronic device displays hierarchical controls of the source interface to the user), displays a virtual screen in the sixth display area, and displays a visual graph and/or control display content of a control corresponding to a control identifier selected by the user on the virtual screen; the visual graphics and/or control display content of the controls presented on the virtual screen may be used by a user to perform editing operations.
Optionally, in part 710, a check box is set for each control identifier, and the user can select the control identifier by checking the check box, and accordingly, the first electronic device obtains the control identifier selected by the user (i.e., obtains the control selected by the user from the hierarchical controls) by detecting the above-mentioned selection operation for the check box.
At this time, the visual graph of the control displayed on the virtual screen can be drawn on the virtual screen by the first electronic device according to the control identifier selected by the user by using the control drawing instruction corresponding to the control identifier.
It should be noted that, the method in the embodiment of the present application shows a virtual screen generated based on screen information of the second electronic device for a user, and the user can select a control shown on the virtual screen from a source interface according to personal needs and perform layout, thereby indirectly achieving optimization and editing of a screen projection interface displayed in the second electronic device, and particularly when the second electronic device has a special-shaped display screen, the method can perform targeted optimization based on the special-shaped display screen, so that a visual effect of the screen projection interface is more similar to a visual effect of a native application program interface of the second electronic device, and not only mirroring, copying, stretching, and zooming of the source interface in the first electronic device, thereby providing a more natural and more native screen projection experience for the user; moreover, the definition of the screen projection interface is relatively better and is not blurred due to the enlargement. For example, if a music playing interface in a mobile phone is projected on a watch, because the display screen of the watch is relatively small, only three buttons of playing, the previous song and the next song can be selected to be displayed on the screen projection interface; or if the interface of the navigation application in the mobile phone is projected on the HUD display, only key buttons such as direction indication can be displayed on the projection interface, and the like.
It is to be understood that some or all of the steps or operations in the above-described embodiments are merely examples, and other operations or variations of various operations may be performed by the embodiments of the present application. Further, the various steps may be performed in a different order presented in the above-described embodiments, and it is possible that not all of the operations in the above-described embodiments are performed.
Fig. 8 is a schematic structural diagram of an embodiment of the projection screen device of the present application, and as shown in fig. 8, the device 80 may include:
the first display unit 81 is configured to display each hierarchical control of the source interface to a user, and acquire a control selected by the user from each hierarchical control;
the second display unit 82 is configured to display a virtual screen generated based on the screen information of the second electronic device to the user, and display a control selected by the user on the virtual screen, where the control displayed on the virtual screen can be edited by the user;
a sending unit 83, configured to send interface layout information of the virtual screen to the second electronic device after detecting a confirmation operation of the user on the interface layout on the virtual screen, where the interface layout information includes: and the first control identification and the layout information of the first control are placed on the virtual screen, so that the second electronic equipment can display the screen projection interface of the source interface according to the interface layout information.
Optionally, the first display unit 81 may be specifically configured to:
extracting first data from interface data of a source interface, wherein the first data records control identification of controls in the source interface, hierarchical relation among the controls and control drawing instructions of the controls;
and displaying each hierarchical control of the source interface to a user according to the first data.
Optionally, the first display unit 81 may be specifically configured to:
and extracting a view tree from the interface data of the source interface, wherein the view tree records the control identification of the control in the source interface and the hierarchical relationship between the controls, and extracting a control drawing instruction corresponding to the control identification recorded in the view tree from the interface data.
Optionally, the first display unit 81 may be specifically configured to:
drawing each level control of the source interface according to the level relation among the controls by using a control drawing instruction of the controls, and displaying the drawn each level control to a user; and/or the presence of a gas in the gas,
and showing the control identification of the controls and the hierarchical relation among the controls to the user.
Optionally, the screen of the device is divided into a first display area and a second display area;
the first display unit 81 may be specifically configured to: displaying each level control of the source interface to a user in a first display area;
the second display unit 82 may be specifically configured to: the first electronic device presents the virtual screen to the user in the second display area.
Optionally, the sending unit 83 may further be configured to: and marking the first control identification in the interface data of the source interface sent to the second electronic equipment according to the control identification of the control placed on the virtual screen.
Optionally, the sending unit 83 may specifically be configured to: and carrying the acquired interface layout information in the interface data of the source interface and sending the interface layout information to the second electronic equipment.
Fig. 9 is a schematic structural diagram of another embodiment of the screen projection device of the present application, and as shown in fig. 9, the device 90 may include:
a receiving unit 91, configured to receive interface layout information sent by a first electronic device, where the interface layout information includes: a first control identification and layout information; the interface layout information is the interface layout information of the virtual screen when the first electronic equipment detects the confirmation operation of the user on the interface layout on the virtual screen; receiving interface data of a source interface sent by first electronic equipment;
and the display unit 92 is configured to display a screen projection interface of the source interface according to the interface layout information according to the interface data.
Optionally, the display unit 82 may be specifically configured to: acquiring a first control identification and layout information corresponding to the first control identification from the interface layout information; acquiring a control drawing instruction corresponding to the first control identification from the interface data according to the acquired first control identification; and according to the layout information corresponding to the first control identification, drawing the control corresponding to the first control identification by using the control drawing instruction corresponding to the first control identification.
Optionally, the first control identifier in the interface data has a mark, and the display unit 82 may specifically be configured to: acquiring a first control identification and layout information corresponding to the first control identification from the interface layout information; acquiring a control drawing instruction corresponding to a first control identification with a mark from interface data; and according to the layout information corresponding to the first control identification, drawing the control corresponding to the first control identification by using the control drawing instruction corresponding to the first control identification.
The embodiments shown in fig. 8 and fig. 9 provide apparatuses that can be used to implement the technical solutions of the method embodiments shown in fig. 3 to fig. 7 of the present application, and the implementation principles and technical effects thereof can be further referred to the related descriptions in the method embodiments.
It should be understood that the division of each unit of the above devices shown in fig. 8 to 9 is only a division of a logic function, and the actual implementation may be wholly or partially integrated into one physical entity, or may be physically separated. And these units can be implemented entirely in software, invoked by a processing element; or may be implemented entirely in hardware; part of the units can also be realized in the form of software called by a processing element, and part of the units can be realized in the form of hardware. For example, the display unit may be a separate processing element, or may be integrated into a chip of the electronic device. The other units are implemented similarly. In addition, all or part of the units can be integrated together or can be independently realized. In implementation, the steps of the method or the units above may be implemented by hardware integrated logic circuits in a processor element or instructions in software.
For example, the above units may be one or more integrated circuits configured to implement the above methods, such as: one or more Application Specific Integrated Circuits (ASICs), one or more microprocessors (DSPs), one or more Field Programmable Gate Arrays (FPGAs), etc. For another example, these units may be integrated together and implemented in the form of a System-On-a-Chip (SOC).
Fig. 10 is a schematic structural diagram of an embodiment of an electronic device according to the present application, and as shown in fig. 10, the electronic device may include: a display screen; one or more processors; a memory; and one or more computer programs.
The electronic equipment can be a mobile terminal (mobile phone), an Intelligent screen, an unmanned aerial Vehicle, an Intelligent Connected Vehicle (ICV), an Intelligent Vehicle (smart/Intelligent Vehicle) or Vehicle-mounted equipment.
Wherein the one or more computer programs are stored in the memory, the one or more computer programs comprising instructions which, when executed by the apparatus, cause the apparatus to perform the methods illustrated in figures 3-7.
Specifically, the electronic device may be the first electronic device described in this embodiment, and when the instruction is executed by the first electronic device, the first electronic device is enabled to perform the following steps:
displaying each level control of the source interface to a user, and acquiring a control selected by the user from each level control; displaying a virtual screen generated based on the screen information of the second electronic device to the user, and displaying a control selected by the user on the virtual screen, wherein the control displayed on the virtual screen can be edited by the user;
after detecting that a user confirms the operation of the interface layout on the virtual screen, sending the interface layout information of the virtual screen to the second electronic device, wherein the interface layout information comprises: and the first control identification and the layout information of the first control are placed on the virtual screen, so that the second electronic equipment can display the screen projection interface of the source interface according to the interface layout information.
When the instruction is executed by the first electronic device, the step of showing each level control of the source interface to the user comprises the following steps:
extracting first data from interface data of a source interface, wherein the first data records control identification of controls in the source interface, hierarchical relation among the controls and control drawing instructions of the controls;
and displaying each hierarchical control of the source interface to a user according to the first data.
When the instruction is executed by the first electronic device, the step of extracting the first data from the interface data of the source interface comprises the following steps:
extracting a view tree from interface data of a source interface, wherein the view tree records control identification of controls in the source interface and hierarchical relation among the controls; and extracting a control drawing instruction corresponding to the control identification recorded in the view tree from the interface data.
When the instruction is executed by the first electronic device, the step of displaying each level control of the source interface to the user according to the first data comprises the following steps:
drawing each level control of the source interface according to the level relation among the controls by using a control drawing instruction of the controls, and displaying the drawn each level control to a user; and/or the presence of a gas in the gas,
and showing the control identification of the controls and the hierarchical relation among the controls to the user.
The screen of the first electronic equipment is divided into a first display area and a second display area;
when the instructions are executed by the first electronic device, the step of showing the hierarchical controls of the source interface to the user comprises the following steps:
displaying each level control of the source interface to a user in a first display area;
when the instruction is executed by the first electronic device, the step of displaying the virtual screen generated based on the screen information of the second electronic device to the user comprises the following steps:
the virtual screen is presented to the user in the second display area.
When the instruction is executed by the first electronic device, the first electronic device further executes the following steps:
and marking the first control identification in the first data of the interface data of the source interface sent to the second electronic equipment according to the first control identification in the interface layout information.
When the instruction is executed by the first electronic device, the step of sending the acquired control identification and layout information to the second electronic device includes:
and carrying the acquired interface layout information in the interface data of the source interface and sending the interface layout information to the second electronic equipment.
Specifically, the electronic device may be a second electronic device described in this embodiment, and when the instruction is executed by the second electronic device, the second electronic device executes the following steps:
receiving interface layout information sent by first electronic equipment, wherein the interface layout information comprises: a first control identification and layout information; the interface layout information is the interface layout information of the virtual screen when the first electronic equipment detects the confirmation operation of the user on the interface layout on the virtual screen;
receiving interface data of a source interface sent by first electronic equipment;
and displaying the screen projection interface of the source interface according to the interface layout information according to the interface data.
When the instruction is executed by the second electronic device, the step of displaying the screen projection interface of the source interface according to the interface data and the first control identification and the layout information comprises the following steps:
acquiring a first control identification and layout information corresponding to the first control identification from the interface layout information; acquiring a control drawing instruction corresponding to the first control identification from the interface data according to the acquired first control identification;
and according to the layout information corresponding to the first control identification, drawing the control corresponding to the first control identification by using the control drawing instruction corresponding to the first control identification.
Wherein the first control identification in the interface data has a mark; when the instruction is executed by the second electronic device, the step of displaying the screen projection interface of the source interface according to the first control identification and the layout information according to the interface data comprises the following steps:
acquiring a first control identification and layout information corresponding to the first control identification from the interface layout information; acquiring a control drawing instruction corresponding to a first control identification with a mark from interface data;
and according to the layout information corresponding to the first control identification, drawing the control corresponding to the first control identification by using the control drawing instruction corresponding to the first control identification.
The electronic device shown in fig. 10 may be a terminal device or a circuit device built in the terminal device. The device may be used as the first electronic device or the second electronic device to execute the functions/steps of the methods provided by the embodiments shown in fig. 3 to 7 of the present application.
The electronic device 1000 may include a processor 1010, an external memory interface 1020, an internal memory 1021, a Universal Serial Bus (USB) interface 1030, a charge management module 1040, a power management module 1041, a battery 1042, an antenna 1, an antenna 2, a mobile communication module 1050, a wireless communication module 1060, an audio module 1070, a speaker 1070A, a receiver 1070B, a microphone 1070C, a headset interface 1070D, a sensor module 1080, a button 1090, a motor 1091, an indicator 1092, a camera 1093, a display 1094, and a Subscriber Identity Module (SIM) card interface 1095, etc., where the sensor module 1080 may include a pressure sensor 1080A, a gyroscope sensor 1080B, a barometric sensor 1080C, a magnetic sensor 1080D, an acceleration sensor 1080E, a distance sensor 1080F, a proximity light sensor 1080G, a fingerprint sensor 1080H, a temperature sensor 1080J, a touch sensor 1080K, an ambient light sensor 1080L, a conduction bone sensor 1080M, etc.
It is to be understood that the illustrated structure of the embodiment of the invention is not intended to limit the electronic device 1000. In other embodiments of the present application, the electronic device 1000 may include more or fewer components than illustrated, or some components may be combined, some components may be split, or a different arrangement of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Processor 1010 may include one or more processing units, such as: processor 1010 may include an Application Processor (AP), a modem processor, a Graphics Processor (GPU), an Image Signal Processor (ISP), a controller, a video codec, a Digital Signal Processor (DSP), a baseband processor, and/or a neural-Network Processor (NPU), among others. The different processing units may be separate devices or may be integrated into one or more processors.
The controller can generate an operation control signal according to the instruction operation code and the timing signal to complete the control of instruction fetching and instruction execution.
A memory may also be provided in processor 1010 for storing instructions and data. In some embodiments, the memory in the processor 1010 is a cache memory. The memory may hold instructions or data that have just been used or recycled by the processor 1010. If processor 1010 needs to reuse the instruction or data, it may be called directly from the memory. Avoiding repeated accesses reduces the latency of the processor 1010, thereby increasing the efficiency of the system.
In some embodiments, processor 1010 may include one or more interfaces. The interface may include an integrated circuit (I2C) interface, an integrated circuit built-in audio (I2S) interface, a Pulse Code Modulation (PCM) interface, a universal asynchronous receiver/transmitter (UART) interface, a Mobile Industry Processor Interface (MIPI), a general-purpose input/output (GPIO) interface, a Subscriber Identity Module (SIM) interface, and/or a Universal Serial Bus (USB) interface, etc.
The I2C interface is a bi-directional synchronous serial bus including a serial data line (SDA) and a serial clock line (SC L). in some embodiments, the processor 1010 may include multiple sets of I2C buses.the processor 1010 may be coupled to the touch sensor 1080K, the charger, the flash, the camera 1093, etc. via different I2C bus interfaces, for example, the processor 1010 may be coupled to the touch sensor 1080K via the I2C interface, such that the processor 1010 and the touch sensor 1080K communicate via the I2C bus interface, thereby implementing the touch function of the electronic device 1000.
The I2S interface may be used for audio communication. In some embodiments, processor 1010 may include multiple sets of I2S buses. The processor 1010 may be coupled to the audio module 1070 via an I2S bus to enable communication between the processor 1010 and the audio module 1070. In some embodiments, the audio module 1070 can transmit audio signals to the wireless communication module 1060 through the I2S interface, so as to receive phone calls through bluetooth headset.
The PCM interface may also be used for audio communication, sampling, quantizing and encoding analog signals. In some embodiments, the audio module 1070 and the wireless communication module 1060 may be coupled by a PCM bus interface. In some embodiments, the audio module 1070 can also transmit the audio signal to the wireless communication module 1060 through the PCM interface, so as to implement the function of receiving the call through the bluetooth headset. Both the I2S interface and the PCM interface may be used for audio communication.
The UART interface is a universal serial data bus used for asynchronous communications. The bus may be a bidirectional communication bus. It converts the data to be transmitted between serial communication and parallel communication. In some embodiments, a UART interface is generally used to connect the processor 1010 and the wireless communication module 1060. For example: the processor 1010 communicates with the bluetooth module in the wireless communication module 1060 through the UART interface to implement the bluetooth function. In some embodiments, the audio module 1070 may transmit the audio signal to the wireless communication module 1060 through the UART interface, so as to implement the function of playing music through the bluetooth headset.
The MIPI interface may be used to connect the processor 1010 with peripheral devices such as the display screen 1094 and the camera 1093. The MIPI interface includes a Camera Serial Interface (CSI), a Display Serial Interface (DSI), and the like. In some embodiments, processor 1010 and camera 1093 communicate via a CSI interface to implement the capture functionality of electronic device 1000. The processor 1010 and the display screen 1094 communicate via the DSI interface to implement the display function of the electronic device 1000.
The GPIO interface may be configured by software. The GPIO interface may be configured as a control signal and may also be configured as a data signal. In some embodiments, a GPIO interface may be used to connect the processor 1010 with the camera 1093, the display 1094, the wireless communication module 1060, the audio module 1070, the sensor module 1080, and the like. The GPIO interface may also be configured as an I2C interface, an I2S interface, a UART interface, a MIPI interface, and the like.
The USB interface 1030 is an interface conforming to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, or the like. The USB interface 1030 may be used to connect a charger to charge the electronic device 1000, and may also be used to transmit data between the electronic device 1000 and a peripheral device. And the earphone can also be used for connecting an earphone and playing audio through the earphone. The interface may also be used to connect other electronic devices, such as AR devices and the like.
It should be understood that the connection relationship between the modules according to the embodiment of the present invention is only illustrative, and is not limited to the structure of the electronic device 1000. In other embodiments of the present application, the electronic device 1000 may also adopt different interface connection manners or a combination of multiple interface connection manners in the above embodiments.
The charging management module 1040 is configured to receive charging input from a charger. The charger may be a wireless charger or a wired charger. In some wired charging embodiments, the charging management module 1040 may receive charging input from a wired charger via the USB interface 1030. In some wireless charging embodiments, the charging management module 1040 may receive a wireless charging input through a wireless charging coil of the electronic device 1000. The charging management module 1040 may also supply power to the electronic device through the power management module 1041 while charging the battery 1042.
The power management module 1041 is used for connecting the battery 1042, the charging management module 1040 and the processor 1010. The power management module 1041 receives input from the battery 1042 and/or the charging management module 1040, and provides power to the processor 1010, the internal memory 1021, the display 1094, the camera 1093, and the wireless communication module 1060. The power management module 1041 may also be used to monitor parameters such as battery capacity, battery cycle number, battery state of health (leakage, impedance), and the like. In some other embodiments, the power management module 1041 may also be disposed in the processor 1010. In other embodiments, the power management module 1041 and the charging management module 1040 may be disposed in the same device.
The wireless communication function of the electronic device 1000 may be implemented by the antenna 1, the antenna 2, the mobile communication module 1050, the wireless communication module 1060, the modem processor, the baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the electronic device 1000 may be used to cover a single or multiple communication bands. Different antennas can also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed as a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 1050 may provide a solution for wireless communication including 2G/3G/4G/5G, etc. applied to the electronic device 1000. the mobile communication module 1050 may include at least one filter, switch, power amplifier, low noise amplifier (L NA), etc. the mobile communication module 1050 may receive electromagnetic waves from the antenna 1, filter the received electromagnetic waves, amplify, etc., and transmit the processed electromagnetic waves to the modem processor for demodulation.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating a low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then passes the demodulated low frequency baseband signal to a baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor. The application processor outputs sound signals through an audio device (not limited to the speaker 1070A, the receiver 1070B, etc.) or displays images or video through the display screen 1094. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be separate from the processor 1010, and may be located in the same device as the mobile communication module 1050 or other functional modules.
The wireless communication module 1060 may provide solutions for wireless communication including wireless local area networks (wlan) and W L AN (e.g., wireless fidelity (Wi-Fi) network), bluetooth (bluetooth, BT), Global Navigation Satellite System (GNSS), Frequency Modulation (FM), Near Field Communication (NFC), infrared (infrared, IR) and the like applied to the electronic device 1000.
In some embodiments, the antenna 1 of the electronic device 1000 is coupled to the mobile communication module 1050 and the antenna 2 is coupled to the wireless communication module 1060 so that the electronic device 1000 may communicate with the network and other devices via wireless communication technologies, which may include Global System for Mobile communications (GSM), General Packet Radio Service (GPRS), Code Division Multiple Access (CDMA), wideband code division multiple Access (wideband code division multiple Access, WCDMA), time division code division multiple Access (TD-SCDMA), Long term evolution (long term evolution, L TE), GNSS, W L AN, NFC, FM, and/or IR technologies, which may include Global positioning System (Global positioning System, Global System for navigation, GPS satellite navigation System (SBS), Beidou navigation System (SBSa), Beidou navigation System (SBS/S), Beidou navigation System (SBS 52), Beidou satellite navigation System (GPS-enhanced navigation System, Beidou navigation System, GPS-satellite System, or Beidou navigation satellite System).
The electronic device 1000 implements a display function through the GPU, the display screen 1094, and the application processor. The GPU is a microprocessor for image processing, and is connected to the display screen 1094 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 1010 may include one or more GPUs that execute program instructions to generate or alter display information.
The display screen 1094 may be implemented as a liquid crystal display (L CD), an organic light-emitting diode (O L ED), an active matrix organic light-emitting diode (AMO L ED), a flexible light-emitting diode (F L ED), a minified, Micro L ED, Micro-O L ED, a quantum dot light-emitting diode (Q L), and the like.
The electronic device 1000 may implement a shooting function through the ISP, the camera 1093, the video codec, the GPU, the display screen 1094, the application processor, and the like.
The ISP is used for processing data fed back by the camera 1093. For example, when a photo is taken, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electrical signal, and the camera photosensitive element transmits the electrical signal to the ISP for processing and converting into an image visible to naked eyes. The ISP can also carry out algorithm optimization on the noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in the camera 1093.
The camera 1093 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image to the photosensitive element. The photosensitive element may be a Charge Coupled Device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor. The light sensing element converts the optical signal into an electrical signal, which is then passed to the ISP where it is converted into a digital image signal. And the ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into image signal in standard RGB, YUV and other formats. In some embodiments, the electronic device 1000 may include 1 or N cameras 1093, N being a positive integer greater than 1.
The digital signal processor is used for processing digital signals, and can process digital image signals and other digital signals. For example, when the electronic device 1000 selects a frequency bin, the digital signal processor is used to perform fourier transform or the like on the frequency bin energy.
Video codecs are used to compress or decompress digital video. The electronic device 1000 may support one or more video codecs. In this way, the electronic device 1000 may play or record video in a variety of encoding formats, such as: moving Picture Experts Group (MPEG) 1, MPEG2, MPEG3, MPEG4, and the like.
The NPU is a neural-network (NN) computing processor that processes input information quickly by using a biological neural network structure, for example, by using a transfer mode between neurons of a human brain, and can also learn by itself continuously. The NPU can implement applications such as intelligent recognition of the electronic device 1000, for example: image recognition, face recognition, speech recognition, text understanding, and the like.
The external memory interface 1020 may be used to connect an external memory card, such as a Micro SD card, to extend the memory capability of the electronic device 1000. The external memory card communicates with the processor 1010 through the external memory interface 1020 to implement a data storage function. For example, files such as music, video, etc. are saved in an external memory card.
Internal memory 1021 may be used to store computer-executable program code, which includes instructions. The internal memory 1021 may include a program storage area and a data storage area. The storage program area may store an operating system, an application program (such as a sound playing function, an image playing function, etc.) required by at least one function, and the like. The data storage area may store data (e.g., audio data, phone book, etc.) created during use of the electronic device 1000, and the like. In addition, the internal memory 1021 may include a high-speed random access memory, and may further include a nonvolatile memory, such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (UFS), and the like. The processor 1010 executes various functional applications and data processing of the electronic device 1000 by executing instructions stored in the internal memory 1021 and/or instructions stored in a memory provided in the processor.
The electronic device 1000 may implement audio functions through the audio module 1070, the speaker 1070A, the receiver 1070B, the microphone 1070C, the headphone interface 1070D, the application processor, and the like. Such as music playing, recording, etc.
The audio module 1070 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. The audio module 1070 may also be used to encode and decode audio signals. In some embodiments, the audio module 1070 may be disposed in the processor 1010, or some functional modules of the audio module 1070 may be disposed in the processor 1010.
The speaker 1070A, also called a "horn", is used to convert electrical audio signals into sound signals. The electronic device 1000 may listen to music or to a hands-free call through the speaker 1070A.
A receiver 1070B, also called "earpiece", is used to convert the electrical audio signal into an acoustic signal. When the electronic device 1000 receives a call or voice information, it can receive a voice by placing the receiver 1070B close to the ear of a person.
The microphone 1070C, also called "microphone", converts sound signals into electrical signals. When making a call or transmitting voice information, a user can input a voice signal into the microphone 1070C by making a voice near the microphone 1070C through the mouth of the user. The electronic device 1000 may be provided with at least one microphone 1070C. In other embodiments, the electronic device 1000 may be provided with two microphones 1070C to achieve a noise reduction function in addition to collecting sound signals. In other embodiments, three, four or more microphones 1070C may be further disposed on the electronic device 1000 to collect the sound signals, reduce noise, identify the sound sources, perform directional recording, and so on.
The headphone interface 1070D is used to connect wired headphones. The headset interface 1070D may be the USB interface 1030, or may be a 3.5mm open mobile electronic device platform (OMTP) standard interface, a cellular telecommunications industry association (cellular telecommunications industry association of the USA, CTIA) standard interface.
The pressure sensor 1080A is used for sensing a pressure signal, which may be converted into an electrical signal. In some embodiments, pressure sensor 1080A may be disposed on display 1094. Pressure sensors 1080A can be of a wide variety, such as resistive pressure sensors, inductive pressure sensors, capacitive pressure sensors, and the like. The capacitive pressure sensor may be a sensor comprising at least two parallel plates having an electrically conductive material. When a force acts on the pressure sensor 1080A, the capacitance between the electrodes changes. The electronic device 1000 determines the strength of the pressure from the change in capacitance. When a touch operation is applied to the display screen 1094, the electronic device 1000 detects the intensity of the touch operation according to the pressure sensor 1080A. The electronic apparatus 1000 can also calculate the touched position from the detection signal of the pressure sensor 1080A. In some embodiments, the touch operations that are applied to the same touch position but different touch operation intensities may correspond to different operation instructions. For example: and when the touch operation with the touch operation intensity smaller than the first pressure threshold value acts on the short message application icon, executing an instruction for viewing the short message. And when the touch operation with the touch operation intensity larger than or equal to the first pressure threshold value acts on the short message application icon, executing an instruction of newly building the short message.
The gyro sensor 1080B may be used to determine a motion pose of the electronic device 1000. In some embodiments, the angular velocity of electronic device 100 about three axes (i.e., x, y, and z axes) may be determined by gyroscope sensor 1080B. The gyro sensor 1080B may be used to capture anti-shake. Illustratively, when the shutter is pressed, the gyro sensor 1080B detects the shake angle of the electronic device 1000, calculates the distance to be compensated for by the lens module according to the shake angle, and enables the lens to counteract the shake of the electronic device 1000 through reverse movement, thereby achieving anti-shake. The gyroscope sensor 1080B can also be used for navigation and body sensing game scenes.
The air pressure sensor 1080C is used to measure air pressure. In some embodiments, the electronic device 1000 calculates altitude, aiding in positioning and navigation from barometric pressure values measured by the barometric pressure sensor 1080C.
The magnetic sensor 1080D includes a hall sensor. The electronic device 1000 can detect the opening and closing of the flip holster with the magnetic sensor 1080D. In some embodiments, when the electronic device 1000 is a flip phone, the electronic device 1000 can detect the opening and closing of the flip according to the magnetic sensor 1080D. And then according to the opening and closing state of the leather sheath or the opening and closing state of the flip cover, the automatic unlocking of the flip cover is set.
The acceleration sensor 1080E can detect the magnitude of acceleration of the electronic device 1000 in various directions (typically three axes). The magnitude and direction of gravity can be detected when the electronic device 1000 is stationary. The method can also be used for recognizing the posture of the electronic equipment, and is applied to horizontal and vertical screen switching, pedometers and other applications.
A distance sensor 1080F for measuring distance. The electronic device 1000 may measure distance by infrared or laser. In some embodiments, taking a picture of a scene, the electronic device 1000 may utilize the range sensor 1080F to range for fast focus.
The proximity light sensor 1080G may include, for example, a light emitting diode (L ED) and a light detector, such as a photodiode, the light emitting diode may be an infrared light emitting diode, the electronic device 1000 emits infrared light outward through the light emitting diode, the electronic device 1000 uses the photodiode to detect infrared reflected light from nearby objects, when sufficient reflected light is detected, it may be determined that there is an object near the electronic device 1000, when insufficient reflected light is detected, the electronic device 1000 may determine that there is no object near the electronic device 1000.
The ambient light sensor 1080L is used for sensing ambient light brightness, the electronic device 1000 can adaptively adjust the brightness of the display screen 1094 according to the sensed ambient light brightness, the ambient light sensor 1080L can also be used for automatically adjusting white balance during photographing, and the ambient light sensor 1080L can also be matched with the proximity light sensor 1080G to detect whether the electronic device 1000 is in a pocket or not so as to prevent accidental touch.
The fingerprint sensor 1080H is used to collect a fingerprint. The electronic device 1000 can utilize the collected fingerprint characteristics to realize fingerprint unlocking, access an application lock, fingerprint photographing, fingerprint incoming call answering and the like.
The temperature sensor 1080J is used to detect temperature. In some embodiments, electronic device 1000 implements a temperature processing strategy using the temperature detected by temperature sensor 1080J. For example, when the temperature reported by the temperature sensor 1080J exceeds a threshold, the electronic device 1000 performs a reduction in performance of a processor located near the temperature sensor 1080J to reduce power consumption to implement thermal protection. In other embodiments, the electronic device 1000 heats the battery 1042 when the temperature is lower than another threshold, so as to avoid abnormal shutdown of the electronic device 1000 due to low temperature. In other embodiments, when the temperature is lower than a further threshold, the electronic device 1000 performs boosting on the output voltage of the battery 1042 to avoid abnormal shutdown due to low temperature.
Touch sensor 1080K, also known as a "touch device". The touch sensor 1080K may be disposed on the display screen 1094, and the touch sensor 1080K and the display screen 1094 form a touch screen, which is also referred to as a "touch screen". The touch sensor 1080K is used to detect a touch operation applied thereto or therearound. The touch sensor can communicate the detected touch operation to the application processor to determine the touch event type. Visual output related to the touch operation may be provided through the display screen 1094. In other embodiments, the touch sensor 1080K may be disposed on a surface of the electronic device 1000 at a different location than the display 1094.
The bone conduction sensor 1080M may acquire a vibration signal. In some embodiments, the bone conduction transducer 1080M may acquire a vibration signal of the body's voice vibrating a bone mass. The bone conduction sensor 1080M can also be contacted with the pulse of the human body to receive the blood pressure pulsation signal. In some embodiments, the bone conduction sensor 1080M may also be disposed in a headset, integrated into a bone conduction headset. The audio module 1070 may analyze a voice signal based on the vibration signal of the bone block vibrated by the sound part obtained by the bone conduction sensor 1080M, so as to implement a voice function. The application processor can analyze heart rate information based on the blood pressure beating signal acquired by the bone conduction sensor 1080M, so that a heart rate detection function is realized.
The keys 1090 include a power-on key, a volume key, and the like. The keys 1090 may be mechanical keys. Or may be touch keys. The electronic device 1000 may receive a key input, generate a key signal input related to a user setting and a function control of the electronic device 1000.
The motor 1091 may generate a vibration cue. The motor 1091 may be used for both incoming call vibration cues and touch vibration feedback. For example, touch operations applied to different applications (e.g., photographing, audio playing, etc.) may correspond to different vibration feedback effects. The motor 1091 may also respond to different vibration feedback effects by performing touch operations on different areas of the display 1094. Different application scenes (such as time reminding, receiving information, alarm clock, game and the like) can also correspond to different vibration feedback effects. The touch vibration feedback effect may also support customization.
The indicator 1092 may be an indicator light that may be used to indicate a charging status, a change in charge level, or a message, missed call, notification, etc.
The SIM card interface 1095 is used to connect a SIM card. The SIM card can be attached to and detached from the electronic device 1000 by being inserted into the SIM card interface 1095 or being pulled out of the SIM card interface 1095. The electronic device 1000 may support 1 or N SIM card interfaces, N being a positive integer greater than 1. The SIM card interface 1095 may support a Nano SIM card, a Micro SIM card, a SIM card, etc. Multiple cards can be inserted into the same SIM card interface 1095. The types of the plurality of cards may be the same or different. The SIM card interface 1095 may also be compatible with different types of SIM cards. The SIM card interface 1095 may also be compatible with external memory cards. The electronic device 1000 interacts with the network through the SIM card to implement functions such as communication and data communication. In some embodiments, the electronic device 1000 employs esims, namely: an embedded SIM card. The eSIM card can be embedded in the electronic device 1000 and cannot be separated from the electronic device 1000.
It should be understood that the electronic device 1000 shown in fig. 10 is capable of implementing various processes of the methods provided by the embodiments shown in fig. 3-7 of the present application. The operations and/or functions of the modules in the electronic device 1000 are respectively to implement the corresponding flows in the above-described method embodiments. Reference may be made specifically to the description of the embodiments of the method illustrated in fig. 3 to 7 of the present application, and a detailed description is appropriately omitted herein to avoid redundancy.
It should be understood that the processor 1010 in the electronic device 1000 shown in fig. 10 may be a system on chip SOC, and the processor 1010 may include a Central Processing Unit (CPU), and may further include other types of processors, such as: an image Processing Unit (GPU), and the like.
In summary, various parts of the processors or processing units within the processor 1010 can cooperate to implement the foregoing method flows, and corresponding software programs of the various parts of the processors or processing units can be stored in the internal memory 121.
The present application further provides an electronic device, where the device includes a storage medium and a central processing unit, the storage medium may be a non-volatile storage medium, a computer executable program is stored in the storage medium, and the central processing unit is connected to the non-volatile storage medium and executes the computer executable program to implement the method provided in the embodiment shown in fig. 3 to 7 of the present application.
In the above embodiments, the processors may include, for example, a CPU, a DSP, a microcontroller, or a digital Signal processor, and may further include a GPU, an embedded Neural Network Processor (NPU), and an Image Signal Processing (ISP), and the processors may further include necessary hardware accelerators or logic Processing hardware circuits, such as an ASIC, or one or more integrated circuits for controlling the execution of the program according to the technical solution of the present application. Further, the processor may have the functionality to operate one or more software programs, which may be stored in the storage medium.
Embodiments of the present application further provide a computer-readable storage medium, in which a computer program is stored, and when the computer program runs on a computer, the computer is enabled to execute the method provided by the embodiments shown in fig. 3 to 7 of the present application.
Embodiments of the present application further provide a computer program product, which includes a computer program, when the computer program runs on a computer, the computer executes the method provided by the embodiments shown in fig. 3 to fig. 7 of the present application.
In the embodiments of the present application, "at least one" means one or more, "a plurality" means two or more. "and/or" describes the association relationship of the associated objects, and means that there may be three relationships, for example, a and/or B, and may mean that a exists alone, a and B exist simultaneously, and B exists alone. Wherein A and B can be singular or plural. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship. "at least one of the following" and similar expressions refer to any combination of these items, including any combination of singular or plural items. For example, at least one of a, b, and c may represent: a, b, c, a and b, a and c, b and c or a and b and c, wherein a, b and c can be single or multiple.
Those of ordinary skill in the art will appreciate that the various elements and algorithm steps described in connection with the embodiments disclosed herein can be implemented as electronic hardware, computer software, or combinations of electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described systems, apparatuses and units may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the several embodiments provided in the present application, any function, if implemented in the form of a software functional unit and sold or used as a separate product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application or portions thereof that substantially contribute to the prior art may be embodied in the form of a software product stored in a storage medium and including instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: a U disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
The above description is only for the specific embodiments of the present application, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present disclosure, and all the changes or substitutions should be covered by the protection scope of the present application. The protection scope of the present application shall be subject to the protection scope of the claims.

Claims (22)

1. A screen projection method, comprising:
the method comprises the steps that first electronic equipment displays all hierarchical controls of a source interface to a user and obtains controls selected by the user from all the hierarchical controls; the first electronic equipment displays a virtual screen generated based on screen information of second electronic equipment to the user, displays a control selected by the user on the virtual screen, and the control displayed on the virtual screen can be edited by the user;
after detecting that the user confirms the operation of the interface layout on the virtual screen, the first electronic device sends the interface layout information of the virtual screen to the second electronic device, wherein the interface layout information comprises: and the first control identification and the layout information of the first control are placed on the virtual screen, so that the second electronic equipment can display the screen projection interface of the source interface according to the interface layout information.
2. The method of claim 1, wherein the first electronic device presents hierarchical controls of the source interface to a user, comprising:
the first electronic equipment extracts first data from the interface data of the source interface, wherein the first data records control identification of controls in the source interface, hierarchical relation among the controls and control drawing instructions of the controls;
and the first electronic equipment displays all hierarchical controls of the source interface to the user according to the first data.
3. The method of claim 2, wherein the first electronic device extracts first data from the interface data of the source interface, comprising:
the first electronic device extracts a view tree from the interface data of the source interface, the view tree records control identification of the controls in the source interface and hierarchical relation among the controls, and extracts a control drawing instruction corresponding to the control identification recorded in the view tree from the interface data.
4. The method of claim 2, wherein the first electronic device presents to the user hierarchical controls of the source interface according to the first data, comprising:
the first electronic equipment draws all the hierarchical controls of the source interface according to the hierarchical relation among the controls by using the control drawing instruction of the controls, and displays the drawn all the hierarchical controls to the user; and/or the presence of a gas in the gas,
and the first electronic equipment shows the control identification of the controls and the hierarchical relation among the controls to the user.
5. The method according to any one of claims 1 to 4, wherein the screen of the first electronic device is divided into a first display area and a second display area;
the first electronic device presents the hierarchical controls of the source interface to a user, including:
the first electronic equipment displays all hierarchical controls of the source interface to the user in the first display area;
the first electronic device presents a virtual screen generated based on screen information of a second electronic device to the user, including:
the first electronic device presents the virtual screen to the user in the second display area.
6. The method of any of claims 1 to 4, further comprising:
and the first electronic equipment marks the first control identification in the interface data of the source interface sent to the second electronic equipment according to the first control identification in the interface layout information.
7. The method according to any one of claims 1 to 4, wherein the step of sending, by the first electronic device, the obtained first control identifier and layout information to the second electronic device includes:
and the first electronic equipment carries the acquired interface layout information in the interface data of the source interface and sends the interface layout information to the second electronic equipment.
8. A screen projection method, comprising:
the second electronic device receives interface layout information sent by the first electronic device, wherein the interface layout information comprises: a first control identification and layout information; the interface layout information is the interface layout information of the virtual screen when the first electronic equipment detects the confirmation operation of the user on the interface layout on the virtual screen;
the second electronic equipment receives interface data of a source interface sent by the first electronic equipment;
and the second electronic equipment displays a screen projection interface of the source interface according to the interface layout information according to the interface data.
9. The method of claim 8, wherein the second electronic device displays a screen projection interface of the source interface according to the interface layout information according to the interface data, and the method comprises:
the second electronic equipment acquires a first control identification and layout information corresponding to the first control identification from the interface layout information; the second electronic equipment acquires a control drawing instruction corresponding to the first control identification from the interface data according to the acquired first control identification;
and the second electronic equipment uses the control drawing instruction corresponding to the first control identification to draw the control corresponding to the first control identification according to the layout information corresponding to the first control identification.
10. The method of claim 8, wherein the first control identification in the interface data has a mark, and the second electronic device displays a screen projection interface of the source interface according to the interface layout information according to the interface data, and the method comprises:
the second electronic equipment acquires a first control identification and layout information corresponding to the first control identification from the interface layout information;
the second electronic equipment acquires a control drawing instruction corresponding to the first control identification with the mark from the interface data;
and the second electronic equipment uses the control drawing instruction corresponding to the first control identification to draw the control corresponding to the first control identification according to the layout information corresponding to the first control identification.
11. A first electronic device, comprising:
a display screen; one or more processors; a memory; and one or more computer programs, wherein the one or more computer programs are stored in the memory, the one or more computer programs comprising instructions which, when executed by the first electronic device, cause the first electronic device to perform the steps of:
displaying each level control of a source interface to a user, and acquiring a control selected by the user from the level controls; displaying a virtual screen generated based on screen information of second electronic equipment to the user, and displaying a control selected by the user on the virtual screen, wherein the control displayed on the virtual screen can be edited by the user;
after detecting that the user confirms the operation of the interface layout on the virtual screen, sending the interface layout information of the virtual screen to the second electronic device, wherein the interface layout information comprises: and the first control identification and the layout information of the first control are placed on the virtual screen, so that the second electronic equipment can display the screen projection interface of the source interface according to the interface layout information.
12. The first electronic device of claim 11, wherein the instructions, when executed by the first electronic device, cause the step of exposing the hierarchical controls of the source interface to a user to comprise:
extracting first data from the interface data of the source interface, wherein the first data records control identification of the controls in the source interface, hierarchical relation among the controls and control drawing instructions of the controls;
and displaying each level control of the source interface to the user according to the first data.
13. The first electronic device of claim 12, wherein the instructions, when executed by the first electronic device, cause the step of extracting first data from the interface data of the source interface to comprise:
extracting a view tree from the interface data of the source interface, wherein the view tree records control identification of controls in the source interface and hierarchical relation among the controls; and extracting a control drawing instruction corresponding to the control identification recorded in the view tree from the interface data.
14. The first electronic device of any of claim 12, wherein the instructions, when executed by the first electronic device, cause the step of presenting to the user hierarchical controls of the source interface in accordance with the first data comprises:
drawing each level control of the source interface according to the level relation among the controls by using a control drawing instruction of the controls, and displaying the drawn each level control to the user; and/or the presence of a gas in the gas,
and showing the control identification of the controls and the hierarchical relation among the controls to the user.
15. The first electronic device according to any one of claims 11 to 14, wherein a screen of the first electronic device is divided into a first display area and a second display area;
the instructions, when executed by the first electronic device, cause the step of presenting to a user the hierarchical controls of the source interface to comprise:
displaying hierarchical controls of the source interface to the user in the first display area;
the instructions, when executed by the first electronic device, cause the step of presenting the user with a virtual screen generated based on screen information of a second electronic device to comprise:
and displaying the virtual screen to the user in the second display area.
16. The first electronic device of any of claims 11-14, wherein the instructions, when executed by the first electronic device, cause the first electronic device to further perform the steps of:
and marking the first control identification in the first data of the interface data of the source interface sent to the second electronic equipment according to the first control identification in the interface layout information.
17. The first electronic device of any of claims 11-14, wherein the instructions, when executed by the first electronic device, cause the step of sending the retrieved control identification and layout information to the second electronic device to comprise:
and carrying the acquired interface layout information in the interface data of the source interface and sending the interface layout information to the second electronic equipment.
18. A second electronic device, comprising:
a display screen; one or more processors; a memory; and one or more computer programs, wherein the one or more computer programs are stored in the memory, the one or more computer programs comprising instructions which, when executed by the second electronic device, cause the second electronic device to perform the steps of:
receiving interface layout information sent by first electronic equipment, wherein the interface layout information comprises: a first control identification and layout information; the interface layout information is the interface layout information of the virtual screen when the first electronic equipment detects the confirmation operation of the user on the interface layout on the virtual screen;
receiving interface data of a source interface sent by the first electronic equipment;
and displaying the screen projection interface of the source interface according to the interface layout information according to the interface data.
19. The second electronic device of claim 18, wherein the instructions, when executed by the second electronic device, cause the step of displaying a screen-casting interface of the source interface according to the first control identification and layout information according to the interface data to comprise:
acquiring a first control identification and layout information corresponding to the first control identification from the interface layout information; acquiring a control drawing instruction corresponding to the first control identification from the interface data according to the acquired first control identification;
and according to the layout information corresponding to the first control identification, drawing the control corresponding to the first control identification by using the control drawing instruction corresponding to the first control identification.
20. The second electronic device of claim 18, wherein the first control identification in the interface data has a label; when the instruction is executed by the second electronic device, the step of displaying the screen projection interface of the source interface according to the first control identification and the layout information according to the interface data includes:
acquiring a first control identification and layout information corresponding to the first control identification from the interface layout information; acquiring a control drawing instruction corresponding to a first control identification with a mark from the interface data;
and according to the layout information corresponding to the first control identification, drawing the control corresponding to the first control identification by using the control drawing instruction corresponding to the first control identification.
21. A computer-readable storage medium, in which a computer program is stored which, when run on a computer, causes the computer to carry out the method according to any one of claims 1 to 7.
22. A computer-readable storage medium, in which a computer program is stored which, when run on a computer, causes the computer to carry out the method according to any one of claims 8 to 10.
CN202010328653.8A 2020-04-23 2020-04-23 Screen projection method and device and electronic equipment Pending CN111443884A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202010328653.8A CN111443884A (en) 2020-04-23 2020-04-23 Screen projection method and device and electronic equipment
PCT/CN2021/082506 WO2021213120A1 (en) 2020-04-23 2021-03-24 Screen projection method and apparatus, and electronic device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010328653.8A CN111443884A (en) 2020-04-23 2020-04-23 Screen projection method and device and electronic equipment

Publications (1)

Publication Number Publication Date
CN111443884A true CN111443884A (en) 2020-07-24

Family

ID=71654354

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010328653.8A Pending CN111443884A (en) 2020-04-23 2020-04-23 Screen projection method and device and electronic equipment

Country Status (2)

Country Link
CN (1) CN111443884A (en)
WO (1) WO2021213120A1 (en)

Cited By (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112000410A (en) * 2020-08-17 2020-11-27 努比亚技术有限公司 Screen projection control method and device and computer readable storage medium
CN112286477A (en) * 2020-11-16 2021-01-29 Oppo广东移动通信有限公司 Screen projection display method and related product
CN113438526A (en) * 2021-06-25 2021-09-24 维沃移动通信有限公司 Screen content sharing method, screen content display device, screen content equipment and storage medium
WO2021213120A1 (en) * 2020-04-23 2021-10-28 华为技术有限公司 Screen projection method and apparatus, and electronic device
CN113805827A (en) * 2021-09-14 2021-12-17 北京百度网讯科技有限公司 Screen projection display method and device, electronic equipment and storage medium
CN113891167A (en) * 2021-08-27 2022-01-04 荣耀终端有限公司 Screen projection method and electronic equipment
CN114363678A (en) * 2020-09-29 2022-04-15 华为技术有限公司 Screen projection method and equipment
CN114579231A (en) * 2022-02-15 2022-06-03 北京优酷科技有限公司 Page display method and device and electronic equipment
CN114610434A (en) * 2022-03-28 2022-06-10 联想(北京)有限公司 Output control method and electronic equipment
CN114661258A (en) * 2020-12-23 2022-06-24 华为技术有限公司 Adaptive display method, electronic device, and storage medium
CN114816158A (en) * 2021-01-11 2022-07-29 华为技术有限公司 Interface control method and device, electronic equipment and readable storage medium
CN114884990A (en) * 2022-05-06 2022-08-09 亿咖通(湖北)技术有限公司 Screen projection method and device based on virtual screen
CN115016702A (en) * 2021-09-10 2022-09-06 荣耀终端有限公司 Control method and system for selecting application program display screen in extended screen mode
CN115209213A (en) * 2022-08-23 2022-10-18 荣耀终端有限公司 Wireless screen projection method and mobile device
WO2023005900A1 (en) * 2021-07-28 2023-02-02 华为技术有限公司 Screen projection method, electronic device, and system
WO2023020025A1 (en) * 2021-08-20 2023-02-23 荣耀终端有限公司 Screen projection method and electronic device
WO2023050546A1 (en) * 2021-09-30 2023-04-06 上海擎感智能科技有限公司 Screen projection processing method and system, and electronic device and storage medium
WO2023103948A1 (en) * 2021-12-08 2023-06-15 华为技术有限公司 Display method and electronic device
US11947998B2 (en) 2020-09-02 2024-04-02 Huawei Technologies Co., Ltd. Display method and device
WO2024067052A1 (en) * 2022-09-30 2024-04-04 华为技术有限公司 Screen mirroring display method, electronic device, and system
WO2024099102A1 (en) * 2022-11-09 2024-05-16 维沃移动通信有限公司 Control display method and apparatus, electronic device, and readable storage medium
WO2024139138A1 (en) * 2022-12-30 2024-07-04 中兴通讯股份有限公司 Screen mirroring method, electronic device, and computer-readable medium
US12073071B2 (en) 2020-07-29 2024-08-27 Huawei Technologies Co., Ltd. Cross-device object drag method and device

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116089256B (en) * 2022-05-13 2024-03-12 荣耀终端有限公司 Terminal testing method, device and storage medium
CN115599335B (en) * 2022-12-13 2023-08-22 佳瑛科技有限公司 Method and system for sharing layout files based on multi-screen mode
CN117156189B (en) * 2023-02-27 2024-08-13 荣耀终端有限公司 Screen-throwing display method and electronic equipment
CN117707715A (en) * 2023-05-25 2024-03-15 荣耀终端有限公司 Application management method and electronic device

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103176797A (en) * 2013-02-21 2013-06-26 用友软件股份有限公司 Interface layout device and interface layout method
CN103645906A (en) * 2013-12-25 2014-03-19 上海斐讯数据通信技术有限公司 Method and system for realizing interface re-layout based on fixed interface layout document
CN109753315A (en) * 2018-11-22 2019-05-14 广州小鸡快跑网络科技有限公司 A kind of smart machine interactive content editor implementation method and storage medium
CN110381195A (en) * 2019-06-05 2019-10-25 华为技术有限公司 A kind of throwing screen display methods and electronic equipment
CN110554816A (en) * 2019-07-25 2019-12-10 华为技术有限公司 Interface generation method and equipment

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105915978A (en) * 2015-12-14 2016-08-31 乐视致新电子科技(天津)有限公司 Vehicle-mounted display control method and device thereof
CN107273083B (en) * 2017-06-30 2020-05-26 百度在线网络技术(北京)有限公司 Interaction method, device, equipment and storage medium between terminal equipment
CN110688042A (en) * 2019-09-29 2020-01-14 百度在线网络技术(北京)有限公司 Interface display method and device
CN111443884A (en) * 2020-04-23 2020-07-24 华为技术有限公司 Screen projection method and device and electronic equipment

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103176797A (en) * 2013-02-21 2013-06-26 用友软件股份有限公司 Interface layout device and interface layout method
CN103645906A (en) * 2013-12-25 2014-03-19 上海斐讯数据通信技术有限公司 Method and system for realizing interface re-layout based on fixed interface layout document
CN109753315A (en) * 2018-11-22 2019-05-14 广州小鸡快跑网络科技有限公司 A kind of smart machine interactive content editor implementation method and storage medium
CN110381195A (en) * 2019-06-05 2019-10-25 华为技术有限公司 A kind of throwing screen display methods and electronic equipment
CN110554816A (en) * 2019-07-25 2019-12-10 华为技术有限公司 Interface generation method and equipment

Cited By (29)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021213120A1 (en) * 2020-04-23 2021-10-28 华为技术有限公司 Screen projection method and apparatus, and electronic device
US12073071B2 (en) 2020-07-29 2024-08-27 Huawei Technologies Co., Ltd. Cross-device object drag method and device
CN112000410B (en) * 2020-08-17 2024-03-19 努比亚技术有限公司 Screen projection control method, device and computer readable storage medium
CN112000410A (en) * 2020-08-17 2020-11-27 努比亚技术有限公司 Screen projection control method and device and computer readable storage medium
US11947998B2 (en) 2020-09-02 2024-04-02 Huawei Technologies Co., Ltd. Display method and device
CN114363678A (en) * 2020-09-29 2022-04-15 华为技术有限公司 Screen projection method and equipment
CN112286477B (en) * 2020-11-16 2023-12-08 Oppo广东移动通信有限公司 Screen projection display method and related product
CN112286477A (en) * 2020-11-16 2021-01-29 Oppo广东移动通信有限公司 Screen projection display method and related product
WO2022100237A1 (en) * 2020-11-16 2022-05-19 Oppo广东移动通信有限公司 Screen projection display method and related product
CN114661258A (en) * 2020-12-23 2022-06-24 华为技术有限公司 Adaptive display method, electronic device, and storage medium
CN114816158A (en) * 2021-01-11 2022-07-29 华为技术有限公司 Interface control method and device, electronic equipment and readable storage medium
CN113438526A (en) * 2021-06-25 2021-09-24 维沃移动通信有限公司 Screen content sharing method, screen content display device, screen content equipment and storage medium
WO2023005900A1 (en) * 2021-07-28 2023-02-02 华为技术有限公司 Screen projection method, electronic device, and system
WO2023020025A1 (en) * 2021-08-20 2023-02-23 荣耀终端有限公司 Screen projection method and electronic device
CN113891167A (en) * 2021-08-27 2022-01-04 荣耀终端有限公司 Screen projection method and electronic equipment
CN113891167B (en) * 2021-08-27 2023-07-04 荣耀终端有限公司 Screen projection method and electronic equipment
CN115016702A (en) * 2021-09-10 2022-09-06 荣耀终端有限公司 Control method and system for selecting application program display screen in extended screen mode
CN115016702B (en) * 2021-09-10 2023-10-27 荣耀终端有限公司 Control method and system for selecting application program display screen in extended screen mode
CN113805827A (en) * 2021-09-14 2021-12-17 北京百度网讯科技有限公司 Screen projection display method and device, electronic equipment and storage medium
CN113805827B (en) * 2021-09-14 2024-05-07 北京百度网讯科技有限公司 Screen projection display method and device, electronic equipment and storage medium
WO2023050546A1 (en) * 2021-09-30 2023-04-06 上海擎感智能科技有限公司 Screen projection processing method and system, and electronic device and storage medium
WO2023103948A1 (en) * 2021-12-08 2023-06-15 华为技术有限公司 Display method and electronic device
CN114579231A (en) * 2022-02-15 2022-06-03 北京优酷科技有限公司 Page display method and device and electronic equipment
CN114610434A (en) * 2022-03-28 2022-06-10 联想(北京)有限公司 Output control method and electronic equipment
CN114884990A (en) * 2022-05-06 2022-08-09 亿咖通(湖北)技术有限公司 Screen projection method and device based on virtual screen
CN115209213A (en) * 2022-08-23 2022-10-18 荣耀终端有限公司 Wireless screen projection method and mobile device
WO2024067052A1 (en) * 2022-09-30 2024-04-04 华为技术有限公司 Screen mirroring display method, electronic device, and system
WO2024099102A1 (en) * 2022-11-09 2024-05-16 维沃移动通信有限公司 Control display method and apparatus, electronic device, and readable storage medium
WO2024139138A1 (en) * 2022-12-30 2024-07-04 中兴通讯股份有限公司 Screen mirroring method, electronic device, and computer-readable medium

Also Published As

Publication number Publication date
WO2021213120A1 (en) 2021-10-28

Similar Documents

Publication Publication Date Title
WO2021213120A1 (en) Screen projection method and apparatus, and electronic device
CN114679537B (en) Shooting method and terminal
CN112130742B (en) Full screen display method and device of mobile terminal
CN112231025B (en) UI component display method and electronic equipment
CN113645351B (en) Application interface interaction method, electronic device and computer-readable storage medium
CN110536004B (en) Method for applying multiple sensors to electronic equipment with flexible screen and electronic equipment
WO2020000448A1 (en) Flexible screen display method and terminal
WO2020029306A1 (en) Image capture method and electronic device
CN111669459B (en) Keyboard display method, electronic device and computer readable storage medium
CN112751954B (en) Operation prompting method and electronic equipment
WO2021258814A1 (en) Video synthesis method and apparatus, electronic device, and storage medium
CN113535284A (en) Full-screen display method and device and electronic equipment
CN110138999B (en) Certificate scanning method and device for mobile terminal
CN114089932B (en) Multi-screen display method, device, terminal equipment and storage medium
CN113254409A (en) File sharing method, system and related equipment
CN115967851A (en) Quick photographing method, electronic device and computer readable storage medium
CN116048358B (en) Method and related device for controlling suspension ball
WO2023241209A9 (en) Desktop wallpaper configuration method and apparatus, electronic device and readable storage medium
CN113970888A (en) Household equipment control method, terminal equipment and computer readable storage medium
CN111492678A (en) File transmission method and electronic equipment
CN115115679A (en) Image registration method and related equipment
CN112449101A (en) Shooting method and electronic equipment
CN114697543B (en) Image reconstruction method, related device and system
CN115393676A (en) Gesture control optimization method and device, terminal and storage medium
CN115730091A (en) Comment display method and device, terminal device and readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20200724