CN116708645A - Screen throwing method and mobile phone - Google Patents

Screen throwing method and mobile phone Download PDF

Info

Publication number
CN116708645A
CN116708645A CN202310695573.XA CN202310695573A CN116708645A CN 116708645 A CN116708645 A CN 116708645A CN 202310695573 A CN202310695573 A CN 202310695573A CN 116708645 A CN116708645 A CN 116708645A
Authority
CN
China
Prior art keywords
electronic device
mobile phone
application
screen
user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310695573.XA
Other languages
Chinese (zh)
Inventor
祁国强
谷贺瑾
牛思月
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honor Device Co Ltd
Original Assignee
Honor Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honor Device Co Ltd filed Critical Honor Device Co Ltd
Priority to CN202310695573.XA priority Critical patent/CN116708645A/en
Publication of CN116708645A publication Critical patent/CN116708645A/en
Pending legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • H04M1/72409User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality by interfacing with external accessories
    • H04M1/72412User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality by interfacing with external accessories using two-way short-range wireless interfaces
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/4104Peripherals receiving signals from specially adapted client devices
    • H04N21/4122Peripherals receiving signals from specially adapted client devices additional display device, e.g. video projector
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/4104Peripherals receiving signals from specially adapted client devices
    • H04N21/4126The peripheral being portable, e.g. PDAs or mobile phones
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/436Interfacing a local distribution network, e.g. communicating with another STB or one or more peripheral devices inside the home
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/80Services using short range communication, e.g. near-field communication [NFC], radio-frequency identification [RFID] or low energy communication
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2250/00Details of telephonic subscriber devices
    • H04M2250/22Details of telephonic subscriber devices including a touch pad, a touch sensor or a touch detector

Landscapes

  • Engineering & Computer Science (AREA)
  • Signal Processing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • Telephone Function (AREA)

Abstract

The application provides a screen projection method and a mobile phone, which relate to the field of terminals and can be used for conveniently and rapidly projecting data related to application programs in electronic equipment to other electronic equipment for establishing wireless communication connection for playing. The method comprises the following steps: responding to a first operation of a user, and acquiring positions of the first electronic equipment and the second electronic equipment; determining the positions of the first electronic equipment icon and the second electronic equipment icon on a multi-task interface according to the positions of the first electronic equipment and the second electronic equipment; acquiring a thumbnail of a first application program and a thumbnail of a second application program; the first application program and the second application program are application programs running on the mobile phone; displaying the icon of the electronic equipment and the thumbnail of the application program on a multi-task interface; and responding to a second operation of a user, and projecting multimedia data associated with the first application to the first electronic device.

Description

Screen throwing method and mobile phone
Technical Field
The application relates to the field of terminals, in particular to a screen projection method and a mobile phone.
Background
With the development of smart home technology, a user or a home often has a plurality of electronic devices capable of communicating with each other. Various electronic devices generally have respective device characteristics, for example, portability of a mobile phone is better, display effect of a television screen is better, and sound quality effect of a sound box is better. In order to fully develop the device characteristics of different electronic devices, the electronic devices can realize the interaction of multimedia data among a plurality of devices through functions such as screen projection, air play or Bluetooth.
By taking the screen throwing function as an example, a user can send multimedia data such as photos, videos and the like in the mobile phone to other controlled equipment (such as a smart television) supporting the screen throwing function for display by installing screen throwing software in the mobile phone. In one implementation, after the user selects the smart phone as the controlled device, the mobile phone sends the display data in the mobile phone screen to the smart television in real time for display, that is, the display content of the mobile phone is identical to that of the smart television, and the screen throwing mode easily causes the leakage of private information in the mobile phone. In another implementation manner, after a user selects the intelligent electricity as the controlled device, the mobile phone can prompt the user to select an application needing to be screen-cast through a list and other forms, and after the user selects a specific application, the mobile phone can screen the application selected by the user on the intelligent television for display, and the screen-cast mode requires the user to manually set specific screen-cast content before screen-cast each time, so that the operation of the screen-cast process is complicated.
Disclosure of Invention
The application provides a screen projection method and a mobile phone, which can conveniently and rapidly project data associated with an application program into electronic equipment in a multi-task interface for playing, reduce the operation of the screen projection process, improve the efficiency of man-machine interaction, and enhance the intelligence of the electronic equipment and the friendliness of man-machine interaction.
In a first aspect, a method for projecting a screen is provided, which is characterized by comprising: responding to a first operation of a user, and acquiring positions of the first electronic equipment and the second electronic equipment; determining the positions of the first electronic equipment icon and the second electronic equipment icon on a multi-task interface according to the positions of the first electronic equipment and the second electronic equipment; the first electronic device icon is used for identifying the first electronic device, the second electronic device icon is used for identifying the second electronic device, and the multi-task interface comprises a first display area which is used for displaying the first electronic device icon and the second electronic device icon; acquiring a thumbnail of a first application program and a thumbnail of a second application program; the first application program and the second application program are application programs running on the electronic equipment; the multi-task interface further comprises a second display area, wherein the second display area is used for displaying the thumbnail of the first application program and the thumbnail of the second application program; displaying the multi-task interface; and responding to a second operation of a user, and projecting multimedia data associated with the first application to the first electronic device.
Therefore, on one hand, the screen projection method provided by the embodiment of the application can further determine the position relationship (such as direction, distance and the like) between the mobile phone and other devices on the premise of establishing the wireless communication connection relationship between the devices, thereby improving the precision and efficiency of selecting the receiving device by the user; on the other hand, the user can select the application program and the electronic equipment for receiving the screen in the multi-task interface of the mobile phone through the operation mode, so that the screen is applied to the electronic equipment, and the efficiency of the screen operation is improved.
In one possible implementation manner, the acquiring the positions of the first electronic device and the second electronic device is specifically: and acquiring the positions of the first electronic device and the second electronic device through an ultra-wideband positioning technology or an antenna array technology.
That is, on the premise of establishing a wireless communication connection relationship between devices, nearby electronic devices can be further screened through an ultra-wideband positioning technology or an antenna array technology, so that the number of electronic devices found by mobile phones or other devices is reduced, the trouble of searching and identifying target devices in a large number of devices in a large range by a user is avoided, the situation of mistakenly selecting the devices during screen throwing is avoided, and the accuracy and efficiency of selecting receiving devices by the user are improved.
According to a first aspect, in a possible implementation manner, the first display area includes a coordinate system, and the positions of the first electronic device icon and the second electronic device icon on the multitasking interface are determined according to the positions of the first electronic device and the second electronic device, specifically: and determining the positions of the first electronic equipment icon and the second electronic equipment icon in a coordinate system in the multi-task interface according to the positions of the first electronic equipment and the second electronic equipment.
The coordinate system may include a center point, a horizontal axis, and a vertical axis, where the center point may represent the mobile phone, the horizontal axis represents a left-right direction relative to the mobile phone, the vertical axis represents a front-back direction relative to the mobile phone, the multitasking interface determines positions of the first electronic device icon and the second electronic device icon in the coordinate system of the multitasking interface according to positions of the first electronic device and the second electronic device icon, that is, positions/directions of the electronic devices relative to the mobile phone are displayed, and positions of the different electronic device icons indicate relative positions of the corresponding electronic devices and the mobile phone. In addition, the distance between the electronic device icon and the center point indicates the distance between the corresponding electronic device and the mobile phone, and the closer the icon is to the center point, the closer the corresponding electronic device is to the mobile phone.
That is, on one hand, the number of the electronic devices found by the mobile phone is reduced by screening the position relationship (such as direction, distance and the like) between the mobile phone and other devices, and further, the accuracy and efficiency of selecting the receiving device by the user are improved; on the other hand, the positions of the first electronic equipment icons and the second electronic equipment icons in the coordinate system of the multi-task interface are determined according to the positions of the first electronic equipment and the second electronic equipment, and a user can see the relative positions and directions of nearby equipment in the multi-task interface of the mobile phone, so that the accuracy and efficiency of selecting the receiving equipment by the user are improved.
In one possible implementation manner, the second operation is specifically dragging the thumbnail of the first application program to the first electronic device icon.
That is, the user can select the application and the electronic device pointed by the mobile phone in the multi-task interface of the mobile phone through the operation mode, so that the application is finished to be projected to the electronic device, the whole man-machine interaction process is more natural and friendly, and the use experience of the user is improved.
In a possible implementation manner, after the first electronic device is screened for the multimedia data associated with the first application, the method further includes: and responding to a third operation of a user, stopping the screen of the multimedia data associated with the first application to the first electronic device, and screening the multimedia data associated with the second application to the first electronic device.
That is, the user can complete the switching of the screen-throwing application only by performing one-step operation on the multi-task interface without performing multiple steps of operations (for example, first opening the first application, then selecting to stop the screen throwing, then opening the second application, and starting the screen throwing), so that the screen throwing content can be replaced very conveniently, and the privacy of the user is protected from leakage.
In a possible implementation manner according to the first aspect, the method further includes: and in response to a fourth operation of the user, projecting multimedia data associated with the second application to the second electronic device.
That is, after the mobile phone has screen-cast the first application to the first electronic device, the user may further screen-cast the second application on the mobile phone to the second electronic device through the fourth operation, so as to implement convenient operation of screen-casting multiple applications to multiple electronic devices on the multi-task interface. At this time, the first application in the mobile phone outputs the multimedia data to the second device for playing, and the second application in the mobile phone outputs the multimedia data to the first application for playing. The user can enjoy the multimedia data respectively played by different electronic devices at the same time.
In a possible implementation manner according to the first aspect, the method further includes: in response to a fifth operation by the user,
Stopping the screen of the multimedia data associated with the first application to the first electronic device.
That is, the user can stop the screen-throwing operation only by performing one-step operation on the multi-task interface without performing multi-step operation (for example, first opening the first application and then selecting to stop the screen-throwing operation), so that the screen-throwing operation can be ended very conveniently and rapidly, and the user experience is improved.
In one possible implementation manner, after responding to the second operation of the user, the multi-task interface comprises a control for stopping screen throwing; and the fifth operation is specifically to drag the thumbnail of the first application program to the control for stopping screen dropping.
That is, the user can stop the screen dropping by dragging the thumbnail of the application program to the control for stopping the screen dropping on the multi-task interface without performing multi-step operations (for example, opening the first application and then selecting to stop the screen dropping), so that the screen dropping can be finished very conveniently.
In a possible implementation manner according to the first aspect, the method further includes: and responding to a sixth operation of a user, and projecting multimedia data associated with the first application to the first electronic device and the second electronic device.
That is, in a scenario that a single application is required to be screen-cast to a plurality of devices, a user can realize that one application is screen-cast to a plurality of electronic devices only by performing one-step operation on a multi-task interface, and the plurality of electronic devices can be electronic devices with established wireless communication connection relations pointed by mobile phones, so that the screen-cast of the plurality of devices is more convenient and faster on the premise of limiting the devices requiring screen-cast.
In a possible implementation manner according to the first aspect, the sixth operation is specifically sliding the thumbnail of the first application program towards the first direction.
The first direction may be upward or other directions, and the sliding may be a two-finger or three-finger sliding operation.
That is, in a scenario where a single application program needs to be screen-cast to a plurality of devices, a user can implement screen-cast of one application to a plurality of electronic devices only by sliding a thumbnail of the application program in a multitasking interface in a first direction, and the plurality of electronic devices can be electronic devices with established wireless communication connection relationships pointed by a mobile phone, so that on the premise of limiting the devices needing screen-cast, the screen-cast of the plurality of devices is more convenient and faster.
In a possible implementation manner according to the first aspect, the method further includes: and responding to a seventh operation of a user, stopping the screen of the multimedia data associated with the first application to the first electronic device and the second electronic device.
In other words, in a scene where a single application program needs to be ended to screen a plurality of devices, a user can stop the screen of the plurality of devices to one application program only by performing one-step operation on a multi-task interface, so that the screen of the plurality of devices to be ended by the application program is more convenient and faster.
In a possible implementation manner according to the first aspect, the seventh operation is specifically sliding the thumbnail of the first application program towards the second direction.
Wherein the second direction may be downward or other direction and the sliding may be a two-finger or three-finger sliding operation.
In other words, in a scene where a single application program needs to be ended to screen a plurality of devices, a user can stop screen throwing of one application program to the plurality of devices only by sliding a thumbnail of the first application program in a second direction on the multi-task interface, so that screen throwing of the application program to the plurality of devices is ended more conveniently and rapidly.
In a second aspect, the present application provides a mobile phone, comprising: a touch screen, a communication interface, one or more processors, memory, and one or more computer programs; wherein the processor is coupled to the touch screen, the communication interface, the positioning device, and the memory, the one or more computer programs are stored in the memory, and when the mobile phone is running, the processor executes the one or more computer programs stored in the memory to cause the mobile phone to perform the method of playing multimedia data as described in any one of the above.
In a third aspect, the present application provides a computer storage medium comprising computer instructions which, when run on an electronic device, cause the electronic device to perform the method of projecting a screen as claimed in any one of the first aspects.
In a fourth aspect, the present application provides a computer program product for, when run on an electronic device, causing the electronic device to perform the method of screening according to any one of the first aspects.
It will be appreciated that the terminal according to the second aspect, the computer storage medium according to the third aspect, and the computer program product according to the fourth aspect are all configured to perform the corresponding methods provided above, and therefore, the advantages achieved by the method are referred to the advantages in the corresponding methods provided above, and are not repeated herein.
Drawings
Fig. 1 is a schematic diagram of a communication system according to an embodiment of the present application;
fig. 2A is a schematic diagram of a structure of an electronic device according to an embodiment of the present application;
fig. 2B is a schematic diagram of an electronic device with an antenna array according to an embodiment of the present application;
FIG. 3 is a schematic diagram of a software structure of an electronic device according to an embodiment of the present application;
fig. 4A is a schematic diagram of an application scenario of a screen projection method according to an embodiment of the present application;
fig. 4B is a second application scenario schematic diagram of a screen projection method according to an embodiment of the present application;
Fig. 4C is a schematic diagram of an application scenario III of a screen projection method according to an embodiment of the present application;
fig. 4D is a schematic diagram of an application scenario of a screen projection method according to an embodiment of the present application;
fig. 4E is a schematic diagram of an application scenario of a screen projection method according to an embodiment of the present application;
fig. 4F is a schematic diagram of an application scenario of a screen projection method according to an embodiment of the present application;
fig. 5 is a schematic diagram seventh of an application scenario of a screen projection method provided by an embodiment of the present application;
fig. 6A is an application scenario diagram eight of a screen projection method according to an embodiment of the present application;
fig. 6B is a schematic diagram of an application scenario of a screen projection method according to an embodiment of the present application;
fig. 7A is a schematic diagram of an application scenario of a screen projection method according to an embodiment of the present application;
fig. 7B is an eleventh application scenario diagram of a screen projection method according to an embodiment of the present application;
fig. 8A is a schematic diagram twelve application scenarios of a screen projection method according to an embodiment of the present application;
fig. 8B is a thirteenth application scenario schematic diagram of a screen projection method according to an embodiment of the present application;
fig. 9A is a schematic diagram fourteen application scenarios of a screen projection method according to an embodiment of the present application;
fig. 9B is a schematic diagram fifteen of an application scenario of a screen projection method according to an embodiment of the present application;
Fig. 9C is a sixteen application scenario diagrams of a screen projection method according to an embodiment of the present application;
fig. 9D is a seventeenth application scenario schematic diagram of a screen projection method according to an embodiment of the present application;
fig. 10 is a schematic flow chart of a screen projection method according to an embodiment of the present application;
FIG. 11 is a second schematic flow chart of a screen projection method according to an embodiment of the present application;
fig. 12 is a flowchart of a screen projection method according to an embodiment of the present application;
fig. 13 is a schematic flow chart of a screen projection method according to an embodiment of the present application;
fig. 14 is a flowchart of a screen projection method according to an embodiment of the present application;
fig. 15 is a flowchart of a screen projection method according to an embodiment of the present application;
fig. 16 is a flow chart seven of a screen projection method according to an embodiment of the present application;
fig. 17 is a schematic flowchart eight of a screen projection method according to an embodiment of the present application.
Detailed Description
The terminology used in the following embodiments of the application is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used in the specification of the present application and the appended claims, the singular forms "a," "an," "the," and "the" are intended to include the plural forms as well, unless the context clearly indicates to the contrary. It should also be understood that the term "and/or" as used in this disclosure refers to and encompasses any or all possible combinations of one or more of the listed items.
Embodiments of an electronic device, a graphical user interface for such an electronic device, and for using such an electronic device are described below. In some embodiments, the electronic device may be a portable electronic device such as a cell phone, tablet computer, wearable electronic device (e.g., smart watch) with wireless communication capabilities, etc., that also includes other functionality such as personal digital assistant and/or music player functionality. Exemplary embodiments of portable electronic devices include, but are not limited to, piggy-back Or other operating system. The portable electronic device can also be usedBut other portable electronic devices such as Laptop computers (Laptop) or the like having a touch sensitive surface or touch panel. It should also be appreciated that in other embodiments, the electronic device described above may not be a portable electronic device, but rather a desktop computer having a touch-sensitive surface or touch panel.
The term "User Interface (UI)" in the description and claims of the present application and in the drawings is a media interface for interaction and information exchange between an application program or an operating system and a user, which enables conversion between an internal form of information and a form acceptable to the user. The user interface of the application program is source code written in a specific computer language such as java, extensible markup language (extensible markup language, XML) and the like, the interface source code is analyzed and rendered on the terminal equipment, and finally the interface source code is presented as content which can be identified by a user, such as a picture, characters, buttons and the like. Controls (controls), also known as parts (widgets), are basic elements of a user interface, typical controls being toolbars (toolbars), menu bars (menu bars), text boxes (text boxes), buttons (buttons), scroll bars (scrollbars), pictures and text. The properties and content of the controls in the interface are defined by labels or nodes, such as XML specifies the controls contained in the interface by nodes of < Textview >, < ImgView >, < VideoView >, etc. One node corresponds to a control or attribute in the interface, and the node is rendered into visual content for a user after being analyzed and rendered. In addition, many applications, such as the interface of a hybrid application (hybrid application), typically include web pages. A web page, also referred to as a page, is understood to be a special control embedded in an application program interface, which is source code written in a specific computer language, such as hypertext markup language (hyper text markup language, HTML), cascading style sheets (cascading style sheets, CSS), java script (JavaScript, JS), etc., and which can be loaded and displayed as user-recognizable content by a browser or web page display component similar to the browser's functionality. The specific content contained in a web page is also defined by tags or nodes in the web page source code, such as HTML defines the elements and attributes of the web page by < p >, < img >, < video >, < canvas >.
A commonly used presentation form of the user interface is a graphical user interface (graphic user interface, GUI), which refers to a user interface related to computer operations that is displayed in a graphical manner. It may be an interface element such as an icon, a window, a control, etc. displayed in a display screen of the electronic device, where the control may include a visual interface element such as an icon, a button, a menu, a tab, a text box, a dialog box, a status bar, a navigation bar, a Widget, etc.
The following embodiments of the present application provide a method, a graphical user interface, an electronic device, and a system for screen projection, so that the electronic device can perform rapid screen projection on a multi-task interface, and can search for nearby devices to perform screen projection in combination with information such as a direction, a distance, etc., thereby improving the efficiency of screen projection of the electronic device.
In the following embodiments of the present application, the screen may be a service or function provided by the electronic device, and may support the electronic device to transmit data to other devices in a specified direction. In some embodiments, the screen may support the electronic device to transmit data to nearby devices via one or more of bluetooth, wi-Fi direct (wireless fidelity direct), wi-Fi software access point (software access point, softAP), millimeter Wave (mmWave), ultra Wideband (UWB), etc. technologies. The nearby devices refer to devices discovered by the electronic device through one or more technologies of Bluetooth, wi-Fi direct (such as Wi-Fi p2 p), wi-Fi softAP and the like, wi-Fi LAN, millimeter wave (mmWave) and Ultra Wideband (UWB).
It should be understood that the projection is just a word used in this embodiment, and the meaning of the representation is already described in this embodiment, and the name of the projection should not be limited to this embodiment.
The following describes a method for directional screen projection according to an embodiment of the present application, which may be applied to a communication system 10 shown in fig. 1.
Fig. 1 illustrates a communication system 10 for directional projection embodying the present application. The communication system 10 may include an electronic device 100 and one or more nearby electronic devices, such as a first electronic device 101, a second electronic device 102, a third electronic device 103, and so on.
The electronic device 100 may be a portable electronic device such as a mobile phone, a tablet computer, etc. In particular, the electronic device 100 may have one or more of a bluetooth module, a WLAN module, a millimeter wave (mmWave) antenna module, an Ultra Wideband (UWB) antenna module. The electronic device 100 may detect, scan, or directionally detect/scan devices in the vicinity of the electronic device 100 by transmitting signals through one or more of a bluetooth module, a WLAN module, a millimeter wave (millimetawave) antenna module, and an Ultra Wideband (UWB) antenna module, such that the electronic device 100 may discover or directionally discover nearby devices using one or more short-range wireless communication protocols of bluetooth, WLAN, millimeter wave (mmWave), and Ultra Wideband (UWB), and establish a wireless communication connection with nearby devices, sharing data to nearby devices through one or more short-range wireless communication protocols of bluetooth, WLAN, millimeter wave (mmWave), and Ultra Wideband (UWB).
The first electronic device 101, the second electronic device 102, or the third electronic device 103 may be an electronic device such as a mobile phone, a tablet computer, or a personal computer having one or more of a bluetooth module, a WLAN module, a millimeter wave (mmWave) antenna module, and an Ultra Wideband (UWB) antenna module, or may be another device such as a printer, a projector, a display, or a speaker having one or more of a bluetooth module, a WLAN module, a millimeter wave (mmWave) antenna module, and an Ultra Wideband (UWB) antenna module.
It will be appreciated that the configuration shown in this embodiment is not a particular limitation of communication system 10. In other embodiments of the present application, communication system 10 may include more or fewer devices than shown. For example, communication system 10 may also include other electronic devices having one or more of a Bluetooth module, a WLAN module, a millimeter wave (mmWave) antenna module, and an ultra-wideband (UWB) antenna module.
Fig. 2A shows a schematic structural diagram of the electronic device 100.
The electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (universal serial bus, USB) interface 130, a charge management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, keys 190, a motor 191, an indicator 192, a camera 193, a display 194, and a subscriber identity module (subscriber identification module, SIM) card interface 195, etc. The sensor module 180 may include a pressure sensor 180A, a gyro sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
It should be understood that the illustrated structure of the embodiment of the present application does not constitute a specific limitation on the electronic device 100. In other embodiments of the application, electronic device 100 may include more or fewer components than shown, or certain components may be combined, or certain components may be split, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
The processor 110 may include one or more processing units, such as: the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), a controller, a memory, a video codec, a digital signal processor (digital signal processor, DSP), a baseband processor, and/or a neural network processor (neural-network processing unit, NPU), etc. Wherein the different processing units may be separate devices or may be integrated in one or more processors. In some embodiments, the electronic device 100 may also include one or more processors 110.
The controller may be a neural hub and a command center of the electronic device 100, among others. The controller can generate operation control signals according to the instruction operation codes and the time sequence signals to finish the control of instruction fetching and instruction execution.
A memory may also be provided in the processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that the processor 110 has just used or recycled. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Repeated accesses are avoided, reducing the latency of the processor 110 and thus improving the efficiency of the electronic device 100.
In some embodiments, the processor 110 may include one or more interfaces. The interfaces may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous receiver transmitter (universal asynchronous receiver/transmitter, UART) interface, a mobile industry processor interface (mobile industry processor interface, MIPI), a general-purpose input/output (GPIO) interface, a subscriber identity module (subscriber identity module, SIM) interface, and/or a universal serial bus (universal serial bus, USB) interface, among others.
The I2C interface is a bi-directional synchronous serial bus comprising a serial data line (SDA) and a serial clock line (derail clock line, SCL). In some embodiments, the processor 110 may contain multiple sets of I2C buses. The processor 110 may be coupled to the touch sensor 180K, charger, flash, camera 193, etc., respectively, through different I2C bus interfaces. For example: the processor 110 may be coupled to the touch sensor 180K through an I2C interface, such that the processor 110 communicates with the touch sensor 180K through an I2C bus interface to implement a touch function of the electronic device 100.
The I2S interface may be used for audio communication. In some embodiments, the processor 110 may contain multiple sets of I2S buses. The processor 110 may be coupled to the audio module 170 via an I2S bus to enable communication between the processor 110 and the audio module 170. In some embodiments, the audio module 170 may transmit an audio signal to the wireless communication module 160 through the I2S interface, to implement a function of answering a call through the bluetooth headset.
PCM interfaces may also be used for audio communication to sample, quantize and encode analog signals. In some embodiments, the audio module 170 and the wireless communication module 160 may be coupled through a PCM bus interface. In some embodiments, the audio module 170 may also transmit audio signals to the wireless communication module 160 through the PCM interface to implement a function of answering a call through the bluetooth headset. Both the I2S interface and the PCM interface may be used for audio communication.
The UART interface is a universal serial data bus for asynchronous communications. The bus may be a bi-directional communication bus. It converts the data to be transmitted between serial communication and parallel communication. In some embodiments, a UART interface is typically used to connect the processor 110 with the wireless communication module 160. For example: the processor 110 communicates with a bluetooth module in the wireless communication module 160 through a UART interface to implement a bluetooth function. In some embodiments, the audio module 170 may transmit an audio signal to the wireless communication module 160 through a UART interface, to implement a function of playing music through a bluetooth headset.
The MIPI interface may be used to connect the processor 110 to peripheral devices such as a display 194, a camera 193, and the like. The MIPI interfaces include camera serial interfaces (camera serial interface, CSI), display serial interfaces (display serial interface, DSI), and the like. In some embodiments, processor 110 and camera 193 communicate through a CSI interface to implement the photographing functions of electronic device 100. The processor 110 and the display 194 communicate via a DSI interface to implement the display functionality of the electronic device 100.
The GPIO interface may be configured by software. The GPIO interface may be configured as a control signal or as a data signal. In some embodiments, a GPIO interface may be used to connect the processor 110 with the camera 193, the display 194, the wireless communication module 160, the audio module 170, the sensor module 180, and the like. The GPIO interface may also be configured as an I2C interface, an I2S interface, a UART interface, an MIPI interface, etc.
The USB interface 130 is an interface conforming to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, or the like. The USB interface 130 may be used to connect a charger to charge the electronic device 100, and may also be used to transfer data between the electronic device 100 and a peripheral device. And can also be used for connecting with a headset, and playing audio through the headset. The interface may also be used to connect other electronic devices, such as AR devices, etc.
It should be understood that the interfacing relationship between the modules illustrated in the embodiments of the present invention is only illustrative, and is not meant to limit the structure of the electronic device 100. In other embodiments, the electronic device 100 may also employ different interfaces in the above embodiments, or a combination of interfaces.
The charge management module 140 is configured to receive a charge input from a charger. The charger can be a wireless charger or a wired charger. In some wired charging embodiments, the charge management module 140 may receive a charging input of a wired charger through the USB interface 130. In some wireless charging embodiments, the charge management module 140 may receive wireless charging input through a wireless charging coil of the electronic device 100. The charging management module 140 may also supply power to the electronic device through the power management module 141 while charging the battery 142.
The power management module 141 is used for connecting the battery 142, and the charge management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charge management module 140 and provides power to the processor 110, the internal memory 121, the external memory, the display 194, the camera 193, the wireless communication module 160, and the like. The power management module 141 may also be configured to monitor battery capacity, battery cycle number, battery health (leakage, impedance) and other parameters. In other embodiments, the power management module 141 may also be provided in the processor 110. In other embodiments, the power management module 141 and the charge management module 140 may be disposed in the same device.
The wireless communication function of the electronic device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the electronic device 100 may be used to cover a single or multiple communication bands. Different antennas may also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed into a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 150 may provide a solution for wireless communication including 2G/3G/4G/5G, etc., applied to the electronic device 100. The mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (low noise amplifier, LNA), etc. The mobile communication module 150 may receive electromagnetic waves from the antenna 1, perform processes such as filtering, amplifying, and the like on the received electromagnetic waves, and transmit the processed electromagnetic waves to the modem processor for demodulation. The mobile communication module 150 may amplify the signal modulated by the modem processor, and convert the signal into electromagnetic waves through the antenna to radiate. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be provided in the same device as at least some of the modules of the processor 110.
The fifth generation communication technology (5G) in 2020 will realize large-scale commercial use, and new communication technologies bring new requirements to mobile device antennas and base station antennas. The frequency band used in 5G includes millimeter wave frequency band (mmWave, 30GHz-100 GHz). Therefore, in order to be able to support millimeter wave communication, a terminal supporting 5G communication is provided with a millimeter wave antenna. Compared with the traditional microwave frequency band, in the propagation process of millimeter wave signals, the propagation of the signals is similar to the radiation propagation, namely, the direct path and the reflection path are mainly, obvious scattering and diffraction phenomena are avoided, the signals can propagate on a plurality of spatially discrete paths, and the number of effective paths is very rare. On millimeter wave channels lacking scattering, diversity gain can be obtained by means of space division multiplexing (such as MU-MIMO technology), so that the quality of signals is improved remarkably.
The shorter the millimeter wave signal wavelength, the shorter the length of the antenna. Based on the transmission rate requirements, a 5G terminal may have multiple millimeter wave antenna modules. Fig. 2B illustrates an electronic terminal with a millimeter-wave antenna array, which may be the electronic terminal 100 of fig. 1. In fig. 2B, the electronic terminal 100 has 8 antennas, which form an antenna array, that is, two or more single antennas operating at the same frequency are fed and spatially arranged to form an antenna array, so that the 5G terminal has beamforming capability, that is, by adjusting the weight of each antenna module, the multiple antennas undergo coherence and cancellation on waveforms, forming a lobe pattern with zero point and maximum directivity, so that the phased antenna array has the capability of transmitting or receiving signals in a specified direction. The multiple antennas on the electronic device may be laid out as a uniform linear array, a circular array, a rectangular array, or the like.
Current 5G millimeter wave antenna arrays are generally phased array based. Specific implementations can be categorized into AoB (Antenna on Board), i.e., the antenna array is located on the system motherboard, aiP (Antenna in Package), i.e., the antenna array is located within the package of the chip, and AiM (Antenna in Module), i.e., the antenna array and the RFIC form a module.
In order to achieve wider spatial coverage, the millimeter wave antenna array is generally designed by matching the types of antennas with complementary radiation beams (such as broadside radiation, i.e. broadside radiation, and end-fire radiation), such as patch antennas (patch antennas) and Quasi-Yagi antennas (Quasi-Yagi antennas), and giving appropriate design to the antenna feed points to achieve dual-polarized coverage, thereby greatly improving the range and coverage of millimeter wave signals.
The antenna 2 may have two or more antennas operating at the same frequency and form an antenna array.
The invention is applicable to electronic terminals with antenna arrays, in particular to electronic terminals with millimeter wave antenna arrays.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating the low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then transmits the demodulated low frequency baseband signal to the baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor. The application processor outputs sound signals through an audio device (not limited to the speaker 170A, the receiver 170B, etc.), or displays images or video through the display screen 194. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be provided in the same device as the mobile communication module 150 or other functional module, independent of the processor 110.
The wireless communication module 160 may provide solutions for wireless communication including wireless local area network (wireless local area networks, WLAN) (e.g., wireless fidelity (wireless fidelity, wi-Fi) network), bluetooth (BT), global navigation satellite system (global navigation satellite system, GNSS), frequency modulation (frequency modulation, FM), near field wireless communication technology (near field communication, NFC), infrared technology (IR), ultra Wideband (UWB), etc., as applied to the electronic device 100. The wireless communication module 160 may be one or more devices that integrate at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via an antenna, modulates the electromagnetic wave signals, filters the electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, frequency modulate it, amplify it, and convert it to electromagnetic waves for radiation via an antenna. Illustratively, the wireless communication module 160 may include a Bluetooth module, a Wi-Fi module, or the like.
In some embodiments, a portion of the antenna of the electronic device 100 is coupled to the mobile communication module 150 and another portion of the antenna is coupled to the wireless communication module 160 so that the electronic device 100 can communicate with networks and other devices through wireless communication techniques. The wireless communication techniques may include the Global System for Mobile communications (global system for mobile communications, GSM), general packet radio service (general packet radio service, GPRS), code division multiple access (code division multiple access, CDMA), wideband code division multiple access (wideband code division multiple access, WCDMA), time division code division multiple access (time-division code division multiple access, TD-SCDMA), long term evolution (long term evolution, LTE), millimeter wave (mmWave), BT, GNSS, WLAN, NFC, FM, UWB, and/or IR techniques, among others. The GNSS may include a global satellite positioning system (global positioning system, GPS), a global navigation satellite system (global navigation satellite system, GLONASS), a beidou satellite navigation system (beidou navigation satellite system, BDS), a quasi zenith satellite system (quasi-zenith satellite system, QZSS) and/or a satellite based augmentation system (satellite based augmentation systems, SBAS).
In some embodiments, a Bluetooth (BT) module, WLAN module included in wireless communication module 160 may transmit signals to detect or scan for nearby devices of electronic device 100, such that electronic device 100 may discover nearby devices using wireless communication techniques such as bluetooth or WLAN, and establish wireless communication connections with nearby devices, and share data to nearby devices over the connections. Among other things, a Bluetooth (BT) module may provide a solution that includes one or more of classical bluetooth or bluetooth low energy (Bluetooth low energy, BLE) bluetooth communications. The WLAN module may provide a solution that includes one or more WLAN communications of Wi-Fi direct, wi-Fi LAN, or Wi-Fi softAP.
In some embodiments, a millimeter wave (mmWave) module included in the mobile communication module 150 may transmit or receive signals in a specified direction to detect or scan nearby devices of the electronic device 100 in the specified direction, such that the electronic device 100 may discover nearby devices in the specified direction using mobile communication technologies such as millimeter waves and establish mobile communication connections or wireless communication connections such as WLAN direct connections with nearby devices and share data to nearby devices through the connections.
The electronic device 100 may implement display functions through a GPU, a display screen 194, an application processor, and the like. The GPU is a microprocessor for image processing, and is connected to the display 194 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 110 may include one or more GPUs that execute instructions to generate or change display information.
The display screen 194 is used to display images, videos, and the like. The display 194 includes a display panel. The display panel may employ a liquid crystal display (liquid crystal display, LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (AMOLED) or an active-matrix organic light-emitting diode (matrix organic light emitting diode), a flexible light-emitting diode (flex), a mini, a Micro led, a Micro-OLED, a quantum dot light-emitting diode (quantum dot light emitting diodes, QLED), or the like. In some embodiments, the electronic device 100 may include 1 or N display screens 194, N being a positive integer greater than 1.
The electronic device 100 may implement photographing functions through an ISP, a camera 193, a video codec, a GPU, a display screen 194, an application processor, and the like.
The ISP is used to process data fed back by the camera 193. For example, when photographing, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electric signal, and the camera photosensitive element transmits the electric signal to the ISP for processing and is converted into an image visible to naked eyes. ISP can also optimize the noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in the camera 193.
The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image onto the photosensitive element. The photosensitive element may be a charge coupled device (charge coupled device, CCD) or a Complementary Metal Oxide Semiconductor (CMOS) phototransistor. The photosensitive element converts the optical signal into an electrical signal, which is then transferred to the ISP to be converted into a digital image signal. The ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into an image signal in a standard RGB, YUV, or the like format. In some embodiments, electronic device 100 may include 1 or N cameras 193, N being a positive integer greater than 1.
The digital signal processor is used for processing digital signals, and can process other digital signals besides digital image signals. For example, when the electronic device 100 selects a frequency bin, the digital signal processor is used to fourier transform the frequency bin energy, or the like.
Video codecs are used to compress or decompress digital video. The electronic device 100 may support one or more video codecs. In this way, the electronic device 100 may play or record video in a variety of encoding formats, such as: dynamic picture experts group (moving picture experts group, MPEG) -1, MPEG-2, MPEG-3, MPEG-4, etc.
The NPU is a neural-network (NN) computing processor, and can rapidly process input information by referencing a biological neural network structure, for example, referencing a transmission mode between human brain neurons, and can also continuously perform self-learning. Applications such as intelligent awareness of the electronic device 100 may be implemented through the NPU, for example: image recognition, face recognition, speech recognition, text understanding, etc.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to enable expansion of the memory capabilities of the electronic device 100. The external memory card communicates with the processor 110 through an external memory interface 120 to implement data storage functions. For example, data such as music, photos, videos, etc. are stored in an external memory card.
The internal memory 121 may be used to store one or more computer programs, including instructions. The processor 110 may cause the electronic device 100 to perform the data sharing method provided in some embodiments of the present application, as well as various functional applications, data processing, and the like, by executing the above-described instructions stored in the internal memory 121. The internal memory 121 may include a storage program area and a storage data area. The storage program area can store an operating system; the storage area may also store one or more applications (e.g., gallery, contacts, etc.), and so forth. The storage data area may store data (e.g., photos, contacts, etc.) created during use of the electronic device 100. In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (universal flash storage, UFS), and the like.
The electronic device 100 may implement audio functions through an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, an application processor, and the like. Such as music playing, recording, etc.
The audio module 170 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. The audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be disposed in the processor 110, or a portion of the functional modules of the audio module 170 may be disposed in the processor 110.
The speaker 170A, also referred to as a "horn," is used to convert audio electrical signals into sound signals. The electronic device 100 may listen to music, or to hands-free conversations, through the speaker 170A.
A receiver 170B, also referred to as a "earpiece", is used to convert the audio electrical signal into a sound signal. When electronic device 100 is answering a telephone call or voice message, voice may be received by placing receiver 170B in close proximity to the human ear.
Microphone 170C, also referred to as a "microphone" or "microphone", is used to convert sound signals into electrical signals. When making a call or transmitting voice information, the user can sound near the microphone 170C through the mouth, inputting a sound signal to the microphone 170C. The electronic device 100 may be provided with at least one microphone 170C. In other embodiments, the electronic device 100 may be provided with two microphones 170C, and may implement a noise reduction function in addition to collecting sound signals. In other embodiments, the electronic device 100 may also be provided with three, four, or more microphones 170C to enable collection of sound signals, noise reduction, identification of sound sources, directional recording functions, etc.
The earphone interface 170D is used to connect a wired earphone. The headset interface 170D may be a USB interface 130 or a 3.5mm open mobile electronic device platform (open mobile terminal platform, OMTP) standard interface, a american cellular telecommunications industry association (cellular telecommunications industry association of the USA, CTIA) standard interface.
The pressure sensor 180A is used to sense a pressure signal, and may convert the pressure signal into an electrical signal. In some embodiments, the pressure sensor 180A may be disposed on the display screen 194. The pressure sensor 180A is of various types, such as a resistive pressure sensor, an inductive pressure sensor, a capacitive pressure sensor, and the like. The capacitive pressure sensor may be a capacitive pressure sensor comprising at least two parallel plates with conductive material. The capacitance between the electrodes changes when a force is applied to the pressure sensor 180A. The electronic device 100 determines the strength of the pressure from the change in capacitance. When a touch operation is applied to the display screen 194, the electronic apparatus 100 detects the touch operation intensity according to the pressure sensor 180A. The electronic device 100 may also calculate the location of the touch based on the detection signal of the pressure sensor 180A. In some embodiments, touch operations that act on the same touch location, but at different touch operation strengths, may correspond to different operation instructions. For example: and executing an instruction for checking the short message when the touch operation with the touch operation intensity smaller than the first pressure threshold acts on the short message application icon. And executing an instruction for newly creating the short message when the touch operation with the touch operation intensity being greater than or equal to the first pressure threshold acts on the short message application icon.
The gyro sensor 180B may be used to determine a motion gesture of the electronic device 100. In some embodiments, the angular velocity of electronic device 100 about three axes (i.e., x, y, and z axes) may be determined by gyro sensor 180B. The gyro sensor 180B may be used for photographing anti-shake. For example, when the shutter is pressed, the gyro sensor 180B detects the shake angle of the electronic device 100, calculates the distance to be compensated by the lens module according to the angle, and makes the lens counteract the shake of the electronic device 100 through the reverse motion, so as to realize anti-shake. The gyro sensor 180B may also be used for navigating, somatosensory game scenes.
The air pressure sensor 180C is used to measure air pressure. In some embodiments, electronic device 100 calculates altitude from barometric pressure values measured by barometric pressure sensor 180C, aiding in positioning and navigation.
The magnetic sensor 180D includes a hall sensor. The electronic device 100 may detect the opening and closing of the flip cover using the magnetic sensor 180D. In some embodiments, when the electronic device 100 is a flip machine, the electronic device 100 may detect the opening and closing of the flip according to the magnetic sensor 180D. And then according to the detected opening and closing state of the leather sheath or the opening and closing state of the flip, the characteristics of automatic unlocking of the flip and the like are set.
The acceleration sensor 180E may detect the magnitude of acceleration of the electronic device 100 in various directions (typically three axes). The magnitude and direction of gravity may be detected when the electronic device 100 is stationary. The electronic equipment gesture recognition method can also be used for recognizing the gesture of the electronic equipment, and is applied to horizontal and vertical screen switching, pedometers and other applications.
A distance sensor 180F for measuring a distance. The electronic device 100 may measure the distance by infrared or laser. In some embodiments, the electronic device 100 may range using the distance sensor 180F to achieve quick focus.
The proximity light sensor 180G may include, for example, a Light Emitting Diode (LED) and a light detector, such as a photodiode. The light emitting diode may be an infrared light emitting diode. The electronic device 100 emits infrared light outward through the light emitting diode. The electronic device 100 detects infrared reflected light from nearby objects using a photodiode. When sufficient reflected light is detected, it may be determined that there is an object in the vicinity of the electronic device 100. When insufficient reflected light is detected, the electronic device 100 may determine that there is no object in the vicinity of the electronic device 100. The electronic device 100 can detect that the user holds the electronic device 100 close to the ear by using the proximity light sensor 180G, so as to automatically extinguish the screen for the purpose of saving power. The proximity light sensor 180G may also be used in holster mode, pocket mode to automatically unlock and lock the screen.
The ambient light sensor 180L is used to sense ambient light level. The electronic device 100 may adaptively adjust the brightness of the display 194 based on the perceived ambient light level. The ambient light sensor 180L may also be used to automatically adjust white balance when taking a photograph. Ambient light sensor 180L may also cooperate with proximity light sensor 180G to detect whether electronic device 100 is in a pocket to prevent false touches.
The fingerprint sensor 180H is used to collect a fingerprint. The electronic device 100 may utilize the collected fingerprint feature to unlock the fingerprint, access the application lock, photograph the fingerprint, answer the incoming call, etc.
The temperature sensor 180J is for detecting temperature. In some embodiments, the electronic device 100 performs a temperature processing strategy using the temperature detected by the temperature sensor 180J. For example, when the temperature reported by temperature sensor 180J exceeds a threshold, electronic device 100 performs a reduction in the performance of a processor located in the vicinity of temperature sensor 180J in order to reduce power consumption to implement thermal protection. In other embodiments, when the temperature is below another threshold, the electronic device 100 heats the battery 142 to avoid the low temperature causing the electronic device 100 to be abnormally shut down. In other embodiments, when the temperature is below a further threshold, the electronic device 100 performs boosting of the output voltage of the battery 142 to avoid abnormal shutdown caused by low temperatures.
The touch sensor 180K may also be referred to as a touch panel or touch sensitive surface. The touch sensor 180K may be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, which is also called a "touch screen". The touch sensor 180K is for detecting a touch operation acting thereon or thereabout. The touch sensor may communicate the detected touch operation to the application processor to determine the touch event type. Visual output related to touch operations may be provided through the display 194. In other embodiments, the touch sensor 180K may also be disposed on the surface of the electronic device 100 at a different location than the display 194.
The bone conduction sensor 180M may acquire a vibration signal. In some embodiments, bone conduction sensor 180M may acquire a vibration signal of a human vocal tract vibrating bone pieces. The bone conduction sensor 180M may also contact the pulse of the human body to receive the blood pressure pulsation signal. In some embodiments, bone conduction sensor 180M may also be provided in a headset, in combination with an osteoinductive headset. The audio module 170 may analyze the voice signal based on the vibration signal of the sound portion vibration bone block obtained by the bone conduction sensor 180M, so as to implement a voice function. The application processor may analyze the heart rate information based on the blood pressure beat signal acquired by the bone conduction sensor 180M, so as to implement a heart rate detection function.
The keys 190 include a power-on key, a volume key, etc. The keys 190 may be mechanical keys. Or may be a touch key. The electronic device 100 may receive key inputs, generating key signal inputs related to user settings and function controls of the electronic device 100.
The motor 191 may generate a vibration cue. The motor 191 may be used for incoming call vibration alerting as well as for touch vibration feedback. For example, touch operations acting on different applications (e.g., photographing, audio playing, etc.) may correspond to different vibration feedback effects. The motor 191 may also correspond to different vibration feedback effects by touching different areas of the display screen 194. Different application scenarios (such as time reminding, receiving information, alarm clock, game, etc.) can also correspond to different vibration feedback effects. The touch vibration feedback effect may also support customization.
The indicator 192 may be an indicator light, may be used to indicate a state of charge, a change in charge, a message indicating a missed call, a notification, etc.
The SIM card interface 195 is used to connect a SIM card. The SIM card may be inserted into the SIM card interface 195, or removed from the SIM card interface 195 to enable contact and separation with the electronic device 100. The electronic device 100 may support 1 or N SIM card interfaces, N being a positive integer greater than 1. The SIM card interface 195 may support Nano SIM cards, micro SIM cards, and the like. The same SIM card interface 195 may be used to insert multiple cards simultaneously. The types of the plurality of cards may be the same or different. The SIM card interface 195 may also be compatible with different types of SIM cards. The SIM card interface 195 may also be compatible with external memory cards. The electronic device 100 interacts with the network through the SIM card to realize functions such as communication and data communication. In some embodiments, the electronic device 100 employs esims, i.e.: an embedded SIM card. The eSIM card can be embedded in the electronic device 100 and cannot be separated from the electronic device 100.
The electronic device 100 illustrated in the example of fig. 2A may display various graphical user interfaces described in various embodiments below through the display 194. The electronic apparatus 100 may detect a touch operation in each of the graphical user interfaces, such as a click operation (e.g., a touch operation on an icon, a double click operation) in each of the graphical user interfaces, a slide operation up or down in each of the graphical user interfaces, or an operation to perform a circle gesture, etc., through the touch sensor 180K. In some embodiments, the electronic device 100 may detect a motion gesture performed by the user holding the electronic device 100, such as shaking the electronic device, through the gyroscope sensor 180B, the acceleration sensor 180E, and the like. In some embodiments, the electronic device 100 may detect a non-touch gesture operation through the camera 193 (e.g., 3D camera, depth camera).
The software system of the electronic device 100 may employ a layered architecture, an event driven architecture, a microkernel architecture, a microservice architecture, or a cloud architecture. In the embodiment of the invention, taking an Android system with a layered architecture as an example, a software structure of the electronic device 100 is illustrated.
Fig. 3 is a software configuration block diagram of the electronic device 100 according to the embodiment of the present invention.
The layered architecture divides the software into several layers, each with distinct roles and branches. The layers communicate with each other through a software interface. In some embodiments, the Android system is divided into four layers, from top to bottom, an application layer, an application framework layer, an Zhuoyun row (Android run) and system libraries, and a kernel layer, respectively.
The application layer may include a series of application packages.
As shown in fig. 3, the application package may include applications for cameras, gallery, calendar, phone calls, maps, navigation, WLAN, bluetooth, music, video, short messages, etc.
The application framework layer provides an application programming interface (application programming interface, API) and programming framework for application programs of the application layer. The application framework layer includes a number of predefined functions.
As shown in FIG. 3, the application framework layer may include a window manager, a content provider, a view system, a telephony manager, a resource manager, a notification manager, and the like.
The window manager is used for managing window programs. The window manager can acquire the size of the display screen, judge whether a status bar exists, lock the screen, intercept the screen and the like.
The content provider is used to store and retrieve data and make such data accessible to applications. The data may include video, images, audio, calls made and received, browsing history and bookmarks, phonebooks, etc.
The view system includes visual controls, such as controls to display text, controls to display pictures, and the like. The view system may be used to build applications. The display interface may be composed of one or more views. For example, a display interface including a text message notification icon may include a view displaying text and a view displaying a picture.
The telephony manager is used to provide the communication functions of the electronic device 100. Such as the management of call status (including on, hung-up, etc.).
The resource manager provides various resources for the application program, such as localization strings, icons, pictures, layout files, video files, and the like.
The notification manager allows the application to display notification information in a status bar, can be used to communicate notification type messages, can automatically disappear after a short dwell, and does not require user interaction. Such as notification manager is used to inform that the download is complete, message alerts, etc. The notification manager may also be a notification in the form of a chart or scroll bar text that appears on the system top status bar, such as a notification of a background running application, or a notification that appears on the screen in the form of a dialog window. For example, a text message is prompted in a status bar, a prompt tone is emitted, the electronic device vibrates, and an indicator light blinks, etc.
Android run time includes a core library and virtual machines. Android run time is responsible for scheduling and management of the Android system.
The core library consists of two parts: one part is a function which needs to be called by java language, and the other part is a core library of android.
The application layer and the application framework layer run in a virtual machine. The virtual machine executes java files of the application program layer and the application program framework layer as binary files. The virtual machine is used for executing the functions of object life cycle management, stack management, thread management, security and exception management, garbage collection and the like.
The system library may include a plurality of functional modules. For example: surface manager (surface manager), media Libraries (Media Libraries), three-dimensional graphics processing Libraries (e.g., openGL ES), 2D graphics engines (e.g., SGL), etc.
The surface manager is used to manage the display subsystem and provides a fusion of 2D and 3D layers for multiple applications.
Media libraries support a variety of commonly used audio, video format playback and recording, still image files, and the like. The media library may support a variety of audio and video encoding formats, such as MPEG4, h.264, MP3, AAC, AMR, JPG, PNG, etc.
The three-dimensional graphic processing library is used for realizing three-dimensional graphic drawing, image rendering, synthesis, layer processing and the like.
The 2D graphics engine is a drawing engine for 2D drawing.
The kernel layer is a layer between hardware and software. The inner core layer at least comprises a display driver, a camera driver, an audio driver and a sensor driver.
The software system shown in fig. 3 involves application rendering (e.g., video library, multi-tasking interface) using the screen-cast capability, and the application framework layer provides WLAN services, bluetooth services, and the kernel and underlying layers provide WLAN bluetooth capabilities and basic communication protocols.
The workflow of the electronic device 100 software and hardware is illustrated below in connection with capturing a photo scene.
When touch sensor 180K receives a touch operation, a corresponding hardware interrupt is issued to the kernel layer. The kernel layer processes the touch operation into the original input event (including information such as touch coordinates, time stamp of touch operation, etc.). The original input event is stored at the kernel layer. The application framework layer acquires an original input event from the kernel layer, and identifies a control corresponding to the input event. Taking the touch operation as an example, the control corresponding to the touch operation is a control of a camera application icon, the camera application calls an interface of an application framework layer, starts the camera application, further starts a camera driver by calling a kernel layer, and captures a still image or video through a camera 193.
The following describes an application scenario related to the present application and some embodiments for implementing "multitasking interface" on a mobile phone, taking the mobile phone as an example of the electronic device 100.
When a user uses the mobile phone, the application program in the mobile phone can be switched from the foreground operation to the background operation. And, the user can control the mobile phone to display the multi-task management interface. The multi-task management interface is a graphical user interface for displaying/managing application programs running in the background on electronic devices such as smart phones and tablet computers, and the graphical user interface can display thumbnail images of the application programs currently running and can also be called as a multi-task interface.
Fig. 4A illustrates a graphical user interface on a cell phone. The graphical user interface may be a Home screen (Home screen) for a mobile phone, and may include: status bars, calendar indicators, weather indicators, trays with commonly used application icons, and the like.
A Home key 11 is disposed below the mobile phone screen, and may be used to receive a user command. The Home key 11 may be a physical key or a virtual key.
Wherein, as shown in fig. 4A and 4B, in response to a first operation by the user, the mobile phone calls up the multi-task interface 30. Specifically, the first operation is that after the user presses the Home key 11 twice, the mobile phone displays the multi-task interface. The triggering mode of calling up the multi-task interface for the mobile phone is not limited, and can be referred to the existing mode.
Fig. 4B illustrates a multi-tasking interface 30 displayed on a cell phone. The multi-tasking interface may support a variety of operations by a user on an application currently in operation, such as screen casting, screen casting withdrawal, application switching, viewing an operating application, closing an operating application, and the like.
As shown in fig. 4B, the multi-tasking interface 30 may include a first display area 31 and a second display area 32. Wherein the first display area may include: a first electronic device icon 101', a second electronic device icon 102', a third electronic device icon 103'; the second display area may include a thumbnail 201 of the first application, a thumbnail 202 of the second application, and a thumbnail 203 of the third application. The electronic equipment is electronic equipment with a wireless communication connection relation established pointed by a mobile phone, and the application thumbnail is formed by an interface screenshot when an application program is switched to a background operation.
For example, icons corresponding to electronic devices in the pointing range of the mobile phone may be ordered according to a predetermined ordering rule, such as the icons of the first electronic device icon 101', the second electronic device icon 102', and the third electronic device icon 103'. In some embodiments, the predetermined ordering rule may be based on how far or near the electronic device is from the handset. For example, three electronic device icons are displayed in the first display area 31, and the distances between the electronic devices corresponding to the three icons and the mobile phone in the sequence representation are the first electronic device icon 101' < the second electronic device icon 102' < the third electronic device icon 103'. Furthermore, in other embodiments, different electronic device icons may have different display effects in the first display area 31. For example, the first electronic device icon 101' may differ in appearance (e.g., color, size, etc.) from other icons to indicate that the nearby device to which the icon corresponds is closest to the electronic device or that the signal strength from the nearby device is highest. In other embodiments, the predetermined ordering rule may be other rules such as the device ID first letter order, which is not limited in this embodiment.
Therefore, the electronic equipment in the pointing range of the mobile phone is identified by utilizing the positioning between the mobile phone and other electronic equipment, the target equipment which is required to be screen-cast by the user can be quickly locked, the directional screen-cast is realized, the trouble that the user searches and identifies the target equipment in a large amount of equipment in a large range is avoided, and the situation that the equipment is selected by mistake during the screen-cast is avoided.
It should be noted that, the positioning and identification between the mobile phone and other electronic devices are implemented in different manners according to different hardware. In some embodiments, the position identification between the mobile phone and other electronic devices can be realized through pre-installed UWB chips in the mobile phone and other electronic devices.
It will be appreciated that the electronic device icon may also take other forms, such as a text message "printer". In addition, the electronic device icon may also include an account number of the user using the device, e.g., a cell phone including the text message "MAC," where "MAC" is the user account number. In other embodiments, the electronic device icon may also display a model number or other custom information of the electronic device, such as "HUAWI P30" or the like.
It should be noted that, there may be a plurality of electronic devices with established wireless communication connection relationships pointed by the mobile phone, and the electronic devices may be limited by the display interface of the mobile phone, only a part of the electronic device icons may be displayed in the multitasking interface, and the user may trigger the mobile phone to display other electronic device icons through a sliding operation.
It should be noted that there may be a plurality of applications in the background running state, and the application may be limited by the display interface of the electronic device, and only the thumbnail of a part of applications may be displayed in the multitasking interface, so that the user may trigger the electronic device to display the thumbnail of other applications through a sliding operation.
In this embodiment, as shown in fig. 4C, in the first display area 31, a plurality of electronic device icons and the relative positions of the corresponding electronic devices and the mobile phone may be displayed. Wherein the first display area 31 may display a coordinate system 110, which may include a center point 1111 and a horizontal axis 1112 and a vertical axis 1113. The center point 1111 may represent a mobile phone, the horizontal axis 1112 may represent a left-right direction with respect to the mobile phone, and the vertical axis 1113 may represent a front-back direction with respect to the mobile phone. The multitasking interface may also display a first electronic device icon 101', a second electronic device icon 102', and a third electronic device icon 103' to represent nearby electronic devices found by the mobile phone, and display positions/directions of the electronic devices relative to the mobile phone, that is, positions of different electronic device icons indicate relative positions of the corresponding electronic devices and the mobile phone. For example, the electronic device icon 103' is located at the upper right of the center point 1111, which indicates that the actual electronic device 103 is located at the upper right of the cell phone. In addition, the distance between the electronic device icon and the center point 1111 indicates the distance between the corresponding electronic device and the mobile phone, and the closer the icon is to the center point, the closer the corresponding electronic device is to the mobile phone. It will be appreciated that the electronic device icons may also take other forms, such as numbers or other information customized by the device user. In other embodiments, the icons of the electronic devices closer to the center point may have different display effects than the icons of the electronic devices farther from the center point, e.g., the icons closer to the center point may differ in appearance (e.g., icon size, color, etc.) from the icons farther from the center point.
In addition, in some implementations, the electronic device icon in the first display area may also display device information of the discovered electronic device, such as "HUAWEI P30" or "Tom's notebook.
As shown in fig. 4D, the user performs a second operation, specifically dragging the thumbnail 201 of the first application to the first electronic device icon 101'. And the second operation is used for starting the screen throwing function of the mobile phone and throwing the screen of the application required by the user to the selected electronic equipment.
As an extension to the embodiment of the present application, as shown in fig. 4E, the second operation may also be that the user clicks the thumbnail 31 of the first application program first, and then clicks the first electronic device icon 101', so as to start the screen-throwing function of the mobile phone. Alternatively, the user may click on the first electronic device icon 101' first and then click on the thumbnail 31 of the first application. Wherein, clicking can be single click, double click or long press, and the embodiment of the application is not limited to this.
As an extension to the above embodiment, as shown in fig. 4F, the second operation may also be that the user clicks the start directional screen-throwing icon 120 first, and then slides the thumbnail 201 of the first application program toward the first electronic device 101 that needs to receive the screen throwing, so as to start the screen throwing function of the mobile phone. The first display area 31 may give an arrow prompt according to the detected electronic devices in different directions. In some embodiments, the handset determines the contact position at the beginning and end of the user's swipe, respectively, and determines the direction of the user's swipe based on the contact position at the beginning and end of the swipe. It will be appreciated that the manner of determining the sliding direction is not limited to determining the contact point position at the beginning and end of the sliding, but the mobile phone may also determine the sliding direction by recording the finger sliding track, etc., and the present embodiment is not limited in this regard.
As shown in fig. 5, in response to the second operation of the user, the mobile phone starts a screen-casting function to cast the first application to the first electronic device 101. Further, the thumbnail of the dragged first application is flipped to the position of the second display area where it was before the drag and the text "in-screen" is displayed, and the third display area 33 of the multi-tasking interface is updated and control 301 is displayed ("stop-screen"). Furthermore, the mobile phone can display information containing the content that the first application is screen-cast to the first electronic device, so that a user can conveniently view the current specific screen-cast state.
As shown in fig. 6A, if the user needs to switch the first application to the second application after the first application is projected onto the first electronic device 101 (as shown in fig. 5), the second operation may be performed again on the multi-task interface to switch the projected application. Specifically, the user performs a third operation, which may be dragging the thumbnail 202 of the second application to the first electronic device icon 102'.
It can be appreciated that, if the user needs to switch the first application to the second device 102 for screen casting after the first application is screen cast to the first electronic device 101, the thumbnail 201 of the first application program may be dragged to the second electronic device icon 102' on the multi-task interface.
As shown in fig. 6B, in response to the third operation of the user, the mobile phone may instruct the first device 101 to stop playing the first application, and screen the second application to the first electronic device 101. Specifically, the mobile phone may first send an end signal of "stop screen-throwing" to the first device 101, and then transmit the multimedia data output by the second application to the first device 101. Therefore, the method and the device realize that the playing of the first application in the first electronic equipment is switched to the playing of the second application in the first electronic equipment, namely, a user can finish the switching of the screen-throwing application only by performing one-step operation on the multi-task interface without multi-step operation (for example, the first application is firstly opened, then the screen throwing is selected to be stopped, then the second application is opened, and the screen throwing is started), so that the screen throwing content can be conveniently replaced, the privacy of the user is protected from leakage, and the experience of the user is improved.
As shown in fig. 7A, the user performs a fourth operation, specifically dragging the thumbnail 201 of the first application to the second electronic device icon 102'. Next, as shown in fig. 7B, in response to the fourth operation of the user, the mobile phone screens the first application to the second electronic device 102. Therefore, after the mobile phone has screen-cast the second application to the first electronic device, the user can screen-cast the first application on the mobile phone to the second electronic device through the fourth operation, so that the convenient operation of screen-cast of a plurality of applications to the plurality of electronic devices on the multi-task interface is realized. At this time, the first application in the mobile phone outputs the multimedia data to the second device for playing, and the second application in the mobile phone outputs the multimedia data to the first application for playing. The user can enjoy the multimedia data respectively played by different electronic devices at the same time, so that the use experience of the user is improved.
As shown in fig. 8A, the user performs a fifth operation, specifically dragging the thumbnail 201 of the first application to the control 501 ("stop dropping screen"). Wherein, prior to the fifth operation, the first application has been screen-cast to the second electronic device 102.
As an extension to this embodiment, the fifth operation may be a sliding operation that slides down or up, or may be an operation that selects an application thumbnail first and then performs a gesture of drawing a circle in the second display area, or may be an operation that selects an application thumbnail within a fixed time (e.g., 1 second) after shaking the electronic device, or may be a voice control operation, that is, a user speaks a voice instruction of "stop dropping a screen". The specific form of the fifth operation is not limited in this embodiment.
As shown in fig. 8B, in response to the fifth operation by the user, the handset instructs the second electronic device 102 to stop playing the first application. Alternatively, the handset may cease sending display data generated when running the first application to the first electronic device. In addition, after the first electronic equipment stops running the first application, the mobile phone can switch the first application back to the mobile phone for continuous display, so that the first application can realize seamless connection between the mobile phone and the first electronic equipment.
As a refinement of this embodiment, after the fifth operation, before the first application stops the screen, the mobile phone may pop up a dialog box of "whether to determine to stop the screen of the first application", where the dialog box further includes two buttons of "yes" and "no". If the user selects yes, stopping screen projection of the first application by the mobile phone; if the user selects no, the mobile phone continues to screen the first application.
Illustratively, in one scenario (e.g., a multi-person meeting scenario), a user needs to quickly and easily screen an application to multiple electronic devices. As shown in fig. 9A, the user performs a sixth operation, specifically, long-pressing the thumbnail 201 of the first application program with two fingers and continuously sliding upward for a distance L that is greater than a predetermined length to enable the electronic device to recognize and determine the direction in which the user slides. And the sixth operation is used for starting the functions of the plurality of devices for projecting the mobile phone screen, and the application selected by the user can be projected to the plurality of electronic devices.
Note that, the double fingers in the sixth operation may be three fingers; the long press can be a single click or a double click; the upward sliding may be customized to slide in a certain direction, or slide in the first display area, etc.; the present embodiment is not limited in this regard.
In some embodiments, the handset determines the contact position at the beginning and end of the user's swipe, respectively, and determines the direction of the user's swipe based on the contact position at the beginning and end of the swipe. It will be appreciated that the manner of determining the sliding direction is not limited to determining the contact point position at the beginning and end of the sliding, but the mobile phone may also determine the sliding direction by recording the finger sliding track, etc., and the present embodiment is not limited in this regard.
As shown in fig. 9B, in response to the sixth operation of the user, the mobile phone starts the function of projecting the plurality of devices, and projects the first application to the first electronic device 101, the second electronic device 102, and the third electronic device 103, respectively. In some embodiments, the plurality of devices for projecting the screen of the mobile phone may be all electronic devices for which wireless communication connection relationship is established pointed by the mobile phone, or may be electronic devices corresponding to all electronic device icons displayed in the first display area of the multi-task interface for projecting the first application.
Therefore, in a scene that a single application is required to be projected onto a plurality of devices, a user can project one application onto a plurality of electronic devices only by performing one-step operation on a multi-task interface, and the plurality of electronic devices can be electronic devices with established wireless communication connection relations pointed by mobile phones, so that the plurality of devices are more convenient and faster to project on the premise of limiting the devices requiring to project.
As shown in fig. 9C, the user performs a seventh operation, specifically, a double-finger long-pressing of the thumbnail 201 of the first application program and a downward continuous sliding for a distance L that is greater than a predetermined length to enable the electronic device to recognize and determine the direction in which the user slides. The seventh operation is to end the screen casting of the plurality of devices, and the screen casting of the plurality of devices by the application selected by the user may be stopped at the same time.
As shown in fig. 9D, in response to the seventh operation of the user, the mobile phone instructs the first electronic device 101, the second electronic device 102, and the third electronic device 103 to stop playing the first application, respectively. Alternatively, the mobile phone may stop sending display data generated when the first application is running to the first electronic device 101, the second electronic device 102, and the third electronic device 103.
Therefore, in a scene that a single application screen is required to be ended to a plurality of devices, a user can stop the screen of the plurality of devices to one application only by performing one-step operation on the multi-task interface, so that the screen of the plurality of devices is ended more conveniently and rapidly.
Referring to fig. 10, fig. 10 is a schematic flow chart of a method for projecting a screen on a multi-task interface according to an embodiment of the present application, which is applied to the electronic device 100 shown in fig. 1, and a mobile phone is taken as an example of the electronic device 100; as shown in the figure, the multi-task interface screen projection method comprises the following steps:
S401, responding to a first operation of a user, and acquiring positions of the first electronic device and the second electronic device.
The first operation is used for opening a multi-task interface of the mobile phone, for example, after the user presses the Home key twice, the mobile phone displays the multi-task interface. For another example, after the user touches the multitasking button in the navigation bar, the handset displays the multitasking interface. The triggering mode of calling up the multi-task interface for the mobile phone is not limited, and can be referred to the existing mode.
It will be appreciated that the wireless communication connection established by the mobile phone and other electronic devices may be a wireless communication connection established prior to responding to the first operation, and the wireless communication connection may be established by at least one of WIFI, bluetooth, UWB ultra wideband technology, infrared data transmission, wireless local area network, and the like. After the user performs the first operation, the mobile phone can acquire the distance between the electronic device and the mobile phone through the antenna array or the UWB ultra-wideband technology and the like, and further the electronic device with the distance smaller than the distance threshold value between the electronic device and the mobile phone is determined to be the candidate electronic device, and the candidate electronic device is displayed in the first display area. The mobile phone can sort the discovered nearby devices according to a preset sorting rule and display device information corresponding to the sorted nearby devices in a first display area.
After the mobile phone discovers the first electronic device and the second electronic device, the mobile phone determines the relative positions of the nearby electronic devices and the mobile phone, and a specific process can be seen in fig. 17 and the description thereof.
S402, determining the positions of a first electronic device icon and a second electronic device icon on a multi-task interface according to the positions of the first electronic device and the second electronic device; the first electronic device icon is used for identifying the first electronic device, the second electronic device icon is used for identifying the second electronic device, and the multi-task interface comprises a first display area which is used for displaying the first electronic device icon and the second electronic device icon.
Further, the first electronic device icon and the second electronic device corresponding to the first electronic device icon and the second electronic device icon displayed in the first display area of the multi-task interface are first electronic devices and second electronic devices which have a wireless communication connection relationship with the mobile phone, and the first electronic devices and the second electronic devices have a certain position relationship with the mobile phone.
Further, if the indication direction (gesture) of the mobile phone changes, the relative positions of the nearby electronic devices and the mobile phone change, and the electronic devices in the first display area are updated accordingly. The user can control the indication direction (posture) of the mobile phone, and the mobile phone can set different display effects on the device information corresponding to the nearby device in the indicated direction among the nearby devices found in all directions, or can display only the device information corresponding to the nearby electronic device in the indicated direction.
It should be noted that, there may be a plurality of electronic devices with established wireless communication connection relationships pointed by the mobile phone, and the electronic devices may be limited by the display interface of the mobile phone, only a part of the electronic devices may be displayed in the multitasking interface, and the user may trigger the mobile phone to display other electronic devices through a sliding operation.
S403, acquiring a thumbnail of the first application program and a thumbnail of the second application program; the first application program and the second application program are application programs running on the electronic equipment; the multi-task interface further comprises a second display area, wherein the second display area is used for displaying the thumbnail of the first application program and the thumbnail of the second application program.
Further, the application thumbnail displayed in the second display area of the multitasking interface is formed by an interface screenshot when the application program is switched to the background operation.
It should be noted that there may be a plurality of applications in the background running state, and the application may be limited by the multitasking interface of the mobile phone, where only a thumbnail of a part of the applications may be displayed in the multitasking interface, and the user may trigger the mobile phone to display thumbnails of other applications through a sliding operation.
S404, displaying the multi-task interface.
Specifically, the multi-task interface includes a first display area and a second display area. The first display area displays a first electronic device icon and a second electronic device icon according to the positions of the first electronic device and the second electronic device; the first electronic device icon is used for identifying the first electronic device, and the second electronic device icon is used for identifying the second electronic device. The second display area displays a thumbnail of the first application program and a thumbnail of the second application program; the thumbnail of the first application program is used for identifying the first application, and the thumbnail of the second application program is used for identifying the second application.
And S405, responding to a second operation of a user, and projecting multimedia data associated with the first application to the first electronic equipment.
The second operation is specifically that a thumbnail of the first application program in the second display area is dragged to a first electronic device icon in the first display area, and then the mobile phone screens multimedia data associated with the first application to the first electronic device.
As an extension to this embodiment, the second operation may be an operation of sliding up or down, or an operation of selecting an application thumbnail first and then executing a circle gesture in the second display area, or an operation of selecting an application thumbnail within a fixed time (e.g., 1 second) after shaking the electronic device, or a voice control operation, that is, a voice instruction of speaking "screen-drop" by the user. The specific form of the first operation is not limited in this embodiment.
As a refinement of this embodiment, after the second operation is finished and before the screen-throwing function is started, the mobile phone may pop up a dialog box of "whether to determine to throw the first application to the first electronic device", where the dialog box further includes two buttons of "yes" and "no". If the user selects yes, the mobile phone starts a screen throwing function; if the user selects no, the mobile phone does not start the screen throwing function.
It should be noted that, in the screen throwing process between the mobile phone and the electronic device in this embodiment, the transmitted data may support, but is not limited to, at least one of audio data, video data, picture data, web page data, and text data. For example, the mobile phone can send the display data output when the WeChat application program is operated to the smart television through the screen throwing function, and the smart television continues to display the display interface output by the WeChat application program. For another example, if the micro-letter program is installed in the smart television, the mobile phone can send an instruction for opening the micro-letter application program to the smart television, so that the smart television can open the micro-letter application program installed by itself in response to the instruction, and the micro-letter application program on the mobile phone is projected to the smart television to continue to run. Or when the mobile phone sends an instruction for opening the WeChat application program to the smart television, the instruction can also carry the running data of the current WeChat application program, for example, the running data is used for indicating that the current WeChat application program is positioned on the chat interface of the contact Sam. Therefore, after the smart television opens the WeChat application program installed by itself, the smart television can automatically jump to the chat interface of the contact Sam according to the running data, so that the multimedia data can be continuously played seamlessly when being put on the screen among a plurality of devices.
After the mobile phone drops the multimedia data output by the first application from the mobile phone to the first electronic device for playing, the mobile phone can continue to operate the first application to output the corresponding multimedia data. Alternatively, the handset may stop running the first application. Of course, after the mobile phone throws the first application to the first electronic device, the user may also perform various operations on the first application in the first electronic device, which is not limited in the embodiment of the present application.
As a refinement of this embodiment, in response to the second operation of the user, the mobile phone starts the screen-throwing function, and the thumbnail of the first application program dragged to the first display area by the user rebounds to the second display area and displays the text "in screen throwing", and at the same time, the third display area of the multi-task interface is updated, and controls are displayed ("stop screen throwing").
Therefore, the user can select the application and the electronic equipment pointed by the mobile phone in the multi-task interface of the mobile phone through the operation mode, so that the application is finished to be projected to the electronic equipment, the whole man-machine interaction process is more natural and friendly, and the use experience of the user is improved.
Referring to fig. 11, fig. 11 is a schematic flow chart of a method for projecting a screen on a multi-task interface according to an embodiment of the present application, which is applied to the electronic device 100 shown in fig. 1, and a mobile phone is taken as an example of the electronic device 100; as shown in the figure, the multi-task interface screen projection method comprises the following steps:
S501, responding to a first operation of a user, and acquiring positions of the first electronic device and the second electronic device.
The first operation is used for opening a multi-task interface of the mobile phone, for example, after the user presses the Home key twice, the mobile phone displays the multi-task interface. For another example, after the user touches the multitasking button in the navigation bar, the handset displays the multitasking interface. The triggering mode of calling up the multi-task interface for the mobile phone is not limited, and can be referred to the existing mode.
It will be appreciated that the wireless communication connection established by the mobile phone and other electronic devices may be a wireless communication connection established prior to responding to the first operation, and the wireless communication connection may be established by at least one of WIFI, bluetooth, UWB ultra wideband technology, infrared data transmission, wireless local area network, and the like. After the user performs the first operation, the mobile phone can acquire the distance between the electronic device and the mobile phone through the antenna array or the UWB ultra-wideband technology and the like, and further the electronic device with the distance smaller than the distance threshold value between the electronic device and the mobile phone is determined to be the candidate electronic device, and the candidate electronic device is displayed in the first display area. The mobile phone can sort the discovered nearby devices according to a preset sorting rule and display device information corresponding to the sorted nearby devices in a first display area.
After the mobile phone discovers the first electronic device and the second electronic device, the mobile phone determines the relative positions of the nearby electronic devices and the mobile phone, and a specific process can be seen in fig. 17 and the description thereof.
S502, determining the positions of a first electronic device icon and a second electronic device icon on a multi-task interface according to the positions of the first electronic device and the second electronic device; the first electronic device icon is used for identifying the first electronic device, the second electronic device icon is used for identifying the second electronic device, and the multi-task interface comprises a first display area which is used for displaying the first electronic device icon and the second electronic device icon.
Further, the first electronic device icon and the second electronic device corresponding to the first electronic device icon and the second electronic device icon displayed in the first display area of the multi-task interface are first electronic devices and second electronic devices which have a wireless communication connection relationship with the mobile phone, and the first electronic devices and the second electronic devices have a certain position relationship with the mobile phone.
Further, if the indication direction (gesture) of the mobile phone changes, the relative positions of the nearby electronic devices and the mobile phone change, and the electronic devices in the first display area are updated accordingly. The user can control the indication direction (posture) of the mobile phone, and the mobile phone can set different display effects on the device information corresponding to the nearby device in the indicated direction among the nearby devices found in all directions, or can display only the device information corresponding to the nearby electronic device in the indicated direction.
It should be noted that, there may be a plurality of electronic devices with established wireless communication connection relationships pointed by the mobile phone, and the electronic devices may be limited by the display interface of the mobile phone, only a part of the electronic devices may be displayed in the multitasking interface, and the user may trigger the mobile phone to display other electronic devices through a sliding operation.
S503, acquiring a thumbnail of a first application program and a thumbnail of a second application program; the first application program and the second application program are application programs running on the electronic equipment; the multi-task interface further comprises a second display area, wherein the second display area is used for displaying the thumbnail of the first application program and the thumbnail of the second application program.
Further, the application thumbnail displayed in the second display area of the multitasking interface is formed by an interface screenshot when the application program is switched to the background operation.
It should be noted that there may be a plurality of applications in the background running state, and the application may be limited by the display interface of the mobile phone, and only the thumbnail of a part of applications may be displayed in the multitasking interface, so that the user may trigger the mobile phone to display the thumbnail of other applications through a sliding operation.
S504, displaying the multi-task interface.
Specifically, the multi-task interface includes a first display area and a second display area. The first display area displays a first electronic device icon and a second electronic device icon according to the positions of the first electronic device and the second electronic device; the first electronic device icon is used for identifying the first electronic device, and the second electronic device icon is used for identifying the second electronic device. The second display area displays a thumbnail of the first application program and a thumbnail of the second application program; the thumbnail of the first application program is used for identifying the first application, and the thumbnail of the second application program is used for identifying the second application.
And S505, responding to a second operation of the user, and projecting multimedia data associated with the first application to the first electronic equipment.
The second operation is specifically that a thumbnail of the first application program in the second display area is dragged to a first electronic device icon in the first display area, and then the mobile phone screens multimedia data associated with the first application to the first electronic device.
As an extension to this embodiment, the second operation may be an operation of sliding up or down, or an operation of selecting an application thumbnail first and then executing a circle gesture in the second display area, or an operation of selecting an application thumbnail within a fixed time (e.g., 1 second) after shaking the electronic device, or a voice control operation, that is, a voice instruction of speaking "screen-drop" by the user. The specific form of the first operation is not limited in this embodiment.
As a refinement of this embodiment, after the second operation is finished and before the screen-throwing function is started, the mobile phone may pop up a dialog box of "whether to determine to throw the first application to the first electronic device", where the dialog box further includes two buttons of "yes" and "no". If the user selects yes, the mobile phone starts a screen throwing function; if the user selects no, the mobile phone does not start the screen throwing function.
It should be noted that, in the screen throwing process between the mobile phone and the electronic device in this embodiment, the transmitted data may support, but is not limited to, at least one of audio data, video data, picture data, web page data, and text data. For example, the mobile phone can send the display data output when the WeChat application program is operated to the smart television through the screen throwing function, and the smart television continues to display the display interface output by the WeChat application program. For another example, if the micro-letter program is installed in the smart television, the mobile phone can send an instruction for opening the micro-letter application program to the smart television, so that the smart television can open the micro-letter application program installed by itself in response to the instruction, and the micro-letter application program on the mobile phone is projected to the smart television to continue to run. Or when the mobile phone sends an instruction for opening the WeChat application program to the smart television, the instruction can also carry the running data of the current WeChat application program, for example, the running data is used for indicating that the current WeChat application program is positioned on the chat interface of the contact Sam. Therefore, after the smart television opens the WeChat application program installed by itself, the smart television can automatically jump to the chat interface of the contact Sam according to the running data, so that the multimedia data can be continuously played seamlessly when being put on the screen among a plurality of devices.
After the mobile phone drops the multimedia data output by the first application from the mobile phone to the first electronic device for playing, the mobile phone can continue to operate the first application to output the corresponding multimedia data. Alternatively, the handset may stop running the first application. Of course, after the mobile phone throws the first application to the first electronic device, the user may also perform various operations on the first application in the first electronic device, which is not limited in the embodiment of the present application.
As a refinement of this embodiment, in response to the second operation of the user, the mobile phone starts the screen-throwing function, and the thumbnail of the first application program dragged to the first display area by the user rebounds to the second display area and displays the text "in screen throwing", and at the same time, the third display area of the multi-task interface is updated, and controls are displayed ("stop screen throwing").
Therefore, the user can select the application and the electronic equipment pointed by the mobile phone in the multi-task interface of the mobile phone through the operation mode, so that the application is finished to be projected to the electronic equipment, the whole man-machine interaction process is more natural and friendly, and the use experience of the user is improved.
And S506, stopping the screen of the multimedia data associated with the first application to the first electronic device and the screen of the multimedia data associated with the second application to the first electronic device in response to a third operation of the user.
Specifically, the third operation is to drag the thumbnail of the second application program in the second display area to the first electronic device icon in the first display area, and then the mobile phone drops the multimedia data of the second application to the first electronic device.
As a refinement of this embodiment, after the third operation is finished and before the screen-throwing function is started, the mobile phone may pop up a dialog box of "whether to determine to throw the second application to the first electronic device", where the dialog box further includes two buttons of "yes" and "no". If the user selects yes, the mobile phone starts a screen throwing function; if the user selects no, the mobile phone does not start the screen throwing function.
After the mobile phone throws the multimedia data output by the second application from the mobile phone to the first electronic device for playing, the mobile phone can continue to run the second application to output the corresponding multimedia data. Alternatively, the handset may stop running the second application. Of course, after the mobile phone drops the second application to the first electronic device, the user may also perform various operations on the first application in the first electronic device, which is not limited in the embodiment of the present application.
As a refinement of this embodiment, in response to the third operation of the user, the mobile phone starts the screen-throwing function, and the thumbnail of the second application program dragged to the first display area by the user rebounds to the second display area and displays the text "in-screen-throwing", and at the same time, the third display area of the multi-task interface is updated, and controls are displayed ("stop-throwing").
Therefore, a user can complete switching of the screen-throwing application only by performing one-step operation on the multi-task interface without multi-step operation (for example, a first application is firstly opened, then the screen throwing is selected to stop, a second application is then opened, and the screen throwing is started), so that screen throwing content can be replaced very conveniently, privacy of the user is protected from leakage, and user experience is improved.
Referring to fig. 12, fig. 12 is a schematic flow chart of a method for projecting a screen on a multi-task interface according to an embodiment of the present application, which is applied to the electronic device 100 shown in fig. 1, and a mobile phone is taken as an example of the electronic device 100; as shown in the figure, the multi-task interface screen projection method comprises the following steps:
s601, responding to a first operation of a user, and acquiring positions of the first electronic device and the second electronic device.
The first operation is used for opening a multi-task interface of the mobile phone, for example, after the user presses the Home key twice, the mobile phone displays the multi-task interface. For another example, after the user touches the multitasking button in the navigation bar, the handset displays the multitasking interface. The triggering mode of calling up the multi-task interface for the mobile phone is not limited, and can be referred to the existing mode.
It will be appreciated that the wireless communication connection established by the mobile phone and other electronic devices may be a wireless communication connection established prior to responding to the first operation, and the wireless communication connection may be established by at least one of WIFI, bluetooth, UWB ultra wideband technology, infrared data transmission, wireless local area network, and the like. After the user performs the first operation, the mobile phone can acquire the distance between the electronic device and the mobile phone through the antenna array or the UWB ultra-wideband technology and the like, and further the electronic device with the distance smaller than the distance threshold value between the electronic device and the mobile phone is determined to be the candidate electronic device, and the candidate electronic device is displayed in the first display area. The mobile phone can sort the discovered nearby devices according to a preset sorting rule and display device information corresponding to the sorted nearby devices in a first display area.
After the mobile phone discovers the first electronic device and the second electronic device, the mobile phone determines the relative positions of the nearby electronic devices and the mobile phone, and a specific process can be seen in fig. 17 and the description thereof.
S602, determining the positions of a first electronic device icon and a second electronic device icon on a multi-task interface according to the positions of the first electronic device and the second electronic device; the first electronic device icon is used for identifying the first electronic device, the second electronic device icon is used for identifying the second electronic device, and the multi-task interface comprises a first display area which is used for displaying the first electronic device icon and the second electronic device icon.
Further, the first electronic device icon and the second electronic device corresponding to the first electronic device icon and the second electronic device icon displayed in the first display area of the multi-task interface are first electronic devices and second electronic devices which have a wireless communication connection relationship with the mobile phone, and the first electronic devices and the second electronic devices have a certain position relationship with the mobile phone.
Further, if the indication direction (gesture) of the mobile phone changes, the relative positions of the nearby electronic devices and the mobile phone change, and the electronic devices in the first display area are updated accordingly. The user can control the indication direction (posture) of the mobile phone, and the mobile phone can set different display effects on the device information corresponding to the nearby device in the indicated direction among the nearby devices found in all directions, or can display only the device information corresponding to the nearby electronic device in the indicated direction.
It should be noted that, there may be a plurality of electronic devices with established wireless communication connection relationships pointed by the mobile phone, and the electronic devices may be limited by the display interface of the mobile phone, only a part of the electronic devices may be displayed in the multitasking interface, and the user may trigger the mobile phone to display other electronic devices through a sliding operation.
S603, acquiring a thumbnail of the first application program and a thumbnail of the second application program; the first application program and the second application program are application programs running on the electronic equipment; the multi-task interface further comprises a second display area, wherein the second display area is used for displaying the thumbnail of the first application program and the thumbnail of the second application program.
Further, the application thumbnail displayed in the second display area of the multitasking interface is formed by an interface screenshot when the application program is switched to the background operation.
It should be noted that there may be a plurality of applications in the background running state, and the application may be limited by the display interface of the mobile phone, and only the thumbnail of a part of applications may be displayed in the multitasking interface, so that the user may trigger the mobile phone to display the thumbnail of other applications through a sliding operation.
S604, displaying the multi-task interface.
Specifically, the multi-task interface includes a first display area and a second display area. The first display area displays a first electronic device icon and a second electronic device icon according to the positions of the first electronic device and the second electronic device; the first electronic device icon is used for identifying the first electronic device, and the second electronic device icon is used for identifying the second electronic device. The second display area displays a thumbnail of the first application program and a thumbnail of the second application program; the thumbnail of the first application program is used for identifying the first application, and the thumbnail of the second application program is used for identifying the second application.
And S605, responding to a second operation of the user, and projecting multimedia data associated with the first application to the first electronic device.
The second operation is specifically that a thumbnail of the first application program in the second display area is dragged to a first electronic device icon in the first display area, and then the mobile phone screens multimedia data associated with the first application to the first electronic device.
As an extension to this embodiment, the second operation may be an operation of sliding up or down, or an operation of selecting an application thumbnail first and then executing a circle gesture in the second display area, or an operation of selecting an application thumbnail within a fixed time (e.g., 1 second) after shaking the electronic device, or a voice control operation, that is, a voice instruction of speaking "screen-drop" by the user. The specific form of the first operation is not limited in this embodiment.
As a refinement of this embodiment, after the second operation is finished and before the screen-throwing function is started, the mobile phone may pop up a dialog box of "whether to determine to throw the first application to the first electronic device", where the dialog box further includes two buttons of "yes" and "no". If the user selects yes, the mobile phone starts a screen throwing function; if the user selects no, the mobile phone does not start the screen throwing function.
It should be noted that, in the screen throwing process between the mobile phone and the electronic device in this embodiment, the transmitted data may support, but is not limited to, at least one of audio data, video data, picture data, web page data, and text data. For example, the mobile phone can send the display data output when the WeChat application program is operated to the smart television through the screen throwing function, and the smart television continues to display the display interface output by the WeChat application program. For another example, if the micro-letter program is installed in the smart television, the mobile phone can send an instruction for opening the micro-letter application program to the smart television, so that the smart television can open the micro-letter application program installed by itself in response to the instruction, and the micro-letter application program on the mobile phone is projected to the smart television to continue to run. Or when the mobile phone sends an instruction for opening the WeChat application program to the smart television, the instruction can also carry the running data of the current WeChat application program, for example, the running data is used for indicating that the current WeChat application program is positioned on the chat interface of the contact Sam. Therefore, after the smart television opens the WeChat application program installed by itself, the smart television can automatically jump to the chat interface of the contact Sam according to the running data, so that the multimedia data can be continuously played seamlessly when being put on the screen among a plurality of devices.
After the mobile phone drops the multimedia data output by the first application from the mobile phone to the first electronic device for playing, the mobile phone can continue to operate the first application to output the corresponding multimedia data. Alternatively, the handset may stop running the first application. Of course, after the mobile phone throws the first application to the first electronic device, the user may also perform various operations on the first application in the first electronic device, which is not limited in the embodiment of the present application.
As a refinement of this embodiment, in response to the second operation of the user, the mobile phone starts the screen-throwing function, and the thumbnail of the first application program dragged to the first display area by the user rebounds to the second display area and displays the text "in screen throwing", and at the same time, the third display area of the multi-task interface is updated, and controls are displayed ("stop screen throwing").
Therefore, the user can select the application and the electronic equipment pointed by the mobile phone in the multi-task interface of the mobile phone through the operation mode, so that the application is finished to be projected to the electronic equipment, the whole man-machine interaction process is more natural and friendly, and the use experience of the user is improved.
And S606, responding to a fourth operation of the user, and projecting the multimedia data associated with the second application to the second electronic equipment. Specifically, the fourth operation is dragging the thumbnail 202 of the second application to the second electronic device icon 102'.
Therefore, after the mobile phone has screen-cast the first application to the first electronic device, the user can screen-cast the second application on the mobile phone to the second electronic device through the fourth operation, so that the convenient operation of screen-cast of a plurality of applications to the plurality of electronic devices on the multi-task interface is realized. At this time, the first application in the mobile phone outputs the multimedia data to the second device for playing, and the second application in the mobile phone outputs the multimedia data to the first application for playing. The user can enjoy the multimedia data respectively played by different electronic devices at the same time, so that the use experience of the user is improved.
Referring to fig. 13, fig. 13 is a schematic flow chart of a method for projecting a screen on a multi-task interface according to an embodiment of the present application, which is applied to the electronic device 100 shown in fig. 1, and a mobile phone is taken as an example of the electronic device 100; as shown in the figure, the multi-task interface screen projection method comprises the following steps:
s701, responding to a first operation of a user, and acquiring positions of the first electronic device and the second electronic device.
S702, determining the positions of a first electronic device icon and a second electronic device icon on a multi-task interface according to the positions of the first electronic device and the second electronic device; the first electronic device icon is used for identifying the first electronic device, the second electronic device icon is used for identifying the second electronic device, and the multi-task interface comprises a first display area which is used for displaying the first electronic device icon and the second electronic device icon.
S703, acquiring a thumbnail of the first application program and a thumbnail of the second application program; the first application program and the second application program are application programs running on the electronic equipment; the multi-task interface further comprises a second display area, wherein the second display area is used for displaying the thumbnail of the first application program and the thumbnail of the second application program.
And S704, displaying the multi-task interface.
And S705, responding to a second operation of the user, and projecting the multimedia data associated with the first application to the first electronic equipment. The second operation is specifically that a thumbnail of the first application program in the second display area is dragged to a first electronic device icon in the first display area, and then the mobile phone screens multimedia data associated with the first application to the first electronic device.
As an extension to this embodiment, the second operation may be an operation of sliding up or down, or an operation of selecting an application thumbnail first and then executing a circle gesture in the second display area, or an operation of selecting an application thumbnail within a fixed time (e.g., 1 second) after shaking the electronic device, or a voice control operation, that is, a voice instruction of speaking "screen-drop" by the user. The specific form of the first operation is not limited in this embodiment.
As a refinement of this embodiment, after the second operation is finished and before the screen-throwing function is started, the mobile phone may pop up a dialog box of "whether to determine to throw the first application to the first electronic device", where the dialog box further includes two buttons of "yes" and "no". If the user selects yes, the mobile phone starts a screen throwing function; if the user selects no, the mobile phone does not start the screen throwing function.
After the mobile phone drops the multimedia data output by the first application from the mobile phone to the first electronic device for playing, the mobile phone can continue to operate the first application to output the corresponding multimedia data. Alternatively, the handset may stop running the first application. Of course, after the mobile phone throws the first application to the first electronic device, the user may also perform various operations on the first application in the first electronic device, which is not limited in the embodiment of the present application.
As a refinement of this embodiment, in response to the second operation of the user, the mobile phone starts the screen-throwing function, and the thumbnail of the first application program dragged to the first display area by the user rebounds to the second display area and displays the text "in screen throwing", and at the same time, the third display area of the multi-task interface is updated, and controls are displayed ("stop screen throwing").
Therefore, the user can select the application and the electronic equipment pointed by the mobile phone in the multi-task interface of the mobile phone through the operation mode, so that the application is finished to be projected to the electronic equipment, the whole man-machine interaction process is more natural and friendly, and the use experience of the user is improved.
And S706, stopping the screen of the multimedia data associated with the first application to the first electronic device in response to the fifth operation of the user. Specifically, after responding to the second operation of the user, the multitasking interface comprises a control for stopping screen throwing, and the fifth operation is specifically to drag the thumbnail of the first application program to the control for stopping screen throwing. Before the fifth operation, the first application is in a screen-throwing state.
Further, the mobile phone may stop sending display data generated when the first application is running to the first electronic device, or may send a signal for ending screen projection to the electronic device. In addition, after the first electronic equipment stops running the first application, the mobile phone can switch the first application back to the mobile phone for continuous display, so that the first application can realize seamless connection between the mobile phone and the first electronic equipment.
As an extension to this embodiment, the fifth operation may be a sliding operation that slides down or up, or may be an operation that selects an application thumbnail first and then performs a gesture of drawing a circle in the second display area, or may be an operation that selects an application thumbnail within a fixed time (e.g., 1 second) after shaking the electronic device, or may be a voice control operation, that is, a user speaks a voice instruction of "stop dropping a screen". The specific form of the fifth operation is not limited in this embodiment.
As a refinement of this embodiment, after the fifth operation is finished and before stopping the screen-throwing, the mobile phone may pop up a dialog box of "whether to determine to stop the screen-throwing of the first application", where the dialog box further includes two buttons of "yes" and "no". If the user selects yes, stopping screen projection of the first application by the mobile phone; if the user selects no, the mobile phone continues to screen the first application.
Therefore, the user can stop screen throwing only by performing one-step operation on the multi-task interface without performing multi-step operation (for example, first application is opened and then screen throwing is selected to stop), so that the operation can be finished very conveniently, and the user experience is improved.
Referring to fig. 14, fig. 14 is a schematic flow chart of a method for projecting a screen on a multi-task interface according to an embodiment of the present application, which is applied to the electronic device 100 shown in fig. 1, and a mobile phone is taken as an example of the electronic device 100; as shown in the figure, the multi-task interface screen projection method comprises the following steps:
s801, acquiring positions of a first electronic device and a second electronic device in response to a first operation of a user.
S802, determining the positions of a first electronic device icon and a second electronic device icon on a multi-task interface according to the positions of the first electronic device and the second electronic device; the first electronic device icon is used for identifying the first electronic device, the second electronic device icon is used for identifying the second electronic device, and the multi-task interface comprises a first display area which is used for displaying the first electronic device icon and the second electronic device icon.
S803, acquiring a thumbnail of the first application program and a thumbnail of the second application program; the first application program and the second application program are application programs running on the electronic equipment; the multi-task interface further comprises a second display area, wherein the second display area is used for displaying the thumbnail of the first application program and the thumbnail of the second application program.
S804, displaying the multi-task interface.
And S805, in response to a second operation of the user, projecting multimedia data associated with the first application to the first electronic device. The second operation is specifically that a thumbnail of the first application program in the second display area is dragged to a first electronic device icon in the first display area, and then the mobile phone screens multimedia data associated with the first application to the first electronic device.
As an extension to this embodiment, the second operation may be an operation of sliding up or down, or an operation of selecting an application thumbnail first and then executing a circle gesture in the second display area, or an operation of selecting an application thumbnail within a fixed time (e.g., 1 second) after shaking the electronic device, or a voice control operation, that is, a voice instruction of speaking "screen-drop" by the user. The specific form of the first operation is not limited in this embodiment.
As a refinement of this embodiment, after the second operation is finished and before the screen-throwing function is started, the mobile phone may pop up a dialog box of "whether to determine to throw the first application to the first electronic device", where the dialog box further includes two buttons of "yes" and "no". If the user selects yes, the mobile phone starts a screen throwing function; if the user selects no, the mobile phone does not start the screen throwing function.
After the mobile phone drops the multimedia data output by the first application from the mobile phone to the first electronic device for playing, the mobile phone can continue to operate the first application to output the corresponding multimedia data. Alternatively, the handset may stop running the first application. Of course, after the mobile phone throws the first application to the first electronic device, the user may also perform various operations on the first application in the first electronic device, which is not limited in the embodiment of the present application.
As a refinement of this embodiment, in response to the second operation of the user, the mobile phone starts the screen-throwing function, and the thumbnail of the first application program dragged to the first display area by the user rebounds to the second display area and displays the text "in screen throwing", and at the same time, the third display area of the multi-task interface is updated, and controls are displayed ("stop screen throwing").
Therefore, the user can select the application and the electronic equipment pointed by the mobile phone in the multi-task interface of the mobile phone through the operation mode, so that the application is finished to be projected to the electronic equipment, the whole man-machine interaction process is more natural and friendly, and the use experience of the user is improved.
And S806, responding to a sixth operation of the user, and projecting the multimedia data associated with the first application to the first electronic device and the second electronic device. Specifically, the sixth operation may be to slide the thumbnail of the first application program in the first direction, or to double-finger long press the thumbnail 201 of the first application program and continue to slide in the first direction for a distance L that is greater than a predetermined length to enable the electronic device to recognize and determine the direction in which the user slides.
The first direction may be upward or other directions, and the sliding may be a two-finger or three-finger sliding operation.
And the sixth operation is used for starting the functions of the plurality of devices for projecting the mobile phone screen, and the application selected by the user can be projected to the plurality of electronic devices.
In some embodiments, the handset determines the contact position at the beginning and end of the user's swipe, respectively, and determines the direction of the user's swipe based on the contact position at the beginning and end of the swipe. It will be appreciated that the manner of determining the sliding direction is not limited to determining the contact point position at the beginning and end of the sliding, but the mobile phone may also determine the sliding direction by recording the finger sliding track, etc., and the present embodiment is not limited in this regard.
That is, in a scenario where a single application program needs to be screen-cast to a plurality of devices, a user can implement screen-cast of one application to a plurality of electronic devices only by sliding a thumbnail of the application program in a multitasking interface in a first direction, and the plurality of electronic devices can be electronic devices with established wireless communication connection relationships pointed by a mobile phone, so that on the premise of limiting the devices needing screen-cast, the screen-cast of the plurality of devices is more convenient and faster.
Referring to fig. 15, fig. 15 is a schematic flow chart of a method for projecting a screen on a multi-task interface according to an embodiment of the present application, which is applied to the electronic device 100 shown in fig. 1, and a mobile phone is taken as an example of the electronic device 100; as shown in the figure, the multi-task interface screen projection method comprises the following steps:
s901, acquiring positions of a first electronic device and a second electronic device in response to a first operation of a user.
S902, determining positions of a first electronic device icon and a second electronic device icon on a multi-task interface according to the positions of the first electronic device and the second electronic device; the first electronic device icon is used for identifying the first electronic device, the second electronic device icon is used for identifying the second electronic device, and the multi-task interface comprises a first display area which is used for displaying the first electronic device icon and the second electronic device icon.
S903, acquiring a thumbnail of the first application program and a thumbnail of the second application program; the first application program and the second application program are application programs running on the electronic equipment; the multi-task interface further comprises a second display area, wherein the second display area is used for displaying the thumbnail of the first application program and the thumbnail of the second application program.
And S904, displaying the multi-task interface.
And S905, responding to a second operation of a user, and projecting multimedia data associated with the first application to the first electronic equipment. The second operation is specifically that a thumbnail of the first application program in the second display area is dragged to a first electronic device icon in the first display area, and then the mobile phone screens multimedia data associated with the first application to the first electronic device.
As an extension to this embodiment, the second operation may be an operation of sliding up or down, or an operation of selecting an application thumbnail first and then executing a circle gesture in the second display area, or an operation of selecting an application thumbnail within a fixed time (e.g., 1 second) after shaking the electronic device, or a voice control operation, that is, a voice instruction of speaking "screen-drop" by the user. The specific form of the first operation is not limited in this embodiment.
As a refinement of this embodiment, after the second operation is finished and before the screen-throwing function is started, the mobile phone may pop up a dialog box of "whether to determine to throw the first application to the first electronic device", where the dialog box further includes two buttons of "yes" and "no". If the user selects yes, the mobile phone starts a screen throwing function; if the user selects no, the mobile phone does not start the screen throwing function.
After the mobile phone drops the multimedia data output by the first application from the mobile phone to the first electronic device for playing, the mobile phone can continue to operate the first application to output the corresponding multimedia data. Alternatively, the handset may stop running the first application. Of course, after the mobile phone throws the first application to the first electronic device, the user may also perform various operations on the first application in the first electronic device, which is not limited in the embodiment of the present application.
As a refinement of this embodiment, in response to the second operation of the user, the mobile phone starts the screen-throwing function, and the thumbnail of the first application program dragged to the first display area by the user rebounds to the second display area and displays the text "in screen throwing", and at the same time, the third display area of the multi-task interface is updated, and controls are displayed ("stop screen throwing").
Therefore, the user can select the application and the electronic equipment pointed by the mobile phone in the multi-task interface of the mobile phone through the operation mode, so that the application is finished to be projected to the electronic equipment, the whole man-machine interaction process is more natural and friendly, and the use experience of the user is improved.
And S906, stopping the screen of the multimedia data associated with the first application to the first electronic device and the second electronic device in response to a seventh operation of the user.
Specifically, the seventh operation may be to slide the thumbnail of the first application program in the second direction, or to double-finger long press the thumbnail 201 of the first application program and continuously slide in the second direction for a distance L, which is greater than a predetermined length to enable the electronic device to recognize and determine the direction in which the user slides.
Wherein the second direction may be downward or other direction and the sliding may be a two-finger or three-finger sliding operation.
The seventh operation is to end the screen casting of the plurality of devices, and the screen casting of the plurality of devices by the application selected by the user may be stopped at the same time.
In some embodiments, the handset determines the contact position at the beginning and end of the user's swipe, respectively, and determines the direction of the user's swipe based on the contact position at the beginning and end of the swipe. It will be appreciated that the manner of determining the sliding direction is not limited to determining the contact point position at the beginning and end of the sliding, but the mobile phone may also determine the sliding direction by recording the finger sliding track, etc., and the present embodiment is not limited in this regard.
Therefore, in a scene that a single application screen is required to be ended to a plurality of devices, a user can stop the screen of the plurality of devices to one application only by performing one-step operation on the multi-task interface, so that the screen of the plurality of devices is ended more conveniently and rapidly.
In some embodiments, after the mobile phone starts the screen-casting function, a communication connection is established with the electronic device receiving the screen-casting, and data of the application program selected for screen-casting starts to be transmitted after the communication connection is established. The transmitted data may support, but is not limited to, at least one of audio data, video data, picture data, web page data, text data.
Fig. 16 illustrates a process in which after the electronic device opens the multi-tasking interface, all nearby devices are discovered and the relative locations of the nearby devices and the electronic device 100 are determined, and a connection is established with a selected receiving device and data is transmitted. As shown in fig. 16, when the electronic device 100 monitors that the multi-task interface is opened, the electronic device 100 is triggered to discover nearby devices. The user may perform a first operation in the graphical user interface shown in fig. 4A, and open the multi-task interface.
The electronic device 100 may discover nearby devices in several ways:
Mode 1. Discovery of nearby devices Using Wi-Fi direct Wireless communication technology
In some embodiments, the electronic device 100 may broadcast a probe request (probe request) outwards. After the nearby device monitors the probe request, it may respond with a probe response (probe response) to notify itself of its presence. In other embodiments, nearby devices may periodically send out Beacon (Beacon) frames. The electronic device 100 may discover nearby devices by listening to Beacon frames sent by the nearby devices.
That is, the electronic device 100 may actively discover nearby devices or passively discover nearby devices.
Mode 2 discovery of nearby devices using Bluetooth Wireless communication technology
In some embodiments, bluetooth devices (e.g., cell phones with bluetooth modules, tablet computers, printers, etc.) in the vicinity of the electronic device 100 may make bluetooth broadcasts. The electronic device 100 may perform a bluetooth scan to scan broadcast frames broadcast by nearby bluetooth devices to discover nearby bluetooth devices.
Mode 3 discovery of devices in the same Wi-Fi LAN
In some embodiments, electronic device 100 may determine the IP address range of the LAN based on its own IP address and subnet mask, and then may discover devices in the LAN by unicast polling. Without being limited thereto, the electronic device 100 may also discover devices in the LAN by broadcasting or multicasting messages in the LAN.
The method is not limited to the above-mentioned modes of discovering nearby devices by the electronic device, in practical application, the electronic device may also adopt other modes to discover nearby devices based on communication technologies such as bluetooth or Wi-Fi direct connection or Wi-Fi LAN or millimeter wave, and the application is not limited thereto.
The electronic device 100 obtains the identification information of the nearby device from the message (such as the probe response, the beacon frame or the bluetooth broadcast frame) received by the nearby device during the discovery process of the nearby device, and then obtains the device information through the identification information, and may also obtain the device information of the nearby device from the message received by the nearby device.
After the electronic device 100 discovers the first electronic device 101, the second electronic device 102, and the third electronic device 103, the electronic device 100 determines the relative positions of the nearby electronic devices and the electronic device 100, respectively. The relative position includes at least information such as an angle of arrival (AoA). The specific process by which the electronic device 100 determines the angle of arrival (AoA) of a nearby device may be as shown in fig. 17. The electronic device 100 first sends a first request signal 1011 to the first electronic device 101, the first request signal 1011 being for requesting the electronic device 101 to send a first report signal 1012 for an angle of arrival (AoA) measurement. The first electronic device 101, after receiving the first request signal 1011, transmits a first report signal 1012 to the electronic device 100, and the electronic device 100 receives the first report signal 1012 using the antenna array and estimates an angle of arrival (AoA) of the first electronic device 101 from the phase differences of the first report signals 1012 received by different antennas. In some embodiments, the first report signal 1012 may specifically include a fixed frequency spread signal, which may be a data string containing a series of switch slots and sampling slots appended to a normal data packet. When receiving the first report signal 1012, the receiving end switches different receiving antennas according to a certain sequence, and determines an arrival angle (AoA) according to the phase difference of the fixed frequency spread signals received by the different antennas.
For example, if the millimeter wave antenna array of the electronic device 100 includes 8 millimeter wave antennas A1-A8, after the first request signal 1011 is transmitted, only the antenna A1 is used to receive the first report signal 1012 transmitted by the nearby device at time t1, only the antenna A2 is used to receive the first report signal 1012 transmitted by the nearby device at time t2, and so on, after the 8 antennas respectively receive the first report signals 1012 transmitted by the nearby device, the electronic device 100 calculates the arrival angle (AoA) of the nearby device, that is, the relative angle between the nearby device and the electronic device 100, according to the phase difference of the first report signals 1012 received by the 8 antennas.
In other embodiments, the electronic device 100 and the first electronic device 101 may employ an emission angle (AoD) measurement method to determine the relative angle.
Next, the electronic device 100 determines the angle of arrival (AoA) of the second electronic device 102 and the third electronic device 103, respectively, using the antenna array at different time intervals using the same procedure as the measuring of the angle of arrival (AoA) of the electronic device 101. In other embodiments, the electronic device 100 may determine the angle of arrival (AoA) of multiple or all of the nearby devices simultaneously.
In other embodiments, electronic device 100 may configure the antenna array for beamforming to receive broadcast messages sent by nearby devices in only one direction or to send probe signals towards that direction, and the nearby devices that received the probe signals send probe responses to electronic device 100 for discovery of nearby devices in that direction. The electronic device 100 acquires device information of nearby devices and associates the nearby devices with the direction. Next, the electronic device 100 configures the antenna array for beamforming, finding nearby devices in the other direction. That is, by polling in various directions, discovery of nearby devices is completed, and the relative angles of nearby devices are obtained.
In other embodiments, the relative position may also include distance information. The electronic device 100 may determine the distance of the electronic device 100 from the discovered nearby device by receiving a signal strength RSSI of a message from the nearby device during a nearby device discovery process, e.g., the electronic device 100 may determine the distance of the electronic device 100 from the discovered nearby device by receiving a signal strength RSSI of a broadcast message or a response message from the nearby device during a nearby device discovery process; other prior art ranging methods, such as ToF/ToA/TDoA ranging, etc., may also be employed to determine the distance between the electronic device 100 and the discovered nearby devices. The present embodiment does not impose any limitation on this.
After detecting the first operation of the user, the electronic device 100 determines the relative positions of the first electronic device 101, the second electronic device 102 and the third electronic device 103, updates the multi-task interface, and displays the multi-task interface with reference to fig. 4B or fig. 4C, where the first display area in the multi-task interface includes the device information of the nearby devices, namely the first electronic device 101, the second electronic device 102 and the third electronic device 103, and the relative positions of the devices and the electronic device 100. Further, in the multitasking interface as shown in fig. 4C, the closer to the electronic device is the nearby device, the closer to the center point 1111 its corresponding icon is. The electronic device 100 listens to the user's selection of an application requiring a screen throw and the electronic device receiving the screen throw in a multi-tasking interface as shown in fig. 4B or fig. 4C.
In other embodiments, electronic device 100 may determine its signal strength RSSI from signals received from nearby devices during a nearby device discovery process, e.g., electronic device 100 may determine the signal strength RSSI of broadcast messages or response messages received from nearby devices during a nearby device discovery process, update the multi-tasking interface after determining the relative angle of the nearby device to electronic device 100 and detecting the user's operation to share data, and include the device information, relative angle, and signal strength of the nearby device in the multi-tasking interface. Further, in the multitasking interface as shown in fig. 4C, the stronger the signal strength is for a nearby device, the closer its corresponding icon is to the center point 1111.
As shown in fig. 16, if the user selects the first electronic device 101 as the receiving device of the first application for projection screen through the second operation. Next, the electronic device 100 receives a second operation of the user, establishes a communication connection with the first electronic device 101 through a wireless technology such as bluetooth or WiFi, and starts transmitting data. The specific process of establishing the communication connection may be: the electronic device 100 sends a connection establishment request message to the first electronic device 101, and after receiving the connection establishment request message, the first electronic device 101 replies a connection response message to the electronic device 100 to establish a communication connection between the electronic device 100 and the first electronic device 101. Next, the electronic device 100 transmits data of the first application selected by the user, such as an image, a video, a document, an application installer, application information, and the like, to the first electronic device 101.
In other embodiments, if the user indicates the direction information when opening the multi-tasking interface, for example, different display effects are set on the device information corresponding to the nearby device in the indicated direction among the nearby devices that the electronic device can find in all directions, or only the device information corresponding to the nearby device in the indicated direction is displayed in the multi-tasking interface as shown in fig. 4B or 4C.
In the above embodiments, the mobile phone is used as the electronic device 100, and the above multi-task interface is set in the mobile phone for illustration. It can be understood that the electronic device may be any device in a device group such as a tablet computer, a notebook computer, and the like, which is not limited in the embodiment of the present application.
In the above embodiments, it may be implemented in whole or in part by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. When loaded and executed on a computer, produces a flow or function in accordance with embodiments of the present application, in whole or in part. The computer may be a general purpose computer, a special purpose computer, a computer network, or other programmable apparatus. The computer instructions may be stored in a computer-readable storage medium or transmitted from one computer-readable storage medium to another computer-readable storage medium, for example, the computer instructions may be transmitted from one website, computer, server, or data center to another website, computer, server, or data center by a wired (e.g., coaxial cable, fiber optic, digital subscriber line), or wireless (e.g., infrared, wireless, microwave, etc.). The computer readable storage medium may be any available medium that can be accessed by a computer or a data storage device such as a server, data center, etc. that contains an integration of one or more available media. The usable medium may be a magnetic medium (e.g., floppy disk, hard disk, magnetic tape), an optical medium (e.g., DVD), or a semiconductor medium (e.g., solid state disk), etc.
Those of ordinary skill in the art will appreciate that implementing all or part of the above-described method embodiments may be accomplished by a computer program to instruct related hardware, the program may be stored in a computer readable storage medium, and the program may include the above-described method embodiments when executed. And the aforementioned storage medium includes: ROM or random access memory RAM, magnetic or optical disk, etc.

Claims (14)

1. A screen projection method, comprising:
responding to a first operation of a user, and acquiring positions of a first electronic device and a second electronic device by a mobile phone;
the mobile phone determines the positions of the first electronic equipment icon and the second electronic equipment icon on a multi-task interface according to the positions of the first electronic equipment and the second electronic equipment; the first electronic device icon is used for identifying the first electronic device, the second electronic device icon is used for identifying the second electronic device, and the multi-task interface comprises a first display area which is used for displaying the first electronic device icon and the second electronic device icon;
the mobile phone acquires a thumbnail of a first application program and a thumbnail of a second application program; the first application program and the second application program are application programs running on the mobile phone; the multi-task interface further comprises a second display area, wherein the second display area is used for displaying the thumbnail of the first application program and the thumbnail of the second application program;
The mobile phone displays the multi-task interface;
and responding to a second operation of the user, and screening multimedia data associated with the first application program to the first electronic device by the mobile phone.
2. The method of claim 1, wherein the obtaining, by the mobile phone, the location of the first electronic device and the second electronic device is specifically:
the mobile phone obtains the positions of the first electronic equipment and the second electronic equipment through an ultra-wideband positioning technology or an antenna array technology.
3. The method of claim 1, wherein the first display area includes a coordinate system, and the mobile phone determines positions of the first electronic device icon and the second electronic device icon on the multi-task interface according to positions of the first electronic device and the second electronic device, specifically:
and the mobile phone determines the positions of the first electronic equipment icon and the second electronic equipment icon in a coordinate system of the multi-task interface according to the positions of the first electronic equipment and the second electronic equipment.
4. The method of claim 1, wherein the second operation is specifically dragging a thumbnail of the first application to the first electronic device icon.
5. The method of any of claims 1-4, wherein after the handset screens the multimedia data associated with the first application to the first electronic device, further comprising:
and responding to a third operation of the user, stopping the screen of the multimedia data associated with the first application to the first electronic equipment by the mobile phone, and screen the multimedia data associated with the second application to the first electronic equipment.
6. The method according to any one of claims 1-4, further comprising:
and responding to a fourth operation of the user, and projecting multimedia data associated with the second application to the second electronic equipment by the mobile phone.
7. The method according to any one of claims 1-4, further comprising:
and responding to a fifth operation of the user, and stopping the screen of the multimedia data associated with the first application to the first electronic equipment by the mobile phone.
8. The method of claim 7, wherein the multi-tasking interface further comprises a control to stop dropping screen in response to a second operation by the user;
the fifth operation is specifically to drag the thumbnail of the first application program to the control for stopping screen dropping.
9. The method according to claim 1, wherein the method further comprises:
and responding to a sixth operation of the user, the mobile phone screens the multimedia data associated with the first application to the first electronic equipment and the second electronic equipment.
10. The method according to claim 9, wherein the sixth operation is in particular sliding a thumbnail of the first application in the first direction.
11. The method of any one of claims 1-4 or 8-10, further comprising:
and responding to a seventh operation of the user, stopping the screen of the multimedia data associated with the first application to the first electronic equipment and the second electronic equipment by the mobile phone.
12. The method according to claim 11, wherein the seventh operation is in particular sliding a thumbnail of the first application in the second direction.
13. A mobile phone, comprising:
a touch screen, wherein the touch screen comprises a touch sensitive surface and a display screen;
one or more processors;
one or more memories;
and one or more computer programs, wherein the one or more computer programs are stored in the one or more memories, the one or more computer programs comprising instructions that, when executed by the handset, cause the handset to implement the method of screening as recited in any of claims 1-12.
14. A computer readable storage medium having instructions stored therein, which when run on a mobile phone, cause the mobile phone to perform the method of screening according to any one of claims 1-12.
CN202310695573.XA 2020-05-25 2020-05-25 Screen throwing method and mobile phone Pending CN116708645A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310695573.XA CN116708645A (en) 2020-05-25 2020-05-25 Screen throwing method and mobile phone

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202310695573.XA CN116708645A (en) 2020-05-25 2020-05-25 Screen throwing method and mobile phone
CN202010448218.9A CN113794796B (en) 2020-05-25 2020-05-25 Screen projection method and electronic equipment

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
CN202010448218.9A Division CN113794796B (en) 2020-05-25 2020-05-25 Screen projection method and electronic equipment

Publications (1)

Publication Number Publication Date
CN116708645A true CN116708645A (en) 2023-09-05

Family

ID=79181002

Family Applications (2)

Application Number Title Priority Date Filing Date
CN202010448218.9A Active CN113794796B (en) 2020-05-25 2020-05-25 Screen projection method and electronic equipment
CN202310695573.XA Pending CN116708645A (en) 2020-05-25 2020-05-25 Screen throwing method and mobile phone

Family Applications Before (1)

Application Number Title Priority Date Filing Date
CN202010448218.9A Active CN113794796B (en) 2020-05-25 2020-05-25 Screen projection method and electronic equipment

Country Status (1)

Country Link
CN (2) CN113794796B (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114567663B (en) * 2022-02-28 2024-06-21 Oppo广东移动通信有限公司 Screen projection control method and device of display equipment, electronic equipment and storage medium
CN117616744A (en) * 2022-06-15 2024-02-27 北京小米移动软件有限公司 Multimedia data playing method and device, electronic equipment and storage medium
CN117675993A (en) * 2022-08-29 2024-03-08 Oppo广东移动通信有限公司 Cross-equipment connection method and device, storage medium and terminal equipment
CN118069006A (en) * 2022-11-22 2024-05-24 荣耀终端有限公司 Application circulation method and electronic equipment
CN118102006A (en) * 2022-11-28 2024-05-28 成都欧珀通信科技有限公司 Service circulation method, device and storage medium

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20010030668A1 (en) * 2000-01-10 2001-10-18 Gamze Erten Method and system for interacting with a display
KR101738527B1 (en) * 2010-12-07 2017-05-22 삼성전자 주식회사 Mobile device and control method thereof
US9763037B2 (en) * 2011-05-18 2017-09-12 Medappit Llc Network architecture for synchronized display
CN103257813B (en) * 2012-02-21 2017-12-22 海尔集团公司 The determination method and document transmission method and system of a kind of shared equipment
US9104367B2 (en) * 2012-03-02 2015-08-11 Realtek Semiconductor Corp. Multimedia interaction system and related computer program product capable of avoiding unexpected interaction behavior
KR102433879B1 (en) * 2015-08-21 2022-08-18 삼성전자주식회사 Display apparatus and control method thereof
CN108509237A (en) * 2018-01-19 2018-09-07 广州视源电子科技股份有限公司 Operation method and device of intelligent interaction panel and intelligent interaction panel
CN113504851A (en) * 2018-11-14 2021-10-15 华为技术有限公司 Method for playing multimedia data and electronic equipment
CN110597473A (en) * 2019-07-30 2019-12-20 华为技术有限公司 Screen projection method and electronic equipment

Also Published As

Publication number Publication date
CN113794796A (en) 2021-12-14
CN113794796B (en) 2023-06-23

Similar Documents

Publication Publication Date Title
CN110381197B (en) Method, device and system for processing audio data in many-to-one screen projection
CN114397979B (en) Application display method and electronic equipment
CN113794796B (en) Screen projection method and electronic equipment
CN113497909B (en) Equipment interaction method and electronic equipment
CN116360725B (en) Display interaction system, display method and device
CN114115770B (en) Display control method and related device
WO2021143650A1 (en) Method for sharing data and electronic device
JP7234379B2 (en) Methods and associated devices for accessing networks by smart home devices
WO2020024108A1 (en) Application icon display method and terminal
CN112130788A (en) Content sharing method and device
CN114079691B (en) Equipment identification method and related device
CN116489268A (en) Equipment identification method and related device
CN112068907A (en) Interface display method and electronic equipment
CN114356195B (en) File transmission method and related equipment
WO2022268009A1 (en) Screen sharing method and related device
WO2022022674A1 (en) Application icon layout method and related apparatus
CN115016697A (en) Screen projection method, computer device, readable storage medium, and program product
CN114201738B (en) Unlocking method and electronic equipment
CN114173184B (en) Screen projection method and electronic equipment
CN114173286B (en) Method and device for determining test path, electronic equipment and readable storage medium
WO2022062902A1 (en) File transfer method and electronic device
CN112840680A (en) Position information processing method and related device
CN118646821A (en) Card display method and electronic equipment
CN117978308A (en) Bluetooth ranging method, electronic equipment and system
CN115857964A (en) Application program installation method and related equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination