CN114115770A - Display control method and related device - Google Patents

Display control method and related device Download PDF

Info

Publication number
CN114115770A
CN114115770A CN202010901550.6A CN202010901550A CN114115770A CN 114115770 A CN114115770 A CN 114115770A CN 202010901550 A CN202010901550 A CN 202010901550A CN 114115770 A CN114115770 A CN 114115770A
Authority
CN
China
Prior art keywords
electronic device
user interface
interface
application
screen
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010901550.6A
Other languages
Chinese (zh)
Inventor
叶灵洁
杨文彬
阚彬
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Device Co Ltd
Original Assignee
Huawei Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Device Co Ltd filed Critical Huawei Device Co Ltd
Priority to CN202010901550.6A priority Critical patent/CN114115770A/en
Priority to PCT/CN2021/112365 priority patent/WO2022042326A1/en
Publication of CN114115770A publication Critical patent/CN114115770A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1454Digital output to display device ; Cooperation and interconnection of the display device with other functional units involving copying of the display data of a local workstation or window to a remote workstation or window so that an actual copy of the data is displayed simultaneously on two or more displays, e.g. teledisplay
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the application provides a display control method, which comprises the following steps: the main electronic device detects a first operation; if the first operation is an operation on the first user interface, responding to the first operation, and displaying an interface for calling and running the application of the main electronic equipment by the main electronic equipment; and if the first operation is an operation on the second user interface, responding to the first operation, and displaying an interface for calling and running the application of the slave electronic equipment by the master electronic equipment. By implementing the embodiment of the application, the situation that the size of the application interface changes along with the size change of the screen projection interface when the main electronic equipment is operated can be avoided, and the use convenience is improved.

Description

Display control method and related device
Technical Field
The present disclosure relates to the field of electronic technologies, and in particular, to a display control method and a related device.
Background
With the development of science and technology, the types and functions of electronic equipment are more and more, and when a user uses a plurality of electronic equipment, the user is inconvenient to switch back and forth. In order to reduce the cost of switching devices by users, device cooperation technology is gradually developed.
The device cooperation means that after the two electronic devices are connected (for example, connected by bluetooth), the interface content of the slave electronic device is projected and displayed on the master electronic device. The operation of the slave electronic device can be realized on the screen projection interface of the master electronic device, for example, a user can call an application of the slave electronic device through the screen projection interface of the master electronic device, but the operation interface of the application is always adaptive to the size of the screen projection interface, for example, the user reduces the screen projection interface, the operation interface of the application is also reduced, and the use is extremely inconvenient.
Disclosure of Invention
The application provides a display control method and a related device, which can identify an input source of a first operation and display an application interface corresponding to the input source, so that the situation that the size of the application interface changes along with the size change of a screen projection interface when a user operates on a main electronic device is avoided, the use by the user is facilitated, and the configuration advantages of the main electronic device are fully utilized.
In a first aspect, the present application provides a method of display control, which is applied to a master electronic device that displays a first user interface including interface content of a second user interface displayed by a slave electronic device that is projected on the master electronic device; the master electronic device detecting a first operation; if the first operation is an operation on the first user interface, in response to the first operation, the primary electronic device displaying a third user interface, the third user interface including an interface to invoke and run an application of the primary electronic device; and if the first operation is an operation on the second user interface, responding to the first operation, and displaying a fourth user interface by the main electronic equipment, wherein the fourth user interface comprises an interface for calling and running an application of the slave electronic equipment.
By implementing the method provided by the first aspect, the main electronic device can identify the input source of the first operation and display the application interface corresponding to the input source, so that the situation that the size of the application interface changes along with the size change of the screen projection interface when a user operates on the main electronic device is avoided, the use by the user is facilitated, and the configuration advantages of the main electronic device are fully utilized.
The first operation comprises a touch operation for starting or calling an application. Illustratively, the first operation may be an operation that the user clicks a search box of a browser to prepare to call an input method to input text, or an operation that the user clicks an icon of a camera application on a main interface to prepare to call a camera.
In combination with the first aspect, in some embodiments, the first user interface includes a screen projection area including interface content of a second user interface displayed by the slave electronic device projected on the master electronic device, the first operation is an operation on the screen projection area of the first user interface, and the method further includes: in response to the detected second operation acting on the third user interface, the master electronic device transmits the execution result of the application to the slave electronic device. Therefore, the screen projection area on the main electronic equipment is operated, the called application running result of the main electronic equipment is directly sent to the slave electronic equipment, the result does not need to be sent manually by a user, and the user operation is reduced.
Wherein the second operation comprises an operation for the invoked application. Illustratively, the second operation may be an operation in which the user clicks an input text of the input method interface, or an operation in which the user clicks a photographing control of the camera application interface.
In combination with the first aspect, in some embodiments, the method further comprises: the master electronic device detects a third operation for turning on a differentiated distributed input device function, and in response to the third operation, turns on the differentiated distributed input device function, the differentiated distributed input device function being used for the master electronic device to call and run an application of the master electronic device if the master electronic device detects that the first operation is on the master electronic device, or to call and run an application of the slave electronic device if the first operation is on the slave electronic device. In this way, the user can set the on or off of the function of the distinguishing distributed input device according to the requirement of the user.
In some embodiments, the method further comprises: the first user interface comprises a window area, the third operation comprises an operation which is detected by the main electronic device and acts on a first interactive element in the window area, and the first interactive element is used for monitoring the operation of starting the function of the distinguishing distributed input device; when the distinguishing distributed input device function is not turned on, the first interactive element is displayed in the window area.
In some embodiments, the method further comprises: after the function of distinguishing the distributed input devices is started, the main electronic device refreshes the window area, and second interactive elements are displayed in the refreshed window area; the primary electronic device detects a fourth operation on the second interactive element and, in response to the fourth operation, turns off the differentiated distributed input device function. In this way, the user is facilitated to turn on or off the function of the differentiated distributed input device.
With reference to the first aspect, in some embodiments, the first user interface includes a screen projection area including interface content of a second user interface displayed by the slave electronic device projected on the master electronic device, and the method further includes: after the differentiated distributed input device function is turned off, the main electronic device detects a fifth operation; if the fifth operation is an operation on the screen projection area on the master electronic device, in response to the fifth operation, the master electronic device displays a fifth user interface, which includes an interface for invoking and running an application of the slave electronic device. That is, if the user does not want to utilize the function of distinguishing the distributed input devices, an existing scheme, i.e., a general screen projection function, may be provided for the user to select. The differentiated distributed input device functionality of the present application may thus be compatible with a generic screen projection function.
With reference to the first aspect, in some embodiments, if the first operation is an operation on a text box on the first user interface, the third user interface includes an input method interface that invokes and runs an input method application of the host electronic device. Therefore, when the user inputs the text in the screen projection area of the main electronic equipment, the user can use the input method of the main electronic equipment, and the text is more convenient to input.
In some embodiments, the second operation is an input operation that acts on a text box in the drop-in area of the third user interface, the method further comprising: and the main electronic equipment refreshes the third user interface, and the screen projection area of the refreshed third user interface comprises the input result of the input operation.
With reference to the first aspect, in some embodiments, if the first operation is an operation of a camera application acting on the first user interface, the third user interface includes a first preview screen of the camera application that invokes and runs the main electronic device, the first preview screen includes a first image, and the third user interface further includes at least one enhanced function option provided by the main electronic device; in response to the detected sixth operation acting on the target enhancement function option, the electronic equipment displays a second preview picture, wherein the second preview picture comprises a second image, the second image is an image obtained by performing enhancement function processing corresponding to the target enhancement function option on the first image, and the at least one enhancement function option comprises the target enhancement function option. Wherein the sixth operation comprises a click operation on the target enhanced function option.
In some embodiments, the first user interface includes a screen-projection region that includes interface content of a second user interface displayed by the slave electronic device projected on the master electronic device, the first operation is an operation on the screen-projection region of the first user interface, the second operation is an operation that acts on a capture control in the third user interface, the method further comprising: the main electronic device refreshes the third user interface, the refreshed third user interface comprises a shooting result of the shooting operation, and the shooting result comprises a shooting picture after the target enhancement function is called. In this way, when the user operates the screen projection area of the main electronic device, the camera application of the main electronic device can be called, the enhanced function of the main electronic device is called for shooting, and the shooting result can be obtained in the slave electronic device without using the camera application of the slave electronic device.
With reference to the first aspect, in some embodiments, if the first operation is an operation of a playback application icon acting on the first user interface, the third user interface includes a media object selection interface that invokes and runs a playback application of the host electronic device, and the media object selection interface includes at least one media object identifier; the media object selection interface includes media object identifications of historical plays by the primary electronic device. In response to the detected seventh operation acting on the target media object identifier, the electronic device displays a playing picture of media corresponding to the target media object identifier, the playing picture includes playing information of the media which is played on the main electronic device in a historical manner, and the at least one media object identifier includes the target media object identifier.
In some embodiments, the playback information includes one or more of the following: the system comprises playing progress information, user comment information and media sharing information.
In some embodiments, the first user interface includes a screen projection area, the screen projection area includes interface content of a second user interface displayed by the slave electronic device projected on the master electronic device, the first operation is an operation on the screen projection area of the first user interface, and if the second operation is an operation on a closing control of the media playing screen, the method further includes: in response to the detected second operation acting on the media playing picture, the master electronic device closes the media playing picture, and sends the playing information of the media playing picture to the slave electronic device. In this way, the user can call the playing application of the main electronic device when the screen projection area of the main electronic device is operated. The advantage that the screen of the master electronic equipment is larger than that of the slave electronic equipment is better utilized, and the playing information is sent to the slave electronic equipment, so that a user can conveniently and continuously watch the scene on the slave electronic equipment after the scene is changed.
In combination with the first aspect, in some embodiments, the primary electronic device includes a touch screen and a graphical user interface; the slave electronic device includes a touch screen and a graphical user interface. In this way, the user can directly touch the screen on the electronic device, and response results can be obtained through the graphical user interface.
In a second aspect, the present application provides an electronic device comprising: a display screen, a touch sensor, a memory, one or more processors, a plurality of applications, and one or more programs; wherein the one or more programs are stored in the memory; the one or more processors, when executing the one or more programs, cause the electronic device to perform the method as described in the first aspect and any possible implementation manner of the first aspect.
In a third aspect, the present application provides a computer device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the computer program when executed by the processor causes the computer device to perform the method as described in the first aspect and any possible implementation manner of the first aspect.
In a fourth aspect, the present application provides an electronic device including means for performing the method as described in the first aspect and any possible implementation manner of the first aspect.
In a fifth aspect, the present application provides a computer program product containing instructions that, when run on an electronic device, cause the electronic device to perform the method as described in the first aspect and any possible implementation manner of the first aspect.
In a sixth aspect, the present application provides a computer-readable storage medium, which includes instructions that, when executed on an electronic device, cause the electronic device to perform the method described in the first aspect and any possible implementation manner of the first aspect.
In a seventh aspect, the present application provides a communication system, including a master electronic device, further including a slave electronic device; the master electronic device performs the method as described in the first aspect and any possible implementation manner of the first aspect.
According to the technical scheme, the main electronic equipment can identify the input source of the first operation and display the application interface corresponding to the input source, so that the situation that the size of the application interface changes along with the size change of the screen projection interface when a user operates on the main electronic equipment is avoided, the use by the user is facilitated, and the configuration advantages of the main electronic equipment are fully utilized.
Drawings
Fig. 1 is a schematic structural diagram of an electronic device provided in an embodiment of the present application;
fig. 2A-fig. 2I are schematic diagrams illustrating establishment of a collaborative relationship between two electronic devices according to an embodiment of the present application;
3A-3B are schematic diagrams of a first manner of enabling a function of distinguishing distributed input devices provided by an embodiment of the present application;
4A-4I are some schematic diagrams of human-computer interaction provided in a text input scenario according to an embodiment of the present application;
5A-5F are schematic diagrams of further human-computer interactions provided in an image capture scenario according to embodiments of the present application;
6A-6I are some schematic diagrams of human-computer interaction provided in a video playing scene according to an embodiment of the present application;
fig. 7 is a schematic diagram of a software structure of an electronic device according to an embodiment of the present application;
fig. 8 is a flowchart of a method for an electronic device to implement a function of distinguishing distributed input devices according to an embodiment of the present application.
Detailed Description
The embodiments of the present application will be described below with reference to the drawings. The terminology used in the description of the embodiments herein is for the purpose of describing particular embodiments herein only and is not intended to be limiting of the application.
First, application scenarios related to embodiments of the present application are described. The present application relates to at least two electronic devices having a touch screen and a graphical user interface, including but not limited to a cell phone, a tablet, a smart screen, etc. In the embodiment of the application, a cooperative relationship is established between the at least two electronic devices. The collaboration among the electronic devices means that a distributed technology is utilized to realize cross-device and cross-system collaboration, and after a plurality of electronic devices are connected, functions such as resource sharing, collaborative operation and the like are realized, wherein the plurality of electronic devices include but are not limited to electronic devices such as mobile phones, computers, tablets and the like.
The following describes a process of establishing a collaborative relationship between two electronic devices, for example, a first electronic device and a second electronic device. Specifically, the first electronic device and the second electronic device are connected first, and the connection manner includes, but is not limited to, a bluetooth connection, a code scanning connection, a Near Field Communication (NFC) connection, and the like. Then, the interface content of the first electronic device can be projected to the interface of the second electronic device for displaying. In this application, the first electronic device may be referred to as a slave electronic device and the second electronic device as a master electronic device. The interface content of the slave electronic device after screen projection can be displayed on the interface of the master electronic device in a screen projection manner. And the other parts of the main electronic device interface except the screen projection interface are called non-screen projection interfaces.
In the embodiment of the application, when a user performs a touch screen operation on the screen of the main electronic device, no matter the touch screen operation is on a screen projection interface or a non-screen projection interface, an application program of the main electronic device is called to realize a corresponding function. And when the user performs touch screen operation on the screen of the slave electronic equipment, calling the application program of the slave electronic equipment to realize the corresponding function.
An exemplary electronic device 100 provided in the following embodiments of the present application is first described below.
Fig. 1 shows a schematic structural diagram of an electronic device 100.
The electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a Universal Serial Bus (USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, a key 190, a motor 191, an indicator 192, a camera 193, a display screen 194, a Subscriber Identification Module (SIM) card interface 195, and the like. The sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
It is to be understood that the illustrated structure of the embodiment of the present application does not specifically limit the electronic device 100. In other embodiments of the present application, electronic device 100 may include more or fewer components than shown, or some components may be combined, some components may be split, or a different arrangement of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Processor 110 may include one or more processing units, such as: the processor 110 may include an Application Processor (AP), a modem processor, a Graphics Processing Unit (GPU), an Image Signal Processor (ISP), a controller, a video codec, a Digital Signal Processor (DSP), a baseband processor, and/or a neural-Network Processing Unit (NPU), etc. The different processing units may be separate devices or may be integrated into one or more processors.
The controller can generate an operation control signal according to the instruction operation code and the timing signal to complete the control of instruction fetching and instruction execution.
A memory may also be provided in processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that have just been used or recycled by the processor 110. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Avoiding repeated accesses reduces the latency of the processor 110, thereby increasing the efficiency of the system.
In some embodiments, processor 110 may include one or more interfaces. The interface may include an integrated circuit (I2C) interface, an integrated circuit built-in audio (I2S) interface, a Pulse Code Modulation (PCM) interface, a universal asynchronous receiver/transmitter (UART) interface, a Mobile Industry Processor Interface (MIPI), a general-purpose input/output (GPIO) interface, a Subscriber Identity Module (SIM) interface, and/or a Universal Serial Bus (USB) interface, etc.
In this embodiment, the processor 110 may be configured to determine whether the function of distinguishing the distributed input devices is activated when the master electronic device 100 and the other slave electronic devices are working together.
The I2C interface is a bi-directional synchronous serial bus that includes a serial data line (SDA) and a Serial Clock Line (SCL). In some embodiments, processor 110 may include multiple sets of I2C buses. The processor 110 may be coupled to the touch sensor 180K, the charger, the flash, the camera 193, etc. through different I2C bus interfaces, respectively. For example: the processor 110 may be coupled to the touch sensor 180K via an I2C interface, such that the processor 110 and the touch sensor 180K communicate via an I2C bus interface to implement the touch functionality of the electronic device 100.
The I2S interface may be used for audio communication. In some embodiments, processor 110 may include multiple sets of I2S buses. The processor 110 may be coupled to the audio module 170 via an I2S bus to enable communication between the processor 110 and the audio module 170. In some embodiments, the audio module 170 may communicate audio signals to the wireless communication module 160 via the I2S interface, enabling answering of calls via a bluetooth headset.
The PCM interface may also be used for audio communication, sampling, quantizing and encoding analog signals. In some embodiments, the audio module 170 and the wireless communication module 160 may be coupled by a PCM bus interface. In some embodiments, the audio module 170 may also transmit audio signals to the wireless communication module 160 through the PCM interface, so as to implement a function of answering a call through a bluetooth headset. Both the I2S interface and the PCM interface may be used for audio communication.
The UART interface is a universal serial data bus used for asynchronous communications. The bus may be a bidirectional communication bus. It converts the data to be transmitted between serial communication and parallel communication. In some embodiments, a UART interface is generally used to connect the processor 110 with the wireless communication module 160. For example: the processor 110 communicates with a bluetooth module in the wireless communication module 160 through a UART interface to implement a bluetooth function. In some embodiments, the audio module 170 may transmit the audio signal to the wireless communication module 160 through a UART interface, so as to realize the function of playing music through a bluetooth headset.
MIPI interfaces may be used to connect processor 110 with peripheral devices such as display screen 194, camera 193, and the like. The MIPI interface includes a Camera Serial Interface (CSI), a Display Serial Interface (DSI), and the like. In some embodiments, processor 110 and camera 193 communicate through a CSI interface to implement the capture functionality of electronic device 100. The processor 110 and the display screen 194 communicate through the DSI interface to implement the display function of the electronic device 100.
The GPIO interface may be configured by software. The GPIO interface may be configured as a control signal and may also be configured as a data signal. In some embodiments, a GPIO interface may be used to connect the processor 110 with the camera 193, the display 194, the wireless communication module 160, the audio module 170, the sensor module 180, and the like. The GPIO interface may also be configured as an I2C interface, an I2S interface, a UART interface, a MIPI interface, and the like.
The USB interface 130 is an interface conforming to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, or the like. The USB interface 130 may be used to connect a charger to charge the electronic device 100, and may also be used to transmit data between the electronic device 100 and a peripheral device. And the earphone can also be used for connecting an earphone and playing audio through the earphone. The interface may also be used to connect other electronic devices, such as AR devices and the like.
It should be understood that the interface connection relationship between the modules illustrated in the embodiments of the present application is only an illustration, and does not limit the structure of the electronic device 100. In other embodiments of the present application, the electronic device 100 may also adopt different interface connection manners or a combination of multiple interface connection manners in the above embodiments.
The charging management module 140 is configured to receive charging input from a charger. The charger may be a wireless charger or a wired charger. In some wired charging embodiments, the charging management module 140 may receive charging input from a wired charger via the USB interface 130. In some wireless charging embodiments, the charging management module 140 may receive a wireless charging input through a wireless charging coil of the electronic device 100. The charging management module 140 may also supply power to the electronic device through the power management module 141 while charging the battery 142.
The power management module 141 is used to connect the battery 142, the charging management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charge management module 140, and supplies power to the processor 110, the internal memory 121, the display 194, the camera 193, the wireless communication module 160, and the like. The power management module 141 may also be used to monitor parameters such as battery capacity, battery cycle count, battery state of health (leakage, impedance), etc. In some other embodiments, the power management module 141 may also be disposed in the processor 110. In other embodiments, the power management module 141 and the charging management module 140 may be disposed in the same device.
The wireless communication function of the electronic device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the electronic device 100 may be used to cover a single or multiple communication bands. Different antennas can also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed as a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 150 may provide a solution including 2G/3G/4G/5G wireless communication applied to the electronic device 100. The mobile communication module 150 may include at least one filter, a switch, a power amplifier, a Low Noise Amplifier (LNA), and the like. The mobile communication module 150 may receive the electromagnetic wave from the antenna 1, filter, amplify, etc. the received electromagnetic wave, and transmit the electromagnetic wave to the modem processor for demodulation. The mobile communication module 150 may also amplify the signal modulated by the modem processor, and convert the signal into electromagnetic wave through the antenna 1 to radiate the electromagnetic wave. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the same device as at least some of the modules of the processor 110.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating a low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then passes the demodulated low frequency baseband signal to a baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor. The application processor outputs a sound signal through an audio device (not limited to the speaker 170A, the receiver 170B, etc.) or displays an image or video through the display screen 194. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be provided in the same device as the mobile communication module 150 or other functional modules, independent of the processor 110.
The wireless communication module 160 may provide a solution for wireless communication applied to the electronic device 100, including Wireless Local Area Networks (WLANs) (e.g., wireless fidelity (Wi-Fi) networks), bluetooth (bluetooth, BT), Global Navigation Satellite System (GNSS), Frequency Modulation (FM), Near Field Communication (NFC), Infrared (IR), and the like. The wireless communication module 160 may be one or more devices integrating at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, performs frequency modulation and filtering processing on electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, perform frequency modulation and amplification on the signal, and convert the signal into electromagnetic waves through the antenna 2 to radiate the electromagnetic waves.
In some embodiments, antenna 1 of electronic device 100 is coupled to mobile communication module 150 and antenna 2 is coupled to wireless communication module 160 so that electronic device 100 can communicate with networks and other devices through wireless communication techniques. The wireless communication technology may include global system for mobile communications (GSM), General Packet Radio Service (GPRS), code division multiple access (code division multiple access, CDMA), Wideband Code Division Multiple Access (WCDMA), time-division code division multiple access (time-division code division multiple access, TD-SCDMA), Long Term Evolution (LTE), LTE, BT, GNSS, WLAN, NFC, FM, and/or IR technologies, etc. The GNSS may include a Global Positioning System (GPS), a global navigation satellite system (GLONASS), a beidou navigation satellite system (BDS), a quasi-zenith satellite system (QZSS), and/or a Satellite Based Augmentation System (SBAS).
The electronic device 100 implements display functions via the GPU, the display screen 194, and the application processor. The GPU is a microprocessor for image processing, and is connected to the display screen 194 and an application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. The processor 110 may include one or more GPUs that execute program instructions to generate or alter display information.
The display screen 194 is used to display images, video, and the like. The display screen 194 includes a display panel. The display panel may adopt a Liquid Crystal Display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (active-matrix organic light-emitting diode, AMOLED), a flexible light-emitting diode (FLED), a miniature, a Micro-oeld, a quantum dot light-emitting diode (QLED), and the like. In some embodiments, the electronic device 100 may include 1 or N display screens 194, with N being a positive integer greater than 1.
In the embodiment of the present application, the display screen 194 may be used to display a user interface, and various controls in the user interface may be used to monitor touch operations of a user on electronic devices from different sources and at different positions of the electronic devices. In response to the operation, the display screen 194 may also be used to display a response result of the electronic device to the touch operation.
The electronic device 100 may implement a shooting function through the ISP, the camera 193, the video codec, the GPU, the display 194, the application processor, and the like.
The ISP is used to process the data fed back by the camera 193. For example, when a photo is taken, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electrical signal, and the camera photosensitive element transmits the electrical signal to the ISP for processing and converting into an image visible to naked eyes. The ISP can also carry out algorithm optimization on the noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in camera 193.
The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image to the photosensitive element. The photosensitive element may be a Charge Coupled Device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor. The light sensing element converts the optical signal into an electrical signal, which is then passed to the ISP where it is converted into a digital image signal. And the ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into image signal in standard RGB, YUV and other formats. In some embodiments, the electronic device 100 may include 1 or N cameras 193, N being a positive integer greater than 1.
The digital signal processor is used for processing digital signals, and can process digital image signals and other digital signals. For example, when the electronic device 100 selects a frequency bin, the digital signal processor is used to perform fourier transform or the like on the frequency bin energy.
Video codecs are used to compress or decompress digital video. The electronic device 100 may support one or more video codecs. In this way, the electronic device 100 may play or record video in a variety of encoding formats, such as: moving Picture Experts Group (MPEG) 1, MPEG2, MPEG3, MPEG4, and the like.
The NPU is a neural-network (NN) computing processor that processes input information quickly by using a biological neural network structure, for example, by using a transfer mode between neurons of a human brain, and can also learn by itself continuously. Applications such as intelligent recognition of the electronic device 100 can be realized through the NPU, for example: image recognition, face recognition, speech recognition, text understanding, and the like.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to extend the memory capability of the electronic device 100. The external memory card communicates with the processor 110 through the external memory interface 120 to implement a data storage function. For example, files such as music, video, etc. are saved in an external memory card.
The internal memory 121 may be used to store computer-executable program code, which includes instructions. The internal memory 121 may include a program storage area and a data storage area. The storage program area may store an operating system, an application program (such as a sound playing function, an image playing function, etc.) required by at least one function, and the like. The storage data area may store data (such as audio data, phone book, etc.) created during use of the electronic device 100, and the like. In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory, such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (UFS), and the like. The processor 110 executes various functional applications of the electronic device 100 and data processing by executing instructions stored in the internal memory 121 and/or instructions stored in a memory provided in the processor.
The electronic device 100 may implement audio functions via the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the headphone interface 170D, and the application processor. Such as music playing, recording, etc.
The audio module 170 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. The audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be disposed in the processor 110, or some functional modules of the audio module 170 may be disposed in the processor 110.
The speaker 170A, also called a "horn", is used to convert the audio electrical signal into an acoustic signal. The electronic apparatus 100 can listen to music through the speaker 170A or listen to a handsfree call.
The receiver 170B, also called "earpiece", is used to convert the electrical audio signal into an acoustic signal. When the electronic apparatus 100 receives a call or voice information, it can receive voice by placing the receiver 170B close to the ear of the person.
The microphone 170C, also referred to as a "microphone," is used to convert sound signals into electrical signals. When making a call or transmitting voice information, the user can input a voice signal to the microphone 170C by speaking the user's mouth near the microphone 170C. The electronic device 100 may be provided with at least one microphone 170C. In other embodiments, the electronic device 100 may be provided with two microphones 170C to achieve a noise reduction function in addition to collecting sound signals. In other embodiments, the electronic device 100 may further include three, four or more microphones 170C to collect sound signals, reduce noise, identify sound sources, perform directional recording, and so on.
The headphone interface 170D is used to connect a wired headphone. The headset interface 170D may be the USB interface 130, or may be a 3.5mm open mobile electronic device platform (OMTP) standard interface, a cellular telecommunications industry association (cellular telecommunications industry association of the USA, CTIA) standard interface.
The pressure sensor 180A is used for sensing a pressure signal, and converting the pressure signal into an electrical signal. In some embodiments, the pressure sensor 180A may be disposed on the display screen 194. The pressure sensor 180A can be of a wide variety, such as a resistive pressure sensor, an inductive pressure sensor, a capacitive pressure sensor, and the like. The capacitive pressure sensor may be a sensor comprising at least two parallel plates having an electrically conductive material. When a force acts on the pressure sensor 180A, the capacitance between the electrodes changes. The electronic device 100 determines the strength of the pressure from the change in capacitance. When a touch operation is applied to the display screen 194, the electronic apparatus 100 detects the intensity of the touch operation according to the pressure sensor 180A. The electronic apparatus 100 may also calculate the touched position from the detection signal of the pressure sensor 180A. In some embodiments, the touch operations that are applied to the same touch position but different touch operation intensities may correspond to different operation instructions. For example: and when the touch operation with the touch operation intensity smaller than the first pressure threshold value acts on the short message application icon, executing an instruction for viewing the short message. And when the touch operation with the touch operation intensity larger than or equal to the first pressure threshold value acts on the short message application icon, executing an instruction of newly building the short message.
The gyro sensor 180B may be used to determine the motion attitude of the electronic device 100. In some embodiments, the angular velocity of electronic device 100 about three axes (i.e., the x, y, and z axes) may be determined by gyroscope sensor 180B. The gyro sensor 180B may be used for photographing anti-shake. For example, when the shutter is pressed, the gyro sensor 180B detects a shake angle of the electronic device 100, calculates a distance to be compensated for by the lens module according to the shake angle, and allows the lens to counteract the shake of the electronic device 100 through a reverse movement, thereby achieving anti-shake. The gyroscope sensor 180B may also be used for navigation, somatosensory gaming scenes.
The air pressure sensor 180C is used to measure air pressure. In some embodiments, electronic device 100 calculates altitude, aiding in positioning and navigation, from barometric pressure values measured by barometric pressure sensor 180C.
The magnetic sensor 180D includes a hall sensor. The electronic device 100 may detect the opening and closing of the flip holster using the magnetic sensor 180D. In some embodiments, when the electronic device 100 is a flip phone, the electronic device 100 may detect the opening and closing of the flip according to the magnetic sensor 180D. And then according to the opening and closing state of the leather sheath or the opening and closing state of the flip cover, the automatic unlocking of the flip cover is set.
The acceleration sensor 180E may detect the magnitude of acceleration of the electronic device 100 in various directions (typically three axes). The magnitude and direction of gravity can be detected when the electronic device 100 is stationary. The method can also be used for recognizing the posture of the electronic equipment, and is applied to horizontal and vertical screen switching, pedometers and other applications.
A distance sensor 180F for measuring a distance. The electronic device 100 may measure the distance by infrared or laser. In some embodiments, taking a picture of a scene, electronic device 100 may utilize range sensor 180F to range for fast focus.
The proximity light sensor 180G may include, for example, a Light Emitting Diode (LED) and a light detector, such as a photodiode. The light emitting diode may be an infrared light emitting diode. The electronic device 100 emits infrared light to the outside through the light emitting diode. The electronic device 100 detects infrared reflected light from nearby objects using a photodiode. When sufficient reflected light is detected, it can be determined that there is an object near the electronic device 100. When insufficient reflected light is detected, the electronic device 100 may determine that there are no objects near the electronic device 100. The electronic device 100 can utilize the proximity light sensor 180G to detect that the user holds the electronic device 100 close to the ear for talking, so as to automatically turn off the screen to achieve the purpose of saving power. The proximity light sensor 180G may also be used in a holster mode, a pocket mode automatically unlocks and locks the screen.
The ambient light sensor 180L is used to sense the ambient light level. Electronic device 100 may adaptively adjust the brightness of display screen 194 based on the perceived ambient light level. The ambient light sensor 180L may also be used to automatically adjust the white balance when taking a picture. The ambient light sensor 180L may also cooperate with the proximity light sensor 180G to detect whether the electronic device 100 is in a pocket to prevent accidental touches.
The fingerprint sensor 180H is used to collect a fingerprint. The electronic device 100 can utilize the collected fingerprint characteristics to unlock the fingerprint, access the application lock, photograph the fingerprint, answer an incoming call with the fingerprint, and so on.
The temperature sensor 180J is used to detect temperature. In some embodiments, electronic device 100 implements a temperature processing strategy using the temperature detected by temperature sensor 180J. For example, when the temperature reported by the temperature sensor 180J exceeds a threshold, the electronic device 100 performs a reduction in performance of a processor located near the temperature sensor 180J, so as to reduce power consumption and implement thermal protection. In other embodiments, the electronic device 100 heats the battery 142 when the temperature is below another threshold to avoid the low temperature causing the electronic device 100 to shut down abnormally. In other embodiments, when the temperature is lower than a further threshold, the electronic device 100 performs boosting on the output voltage of the battery 142 to avoid abnormal shutdown due to low temperature.
The touch sensor 180K is also referred to as a "touch panel". The touch sensor 180K may be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, which is also called a "touch screen". The touch sensor 180K is used to detect a touch operation applied thereto or nearby. The touch sensor can communicate the detected touch operation to the application processor to determine the touch event type. Visual output associated with the touch operation may be provided through the display screen 194. In other embodiments, the touch sensor 180K may be disposed on a surface of the electronic device 100, different from the position of the display screen 194.
The bone conduction sensor 180M may acquire a vibration signal. In some embodiments, the bone conduction sensor 180M may acquire a vibration signal of the human vocal part vibrating the bone mass. The bone conduction sensor 180M may also contact the human pulse to receive the blood pressure pulsation signal. In some embodiments, the bone conduction sensor 180M may also be disposed in a headset, integrated into a bone conduction headset. The audio module 170 may analyze a voice signal based on the vibration signal of the bone mass vibrated by the sound part acquired by the bone conduction sensor 180M, so as to implement a voice function. The application processor can analyze heart rate information based on the blood pressure beating signal acquired by the bone conduction sensor 180M, so as to realize the heart rate detection function.
The keys 190 include a power-on key, a volume key, and the like. The keys 190 may be mechanical keys. Or may be touch keys. The electronic apparatus 100 may receive a key input, and generate a key signal input related to user setting and function control of the electronic apparatus 100.
The motor 191 may generate a vibration cue. The motor 191 may be used for incoming call vibration cues, as well as for touch vibration feedback. For example, touch operations applied to different applications (e.g., photographing, audio playing, etc.) may correspond to different vibration feedback effects. The motor 191 may also respond to different vibration feedback effects for touch operations applied to different areas of the display screen 194. Different application scenes (such as time reminding, receiving information, alarm clock, game and the like) can also correspond to different vibration feedback effects. The touch vibration feedback effect may also support customization.
Indicator 192 may be an indicator light that may be used to indicate a state of charge, a change in charge, or a message, missed call, notification, etc.
The SIM card interface 195 is used to connect a SIM card. The SIM card can be brought into and out of contact with the electronic apparatus 100 by being inserted into the SIM card interface 195 or being pulled out of the SIM card interface 195. The electronic device 100 may support 1 or N SIM card interfaces, N being a positive integer greater than 1. The SIM card interface 195 may support a Nano SIM card, a Micro SIM card, a SIM card, etc. The same SIM card interface 195 can be inserted with multiple cards at the same time. The types of the plurality of cards may be the same or different. The SIM card interface 195 may also be compatible with different types of SIM cards. The SIM card interface 195 may also be compatible with external memory cards. The electronic device 100 interacts with the network through the SIM card to implement functions such as communication and data communication. In some embodiments, the electronic device 100 employs esims, namely: an embedded SIM card. The eSIM card can be embedded in the electronic device 100 and cannot be separated from the electronic device 100.
Some exemplary User Interfaces (UIs) provided by the main electronic device 100 are described below. The term "user interface" in the description and claims and drawings of the present application is a media interface for interaction and information exchange between an application or operating system and a user that enables conversion between an internal form of information and a form acceptable to the user. A commonly used presentation form of the user interface is a Graphical User Interface (GUI), which refers to a user interface related to computer operations and displayed in a graphical manner. It may be an interface element such as an icon, a window, a control, etc. displayed in the display screen of the electronic device, where the control may include a visual interface element such as an icon, a button, a menu, a tab, a text box, a dialog box, a status bar, a navigation bar, a Widget, etc.
Fig. 2A illustrates an exemplary user interface 21 on the electronic device 100 for exposing applications installed by the electronic device 100.
The user interface 21 may include: a status bar 201, a tray 202 with common application icons, and other application icons. Wherein:
the status bar 201 may include: one or more of a wireless fidelity (Wi-Fi) signal strength indicator 201A, a bluetooth capability indicator 201B, a battery status indicator 201C, and a time indicator 201D.
The tray 202 with the common application icons may show: album icon 204A, notes icon 204B, mail icon 204C, file management icon 204D.
Other application icons may be, for example: camera icon 203, microblog (Weibo) icon 204, phone icon 205, calendar icon 206, browser icon 207, WeChat (Wechat) icon 208, settings icon 209, reading icon 210, video icon 211, music icon 212, contacts icon 213, and information icon 214. The user interface 21 may also include a page indicator 215. Other application icons may be distributed across multiple pages and page indicator 215 may be used to indicate which page the user is currently browsing for applications in. The user may slide the area of the other application icons from side to browse the application icons in the other pages.
In some embodiments, the user interface 21 illustratively shown in FIG. 2A may be a Home screen (Home Screen).
It is understood that fig. 2A merely illustrates a user interface on the electronic device 100, and should not be construed as a limitation to the embodiments of the present application.
The following describes a process of establishing a cooperative relationship between two electronic devices.
As shown in fig. 2A, when the primary electronic device 100 detects a slide-down gesture on the status bar 201, the primary electronic device 100 may display a window 216 on the user interface 21 in response to the gesture. As shown in FIG. 2B, a "multi-screen collaboration" switch control 216A may be displayed in the window 216, and switch controls for other functions (e.g., Wi-Fi, Bluetooth, flashlight, etc.) may also be displayed. When an operation on the switch control 216A in the window 216 (e.g., a touch operation on the switch control 216A) is detected, the main electronic device 100 may turn on the "multi-screen coordination" function in response to the operation. As shown in fig. 2C, after the main electronic device 100 starts the "multi-screen coordination" function, the user interface status bar displays a "multi-screen coordination" function indicator 201E, and the user interface displays a "multi-screen coordination" window 217, which prompts the slave electronic device (e.g., a mobile phone) to connect to the main electronic device 100 to establish a coordination relationship.
The slave electronic device 200 may connect with the master electronic device 100 in a plurality of connection manners to establish a cooperative relationship, including a bluetooth connection, a code scanning connection, an NFC connection, and the like. The present embodiment describes a specific connection procedure in a bluetooth connection manner. As shown in fig. 2D, when the slave electronic device 200 detects a slide down gesture on the status bar 218, the slave electronic device 200 may display a window 219 on the user interface 22 in response to the gesture. As shown in fig. 2E, a bluetooth switch control 219A and switch controls for other functions (e.g., Wi-Fi, mobile data, etc.) may be displayed in window 216. When an operation on the switch control 219A in the window 219 (such as a touch operation on the switch control 219A) is detected, the slave electronic device 200 may turn on bluetooth in response to the operation. As shown in fig. 2F, when the slave electronic device 200 and the master electronic device 100 are within the bluetooth detectable range, the user interface of the slave electronic device 200 displays a confirmation connection window 220, and when an operation of a control 220A (confirmation "connection") on the window 220 is detected (such as a touch operation on the control 220A), the user interface of the master electronic device 100 may display a confirmation connection window in response to the operation.
As shown in fig. 2G, the graphic user interface 21 of the master electronic device 100 displays a confirmation connection window 221, and when an operation (such as a touch operation on the switch control 221A) on the control 221A (confirmation "permission" connection) in the window 221 is detected, the slave electronic device 200 and the master electronic device 100 make a connection in response to the operation. As shown in fig. 2H, the graphical user interface 21 of the master electronic device 100 displays a window 222 indicating that a connection is being made with the slave electronic device 200; the graphical user interface 22 of the slave electronic device 200 displays a window 223 indicating that a connection is being made with the master electronic device 100. When the slave electronic device 200 and the master electronic device 100 are successfully connected, as shown in fig. 2I, the graphical user interface 21 on the master electronic device 100 displays the screen projection interface 300 of the slave electronic device 200.
In the embodiment of the application, a cooperative relationship is established between a master electronic device and a slave electronic device, the master electronic device has a function of distinguishing an input device, specifically, a touch screen operation for the master electronic device is detected, an application of the master electronic device is called, optionally, when the touch screen operation is performed on a screen projection interface of the master electronic device, a response result is displayed on the screen projection interface of the master electronic device, and further optionally, the response result is sent to the slave electronic device and displayed on the slave electronic device; when the touch screen is operated on the non-screen-projection interface of the main electronic device, the response result is displayed on the non-screen-projection interface of the main electronic device. And detecting touch screen operation aiming at the slave electronic equipment, calling an application of the slave electronic equipment, and displaying a response result on screen projection interfaces of the slave electronic equipment and the master electronic equipment. For convenience of description, in the following embodiments, the above-described function is referred to as a differentiated distributed input device function.
Optionally, in order to be compatible with the general screen projection function, the application may implement differentiation between the distributed input device function and the existing general screen projection function through the switch control. The universal screen projection function can mean that when a user touches a screen on a non-screen projection interface of the main electronic device, the application of the main electronic device is called, and the main electronic device responds to the screen touch operation; when the user touches the screen projection interface of the main electronic device for operation, the application of the slave electronic device is called, the slave electronic device responds to the touch screen operation, and the response result is displayed on the screen projection interface of the main electronic device.
The process of enabling the differentiation of the functions of the distributed input device is described below in two alternative embodiments.
(first) first alternative embodiment: the functions of the distributed input equipment are distinguished by turning on or off the function switch control.
Fig. 2A and 2B illustrate an operation of the electronic device 100 to enable distinguishing the functions of the distributed input device through a "function switch" control.
The user may make a slide-down gesture at the status bar 201 to open the window 216 and may click the "function switch" control 216B in the window 216 to conveniently open the "function switch". When the "function switch" is already on, the "function switch" may be turned off if the "function switch" control 216B is clicked again. The switch control 216B of the "function switch" may be represented in the form of a text message or an icon.
In the embodiment of the present application, after the "function switch" is turned on through the operations shown in fig. 2A and fig. 2B, the electronic device 100 turns on the function of distinguishing the distributed input devices, that is, when the master electronic device 100 and the slave electronic devices cooperate, the touch operation comes from the master electronic device, and an application on the master electronic device is called to respond to the touch operation. The touch operation comes from the slave electronic equipment, and the application on the slave electronic equipment is called to respond to the touch operation.
When the "function switch" is turned off, the electronic device 100 turns off the function of distinguishing the distributed input devices and calls the general screen projection function. That is, when the master electronic device 100 and the slave electronic device cooperate, a touch operation is performed on a non-screen-projection interface of the master electronic device, and an application on the master electronic device is called to respond to the touch operation; and performing touch operation on the screen projection interface of the slave electronic equipment or the master electronic equipment, and calling the application on the slave electronic equipment to respond to the touch operation.
The "function switch" may be turned on by default or turned off by default, and a user may set the "function switch" autonomously to turn on or turn off, which is not limited in the embodiments of the present application.
(second) second alternative embodiment: and the function of the distributed input equipment is opened or closed by using a 'switching equipment' control.
Fig. 3A and 3B illustrate an operation of the main electronic device 100 to turn on or off the function of distinguishing distributed input devices using a "toggle device" control.
In some embodiments, the master electronic device 100 may turn on the differentiated distributed input device function by default. When the master electronic device 100 and the slave electronic device 200 have established a collaborative relationship, as shown in fig. 3A, a notification bar of a user interface of the master electronic device 100 may display a window 301, which prompts a user to close the function of distinguishing the distributed input devices, switch to the original device, and invoke an application of the original device to respond to a touch operation of the screen projection interface, that is, turn on the universal screen projection function. Prompt 301A, prompt 301B, stow notification control 301C, "disconnect" control 301D, "switch device" control 301E may be displayed in window 301.
The prompt 301A may be used to introduce that the main electronic device 100 is connected to another electronic device, establish a collaborative relationship, and display a connection duration.
The prompt message 301B may be used to introduce a connection object of the main electronic device 100, and prompt the user that the main electronic device 100 and the electronic device of the corresponding model establish a collaborative relationship.
Control 301C may be used to listen for operations (e.g., touch operations) to collapse window 301, and control 301C may be represented in the form of an icon or textual information.
Control 301D may be used to listen for an operation (e.g., a touch operation) to disconnect the device. When the operation acting on the control 301C is detected, the master electronic device 100 is disconnected from the slave electronic device 200, and the two electronic devices cannot cooperate with each other, so that data sharing is realized.
Control 301E can be used to listen for an operation (e.g., a touch operation) that turns off the functionality of the differentiated distributed input device. When an operation on control 301E is detected, for example, clicking "toggle device", main electronic device 100 turns off the function of distinguishing the distributed input devices and uses the general screen projection function. At this time, as shown in fig. 3B, the user interface notification bar display window 302 of the main electronic device 100.
Window 302 differs from window 301 in control 302A and control 301E. The control 302A may be used to listen for an operation (e.g., a touch operation) that turns on the functionality of the differentiated distributed input devices. When an operation is detected that acts on control 302A, such as clicking "use default device," master electronic device 100 reopens the functionality of distinguishing distributed input devices. At this time, as shown in fig. 3A, the user interface notification bar of the main electronic device 100 displays the window 301 again.
In the following, some embodiments of the user using the function of distinguishing the distributed input devices when the main electronic device uses two electronic devices to work cooperatively are described with reference to three different usage scenarios.
Use scenario one: text entry scenario
The embodiments shown in fig. 4A-4I take a text input scenario as an example to illustrate the process of implementing the function of distinguishing distributed input devices on an electronic device. In a text entry scenario, a user interface of the electronic device may display a "text entry interface" provided by the application.
In some embodiments, the "text input interface" is used to display a text box in which text can be input, which displays an input focus in response to an operation (touch operation or the like) by the user, waiting for a text input result by the user.
The scenarios exemplarily illustrated in fig. 4A-4I may be one implementation of a "text entry scenario". This scenario includes two electronic devices, a master electronic device 100 and a slave electronic device 200, with the interface content of the slave electronic device being displayed on the master electronic device in a projected screen, and the projected screen interface 300 of the master electronic device displaying the interface content of the slave electronic device.
As shown in fig. 4A, the user interface of the screen-projection interface 300 on the main electronic device 100 is a "text input interface" which may be provided by an application (e.g., a browser) having a text input scenario. The user interface may be a user interface where a user clicks a browser icon of the in-view screen projection interface 300 and prepares to enter a search target in the browser search bar 401.
As shown in fig. 4A, an operation (e.g., a touch operation) acting on the browser search bar 401 in the screen projection interface 300 on the main electronic device 100 is detected, the browser search bar 401 of the screen projection interface 300 displays an input focus 402 as shown in fig. 4B, invokes an input method application of the main electronic device, displays an input method interface 403, and waits for a user input.
As shown in fig. 4C, in response to an input result submission operation by the user, the input result 404 is displayed in the browser search bar of the screen-projection interface 300 and also displayed in the browser search bar of the slave electronic device 200.
The touch operation is not limited to the left blank position in the browser search bar 401 of the screen-projection interface 300, and the touch operation may be performed at other positions of the browser search bar 401, such as the middle or right blank position.
As shown in fig. 4D, when an operation (e.g., a touch operation) acting on the browser search field 405 of the slave electronic device 200 is detected, the browser search field 405 of the slave electronic device 200 displays an input focus 406 as shown in fig. 4E, invokes an input method application of the slave electronic device 200, and displays an input method interface 407, waiting for a user input. This input method interface is also displayed on the screen-projection interface 300. As shown in fig. 4F, in response to the input result submission operation by the user, the input result 408 is displayed in the browser search bar of the slave electronic device 200, and is also displayed in the browser search bar of the screen-projection interface 300. The touch operation is not limited to the left blank position in the browser search field 405 of the slave electronic device 200, and the touch operation may be performed at another position of the browser search field 404, for example, a middle or right blank position.
As shown in fig. 4G, an operation (e.g., a touch operation) on the browser search field 409 of the non-screen-projection interface on the main electronic device 100 is detected, the browser search field of the main electronic device displays the input focus 410 as shown in fig. 4H D, the input method application of the main electronic device is invoked, the input method interface 411 is displayed, and a user input is waited for. As shown in fig. 4F, in response to an input result submission operation by the user, an input result 412 is displayed in the browser search bar of the main electronic device.
With the embodiments exemplarily shown in fig. 4A-4I, the master electronic device may respond to an input operation using a differentiated distributed input device function in a text input scenario, that is, when a touch operation is performed on a screen projection interface of the master electronic device, an input method of the master electronic device may be called, and when a touch operation is performed on a slave electronic device, an input method of the slave electronic device may be called to respond to an input operation. That is, the electronic device may distinguish the input source, and invoke the input method corresponding to the source to respond to the input in the text input scenario. Therefore, on the main electronic equipment, even if the screen projection interface is stretched, the input method interface responding to the touch operation of the screen projection interface can not be stretched synchronously with the window, but the input method on the main electronic equipment is used, the size of the interface is unchanged, and the user experience is good. Illustratively, the slave electronic device 200 is a mobile phone, the master electronic device 100 is a tablet computer, and a user touches a screen on a screen-projecting interface on the tablet computer to input a text, and may invoke an input method on the tablet computer, where the input method interface is larger than the input method interface of the mobile phone and does not change in size with the screen-projecting interface, so that the text can be conveniently input.
The page layout of the "text input interface" may also take other forms, which are not limited to those shown in fig. 4A to 4I, and this is not limited by the embodiment of the present application. Not limited to browser applications, other applications (e.g., notes, qq, etc.) may also provide similar "text input interfaces," where the applications provide a "text input interface," a master electronic device in conjunction with other slave electronic devices may also provide differentiated distributed input device functionality in a manner similar to that described above with respect to fig. 4A-4I.
(II) usage scenario II: image capture scene
The following embodiments of fig. 5A to 5F take an image capturing scene as an example to describe a process of implementing a function of distinguishing distributed input devices on an electronic device.
The scenes exemplarily illustrated in fig. 5A-5F may be one implementation of an "image capture scene". This scenario includes two electronic devices, a master electronic device 100 and a slave electronic device 200, with the interface content of the slave electronic device being displayed on the master electronic device for projection.
As shown in fig. 5A, the user interface of the screen-projection interface 300 on the main electronic device 100 may be a user interface where the user is ready to click on the camera icon in the figure.
As shown in fig. 5A, an operation (e.g., a touch operation) of the camera icon 501 acting on the screen projection interface 300 on the main electronic device 100 is detected, the camera of the main electronic device 100 is called, and the camera application is started.
As shown in fig. 5B, after the camera application is started, a camera application interface 502 is displayed, where the interface may include a preview screen of a camera of the host electronic device and other controls, where the other controls include a photographing control 502A, an album control 502B, and the like. When an operation (for example, a touch operation) of the photographing control 502A acting on the camera application interface is detected, the master electronic device calls a camera to take a picture, and the album control 502B of the camera application displays the taken picture and sends the taken picture to the slave electronic device 200. Illustratively, when the slave electronic device 200 is a mobile phone, the master electronic device 100 is a tablet computer, and the user places the mobile phone on a desk, so that the user can take a picture through the tablet computer in a cooperative relationship with the mobile phone. In this implementation scenario, since it is the camera of the main electronic device 100 that is called to take a picture, the property (such as sharpness) of taking the picture is defined by the shooting parameters of the camera of the main electronic device 100.
In some embodiments, the camera application of the master electronic device 100 also has other functions (e.g., enhanced functions, which may be a beauty function, a peeling function, a manicure function, etc.), while the camera of the slave electronic device 200 does not have these functions. An operation of a camera icon acting on the screen-projecting interface 300 of the main electronic device 100 is detected, a camera application of the main electronic device 100 is started, and a camera application interface is displayed. The camera application interface may include a preview image and other controls for the main electronic device camera and may also include enhanced functionality options. And detecting the operation acting on the selected enhanced function option, refreshing the camera application interface, and displaying the preview picture processed by the selected enhanced function. And detecting the operation acting on the photographing control, calling the camera to photograph, and calling the enhanced function corresponding to the selected enhanced function option to obtain a photographing result for realizing the corresponding enhanced function.
As shown in fig. 5C, an operation (e.g., a touch operation) acting on the camera icon 503 on the slave electronic apparatus 200 is detected, the camera of the slave electronic apparatus is called, and the camera application is started. As shown in fig. 5D, a camera application interface 504 is displayed, which includes a slave electronic device camera preview screen, and is also displayed by the screen projection interface 300 on the master electronic device 100. When an operation (for example, a touch operation) of the photographing control 504A acting on the camera application interface of the slave electronic device 200 is detected, the slave electronic device calls a camera to take a picture, the album control 504B of the camera application displays the taken picture, and the album control of the camera application of the screen projection interface 300 also displays the taken picture. The properties of the taken picture, such as sharpness, are defined by the photographing parameters of the camera of the slave electronic device 200.
As shown in fig. 5E, an operation (e.g., a touch operation) on the camera icon 505 on the main electronic device 100 is detected, the camera of the main electronic device is called, and the camera application is started. As shown in fig. 5F, a host electronic device camera application interface 506 is displayed, which includes a host electronic device camera preview screen. An operation (e.g., a touch operation) of the photographing control 506A acting on the camera application interface of the main electronic device 100 is detected, and the album control 506B of the camera application displays the taken photograph and stores the taken photograph in the main electronic device 100. The properties of the taken picture, such as sharpness, are defined by the photographing parameters of the camera of the main electronic device 100.
With the embodiments exemplarily shown in fig. 5A-5F, the master electronic device may use the differentiated distributed input device function in an image capturing scene, that is, the camera of the master electronic device may be invoked to respond to an input operation when a touch operation is performed on the screen projection interface of the master electronic device, and the camera of the slave electronic device may be invoked to respond to an input operation when a touch operation is performed on the slave electronic device. That is, the electronic device may distinguish the input sources and respond to the input by invoking the cameras of the corresponding sources in the image capture scene. Therefore, the screen projection interface is operated on the main electronic equipment, the slave electronic equipment does not need to be switched to use, the camera of the main electronic equipment is directly utilized to respond to the operation, the attribute of taking the picture is not limited by the slave electronic equipment but is related to the main electronic equipment, and the user can conveniently use the screen projection interface and has good experience.
The user interface of "image capturing scene" may also take other forms, which is not limited to the form shown in fig. 5A to 5F, and this is not limited in this embodiment of the application. Not limited to the camera application, other applications (e.g., a video conference application, a live application, etc.) may also provide similar image capture scenes, and when these applications provide image capture scenes, the master electronic device in cooperation with other slave electronic devices may also provide a distributed input device recognition function in a similar manner to that described above with respect to fig. 5A-5F.
(III) usage scenario III: video playing scene
The following embodiments of fig. 6A to 6I take a video playing scene as an example to describe a method for distinguishing distributed input devices on an electronic device.
The scenes exemplarily shown in fig. 6A-6I may be one implementation of a "video playing scene". This scenario includes two electronic devices, a master electronic device 100 and a slave electronic device 200, with the interface content of the slave electronic device being displayed on the master electronic device for projection.
As shown in fig. 6A, the user interface of the screen-projection interface 300 on the main electronic device 100 may be a user interface where the user is ready to click on a video icon in the figure.
As shown in fig. 6A, an operation (e.g., a touch operation) of the video icon 601 acting on the screen projection interface 300 on the main electronic device 100 is detected, the player of the main electronic device is called, and the video application is started.
As shown in fig. 6B, after the video application is started, a video selection interface 602 is displayed, where the video selection interface may display at least 1 video to be selected, and the user may select a video that the user wants to play, and further optionally, the video selection interface 602 may also display a play history of the player of the user via the host electronic device. The user can also input a search text by clicking a search control, and select a video to be played by searching. An operation on the selected video 602A is detected and the selected video is played.
As shown in fig. 6C, when playing a video, a video playing interface 603 is displayed, and this interface may include a video playing screen, a pause/play control, a close control, and a progress bar control. In response to the operation of the user on the closing control 603A, the playing information is saved and sent to the slave electronic device 200, and the video playing interface 603 is closed. The slave electronic device 200 receives the playing information, updates the playing information stored in the video application, and stores personalized content for the user. The playing information may include information such as video playing record and playing progress, and may also include information such as user comment information and media sharing information, which is not limited in the embodiment of the present application.
As shown in fig. 6D, an operation (e.g., a touch operation) of the video icon 604 acting on the slave electronic device 200 is detected, a player of the slave electronic device is called, and a video application is started. As shown in fig. 6E, after the video application is launched, a video selection interface 605 is displayed. Upon detecting an operation on the selected video 605A, the selected video is played and the video playing interface 606 is displayed, as shown in fig. 6F. The interface content of the slave electronic device is also displayed on the screen-cast interface 300 of the master electronic device 100.
As shown in fig. 6G, an operation (e.g., a touch operation) on the video icon 607 on the main electronic device 100 is detected, the player of the main electronic device is called, and the video application is started. As shown in fig. 6H, after the video application is launched, a video selection interface 608 is displayed. Upon detecting an operation on the selected video 605A, the selected video is played and a video playing interface 609 is displayed, as shown in fig. 6F.
Through the embodiments exemplarily shown in fig. 6A to 6I, the master electronic device may use the differentiated distributed input device function in a video playing scene, that is, a player of the master electronic device may be invoked to respond to an input operation when a touch operation is performed on a screen projection interface of the master electronic device, and a player of the slave electronic device may be invoked to respond to an input operation when a touch operation is performed on the slave electronic device. That is, the electronic device may distinguish the input source, and respond to the input by invoking the player corresponding to the source in the video playing scene. Therefore, the screen projection interface is operated on the main electronic equipment, the slave electronic equipment does not need to be switched to use, the player of the main electronic equipment is directly used for playing the video on the slave electronic equipment, the larger screen of the main electronic equipment is used for providing better viewing experience, the playing information such as the playing progress in the slave electronic equipment is updated along with the playing information, the user can conveniently continue to view after the scene is changed, and the user experience is good.
The user interface of "video playing scene" may also take other forms, which is not limited to the user interface shown in fig. 6A to 6I, and this is not limited in this embodiment of the application. Not limited to video applications, other applications (e.g., beepli, etc.) may also provide similar video playback scenes, and when these applications provide video playback scenes, the master electronic device in conjunction with other slave electronic devices may also provide distributed input device identification functionality in a manner similar to that described above with respect to fig. 6A-6I.
In this application, the user interface for presenting the main electronic device on which the user performs the first operation may be referred to as a first user interface. Exemplary implementations of the first user interface may include the user interface of the master electronic device 100 shown in fig. 4A, 4G, 5A, 5E, 6A, or 6G.
In some embodiments of the present application, the user interface showing the slave electronic device on which the user performs the first operation may be referred to as a second user interface. Exemplary implementations of the second user interface may include the user interface of the slave electronic device 200 shown in fig. 4D, 5C, or 6D.
In some embodiments of the present application, a touch operation for opening or calling an application may be referred to as a first operation. Exemplary implementations of the first operation may include an operation in which the user clicks a text box to prepare for invoking an input method, or an operation in which the user clicks a camera application icon to prepare for invoking a camera application, or an operation in which the user clicks a video application icon to prepare for invoking a video application.
In some embodiments of the present application, the interface displayed on the primary electronic device for presenting the application invoking the primary electronic device may be referred to as a third user interface. Exemplary implementations of the third user interface may include the user interface of the master electronic device 100 shown in fig. 4B, 4H, 5B, 5F, 6B, or 6H.
In some embodiments of the present application, the interface displayed on the master electronic device for presenting the application invoking the slave electronic device may be referred to as a fourth user interface. Exemplary implementations of the fourth user interface may include the user interface of the master electronic device 100 shown in fig. 4E, 5D, or 6E.
In some embodiments of the present application, an exemplary implementation of a screen projection area may include the screen projection interface 300 shown in fig. 4A, fig. 5A, or fig. 6A.
In some embodiments of the present application, the operation for the invoked application may be referred to as a second operation. Exemplary implementations of the second operation may include a user clicking an input text of the input method interface, or a user clicking a photographing control of the camera application interface, or a user clicking a closing control of the video playing interface.
In some embodiments of the present application, the operation for turning on the function of the present application for distinguishing distributed input devices may be referred to as a third operation. Exemplary implementations of the third operation may include a user clicking a "function switch" control of the status bar, or a user clicking a "use default device" control of a window of the notification bar. The operation for turning off the function of the distributed input device of the present application may be referred to as a fourth operation. An exemplary implementation of the fourth operation may include an operation in which the user clicks a "switch device" control of a window of the notification bar.
In some embodiments of the present application, the control for listening to the operation of opening the differentiated distributed input device function may be referred to as a first interactive element. An exemplary implementation of the first interactive element may include a "use default device" control 302A shown in fig. 3B. The control for listening to the operation of closing the differentiated distributed input device functionality may be referred to as a second interactive element. An exemplary implementation of the second interactive element may include a "toggle device" control 301E shown in FIG. 3A.
In some embodiments of the present application, after the differentiated distributed input function of the present application is turned off, the touch operation for the invoked application may be referred to as a fifth operation. Exemplary implementations of the fifth operation may include an operation in which the user clicks a text box to prepare for invoking an input method, or an operation in which the user clicks a camera application icon to prepare for invoking a camera application, or an operation in which the user clicks a video application icon to prepare for invoking a video application. An interface for displaying an application on the master electronic device that invokes the slave electronic device may be referred to as a fifth user interface.
In some embodiments of the present application, the user interface 502 shown in fig. 5B may be referred to as a first preview screen of the camera application, and the enhanced function options may include a beauty function option, a peeling function option, a body grooming function option, and so on.
In some embodiments of the present application, the user interface 602 shown in fig. 6B may be referred to as a media object selection interface of a playing application, the identifier shown as 602A may be referred to as a target media object identifier, the seventh operation may include a click operation on the target media object identifier, and the user interface 603 shown in fig. 6C may be referred to as a media playing screen of the playing application.
Fig. 7 is a block diagram of a software configuration of the electronic device 100 according to the embodiment of the present application.
The layered architecture divides the software into several layers, each layer having a clear role and division of labor. The layers communicate with each other through a software interface. In some embodiments, the layered architecture is divided into three layers, an application layer, a framework layer, and a kernel layer from top to bottom.
The application layer may include a series of application packages.
As shown in fig. 7, the application package contains various applications that respond to a user touch operation. For example, in a text entry scenario, the application package may contain an input method application, and may also contain other applications that provide an input text scenario (e.g., a browser application, a notes application, etc.) that may notify the system to invoke the input method application by displaying a text entry interface. The embodiment of the present application is not limited to a specific scenario and an application installed in the electronic device 100.
The application package is not limited, and can also comprise application programs such as a camera, a gallery, a calendar, a call, a map, navigation, WLAN, Bluetooth, music, video, short messages and the like.
The framework layer provides an Application Programming Interface (API) and a programming framework for the application programs of the application layer. The application framework layer includes a number of predefined functions.
As shown in fig. 7, the framework layer may include an input event management framework and a display framework provided by the embodiments of the present application. The input event management framework comprises an event hub EventHub, and can acquire an underlying touch event. In the application, Flag of a recording device is added to EventHub to distinguish input source electronic devices. The value of Flag used to distinguish devices in this application may be referred to as a first parameter. Touch event information acquired by EventHub is distributed through touch points, transmitted to a window management system in a display frame, and processed and then called to relevant applications of an application layer, so that response to the touch event is completed.
The application framework layer may further include a content provider, a view system, a phone manager, a resource manager, a notification manager, and the like, which is not limited in this embodiment. Where the content provider is used to store and retrieve data and make it accessible to applications. The view system includes visual controls such as controls to display text, controls to display pictures, and the like. The view system may be used to build applications. The phone manager is used to provide communication functions of the electronic device 100. Such as management of call status (including on, off, etc.). The resource manager provides various resources, such as localized strings, icons, pictures, and the like, for the application. The notification manager allows the application to display notification information in the status bar that can be used to convey notification type messages. Such as a notification manager used to inform download completion, message alerts, etc.
The kernel layer is a layer between hardware and software. The inner core layer at least comprises a display driver, a camera driver, an audio driver and a sensor driver.
As shown in fig. 7, the kernel layer may be configured to report a touch event of the device node to the framework layer, where the touch event may be from the master electronic device or the slave electronic device. And after receiving the touch event, the frame layer judges the input source and calls the application of the corresponding source.
The following describes exemplary work flows of software and hardware of the electronic device 100 in connection with a text input scenario mentioned in the embodiments of the present application.
When the touch sensor 180K receives a touch operation, a corresponding hardware interrupt is issued to the kernel layer. The kernel layer processes the touch operation into a touch event (including touch coordinates, touch source, etc. information). Touch events are stored at the kernel layer. Reporting the touch event to a framework layer, identifying source equipment corresponding to the touch event according to Flag of recording equipment in EventHub, and calling application on main electronic equipment when the Flag corresponds to the main electronic equipment; and when the Flag corresponds to the slave electronic equipment, calling the application on the slave electronic equipment. And according to the position of the touch event, selecting a sending and displaying object of the response result, and when the touch event occurs on a screen projection interface of the main electronic device, sending the response result to the slave electronic device to be displayed on a non-screen projection interface of the main electronic device and the slave electronic device; when the touch event occurs on the non-screen-projection interface of the main electronic device, the response result is sent to the main electronic device and displayed on the non-screen-projection interface of the main electronic device.
For example, the touch operation is a touch single-click operation, the single-click operation is positioned in a browser search bar of the screen-projection interface 300 of the main electronic device 100, and the input method application of the main electronic device calls an interface of the framework layer to start the input method application. When the user's text input operation is completed, the input text result is transmitted to the slave electronic device and displayed on the screen-cast interface 300 of the master electronic device.
Based on the software structure of the electronic device shown in fig. 7, fig. 8 is a flowchart of a method for implementing the function of distinguishing distributed input devices by the electronic device. The implementation of each step in fig. 8 can refer to the related description of fig. 7, and is not described herein again.
As used in the above embodiments, the term "when …" may be interpreted to mean "if …" or "after …" or "in response to a determination of …" or "in response to a detection of …", depending on the context. Similarly, depending on the context, the phrase "at the time of determination …" or "if (a stated condition or event) is detected" may be interpreted to mean "if the determination …" or "in response to the determination …" or "upon detection (a stated condition or event)" or "in response to detection (a stated condition or event)".
In the above embodiments, the implementation may be wholly or partially realized by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. When loaded and executed on a computer, cause the processes or functions described in accordance with the embodiments of the application to occur, in whole or in part. The computer may be a general purpose computer, a special purpose computer, a network of computers, or other programmable device. The computer instructions may be stored in a computer readable storage medium or transmitted from one computer readable storage medium to another, for example, the computer instructions may be transmitted from one website, computer, server, or data center to another website, computer, server, or data center by wire (e.g., coaxial cable, fiber optic, digital subscriber line) or wirelessly (e.g., infrared, wireless, microwave, etc.). The computer-readable storage medium can be any available medium that can be accessed by a computer or a data storage device, such as a server, a data center, etc., that incorporates one or more of the available media. The usable medium may be a magnetic medium (e.g., floppy disk, hard disk, magnetic tape), an optical medium (e.g., DVD), or a semiconductor medium (e.g., solid state disk), among others.
One of ordinary skill in the art will appreciate that all or part of the processes in the methods of the above embodiments may be implemented by hardware related to instructions of a computer program, which may be stored in a computer-readable storage medium, and when executed, may include the processes of the above method embodiments. And the aforementioned storage medium includes: various media capable of storing program codes, such as ROM or RAM, magnetic or optical disks, etc.

Claims (17)

1. A method of display control, the method comprising:
the method comprises the steps that a master electronic device displays a first user interface, wherein the first user interface comprises interface content of a second user interface displayed by a slave electronic device, which is projected on the master electronic device;
the master electronic device detecting a first operation;
if the first operation is an operation on the first user interface, responding to the first operation, and displaying a third user interface by the main electronic equipment, wherein the third user interface comprises an interface for calling and running an application of the main electronic equipment;
and if the first operation is an operation on the second user interface, responding to the first operation, and displaying a fourth user interface by the main electronic equipment, wherein the fourth user interface comprises an interface for calling and running the application of the slave electronic equipment.
2. The method of claim 1, wherein the first user interface includes a screen-cast region including interface content of a second user interface displayed by the slave electronic device that is cast on the master electronic device, the first operation being an operation on the screen-cast region of the first user interface, the method further comprising:
in response to the detected second operation acting on the third user interface, the master electronic device sends the running result of the application to the slave electronic device.
3. The method according to claim 1 or 2, characterized in that the method further comprises:
the master electronic device detects a third operation for starting a function of distinguishing a distributed input device, and in response to the third operation, the function of distinguishing the distributed input device is started, wherein the function of distinguishing the distributed input device is used for calling and running an application of the master electronic device when the master electronic device detects that the first operation is at the master electronic device, or calling and running an application of the slave electronic device when the first operation is at the slave electronic device.
4. The method of claim 3, wherein the method further comprises:
the first user interface comprises a window area, the third operation comprises an operation which is detected by the main electronic equipment and acts on a first interactive element in the window area, and the first interactive element is used for monitoring the operation of starting the function of the differentiated distributed input equipment; and when the function of distinguishing the distributed input devices is not started, the first interactive element is displayed in the window area.
5. The method of claim 4, wherein the method further comprises:
after the function of distinguishing the distributed input devices is started, the main electronic device refreshes the window area, and second interactive elements are displayed in the refreshed window area;
the main electronic device detects a fourth operation acting on the second interactive element, and responds to the fourth operation to close the function of distinguishing the distributed input devices.
6. The method of claim 5, wherein the first user interface includes a screen-cast area including interface content of a second user interface displayed by the slave electronic device that is cast on the master electronic device, the method further comprising:
after the function of distinguishing the distributed input devices is closed, the main electronic device detects a fifth operation;
if the fifth operation is an operation on the screen projection area on the master electronic device, responding to the fifth operation, and displaying a fifth user interface by the master electronic device, wherein the fifth user interface comprises an interface for calling and running an application of the slave electronic device.
7. The method of any of claims 1-6, wherein if the first operation is an operation that acts on a text box on the first user interface, the third user interface comprises an input method interface that invokes and runs an input method application of the host electronic device.
8. The method of claim 2, wherein the second operation is an input operation that acts on a text box in the screen-projection area, the method further comprising:
and the main electronic equipment refreshes the third user interface, and the screen projection area of the refreshed third user interface comprises the input result of the input operation.
9. The method of any of claims 1-6, wherein if the first operation is an operation of a camera application acting on the first user interface, the third user interface comprises a first preview screen of the camera application that invokes and runs the host electronic device, the first preview screen comprises a first image, and the third user interface further comprises at least one enhanced functionality option provided with the host electronic device;
in response to a detected sixth operation acting on a target enhancement function option, the electronic device displays a second preview picture, where the second preview picture includes a second image, the second image is an image obtained by performing enhancement function processing corresponding to the target enhancement function option on the first image, and the at least one enhancement function option includes the target enhancement function option.
10. The method of claim 9, wherein the first user interface includes a screen-cast region that includes interface content of a second user interface displayed by the slave electronic device that is cast on the master electronic device, the first operation is an operation on the screen-cast region of the first user interface, the second operation is an operation that acts on a capture control in the third user interface, the method further comprising:
the master electronic equipment refreshes the third user interface, the refreshed third user interface comprises a shooting result of the shooting operation, and the shooting result is sent to the slave electronic equipment; the shooting result comprises a shooting picture after the target enhancement function is called.
11. The method of any of claims 1-6, wherein if the first operation is an operation of a playback application icon acting on the first user interface, the third user interface comprises a media object selection interface that invokes and runs a playback application of the host electronic device, the media object selection interface comprising at least one media object identifier; the media object selection interface comprises media object identification played by the host electronic equipment in a historical way;
in response to the detected seventh operation acting on the target media object identifier, the electronic device displays a playing picture of media corresponding to the target media object identifier, the playing picture includes playing information of the media which is played on the main electronic device in a historical manner, and the at least one media object identifier includes the target media object identifier.
12. The method of claim 11, wherein the playback information includes one or more of the following: the system comprises playing progress information, user comment information and media sharing information.
13. The method of claim 11 or 12, wherein the first user interface includes a screen-casting area, the screen-casting area includes interface content of a second user interface displayed by the slave electronic device, the first operation is an operation on the screen-casting area of the first user interface, and if the second operation is an operation on a close control of the media playback screen, the method further includes:
and responding to the detected second operation acting on the media playing picture, closing the media playing picture by the main electronic equipment, and sending the playing information of the media playing picture to the slave electronic equipment.
14. The method of any of claims 1-13, wherein the primary electronic device comprises a touch screen and a graphical user interface; the slave electronic device includes a touch screen and a graphical user interface.
15. A computer device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, the processor causing the computer device to implement the method of any one of claims 1 to 14 when executing the computer program.
16. An electronic device, characterized in that the electronic device comprises means for performing the method of any of claims 1 to 14.
17. A computer-readable storage medium comprising instructions that, when executed on an electronic device, cause the electronic device to perform the method of any of claims 1-14.
CN202010901550.6A 2020-08-31 2020-08-31 Display control method and related device Pending CN114115770A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202010901550.6A CN114115770A (en) 2020-08-31 2020-08-31 Display control method and related device
PCT/CN2021/112365 WO2022042326A1 (en) 2020-08-31 2021-08-12 Display control method and related apparatus

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010901550.6A CN114115770A (en) 2020-08-31 2020-08-31 Display control method and related device

Publications (1)

Publication Number Publication Date
CN114115770A true CN114115770A (en) 2022-03-01

Family

ID=80354560

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010901550.6A Pending CN114115770A (en) 2020-08-31 2020-08-31 Display control method and related device

Country Status (2)

Country Link
CN (1) CN114115770A (en)
WO (1) WO2022042326A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115580677A (en) * 2022-09-26 2023-01-06 荣耀终端有限公司 Method for controlling equipment, electronic equipment and storage medium

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117480487A (en) * 2022-05-30 2024-01-30 京东方科技集团股份有限公司 Screen information synchronization method and system
CN115599335B (en) * 2022-12-13 2023-08-22 佳瑛科技有限公司 Method and system for sharing layout files based on multi-screen mode
CN115827207B (en) * 2023-02-16 2023-07-14 荣耀终端有限公司 Application program switching method and electronic equipment

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150061970A1 (en) * 2013-08-29 2015-03-05 Samsung Electronics Co., Ltd. Method for sharing screen and electronic device thereof
US20150113432A1 (en) * 2013-10-23 2015-04-23 Samsung Electronics Co., Ltd. Method and device for transmitting data, and method and device for receiving data
CN110221798A (en) * 2019-05-29 2019-09-10 华为技术有限公司 A kind of throwing screen method, system and relevant apparatus
US10474416B1 (en) * 2019-01-03 2019-11-12 Capital One Services, Llc System to facilitate interaction during a collaborative screen sharing session
CN110515579A (en) * 2019-08-28 2019-11-29 北京小米移动软件有限公司 Throw screen method, apparatus, terminal and storage medium
US20200042274A1 (en) * 2018-07-31 2020-02-06 Samsung Electronics Co., Ltd. Electronic device and method for executing application using both display of electronic device and external display
CN111078819A (en) * 2019-12-31 2020-04-28 维沃移动通信有限公司 Application sharing method and electronic equipment
CN111314549A (en) * 2020-02-10 2020-06-19 联想(北京)有限公司 Transmission apparatus and processing method
CN111327769A (en) * 2020-02-25 2020-06-23 北京小米移动软件有限公司 Multi-screen interaction method and device and storage medium

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10613739B2 (en) * 2016-06-11 2020-04-07 Apple Inc. Device, method, and graphical user interface for controlling multiple devices in an accessibility mode
JP6803581B2 (en) * 2018-09-28 2020-12-23 パナソニックIpマネジメント株式会社 Display control device, display control method, and display control system
CN110333814A (en) * 2019-05-31 2019-10-15 华为技术有限公司 A kind of method and electronic equipment of sharing contents
CN110471639B (en) * 2019-07-23 2022-10-18 华为技术有限公司 Display method and related device
CN111309419A (en) * 2020-01-22 2020-06-19 维沃移动通信有限公司 Split-screen display method and electronic equipment

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150061970A1 (en) * 2013-08-29 2015-03-05 Samsung Electronics Co., Ltd. Method for sharing screen and electronic device thereof
US20150113432A1 (en) * 2013-10-23 2015-04-23 Samsung Electronics Co., Ltd. Method and device for transmitting data, and method and device for receiving data
US20200042274A1 (en) * 2018-07-31 2020-02-06 Samsung Electronics Co., Ltd. Electronic device and method for executing application using both display of electronic device and external display
US10474416B1 (en) * 2019-01-03 2019-11-12 Capital One Services, Llc System to facilitate interaction during a collaborative screen sharing session
CN110221798A (en) * 2019-05-29 2019-09-10 华为技术有限公司 A kind of throwing screen method, system and relevant apparatus
CN110515579A (en) * 2019-08-28 2019-11-29 北京小米移动软件有限公司 Throw screen method, apparatus, terminal and storage medium
CN111078819A (en) * 2019-12-31 2020-04-28 维沃移动通信有限公司 Application sharing method and electronic equipment
CN111314549A (en) * 2020-02-10 2020-06-19 联想(北京)有限公司 Transmission apparatus and processing method
CN111327769A (en) * 2020-02-25 2020-06-23 北京小米移动软件有限公司 Multi-screen interaction method and device and storage medium

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115580677A (en) * 2022-09-26 2023-01-06 荣耀终端有限公司 Method for controlling equipment, electronic equipment and storage medium

Also Published As

Publication number Publication date
WO2022042326A1 (en) 2022-03-03

Similar Documents

Publication Publication Date Title
WO2021017889A1 (en) Display method of video call appliced to electronic device and related apparatus
CN112231025B (en) UI component display method and electronic equipment
WO2021213164A1 (en) Application interface interaction method, electronic device, and computer readable storage medium
WO2020000448A1 (en) Flexible screen display method and terminal
WO2022042326A1 (en) Display control method and related apparatus
US20220179827A1 (en) File Sharing Method of Mobile Terminal and Device
CN113961157B (en) Display interaction system, display method and equipment
CN114125130B (en) Method for controlling communication service state, terminal device and readable storage medium
CN110602312B (en) Call method, electronic device and computer readable storage medium
CN112543447A (en) Device discovery method based on address list, audio and video communication method and electronic device
CN112751954A (en) Operation prompting method and electronic equipment
CN114995715B (en) Control method of floating ball and related device
WO2023241209A9 (en) Desktop wallpaper configuration method and apparatus, electronic device and readable storage medium
CN114089932A (en) Multi-screen display method and device, terminal equipment and storage medium
CN113141483B (en) Screen sharing method based on video call and mobile device
WO2022143180A1 (en) Collaborative display method, terminal device, and computer readable storage medium
CN111492678B (en) File transmission method and electronic equipment
CN113438366B (en) Information notification interaction method, electronic device and storage medium
CN112532508B (en) Video communication method and video communication device
EP4293997A1 (en) Display method, electronic device, and system
CN113923372B (en) Exposure adjusting method and related equipment
CN113050864A (en) Screen capturing method and related equipment
WO2024114212A1 (en) Cross-device focus switching method, electronic device and system
WO2024067037A1 (en) Service calling method and system, and electronic device
WO2022042774A1 (en) Profile picture display method and electronic device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination