CN116204145A - Method for projecting screen of electronic equipment, communication device and electronic equipment - Google Patents

Method for projecting screen of electronic equipment, communication device and electronic equipment Download PDF

Info

Publication number
CN116204145A
CN116204145A CN202211625815.XA CN202211625815A CN116204145A CN 116204145 A CN116204145 A CN 116204145A CN 202211625815 A CN202211625815 A CN 202211625815A CN 116204145 A CN116204145 A CN 116204145A
Authority
CN
China
Prior art keywords
electronic device
screen
application
area
electronic
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211625815.XA
Other languages
Chinese (zh)
Inventor
肖冬
盛琦阳
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Device Co Ltd
Original Assignee
Huawei Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Device Co Ltd filed Critical Huawei Device Co Ltd
Priority to CN202211625815.XA priority Critical patent/CN116204145A/en
Publication of CN116204145A publication Critical patent/CN116204145A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1407General aspects irrespective of display type, e.g. determination of decimal point position, display with fixed or driving decimal point, suppression of non-significant zeros
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1454Digital output to display device ; Cooperation and interconnection of the display device with other functional units involving copying of the display data of a local workstation or window to a remote workstation or window so that an actual copy of the data is displayed simultaneously on two or more displays, e.g. teledisplay
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces

Abstract

The application provides a method for projecting a screen of electronic equipment, a communication device and the electronic equipment, wherein the method is applied to first electronic equipment, the first electronic equipment comprises a touch screen, and second electronic equipment is placed on a first area of the touch screen, and the method comprises the following steps: the first electronic device obtains a first parameter, the first parameter comprising at least one of: the change amount of the dielectric constant of the first area of the touch screen of the first electronic device, the area of the first area, or the weight of the second electronic device; and when the first parameter meets the preset condition, displaying at least one application program running on the second electronic device on the first electronic device. According to the method, the application program running on the mobile phone can be rapidly projected onto the large-screen device, so that the problem of complicated connection steps in the related art and the problem of long interaction path caused by the need of re-opening the application program running on the mobile phone on the large-screen device are solved, and the user experience is improved.

Description

Method for projecting screen of electronic equipment, communication device and electronic equipment
Technical Field
The application relates to the technical field of screen display, in particular to a screen display method of electronic equipment, a communication device and the electronic equipment.
Background
With the development of application display technology, more and more electronic devices support the screen projection technology. The screen projection technology is to project an application interface started on one electronic device (such as a mobile phone) with a smaller display screen onto another electronic device (such as a large-screen device) with a larger display screen, so that a better watching effect is achieved, and an operation device on a second electronic device can be utilized to operate the application interface.
At present, after the mobile phone and the large screen device are connected in a screen throwing way, the mobile phone can project the icon of the application program on the display interface of the mobile phone to the large screen device, the large screen device displays the icon on the display interface of the large screen device, and at the moment, a user can operate a mouse of the large screen device to click the icon displayed on the display interface of the computer so as to open the corresponding application program in the mobile phone.
Therefore, in the related screen-throwing technology, when the step of establishing connection between the mobile phone and the large screen device is complex, and when the mobile phone and the large screen device are connected, an application program running on the mobile phone cannot be directly thrown on the large screen device, and a user needs to further operate on the large screen device to open the running application program, so that the problems of long interaction path and poor user experience are caused.
Disclosure of Invention
The application program running on the mobile phone is quickly thrown onto the large-screen device by identifying the variation of the dielectric constant of the touch screen of the large-screen device when the mobile phone is placed on the touch screen of the large-screen device, so that the problem of long interaction path caused by complex connection establishment steps of the mobile phone and the large-screen device and the need of restarting the running application program on the large-screen device in the related art is solved, and user experience is improved.
In a first aspect, a method for projecting a screen of an electronic device is provided, an execution body of the method may be a first electronic device, or may be a chip integrated on the first electronic device, where the first electronic device includes a touch screen, and a second electronic device is placed on a first area of the touch screen, and the method includes: the first electronic device obtains a first parameter, the first parameter comprising at least one of: the change amount of the dielectric constant of the first area of the touch screen of the first electronic device, the area of the first area, or the weight of the second electronic device; and when the first parameter meets the preset condition, displaying at least one application program running on the second electronic device on the first electronic device.
According to the method provided by the first aspect, whether the change amount of the dielectric constant of the first area of the touch screen of the first electronic device, the area of the first area or the weight of the second electronic device meets the preset condition is judged, and when the preset condition is met, the first electronic device and the second electronic device are quickly connected, so that the problem of complicated connection steps in the related art is reduced, and after the first electronic device and the second electronic device are connected, at least one application program running on the second electronic device can be directly displayed on the first electronic device, the problem of long interaction path caused by the fact that the application program running on the second electronic device needs to be re-opened on the first electronic device in the related art is further solved, and therefore user experience is improved.
In a possible implementation manner of the first aspect, the preset condition includes: the change amount of the dielectric constant of the first area of the touch screen of the first electronic device satisfies at least one of a preset first condition, an area of the first area satisfies a preset second condition, or a weight of the second electronic device satisfies a preset third condition.
In a possible implementation manner of the first aspect, when the first parameter meets a preset condition, before displaying, on the first electronic device, the application running on the second electronic device, the method further includes: when the first parameter meets a preset condition, the first electronic device determines the time for placing the second electronic device on the touch screen; and when the time for which the second electronic device is placed on the touch screen is greater than or equal to a preset first threshold value, displaying an application program running on the second electronic device on the first electronic device. In the implementation manner, the false recognition of the first electronic equipment can be reduced and the screen projection accuracy is improved by determining the time for placing the second electronic equipment on the first electronic equipment.
In a possible implementation manner of the first aspect, when the first parameter meets a preset condition, before displaying, on the first electronic device, at least one application running on the second electronic device, the method further includes: the first electronic device sends an authorization request to the second electronic device; and receiving an authorization message sent by the second electronic equipment, and displaying an application program running on the second electronic equipment on the first electronic equipment. In the implementation manner, through the authorization confirmation of the first electronic equipment and the second electronic equipment, the screen can be projected again under the condition that the user wants to project the screen, so that the privacy of the user and the accuracy of the screen projection are further protected.
In a possible implementation manner of the first aspect, the touch screen includes a first screen area and a second screen area, and when the first electronic device displays an application running on the second electronic device in the second screen area, the method further includes: receiving a first operation of a user on a first application program, wherein the first application program is any one of at least one application program; in response to the first operation, the first electronic device displays the first application in the first screen region. In the implementation manner, when the plurality of application programs are displayed in the second screen area in the form of small windows, the first application program can be displayed in the first screen area in an enlarged mode, so that the display of other application programs is not affected, and further user experience is improved.
In a possible implementation manner of the first aspect, the first electronic device displays at least one application running on the second electronic device in a superimposed or tiled manner.
Alternatively, the windows of each application may be independent of each other, and the user may operate each window on the first electronic device, e.g. the user may control closing of each window individually.
In a possible implementation manner of the first aspect, the first electronic device displays a multi-level interface of a first application running on the second electronic device, and the first application is any one of the at least one application.
In a second aspect, a method for projecting a screen of an electronic device is provided, where an execution body of the method may be a second electronic device, or may be a chip integrated on the second electronic device, where the second electronic device is placed on a touch screen of the first electronic device, and the method includes: the second electronic equipment detects the contact surface of the first electronic equipment and determines whether the contact surface is an electronic screen or not, wherein the contact surface is the surface of the touch screen of the first electronic equipment, which is contacted with the second electronic equipment; and when the contact surface is an electronic screen, the second electronic device displays the running at least one application program on the first electronic device.
According to the method provided by the second aspect, the second electronic device detects the contact surface contacted with the first electronic device, and when the contact surface is an electronic screen, connection between the first electronic device and the second electronic device is quickly established, so that the problem of complicated connection steps in the related art is reduced, and after the connection between the first electronic device and the second electronic device is established, at least one application program running on the second electronic device can be directly displayed on the first electronic device, the problem of long interaction path caused by the fact that the application program running on the second electronic device needs to be re-opened on the first electronic device in the related art is further solved, and therefore user experience is improved.
In a possible implementation manner of the second aspect, before the second electronic device displays the running at least one application on the first electronic device, the method further includes: the second electronic device sends a connection request to the first electronic device; and responding to the connection establishment message sent by the first electronic device, and displaying the running at least one application program on the first electronic device by the second electronic device. In the implementation manner, through the authorization confirmation of the first electronic equipment and the second electronic equipment, the screen can be projected again under the condition that the user wants to project the screen, so that the privacy of the user and the accuracy of the screen projection are further protected.
In a possible implementation manner of the second aspect, the second electronic device detects the contact surface of the first electronic device by using an infrared camera.
It should be understood that the infrared camera detects and recognizes the content of the photographed picture through ISO sensitivity and focal length. When the second electronic device is placed on a non-electronic screen, the ISO sensitivity may be zero due to no light entering. Therefore, the infrared camera can be used for judging whether the contact surface is an electronic screen or not.
In a third aspect, a communication device is provided, the communication device comprising means for performing the steps in the above first aspect or any of the possible implementation manners of the first aspect, or means for performing the steps in the above second aspect or any of the possible implementation manners of the second aspect.
In a fourth aspect, a communications apparatus is provided, the communications apparatus comprising at least one processor and a memory coupled with the processor and the memory storing program instructions that when executed by the processor perform or are operative to perform the method of the first aspect above or any of the possible implementations of the first aspect.
In a fifth aspect, a communication device is provided, the communication device comprising at least one processor and interface circuitry, the at least one processor being configured to perform the method of the above first aspect or any of the possible implementations of the first aspect, or to perform the method of the above second aspect or any of the possible implementations of the second aspect.
In a sixth aspect, there is provided an electronic device comprising the communication apparatus provided in the third aspect, or the electronic device comprising the communication apparatus provided in the fourth aspect, or the electronic device comprising the communication apparatus provided in the fifth aspect.
In a seventh aspect, a computer program product is provided, comprising a computer program for performing the method of the above first aspect or any of the possible implementations of the first aspect or for performing the method of the above second aspect or any of the possible implementations of the second aspect when executed by a processor.
In an eighth aspect, a computer readable storage medium is provided, in which a computer program is stored which, when executed, is adapted to carry out the method of the above first aspect or any of the possible implementations of the second aspect.
In a ninth aspect, there is provided a chip comprising: a processor for calling and running a computer program from a memory, causing a communication device on which the chip is installed to perform a method for performing the above first aspect or any of the possible implementation forms of the first aspect or for performing the above second aspect or any of the possible implementation forms of the second aspect.
Drawings
Fig. 1 is a schematic diagram of an interface for establishing a cooperative connection between a first electronic device and a second electronic device in the related art.
Fig. 2 shows an interface schematic diagram of a first electronic device and a second electronic device in the related art.
Fig. 3 shows a schematic structural diagram of an electronic device 300.
Fig. 4 shows a software architecture block diagram of an electronic device 300 according to an embodiment of the present application.
Fig. 5 is a schematic diagram illustrating a display interface of an electronic device according to an embodiment of the present application.
Fig. 6 shows a schematic diagram of the change of the dielectric constant of the capacitive touch panel.
Fig. 7 shows an interface display schematic diagram of a first electronic device provided in an embodiment of the present application.
Fig. 8 shows an interface display schematic diagram of a first electronic device provided in an embodiment of the present application.
Fig. 9 shows an interface display schematic diagram of a first electronic device provided in an embodiment of the present application.
Fig. 10 is a schematic diagram of a display interface of an electronic device according to an embodiment of the present application.
Fig. 11 is a schematic diagram of a display interface of another electronic device according to an embodiment of the present application.
Fig. 12 is a schematic diagram of a display interface of another electronic device according to an embodiment of the present application.
Fig. 13 is a schematic diagram of a display interface of another electronic device according to an embodiment of the present application.
Fig. 14 is a schematic view of a display interface of another electronic device according to an embodiment of the present application.
Fig. 15 shows a flowchart of an implementation of a method 1500 for screen projection of an electronic device according to an embodiment of the present application.
Fig. 16 shows a flowchart of an implementation of a method 1600 for screen projection of another electronic device according to an embodiment of the present application.
Fig. 17 shows a schematic diagram of a chip system provided in an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application.
The terminology used in the following embodiments is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used in the specification and the appended claims, the singular forms "a," "an," "the," and "the" are intended to include, for example, "one or more" such forms of expression, unless the context clearly indicates to the contrary. It should also be understood that in embodiments of the present application, "one or more" refers to one or more than two (including two); "and/or", describes an association relationship of the association object, indicating that three relationships may exist; for example, a and/or B may represent: a alone, a and B together, and B alone, wherein A, B may be singular or plural. The character "/" generally indicates that the context-dependent object is an "or" relationship.
Reference in the specification to "one embodiment" or "some embodiments" or the like means that a particular feature, structure, or characteristic described in connection with the embodiment is included in one or more embodiments of the application. Thus, appearances of the phrases "in one embodiment," "in some embodiments," "in other embodiments," and the like in the specification are not necessarily all referring to the same embodiment, but mean "one or more but not all embodiments" unless expressly specified otherwise. The terms "comprising," "including," "having," and variations thereof mean "including but not limited to," unless expressly specified otherwise.
The plurality of the embodiments of the present application refers to greater than or equal to two. It should be noted that, in the description of the embodiments of the present application, the terms "first," "second," and the like are used for distinguishing between the descriptions and not necessarily for indicating or implying a relative importance, or alternatively, for indicating or implying a sequential order.
Furthermore, various aspects or features of the present application may be implemented as a method, apparatus, or article of manufacture using standard programming and/or engineering techniques. The term "article of manufacture" as used in the embodiments of the present application encompasses a computer program accessible from any computer-readable device, carrier, or media. For example, computer-readable media can include, but are not limited to, magnetic storage devices (e.g., hard disk, floppy disk, or magnetic strips, etc.), optical disks (e.g., compact disk, CD, digital versatile disk, digital versatile disc, DVD, etc.), smart cards, and flash memory devices (e.g., erasable programmable read-only memory, EPROM), cards, sticks, or key drives, etc. Additionally, various storage media described herein can represent one or more devices and/or other machine-readable media for storing information. The term "machine-readable medium" can include, without being limited to, wireless channels and various other media capable of storing, containing, and/or carrying instruction(s) and/or data.
With the development of internet technology, the screen projection technology is widely applied. The screen projection refers to projecting the interface content displayed on the first electronic device into the second electronic device, so that the second electronic device can synchronously display the interface content displayed by the first electronic device. The interface content of the equipment with smaller display screen (such as a mobile phone and a tablet personal computer) can be projected onto the large-screen display equipment (such as a television and a vehicle-mounted multimedia display screen) through the screen projection technology, so that a better watching effect is achieved, and a user can conveniently operate by using the operation equipment of the large-screen equipment.
It will be appreciated that before the first electronic device is screen-cast to the second electronic device, the first electronic device needs to establish a connection with the second electronic device, the user typically establishes a cooperative connection with the first electronic device by near field communication (Near Field Communication, NFC) on a touch pad located on the second electronic device, or close to a keyboard, and then screen-casts a window of the first electronic device onto an interface of the second electronic device.
In the following, a first electronic device is taken as a mobile phone, a second electronic device is taken as a notebook computer, and how to establish cooperative connection between the first electronic device and the second electronic device and how to screen a window of the first electronic device on an interface of the second electronic device in the related art is described.
For example, fig. 1 shows an interface schematic diagram of a first electronic device and a second electronic device in the related art to establish a cooperative connection. As shown in fig. 1 (a), before the notebook 110 and the mobile phone 120 establish a cooperative connection, the notebook 110 needs to click a "near found" button in a computer management home to trigger, while the bluetooth connection of the mobile phone 120 is in an open state, when the notebook 110 finds the mobile phone 120, a popup window appears on the display screen 1101 of the notebook, the popup window 1102 displays "whether to establish a connection with the first electronic device", and the user can select an "allow" button or a "reject" button on the popup window 1102 of the notebook according to actual requirements. When the user clicks the "allow" button, the connection is established between the notebook computer 110 and the mobile phone 120, and then the interface 1201 of the mobile phone 120 can be projected onto the display 1101 of the notebook computer, and as shown in fig. 1 (b), the mobile phone window is projected onto the main interface of the mobile phone 120 only on the display of the notebook computer. As can be seen from fig. 1 (a), although the mobile phone 120 runs in the background as a video application, the window that is the video application after the screen is thrown cannot be directly displayed on the display screen of the notebook computer 110.
Fig. 2 shows an interface schematic diagram of a first electronic device and a second electronic device in the related art. As shown in fig. 2 (a), the main interface 1201 of the mobile phone 120 is displayed on the display 1101 of the notebook computer 110, and although the background operation of the mobile phone 120 is a video application, the playback interface of the video is not directly displayed on the notebook computer. Therefore, when the user needs to watch the video on the notebook computer 110, the video application needs to be searched and opened. As shown in fig. 2 (b), after the user clicks on the video application, the screen-drop interface 1201 of the mobile phone displayed on the display 1101 of the notebook computer 110 is switched to 1202 the display interface.
In summary, when the first electronic device establishes a connection to the second electronic device, the interface operation steps are relatively more, and when the first electronic device throws a screen to the second electronic device, only the main interface of the first electronic device is displayed on the second electronic device, so that the user needs to open the applications of the first electronic device on the interface of the second electronic device one by one.
In view of this, the present application provides a method for projecting a screen of an electronic device, where the method is applied to a first electronic device, where the first electronic device has a touch screen, and a second electronic device is placed on the touch screen of the first electronic device, and the first electronic device acquires a first parameter, where the first parameter includes at least one of the following: the dielectric constant variation of the touch screen of the first electronic device, the dielectric constant variation area of the touch screen of the first electronic device, or the weight of the second electronic device; and when the first parameter meets the preset condition, displaying an application program running on the second electronic device on the first electronic device. According to the method, the second electronic equipment can rapidly throw the screen to the first electronic equipment, and user experience is further improved.
Before introducing the method for projecting the screen of the electronic equipment, the electronic equipment provided by the application is specifically described. The embodiment of the application provides an electronic device, which is used for executing the method for projecting the screen of the electronic device, and the electronic device can refer to the electronic device with a larger display screen. For example, the electronic device in the embodiments of the present application may be a smart television, a projection device that can project on a wall, a computer, various portable notebooks, various tablet computers, and other electronic devices with display screens, and the like. Optionally, the electronic device may also be a personal digital assistant (personal digital assistant, PDA) with a larger display screen, a handheld device with a larger display screen, a computing device, a vehicle-mounted device, a wearable device, an electronic device in a 5G network or an electronic device in an evolved public land mobile network (public land mobile network, PLMN), etc., which the embodiments of the present application are not limited to.
By way of example, fig. 3 shows a schematic structural diagram of an electronic device 300. The electronic device 300 may include a processor 310, an external memory interface 320, an internal memory 321, a universal serial bus (universal serial bus, USB) interface 330, a charge management module 340, a power management module 341, a battery 342, an antenna 1, an antenna 2, a mobile communication module 350, a wireless communication module 360, an audio module 370, a speaker 370A, a receiver 370B, a microphone 370C, an ear-piece interface 370D, a sensor module 380, keys 390, a motor 391, an indicator 392, a camera 393, a display screen 394, and a user identification module (subscriber identification module, SIM) card interface 395, among others. The sensor modules 380 may include pressure sensors 380A, gyroscope sensors 380B, barometric pressure sensors 380C, magnetic sensors 380D, acceleration sensors 380E, distance sensors 380F, proximity sensors 380G, fingerprint sensors 380H, gravity sensors 380I, temperature sensors 380J, touch sensors 380K, ambient light sensors 380L, bone conduction sensors 380M, etc.
It is to be understood that the structure illustrated in the embodiments of the present application does not constitute a specific limitation on the electronic device 300. In other embodiments of the present application, electronic device 300 may include more or fewer components than shown, or certain components may be combined, or certain components may be split, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
The processor 310 may include one or more processing units, such as: the processor 310 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), a controller, a memory, a video codec, a digital signal processor (digital signal processor, DSP), a baseband processor, and/or a neural network processor (neural-network processing unit, NPU), etc. Wherein the different processing units may be separate devices or may be integrated in one or more processors.
The controller may be a neural hub and a command center of the electronic device 300, among others. The controller can generate operation control signals according to the instruction operation codes and the time sequence signals to finish the control of instruction fetching and instruction execution.
A memory may also be provided in the processor 310 for storing instructions and data. In some embodiments, the memory in the processor 310 is a cache memory. The memory may hold instructions or data that the processor 310 has just used or recycled. If the processor 310 needs to reuse the instruction or data, it may be called directly from the memory. Repeated accesses are avoided and the latency of the processor 310 is reduced, thereby improving the efficiency of the system.
In some embodiments, processor 310 may include one or more interfaces. The interfaces may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous receiver transmitter (universal asynchronous receiver/transmitter, UART) interface, a mobile industry processor interface (mobile industry processor interface, MIPI), a general-purpose input/output (GPIO) interface, a subscriber identity module (subscriber identity module, SIM) interface, and/or a universal serial bus (universal serial bus, USB) interface, among others.
The I2C interface is a bi-directional synchronous serial bus comprising a serial data line (SDA) and a serial clock line (derail clock line, SCL). In some embodiments, the processor 310 may contain multiple sets of I2C buses. The processor 310 may be coupled to the touch sensor 380K, charger, flash, camera 393, etc., respectively, via different I2C bus interfaces. For example: the processor 310 may couple the touch sensor 380K through an I2C interface, such that the processor 310 communicates with the touch sensor 380K through an I2C bus interface, implementing the touch functionality of the electronic device 300.
The I2S interface may be used for audio communication. In some embodiments, the processor 310 may contain multiple sets of I2S buses. The processor 310 may be coupled to the audio module 370 via an I2S bus to enable communication between the processor 310 and the audio module 370. In some embodiments, the audio module 370 may communicate audio signals to the wireless communication module 360 via the I2S interface to enable answering calls via the bluetooth headset.
PCM interfaces may also be used for audio communication to sample, quantize and encode analog signals. In some embodiments, the audio module 370 and the wireless communication module 360 may be coupled by a PCM bus interface. In some embodiments, the audio module 370 may also transmit audio signals to the wireless communication module 360 via the PCM interface to enable phone answering via the bluetooth headset. Both the I2S interface and the PCM interface may be used for audio communication.
The UART interface is a universal serial data bus for asynchronous communications. The bus may be a bi-directional communication bus. It converts the data to be transmitted between serial communication and parallel communication. In some embodiments, a UART interface is typically used to connect the processor 310 with the wireless communication module 360. For example: the processor 310 communicates with a bluetooth module in the wireless communication module 360 through a UART interface to implement a bluetooth function. In some embodiments, the audio module 370 may transmit audio signals to the wireless communication module 360 through a UART interface to implement a function of playing music through a bluetooth headset.
The MIPI interface may be used to connect the processor 310 to peripheral devices such as the display screen 394, the camera 393, and the like. The MIPI interfaces include camera serial interfaces (camera serial interface, CSI), display serial interfaces (display serial interface, DSI), and the like. In some embodiments, processor 310 and camera 393 communicate through a CSI interface, implementing the photographing function of electronic device 300. The processor 310 and the display screen 394 communicate via a DSI interface to implement the display functions of the electronic device 300.
The GPIO interface may be configured by software. The GPIO interface may be configured as a control signal or as a data signal. In some embodiments, a GPIO interface may be used to connect processor 310 with camera 393, display 394, wireless communication module 360, audio module 370, sensor module 380, and the like. The GPIO interface may also be configured as an I2C interface, an I2S interface, a UART interface, an MIPI interface, etc.
The USB interface 330 is an interface conforming to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, or the like. The USB interface 330 may be used to connect a charger to charge the electronic device 300, or may be used to transfer data between the electronic device 300 and a peripheral device. And can also be used for connecting with a headset, and playing audio through the headset. The interface may also be used to connect other electronic devices, such as AR devices, etc.
It should be understood that the interfacing relationship between the modules illustrated in the embodiments of the present application is only illustrative, and does not limit the structure of the electronic device 300. In other embodiments of the present application, the electronic device 300 may also use different interfacing manners, or a combination of multiple interfacing manners in the foregoing embodiments.
The charge management module 340 is configured to receive a charge input from a charger. The charger can be a wireless charger or a wired charger. In some wired charging embodiments, the charge management module 340 may receive a charging input of a wired charger through the USB interface 330. In some wireless charging embodiments, the charge management module 340 may receive wireless charging input through a wireless charging coil of the electronic device 300. The battery 342 is charged by the charge management module 340, and the electronic device may be powered by the power management module 341.
The power management module 341 is configured to connect the battery 342, the charge management module 340 and the processor 310. The power management module 341 receives input from the battery 342 and/or the charge management module 340 to power the processor 310, the internal memory 321, the external memory, the display screen 394, the camera 393, the wireless communication module 360, and the like. The power management module 341 may also be configured to monitor battery capacity, battery cycle number, battery health (leakage, impedance), and other parameters. In other embodiments, the power management module 341 may be disposed in the processor 310 or in the same device.
The wireless communication function of the electronic device 300 may be implemented by the antenna 1, the antenna 2, the mobile communication module 350, the wireless communication module 360, a modem processor, a baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the electronic device 300 may be used to cover a single or multiple communication bands. Different antennas may also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed into a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 350 may provide a solution for wireless communication, including 2G/3G/4G/5G, etc., applied on the electronic device 300. The mobile communication module 350 may include at least one filter, switch, power amplifier, low noise amplifier (low noise amplifier, LNA), etc. The mobile communication module 350 may receive electromagnetic waves from the antenna 1, perform processes such as filtering, amplifying, and the like on the received electromagnetic waves, and transmit the processed electromagnetic waves to the modem processor for demodulation. The mobile communication module 350 may amplify the signal modulated by the modem processor, and convert the signal into electromagnetic waves through the antenna 1 to radiate the electromagnetic waves. In some embodiments, at least some of the functional modules of the mobile communication module 350 may be disposed in the processor 310. In some embodiments, at least some of the functional modules of the mobile communication module 350 may be provided in the same device as at least some of the modules of the processor 310.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating the low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then transmits the demodulated low frequency baseband signal to the baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor. The application processor outputs sound signals through an audio device (not limited to speaker 370A, receiver 370B, etc.), or displays images or video through display screen 394. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be provided in the same device as the mobile communication module 350 or other functional module, independent of the processor 310.
The wireless communication module 360 may provide solutions for wireless communication including wireless local area network (wireless local area networks, WLAN) (e.g., wireless fidelity (wireless fidelity, wi-Fi) network), bluetooth (BT), global navigation satellite system (global navigation satellite system, GNSS), frequency modulation (frequency modulation, FM), near field wireless communication technology (near field communication, NFC), infrared technology (IR), etc., as applied to the electronic device 300. The wireless communication module 360 may be one or more devices that integrate at least one communication processing module. The wireless communication module 360 receives electromagnetic waves via the antenna 2, modulates the electromagnetic wave signals, filters the electromagnetic wave signals, and transmits the processed signals to the processor 310. The wireless communication module 360 may also receive a signal to be transmitted from the processor 310, frequency modulate it, amplify it, and convert it to electromagnetic waves for radiation via the antenna 2.
In some embodiments, antenna 1 and mobile communication module 350 of electronic device 300 are coupled, and antenna 2 and wireless communication module 360 are coupled, such that electronic device 300 may communicate with a network and other devices via wireless communication techniques. The wireless communication techniques may include the Global System for Mobile communications (global system for mobile communications, GSM), general packet radio service (general packet radio service, GPRS), code division multiple access (code division multiple access, CDMA), wideband code division multiple access (wideband code division multiple access, WCDMA), time division code division multiple access (time-division code division multiple access, TDSCDMA), long term evolution (long term evolution, LTE), BT, GNSS, WLAN, NFC, FM, and/or IR techniques, among others. The GNSS may include a global satellite positioning system (global positioning system, GPS), a global navigation satellite system (global navigation satellite system, GLONASS), a beidou satellite navigation system (beidou navigation satellite system, BDS), a quasi zenith satellite system (quasi-zenith satellite system, QZSS) and/or a satellite based augmentation system (satellite based augmentation systems, SBAS).
The electronic device 300 implements display functions through a GPU, a display screen 394, an application processor, and the like. The GPU is a microprocessor for image processing, connected to the display screen 394 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 310 may include one or more GPUs that execute program instructions to generate or change display information.
The display screen 394 is used for displaying images, videos, and the like. The display screen 394 includes a display panel. The display panel may employ a liquid crystal display (liquid crystal display, LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (AMOLED) or an active-matrix organic light-emitting diode (matrix organic light emitting diode), a flexible light-emitting diode (flex), a mini, a Micro led, a Micro-OLED, a quantum dot light-emitting diode (quantum dot light emitting diodes, QLED), or the like. In some embodiments, the electronic device 300 may include 1 or N display screens 394, N being a positive integer greater than 1.
Electronic device 300 may implement capture functionality through an ISP, camera 393, video codec, GPU, display 394, and application processor, among others.
The ISP is used to process the data fed back by camera 393. For example, when photographing, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electric signal, and the camera photosensitive element transmits the electric signal to the ISP for processing and is converted into an image visible to naked eyes. ISP can also optimize the noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in the camera 393.
Camera 393 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image onto the photosensitive element. The photosensitive element may be a charge coupled device (charge coupled device, CCD) or a Complementary Metal Oxide Semiconductor (CMOS) phototransistor. The photosensitive element converts the optical signal into an electrical signal, which is then transferred to the ISP to be converted into a digital image signal. The ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into an image signal in a standard RGB, YUV, or the like format. In some embodiments, electronic device 300 may include 1 or N cameras 393, N being a positive integer greater than 1.
The digital signal processor is used for processing digital signals, and can process other digital signals besides digital image signals. For example, when the electronic device 300 is selecting a frequency bin, the digital signal processor is used to fourier transform the frequency bin energy, or the like.
Video codecs are used to compress or decompress digital video. The electronic device 300 may support one or more video codecs. Thus, the electronic device 300 may play or record video in a variety of encoding formats, such as: dynamic picture experts group (moving picture experts group, MPEG) 1, MPEG2, MPEG3, MPEG4, etc.
The NPU is a neural-network (NN) computing processor, and can rapidly process input information by referencing a biological neural network structure, for example, referencing a transmission mode between human brain neurons, and can also continuously perform self-learning. Applications such as intelligent cognition of the electronic device 300 may be implemented by the NPU, for example: image recognition, face recognition, speech recognition, text understanding, etc.
The external memory interface 320 may be used to connect an external memory card, such as a Micro SD card, to enable expansion of the memory capabilities of the electronic device 300. The external memory card communicates with the processor 310 through an external memory interface 320 to implement data storage functions. For example, files such as music, video, etc. are stored in an external memory card.
The internal memory 321 may be used to store computer executable program code comprising instructions. The processor 310 executes various functional applications of the electronic device 300 and data processing by executing instructions stored in the internal memory 321. The internal memory 321 may include a storage program area and a storage data area. The storage program area may store an application program (such as a sound playing function, an image playing function, etc.) required for at least one function of the operating system, etc. The storage data area may store data created during use of the electronic device 300 (e.g., audio data, phonebook, etc.), and so on. In addition, the internal memory 321 may include a high-speed random access memory, and may also include a nonvolatile memory, such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (universal flash storage, UFS), and the like.
The electronic device 300 may implement audio functionality through an audio module 370, a speaker 370A, a receiver 370B, a microphone 370C, an ear-headphone interface 370D, and an application processor, among others. Such as music playing, recording, etc.
The audio module 370 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. The audio module 370 may also be used to encode and decode audio signals. In some embodiments, the audio module 370 may be disposed in the processor 310, or some of the functional modules of the audio module 370 may be disposed in the processor 310.
Speaker 370A, also known as a "horn," is used to convert audio electrical signals into sound signals. The electronic device 300 may listen to music, or to hands-free conversations, through the speaker 370A.
A receiver 370B, also referred to as a "earpiece", is used to convert the audio electrical signal into a sound signal. When electronic device 300 is answering a telephone call or voice message, voice may be received by placing receiver 370B close to the human ear.
Microphone 370C, also referred to as a "microphone," is used to convert sound signals into electrical signals. When making a call or transmitting voice information, the user can sound near the microphone 370C through the mouth, inputting a sound signal to the microphone 370C. The electronic device 300 may be provided with at least one microphone 370C. In other embodiments, the electronic device 300 may be provided with two microphones 370C, and may implement a noise reduction function in addition to collecting sound signals. In other embodiments, the electronic device 300 may also be provided with three, four, or more microphones 370C to enable collection of sound signals, noise reduction, identification of sound sources, directional recording functions, etc.
The earphone interface 370D is for connecting a wired earphone. The headset interface 370D may be a USB interface 330 or a 3.5mm open mobile electronic device platform (open mobile terminal platform, OMTP) standard interface, a american cellular telecommunications industry association (cellular telecommunications industry association of the USA, CTIA) standard interface.
The pressure sensor 380A is configured to sense a pressure signal and convert the pressure signal into an electrical signal. In some embodiments, the pressure sensor 380A may be disposed on the display screen 394. The pressure sensor 380A is of various types, such as a resistive pressure sensor, an inductive pressure sensor, a capacitive pressure sensor, and the like. The capacitive pressure sensor may be a capacitive pressure sensor comprising at least two parallel plates with conductive material. When a force is applied to the pressure sensor 380A, the capacitance between the electrodes changes. The electronic device 300 determines the strength of the pressure from the change in capacitance. When a touch operation is applied to the display screen 394, the electronic apparatus 300 detects the touch operation intensity according to the pressure sensor 380A. The electronic device 300 may also calculate the location of the touch based on the detection signal of the pressure sensor 380A. In some embodiments, touch operations that act on the same touch location, but at different touch operation strengths, may correspond to different operation instructions. For example: and executing an instruction for checking the short message when the touch operation with the touch operation intensity smaller than the first pressure threshold acts on the short message application icon. And executing an instruction for newly creating the short message when the touch operation with the touch operation intensity being greater than or equal to the first pressure threshold acts on the short message application icon.
The gyro sensor 380B may be used to determine a motion gesture of the electronic device 300. In some embodiments, the angular velocity of electronic device 300 about three axes (i.e., x, y, and z axes) may be determined by gyro sensor 380B. The gyro sensor 380B may be used for photographing anti-shake. For example, when the shutter is pressed, the gyro sensor 380B detects the shake angle of the electronic device 300, calculates the distance to be compensated by the lens module according to the angle, and makes the lens counteract the shake of the electronic device 300 through the reverse motion, so as to realize anti-shake. The gyro sensor 380B may also be used for navigating, somatosensory game scenes.
The air pressure sensor 380C is used to measure air pressure. In some embodiments, the electronic device 300 calculates altitude from barometric pressure values measured by the barometric pressure sensor 380C, aiding in positioning and navigation.
The magnetic sensor 380D includes a hall sensor. The electronic device 300 may detect the opening and closing of the flip holster using the magnetic sensor 380D. In some embodiments, when the electronic device 300 is a flip machine, the electronic device 300 may detect the opening and closing of the flip according to the magnetic sensor 380D. And then according to the detected opening and closing state of the leather sheath or the opening and closing state of the flip, the characteristics of automatic unlocking of the flip and the like are set.
The acceleration sensor 380E may detect the magnitude of acceleration of the electronic device 300 in various directions (typically three axes). The magnitude and direction of gravity may be detected when the electronic device 300 is stationary. The electronic equipment gesture recognition method can also be used for recognizing the gesture of the electronic equipment, and is applied to horizontal and vertical screen switching, pedometers and other applications.
A distance sensor 380F for measuring distance. The electronic device 300 may measure the distance by infrared or laser. In some embodiments, the electronic device 300 may range using the distance sensor 380F to achieve fast focus.
The proximity light sensor 380G may include, for example, a Light Emitting Diode (LED) and a light detector, such as a photodiode. The light emitting diode may be an infrared light emitting diode. The electronic device 300 emits infrared light outward through the light emitting diode. The electronic device 300 uses a photodiode to detect infrared reflected light from nearby objects. When sufficient reflected light is detected, it may be determined that an object is in the vicinity of the electronic device 300. When insufficient reflected light is detected, the electronic device 300 may determine that there is no object in the vicinity of the electronic device 300. The electronic device 300 can detect that the user holds the electronic device 300 close to the ear by using the proximity light sensor 380G, so as to automatically extinguish the screen to achieve the purpose of saving power. The proximity light sensor 380G may also be used in holster mode, pocket mode to automatically unlock and lock the screen.
The ambient light sensor 380L is used to sense ambient light level. The electronic device 300 may adaptively adjust the brightness of the display screen 394 based on the perceived ambient light level. The ambient light sensor 380L may also be used to automatically adjust white balance during photographing. The ambient light sensor 380L may also cooperate with the proximity light sensor 380G to detect if the electronic device 300 is in a pocket to prevent false touches.
The fingerprint sensor 380H is used to collect a fingerprint. The electronic device 300 can utilize the collected fingerprint characteristics to realize fingerprint unlocking, access an application lock, fingerprint photographing, fingerprint incoming call answering and the like.
The gravity sensor 380I detects the weight of the object placed on the electronic device 300, in this embodiment, the gravity sensor is located in the display screen and is used for collecting the weight of the second electronic device, when the gravity sensor 380I collects the weight of the second electronic device, the detected data is uploaded to the processor on the electronic device 300, and the processor determines whether the weight of the second electronic device meets the preset third condition.
The temperature sensor 380J is used to detect temperature. In some embodiments, the electronic device 300 performs a temperature processing strategy using the temperature detected by the temperature sensor 380J. For example, when the temperature reported by temperature sensor 380J exceeds a threshold, electronic device 300 performs a reduction in performance of a processor located in the vicinity of temperature sensor 380J in order to reduce power consumption to implement thermal protection. In other embodiments, when the temperature is below another threshold, the electronic device 300 heats the battery 342 to avoid the low temperature causing the electronic device 300 to shut down abnormally. In other embodiments, when the temperature is below a further threshold, the electronic device 300 performs boosting of the output voltage of the battery 342 to avoid abnormal shutdown caused by low temperatures.
The touch sensor 380K, also referred to as a "touch panel". The touch sensor 380K may be disposed on the display screen 394, and the touch sensor 380K and the display screen 394 form a touch screen, which is also referred to as a "touch screen". The touch sensor 380K is for detecting a touch operation acting thereon or thereabout. The touch sensor may communicate the detected touch operation to the application processor to determine the touch event type. Visual output related to touch operations may be provided through the display screen 394. In other embodiments, touch sensor 380K may also be located on a surface of electronic device 300 other than at display 394.
In this embodiment of the present application, when the second electronic device is placed on the touch screen on the first electronic device, the touch IC on the touch screen may detect the amount of change in the dielectric constant and the area of change in the dielectric constant on the touch screen. And then the touch IC sends the variation of the dielectric constant and the variation area of the dielectric constant on the touch screen to the processor, and the processor judges whether the variation of the dielectric constant on the touch screen meets a preset first condition or not and whether the variation area of the dielectric constant on the touch screen meets a preset second condition or not.
The bone conduction sensor 380M may acquire a vibration signal. In some embodiments, bone conduction sensor 380M may acquire a vibration signal of a human vocal tract vibrating bone pieces. The bone conduction sensor 380M may also contact the pulse of the human body to receive the blood pressure pulsation signal. In some embodiments, bone conduction sensor 380M may also be provided in the headset, in combination with an osteoinductive headset. The audio module 370 may analyze the voice signal based on the vibration signal of the sound portion vibration bone block obtained by the bone conduction sensor 380M, so as to implement a voice function. The application processor can analyze heart rate information based on the blood pressure beat signals acquired by the bone conduction sensor 380M, so as to realize a heart rate detection function.
The keys 390 include a power on key, a volume key, etc. Key 390 may be a mechanical key. Or may be a touch key. The electronic device 300 may receive key inputs, generating key signal inputs related to user settings and function controls of the electronic device 300.
The motor 391 may generate a vibration alert. The motor 391 may be used for incoming call vibration alerting as well as for touch vibration feedback. For example, touch operations acting on different applications (e.g., photographing, audio playing, etc.) may correspond to different vibration feedback effects. The motor 391 may also correspond to different vibration feedback effects by touch operations applied to different areas of the display screen 394. Different application scenarios (such as time reminding, receiving information, alarm clock, game, etc.) can also correspond to different vibration feedback effects. The touch vibration feedback effect may also support customization.
The indicator 392 may be an indicator light, which may be used to indicate a state of charge, a change in charge, a message indicating a missed call, a notification, etc.
The SIM card interface 395 is for interfacing with a SIM card. The SIM card may be inserted into the SIM card interface 395 or removed from the SIM card interface 395 to enable contact and separation with the electronic device 300. The electronic device 300 may support 1 or N SIM card interfaces, N being a positive integer greater than 1. The SIM card interface 395 may support Nano SIM cards, micro SIM cards, and the like. The same SIM card interface 395 can be used to insert multiple cards simultaneously. The types of the plurality of cards may be the same or different. The SIM card interface 395 may also be compatible with different types of SIM cards. The SIM card interface 395 may also be compatible with external memory cards. The electronic device 300 interacts with the network through the SIM card to realize functions such as communication and data communication. In some embodiments, the electronic device 300 employs esims, namely: an embedded SIM card. The eSIM card can be embedded in the electronic device 300 and cannot be separated from the electronic device 300.
The software system of the electronic device 300 may employ a layered architecture, an event driven architecture, a microkernel architecture, a microservice architecture, or a cloud architecture. In this embodiment, taking an Android system with a layered architecture as an example, a software structure of the electronic device 300 is illustrated.
Fig. 4 shows a software architecture block diagram of an electronic device 300 according to an embodiment of the present application. The layered architecture divides the software into several layers, each with distinct roles and branches. The layers communicate with each other through a software interface. In some embodiments, the Android system is divided into four layers, from top to bottom, an application layer, an application framework layer, an Zhuoyun row (Android run) and system libraries, and a kernel layer, respectively. The application layer may include a series of application packages.
As shown in fig. 4, the application package may include applications for cameras, gallery, calendar, phone calls, maps, navigation, WLAN, bluetooth, music, video, short messages, etc.
The application framework layer provides an application programming interface (application programming interface, API) and programming framework for application programs of the application layer. The application framework layer includes a number of predefined functions.
As shown in fig. 4, the application framework layer may include a window manager, a content provider, a view system, a telephony manager, a resource manager, a notification manager, and the like.
The window manager is used for managing window programs. The window manager can acquire the size of the display screen, judge whether a status bar exists, lock the screen, intercept the screen and the like.
The content provider is used to store and retrieve data and make such data accessible to applications.
The view system includes visual controls, such as controls to display text, controls to display pictures, and the like. The view system may be used to build applications. The display interface may be composed of one or more views.
The telephony manager is used to provide the communication functions of the electronic device 300. Such as the management of call status (including on, hung-up, etc.).
The resource manager provides various resources for the application program, such as localization strings, icons, pictures, layout files, video files, and the like.
The notification manager allows the application to display notification information in a status bar, can be used to communicate notification type messages, can automatically disappear after a short dwell, and does not require user interaction. Such as notification manager is used to inform that the download is complete, message alerts, etc. The notification manager may also be a notification in the form of a chart or scroll bar text that appears on the system top status bar, such as a notification of a background running application, or a notification that appears on the screen in the form of a dialog window. For example, a text message is prompted in a status bar, a prompt tone is emitted, the electronic device vibrates, and an indicator light blinks, etc.
Android runtimes include core libraries and virtual machines. Android run time is responsible for scheduling and management of the Android system.
The core library consists of two parts: one part is a function which needs to be called by java language, and the other part is a core library of android.
The application layer and the application framework layer run in a virtual machine. The virtual machine executes java files of the application program layer and the application program framework layer as binary files. The virtual machine is used for executing the functions of object life cycle management, stack management, thread management, security and exception management, garbage collection and the like.
The system library may include a plurality of functional modules. For example: surface manager (surface manager), media library (media library), three-dimensional graphics processing library (e.g., openGL ES), 2D graphics engine (e.g., SGL), etc.
The surface manager is used to manage the display subsystem and provides a fusion of 2D and 3D layers for multiple applications.
Media libraries support a variety of commonly used audio, video format playback and recording, still image files, and the like. The media library may support a variety of audio video encoding formats, such as: MPEG4, h.264, MP3, AAC, AMR, JPG, PNG, etc.
The three-dimensional graphic processing library is used for realizing three-dimensional graphic drawing, image rendering, synthesis, layer processing and the like.
The 2D graphics engine is a drawing engine for 2D drawing.
The kernel layer is a layer between hardware and software. The inner core layer at least comprises a display driver, a camera driver, an audio driver and a sensor driver.
For easy understanding, the following embodiments of the present application will take an electronic device having a structure shown in fig. 3 and fig. 4 as an example, and specifically describe a method for projecting a screen of the electronic device provided in the embodiments of the present application in conjunction with the accompanying drawings and application scenarios.
In one possible application scenario, a user wants to quickly screen a display interface on a mobile phone to a large-screen device, and the method for screen projection of the electronic device provided by the application can be used.
In another possible application scenario, a user wants to rapidly screen a plurality of application programs running on a mobile phone onto a large-screen device, and the method for screen projection of the electronic device provided by the application can be used.
The first electronic device provided by the embodiment of the application may be an electronic device with a larger display screen, for example, a large-screen device, a tablet computer, a PC, a car machine, a smart screen, a folding screen device, and the like, and the electronic device with the larger display screen includes a touch screen.
The method for projecting the screen of the electronic device provided by the application is specifically described by taking the first electronic device as a folding screen device and the second electronic device as a mobile phone, wherein the folding screen device comprises: the mobile phone comprises a first screen area, a folding area and a second screen area, wherein the first screen area and the second screen area are touch screen areas, and the mobile phone can be placed in the first screen area or the second screen area.
Fig. 5 is a schematic diagram illustrating an example of a display interface of an electronic device according to an embodiment of the present application, where, as shown in fig. 5 (a), a mobile phone is placed in a first area on a second screen area of a folding screen device. And assume that application a, application B, and application C are running on the handset. When the mobile phone is placed on the second screen area of the folding screen device, the first electronic device obtains at least one of the following parameters: the change amount of the dielectric constant of the first area of the touch screen of the first electronic device, the area of the first area or the weight range of the second electronic device; when the variation of the dielectric constant of the first area of the touch screen of the first electronic device meets a preset first condition, the area of the first area meets a preset second condition, or the weight range of the second electronic device meets a preset third condition, the application a, the application B and the application C running on the second electronic device can be automatically screen-thrown onto the first electronic device, that is, as shown by an interface in the (B) diagram in fig. 5, the second screen area of the first electronic device can display the application a, the application B and the application C running on the second electronic device. The user may operate an application running on the second electronic device within the second screen region of the first electronic device.
The following describes how the first electronic device obtains the first parameter, and how the first electronic device determines whether the first parameter meets the preset condition.
Optionally, when the first parameter is at least one of the following: when the change amount of the dielectric constant of the first area of the touch screen of the first electronic device, the area of the first area or the weight range of the second electronic device, the first electronic device determines whether the change amount of the dielectric constant of the first area of the touch screen meets a preset first condition, whether the area of the first area meets a preset second condition and whether the weight of the second electronic device meets a preset third condition.
Firstly, when the second electronic device is placed on the second screen area of the first electronic device (folding screen device), the touch screen in the second screen area can sense current transmission change, the touch IC on the touch screen can detect signals of dielectric constants in the second screen area, and when the variation of the dielectric constants of the first area of the touch screen detected by the first electronic device meets a preset first condition, the object placed in the second screen area at the moment is the electronic device.
The preset first condition may be, for example, 1.5ε -2.2ε. Of course, the preset first condition may also be other value ranges, which is not limited in the embodiment of the present application.
The principle of the dielectric constant change when the second electronic device is placed on the touch screen of the first electronic device will be described below.
Fig. 6 shows a schematic diagram of a change in dielectric constant of a capacitive touch panel, as shown in fig. 6, when a user touches a capacitive screen, a coupling capacitor is formed between a finger of the user and a working surface due to an electric field of the human body, and because a high-frequency signal is connected to the working surface, the finger absorbs a small current, the current flows from electrodes at four corners of the screen respectively, and theoretically, the current flowing through the four electrodes is proportional to the distance from the finger to four corners, and a specific position of the user touching the capacitive screen can be obtained by precisely calculating the ratio of the four currents.
Similarly, the dielectric constant of any article is just non-insulator as long as the dielectric constant is greater than zero, and when any non-insulator is placed on the touch screen, the electric field of the capacitive touch screen can change, i.e. when any non-insulator is placed on the touch screen, the dielectric constant on the touch screen can change. When the mobile phone, the earphone or the plastic shell and other articles are placed on the screen, the capacitive screen can sense the current transmission change, so that the change of the dielectric constant on the touch screen is caused, and the object placed on the touch screen can be judged by utilizing the change of the dielectric constant on the touch screen.
Therefore, in the embodiment of the application, when the mobile phone is directly placed on the second screen area, the change amount of the dielectric constant of the first area of the touch screen of the first electronic device can be obtained through the touch IC, and according to the change amount of the dielectric constant of the first area of the touch screen of the first electronic device, the object placed on the second screen area is determined to be the electronic device.
For example, since the mobile phone contacts with the touch screen, the mobile phone is a mobile phone housing, and the mobile phone housing is usually made of glass, metal, plastic, or the like. When the touch IC obtains that the change amount of the dielectric constant of the first area of the touch screen of the first electronic device is the dielectric constant of the material such as glass, metal or plastic, whether the object placed in the second screen area is the electronic device can be judged according to the change amount of the dielectric constant.
Based on the principle of the change of the dielectric constant, in the embodiment of the present application, the size of the second electronic device may be determined by using the area of the first area of the touch screen of the first electronic device, and then whether the object is an electronic device may be determined according to the size.
The change area of the dielectric constant of the touch screen of the first electronic device is the size of the second electronic device placed on the first electronic device. And then further judging whether the object is an electronic device according to the size of the object. And when the change area of the dielectric constant of the touch screen of the first electronic device meets a preset second condition, indicating that the object is the electronic device.
The preset second condition may be, for example, 82.28cm 2 -300cm 2 . Of course, the preset second condition may also be other value ranges, which is not limited in the embodiment of the present application.
In the embodiment of the application, the first electronic device may further determine whether the object is an electronic device by measuring the weight of the second electronic device placed in the second screen area. I.e. when the weight of the object meets a preset third condition, it is indicated that the object is an electronic device.
Specifically, the touch screen of the first electronic device includes a gravity sensor, the gravity sensor can detect the weight of the object, and the first electronic device judges whether the object is the electronic device according to the weight detected by the gravity sensor.
The preset third condition may be, for example, 175g-2000g. Of course, the preset third condition may also be other value ranges, which are not limited in the embodiment of the present application.
In summary, when the amount of change of the dielectric constant of the first area of the touch screen of the first electronic device satisfies the preset first condition, the area of the first area satisfies the preset second condition, and the weight of the second electronic device satisfies at least one of the preset third condition, the first electronic device may switch from the interface in the diagram (a) shown in fig. 5 to the interface in the diagram (b) shown in fig. 5. At least one application running on the second electronic device, such as application a, application B, and application C, is displayed within the second screen area of the first electronic device.
In some embodiments, in order to improve the accuracy of screen projection and protect the privacy of the user, when the variation of the dielectric constant of the first area of the touch screen of the first electronic device meets at least one of the preset first condition, the area of the first area meets the preset second condition, and the weight of the second electronic device meets the preset third condition, the first electronic device sends an authorization request to the second electronic device, and when the second electronic device agrees to the authorization, at least one application running on the second electronic device can be displayed on the first electronic device.
Fig. 7 is an exemplary schematic diagram of an interface display of a first electronic device provided in the embodiment of the present application, where when a second electronic device is placed in a first area of a second screen area of the first electronic device and it is assumed that an application a, an application B, and an application C are running on the second electronic device, and when a first parameter acquired by the first electronic device meets a preset condition, as shown in fig. 7 (B), a multi-screen authorization popup window is displayed on the second electronic device, and after a prompt user "confirms the permission, the folding screen device displays an application interface in the mobile phone", and when the user clicks agrees, the interface of the first electronic device is switched from the interface shown in fig. 7 (a) to the interface shown in fig. 7 (B), that is, the application a, the application B, and the application C running on the second electronic device are displayed in the second screen area of the first electronic device.
In other embodiments, when the amount of change of the dielectric constant of the first area of the touch screen of the first electronic device meets at least one of a preset first condition, an area of the first area meets a preset second condition, and a weight of the second electronic device meets at least one of a preset third condition, in order to further improve accuracy of screen projection, the first electronic device may further obtain time for placing the second electronic device, and when the time for placing the second electronic device on the touch screen of the first electronic device is greater than or equal to a preset first threshold, at least one application running on the second electronic device is displayed on the first electronic device.
It should be understood that the preset first threshold may be set according to a specific situation, and embodiments of the present application are not limited.
Specifically, the gravity of the second electronic device may be obtained in real time by using a gravity sensor integrated on the touch screen of the first electronic device, and when the value of the gravity sensor in a certain time period is kept at a certain fixed value, it indicates that the second electronic device is placed on the first electronic device in the time period. The time period can be compared with a preset first threshold value, and whether the time period meets the preset first threshold value or not is judged.
Optionally, as a possible implementation manner, when the first electronic device determines that the time for which the second electronic device is placed is greater than or equal to a preset first threshold, the first electronic device actively sends an authorization request to the second electronic device, and when the second electronic device agrees to the authorization, an application running on the second electronic device may be displayed on the first electronic device.
In some embodiments, when a plurality of applications are run on the second electronic device and displayed on the second screen area of the first electronic device, the user may open any one of the applications on the second screen area so that it may be displayed in an enlarged manner. Fig. 8 is a schematic diagram illustrating an interface display of a first electronic device according to an embodiment of the present application, where, as shown in fig. 8 (a), when an application a, an application B, and an application C are displayed on the first electronic device, it is assumed that the application a is a WeChat application, the application B is a video application, and the application C is a photo application, when a user wants to view the enlarged view on the first electronic device, the user may click on the application B, and after clicking on the application B, a second screen area of the first electronic device is switched from the interface shown in fig. 8 (a) to the interface shown in fig. 8 (B). I.e. the video may be enlarged in the second screen area for viewing by the user.
It will be appreciated that, since the second electronic device is placed on the second screen area of the first electronic device, when the user views the enlarged window as a video in the second screen area, the window is obscured when the window is enlarged to a certain size, and that displaying the enlarged window on the second screen area affects the user experience to a certain extent since other applications are also displayed in the second screen area.
Therefore, in other embodiments of the present application, when a plurality of application programs running on the second electronic device are displayed in the second screen area of the first electronic device, the user may display any one of the opened application programs in the first screen area, so as to implement a multi-screen collaboration function, and finally, when the user views a certain application program, the user does not affect the display of other application programs, thereby improving user experience. As an example, fig. 9 shows a schematic diagram of an interface display of a first electronic device provided in an embodiment of the present application, where, as shown in fig. 9 (a), when an application a, an application B, and an application C are displayed on the first electronic device, and assuming that the application a is a WeChat application, the application B is a video application, and the application C is a photo application, when a user wants to view an enlarged WeChat application and does not want to affect the WeChat application, the user may drag the application B upward when clicking the application B, and when the application B drags to the first screen area, the interface of the first electronic device is switched from the interface shown in fig. 9 (a) to the interface shown in fig. 9 (B), that is, the interface of the application B is displayed on the first screen area of the first electronic device.
According to the method, whether the change amount of the dielectric constant on the touch screen of the first electronic device, the area of the first area or the weight of the second electronic device meets the preset condition is judged, and when the preset condition is met, the first electronic device and the second electronic device are quickly connected, so that the problem of complicated connection steps in the related art is solved, and after the first electronic device and the second electronic device are connected, at least one application program running on the second electronic device can be directly displayed on the first electronic device, the problem of long interaction path caused by the fact that the application program running on the second electronic device needs to be re-opened on the first electronic device in the related art is further solved, and therefore user experience is improved.
The application also provides another method for projecting the screen of the electronic equipment, when the second electronic equipment is placed on the touch screen of the first electronic equipment, the camera of the second electronic equipment is used for detecting the first electronic equipment, when the surface of the object at the opposite end is detected to be the electronic screen, a connection request is sent to the first electronic equipment, and after the first electronic equipment agrees to the connection request, the connection between the first electronic equipment and the second electronic equipment is established. The application running on the second electronic equipment can be automatically screen-thrown to the first electronic equipment, so that the problem of complicated connection steps in the related art is reduced, and after the first electronic equipment and the second electronic equipment are connected, at least one application running on the second electronic equipment can be directly displayed on the first electronic equipment, and the problem of long interaction path caused by the fact that the application running on the second electronic equipment needs to be re-opened on the first electronic equipment in the related art is further solved, so that user experience is improved.
Fig. 5 to fig. 9 describe a specific process of establishing a screen-throwing connection between the first electronic device and the second electronic device, and after the first electronic device and the second electronic device establish a connection, at least one application program on the second electronic device is thrown onto the first electronic device and then displayed in the following various forms. The following is a detailed description with reference to fig. 10 to 14.
Fig. 10 is a schematic diagram of a display interface of an electronic device provided in an embodiment of the present application, where, as shown in fig. 10 (a), a plurality of application programs running on a second electronic device may be displayed on a first electronic device in a superimposed manner, and, as shown in fig. 10 (b), a plurality of application programs running on the second electronic device may also be displayed on the first electronic device in a tiled manner. Moreover, as can be seen from fig. 10, the windows corresponding to each application program are independent from each other, and the user can operate each window on the first electronic device, for example, the user can individually control closing of each window.
Fig. 11 is a schematic diagram of a display interface of another electronic device provided in the embodiment of the present application, where, as shown in fig. 11 (a), when a plurality of application programs running on a second electronic device are displayed on a first electronic device in an overlapping manner, a user may display, on a screen, a secondary interface under the window by clicking any function in the window. For example, when the second screen area of the first electronic device displays window 1, window 2, and window 3, the user may click on any of the functions in window 3, displaying the secondary interface under window 3 on the screen. As shown in fig. 11 (b), the user may also click on any of the functions in the secondary interface, and the tertiary interface under the window is displayed on the same screen. I.e. the user can click any one of the functions in the secondary interface in window 3 and display the tertiary interface under window 3 on the same screen.
Fig. 12 is a schematic diagram showing a display interface of another electronic device according to the embodiment of the present application, where, as shown in fig. 12 (a), when a plurality of application programs running on a second electronic device are displayed on a first electronic device in a tiled manner, a user may display, on a screen, a secondary interface under the window by clicking any function in the window. For example, when the second screen area of the first electronic device displays window 1, window 2, window 3, window 4, window 5, window 6, and window 7, the user may click any of the functions in window 7, and the secondary interface under window 7 is displayed on the screen. As shown in fig. 12 (b), the user may also click on any of the functions in the secondary interface, and the tertiary interface under the window is displayed on the same screen. I.e. the user can click any one of the functions in the secondary interface in window 7 and the tertiary interface under window 7 is displayed on the same screen.
In some possible implementations, to facilitate comparison of different content of the same page, a plurality of windows of the same application may be displayed on the first electronic device. Fig. 13 is a schematic diagram of a display interface of another electronic device according to an embodiment of the present application, where, as shown in fig. 13 (a), when a plurality of application programs running on a second electronic device are displayed on a first electronic device in an overlapping manner, a user may display a plurality of identical windows on a screen by opening the window again. For example, when the second screen area of the first electronic device displays window 1, window 2, and window 3, the user may click on window 3 again, and the primary interfaces of two windows 3 are co-screen. As shown in fig. 13 (b), when a plurality of applications running on the second electronic device are displayed on the first electronic device in a tiled manner, the user can display a plurality of identical windows on the same screen by opening the window again. I.e. the user can click on window 7 again, co-screen the primary interface of two windows 7.
In some possible implementations, to facilitate comparison analysis of different windows, one-screen comparison may be performed on sub-pages of more than one window, or different sub-page comparisons of the same window may also be performed. Fig. 14 is a schematic diagram illustrating a display interface of another electronic device according to an embodiment of the present application, where, as shown in fig. 14 (a), when a window 1 and a window 2 running on a second electronic device are displayed on a first electronic device, a user may open a secondary interface of the window 1 by clicking any one of the primary interfaces of the window 1. The secondary interface of the window 1 can be compared with the secondary interface of the window 2 by clicking any one of the primary interfaces of the window 2 to open the secondary interface of the window 2. As shown in fig. 14 (b), when the window 1 and the window 2 running on the second electronic device are displayed on the first electronic device, the user may open the secondary interface of the window 2 by clicking any one of the primary interfaces of the window 2, and compare the primary interface of the window 2 with the secondary interface of the window 2.
It should be understood that, in the present application, the above examples and the interfaces of the terminal, the operations of the user, and the like are merely illustrative, and do not constitute a specific limitation of the embodiments of the present application. For example: in other embodiments of the present application, the icons on the interfaces displayed by the respective terminals may include more or fewer icons than those displayed on the interfaces shown in any of the foregoing figures, or some icons may be combined, some icons may be split, or different icons may be split, etc. The embodiments of the present application are not limited herein.
It should also be understood that any of the above examples or the solutions shown in any of the figures may be a separate solution, or any of the above solutions combined by any of the examples or any of the solutions combined by any of the figures may be a separate solution, which is not limited herein.
Based on the above application scenario, the electronic devices shown in fig. 3 and fig. 4, and the above examples, the following describes a step flow of a method for projecting a screen of an electronic device provided in the present application. A specific implementation is described below in conjunction with fig. 15.
Fig. 15 shows a flowchart of an implementation of a method 1500 for screen projection of an electronic device according to an embodiment of the present application. It should be understood that the steps shown in fig. 15 may be implemented by the first electronic device or a chip configured in the first electronic device. Specifically, the method 1500 includes: step S1510 to step S1520.
S1510, the first electronic device obtains a first parameter, where the first parameter includes at least one of: the change amount of the dielectric constant of the first area of the touch screen of the first electronic device, the area of the first area, or the weight of the second electronic device.
It should be noted that, a specific manner how the first electronic device obtains the change amount of the dielectric constant of the first area, the area of the first area, or the weight of the second electronic device of the touch screen of the first electronic device may refer to a specific description corresponding to the (a) diagram in fig. 5, which is not repeated herein.
S1520, when the first parameter meets the preset condition, at least one application program running on the second electronic device is displayed on the first electronic device.
In one possible implementation, the preset condition may be: whether the variation of the dielectric constant of a first area of the touch screen of the first electronic device meets a preset first condition, whether the area of the first area meets a preset second condition, and whether the weight of the second electronic device meets at least one of a preset third condition.
The specific manner how the electronic device determines whether the variation of the dielectric constant of the first area of the touch screen of the first electronic device meets the preset first condition, whether the area of the first area meets the preset second condition, and whether the weight of the second electronic device meets the preset third condition may refer to the specific description corresponding to the (a) diagram in fig. 5, which is not repeated herein.
According to the method provided by the embodiment of the invention, whether the change amount of the dielectric constant on the touch screen of the first electronic device, the area of the first area or the weight of the second electronic device meets the preset condition is judged, and when the preset condition is met, the first electronic device and the second electronic device are quickly connected, so that the problem of complicated connection steps in the related art is reduced, and after the first electronic device and the second electronic device are connected, at least one application program running on the second electronic device can be directly displayed on the first electronic device, and the problem of long interaction path caused by the fact that the application program running on the second electronic device needs to be re-opened on the first electronic device in the related art is further solved, so that the user experience is improved.
In some exemplary embodiments, when the first parameter satisfies the preset condition, before displaying the application running on the second electronic device on the first electronic device, the method further includes: when the first parameter meets a preset condition, the first electronic device determines the time for placing the second electronic device on the touch screen; and when the time for which the second electronic device is placed on the touch screen is greater than or equal to a preset first threshold value, displaying an application program running on the second electronic device on the first electronic device. In the implementation manner, the false recognition of the first electronic equipment can be reduced and the screen projection accuracy is improved by determining the time for placing the second electronic equipment on the first electronic equipment.
In some exemplary embodiments, when the first parameter satisfies the preset condition, before displaying the at least one application running on the second electronic device on the first electronic device, the method further includes: the first electronic device sends an authorization request to the second electronic device; and receiving an authorization message sent by the second electronic equipment, and displaying an application program running on the second electronic equipment on the first electronic equipment. For example, the first electronic device may send an authorization request to the second electronic device, and receive an authorization message sent by the second electronic device, where an application running on the second electronic device is displayed on the first electronic device may refer to the display interface shown in fig. 7. In the implementation manner, through the authorization confirmation of the first electronic equipment and the second electronic equipment, the screen can be projected again under the condition that the user wants to project the screen, so that the privacy of the user and the accuracy of the screen projection are further protected.
In some exemplary embodiments, the touch screen includes a first screen region and a second screen region, and the first electronic device further includes, when the second screen region displays an application running on the second electronic device: receiving a first operation of a user on a first application program, wherein the first application program is any one of at least one application program; in response to the first operation, the first electronic device displays the first application in the first screen region. For example, receiving a first operation of a first application by a user may refer to a display interface shown in (a) of fig. 9, and displaying the first application on a first screen area by a first electronic device in response to the first operation may refer to a display interface shown in (b) of fig. 9, which is an example of a video application in fig. 9. In the implementation manner, when the plurality of application programs are displayed in the second screen area in the form of small windows, the first application program can be displayed in the first screen area in an enlarged mode, so that the display of other application programs is not affected, and further user experience is improved.
In some exemplary embodiments, the first electronic device displays at least one application running on the second electronic device in a superimposed or tiled manner. For example, reference may be made to the display interface shown in fig. 10 (a) and the display interface shown in fig. 10 (b).
In some exemplary embodiments, the first electronic device displays a multi-level interface of a first application running on the second electronic device, the first application being any one of the at least one application. For example, reference may be made to an interface shown in fig. 11 (a) and an interface shown in fig. 11 (b), an interface shown in fig. 12 (a) and an interface shown in fig. 12 (b), an interface shown in fig. 13 (a) and an interface shown in fig. 13 (b), an interface shown in fig. 14 (a) and an interface shown in fig. 14 (b).
Fig. 16 shows a flowchart of an implementation of a method 1600 for screen projection of another electronic device according to an embodiment of the present application. It should be understood that the steps shown in fig. 16 may be implemented by the second electronic device or a chip integrated in the second electronic device. Specifically, the method 1600 includes: step S1610 to step S1620.
S1610, the second electronic device detects the contact surface of the first electronic device to determine whether the contact surface is an electronic screen, and the contact surface is a surface where the touch screen of the first electronic device contacts with the second electronic device.
In this embodiment of the present application, the second electronic device may detect the contact surface of the first electronic device, and determine whether the contact surface is an electronic screen.
The detection method of the contact surface of the first electronic device by the second electronic device is not limited, and for example, the contact surface of the first electronic device may be identified by a camera mounted on the second electronic device.
It should also be noted that the type of the camera is not limited in the embodiment of the present application.
Optionally, as a possible implementation manner, when the second electronic device is placed on the first electronic device, an infrared camera on the second electronic device may be used to detect a contact surface of the first electronic device that contacts the second electronic device, and determine whether the contact surface is an electronic screen.
It should be understood that the infrared camera detects and recognizes the content of the photographed picture through ISO sensitivity and focal length. When the second electronic device is placed on a non-electronic screen, the ISO sensitivity may be zero due to no light entering. Therefore, the infrared camera can be used for judging whether the contact surface is an electronic screen or not.
S1620, when the contact surface is an electronic screen, the second electronic device displays the running at least one application program on the first electronic device.
And when the contact surface is an electronic screen, displaying the running at least one application program on the first electronic device.
According to the method for projecting the screen of the electronic equipment, the second electronic equipment detects the contact surface contacted with the first electronic equipment, and when the contact surface is the electronic screen, connection between the first electronic equipment and the second electronic equipment is quickly established, so that the problem of complicated connection steps in the related technology is reduced, and after the connection between the first electronic equipment and the second electronic equipment is established, at least one application program running on the second electronic equipment can be directly displayed on the first electronic equipment, the problem of long interaction path caused by the fact that the application program running on the second electronic equipment needs to be re-opened on the first electronic equipment in the related technology is further solved, and therefore user experience is improved.
Optionally, in one possible implementation manner, when the second electronic device determines that the contact surface is an electronic screen, the second electronic device first sends a connection request to the first electronic device, after the first electronic device receives the connection request sent by the second electronic device, sends a connection establishment message to the second electronic device, and after the second electronic device based on the connection establishment message sent by the first electronic device, the second electronic device displays the running at least one application program on the first electronic device. In the implementation manner, through the authorization confirmation of the first electronic equipment and the second electronic equipment, the screen can be projected again under the condition that the user wants to project the screen, so that the privacy of the user and the accuracy of the screen projection are further protected.
It should be understood that the foregoing is only intended to assist those skilled in the art in better understanding the embodiments of the present application and is not intended to limit the scope of the embodiments of the present application. It will be apparent to those skilled in the art from the foregoing examples that various equivalent modifications or variations are possible, for example, some steps of the methods described above may not be necessary, or some steps may be newly added, etc. Or a combination of any two or more of the above. Such modifications, variations, or combinations are also within the scope of embodiments of the present application.
It should also be understood that the manner, condition, class and division of the embodiments in the embodiments of the present application are for convenience of description only and should not be construed as being particularly limited, and the various manners, classes, conditions and features of the embodiments may be combined without contradiction.
It should also be understood that the various numbers referred to in the embodiments of the present application are merely descriptive convenience and are not intended to limit the scope of the embodiments of the present application. The sequence numbers of the above-mentioned processes do not mean the sequence of execution sequence, and the execution sequence of each process should be determined by its functions and internal logic, and should not constitute any limitation on the implementation process of the embodiments of the present application.
It should also be understood that the foregoing description of embodiments of the present application focuses on highlighting differences between the various embodiments and that the same or similar elements not mentioned may be referred to each other and are not described in detail herein for brevity.
Embodiments of the present application also provide a chip system, as shown in fig. 17, that includes at least one processor 1701 and at least one interface circuit 1702. The processor 1701 and the interface circuit 1702 may be interconnected by wires. For example, the interface circuit 1702 may be used to receive signals from other devices, such as a memory of any of the electronic devices described above. For another example, the interface circuit 1702 may be used to send signals to other devices, such as the processor 1701. The interface circuit 1702 may, for example, read instructions stored in a memory and send the instructions to the processor 1701. The instructions, when executed by the processor 1701, may cause the electronic device to perform the steps performed by any of the electronic devices (e.g., cell phone, large screen device, tablet computer, car machine, PC, etc.) in the above embodiments. Of course, the chip system may also include other discrete devices, which are not specifically limited in this embodiment of the present application.
It should also be understood that the division of the units in the above apparatus is merely a division of a logic function, and may be fully or partially integrated into a physical entity or may be physically separated. And the units in the device can be all realized in the form of software calls through the processing element; or can be realized in hardware; it is also possible that part of the units are implemented in the form of software, which is called by the processing element, and part of the units are implemented in the form of hardware. For example, each unit may be a processing element that is set up separately, may be implemented as integrated in a certain chip of the apparatus, or may be stored in a memory in the form of a program, and the functions of the unit may be called and executed by a certain processing element of the apparatus. The processing element, which may also be referred to herein as a processor, may be an integrated circuit with signal processing capabilities. In implementation, each step of the above method or each unit above may be implemented by an integrated logic circuit of hardware in a processor element or in the form of software called by a processing element. In one example, the unit in any of the above apparatuses may be one or more integrated circuits configured to implement the above methods, for example: one or more application specific integrated circuits (application specific integrated circuit, ASIC), or one or more digital signal processors (digital signal processor, DSP), or one or more field programmable gate arrays (field programmable gate array, FPGA), or a combination of at least two of these integrated circuit forms. For another example, when the units in the apparatus may be implemented in the form of a scheduler of processing elements, the processing elements may be general-purpose processors, such as a central processing unit (central processing unit, CPU) or other processor that may invoke the program. For another example, the units may be integrated together and implemented in the form of a system-on-a-chip (SOC).
The embodiment of the application also provides an apparatus, which is included in an electronic device (for example, a first electronic device or a second electronic device), and has a function of implementing the behavior of the electronic device in any of the above embodiments. The functions can be realized by hardware, and can also be realized by executing corresponding software by hardware. The hardware or software includes at least one module or unit corresponding to the functions described above.
The application also provides electronic equipment (for example, a first electronic equipment or a second electronic equipment), which comprises the device provided by the embodiment of the application.
Embodiments of the present application also provide a computer readable storage medium storing a computer program code, where the computer program includes instructions for executing steps of executing a display interface by an electronic device (e.g., a first electronic device or a second electronic device) in any of the embodiments provided in the embodiments of the present application. The readable medium may be read-only memory (ROM) or random access memory (random access memory, RAM), which the embodiments of the present application do not limit.
Embodiments of the present application also provide a computer program product for causing an electronic device to perform the steps performed by the electronic device in any of the embodiments described above when the computer program product is run on the electronic device.
Embodiments of the present application also provide a graphical user interface on an electronic device with a display screen, a camera, a memory, and one or more processors to execute one or more computer programs stored in the memory, the graphical user interface comprising a graphical user interface displayed by the electronic device when performing steps performed by the electronic device as in any of the embodiments described above.
It will be appreciated that, in order to implement the above-mentioned functions, the above-mentioned terminal device includes a hardware structure and/or a software module for executing the respective functions. Those of skill in the art will readily appreciate that the elements and algorithm steps of the examples described in connection with the embodiments disclosed herein may be implemented as hardware or combinations of hardware and computer software. Whether a function is implemented as hardware or computer software driven hardware depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the embodiments of the present application.
The embodiment of the present application may divide the functional modules of the terminal device and the like according to the above method example, for example, each functional module may be divided corresponding to each function, or two or more functions may be integrated into one processing module. The integrated modules may be implemented in hardware or in software functional modules. It should be noted that, in the embodiment of the present application, the division of the modules is schematic, which is merely a logic function division, and other division manners may be implemented in actual implementation.
From the foregoing description of the embodiments, it will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-described division of functional modules is illustrated, and in practical application, the above-described functional allocation may be implemented by different functional modules according to needs, i.e. the internal structure of the apparatus is divided into different functional modules to implement all or part of the functions described above. The specific working processes of the above-described systems, devices and units may refer to the corresponding processes in the foregoing method embodiments, which are not described herein.
The functional units in the embodiments of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The integrated units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the embodiments of the present application may be essentially or a part contributing to the prior art or all or part of the technical solution may be embodied in the form of a software product stored in a storage medium, including several instructions to cause a computer device (which may be a personal computer, a server, or a network device, etc.) or a processor to perform all or part of the steps of the methods described in the embodiments of the present application. And the aforementioned storage medium includes: flash memory, removable hard disk, read-only memory, random access memory, magnetic or optical disk, and the like.
The foregoing is merely a specific embodiment of the present application, but the protection scope of the present application is not limited thereto, and any changes or substitutions within the technical scope of the present disclosure should be covered in the protection scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (14)

1. A method for projecting a screen of an electronic device, the method being applied to a first electronic device, the first electronic device including a touch screen, a second electronic device being disposed on a first area of the touch screen, the method comprising:
the first electronic device obtains a first parameter, wherein the first parameter comprises at least one of the following: the change amount of the dielectric constant of a first area of the touch screen of the first electronic device, the area of the first area, or the weight of the second electronic device;
and when the first parameter meets a preset condition, displaying at least one application program running on the second electronic device on the first electronic device.
2. The method of claim 1, wherein the preset conditions include: the change amount of the dielectric constant of the first area of the touch screen of the first electronic device meets at least one of a preset first condition, the area of the first area meets a preset second condition, or the weight of the second electronic device meets a preset third condition.
3. The method according to claim 1 or 2, wherein when the first parameter meets a preset condition, the method further comprises, before displaying at least one application running on the second electronic device on the first electronic device:
the first electronic device determines the time for placing the second electronic device on the touch screen;
and when the time for placing the second electronic equipment on the touch screen is greater than or equal to a preset first threshold value, displaying an application program running on the second electronic equipment on the first electronic equipment.
4. The method of any of claims 1-3, wherein when the first parameter meets a preset condition, the method further comprises, prior to displaying at least one application running on the second electronic device on the first electronic device:
the first electronic device sends an authorization request to the second electronic device;
and receiving an authorization message sent by the second electronic equipment, and displaying an application program running on the second electronic equipment on the first electronic equipment.
5. The method of any of claims 1-4, wherein the touch screen includes a first screen area and a second screen area, the first electronic device when the second screen area displays an application running on the second electronic device, the method further comprising:
Receiving a first operation of a user on a first application program, wherein the first application program is any one of the at least one application program;
in response to the first operation, the first electronic device displays the first application on the first screen region.
6. The method of any of claims 1-5, wherein the first electronic device displays at least one application running on the second electronic device in a superimposed or tiled manner.
7. The method of any of claims 1-6, wherein the first electronic device displays a multi-level interface of a first application running on the second electronic device, the first application being any of the at least one application.
8. A method for projecting a screen of an electronic device, the method being applied to a second electronic device, the second electronic device being placed on a touch screen of a first electronic device, the method comprising:
the second electronic device detects a contact surface of the first electronic device, and determines whether the contact surface is an electronic screen, wherein the contact surface is a surface of the touch screen of the first electronic device, which is in contact with the second electronic device;
And when the contact surface is an electronic screen, the second electronic device displays the running at least one application program on the first electronic device.
9. The method of claim 8, wherein before the second electronic device displays the running at least one application on the first electronic device, the method further comprises:
the second electronic equipment sends a connection request to the first electronic equipment;
and responding to the connection establishment message sent by the first electronic device, and displaying the running at least one application program on the first electronic device by the second electronic device.
10. The method of claim 8 or 9, wherein the second electronic device detects the contact surface of the first electronic device using an infrared camera.
11. A communication device comprising means for performing the steps of the method according to any of claims 1 to 7 or means for performing the steps of the method according to any of claims 8 to 10.
12. An electronic device comprising a processor and a memory for storing instructions, the processor for reading the instructions to perform the method of any one of claims 1 to 7 or for performing the method of any one of claims 8 to 10.
13. A computer readable storage medium, in which a computer program is stored, the computer program comprising program instructions which, when executed by a processor, cause the processor to perform the method of any one of claims 1 to 7 or for performing the method of any one of claims 8 to 10.
14. A chip, comprising: a processor for calling and running a computer program from a memory, causing a communication device on which the chip is mounted to perform the method of any one of claims 1 to 7 or for performing the method of any one of claims 8 to 10.
CN202211625815.XA 2022-12-13 2022-12-13 Method for projecting screen of electronic equipment, communication device and electronic equipment Pending CN116204145A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211625815.XA CN116204145A (en) 2022-12-13 2022-12-13 Method for projecting screen of electronic equipment, communication device and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211625815.XA CN116204145A (en) 2022-12-13 2022-12-13 Method for projecting screen of electronic equipment, communication device and electronic equipment

Publications (1)

Publication Number Publication Date
CN116204145A true CN116204145A (en) 2023-06-02

Family

ID=86513779

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211625815.XA Pending CN116204145A (en) 2022-12-13 2022-12-13 Method for projecting screen of electronic equipment, communication device and electronic equipment

Country Status (1)

Country Link
CN (1) CN116204145A (en)

Similar Documents

Publication Publication Date Title
CN112231025B (en) UI component display method and electronic equipment
CN115866121B (en) Application interface interaction method, electronic device and computer readable storage medium
CN110058777B (en) Method for starting shortcut function and electronic equipment
CN110910872B (en) Voice interaction method and device
WO2020000448A1 (en) Flexible screen display method and terminal
CN111095723B (en) Wireless charging method and electronic equipment
CN116233300A (en) Method for controlling communication service state, terminal device and readable storage medium
CN113641271B (en) Application window management method, terminal device and computer readable storage medium
CN115914461B (en) Position relation identification method and electronic equipment
CN116048358B (en) Method and related device for controlling suspension ball
CN114006698B (en) token refreshing method and device, electronic equipment and readable storage medium
CN114828098B (en) Data transmission method and electronic equipment
CN117009005A (en) Display method, automobile and electronic equipment
CN114911400A (en) Method for sharing pictures and electronic equipment
CN111339513A (en) Data sharing method and device
CN116204145A (en) Method for projecting screen of electronic equipment, communication device and electronic equipment
CN116095223B (en) Notification display method and terminal device
CN114205318B (en) Head portrait display method and electronic equipment
CN115016666B (en) Touch processing method, terminal equipment and storage medium
CN114764300B (en) Window page interaction method and device, electronic equipment and readable storage medium
CN114077323B (en) Touch screen false touch prevention method of electronic equipment, electronic equipment and chip system
CN113905334B (en) Information sharing method and device
WO2024012346A1 (en) Task migration method, electronic device, and system
CN117177216A (en) Information interaction method and device and electronic equipment
CN116301905A (en) Data processing method and related equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination