CN115567630A - Management method of electronic equipment, electronic equipment and readable storage medium - Google Patents

Management method of electronic equipment, electronic equipment and readable storage medium Download PDF

Info

Publication number
CN115567630A
CN115567630A CN202210013508.XA CN202210013508A CN115567630A CN 115567630 A CN115567630 A CN 115567630A CN 202210013508 A CN202210013508 A CN 202210013508A CN 115567630 A CN115567630 A CN 115567630A
Authority
CN
China
Prior art keywords
function
electronic device
interface
display
electronic equipment
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202210013508.XA
Other languages
Chinese (zh)
Other versions
CN115567630B (en
Inventor
于志新
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honor Device Co Ltd
Original Assignee
Honor Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honor Device Co Ltd filed Critical Honor Device Co Ltd
Priority to CN202210013508.XA priority Critical patent/CN115567630B/en
Priority to PCT/CN2022/143843 priority patent/WO2023131070A1/en
Publication of CN115567630A publication Critical patent/CN115567630A/en
Application granted granted Critical
Publication of CN115567630B publication Critical patent/CN115567630B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1423Digital output to display device ; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72448User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions
    • H04M1/72454User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions according to context-related or environment-related conditions
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D30/00Reducing energy consumption in communication networks
    • Y02D30/70Reducing energy consumption in communication networks in wireless communication networks

Landscapes

  • Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Environmental & Geological Engineering (AREA)
  • Telephone Function (AREA)

Abstract

The embodiment of the application provides a management method of electronic equipment, the electronic equipment and a readable storage medium, and belongs to the technical field of terminals. The method includes that first electronic equipment detects first operation on the first electronic equipment, the first operation requests to start a first function of the first electronic equipment, and the first function is used for enabling a first display screen and a second display screen of the first electronic equipment to display the same picture; responding to the first operation, and judging whether a second function of the first electronic equipment is in an open state at present, wherein the second function is used for enabling a current display interface of the first electronic equipment to be projected on the second electronic equipment, and the current display interface of the first electronic equipment is the current display interface of the first display screen or the current display interface of the second display screen; and when the second function is in the closed state, the first function is started. Therefore, multiple functions of the electronic equipment can be managed under different use scenes, and optimization of application scenes is achieved.

Description

Management method of electronic equipment, electronic equipment and readable storage medium
Technical Field
The present application relates to the field of terminal technologies, and in particular, to a management method for an electronic device, and a readable storage medium.
Background
With the continuous development of electronic devices, electronic devices with a folding screen are now appearing. The electronic equipment with the folding screen can realize multiple functions because the screen of the electronic equipment can be folded, and the problem that how to manage the multiple functions in different use scenes is needed to be solved at present.
Disclosure of Invention
The embodiment of the application provides a management method of electronic equipment, the electronic equipment and a readable storage medium.
In a first aspect, an embodiment of the present application provides a management method for electronic devices, where the method is performed by a first electronic device, where the first electronic device includes at least two display screens, and includes: detecting a first operation on first electronic equipment, wherein the first operation requests to start a first function of the first electronic equipment, the first function is used for enabling a first display screen and a second display screen of the first electronic equipment to display the same picture, and the first display screen and the second display screen are any two display screens in at least two display screens, wherein the two display screens are positioned on different surfaces; responding to the first operation, and judging whether a second function of the first electronic equipment is in an open state at present, wherein the second function is used for enabling a current display interface of the first electronic equipment to be projected on second electronic equipment, and the current display interface of the first electronic equipment is the current display interface of the first display screen or the current display interface of the second display screen; and when the second function is in the closed state, the first function is started.
Based on the above technical scheme, when a user requests to start the first function of the first electronic device, if the second function is in a closed state, the first function is started, and if the second function is in an opened state, the first function cannot be started. Through managing the first function and the second function of the first electronic equipment, the first electronic equipment can be in a good running state when being used in different use scenes, scene optimization is achieved, and user experience is improved.
In some embodiments, the picture displayed in the first display screen and the picture displayed in the second display screen may be displayed in the same scale, i.e., in the same aspect ratio.
In some embodiments, the second display screen displays a picture that is a mirror image of the picture displayed by the first display screen.
With reference to the first aspect, in certain implementations of the first aspect, detecting the first operation on the first electronic device includes: detecting a camera opening operation, and starting a camera application; a first operation on a first electronic device is detected in a camera application.
With reference to the first aspect and the foregoing implementation manners, in some implementation manners of the first aspect, an application interface of the camera application displays an open button of the first function, and the first operation is a click operation on the open button.
With reference to the first aspect and the foregoing implementation manners, in some implementation manners of the first aspect, after determining whether the second function of the first electronic device is currently in an on state, the method further includes: when the second function is in an opening state, displaying a first interface, wherein the first interface is used for guiding a user to close the second function; detecting an operation of closing the second function; and responding to the operation of closing the second function, and opening the first function after closing the second function.
Based on the scheme, when the first electronic device does not support the operation of the user, the user can be prompted how to operate, so that the requirements of the user are met, and the user experience is improved.
With reference to the first aspect and the foregoing implementations, in some implementations of the first aspect, after the first function is turned on, the method further includes: detecting a second operation on the first electronic equipment, wherein the second operation requests to start a second function; and responding to the second operation, and after the first function is closed, opening the second function.
With reference to the first aspect and the foregoing implementation manners, in some implementation manners of the first aspect, in response to a second operation, after the first function is turned off, turning on a second function includes: responding to the second operation, and judging whether the first function is in an opening state at present; when the first function is in the opening state at present, the second function is opened after the first function is closed.
That is, when the first function and the second function are simultaneously requested to be turned on, the second function is preferentially used.
With reference to the first aspect and the foregoing implementation manners, in some implementation manners of the first aspect, when the first function is currently in an on state, the method further includes: displaying a second interface, wherein the second interface is used for guiding a user to confirm that the first function is closed; detecting the operation that a user confirms to close the first function; in response to confirming the operation to turn off the first function, the first function is turned off.
With reference to the first aspect and the foregoing implementation manners, in some implementation manners of the first aspect, when the first function is currently in an on state, the method further includes: and when the electric quantity of the first electronic equipment is detected to be lower than a first threshold value, closing the first function.
With reference to the first aspect and the foregoing implementation manners, in some implementation manners of the first aspect, when the first function is currently in an on state, the method further includes: and when the temperature of the first electronic equipment is detected to be higher than a second threshold value, closing the first function.
When the first electronic equipment is in a scene such as low electric quantity, abnormal temperature and the like, the first function can be automatically closed, so that a system of the first electronic equipment can normally operate.
With reference to the first aspect and the foregoing implementation manners, in some implementation manners of the first aspect, the turning off the first function includes: displaying a third interface, wherein the third interface is used for prompting a user whether to close the first function or not; detecting a third operation on the first electronic equipment, wherein the third operation requests to close the first function; in response to the third operation, the first function is turned off.
The third interface can display a yes button or a no button, the user can click the yes button to confirm that the first function is closed, and can click the no button to confirm that the first function is not closed, so that when the first electronic device is in a scene of low electric quantity, abnormal temperature and the like, the user can select to continue using the first function and can also select to close the first function, and therefore user experience is improved.
With reference to the first aspect and the foregoing implementation manners, in some implementation manners of the first aspect, when the first function is currently in an on state, the method further includes: receiving a call request from a third electronic device; the first function is turned off.
It can be understood that, when the call is made, the first function may not be used, and then when the user uses the first electronic device to make the call, the first electronic device may close the first function, thereby reducing the progress of the operation of the first electronic device, saving the power consumption of the electronic device, and improving the user experience.
With reference to the first aspect and the foregoing implementation manners, in some implementation manners of the first aspect, when the first function is currently in an on state, the method further includes: and when detecting that the camera application of the first electronic equipment cannot run, closing the first function.
In a second aspect, the present application provides an electronic device comprising: at least two display screens; one or more processors; one or more memories; a module in which a plurality of applications are installed; the memory stores one or more programs that, when executed by the processor, cause the electronic device to perform the steps of: detecting a first operation on the electronic equipment, wherein the first operation requests to start a first function of the electronic equipment, the first function is used for enabling a first display screen and a second display screen of the electronic equipment to display the same picture, and the first display screen and the second display screen are any two display screens of the electronic equipment, which are positioned on different surfaces; responding to the first operation, and judging whether a second function of the first electronic equipment is in an open state at present, wherein the second function is used for enabling a current display interface of the electronic equipment to be projected on second electronic equipment, and the current display interface of the electronic equipment is the current display interface of the first display screen or the current display interface of the second display screen; and when the second function is in the closed state, the first function is started.
With reference to the second aspect, in some implementations of the second aspect, the one or more programs, when executed by the processor, cause the electronic device to perform the steps of: detecting a camera opening operation, and starting a camera application; a first operation on a first electronic device is detected in a camera application.
An application interface of the camera application displays an open button of a first function, and the first operation is a click operation on the open button.
With reference to the second aspect and the foregoing implementations, in some implementations of the second aspect, the one or more programs, when executed by the processor, cause the electronic device to perform the steps of: when the second function is in an opening state, displaying a first interface, wherein the first interface is used for guiding a user to close the second function; detecting an operation of closing the second function; and responding to the operation of closing the second function, and opening the first function after closing the second function.
With reference to the second aspect and the foregoing implementations, in some implementations of the second aspect, the one or more programs, when executed by the processor, cause the electronic device to perform the steps of: detecting a second operation on the electronic equipment, wherein the second operation requests to start a second function; and responding to the second operation, and after the first function is closed, opening the second function.
With reference to the second aspect and the foregoing implementations, in some implementations of the second aspect, the one or more programs, when executed by the processor, cause the electronic device to perform the steps of: responding to the second operation, and judging whether the first function is in an opening state at present; when the first function is in the opening state at present, the second function is opened after the first function is closed.
With reference to the second aspect and the implementations described above, in some implementations of the second aspect, the one or more programs, when executed by the processor, cause the electronic device to perform the steps of: displaying a second interface, wherein the second interface is used for guiding a user to confirm that the first function is closed; detecting an operation that a user confirms to close the first function; in response to confirming the operation to turn off the first function, the first function is turned off.
With reference to the second aspect and the implementations described above, in some implementations of the second aspect, the one or more programs, when executed by the processor, cause the electronic device to perform the steps of: and when the electric quantity of the electronic equipment is detected to be lower than a first threshold value, closing the first function.
With reference to the second aspect and the foregoing implementations, in some implementations of the second aspect, the one or more programs, when executed by the processor, cause the electronic device to perform the steps of: and when the temperature of the electronic equipment is detected to be higher than the second threshold value, the first function is closed.
With reference to the second aspect and the implementations described above, in some implementations of the second aspect, the one or more programs, when executed by the processor, cause the electronic device to perform the steps of: displaying a third interface, wherein the third interface is used for prompting a user whether to close the first function or not; detecting a third operation on the electronic equipment, wherein the third operation requests to close the first function; in response to the third operation, the first function is turned off.
With reference to the second aspect and the implementations described above, in some implementations of the second aspect, the one or more programs, when executed by the processor, cause the electronic device to perform the steps of: receiving a call request from a third electronic device; the first function is turned off.
With reference to the second aspect and the implementations described above, in some implementations of the second aspect, the one or more programs, when executed by the processor, cause the electronic device to perform the steps of: and when detecting that the application program of the electronic equipment cannot run, closing the first function.
In a third aspect, the present application provides an apparatus, which is included in an electronic device, and has a function of implementing the behavior of the electronic device in the foregoing aspects and possible implementations of the foregoing aspects. The functions may be implemented by hardware, or by hardware executing corresponding software. The hardware or software includes one or more modules or units corresponding to the above-described functions. Such as a display module or unit, a detection module or unit, a processing module or unit, etc.
In a fourth aspect, an embodiment of the present application provides a computer-readable storage medium, which includes computer instructions, and when the computer instructions are executed on an electronic device, the electronic device is caused to perform any one of the possible electronic device management methods in the first aspect.
In a fifth aspect, the present application provides a computer program product, which, when run on an electronic device, causes the electronic device to execute the method for managing an electronic device according to any one of the above-mentioned possibilities.
In this application embodiment, the first electronic device includes at least two display screens, and the first display screen and the second display screen are any two display screens located on different surfaces of the at least two display screens. The method comprises the steps that first electronic equipment detects first operation on the first electronic equipment, the first operation requests to start a first function of the first electronic equipment, the first function is used for enabling a first display screen of the first electronic equipment and a second display screen to display the same picture, whether a second function of the first electronic equipment is in an opening state at present is judged in response to the first operation, the second function is used for enabling a current display interface of the first electronic equipment to be projected on the second electronic equipment, and the current display interface of the first electronic equipment is the current display interface of the first display screen or the current display interface of the second display screen; and when the second function is in the closed state, the first function is started.
Based on the technical scheme, when a user requests to start the first function of the first electronic device, if the second function is in a closed state, the first function is started by the first electronic device, and the first electronic device manages the first function and the second function, so that the problem that the first function and the second function conflict when the first electronic device does not support the display of the first display screen on the second display screen and the second electronic device simultaneously is avoided, and when the first electronic device is used in different use scenes, the first electronic device can be in a good running state, application scene optimization is realized, and user experience is improved.
Drawings
FIG. 1 is a schematic diagram of an example of an electronic device according to an embodiment of the present application;
fig. 2 is a schematic diagram of the front and back sides of the mobile phone 100 when the display is unfolded and the display is folded according to an embodiment of the present application;
fig. 3 is a schematic diagram illustrating an example of the first function of the mobile phone 100 when the first function is turned on and off according to the embodiment of the present disclosure;
FIG. 4 is a flowchart illustrating an example of using a first function according to an embodiment of the present disclosure;
fig. 5 is a schematic diagram illustrating an example of the mobile phone 100 according to the embodiment of the present disclosure when a second function is turned on;
fig. 6 is a schematic flowchart of an example of a management method for an electronic device according to an embodiment of the present application;
FIG. 7 is a schematic diagram of an example of a prompt interface provided in the embodiments of the present application;
fig. 8 is a schematic flowchart of a management method for an electronic device according to still another embodiment of the present application;
fig. 9 is a block diagram of a software structure of an exemplary mobile phone 100 according to an embodiment of the present disclosure;
FIG. 10 is an interactive schematic diagram showing cooperation between the various software structures of FIG. 9 to implement the method of the present application;
FIG. 11 is another interactive diagram illustrating cooperation between the various software structures of FIG. 9 to implement the method of the present application;
FIG. 12 is a schematic diagram of still another interaction between the various software structures of FIG. 9 to cooperate in implementing the method of the present application;
FIG. 13 is a schematic diagram of still another interaction between the various software structures of FIG. 9 to cooperate in implementing the method of the present application;
FIG. 14 is a further interaction diagram illustrating cooperation between the various software structures of FIG. 9 to implement the method of the present application;
FIG. 15 is a schematic diagram of still another interaction between the various software structures of FIG. 9 to implement the method of the present application;
fig. 16 is a schematic diagram showing still another interaction between the software structures in fig. 9 to implement the method of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present application clearer, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are a part of the present invention, but not all embodiments.
It should be understood that reference to "a plurality" in this application means two or more. In the description of this application, "/" indicates an inclusive meaning, for example, A/B may indicate either A or B; "and/or" herein is only an association relationship describing an associated object, and means that there may be three relationships, for example, a and/or B, which may mean: a exists alone, A and B exist simultaneously, and B exists alone. In addition, for the convenience of clearly describing the technical solutions of the present application, the words "first", "second", and the like are used to distinguish the same items or similar items having substantially the same functions and actions. Those skilled in the art will appreciate that the terms "first," "second," etc. do not denote any order or quantity, nor do the terms "first," "second," etc. denote any order or importance.
Fig. 1 is a schematic structural diagram of an electronic device 100 according to an embodiment of the present disclosure.
The electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a Universal Serial Bus (USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, a key 190, a motor 191, an indicator 192, a camera 193, a display screen 194, a Subscriber Identity Module (SIM) card interface 195, and the like. The sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
Processor 110 may include one or more processing units, such as: the processor 110 may include an Application Processor (AP), a modem processor, a Graphics Processing Unit (GPU), an Image Signal Processor (ISP), a controller, a video codec, a Digital Signal Processor (DSP), a baseband processor, and/or a neural-Network Processing Unit (NPU), etc. The different processing units may be separate devices or may be integrated into one or more processors.
The controller can generate an operation control signal according to the instruction operation code and the timing signal to complete the control of instruction fetching and instruction execution.
A memory may also be provided in processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that have just been used or recycled by the processor 110. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Avoiding repeated accesses reduces the latency of the processor 110, thereby increasing the efficiency of the system.
In some embodiments, processor 110 may include one or more interfaces. The interface may include an integrated circuit (I2C) interface, an integrated circuit built-in audio (I2S) interface, a Pulse Code Modulation (PCM) interface, a universal asynchronous receiver/transmitter (UART) interface, a Mobile Industry Processor Interface (MIPI), a general-purpose input/output (GPIO) interface, a Subscriber Identity Module (SIM) interface, and/or a Universal Serial Bus (USB) interface, etc.
The I2C interface is a bidirectional synchronous serial bus including a serial data line (SDA) and a Serial Clock Line (SCL). In some embodiments, the processor 110 may include multiple sets of I2C buses. The processor 110 may be coupled to the touch sensor 180K, the charger, the flash, the camera 193, etc. through different I2C bus interfaces, respectively. For example: the processor 110 may be coupled to the touch sensor 180K through an I2C interface, so that the processor 110 and the touch sensor 180K communicate through an I2C bus interface to implement a touch function of the electronic device 100.
The I2S interface may be used for audio communication. In some embodiments, processor 110 may include multiple sets of I2S buses. The processor 110 may be coupled to the audio module 170 through an I2S bus, enabling communication between the processor 110 and the audio module 170. In some embodiments, the audio module 170 may transmit an audio signal to the wireless communication module 160 through the I2S interface, so as to implement a function of answering a call through a bluetooth headset.
The PCM interface may also be used for audio communication, sampling, quantizing and encoding analog signals. In some embodiments, the audio module 170 and the wireless communication module 160 may be coupled by a PCM bus interface. In some embodiments, the audio module 170 may also transmit audio signals to the wireless communication module 160 through the PCM interface, so as to implement a function of answering a call through a bluetooth headset. Both the I2S interface and the PCM interface may be used for audio communication.
The UART interface is a universal serial data bus used for asynchronous communications. The bus may be a bidirectional communication bus. It converts the data to be transmitted between serial communication and parallel communication. In some embodiments, a UART interface is generally used to connect the processor 110 with the wireless communication module 160. For example: the processor 110 communicates with a bluetooth module in the wireless communication module 160 through a UART interface to implement a bluetooth function. In some embodiments, the audio module 170 may transmit the audio signal to the wireless communication module 160 through a UART interface, so as to realize the function of playing music through a bluetooth headset.
The MIPI interface may be used to connect the processor 110 with peripheral devices such as the display screen 194, the camera 193, and the like. The MIPI interface includes a Camera Serial Interface (CSI), a Display Serial Interface (DSI), and the like. In some embodiments, processor 110 and camera 193 communicate through a CSI interface to implement the capture functionality of electronic device 100. The processor 110 and the display screen 194 communicate through the DSI interface to implement the display function of the electronic device 100.
The GPIO interface may be configured by software. The GPIO interface may be configured as a control signal and may also be configured as a data signal. In some embodiments, a GPIO interface may be used to connect the processor 110 with the camera 193, the display 194, the wireless communication module 160, the audio module 170, the sensor module 180, and the like. The GPIO interface may also be configured as an I2C interface, I2S interface, UART interface, MIPI interface, and the like.
The USB interface 130 is an interface conforming to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, or the like. The USB interface 130 may be used to connect a charger to charge the electronic device 100, and may also be used to transmit data between the electronic device 100 and a peripheral device. And the earphone can also be used for connecting an earphone and playing audio through the earphone. The interface may also be used to connect other electronic devices, such as AR devices and the like.
It should be understood that the connection relationship between the modules according to the embodiment of the present invention is only illustrative, and is not limited to the structure of the electronic device 100. In other embodiments of the present application, the electronic device 100 may also adopt different interface connection manners or a combination of multiple interface connection manners in the above embodiments.
The charging management module 140 is configured to receive charging input from a charger. The charger may be a wireless charger or a wired charger. In some wired charging embodiments, the charging management module 140 may receive charging input from a wired charger via the USB interface 130. In some wireless charging embodiments, the charging management module 140 may receive a wireless charging input through a wireless charging coil of the electronic device 100. The charging management module 140 may also supply power to the electronic device through the power management module 141 while charging the battery 142.
The power management module 141 is used to connect the battery 142, the charging management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charge management module 140, and supplies power to the processor 110, the internal memory 121, the display 194, the camera 193, the wireless communication module 160, and the like. The power management module 141 may also be used to monitor parameters such as battery capacity, battery cycle count, battery state of health (leakage, impedance), etc. In other embodiments, the power management module 141 may be disposed in the processor 110. In other embodiments, the power management module 141 and the charging management module 140 may be disposed in the same device.
The wireless communication function of the electronic device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the electronic device 100 may be used to cover a single or multiple communication bands. Different antennas can also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed as a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 150 may provide a solution including wireless communication of 2G/3G/4G/5G/6G, etc. applied to the electronic device 100. The mobile communication module 150 may include at least one filter, a switch, a power amplifier, a Low Noise Amplifier (LNA), and the like. The mobile communication module 150 may receive the electromagnetic wave from the antenna 1, filter, amplify, etc. the received electromagnetic wave, and transmit the electromagnetic wave to the modem processor for demodulation. The mobile communication module 150 may also amplify the signal modulated by the modem processor, and convert the signal into electromagnetic wave through the antenna 1 to radiate the electromagnetic wave. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the same device as at least some of the modules of the processor 110.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating a low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then passes the demodulated low frequency baseband signal to a baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor. The application processor outputs a sound signal through an audio device (not limited to the speaker 170A, the receiver 170B, etc.) or displays an image or video through the display screen 194. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be provided in the same device as the mobile communication module 150 or other functional modules, independent of the processor 110.
The wireless communication module 160 may provide a solution for wireless communication applied to the electronic device 100, including Wireless Local Area Networks (WLANs) (e.g., wireless fidelity (Wi-Fi) networks), bluetooth (bluetooth, BT), global Navigation Satellite System (GNSS), frequency Modulation (FM), near Field Communication (NFC), infrared (IR), and the like. The wireless communication module 160 may be one or more devices integrating at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, performs frequency modulation and filtering processing on electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, perform frequency modulation and amplification on the signal, and convert the signal into electromagnetic waves through the antenna 2 to radiate the electromagnetic waves.
In some embodiments, antenna 1 of electronic device 100 is coupled to mobile communication module 150 and antenna 2 is coupled to wireless communication module 160 so that electronic device 100 can communicate with networks and other devices through wireless communication techniques. The wireless communication technology may include global system for mobile communications (GSM), general Packet Radio Service (GPRS), code division multiple access (code division multiple access, CDMA), wideband Code Division Multiple Access (WCDMA), time-division code division multiple access (time-division code division multiple access, TD-SCDMA), long Term Evolution (LTE), BT, GNSS, WLAN, NFC, FM, and/or IR technologies, etc. The GNSS may include a Global Positioning System (GPS), a global navigation satellite system (GLONASS), a beidou navigation satellite system (BDS), a quasi-zenith satellite system (QZSS), and/or a Satellite Based Augmentation System (SBAS).
The electronic device 100 implements display functions via the GPU, the display screen 194, and the application processor. The GPU is a microprocessor for image processing, and is connected to the display screen 194 and an application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. The processor 110 may include one or more GPUs that execute program instructions to generate or alter display information.
The display screen 194 is used to display images, video, and the like. The display screen 194 includes a display panel. The display panel may adopt a Liquid Crystal Display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (active-matrix organic light-emitting diode, AMOLED), a flexible light-emitting diode (FLED), a miniature, a Micro-oeld, a quantum dot light-emitting diode (QLED), and the like. In some embodiments, the electronic device 100 may include 1 or N display screens 194, N being a positive integer greater than 1, and the display screens 194 may be foldable.
The electronic device 100 may implement a shooting function through the ISP, the camera 193, the video codec, the GPU, the display 194, the application processor, and the like.
The ISP is used to process the data fed back by the camera 193. For example, when a user takes a picture, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, an optical signal is converted into an electric signal, and the camera photosensitive element transmits the electric signal to the ISP for processing and converting into an image visible to the naked eye. The ISP can also carry out algorithm optimization on the noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in camera 193.
The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image to the photosensitive element. The photosensitive element may be a Charge Coupled Device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor. The light sensing element converts the optical signal into an electrical signal, which is then passed to the ISP where it is converted into a digital image signal. And the ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into image signal in standard RGB, YUV and other formats. In some embodiments, electronic device 100 may include 1 or N cameras 193, N being a positive integer greater than 1.
The digital signal processor is used for processing digital signals, and can process other digital signals besides digital image signals. For example, when the electronic device 100 selects a frequency bin, the digital signal processor is used to perform fourier transform or the like on the frequency bin energy.
Video codecs are used to compress or decompress digital video. The electronic device 100 may support one or more video codecs. In this way, the electronic device 100 may play or record video in a variety of encoding formats, such as: moving Picture Experts Group (MPEG) 1, MPEG2, MPEG3, MPEG4, and the like.
The NPU is a neural-network (NN) computing processor that processes input information quickly by using a biological neural network structure, for example, by using a transfer mode between neurons of a human brain, and can also learn by itself continuously. Applications such as intelligent recognition of the electronic device 100 can be realized through the NPU, for example: image recognition, face recognition, speech recognition, text understanding, and the like.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to extend the memory capability of the electronic device 100. The external memory card communicates with the processor 110 through the external memory interface 120 to implement a data storage function. For example, files such as music, video, etc. are saved in an external memory card.
The internal memory 121 may be used to store computer-executable program code, which includes instructions. The internal memory 121 may include a program storage area and a data storage area. The storage program area may store an operating system, an application program (such as a sound playing function, an image playing function, etc.) required by at least one function, and the like. The storage data area may store data (such as audio data, phone book, etc.) created during use of the electronic device 100, and the like. In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory, such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (UFS), and the like. The processor 110 executes various functional applications of the electronic device 100 and data processing by executing instructions stored in the internal memory 121 and/or instructions stored in a memory provided in the processor.
The electronic device 100 may implement audio functions via the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the headphone interface 170D, and the application processor. Such as music playing, recording, etc.
The audio module 170 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. The audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be disposed in the processor 110, or some functional modules of the audio module 170 may be disposed in the processor 110.
The speaker 170A, also called a "horn", is used to convert the audio electrical signal into a sound signal. The electronic apparatus 100 can listen to music through the speaker 170A or listen to a handsfree call.
The receiver 170B, also called "earpiece", is used to convert the electrical audio signal into a sound signal. When the electronic apparatus 100 receives a call or voice information, it can receive voice by placing the receiver 170B close to the ear of the person.
The microphone 170C, also referred to as a "microphone," is used to convert sound signals into electrical signals. When making a call or transmitting voice information, the user can input a voice signal to the microphone 170C by speaking near the microphone 170C through the mouth. The electronic device 100 may be provided with at least one microphone 170C. In other embodiments, the electronic device 100 may be provided with two microphones 170C to achieve a noise reduction function in addition to collecting sound signals. In other embodiments, the electronic device 100 may further include three, four or more microphones 170C to collect sound signals, reduce noise, identify sound sources, perform directional recording, and so on.
The earphone interface 170D is used to connect a wired earphone. The headset interface 170D may be the USB interface 130, or may be a 3.5mm open mobile electronic device platform (OMTP) standard interface, a cellular telecommunications industry association (cellular telecommunications industry association of the USA, CTIA) standard interface.
The pressure sensor 180A is used for sensing a pressure signal, and can convert the pressure signal into an electrical signal. In some embodiments, the pressure sensor 180A may be disposed on the display screen 194. The pressure sensor 180A can be of a wide variety, such as a resistive pressure sensor, an inductive pressure sensor, a capacitive pressure sensor, and the like. The capacitive pressure sensor may be a sensor comprising at least two parallel plates having an electrically conductive material. When a force acts on the pressure sensor 180A, the capacitance between the electrodes changes. The electronic device 100 determines the strength of the pressure from the change in capacitance. When a touch operation is applied to the display screen 194, the electronic device 100 detects the intensity of the touch operation according to the pressure sensor 180A. The electronic apparatus 100 may also calculate the touched position from the detection signal of the pressure sensor 180A. In some embodiments, the touch operations that are applied to the same touch position but have different touch operation intensities may correspond to different operation instructions. For example: and when the touch operation with the touch operation intensity smaller than the first pressure threshold value acts on the short message application icon, executing an instruction for viewing the short message. And when the touch operation with the touch operation intensity larger than or equal to the first pressure threshold value acts on the short message application icon, executing an instruction of newly building the short message.
The gyro sensor 180B may be used to determine the motion attitude of the electronic device 100. In some embodiments, the angular velocity of electronic device 100 about three axes (i.e., x, y, and z axes) may be determined by gyroscope sensor 180B. The gyro sensor 180B may be used for photographing anti-shake. For example, when the shutter is pressed, the gyro sensor 180B detects a shake angle of the electronic device 100, calculates a distance to be compensated for by the lens module according to the shake angle, and allows the lens to counteract the shake of the electronic device 100 through a reverse movement, thereby achieving anti-shake. The gyroscope sensor 180B may also be used for navigation, somatosensory gaming scenes.
The air pressure sensor 180C is used to measure air pressure. In some embodiments, electronic device 100 calculates altitude from barometric pressure values measured by barometric pressure sensor 180C to assist in positioning and navigation.
The magnetic sensor 180D includes a hall sensor. The electronic device 100 may detect the opening and closing of the flip holster using the magnetic sensor 180D. In some embodiments, when the electronic device 100 is a flip phone, the electronic device 100 may detect the opening and closing of the flip according to the magnetic sensor 180D. And then according to the opening and closing state of the leather sheath or the opening and closing state of the flip cover, the automatic unlocking of the flip cover is set.
The acceleration sensor 180E may detect the magnitude of acceleration of the electronic device 100 in various directions (typically three axes). The magnitude and direction of gravity can be detected when the electronic device 100 is stationary. The method can also be used for recognizing the posture of the electronic equipment, and is applied to horizontal and vertical screen switching, pedometers and other applications.
A distance sensor 180F for measuring a distance. The electronic device 100 may measure the distance by infrared or laser. In some embodiments, taking a picture of a scene, electronic device 100 may utilize range sensor 180F to range for fast focus.
The proximity light sensor 180G may include, for example, a Light Emitting Diode (LED) and a light detector, such as a photodiode. The light emitting diode may be an infrared light emitting diode. The electronic apparatus 100 emits infrared light to the outside through the light emitting diode. The electronic device 100 detects infrared reflected light from nearby objects using a photodiode. When sufficient reflected light is detected, it can be determined that there is an object near the electronic device 100. When insufficient reflected light is detected, the electronic device 100 may determine that there are no objects near the electronic device 100. The electronic device 100 can utilize the proximity light sensor 180G to detect that the user holds the electronic device 100 close to the ear for talking, so as to automatically turn off the screen to achieve the purpose of saving power. The proximity light sensor 180G can also be used in a holster mode, a pocket mode automatically unlocks and locks the screen.
The ambient light sensor 180L is used to sense ambient light brightness. Electronic device 100 may adaptively adjust the brightness of display screen 194 based on the perceived ambient light level. The ambient light sensor 180L may also be used to automatically adjust the white balance when taking a picture. The ambient light sensor 180L may also cooperate with the proximity light sensor 180G to detect whether the electronic device 100 is in a pocket to prevent accidental touches.
The fingerprint sensor 180H is used to collect a fingerprint. The electronic device 100 can utilize the collected fingerprint characteristics to unlock the fingerprint, access the application lock, photograph the fingerprint, answer an incoming call with the fingerprint, and so on.
The temperature sensor 180J is used to detect temperature. In some embodiments, electronic device 100 implements a temperature processing strategy using the temperature detected by temperature sensor 180J. For example, when the temperature reported by the temperature sensor 180J exceeds a threshold, the electronic device 100 performs a reduction in performance of a processor located near the temperature sensor 180J, so as to reduce power consumption and implement thermal protection. In other embodiments, the electronic device 100 heats the battery 142 when the temperature is below another threshold to avoid the low temperature causing the electronic device 100 to shut down abnormally. In other embodiments, when the temperature is lower than a further threshold, the electronic device 100 performs a boost on the output voltage of the battery 142 to avoid abnormal shutdown due to low temperature.
The touch sensor 180K is also called a "touch device". The touch sensor 180K may be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, which is also called a "touch screen". The touch sensor 180K is used to detect a touch operation applied thereto or nearby. The touch sensor can communicate the detected touch operation to the application processor to determine the touch event type. For example, the touch sensor 180K obtains a touch operation of the user clicking the collaborative photographing function button, and then transfers the operation to the processor 110, and the processor 110 determines that the touch event is "turn on the collaborative photographing function". When the user clicks the photo-collaborative button again, the touch sensor 180K passes the operation to the processor 110, and the processor 110 determines that the touch event is "turn off the photo-collaborative function". In some embodiments, visual output related to touch operations may be provided through the display screen 194. In other embodiments, the touch sensor 180K may be disposed on a surface of the mobile phone 100, different from the position of the display 194. This is not limited by the present application.
The bone conduction sensor 180M can acquire a vibration signal. In some embodiments, the bone conduction sensor 180M may acquire a vibration signal of the human vocal part vibrating the bone mass. The bone conduction sensor 180M may also contact the human pulse to receive the blood pressure pulsation signal. In some embodiments, the bone conduction sensor 180M may also be disposed in a headset, integrated into a bone conduction headset. The audio module 170 may analyze a voice signal based on the vibration signal of the bone block vibrated by the sound part obtained by the bone conduction sensor 180M, so as to implement a voice function. The application processor can analyze heart rate information based on the blood pressure beating signal acquired by the bone conduction sensor 180M, so as to realize the heart rate detection function.
The keys 190 include a power-on key, a volume key, and the like. The keys 190 may be mechanical keys. Or may be touch keys. The electronic apparatus 100 may receive a key input, and generate a key signal input related to user setting and function control of the electronic apparatus 100.
The motor 191 may generate a vibration cue. The motor 191 may be used for incoming call vibration cues, as well as for touch vibration feedback. For example, touch operations applied to different applications (e.g., photographing, audio playing, etc.) may correspond to different vibration feedback effects. The motor 191 may also respond to different vibration feedback effects for touch operations applied to different areas of the display screen 194. Different application scenes (such as time reminding, receiving information, alarm clock, game and the like) can also correspond to different vibration feedback effects. The touch vibration feedback effect may also support customization.
Indicator 192 may be an indicator light that may be used to indicate a state of charge, a change in charge, or a message, missed call, notification, etc.
The SIM card interface 195 is used to connect a SIM card. The SIM card can be brought into and out of contact with the electronic apparatus 100 by being inserted into the SIM card interface 195 or being pulled out of the SIM card interface 195. The electronic device 100 may support 1 or N SIM card interfaces, N being a positive integer greater than 1. The SIM card interface 195 may support a Nano SIM card, a Micro SIM card, a SIM card, etc. The same SIM card interface 195 can be inserted with multiple cards at the same time. The types of the plurality of cards may be the same or different. The SIM card interface 195 is also compatible with different types of SIM cards. The SIM card interface 195 may also be compatible with external memory cards. The electronic device 100 interacts with the network through the SIM card to implement functions such as communication and data communication. In some embodiments, the electronic device 100 employs esims, namely: an embedded SIM card. The eSIM card can be embedded in the electronic device 100 and cannot be separated from the electronic device 100.
It is to be understood that the illustrated structure of the embodiment of the present invention does not specifically limit the electronic device 100. In other embodiments of the present application, electronic device 100 may include more or fewer components than shown, or some components may be combined, some components may be split, or a different arrangement of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
The method of the embodiment of the present application is applicable to any multi-screen electronic device having a first function and a second function, where the multi-screen electronic device has at least two display screens, and the at least two display screens are installed on two different sides of the electronic device, such as a folding screen mobile phone, a tablet computer, and the like, and the present application does not limit this. For convenience of description, the electronic device 100 will be described as the folding-screen mobile phone 100, and the folding-screen mobile phone 100 will be simply referred to as the mobile phone 100.
Illustratively, (a) in fig. 2 shows a schematic front view of the display of the mobile phone 100 when the display is unfolded, when the foldable display of the mobile phone 100 is unfolded, the display of the mobile phone 100 may become a complete plane, and the plane in which the display is located when the mobile phone 100 is horizontally placed may be parallel to a horizontal plane. The display screen shown in fig. 2 (a) may be referred to as an inner screen, and when the mobile phone 100 is in the unfolded state, the inner screen may be divided into an inner screen area a-1 and an inner screen area a-2, where the inner screen area a-1 and the inner screen area a-2 are separated by a rotation axis. The camera P2 may be disposed in the inner screen area a-1 or the inner screen area a-2 of the mobile phone 100, which is not limited in this application.
Fig. 2 (B) shows a schematic back view of the display of the mobile phone 100 when the display is unfolded, where the back view includes an outer screen B and a mobile phone shell, and both the outer screen B and the mobile phone shell may be provided with cameras. The outer screen B is arranged opposite to the inner screen area A-1, and the mobile phone shell is arranged opposite to the inner screen area A-2. Or the outer screen B is arranged opposite to the inner screen area A-2, and the mobile phone shell is arranged opposite to the inner screen area A-1. And a camera P1 is arranged on the outer screen B. Cell phone case is located one side of outer screen B, is provided with camera P3 on the cell phone case, and camera P3 includes one or more cameras, for example includes 3 cameras.
Fig. 2 (c) shows a schematic diagram of the mobile phone 100 when the display screen is folded, in which the inner screen area a-1 rotates towards the inner screen area a-2, or the inner screen area a-2 rotates towards the inner screen area a-1, so that the mobile phone 100 is folded, when the mobile phone 100 is in the folded state, the inner screen a-1 and the inner screen a-2 are not displayed to the user, and the outer screen B is displayed to the user.
In the embodiment of the application, the first function is used for enabling a first display screen and a second display screen of the mobile phone to display the same picture, wherein the first display screen is an outer screen B, and the second display screen can be an integral inner screen A formed by an inner screen area A-1 and an inner screen area A-2, can also be the inner screen area A-1, or can also be the inner screen area A-2.
The camera application in the folding-screen mobile phone 100 is taken as an example for explanation. It can be understood that the technical solution of the present application is also applicable to other applications on electronic devices, and is not limited to camera applications.
For convenience of description, in the following description, it is assumed that a photographer faces the inner panel a and a subject faces the outer panel B as an example, and correspondingly, the camera P2 is referred to as a front camera, and the cameras P1 and P3 are referred to as rear cameras. Or when the inner screen a is in the unfolded state, the camera P1 is not activated, and correspondingly, the camera P2 is referred to as a front camera and the camera P3 is referred to as a rear camera.
It is understood that the method of the present application is also applicable to a scenario in which the photographer faces the outer screen a and the photographed person faces the inner screen B, in which case, correspondingly, the camera P2 is referred to as a rear camera and the cameras P1 and P3 are referred to as front cameras. Alternatively, camera P1 is not enabled, camera P2 is referred to as a rear camera, and camera P3 is referred to as a front camera. Alternatively, camera P3 is not enabled, camera P2 is referred to as a rear camera, and camera P1 is referred to as a front camera.
For example, fig. 3 (a) shows a schematic diagram of the mobile phone 100 when the first function is turned on, and when the photographer 10 photographs the person 20 with the rear camera P1 of the mobile phone 100, the photographed screen 101a photographed by the camera P1 is displayed in the inner screen a facing the photographer, and the mirror image 101B of the photographed screen 101a is displayed in the outer screen B facing the person 100. In this way, information such as the posture of the subject 20 viewed on the mirror image 101B of the outer panel B, the position of the subject 20 on the entire shooting screen, and the like is the same as that viewed on the shooting screen 101a of the inner panel by the user 10. The mirror image of the shot image displayed by the outer screen B can be obtained by copying or mirroring the shot image displayed by the inner screen A. In addition, in order to display a complete mirror image of the photographed screen 101a in the external screen B, the size ratio (e.g., aspect ratio) of the photographed screen 101a to the mirror image 101B may be the same. By the mode, when a photographer uses the folding screen mobile phone 100 to photograph the photographed person, the photographed person can also visually see the photographing effect through the picture displayed by the outer screen B, and the photographed person can directly adjust the posture and the position of the photographed person according to the picture displayed by the outer screen B to obtain a better photographing effect.
Fig. 3 (B) illustrates a schematic diagram when the first function of the mobile phone 100 is closed, where in a case that the first function is closed, the inner screen a displays a captured preview picture, and the outer screen B may display information such as current time, network status, and power, or display a default screen locking picture, a desktop, and the like, which is not limited in this application.
Fig. 3 (c) and (d) show a camera application interface of the mobile phone 100, which includes a currently photographed picture and some function buttons located at two sides of the camera application interface, where the function buttons include a cooperative photographing button for starting the cooperative photographing function, such as the cooperative photographing button 02 in the off state. The cooperative photographing button 02 in the off state is used to indicate that the cooperative photographing function is off at this time, and the external screen a displays information such as time. For example, the cooperative photographing button 03 in the on state is used to indicate that the cooperative photographing function is on at this time.
In this embodiment of the application, a user may execute a first operation in the camera application interface to start the collaborative photographing function, where the first operation may be to click a button related to collaborative photographing in the camera application interface, or may also be a voice instruction, and the like, which is not limited herein.
For example, fig. 4 shows an implementation process of the mobile phone 100 turning on the cooperative photographing function. As in fig. 4 (a), the user clicks the camera application at the application program interface to open the camera application. After the camera application is opened, the inner screen A displays a camera application interface. The user clicks the cooperative photographing button 02 in the closed state, and after the user clicks the cooperative photographing button 02, the mobile phone 100 starts the cooperative photographing function in response to the user's clicking operation.
At this time, as shown in (b) in fig. 4, the cooperative shooting photograph 02 in the off state is switched to the cooperative shooting button 03 in the on state. As shown in fig. 4 (c), the external panel B of the mobile phone 100 displays a mirror image 101B of the photographed image 101a of the internal panel a.
When the user clicks the cooperative photographing button 03 in the on state again, in response to the click operation of the user, the mobile phone 100 turns off the cooperative photographing function, and the cooperative photographing button 03 in the on state is switched to the cooperative photographing button 02 in the off state again, at this time, as shown in (d) in fig. 4, the time is displayed on the external screen B of the mobile phone 100.
The camera application includes a plurality of shooting modes to meet the shooting requirements. It should be noted that, in some shooting modes, the mobile phone 100 supports the use of a collaborative shooting function, such as a video recording mode and a movie mode. In other photographing modes, the cellular phone 100 does not support the use of a cooperative photographing function, such as a self-photographing mode.
And the camera application judges whether the current camera mode supports the collaborative photographing function according to the photographing mode selected by the user and the started camera. For example, in the case where the current camera mode is the movie mode, the cooperative photographing function is supported. In the case where the current camera mode is the self-timer mode, the cooperative photographing function is not supported. For another example, in the case where the camera application is turned on with the rear camera, the cooperative photographing function is supported. When the camera application is started and the front camera is started, the cooperative photographing function is not supported.
If the user selects a mode that does not support the cooperative photographing function from the camera modes after the cooperative photographing function is turned on, the cooperative photographing function of the mobile phone 100 is automatically turned off, for example, when the user switches the photographing mode from the movie mode to the self-photographing mode, the external screen B does not display the preview image photographed by the internal screen a.
In one implementation, the inner screen area a-1 and the inner screen area a-2 may display different contents, for example, when taking a picture, the inner screen area a-1 and the outer screen B simultaneously display a taken preview picture, and the inner screen area a-2 may display an album. The first function applied to the camera application described above is referred to as a cooperative photographing function.
In the embodiment of the present application, the second function is used to enable the current display screen of the mobile phone 100 to be projected on the second electronic device, which may be the computer 200. The second function may be referred to as a system cooperation function hereinafter.
For example, fig. 5 shows a schematic diagram of the mobile phone 100 when the second function is turned on. The mobile phone 100 and the computer 200 may be connected by NFC, code scanning, bluetooth, or the like. For example, sliding upwards from the bottom of the mobile phone 100, a "multi-device cooperation" button may appear, and the "multi-device cooperation" button is clicked, so that the mobile phone 100 displays an electronic device list capable of establishing a cooperative connection with the mobile phone 100, and when the electronic device list includes the computer 200, the computer 200 is clicked, so that the mobile phone 100 establishes a cooperative connection with the computer 200. After the mobile phone 100 is connected to the computer 200, a mobile phone window appears on the desktop of the computer 200, and the mobile phone window displays the current display interface of the inner screen a or the outer screen B of the mobile phone 100. The second function of the cellular phone 100 can be used in a folded state or an unfolded state. When the mobile phone 100 is in the folded state, the first function is turned on, and the user can be guided to unfold the display screen of the mobile phone 100.
In the embodiment of the present application, the cooperative photographing function of the mobile phone 100 causes the picture displayed by the internal screen a to be displayed on the external screen B in a split manner, and the system cooperation function of the mobile phone 100 causes the picture displayed by the internal screen a to be displayed on the computer 200 in a split manner. The user may request to start the system cooperation function in a state where the cooperation photographing function is started, or the user may request to start the cooperation photographing function in a state where the system cooperation function is started. When the mobile phone 100 does not support the simultaneous distribution of the image displayed on the internal screen a to the external screen B of the mobile phone and the computer 200, the cooperative photographing function and the system cooperative function cannot be used simultaneously.
Fig. 6 illustrates a management method of an electronic device according to an embodiment of the present application. When the mobile phone 100 receives a user request, the mobile phone 100 determines whether the collaborative photographing function or the system collaborative function is activated by the user request operation. When the user requests the operation request to start the cooperative photographing function, the mobile phone 100 needs to determine whether the current system cooperative function is in an on state, and when the system cooperative function is in an off state, the mobile phone 100 starts the cooperative photographing function. When the system cooperation function is in an open state, the cooperation photographing function is unavailable.
In one implementation, when the system coordination function is in the on state, the mobile phone 100 may send a prompt to the user to prompt the user to turn off the system coordination function. Exemplarily, (a) in fig. 7 shows a prompt interface. When the system coordination function is in an open state, if a user requests to open the coordination photographing function, a prompt box 01 pops up, the prompt box 01 displays that the coordination photographing function cannot be opened, the system coordination function is required to be closed firstly, and a 'go close' button and a 'cancel' button are displayed. The user may click the "go to close" button to perform the operation for closing the system cooperation function. Alternatively, the user may click a "cancel" button to forgo opening the collaborative photographing function.
When a request operation of a user requests to start the system coordination function, the mobile phone 100 needs to determine whether the current coordination photographing function is in an on state, and when the coordination photographing function is in an off state, the mobile phone 100 starts the system coordination function. When the cooperative photographing function is in the on state, the mobile phone 100 first turns off the cooperative photographing function, and then turns on the system cooperative function.
In one implementation, the mobile phone 100 can send a prompt to the user that the collaborative photographing function is to be turned off. Exemplarily, (b) in fig. 7 shows a prompt interface. When the cooperative photographing function is in an on state, if a user requests to start the system cooperative function, a prompt box 02 pops up, the prompt box 02 displays that the cooperative photographing function is to be closed, whether the system cooperative function is started is determined, and a yes button and a no button are displayed. The user can click the "yes" button, and the mobile phone 100 starts the system cooperation function. Alternatively, the user may click the "no" button to forego starting the system collaboration function.
Fig. 8 illustrates a management method for an electronic device according to an embodiment of the present application. When the cooperative photographing function of the mobile phone 100 is in an on state, the power and temperature of the mobile phone 100 may change during the use of the mobile phone 100. When the mobile phone 100 detects that the power of the mobile phone 100 is lower than the first threshold, for example, the power of the mobile phone 100 is lower than 30%, or when the mobile phone 100 detects that the temperature of the mobile phone 100 is higher than the second threshold, the mobile phone 100 turns off the cooperative photographing function.
In one implementation, when the cooperative photographing function of the mobile phone 100 is in an on state, when the mobile phone 100 detects that the power of the mobile phone 100 is lower than a first threshold, or when the mobile phone 100 detects that the temperature of the mobile phone 100 is higher than a second threshold, the mobile phone 100 may send a prompt to the user to prompt the user that the current usability of the mobile phone 100 is poor. Illustratively, when the power of the mobile phone 100 is low, the mobile phone 100 pops up a prompt box, and the prompt box displays "the current power is low, whether the cooperative photographing function is turned off", and displays a "yes" button and a "no" button. The user can click the "yes" button, and the cellular phone 100 closes the cooperative photographing function in response to the operation. Alternatively, the user may click the "no" button, and the cellular phone 100 continues to turn on the cooperative photographing function in response to the operation. The interface displayed by the mobile phone 100 and including the prompt box corresponds to the third interface, and the operation of clicking the "yes" button by the user corresponds to the third operation.
In addition, the mobile phone 100 may receive a call request from the third electronic device, and when the mobile phone 100 receives the call request while the cooperative shooting function of the mobile phone 100 is in the on state, the mobile phone 100 automatically turns off the cooperative shooting function.
In an implementation manner, when the cooperative shooting function of the mobile phone 100 is in an on state, and the mobile phone 100 receives a call request, it is determined whether the user accepts the call request, and if the user accepts the call request, and the mobile phone 100 establishes a call connection with the third electronic device, the mobile phone 100 automatically turns off the cooperative shooting function. If the user does not accept the call request, the mobile phone 100 continues to start the cooperative photographing function. After the call is ended, the mobile phone 100 may restart the cooperative photographing function, and recover to the state of the camera application before the call.
It should be noted that the above-mentioned turning off the cooperative photographing function is only one function of turning off the camera application, and after the cooperative photographing function is turned off, the camera may continue to operate in the foreground.
In addition, when the mobile phone 100 detects that the camera application fails and cannot run, if the camera application automatically moves back to the background, the mobile phone 100 automatically turns off the cooperative shooting function.
In order to more clearly understand the implementation details of the above method on the mobile phone 100, the following describes the process of implementing the above method by the cooperation of the software/hardware components in the mobile phone 100 with reference to fig. 9 to 16.
Fig. 9 is a block diagram of a software structure of the mobile phone 100 according to the embodiment of the present application, in which a layered framework divides software into a plurality of layers, each layer has a clear role and division of labor, and the layers communicate with each other through a software interface. In some embodiments, the system of the handset 100 may be divided into an application layer 10, an application framework layer 20, and a hardware layer 30.
The application layer 10 may include a series of application packages, for example, an application package may include a camera application including a function module, a logic module, and a camera management class.
The functional modules include a plurality of shooting modes of the camera application, such as a photographing mode 11, a portrait mode 12, a video recording mode 13, a movie mode 14, a self-timer mode, and the like. The photographing mode 11, the portrait mode 12, the video recording mode 13, and the movie mode 14 all support the collaborative photographing function 1105.
It is understood that in some embodiments, the photographing mode 11 may include functions of wisdom knowledge 1101, artificial Intelligence (AI) photography 1102, a filter function 1103, a flash function 1104, and a collaborative photographing function 1105, and the user may select the above functions as desired to achieve the corresponding purpose.
For example, in the photographing mode 11, the user may select to turn on the smart literacy 1101 function to recognize the object in the image. Specifically, after the user starts the smart recognition function in the photographing mode 11, the user only needs to align the camera with the photographed object, and when the mobile phone 100 recognizes the object, the name (e.g., "azalea") or the category ("dog") of the object is displayed on the mobile phone inner screen B. In other embodiments, the user may also turn on the filter function 1103 to beautify the photographed object. This is not limited by the present application.
In some embodiments, portrait mode 12 may include a collaborative capture function 1105, and may also include other functions, such as a beauty function, a special effects function (not shown), and so forth. When the user selects the portrait mode 12 to perform photographing, the portrait mode enables the face of the person to be photographed to be kept clear while blurring the background, thereby highlighting the subject.
In some embodiments, the recording mode 13 may include a multi-view recording function 1301, a flash function 1104, a filter function 1103, a collaborative shooting function 1105, and the like. When the user turns on the multi-view recording function 1301, the front and rear cameras of the mobile phone 100 are turned on at the same time, so that the user can record a plurality of objects at the same time by using the front and rear cameras. For example, after the user may start the multi-view recording function 1301, the image captured by the front camera P2 and the image captured by the rear camera P3 may be recorded at the same time, so as to record the scene where the user is located in detail. In addition, the user may also turn on the flash function 1104 to improve the imaging effect in a dark scene, for example, when the user shoots in a dark place, the flash function 1104 may be turned on to increase the exposure duration to improve the imaging effect of the shot object.
In some embodiments, the movie mode 14 may include a color Lookup (LUT) function 1401,4k high-dynamic illumination rendered image (HDR) function 1402, a slow motion function 1403, a flash function 1104, a co-capture function 1105, and the like. The color search function 1401 is substantially a function of adjusting colors of an object photographed by a user, so that the object photographed by the user has more colors. In the 4K HDR function 1402, 4K refers to resolution (4096 pixels × 2160 pixels), HDR is an image rendering technology, and aims to implement reflection and refraction of scene illumination and simulated light to make an object appear more realistic, and 4K HDR is used to improve quality of a shot picture and improve image resolution and object reality. The slow motion function 1403 increases the aesthetic feeling and the sense of realism of the captured image by the user through the slow motion function 1403, in order to increase the interest of the capturing.
In addition, the camera application realizes the turning on and off of the above functions and controls the cooperative display of the inner and outer screens of the mobile phone 100 through corresponding control logic in the logic module. The logic module turns on or off the cooperative photographing function when the state of the mobile phone 100 conforms to the preset control logic, for example, the logic module includes a power control logic 1501, a temperature control logic 1502, a call control logic 1503, a fault control logic 1504, a system change control logic 1505, and a cooperative photographing switch control logic 1506.
When the power of the mobile phone 100 is too low, the camera application turns off the cooperative photographing function through the power control logic 1501. When the temperature of the cell phone 100 is too high, the camera application turns off the cooperative photographing function through the temperature control logic 1502. When the mobile phone 100 receives an incoming call, the camera application turns off the cooperative photographing function through the call control logic 1503. When the camera of the cell phone 100 is not operational, the camera application turns off the collaborative photographing function through the fault control logic 1504.
The camera application coordinates the use of the first and second functions through system change control logic 1505. In the state that the cooperative photographing function is turned on, when the user turns on the system cooperative function, the system change control logic 1505 is triggered to turn off the cooperative photographing function. In addition, when the cooperative photographing function is turned on, the system change control logic 1505 is triggered to determine whether the system cooperative function is turned on.
The camera application detects the on or off of the cooperative photographing switch through the cooperative photographing switch control logic 1506, and controls the on or off of the cooperative photographing function according to the on or off of the cooperative photographing switch.
In some embodiments of the present application, the cooperative photographing switch may be the cooperative photographing button 02 or the cooperative photographing button 03 shown in fig. 3. The user may trigger the cooperative photographing switch control logic 1506 by clicking the cooperative photographing button 02 in the off state on the mobile phone inner screen a as shown in fig. 3 (d) to turn on the cooperative photographing function 1105 of the camera application. In addition, the user may trigger the cooperative photographing switch control logic 1506 by clicking the cooperative photographing button 03 in an on state on the screen a in the mobile phone as shown in (c) in fig. 3 to turn off the cooperative photographing function 1105 of the camera application.
It is understood that in some embodiments, the cooperative photographing switch may also be another type of switch, and the display state of the cooperative photographing switch may be changed according to the on and off of the cooperative photographing function. For example, when the cooperative photographing function is turned on, the cooperative photographing switch is in a convex state, and when the cooperative photographing function is turned off, the cooperative photographing switch is restored to the original state. This is not limited by the present application.
The camera management class is a module that manages functions of a camera application. The abnormal scenario processing module 16 receives the abnormal information of the handset 100 to trigger the control logic in the logic module. The function opening module 17 and the function closing module 18 are respectively used for implementing the cooperative photographing function 1105 of opening the camera application and the cooperative photographing function 1105 of closing the camera application.
It is understood that the application layer 10 may also include other applications, such as a video application, a chat application, etc., and the present application is not limited in this regard.
The application framework layer 20 provides an Application Programming Interface (API) and a programming framework for the application program of the application layer. The application framework layer includes a number of predefined functions. In the embodiment of the present application, the framework layer includes a camera API21, a data stream switching module 22, a camera service module 23, an information receiving module 24, a display management module (DisplayManager) 25, a cooperative display management module 26, a cooperative monitoring module 27, and a system management module 28.
The camera API21 is a program interface of the camera module 31, and the camera application can call the camera module 31 to perform shooting by calling the camera API 21. The camera API21 may also provide an interface for other applications, for example, an album is connected to the camera API, and a user may enter the album at the interface of the camera application when using the camera application.
The data stream switching module 22 is used to switch various data streams of the camera, such as switching a photographing data stream to a video data stream.
The camera service module 23 may monitor the operation of the camera application and pass this information to the logic module when the camera application is not operational, thereby triggering the fault control logic 1504 to shut down the cooperative photographing function.
The information receiving module 24 includes a battery information receiving module (BatteryInfoReceiver) and a temperature information receiving module (overheatrecheiver), and is configured to receive information such as battery power information and temperature information of the mobile phone 100, and transmit the information to the abnormal scene processing module 16 of the camera application after determining that the battery power is low or the temperature of the mobile phone 100 is too high, so as to trigger a corresponding control logic in the logic module. The information receiving module 24 further includes a call information receiving module, configured to receive call information of the mobile phone 100, and after receiving the call information of the mobile phone 100, transmit the information to the abnormal scene processing module 16 of the camera application, so as to trigger a corresponding control logic in the logic module.
In addition, the display management module 25 monitors the use states of the inner screen a and the outer screen B of the mobile phone 100 in real time to monitor whether the cooperative photographing function and the system cooperative function of the mobile phone 100 are in an on state. The camera application acquires the state of the system cooperative function from the display management module 25, and then determines whether to start the cooperative photographing function according to the acquired state and a preset control logic. For example, when the user starts the cooperative photographing function, the camera management class obtains the state of the system cooperative function from the display management module 25, when the system cooperative function is currently in an on state, triggers the system change control logic 1505 to execute the logic that the cooperative photographing function cannot be started, and when the system cooperative function is currently in an off state, executes the logic that the cooperative photographing function is started.
After the cooperative photographing function is started, the display management module 25 notifies the camera application in a state callback mode when sensing that the outer screen is lighted, so that the camera application refreshes the display state of the cooperative photographing switch.
The cooperative monitoring module 27 is configured to monitor whether the cooperative photographing switch in the camera application is in an on state or an off state.
The cooperative display management module 26 is configured to display a preview picture of the inner screen on the outer screen, so as to implement cooperative display of the inner screen a and the outer screen B, for example, the cooperative display of the inner screen a and the outer screen B may be implemented by creating a split screen management class and an interface base class. In some embodiments, the cooperative display management module 26 may create a corresponding display layer (map) for the external screen B, and display the layer on the external screen B of the mobile phone as a carrier of a mirror image of the captured image, and display the mirror image of the captured image using the layer.
The cooperative display management module 26 may provide a relevant interface for the cooperative photographing function 1105 of the camera application, and after the cooperative photographing function 1105 is started, the cooperative display of the inner screen a and the outer screen B may be implemented by calling the relevant interface of the cooperative display management module 26. After the cooperative photographing function 1105 is closed, the cooperative display window of the external screen B may be closed by calling the relevant interface of the cooperative display management module 26.
The system management module 28 is used for controlling on and off of the system cooperative function. When the system cooperation function is requested to be turned on, the system management module 28 acquires the state of the cooperation photographing function from the display management module 25, and when the cooperation photographing function is currently in the on state, executes logic for turning on the system cooperation function after turning off the cooperation photographing function. And executing logic for starting the system cooperative function when the cooperative photographing function is in the closed state currently.
It is understood that the application framework layer 20 may also include other modules, such as a view manager (not shown), a wireless fidelity (Wi-Fi) (not shown), and the present application is not limited thereto.
The hardware layer 30 includes a camera module 31, an Image Signal Processor (ISP) driver 32, a display port driver 33, a sensor module 34, and a communication module 35.
In the present embodiment, the camera module 31 includes the camera P1, the camera P2, and the camera P3 described above. In other embodiments, other cameras may also be included.
The ISP driver 32 is used to process the signal transmitted by the image sensor of the mobile phone 100. For example, an electrical signal of an image fed back by the image sensor is converted into an image visible to the naked eye. The ISP driver 32 may also perform algorithmic optimization on image noise, brightness, etc. The ISP driver 32 may also optimize parameters such as exposure and color temperature of a shooting scene, or perform anti-shake processing on an image to enhance the display effect of the image.
The DP driver 33 is an interface for the external display to access the mobile phone 100, and the mobile phone 100 can be connected to the external display through the DP driver 33 and synchronously display the image displayed on the mobile phone 100 on the external display.
In the embodiment of the present application, the sensor module 34 includes a temperature sensor for detecting the temperature of the mobile phone 100 and a power management module. The power management module is used for monitoring the electric quantity of the battery.
The communication module 35 includes a mobile communication module and a wireless communication module so that the cellular phone 100 can receive a call request from a third electronic device to communicate with the third electronic device. And, the mobile phone 100 can establish communication connection with the computer 200 to realize the second function.
It is understood that the software framework of the handset 100 may further include a kernel layer (not shown), a hardware abstraction layer (not shown).
It is understood that the above software structure is only exemplary and not limiting to the software structure of the mobile phone 100, and in other embodiments, the mobile phone 100 may have more or less structures, which is not limited in this application.
In order to more intuitively understand the process of the method implemented by the cooperation between the software modules. Specifically, the following takes the interaction diagram shown in fig. 10 as an example to describe a process of turning off the cooperative shooting function when the battery power is too low when the cooperative shooting function is in the on state.
In step 1001, the power management module (lowbattery controller) detects that the battery power of the mobile phone 100 is lower than a first threshold.
The electric quantity management module comprises a power supply management module and a battery information receiving module. The power management module is used for monitoring the electric quantity of the battery, and the battery information receiving module judges whether the electric quantity of the battery is lower than a first threshold value.
In step 1002, the power management module transmits a detection result that the battery power is lower than a first threshold to a camera management class (AppUtill). After receiving the detection result, the camera management class sends the detection result to an abnormal scene processing module (handleabnormal scene exitcollateromatemode) of the camera application, so that the abnormal scene processing module processes the condition of low battery power.
In step 1003, the abnormal scene processing module triggers the electric quantity control logic 1501 to determine to close the cooperative photographing function.
In step 1004, the exception scenario processing module sends an instruction for closing the collaborative photographing function to a function closing module (exitcollaboratompode). The function closing module sends an instruction to close the collaborative photographing function to the collaborative display management module (collaborative presentation).
Step 1005, the collaborative display management module closes the window for collaborative display in the external screen.
Step 1006, after the window for cooperative display is closed, the cooperative display management module uses a callback function (onDisplayChanged) to callback the information of closing the window for cooperative display in the external screen of the mobile phone 100 to the display management module.
In order to more intuitively understand the process of the method implemented by the cooperation between the software modules. Specifically, the following takes the interaction diagram shown in fig. 11 as an example to describe a process of turning off the cooperative shooting function when the temperature of the mobile phone 100 is too high when the cooperative shooting function is in the on state.
In step 1101, the temperature management module (overtemperature controller) detects that the temperature of the mobile phone 100 is higher than the second threshold.
The temperature management module comprises a temperature sensor and a temperature information receiving module. The temperature sensor detects the temperature of the mobile phone 100. The temperature information receiving module determines whether the temperature of the mobile phone 100 is higher than a second threshold.
In step 1102, the temperature management module transmits the detection result that the temperature of the mobile phone 100 is higher than the second threshold to a camera management class (AppUtill). After receiving the detection result, the camera management class sends the information to an abnormal scene processing module (handleabnormal scene exitcollateremode) of the camera application, so that the abnormal scene processing module processes the condition that the temperature of the mobile phone 100 is too high.
In step 1103, the abnormal scene processing module triggers the temperature control logic 1502 to determine to turn off the cooperative shooting function.
In step 1104, the exception scene processing module sends an instruction for closing the collaborative photographing function to a function closing module (exitcollaparatemode). The function closing module sends an instruction to close the collaborative photographing function to the collaborative display management module (collaborative presentation).
In step 1105, the collaborative display management module closes the window for collaborative display in the outer screen.
Step 1106, after closing the window for cooperative display, the cooperative display management module uses a callback function (onDisplayChanged) to callback the information of closing the external screen of the mobile phone 100 to the display management module.
In order to more intuitively understand the process of the method implemented by the cooperation between the software modules. Specifically, the following takes the interaction diagram shown in fig. 12 as an example to describe a process of turning off the cooperative shooting function in the case of an operation failure of the camera application when the cooperative shooting function is in an on state.
In step 1201, the camera service module (cameraoneerrorprocessor) detects that the camera application operation fails.
Step 1202, the camera service module transmits the detection result of the failure of the camera application operation to the camera management class. And after receiving the detection result, the camera management class sends the detection result to an abnormal scene processing module of the camera application.
In step 1203, the abnormal scene processing module triggers the fault control logic 1504 to determine to close the cooperative photographing function.
In step 1204, the abnormal scene processing module sends an instruction for closing the cooperative photographing function to the function closing module. And the function closing module sends an instruction for closing the collaborative photographing function to the collaborative display management module.
Step 1205, the collaborative display management module closes the window for collaborative display in the external screen.
In step 1206, after the window for collaborative display is closed, the collaborative display management module uses a callback function (onDisplayChanged) to callback the closed information of the external screen of the mobile phone 100 to the display management module.
In order to more intuitively understand the process of the method implemented by the cooperation between the software modules. Specifically, the following takes the interaction diagram shown in fig. 13 as an example to describe a process of turning off the cooperative shooting function when the mobile phone 100 receives a call request when the cooperative shooting function is in an on state.
In step 1301, the communication processing module detects that the mobile phone 100 receives a call request.
The communication processing module comprises a communication module and a call information receiving module.
In step 1302, the communication processing module transmits the detection result of the call request received by the mobile phone 100 to a camera management class (AppUtill). And after receiving the detection result, the camera management class sends the detection result to an abnormal scene processing module of the camera application.
In step 1303, the abnormal scene processing module triggers the call control logic 1503 to determine to close the cooperative shooting function.
In step 1304, the abnormal scene processing module sends an instruction for closing the cooperative photographing function to the function closing module. And the function closing module sends an instruction for closing the collaborative photographing function to the collaborative display management module.
In step 1305, the collaborative display management module closes a window for collaborative display in the outer screen.
In step 1306, after the window for collaborative display is closed, the collaborative display management module uses a callback function (onDisplayChanged) to callback the information of closing the external screen of the mobile phone 100 to the display management module.
In order to more intuitively understand the process of the method implemented by the cooperation between the software modules. Specifically, an implementation process of turning on the cooperative photographing function is described below by taking the interaction diagram shown in fig. 14 as an example.
In step 1401, the user opens a camera application.
For example, a user may open a camera application by clicking on an application icon of the camera application. Of course, the camera application may also be opened in other ways, such as by a voice command or a gesture command.
Step 1402, the camera application is started and informs the cooperative monitoring module.
In step 1403, when the collaborative monitoring module (collarateoncliclisterner) detects that the camera application was closed last time, the collaborative photographing function of the camera application is in an on state.
In step 1404, the cooperative monitoring module sends an instruction to start the cooperative photographing function to the camera management class. The camera management class sends an instruction to start the collaborative photographing function to a function start module (entercollectivity mode).
Step 1405, the function starting module sends a request for acquiring the current state of the system cooperation function to the query module (iscasingstate). And the query module sends a request for acquiring the current state of the system cooperative function to the display management module.
The request is used to obtain the current state of the system cooperation function.
In step 1406, the display management module returns a response to the query module, the response including the current status of the system collaboration function. The query module returns a response to the function opening module including the current state of the system cooperation function.
And 1407, displaying a prompt message to the user when the system cooperation function is in the starting state.
As shown in fig. 7 (a), the mobile phone 100 prompts the user that the current system cooperation function is currently in an on state, and if the cooperation photographing function needs to be turned on, the system cooperation function needs to be turned off first.
And step 1408, when the system cooperation function is in the off state, the cooperation photographing function is turned on.
In order to more intuitively understand the process of the method implemented by the cooperation between the software modules. Specifically, another implementation process of starting the cooperative photographing function is described below by taking the interaction diagram shown in fig. 15 as an example.
Step 1501: the user opens the camera application.
Step 1502: the camera application starts.
In step 1503, when the cooperative monitoring module detects that the camera application is closed last time, the cooperative photographing function of the camera application is in a closed state.
At step 1504, the user clicks the photo collaboration button.
For example, as shown in (d) of fig. 3, the user clicks the cooperative photographing button 02 in the off state.
After the user clicks the collaborative photographing button, the collaborative monitoring module can detect an opening instruction of the collaborative photographing button, and the collaborative photographing function is started according to the opening instruction.
In step 1505, the cooperative monitoring module sends the instruction for starting the cooperative photographing function to the camera management class. And the camera management class sends an instruction for starting the collaborative photographing function to the function starting module.
In step 1506, the function startup module sends a request for obtaining the current state of the system cooperation function to the query module (iscasingstate). And the query module sends a request for acquiring the current state of the system cooperative function to the display management module.
Step 1507, the display management module returns a response to the query module, the response including the current status of the system coordination function. The query module returns a response to the function opening module including the current state of the system cooperation function.
And 1508, when the system cooperation function is in the on state, displaying a prompt message to the user.
Step 1509, when the system cooperation function is in the off state, the cooperation photographing function is turned on.
In order to more intuitively understand the process of the method implemented by the cooperation between the software modules. Specifically, an implementation process of starting the system cooperation function is described below by taking the interaction diagram shown in fig. 16 as an example.
Step 1601, the user starts the system cooperation function.
In step 1602, the system management module (MainViewPage) sends a request for obtaining the current status of the collaborative photographing function to the display management module.
In step 1603, the display management module returns a response to the system cooperation function management module, where the response includes the current status of the cooperation photographing function.
In step 1604, the system management module starts the system cooperation function when the cooperation shooting function is in the off state.
Step 1605, when the cooperative photographing function is in the on state, the system management module sends the information of the system cooperative function on to the camera management class. After receiving the information, the camera management class sends the information to an abnormal scene processing module (handleabnormal scene exitcollateromatode) of the camera application, so that the abnormal scene processing module processes the information.
In step 1606, the abnormal scene processing module triggers the system change control logic 1505, which determines to turn off the cooperative photographing function.
In step 1607, the abnormal scene processing module sends an instruction for closing the collaborative photographing function to a function closing module (exitcollaparatemode). The function closing module sends an instruction to close the collaborative photographing function to the collaborative display management module (collaborative presentation).
In step 1608, the cooperative display management module closes the window for cooperative display in the outer screen.
In step 1609, after the window for cooperative display is closed, the cooperative display management module uses a callback function (onDisplayChanged) to callback the closed information of the external screen of the mobile phone 100 to the display management module. The display management module sends the information of the cooperative photographing function of the mobile phone 100 to the system management module.
Step 1610, the system management module starts the system coordination function.
In this embodiment, the electronic device may be divided into functional modules according to the method example, for example, each functional module may be divided corresponding to each function, or two or more functions may be integrated into one processing module. The integrated module can be implemented in the form of hardware. It should be noted that, the division of the modules in this embodiment is schematic, and is only one logic function division, and another division manner may be available in actual implementation.
It should be noted that all relevant contents of each step related to the above method embodiment may be referred to the functional description of the corresponding functional module, and are not described herein again.
The electronic device provided by the embodiment is used for executing the management method of the electronic device, so that the same effect as the implementation method can be achieved. Where an integrated unit is employed, the electronic device may include a processing module, a memory module, and a communication module. The processing module may be configured to control and manage actions of the electronic device, and for example, may be configured to support the electronic device to execute steps executed by the processing unit. The memory module may be used to support the electronic device in executing stored program codes and data, etc. The communication module can be used for supporting the communication between the electronic equipment and other equipment.
The processing module may be a processor or a controller, among others. Which may implement or execute the various illustrative logical blocks, modules, and circuits described in connection with the disclosure herein. A processor may also be a combination of computing functions, e.g., a combination of one or more microprocessors, a Digital Signal Processing (DSP) and a microprocessor, or the like. The storage module may be a memory. The communication module may specifically be a radio frequency circuit, a bluetooth chip, a Wi-Fi chip, or other devices that interact with other electronic devices.
In an embodiment, when the processing module is a processor and the storage module is a memory, the electronic device according to this embodiment may be a device having the structure shown in fig. 1.
The present embodiment also provides a computer-readable storage medium, in which computer instructions are stored, and when the computer instructions are executed on an electronic device, the electronic device executes the above related method steps to implement the method in the above embodiment.
The present embodiment also provides a computer program product, which when running on a computer, causes the computer to execute the relevant steps described above, so as to implement the method in the above embodiments.
In addition, embodiments of the present application also provide an apparatus, which may be specifically a chip, a component or a module, and may include a processor and a memory connected to each other; the memory is used for storing computer execution instructions, and when the device runs, the processor can execute the computer execution instructions stored in the memory, so that the chip can execute the method in the above method embodiments.
The electronic device, the computer-readable storage medium, the computer program product, or the chip provided in this embodiment are all configured to execute the corresponding method provided above, so that the beneficial effects achieved by the electronic device, the computer-readable storage medium, the computer program product, or the chip may refer to the beneficial effects in the corresponding method provided above, and are not described herein again.
Through the description of the above embodiments, those skilled in the art will understand that, for convenience and simplicity of description, only the division of the above functional modules is used as an example, and in practical applications, the above function distribution may be completed by different functional modules as needed, that is, the internal structure of the device may be divided into different functional modules to complete all or part of the above described functions.
In the several embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other manners. For example, the above-described apparatus embodiments are merely illustrative, and for example, a division of a module or a unit is only one type of logical functional division, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another apparatus, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
Units described as separate parts may or may not be physically separate, and parts displayed as units may be one physical unit or a plurality of physical units, may be located in one place, or may be distributed to a plurality of different places. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a readable storage medium. Based on such understanding, the technical solutions of the embodiments of the present application may be essentially or partially contributed to by the prior art, or all or part of the technical solutions may be embodied in the form of a software product, where the software product is stored in a storage medium and includes several instructions to enable a device (which may be a single chip, a chip, or the like) or a processor (processor) to execute all or part of the steps of the methods of the embodiments of the present application. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
It will be understood that the terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
As used in this specification and the appended claims, the term "if" may be interpreted contextually as "when", "upon" or "in response to" determining "or" in response to detecting ". Similarly, the phrase "if it is determined" or "if a [ described condition or event ] is detected" may be interpreted contextually to mean "upon determining" or "in response to determining" or "upon detecting [ described condition or event ]" or "in response to detecting [ described condition or event ]".
Furthermore, in the description of the present application and the appended claims, the terms "first," "second," "third," and the like are used for distinguishing between descriptions and not necessarily for describing a relative importance or importance.
Reference throughout this specification to "one embodiment" or "some embodiments," or the like, means that a particular feature, structure, or characteristic described in connection with the embodiment is included in one or more embodiments of the present application. Thus, appearances of the phrases "in one embodiment," "in some embodiments," "in other embodiments," or the like, in various places throughout this specification are not necessarily all referring to the same embodiment, but rather "one or more but not all embodiments" unless specifically stated otherwise. The terms "comprising," "including," "having," and variations thereof mean "including, but not limited to," unless expressly specified otherwise.
The above description is only for the specific embodiments of the present application, but the scope of the present application is not limited thereto, and any person skilled in the art can easily think of the changes or substitutions within the technical scope of the present application, and shall be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (15)

1. A method for managing electronic devices, the method being performed by a first electronic device, the first electronic device including at least two display screens, comprising:
detecting a first operation on the first electronic device, wherein the first operation requests to start a first function of the first electronic device, the first function is used for enabling a first display screen and a second display screen of the first electronic device to display the same picture, and the first display screen and the second display screen are any two display screens of the at least two display screens which are positioned on different surfaces;
responding to a first operation, and judging whether a second function of the first electronic equipment is in an open state currently or not, wherein the second function is used for enabling a current display interface of the first electronic equipment to be projected on second electronic equipment, and the current display interface of the first electronic equipment is the current display interface of the first display screen or the current display interface of the second display screen;
and when the second function is in a closed state currently, starting the first function.
2. The method of claim 1, wherein detecting the first operation on the first electronic device comprises:
detecting a camera opening operation, and starting a camera application;
a first operation on the first electronic device is detected in the camera application.
3. The method according to claim 2, wherein an application interface of the camera application displays an open button of the first function, and the first operation is a click operation on the open button.
4. The method according to any one of claims 1 to 3, wherein after determining whether the second function of the first electronic device is currently in an on state, the method further comprises:
when the second function is in an opening state, displaying a first interface, wherein the first interface is used for guiding a user to close the second function;
detecting an operation of turning off the second function;
and responding to the operation of closing the second function, and opening the first function after closing the second function.
5. The method of any of claims 1 to 4, wherein after the first function is turned on, the method further comprises:
detecting a second operation on the first electronic equipment, wherein the second operation requests to start the second function;
and responding to the second operation, and after the first function is closed, opening the second function.
6. The method of claim 5, wherein turning on the second function after turning off the first function in response to the second operation comprises:
responding to the second operation, and judging whether the first function is in an opening state currently;
and when the first function is in an opening state at present, after the first function is closed, the second function is opened.
7. The method of claim 6, wherein when the first function is currently in an on state, the method further comprises:
displaying a second interface for guiding a user to confirm closing of the first function;
detecting an operation of confirming to close the first function by a user;
in response to the operation of confirming to turn off the first function, turning off the first function.
8. The method of any of claims 1 to 7, wherein when the first function is currently in an on state, the method further comprises:
and when the fact that the electric quantity of the first electronic equipment is lower than a first threshold value is detected, the first function is closed.
9. The method of any of claims 1 to 8, wherein when the first function is currently in an on state, the method further comprises:
turning off the first function when the temperature of the first electronic device is detected to be higher than a second threshold.
10. The method of claim 8 or 9, wherein said shutting down the first function comprises:
displaying a third interface, wherein the third interface is used for prompting a user whether to close the first function;
detecting a third operation on the first electronic device, wherein the third operation requests to close the first function;
in response to the third operation, turning off the first function.
11. The method of any of claims 1 to 10, wherein when the first function is currently in an on state, the method further comprises:
receiving a call request, wherein the call request is from a third electronic device;
and after receiving a call request, closing the first function.
12. The method of any of claims 1 to 11, wherein when the first function is currently in an on state, the method further comprises:
and when detecting that the camera application of the first electronic equipment cannot run, closing the first function.
13. An electronic device, comprising: the display screen comprises at least two display screens, wherein a first display screen and a second display screen in the at least two display screens are any two display screens positioned on different surfaces; one or more processors; one or more memories; a module installed with a plurality of applications; the memory stores one or more programs that, when executed by the processor, cause the electronic device to perform the method of managing an electronic device of any one of claims 1 to 12.
14. A computer-readable storage medium storing computer instructions which, when run on an electronic device, cause the electronic device to perform the method of managing the electronic device according to any one of claims 1 to 12.
15. A computer program product, characterized in that it causes a computer to carry out the management method of an electronic device according to any one of claims 1 to 12, when said computer program product is run on said computer.
CN202210013508.XA 2022-01-06 2022-01-06 Electronic equipment management method, electronic equipment and readable storage medium Active CN115567630B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202210013508.XA CN115567630B (en) 2022-01-06 2022-01-06 Electronic equipment management method, electronic equipment and readable storage medium
PCT/CN2022/143843 WO2023131070A1 (en) 2022-01-06 2022-12-30 Electronic device management method, electronic device, and readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210013508.XA CN115567630B (en) 2022-01-06 2022-01-06 Electronic equipment management method, electronic equipment and readable storage medium

Publications (2)

Publication Number Publication Date
CN115567630A true CN115567630A (en) 2023-01-03
CN115567630B CN115567630B (en) 2023-06-16

Family

ID=84736528

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210013508.XA Active CN115567630B (en) 2022-01-06 2022-01-06 Electronic equipment management method, electronic equipment and readable storage medium

Country Status (2)

Country Link
CN (1) CN115567630B (en)
WO (1) WO2023131070A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117111796A (en) * 2023-04-28 2023-11-24 荣耀终端有限公司 Collaborative display method, equipment and medium
CN117555658A (en) * 2023-10-11 2024-02-13 荣耀终端有限公司 Method for managing application running and electronic equipment
CN117687586A (en) * 2023-07-17 2024-03-12 荣耀终端有限公司 Electronic device, display method, chip system and readable storage medium

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110242392A1 (en) * 2004-11-15 2011-10-06 Kuo-Ching Chiang Portable Image Capturing Device with Embedded Projector
CN105554537A (en) * 2015-12-08 2016-05-04 青岛海信电器股份有限公司 Control method and control device
CN107766023A (en) * 2017-10-19 2018-03-06 广东欧珀移动通信有限公司 Method for information display, device, terminal and storage medium
CN110221798A (en) * 2019-05-29 2019-09-10 华为技术有限公司 A kind of throwing screen method, system and relevant apparatus
CN110381195A (en) * 2019-06-05 2019-10-25 华为技术有限公司 A kind of throwing screen display methods and electronic equipment
CN111263005A (en) * 2020-01-21 2020-06-09 华为技术有限公司 Display method and related device of folding screen
WO2020211735A1 (en) * 2019-04-19 2020-10-22 华为技术有限公司 Method for using enhanced function of electronic device and related apparatus
CN113590059A (en) * 2020-04-30 2021-11-02 青岛海信移动通信技术股份有限公司 Screen projection method and mobile terminal

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005005909A (en) * 2003-06-10 2005-01-06 Sony Ericsson Mobilecommunications Japan Inc Competition management program, storage medium stored with competition management program, competition management method and electronic apparatus
CN108174042A (en) * 2018-01-23 2018-06-15 北京珠穆朗玛移动通信有限公司 Image pickup method, mobile terminal and the device of mobile terminal
CN111124342A (en) * 2019-12-27 2020-05-08 西安万像电子科技有限公司 Screen projection method and equipment
CN111324327B (en) * 2020-02-20 2022-03-25 华为技术有限公司 Screen projection method and terminal equipment

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110242392A1 (en) * 2004-11-15 2011-10-06 Kuo-Ching Chiang Portable Image Capturing Device with Embedded Projector
CN105554537A (en) * 2015-12-08 2016-05-04 青岛海信电器股份有限公司 Control method and control device
CN107766023A (en) * 2017-10-19 2018-03-06 广东欧珀移动通信有限公司 Method for information display, device, terminal and storage medium
WO2020211735A1 (en) * 2019-04-19 2020-10-22 华为技术有限公司 Method for using enhanced function of electronic device and related apparatus
CN110221798A (en) * 2019-05-29 2019-09-10 华为技术有限公司 A kind of throwing screen method, system and relevant apparatus
CN110381195A (en) * 2019-06-05 2019-10-25 华为技术有限公司 A kind of throwing screen display methods and electronic equipment
CN111263005A (en) * 2020-01-21 2020-06-09 华为技术有限公司 Display method and related device of folding screen
CN113590059A (en) * 2020-04-30 2021-11-02 青岛海信移动通信技术股份有限公司 Screen projection method and mobile terminal

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117111796A (en) * 2023-04-28 2023-11-24 荣耀终端有限公司 Collaborative display method, equipment and medium
CN117687586A (en) * 2023-07-17 2024-03-12 荣耀终端有限公司 Electronic device, display method, chip system and readable storage medium
CN117555658A (en) * 2023-10-11 2024-02-13 荣耀终端有限公司 Method for managing application running and electronic equipment

Also Published As

Publication number Publication date
WO2023131070A1 (en) 2023-07-13
CN115567630B (en) 2023-06-16

Similar Documents

Publication Publication Date Title
CN110401766B (en) Shooting method and terminal
CN110119295B (en) Display control method and related device
CN110602315B (en) Electronic device with foldable screen, display method and computer-readable storage medium
CN113810600B (en) Terminal image processing method and device and terminal equipment
CN113810601B (en) Terminal image processing method and device and terminal equipment
CN112492193B (en) Method and equipment for processing callback stream
CN115567630B (en) Electronic equipment management method, electronic equipment and readable storage medium
CN114125130B (en) Method for controlling communication service state, terminal device and readable storage medium
WO2020118490A1 (en) Automatic screen-splitting method, graphical user interface, and electronic device
CN113448382A (en) Multi-screen display electronic device and multi-screen display method of electronic device
CN114489533A (en) Screen projection method and device, electronic equipment and computer readable storage medium
WO2023241209A1 (en) Desktop wallpaper configuration method and apparatus, electronic device and readable storage medium
CN114115770A (en) Display control method and related device
CN114095666A (en) Photographing method, electronic device and computer-readable storage medium
CN114089932A (en) Multi-screen display method and device, terminal equipment and storage medium
CN115967851A (en) Quick photographing method, electronic device and computer readable storage medium
CN114863494A (en) Screen brightness adjusting method and device and terminal equipment
CN113535284A (en) Full-screen display method and device and electronic equipment
CN114500901A (en) Double-scene video recording method and device and electronic equipment
CN112449101A (en) Shooting method and electronic equipment
CN112532508B (en) Video communication method and video communication device
CN113923372B (en) Exposure adjusting method and related equipment
CN114257737A (en) Camera shooting mode switching method and related equipment
CN114827098A (en) Method and device for close shooting, electronic equipment and readable storage medium
CN112422814A (en) Shooting method and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant