CN116048436B - Application interface display method, electronic device and storage medium - Google Patents

Application interface display method, electronic device and storage medium Download PDF

Info

Publication number
CN116048436B
CN116048436B CN202210693404.8A CN202210693404A CN116048436B CN 116048436 B CN116048436 B CN 116048436B CN 202210693404 A CN202210693404 A CN 202210693404A CN 116048436 B CN116048436 B CN 116048436B
Authority
CN
China
Prior art keywords
screen
application
display
electronic device
application interface
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210693404.8A
Other languages
Chinese (zh)
Other versions
CN116048436A (en
Inventor
孔德敏
孙祺
张东
李建武
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honor Device Co Ltd
Original Assignee
Honor Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honor Device Co Ltd filed Critical Honor Device Co Ltd
Priority to CN202210693404.8A priority Critical patent/CN116048436B/en
Publication of CN116048436A publication Critical patent/CN116048436A/en
Application granted granted Critical
Publication of CN116048436B publication Critical patent/CN116048436B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1423Digital output to display device ; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display
    • G06F3/1446Digital output to display device ; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display display composed of modules, e.g. video walls
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04847Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces

Abstract

The embodiment of the application provides an application interface display method, electronic equipment and a storage medium, and relates to the technical field of terminals. The method is applied to the electronic device comprising the device state management system, and comprises the following steps: responding to the operation of a user on a first application, and sending a query instruction to a device state management system by the first application; if the electronic equipment is determined to be the folding screen electronic equipment according to the feedback information of the equipment state management system, the electronic equipment is currently displayed by using a first screen and is in an unfolding state, a first application interface of a first application is displayed on the first screen, the first application interface comprises a first control, and the folding screen electronic equipment comprises the first screen and a second screen which is opposite to the first screen; and responding to the operation of the user on the first control, and displaying a second application interface obtained after the first application interface is adjusted according to the display size of the second screen on the second screen. The embodiment of the application can provide more camera shooting scenes by utilizing the characteristics of the hardware equipment.

Description

Application interface display method, electronic device and storage medium
Technical Field
The present disclosure relates to the field of terminal technologies, and in particular, to an application interface display method, an electronic device, and a storage medium.
Background
The electronic equipment such as the mobile phone, the tablet personal computer and the like can be used for installing the application, and an application interface of the installed application is displayed on a display screen of the electronic equipment, so that a user can conveniently use the functions provided by the application. In order to avoid the display screen of the electronic device being too large to affect its convenience, some manufacturers have applied flexible screens to the electronic device, resulting in an electronic device with a folded screen. An electronic device with a folding screen includes at least two display screens and a plurality of cameras on a hardware device. In the prior art, the application interfaces displayed by the same application on the electronic device without the folding screen and the electronic device with the folding screen are the same, and richer functions are not provided for users according to the characteristics of the electronic device with the folding screen on the hardware device.
Disclosure of Invention
In view of the foregoing, it is necessary to provide an application interface display method, an electronic device, and a storage medium to provide a user with a richer camera shooting scene according to characteristics of the electronic device with a folding screen on a hardware device.
In a first aspect, an embodiment of the present application provides an application interface display method, applied to an electronic device, where the electronic device includes a device state management system, and the method includes: responding to a first operation of a user on a first application, wherein the first application sends a query instruction to the equipment state management system, and the first application is a three-party application installed on the electronic equipment; if the electronic equipment is determined to be a folding screen electronic equipment according to the feedback information of the equipment state management system, the electronic equipment is currently displayed by using a first screen and is in an unfolding state, a first application interface of the first application is displayed on the first screen, a first control is included on the first application interface, and the folding screen electronic equipment comprises the first screen and a second screen which is opposite to the first screen; and responding to a second operation of the first control by the user, displaying a second application interface of the first application on the second screen, wherein the second application interface is an interface of the first application interface after being adjusted according to the display size of the second screen. According to the technical scheme, when the electronic device is provided with the first screen which is the folded screen and the second screen which is opposite to the display direction of the first screen in the unfolded state, and the current physical state of the first screen is the unfolded state, the hardware call control with the call display screen is displayed on the application interface displayed on the first screen, and a richer camera shooting scene can be provided for a user through the hardware call control.
In some embodiments, in response to the first operation, invoking a first camera to take a photograph; and responding to the second operation, calling a second camera to shoot, wherein the first camera and the second camera are different. Through the technical scheme, the called cameras can be switched, and richer camera shooting scenes are provided for users.
In some embodiments, the method further comprises: registering a first event in the equipment state management system, wherein the first event is used for controlling the equipment state management system to monitor the system display state of the electronic equipment; and responding to the display state switching operation of the user, switching the display state of the system from the first screen display to the second screen display, and notifying the first application of the display state change after the equipment state management system monitors the display state change of the system from the first screen display to the second screen display. By the technical scheme, the monitoring of the first application on the system display state can be realized.
In some embodiments, the first application sets the first control to a non-selectable mode after receiving the display state change. Through the technical scheme, when the calling scene of the hardware calling control is not met, the hardware calling control is set to be in an unselected mode, the condition that the corresponding function of the hardware calling control cannot be realized by clicking the hardware calling control is avoided, and the resource loss of the electronic equipment can be correspondingly reduced.
In some embodiments, the method further comprises: registering a second event in the equipment state management system, wherein the second event is used for controlling the equipment state management system to monitor the physical state of the first screen; and responding to the physical state adjustment operation of the user on the first screen, changing the physical state of the first screen from an unfolded state to a non-unfolded state, and informing the first application of the physical state change after the equipment state management system monitors the physical state change of the first screen from the unfolded state to the non-unfolded state. By the technical scheme, the first application can monitor the physical state of the first screen.
In some embodiments, the first application sets the first control to a non-selectable mode after receiving the physical state change. Through the technical scheme, when the calling scene of the hardware calling control is not met, the hardware calling control is set to be in an unselected mode, the condition that the corresponding function of the hardware calling control cannot be realized by clicking the hardware calling control is avoided, and the resource loss of the electronic equipment can be correspondingly reduced.
In some embodiments, the first application is a three-way application including a shooting function, the first control is a rear-mounted self-timer control, the second application interface is displayed on the second screen in response to a selected operation of a user acting on the rear-mounted self-timer control, and a rear-mounted camera on the same side as the second screen is called to shoot. Through the technical scheme, the front-mounted self-timer effect can be realized by using the rear-mounted camera (the photographed user can directly check the real-time photographing effect on the display screen), so that photographing of the user is facilitated. Meanwhile, the shooting capability of the rear camera is better than that of the front camera, so that better shooting effect than common front self-shooting can be achieved on the premise of realizing the front self-shooting effect.
In some embodiments, the display of the first application interface on the first screen is stopped, and a prompt is displayed on the first screen, wherein the prompt is used for indicating a user to turn over the electronic device. Through the technical scheme, the user can be prompted to turn over the mobile phone to perform the post-self-timer operation.
In some embodiments, the first screen comprises a second control, and the first screen is subjected to screen extinguishing processing in response to the selected operation of the second control acted by a user. Through the technical scheme, the first screen can be subjected to screen quenching treatment, and the energy loss of the electronic equipment is correspondingly reduced.
In some embodiments, the first application is a three-way application including a video communication function, the first control is a collaborative shooting control, and the second application interface is displayed on the second screen in response to a selection operation of a user on the collaborative shooting control. Through the technical scheme, the application interface of the application can be displayed on the second screen at the same time, so that shooting objects and shot objects can observe shooting effects conveniently, and better shooting effects are obtained.
In some embodiments, the first application is a three-way application including a video communication function, the first control is a multi-directional shooting control, and the first front-facing camera on the first screen and the second front-facing camera on the second screen are called to shoot simultaneously in response to a selected operation of a user acting on the multi-directional shooting control; or simultaneously calling a first front camera on the first screen and a rear camera opposite to the shooting direction of the first front camera to shoot. Through the technical scheme, at least two cameras with different shooting directions can be called to shoot in the shooting process, multi-path shooting is realized, and richer camera shooting scenes are provided for users.
In a second aspect, embodiments of the present application provide an electronic device including a memory and a processor; the memory is used for storing program instructions; the processor reads the program instructions stored in the memory to implement the application interface display method as described above.
In a third aspect, embodiments of the present application provide a computer readable storage medium having stored therein computer readable instructions that when executed by a processor implement an application interface display method as described above.
In addition, the technical effects of the second aspect and the third aspect may be referred to in the description related to the method designed by each of the above method sections, which are not repeated herein.
Drawings
Fig. 1 is a schematic product form of an electronic device with an invaginated folding screen according to an embodiment of the present application.
Fig. 2 is a schematic product form of an electronic device with an invaginated folding screen according to an embodiment of the present application.
Fig. 3 is a schematic product form of an invaginated folding screen according to an embodiment of the present application.
Fig. 4A is a schematic diagram of a folding screen in an electronic device according to an embodiment of the present application.
Fig. 4B is a schematic diagram of a geographic coordinate system according to an embodiment of the present application.
Fig. 5 is a software structural block diagram of an electronic device according to an embodiment of the present application.
Fig. 6 is a flowchart of an application interface display method according to an embodiment of the present application.
Fig. 7 is a flowchart of an application interface display method according to an embodiment of the present application.
Fig. 8 is a flowchart of an application interface display method according to an embodiment of the present application.
Fig. 9 is a flowchart of an application interface display method according to an embodiment of the present application.
Fig. 10 is a flowchart of a method for calling a function of a hardware call control according to an embodiment of the present application.
Fig. 11 is a schematic view of a scenario of an electronic device display application interface according to an embodiment of the present application.
Fig. 12 is a schematic view of a scenario of an electronic device display application interface according to an embodiment of the present application.
Fig. 13 is a schematic view of an application interface of a video communication application according to an embodiment of the present application.
Detailed Description
The terms "first" and "second" are used below for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defining "a first" or "a second" may explicitly or implicitly include one or more such feature. In describing embodiments of the present application, words such as "exemplary," "or," "such as," and the like are used to mean serving as an example, instance, or illustration. Any embodiment or design described herein as "exemplary" or "for example" should not be construed as preferred or advantageous over other embodiments or designs. Rather, the use of words such as "exemplary," "or," "such as," and the like are intended to present related concepts in a concrete fashion.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this application belongs. The terminology used in the description of the present application is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. It should be understood that, "/" means or, unless otherwise indicated herein. For example, A/B may represent A or B. The term "and/or" in this application is merely an association relationship describing an association object, and means that three relationships may exist. For example, a and/or B may represent: a exists alone, A and B exist simultaneously, and B exists alone. "at least one" means one or more. "plurality" means two or more than two. For example, at least one of a, b or c may represent: seven cases of a, b, c, a and b, a and c, b and c, a, b and c. It will be appreciated that the order of the steps shown in the flowcharts herein may be changed and some may be omitted.
The electronic equipment such as the mobile phone, the tablet personal computer and the like can be used for installing the application, and an application interface corresponding to the installed application is displayed on a display screen of the electronic equipment, so that a user can conveniently use the functions provided by the application. In order to avoid the display screen of the electronic device being too large to affect its convenience, some manufacturers have applied flexible screens to the electronic device, resulting in an electronic device with a folded screen. An electronic device with a folding screen includes at least two display screens and a plurality of cameras on a hardware device. In the prior art, the application interfaces displayed by the same application on the electronic device without the folding screen and the electronic device with the folding screen are the same, so that functions provided for users are the same, and more abundant functions cannot be provided for users according to the characteristics (such as the characteristics of at least two display screens and a plurality of cameras) of the electronic device with the folding screen on a hardware device, and more abundant shooting scenes cannot be provided for users.
In order to solve the above technical problem that an application cannot provide a user with richer functions according to the characteristics of an electronic device with a folding screen on a hardware device, the embodiment of the application provides an application interface display method. When the method is applied to the electronic equipment with the folding screen, the method can provide richer functions for a user according to the characteristics of the electronic equipment with the folding screen on the hardware equipment.
The folding screen of the electronic device is foldable to form at least two screens. For example, the folding screen may be folded along a fold edge or fold axis to form a first screen and a second screen. The folding manner of the folding screen on the electronic device can be divided into two types. One type is an outward folding screen (called outward folding screen for short), and the other type is an inward folding screen (called inward folding screen for short). Wherein, take the example of folding screen can fold and form first screen and second screen. After the outward folding screen is folded, the display direction of the first screen is opposite to the display direction of the second screen. After the inward folding screen is folded, the display direction of the first screen is opposite to the display direction of the second screen.
In this embodiment of the present application, an electronic device having a folding screen is described by taking an inwardly folded folding screen as an example (it can be understood that this embodiment of the present application may also be performed on an electronic device having an outwardly folded folding screen), that is, a display direction of a first screen is opposite to a display direction of a second screen, and a front camera (hereinafter referred to as a first front camera) is disposed on the inwardly folded folding screen. For example, please refer to fig. 1, which illustrates a schematic product form of an electronic device with an invaginated folding screen according to an embodiment of the present application. In fig. 1, (a) is a schematic view of the fully unfolded state of the inward folding screen. The inwardly folded screen may be folded along the fold edges in directions 11a and 11B shown in fig. 1 (a) to form a semi-folded form of the screen a and the screen B shown in fig. 1 (B). After the inward folding screen is folded and divided into an A screen and a B screen, the first front camera can be arranged on the A screen or the B screen. The inward folding screen may follow the folding edge, according to the screen a and the screen B shown in (B) of fig. 1. The inner folding screen may be folded over along the fold edges in the directions 12a and 12b shown in fig. 1 (b) to form the out-turned folding screen in the fully folded configuration shown in fig. 1 (c). As shown in fig. 1 (c), after the in-folded screen of the electronic device is fully folded, the a screen and the B screen are opposite and invisible to the user.
For example, a C screen (i.e., a third screen) may be disposed on the back of the B screen (i.e., the second screen) shown in fig. 1 (B), where the C screen is not shown in fig. 1 (B), and after the folded-in screen is fully folded, the B screen is opposite to the a screen, and the C screen is visible to the user, and the display direction of the C screen is opposite to the shooting direction of the rear camera on the electronic device. It can be appreciated that, for an electronic device with such an invaginated folding screen, when the invaginated folding screen is in a fully folded state, the display direction of the C-screen is consistent with the shooting direction of the rear camera on the electronic device after the invaginated folding screen is in an unfolded state.
In fig. 1, the folding screen of the electronic device is folded longitudinally, that is, the folding screen is folded into left and right screens (i.e., a screen and a screen B screen) according to the longitudinal folding edge on the folding screen.
In this embodiment of the present application, the folded-in folding screen of the electronic device may also be folded laterally, that is, according to a lateral folding edge on the folded-in folding screen, into an upper screen and a lower screen (i.e., an a screen and a B screen). For example, as shown in fig. 2 (a), the folded-in folding screen is in an unfolded state, and is folded along the folding edge in the transverse direction of the folding screen, so that fig. 2 (b) can be formed, and after continuing to fold, fig. 2 (c) can be formed.
It should be noted that, the back side of the folded-in folding screen of the electronic device with the folding screen according to the embodiment of the present application further includes a display screen, which may be referred to as a third screen. The third screen may be disposed on the back of the first screen or the second screen formed by folding the folded-in folded screen, for example, in fig. 2 (C), the back of the B screen (i.e., the second screen) may be provided with a C screen (third screen), which is not shown in fig. 2 (C), the B screen is opposite to the a screen, the C screen is opposite to the B screen, the C screen is visible to the user, and the display direction of the C screen is opposite to the rear camera on the electronic device after the folded-in screen is completely folded.
The display direction of the third screen is opposite to the rear camera on the electronic device when the folded-in folding screen is completely folded, and a front camera (hereinafter referred to as a second front camera) is arranged on the third screen. The electronic device with the folded-in folding screen in the embodiment of the application may include at least three cameras, for example, a first front camera, a second front camera and a rear camera.
In the embodiment of the application, the included angle between the screen A and the screen B of the inward folding screen of the electronic equipment is in the range of [0 degrees, 180 degrees ]. If alpha epsilon [0 degrees, P1], the electronic equipment can determine that the inward folding screen is in a fully folded state (called a folded state for short); if alpha epsilon (P1, P2), the electronic device can determine that the inward folding screen is in a half-folded state (simply called half-folded state); alpha epsilon [ P2, 180 DEG ], the electronic device can determine that the invaginated folding screen is in a fully unfolded configuration (simply referred to as an unfolded configuration). Wherein, 0 DEG is more than 0 DEG and less than P1 and less than 180 DEG is more than P2. P1, P2 may be preset angle thresholds. P1 and P2 can be determined according to the use habit of a large number of users for using the inward folding screen; alternatively, P1, P2 may be set by the user in the electronic device.
In some embodiments, the user may want to use the a-screen and the B-screen as a whole (i.e., as a complete display screen) with a greater angle α than 100 ° according to the usage habits of most users. When the included angle alpha between the A screen and the B screen is smaller than 80 degrees, the user wants to use the C screen independently, the possibility of not using the A screen and the B screen is high, and the inward folding screen can be determined to be in a folded state. When the included angle alpha between the A screen and the B screen is between 80 degrees and 100 degrees, the possibility that the user wants to use the A screen and the B screen is high, and the inward folding screen can be determined to be in a half-folding state.
Therefore, the range of the preset angle threshold P1 in the embodiment of the present application may be (0, 80 ° ], and the range of the preset angle threshold P2 may be [100 °,180 °). For example, the preset angle threshold P1 may be 75 °, and the preset angle threshold P2 may be 105 °. The above examples are merely for the purpose of explaining various embodiments of the present application and are not to be construed as limiting in any way.
It should be noted that, at least two screens formed after the folded or unfolded inner folded screen in the embodiment of the present application may be multiple screens that exist independently, or may be a complete screen with an integral structure, and only folded to form at least two portions.
For example, the inwardly folded screen may be a flexible folding screen including folded edges made of a flexible material. Part or all of the flexible folding screen is made of flexible materials. The at least two panels formed after the flexible folding panel is folded are one complete panel of unitary construction, but folded to form at least two sections.
For another example, the inward folding screen may be a multi-screen folding screen. The multi-screen folding screen may include multiple (two or more) screens. The plurality of screens is a plurality of individual display screens. The plurality of screens may be connected in turn by a folding shaft. Each screen can rotate around a folding shaft connected with the screen, so that the folding of the multi-screen folding screen is realized.
In the following embodiments of the present application, an example in which the folded-in folding screen is a flexible folding screen that can be folded transversely will be described.
By way of example, the electronic device in embodiments of the present application may be a cell phone, tablet, desktop, laptop, handheld, notebook, ultra-mobile personal computer (UMPC), netbook, and electronic device such as a cellular phone, personal digital assistant (personal digital assistant, PDA), artificial intelligence (artificial intelligence, AI) device, wearable device, vehicle-mounted device, smart home device, and/or smart city device. The embodiment of the application does not particularly limit the specific form of the electronic device.
The implementation of the examples of the present application will be described in detail below with reference to the accompanying drawings. Fig. 3 is a schematic structural diagram of an electronic device 100 according to an embodiment of the present application. As shown in fig. 3, the electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (universal serial bus, USB) interface 130, a charge management module 120, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, keys 190, a motor 191, an indicator 192, a camera 193, a display 194, a user identification module (subscriber identification module, SIM) card interface 195, and the like.
The sensor module 180 may include a pressure sensor 180A, a gyro sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
It should be understood that the illustrated structure of the embodiment of the present invention does not constitute a specific limitation on the electronic device 100. In other embodiments of the present application, electronic device 100 may include more or fewer components than shown, or certain components may be combined, or certain components may be split, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
The processor 110 may include one or more processing units, such as: the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), a controller, a video codec, a digital signal processor (digital signal processor, DSP), a baseband processor, and/or a neural network processor (neural-network processing unit, NPU), etc. Wherein the different processing units may be separate devices or may be integrated in one or more processors.
The controller may be a neural hub and command center of the electronic device 100. The controller can generate operation control signals according to the instruction operation codes and the time sequence signals to finish the control of instruction fetching and instruction execution.
A memory may also be provided in the processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that the processor 110 has just used or recycled. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Repeated accesses are avoided and the latency of the processor 110 is reduced, thereby improving the efficiency of the system.
In some embodiments, the processor 110 may include one or more interfaces. The interfaces may include an integrated circuit (inter-integrated circuit, I1C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous receiver transmitter (universal asynchronous receiver/transmitter, UART) interface, a mobile industry processor interface (mobile industry processor interface, MIPI), a general-purpose input/output (GPIO) interface, a subscriber identity module (subscriber identity module, SIM) interface, and/or a universal serial bus (universal serial bus, USB) interface, among others.
The I1C interface is a bi-directional synchronous serial bus comprising a serial data line (SDA) and a serial clock line (derail clock line, SCL). The I2S interface may be used for audio communication.
PCM interfaces may also be used for audio communication to sample, quantize and encode analog signals. In some embodiments, the audio module 170 and the wireless communication module 160 may be coupled through a PCM bus interface.
The UART interface is a universal serial data bus for asynchronous communications. The bus may be a bi-directional communication bus. It converts the data to be transmitted between serial communication and parallel communication.
The MIPI interface may be used to connect the processor 110 to peripheral devices such as a display 194, a camera 193, and the like. The MIPI interfaces include camera serial interfaces (camera serial interface, CSI), display serial interfaces (display serial interface, DSI), and the like. In some embodiments, processor 110 and camera 193 communicate through a CSI interface to implement the photographing functions of electronic device 100. The processor 110 and the display 194 communicate via a DSI interface to implement the display functionality of the electronic device 100.
The GPIO interface may be configured by software. The GPIO interface may be configured as a control signal or as a data signal.
The USB interface 130 is an interface conforming to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, or the like.
It should be understood that the interfacing relationship between the modules illustrated in the embodiments of the present invention is only illustrative, and is not meant to limit the structure of the electronic device 100. In other embodiments of the present application, the electronic device 100 may also use different interfacing manners, or a combination of multiple interfacing manners in the foregoing embodiments.
The charge management module 140 is configured to receive a charge input from a charger.
The power management module 141 is used for connecting the battery 142, and the charge management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charge management module 140 to power the processor 110, the internal memory 121, the display 194, the camera 193, the wireless communication module 160, and the like. The power management module 141 may also be configured to monitor battery capacity, battery cycle number, battery health (leakage, impedance) and other parameters.
The wireless communication function of the electronic device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the electronic device 100 may be used to cover a single or multiple communication bands. Different antennas may also be multiplexed to improve the utilization of the antennas.
The mobile communication module 150 may provide a solution for wireless communication including 2G/3G/4G/5G, etc., applied to the electronic device 100. The mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (low noise amplifier, LNA), etc. The mobile communication module 150 may receive electromagnetic waves from the antenna 1, perform processes such as filtering, amplifying, and the like on the received electromagnetic waves, and transmit the processed electromagnetic waves to the modem processor for demodulation.
The mobile communication module 150 can amplify the signal modulated by the modem processor, and convert the signal into electromagnetic waves through the antenna 1 to radiate. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be provided in the same device as at least some of the modules of the processor 110.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating the low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then transmits the demodulated low frequency baseband signal to the baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor.
The wireless communication module 160 may provide solutions for wireless communication including wireless local area network (wireless local area networks, WLAN), bluetooth (BT), global navigation satellite system (global navigation satellite system, GNSS), frequency modulation (frequency modulation, FM), near field wireless communication technology (near field communication, NFC), infrared technology (IR), etc. applied on the electronic device 100. The wireless communication module 160 may be one or more devices that integrate at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, modulates the electromagnetic wave signals, filters the electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, frequency modulate it, amplify it, and convert it to electromagnetic waves for radiation via the antenna 2.
The electronic device 100 implements display functions through a GPU, a display screen 194, an application processor, and the like. The GPU is a microprocessor that serves exception alerts, and is connected to the display 194 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 110 may include one or more GPUs that execute program instructions to generate or change display information.
The display screen 194 is used to display images, videos, and the like. The display 194 includes a display panel. The display panel may employ a liquid crystal display (liquid crystal display, LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (AMOLED) or an active-matrix organic light-emitting diode (matrix organic light emitting diode), a flexible light-emitting diode (flex), a Mini-LED, a Micro-OLED, a quantum dot light-emitting diode (quantum dot light emitting diodes, QLED), or the like.
In some embodiments, the electronic device 100 may include 1 or N display screens 194, N being a positive integer greater than 1. The electronic device 100 may implement photographing functions through an ISP, a camera 193, a video codec, a GPU, a display screen 194, an application processor, and the like. The ISP is used to process data fed back by the camera 193. For example, when photographing, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electric signal, and the camera photosensitive element transmits the electric signal to the ISP for processing and is converted into an image visible to naked eyes. ISP can also optimize the noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in the camera 193.
The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image onto the photosensitive element. The photosensitive element may be a charge coupled device (charge coupled device, CCD) or a Complementary Metal Oxide Semiconductor (CMOS) phototransistor. The photosensitive element converts the optical signal into an electrical signal, which is then transferred to the ISP to be converted into a digital image signal. The ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into an image signal in a standard RGB, YUV, or the like format. In some embodiments, electronic device 100 may include 1 or N cameras 193, N being a positive integer greater than 1.
The digital signal processor is used for processing digital signals, and can process other digital signals besides digital image signals.
Video codecs are used to compress or decompress digital video. The electronic device 100 may support one or more video codecs. In this way, the electronic device 100 may play or record video in a variety of encoding formats, such as: dynamic picture experts group (moving picture experts group, MPEG) 1, MPEG2, MPEG3, MPEG4, etc.
The NPU is a neural-network (NN) computing processor, and can rapidly process input information by referencing a biological neural network structure, for example, referencing a transmission mode between human brain neurons, and can also continuously perform self-learning. Applications such as intelligent awareness of the electronic device 100 may be implemented through the NPU, for example: image recognition, face recognition, speech recognition, text understanding, etc.
The external memory interface 120 may be used to connect external non-volatile memory to enable expansion of the memory capabilities of the electronic device 100. The external nonvolatile memory communicates with the processor 110 through the external memory interface 120 to implement a data storage function.
The internal memory 121 may include one or more random access memories (random access memory, RAM) and one or more non-volatile memories (NVM). In the embodiment of the present application, the internal memory 121 may also be referred to as a memory. The internal memory 121 may be used to store computer executable program code including instructions. The processor 110 executes various functional applications of the electronic device 100 and data processing by executing instructions stored in the internal memory 121. For example, in an embodiment of the present application, the processor 110 may include a storage program area and a storage data area by executing instructions stored in the internal memory 121, and the internal memory 121 may include a storage program area and a storage data area.
The electronic device 100 may implement audio functions through an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, an application processor, and the like. Such as music playing, recording, etc.
The audio module 170 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. The audio module 170 may also be used to encode and decode audio signals.
The speaker 170A, also referred to as a "horn," is used to convert audio electrical signals into sound signals. The electronic device 100 may listen to music, or to hands-free conversations, through the speaker 170A.
A receiver 170B, also referred to as a "earpiece", is used to convert the audio electrical signal into a sound signal. When electronic device 100 is answering a telephone call or voice message, voice may be received by placing receiver 170B in close proximity to the human ear.
Microphone 170C, also referred to as a "microphone" or "microphone", is used to convert sound signals into electrical signals. When making a call or transmitting voice information, the user can sound near the microphone 170C through the mouth, inputting a sound signal to the microphone 170C. The electronic device 100 may be provided with at least one microphone 170C.
The earphone interface 170D is used to connect a wired earphone. The headset interface 170D may be a USB interface 130 or a 3.5mm open mobile electronic device 100 platform (open mobile terminal platform, OMTP) standard interface, a american cellular telecommunications industry association (cellular telecommunications industry association of the USA, CTIA) standard interface.
The pressure sensor 180A is used to sense a pressure signal, and may convert the pressure signal into an electrical signal. In some embodiments, the pressure sensor 180A may be disposed on the display screen 194. The pressure sensor 180A is of various types, such as a resistive pressure sensor, an inductive pressure sensor, a capacitive pressure sensor, and the like. The capacitive pressure sensor may be a capacitive pressure sensor comprising at least two parallel plates with conductive material. The capacitance between the electrodes changes when a force is applied to the pressure sensor 180A. The electronic device 100 determines the strength of the pressure from the change in capacitance. When a touch operation is applied to the display screen 194, the electronic apparatus 100 detects the touch operation intensity according to the pressure sensor 180A. The electronic device 100 may also calculate the location of the touch based on the detection signal of the pressure sensor 180A.
The gyro sensor 180B may be used to determine a motion gesture of the electronic device 100. In some embodiments, the angular velocity of electronic device 100 about three axes (i.e., x, y, and z axes) may be determined by gyro sensor 180B. The gyro sensor 180B may be used for photographing anti-shake. For example, when the shutter is pressed, the gyro sensor 180B detects the shake angle of the electronic device 100, calculates the distance to be compensated by the lens module according to the angle, and makes the lens counteract the shake of the electronic device 100 through the reverse motion, so as to realize anti-shake. The gyro sensor 180B may also be used for navigating, somatosensory game scenes.
In the embodiment of the present application, if the electronic device 100 has an invaginated folding screen and the invaginated folding screen is foldable to form a plurality of screens, a gyro sensor 180B may be included in each screen for measuring the orientation (i.e., the directional vector of the orientation) of the corresponding screen. The electronic device 100 may determine the included angle between adjacent screens based on the measured angular change in the orientation of each screen. For example, as shown in fig. 1 (B), after the folded-in folding screen is folded, a first screen and a second screen are formed, and gyro sensors 180B are disposed in the first screen and the second screen, so that the orientations of the first screen and the second screen can be measured, respectively. The electronic device 100 determines an included angle between the first screen and the second screen according to the measured change of the orientation angle of each screen.
For example, the inward folding screen is folded to form a first screen (a screen shown in fig. 4A) in which a gyro sensor a is provided and a second screen (B screen shown in fig. 4B) in which a gyro sensor B is provided. Here, a simple explanation is made on the principle that the gyro sensor a measures the orientation of the a-screen (i.e., the directional vector of the orientation), the gyro sensor B measures the orientation of the B-screen (i.e., the directional vector of the orientation), and the principle that the electronic apparatus 100 calculates the angle α between the a-screen and the B-screen from the orientation of the a-screen and the orientation of the B-screen.
Wherein the coordinate system of the gyro sensor is a geographical coordinate system. As shown in fig. 4B, the origin O of the geographic coordinate system is located at the carrier (i.e., the point where the device containing the gyroscopic sensor, such as the electronic device 100, is located), the x-axis points to the east (E) along the local meridian, the y-axis points to the north (N) along the local geographic vertical, and the z-axis points to the top along the local geographic vertical, and forms a right-hand rectangular coordinate system with the x-axis and the y-axis.
The electronic device 100 can measure a direction vector of the orientation of each screen in the coordinate system of the gyro sensor provided therein using the gyro sensor 180B provided in each screen. For example, referring to the side view of the electronic device 100 shown in fig. 4A, the electronic device 100 measures that the direction vector of the orientation of the a-screen in the coordinate system of the gyro sensor a is a vector z1, and the direction vector of the orientation of the B-screen in the coordinate system of the gyro sensor B is a vector z2. The electronic device 100 can calculate the angle θ between the vector z1 and the vector z2 according to the vector z1 and the vector z2.
As can be seen from fig. 4A, since the vector z1 is perpendicular to the a-screen and the vector z2 is perpendicular to the B-screen, the included angle α=180° - θ between the a-screen and the B-screen can be obtained. That is, the electronic device 100 can determine the included angle α between the a-screen and the B-screen according to the measured direction vector (i.e., the vector z 1) of the a-screen in the coordinate system of the gyro sensor a and the measured direction vector (i.e., the vector z 2) of the B-screen in the coordinate system of the gyro sensor B.
Although the positions of the gyro sensors provided in the a-screen and the B-screen do not overlap, that is, the origins of the coordinate systems of the gyro sensors of the a-screen and the B-screen do not overlap, the x-axis, the y-axis, and the z-axis of the two coordinate systems are parallel, and thus the coordinate systems of the gyro sensors provided in the a-screen and the B-screen can be considered to be parallel. Thus, although the vector z1 and the vector z2 are not in the same coordinate system, the corresponding axes of the two coordinate systems are in parallel, so the angle θ between the vector z1 and the vector z2 can be calculated by the vector z1 and the vector z 2.
In some embodiments, the angle alpha between the A screen and the B screen can be measured by matching with one or more other sensors.
For example, one acceleration sensor 180E may be provided in each of the inwardly folded screens. The electronic device 100 (e.g., the processor 110) may measure the motion acceleration of each screen as it is rotated using an acceleration sensor; and then calculating the rotating angle of one screen back relative to the other screen according to the measured motion acceleration, namely the included angle alpha between the screen A and the screen B.
In other embodiments, the gyro sensor 180B may be a virtual gyro sensor formed by combining a plurality of other sensors. The virtual gyroscope sensor can be used for calculating the included angle between adjacent screens of the inward folding screen, namely the included angle alpha between the screen A and the screen B.
The air pressure sensor 180C is used to measure air pressure. In some embodiments, electronic device 100 calculates altitude from barometric pressure values measured by barometric pressure sensor 180C, aiding in positioning and navigation.
The magnetic sensor 180D includes a hall sensor. The electronic device 100 may detect the opening and closing of the flip cover using the magnetic sensor 180D. In some embodiments, when the electronic device 100 is a flip machine, the electronic device 100 may detect the opening and closing of the flip according to the magnetic sensor 180D. Then, according to the detected opening and closing state of the leather sheath or the opening and closing state of the flip, the characteristics of automatic unlocking of the flip and the like are set.
The acceleration sensor 180E may detect the magnitude of acceleration of the electronic device 100 in various directions (typically three axes). The magnitude and direction of gravity may be detected when the electronic device 100 is stationary. The acceleration sensor 180E may also be used to identify the gesture of the electronic device 100, and may be used in applications such as horizontal-vertical screen switching, pedometers, and the like.
A distance sensor 180F for measuring a distance. The electronic device 100 may measure the distance by infrared or laser. In some embodiments, the electronic device 100 may range using the distance sensor 180F to achieve quick focus.
The proximity light sensor 180G may include, for example, a Light Emitting Diode (LED) and a light detector, such as a photodiode. The light emitting diode may be an infrared light emitting diode.
The ambient light sensor 180L is used to sense ambient light level. The electronic device 100 may adaptively adjust the brightness of the display 194 based on the perceived ambient light level.
The fingerprint sensor 180H is used to collect a fingerprint. The electronic device 100 may utilize the collected fingerprint feature to unlock the fingerprint, access the application lock, photograph the fingerprint, answer the incoming call, etc.
The temperature sensor 180J is for detecting temperature.
The touch sensor 180K, also referred to as a "touch panel". The touch sensor 180K may be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, which is also called a "touch screen". The touch sensor 180K is for detecting a touch operation acting thereon or thereabout. The touch sensor may communicate the detected touch operation to the application processor to determine the touch event type. Visual output related to touch operations may be provided through the display 194. In other embodiments, the touch sensor 180K may also be disposed on the surface of the electronic device 100 at a different location than the display 194.
In this embodiment, the electronic device 100 may detect a touch operation input by a user on the touch screen through the touch sensor 180K, and collect one or more of a touch position, a touch area, a touch direction, a touch time and the like of the touch operation on the touch screen. In some embodiments, the electronic device 100 may determine the touch location of a touch operation on the touch screen by combining the touch sensor 180K and the pressure sensor 180A.
The bone conduction sensor 180M may acquire a vibration signal. In some embodiments, bone conduction sensor 180M may acquire a vibration signal of a human vocal tract vibrating bone pieces. The bone conduction sensor 180M may also contact the pulse of the human body to receive the blood pressure pulsation signal.
The keys 190 include a power-on key, a volume key, etc. The keys 190 may be mechanical keys. Or may be a touch key.
The motor 191 may generate a vibration cue. The motor 191 may be used for incoming call vibration alerting as well as for touch vibration feedback.
The indicator 192 may be an indicator light, may be used to indicate a state of charge, a change in charge, a message indicating a missed call, a notification, etc.
The SIM card interface 195 is used to connect a SIM card. The SIM card may be inserted into the SIM card interface 195, or removed from the SIM card interface 195 to enable contact and separation with the electronic device 100. The electronic device 100 may support 1 or N SIM card interfaces, N being a positive integer greater than 1. The SIM card interface 195 may support Nano SIM cards, micro SIM cards, and the like. The same SIM card interface 195 may be plugged into multiple frames of cards simultaneously. The multi-frame cards may be of the same type or of different types. The SIM card interface 195 may also be compatible with different types of SIM cards. The SIM card interface 195 may also be compatible with external memory cards. The electronic device 100 interacts with the network through the SIM card to realize functions such as communication and data communication. In some embodiments, the electronic device 100 employs esims, i.e.: an embedded SIM card. The eSIM card can be embedded in the electronic device 100 and cannot be separated from the electronic device 100.
The methods in the following embodiments may be implemented in the electronic device 100 having the above-described hardware structure.
The software system of the electronic device 100 may employ a layered architecture, an event driven architecture, a microkernel architecture, a microservice architecture, or a cloud architecture. Android with layered architecture in embodiments of the present application TM The system is an example illustrating the software architecture of the electronic device 100. The layered architecture may divide the software into several layers, each layer havingClear roles and division of work. The layers communicate with each other through a software interface. In some embodiments, android will be TM The system is divided into four layers, namely an application program layer, an application program framework layer, a system library and a kernel layer from top to bottom.
Referring to fig. 5, fig. 5 is a software architecture block diagram of an electronic device 100 provided by an exemplary embodiment of the present application.
The application layer (which may be abbreviated as application layer) may comprise a series of applications. For example, the application packages may include applications such as lock screens, gallery, calendar, map, desktop, bluetooth, music, text messages, beauty shows, weChat, etc.
The application framework layer provides an application programming interface (Application Programming Interface, API) and programming framework for application programs of the application layer. The application framework layer includes a number of predefined functions. For example, the application framework layer may include an information processing module, a window manager (windowmanager service), a content provider, a view system, a telephony manager, a resource manager, a notification manager, and the like.
The information processing module may be configured to receive an instruction sent by the application layer, and send the received instruction to the system library. The window manager is used for managing window programs. The content provider is used to store and retrieve data and make such data accessible to applications. The data may include video, images, audio, calls made and received, browsing history and bookmarks, phonebooks, etc. The view system includes visual controls, such as controls to display text, controls to display pictures, and the like. The view system includes visual controls, such as controls to display text, controls to display pictures, and the like. The telephony manager is used to provide the communication functions of the electronic device 100. The resource manager provides various resources for the application program, such as localization strings, icons, pictures, layout files, video files, and the like. The notification manager allows the application to display notification information in a status bar.
The window manager can acquire the size of the display screen, judge whether a status bar exists, lock the screen, intercept the screen and the like. Each application run-time Activity, the window manager may create a corresponding application window. The window manager periodically refreshes the display content and window parameters (e.g., size, location, etc. of the application window) in the application window. And, the window manager may create a window state corresponding to each application window. The window manager marks the application window with this and uses this WindowState to store, query and control the state of the application window.
For example, the Window manager may query Window 1 in Window state of Window 1 for whether Window 1 is in full screen state, and if not, the Window manager may query Window 1 in Window state of Window 1 for parameters such as aspect ratio (e.g., 16:9 or 4:3).
The application layer and the application framework layer run in a virtual machine. The virtual machine executes java files of the application program layer and the application program framework layer as binary files. The virtual machine is used for executing the functions of object life cycle management, stack management, thread management, security and exception management, garbage collection and the like.
The system library may include a plurality of functional modules. Such as a device state management system (Device State Manager Service), an angle calculation module, a surface manager (surface manager), a Media library (Media Libraries), a three-dimensional graphics processing library (e.g., openGL ES), a two-dimensional graphics engine (e.g., SGL), etc.
The device state management system may be configured to receive bottom sensor data (e.g., hinge angle data, distance sensor data, etc.) and determine a current device state (e.g., a folded state, a semi-folded state, an unfolded state, etc.) of the electronic device, and may determine whether to use the first screen display or the second screen display according to the device state of the electronic device. The device state management system can determine whether the electronic device has an inward folding screen or not based on the device information of the electronic device, and can also be used for determining the current system display state of the electronic device and switching the system display state of the electronic device. The angle calculation module is used for determining the form of the display screen. The angle calculation module may obtain angle data from a sensor driver of the kernel layer when the electronic device is an electronic device having a folding screen, calculate an opening angle of the folding screen, and determine a form of the folding screen, such as a folded state, a semi-folded state, or an unfolded state, according to the opening angle. The device state management system can inquire the form of the folding screen from the angle calculation module, and can determine the physical folding and unfolding change of the folding screen of the electronic device based on the data transmitted by the angle calculation module. The surface manager is used to manage the display subsystem and provides a fusion of 2D and 3D layers for multiple applications. Media libraries support a variety of commonly used audio, video format playback and recording, still image files, and the like. The three-dimensional graphic processing library is used for realizing three-dimensional graphic drawing, image rendering, synthesis, layer processing and the like. The 2D graphics engine is a drawing engine for 2D drawing.
The kernel layer is a layer between hardware and software. The kernel layer contains at least a display driver, a camera driver, an audio driver, and a sensor driver. A display driver for receiving a notification of the device state management system, the display on the electronic device may be controlled based on the notification of the device state management system, the display including a desktop display, an application interface display, and the like. The sensor drive is used to drive the sensor to detect data. When the acceleration sensor 180E and the gyro sensor 180E detect angle data, a corresponding hardware interrupt is issued to the kernel layer. The sensor driver of the kernel layer sends the angle data to an angle calculation module in the system library, and the angle calculation module is used for calculating the physical form change of the electronic equipment and can be used for determining the physical form change of the inward folding screen of the electronic equipment. The angle calculation module can send the determined physical form change of the inward folding screen of the electronic device to the device state management system, and the device state management system can inform the window manager to change the resolution of the current display screen according to the physical form change of the inward folding screen. For example, when the physical form change of the invaginated folding screen is changed from the semi-folded state to the unfolded state, the device state management system notifies the window manager that the resolution of the current display screen is changed from 1920×1080 to 3840×2160, that is, the effective display area size of the current display screen is changed to 3840×2160. Subsequently, when an application in the application layer calls the window manager to create a corresponding application window, the window manager can set window parameters such as the size, the position and the like of the application window according to the updated screen size, so that the opened application can adapt to the folded screens with different physical forms.
It will be appreciated that the layers and components contained in the layers in the software structure shown in fig. 5 do not constitute a specific limitation on the electronic device. In other embodiments of the present application, the electronic device may include more or fewer layers than shown, and more or fewer components may be included in each layer, as the present application is not limited.
The application interface display method applied to the electronic device provided in the embodiment of the present application is described below with a specific example based on the software architecture diagram shown in fig. 5. Fig. 6 is a flowchart of an application interface display method according to an embodiment of the present application. As shown in fig. 6, the electronic device includes an application layer and a device state management system, and the method includes:
the application layer receives 601 a first operation of an application by a user.
The application may be any third party application installed in the mobile phone, such as an instant messaging application (e.g., weChat), an image processing application (e.g., a beauty show), and the like, and is in an application layer. The third party application is an application that is self-contained when the non-electronic device is initially set up, including an application that is run by downloading an application package. The first operation includes a start operation for an application. The user may perform a start operation of opening the application from a desktop, a negative one-screen menu, a pull-up menu, a pull-down menu, or any shortcut menu. For example, the user may perform a first operation on a beauty show icon on the desktop to open the beauty show. The first operation further includes an operation of opening a function of the application, the function including a function related to photographing, such as a video function, a video recording function, a photographing function, and the like. The user may perform a first operation on a function of the application at an application interface of the application to open the function. For example, the user may perform a first operation on a function icon, which is a video call, in a chat interface of a WeChat to open the video call.
The application layer sends 602 a device type query instruction to the device state management system.
And responding to a first operation of a user on a three-party application in the application layer, and sending a device type query instruction to a device state management system by the three-party application. The equipment state management system is provided with an interface for three-party application to query. Through the interface, the data interaction between the three-party application and the equipment state management system can be realized. If the state management system does not have the interface, the three-party application cannot directly communicate with the state management system. The three-party application can acquire the related information of the electronic equipment through the interface. For example, the three-party application sends a device type query instruction through a query interface preset on the device state management system, and obtains the device type information fed back by the device state management system through the query interface. It can be appreciated that the interaction between the three-party application and the device state management system can be through a query interface preset on the device state management system.
The device type query instruction is used for querying device information of the electronic device, wherein the device information can comprise display screen information. The display information may be used to indicate information related to the display, such as indicating whether the electronic device includes multiple displays, indicating whether the electronic device includes an invaginated folding screen, indicating a positional relationship between the displays when the electronic device has multiple displays, and so on. 603, the device state management system sends the first feedback information to the application layer.
In some embodiments of the present application, after receiving a device type query instruction, a device state management system generates first feedback information based on device information of an electronic device, and sends the generated first feedback information to an application layer, where the first feedback information may include the device information. And determining whether the electronic device is a folding screen electronic device according to the first feedback information. The folding screen electronic device comprises a first screen and a second screen arranged opposite to the first screen. The first screen may be an inwardly folded screen. In some embodiments of the present application, the display direction of the second screen may be opposite to the display direction of the first screen in the unfolded state.
The application layer determines whether the electronic device is a folding screen electronic device according to the first feedback information 604. If the electronic equipment is determined not to be the folding screen electronic equipment, ending the flow of the application interface display method. In some embodiments of the present application, if it is determined that the electronic device is not a folding screen electronic device, an application interface of the application may be displayed according to a conventional display manner. If it is determined that the electronic device is a folding screen electronic device, 605 is executed to send a system display status query instruction to the device status management system.
The device state management system sends 606 the second feedback information to the application layer.
In some embodiments of the present application, after receiving a system display state query instruction, a device state management system generates second feedback information based on display information of the electronic device, and sends the generated second feedback information to the application layer, where the second feedback information is used to indicate a system display state of the electronic device, and the system display state includes a first screen display and a second screen display. The display information is used for indicating the system display state of the current electronic equipment. The system display state is used to indicate which display screen the electronic device is currently using so that the first screen display indicates that the display is using the first screen and the second screen display indicates that the display is using the second screen.
And 607, the application layer determines whether the system display state of the electronic device is the first screen display according to the second feedback information.
If the system display state of the electronic equipment is not the first screen display, ending the flow of the application interface display method. In some embodiments of the present application, if it is determined that the system display state of the electronic device is not the first screen display, an application interface of the application may be displayed in a conventional display manner.
If it is determined that the system display status of the electronic device is the first screen display, then 608 is performed to send a physical status query instruction to the device status management system. The physical state query instruction is configured to query the physical state of the first screen, where the physical state may include an unfolded state, a semi-folded state, and a folded state, and the relevant description of the physical state of the folded screen may be referred to above, and is not repeated herein. In the embodiments of the present application, the semi-folded state and the folded state may be collectively referred to as a non-unfolded state. The physical state of the first screen may be used as a physical state of the electronic device, for example, when the physical state of the first screen is an expanded state, it is determined that the electronic device is in the expanded state; the electronic device is determined to be in a non-expanded state when the physical state of the first screen is in the non-expanded state.
609, the device state management system sends the third feedback information to the application layer.
In some embodiments of the present application, after receiving a physical state query instruction, the device state management system generates third feedback information based on sensor information of the electronic device, and sends the generated third feedback information to the application layer, where the third feedback information is used to indicate a physical state of the first screen of the electronic device, the physical state includes a non-expanded state and an expanded state, and the sensor information includes acceleration sensor information and gyroscope sensor information. In some embodiments of the present application, as shown in fig. 5, the angle calculating module may obtain sensor information (such as acceleration sensor information and gyro sensor information) uploaded by the sensor driving module, and determine a physical state of a folding screen of the electronic device (a physical state of the first screen) according to the uploaded sensor information. The angle calculation module sends the determined physical state of the first screen of the electronic device to the device state management system. The device state management system generates third feedback information based on the physical state of the electronic device folding screen.
The application layer determines whether the current physical state of the first screen is an expanded state based on the third feedback information 610.
And if the current physical state of the first screen is not the unfolded state, ending the flow of the application interface display method. In some embodiments of the present application, if it is determined that the current physical state of the first screen is not the expanded state, an application interface of the application may be displayed in a conventional display manner.
If it is determined that the current physical state of the first screen is the expanded state, 611 is executed, and an application interface of the application is displayed on the first screen, where the application interface includes a hardware call control. The hardware call control can be used for calling a display screen of the electronic equipment and has the function of calling the display screen. The name of the hardware call control is used for illustration only, and the name itself does not have any meaning, and does not constitute any limitation, and in some embodiments of the present application may also be referred to as a first control. After the application layer loads the application interface of the application, a first application interface corresponding to the application is displayed on a first screen of the electronic device. And after clicking the hardware call control by the user, displaying a second application interface corresponding to the application on the second screen. The second application interface is an interface of the first application interface after being adjusted according to the display size of the second screen. The elements presented by the interfaces of the second application interface may be the same as the elements presented by the interfaces of the first application interface, e.g. all present the pictures taken by the camera and all present the same controls. It will be appreciated that the positions at which the same elements appear on the first screen and the second screen may differ due to the different display sizes of the first screen and the second screen.
The elements presented by the interface of the second application interface may also differ from the elements presented by the interface of the first application interface. In some embodiments of the present application, the second application interface further includes a restore control, where the restore control is configured to restore a display scene of the electronic device before clicking back on the hardware call control, for example, displaying the first application interface of the application on the first screen of the electronic device. And responding to the operation of clicking the restore control by the user, stopping displaying the second application interface on the second screen, and displaying the first application interface on the first screen. By setting the restoring control on the second application interface, the display scene before the user clicks the hardware calling control can be restored, the user can conveniently select a proper mode to use the electronic equipment, and the control convenience of the application interface display method is improved.
The hardware call control can also be used for calling a camera of the electronic equipment, and has the function of calling the camera. In some embodiments of the present application, in response to a first operation of a user, invoking a first camera to perform shooting; and when responding to a second operation of the user, invoking a second camera to shoot. The shooting directions of the first camera and the second camera are opposite. It may be understood that the shooting directions of the first camera and the second camera are opposite to each other, which means that when the physical state of the first screen is the unfolded state, the shooting directions of the first camera and the second camera are opposite to each other.
It will be appreciated that the hardware call controls included on the application interface are not displayed on an electronic device that does not have a folding screen. That is, the application interface display method provided by the embodiment is simultaneously applied to an electronic device without an invaginated folding screen and an electronic device with a first screen which is an invaginated folding screen and a second screen which is opposite to the display direction of the first screen in an unfolded state, the displayed application interface is different, and the displayed application interface is provided with a hardware call control on the electronic device with the first screen which is an invaginated folding screen and the second screen which is opposite to the display direction of the first screen in an unfolded state and the current physical state of the first screen is an unfolded state.
According to the application interface display method applied to the electronic equipment, when the electronic equipment is provided with the first screen which is the folded screen and the second screen which is opposite to the display direction of the first screen in the unfolded state, and the current physical state of the first screen is the unfolded state, a hardware call control with a call display screen is displayed on the application interface displayed on the first screen, and a richer camera shooting scene is provided for a user through the hardware call control.
In some embodiments of the present application, as shown in fig. 7, the method further includes: the application layer registers 701 a first event on the device state management system. The first event is used for controlling the equipment state management system to monitor the system display state of the electronic equipment, such as determining whether the display state is switched from the first screen display to the second screen display or not, and determining whether the display state is switched from the second screen display to the first screen display or not. When the device state management system monitors that the system display state of the electronic device changes (such as switching from the first screen display to the second screen display and switching from the second screen display to the first screen display), the device state management system notifies the application layer of the change of the system display state of the electronic device, and an application running in the application layer can receive the notification of the change of the system display state. The user can adjust the system display state of the electronic equipment by performing display state switching operation on the electronic equipment. The display state switching operation includes related operations for adjusting display settings of the electronic device, such as clicking a system display state switching control. After the user switches the system display state of the electronic device from the first screen display to the second screen display or from the second screen display to the first screen display after the user switches the system display state of the electronic device.
The application layer registers 702 a second event on the device state management system. The second event is used for controlling the equipment state management system to monitor the physical state of the first screen, determining whether the physical state of the first screen changes, if so, whether the physical state of the first screen changes from a non-unfolding state to a unfolding state, whether the physical state of the first screen changes from the unfolding state to the non-unfolding state, and the like. When the device state management system monitors that the physical state of the electronic device changes, the device state management system notifies the application layer of the change of the physical state of the electronic device (such as changing from a non-unfolded state to a unfolded state and from a unfolded state to a non-unfolded state), and an running application in the application layer can receive the physical state change of the first screen. The user may adjust the physical state of the first screen by performing a physical state adjustment operation on the first screen.
The execution time of 701 and 702 may be identical or not, and in the embodiment of the present application, the execution time of 701 and 702 is not limited, for example, 701 may be executed after 611 in fig. 6. According to the technical scheme, the system display state and the physical folding and unfolding change of the electronic equipment can be timely monitored by monitoring the system display state and the physical folding and unfolding change of the first screen.
In some embodiments of the present application, when the application layer receives a change in a system display state of the electronic device (for example, the system display is switched from the first screen display to the second screen display) and determines that the current system display state of the electronic device is not the first screen display, or when the application layer receives a change in a physical state of the first screen (for example, the system display is changed from an expanded state to a non-expanded state) and determines that the physical state of the first screen is not the expanded state, a hardware call control included on the application interface is set to a non-selectable mode, and if a user clicks the hardware call control set to the non-selectable mode, a function corresponding to the hardware call control cannot be invoked. The non-selectable modes may include an icon gray mode and an icon hidden mode. According to the embodiment, when the system display state of the electronic device is not the first screen display state or when the physical state of the first screen is not the unfolded state, the hardware call control included on the application interface is set to be in the non-selectable mode in time, so that the situation that the user clicks the hardware call control when the system display state is not the first screen display state or when the physical state of the first screen is not the unfolded state but cannot realize the corresponding function of the hardware call control is avoided, and the resource loss of the electronic device can be correspondingly reduced.
In some embodiments of the present application, when a user clicks a hardware call control set to a non-selectable mode, a corresponding prompt is displayed on a current display interface according to a preset prompt rule, where the prompt is used to prompt the user to set a system display state of the electronic device to a first screen display state and change a physical state of the first screen to an expanded state. The application layer registers the first event and the second event on the device state management system, and the description of the first event and the second event may be found in the above description. And when the application layer determines that the system display state of the electronic device is the first screen display state and the physical state of the first screen is the unfolded state, canceling the unselected mode of the hardware call control. At this time, the user can implement the function of the hardware call control by clicking the hardware call control. According to the embodiment, when the user clicks the hardware calling control set to be in the non-selectable mode, the application scene condition corresponding to the hardware calling control is displayed, so that the user can conveniently adjust the application scene condition.
Fig. 8 is a flowchart of an application interface display method according to an embodiment of the present application. As shown in fig. 8, the following step 607 in fig. 6 further includes: if it is determined that the system display state of the electronic device is not the first screen display, executing 801, and loading an application interface of the application by an application layer, where the application interface includes a hardware call control set to an unselected mode. And after the application layer loads the application interface of the application, displaying the application interface of the application on a folding screen of the electronic equipment.
For a description of the non-selectable mode, reference may be made to the description of the related modes above, and the description thereof will not be repeated here.
The application layer registers the first event and the second event on the device state management system 802.
For a description of the first event and the second event, reference may be made to the description of fig. 7, which is not repeated here.
And 802, canceling an unselected mode of the hardware call control when the application layer determines that the system display state of the electronic device is a first screen display state and the first screen is an unfolding state.
In some embodiments of the present application, when a user clicks a hardware call control set to a non-selectable mode, a corresponding prompt is displayed on a current display interface according to a preset prompt rule, where the prompt is used to prompt the user to set a system display state of the electronic device to a first screen display state and change a physical state of the first screen to an expanded state. Reference may be made to the related descriptions above for some specific implementations of this embodiment, and no further description is given here.
According to the technical scheme, when the system display state of the electronic equipment is not the first screen display, the hardware call control set to be in the unselected mode is displayed on the application interface of the application, the system display state of the electronic equipment and the physical state change of the first screen are monitored, the unselected mode of the hardware call control is canceled when the application scene condition corresponding to the hardware call control is met, and a user can conveniently use the hardware call control to execute a corresponding function.
Fig. 9 is a flowchart of an application interface display method according to an embodiment of the present application. As shown in fig. 9, after 610 in fig. 6, the method further includes: if it is determined that the physical state of the first screen of the electronic device is not the expanded state, 901, displaying an application interface of the application on the first screen, where the application interface includes a hardware call control set to an unselected mode. And after the application layer loads the application interface of the application, displaying the application interface of the application on a first screen of the electronic equipment.
For a description of the non-selectable mode, reference may be made to the description of the related modes above, and the description thereof will not be repeated here.
The application layer registers the first event and the second event on the device state management system 902.
For a description of the first event and the second event, reference may be made to the description of fig. 7, which is not repeated here.
903, when the application layer determines that the system display state of the electronic device is the first screen display and the first screen is in the expanded state, canceling the non-selectable mode of the hardware call control.
In some embodiments of the present application, when a user clicks a hardware call control set to a non-selectable mode, a corresponding prompt is displayed on a current display interface according to a preset prompt rule, where the prompt is used to prompt the user to set a system display state of the electronic device to a first screen display state and change a physical state of the first screen to an expanded state. Reference may be made to the related descriptions above for some specific implementations of this embodiment, and no further description is given here.
According to the embodiment, when the application layer determines that the physical state of the first screen of the electronic device is not the unfolded state, the hardware call control set to the unselected mode is displayed on the application interface of the application, the system display state of the electronic device and the physical state change of the first screen are monitored, and when the application scene condition corresponding to the hardware call control is met, the unselected mode of the hardware call control is canceled, so that a user can conveniently use the hardware call control to execute a corresponding function.
In some embodiments of the present application, when a user clicks a three-party application including a shooting function, a hardware call control displayed on an application interface of the three-party application is a post-self-timer control, and the post-self-timer control is used for calling a post-camera in a self-timer scene, and self-timer is implemented by using the post-camera. Fig. 10 is a flowchart of a method for calling a function of a hardware call control according to an embodiment of the present application. As shown in fig. 10, the method includes:
101, when the user clicks the rear-mounted self-timer control, the application layer sends a system display state switching instruction to the equipment state management system, wherein the system display state switching instruction is used for switching the system display state into a second screen display.
And after receiving the system display state switching instruction, the device state management system switches the system display state of the electronic device from the first screen display to the second screen display. After the system display state is switched, the application interface of the application is changed from the first screen display to the second screen display. Before clicking the rear-mounted self-timer control, the user displays a first application interface of the application on a first screen; and after clicking the rear self-timer control, the user displays a second application interface of the application on a second screen. For the description of the first application interface and the second application interface, reference may be made to the description of fig. 6, which is not repeated herein. In some embodiments of the present application, the device state management system may notify the application layer of the switching result of the system display state of the electronic device.
In some embodiments of the present application, when the application layer determines that the system display state of the electronic device is switched from the first screen display to the second screen display, the display of the first application interface on the first screen may be stopped, and a flip phone prompt is displayed on the first screen, where the flip phone prompt is used to prompt a user to flip the electronic device. For example, the flip phone alert may be: and turning over the mobile phone, and shooting by using a rear camera.
In some embodiments of the present application, when the application layer determines that the system display state of the electronic device switches from the first screen display to the second screen display, a quench control is displayed on the first screen. The screen-off control can be simultaneously or independently displayed with the turning mobile phone prompt. And when the user clicks the screen-off control, the screen of the first screen is off.
102, the application layer stops calling the front-facing camera on the first screen, and calls the rear-facing camera on the same side as the second screen to shoot.
Fig. 11 is a schematic view of a scenario of an electronic device display application interface according to an embodiment of the present application. Before clicking a rear-mounted self-timer control in an application interface of a three-party application, the application interface of the application is displayed on a first screen which is an invaginated folding screen, as shown in (a) of fig. 11, at this time, the display state of the system is that of the first screen, the application interface of the application is displayed on the first screen, and the called camera is a front-mounted camera on the first screen. After clicking the post-self-timer control, the user switches the system display state of the electronic device from the first screen display to the second screen display, and stops displaying the application interface on the first screen. As shown in fig. 11 (b), the application interface of the application is stopped being displayed on the first screen, and a flip phone prompt and a screen-off control (here, the screen-off control is a cancel control shown in the figure) are generated on the first screen. When the user clicks the cancel control, the first screen may be subjected to a screen-off process, as shown in fig. 11 (c), and nothing is displayed on the first screen. After the user clicks the post-self-timer control, the application interface of the application is displayed on the second screen, as shown in (d) of fig. 11, where the system display state is that of the second screen, and the application interface of the application is displayed on the second screen, and the camera invoked here is a post-camera on the same side as the second screen when the physical state of the first screen is in the expanded state.
The above embodiment can be applied to live, self-timer, etc. shooting scenes. Through the embodiment, the front-mounted self-timer effect can be realized by using the rear-mounted camera (the photographed user can directly check the real-time photographing effect on the display screen), so that the photographing of the user is facilitated. Meanwhile, the shooting capability of the rear camera is better than that of the front camera, so that better shooting effect than common front self-shooting can be achieved on the premise of realizing the front self-shooting effect.
In some embodiments of the present application, when a user clicks a three-way application including a video communication function, a hardware call control displayed on an application interface of the three-way application is a collaborative shooting control, where the collaborative shooting control is used to display the same application interface on two display screens with opposite display directions in a shooting process.
When the user clicks the collaborative shooting control, the application layer sends a multi-screen display instruction to the equipment state management system, wherein the multi-screen display instruction is used for informing the equipment state management system that the application interface displayed on the first screen is displayed on the second screen.
In some embodiments of the present application, after receiving the multi-screen display instruction, the device state management system controls to display the application interface displayed on the first screen on the second screen. It will be appreciated that the first screen displays a first application interface of the application while the second screen displays a second application interface of the application. For a description of the first application interface and the second application interface, reference is made to the description above.
Fig. 12 is a schematic view of a scenario of an electronic device display application interface according to an embodiment of the present application. Before a user clicks a collaborative shooting control in an application interface of a three-party application, the application interface of the application is displayed on a first screen which is an invaginated folding screen, as shown in fig. 12 (a), at this time, the display state of the system is that of the first screen, the display is not performed on a second screen, and the called camera can be a front camera on the first screen, and at this time, the scene is: the shooting object uses the front camera on the first screen to shoot the shot object, the shot object is in the display direction of the first screen, the shot object is in the display direction of the second screen, the application interface is displayed on the first screen, and the shot object can not watch the shooting effect, so that the situation that the shot object is not in the view finder frame when the shot object is shot is caused, and the shooting effect is poor. In another embodiment, a rear camera on the same side as the second screen is called, and the scene at this time is: the shooting object shoots the shot object by using a rear camera of the electronic equipment, the shot object is in the display direction of the second screen, the shot object is in the display direction of the first screen, and the application interface is displayed on the first screen. After the user clicks the collaborative photographing control, the application interface of the application may be displayed on the first screen and the second screen at the same time, as shown in fig. 12 (b), the first screen displaying the first user interface and the second screen displaying the second user interface.
The embodiment can be applied to shooting scenes such as video call, live broadcast, video recording, shooting and the like. Through the embodiment, the application interfaces of the application can be displayed on the first screen and the second screen simultaneously, so that shooting effects can be observed by both a shooting object and a shot object conveniently, and better shooting effects can be obtained.
In some embodiments of the present application, when a user clicks a three-way application including a photographing function, a hardware call control displayed on an application interface of the three-way application is a multi-directional photographing control. The multi-direction shooting control is used for entering a multi-direction shooting mode, at least two cameras with different shooting directions can be called to shoot in the shooting process, for example, when a user clicks the multi-direction shooting control, a first front camera on a first screen and a second front camera on a second screen which are folded inwards are called at the same time, or a first front camera on the first screen and a rear camera on the same side with the second screen are called at the same time, and the shooting directions of the rear cameras are opposite to each other with the shooting directions of the first front cameras. By calling two cameras with opposite shooting directions, multipath shooting can be realized, and shot scenes are enriched.
In some embodiments of the present application, the camera that needs to be invoked may be determined according to the camera corresponding to the multidirectional shooting mode preset by the user. Or after the user clicks the hardware call control, the camera information may be displayed on the current application interface, where the camera information may include a position (such as a first screen and a second screen) of the camera in the electronic device, and may also include photographic parameter information of the camera. After receiving a selection instruction triggered by a user on the camera information, determining a camera selected by the user, and calling the camera selected by the user as a camera required to be used in the shooting process.
In some embodiments of the present application, after the electronic device starts the multi-directional shooting mode, the application layer sends a multi-screen display instruction to the device state management system, where the multi-screen display instruction is used to control to display an application interface displayed on the first screen on the second screen, for example, the first screen displays a first application interface, and the second screen displays a second application interface. When at least two cameras with different shooting directions are called to shoot, application interfaces of applications are displayed on display screens (a first screen and a second screen) in the two shooting directions at the same time, so that a user in the first screen display direction and a user in the second screen display direction can both watch the application interfaces, and shot scenes are enriched.
In some embodiments of the present application, displaying an application interface of an application on a first screen and a second screen simultaneously includes: and if the face is detected in the video image acquired by the second front camera or the rear camera, displaying a first application interface on a first screen and displaying a second application interface on a second screen. Considering that if the second front camera or the video image acquired by the front camera cannot detect the face, it indicates that the user located in the image acquisition range of the camera is far away from the electronic device or temporarily away from or not looking at the electronic device, but looks at other places, that is, the user is not temporarily watching the display content on the second screen, so that in order to reduce the energy consumption of the electronic device, an application interface is not displayed on the second screen.
Fig. 13 is a schematic view of an application interface of a video communication application according to an embodiment of the present application. In the video call process, after clicking a multi-directional shooting control in an application interface of a three-party application and calling a first front-facing camera on a first screen and a second front-facing camera on a second screen, the application interface of the three-party application is displayed on the first screen, as shown in fig. 13 (a), wherein a display area is an image shot by the first front-facing camera, a display area is an image shot by the second front-facing camera, and a display area is an image shot by a camera of other electronic equipment performing video call with the electronic equipment. In some embodiments of the present application, after the user clicks the multi-directional shooting control, an application interface of the three-way application is displayed on the first screen and the second screen simultaneously, as shown in (b) in fig. 13, and a second application interface is displayed on the second screen simultaneously.
In other embodiments of the present application, after a user clicks a multi-directional shooting mode and invokes a first front-facing camera on a first screen and a rear-facing camera opposite to the shooting direction of the first front-facing camera, an application interface displayed on the first screen is shown in fig. 13 (C), where a display area is an image shot by the first front-facing camera, B display area is an image shot by the rear-facing camera, and C display area is an image shot by a camera of another electronic device that performs a video call with the electronic device. As shown in a B display area shown in fig. 13 (a) and a B display area shown in fig. 13 (c), the shooting directions of the rear camera and the second front camera are identical, and a wider shooting area can be obtained by shooting with the rear camera than by shooting with the second front camera. In some embodiments of the present application, after the user clicks the multi-directional shooting control, an application interface of the three-way application is displayed on the first screen and the second screen simultaneously, as shown in (d) in fig. 13, and a second application interface is displayed on the second screen simultaneously.
The embodiment can be applied to shooting scenes such as video call, live broadcast, video recording, shooting and the like. Through the embodiment, at least two cameras with different shooting directions can be called to shoot in the shooting process, multi-path shooting is realized, and richer camera shooting scenes are provided for users.
The present embodiment also provides a computer storage medium, in which computer instructions are stored, which when executed on an electronic device, cause the electronic device to execute the above-mentioned related method steps to implement the application interface display method in the above-mentioned embodiments.
The present embodiment also provides a computer program product, which when run on a computer, causes the computer to perform the above-mentioned related steps to implement the application interface display method in the above-mentioned embodiments.
In addition, embodiments of the present application also provide an apparatus, which may be specifically a chip, a component, or a module, and may include a processor and a memory connected to each other; the memory is used for storing computer-executed instructions, and when the device is operated, the processor can execute the computer-executed instructions stored in the memory, so that the chip executes the application interface display method in the method embodiments.
The electronic device, the computer storage medium, the computer program product, or the chip provided in this embodiment are used to execute the corresponding methods provided above, so that the beneficial effects thereof can be referred to the beneficial effects in the corresponding methods provided above, and will not be described herein.
From the foregoing description of the embodiments, it will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-described division of functional modules is illustrated, and in practical application, the above-described functional allocation may be implemented by different functional modules according to needs, i.e. the internal structure of the apparatus is divided into different functional modules to implement all or part of the functions described above.
In the several embodiments provided in this application, it should be understood that the disclosed apparatus and method may be implemented in other ways. For example, the apparatus embodiments described above are merely illustrative, e.g., the division of the modules or units is merely a logical function division, and there may be additional divisions when actually implemented, e.g., multiple units or components may be combined or integrated into another apparatus, or some features may be omitted or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or units, which may be in electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and the parts displayed as units may be one physical unit or a plurality of physical units, may be located in one place, or may be distributed in a plurality of different places. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in each embodiment of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The integrated unit may be stored in a readable storage medium if implemented in the form of a software functional unit and sold or used as a stand-alone product. Based on such understanding, the technical solution of the embodiments of the present application may be essentially or a part contributing to the prior art or all or part of the technical solution may be embodied in the form of a software product stored in a storage medium, including several instructions to cause a device (may be a single-chip microcomputer, a chip or the like) or a processor (processor) to perform all or part of the steps of the methods of the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a random access Memory (Random Access Memory, RAM), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
Finally, it should be noted that the above embodiments are merely for illustrating the technical solution of the present application and not for limiting, and although the present application has been described in detail with reference to the preferred embodiments, it should be understood by those skilled in the art that the technical solution of the present application may be modified or substituted without departing from the spirit and scope of the technical solution of the present application.

Claims (12)

1. An application interface display method applied to a first device, wherein the first device comprises a device state management system, the method comprising:
responding to a first operation of a user on a video call icon in a first application, starting a video call, and sending a query instruction to the equipment state management system through the first application, wherein the first application is a three-party application installed on the first equipment; the first application is a three-party application comprising a video communication function; if the first device is determined to be a folding screen electronic device according to the feedback information of the device state management system, the first device is currently displayed by using a first screen and is in an unfolding state, a first application interface of the first application is displayed on the first screen, the first application interface comprises a first control, and the folding screen electronic device comprises the first screen and a second screen which is arranged opposite to the first screen;
And responding to a second operation of the first control by a user, simultaneously calling a first camera and a second camera to shoot, displaying a second application interface of the first application on the second screen, wherein the second application interface is an interface of the first application interface after being adjusted according to the display size of the second screen, the shooting directions of the first camera and the second camera are opposite, and the second application interface comprises the control displayed on the first application interface, shooting content of the first camera, shooting content of the second camera and content shot by a second device for carrying out video call with the first device.
2. The application interface display method according to claim 1, characterized in that the method further comprises:
registering a first event in the equipment state management system, wherein the first event is used for controlling the equipment state management system to monitor the system display state of the first equipment;
and responding to the display state switching operation of the user, switching the display state of the system from the first screen display to the second screen display, and notifying the first application of the display state change after the equipment state management system monitors the display state change of the system from the first screen display to the second screen display.
3. The application interface display method according to claim 2, wherein the first control is set to a non-selectable mode after the first application receives the display state change.
4. The application interface display method according to claim 1, characterized in that the method further comprises:
registering a second event in the equipment state management system, wherein the second event is used for controlling the equipment state management system to monitor the physical state of the first screen;
and responding to the physical state adjustment operation of the user on the first screen, changing the physical state of the first screen from an unfolded state to a non-unfolded state, and informing the first application of the physical state change after the equipment state management system monitors the physical state change of the first screen from the unfolded state to the non-unfolded state.
5. The application interface display method of claim 4, wherein the first control is set to a non-selectable mode after the first application receives the physical state change.
6. The application interface display method according to claim 1, wherein the first application is a three-way application including a shooting function, the first control is a rear-mounted self-timer control, the second application interface is displayed on the second screen in response to a selection operation of a user on the rear-mounted self-timer control, and a rear camera on the same side as the second screen is called to shoot.
7. The application interface display method according to claim 6, wherein the display of the first application interface on the first screen is stopped, and a prompt message for instructing a user to turn over the first device is displayed on the first screen.
8. The application interface display method according to claim 6, wherein the first screen includes a second control, and the first screen is subjected to screen-off processing in response to a selection operation of a user on the second control.
9. The application interface display method according to claim 1, wherein the first control is a collaborative shooting control, and the second application interface is displayed on the second screen in response to a selection operation of a user on the collaborative shooting control.
10. The application interface display method according to claim 1, wherein the first control is a multi-directional shooting control, and the first front-facing camera on the first screen and the second front-facing camera on the second screen are called to shoot simultaneously in response to a selection operation of a user on the multi-directional shooting control; or simultaneously calling a first front camera on the first screen and a rear camera opposite to the shooting direction of the first front camera to shoot.
11. An electronic device comprising a memory and a processor;
the memory is used for storing program instructions;
the processor is configured to read the program instructions stored in the memory to implement the application interface display method according to any one of claims 1 to 10.
12. A computer readable storage medium having stored therein computer readable instructions which when executed by a processor implement the application interface display method of any one of claims 1 to 10.
CN202210693404.8A 2022-06-17 2022-06-17 Application interface display method, electronic device and storage medium Active CN116048436B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210693404.8A CN116048436B (en) 2022-06-17 2022-06-17 Application interface display method, electronic device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210693404.8A CN116048436B (en) 2022-06-17 2022-06-17 Application interface display method, electronic device and storage medium

Publications (2)

Publication Number Publication Date
CN116048436A CN116048436A (en) 2023-05-02
CN116048436B true CN116048436B (en) 2024-03-08

Family

ID=86113889

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210693404.8A Active CN116048436B (en) 2022-06-17 2022-06-17 Application interface display method, electronic device and storage medium

Country Status (1)

Country Link
CN (1) CN116048436B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116820229A (en) * 2023-05-17 2023-09-29 荣耀终端有限公司 XR space display method, XR equipment, electronic equipment and storage medium

Citations (30)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1929538A (en) * 2005-09-09 2007-03-14 Lg电子株式会社 Image capturing and displaying method and system
CN203574726U (en) * 2013-08-22 2014-04-30 深圳富泰宏精密工业有限公司 Portable electronic device
WO2016052814A1 (en) * 2014-09-30 2016-04-07 Lg Electronics Inc. Mobile terminal and method of controlling the same
CN105657239A (en) * 2015-04-27 2016-06-08 宇龙计算机通信科技(深圳)有限公司 Image processing method and device
CN106657460A (en) * 2016-11-17 2017-05-10 上海斐讯数据通信技术有限公司 Method and device for adopting rear camera for selfie
US9686497B1 (en) * 2015-10-29 2017-06-20 Crater Group Co. Video annotation and dynamic video call display for multi-camera devices
CN106933620A (en) * 2017-02-14 2017-07-07 珠海市魅族科技有限公司 A kind of filming control method and system
CN107317993A (en) * 2017-08-08 2017-11-03 维沃移动通信有限公司 A kind of video call method and mobile terminal
CN107528938A (en) * 2017-07-26 2017-12-29 维沃移动通信有限公司 A kind of video call method, terminal and computer-readable recording medium
CN107613196A (en) * 2017-09-05 2018-01-19 珠海格力电器股份有限公司 A kind of self-timer method and its device, electronic equipment
CN108540596A (en) * 2018-03-08 2018-09-14 华勤通讯技术有限公司 A kind of terminal device
CN108781271A (en) * 2016-02-02 2018-11-09 三星电子株式会社 Method and apparatus for providing images serve
CN110278298A (en) * 2019-06-19 2019-09-24 上海摩软通讯技术有限公司 A kind of folding terminal device, Folding display method and device
CN110401766A (en) * 2019-05-22 2019-11-01 华为技术有限公司 A kind of image pickup method and terminal
WO2019227752A1 (en) * 2018-05-28 2019-12-05 华为技术有限公司 Method for operating electronic device and electronic device
CN110839094A (en) * 2018-08-17 2020-02-25 西安中兴新软件有限责任公司 Terminal, information processing method and computer readable storage medium
CN111124561A (en) * 2019-11-08 2020-05-08 华为技术有限公司 Display method applied to electronic equipment with folding screen and electronic equipment
CN111770273A (en) * 2020-06-29 2020-10-13 维沃移动通信有限公司 Image shooting method and device, electronic equipment and readable storage medium
CN112136100A (en) * 2018-05-28 2020-12-25 华为技术有限公司 Shooting method and electronic equipment
WO2021013134A1 (en) * 2019-07-22 2021-01-28 华为技术有限公司 Method for controlling lifting of camera and electronic device
CN112689990A (en) * 2018-11-09 2021-04-20 深圳市柔宇科技股份有限公司 Photographing control method, electronic device, and computer-readable storage medium
CN112799522A (en) * 2021-02-24 2021-05-14 北京小米移动软件有限公司 Mobile terminal control method and device, mobile terminal and storage medium
CN112822427A (en) * 2020-12-30 2021-05-18 维沃移动通信有限公司 Video image display control method and device and electronic equipment
KR20210102010A (en) * 2020-02-10 2021-08-19 삼성전자주식회사 Method and apparatus for operating of electronic device having flexible display
CN113553013A (en) * 2020-04-23 2021-10-26 北京小米移动软件有限公司 Data transmission method and device and multi-screen terminal equipment
CN113824878A (en) * 2021-08-20 2021-12-21 荣耀终端有限公司 Shooting control method based on foldable screen and electronic equipment
CN114040144A (en) * 2021-12-01 2022-02-11 展讯通信(天津)有限公司 Video call method and electronic equipment
CN114401373A (en) * 2022-03-24 2022-04-26 荣耀终端有限公司 Method for displaying on two screens simultaneously, electronic equipment and readable storage medium
CN114401340A (en) * 2021-12-31 2022-04-26 荣耀终端有限公司 Collaborative shooting method, electronic device and medium thereof
WO2022105803A1 (en) * 2020-11-20 2022-05-27 华为技术有限公司 Camera calling method and system, and electronic device

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20140072927A (en) * 2012-11-15 2014-06-16 엘지전자 주식회사 Mobile terminal and method of controlling the same
KR102072509B1 (en) * 2013-06-03 2020-02-04 삼성전자주식회사 Group recording method, machine-readable storage medium and electronic device
KR20160013748A (en) * 2014-07-28 2016-02-05 엘지전자 주식회사 Protable electronic device and control method thereof
KR102377277B1 (en) * 2015-02-27 2022-03-23 삼성전자주식회사 Method and apparatus for supporting communication in electronic device
KR20200140609A (en) * 2019-06-07 2020-12-16 삼성전자주식회사 Foldable electronic device and method for displaying information in the foldable electronic device
CN114567744A (en) * 2019-06-10 2022-05-31 聚好看科技股份有限公司 Video call interface switching method on smart television

Patent Citations (30)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1929538A (en) * 2005-09-09 2007-03-14 Lg电子株式会社 Image capturing and displaying method and system
CN203574726U (en) * 2013-08-22 2014-04-30 深圳富泰宏精密工业有限公司 Portable electronic device
WO2016052814A1 (en) * 2014-09-30 2016-04-07 Lg Electronics Inc. Mobile terminal and method of controlling the same
CN105657239A (en) * 2015-04-27 2016-06-08 宇龙计算机通信科技(深圳)有限公司 Image processing method and device
US9686497B1 (en) * 2015-10-29 2017-06-20 Crater Group Co. Video annotation and dynamic video call display for multi-camera devices
CN108781271A (en) * 2016-02-02 2018-11-09 三星电子株式会社 Method and apparatus for providing images serve
CN106657460A (en) * 2016-11-17 2017-05-10 上海斐讯数据通信技术有限公司 Method and device for adopting rear camera for selfie
CN106933620A (en) * 2017-02-14 2017-07-07 珠海市魅族科技有限公司 A kind of filming control method and system
CN107528938A (en) * 2017-07-26 2017-12-29 维沃移动通信有限公司 A kind of video call method, terminal and computer-readable recording medium
CN107317993A (en) * 2017-08-08 2017-11-03 维沃移动通信有限公司 A kind of video call method and mobile terminal
CN107613196A (en) * 2017-09-05 2018-01-19 珠海格力电器股份有限公司 A kind of self-timer method and its device, electronic equipment
CN108540596A (en) * 2018-03-08 2018-09-14 华勤通讯技术有限公司 A kind of terminal device
WO2019227752A1 (en) * 2018-05-28 2019-12-05 华为技术有限公司 Method for operating electronic device and electronic device
CN112136100A (en) * 2018-05-28 2020-12-25 华为技术有限公司 Shooting method and electronic equipment
CN110839094A (en) * 2018-08-17 2020-02-25 西安中兴新软件有限责任公司 Terminal, information processing method and computer readable storage medium
CN112689990A (en) * 2018-11-09 2021-04-20 深圳市柔宇科技股份有限公司 Photographing control method, electronic device, and computer-readable storage medium
CN110401766A (en) * 2019-05-22 2019-11-01 华为技术有限公司 A kind of image pickup method and terminal
CN110278298A (en) * 2019-06-19 2019-09-24 上海摩软通讯技术有限公司 A kind of folding terminal device, Folding display method and device
WO2021013134A1 (en) * 2019-07-22 2021-01-28 华为技术有限公司 Method for controlling lifting of camera and electronic device
CN111124561A (en) * 2019-11-08 2020-05-08 华为技术有限公司 Display method applied to electronic equipment with folding screen and electronic equipment
KR20210102010A (en) * 2020-02-10 2021-08-19 삼성전자주식회사 Method and apparatus for operating of electronic device having flexible display
CN113553013A (en) * 2020-04-23 2021-10-26 北京小米移动软件有限公司 Data transmission method and device and multi-screen terminal equipment
CN111770273A (en) * 2020-06-29 2020-10-13 维沃移动通信有限公司 Image shooting method and device, electronic equipment and readable storage medium
WO2022105803A1 (en) * 2020-11-20 2022-05-27 华为技术有限公司 Camera calling method and system, and electronic device
CN112822427A (en) * 2020-12-30 2021-05-18 维沃移动通信有限公司 Video image display control method and device and electronic equipment
CN112799522A (en) * 2021-02-24 2021-05-14 北京小米移动软件有限公司 Mobile terminal control method and device, mobile terminal and storage medium
CN113824878A (en) * 2021-08-20 2021-12-21 荣耀终端有限公司 Shooting control method based on foldable screen and electronic equipment
CN114040144A (en) * 2021-12-01 2022-02-11 展讯通信(天津)有限公司 Video call method and electronic equipment
CN114401340A (en) * 2021-12-31 2022-04-26 荣耀终端有限公司 Collaborative shooting method, electronic device and medium thereof
CN114401373A (en) * 2022-03-24 2022-04-26 荣耀终端有限公司 Method for displaying on two screens simultaneously, electronic equipment and readable storage medium

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
Wu,HY等.User-defined gestures for dual-screen mobile interaction.international journal of human-computer interaction.2020,全文. *
多显示屏电子设备的控制方法及装置;刘树标;;中国设备工程(第13期);全文 *
远程遥控一键视频聊天机软件的设计;段硕;中国优秀硕士学位论文全文数据库 信息科技辑;全文 *

Also Published As

Publication number Publication date
CN116048436A (en) 2023-05-02

Similar Documents

Publication Publication Date Title
EP4084450B1 (en) Display method for foldable screen, and related apparatus
US11669242B2 (en) Screenshot method and electronic device
WO2020168965A1 (en) Method for controlling electronic device having folding screen, and electronic device
WO2020259452A1 (en) Full-screen display method for mobile terminal, and apparatus
CN110536004B (en) Method for applying multiple sensors to electronic equipment with flexible screen and electronic equipment
WO2020000448A1 (en) Flexible screen display method and terminal
WO2021155808A1 (en) Display control method and electronic device
WO2021104008A1 (en) Method for displaying folding screen and related apparatus
WO2021052279A1 (en) Foldable screen display method and electronic device
WO2021213164A1 (en) Application interface interaction method, electronic device, and computer readable storage medium
WO2021036770A1 (en) Split-screen processing method and terminal device
CN110798568B (en) Display control method of electronic equipment with folding screen and electronic equipment
CN111182614B (en) Method and device for establishing network connection and electronic equipment
US20220174143A1 (en) Message notification method and electronic device
CN112351156B (en) Lens switching method and device
WO2022001258A1 (en) Multi-screen display method and apparatus, terminal device, and storage medium
WO2021208723A1 (en) Full-screen display method and apparatus, and electronic device
WO2023103951A1 (en) Display method for foldable screen and related apparatus
CN114257671B (en) Image display method and electronic equipment
US20240094858A1 (en) Window display method and electronic device
CN116048436B (en) Application interface display method, electronic device and storage medium
CN116048358A (en) Method and related device for controlling suspension ball
CN116723256A (en) Display method of electronic equipment with folding screen
CN110737916A (en) Communication terminal and processing method
CN115941836B (en) Interface display method, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant