CN116048686A - Display method and folding screen device - Google Patents

Display method and folding screen device Download PDF

Info

Publication number
CN116048686A
CN116048686A CN202211038392.1A CN202211038392A CN116048686A CN 116048686 A CN116048686 A CN 116048686A CN 202211038392 A CN202211038392 A CN 202211038392A CN 116048686 A CN116048686 A CN 116048686A
Authority
CN
China
Prior art keywords
screen
interface
display
folding
application
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202211038392.1A
Other languages
Chinese (zh)
Other versions
CN116048686B (en
Inventor
孔德敏
孙祺
李建武
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honor Device Co Ltd
Original Assignee
Honor Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honor Device Co Ltd filed Critical Honor Device Co Ltd
Priority to CN202211038392.1A priority Critical patent/CN116048686B/en
Priority to CN202311464896.4A priority patent/CN117992159A/en
Publication of CN116048686A publication Critical patent/CN116048686A/en
Application granted granted Critical
Publication of CN116048686B publication Critical patent/CN116048686B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04847Interaction techniques to control parameter settings, e.g. interaction with sliders or dials

Abstract

The embodiment of the application provides a display method and a folding screen device. The method is applied to folding screen equipment, the folding screen equipment comprises a first screen and a second screen, the first screen is a folding screen, and the folding screen equipment displays a first interface of a target application on the first screen; when the first screen is in an unfolding state, a first control is displayed on the first interface; in response to operation of the first control, the folding screen device displays a first interface on the first screen in succession, and displays a second interface of the target application on the second screen; the second interface is related to the function indicated by the first control, or the second interface is related to a co-display function of a secondary screen provided by the target application. Thus, the two display screens of the folding screen device can simultaneously display different interfaces of the same application program, enrich the display functions and application scenes of the folding screen device, and promote the use experience of users.

Description

Display method and folding screen device
Technical Field
The application relates to the technical field of intelligent terminals, in particular to a display method and folding screen equipment.
Background
With the continuous development of electronic devices and display screens and the improvement of life demands of people, electronic devices with folding screens have been developed, so that users can make folding or unfolding operations on the display screens, and the use demands of users on different screen sizes are met.
Most of the folding screen devices are provided with other display screens besides the folding screen, so that the use requirements of different users are met. Therefore, how to enrich the display functions of the folding screen device and improve the user experience is a problem to be solved.
Disclosure of Invention
In order to solve the technical problems, an embodiment of the application provides a display method and a folding screen device. In the method, the two display screens of the folding screen device can simultaneously display different interfaces of the same application program, so that the display functions and application scenes of the folding screen device are enriched, and the use experience of a user is improved.
In a first aspect, an embodiment of the present application provides a display method. The method is applied to a folding screen device, wherein the folding screen device comprises a first screen and a second screen, and the first screen is a folding screen. Wherein the method comprises the following steps:
the method comprises the steps that a folding screen device displays a first interface of a target application on a first screen; the first screen is in an unfolding state, and a first control is displayed on the first interface;
in response to operation of the first control, the folding screen device displays a first interface on the first screen in succession, and displays a second interface of the target application on the second screen; the second interface is related to the function indicated by the first control, or the second interface is related to a co-display function of a secondary screen provided by the target application.
The target application may be any application, including a system application and a third party application.
The first control is used for calling the second screen to display an interface. For example, the first control is a secondary screen collaborative control, the first control is a control opening at the secondary screen, and the like.
Thus, the two display screens of the folding screen device can simultaneously display different interfaces of the same application program, enrich the display functions and application scenes of the folding screen device, and promote the use experience of users.
According to a first aspect, a first control is displayed on a first interface, comprising: and if the target application inquires that the folding screen device currently uses the first screen to display the interface of the target application and the first screen is in an unfolding state, displaying a first control on the first interface.
Thus, when the folding screen device currently uses the folding inner screen for interface display and the folding inner screen is in an unfolded state, a first control with a function of calling the outer screen display is displayed on the folding inner screen, so that richer display functions and application scenes are provided for a user through the first control.
According to the first aspect, or any implementation manner of the first aspect, a first control is displayed on a first interface, including: and if the target application inquires that the folding screen device currently uses the first screen to display the interface of the target application, the first screen is in an unfolding state, and when the fact that the preset condition is met currently is determined, the first control is displayed on the first interface.
The preset condition relates to a function provided by the target application, for example, a geofence condition and the like.
In this way, the display of the first control is related to the display state and the folding screen gesture of the folding screen device and other preset conditions, so that the third party application program can determine the display time of the first control according to the application function, and the display time is more in accordance with a specific application scene.
According to a first aspect, or any implementation manner of the first aspect, the first content is displayed in a first interface, and the second interface includes a first area and a second area, where the first area displays the first content, and the second area is used for filling the content by a user.
The first content may be, for example, writing content, drawing content, or the like.
The target application may be a drawing application, the first interface may be an interface generated by a user drawing on the folding screen, the second interface may display a drawing result of the user on the folding screen, and a drawing area is provided for the external screen user, so that the folding screen user and the external screen user may perform drawing operations at the same time.
In a specific application scene, a user can watch display content on the folding screen on the outer screen of the folding screen device, and meanwhile, the outer screen can be operated, so that the effect that different users operate the folding screen and the outer screen simultaneously is achieved, and the use scene of the user is enriched.
According to the first aspect, or any implementation manner of the first aspect, the second content is displayed in the first interface, and progress information of the second content is synchronously displayed in the second interface.
Wherein the second content may be, for example, a game, an event, etc.
For example, the target application may be a game application, a live broadcast application, or the like, the first interface may be a game interface, an event live broadcast interface, or the like, and the second interface may display content explanatory information, such as game progress, event progress, or the like, for the first interface. The comment information may be text.
In a specific application scene, a user can play a game or watch an event by using the folding screen, other users can watch the progress of the game or the progress of the event on the external screen, the display function of the folding screen device is enriched, and the use experience of the user is improved.
According to the first aspect or any implementation manner of the first aspect, the third content is displayed in the first interface, the user state information corresponding to the third content is displayed in the second interface, and the second interface is a screen-off display interface.
The third content may be video content, text content, or the like.
For example, the target application may be a video-type application, a net class application, or the like, the first interface may be a video interface, and the second interface may be user status information corresponding to the display content of the first interface, such as "in net class", "in learning", "do not disturb", or the like.
In this way, in a specific application scenario, when the user uses the folding screen to work or learn, the external screen can display user status information such as "in class", "in learning", "without disturbing", etc., so that other users can know in time through the external screen display. Moreover, the user state information is displayed in a screen-extinguishing display mode, so that the energy consumption of the external screen can be effectively reduced.
According to the first aspect, or any implementation manner of the first aspect, the preset condition is a geofence detection condition; the first interface is a video interface, and the second interface displays two-dimensional code information.
By way of example, the two-dimensional code may be a personal health electronic code or the like.
For example, the geofence detection condition may be a determination that the user entered a public place based on geofence detection.
Therefore, in a specific application scene, when a user uses the folding screen to watch video or video chat, when the folding screen device meets the geofence detection condition, two-dimensional code information can be displayed on the outer screen, and a video interface displayed on the folding screen does not need to exit, so that the use experience of the user is improved.
According to the first aspect, or any implementation manner of the first aspect, after the folding screen device displays the second interface of the target application on the second screen, the method further includes:
the folding screen device cancels the display of the first control on the first interface and displays the second control; the second control is used for closing the auxiliary screen collaborative display function provided by the target application.
In this way, the user can restore the display condition of the folding screen device conveniently, and the display mode is selected according to personal requirements.
According to the first aspect, or any implementation manner of the first aspect, the method further includes:
and responding to the operation of the second control, stopping displaying the interface on the second screen, and carrying out screen extinguishing processing on the second screen.
According to the first aspect, or any implementation manner of the first aspect, after the first control is displayed on the first interface by the folding screen device, the folding screen device further includes:
and if the target application inquires that the current screen used by the folding screen device is switched from the first screen to the second screen or the first screen is switched from the unfolded state to the folded state, the first control is switched from the controllable mode to the uncontrollable mode.
Therefore, when the external screen display is not called through the first control, the controllable mode is switched to the uncontrollable mode, the situation that the external screen display cannot be called when the user clicks the first control is avoided, and the resource consumption of the folding screen device can be correspondingly reduced.
According to a first aspect, or any implementation manner of the first aspect, the folding screen device includes a device state management system, and the method further includes:
the target application registers a first event in the device state management system, wherein the first event is used for controlling the device state management system to monitor the system display state of the folding screen device and notifying the display state change to the target application;
the target application registers a second event in the device state management system, wherein the second event is used for controlling the device state management system to monitor the gesture information of the first screen and notifying the change of the gesture information to the target application.
Therefore, the system display state change of the target application on the folding screen equipment and the monitoring of the gesture change of the folding screen can be realized, and whether the controllable condition of the first control is met or not can be timely obtained.
In a second aspect, embodiments of the present application provide a folding screen apparatus. The electronic device includes: one or more processors; a memory; and one or more computer programs, wherein the one or more computer programs are stored on the memory, which when executed by the one or more processors, cause the folding screen device to perform the first aspect and the display method of any of the first aspects.
Any implementation manner of the second aspect and the second aspect corresponds to any implementation manner of the first aspect and the first aspect, respectively. The technical effects corresponding to the second aspect and any implementation manner of the second aspect may be referred to the technical effects corresponding to the first aspect and any implementation manner of the first aspect, which are not described herein.
In a third aspect, embodiments of the present application provide a computer-readable storage medium. The computer readable storage medium comprises a computer program which, when run on an electronic device, causes the electronic device to perform the first aspect and the display method of any one of the first aspects.
Any implementation manner of the third aspect and any implementation manner of the third aspect corresponds to any implementation manner of the first aspect and any implementation manner of the first aspect, respectively. The technical effects corresponding to the third aspect and any implementation manner of the third aspect may be referred to the technical effects corresponding to the first aspect and any implementation manner of the first aspect, which are not described herein.
In a fourth aspect, embodiments of the present application provide a computer program product comprising a computer program which, when executed, causes a computer to perform the display method of any one of the first aspect or the first aspect.
Any implementation manner of the fourth aspect and any implementation manner of the fourth aspect corresponds to any implementation manner of the first aspect and any implementation manner of the first aspect, respectively. Technical effects corresponding to any implementation manner of the fourth aspect may be referred to the technical effects corresponding to any implementation manner of the first aspect, and are not described herein.
In a fifth aspect, the present application provides a chip comprising processing circuitry, a transceiver pin. Wherein the transceiver pin and the processing circuit communicate with each other via an internal connection path, the processing circuit executing the display method as in the first aspect or any one of the first aspects to control the receiving pin to receive signals and to control the transmitting pin to transmit signals.
Any implementation manner of the fifth aspect and any implementation manner of the fifth aspect corresponds to any implementation manner of the first aspect and any implementation manner of the first aspect, respectively. Technical effects corresponding to any implementation manner of the fifth aspect may be referred to the technical effects corresponding to any implementation manner of the first aspect, and are not described herein.
Drawings
Fig. 1 is a schematic product form of an electronic device with an invaginated folding screen according to an embodiment of the present application.
Fig. 2 is a schematic product form of an electronic device with an invaginated folding screen according to an embodiment of the present application.
Fig. 3 is a schematic diagram of a hardware structure of an exemplary electronic device;
FIG. 4a is a schematic diagram illustrating calculation of an included angle of a folding screen of an electronic device;
FIG. 4b is a schematic diagram of a geographic coordinate system shown by way of example;
FIG. 5a is a schematic diagram of a software architecture of an exemplary electronic device;
FIG. 5b is a schematic diagram of exemplary module interactions;
FIGS. 6 a-6 c are schematic diagrams illustrating an application scenario;
fig. 7a to 7b are schematic views of an exemplary application scenario;
FIGS. 8 a-8 b are schematic views of an exemplary application scenario;
fig. 9a to 9b are schematic views of an application scenario shown by way of example;
FIGS. 10 a-10 b are schematic views of an exemplary application scenario;
fig. 11a is a schematic flow chart of a display method according to an embodiment of the present application;
FIG. 11b is a schematic diagram of an exemplary application scenario;
fig. 12 is a flow chart of a display method according to an embodiment of the present application;
fig. 13 is a schematic flow chart of a display method according to an embodiment of the present application;
fig. 14 is a flow chart of a display method according to an embodiment of the present application.
Detailed Description
The following description of the embodiments of the present application will be made clearly and fully with reference to the accompanying drawings, in which it is evident that the embodiments described are some, but not all, of the embodiments of the present application. All other embodiments, which can be made by one of ordinary skill in the art based on the embodiments herein without making any inventive effort, are intended to be within the scope of the present application.
The term "and/or" is herein merely an association relationship describing an associated object, meaning that there may be three relationships, e.g., a and/or B, may represent: a exists alone, A and B exist together, and B exists alone.
The terms first and second and the like in the description and in the claims of embodiments of the present application are used for distinguishing between different objects and not necessarily for describing a particular sequential order of objects. For example, the first target object and the second target object, etc., are used to distinguish between different target objects, and are not used to describe a particular order of target objects.
In the embodiments of the present application, words such as "exemplary" or "such as" are used to mean serving as examples, illustrations, or descriptions. Any embodiment or design described herein as "exemplary" or "for example" should not be construed as preferred or advantageous over other embodiments or designs. Rather, the use of words such as "exemplary" or "such as" is intended to present related concepts in a concrete fashion.
In the description of the embodiments of the present application, unless otherwise indicated, the meaning of "a plurality" means two or more. For example, the plurality of processing units refers to two or more processing units; the plurality of systems means two or more systems.
The electronic equipment such as the mobile phone, the tablet personal computer and the like can be used for installing the application, and an application interface corresponding to the installed application is displayed on a display screen of the electronic equipment, so that a user can conveniently use the functions provided by the application. In order to avoid the display screen of the electronic device being too large to affect its convenience, some manufacturers have applied flexible screens to the electronic device, resulting in an electronic device with a folded screen. An electronic device with a folding screen includes at least two display screens on a hardware device.
In the prior art, the application interfaces displayed by the same application on the electronic device without the folding screen and the electronic device with the folding screen are the same, and the functions provided for the user are the same, so that the user is not provided with richer functions and cannot be provided with richer shooting scenes according to the characteristics (such as the characteristics with at least two display screens and a plurality of cameras) of the electronic device with the folding screen on the hardware device.
In order to solve the technical problem that an application cannot provide a user with richer functions according to the characteristics of the electronic device with the folding screen on the hardware device, the embodiment of the application provides a display method. When the method is applied to the electronic equipment with the folding screen, richer functions can be provided for a user according to the characteristics of the electronic equipment with the folding screen on hardware equipment.
A folding screen (which may be referred to as a first screen) of an electronic device may be folded to form at least two display units. For example, the folding screen may be folded along a folding edge or folding axis to form a first display unit and a second display unit. The folding manner of the folding screen on the electronic device can be divided into two types. One type is an outward folding screen (called outward folding screen for short), and the other type is an inward folding screen (called inward folding screen for short). The first display unit and the second display unit are formed by folding a folding screen. After the outward folding screen is folded, the display direction of the first display unit is opposite to the display direction of the second display unit. After the inward folding screen is folded, the display direction of the first display unit is opposite to the display direction of the second display unit.
In the embodiment of the present application, an electronic device having a folding screen is described by taking an inwardly folded folding screen as an example (it can be understood that the embodiment of the present application may also be implemented on an electronic device having an outwardly folded folding screen), that is, a display direction of a first display unit is opposite to a display direction of a second display unit, and a front camera (hereinafter referred to as a first front camera) is disposed on the inwardly folded folding screen. For example, please refer to fig. 1, which illustrates a schematic product form of an electronic device with an invaginated folding screen according to an embodiment of the present application. In fig. 1, (a) is a schematic view of the fully unfolded state of the inward folding screen. The inwardly folded panel may be folded along the fold edges in directions 11a and 11B shown in fig. 1 (a) to form a panel a and a panel B in a semi-folded configuration shown in fig. 1 (B). After the inward folding screen is folded and divided into an A screen and a B screen, the first front camera can be arranged on the A screen or the B screen. The inward folding screen may follow the folding edge, according to the screen a and the screen B shown in fig. 1 (B). The inner folding screen may be folded over along the fold edges in the directions 12a and 12b shown in fig. 1 (b) to form the out-turned folding screen in the fully folded configuration shown in fig. 1 (c). As shown in fig. 1 (c), after the in-folded screen of the electronic device is fully folded, the a screen and the B screen are opposite and invisible to the user.
For example, a C screen (i.e., a second screen) may be provided on the rear surface of the a screen (i.e., the first display unit) shown in fig. 1 (b). After the inward folding screen is completely folded, the A screen is opposite to the B screen, and the C screen is visible to a user. Wherein, the C screen is opposite to the inward folding screen and can be called as an outer screen. A front camera (hereinafter referred to as a second front camera) may be provided on the C-screen. It will be appreciated that for an electronic device having such an invaginated folding screen, when the invaginated folding screen is in the fully folded state, the display direction of the C-screen is opposite to the photographing direction of a rear camera (not shown in the figure) on the electronic device, and when the invaginated folding screen is in the fully unfolded state, the display direction of the C-screen coincides with the photographing direction of the rear camera on the electronic device.
In fig. 1, the folding screen of the electronic device is folded longitudinally, that is, the folding screen is folded into left and right screens (i.e., a screen and a screen B screen) according to the longitudinal folding edge on the folding screen.
In this embodiment of the present application, the folded-in folding screen of the electronic device may also be folded laterally, that is, according to a lateral folding edge on the folded-in folding screen, into an upper screen and a lower screen (i.e., an a screen and a B screen). For example, as shown in fig. 2 (a), the folded-in screen is folded along the folding edge in the transverse direction of the folded screen when the folded-in screen is unfolded, and fig. 2 (b) and fig. 2 (c) can be referred to after the folding.
It should be noted that, the back side of the folded-in folding screen of the electronic device with the folding screen according to the embodiment of the present application further includes a display screen, which may be referred to as a second screen. The second screen may be disposed on the back of the first display unit or the second display unit formed by folding the folded-in folding screen, for example, in fig. 2 (C), the back of the a screen (i.e., the first display unit) may be provided with a C screen (i.e., the second screen), and after the folded-in folding screen is completely folded, the B screen is opposite to the a screen, the C screen is opposite to the a screen, and the C screen is visible to the user. For example, when the back of the a screen is provided with the C screen and the rear camera at the same time, the display direction of the C screen is consistent with the shooting direction of the rear camera on the electronic device. The second screen may also be provided with a front camera (not shown in the figure), which may be referred to as a second front camera.
The electronic device with the folded-in folding screen in the embodiment of the application may include at least two screens, such as a folding screen (i.e., a first screen, which may be divided into a first display unit and a second display unit) and an external screen (i.e., a second screen), and at least three cameras, such as a first front camera, a second front camera and a rear camera.
In the embodiment of the application, the included angle between the screen A and the screen B of the inward folding screen of the electronic equipment is in the range of [0 degrees, 180 degrees ]. If alpha epsilon [0 degrees, P1], the electronic equipment can determine that the inward folding screen is in a fully folded state (called a folded state for short); if alpha epsilon (P1, P2), the electronic device can determine that the inward folding screen is in a half-folded state (simply called half-folded state); alpha epsilon [ P2, 180 DEG ], the electronic device can determine that the invaginated folding screen is in a fully unfolded configuration (simply referred to as an unfolded configuration). Wherein, 0 DEG is more than 0 DEG and less than P1 and less than 180 DEG is more than P2. P1, P2 may be preset angle thresholds. P1 and P2 can be determined according to the use habit of a large number of users for using the inward folding screen; alternatively, P1, P2 may be set by the user in the electronic device.
In some embodiments, the user may want to use the a-screen and the B-screen as a whole (i.e., as a complete display screen) with a greater angle α than 100 ° according to the usage habits of most users. When the included angle alpha between the A screen and the B screen is smaller than 80 degrees, the user wants to use the C screen independently, the possibility of not using the A screen and the B screen is high, and the inward folding screen can be determined to be in a folded state. When the included angle alpha between the A screen and the B screen is between 80 degrees and 100 degrees, the possibility that the user wants to use the A screen and the B screen is high, and the inward folding screen can be determined to be in a half-folding state.
Therefore, the range of the preset angle threshold P1 in the embodiment of the present application may be (0, 80 ° ], and the range of the preset angle threshold P2 may be [100 °,180 °). For example, the preset angle threshold P1 may be 75 °, and the preset angle threshold P2 may be 105 °. The above examples are merely for the purpose of explaining various embodiments of the present application and are not to be construed as limiting in any way.
It should be noted that, at least two screens formed after the folded or unfolded inner folded screen in the embodiment of the present application may be multiple screens that exist independently, or may be a complete screen with an integral structure, and only folded to form at least two portions.
For example, the inwardly folded screen may be a flexible folding screen including folded edges made of a flexible material. Part or all of the flexible folding screen is made of flexible materials. The at least two panels formed after the flexible folding panel is folded are one complete panel of unitary construction, but folded to form at least two sections.
For another example, the inward folding screen may be a multi-screen folding screen. The multi-screen folding screen may include multiple (two or more) screens. The plurality of screens is a plurality of individual display screens. The plurality of screens may be connected in turn by a folding shaft. Each screen can rotate around a folding shaft connected with the screen, so that the folding of the multi-screen folding screen is realized.
In the following embodiments of the present application, an example in which the folded-in folding screen is a flexible folding screen that can be folded transversely will be described.
By way of example, the electronic device in embodiments of the present application may be a cell phone, tablet, desktop, laptop, handheld, notebook, ultra-mobile personal computer (UMPC), netbook, and electronic device such as a cellular phone, personal digital assistant (personal digital assistant, PDA), artificial intelligence (artificial intelligence, AI) device, wearable device, vehicle-mounted device, smart home device, and/or smart city device. The embodiment of the application does not particularly limit the specific form of the electronic device.
Fig. 3 is a schematic structural diagram of the electronic device 100. Alternatively, the electronic device 100 may be a terminal, which may also be referred to as a terminal device, and the terminal may be a cellular phone (cellular phone) or a tablet computer (pad), which is not limited in this application. It should be noted that the schematic structural diagram of the electronic device 100 may be applied to the folding screen mobile phone in fig. 1-2. It should be understood that the electronic device 100 shown in fig. 3 is only one example of an electronic device, and that the electronic device 100 may have more or fewer components than shown in the figures, may combine two or more components, or may have different component configurations. The various components shown in fig. 3 may be implemented in hardware, software, or a combination of hardware and software, including one or more signal processing and/or application specific integrated circuits.
The electronic device 100 may include: processor 110, external memory interface 120, internal memory 121, universal serial bus (universal serial bus, USB) interface 130, charge management module 140, power management module 141, battery 142, antenna 1, antenna 2, mobile communication module 150, wireless communication module 160, audio module 170, speaker 170A, receiver 170B, microphone 170C, headset interface 170D, sensor module 180, keys 190, motor 191, indicator 192, camera 193, display 194, and subscriber identity module (subscriber identification module, SIM) card interface 195, etc. The sensor module 180 may include a pressure sensor, a gyroscope sensor, an acceleration sensor, a temperature sensor, a motion sensor, a barometric sensor, a magnetic sensor, a distance sensor, a proximity sensor, a fingerprint sensor, a touch sensor, an ambient light sensor, a bone conduction sensor, etc.
The processor 110 may include one or more processing units, such as: the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), a controller, a memory, a video codec, a digital signal processor (digital signal processor, DSP), a baseband processor, and/or a neural network processor (neural-network processing unit, NPU), etc. Wherein the different processing units may be separate devices or may be integrated in one or more processors.
The controller may be a neural hub and a command center of the electronic device 100, among others. The controller can generate operation control signals according to the instruction operation codes and the time sequence signals to finish the control of instruction fetching and instruction execution.
A memory may also be provided in the processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory.
The USB interface 130 is an interface conforming to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, or the like. The USB interface 130 may be used to connect a charger to charge the electronic device 100, and may also be used to transfer data between the electronic device 100 and a peripheral device. And can also be used for connecting with a headset, and playing audio through the headset. The interface may also be used to connect other electronic devices, such as AR devices, etc.
The charge management module 140 is configured to receive a charge input from a charger. The charger can be a wireless charger or a wired charger. In some wired charging embodiments, the charge management module 140 may receive a charging input of a wired charger through the USB interface 130. In some wireless charging embodiments, the charge management module 140 may receive wireless charging input through a wireless charging coil of the electronic device 100. The charging management module 140 may also supply power to the electronic device through the power management module 141 while charging the battery 142.
The power management module 141 is used for connecting the battery 142, and the charge management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charge management module 140 and provides power to the processor 110, the internal memory 121, the external memory, the display 194, the camera 193, the wireless communication module 160, and the like.
The wireless communication function of the electronic device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the electronic device 100 may be used to cover a single or multiple communication bands. Different antennas may also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed into a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 150 may provide a solution for wireless communication including 2G/3G/4G/5G, etc., applied to the electronic device 100. The mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (low noise amplifier, LNA), etc.
The wireless communication module 160 may provide solutions for wireless communication including wireless local area network (wireless local area networks, WLAN) (e.g., wireless fidelity (wireless fidelity, wi-Fi) network), bluetooth (BT), global navigation satellite system (global navigation satellite system, GNSS), frequency modulation (frequency modulation, FM), near field wireless communication technology (near field communication, NFC), infrared technology (IR), etc., as applied to the electronic device 100.
In some embodiments, antenna 1 and mobile communication module 150 of electronic device 100 are coupled, and antenna 2 and wireless communication module 160 are coupled, such that electronic device 100 may communicate with a network and other devices through wireless communication techniques.
The electronic device 100 implements display functions through a GPU, a display screen 194, an application processor, and the like. The GPU is a microprocessor for image processing, and is connected to the display 194 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 110 may include one or more GPUs that execute program instructions to generate or change display information.
The display screen 194 is used to display images, videos, and the like. The display 194 includes a display panel. In some embodiments, the electronic device 100 may include 1 or N display screens 194, N being a positive integer greater than 1.
The electronic device 100 may implement photographing functions through an ISP, a camera 193, a video codec, a GPU, a display screen 194, an application processor, and the like.
The ISP is used to process data fed back by the camera 193. For example, when photographing, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electric signal, and the camera photosensitive element transmits the electric signal to the ISP for processing and is converted into an image visible to naked eyes. ISP can also optimize the noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in the camera 193.
The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image onto the photosensitive element. The photosensitive element may be a charge coupled device (charge coupled device, CCD) or a Complementary Metal Oxide Semiconductor (CMOS) phototransistor. The photosensitive element converts the optical signal into an electrical signal, which is then transferred to the ISP to be converted into a digital image signal. The ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into an image signal in a standard RGB, YUV, or the like format. In some embodiments, electronic device 100 may include 1 or N cameras 193, N being a positive integer greater than 1.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to enable expansion of the memory capabilities of the electronic device 100. The external memory card communicates with the processor 110 through an external memory interface 120 to implement data storage functions.
The internal memory 121 may be used to store computer executable program code including instructions. The processor 110 executes various functional applications of the electronic device 100 and data processing by executing instructions stored in the internal memory 121, for example, to cause the electronic device 100 to implement the display method in the embodiment of the present application. The internal memory 121 may include a storage program area and a storage data area. The storage program area may store an application program (such as a sound playing function, an image playing function, etc.) required for at least one function of the operating system, etc. The storage data area may store data created during use of the electronic device 100 (e.g., audio data, phonebook, etc.), and so on. In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (universal flash storage, UFS), and the like.
The electronic device 100 may implement audio functions through an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, an application processor, and the like. Such as music playing, recording, etc.
The audio module 170 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. The audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be disposed in the processor 110, or a portion of the functional modules of the audio module 170 may be disposed in the processor 110.
The speaker 170A, also referred to as a "horn," is used to convert audio electrical signals into sound signals. The electronic device 100 may listen to music, or to hands-free conversations, through the speaker 170A. In some embodiments, the electronic device 100 may be provided with a plurality of speakers 170A.
A receiver 170B, also referred to as a "earpiece", is used to convert the audio electrical signal into a sound signal. When electronic device 100 is answering a telephone call or voice message, voice may be received by placing receiver 170B in close proximity to the human ear.
Microphone 170C, also referred to as a "microphone" or "microphone", is used to convert sound signals into electrical signals. When making a call or transmitting voice information, the user can sound near the microphone 170C through the mouth, inputting a sound signal to the microphone 170C. The electronic device 100 may be provided with at least one microphone 170C. In other embodiments, the electronic device 100 may be provided with two microphones 170C, and may implement a noise reduction function in addition to collecting sound signals. In other embodiments, the electronic device 100 may also be provided with three, four, or more microphones 170C to enable collection of sound signals, noise reduction, identification of sound sources, directional recording functions, etc.
The earphone interface 170D is used to connect a wired earphone. The headset interface 170D may be a USB interface 130 or a 3.5mm open mobile electronic device platform (open mobile terminal platform, OMTP) standard interface, a american cellular telecommunications industry association (cellular telecommunications industry association of the USA, CTIA) standard interface.
The pressure sensor is used for sensing a pressure signal and can convert the pressure signal into an electric signal. In some embodiments, the pressure sensor may be provided on the display screen 194. The electronic device 100 may also calculate the location of the touch based on the detection signal of the pressure sensor.
The gyroscopic sensor may be used to determine a motion pose of the electronic device 100. In some embodiments, the angular velocity of electronic device 100 about three axes (i.e., x, y, and z axes) may be determined by a gyroscopic sensor.
In the embodiment of the present application, if the electronic device 100 has an invaginated folding screen and the invaginated folding screen is foldable to form a plurality of screens, a gyro sensor may be included in each screen for measuring the orientation (i.e., the directional vector of the orientation) of the corresponding screen. The electronic device 100 may determine the included angle between adjacent screens based on the measured angular change in the orientation of each screen. For example, as shown in fig. 1 (b), after the inward folding screen is folded, a first display unit and a second display unit are formed, and gyro sensors are respectively arranged in the first display unit and the second display unit, so that the directions of the first display unit and the second display unit can be measured respectively. The electronic device 100 determines an included angle between the first display unit and the second display unit according to the measured change of the orientation angle of each screen.
For example, the inward folding screen is folded to form a first display unit (a screen shown in the drawing) and a second display unit (B screen shown in the drawing) shown in fig. 4a, in which a gyro sensor a is provided, and the B screen is provided with a gyro sensor B. Here, a simple explanation is made on the principle that the gyro sensor a measures the orientation of the a-screen (i.e., the directional vector of the orientation), the gyro sensor B measures the orientation of the B-screen (i.e., the directional vector of the orientation), and the principle that the electronic apparatus 100 calculates the angle α between the a-screen and the B-screen from the orientation of the a-screen and the orientation of the B-screen.
Wherein the coordinate system of the gyro sensor is a geographical coordinate system. As shown in fig. 4b, the origin O of the geographic coordinate system is located at the point where the vehicle, i.e. the device containing the gyroscopic sensor, such as the electronic device 100, is located, the x-axis points to the east (E) along the local latitude line, the y-axis points to the north (N) along the local meridian line, and the z-axis points up along the local geographic perpendicular line, and forms a right-hand rectangular coordinate system with the x-axis and the y-axis. The plane formed by the x axis and the y axis is a local horizontal plane, and the plane formed by the y axis and the z axis is a local meridian plane. Thus, it can be appreciated that the coordinate system of the gyroscopic sensor is: the gyroscope sensor is taken as an origin O, an x-axis along the direction of the local latitude line to the east, a y-axis along the direction of the local meridian to the north, and a z-axis along the direction of the local geographic plumb line (namely the opposite direction of the geographic plumb line).
The electronic device 100 can measure a direction vector of the orientation of each screen in the coordinate system of the gyro sensor provided therein, using the gyro sensor provided in each screen. For example, referring to the side view of the electronic device 100 shown in fig. 4a, the electronic device 100 measures that the direction vector of the orientation of the a-screen in the coordinate system of the gyro sensor a is a vector Z1, and the direction vector of the orientation of the B-screen in the coordinate system of the gyro sensor B is a vector Z2. The electronic device 100 can calculate the angle θ between the vector Z1 and the vector Z2 according to the vector Z1 and the vector Z2.
As can be seen from fig. 4a, since the vector Z1 is perpendicular to the a-screen and the vector Z2 is perpendicular to the B-screen, the included angle α=180° - θ between the a-screen and the B-screen can be obtained. That is, the electronic device 100 can determine the included angle α between the a-screen and the B-screen according to the measured direction vector (i.e., the vector Z1) of the a-screen in the coordinate system of the gyro sensor a and the measured direction vector (i.e., the vector Z2) of the B-screen in the coordinate system of the gyro sensor B.
Although the positions of the gyro sensors provided in the a-screen and the B-screen do not overlap, that is, the origins of the coordinate systems of the gyro sensors of the a-screen and the B-screen do not overlap, the x-axis, the y-axis, and the z-axis of the two coordinate systems are parallel, and thus the coordinate systems of the gyro sensors provided in the a-screen and the B-screen can be considered to be parallel. Thus, although the vector Z1 and the vector Z2 are not in the same coordinate system, the corresponding axes of the two coordinate systems are in parallel, so the angle θ between the vector Z1 and the vector Z2 can be calculated by the vector Z1 and the vector Z2.
In some embodiments, the angle alpha between the A screen and the B screen can be measured by matching with one or more other sensors.
For example, one acceleration sensor may be provided in each of the inwardly folded screens. The electronic device 100 (e.g., the processor 110) may measure the motion acceleration of each screen as it is rotated using an acceleration sensor; and then calculating the rotating angle of one screen back relative to the other screen according to the measured motion acceleration, namely the included angle alpha between the screen A and the screen B.
In other embodiments, the gyro sensor may be a virtual gyro sensor formed by combining a plurality of other sensors. The virtual gyroscope sensor can be used for calculating the included angle between adjacent screens of the inward folding screen, namely the included angle alpha between the screen A and the screen B.
The acceleration sensor may detect the magnitude of acceleration of the electronic device 100 in various directions (typically three axes). The acceleration sensor may detect the magnitude and direction of gravity when the electronic device 100 is stationary. The acceleration sensor can also be used for recognizing the gesture of the electronic equipment, and is applied to the applications such as horizontal and vertical screen switching, pedometers and the like.
Touch sensors, also known as "touch panels". The touch sensor may be disposed on the display screen 194, and the touch sensor and the display screen 194 form a touch screen, which is also referred to as a "touch screen". The touch sensor is used to detect a touch operation acting on or near it. The touch sensor may communicate the detected touch operation to the application processor to determine the touch event type.
In this embodiment of the present application, the electronic device 100 may detect a touch operation input by a user on a touch screen through a touch sensor, and collect one or more of a touch position, a touch area, a touch direction, a touch time, and the like of the touch operation on the touch screen. In some embodiments, the electronic device 100 may determine the touch location of a touch operation on the touch screen by combining a touch sensor and a pressure sensor.
The keys 190 include a power-on key (or power key), a volume key, etc. The keys 190 may be mechanical keys. Or may be a touch key. The electronic device 100 may receive key inputs, generating key signal inputs related to user settings and function controls of the electronic device 100.
The motor 191 may generate a vibration cue. The motor 191 may be used for incoming call vibration alerting as well as for touch vibration feedback. For example, touch operations acting on different applications (e.g., photographing, audio playing, etc.) may correspond to different vibration feedback effects.
The indicator 192 may be an indicator light, may be used to indicate a state of charge, a change in charge, a message indicating a missed call, a notification, etc.
The software system of the electronic device 100 may employ a layered architecture, an event driven architecture, a microkernel architecture, a microservice architecture, or a cloud architecture. In this embodiment, taking an Android system with a layered architecture as an example, a software structure of the electronic device 100 is illustrated.
Fig. 5a is a software architecture block diagram of the electronic device 100 according to an embodiment of the present application.
The layered architecture of the electronic device 100 divides the software into several layers, each with a distinct role and division of labor. The layers communicate with each other through a software interface. In some embodiments, the Android system is divided into four layers, from top to bottom, an application layer, an application framework layer, an Zhuoyun row (Android run) and system libraries, and a kernel layer, respectively.
The application layer may include a series of application packages.
As shown in fig. 5a, the application package may include applications for cameras, gallery, conversation, calendar, drawing, games, maps, navigation, music, video, weChat, etc.
The application framework layer provides an application programming interface (application programming interface, API) and programming framework for application programs of the application layer. The application framework layer includes a number of predefined functions.
As shown in fig. 5a, the application framework layer may include a device state management system (Device State Manager Service), a window manager (Window Manager Service), a content provider, a view system, a telephony manager, a resource manager, a notification manager, and the like.
The device state management system may be used to provide device state information, including but not limited to device basic information and device pose information, etc., to applications in the application layer.
The device state management system may be used to receive bottom sensor data (e.g., hinge angle data, distance sensor data, etc.) and determine a current device posture (e.g., folded, semi-folded, unfolded, etc.) of the electronic device, and may also determine whether the device is currently displayed using a first screen (e.g., a folded inner screen) or a second screen (e.g., an outer screen). The device state management system can determine whether the electronic device is provided with the inward folding screen or not based on the basic information of the electronic device, can also be used for determining the current system display state of the electronic device, and can switch the system display state of the electronic device. The device state management system can also inquire the form of the folding screen from the angle calculation module, and can determine the folding posture change of the folding screen of the electronic device based on the data transmitted by the angle calculation module.
In an embodiment of the present application, referring to fig. 5b, the device state management system provides a query interface for applications (including system applications or three-party applications). The device state management system can receive a query instruction sent by the application program and send feedback information to the application program based on the query instruction.
The window manager is used for managing window programs. The window manager can acquire the size of the display screen, judge whether a status bar exists, lock the screen, intercept the screen and the like.
The window manager can acquire the size of the display screen, judge whether a status bar exists, lock the screen, intercept the screen and the like. Each application run-time Activity, the window manager may create a corresponding application window. The window manager periodically refreshes the display content and window parameters (e.g., size, location, etc. of the application window) in the application window. And, the window manager may create a window state corresponding to each application window. The window manager marks the application window with this and uses this WindowState to store, query and control the state of the application window.
For example, the Window manager may query Window 1 in Window state of Window 1 for whether Window 1 is in full screen state, and if not, the Window manager may query Window 1 in Window state of Window 1 for parameters such as aspect ratio (e.g., 16:9 or 4:3).
The content provider is used to store and retrieve data and make such data accessible to applications. The data may include video, images, audio, calls made and received, browsing history and bookmarks, phonebooks, etc.
The view system includes visual controls, such as controls to display text, controls to display pictures, and the like. The view system may be used to build applications. The display interface may be composed of one or more views. For example, a display interface including a text message notification icon may include a view displaying text and a view displaying a picture.
The telephony manager is used to provide the communication functions of the electronic device 100. Such as the management of call status (including on, hung-up, etc.).
The resource manager provides various resources for the application program, such as localization strings, icons, pictures, layout files, video files, and the like.
The notification manager allows the application to display notification information in a status bar, can be used to communicate notification type messages, can automatically disappear after a short dwell, and does not require user interaction. Such as notification information is used to inform that the download is complete, a message alert, etc. The notification information may also be a notification in the form of a chart or scroll bar text appearing in the system top status bar, such as a notification of a background running application, or a notification appearing on the screen in the form of a dialog window. For example, a text message is prompted in a status bar, a prompt tone is emitted, the electronic device vibrates, and an indicator light blinks, etc.
Android run time includes a core library and virtual machines. Android run is responsible for scheduling and management of the Android system.
The core library consists of two parts: one part is a function which needs to be called by java language, and the other part is a core library of android.
The application layer and the application framework layer run in a virtual machine. The virtual machine executes java files of the application program layer and the application program framework layer as binary files. The virtual machine is used for executing the functions of object life cycle management, stack management, thread management, security and exception management, garbage collection and the like.
The system library may include a plurality of functional modules. For example: an angle calculation module, a surface manager (surface manager), a Media library (Media Libraries), a three-dimensional graphics processing library (e.g., openGL ES), a 2D graphics engine (e.g., SGL), etc.
The angle calculation module can be used for calculating the included angle between the two display units of the folding screen.
In an embodiment of the present invention, referring to fig. 5b, when the electronic device is an electronic device with a folding screen, the angle calculating module may obtain angle data from a sensor driver of the kernel layer, calculate an included angle between two display units of the folding screen, and send the included angle of the folding screen to the device state management system, so that the device state management system may determine a posture of the folding screen, for example, a folded state, a semi-folded state, or an unfolded state, according to the included angle of the folding screen.
The surface manager is used to manage the display subsystem and provides a fusion of 2D and 3D layers for multiple applications. Media libraries support a variety of commonly used audio, video format playback and recording, still image files, and the like. The media library may support a variety of audio video encoding formats, such as: MPEG4, h.264, MP3, AAC, AMR, JPG, PNG, etc. The three-dimensional graphic processing library is used for realizing three-dimensional graphic drawing, image rendering, synthesis, layer processing and the like. The 2D graphics engine is a drawing engine for 2D drawing.
The kernel layer is a layer between hardware and software. The kernel layer contains at least a display driver, a camera driver, an audio driver, and a sensor driver. A display driver for receiving a notification of the device state management system, the display on the electronic device may be controlled based on the notification of the device state management system, the display including a desktop display, an application interface display, and the like. The sensor drive is used to drive the sensor to detect data. When the acceleration sensor and the gyroscope sensor detect angle data, corresponding hardware interrupt is sent to the kernel layer. The sensor driver of the kernel layer sends the angle data to an angle calculation module in the system library, and the angle calculation module is used for calculating the physical form change of the electronic equipment and can be used for determining the physical form change of the inward folding screen of the electronic equipment. The angle calculation module can send the determined physical form change of the inward folding screen of the electronic device to the device state management system, and the device state management system can inform the window manager to change the resolution of the current display screen according to the physical form change of the inward folding screen. For example, when the physical form change of the invaginated folding screen is changed from the semi-folded state to the unfolded state, the device state management system notifies the window manager that the resolution of the current display screen is changed from 1920×1080 to 3840×2160, that is, the effective display area size of the current display screen is changed to 3840×2160. Subsequently, when an application in the application layer calls the window manager to create a corresponding application window, the window manager can set window parameters such as the size, the position and the like of the application window according to the updated screen size, so that the opened application can adapt to the folded screens with different physical forms.
It will be appreciated that the layers and components contained in the layers in the software structure shown in fig. 5a do not constitute a specific limitation of the electronic device 100. In other embodiments of the present application, electronic device 100 may include more or fewer layers than shown, and more or fewer components may be included in each layer, as the present application is not limited.
It may be understood that, in order to implement the display method in the embodiments of the present application, the electronic device includes corresponding hardware and/or software modules that perform each function. The steps of an algorithm for each example described in connection with the embodiments disclosed herein may be embodied in hardware or a combination of hardware and computer software. Whether a function is implemented as hardware or computer software driven hardware depends upon the particular application and design constraints imposed on the solution. Those skilled in the art may implement the described functionality using different approaches for each particular application in conjunction with the embodiments, but such implementation is not to be considered as outside the scope of this application.
The embodiment of the application provides a display method. In the method, the folding screen and the external screen of the folding screen mobile phone can simultaneously display the interface of the same application program, and the interface can be the same interface or different interfaces. The interface displayed on the folding screen may be referred to as a main interface of the application program, and the interface displayed on the external screen may be referred to as a sub-interface of the application program.
In the embodiment of the application, when the folding screen of the folding screen device is in an unfolded state, the folding screen device displays a first interface of the target application on the folding screen, and a first control (such as a secondary screen (i.e. an external screen) collaborative control) is displayed on the first interface. And responding to the operation of the user on the first control, the folding screen equipment continuously displays a first interface on the folding screen, and displays a second interface of the target application on the external screen. The second interface may have the same content as the first interface or may be different from the first interface. Wherein the content displayed in the second interface is related to the function indicated by the first control or is related to a co-display function provided by the application program.
In the embodiment of the application, the display of the first control may be triggered by the folded screen gesture (i.e., the folded screen is in an unfolded state), or may be triggered by the folded screen gesture and at least one other condition (e.g., a geofence, etc.).
Further, after the first interface is continuously displayed on the folding screen by the folding screen device in response to the operation of the user on the first control and the second interface of the target application is displayed on the external screen, a second control (such as a secondary screen cooperative control is cancelled) can be displayed on the first interface, and the second control is used for canceling the display of the second interface and displaying the external screen.
The following uses a folding screen device as an example of a folding screen mobile phone, and the display method provided in the embodiment of the application is explained through several specific application scenarios.
Scene one
In this scenario, the display of the first control is triggered by the collapsed screen gesture and at least one other condition (e.g., a geofence, etc.). The second interface of the target application is displayed on the external screen of the folding screen mobile phone, is related to the function of the first control, and is completely unrelated to the first interface of the target application displayed on the folding screen.
Illustratively, the user uses a first application (such as a WeChat application) on a folding screen of the folding screen mobile phone, and a display interface of the first application may be a video chat interface, which may be shown in (1) in FIG. 6 a. At this time, the folding screen of the folding screen mobile phone is in an unfolded state, and the external screen of the mobile phone is in a screen-rest state, which can be shown in fig. 6a (2). In this case, if the geofence of the user changes, for example, the user enters a public place (such as a mall, a drop station, etc.), the first application determines that the user enters the public place according to the detection result of the geofence, and queries that the folding screen mobile phone currently uses the folding screen to display the interface thereof, and the folding screen is in an unfolded state, the personal one-touch opening window is popped up, as shown in (1) in fig. 6 b. With continued reference to fig. 6b (1), a plurality of controls are displayed in the personal one-code opening window, such as a "know" control, a "direct open" control, and a "secondary screen open" control. At this time, the external screen of the mobile phone still takes on the screen-off state, and reference can be made to fig. 6b (2).
In another case, the user uses a first application on the external screen of the folding screen mobile phone, and the display interface of the first application may be a video chat interface. If the first application determines that the user enters a public place according to the geofence detection result and inquires that the folding screen mobile phone currently uses the external screen to display the interface of the user, the folding screen is in a folding state, and then a personal one-code opening window is popped up. At the moment, a control which is known and a control which is directly opened can be displayed in the personal one-code opening window, namely, the control which is opened by the auxiliary screen is in a hidden mode; the control of "knowing" and "directly opening" can also be displayed, the control of "sub-screen opening" can be displayed, wherein the control of "knowing" and "directly opening" are in controllable mode, and the control of "sub-screen opening" is in uncontrollable mode (such as icon grey display).
In yet another case, the folding screen of the folding screen mobile phone displays an interface as shown in (1) in fig. 6b, and if the user performs a folding operation on the folding screen to make the folding screen be in a folded state, a "sub-screen opening" control in the one-touch opening window of the person displayed on the interface is switched from a controllable mode to an uncontrollable mode, such as a hidden mode or an icon grey mode.
With continued reference to fig. 6b (1), the user clicks the "sub-screen open" control 201. In response to the user operation, the folding screen mobile phone sequentially displays a video chat interface (shown in (1) of fig. 6 c) on the folding screen, and displays a personal health electronic code display interface (shown in (2) of fig. 6 c) on the external screen. At this time, the user may perform related operations on the external screen, which is not limited in this embodiment.
In an alternative embodiment, the user clicks on the "sub-screen open" control 201, and in response to a user operation, the folding screen handset displays a video chat interface in succession on the folding screen, and displays a "cancel sub-screen collaboration" control in the interface. For example, if the user completes related operations (such as code scanning and the like) based on the external screen display interface, the "cancel auxiliary screen collaboration" control may be clicked, and in response to the user operation, the first application exits the interface displayed on the external screen, and the external screen is closed.
In an alternative embodiment, the user clicks the "sub-screen open" control 201, and in response to a user operation, the folding screen handset displays a personal health electronic code display interface on the external screen, and displays a "cancel sub-screen collaboration" control in the interface. For example, if the user completes related operations (such as code scanning and the like) based on the external screen display interface, the "cancel auxiliary screen collaboration" control may be clicked, and in response to the user operation, the first application exits the interface displayed on the external screen, and the external screen is closed.
Thus, the same application program can respectively display different application program interfaces on the folding screen and the external screen of the folding screen mobile phone so as to be convenient for a user to use. For example, when the user performs video call or watches video, other functions of the application program are not needed to be used after the user exits, and the use experience of the user is improved.
It should be noted that, in this scenario, the personal health electronic code display interface is merely an exemplary example, and the present embodiment does not limit the auxiliary screen collaboration interface displayed on the external screen.
Scene two
In this scenario, the display of the first control is triggered by the folding screen gesture. The second interface of the target application is displayed on the external screen of the folding screen mobile phone, and the second interface is related to a secondary screen collaborative display function provided by an application program and is related to the first interface of the target application displayed on the folding screen. Wherein, the second interface can display part of the content in the first interface.
Illustratively, the user uses a second application (such as a painting application) on the folding screen of the folding screen mobile phone, and the display interface of the second application may be a painting interface, which may be shown in (1) in fig. 7 a. Wherein, the user can draw in the drawing interface using handwriting. At this time, the folding screen of the folding screen mobile phone is in an unfolded state, and the external screen of the mobile phone is in a screen-rest state, which can be shown in fig. 7a (2). In this case, since the second application can determine that the folding screen mobile phone currently uses the folding screen to display its interface, and the folding screen is in the unfolded state, the secondary screen collaboration control 301 can be displayed in the folding screen display interface, as shown in (1) in fig. 7 a.
In another case, the user uses a second application on the external screen of the folding screen mobile phone, and the display interface of the second application may be a drawing interface. Because the second application can determine that the external screen is currently used by the folding screen mobile phone to display the interface of the folding screen mobile phone, and the folding screen is in a folding state, the auxiliary screen cooperative control can be set to an uncontrollable mode, such as a hidden mode or an icon gray setting mode.
In yet another case, the folding screen of the folding screen mobile phone displays an interface as shown in (1) in fig. 7a, and if the user performs a folding operation on the folding screen to make the folding screen be in a folded state, the auxiliary screen collaboration control displayed on the interface is switched from a controllable mode to an uncontrollable mode, such as a hidden mode or an icon grey mode.
With continued reference to fig. 7a (1), the user clicks on the "sub-screen open" control 301. In response to the user operation, the folding screen mobile phone sequentially displays a drawing interface (shown in (1) of fig. 7 b) on the folding screen, and displays a drawing collaboration interface (shown in (2) of fig. 7 b) on the external screen.
As shown in (2) of fig. 7b, the drawing collaboration interface displayed on the folding screen external screen may be divided into a region 303 and a region 304. The area 303 displays details of the user drawing in the drawing interface displayed on the folding screen, and the area 304 may be used for drawing operations together with other users (such as children, babies, etc.).
With continued reference to fig. 7a (1), an exemplary user clicks on the "sub-screen open" control 301, and in response to a user operation, the folding screen handset displays a drawing interface next to the folding screen and displays a "cancel sub-screen collaboration" control 302 in the interface. If the user clicks the "cancel secondary screen collaboration" control 302, the second application exits the drawing collaboration interface displayed on the external screen, and the external screen is closed in response to the user operation.
Thus, the same application program can respectively display different application program interfaces on the folding screen and the external screen of the folding screen mobile phone so as to be convenient for a plurality of users to use simultaneously. Moreover, the user can clearly know the operation condition of the user on the folding screen on the outer screen.
Scene three
In this scenario, the display of the first control is triggered by the folding screen gesture. The second interface of the target application is displayed on the external screen of the folding screen mobile phone, and the second interface is related to a secondary screen collaborative display function provided by an application program and is related to the first interface of the target application displayed on the folding screen. The second interface may display comment information of the first interface content, such as game progress, event progress, and the like.
Illustratively, the user uses a third application (such as a game type, a live broadcast type application, etc.) on the folding screen of the folding screen mobile phone, and the display interface of the third application may be a game interface, which may be shown with reference to (1) in fig. 8 a. Wherein the user may perform a related game operation in the interface. At this time, the folding screen of the folding screen mobile phone is in an unfolded state, and the external screen of the mobile phone is in a screen-rest state, which can be shown in fig. 8a (2). In this case, since the third application can determine that the folding screen mobile phone currently uses the folding screen to display its interface, and the folding screen is in the unfolded state, the secondary screen collaboration control 401 can be displayed in the folding screen display interface, as shown in (1) in fig. 8 a.
In another case, the user uses a third application on the external screen of the folding screen phone, and the display interface of the third application may be a game interface. Because the third application can determine that the external screen is currently used by the folding screen mobile phone to display the interface of the folding screen mobile phone, and the folding screen is in a folding state, the auxiliary screen cooperative control can be set to an uncontrollable mode, such as a hidden mode or an icon gray setting mode.
In yet another case, the folding screen of the folding screen mobile phone displays an interface as shown in (1) of fig. 8a, and if the user performs a folding operation on the folding screen to make the folding screen be in a folded state, the auxiliary screen collaboration control displayed on the interface is switched from a controllable mode to an uncontrollable mode, such as a hidden mode or an icon grey mode.
With continued reference to fig. 8a (1), the user clicks the "sub-screen open" control 401. In response to the user operation, the folding screen mobile phone successively displays a game interface (shown in (1) of fig. 8 b) on the folding screen, and displays a game progress interface (shown in (2) of fig. 8 b) on the external screen. The game progress interface is displayed on the external screen, and the current game progress situation can be displayed, including but not limited to various game information statistics and the like.
With continued reference to fig. 8a (1), the exemplary user clicks on the "secondary screen open" control 401, and in response to user operation, the folding screen handset continues to display the game interface on the folding screen and displays a "cancel secondary screen collaboration" control 402 in the interface. If the user clicks the "cancel secondary screen collaboration" control 402, the third application exits the game progress interface displayed on the external screen, which is in response to the user operation.
Thus, the same application program can respectively display different application program interfaces on the folding screen and the external screen of the folding screen mobile phone, so that a user can clearly know the operating condition profile of the user on the folding screen or the profile of the display content on the folding screen on the external screen.
Scene four
In this scenario, the display of the first control is triggered by the folding screen gesture. The second interface of the target application is displayed on the external screen of the folding screen mobile phone, and the second interface is related to a secondary screen collaborative display function provided by an application program and is related to the first interface of the target application displayed on the folding screen. Wherein the second interface can be used for displaying working and/or learning states of the folding screen user, and the second interface can be presented in an AOD (Always on Display, screen-extinguishing display) mode.
Illustratively, the user uses a fourth application (such as a video class, a net class application, etc.) on the folding screen of the folding screen mobile phone, and the display interface of the fourth application may be a video interface, which may be shown with reference to (1) in fig. 9 a. Wherein the user can view the relevant content in the interface. At this time, the folding screen of the folding screen mobile phone is in an unfolded state, and the external screen of the mobile phone is in a screen-rest state, which can be shown in fig. 9a (2). In this case, since the fourth application can determine that the folding screen mobile phone currently uses the folding screen to display its interface, and the folding screen is in the unfolded state, the secondary screen collaboration control 501 can be displayed in the folding screen display interface, as shown in (1) in fig. 9 a.
In another case, the user uses a fourth application on the external screen of the folding screen phone, and the display interface of the fourth application may be a video interface. Because the fourth application can determine that the external screen is currently used by the folding screen mobile phone to display the interface of the folding screen mobile phone, and the folding screen is in a folding state, the auxiliary screen cooperative control can be set to an uncontrollable mode, such as a hidden mode or an icon gray setting mode.
In yet another case, the folding screen of the folding screen mobile phone displays an interface as shown in (1) in fig. 9a, and if the user performs a folding operation on the folding screen to make the folding screen be in a folded state, the auxiliary screen collaboration control displayed on the interface is switched from a controllable mode to an uncontrollable mode, such as a hidden mode or an icon grey mode.
With continued reference to fig. 9a (1), the user clicks the "sub-screen open" control 501. In response to the user operation, the folding screen mobile phone sequentially displays a video interface on the folding screen (which can be shown in (1) of fig. 9 b), and displays an AOD interface on the external screen, for example, displays a word "in class" in the AOD interface, which can be shown in (2) of fig. 9 b. The typeface displayed in the AOD interface may be determined by an application, which is not limited in this embodiment.
With continued reference to fig. 9a (1), the exemplary user clicks on the "sub-screen open" control 501, and in response to user operation, the folding screen handset continues to display a video interface on the folding screen and displays a "cancel sub-screen collaboration" control 502 in the interface. If the user clicks the "cancel secondary screen collaboration" control 502, then in response to the user operation, the third application exits the AOD interface displayed on the external screen, which is the external screen.
Therefore, the same application program can respectively display different application program interfaces on the folding screen and the external screen of the folding screen mobile phone, so that a user can clearly know the state of using the folding screen on the external screen.
It should be emphasized that the above-mentioned several scenes are merely exemplary examples, and the technical solution provided in the embodiments of the present application may be extended to other application scenes, so that the same application program may display different application program interfaces on the folding screen and the external screen of the folding screen mobile phone, so as to facilitate user operation.
It should be noted that the same application program may also display application program interfaces of the same content on the folding screen and the external screen of the folding screen mobile phone, so that the user can view through the folding screen or the external screen.
Fig. 10 a-10 b illustrate another application scenario.
Illustratively, the user uses a fifth application (such as a video call application) on the folding screen of the folding screen mobile phone, and the display interface of the fifth application may be a video chat interface, which may be shown in (1) in fig. 10 a. At this time, the folding screen of the folding screen mobile phone is in an unfolded state, and the external screen of the mobile phone is in a screen-rest state, which can be shown in fig. 10a (2). In this case, since the fifth application can determine that the folding screen mobile phone currently uses the folding screen to display its interface, and the folding screen is in an unfolded state, the secondary screen collaboration control can be displayed in the folding screen display interface, as shown in (1) in fig. 10 a.
In another case, the user uses a fifth application on the external screen of the folding screen phone, and the display interface of the fifth application may be a video chat interface. Because the fifth application can determine that the external screen is currently used by the folding screen mobile phone to display the interface of the folding screen mobile phone, and the folding screen is in a folding state, the auxiliary screen cooperative control can be set to an uncontrollable mode, such as a hidden mode or an icon gray setting mode.
In still another case, the folding screen of the folding screen mobile phone displays an interface as shown in (1) in fig. 10a, and if the user performs a folding operation on the folding screen to make the folding screen be in a folded state, the auxiliary screen collaboration control displayed on the interface is switched from a controllable mode to an uncontrollable mode, such as a hidden mode or an icon grey mode.
With continued reference to fig. 10a (1), if the user clicks the "sub-screen open" control. In response to the user operation, the folding screen mobile phone sequentially displays a video chat interface on the folding screen (see (1) in fig. 10 b), and simultaneously displays the video chat interface on the external screen (see (2) in fig. 10 b).
Therefore, the video chat interface can be displayed on the folding screen and the outer screen simultaneously, the shooting effect can be observed by both the shooting object and the shot object, and the user experience is improved.
In order to achieve the technical scheme provided by the embodiment of the application, the equipment state management system provides a query interface for the application program, so that the application program can query the screen of the folding screen mobile phone display application program interface and the gesture information of the folding screen, and further whether the first control is displayed or not is confirmed.
Fig. 11a is a schematic flow chart of a display method according to an embodiment of the present application. As shown in fig. 11a, the steps of the display method specifically include:
s601, the application program receives a first operation of a user.
The application program may be any third party application installed in the folding screen mobile phone, such as an instant messaging application (e.g., a WeChat), an image processing application (e.g., a beauty show) and the like, and may also be a system application (i.e., an application that is self-contained when the electronic device is initially set). The third party application is an application that is self-contained when the non-electronic device is initially set up, including an application that is run by downloading an application package.
The first operation may be an opening operation of the application program. The user may perform a start operation of opening the application from a desktop, a negative one-screen menu, a pull-up menu, a pull-down menu, or any shortcut menu. For example, a user may perform a first operation on an application icon on a desktop to launch the application.
The first operation may also be an opening operation of a certain function in the application program, and the function may be, for example, a function related to shooting, such as a video function, a video recording function, a shooting function, and the like. The user may perform a first operation on a function of the application at an application interface of the application to open the function. For example, the user may perform a first operation on a function icon, which is a video call, in a chat interface of a WeChat to open the video call.
S602, the application program sends a device type query instruction to the device state management system.
In response to a first operation of a user on a certain three-party application, the three-party application sends a device type query instruction to a device state management system. The equipment state management system is provided with an interface for three-party application to query. Through the interface, the data interaction between the three-party application and the equipment state management system can be realized. If the state management system does not have the interface, the three-party application cannot directly communicate with the state management system. The three-party application can acquire the related information of the electronic equipment through the interface. For example, the three-party application sends a device type query instruction through a query interface preset on the device state management system, and obtains the device type information fed back by the device state management system through the query interface. It can be appreciated that the interaction between the three-party application and the device state management system can be through a query interface preset on the device state management system.
The device type query instruction is used for querying device information of the electronic device, wherein the device information can comprise display screen information. The display information may be used to indicate information related to the display, such as indicating whether the electronic device includes multiple displays, indicating whether the electronic device includes an invaginated folding screen, indicating a positional relationship between the displays when the electronic device has multiple displays, and so on.
S603, the device state management system sends the first feedback information to the application program.
In some embodiments of the present application, after receiving a device type query instruction, the device state management system generates first feedback information based on device information of the electronic device, and sends the generated first feedback information to an application program, where the first feedback information may include the device information. The first feedback information may be used to determine whether the electronic device is a folding screen electronic device. The folding screen electronic device comprises a first screen and a second screen arranged opposite to the first screen. The first screen may be an inwardly folded screen. In some embodiments of the present application, the display direction of the second screen may be opposite to the display direction of the first screen in the unfolded state.
S604, the application program determines whether the electronic device is a folding screen electronic device according to the first feedback information, if so, S605 is executed.
If the electronic equipment is determined not to be the folding screen electronic equipment, the flow of the display method is ended. In some embodiments of the present application, if it is determined that the electronic device is not a folding screen electronic device, an application interface of the application may be displayed according to a conventional display manner.
S605, the application program sends a system display state query instruction to the device state management system.
S606, the device state management system sends second feedback information to the application program.
In some embodiments of the present application, after receiving a system display state query instruction, a device state management system generates second feedback information based on display information of the electronic device, and sends the generated second feedback information to the application layer, where the second feedback information is used to indicate a system display state of the electronic device, and the system display state includes a first screen display and a second screen display. The display information is used for indicating the system display state of the current electronic equipment. The system display state is used to indicate which display screen the electronic device is currently using so that the first screen display indicates that the display is using the first screen and the second screen display indicates that the display is using the second screen.
S607, the application program determines whether the system display state of the electronic device is the first screen display according to the second feedback information, if yes, then S608 is executed.
If the system display state of the electronic equipment is not the first screen display, ending the flow of the application interface display method. In some embodiments of the present application, if it is determined that the system display state of the electronic device is not the first screen display, an application interface of the application may be displayed in a conventional display manner.
S608, the application program sends a gesture query instruction to the device state management system.
The gesture query instruction is configured to query a gesture of the first screen, where the gesture may include an unfolded state, a semi-folded state, and a folded state, and the relevant description about the gesture of the folded screen may be referred to the relevant description above, which is not repeated herein. In the embodiments of the present application, the semi-folded state and the folded state may be collectively referred to as a non-unfolded state. The gesture of the first screen may be used as a gesture of the electronic device, for example, when the gesture of the first screen is in the unfolded state, it is determined that the electronic device is in the unfolded state; when the gesture of the first screen is in the non-unfolding state, the electronic device is determined to be in the non-unfolding state.
S609, the device state management system sends the third feedback information to the application.
In some embodiments of the present application, after receiving the gesture query instruction, the device state management system generates third feedback information based on the sensor information of the electronic device, and sends the generated third feedback information to the application program. The third feedback information is used for indicating the gesture of the first screen of the electronic device, the gesture comprises a non-unfolding state and an unfolding state, and the sensor information comprises acceleration sensor information and gyroscope sensor information.
In some embodiments of the present application, as shown in fig. 5b, the angle calculating module may obtain sensor information (such as acceleration sensor information and gyroscope sensor information) uploaded by the sensor driving module, and determine an included angle of the folding screen of the electronic device according to the uploaded sensor information. The angle calculation module sends the determined included angle information of the folding screen of the electronic device to the device state management system so that the device state management system determines the gesture of the folding screen (namely the gesture information of the first screen), and generates third feedback information based on the gesture information of the folding screen of the electronic device.
S610, the application program determines whether the current gesture of the first screen is an unfolding state according to the third feedback information, and if not, S611 is executed.
And if the current gesture of the first screen is not the unfolding state, ending the flow of the application interface display method. In some embodiments of the present application, if it is determined that the current gesture of the first screen is not the expanded state, an application interface of the application may be displayed in a conventional display manner.
S611, the folding screen mobile phone displays an application interface of the application program on the first screen, wherein the application interface comprises a first control.
In some applications, the display of the first control is triggered by other conditions (such as a geofence) besides the gesture of the folding screen, and the first control is displayed in an application interface displayed on the first screen only when the conditions are met.
In an embodiment of the present application, the first control may be used to invoke a second screen (i.e., an external screen) of the folding screen device, so that the first screen and the second screen of the folding screen device cooperatively display.
After the application layer loads the application interface of the application, a first interface corresponding to the application is displayed on a first screen of the electronic equipment. After the user clicks the first control, a first interface corresponding to the application is continuously displayed on the first screen, and a second interface corresponding to the application is displayed on the second screen, which can be shown in fig. 11 b. The content of the second interface may be the same as the first interface or may be different from the first interface. The content displayed in the second interface is related to the function indicated by the first control or to a co-display function provided by the application program.
In some embodiments of the present application, after the user clicks the first control, a first interface corresponding to the application is continuously displayed on the first screen, where the first interface may further include a second control (e.g. cancel the secondary screen collaboration control). And if the user clicks the second control, stopping displaying the second application interface on the second screen in response to the user operation, and continuously displaying the first interface corresponding to the application on the first screen. Therefore, by setting the second control, the display scene before the user clicks the first control can be restored, the user can conveniently select a proper mode to use the electronic equipment, and the control convenience of the application interface display method is improved.
In some embodiments of the present application, as shown in fig. 12, the display method further includes:
s701, the application registers a first event on the device state management system.
The first event is used for controlling the device state management system to monitor the system display state of the electronic device, such as determining whether the display state is switched from the first screen display to the second screen display, and determining whether the display state is switched from the second screen display to the first screen display.
When the device state management system monitors that the system display state of the electronic device changes (such as switching from the first screen display to the second screen display and switching from the second screen display to the first screen display), the device state management system notifies an application program of the change of the system display state of the electronic device. The user can adjust the system display state of the electronic equipment by performing display state switching operation on the electronic equipment. The display state switching operation includes related operations of adjusting a display setting of the electronic device, such as clicking a system display state switching control, and the like. After the user performs the display state switching operation, the system display state of the electronic device may be switched from the first screen display to the second screen display, or the system display state of the electronic device may be switched from the second screen display to the first screen display.
S702, the application registers a second event on the device state management system.
The second event is used for controlling the equipment state management system to monitor the gesture of the first screen, determining whether the gesture of the first screen changes, if so, changing from a non-unfolding state to a unfolding state, changing from the unfolding state to the non-unfolding state, and the like. When the device state management system monitors that the gesture of the electronic device changes, the device state management system notifies the application program of the change of the gesture of the electronic device (such as changing from a non-unfolding state to a unfolding state and from the unfolding state to the non-unfolding state). The user can adjust the posture of the first screen by performing a posture adjustment operation on the first screen.
The execution time of S701 and S702 may be identical or not, and in the embodiment of the present application, the execution time of S701 and S702 is not limited, for example, S701 may be executed after S611 in fig. 11 a. According to the technical scheme, the system display state and the folding and unfolding changes of the electronic equipment can be timely monitored by monitoring the system display state and the folding and unfolding changes of the first screen.
In some embodiments of the present application, a first control included on the application interface is set to an uncontrollable mode when the application receives a change in a system display state of the electronic device (e.g., switches from a first screen display to a second screen display) and determines that the current system display state of the electronic device is not the first screen display, or when the application receives a change in a first screen pose (e.g., changes from an expanded state to a non-expanded state) and determines that the pose of the first screen is not the expanded state. If the user clicks the first control set in the uncontrollable mode, the function corresponding to the first control cannot be called. Uncontrollable patterns may include an icon greying pattern and an icon hiding pattern.
In the above embodiment, when the system display state of the electronic device is not the first screen display or when the gesture of the first screen is not the unfolded state, the first control included on the application interface is set to the uncontrollable mode in time, so that the situation that when the system display state is not the first screen display or when the gesture of the first screen is not the unfolded state, the user clicks the first control but cannot realize the corresponding function of the first control is avoided, and resource loss of the electronic device can be correspondingly reduced.
In some embodiments of the present application, when a user clicks a first control set in an uncontrollable mode, a corresponding prompt is displayed on a current display interface according to a preset prompt rule, where the prompt is used to prompt the user to set a system display state of the electronic device to a first screen display state and change a gesture of the first screen to an expanded state.
The application registers the first event and the second event on the device state management system, and the description of the first event and the second event may be found in the above description. And when the application program determines that the system display state of the electronic device is the first screen display state and the gesture of the first screen is the unfolding state, canceling the uncontrollable mode of the first control. At the moment, the user can realize the function of the auxiliary screen collaborative display by clicking the first control. According to the embodiment, when the user clicks the first control set in the uncontrollable mode, the application scene condition corresponding to the first control is displayed, so that the user can conveniently adjust the application scene condition.
In some embodiments of the present application, as shown in fig. 13, after S607, if it is determined that the system display state of the electronic device is not the first screen display, the display method may further include:
s801, an application program loads an application interface, and the application interface comprises a first control set to an uncontrollable mode.
After the application program loads the application interface, the corresponding application interface is displayed on a folding screen of the electronic equipment. For a description of the uncontrollable patterns, reference may be made to the description of the above, which is not repeated here.
S802, the application program registers a first event and a second event on the device state management system.
For a description of the first event and the second event, reference may be made to the description of fig. 12 above, and details thereof are not repeated here.
S803, when the application program determines that the system display state of the electronic device is the first screen display state and the first screen is the unfolded state, the uncontrollable mode of the first control is canceled.
In some embodiments of the present application, when a user clicks a first control set in an uncontrollable mode, a corresponding prompt is displayed on a current display interface according to a preset prompt rule, where the prompt is used to prompt the user to set a system display state of the electronic device to a first screen display, and change a gesture of the first screen to an expanded state. Reference may be made to the related descriptions above for some specific implementations of this embodiment, and no further description is given here.
According to the technical scheme, when the system display state of the electronic equipment is not the first screen display, the first control set to the uncontrollable mode is displayed on the application interface of the application, the system display state of the electronic equipment and the gesture change of the first screen are monitored, the uncontrollable mode of the first control is canceled when the application scene condition corresponding to the first control is met, and a user can conveniently use the first control to execute a corresponding function.
In some embodiments of the present application, as shown in fig. 14, after S610, if it is determined that the gesture of the first screen of the electronic device is not the unfolded state, the display method may further include:
and S901, displaying an application interface of the application on a first screen, wherein the application interface comprises a first control set to an uncontrollable mode.
After the application program loads the application interface, the corresponding application interface is displayed on a folding screen of the electronic equipment. For a description of the uncontrollable patterns, reference may be made to the description of the above, which is not repeated here.
S902, the application registers the first event and the second event on the device state management system.
For a description of the first event and the second event, reference may be made to the description of fig. 13, which is not repeated herein.
S903, when the application program determines that the system display state of the electronic device is the first screen display state and the first screen is the unfolded state, the uncontrollable mode of the first control is canceled.
In some embodiments of the present application, when a user clicks a first control set in an uncontrollable mode, a corresponding prompt is displayed on a current display interface according to a preset prompt rule, where the prompt is used to prompt the user to set a system display state of the electronic device to a first screen display, and change a gesture of the first screen to an expanded state. Reference may be made to the related descriptions above for some specific implementations of this embodiment, and no further description is given here.
According to the embodiment, when the application program determines that the gesture of the first screen of the electronic device is not in the unfolded state, the first control set to the uncontrollable mode is displayed on the application interface of the application, the system display state of the electronic device and the gesture change of the first screen are monitored, and when the application scene condition corresponding to the first control is met, the uncontrollable mode of the first control is canceled, so that a user can conveniently use the first control to execute a corresponding function.
The present embodiment also provides a computer storage medium having stored therein computer instructions which, when executed on an electronic device, cause the electronic device to perform the above-described related method steps to implement the display method in the above-described embodiments.
The present embodiment also provides a computer program product which, when run on a computer, causes the computer to perform the above-described relevant steps to implement the display method in the above-described embodiments.
In addition, embodiments of the present application also provide an apparatus, which may be specifically a chip, a component, or a module, and may include a processor and a memory connected to each other; the memory is configured to store computer-executable instructions, and when the device is running, the processor may execute the computer-executable instructions stored in the memory, so that the chip executes the display method in the above method embodiments.
The electronic device (such as a folding screen mobile phone) provided in this embodiment, the computer storage medium, the computer program product or the chip are all configured to execute the corresponding method provided above, so that the beneficial effects that can be achieved by the electronic device can refer to the beneficial effects in the corresponding method provided above, and are not repeated herein.
It will be appreciated by those skilled in the art that, for convenience and brevity of description, only the above-described division of the functional modules is illustrated, and in practical application, the above-described functional allocation may be performed by different functional modules according to needs, i.e. the internal structure of the apparatus is divided into different functional modules to perform all or part of the functions described above.
In the several embodiments provided in this application, it should be understood that the disclosed apparatus and method may be implemented in other ways. For example, the apparatus embodiments described above are merely illustrative, e.g., the division of modules or units is merely a logical function division, and there may be additional divisions when actually implemented, e.g., multiple units or components may be combined or integrated into another apparatus, or some features may be omitted or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or units, which may be in electrical, mechanical or other form.
The above embodiments are merely for illustrating the technical solution of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit of the corresponding technical solutions from the scope of the technical solutions of the embodiments of the present application.

Claims (13)

1. A display method applied to a folding screen apparatus including a first screen and a second screen, the first screen being a folding screen, the method comprising:
displaying a first interface of a target application on the first screen; the first screen is in an unfolding state, and a first control is displayed on the first interface;
responding to the operation of the first control, displaying the first interface on the first screen in a continuous mode, and displaying a second interface of the target application on the second screen; the second interface is related to a function indicated by the first control, or the second interface is related to a co-display function of a secondary screen provided by the target application.
2. The method of claim 1, wherein the first interface has a first control displayed thereon, comprising:
and if the target application inquires that the folding screen device currently uses the first screen to display the interface of the target application and the first screen is in an unfolding state, displaying the first control on the first interface.
3. The method of claim 1, wherein the first interface has a first control displayed thereon, comprising:
And if the target application inquires that the folding screen device currently uses the first screen to display the interface of the target application, the first screen is in an unfolding state, and when the fact that the preset condition is met currently is determined, the first control is displayed on the first interface.
4. The method of claim 2, wherein the first interface displays first content and the second interface includes a first region displaying the first content and a second region for user-filled content.
5. The method of claim 2, wherein a second content is displayed in the first interface, and wherein progress information of the second content is synchronously displayed in the second interface.
6. The method of claim 2, wherein a third content is displayed in the first interface, user status information corresponding to the third content is displayed in the second interface, and the second interface is a screen-off display interface.
7. A method according to claim 3, wherein the preset condition is a geofence detection condition; the first interface is a video interface, and the second interface displays two-dimensional code information.
8. The method of claim 1, further comprising, after displaying the second interface of the target application on the second screen:
canceling to display the first control on the first interface and displaying a second control; and the second control is used for closing the auxiliary screen collaborative display function provided by the target application.
9. The method as recited in claim 8, further comprising:
and responding to the operation of the second control, stopping displaying an interface on the second screen, and carrying out screen extinguishing processing on the second screen.
10. A method according to claim 2 or 3, further comprising, after displaying a first control on the first interface:
and if the target application inquires that the current screen used by the folding screen device is switched from the first screen to the second screen or the first screen is switched from an unfolding state to a folding state, the first control is switched from a controllable mode to an uncontrollable mode.
11. A method according to claim 2 or 3, wherein the folding screen device comprises a device status management system, the method further comprising:
the target application registers a first event in the equipment state management system, wherein the first event is used for controlling the equipment state management system to monitor the system display state of the folding screen equipment and notifying the display state change to the target application;
The target application registers a second event in the equipment state management system, wherein the second event is used for controlling the equipment state management system to monitor the gesture information of the first screen and notifying the change of the gesture information to the target application.
12. A folding screen apparatus, comprising:
one or more processors;
a memory;
and one or more computer programs, wherein the one or more computer programs are stored on the memory, which when executed by the one or more processors, cause the folding screen device to perform the display method of any of claims 1-11.
13. A computer readable storage medium comprising a computer program, characterized in that the computer program, when run on an electronic device, causes the electronic device to perform the display method according to any one of claims 1-11.
CN202211038392.1A 2022-08-29 2022-08-29 Display method and folding screen device Active CN116048686B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202211038392.1A CN116048686B (en) 2022-08-29 2022-08-29 Display method and folding screen device
CN202311464896.4A CN117992159A (en) 2022-08-29 2022-08-29 Display method and folding screen device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211038392.1A CN116048686B (en) 2022-08-29 2022-08-29 Display method and folding screen device

Related Child Applications (1)

Application Number Title Priority Date Filing Date
CN202311464896.4A Division CN117992159A (en) 2022-08-29 2022-08-29 Display method and folding screen device

Publications (2)

Publication Number Publication Date
CN116048686A true CN116048686A (en) 2023-05-02
CN116048686B CN116048686B (en) 2023-11-24

Family

ID=86113882

Family Applications (2)

Application Number Title Priority Date Filing Date
CN202211038392.1A Active CN116048686B (en) 2022-08-29 2022-08-29 Display method and folding screen device
CN202311464896.4A Pending CN117992159A (en) 2022-08-29 2022-08-29 Display method and folding screen device

Family Applications After (1)

Application Number Title Priority Date Filing Date
CN202311464896.4A Pending CN117992159A (en) 2022-08-29 2022-08-29 Display method and folding screen device

Country Status (1)

Country Link
CN (2) CN116048686B (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109917956A (en) * 2019-02-22 2019-06-21 华为技术有限公司 It is a kind of to control the method and electronic equipment that screen is shown
CN111949345A (en) * 2019-05-14 2020-11-17 华为技术有限公司 Application display method and electronic equipment
CN114237530A (en) * 2020-01-21 2022-03-25 华为技术有限公司 Display method and related device of folding screen
CN114401373A (en) * 2022-03-24 2022-04-26 荣耀终端有限公司 Method for displaying on two screens simultaneously, electronic equipment and readable storage medium

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109917956A (en) * 2019-02-22 2019-06-21 华为技术有限公司 It is a kind of to control the method and electronic equipment that screen is shown
CN111949345A (en) * 2019-05-14 2020-11-17 华为技术有限公司 Application display method and electronic equipment
CN114237530A (en) * 2020-01-21 2022-03-25 华为技术有限公司 Display method and related device of folding screen
CN114401373A (en) * 2022-03-24 2022-04-26 荣耀终端有限公司 Method for displaying on two screens simultaneously, electronic equipment and readable storage medium

Also Published As

Publication number Publication date
CN116048686B (en) 2023-11-24
CN117992159A (en) 2024-05-07

Similar Documents

Publication Publication Date Title
KR101655812B1 (en) Mobile terminal and operation method thereof
CN110971930A (en) Live virtual image broadcasting method, device, terminal and storage medium
CN110368689B (en) Game interface display method, system, electronic equipment and storage medium
US20220164159A1 (en) Method for playing audio, terminal and computer-readable storage medium
CN108762881B (en) Interface drawing method and device, terminal and storage medium
KR20110035563A (en) Mobile terminal and operation control method thereof
CN112907725B (en) Image generation, training of image processing model and image processing method and device
CN110944374B (en) Communication mode selection method and device, electronic equipment and medium
KR20220062061A (en) Foldable screen display method and electronic device
CN108897597B (en) Method and device for guiding configuration of live broadcast template
CN110288689B (en) Method and device for rendering electronic map
CN111031170A (en) Method, apparatus, electronic device and medium for selecting communication mode
CN111897465B (en) Popup display method, device, equipment and storage medium
CN108848405B (en) Image processing method and device
CN108734662B (en) Method and device for displaying icons
CN110837300B (en) Virtual interaction method and device, electronic equipment and storage medium
CN113409427A (en) Animation playing method and device, electronic equipment and computer readable storage medium
CN110992268B (en) Background setting method, device, terminal and storage medium
CN114840280A (en) Display method and electronic equipment
CN116048686B (en) Display method and folding screen device
CN109101166B (en) Audio control method, device and storage medium
CN108881715B (en) Starting method and device of shooting mode, terminal and storage medium
CN111010732A (en) Network registration method, device, electronic equipment and medium
CN113220203B (en) Activity entry display method, device, terminal and storage medium
CN113538633B (en) Animation playing method and device, electronic equipment and computer readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant