CN115002879B - Method and device for displaying camera interface - Google Patents
Method and device for displaying camera interface Download PDFInfo
- Publication number
- CN115002879B CN115002879B CN202210259205.6A CN202210259205A CN115002879B CN 115002879 B CN115002879 B CN 115002879B CN 202210259205 A CN202210259205 A CN 202210259205A CN 115002879 B CN115002879 B CN 115002879B
- Authority
- CN
- China
- Prior art keywords
- interface
- control module
- screen
- camera
- cooperative function
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W52/00—Power management, e.g. TPC [Transmission Power Control], power saving or power classes
- H04W52/02—Power saving arrangements
- H04W52/0209—Power saving arrangements in terminal devices
- H04W52/0261—Power saving arrangements in terminal devices managing power supply demand, e.g. depending on battery level
- H04W52/0274—Power saving arrangements in terminal devices managing power supply demand, e.g. depending on battery level by switching on or off the equipment or parts thereof
- H04W52/028—Power saving arrangements in terminal devices managing power supply demand, e.g. depending on battery level by switching on or off the equipment or parts thereof switching on or off only a part of the equipment circuit blocks
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M1/00—Substation equipment, e.g. for use by subscribers
- H04M1/72—Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
- H04M1/724—User interfaces specially adapted for cordless or mobile telephones
- H04M1/72403—User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
- H04M1/7243—User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality with interactive means for internal management of messages
- H04M1/72439—User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality with interactive means for internal management of messages for image or video messaging
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M1/00—Substation equipment, e.g. for use by subscribers
- H04M1/72—Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
- H04M1/724—User interfaces specially adapted for cordless or mobile telephones
- H04M1/72448—User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions
- H04M1/72454—User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions according to context-related or environment-related conditions
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02D—CLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
- Y02D30/00—Reducing energy consumption in communication networks
- Y02D30/70—Reducing energy consumption in communication networks in wireless communication networks
Landscapes
- Engineering & Computer Science (AREA)
- Computer Networks & Wireless Communication (AREA)
- Signal Processing (AREA)
- Human Computer Interaction (AREA)
- Environmental & Geological Engineering (AREA)
- Multimedia (AREA)
- Business, Economics & Management (AREA)
- General Business, Economics & Management (AREA)
- Studio Devices (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The application relates to the field of terminals, and provides a method and a device for displaying a camera interface, wherein the method is applied to foldable terminal equipment, the terminal equipment comprises an inner screen and an outer screen, and the method comprises the following steps: receiving a first operation for opening a camera APP; responding to a first operation, displaying a first interface of a camera APP on an inner screen, wherein the first interface comprises a cooperative function control, and the cooperative function control is used for starting or closing a cooperative function; receiving a second operation of the user on the cooperative function control, wherein the second operation is used for opening the cooperative function; displaying a second interface of the camera APP on the outer screen in response to the second operation; when the terminal device is detected to be in the closed state, the first interface on the inner screen and the second interface on the outer screen are closed, and the third interface of the camera APP is displayed on the outer screen. The method can reduce the power consumption of the foldable terminal device when the cooperative shooting function is used.
Description
The present application claims priority from chinese patent application entitled "method and apparatus for displaying camera interface" filed by the national directive property office at 31/12/2021 under the application number 202111679509.X, the entire contents of which are incorporated herein by reference.
Technical Field
The application relates to the field of terminals, in particular to a method and a device for displaying a camera interface.
Background
Terminal devices typically include a display screen through which a user can interact with the terminal device, and some terminal devices include multiple display screens located on opposing surfaces of the terminal device. The terminal equipment can start the collaborative shooting function, can display the same or different User Interfaces (UI) through a plurality of display screens, and provides rich shooting experience for users.
Some terminal devices have a folding function, and therefore, these terminal devices can assume a plurality of states, such as an unfolded state, a closed state, and a half-folded state. The terminal devices in different states are suitable for different scenes, the collaborative shooting function is a relatively power-consuming function, and how to reduce the power consumption of the foldable terminal device when the collaborative shooting function is used is a problem which needs to be solved currently.
Disclosure of Invention
Embodiments of the present application provide a method, an apparatus, a computer-readable storage medium, and a computer program product for displaying a camera interface, which can reduce power consumption when a foldable terminal device uses a collaborative shooting function.
In a first aspect, a method for displaying a camera interface is provided, where the method is applied to a foldable terminal device, where the terminal device includes an inner screen and an outer screen, and the method includes: receiving a first operation, wherein the first operation is used for opening a camera Application (APP), and the terminal equipment is in an expansion state; responding to a first operation, displaying a first interface of a camera APP on an inner screen, wherein the first interface comprises a cooperative function control, and the cooperative function control is used for starting or closing a cooperative function; receiving a second operation of the user on the cooperative function control, wherein the second operation is used for opening the cooperative function; displaying a second interface of the camera APP on the outer screen in response to a second operation; when the terminal device is detected to be in the closed state, the first interface on the inner screen and the second interface on the outer screen are closed, and the third interface of the camera APP is displayed on the outer screen.
After the user takes a picture, the user usually wants to see the shooting result, so when the user folds the terminal device, the user may conveniently hold the terminal device while viewing the shooting result, and does not want to turn off the camera APP. If the terminal device closes the camera APP after receiving the folding operation (if the activity of the camera APP is destroyed), the user needs to restart the camera APP when checking the shooting result, and the camera APP needs to reload resources such as executable files and dynamic libraries, which consumes more electric quantity. Based on the method provided by the application, the terminal device does not close the camera APP after detecting the folding operation, the third interface (which can be the same as the first interface or the second interface and can also be different from the first interface or the second interface) of the camera APP is displayed on the external screen, and when the user needs to check the shooting result, the camera APP does not need to be restarted, so that the power consumption is reduced. In addition, the user can quickly check the shooting result, and therefore the method provided by the application improves the user experience.
In one implementation, closing a first interface on an inner screen and a second interface on an outer screen includes: the folding state control module sends a first message to the cooperative function control module, wherein the first message instructs the cooperative function control module to close the first preview frame data stream, and the first preview frame data stream is a preview frame data stream associated with the inner screen; and the cooperative function control module responds to the first message and calls the camera service to close the first preview frame data stream.
When a user uses the collaborative function to shoot, the camera UI is firstly displayed on the inner screen, so that the preview frame data stream (session) which is initially established is the preview frame data stream associated with the inner screen, and the preview frame data stream associated with the inner screen is used through the surface share mechanism when the camera UI is displayed on the outer screen. After the terminal device is folded, the inner screen is not used any more, and the terminal device needs to establish a preview frame data stream associated with the outer screen, so that the cooperative function control module needs to call a camera service to close the first preview frame data stream, so as to establish a second preview frame data stream (the preview frame data stream associated with the outer screen).
In one implementation, the method further comprises: and the cooperative function control module responds to the first message and calls the display module to perform power-off processing on the inner screen.
After the terminal equipment is folded, the inner screen is shielded, and the power-off processing of the inner screen can reduce the power consumption.
In one implementation, closing the first interface on the inner screen and the second interface on the outer screen further comprises: the folding state control module sends a second message to the cooperative function control module, and the second message instructs the cooperative function control module to close the cooperative function; and the cooperative function control module responds to the second message and closes the cooperative function switch on the first interface through the interface control module.
After the terminal equipment is folded, the inner screen is shielded, the scene of collaborative shooting disappears, and the user can perceive the current collaborative shooting state by closing the collaborative function switch on the first interface, so that the user experience is improved.
In one implementation, displaying a third interface of a camera APP on an outer screen includes: the folding state control module sends a preview request to the camera service; the camera service responds to the preview request and acquires a second preview frame data stream; the camera service sends a second preview frame data stream to the interface control module, wherein the second preview frame data stream is a preview frame data stream associated with the outer screen; and the interface control module displays a third interface on the outer screen according to the second preview frame data stream.
After the terminal device is folded, the inner screen is not used any more, and the first preview frame data stream is disconnected, so that a preview frame data stream (a second preview frame data stream) associated with the outer screen needs to be established, so that the outer screen can normally display a third interface.
In one implementation, the terminal device further includes an outer screen camera, and the third interface includes an image and a cooperative function control collected by the outer screen camera.
The third interface contains a cooperative function control, so that a user can directly start the cooperative function through the outer screen, and compared with a scheme that the cooperative function can only be started through the first interface of the inner screen, the embodiment provides an additional cooperative function entry, the use by the user is facilitated, and the user experience is improved.
In one implementation, the method further comprises: when the terminal equipment is detected to be in the unfolding state, displaying a first interface on the inner screen; receiving a third operation of the user on the collaborative function control, wherein the third operation is used for opening the collaborative function; responding to the third operation, and displaying a second interface on the outer screen; when no new operation is detected within a preset time period after the third operation, the interface control module sends a third message to the cooperative function control module, and the third message instructs the cooperative function control module to close the cooperative function; and the cooperative function control module responds to the third message, calls the display module to perform power-off processing on the external screen and perform screen-off processing on the internal screen.
The method has the advantages that new operation is not received within the preset time period, the user does not use the terminal equipment with high probability, the terminal equipment is powered off for the external screen, the internal screen is powered off for saving electric quantity, and in addition, the internal screen in the screen-off state can quickly respond to awakening operation, so that user experience is improved.
In one implementation, the method further comprises: and the cooperative function control module responds to the third message and calls the camera service to close the current preview frame data stream.
After the inner screen is turned off and the outer screen is powered off, the preview frame data stream is continuously acquired, so that the meaning is lost, and the electric quantity consumption can be reduced by closing the preview frame data stream.
In one implementation, the method further comprises: when the interface control module detects the awakening operation, the interface control module sends a fourth message to the cooperative function control module, and the fourth message instructs the cooperative function control module to start a cooperative function; the cooperative function control module responds to the fourth message, calls the display module to control the outer screen to be powered on and controls the inner screen to be lightened; the cooperative function control module responds to the fourth message and calls a camera service to acquire a third preview frame data stream; and the interface control module processes the third preview frame data stream, displays the first interface on the inner screen and displays the second interface on the outer screen.
In this embodiment, the interface control module records a camera state (collaborative shooting state) before the screen is turned off, so that after the interface control module receives the wakeup operation, the interface control module instructs the collaborative function control module to start the collaborative function, and the user does not need to open the collaborative function through the operation, thereby improving the user experience.
In one implementation manner, before the interface control module detects the wake-up operation, the method further includes: the interface control module calls the display module to display prompt information on the inner screen, and the prompt information is used for prompting a user to click the inner screen to wake up the camera APP.
The prompt information enables the user to perceive the current camera state, and misoperation of the user is avoided, so that user experience is improved.
In a second aspect, another method for displaying a camera interface is provided, where the method is applied to a foldable terminal device, where the terminal device includes an inner screen and an outer screen, and the method includes: receiving a first operation of a user, wherein the first operation is used for opening a camera APP; responding to the first operation, displaying an interface of the camera APP according to a folded state of the terminal device, wherein the interface of the camera APP comprises a first interface, when the folded state is a closed state or a semi-folded state, the first interface is displayed on the outer screen, and the inner screen is controlled to be turned off, and the first interface comprises a cooperative work button which is used for controlling whether the terminal device displays the first interface on the inner screen.
When the terminal equipment is in a closed state, the inner screen is completely shielded; when the terminal equipment is in the half-folding state, the inner screen is partially shielded. These two kinds of condition all do not do benefit to camera APP's collaborative work, promptly, the user uses interior screen and outer screen to carry out the experience of shooing simultaneously not good, consequently, when terminal equipment is in closed state or half folding state, including the interface meaningless of screen simultaneous display camera APP outward, terminal equipment can only show camera APP's interface outward, treat that terminal equipment expandes the back again at interior screen and outer screen simultaneous display camera APP's interface to can reduce the electric quantity consumption that shows camera UI.
In one implementation, the method further comprises:
receiving a second operation of the user on the first interface, wherein the second operation is used for opening the cooperative work button;
responding to the second operation, displaying a collaborative work prompt interface on the external screen, wherein the collaborative work prompt interface is used for guiding the user to unfold the terminal equipment;
when the terminal device is switched from the closed state or the semi-folded state to the unfolded state, displaying a second interface of the camera APP on the outer screen, and displaying the first interface on the inner screen.
In one implementation, the first interface is a control interface of the camera APP, and the second interface is a preview interface of the camera APP.
In one implementation, the unfolded state is a state when a folding angle of the terminal device is greater than or equal to a first angle threshold.
In one implementation, the method further comprises:
when the terminal device is switched from the unfolded state to the half-folded state, displaying the first interface on the inner screen and displaying the second interface on the outer screen;
when the terminal equipment is switched from the semi-folding state to the closing state, the first interface is displayed on the outer screen, and the inner screen is controlled to be turned off.
In one implementation, the method further comprises:
when the terminal equipment is switched from the unfolding state to the semi-folding state or the closing state, the first interface is displayed on the outer screen, and the inner screen is controlled to be turned off.
In one implementation, the displaying the second interface of the camera APP on the outer screen includes:
the camera APP receives a preview picture from a camera service;
and displaying the second interface containing the preview picture on the outer screen.
In one implementation, the displaying the first interface on the external screen includes:
the method comprises the steps that a camera APP sends a preview request to an image processing engine IPE through a camera service, and the preview request is used for requesting to obtain a preview picture;
the camera APP receives the preview screen from the IPE through the camera service;
and displaying the first interface containing the preview picture on the outer screen.
In one implementation manner, the closed state is a state when the folding angle of the terminal device is smaller than or equal to a second angle threshold, the semi-folded state is a state when the folding angle of the terminal device is greater than the second angle threshold and smaller than a first angle threshold, and the first angle threshold is greater than the second angle threshold.
In a third aspect, an apparatus for displaying a camera interface is provided that includes means for performing any of the methods of the first or second aspects. The device can be a terminal device and also can be a chip in the terminal device. The apparatus may include an input unit and a processing unit.
When the apparatus is a terminal device, the processing unit may be a processor, and the input unit may be a communication interface; the terminal device may further comprise a memory for storing computer program code which, when executed by the processor, causes the terminal device to perform the method of any of the first or second aspects.
When the apparatus is a chip in a terminal device, the processing unit may be a logic processing unit inside the chip, and the input unit may be an output interface, a pin, a circuit, or the like; the chip may also include a memory, which may be a memory within the chip (e.g., registers, cache, etc.) or a memory external to the chip (e.g., read-only memory, random access memory, etc.); the memory is adapted to store computer program code which, when executed by the processor, causes the chip to perform any of the methods of the first or second aspects.
In a third aspect, there is provided a computer readable storage medium having stored computer program code which, when executed by an apparatus displaying a camera interface, causes the apparatus to perform any one of the methods of the first or second aspects.
In a fourth aspect, there is provided a computer program product comprising: computer program code which, when run by an apparatus displaying a camera interface, causes the apparatus to perform any of the methods of the first or second aspects.
Drawings
FIG. 1 is a schematic diagram of a hardware architecture of a device suitable for use in the present application;
FIG. 2 is a schematic diagram of a software architecture of an apparatus suitable for use in the present application;
FIG. 3 is a schematic illustration of an application scenario suitable for use in the present application;
fig. 4 is a schematic diagram of an unfolded state of the folding-screen terminal device;
fig. 5 is a schematic diagram of a half-folded state and a closed state of the folding-screen terminal device;
FIG. 6 is a schematic diagram of a camera UI for an external screen display;
FIG. 7 is a schematic illustration of a guidance interface provided herein;
FIG. 8 is a schematic diagram of a camera UI for an in-screen display;
FIG. 9 is a schematic diagram of another camera UI for an exterior screen display;
FIG. 10 is a flow chart of a method of displaying a camera interface provided herein;
FIG. 11 is a schematic diagram illustrating rules for displaying a camera interface provided herein;
FIG. 12 is a flow chart of another method of displaying a camera interface provided herein;
FIG. 13 is a schematic diagram of an inner screen camera UI provided herein;
FIG. 14 is a schematic diagram of an external screen camera UI provided by the present application;
FIG. 15 is a flow chart of another method of displaying a camera interface provided herein;
FIG. 16 is a schematic view of another guidance interface provided herein.
Detailed Description
The technical solutions in the embodiments of the present application will be described below with reference to the accompanying drawings.
Fig. 1 shows a hardware structure of an apparatus suitable for the present application.
The apparatus 100 may be a mobile phone, a smart screen, a tablet computer, a wearable electronic device, an in-vehicle electronic device, an Augmented Reality (AR) device, a Virtual Reality (VR) device, a notebook computer, an ultra-mobile personal computer (UMPC), a netbook, a Personal Digital Assistant (PDA), and the like, and the embodiment of the present application does not limit the specific type of the apparatus 100.
The device 100 may include a processor 110, an internal memory 120, a display 130, a sensor 140, a camera 150 charge management module 160, a power management module 161, a battery 162, and the like. The display 130 may include two display screens (e.g., the display screen 131 and the display screen 132), or may include more display screens; the sensor module 140 may include a pressure sensor 141, a touch sensor 142, and the like.
The configuration shown in fig. 1 is not intended to specifically limit the apparatus 100. In other embodiments of the present application, the apparatus 100 may include more or fewer components than those shown in FIG. 1, or the apparatus 100 may include a combination of some of the components shown in FIG. 1, or the apparatus 100 may include sub-components of some of the components shown in FIG. 1. The components shown in fig. 1 may be implemented in hardware, software, or a combination of software and hardware.
The controller can generate an operation control signal according to the instruction operation code and the time sequence signal to finish the control of instruction fetching and instruction execution.
A memory may also be provided in processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that have just been used or recycled by the processor 110. If the processor 110 needs to use the instruction or data again, it can be called directly from the memory. Avoiding repeated accesses reduces the latency of the processor 110, thereby increasing the efficiency of the system.
In some embodiments, processor 110 may include one or more interfaces. For example, the processor 110 may include at least one of the following interfaces: an inter-integrated circuit (I2C) interface, a Mobile Industry Processor Interface (MIPI), and a general-purpose input/output (GPIO) interface.
The I2C interface is a bidirectional synchronous serial bus including a serial data line (SDA) and a Serial Clock Line (SCL). In some embodiments, the processor 110 may include multiple sets of I2C buses. The processor 110 may be coupled to the touch sensor 142, the camera 150, the charger, etc. through different I2C bus interfaces, respectively. For example: the processor 110 may be coupled to the touch sensor 142 via an I2C interface, such that the processor 110 and the touch sensor 142 communicate via an I2C bus interface to implement the touch function of the apparatus 100.
A MIPI interface may be used to connect processor 110 with peripheral devices such as display screen 130 and camera 150. The MIPI interface includes a Camera Serial Interface (CSI), a Display Serial Interface (DSI), and the like. In some embodiments, processor 110 and camera 150 communicate via CSI to enable the capture functionality of apparatus 100. In other embodiments, processor 110 and display screen 130 communicate via DSI to implement display functions of device 100.
The GPIO interface may be configured by software. The GPIO interface may be configured as a control signal interface and may also be configured as a data signal interface. In some embodiments, a GPIO interface may be used to connect the processor 110 with the display screen 130 and the sensor module 140. The GPIO interface may also be configured as an I2C interface or a MIPI interface.
The connection relationship between the modules shown in fig. 1 is merely illustrative and does not limit the connection relationship between the modules of the apparatus 100. Alternatively, the modules of the apparatus 100 may also adopt a combination of the connection manners in the above embodiments.
The device 100 may implement display functionality via the GPU, the display screen 130, and the application processor. The GPU is a microprocessor for image processing, and is connected to the display screen 130 and an application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. The processor 110 may include one or more GPUs that execute program instructions to generate or alter display information.
The display screen 130 may be used to display images or video. The display screen 130 includes a display panel. The display panel may adopt a Liquid Crystal Display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (AMOLED), a flexible light-emitting diode (FLED), a Mini light-emitting diode (Mini LED), a Micro light-emitting diode (Micro LED), a Micro OLED (Micro OLED), or a quantum dot light-emitting diode (QLED). In some embodiments, the apparatus 100 may include N display screens 130, N being a positive integer greater than 1.
The device 100 may implement a photographing function through the ISP, the camera 150, the video codec, the GPU, the display screen 130, and the application processor, etc.
The ISP is used to process the data fed back by the camera 150. For example, when a photo is taken, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electrical signal, and the camera photosensitive element transmits the electrical signal to the ISP for processing and converting into an image visible to naked eyes. The ISP can optimize the algorithm of the noise, brightness and color of the image, and can also optimize the parameters of exposure, color temperature and the like of the shooting scene. In some embodiments, the ISP may be provided in camera 150.
The camera 150 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image to the photosensitive element. The photosensitive element may be a Charge Coupled Device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor. The light sensing element converts the optical signal into an electrical signal, which is then passed to the ISP where it is converted into a digital image signal. And the ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into a standard Red Green Blue (RGB), YUV, or the like format image signal. In some embodiments, apparatus 100 may include 1 or N cameras 150, N being a positive integer greater than 1.
The digital signal processor is used for processing digital signals, and can process digital image signals and other digital signals. For example, when the apparatus 100 selects a frequency bin, the digital signal processor is configured to perform fourier transform or the like on the frequency bin energy.
Video codecs are used to compress or decompress digital video. The apparatus 100 may support one or more video codecs. In this way, the apparatus 100 can play or record video in a variety of encoding formats, such as: moving Picture Experts Group (MPEG) 1, MPEG2, MPEG3, and MPEG4.
The NPU is a processor which uses biological neural network structure for reference, for example, the NPU can rapidly process input information by using a transfer mode between human brain neurons, and can also continuously self-learn. The NPU can implement functions of the apparatus 100, such as intelligent recognition, for example: image recognition, face recognition, speech recognition and text understanding.
Internal memory 120 may be used to store computer-executable program code, including instructions. The internal memory 120 may include a program storage area and a data storage area. Wherein the storage program area may store an operating system, an application program required for at least one function (e.g., a sound playing function and an image playing function). The storage data area may store data (e.g., audio data and a phonebook) created during use of the device 100. In addition, the internal memory 120 may include a high-speed random access memory, and may also include a nonvolatile memory such as: at least one magnetic disk storage device, a flash memory device, and a universal flash memory (UFS), and the like. The processor 110 performs various processing methods of the apparatus 100 by executing instructions stored in the internal memory 120 and/or instructions stored in a memory provided in the processor.
The pressure sensor 141 is used for sensing a pressure signal, and can convert the pressure signal into an electrical signal. In some embodiments, the pressure sensor 141 may be disposed on the display screen 130. The pressure sensor 141 may be of a wide variety, and may be, for example, a resistive pressure sensor, an inductive pressure sensor, or a capacitive pressure sensor. The capacitive pressure sensor may be a sensor including at least two parallel plates having conductive materials, and when a force is applied to the pressure sensor 141, the capacitance between the electrodes changes, and the apparatus 100 determines the intensity of the pressure based on the change in capacitance. When a touch operation is applied to the display screen 130, the device 100 detects the touch operation according to the pressure sensor 141. The apparatus 100 may also calculate the touched position based on the detection signal of the pressure sensor 141. In some embodiments, the touch operations that are applied to the same touch position but have different touch operation intensities may correspond to different operation instructions. For example: when the touch operation with the touch operation intensity smaller than the first pressure threshold value acts on the short message application icon, executing an instruction for viewing the short message; and when the touch operation with the touch operation intensity larger than or equal to the first pressure threshold value acts on the short message application icon, executing an instruction of newly building the short message.
The touch sensor 142 is also referred to as a touch device. The touch sensor 142 may be disposed on the display screen 130, and the touch sensor 142 and the display screen 130 form a touch screen, which is also referred to as a touch screen. The touch sensor 142 is used to detect a touch operation applied thereto or in the vicinity thereof. The touch sensor 142 may communicate the detected touch operation to the application processor to determine the touch event type. Visual output associated with the touch operation may be provided through the display screen 130. In other embodiments, the touch sensor 142 may be disposed on a surface of the device 100 and at a different location than the display screen 130.
The charge management module 160 is used to receive power from the charger. The charger may be a wireless charger or a wired charger. In some wired charging embodiments, the charge management module 160 may receive the current of a wired charger. In some wireless charging embodiments, the charging management module 160 may receive electromagnetic waves through a wireless charging coil of the device 100. The charging management module 160 may also provide power to the device 100 via the power management module 161 while charging the battery 162.
The power management module 161 is used to connect the battery 162, the charging management module 160 and the processor 110. The power management module 161 receives input from the battery 162 and/or the charging management module 160 and provides power to the processor 110, the internal memory 120, the display 130, the camera 150, and the like. The power management module 161 may also be used to monitor parameters such as battery capacity, battery cycle count, and battery state of health (e.g., leakage, impedance). Alternatively, the power management module 161 may be disposed in the processor 110, or the power management module 161 and the charging management module 160 may be disposed in the same device.
The hardware system of the apparatus 100 is described in detail above, and the software system of the apparatus 100 is described below. The software system may adopt a layered architecture, an event-driven architecture, a micro-core architecture, a micro-service architecture or a cloud architecture, and the software system of the apparatus 100 is exemplarily described in the embodiment of the present application by taking the layered architecture as an example.
As shown in fig. 2, the software system adopting the layered architecture is divided into a plurality of layers, and each layer has a clear role and division of labor. The layers communicate with each other through a software interface. In some embodiments, the software system may be divided into four layers, an application layer, an application framework layer, a Hardware Abstraction Layer (HAL), and a kernel layer from top to bottom.
The application layer may include a camera or like application. The camera APP may call an interface provided by an Android source project (AOSP), and communicate with a camera service of an application framework layer through a remote object access (binder) mechanism to implement a specific function (e.g., a photographing function).
The application framework (framework) layer provides an Application Programming Interface (API) and a programming framework for the application program of the application layer. The application framework layer may include some predefined functions.
For example, the application framework layer includes a camera service (CameraService), a display module, a window manager, a resource manager, a view system, and the like.
The camera service is configured to process data streams related to the camera APP, for example, the camera service may instruct the camera 150 to start working based on a call command of the camera APP, acquire a photographing data stream, a preview frame data stream, or a video data stream generated by the camera 150, and send the data streams to the camera APP.
The display module is used for controlling the state of the display screen 130, for example, the camera APP may call the display module to control the display screen 130, so that the display screen 130 is in a state of turning off the screen, turning on the screen, powering off, powering on, and the like.
The window manager is used for managing window programs. The window manager can obtain the size of the display screen and judge whether a status bar, a lock screen and a capture screen exist.
The resource manager provides various resources for the application, such as localized strings, icons, pictures, layout files, and video files.
The view system includes visual controls such as controls to display text and controls to display pictures. The view system may be used to build a display interface for an application, such as an interface for a camera APP.
The HAL presets communication protocols and interfaces with the frame layer, and the specific architecture of the HAL can be implemented by a manufacturer. For example, the HAL may contain an Image Processing Engine (IPE) that may process a preview frame request of the camera service, invoke a camera driver in the kernel layer to acquire a preview frame data stream, and forward the preview frame data stream to the camera service.
The kernel layer is a layer between hardware and software. The kernel layer can comprise driving modules such as a display driver, a camera driver and a sensor driver, and the driving modules can directly control hardware to work.
The software system shown in fig. 2 may further include more functional modules, for example, the software system may further include a system library and an Android Runtime (Android Runtime).
The system library may include a plurality of functional modules, such as: surface managers (surface managers), media Libraries (Media Libraries), three-dimensional graphics processing Libraries (e.g., open graphics library for embedded systems, openGL ES) and 2D graphics engines (e.g., skin Graphics Library (SGL)) for embedded systems.
The surface manager is used for managing the display subsystem and providing fusion of the 2D layer and the 3D layer for a plurality of application programs.
The media library supports playback and recording of multiple audio formats, playback and recording of multiple video formats, and still image files. The media library may support a variety of audiovisual coding formats, such as MPEG4, h.264, moving picture experts group audio layer III (MP 3), advanced Audio Coding (AAC), adaptive multi-rate (AMR), joint photographic experts group (JPG), and Portable Network Graphics (PNG).
The three-dimensional graphics processing library may be used to implement three-dimensional graphics drawing, image rendering, compositing, and layer processing.
The two-dimensional graphics engine is a drawing engine for 2D drawing.
The Android Runtime comprises a core library and a virtual machine. The Android runtime is responsible for scheduling and managing an Android system.
The core library comprises two parts: one part is a function which needs to be called by java language, and the other part is a core library of android.
The application layer and the application framework layer run in a virtual machine. And executing java files of the application program layer and the application program framework layer into a binary file by the virtual machine. The virtual machine is used to perform the functions of object lifecycle management, stack management, thread management, security and exception management, and garbage collection.
The workflow of the software system and the hardware system of the apparatus 100 is exemplarily described below in connection with a scenario of opening the camera APP.
When a user performs a touch operation on the touch sensor 180K, a corresponding hardware interrupt is sent to the kernel layer, and a sensor driver of the kernel layer processes the touch operation into an original input event, where the original input event includes information such as touch coordinates and a timestamp of the touch operation. The original input event is stored in the kernel layer, and the application framework layer acquires the original input event from the kernel layer, identifies a control corresponding to the original input event, and notifies an Application (APP) corresponding to the control. For example, the touch operation is a click operation, the APP corresponding to the control is a camera APP, and after the camera APP is awakened by the click operation, the display driver of the kernel layer may be called through the API, and the UI of the camera APP is displayed on the display screen 130 through the display driver.
Fig. 3 is an application scenario applicable to the present application, where a photographer uses a folding screen terminal device to perform photographing, a screen facing the photographer is an inner screen, a screen facing the photographer is an outer screen, and both the photographer and the photographer can see a preview image. The terminal device applicable to the present application is not limited to this, and other terminal devices that enable the photographer and the subject to see the preview screen at the same time are also applicable to the present application.
Fig. 4 is a schematic diagram of an unfolded state of the folding-screen terminal device, and fig. 5 is a schematic diagram of a half-folded state and a closed state of the folding-screen terminal device. The closed state is a state when the folding angle of the folding screen terminal device is smaller than or equal to a second angle threshold, the semi-folded state is a state when the folding angle of the folding screen terminal device is larger than the second angle threshold and smaller than a first angle threshold, and the unfolded state is a state when the folding angle of the folding screen terminal device is larger than or equal to the first angle threshold. The first angle threshold is greater than the second angle threshold, e.g., the first angle threshold is 135 degrees and the second angle threshold is 45 degrees.
Fig. 6 is a schematic diagram of a camera UI displayed on an external screen. When the terminal device is in a closed state, the user clicks the "cooperation" button, and the terminal device may display the interface shown in fig. 7 on the external screen to guide the user to expand the terminal device. In the unfolding process of the terminal equipment, the outer screen can always display the interface shown in fig. 7, and the inner screen can always keep the screen-off state.
When the terminal device is switched from the semi-folded state to the unfolded state, the inner screen may display the camera UI shown in fig. 8, and the outer screen may display the camera UI shown in fig. 9, where it should be noted that the camera UIs displayed by the inner screen and the outer screen may be the same or different.
The user may click a "collaborative photographing" button on the camera UI shown in fig. 8, and close the collaborative photographing function, so that the external screen no longer displays the preview screen.
Fig. 10 is a flowchart of a method for displaying a camera interface provided by the present application.
S1001, the folding state control module of the camera APP detects that the mobile phone is unfolded.
The folding state of the mobile phone can be acquired from the camera service, wherein the camera service can call the sensor to drive the folding state of the mobile phone to be acquired, and the folding state is informed to the folding state control module.
The user can set up on the cell-phone (folding screen terminal equipment) and expand automatic opening camera APP behind the cell-phone, then fold condition control module detects the cell-phone and can carry out following step after expanding.
S1002, the folded state control module sends a preview request to a camera service in an application Framework (FWK) layer, requesting to obtain a preview frame of the camera APP.
S1003, the folding state control module forwards the preview request to IPE in HAL.
The IPE may call the camera driver of the kernel layer, control the camera 150 to work, and obtain the preview frame.
S1004, the IPE forwards the preview frame to the camera service.
S1005, the camera service forwards the preview frame to the UI control module.
S1006, the UI control module displays the inner screen camera UI.
Displaying the preview frame on the inner screen is a preset function, the folding state control module sends a preview request to the camera service, and the UI control module receives the preview frame from the camera service and then can display the preview frame on the inner screen (namely, displaying the inner screen camera UI).
S1007, the cooperation function control module detects an operation of turning on the cooperation switch.
For example, the user may click a "take a photo in coordination" button on the interface shown in fig. 8, turning on the coordination switch.
S1008, the cooperation function control module sends a message to create an extrascreen camera UI to the camera service.
S1009, the camera service transmits the preview frame to the UI control module.
The outer screen can share the preview frame associated with the inner screen through the surface share mechanism, so that the camera service does not need to send a preview request to the IPE.
S1010, the UI control module displays the outer screen camera UI.
The outer screen camera UI is shown in fig. 9.
S1011, the cooperative function control module detects an operation of turning off the cooperative switch.
For example, the user may click on the "take a photo collaboratively" button on the interface shown in fig. 8, turning off the collaborate switch.
S1012, the cooperation function controlling module sends a message to close the UI of the external screen camera to the UI controlling module.
After receiving the message, the UI control module closes the camera UI displayed on the external screen, and optionally, the mobile phone may continue to display the UI shown in fig. 5 on the external screen.
Alternatively, if the user does not turn off the cooperative switch but folds the handset, the handset may perform the following steps.
And S1013, the folding state control module detects that the mobile phone enters a half-folding state and does not close the UI of the outer screen camera.
S1014, the folding state control module detects that the mobile phone enters a closing state.
S1015, the folding state control module sends a message of closing the cooperative function to the cooperative function control module.
And S1016, the cooperative function control module sends a message of closing the UI of the external screen camera to the UI control module.
And the UI control module closes the camera UI displayed on the external screen after receiving the message.
FIG. 11 is a schematic diagram illustrating rules for displaying a camera interface provided herein.
When the mobile phone is in a closed state, the 'cooperation' switch is defaulted to be in a closed state, and a user can start the cooperation function by clicking the 'cooperation' switch. After clicking the "cooperation" switch, the external screen may display the guidance interface shown in fig. 7, and guide the user to expand the terminal device. In the unfolding process of the terminal equipment, the outer screen can always display the interface shown in fig. 7, and the inner screen can always keep the screen-off state. The camera APP only displays a 'cooperation' switch in a photographing mode, a portrait mode, a video recording mode and a movie mode, and the cooperation function is supported.
When the mobile phone is in an unfolding state, if the user does not actively operate, the 'cooperation' switch defaults to a closing state, and the external screen does not display a preview picture; if the unfolding state is switched from the closed state under the prompt of the guide interface, the 'cooperation' switch is acquiescent to the opening state, and the outer screen displays the preview picture which is the same as the inner screen. The camera APP only displays a 'cooperation' switch in a photographing mode, a portrait mode, a video recording mode and a movie mode, and the cooperation function is supported.
When the mobile phone is in a half-folding state, the 'cooperation' switch is in a closing state and cannot be clicked, and the cooperation function is closed, namely, the preview picture is not displayed on the inner screen any more.
Two other methods of displaying a camera UI provided by the present application are described below.
Fig. 12 is a display process of the camera UI during the process of unfolding the mobile phone to closing the mobile phone, and the method includes the following contents.
And S1201, the folding state control module detects that the mobile phone is unfolded.
The folding state control module may acquire a folding state of the mobile phone from the camera service, wherein the camera service may call the sensor drive to acquire the folding state of the mobile phone.
The user can set "automatically open the camera APP after unfolding the mobile phone" on the mobile phone, and then the folding state control module can execute the following steps after detecting that the mobile phone is unfolded.
S1202, the folded state control module sends a preview request to the camera service in the FWK to request to acquire a preview frame of the camera APP.
S1203, the folding state control module forwards the preview request to the IPE in the HAL.
The IPE may call the camera driver of the kernel layer, control the camera 150 to work, and obtain the preview frame.
S1204, the IPE forwards the preview frame to the camera service.
S1205, the camera service forwards the preview frame to the UI control module.
It should be noted that the operation of the camera 150 is continuous, so the preview frames acquired by the IPE from the camera 150 are continuous, and the preview frames forwarded by the IPE to the UI control module through the camera service are also continuous, that is, the preview frames in S1204 and S1205 are actually the preview frame data stream.
S1206, the UI control module displays the inner screen camera UI.
Displaying the preview frame on the inner screen is a preset function, and after receiving the preview frame from the camera service, the UI control module may invoke a display (display) module in the FWK to process the preview frame, and display the camera UI (i.e., the first interface) on the inner screen. At this time, the cooperative function is not yet turned on, and therefore, the cooperative function switch on the first interface is in an off state, as shown in fig. 13.
It should be noted that the UI shown in fig. 13 is schematic, and the cooperative function switch in the off state may be in other forms, for example, the "cooperative shooting" button is in a virtual state to indicate that the cooperative function switch is in the off state.
S1207, the cooperation function control module detects an operation of turning on the cooperation switch.
For example, the user may click a "take a photo collaboratively" button on the interface shown in fig. 13, turning on the collaborate switch. Optionally, the user may also turn on the cooperative switch through the camera APP under voice control, and the specific manner of turning on the cooperative switch of the camera APP is not limited in this application.
S1208, the cooperative function control module sends a message to the camera service to create the extrascreen camera UI.
S1209, the camera service transmits a preview frame to the UI control module.
The outer screen can share the preview frame associated with the inner screen through the surface share mechanism, so that the camera service does not need to send a preview request to the IPE, and can directly send the preview frame acquired from the IPE to the UI control module.
S1210, the UI control module displays the outer screen camera UI.
After receiving the preview frame from the camera service, the UI control module may invoke a display (display) module to process the preview frame, and display the camera UI (i.e., the second interface) on the outer screen.
The second interface may contain only preview frames, as shown in FIG. 9; the second interface may also contain a preview frame and a co-function switch in the on state as shown in fig. 6. After the cooperative function is turned on, the cooperative function switch in the internal screen camera UI is switched from the off state to the on state, as shown in fig. 8.
After the user finishes shooting with the mobile phone, the user usually needs to check whether the shooting result meets the requirement, and sometimes needs to edit the shooting result. When editing the shooting result, the user needs to hold the mobile phone with one hand and operate the mobile phone on the screen with the other hand, and therefore, the user usually needs to fold the mobile phone to hold the mobile phone with one hand.
S1211, the folding state control module detects that the mobile phone enters the closed state.
The folding state control module can acquire the folding state of the mobile phone from the camera service, wherein the camera service can call the sensor to drive the folding state of the mobile phone.
And S1212, the folding state control module sends a message to the cooperative function control module, and instructs the cooperative function control module to close the cooperative function and the preview frame.
The folding state control module may instruct the cooperative function control module to close the cooperative function and the preview frame through one message, or may instruct the cooperative function control module to close the cooperative function and the preview frame through two messages, respectively, which is not limited in the present application.
And S1213, the cooperative function control module sends a message to the UI control module to instruct the UI control module to close the cooperative switch.
And after receiving the message of closing the cooperative switch, the UI control module calls the display module to update the UI of the camera, and sets the state of the cooperative switch to be a closed state.
S1214, the cooperation function control module sends a message to the camera service instructing the camera service to close the preview frame.
S1215, the camera service sends a message to the IPE instructing the IPE to close the preview frame.
When a user uses the collaborative function to shoot, the camera UI is firstly displayed on the inner screen, so that the preview frame data stream (session) which is initially established is the preview frame data stream associated with the inner screen, and the preview frame data stream associated with the inner screen is used through the surface share mechanism when the camera UI is displayed on the outer screen. After the mobile phone is folded, the inner screen is not used any more, and the mobile phone needs to establish a preview frame data stream associated with the outer screen, so that the cooperative function control module needs to call a camera service to close the preview frame (close the first preview frame data stream).
Optionally, the cooperative function control module may further call the display module to perform power-off processing on the internal screen. After the cell-phone is folded, interior screen is sheltered from, and power consumption can be reduced to the processing of interior screen outage.
S1216, the folding state control module sends a preview request to the camera service, requesting to acquire a preview frame.
After the terminal device is folded, the inner screen is not used any more, and the first preview frame data stream is disconnected, so that a preview frame data stream (second preview frame data stream) associated with the outer screen needs to be established, so that the outer screen can normally display the first interface.
S1217, the camera service forwards the preview request to the IPE.
After receiving the preview request, the IPE calls the camera 150 to obtain the preview frame through the camera drive.
S1218, the IPE sends a preview frame to the camera service.
S1219, the camera service transmits the preview frame to the UI control module.
S1220, the UI control module displays the outer screen camera UI based on the preview frame.
After receiving the preview frame from the camera service, the UI control module may invoke a display (display) module to process the preview frame, and display a camera UI on the external screen, where the UI displayed on the external screen may be a UI including a collaborative function switch, as shown in fig. 14; alternatively, the UI displayed on the external screen at this time may be a UI that does not include a cooperative function switch, as shown in fig. 9. The UI shown in fig. 9 and 14 is an example and not a limitation, and the UI displayed on the outer screen may be another form of UI at this time.
Note that, the first interface in S1220 may be different from the first interface in S1206 due to a difference in screen size, and in fig. 12, the UI including the preview frame screen and the cooperative function switch in the off state may be referred to as the first interface.
As can be seen from the above, in the method shown in fig. 12, the camera APP is not closed after the mobile phone is folded, but the camera UI is continuously displayed on the external screen, and the beneficial effects of the method are as follows.
After the user takes a picture, the user usually wants to see the shooting result, so when the user folds the terminal device, the user may conveniently hold the terminal device while viewing the shooting result, and does not want to turn off the camera APP. If the terminal device closes the camera APP after receiving the folding operation (if the activity of the camera APP is destroyed), the user needs to restart the camera APP when checking the shooting result, and the camera APP needs to reload resources such as executable files and dynamic libraries, which consumes more electric quantity. Based on the method provided by the application, the terminal device does not close the camera APP after detecting the folding operation, the interface of the camera APP is displayed on the external screen, and when the user needs to check the shooting result, the camera APP is not required to be restarted, so that the power consumption is reduced. In addition, the user can quickly check the shooting result, so that the method provided by the application improves the user experience.
Fig. 15 shows a display process of the camera UI during the process of unfolding the mobile phone into the closed state, and the method includes the following steps.
S1501, the folding state control module detects that the mobile phone is unfolded.
The folding state control module can acquire the folding state of the mobile phone from the camera service, wherein the camera service can call the sensor to drive the folding state of the mobile phone.
The user can set "automatically open the camera APP after unfolding the mobile phone" on the mobile phone, and then the folding state control module can execute the following steps after detecting that the mobile phone is unfolded.
S1502, the folding state control module sends a preview request to the camera service in the FWK, requesting to obtain a preview frame of the camera APP.
S1503, the folding state control module forwards the preview request to the IPE in the HAL.
The IPE may call the camera driver of the kernel layer, control the camera 150 to work, and obtain the preview frame.
S1504, the IPE forwards the preview frame to the camera service.
S1505, the camera service forwards the preview frame to the UI control module.
It should be noted that the operation of the camera 150 is continuous, so the preview frames acquired by the IPE from the camera 150 are continuous, and the preview frames forwarded by the IPE to the UI control module through the camera service are also continuous, that is, the preview frames in S1504 and S1505 are actually the preview frame data stream.
S1506, the UI control module displays the in-screen camera UI.
Displaying the preview frame on the inner screen is a preset function, and after receiving the preview frame from the camera service, the UI control module may invoke a display (display) module in the FWK to process the preview frame, and display the camera UI (i.e., the first interface) on the inner screen. At this time, the cooperative function is not yet turned on, and therefore, the cooperative function switch on the first interface is in an off state, as shown in fig. 13.
It should be noted that the UI shown in fig. 13 is schematic, and the cooperative function switch in the off state may be in other forms, for example, the "cooperative shooting" button is in a virtual state to indicate that the cooperative function switch is in the off state.
S1507, the cooperation function control module detects an operation of turning on the cooperation switch.
For example, the user may click a "take a photo in coordination" button on the interface shown in fig. 13, turning on the coordination switch. Optionally, the user can also open the cooperative switch through the voice control camera APP, and the specific mode of opening the cooperative switch of the camera APP is not limited in this application.
S1508, the co-function control module sends a message to the camera service to create the extrascreen camera UI.
S1509, the camera service transmits the preview frame to the UI control module.
The outer screen can share the preview frame associated with the inner screen through the surface share mechanism, so that the camera service does not need to send a preview request to the IPE, and can directly send the preview frame acquired from the IPE to the UI control module.
S1510, the UI control module displays the outer screen camera UI.
After the UI control module receives the preview frame from the camera service, it may invoke a display (display) module to process the preview frame, and display the camera UI (i.e., the second interface) on the outer screen.
The second interface may contain only preview frames, as shown in FIG. 9; the second interface may also contain a preview frame and a co-function switch in an on state as shown in fig. 6.
S1511, the UI control module sends a message to the cooperative function control module indicating the cooperative function control module to close the cooperative function if no new operation is detected within 30S.
For example, after the user clicks the "collaborative photographing" button on the interface shown in fig. 13, if no operation is performed on the mobile phone for more than 30s, the UI control module sends a third message to the collaborative function control module, instructing the collaborative function control module to close the collaborative function.
The above-mentioned 30s is an example and not a limitation, and the UI control module may further instruct the cooperative function control module to turn off the cooperative function according to another preset time period.
Optionally, the UI control module may invoke the display module to display a prompt message on the inner screen, as shown in fig. 16, where the prompt message is used to prompt the user to click the inner screen to wake up the camera APP.
The prompt information enables the user to perceive the current camera state, and avoids misoperation of the user, so that the user experience is improved.
And S1512, the cooperative function control module responds to the information of the UI control module, controls the inner screen to be turned off, and controls the outer screen to be powered off.
After the cooperative function control module receives the information of the UI control module, the display module can be called to perform power-off processing on the external screen and perform screen-off processing on the internal screen.
The mobile phone power-off processing method has the advantages that new operation is not received within the preset time period, the fact that a user does not use the mobile phone with high probability is indicated, the mobile phone conducts power-off processing on the external screen, the internal screen is subjected to screen-off processing, electric quantity can be saved, in addition, the internal screen in the screen-off state can quickly respond to awakening operation, and accordingly user experience is improved.
S1513, the cooperation function control module sends a message of closing the preview frame to the camera service in response to the message of the UI control module.
After the inner screen is turned off and the outer screen is powered off, the preview frame data stream is continuously acquired, so that the significance is lost, and the cooperative function control module calls the camera service to close the preview frame data stream, so that the power consumption can be reduced.
S1514, the camera service sends a message to the IPE closing the preview frame.
After receiving the message, the IPE controls the camera 150 to stop working through the camera drive.
S1515, the UI control module detects a wake-up operation.
The user can wake up the camera APP by clicking the UI shown in fig. 16, and can also wake up the camera APP by voice, and the specific manner of waking up the camera APP is not limited in the present application.
S1516, the UI control module sends a message (fourth message) to start the cooperative function to the cooperative function control module.
S1517, the cooperative function control module controls the inner screen to light and the outer screen to power up in response to the message for starting the cooperative function.
And S1518, the cooperative function control module responds to the cooperative function starting message, sends a preview request to the camera service, and requests to acquire a preview frame.
S1519, the camera service sends a preview request to the IPE.
The IPE may call the camera driver of the kernel layer, control the camera 150 to work, and obtain the preview frame.
S1520, the camera service acquires the preview frame from the IPE.
S1521, the camera service forwards the preview frame to the UI control module.
S1522, the UI control module controls the external screen and the internal screen to display the camera UI, respectively.
After receiving the preview frame from the camera service, the UI control module may invoke a display (display) module to process the preview frame, and display the camera UI on the outer screen and the inner screen, respectively.
In the method shown in fig. 15, the UI control module records a camera state (cooperative function activation state) before the screen is turned off, so that after the UI control module receives the wake-up operation, the UI control module instructs the cooperative function control module to activate the cooperative function, and the user does not need to activate the cooperative function by operation, thereby improving user experience.
The present application also provides a computer program product which, when executed by a processor, implements the method of any of the method embodiments of the present application.
The computer program product may be stored in a memory and eventually transformed into an executable object file that can be executed by a processor via preprocessing, compiling, assembling and linking.
The computer program product may also solidify the code in the chip. The present application is not intended to be limited to the particular form of the computer program product.
The present application also provides a computer-readable storage medium having stored thereon a computer program which, when executed by a computer, implements the method of any of the method embodiments of the present application. The computer program may be a high-level language program or an executable object program.
The computer readable storage medium may be volatile memory or nonvolatile memory, or may include both volatile and nonvolatile memory. The non-volatile memory may be a read-only memory (ROM), a Programmable ROM (PROM), an Erasable PROM (EPROM), an electrically Erasable EPROM (EEPROM), or a flash memory. Volatile memory can be Random Access Memory (RAM), which acts as external cache memory. By way of example, but not limitation, many forms of RAM are available, such as Static Random Access Memory (SRAM), dynamic Random Access Memory (DRAM), synchronous Dynamic Random Access Memory (SDRAM), double data rate SDRAM, enhanced SDRAM, SLDRAM, synchronous Link DRAM (SLDRAM), and direct rambus RAM (DR RAM).
It can be clearly understood by those skilled in the art that, for convenience and brevity of description, the specific working processes and the generated technical effects of the above-described apparatuses and devices may refer to the corresponding processes and technical effects in the foregoing method embodiments, and are not described herein again.
In the several embodiments provided in the present application, the disclosed system, apparatus and method can be implemented in other ways. For example, some features of the method embodiments described above may be omitted, or not performed. The above-described embodiments of the apparatus are merely exemplary, the division of the unit is only one logical function division, and there may be other division ways in actual implementation, and a plurality of units or components may be combined or integrated into another system. In addition, the coupling between the units or the coupling between the components may be direct coupling or indirect coupling, and the coupling includes electrical, mechanical or other connections.
It should be understood that, in the various embodiments of the present application, the sequence numbers of the processes do not mean the execution sequence, and the execution sequence of the processes should be determined by the functions and the inherent logic, and should not constitute any limitation to the implementation process of the embodiments of the present application.
Additionally, the terms "system" and "network" are often used interchangeably herein. The term "and/or" herein is only one kind of association relationship describing the association object, and means that there may be three kinds of relationships, for example, a and/or B, and may mean: a exists alone, A and B exist simultaneously, and B exists alone. In addition, the character "/" herein generally indicates that the former and latter associated objects are in an "or" relationship.
In short, the above description is only a preferred embodiment of the present disclosure, and is not intended to limit the scope of the present disclosure. Any modification, equivalent replacement, improvement and the like made within the spirit and principle of the present application shall be included in the protection scope of the present application.
Claims (12)
1. A method for displaying a camera interface, which is applied to a foldable terminal device, wherein the terminal device comprises an inner screen and an outer screen, the method comprising:
receiving a first operation, wherein the first operation is used for opening a camera application, and the terminal equipment is in an unfolding state;
responding to the first operation, and displaying a first interface of the camera application on the inner screen, wherein the first interface comprises a collaborative function control, and the collaborative function control is used for starting or closing a collaborative function;
receiving a second operation of the collaborative function control by the user, wherein the second operation is used for opening the collaborative function;
displaying a second interface of the camera application on the outer screen in response to the second operation;
and when the terminal equipment is detected to be in a closed state, closing the first interface on the inner screen and the second interface on the outer screen, and displaying a third interface of the camera application on the outer screen.
2. The method of claim 1, wherein said closing the first interface on the inner screen and the second interface on the outer screen comprises:
the folding state control module sends a first message to the cooperative function control module, wherein the first message instructs the cooperative function control module to close a first preview frame data stream, and the first preview frame data stream is a preview frame data stream associated with the inner screen;
and the cooperative function control module responds to the first message and calls a camera service to close the first preview frame data stream.
3. The method of claim 2, further comprising:
and the cooperative function control module responds to the first message and calls a display module to perform power-off processing on the inner screen.
4. The method of claim 2 or 3, wherein said closing the first interface on the inner screen and the second interface on the outer screen further comprises:
the folding state control module sends a second message to the cooperative function control module, and the second message instructs the cooperative function control module to close the cooperative function;
and the cooperative function control module responds to the second message and closes the cooperative function switch on the first interface through the interface control module.
5. The method of claim 2 or 3, wherein displaying the third interface of the camera application on the external screen comprises:
the folding state control module sends a preview request to the camera service;
the camera service responds to the preview request and acquires a second preview frame data stream, wherein the second preview frame data stream is a preview frame data stream associated with the outer screen;
the camera service sends the second preview frame data stream to the interface control module;
and the interface control module displays the third interface on the external screen according to the second preview frame data stream.
6. The method according to any one of claims 1 to 3, wherein the terminal device further comprises an outer screen camera, and the third interface comprises an image captured by the outer screen camera and the co-functional control.
7. The method of any of claims 1 to 3, further comprising:
when the terminal equipment is detected to be in the unfolding state, displaying the first interface on the inner screen;
receiving a third operation of the collaborative function control by the user, wherein the third operation is used for opening the collaborative function;
displaying the second interface on the outer screen in response to the third operation;
when no new operation is detected within a preset time period after the third operation, the interface control module sends a third message to the cooperative function control module, wherein the third message instructs the cooperative function control module to close the cooperative function;
and the cooperative function control module responds to the third message, calls a display module to perform power-off processing on the external screen and performs screen-off processing on the internal screen.
8. The method of claim 7, further comprising:
and the cooperative function control module responds to the third message and calls a camera service to close the current preview frame data stream.
9. The method of claim 8, further comprising:
when the interface control module detects a wake-up operation, the interface control module sends a fourth message to the cooperative function control module, and the fourth message instructs the cooperative function control module to start a cooperative function;
the cooperative function control module responds to the fourth message, calls the display module to control the outer screen to be powered on and controls the inner screen to be lightened;
the cooperative function control module responds to the fourth message and calls a camera service to acquire a third preview frame data stream;
the interface control module processes the third preview frame data stream, displays the first interface on the inner screen, and displays the second interface on the outer screen.
10. The method of claim 9, wherein before the interface control module detects the wake-up operation, further comprising:
the interface control module calls the display module to display prompt information on the inner screen, wherein the prompt information is used for prompting a user to click the inner screen to wake up the camera application.
11. An apparatus for displaying a camera interface, comprising a processor and a memory, the processor and the memory coupled, the memory for storing a computer program that, when executed by the processor, causes the apparatus to perform the method of any of claims 1 to 10.
12. A computer-readable storage medium, characterized in that the computer-readable storage medium stores a computer program which, when executed by a processor, causes an apparatus comprising the processor to perform the method of any of claims 1 to 10.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111679509X | 2021-12-31 | ||
CN202111679509 | 2021-12-31 |
Publications (2)
Publication Number | Publication Date |
---|---|
CN115002879A CN115002879A (en) | 2022-09-02 |
CN115002879B true CN115002879B (en) | 2023-04-07 |
Family
ID=83023654
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202210259205.6A Active CN115002879B (en) | 2021-12-31 | 2022-03-16 | Method and device for displaying camera interface |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN115002879B (en) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN117768772A (en) * | 2022-09-16 | 2024-03-26 | 荣耀终端有限公司 | Interaction method and device of camera application interface |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110401766A (en) * | 2019-05-22 | 2019-11-01 | 华为技术有限公司 | A kind of image pickup method and terminal |
CN111124561A (en) * | 2019-11-08 | 2020-05-08 | 华为技术有限公司 | Display method applied to electronic equipment with folding screen and electronic equipment |
CN111263005A (en) * | 2020-01-21 | 2020-06-09 | 华为技术有限公司 | Display method and related device of folding screen |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109766053B (en) * | 2019-01-15 | 2020-12-22 | Oppo广东移动通信有限公司 | User interface display method, device, terminal and storage medium |
-
2022
- 2022-03-16 CN CN202210259205.6A patent/CN115002879B/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110401766A (en) * | 2019-05-22 | 2019-11-01 | 华为技术有限公司 | A kind of image pickup method and terminal |
CN111124561A (en) * | 2019-11-08 | 2020-05-08 | 华为技术有限公司 | Display method applied to electronic equipment with folding screen and electronic equipment |
CN111263005A (en) * | 2020-01-21 | 2020-06-09 | 华为技术有限公司 | Display method and related device of folding screen |
Also Published As
Publication number | Publication date |
---|---|
CN115002879A (en) | 2022-09-02 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN114205522B (en) | Method for long-focus shooting and electronic equipment | |
WO2021057203A1 (en) | Operation method and electronic device | |
WO2024055797A9 (en) | Method for capturing images in video, and electronic device | |
CN115002879B (en) | Method and device for displaying camera interface | |
EP4171002A1 (en) | Method for image snapshot in video recording, and electronic device | |
WO2022247541A1 (en) | Method and apparatus for application animation linking | |
WO2023093169A1 (en) | Photographing method and electronic device | |
CN115714908B (en) | Switching control method of working modes, electronic equipment and readable storage medium | |
CN115689963A (en) | Image processing method and electronic equipment | |
WO2022262550A1 (en) | Video photographing method and electronic device | |
WO2023241544A1 (en) | Component preview method and electronic device | |
WO2023116012A1 (en) | Screen display method and electronic device | |
WO2022262540A1 (en) | Photographing method and electronic device | |
WO2024169305A1 (en) | Application management method and electronic device | |
EP4395354A1 (en) | Photographing method and related device | |
WO2022262453A1 (en) | Abnormality prompting method and electronic device | |
WO2024094046A1 (en) | Application display method, electronic device, and storage medium | |
WO2024109296A1 (en) | Multi-device task stream transfer method and related apparatus | |
US20240364997A1 (en) | Shooting Restoration Method and Electronic Device | |
WO2024109198A1 (en) | Window adjustment method and related apparatus | |
WO2023035920A1 (en) | Method for capturing image during filming, and electronic device | |
EP4361805A1 (en) | Method for generating theme wallpaper, and electronic device | |
WO2023035868A1 (en) | Photographing method and electronic device | |
WO2024152676A1 (en) | Window management method and electronic device | |
WO2023072113A1 (en) | Display method and electronic device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |