CN116744106A - Control method of camera application and terminal equipment - Google Patents

Control method of camera application and terminal equipment Download PDF

Info

Publication number
CN116744106A
CN116744106A CN202211310555.7A CN202211310555A CN116744106A CN 116744106 A CN116744106 A CN 116744106A CN 202211310555 A CN202211310555 A CN 202211310555A CN 116744106 A CN116744106 A CN 116744106A
Authority
CN
China
Prior art keywords
camera
time
camera module
period
application
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202211310555.7A
Other languages
Chinese (zh)
Other versions
CN116744106B (en
Inventor
雷雨
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honor Device Co Ltd
Original Assignee
Honor Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honor Device Co Ltd filed Critical Honor Device Co Ltd
Priority to CN202211310555.7A priority Critical patent/CN116744106B/en
Publication of CN116744106A publication Critical patent/CN116744106A/en
Application granted granted Critical
Publication of CN116744106B publication Critical patent/CN116744106B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D30/00Reducing energy consumption in communication networks
    • Y02D30/70Reducing energy consumption in communication networks in wireless communication networks

Abstract

The application provides a control method of camera application and terminal equipment. The method provided by the application comprises the following steps: detecting a first operation of a camera application by a user, wherein the first operation is used for opening the camera application or switching a camera mode of the camera application; enabling the first camera module at a first time in response to the first operation; and enabling the PHY device at a second moment, wherein the second moment is later than the first moment, the second moment is positioned in a first period, and the first period is a period in which the power-on pins of the first camera module are all at a high level. The method provided by the application is beneficial to reducing the probability of black screen when the camera application is opened or the camera mode is switched.

Description

Control method of camera application and terminal equipment
Technical Field
The present application relates to the field of terminal technologies, and in particular, to a control method for a camera application and a terminal device.
Background
The terminal devices generally include a camera application, which a user can use to take pictures and record video. At present, when a terminal device runs a camera application, the following problems are encountered:
When a camera application is opened or a camera mode is switched (for example, a shooting mode is switched to a video mode or a video mode is switched to a shooting mode), a condition of black screen of the terminal device occurs, and user experience is seriously affected.
Disclosure of Invention
The application provides a control method and terminal equipment of a camera application, which are beneficial to reducing the probability of black screen occurrence of the camera application when a camera mode is opened or switched.
In a first aspect, the present application provides a method for controlling a camera application, which may be applied to a terminal device, where the terminal device includes a first camera module and a Physical (PHY) device. The method provided by the application comprises the following steps: detecting a first operation of a camera application by a user, wherein the first operation is used for opening the camera application or switching a camera mode of the camera application; enabling the first camera module at a first time in response to the first operation; and enabling the PHY device at a second moment, wherein the second moment is later than the first moment, the second moment is positioned in a first period, and the first period is a period in which the power-on pins of the first camera module are all at a high level.
The first operation may be a click operation, a slide operation, or a double click operation applied to the camera by the user, which is not limited in the embodiment of the present application.
The first camera module is a camera module serving the first operation.
The terminal device comprises, for example, 3 camera modules, namely a camera module 1, a camera module 2 and a camera module 3. The terminal device may use the camera module 1 when the camera application is opened, the camera module 2 in the portrait photographing mode, and the camera module 3 in the video recording mode. If the first operation is used to open the camera application, the terminal device detects the first operation, and the terminal device may enable the camera module 1 at the first time. If the first operation is for switching to the portrait photographing mode of the camera application, the terminal device may enable the camera module 2 at the first moment. If the first operation is for switching to the video recording mode of the camera application, the terminal device may enable the camera module 3 at the first moment.
The terminal device responds to the first operation and can power up the first camera module, and the power up time sequence of the first camera module is LP00, LP10 and LP11. The first time is within the power-up sequence.
The enabling of the terminal device at the first moment by the first camera module may specifically include: and when the terminal equipment detects that the first moment arrives, sending an enabling instruction to the first camera module, wherein the enabling instruction is used for indicating the first camera module to enable. After the first camera module receives the enabling instruction, enabling can be performed based on the enabling instruction. At this time, the first camera module has the capability of acquiring image data, but does not acquire image data.
The second time is later than the first time, which may also be referred to as the second time being after the first time, which is not limited in the embodiment of the present application. And the second moment is later than the first moment, the terminal equipment enables the first camera module first and then enables the PHY device.
The terminal device enables the PHY device at the second time, which may specifically include: when the terminal device detects that the second time arrives, an enabling instruction is sent to the PHY device, and the enabling instruction is used for indicating the PHY device to enable. After receiving the enable instruction, the PHY device may perform enabling based on the enable instruction.
The first period is a period when the power-on pins of the first camera module are all at a high level, namely, the period of LP11, at this time, the PHY device is enabled, and the power-on time sequence of the first camera module is in the period of LP11, and the PHY device has the capability of receiving and acquiring image data.
According to the control method for the camera application, the first operation of the camera application by the user is detected, the camera module can be enabled in response to the first operation, the PHY device is enabled, and after the PHY device is enabled, the power-on time sequence of the camera module is in LP11, so that the PHY device can have the capability of receiving image data after being enabled, the camera interface is prevented from being blocked, and the user experience can be improved.
With reference to the first aspect, in some implementations of the first aspect, the first time is located at a start time of a second period, a difference between the second time and the first time is greater than or equal to a first duration, and the difference is less than or equal to a second duration, where the first duration is a duration of the second period, the second duration is a sum of a duration of the second period and a duration of the first period, and the second period is a period in which a first pin is at a high level and a second pin is at a low level in a powered pin of the first camera module.
The second period is a period in which the first pin is at a high level and the second pin is at a low level in the power-on pins of the first camera module, i.e., LP10. The first time may be at the start time of the second period, and then the first time is at the start time of LP10. The first time period is the time period of the second time period, i.e., the LP10 time period. The difference between the second time and the first time is greater than or equal to the LP10 duration.
The first period is LP11, the second period is the sum of the period of the second period and the period of the first period, and the second period is (lp10+lp11). The difference between the second time and the first time is less than or equal to (LbL 10+LbL 11).
The difference between the second time and the first time is greater than or equal to the first time period and the difference is less than or equal to the second time period, and the difference between the second time and the first time period is greater than or equal to the LP10 time period and the difference is less than or equal to (LP 10+ LP 11).
With reference to the first aspect, in some implementations of the first aspect, the terminal device includes first configuration information, where the first configuration information is used to indicate a camera module in the terminal device that needs to adjust an enabling time of the PHY device; the camera module indicated by the first configuration information comprises a first camera module.
The first configuration information may be represented in the form of a configuration file. The first configuration information is used for indicating a camera module in the terminal device, which needs to adjust the enabling time of the PHY device, and the first configuration information may include a name or an identifier of the camera module, which needs to adjust the enabling time of the PHY device.
If the camera module indicated by the first configuration information does not include the first camera module, that is, the first camera module is not the camera module that needs to adjust the enabling time of the PHY device, the terminal device may not execute the subsequent steps provided in the embodiment of the present application, and may start the first camera module by using a method in the prior art.
According to the camera application control method provided by the application, the camera modules needing to adjust the enabling time of the PHY device in the terminal equipment are indicated by the first configuration information, and the camera modules indicated by the first configuration information are only adjusted, so that the camera application control method is less in change compared with the method for adjusting all the camera modules in the terminal equipment.
With reference to the first aspect, in some implementations of the first aspect, the terminal device further includes a second camera module, where the second camera module is not included in the camera module indicated by the first configuration information; the method further comprises the steps of: detecting a second operation of the camera application by the user, the second operation being for switching a camera mode of the camera application, the second operation being different from the first operation; enabling the PHY device at a third time in response to the second operation; enabling the second camera module at a fourth time, wherein the fourth time is later than the third time.
The second camera module is not included in the camera module indicated by the first configuration information, that is, the second camera module is not a camera module that needs to adjust the enabling time of the PHY device.
The second operation is different from the first operation. If the first operation is to open the camera application, the second operation may be to switch camera modes of the camera application. If the first operation is used to switch the camera mode of the camera application, the second operation may also be used to switch the camera mode of the camera application, but the camera mode switched by the first operation is different from the camera mode switched by the second operation.
The second camera module is not a camera module that needs to adjust the enabling time of the PHY device, and when the terminal device detects a second operation to start the second camera module, the PHY device is enabled at a third time in response to the second operation, and the second camera module is enabled at a fourth time, which is later than the third time, that is, the terminal device is set to enable the PHY device before the second camera module is enabled.
The terminal device may employ prior art methods to enable the PHY device prior to enabling the second camera module.
According to the camera application control method provided by the application, when the second camera module is not the camera module which needs to adjust the enabling time of the PHY device and the terminal equipment detects the second operation of starting the second camera module, the PHY device can be enabled before the second camera module is enabled. The two control methods can be switched back and forth according to different camera modules, and the flexibility is high.
With reference to the first aspect, in some implementations of the first aspect, the third time and the fourth time are both in a third period, where the third period is a period in which power-on pins of the second camera module are both at a high level.
The third period is a period in which the power-on pins of the second camera module are all at high level, namely, the period of LP11.
The terminal device may first power up the second camera module in response to the second operation, and the power up timing sequence of the second camera module may be LP11. When the power-on time sequence of the second camera module is in LP11, the terminal device may enable the PHY device first and then enable the second camera module. When the PHY device is in the LP11, the power-on time sequence of the second camera module is in the LP11, and the PHY device can have the capability of receiving image data after being enabled.
According to the control method for the camera application, in the third period, the PHY device can be enabled first and then the second camera module is enabled, the enabling time of the PHY device is not required to be adjusted, and the efficiency of starting the second camera module is improved.
In a second aspect, the present application provides a terminal device comprising: a detection module and a processing device. The detection module is used for: detecting a first operation of a camera application by a user, wherein the first operation is used for opening the camera application or switching a camera mode of the camera application; the processing module is used for: enabling the first camera module at a first time in response to the first operation; and enabling the PHY device at a second moment, wherein the second moment is later than the first moment, the second moment is positioned in a first period, and the first period is a period in which the power-on pins of the first camera module are all at a high level.
With reference to the second aspect, in some implementations of the second aspect, the first time is located at a start time of the second period, a difference between the second time and the first time is greater than or equal to a first duration, and the difference is less than or equal to a second duration, where the first duration is a duration of the second period, the second duration is a sum of a duration of the second period and a duration of the first period, and the second period is a period in which a first pin is at a high level and a second pin is at a low level in a powered pin of the first camera module.
With reference to the second aspect, in some implementations of the second aspect, the terminal device includes first configuration information, where the first configuration information is used to indicate a camera module in the terminal device that needs to adjust an enabling time of the PHY device; the camera module indicated by the first configuration information comprises a first camera module.
With reference to the second aspect, in some implementations of the second aspect, the terminal device further includes a second camera module, where the second camera module is not included in the camera module indicated by the first configuration information; the detection module is also used for: detecting a second operation of the camera application by the user, the second operation being for switching a camera mode of the camera application, the second operation being different from the first operation; the processing module is also used for: enabling the PHY device at a third time in response to the second operation; enabling the second camera module at a fourth time, wherein the fourth time is later than the third time.
With reference to the second aspect, in some implementations of the second aspect, the third time and the fourth time are both located in a third period, where the third period is a period in which power-up pins of the second camera module are both at a high level.
In a third aspect, the present application provides a terminal device, which may also be referred to as a terminal (terminal), a User Equipment (UE), a Mobile Station (MS), a Mobile Terminal (MT), or the like. The terminal device may be a mobile phone, a smart television, a wearable device, a tablet (Pad), a computer with wireless transceiving function, a Virtual Reality (VR) terminal device, an augmented reality (augmented reality, AR) terminal device, a wireless terminal in industrial control (industrial control), a wireless terminal in unmanned driving (self-driving), a wireless terminal in teleoperation (remote medical surgery), a wireless terminal in smart grid (smart grid), a wireless terminal in transportation safety (transportation safety), a wireless terminal in smart city (smart city), a wireless terminal in smart home (smart home), or the like.
The terminal device includes: a processor and a memory; the memory stores computer-executable instructions; the processor executes computer-executable instructions stored in the memory to cause the terminal device to perform a method as in the first aspect.
In a fourth aspect, the application provides a computer readable storage medium storing a computer program which when executed by a processor performs a method as in the first aspect.
In a fifth aspect, the application provides a computer program product comprising a computer program which, when run, causes a computer to perform the method as in the first aspect.
In a sixth aspect, the application provides a chip comprising a processor for invoking a computer program in memory to perform the method according to the first aspect.
It should be understood that, the second aspect to the sixth aspect of the present application correspond to the technical solutions of the first aspect of the present application, and the advantages obtained by each aspect and the corresponding possible embodiments are similar, and are not repeated.
Drawings
FIG. 1 is a schematic diagram of a camera application open;
FIG. 2 is a schematic diagram of a camera mode switch;
FIG. 3 is a schematic diagram of an image data transmission;
FIG. 4 is a schematic diagram of the timing of an LP mode;
fig. 5 is a hardware configuration diagram of a terminal device according to an embodiment of the present application;
fig. 6 is a block diagram of a software architecture of a terminal device to which the embodiment of the present application is applicable;
FIG. 7 is a schematic flow chart of a control method of a camera application provided by an embodiment of the application;
fig. 8 is a timing diagram of a control method of a camera application according to an embodiment of the present application;
FIG. 9 is a schematic flow chart diagram of another control method for a camera application provided by an embodiment of the present application;
fig. 10 is a schematic block diagram of a terminal device according to an embodiment of the present application;
fig. 11 is a schematic block diagram of another terminal device provided in an embodiment of the present application.
Detailed Description
The technical scheme of the application will be described below with reference to the accompanying drawings.
The terminal devices generally include a camera application, which a user can use to take pictures and record video. At present, when a terminal device runs a camera application, the following problems are encountered: when a camera application is opened or a camera mode is switched (for example, a shooting mode is switched to a video mode or a video mode is switched to a shooting mode), a condition of black screen of the terminal device occurs, and user experience is seriously affected.
In the embodiment of the application, the terminal equipment is a mobile phone as an example, and a scene of black screen of camera application is described in detail.
In one possible scenario, the user turns on the camera application and the phone appears black. Fig. 1 shows a schematic diagram of a camera application opening. As shown in the interface a of fig. 1, the desktop of the handset includes icons of a plurality of applications, wherein the icons of the plurality of applications include camera applications. The user may open the camera application by clicking on an icon of the camera application. When the mobile phone detects an operation of clicking the camera application icon by the user, the mobile phone may display a camera interface, i.e., display the b interface in fig. 1, in response to the operation. As shown in interface b of fig. 1, the camera interface displays a black screen, or the camera interface displays an image after a period of time. This situation can severely impact the user experience.
In another possible scenario, the phone appears black when the user switches camera modes. Fig. 2 shows a schematic diagram of a camera mode switch. As shown in the interface a of fig. 2, the camera application of the mobile phone may provide the user with a video mode, a photo mode, and a portrait mode. The camera interface of the cell phone may display video options, photo options, and portrait options. In the interface a in fig. 2, the camera application is in photo mode. The user can switch the photo mode to portrait mode by clicking the portrait option. When the mobile phone detects an operation of clicking the portrait option by the user, the mobile phone switches to a portrait mode in response to the operation, and the b interface in fig. 2 can be displayed. As shown in interface b in fig. 2, the mobile phone switches to portrait mode, but the interface is a black screen. This situation can severely impact the user experience.
When a user opens a camera application or switches camera modes, the mobile phone has a black screen, that is, image data obtained by a camera of the mobile phone is not displayed. In order to determine why the mobile phone cannot display the image data obtained by the camera, the embodiments of the present application have studied the transmission of the image data obtained by the camera.
By way of example, fig. 3 shows a schematic diagram of an image data transmission. As shown in fig. 3, the mobile phone includes a camera module, a processor, and a display device. The camera module is used for acquiring image data. The processor includes a Physical (PHY) device. The processor may be a high-pass processor (a high-pass processor may also be referred to as a high-pass platform), but embodiments of the application are not limited thereto. The display device is used for displaying the image data. The PHY device is configured to receive the image data sent by the camera module, and may also be referred to as a PHY device or a PHY chip, which is not limited in the embodiment of the present application.
The processor detects an operation of a user to open a camera application or switch camera modes, and in response to the operation, may instruct the camera module to acquire image data. The processor may receive image data transmitted by the camera module through the PHY device using a mobile industry processor interface (mobile industry processor interface, MIPI) protocol. The processor may control the display device to display the image data acquired from the PHY device.
The data transmission between the PHY device and the camera module needs to meet the timing requirements specified by the MIPI protocol. The MIPI protocol specifies two modes, a Low Power (LP) mode and a High Speed (HS) mode, respectively. The LP mode is a preparation phase before image data transmission, and the HS mode is an image data transmission phase. The LP modes may include LP00, LP10, and LP11.LP00 is used for indicating that the power-on pins of the camera module are all low level 0; LP10 is used for indicating that one pin of the power-on pin of the camera module is low level 0, and the other pin is high level 1; LP11 is used to indicate that the power-on pins of the camera module are all high level 1. The power-on pins of the camera module can comprise PWDN (power down) pins and RESETB pins.
The LP modes used by camera modules produced by different manufacturers are different. Some manufacturers produce camera modules with power-up sequences LP00, LP10 and LP11. The duration of LP00, the duration of LP10, and the duration of LP11 are set by the manufacturer at the time of manufacture.
The control process of the processor to the camera module is to power on first and then enable the last transmission of image data. If the power-on time sequence of the camera module is LP00, LP10 and LP11, after the processor powers on the camera module, the time sequence of the camera module is LP00, when the LP00 time length arrives and the camera module is LP10, the processor controls the camera module to enable, and the processor transmits image data in the time sequence corresponding to the HS mode after the LP11 time length. The processor controls the camera module to enable, which can be understood as a stage that the camera is ready to start working, and can acquire image data but not acquire the image data.
At present, the processor detects an operation of a user to open a camera application or switch a camera mode, and in response to the operation, the camera can be powered on, the PHY device is enabled first, then the camera module is enabled, and finally the PHY device receives image data acquired by the camera module and displays the image data.
If the power-on time sequence of the camera module is LP00, LP10 and LP11, when the processor controls the PHY device to be enabled, the power-on time sequence of the camera module is LP00, and when the processor controls the camera module to be enabled, the power-on time sequence of the camera module is LP10. The PHY device can only receive the image data sent by the camera module when the power-on time sequence of the camera module is LP11, so in this implementation manner, the PHY device cannot work normally when enabled, and can only receive the image data sent by the camera module when waiting for the power-on time sequence of the camera module to be LP 11. The cell phone is in a black screen state within a period of time between the PHY device being enabled to the PHY device being able to receive the image data sent by the camera module. It should be noted that, when the PHY device is enabled and the power-on timing of the camera module is LP11, the PHY device only has the capability of receiving the image data, but does not receive the image data at this time.
Illustratively, FIG. 4 shows a schematic diagram of the timing of one LP mode. The power-on time sequence of the camera module is LP00, LP10 and LP11, and the time length of LP00, the time length of LP10 and the time length of LP11 are set by manufacturers during production. The processor detects an operation of opening a camera application or switching a camera mode by a user, and in response to the operation, the camera module can be powered on, the power-on time sequence of the camera module is in LP00, and in the LP00 time period, the processor controls the PHY device to enable, at the moment, the power-on time sequence of the camera module is not in LP11, and the PHY device cannot work normally, namely, does not have the capability of receiving image data. When the LP00 time period arrives and is in LP10, the processor controls the camera module to be enabled. During the LP10 period, the PHY device does not have the capability to receive image data. When the LP10 duration arrives and is in LP11, the PHY device has the capability of receiving the image data, and after the LP11 duration, the PHY device receives the image data acquired by the camera module in a time sequence corresponding to the HS mode. In this implementation, the PHY device needs to wait for a period of time to have the capability of receiving the image data, and then send the image data to the display device for display, which causes a phenomenon of blackout of the camera interface.
In summary, when the user opens the camera application or switches the camera mode, the terminal device will have a black screen, because the timing sequence after the camera module in the terminal device is powered on is LP00, LP10, and LP11, when the device (such as the PHY device) for receiving the image data, which is included in the processor in the terminal device, is enabled, the timing sequence of the camera module is LP00, and the capability of receiving the image data can be provided when waiting for the timing sequence of the camera module to be LP 11.
In view of this, the embodiment of the application provides a control method and terminal equipment for camera application, which detects the operation of opening the camera application or switching the camera mode by a user, and in response to the operation, the camera can be powered on, and the camera module is enabled first, and then the PHY device is enabled, after the PHY device is enabled, the time sequence of the camera module is in LP11, that is, when the time sequence of the camera module is in LP10 after the camera module is powered on, the camera module is controlled to be enabled, and when the time sequence of the camera module is in LP11, the camera module is controlled to be enabled, and the PHY device can have the capability of receiving image data after being enabled, thereby being beneficial to reducing the probability of black screen of the camera interface and improving the user experience.
The method provided by the embodiment of the application can be applied to any terminal equipment including camera application, such as mobile phones, tablet computers, personal computers (personal computer, PC) and the like. The terminal device may also be referred to as a terminal (terminal), a User Equipment (UE), a Mobile Station (MS), a Mobile Terminal (MT), etc. The terminal device may be a mobile phone, a smart television, a wearable device, a tablet (Pad), a computer with wireless transceiving function, a Virtual Reality (VR) terminal device, an augmented reality (augmented reality, AR) terminal device, a wireless terminal in industrial control (industrial control), a wireless terminal in unmanned driving (self-driving), a wireless terminal in teleoperation (remote medical surgery), a wireless terminal in smart grid (smart grid), a wireless terminal in transportation safety (transportation safety), a wireless terminal in smart city (smart city), a wireless terminal in smart home (smart home), or the like. The embodiment of the application does not limit the specific technology and the specific equipment form adopted by the terminal equipment.
In order to better understand the embodiments of the present application, the following describes a hardware structure of the terminal device according to the embodiments of the present application. Fig. 5 is an exemplary hardware configuration diagram of a terminal device according to an embodiment of the present application. As shown in fig. 5, the terminal device may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (universal serial bus, USB) interface 130, a charge management module 140, a power management module 141, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a microphone 170B, a microphone 170C, a sensor module 180, keys 190, an indicator 192, a camera 193, a display 194, and the like.
Alternatively, the sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
It will be appreciated that the structure illustrated in the embodiments of the present application does not constitute a specific limitation on the terminal device. In other embodiments of the application, the terminal device may include more or less components than illustrated, or certain components may be combined, or certain components may be split, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
The present embodiment focuses only on the processor 110, camera 193, and display 194 of the terminal device. The terminal device shown in fig. 5 may acquire image data through the camera 193, receive the image data through the PHY device in the processor 110, and finally display the image data on the display 194, and the embodiment of the present application will be described only with reference to the processor 110, the camera 193, and the display 194.
Processor 110 may include one or more processing units. For example: the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), a controller, a video codec, a digital signal processor (digital signal processor, DSP), a baseband processor, and/or a neural network processor (neural-network processing unit, NPU), etc. Wherein the different processing units may be separate devices or may be integrated in one or more processors. A memory may also be provided in the processor 110 for storing instructions and data. It will be appreciated that the processor 110 may be the processor in the example shown in fig. 3 described above, may include PHY devices, and may have the functionality shown in fig. 3 described above.
A memory may also be provided in the processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that the processor 110 has just used or recycled. If the processor 110 needs to reuse the instruction or data, it may be called from memory. Repeated accesses are avoided and the latency of the processor 110 is reduced, thereby improving the efficiency of the system.
In some embodiments, the processor 110 may include one or more interfaces. The interfaces may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous receiver transmitter (universal asynchronous receiver/transmitter, UART) interface, a mobile industry processor interface (mobile industry processor interface, MIPI), a general-purpose input/output (GPIO) interface, a subscriber identity module (subscriber identity module, SIM) interface, and/or a universal serial bus (universal serial bus, USB) interface, among others.
The terminal device realizes the display function through the GPU, the display screen 194, the AP, and the like. The GPU is a microprocessor for image processing, and is connected to the display 194 and the AP. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 110 may include one or more GPUs that execute program instructions to generate or change display information.
The display screen 194 is used for displaying images, displaying videos, receiving sliding operations, and the like. The display 194 includes a display panel. The display panel may employ a liquid crystal display (liquid crystal display, LCD), an organic light-emitting diode (OLED), an active-matrix organic light emitting diod (AMOLED), a flexible light-emitting diode (flex), a mini, a Micro-OLED, a quantum dot light-emitting diode (quantum dot light emitting diodes, QLED), or the like. In some embodiments, the terminal device may include 1 or N display screens 194, N being a positive integer greater than 1.
The terminal device may implement photographing functions through an ISP, a camera 193, a video codec, a GPU, a display 194, an AP, and the like.
Wherein the ISP is used to process the data fed back by the camera 193. For example, when photographing, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electrical signal, and the camera photosensitive element transmits the electrical signal to the ISP for processing, so that the electrical signal is converted into an image visible to naked eyes. ISP can also optimize the noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in the camera 193.
The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image onto the photosensitive element. The photosensitive element may be a charge coupled device (charge coupled device, CCD) or a Complementary Metal Oxide Semiconductor (CMOS) phototransistor. The photosensitive element converts the optical signal into an electrical signal, which is then transferred to the ISP to be converted into a digital image signal. The ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into an image signal in a standard RGB, YUV, or the like format. In some embodiments, the terminal device may include 1 or N cameras 193, N being a positive integer greater than 1.
Video codecs are used to compress or decompress digital video. The terminal device may support one or more video codecs. In this way, the terminal device may play or record video in multiple encoding formats, for example: dynamic picture experts group (moving picture experts group, MPEG) 1, MPEG2, MPEG3, MPEG4, etc.
The AP outputs a sound signal through an audio device (not limited to the speaker 170A, the receiver 170B, etc.), or displays an image or video through the display screen 194. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be provided in the same device as the mobile communication module 150 or other functional module, independent of the processor 110.
The software system of the terminal device can adopt a layered architecture, an event driven architecture, a microkernel architecture, a microservice architecture or a cloud architecture. The layered architecture may adopt an Android (Android) system, an apple (IOS) system, or other operating systems, which is not limited in the embodiment of the present application. Taking an Android system with a layered architecture as an example, a software structure of the terminal device is illustrated.
Fig. 6 is a software architecture block diagram of a terminal device to which the embodiment of the present application is applicable. The layered architecture divides the software system of the terminal device into a plurality of layers, each layer having a distinct role and division of labor. The layers communicate with each other through a software interface. In some embodiments, the Android system may be divided into four layers, an application layer (applications), an application framework layer (application framework), an Zhuoyun rows (Android run) and system libraries, a hardware abstraction layer, and a kernel layer (kernel) in order from top to bottom. The embodiment of the application only improves the hardware abstraction layer.
The application layer may include a series of application packages that run applications by calling an application program interface (application programming interface, API) provided by the application framework layer. As shown in FIG. 6, the application package may include applications for cameras, gallery, calendars, memos, and maps.
The application framework layer provides APIs and programming frameworks for application programs of the application layer. The application framework layer includes a number of predefined functions. As shown in fig. 6, the application framework layer may include a window manager, a content provider, and a view system, among others.
The window manager is used for managing window programs. The window manager can acquire the size of the display screen, judge whether a status bar exists, lock the screen, intercept the screen and the like. The content provider is used to store and retrieve data and make such data accessible to applications. The data may include video, images, audio, calls made and received, browsing history and bookmarks, phonebooks, etc. The view system includes visual controls, such as controls to display text, controls to display pictures, and the like. The view system may be used to build applications. The display interface may be composed of one or more views. For example, a display interface including a text message notification icon may include a view displaying text and a view displaying a picture.
The android system runtime comprises a core library and a virtual machine. And the android system is responsible for scheduling and managing the android system when running. The core library consists of two parts: one part is a function which needs to be called by Java language, and the other part is a core library of android. The application layer and the application framework layer run in a virtual machine. The virtual machine executes Java files of the application layer and the application framework layer as binary files. The virtual machine is used for executing the functions of object life cycle management, stack management, thread management, security and exception management, garbage collection and the like. The system library may contain modules for a number of functions, such as: surface manager, media library, three-dimensional graphics processing library, etc.
The surface manager is used to manage the display subsystem and provides a fusion of the two-dimensional and three-dimensional layers for the plurality of applications. Media libraries support a variety of commonly used audio, video format playback and recording, still image files, and the like. The media library may support a variety of audio video encoding formats, such as: JPG, PNG, etc. The three-dimensional graphic processing library is used for realizing three-dimensional graphic drawing, image rendering, synthesis, layer processing and the like.
The hardware abstraction layer includes a camera services module. The camera service module is used for providing service for camera application. The camera service module may include a power-up module, an enable module, and a configuration module. The configuration module includes configuration information indicating a camera module that needs to adjust an enable time of the PHY device. The configuration module may be configured to determine whether the camera module started by the user is a camera module that needs to adjust the enabling time of the PHY device, and send indication information to the power-up module based on the determination result. The power-on module can power on the camera module contained in the terminal equipment based on the indication information sent by the configuration module. The camera power-on time sequence may include LP00, LP10 and LP11, or the camera power-on time sequence may be LP11. The power-up module may also determine the enable time of the PHY device and the camera module and send these information to the enable module. The enabling module may enable the PHY device and the camera module based on information sent by the power-on module.
The kernel layer is a layer between hardware and software. The kernel layer is used for driving the hardware so that the hardware works. The kernel layer at least comprises a display driver, a camera driver, a bluetooth driver and the like, which is not limited in the embodiment of the application.
In order to clearly describe the technical solution of the embodiments of the present application, in the embodiments of the present application, the words "first", "second", etc. are used to distinguish the same item or similar items having substantially the same function and effect. It will be appreciated by those of skill in the art that the words "first," "second," and the like do not limit the amount and order of execution, and that the words "first," "second," and the like do not necessarily differ.
In the present application, the words "exemplary" or "such as" are used to mean serving as an example, instance, or illustration. Any embodiment or design described herein as "exemplary" or "for example" should not be construed as preferred or advantageous over other embodiments or designs. Rather, the use of words such as "exemplary" or "such as" is intended to present related concepts in a concrete fashion.
Furthermore, "at least one" means one or more, and "a plurality" means two or more. "and/or", describes an association relationship of an association object, and indicates that there may be three relationships, for example, a and/or B, and may indicate: a alone, a and B together, and B alone, wherein a, B may be singular or plural. The character "/" generally indicates that the context-dependent object is an "or" relationship. "at least one of" or the like means any combination of these items, including any combination of single item(s) or plural items(s). For example, at least one (one) of a, b, and c may represent: a, b, or c, or a and b, or a and c, or b and c, or a, b and c, wherein a, b and c can be single or multiple.
Illustratively, fig. 7 shows a schematic flow chart of a control method 700 of a camera application. The method 700 may be performed by a terminal device, e.g., a cell phone, including a first camera module and a PHY device. The hardware structure of the terminal device may be as shown in fig. 5, and the software structure of the terminal device may be as shown in fig. 6, but the embodiment of the application is not limited thereto. The method 700 may be applied to the scenario illustrated in fig. 1 and 2 described above, but embodiments of the present application are not limited thereto. It should be noted that the PHY device may be included in a processor of the terminal device, and the method provided in the embodiment of the present application may also be specifically executed by the processor of the terminal device. Such as a high-pass processor in the terminal device.
As shown in fig. 7, the method 700 may include the steps of:
s701, detecting a first operation of the camera application by the user, where the first operation is used to open the camera application, or the first operation is used to switch a camera mode of the camera application.
The first operation may be a click operation, a slide operation, or a double click operation applied to the camera by the user, which is not limited in the embodiment of the present application.
If the first operation is used to turn on the application camera, the first operation may be the operation shown in fig. 1 described above, but the embodiment of the application is not limited thereto. If the first operation is used to switch the camera mode of the camera application, the first operation may be the operation shown in fig. 2, but the embodiment of the application is not limited thereto.
S702, responding to a first operation, and enabling the first camera module at a first moment.
The first camera module is a camera module serving the first operation.
The terminal device comprises, for example, 3 camera modules, namely a camera module 1, a camera module 2 and a camera module 3. The terminal device may use the camera module 1 when the camera application is opened, the camera module 2 in the portrait photographing mode, and the camera module 3 in the video recording mode. If the first operation is used to open the camera application, the terminal device detects the first operation, and the terminal device may enable the camera module 1 at the first time. If the first operation is for switching to the portrait photographing mode of the camera application, the terminal device may enable the camera module 2 at the first moment. If the first operation is for switching to the video recording mode of the camera application, the terminal device may enable the camera module 3 at the first moment.
The terminal device responds to the first operation and can power up the first camera module, and the power up time sequence of the first camera module is LP00, LP10 and LP11. The first time is within the power-up sequence.
The enabling of the terminal device at the first moment by the first camera module may specifically include: and when the terminal equipment detects that the first moment arrives, sending an enabling instruction to the first camera module, wherein the enabling instruction is used for indicating the first camera module to enable. After the first camera module receives the enabling instruction, enabling can be performed based on the enabling instruction. At this time, the first camera module has the capability of acquiring image data, but does not acquire image data.
S703, enabling the PHY device at a second time, wherein the second time is later than the first time, and the second time is located in a first period, and the first period is a period in which power-on pins of the first camera module are all at high level.
The second time is later than the first time, which may also be referred to as the second time being after the first time, which is not limited in the embodiment of the present application. And the second moment is later than the first moment, the terminal equipment enables the first camera module first and then enables the PHY device.
The terminal device enables the PHY device at the second time, which may specifically include: when the terminal device detects that the second time arrives, an enabling instruction is sent to the PHY device, and the enabling instruction is used for indicating the PHY device to enable. After receiving the enable instruction, the PHY device may perform enabling based on the enable instruction.
The first period is a period when the power-on pins of the first camera module are all at a high level, namely, the period of LP11, at this time, the PHY device is enabled, and the power-on time sequence of the first camera module is in the period of LP11, and the PHY device has the capability of receiving and acquiring image data.
Optionally, the first time may be located at a start time of a second period, where a difference between the second time and the first time is greater than or equal to a first duration, and the difference is less than or equal to a second duration, where the first duration is a duration of the second period, and the second duration is a sum of a duration of the second period and a duration of the first period, and the second period is a period in which a first pin is at a high level and a second pin is at a low level in a powered pin of the first camera module.
The second period is a period in which the first pin is at a high level and the second pin is at a low level in the power-on pins of the first camera module, i.e., LP10. The first time may be at the start time of the second period, and then the first time is at the start time of LP10. The first time period is the time period of the second time period, i.e., the LP10 time period. The difference between the second time and the first time is greater than or equal to the LP10 duration.
The first period is LP11, the second period is the sum of the period of the second period and the period of the first period, and the second period is (lp10+lp11). The difference between the second time and the first time is less than or equal to (LbL 10+LbL 11).
The difference between the second time and the first time is greater than or equal to the first time period and the difference is less than or equal to the second time period, and the difference between the second time and the first time period is greater than or equal to the LP10 time period and the difference is less than or equal to (LP 10+ LP 11).
Illustratively, fig. 8 shows a timing diagram of a control method of a camera application. The power-on time sequence of the first camera module is LP00, LP10 and LP11, and the duration of LP00, the duration of LP10 and the duration of LP11 are set by manufacturers during production. The terminal equipment detects the operation of opening the camera application or switching the camera mode by a user, and can power on the first camera module in response to the operation, the power-on time sequence of the first camera module is in LP00, when the LP00 time length reaches and is in LP10, the terminal equipment controls the first camera module to enable at the first time (namely, the starting time of LP 10). When the LP10 duration arrives at LP11, the terminal device controls the PHY device to enable at the second time (i.e., during the duration of LP 11). At this time, the PHY device has the capability of receiving image data. And in a time sequence corresponding to the HS mode, the terminal equipment acquires the image data through the first camera module, receives the image data acquired by the first camera module through the PHY device, and finally displays the image data through the display device.
According to the control method for the camera application, provided by the embodiment of the application, the first operation of the camera application by the user is detected, the camera module can be enabled in response to the first operation, the PHY device is enabled, and after the PHY device is enabled, the power-on time sequence of the camera module is in LP11, so that the PHY device can have the capability of receiving image data after being enabled, the camera interface black screen can be avoided, and the user experience can be improved.
As an alternative embodiment, the terminal device may include first configuration information, where the first configuration information is used to indicate a camera module in the terminal device that needs to adjust an enabling time of the PHY device; before enabling the first camera module at the first time in S702, the method 700 may further include: judging whether the camera module indicated by the first configuration information comprises a first camera module or not, and enabling the first camera module at a first moment if the camera module indicated by the first configuration information comprises the first camera module.
The first configuration information may be represented in the form of a configuration file. The first configuration information is used for indicating a camera module in the terminal device, which needs to adjust the enabling time of the PHY device, and the first configuration information may include a name or an identifier of the camera module, which needs to adjust the enabling time of the PHY device.
For example, the camera modules produced by different manufacturers may be different in name or identifier, and before the terminal device leaves the factory, the developer may preset the name of the manufacturer producing the camera module that needs to adjust the enabling time of the PHY device in the terminal device with the first configuration information. The terminal device detects an operation of starting the camera module, for example, the first operation, and the terminal device may determine whether the first configuration information includes the started camera module, for example, the first camera module. If the first configuration information includes the started camera module, enabling the first camera module at the first moment by the terminal equipment.
If the camera module indicated by the first configuration information does not include the first camera module, that is, the first camera module is not the camera module that needs to adjust the enabling time of the PHY device, the terminal device may not execute the subsequent steps provided in the embodiment of the present application, and may start the first camera module by using a method in the prior art.
According to the camera application control method provided by the embodiment of the application, the camera modules needing to adjust the enabling time of the PHY device in the terminal equipment are indicated by the first configuration information, and compared with the camera modules indicated by the first configuration information, the camera modules are adjusted, and the camera application control method is less in change.
As an optional embodiment, the terminal device further includes a second camera module, where the second camera module is not included in the camera module indicated by the first configuration information; the method may further include: detecting a second operation of the camera application by the user, the second operation being for switching a camera mode of the camera application, the second operation being different from the first operation; enabling the PHY device at a third time in response to the second operation; enabling the second camera module at a fourth time, wherein the fourth time is later than the third time.
The second camera module is not included in the camera module indicated by the first configuration information, that is, the second camera module is not a camera module that needs to adjust the enabling time of the PHY device.
The second operation is different from the first operation. If the first operation is to open the camera application, the second operation may be to switch camera modes of the camera application. If the first operation is used to switch the camera mode of the camera application, the second operation may also be used to switch the camera mode of the camera application, but the camera mode switched by the first operation is different from the camera mode switched by the second operation.
The second camera module is not a camera module that needs to adjust the enabling time of the PHY device, and when the terminal device detects a second operation to start the second camera module, the PHY device is enabled at a third time in response to the second operation, and the second camera module is enabled at a fourth time, which is later than the third time, that is, the terminal device is set to enable the PHY device before the second camera module is enabled.
The terminal device may employ prior art methods to enable the PHY device prior to enabling the second camera module.
According to the camera application control method provided by the embodiment of the application, when the second camera module is not the camera module which needs to adjust the enabling time of the PHY device and the terminal equipment detects the second operation of starting the second camera module, the PHY device can be enabled before the second camera module is enabled. The two control methods can be switched back and forth according to different camera modules, and the flexibility is high.
Optionally, the third time and the fourth time may be both in a third period, where the third period is a period in which the power-on pins of the second camera module are both at a high level.
The third period is a period in which the power-on pins of the second camera module are all at high level, namely, the period of LP11.
The terminal device may first power up the second camera module in response to the second operation, and the power up timing sequence of the second camera module may be LP11. When the power-on time sequence of the second camera module is in LP11, the terminal device may enable the PHY device first and then enable the second camera module. When the PHY device is in the LP11, the power-on time sequence of the second camera module is in the LP11, and the PHY device can have the capability of receiving image data after being enabled.
According to the camera application control method provided by the embodiment of the application, in the third period, the PHY device can be enabled first and then the second camera module is enabled, the enabling time of the PHY device is not required to be adjusted, and the efficiency of starting the second camera module is improved.
The terminal equipment provided by the embodiment of the application can comprise a configuration module, a power-on module and an enabling module. The control method of the camera application provided by the embodiment of the application can be executed by the configuration module, the power-on module and the enabling module.
Illustratively, fig. 9 shows a schematic flow chart of a control method 900 of a camera application. As shown in fig. 9, the method 900 may include the steps of:
and S901, when a first operation of a camera application by a user is detected, judging whether the first configuration information comprises a first camera module or not by the configuration module, wherein the first operation is used for starting the first camera module.
The configuration module includes first configuration information for indicating a camera module that needs to adjust an enabling time of the PHY device.
S902, if the first configuration information comprises the first camera module, the configuration module sends a first power-on instruction to the power-on module, and correspondingly, the power-on module receives the first power-on instruction. The first power-on instruction is used for indicating the power-on module to power on the time sequence of the first camera module.
The first power-on instruction is used for indicating the power-on module to power on the first camera module, and the power-on time sequence comprises LP00, LP10 and LP11.
If the first configuration information does not include the first camera module, the configuration module can send a second power-on instruction to the power-on module, and correspondingly, the power-on module receives the second power-on instruction. The second power-on instruction is used for indicating that the power-on module is used for powering on the first camera module, and the power-on time sequence is LP11.
S903, the power-on module powers on the first camera module based on the first power-on instruction, and determines the enabling time of the first camera module and the PHY device.
The power-on module can determine that the enabling time of the first camera module is a first time and the enabling time of the PHY device is a second time based on the first power-on instruction. The second moment is later than the first moment, the second moment is located in the first period, and the first period is a period in which the power-on pins of the first camera module are all at high level.
S904, the power-on module may send an enabling instruction to the enabling module, and correspondingly, the enabling module receives the enabling instruction. The enabling instruction is used for indicating the enabling module to enable based on the determined enabling time.
The enabling instructions may include an enabling time of the first camera module and the PHY device for instructing the enabling module to enable based on the determined enabling time.
S905, enabling the first camera module at a first moment and enabling the PHY device at a second moment based on the enabling instruction by the enabling module.
According to the camera application control method provided by the embodiment of the application, the PHY device can have the capability of receiving image data after being enabled by the configuration module, the power-on module and the enabling module, so that the camera interface black screen can be avoided, and the user experience can be improved.
The sequence numbers of the processes in the above embodiments do not mean the execution sequence, and the execution sequence of the processes should be determined by the functions and the internal logic, and should not limit the implementation process of the embodiments of the present application.
The method provided by the embodiment of the present application is described in detail above with reference to fig. 1 to 9, and the apparatus provided by the embodiment of the present application will be described in detail below with reference to fig. 10 and 11.
Fig. 10 shows a schematic block diagram of a terminal device 1000 according to an embodiment of the present application. The terminal device 1000 includes: a detection module 1010 and a processing module 1020. Wherein, detection module 1010 is used for: detecting a first operation of a camera application by a user, wherein the first operation is used for opening the camera application or switching a camera mode of the camera application; the processing module 1020 is configured to: enabling the first camera module at a first time in response to the first operation; and enabling the PHY device at a second moment, wherein the second moment is later than the first moment, the second moment is positioned in a first period, and the first period is a period in which the power-on pins of the first camera module are all at a high level.
It should be understood that terminal device 1000 herein is embodied in the form of functional modules. The term module herein may refer to an application specific integrated circuit (application specific integrated circuit, ASIC), an electronic circuit, a processor (e.g., a shared, dedicated, or group processor, etc.) and memory that execute one or more software or firmware programs, a combinational logic circuit, and/or other suitable components that support the described functionality. In an alternative example, it will be understood by those skilled in the art that the terminal device 1000 may be specifically a terminal device in the foregoing method embodiment, or the functions of the terminal device in the foregoing method embodiment may be integrated in the terminal device 1000, and the terminal device 1000 may be used to execute each flow and/or step corresponding to the terminal device in the foregoing method embodiment, which is not repeated herein for avoiding repetition.
The terminal device 1000 has a function of implementing the corresponding steps executed by the terminal device in the method embodiment; the above functions may be implemented by hardware, or may be implemented by hardware executing corresponding software. The hardware or software includes one or more modules corresponding to the functions described above.
In an embodiment of the present application, terminal device 1000 in fig. 10 may also be a chip or a chip system, for example: system on chip (SoC).
Fig. 11 is a schematic block diagram of another terminal device 1100 provided by an embodiment of the present application. The terminal apparatus 1100 includes: a processor 1110, a transceiver 1120, and a memory 1130. Wherein the processor 1110, the transceiver 1120 and the memory 1130 are in communication with each other through an internal connection path, the memory 1130 is configured to store instructions, and the processor 1110 is configured to execute the instructions stored in the memory 1130 to control the transceiver 1120 to transmit signals and/or receive signals.
It should be understood that the terminal device 1100 may be specifically a terminal device in the above-described method embodiment, or the functions of the terminal device in the above-described method embodiment may be integrated in the terminal device 1100, and the terminal device 1100 may be configured to perform the steps and/or flows corresponding to the terminal device in the above-described method embodiment. The memory 1130 may optionally include read-only memory and random access memory, and provide instructions and data to the processor 1110. A portion of memory 1130 may also include non-volatile random access memory. For example, the memory 1130 may also store information of the type of device. The processor 1110 may be configured to execute instructions stored in the memory 1130, and when the processor 1110 executes the instructions, the processor 1110 may perform the steps and/or processes corresponding to the terminal device in the above-described method embodiments.
It is to be appreciated that in embodiments of the application, the processor 1110 may be a central processing unit (central processing unit, CPU), which may also be other general purpose processors, digital Signal Processors (DSPs), application Specific Integrated Circuits (ASICs), field Programmable Gate Arrays (FPGAs) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, or the like. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
In implementation, the steps of the above method may be performed by integrated logic circuits of hardware in a processor or by instructions in the form of software. The steps of a method disclosed in connection with the embodiments of the present application may be embodied directly in a hardware processor for execution, or in a combination of hardware and software modules in the processor for execution. The software modules may be located in a random access memory, flash memory, read only memory, programmable read only memory, or electrically erasable programmable memory, registers, etc. as well known in the art. The storage medium is located in a memory, and the processor executes instructions in the memory to perform the steps of the method described above in conjunction with its hardware. To avoid repetition, a detailed description is not provided herein.
The application also provides a computer readable storage medium, which stores a computer program for implementing the method corresponding to the terminal device in the method embodiment.
The application also provides a chip system which is used for supporting the terminal equipment to realize the functions shown in the embodiment of the application in the embodiment of the method.
The present application also provides a computer program product comprising a computer program (which may also be referred to as code, or instructions) which, when run on a computer, is adapted to perform the method corresponding to the terminal device shown in the above-mentioned method embodiments.
Those of ordinary skill in the art will appreciate that the various illustrative modules and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
It will be clearly understood by those skilled in the art that, for convenience and brevity of description, specific working procedures of the above-described system, apparatus and module may refer to corresponding procedures in the foregoing method embodiments, which are not repeated herein.
In the several embodiments provided by the present application, it should be understood that the disclosed systems, devices, and methods may be implemented in other manners. For example, the apparatus embodiments described above are merely illustrative, and for example, the division of the modules is merely a logical function division, and there may be additional divisions when actually implemented, for example, multiple modules or components may be combined or integrated into another system, or some features may be omitted or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or modules, which may be in electrical, mechanical, or other forms.
The modules described as separate components may or may not be physically separate, and components shown as modules may or may not be physical modules, i.e., may be located in one place, or may be distributed over a plurality of network modules. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional module in each embodiment of the present application may be integrated into one processing module, or each module may exist alone physically, or two or more modules may be integrated into one module.
The functions, if implemented in the form of software functional modules and sold or used as a stand-alone product, may be stored in a computer-readable storage medium. Based on this understanding, the technical solution of the present application may be embodied essentially or in a part contributing to the prior art or in a part of the technical solution, in the form of a software product stored in a storage medium, comprising several instructions for causing a computer device (which may be a personal computer, a server, a network device, etc.) to perform all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a read-only memory (ROM), a random access memory (random access memory, RAM), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
The foregoing is merely a specific implementation of the present application, but the scope of the embodiments of the present application is not limited thereto, and any person skilled in the art may easily think about changes or substitutions within the technical scope of the embodiments of the present application, and all changes and substitutions are included in the scope of the embodiments of the present application. Therefore, the protection scope of the embodiments of the present application shall be subject to the protection scope of the claims.

Claims (10)

1. A control method for a camera application, the method being applied to a terminal device, the terminal device including a first camera module and a physical layer PHY device, the method comprising:
detecting a first operation of a user on the camera application, wherein the first operation is used for opening the camera application or switching a camera mode of the camera application;
enabling the first camera module at a first moment in response to the first operation;
enabling the PHY device at a second moment, wherein the second moment is later than the first moment, the second moment is located in a first period, and the first period is a period in which power-on pins of the first camera module are all in high level.
2. The method of claim 1, wherein the first time is at a start time of a second period, a difference between the second time and the first time is greater than or equal to a first time period, and the difference is less than or equal to a second time period, the first time period is a time period of the second period, the second time period is a sum of a time period of the second period and a time period of the first period, and the second period is a period in which a first pin is high and a second pin is low in a powered pin of the first camera module.
3. The method according to claim 1 or 2, wherein the terminal device includes first configuration information for indicating a camera module in the terminal device that needs to adjust an enable time of a PHY device;
the camera module indicated by the first configuration information comprises the first camera module.
4. The method of claim 3, wherein the terminal device further comprises a second camera module, the second camera module not included in the camera module indicated by the first configuration information;
the method further comprises the steps of:
detecting a second operation of the camera application by the user, the second operation being for switching a camera mode of the camera application, the second operation being different from the first operation;
enabling the PHY device at a third time in response to the second operation;
enabling the second camera module at a fourth time, wherein the fourth time is later than the third time.
5. The method of claim 4, wherein the third time and the fourth time are both within a third period of time, and the third period of time is a period of time in which power-up pins of the second camera module are all high.
6. A terminal device, comprising:
the camera application comprises a detection module, a control module and a control module, wherein the detection module detects a first operation of a camera application by a user, the first operation is used for opening the camera application or is used for switching a camera mode of the camera application;
the processing module is used for responding to the first operation and enabling the first camera module at a first moment; and enabling the PHY device at a second moment, wherein the second moment is later than the first moment, the second moment is positioned in a first period, and the first period is a period in which the power-on pins of the first camera module are all at a high level.
7. The terminal device according to claim 6, wherein the terminal device includes first configuration information for indicating a camera module in the terminal device that needs to adjust an enable time of a PHY device;
the camera module indicated by the first configuration information comprises the first camera module.
8. A terminal device, comprising: a processor and a memory;
the memory stores computer-executable instructions;
the processor executes computer-executable instructions stored in the memory to cause the terminal device to perform the method of any one of claims 1 to 5.
9. A computer readable storage medium storing a computer program, characterized in that the computer program, when executed by a processor, implements the method according to any one of claims 1 to 5.
10. A computer program product comprising a computer program which, when run, causes a computer to perform the method of any one of claims 1 to 5.
CN202211310555.7A 2022-10-25 2022-10-25 Control method of camera application and terminal equipment Active CN116744106B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211310555.7A CN116744106B (en) 2022-10-25 2022-10-25 Control method of camera application and terminal equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211310555.7A CN116744106B (en) 2022-10-25 2022-10-25 Control method of camera application and terminal equipment

Publications (2)

Publication Number Publication Date
CN116744106A true CN116744106A (en) 2023-09-12
CN116744106B CN116744106B (en) 2024-04-30

Family

ID=87901784

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211310555.7A Active CN116744106B (en) 2022-10-25 2022-10-25 Control method of camera application and terminal equipment

Country Status (1)

Country Link
CN (1) CN116744106B (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110784571A (en) * 2019-10-31 2020-02-11 维沃移动通信有限公司 Electronic equipment and image display method
CN112188107A (en) * 2020-10-19 2021-01-05 珠海格力电器股份有限公司 Camera control method and device, electronic equipment and storage medium
CN112954424A (en) * 2020-08-21 2021-06-11 海信视像科技股份有限公司 Display device and camera starting method
CN113366819A (en) * 2019-05-29 2021-09-07 深圳市欢太科技有限公司 Camera starting method and related device
WO2022127787A1 (en) * 2020-12-18 2022-06-23 华为技术有限公司 Image display method and electronic device
CN114726950A (en) * 2022-02-28 2022-07-08 荣耀终端有限公司 Opening method and device of camera module
WO2022170856A1 (en) * 2021-02-09 2022-08-18 华为技术有限公司 Method for establishing connection, and electronic device

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113366819A (en) * 2019-05-29 2021-09-07 深圳市欢太科技有限公司 Camera starting method and related device
CN110784571A (en) * 2019-10-31 2020-02-11 维沃移动通信有限公司 Electronic equipment and image display method
CN112954424A (en) * 2020-08-21 2021-06-11 海信视像科技股份有限公司 Display device and camera starting method
CN112188107A (en) * 2020-10-19 2021-01-05 珠海格力电器股份有限公司 Camera control method and device, electronic equipment and storage medium
WO2022127787A1 (en) * 2020-12-18 2022-06-23 华为技术有限公司 Image display method and electronic device
WO2022170856A1 (en) * 2021-02-09 2022-08-18 华为技术有限公司 Method for establishing connection, and electronic device
CN114726950A (en) * 2022-02-28 2022-07-08 荣耀终端有限公司 Opening method and device of camera module

Also Published As

Publication number Publication date
CN116744106B (en) 2024-04-30

Similar Documents

Publication Publication Date Title
EP3872609B1 (en) Application display method and electronic device
WO2021000881A1 (en) Screen splitting method and electronic device
US20230419570A1 (en) Image Processing Method and Electronic Device
CN111182614B (en) Method and device for establishing network connection and electronic equipment
CN112333397B (en) Image processing method and electronic device
CN113986002B (en) Frame processing method, device and storage medium
CN113448382B (en) Multi-screen display electronic device and multi-screen display method of electronic device
EP4156158A1 (en) Drive control method and related device
WO2023093169A1 (en) Photographing method and electronic device
WO2021073183A1 (en) Always on display method and mobile device
WO2020233593A1 (en) Method for displaying foreground element, and electronic device
EP4199499A1 (en) Image capture method, graphical user interface, and electronic device
WO2022143180A1 (en) Collaborative display method, terminal device, and computer readable storage medium
EP4163782A1 (en) Cross-device desktop management method, first electronic device, and second electronic device
CN113986162A (en) Layer composition method, device and computer readable storage medium
CN111381996A (en) Memory exception handling method and device
CN116744106B (en) Control method of camera application and terminal equipment
CN112799557B (en) Ink screen display control method, terminal and computer readable storage medium
CN115691370A (en) Display control method and related device
CN113781959A (en) Interface processing method and device
EP4180949A1 (en) Method and apparatus for generating graffiti patterns, electronic device, and storage medium
CN117076284B (en) Page loading time length detection method, equipment and storage medium
CN116185245B (en) Page display method and electronic equipment
CN113179362B (en) Electronic device and image display method thereof
EP4290375A1 (en) Display method, electronic device and system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant