CN117156270A - Photographing processing method and related device - Google Patents

Photographing processing method and related device Download PDF

Info

Publication number
CN117156270A
CN117156270A CN202310121408.3A CN202310121408A CN117156270A CN 117156270 A CN117156270 A CN 117156270A CN 202310121408 A CN202310121408 A CN 202310121408A CN 117156270 A CN117156270 A CN 117156270A
Authority
CN
China
Prior art keywords
camera
photographing
photographing mode
cameras
terminal equipment
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310121408.3A
Other languages
Chinese (zh)
Inventor
李炳炳
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honor Device Co Ltd
Original Assignee
Honor Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honor Device Co Ltd filed Critical Honor Device Co Ltd
Priority to CN202310121408.3A priority Critical patent/CN117156270A/en
Publication of CN117156270A publication Critical patent/CN117156270A/en
Pending legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72448User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions

Abstract

The embodiment of the application provides a photographing processing method and a related device, and relates to the technical field of terminals. The method comprises the following steps: at a first moment, the terminal equipment receives a first operation aiming at camera application; responding to a first operation, and displaying a first interface by the terminal equipment, wherein the first interface comprises a first photographing mode, a second photographing mode and a third photographing mode; the first photographing mode, the second photographing mode and the third photographing mode are all displayed in a state capable of triggering photographing; at a second moment, the terminal equipment receives a second operation for the camera application again; the second moment is later than the first moment; responding to a second operation, displaying a second interface by the terminal equipment, wherein the second interface comprises a first photographing mode and a third photographing mode, and the first photographing mode and the third photographing mode are both displayed in a state capable of triggering photographing; the second interface does not include a second photographing mode. Thus, the camera application may not display an abnormal photographing mode, so that the camera application may be normally used.

Description

Photographing processing method and related device
Technical Field
The application relates to the technical field of terminals, in particular to a photographing processing method and a related device.
Background
In order to meet the requirement of a user on photographing, the terminal device may include a plurality of cameras, so as to improve photographing experience of the user.
In some implementations, the terminal device may use multiple cameras to mutually cooperate to form different photographing modes, and the user may use the different photographing modes to photograph, so as to improve photographing effects. However, when a user takes a picture, the camera application may not be opened, or when the user clicks a certain photographing mode, the camera application may flash back, freeze the screen, or be blocked, which affects the photographing experience of the user.
Disclosure of Invention
According to the photographing processing method and the related device provided by the embodiment of the application, when a certain photographing mode cannot be used due to the failure of a certain physical camera, the camera interface can not display the photographing mode or display the photographing mode to be in a state of being unable to trigger to enter photographing. Therefore, the camera application can be normally used, and the problems that the camera cannot be opened, flash back, screen is frozen, or the camera is blocked and the like are avoided.
In a first aspect, a photographing processing method provided by an embodiment of the present application includes:
at a first moment, the terminal equipment receives a first operation aiming at camera application; responding to a first operation, and displaying a first interface by the terminal equipment, wherein the first interface comprises a first photographing mode, a second photographing mode and a third photographing mode; the first photographing mode, the second photographing mode and the third photographing mode are all displayed in a state capable of triggering photographing; at a second moment, the terminal equipment receives a second operation for the camera application again; the second moment is later than the first moment; responding to a second operation, displaying a second interface by the terminal equipment, wherein the second interface comprises a first photographing mode and a third photographing mode, and the first photographing mode and the third photographing mode are both displayed in a state capable of triggering photographing; and the second interface does not include a second photographing mode. In this way, the camera application can judge whether a certain photographing mode can be used normally, and under the condition that the certain photographing mode is abnormal, the camera application can not display the photographing mode, so that the camera application can be used normally.
In a possible implementation, the first interface further includes first focal segment information, and the second interface further includes second focal Duan Xinxi, where a magnification range in the second focal segment information is smaller than a magnification range in the first focal segment information. Therefore, under the condition that a certain focal segment cannot be used, the camera application can not display the focal segment which cannot be used, and further, the phenomenon that the camera application is abnormal after the invalid focal segment is clicked can be prevented, and the use experience of a user is affected.
In one possible implementation, before the terminal device displays the second interface in response to the second operation, the method may include: in response to the second operation, the terminal equipment displays a third interface, wherein the third interface comprises a first photographing mode and a third photographing mode, the first photographing mode and the third photographing mode are both displayed in a state in which photographing can be triggered, and the third interface comprises a second photographing mode which is displayed in a state in which photographing can not be triggered; the terminal equipment receives a third operation of the second photographing mode at a third interface; in response to the third operation, the terminal device displays a fourth interface, which may include: and the information is used for prompting that the second photographing mode is unavailable and/or prompting the identification information of the invalid camera in the terminal equipment. Therefore, the camera application can enable the user to know a certain photographing mode or the camera can not be used in time through the popup window, and the user can repair or replace the terminal equipment in time.
In one possible implementation, the fourth interface may further include information for prompting cancellation of display of the second photographing mode, and a target button; the method may further comprise: the terminal equipment receives a fourth operation aiming at the target button; and responding to the fourth operation, and displaying the second interface by the terminal equipment. In this way, the camera application can determine whether the photographing interface continues to display the photographing mode which cannot be used by the user, and user experience can be improved.
In a possible implementation, the terminal device includes an ultra-wide angle camera, a wide angle camera, and/or a tele camera; before the terminal device receives the second operation for the camera application again, it may further include: the terminal equipment determines that the ultra-wide angle camera, the wide angle camera and/or the tele camera fails; when the ultra-wide angle camera fails, the second focal segment information does not comprise 0.1X-1X multiplying power; when the wide-angle camera fails, the second focal segment information does not comprise 1X-10X multiplying power; when the long-focus camera fails, the second focal segment information does not comprise the 10X-100X multiplying power. Therefore, the camera application can update the effective focal segment of each photographing mode according to the failure condition of the camera, so that the ineffective focal segment cannot be displayed on a photographing interface, and the camera can be normally used.
In one possible implementation, the terminal device includes a plurality of cameras, and N cameras of the plurality of cameras are required to be used when the second photographing mode operates, where N is a positive integer; before the terminal device receives the second operation for the camera application again, it may include: the terminal equipment determines that M target cameras fail; m is a positive integer, and M is less than or equal to N; the terminal equipment determines that the type of the target failure cameras and/or the number of the target failure cameras meet preset conditions, wherein the target failure cameras are repeated cameras in M target cameras and N cameras; and the terminal equipment fails the second photographing mode. Therefore, decoupling of the physical camera by the terminal equipment under the condition of being configured with the multi-shooting strategy can be realized, and the terminal equipment can disable the second shooting mode, so that normal use of camera application can be ensured.
In a possible implementation, the determining, by the terminal device, that the type of the target failure camera and/or the number of the target failure cameras meet the preset condition may include: the terminal equipment determines that the type of the target failure camera is the main camera, and/or the terminal equipment determines that the number of the target failure cameras is greater than or equal to half of N. Therefore, the terminal equipment determines the invalid photographing mode according to the preset condition, and the arrangement and combination of the remaining normal cameras are not required, so that the realization logic of the multi-camera configuration strategy is simplified, the calculation force is reduced, and each photographing mode is reserved to the maximum extent.
In a possible implementation, the disabling of the second photographing mode by the terminal device may include: the camera of the terminal equipment drives to update the photographing mode and reports updated information to the camera application, wherein the updated information does not comprise the second photographing mode. Therefore, the camera application can display the corresponding photographing mode according to the updated information reported by the camera driver, and the problems that the camera application cannot be opened, flash back, frozen screen or blocked and the like are avoided due to the abnormal photographing mode.
In one possible implementation, before the M target cameras fail, the terminal device uses the target failure cameras when taking a picture in the third photographing mode; after the M target cameras fail, the terminal equipment does not use the target failure cameras when shooting in the third shooting mode. Therefore, under the condition that some cameras fail, some photographing modes using the failed cameras can still be used, and each photographing mode can be reserved to the maximum extent.
In a second aspect, another photographing processing method provided by an embodiment of the present application includes:
the terminal equipment determines a failure camera; the camera of the terminal equipment drives to update a photographing mode and/or a focal segment, and reports updated information to a camera application, wherein the updated information does not comprise the photographing mode and/or the focal segment which are caused by a failure camera and cannot be used; and the camera application displays each photographing mode and/or variable-focus section according to the updated information reported by the camera driver. Therefore, the camera application can display the corresponding photographing mode and/or focal segment according to the updated information reported by the camera driver, so that the camera application can be used normally, and the problems of incapability of opening, flashing, screen freezing, blocking and the like caused by the abnormal photographing mode and/or focal segment can be avoided.
In a possible implementation, determining, by the terminal device, that the failed camera includes: and (3) the chip platform drive and/or the camera drive of the terminal equipment detect the in-place condition of the camera of the terminal equipment, and the invalid camera is obtained. In this way, the terminal device can decouple the failed physical camera, thereby ensuring the normal use of the camera application.
In one possible implementation, each physical camera and each photographing mode in the terminal device are respectively corresponding to respective camera identifications; after obtaining the failure camera, the method may further include: the chip platform drives and updates the camera identifications of all physical cameras in the terminal equipment and the camera identifications of all photographing modes, and reports the updated camera identifications to the multi-photographing strategy matching module of the terminal equipment, wherein the updated camera identifications do not comprise identifications for identifying invalid cameras and identifications for identifying the photographing modes which cannot be used and are caused by the invalid cameras, and the multi-photographing strategy matching module is used for managing the corresponding relation between the photographing modes and the physical cameras used in the photographing modes. In this way, the multi-shooting strategy matching module of the terminal equipment can output a multi-shooting decision according to the updated camera identification driven by the chip platform, so that the corresponding physical camera shooting pictures are controlled, and normal shooting use of the camera after a certain physical camera fails is completed.
In a possible implementation, the camera driver of the terminal device updates the photographing mode and/or the focal segment, including: the camera driver of the terminal equipment updates L photographing modes into Q photographing modes and/or updates the first focal segment information into second focal segment information; the Q photographing modes are photographing modes after unusable photographing modes caused by a disabled camera are removed from L photographing modes, the multiplying power range in the second focal segment information is smaller than the multiplying power range in the first focal segment information, L is a positive integer, Q is a positive integer, and L is larger than or equal to Q. In this way, the camera application can display the corresponding photographing mode and the effective zooming focal segment according to the updated photographing mode and/or the focal segment reported by the camera driver, so that the ineffective photographing mode and/or the focal segment cannot be displayed on a photographing interface, and the user experience is improved.
In a third aspect, an embodiment of the present application provides a device for photographing, where the device may be a terminal device, or may be a chip or a chip system in the terminal device. The apparatus may include a processing unit and a display unit. The processing unit is configured to implement any method related to processing performed by the terminal device in the first aspect or any possible implementation manner of the first aspect. The display unit is configured to implement any method related to display performed by the terminal device in the first aspect or any possible implementation manner of the first aspect. When the apparatus is a terminal device, the processing unit may be a processor. The apparatus may further comprise a storage unit, which may be a memory. The storage unit is configured to store instructions, and the processing unit executes the instructions stored in the storage unit, so that the terminal device implements the method described in the first aspect or any one of possible implementation manners of the first aspect. When the apparatus is a chip or a system of chips within a terminal device, the processing unit may be a processor. The processing unit executes the instructions stored by the storage unit to cause the terminal device to implement the method described in the first aspect or any one of the possible implementations of the first aspect. The memory unit may be a memory unit (e.g., a register, a cache, etc.) in the chip, or a memory unit (e.g., a read-only memory, a random access memory, etc.) located outside the chip in the terminal device.
The processing unit is for receiving a first operation for the camera application and for receiving a second operation for the camera application. The display unit is used for displaying a first interface, and the first interface comprises a first photographing mode, a second photographing mode and a third photographing mode; the second interface is also used for displaying a second interface, and the second interface comprises a first photographing mode and a third photographing mode.
In a possible implementation manner, the first interface further includes first focal segment information, and the second interface further includes second focal Duan Xinxi, where a magnification range in the second focal segment information is smaller than a magnification range in the first focal segment information.
In a possible implementation manner, the display unit is configured to display a third interface, where the third interface includes a first photographing mode and a third photographing mode that can trigger a photographing state, and a second photographing mode that cannot trigger a photographing state; and also for displaying a fourth interface. And the processing unit is used for receiving a third operation of the second photographing mode.
In a possible implementation, the processing unit is configured to receive a fourth operation for the target button. And the display unit is used for displaying the second interface.
In a possible implementation, the processing unit is configured to determine that the ultra-wide angle camera, the wide angle camera, and/or the tele camera is disabled.
In a possible implementation manner, the processing unit is configured to determine that the M target cameras fail, and further determine that the type of the target failed cameras and/or the number of the target failed cameras meet a preset condition, and further is configured to fail the second photographing mode.
In a possible implementation manner, the processing unit is configured to determine that the type of the target failed camera is the master camera, and/or the terminal device determines that the number of the target failed cameras is greater than or equal to half of N.
In a possible implementation manner, the processing unit is configured to update the photographing mode and report updated information to the camera application.
In a possible implementation manner, the processing unit is configured to use the target failure cameras when taking a picture in the third photographing mode before the M target cameras fail; and the camera is also used for not using the target failure cameras when shooting in the third shooting mode after the M target cameras fail.
In a fourth aspect, an embodiment of the present application provides a device for photographing, where the device may be a terminal device, or may be a chip or a chip system in the terminal device. The apparatus may include a processing unit and a display unit. The processing unit is configured to implement any method related to processing performed by the terminal device in the second aspect or any possible implementation manner of the second aspect. The display unit is configured to implement any method related to display performed by the terminal device in the second aspect or any possible implementation manner of the second aspect. When the apparatus is a terminal device, the processing unit may be a processor. The apparatus may further comprise a storage unit, which may be a memory. The storage unit is configured to store instructions, and the processing unit executes the instructions stored in the storage unit, so that the terminal device implements the method described in the second aspect or any one of possible implementation manners of the second aspect. When the apparatus is a chip or a system of chips within a terminal device, the processing unit may be a processor. The processing unit executes instructions stored by the storage unit to cause the terminal device to implement the method described in the second aspect or any one of the possible implementations of the second aspect. The memory unit may be a memory unit (e.g., a register, a cache, etc.) in the chip, or a memory unit (e.g., a read-only memory, a random access memory, etc.) located outside the chip in the terminal device.
The processing unit is used for determining the invalid camera, updating the photographing mode and/or the focal segment, and reporting the updated information to the camera application. And the display unit is used for displaying each photographing mode and/or variable-focus section.
In a possible implementation manner, the processing unit is configured to detect an in-place situation of a camera of the terminal device.
In a possible implementation manner, the processing unit is configured to update a camera identifier of each physical camera in the terminal device and a camera identifier of each photographing mode, and report the updated camera identifier to the multi-photographing policy matching module of the terminal device.
In a possible implementation manner, the processing unit is configured to update the L photographing modes to Q photographing modes, and/or update the first focal segment information to the second focal segment information.
In a fifth aspect, an embodiment of the present application provides a terminal device, including a processor and a memory, where the memory is configured to store code instructions, and the processor is configured to execute the code instructions to perform the first aspect or any one of the possible implementation manners of the first aspect, and/or the photographing processing method described in the second aspect or any one of the possible implementation manners of the second aspect.
In a sixth aspect, embodiments of the present application provide a computer readable storage medium, in which a computer program or instructions are stored which, when run on a computer, cause the computer to perform the first aspect or any one of the possible implementations of the first aspect, and/or the photographing processing method described in the second aspect or any one of the possible implementations of the second aspect.
In a seventh aspect, embodiments of the present application provide a computer program product comprising a computer program which, when run on a computer, causes the computer to perform the first aspect or any one of the possible implementations of the first aspect and/or the photographing processing method described in the second aspect or any one of the possible implementations of the second aspect.
In an eighth aspect, the present application provides a chip or chip system, where the chip or chip system is applied to a terminal device, and the chip or chip system includes one or more processors, where the processors may be configured to detect a situation of a camera of the terminal device, determine a failure camera, update a camera identifier of each physical camera in the terminal device and a camera identifier of each photographing mode, and specifically report the updated camera identifier to the terminal device.
In a ninth aspect, the present application provides a chip or chip system comprising at least one processor and a communication interface, the communication interface and the at least one processor being interconnected by wires, the at least one processor being adapted to execute a computer program or instructions to perform the first aspect or any one of the possible implementations of the first aspect and/or the photographic processing method described in the second aspect or any one of the possible implementations of the second aspect. The communication interface in the chip can be an input/output interface, a pin, a circuit or the like.
In one possible implementation, the chip or chip system described above further includes at least one memory, where the at least one memory has instructions stored therein. The memory may be a memory unit within the chip, such as a register, a cache, etc., or may be a memory unit of the chip (e.g., a read-only memory, a random access memory, etc.).
It should be understood that, the third aspect to the ninth aspect of the present application correspond to the solutions of the first aspect and/or the second aspect of the present application, and the advantages obtained by each aspect and the corresponding possible embodiments are similar, and are not repeated.
Drawings
Fig. 1 is a schematic structural diagram of a terminal device according to an embodiment of the present application;
fig. 2 is a schematic software structure of a terminal device according to an embodiment of the present application;
fig. 3 is a schematic view of a photographing interface according to an embodiment of the present application;
fig. 4 is a schematic diagram of a process flow of decoupling a multi-camera logical camera according to an embodiment of the present application;
fig. 5 is a schematic diagram of a photographing interface provided by an embodiment of the present application without displaying a certain photographing mode;
fig. 6 is a schematic view of popup window prompt when a photographing mode is abnormal according to an embodiment of the present application;
fig. 7 is a schematic diagram of a photographing interface provided by an embodiment of the present application without displaying a certain focal segment;
fig. 8 is a schematic diagram of a photographing processing method according to an embodiment of the present application;
FIG. 9 is a schematic diagram of another photographing method according to an embodiment of the present application;
fig. 10 is a schematic structural diagram of a chip according to an embodiment of the present application.
Detailed Description
In order to facilitate the clear description of the technical solutions of the embodiments of the present application, the following simply describes some terms and techniques involved in the embodiments of the present application:
1. terminology
In embodiments of the present application, the words "first," "second," and the like are used to distinguish between identical or similar items that have substantially the same function and effect. For example, the first chip and the second chip are merely for distinguishing different chips, and the order of the different chips is not limited. It will be appreciated by those of skill in the art that the words "first," "second," and the like do not limit the amount and order of execution, and that the words "first," "second," and the like do not necessarily differ.
It should be noted that, in the embodiments of the present application, words such as "exemplary" or "such as" are used to mean serving as an example, instance, or illustration. Any embodiment or design described herein as "exemplary" or "for example" should not be construed as preferred or advantageous over other embodiments or designs. Rather, the use of words such as "exemplary" or "such as" is intended to present related concepts in a concrete fashion.
In the embodiments of the present application, "at least one" means one or more, and "a plurality" means two or more. "and/or", describes an association relationship of an association object, and indicates that there may be three relationships, for example, a and/or B, and may indicate: a alone, a and B together, and B alone, wherein a, B may be singular or plural. The character "/" generally indicates that the context-dependent object is an "or" relationship. "at least one of" or the like means any combination of these items, including any combination of single item(s) or plural items(s). For example, at least one (one) of a, b, or c may represent: a, b, c, a-b, a-c, b-c, or a-b-c, wherein a, b, c may be single or plural.
2. Terminal equipment
The terminal device of the embodiment of the application can also be any form of electronic device, for example, the electronic device can comprise a handheld device with a photographing function, a vehicle-mounted device and the like. For example, some electronic devices are: a mobile phone, tablet, palm, notebook, mobile internet device (mobile internet device, MID), wearable device, virtual Reality (VR) device, augmented reality (augmented reality, AR) device, wireless terminal in industrial control (industrial control), wireless terminal in unmanned (self driving), wireless terminal in teleoperation (remote medical surgery), wireless terminal in smart grid (smart grid), wireless terminal in transportation security (transportation safety), wireless terminal in smart city (smart city), wireless terminal in smart home (smart home), cellular phone, cordless phone, session initiation protocol (session initiation protocol, SIP) phone, wireless local loop (wireless local loop, WLL) station, personal digital assistant (personal digital assistant, PDA), handheld device with wireless communication function, public computing device or other processing device connected to wireless modem, vehicle-mounted device, wearable device, terminal device in future communication network (public land mobile network), or land mobile communication network, etc. without limiting the application.
By way of example, and not limitation, in embodiments of the application, the electronic device may also be a wearable device. The wearable device can also be called as a wearable intelligent device, and is a generic name for intelligently designing daily wear by applying wearable technology and developing wearable devices, such as glasses, gloves, watches, clothes, shoes and the like. The wearable device is a portable device that is worn directly on the body or integrated into the clothing or accessories of the user. The wearable device is not only a hardware device, but also can realize a powerful function through software support, data interaction and cloud interaction. The generalized wearable intelligent device includes full functionality, large size, and may not rely on the smart phone to implement complete or partial functionality, such as: smart watches or smart glasses, etc., and focus on only certain types of application functions, and need to be used in combination with other devices, such as smart phones, for example, various smart bracelets, smart jewelry, etc. for physical sign monitoring.
In addition, in the embodiment of the application, the electronic equipment can also be terminal equipment in an internet of things (internet of things, ioT) system, and the IoT is an important component of the development of future information technology, and the main technical characteristics of the IoT are that the article is connected with a network through a communication technology, so that the man-machine interconnection and the intelligent network of the internet of things are realized.
The terminal device in the embodiment of the present application may also be referred to as: a User Equipment (UE), a Mobile Station (MS), a Mobile Terminal (MT), an access terminal, a subscriber unit, a subscriber station, a mobile station, a remote terminal, a mobile device, a user terminal, a wireless communication device, a user agent, or a user equipment, etc.
In the embodiment of the application, the terminal device or each network device comprises a hardware layer, an operating system layer running on the hardware layer, and an application layer running on the operating system layer. The hardware layer includes hardware such as a central processing unit (central processing unit, CPU), a memory management unit (memory management unit, MMU), and a memory (also referred to as a main memory). The operating system may be any one or more computer operating systems that implement business processes through processes (processes), such as a Linux operating system, a Unix operating system, an Android operating system, an iOS operating system, or a windows operating system. The application layer comprises applications such as a browser, an address book, word processing software, instant messaging software and the like.
By way of example, fig. 1 shows a schematic diagram of a terminal device.
The terminal device may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (universal serial bus, USB) interface 130, a charge management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, keys 190, a motor 191, an indicator 192, a camera 193, a display 194, and a subscriber identity module (subscriber identification module, SIM) card interface 195, etc. The sensor module 180 may include a pressure sensor 180A, a gyro sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
It will be appreciated that the structure illustrated in the embodiments of the present application does not constitute a specific limitation on the terminal device. In other embodiments of the application, the terminal device may include more or less components than illustrated, or certain components may be combined, or certain components may be split, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
The processor 110 may include one or more processing units, such as: the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), a controller, a video codec, a digital signal processor (digital signal processor, DSP), a baseband processor, and/or a neural network processor (neural-network processing unit, NPU), etc. Wherein the different processing units may be separate devices or may be integrated in one or more processors.
The controller can generate operation control signals according to the instruction operation codes and the time sequence signals to finish the control of instruction fetching and instruction execution.
A memory may also be provided in the processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that the processor 110 has just used or recycled. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Repeated accesses are avoided and the latency of the processor 110 is reduced, thereby improving the efficiency of the system.
In some embodiments, the processor 110 may include one or more interfaces. The interfaces may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous receiver transmitter (universal asynchronous receiver/transmitter, UART) interface, a mobile industry processor interface (mobile industry processor interface, MIPI), a general-purpose input/output (GPIO) interface, a SIM card interface, and/or a USB interface, among others.
It should be understood that the connection relationship between the modules illustrated in the embodiment of the present application is only illustrative, and does not limit the structure of the terminal device. In other embodiments of the present application, the terminal device may also use different interfacing manners in the foregoing embodiments, or a combination of multiple interfacing manners.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to realize expansion of the memory capability of the terminal device. The external memory card communicates with the processor 110 through an external memory interface 120 to implement data storage functions. For example, files such as music, video, etc. are stored in an external memory card.
The internal memory 121 may be used to store computer-executable program code that includes instructions. The internal memory 121 may include a storage program area and a storage data area. The storage program area may store an application program (such as a sound playing function, an image playing function, etc.) required for at least one function of the operating system, etc. The storage data area may store data created during use of the terminal device (such as audio data, phonebook, etc.), etc. In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (universal flash storage, UFS), and the like. The processor 110 performs various functional applications of the terminal device and data processing by executing instructions stored in the internal memory 121 and/or instructions stored in a memory provided in the processor. For example, the method of the embodiments of the present application may be performed.
The terminal device implements display functions through a GPU, a display screen 194, an application processor, and the like. The GPU is a microprocessor for image processing, and is connected to the display 194 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 110 may include one or more GPUs that execute program instructions to generate or change display information. The terminal device may implement photographing functions through an ISP, a camera 193, a video codec, a GPU, a display screen 194, an application processor, and the like.
The camera 193 is used to capture still images or video. In some embodiments, the terminal device may include 1 or N cameras 193, N being a positive integer greater than 1.
The display screen 194 is used to display images, videos, and the like. The display 194 includes a display panel. In some embodiments, the terminal device may include 1 or N display screens 194, N being a positive integer greater than 1. The terminal device implements display functions through a GPU, a display screen 194, an application processor, and the like. The GPU is a microprocessor for image processing, and is connected to the display 194 and the application processor.
Fig. 2 is a software configuration block diagram of a terminal device according to an embodiment of the present application. The layered architecture divides the software into several layers, each with distinct roles and branches. The layers communicate with each other through a software interface. In some embodiments, the Android system is divided into four layers, from top to bottom, an application layer, an application framework layer, an Zhuoyun row (Android run) and system libraries, a hardware abstraction layer, and a kernel layer, respectively.
The application layer may include a series of application packages. As shown in fig. 2, the application package may include applications for cameras, music, phones, etc. Applications may include system applications and three-way applications.
The application framework layer provides an application programming interface (application programming interface, API) and programming framework for application programs of the application layer. The application framework layer includes a number of predefined functions.
As shown in FIG. 2, the application framework layer may include a window manager, a resource manager, a notification manager, a content provider, a view system, and the like.
The window manager is used for managing window programs. The window manager may obtain the display screen size, determine if there is a status bar, lock screen, touch screen, drag screen, intercept screen, etc.
The resource manager provides various resources for the application program, such as localization strings, icons, pictures, layout files, video files, and the like.
The notification manager allows the application to display notification information in a status bar, can be used to communicate notification type messages, can automatically disappear after a short dwell, and does not require user interaction. Such as notification manager is used to inform that the download is complete, message alerts, etc. The notification manager may also be a notification in the form of a chart or scroll bar text that appears on the system top status bar, such as a notification of a background running application, or a notification that appears on the screen in the form of a dialog window. For example, prompting text information in a status bar, giving out a prompt tone, vibrating a terminal device, flashing an indicator light, etc.
The content provider is used for realizing the function of data sharing among different application programs, allowing one program to access the data in the other program, and simultaneously ensuring the safety of the accessed data.
The view system may be responsible for interface rendering and event handling for the application.
Android runtimes include core libraries and virtual machines. Android run time is responsible for scheduling and management of the Android system.
The core library consists of two parts: one part is a function which needs to be called by java language, and the other part is a core library of android.
The application layer and the application framework layer run in a virtual machine. The virtual machine executes java files of the application program layer and the application program framework layer as binary files. The virtual machine is used for executing the functions of object life cycle management, stack management, thread management, security and exception management, garbage collection and the like.
The system library may include a plurality of functional modules. For example: media libraries (media libraries), function libraries (function libraries), graphics processing libraries (e.g., openGL ES), etc.
Media libraries support a variety of commonly used audio, video format playback and recording, still image files, and the like. The media library may support a variety of audio video encoding formats, such as: MPEG4, h.264, MP3, AAC, AMR, JPG, PNG, etc.
The function library provides multiple service API interfaces for the developer, and is convenient for the developer to integrate and realize various functions quickly.
The graphic processing library is used for realizing three-dimensional graphic drawing, image rendering, synthesis, layer processing and the like.
The hardware abstraction layer is a layer of abstracted structure between the kernel layer and the Android run. The hardware abstraction layer may be a package for hardware drivers that provides a unified interface for the invocation of upper layer applications. For example, in the embodiment of the application, the hardware abstraction layer can perform shooting mode matching and zoom capability matching when the camera fails, can report shooting capability to the camera application, can perform multi-shooting strategy matching, and instruct the terminal equipment to open the corresponding camera and make a picture, etc.
The kernel layer is a layer between hardware and software. The kernel layer may include a driving layer, where the driving layer may include a camera driver, a chip platform driver, an audio driver, a display driver, and the like. The camera driver and the chip platform driver can be used for detecting the in-place condition of the camera and the like.
The related data transmission flow of photographing in the Android system layered framework is illustrated by combining the photographing process of the terminal equipment.
In the driving layer of the terminal equipment, when the chip platform driver and the camera driver in the terminal equipment are started, the in-place condition of the physical camera can be detected.
If the camera driver detects that a certain physical camera fails, the camera driver can record the camera failure information, update corresponding photographing mode information, dynamically load corresponding zooming capability and dynamically load configuration files according to the physical camera failure information. The camera driver can also report updated information to the camera application in the application layer, and report the dynamically loaded configuration file to the multi-shot policy matching module in the hardware abstraction layer.
If the chip platform driver detects that a certain physical camera fails, the chip platform driver can modify the multi-shot configuration strategy of the terminal equipment and report the multi-shot configuration strategy to a multi-shot strategy matching module in the hardware abstraction layer.
In an application layer of the terminal equipment, after the camera application receives update information reported by a camera driver of the driving layer, the camera can display a corresponding photographing mode according to photographing mode information and can display an effective zooming focal section according to zooming capability.
In a hardware abstraction layer of the terminal equipment, the multi-shot strategy matching module can output multi-shot decisions according to a shooting mode, zooming capability, a dynamically loaded configuration file and the like of a camera reported by a camera driver of a driving layer and a chip platform driver, so that a corresponding physical camera is opened, a corresponding physical camera picture is controlled, and normal shooting use after a certain physical camera fails is completed.
It should be noted that, the embodiment of the present application is only illustrated by using an android system, and in other operating systems (such as a Windows system, an IOS system, etc.), the scheme of the present application can be implemented as long as the functions implemented by each functional module are similar to those implemented by the embodiment of the present application.
In order to meet the requirement of a user on photographing, the terminal device may include a plurality of cameras, so as to improve photographing experience of the user.
In some implementations, the terminal device may use multiple cameras to mutually cooperate to form different photographing modes, and the user may use the different photographing modes to photograph, so as to improve photographing effects. For example, the photographing mode may include a normal photographing mode, a portrait mode, a large aperture mode, a night view mode, a normal video mode, a macro mode, and the like.
However, when a certain physical camera in the terminal device fails, a photographing mode depending on the camera cannot be used, so that the problem that a camera application cannot be opened or the camera application flashes back, freezes a screen or is blocked when a user clicks the photographing mode may occur, and photographing experience of the user is affected.
Therefore, the embodiment of the application provides a photographing processing method, when a certain photographing mode cannot be used due to the failure of a certain physical camera, the camera interface may not display the photographing mode, or display the photographing mode as a state in which the photographing mode cannot be triggered to enter photographing. Therefore, the camera application can be normally used, and the problems that the camera cannot be opened, flash back, screen is frozen, or the camera is blocked and the like are avoided.
Different physical camera combinations can be used for each photographing mode in the terminal equipment. Physical cameras include, for example, wide angle cameras, ultra wide angle cameras, tele cameras, depth cameras, and the like, which may also be referred to as physical camera modules. For example, the portrait mode may use a wide-angle camera and an ultra-wide-angle camera. It will be appreciated that the photographing mode may also be referred to as a logical camera, and that such a manner of photographing using a logical camera may be referred to as multi-shot photographing.
When a certain photographing mode uses a plurality of physical cameras to photograph and make a picture, the photographing mode can fuse the pictures photographed by the physical cameras into one picture. For example, a wide-angle camera and a super-wide-angle camera can be used in the portrait mode, and when photographing, the wide-angle camera and the super-wide-angle camera can be respectively used for drawing, and the two drawings can be fused into one picture in the photographing mode.
It can be understood that when the terminal equipment shoots, the information of the focal segment can be corresponding, and as different physical cameras can correspond to different shooting focal segments, when the terminal equipment shoots, different physical cameras can be switched along with the different focal segments, so that shooting effect and experience are improved. Wherein the photographing Jiao Duan may also be referred to as photographing magnification.
As shown in fig. 3, when the user clicks the camera application icon 301 in a of fig. 3, the photographing interface 302 of the camera application in b of fig. 3 may be opened, and the photographing interface 302 may include a photographing mode 303 and a focal length adjustment 304 of the photographing mode. The photographing mode 303 may include a large aperture mode, a portrait mode, a photographing mode, a video mode, and the like, and the focal segment adjustment 304 may include different focal segments, where the default focal segment may be 1X.
When the user clicks on the focus segment adjustment 304 for a certain photographing mode in the camera interface in fig. 3 b, the camera interface may display detail Jiao Duan 305. Illustratively, the detail focal segment 305 may include a photographing magnification of [0.1x,50x ]. When a user selects a focal segment to take a picture at [0.1X,1X ] multiplying power, the picture taking mode can take a picture by using the ultra-wide angle camera; when a user selects a focal segment to take a picture at [1X,10X ] multiplying power, the picture taking mode can take a picture by using a wide-angle camera; when the user selects the focal segment to take a picture at [10X,50X ] magnification, the picture can be taken by using the tele camera in the picture taking mode.
It can be appreciated that in the use process of the terminal device, the physical camera may be damaged, which may cause some photographing modes or focal segments to be unusable. Therefore, in the embodiment of the application, the physical camera and the photographing mode can be identified, so that the physical camera and the photographing mode (or called as a logic camera) have respective camera identifications (camera identification, camera) and the terminal equipment can update the photographing mode and/or update the focal length by combining the camera identifications of the physical camera and the camera identifications of the photographing mode.
For example, the correspondence between the camera and the camera may be:
the physical camera a may include a wide-angle camera, and the corresponding camera may be 0.
The physical camera b may include a front camera, and the corresponding camera may be 1.
The physical camera c may include an ultra-wide angle camera, and the corresponding camera may be 2.
The physical camera d may include a tele camera, and the corresponding camera may be 3.
The photographing mode e can use a wide-angle camera, an ultra-wide-angle camera and a long-focus camera, and the corresponding camera can be 4.
The photographing mode f can use a wide-angle camera and an ultra-wide-angle camera, and the corresponding camera can be 5.
The photographing mode g may use a wide-angle camera and a telephoto camera, and the corresponding camera may be 6.
Illustratively, table 1 summarises the camera and camera controls for the photographing modes described above.
TABLE 1
Camera type cameraid The camera specifically comprises
Physical camera a 0 Wide angle of view
Physical camera b 1 Front-mounted
Physical camera c 2 Super wide angle
Physical camera d 3 Long coke
Logic camera e 4 Wide angle, super wide angle and long focus
Logic camera f 5 Wide and super wide angle
Logic camera g 6 Wide angle and long focus
It may be understood that the camera corresponding to the camera in the terminal device may be set by the terminal device in a self-defining manner, and the camera corresponding to the same camera in different terminal devices may be different, and the value of the camera corresponding to the specific camera is not limited in the embodiment of the present application.
In addition, the sorting numbers of the cameras can be set by the terminal equipment in a self-defining way, and the sorting numbers of the cameras in different terminal equipment can be different. For example, the ranking numbers of the cameras may be sequentially increased from 0, sequentially increased from 1, sequentially increased or decreased from a certain number, and the ranking numbers of the cameras may be sequentially increased or decreased according to a certain rule, and the embodiment of the present application is not limited in the manner of the ranking numbers of the specific cameras.
When the terminal equipment is started, a chip platform driver in the terminal equipment can detect the in-place condition of the physical camera. If a certain physical camera fails, the chip platform driver can modify the multi-camera configuration strategy of the terminal equipment, reorder the cameras, and can also determine whether to report the corresponding logical cameras to the multi-camera strategy matching module of the HAL layer of the terminal equipment according to the camera reordering rule.
In a possible implementation, taking a logical camera as an example, where the logical camera is composed of two or three physical cameras and a wide-angle camera is taken as a main camera, a camera rearrangement rule when a certain camera fails is described, and specifically includes:
(1) When the logic camera comprises a main wide-angle camera and the main wide-angle camera fails, no matter the logic camera comprises a plurality of physical cameras, the chip platform driver can report the logic camera without the Xiang Duo camera strategy matching module.
(2) When the logic cameras comprise three physical cameras, if one non-main camera is broken, the chip platform driver can report the logic cameras to the multi-camera strategy matching module; if two non-main cameras are damaged, the chip platform driver can report the logical camera without the Xiang Duo camera strategy matching module.
(3) When the logic camera comprises two physical cameras, any one of the cameras is damaged, and the chip platform driver can report the logic camera without the Xiang Duo camera strategy matching module.
(4) When the physical camera or the logical camera does not report, the cameras of other cameras in the terminal device need to be rearranged, and specific arrangement rules can refer to the related description of the sorting numbers of the cameras and are not repeated.
For example, taking the failure of the ultra-wide angle camera in the physical camera as an example, as shown in the following table 2, after the failure of the ultra-wide angle camera, the correspondence between the camera and the camera in the multi-camera configuration policy may be:
the physical camera a uses a wide-angle camera, and because the physical camera a is not affected by the failure of the ultra-wide-angle camera, the camera corresponding to the physical camera a is still 0.
The physical camera b uses the front camera, and because the front camera is not influenced by the failure of the ultra-wide angle camera, the camera corresponding to the physical camera b is still 1.
The physical camera c uses the ultra wide angle camera, and the chip platform driver can not report the physical camera c because the ultra wide angle camera fails, so the physical camera c can not be included in the multi-camera configuration strategy.
The physical camera d uses a tele camera and is not affected by the failure of the ultra-wide angle camera, but can be rearranged according to the camera rearrangement rule (4), and the camera corresponding to the physical camera d can be changed from 3 to 2.
The logic camera e comprises a wide angle, a super wide angle and a long-focus camera, and because the super wide angle camera fails, a non-main camera is broken according to camera id rearrangement rules (2) and (4), the chip platform driver can continuously report the logic camera e, and then the camera corresponding to the logic camera e can be 3.
The logic cameras f comprise wide-angle cameras and ultra-wide-angle cameras, and because the ultra-wide-angle cameras fail, according to the camera id rearrangement rule (3), the chip platform driver can not report the logic cameras f, and the multi-camera configuration strategy can not comprise the logic cameras f.
The logical camera g includes a wide-angle camera and a telephoto camera, is not affected by the failure of the ultra-wide-angle camera, but can rearrange the cameras according to the camera rearrangement rule (4), and the corresponding camera can be 4.
TABLE 2
Camera type cameraid The camera specifically comprises
Physical camera a 0 Wide angle of view
Physical camera b 1 Front-mounted
Physical camera d 2 Long coke
Logic camera e 3 Wide angle and long focus
Logic camera g 4 Wide angle and long focus
Optionally, in the aforementioned camera rearrangement rule (1), if the main camera fails, the logic camera may also use other cameras to replace the main camera, so that the chip platform driver may report the logic camera to the multi-camera policy matching module. For example, if the main camera fails, the logic camera may use a physical camera such as an ultra-wide angle or a tele as the main camera, or may use a plurality of physical cameras such as an ultra-wide angle and a tele to cooperate with each other as the logic camera, and how the logic camera uses the remaining normal physical cameras is not limited by the embodiment of the present application.
According to the embodiment of the application, when a certain physical camera of the terminal equipment fails, the multi-camera strategy matching module can update the multi-camera configuration strategy without rearranging and combining the rest normal cameras, so that the realization logic of the multi-camera configuration strategy is simplified, the calculation force is reduced, each photographing mode and the variable focus section are reserved to the maximum extent, the camera can be normally used, and the situations of non-opening, flash-back and the like of the camera are avoided.
Fig. 4 shows a process flow of multi-shot logical camera decoupling.
When the terminal equipment is started, a chip platform driver in the terminal equipment can detect the in-place condition of the physical camera. If a certain physical camera fails, the chip platform driver can modify the multi-shot configuration strategy of the terminal equipment, the cameras reorder, and whether to report the corresponding shooting mode to the multi-shot strategy matching module of the HAL layer of the terminal equipment can be determined according to the camera reordering rule.
For example, the common photographing mode may include a wide-angle camera, a super-wide-angle camera, and a telephoto camera, the camera of the wide-angle camera may be 0, the camera of the super-wide-angle camera may be 2, and the camera of the telephoto camera may be 3, and then the physical cameras of the common photographing mode are ordered to be 0, 2, and 3. If the ultra-wide angle camera is bad, the physical cameras can be rearranged by the multi-camera configuration strategy, and the sorting of the cameras is updated, so that the cameras of the long-focus cameras need to be arranged in an increasing way, the cameras of the long-focus cameras are changed to 2, and the sorting of the physical cameras in the common photographing mode is changed to 0 and 2.
When the terminal equipment is started, the camera driver of the terminal equipment driving layer can detect the in-place condition of the physical camera. When detecting that a certain physical camera is out of position, the camera driver can record out-of-position information, update a corresponding photographing mode according to the out-of-position information of the physical camera, a camera corresponding to the photographing mode, and dynamically load a corresponding zoom_cap zoom capability. The camera driver may also report updated information to the camera application.
For example, if the camera driver detects that the ultra-wide angle camera is disabled, the camera driver may modify the mode involved by the ultra-wide angle camera; if the camera drive detects that the tele camera fails, the camera drive can modify the mode related to the tele camera; if the camera driver detects that the front-facing camera fails, the camera driver can modify the mode involved by the front-facing camera. Therefore, the embodiment of the application can realize decoupling of the physical camera under the condition that the terminal equipment is provided with the multi-camera strategy in the front or rear mode, and ensure the normal use of camera application.
In a possible implementation, the chip platform driver in the terminal device or the camera driver of the terminal device driving layer can detect the in-place condition of the physical camera according to the power-on condition of the physical camera, the enabling state of the physical camera, and other modes. For example, if the terminal device can normally power up a certain physical camera, the physical camera is in place; if the terminal equipment cannot power up a certain physical camera, the physical camera is out of position. Or if the enabling state of a certain physical camera is 1, the physical camera is in place; if the enabling state of a certain physical camera is 0, the physical camera is out of position. The embodiment of the application is not limited by the specific way of detecting the in-place condition of the physical camera.
When the in-place condition of the physical camera is detected, the chip platform driver in the terminal equipment and the camera driver in the terminal equipment driving layer can respectively detect the in-place condition of the physical camera, and can also realize the intercommunication of detection information of the in-place condition of the physical camera between the chip platform driver and the camera driver. For example, the chip platform driver in the terminal device may synchronize the detection result to the camera driver of the terminal device driving layer after detecting the in-place condition of the physical camera, or the camera driver of the terminal device driving layer may synchronize the detection result to the chip platform driver in the terminal device after detecting the in-place condition of the physical camera.
In a possible implementation, besides detecting the in-place condition of the physical camera when the terminal device is started, the chip platform driver or the camera driver in the terminal device can also periodically detect the in-place condition of the physical camera, and can trigger the chip platform driver or the camera driver in the terminal device to detect the in-place condition of the physical camera when the initialization of the camera application is completed, and the time for specifically detecting the in-place condition of the physical camera is not limited by the embodiment of the application.
After receiving the update information reported by the camera driver, the camera application can display the corresponding photographing mode according to the photographing mode, camera and the like, and can display the effective zoom focal length according to the zoom_cap zoom capability.
For example, if the logical camera of the portrait mode includes a wide-angle camera and a super-wide-angle camera, when the camera driver detects that the super-wide-angle camera is not in place, the camera driver may modify the mode related to the super-wide angle, and according to the above-mentioned camera id rearrangement rule (3), the camera driver may not report the portrait mode to the camera application, and then the camera interface may not display the portrait mode.
As shown in fig. 5, a normal camera photographing interface may be shown as the interface in a of fig. 5, the camera photographing interface may include a portrait mode 501, and when the camera driver detects that the ultra-wide angle camera is not in place, the camera driver may not report the portrait mode 501 to the camera application, and the camera interface may be shown as the interface in b of fig. 5, and may not display the portrait mode.
Therefore, the camera application can refresh each photographing mode of the photographing interface again under the condition that the physical camera fails, so that the invalid photographing mode cannot be displayed on the photographing interface, and the phenomena that the camera is not opened, blocked, or frozen after the invalid photographing mode is clicked can be prevented, and the use experience of a user is affected.
Optionally, when the camera driver does not report the portrait mode to the camera application, the camera interface may further display the portrait mode as a state that can not be triggered to enter into photographing.
As shown in fig. 6, a normal camera photographing interface may be shown as the interface in a of fig. 6, the camera photographing interface may include a portrait mode 601, when the camera driver detects that the ultra-wide angle camera is out of position, the camera driver may not report the portrait mode 601 to the camera application, and the camera interface may display the portrait mode as a non-photographable state as the interface in b of fig. 6, wherein the non-photographable state may be distinguished by differences in color, size, gray scale, and the like of fonts. The display of the state of being particularly non-photographable is not limited herein.
Optionally, the user may also click on portrait mode 601. When the user clicks the portrait mode 601, the photographing interface may display a popup window to prompt that the portrait mode is unavailable, and the user decides whether to continue displaying the portrait mode. Wherein, the popup window can also be called a prompt box or a popup frame.
As shown in c of fig. 6, a popup window 602 may be displayed on the interface of the terminal device, and a prompt title 603, prompt information 604, ok button 605, and cancel button 606 may be displayed in the popup window 602. Wherein, the prompt title 603 may include "do it cancel to display portrait mode? The prompt 604 may include "the photographing mode is disabled due to camera failure". "specific hint title 603 and content information of hint information 604," embodiments of the present application are not limited.
When the terminal device receives the operation of the user trigger determination button 605, the terminal device may cancel the display of the pop-up window 601 and the portrait mode is not displayed on the interface, as shown in the interface in d of fig. 6. When the terminal device does not receive the trigger operation of the user for a long time or the terminal device receives the operation of the user trigger cancel button 606, the terminal device may cancel the display of the pop-up window 601 and the interface displays the portrait mode of the non-photographable state.
Therefore, the camera application can enable the user to know the reason that a certain photographing mode cannot be used in time through the popup window under the condition that the physical camera fails, and the user determines whether the photographing interface continues to display the photographing mode which cannot be used or not, so that user experience is improved.
If the zoom_cap zoom capability of the normal photographing mode can comprise [0.1X,50X ], wherein a focal segment below 1X needs to use an ultra-wide angle camera. When the camera driver detects that the ultra-wide angle camera is out of position, the zoom_cap zoom capability of [0.1X,1X ] cannot be used, the camera driver can modify the focal segment related to the ultra-wide angle, and the focal segment corresponding to the ultra-wide angle is not reported to the camera application, so that the zoom_cap zoom capability of the common photographing mode of the camera interface can be changed into [1X,50X ].
As shown in fig. 7, in the camera's photographing mode interface, the normal focus section selection interface may be as shown in focus section selection 701 in fig. 7 a, and the zoom_cap zoom capability of the photographing mode may include [0.1x,50x ]. When the camera driver detects that the ultra-wide-angle camera is out of position, the camera driver may not report the [0.1x,1x ] focal segment corresponding to the ultra-wide angle to the camera application, and the camera interface may change the zoom capability of the camera_cap of the photographing mode to [1x,50x ] as shown by the focal segment selection 702 in b of fig. 7.
Therefore, the camera application can update the effective focal segment of each photographing mode under the condition that the physical camera fails, so that the invalid focal segment can not be displayed on a photographing interface, and the phenomena that the camera is not opened, blocked, or frozen screen is prevented from being generated after the invalid focal segment is clicked, and the use experience of a user is affected.
In addition, the camera driver can dynamically load a configuration file IdMap according to non-bit information of a certain physical camera, wherein the configuration file IdMap can record the camera id and the name corresponding to the physical camera, and when the camera driver detects that the certain physical camera fails, the camera id in the configuration file IdMap can be reordered correspondingly so as to be used when multi-shot rule matching is carried out by a multi-shot strategy matching module of a HAL layer of the terminal equipment.
The multi-shooting strategy matching module can determine the matching of the multi-shooting strategy according to the shooting mode of the cameras, the camera ordering, the dynamically loaded IdMap configuration file and the like, and output a multi-shooting decision, so that the corresponding physical cameras are opened, the corresponding physical camera pictures are controlled, and the normal shooting use of the cameras after a certain physical camera fails is completed. That is, when the camera is going to take a picture, the multi-shot policy matching module may determine which physical cameras need to be opened according to the multi-shot policy, and let which physical cameras take a picture.
For example, when a user opens a camera application and uses a default common photographing mode to photograph, the multi-photographing policy matching module can determine that the default focal length of the current common photographing mode is 1X, and then the main photographing camera can be used for photographing the image, further the terminal device can open the main photographing physical camera, and the user can click a photographing button to photograph.
The method according to the embodiment of the present application will be described in detail by way of specific examples. The following embodiments may be combined with each other or implemented independently, and the same or similar concepts or processes may not be described in detail in some embodiments.
Fig. 8 shows a photographing processing method according to an embodiment of the present application. The method comprises the following steps:
s801, at a first moment, the terminal device receives a first operation for a camera application.
In the embodiment of the present application, the first time may be a time when the terminal device receives the first operation of opening the camera application. The first operation may be an operation of clicking the camera application icon by the user, or an operation of opening the camera application by the user voice indication terminal device, and the first operation of opening the camera application is specifically implemented.
S802, responding to a first operation, and displaying a first interface by the terminal equipment, wherein the first interface comprises a first photographing mode, a second photographing mode and a third photographing mode; the first photographing mode, the second photographing mode and the third photographing mode are all displayed in a state in which photographing can be triggered.
In the embodiment of the present application, the first interface may be understood as a photographing interface of a camera application or an interface of a camera application displaying a photographing mode. The photographing mode may include a normal photographing mode, a portrait mode, a large aperture mode, a night view mode, a normal video mode, a macro mode, and the like in the above embodiments. The first photographing mode, the second photographing mode, and the third photographing mode may be different photographing modes.
S803, at a second moment, the terminal equipment receives a second operation for the camera application again; the second time is later than the first time.
In the embodiment of the present application, the second time may be a time when the terminal device receives the second operation of executing photographing. The second operation may be an operation of clicking a photographing button of the camera application by the user, or an operation of instructing the camera application to photograph by the user by voice, which specifically implements the second operation of performing photographing.
S804, responding to a second operation, displaying a second interface by the terminal equipment, wherein the second interface comprises a first photographing mode and a third photographing mode, and the first photographing mode and the third photographing mode are both displayed in a state capable of triggering photographing; and the second interface does not include a second photographing mode.
In the embodiment of the application, the second interface may be a photographing interface of the camera application or an interface of the camera application displaying a photographing mode.
It is understood that the types of photographing modes in the second interface may be less than the types of photographing modes in the first interface, e.g. the first interface comprises the second photographing mode but the second interface does not comprise the second photographing mode.
The type of photographing mode in the second interface may be equal to the type of photographing mode in the first interface, but the photographing mode state in the second interface may be different from the photographing mode state in the first interface. For example, the second photographing mode in the first interface may be a triggerable photographing state, and the second photographing mode in the second interface may be a non-triggerable photographing state. The display differences of the different shooting states may refer to the related descriptions in the embodiment corresponding to b of fig. 6, and will not be described again.
In the embodiment of the application, the camera application can judge whether a certain photographing mode can be normally used, and under the condition that the certain photographing mode is abnormal, the camera application can not display the photographing mode, so that the camera application can be normally used, and the problems of incapability of opening, flashing, freezing, blocking and the like do not occur.
Optionally, on the basis of the embodiment corresponding to fig. 8, the first interface may further include first focal segment information, and the second interface may further include second focal Duan Xinxi, where a magnification range in the second focal segment information is smaller than a magnification range in the first focal segment information.
In the embodiment of the application, the camera application can judge whether each focal segment in a certain photographing mode can be normally used, and under the condition that a certain focal segment cannot be used, the camera application can not display the focal segment which cannot be used. Therefore, abnormal camera application after invalid focal segment clicking can be prevented, and the use experience of a user is influenced.
Optionally, on the basis of the embodiment corresponding to fig. 8, before the response to the second operation in S804, the terminal device displays the second interface, may include: in response to the second operation, the terminal equipment displays a third interface, wherein the third interface comprises a first photographing mode and a third photographing mode, the first photographing mode and the third photographing mode are both displayed in a state in which photographing can be triggered, and the third interface comprises a second photographing mode which is displayed in a state in which photographing can not be triggered; the terminal equipment receives a third operation of the second photographing mode at a third interface; in response to the third operation, the terminal device displays a fourth interface, which may include: and the information is used for prompting that the second photographing mode is unavailable and/or prompting the identification information of the invalid camera in the terminal equipment.
In the embodiment of the application, the third interface may be a photographing interface of the camera application or an interface of the camera application displaying a photographing mode. But a photographing mode that cannot trigger the photographing state may be included in the third interface. The specific third interface may refer to the related description in the embodiment corresponding to b in fig. 6, which is not repeated.
The third operation may be an operation in which the user clicks a photographing mode that cannot be triggered to enter a photographing state.
The fourth interface may be a popup window interface for prompting the user that the photographing mode cannot be used, and the specific fourth interface may refer to the related description in the embodiment corresponding to c in fig. 6, which is not described herein.
In the embodiment of the application, the camera application can enable the user to know a certain photographing mode or that the camera cannot be used in time through the popup window, and the user can repair or replace the terminal equipment in time.
Optionally, on the basis of the embodiment corresponding to fig. 8, the fourth interface further includes information for prompting to cancel the display of the second photographing mode, and a target button; the method may further comprise: the terminal equipment receives a fourth operation aiming at the target button; and responding to the fourth operation, and displaying the second interface by the terminal equipment.
In the embodiment of the present application, the specific fourth interface may refer to the related description in the embodiment corresponding to c in fig. 6, which is not repeated. The fourth operation may be an operation to instruct the camera application not to display the second photographing mode, e.g., the fourth operation may be an operation to click a determination button of the pop-up window for the user. The camera application may display the second interface after receiving the fourth operation, that is, the second photographing mode may not be displayed in the interface of the camera application.
In the embodiment of the application, the camera application can determine whether the photographing interface continues to display the photographing mode which cannot be used or not by the user, and the user experience can be improved.
Optionally, on the basis of the embodiment corresponding to fig. 8, the terminal device includes an ultra-wide-angle camera, a wide-angle camera and/or a tele camera; before the terminal device of S803 receives the second operation for the camera application again, it may further include: the terminal equipment determines that the ultra-wide angle camera, the wide angle camera and/or the tele camera fails; when the ultra-wide angle camera fails, the second focal segment information does not comprise 0.1X-1X multiplying power; when the wide-angle camera fails, the second focal segment information does not comprise 1X-10X multiplying power; when the long-focus camera fails, the second focal segment information does not comprise the 10X-100X multiplying power.
In the embodiment of the application, the camera application can update the effective focal segment of each photographing mode according to the failure condition of the camera, so that the invalid focal segment can not be displayed on a photographing interface, and the phenomena of non-opening, blocking, screen freezing and the like of the camera after the invalid focal segment is clicked can be prevented.
Optionally, on the basis of the embodiment corresponding to fig. 8, the terminal device includes a plurality of cameras, where N cameras in the plurality of cameras are required to be used when the second photographing mode operates, and N is a positive integer; before the terminal device receives the second operation for the camera application again, it may include: the terminal equipment determines that M target cameras fail; m is a positive integer, and M is less than or equal to N; the terminal equipment determines that the type of the target failure cameras and/or the number of the target failure cameras meet preset conditions, wherein the target failure cameras are repeated cameras in M target cameras and N cameras; and the terminal equipment fails the second photographing mode.
In the embodiment of the application, if the camera used in the second photographing mode comprises the camera which is determined to be invalid by the terminal equipment, the terminal equipment can lose the second photographing mode when the invalid camera meets a certain preset condition. Specific preset conditions can refer to the related description of the camera rearrangement rule when a certain camera fails, and are not described in detail.
The terminal equipment determines a photographing mode which cannot be used according to the invalid camera, so that decoupling of the terminal equipment to the physical camera under the condition of being configured with a multi-photographing strategy can be realized, and the terminal equipment invalidates the second photographing mode, so that normal use of camera application can be ensured.
Optionally, on the basis of the embodiment corresponding to fig. 8, the determining, by the terminal device, that the type of the target failure camera and/or the number of the target failure cameras meet the preset condition may include: the terminal equipment determines that the type of the target failure camera is the main camera, and/or the terminal equipment determines that the number of the target failure cameras is greater than or equal to half of N.
In the embodiment of the present application, the preset conditions may refer to the description related to the above-mentioned camera rearrangement rule when a certain camera fails, which is not described in detail. The terminal equipment determines the invalid photographing mode according to the preset condition, and can avoid the need of carrying out permutation and combination on the rest normal cameras, thereby simplifying the realization logic of the multi-photographing configuration strategy, reducing the calculation power and reserving each photographing mode to the maximum extent.
Optionally, on the basis of the embodiment corresponding to fig. 8, the disabling, by the terminal device, the second photographing mode may include:
The camera of the terminal equipment drives to update the photographing mode and reports updated information to the camera application, wherein the updated information does not comprise the second photographing mode.
In the embodiment of the present application, the camera driving and updating the photographing mode and reporting the updated information to the camera application may refer to the related description in the process flow of decoupling the multi-camera logical camera in fig. 4, which is not repeated.
The camera application can display the corresponding photographing mode according to the updated information reported by the camera driver, so that the camera application can be used normally, and the problems of incapability of opening, flashing, freezing, blocking or the like caused by the abnormal photographing mode are avoided.
Optionally, on the basis of the embodiment corresponding to fig. 8, before the M target cameras fail, the terminal device uses the target failure camera when taking a picture in the third photographing mode; after the M target cameras fail, the terminal equipment does not use the target failure cameras when shooting in the third shooting mode.
In the embodiment of the present application, the cameras used in the third photographing mode may include a target failure camera, where the third photographing mode satisfies the situation that the logical camera may be reported in the camera rearrangement rule (2) in the above embodiment, so that the third photographing mode may be in a state that can trigger photographing in the camera interface.
The terminal equipment can determine according to the camera rearrangement rule, and under the condition that some cameras fail, certain photographing modes using the failed cameras can still be used, so that each photographing mode can be reserved to the maximum extent.
Fig. 9 shows another photographing processing method according to an embodiment of the present application. The method comprises the following steps:
s901, the terminal equipment determines a failure camera.
In the embodiment of the present application, the terminal device may determine the failed camera according to the power-on condition of a certain physical camera, the mode of obtaining the enabling state of a certain physical camera, etc., and may refer to the related description in the embodiment corresponding to fig. 4, which is not repeated.
S902, a camera of the terminal equipment drives to update a photographing mode and/or a focal segment, and update information is reported to a camera application, wherein the update information does not comprise the photographing mode and/or the focal segment which are caused by a failure camera and cannot be used.
In the embodiment of the present application, the camera driving and updating the photographing mode and reporting the updated information to the camera application may refer to the related description in the process flow of decoupling the multi-camera logical camera in fig. 4, which is not repeated.
S903, the camera application displays each photographing mode and/or variable-focus section according to the updated information reported by the camera driver.
In the embodiment of the application, the camera application can display the corresponding photographing mode and/or the focal segment according to the updated information reported by the camera driver, so that the camera application can be used normally, and the problems of incapability of opening, flashing, freezing, or blocking and the like caused by the abnormal photographing mode and/or the abnormal focal segment are avoided.
Optionally, on the basis of the embodiment corresponding to fig. 9, determining, by the terminal device of S901, that the failed camera may include: and (3) the chip platform drive and/or the camera drive of the terminal equipment detect the in-place condition of the camera of the terminal equipment, and the invalid camera is obtained.
In the embodiment of the present application, the terminal device may detect the in-place situation of the camera of the terminal device by using the chip platform driver and/or the camera driver according to the power-on situation of a certain physical camera, the mode of obtaining the enabling state of a certain physical camera, etc., and the specific description may refer to the related description in the embodiment corresponding to fig. 4, and will not be repeated. In this way, the terminal device can decouple the failed physical camera, thereby ensuring the normal use of the camera application.
Optionally, on the basis of the embodiment corresponding to fig. 9, each physical camera and each photographing mode in the terminal device are respectively corresponding to respective camera identifications; after obtaining the failure camera, the method may further include: the chip platform drives and updates the camera identifications of all physical cameras in the terminal equipment and the camera identifications of all photographing modes, and reports the updated camera identifications to the multi-photographing strategy matching module of the terminal equipment, wherein the updated camera identifications do not comprise identifications for identifying invalid cameras and identifications for identifying the photographing modes which cannot be used and are caused by the invalid cameras, and the multi-photographing strategy matching module is used for managing the corresponding relation between the photographing modes and the physical cameras used in the photographing modes.
In the embodiment of the present application, the chip platform driver updates the camera identifier of each physical camera and the camera identifier of each photographing mode, which can refer to the related description in the embodiment corresponding to the table 2, and will not be described in detail.
The multi-camera strategy matching module of the terminal equipment can output a multi-camera decision according to the updated camera identifications driven by the chip platform, so that corresponding physical cameras are controlled to take pictures, and normal shooting use of the cameras after a certain physical camera fails is completed.
Optionally, on the basis of the embodiment corresponding to fig. 9, the camera driving of the terminal device of S902 updates the photographing mode and/or the focal segment, and may include: the camera driver of the terminal equipment updates L photographing modes into Q photographing modes and/or updates the first focal segment information into second focal segment information; the Q photographing modes are photographing modes after unusable photographing modes caused by a disabled camera are removed from L photographing modes, the multiplying power range in the second focal segment information is smaller than the multiplying power range in the first focal segment information, L is a positive integer, Q is a positive integer, and L is larger than or equal to Q.
In the embodiment of the present application, the camera driving and updating the photographing mode and/or the focal segment may refer to the related description in the process flow of decoupling the multi-camera logical camera in fig. 4, which is not repeated.
The camera application in the terminal equipment can display the corresponding photographing mode and the effective zooming focal segment according to the updated photographing mode and/or focal segment reported by the camera driver, so that the invalid photographing mode and/or focal segment cannot be displayed on a photographing interface, and the phenomena that after the invalid photographing mode and/or focal segment is clicked, the camera is not opened, blocked or frozen can be prevented from happening, and the use experience of a user is affected.
It should be noted that, the user information (including but not limited to user equipment information, user personal information, etc.) and the data (including but not limited to data for analysis, stored data, presented data, etc.) related to the present application are information and data authorized by the user or fully authorized by each party, and the collection, use and processing of the related data need to comply with the related laws and regulations and standards of the related country and region, and provide corresponding operation entries for the user to select authorization or rejection.
The foregoing description of the solution provided by the embodiments of the present application has been mainly presented in terms of a method. To achieve the above functions, it includes corresponding hardware structures and/or software modules that perform the respective functions. Those of skill in the art will readily appreciate that the present application may be implemented in hardware or a combination of hardware and computer software, as the method steps of the examples described in connection with the embodiments disclosed herein. Whether a function is implemented as hardware or computer software driven hardware depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
The embodiment of the application can divide the functional modules of the device for realizing the method according to the method example, for example, each functional module can be divided corresponding to each function, and two or more functions can be integrated in one processing module. The integrated modules may be implemented in hardware or in software functional modules. It should be noted that, in the embodiment of the present application, the division of the modules is schematic, which is merely a logic function division, and other division manners may be implemented in actual implementation.
Embodiments of the present application also provide a chip or chip system, which is applied to a terminal device, and which may include one or more processors that may be configured to perform: detecting the in-place condition of a camera of the terminal equipment, and determining a failure camera; updating the camera identifications of the physical cameras and the camera identifications of the photographing modes in the terminal equipment, and reporting the updated camera identifications to the terminal equipment, wherein the updated camera identifications do not comprise identifications for identifying invalid cameras and identifications for identifying the photographing modes which are caused by the invalid cameras and cannot be used.
The chip or the chip system may refer to the related description in the embodiment corresponding to fig. 4, and will not be described again.
In the embodiment of the application, the terminal equipment can output a multi-shot decision according to the updated camera identification of the chip or the chip system, thereby controlling the corresponding physical camera to shoot pictures and completing the normal shooting use of the camera after the failure of a certain physical camera.
Fig. 10 is a schematic structural diagram of a chip according to an embodiment of the present application. Chip 1000 includes one or more (including two) processors 1001, communication lines 1002, a communication interface 1003, and memory 1004.
In some implementations, the memory 1004 stores the following elements: executable modules or data structures, or a subset thereof, or an extended set thereof.
The method described in the above embodiments of the present application may be applied to the processor 1001 or implemented by the processor 1001. The processor 1001 may be an integrated circuit chip having signal processing capabilities. In implementation, the steps of the above method may be performed by integrated logic circuits of hardware in the processor 1001 or by instructions in the form of software. The processor 1001 may be a general purpose processor (e.g., a microprocessor or a conventional processor), a digital signal processor (digital signal processing, DSP), an application specific integrated circuit (application specific integrated circuit, ASIC), an off-the-shelf programmable gate array (field-programmable gate array, FPGA) or other programmable logic device, discrete gates, transistor logic, or discrete hardware components, and the processor 1001 may implement or perform the methods, steps, and logic blocks related to the processes disclosed in the embodiments of the present application.
The steps of the method disclosed in connection with the embodiments of the present application may be embodied directly in the execution of a hardware decoding processor, or in the execution of a combination of hardware and software modules in a decoding processor. The software modules may be located in a state-of-the-art storage medium such as random access memory, read-only memory, programmable read-only memory, or charged erasable programmable memory (electrically erasable programmable read only memory, EEPROM). The storage medium is located in the memory 1004, and the processor 1001 reads information in the memory 1004, and performs the steps of the method in combination with its hardware.
The processor 1001, the memory 1004, and the communication interface 1003 may communicate with each other via a communication line 1002.
In the above embodiments, the instructions stored by the memory for execution by the processor may be implemented in the form of a computer program product. The computer program product may be written in the memory in advance, or may be downloaded in the form of software and installed in the memory.
Embodiments of the present application also provide a computer program product comprising one or more computer instructions. When the computer program instructions are loaded and executed on a computer, the processes or functions in accordance with embodiments of the present application are produced in whole or in part. The computer may be a general purpose computer, a special purpose computer, a computer network, or other programmable apparatus. The computer instructions may be stored in a computer-readable storage medium or transmitted from one computer-readable storage medium to another computer-readable storage medium, for example, the computer instructions may be transmitted from one website, computer, server, or data center to another website, computer, server, or data center by wired (e.g., coaxial cable, fiber optic, digital subscriber line (digital subscriber line, DSL), or wireless (e.g., infrared, wireless, microwave, etc.), or semiconductor medium (e.g., solid state disk, SSD)) or the like.
The embodiment of the application also provides a computer readable storage medium. The methods described in the above embodiments may be implemented in whole or in part by software, hardware, firmware, or any combination thereof. Computer readable media can include computer storage media and communication media and can include any medium that can transfer a computer program from one place to another. The storage media may be any target media that is accessible by a computer.
As one possible design, the computer-readable medium may include compact disk read-only memory (CD-ROM), RAM, ROM, EEPROM, or other optical disk memory; the computer readable medium may include disk storage or other disk storage devices. Moreover, any connection is properly termed a computer-readable medium. For example, if the software is transmitted from a website, server, or other remote source using a coaxial cable, fiber optic cable, twisted pair, DSL, or wireless technologies such as infrared, radio, and microwave, then the coaxial cable, fiber optic cable, twisted pair, DSL, or wireless technologies such as infrared, radio, and microwave are included in the definition of medium. Disk and disc, as used herein, includes Compact Disc (CD), laser disc, optical disc, digital versatile disc (digital versatile disc, DVD), floppy disk and blu-ray disc where disks usually reproduce data magnetically, while discs reproduce data optically with lasers.
Embodiments of the present application are described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flowchart illustrations and/or block diagrams, and combinations of flows and/or blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processing unit of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processing unit of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.

Claims (17)

1. A photographing processing method, characterized in that the method comprises:
at a first moment, the terminal equipment receives a first operation aiming at camera application;
responding to the first operation, the terminal equipment displays a first interface, wherein the first interface comprises a first photographing mode, a second photographing mode and a third photographing mode; the first photographing mode, the second photographing mode and the third photographing mode are all displayed in a state capable of triggering photographing;
At a second moment, the terminal device receives a second operation for the camera application again; the second time is later than the first time;
responding to the second operation, the terminal equipment displays a second interface, wherein the second interface comprises the first photographing mode and the third photographing mode, and the first photographing mode and the third photographing mode are both displayed in a state capable of triggering photographing; and, the second interface does not include the second photographing mode.
2. The method of claim 1, wherein the first interface further comprises first focal segment information and the second interface further comprises second focal segment information, wherein a magnification range in the second focal segment information is smaller than a magnification range in the first focal segment information.
3. A method according to claim 1 or 2, wherein in response to the second operation, the terminal device comprises, before displaying a second interface:
in response to the second operation, the terminal device displays a third interface, wherein the third interface comprises the first photographing mode and the third photographing mode, the first photographing mode and the third photographing mode are both displayed in a state that photographing can be triggered, and the third interface comprises the second photographing mode which is displayed in a state that photographing can not be triggered;
The terminal equipment receives a third operation of the second photographing mode at the third interface;
in response to the third operation, the terminal device displays a fourth interface, where the fourth interface includes: and the information is used for prompting that the second photographing mode is unavailable and/or prompting the identification information of the invalid camera in the terminal equipment.
4. The method of claim 3, wherein the fourth interface further comprises information for prompting cancellation of display of the second photographing mode, and a target button; the method further comprises the steps of:
the terminal equipment receives a fourth operation aiming at the target button;
and responding to the fourth operation, and displaying the second interface by the terminal equipment.
5. The method of claim 2, wherein the terminal device comprises an ultra-wide angle camera, a wide angle camera, and/or a tele camera; before the terminal device receives the second operation for the camera application again, the method further comprises:
the terminal equipment determines that the ultra-wide angle camera, the wide angle camera and/or the tele camera fails;
when the ultra-wide angle camera fails, the second focal segment information does not comprise 0.1X-1X multiplying power;
When the wide-angle camera fails, the second focal segment information does not comprise 1X-10X multiplying power;
and when the long-focus camera fails, the second focal segment information does not comprise 10X-100X multiplying power.
6. The method according to any one of claims 1-5, wherein the terminal device comprises a plurality of cameras, and N of the plurality of cameras is used when the second photographing mode is operated, wherein N is a positive integer; before the terminal device receives the second operation for the camera application again, the method includes:
the terminal equipment determines that M target cameras fail; m is a positive integer, and M is less than or equal to N;
the terminal equipment determines that the type of the target failure cameras and/or the number of the target failure cameras meet preset conditions, wherein the target failure cameras are repeated cameras in the M target cameras and the N cameras;
and the terminal equipment fails the second photographing mode.
7. The method according to claim 6, wherein the determining, by the terminal device, that the type of the target failed camera and/or the number of the target failed cameras meet a preset condition comprises:
The terminal equipment determines that the type of the target failure camera is a main camera, and/or the terminal equipment determines that the number of the target failure cameras is greater than or equal to half of the N cameras.
8. The method according to claim 6 or 7, wherein the terminal device disabling the second photographing mode comprises:
and the camera of the terminal equipment drives an updated photographing mode and reports updated information to a camera application, wherein the updated information does not comprise the second photographing mode.
9. The method according to any one of claims 6-8, wherein before the M target cameras fail, the terminal device uses the target failed camera when taking a picture in the third photographing mode;
after the M target cameras fail, the terminal equipment does not use the target failure cameras when shooting in the third shooting mode.
10. A photographing processing method, characterized in that the method comprises:
the terminal equipment determines a failure camera;
the camera of the terminal equipment drives to update a photographing mode and/or a focal segment and reports updated information to a camera application, wherein the updated information does not comprise the photographing mode and/or the focal segment which are caused by the invalid camera and cannot be used;
And the camera application displays each photographing mode and/or variable-focus section according to the updated information reported by the camera driver.
11. The method of claim 10, wherein the determining, by the terminal device, that the camera is disabled comprises: and detecting the in-place condition of the camera of the terminal equipment by using a chip platform driver and/or a camera driver of the terminal equipment to obtain the invalid camera.
12. The method of claim 11, wherein each physical camera and each photographing mode in the terminal device corresponds to a respective camera identifier; after obtaining the failure camera, the method further comprises the following steps of:
the chip platform drives and updates the camera identifications of the physical cameras in the terminal equipment and the camera identifications of the photographing modes, and reports the updated camera identifications to the multi-photographing strategy matching module of the terminal equipment, wherein the updated camera identifications do not comprise identifications for identifying the invalid cameras and identifications for identifying the unusable photographing modes caused by the invalid cameras, and the multi-photographing strategy matching module is used for managing the corresponding relation between the photographing modes and the physical cameras used by the photographing modes.
13. The method according to any of claims 10-12, wherein the camera driving of the terminal device updates the photographing mode and/or the focal length, comprising: the camera driver of the terminal equipment updates L photographing modes into Q photographing modes and/or updates first focal segment information into second focal segment information; the Q photographing modes are photographing modes after the photographing modes which are caused by the failure camera and cannot be used are removed from the L photographing modes, the multiplying power range in the second focal segment information is smaller than the multiplying power range in the first focal segment information, L is a positive integer, Q is a positive integer, and L is greater than or equal to Q.
14. A chip system for application to a terminal device, the chip system comprising one or more processors configured to perform:
detecting the in-place condition of a camera of the terminal equipment, and determining a failure camera;
updating the camera identifications of the physical cameras and the camera identifications of the photographing modes in the terminal equipment, and reporting the updated camera identifications to the terminal equipment, wherein the updated camera identifications do not comprise identifications for identifying the invalid cameras and identifications for identifying the photographing modes which are caused by the invalid cameras and cannot be used.
15. A terminal device, comprising: a memory for storing a computer program and a processor for executing the computer program to perform the method of any of claims 1-13.
16. A computer readable storage medium storing instructions that, when executed, cause a computer to perform the method of any one of claims 1-13.
17. A computer program product comprising a computer program which, when run, causes a terminal device to perform the method of any of claims 1-13.
CN202310121408.3A 2023-02-03 2023-02-03 Photographing processing method and related device Pending CN117156270A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310121408.3A CN117156270A (en) 2023-02-03 2023-02-03 Photographing processing method and related device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310121408.3A CN117156270A (en) 2023-02-03 2023-02-03 Photographing processing method and related device

Publications (1)

Publication Number Publication Date
CN117156270A true CN117156270A (en) 2023-12-01

Family

ID=88899420

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310121408.3A Pending CN117156270A (en) 2023-02-03 2023-02-03 Photographing processing method and related device

Country Status (1)

Country Link
CN (1) CN117156270A (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003005263A (en) * 2001-06-20 2003-01-08 Olympus Optical Co Ltd Camera constituted of a plurality of units
CN114401340A (en) * 2021-12-31 2022-04-26 荣耀终端有限公司 Collaborative shooting method, electronic device and medium thereof
CN114554096A (en) * 2022-02-28 2022-05-27 联想(北京)有限公司 Processing method and device and electronic equipment
CN114726950A (en) * 2022-02-28 2022-07-08 荣耀终端有限公司 Opening method and device of camera module

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003005263A (en) * 2001-06-20 2003-01-08 Olympus Optical Co Ltd Camera constituted of a plurality of units
CN114401340A (en) * 2021-12-31 2022-04-26 荣耀终端有限公司 Collaborative shooting method, electronic device and medium thereof
CN114554096A (en) * 2022-02-28 2022-05-27 联想(北京)有限公司 Processing method and device and electronic equipment
CN114726950A (en) * 2022-02-28 2022-07-08 荣耀终端有限公司 Opening method and device of camera module

Similar Documents

Publication Publication Date Title
CN108028891B (en) Electronic apparatus and photographing method
US11449242B2 (en) Shared storage space access method, device and system and storage medium
CN112860145B (en) Application control method and electronic equipment
CN115017534B (en) File processing authority control method, device and storage medium
CN117156088A (en) Image processing method and related device
CN117156270A (en) Photographing processing method and related device
CN116688495A (en) Frame rate adjusting method and related device
CN115442517A (en) Image processing method, electronic device, and computer-readable storage medium
CN114266306A (en) Method and device for realizing data classification based on machine learning model and electronic equipment
CN116196621B (en) Application processing method and related device
CN116672707B (en) Method and electronic device for generating game prediction frame
CN116088970B (en) Method for controlling application call and related device
CN116048829B (en) Interface calling method, device and storage medium
CN116688494B (en) Method and electronic device for generating game prediction frame
CN116708615B (en) Device registration method, device registration apparatus and storage medium
CN116089320B (en) Garbage recycling method and related device
CN115421599B (en) Input method display control method and related device
CN117176850B (en) Interface display method and related device
CN115016921B (en) Resource scheduling method, device and storage medium
CN115344860B (en) Method for managing application program and electronic equipment
CN117076089B (en) Application management method, terminal device and storage medium
WO2024037346A1 (en) Page management method and electronic device
CN116185245B (en) Page display method and electronic equipment
CN116737037A (en) Stack management method in interface display and related device
CN117707656A (en) User operation response method, device and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination