CN116887047A - Focusing method, electronic equipment and storage medium - Google Patents

Focusing method, electronic equipment and storage medium Download PDF

Info

Publication number
CN116887047A
CN116887047A CN202311147338.5A CN202311147338A CN116887047A CN 116887047 A CN116887047 A CN 116887047A CN 202311147338 A CN202311147338 A CN 202311147338A CN 116887047 A CN116887047 A CN 116887047A
Authority
CN
China
Prior art keywords
focusing
camera
original
instruction
focus
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202311147338.5A
Other languages
Chinese (zh)
Other versions
CN116887047B (en
Inventor
马靖煊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honor Device Co Ltd
Original Assignee
Honor Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honor Device Co Ltd filed Critical Honor Device Co Ltd
Priority to CN202311147338.5A priority Critical patent/CN116887047B/en
Publication of CN116887047A publication Critical patent/CN116887047A/en
Application granted granted Critical
Publication of CN116887047B publication Critical patent/CN116887047B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/62Control of parameters via user interfaces

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Human Computer Interaction (AREA)
  • Studio Devices (AREA)

Abstract

The application provides a focusing method, electronic equipment and a storage medium, relates to the field of terminal equipment, and can enable the electronic equipment to timely respond to photographing operation of a user. The method comprises the following steps: the electronic equipment responds to photographing operation implemented by a user on a photographing interface of a target application, and generates an original focusing instruction; when the focusing mode corresponding to the original focusing instruction is an automatic focusing mode and the electronic equipment is in a dark light environment, the electronic equipment converts the original focusing instruction into a target focusing instruction; the target focusing instruction is used for indicating to focus in a preset focusing mode; under a preset focusing mode, a camera module of the electronic equipment obtains a focusing result within a preset time period after focusing starts; the electronic equipment controls the camera module to focus in a preset focusing mode based on the target focusing instruction so as to obtain a first focusing result in a preset time period, and executes response operation for photographing operation based on the first focusing result.

Description

Focusing method, electronic equipment and storage medium
Technical Field
The present application relates to the field of terminal devices, and in particular, to a focusing method, an electronic device, and a storage medium.
Background
With the technical development of electronic devices, the photographing function of the electronic devices is used more and more frequently, and users have higher and higher requirements on photographing experience. In the prior art, after a user opens an application program (for example, a camera application or a third party application with a photographing function) with a photographing function, a camera module (including a flash lamp, a camera, a focusing motor, etc.) of an electronic device can be focused by triggering a photographing key to obtain a photograph.
Currently, some applications on electronic devices with photographing function use an auto-focus mode. In the automatic focusing mode, if the electronic equipment fails to focus, the camera module of the electronic equipment continuously focuses until the camera module is successful in focusing, and the camera module can take a picture to obtain a picture and return the picture to a corresponding application program, so that the application program presents the picture obtained by taking the picture to a user. Therefore, the photographing process of the electronic equipment can generate great delay, the user cannot perceive the photographing result for a long time, and the use experience of the user is seriously affected.
Disclosure of Invention
The embodiment of the application provides a focusing method, electronic equipment and a storage medium, which can enable a user to timely obtain the response of the electronic equipment in the process of photographing by using a target application in a dim light environment and improve the use experience of the user.
In order to achieve the above purpose, the embodiment of the present application adopts the following technical scheme:
in a first aspect, the present application provides a focusing method, applied to an electronic device, where the electronic device includes a target application having a photographing function. The method may include: the electronic equipment responds to photographing operation implemented by a user on a photographing interface of a target application, and generates an original focusing instruction; the original focusing instruction is used for indicating to focus by adopting a focusing mode corresponding to the target application; when the focusing mode corresponding to the original focusing instruction is an automatic focusing mode and the electronic equipment is in a dark light environment, the electronic equipment converts the original focusing instruction into a target focusing instruction; the target focusing instruction is used for indicating to focus in a preset focusing mode; in the automatic focusing mode, the camera module of the electronic equipment carries out cyclic focusing, and a focusing result indicating successful focusing is obtained under the condition of successful focusing; under a preset focusing mode, a camera module of the electronic equipment obtains a focusing result within a preset time period after focusing starts; the electronic equipment controls the camera module to focus in a preset focusing mode based on the target focusing instruction so as to obtain a first focusing result in a preset time period, and executes response operation for photographing operation based on the first focusing result.
Based on the technical scheme provided by the embodiment, the electronic device can generate the original focusing instruction after receiving the photographing operation implemented by the user in the photographing interface of the target application. And then, the electronic equipment can judge whether the focusing mode corresponding to the original focusing instruction is an automatic focusing mode or not and whether the electronic equipment is in a dark light environment or not. When the focusing mode corresponding to the original focusing instruction is determined to be an automatic focusing mode and the current environment of the electronic equipment is a dark environment, the original focusing instruction can be rewritten into a target focusing instruction for focusing in a preset focusing mode (such as a continuous focusing mode). Then, the electronic device can focus based on the focusing mode corresponding to the target focusing instruction, so as to obtain a focusing result (namely a first focusing result) in time. The focusing result is used for indicating successful focusing or failed focusing. Then, the electronic device can perform corresponding processing based on the focusing result, so that response or feedback is given to the photographing operation implemented by the user in time. In the prior art, the problem that in the process of photographing by using the target application in the dim light environment, the user cannot obtain the response of the electronic equipment for a long time after performing photographing operation, and a great photographing delay is generated is avoided, so that the electronic equipment can respond to the photographing operation of the user in time, and the use experience of the user is improved.
In one possible design manner of the first aspect, the method further includes: the electronic equipment acquires the ambient light brightness detected by the ambient light sensor; and under the condition that the ambient light brightness is larger than a preset brightness value, the electronic equipment determines that the electronic equipment is in a dark light environment.
Based on the technical scheme, the electronic equipment can accurately determine whether the electronic equipment is in a dim light environment or not, and further data support is provided for the subsequent judgment of whether to convert the original focusing instruction or not, so that the focusing method can be smoothly implemented.
In one possible design manner of the first aspect, the electronic device performs a response operation for the photographing operation based on the first focusing result, including: under the condition that the first focusing result is used for indicating successful focusing, the electronic equipment acquires a first original image from the camera module; the electronic equipment obtains a first target image based on the first original image; the electronic device displays a first target image on a tile preview interface of the target application.
Based on the technical scheme, the electronic equipment can timely display the photo (namely the target image) to be viewed after the photographing operation is carried out by the user under the condition of successful focusing, so that the use experience of the user is improved.
In one possible design manner of the first aspect, when the focusing mode corresponding to the original focusing instruction is not an automatic focusing mode or the electronic device is not in a dark environment, the electronic device controls the camera module to focus in the focusing mode corresponding to the target application based on the original focusing instruction, and after the second focusing result is obtained, performs a response operation for the photographing operation based on the second focusing result.
Based on the technical scheme, the electronic equipment can normally focus based on the original focusing instruction and obtain a second focusing result when the focusing mode is not an automatic focusing mode or is not in a dark light environment. And further, response operation for photographing operation of the user can be timely executed based on the second focusing result, and use experience of the user is improved.
In one possible design manner of the first aspect, the generating, by the electronic device, an original focusing instruction in response to a photographing operation performed by a user on a photographing interface of the target application includes: the method comprises the steps that a target application of the electronic equipment responds to photographing operation implemented by a user on a photographing interface of the target application, and an original focusing request is sent to camera service of the electronic equipment; the original focusing request is used for requesting focusing by adopting a focusing mode corresponding to the target application; the camera service of the electronic device receives the original focus request from the target application and generates an original focus instruction based on the original focus request.
Based on the technical scheme, the electronic equipment can smoothly generate the original focusing instruction, so that the follow-up electronic equipment can carry out follow-up focusing flow based on the original focusing instruction.
In one possible design manner of the first aspect, when the focusing mode corresponding to the original focusing instruction is an auto-focusing mode and the electronic device is in a dark environment, the electronic device converts the original focusing instruction into the target focusing instruction, including: the camera service of the electronic equipment sends an original focusing instruction to a camera hardware abstraction module of the electronic equipment; a camera hardware abstraction module of the electronic equipment receives an original focusing instruction from a camera service; the camera hardware abstraction module of the electronic equipment converts the original focusing instruction into a target focusing instruction under the condition that the focusing mode corresponding to the original focusing instruction is determined to be an automatic focusing mode and the electronic equipment is determined to be in a dark light environment.
Because the primary role of the camera services in the framework layer in the electronic device is to translate the requests of the target applications into instructions that can be recognized by the camera HAL (i.e., camera hardware abstraction module) in the HAL layer, a "top-down" like effect is played. The camera HAL is a module for directly calling and controlling the hardware of the camera module of the electronic device. Therefore, based on the above technical scheme, the electronic device can utilize the camera HAL to complete the judgment of the original focusing instruction and the ambient light brightness, and further, under the condition that the focusing mode corresponding to the original focusing instruction is an automatic focusing mode and the electronic device is in a dark environment (the ambient light brightness is smaller than a preset threshold value), the original focusing instruction is calibrated to be changed into the target focusing instruction. Therefore, under the condition that the electronic equipment is in a dark light environment and the original focusing instruction is used for indicating that the automatic focusing mode is adopted for focusing, the camera HAL can indicate the camera module to adopt the preset focusing mode for focusing, so that the electronic equipment can timely obtain a focusing result, and response to photographing operation is made based on the focusing result, and the use experience of a user is improved.
In one possible design of the first aspect, the preset focusing mode is a continuous focusing mode.
In one possible design manner of the first aspect, the flash of the camera module of the electronic device is in an off state.
In practice, if the flash lamp of the electronic device is in an on state, even if the electronic device is in a dark light environment and the focusing mode corresponding to the original focusing instruction is an automatic focusing mode, the electronic device can finish focusing by utilizing illumination given by the flash lamp, and the technical problem to be solved by the technical scheme provided by the application cannot exist. Therefore, in the scene of the technical scheme provided by the application, the flash lamp of the camera module of the electronic equipment is in a closed state.
In a second aspect, the present application provides an electronic device comprising: the device comprises an acquisition module, a processing module and a control module. The acquisition module is used for responding to photographing operation implemented by a user on a photographing interface of the target application and generating an original focusing instruction; the original focusing instruction is used for indicating to focus by adopting a focusing mode corresponding to the target application. The processing module is used for converting the original focusing instruction into a target focusing instruction when the focusing mode corresponding to the original focusing instruction generated by the acquisition module is an automatic focusing mode and the electronic equipment is in a dark light environment; the target focusing instruction is used for indicating to focus in a preset focusing mode; in the automatic focusing mode, the camera module of the electronic equipment carries out cyclic focusing, and a focusing result indicating successful focusing is obtained under the condition of successful focusing; under a preset focusing mode, the camera module of the electronic equipment obtains a focusing result within a preset time period after focusing starts. The control module is used for controlling the camera module to focus in a preset focusing mode based on the target focusing instruction obtained by the conversion of the processing module so as to obtain a first focusing result within a preset duration, and executing response operation for photographing operation based on the first focusing result.
In a possible design of the second aspect, the processing module is further configured to: acquiring the ambient light brightness detected by an ambient light sensor; and under the condition that the ambient light brightness is larger than a preset brightness value, determining that the electronic equipment is in a dark light environment.
In one possible design manner of the second aspect, the electronic device further includes a display module; the control module is specifically used for: under the condition that the first focusing result is used for indicating successful focusing, the electronic equipment acquires a first original image from the camera module; obtaining a first target image based on the first original image; and controlling the display module to display a first target image on a film preview interface of the target application.
In one possible design manner of the second aspect, the processing module is further configured to instruct, based on the original focusing instruction, the control module to control the camera module to perform focusing in a focusing mode corresponding to the target application when the focusing mode corresponding to the original focusing instruction generated by the obtaining module is not an automatic focusing mode or the electronic device is not in a dark light environment, and after obtaining the second focusing result, perform a response operation for the photographing operation based on the second focusing result.
In a possible design manner of the second aspect, the acquisition module includes a first unit and a second unit. The first unit is used for responding to photographing operation implemented by a user on a photographing interface of a target application and sending an original focusing request to a camera service of the electronic equipment; the original focusing request is used for requesting focusing by adopting a focusing mode corresponding to the target application. And the second unit is used for receiving the original focusing request from the first unit and generating an original focusing instruction based on the original focusing request.
In a possible embodiment of the second aspect, the processing module comprises a third unit and a fourth unit. The third unit is used for sending an original focusing instruction to a camera hardware abstraction module of the electronic device. The fourth unit is used for receiving the original focusing instruction from the third unit; the fourth unit is further configured to convert the original focus instruction into a target focus instruction when it is determined that the focus mode corresponding to the original focus instruction is an auto focus mode and it is determined that the electronic device is in a dark environment.
In one possible design of the second aspect, the preset focusing mode is a continuous focusing mode.
In one possible design manner of the second aspect, the flash of the camera module of the electronic device is in an off state.
In a third aspect, the present application provides an electronic device comprising a display screen, a memory, and one or more processors; the display screen and the memory are coupled with the processor; wherein the memory has stored therein computer program code comprising computer instructions which, when executed by the processor, cause the electronic device to perform the ambient light sensor calibration method as provided by the first aspect and any one of its possible designs.
In a fourth aspect, the present application provides a computer readable storage medium comprising computer instructions which, when run on an electronic device, cause the electronic device to perform a focusing method as provided by the first aspect and any one of its possible designs.
In a fifth aspect, the application provides a computer program product for, when run on an electronic device, causing the electronic device to perform a focusing method as provided by the first aspect and any one of its possible designs.
In a sixth aspect, there is provided an apparatus (e.g. the apparatus may be a system-on-a-chip) comprising a processor for supporting an electronic device to implement the functionality referred to in the second aspect above. In one possible design, the apparatus further includes a memory for storing program instructions and data necessary for the electronic device. When the device is a chip system, the device can be formed by a chip, and can also comprise the chip and other discrete devices.
It may be appreciated that the advantages achieved by the technical solutions provided in the second aspect to the sixth aspect may refer to the advantages in any one of the possible designs of the first aspect, and are not described herein.
Drawings
FIG. 1 is a schematic diagram of a photographing process according to the prior art;
fig. 2 is a schematic view of a shooting scene of a target application according to an embodiment of the present application;
fig. 3 is a schematic diagram of a focusing method according to an embodiment of the present application;
fig. 4 is a schematic hardware structure of an electronic device according to an embodiment of the present application;
fig. 5 is a schematic software architecture diagram of an electronic device according to an embodiment of the present application;
fig. 6 is a schematic flow chart of a focusing method according to an embodiment of the present application;
fig. 7 is a schematic view of a photographing operation according to an embodiment of the present application;
fig. 8 is a schematic diagram of a flash setting scenario provided in an embodiment of the present application;
FIG. 9 is a schematic diagram of a dark environment determination process according to an embodiment of the present application;
fig. 10 is a flowchart of another focusing method according to an embodiment of the present application;
fig. 11 is a flowchart of another focusing method according to an embodiment of the present application;
fig. 12 is a schematic structural diagram of an electronic device according to an embodiment of the present application;
fig. 13 is a schematic structural diagram of a chip system according to an embodiment of the present application.
Detailed Description
The terminology used in the following embodiments of the application is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used in the specification of the present application and the appended claims, the singular forms "a," "an," "the," and "the" are intended to include the plural forms as well, unless the context clearly indicates to the contrary. It should also be understood that "/" means or, e.g., A/B may represent A or B; the text "and/or" is merely an association relation describing the associated object, and indicates that three relations may exist, for example, a and/or B may indicate: a exists alone, A and B exist together, and B exists alone.
Reference in the specification to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment may be included in at least one embodiment of the application. The appearances of such phrases in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. Those of skill in the art will explicitly and implicitly appreciate that the described embodiments of the application may be combined with other embodiments.
The terms "first", "second" in the following embodiments of the present application are used for descriptive purposes only and are not to be construed as implying or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defining "a first" or "a second" may explicitly or implicitly include one or more such feature, and in the description of embodiments of the application, unless otherwise indicated, the meaning of "a plurality" is two or more.
In the description of embodiments of the present application, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
First, technical terms related to the present disclosure will be described:
autofocus (or autofocus mode): the automatic focusing is a mode that the photographing module (comprising a flash lamp, a camera, a focusing motor and other components) performs focusing when the electronic equipment photographs. Under the automatic focusing mode, the camera module of the electronic equipment can continuously and circularly focus under the condition of failed focusing, and the focusing result is returned to the upper application program of the photographing function used by the user until the focusing is successful, and photographing is performed to obtain a photo.
Continuous focus (or continuous focus mode): the continuous focusing is a mode that the photographing module performs focusing when the electronic equipment photographs. In the continuous focusing mode, the camera module of the electronic device returns a focusing result to an upper application program of a user using a photographing function no matter successful or unsuccessful in focusing within a preset time after focusing starts. After receiving the focusing result, the upper layer application program can determine whether to directly acquire the image from the camera module as a photo or remind the user to adjust the photographing mode or automatically adjust the focusing mode or the like according to preset processing logic.
And (3) overwriting: override refers to a method in which a child redefines a parent class in an inheritance relationship so that it has a different implementation in the child. Through overwriting, the subclasses can customize the method inherited from the parent class and modify or expand according to the own requirements. When a child overwrites a method of a parent class, the program will execute the redefined method in the child instead of the original method in the parent class when the child object is used to call the method. Such subclasses can re-implement the parent's method according to their specific needs or logic, thereby enabling personalized behavior. Overwriting is an important concept in object-oriented programming, which allows children to implement polymorphisms by inheriting and modifying parent methods, providing a mechanism for flexibility and code multiplexing. It allows the child to customize the methods inherited from the parent according to its specific needs without altering the implementation of the parent.
When the electronic equipment photographs, the follow-up photographing can be performed to obtain a photograph after focusing is completed. Currently, some applications on electronic devices that have photographing functions (typically, third party applications outside of the local camera application) use an auto-focus mode. In the automatic focusing mode, if the electronic equipment fails to focus, the camera module of the electronic equipment continuously focuses until the camera module is successful in focusing, and the camera module can take a picture to obtain a picture and return the picture to a corresponding application program, so that the application program presents the picture obtained by taking the picture to a user.
For example, in the case that the focusing mode of a certain target application with a photographing function in the electronic device is set by the user or defaults to an auto-focusing mode, a photographing procedure of the user when the user needs to use the target application in the electronic device to photograph may be as shown in fig. 1, where the photographing procedure specifically may include the following steps:
step 1, when a user has a photographing requirement, a photographing interface of a target application can be opened, and photographing operation is implemented.
For example, the photographing operation may be performing a triggering operation (e.g., a clicking operation) on a photographing control (or referred to as a photographing button) in the photographing interface. The target application may be a third party application, such as an enterprise WeChat.
And 2, responding to photographing operation implemented by a user, and sending an automatic focusing request corresponding to the automatic focusing mode set by the user or defaulted by the target application to a camera service/framework (camera service).
The automatic focusing request is used for requesting the camera service to focus in an automatic focusing mode. Specifically, because the focusing action is accomplished by the particular hardware-camera module, an autofocus request herein may or may not request the camera service to instruct the camera module to focus in an autofocus mode.
Step 3, because the camera service is a module in a framework layer in the software architecture of the electronic device, the module in the framework layer cannot directly interact with the hardware of the electronic device, and the interaction with the hardware needs to be completed through a hardware abstraction layer (hardware abstraction layer, HAL). The camera service, upon receiving an autofocus request issued by the target application, in response to the autofocus request, sends a first autofocus instruction to a camera HAL (otherwise referred to as a camera hardware abstraction module or camera module) in the HAL layer.
The first autofocus instruction is configured to instruct the camera HAL to focus in an autofocus mode. Specifically, because the focusing action is performed by the hardware of the specific camera module, the first autofocus instruction may request the camera HAL to instruct the camera module to focus in the autofocus mode.
Step 4, the camera HAL receives a first autofocus instruction from the camera service and sends a second autofocus instruction to the camera module (or camera hardware) in response to the autofocus instruction.
The second automatic focusing instruction is used for indicating the camera module to focus in an automatic focusing mode.
And step 5, the camera module receives a second automatic focusing instruction from the camera HAL, and focuses in an automatic focusing mode in response to the second automatic focusing instruction.
In the automatic focusing mode, after focusing is successful, the camera shooting module returns a focusing result of successful focusing to the target application through the camera HAL and the camera service.
In the automatic focusing mode, if the camera module fails to focus, the camera module continuously and circularly focuses until successful focusing, and the successful focusing result is returned to the target application through the camera HAL and the camera service.
And 6, receiving a focusing result of successful focusing by the target application, and responding to the focusing result, and sending a photographing request to the camera HAL by the target application through the camera service.
Wherein the photographing request is used for requesting the camera HAL to acquire an image from the camera module.
And 7, the camera HAL receives a photographing request, and an original image is obtained from the camera module in response to the photographing request.
In particular, the camera HAL may send an image acquisition request to the camera module for requesting an original image.
In some embodiments, the camera module updates the image (which may be a preview image presented in the photographing interface) in real time based on what the camera is currently capturing, and the camera HAL captures the latest image that the camera module is currently updating when the camera HAL captured the original image from the camera module.
And step 8, the camera module returns the original image to the camera HAL.
And 9, processing the original image by the camera HAL to obtain a target image.
In practice, the original image obtained by the camera module may have the following problems: first, the original image may contain some hardware related problems such as lens distortion, chromatic aberration, illumination unevenness, etc. The camera module may correct or compensate for these problems at the hardware level and then pass the processed image (i.e., the original image) to the camera HAL. The camera HAL may further correct or process the image to ensure the quality and accuracy of the image. Second, the format of the original image obtained by the camera module may not be consistent with the format required by the application. The camera module may use a particular image format to transmit the image data, while the target application may need to use another format to process the image. The camera HAL may format the image data of the original image to suit the needs of the target application. Furthermore, the target application may require the use of certain specific image processing algorithms or functions, such as noise reduction, sharpening, color correction, etc. The camera HAL may then apply these algorithms or functions after the original image is acquired to improve the quality of the target image that is ultimately presented to the user or to meet the needs of the target application.
In summary, in step 8, the camera HAL processes the original image captured by the camera module to solve the image quality problem, the format conversion requirement and the specific requirement of the application program. By processing the raw image, the camera HAL may provide image data that is more adapted to the needs of the application.
Based on this, the processing of the original image by the camera HAL includes, but is not limited to, the following:
image format conversion: the camera HAL may convert the image data of the original image from a camera module specific format to a format supported by the target application, such as a JPEG or YUV format.
Autofocus and exposure control: the camera HAL may perform auto-focus and auto-exposure control on the original image according to the function of the camera module to ensure that the captured image is clear and has proper brightness.
And (3) correcting an image: the camera HAL may correct the original image, eliminating image distortion due to camera sensor or lens distortion.
Image processing: the camera HAL may apply some image processing algorithms such as noise reduction, sharpening, color correction, etc. to improve the image quality or to meet the needs of the application.
Step 10, the camera HAL returns the target image to the target application through the camera service.
And step 11, displaying the film preview interface by the target application, and displaying the target image in the film preview interface.
After the step 11 is executed, the user can view the target image generated by photographing on the film preview interface.
For example, as shown in fig. 2 (a), the photographing interface of the target application may be a preview area 201 in the photographing interface, where the photographing interface may display the content currently photographed by the camera module of the electronic device, and may further include a photographing control 202. After the photographing control is triggered by the user, the target application can take a photograph according to the photographing flow shown in fig. 1.
The electronic device is in an environment with sufficient illumination (not in a dark environment), or the user turns on a flash function of a photographing function in a target application (in the case that the user turns on the flash function of the photographing function in the target application, as shown in fig. 2 (a), a flash on identifier 203 may be displayed in a preview interface), or in the case that a camera module of the electronic device is provided with a component (for example, a laser focusing component) capable of completing focusing in the dark environment, the camera module may smoothly and rapidly complete focusing, so that the target application obtains a target image and displays the target image to the user for viewing in a finished preview interface. Illustratively, the tile preview interface may be as shown in fig. 2 (b). A save control 205 is included in the tile preview interface 204. After the save control 205 is triggered, the target application may then store the target image presented in the tile preview interface 204 in a gallery or any feasible storage space.
However, in the prior art, due to cost limitation, the hardware performance of the camera module of some electronic devices is not good enough (for example, the camera module does not have the capability of completing focusing in a dark environment (for example, if no laser focusing device exists, the camera module cannot complete focusing in the dark environment)), and the camera module cannot successfully complete focusing in the dark environment. In this case, if the environment where the user uses the electronic device is a dim light environment (the ambient brightness is less than the preset threshold value) and the user actively turns off the flash of the camera module (or the camera module does not have the flash), the electronic device will fail to focus when using the photographing procedure shown in fig. 1 to photograph. For example, when the user actively turns off the flash of the camera module, the photographing interface of the target application may include a flash off identifier 206 as shown in fig. 2 (c). When the camera module in the electronic device adopts the auto-focusing mode to focus and fails to focus, the electronic device always displays a photographing interface as shown in fig. 2 (c), and cannot display the photographed picture to the user in time.
In this case, the camera module can focus continuously and circularly, and the whole photographing step is blocked in step 5 for a long time. The target application cannot obtain the target image in time, and the user cannot view the target image in time. Therefore, the photographing process of the user using the electronic equipment generates great delay, the user cannot perceive a photographing result for a long time, and the use experience of the user is seriously affected.
In view of the above problems, referring to fig. 3, the present application provides a focusing method, which can be applied to an electronic device with a photographing function. In the method, in response to a user performing a photographing operation in a photographing interface of a target application, the electronic device invokes the target application to send an original focus instruction to a camera hardware abstraction layer HAL in the electronic device (specifically, may send the original focus instruction to the camera HAL through a camera service). When the camera HAL in the electronic equipment determines that the original focusing instruction is an automatic focusing instruction for focusing in an automatic focusing mode and the current environment is a dark environment, the original focusing instruction is rewritten into a target focusing instruction for focusing in a continuous focusing mode. Then, the camera HAL in the electronic equipment can send a target focusing instruction to the camera module so that the module adopts a continuous focusing mode to focus, and a focusing result is returned to the camera HAL after focusing starts for a preset time. The focusing result is used for indicating successful focusing or failed focusing. Then, the camera HAL can return the focusing result to a target application which needs the focusing result to determine whether to continue photographing, and the target application can timely perform corresponding processing based on the focusing result, so that response or feedback is given to photographing operation implemented by a user in time. In the prior art, the problem that in the process of photographing by using the target application in the dim light environment, the user cannot obtain the response of the electronic equipment for a long time after performing photographing operation, and a great photographing delay is generated is avoided, so that the electronic equipment can respond to the photographing operation of the user in time, and the use experience of the user is improved.
The technical scheme provided by the embodiment of the application is described in detail below with reference to the accompanying drawings.
The technical scheme provided by the application can be applied to the electronic equipment with a photographing function (or with a camera module). In some embodiments, the electronic device may be a mobile phone, a tablet computer, a handheld computer, a personal computer (personal computer, PC), a super mobile personal computer (ultra-mobile personal computer, UMPC), a netbook, a cellular phone, a personal digital assistant (personal digital assistant, PDA), an augmented reality (augmented reality, AR) device, a Virtual Reality (VR) device, an artificial intelligence (artificial intelligence, AI) device, a wearable device, a vehicle-mounted device, a smart home device, and/or a smart city device, which are provided with a photographing function, and the embodiment of the present application is not limited in particular type.
Taking an example that the electronic device is a mobile phone as an example, fig. 4 shows a schematic structural diagram of the electronic device according to an embodiment of the present application.
Referring to fig. 4, the electronic device may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (universal serial bus, USB) interface 130, a charge management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, a key 190, a motor 191, an indicator 192, a display 193, a subscriber identity module (subscriber identification module, SIM) card interface 194, a camera 195, and the like. The sensor module 180 may include, among other things, a pressure sensor, a gyroscope sensor, a barometric sensor, a magnetic sensor, an acceleration sensor, a distance sensor, a proximity light sensor, a fingerprint sensor, a temperature sensor, a touch sensor, an ambient light sensor, a bone conduction sensor, etc.
The processor 110 may include one or more processing units, such as: the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), a controller, a memory, a video codec, a digital signal processor (digital signal processor, DSP), a baseband processor, and/or a neural network processor (neural-network processing unit, NPU), etc. Wherein the different processing units may be separate devices or may be integrated in one or more processors.
The controller may be a neural hub and command center of the electronic device. The controller can generate operation control signals according to the instruction operation codes and the time sequence signals to finish the control of instruction fetching and instruction execution.
A memory may also be provided in the processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that the processor 110 has just used or recycled. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Repeated accesses are avoided and the latency of the processor 110 is reduced, thereby improving the efficiency of the system.
In some embodiments, the processor 110 may include one or more interfaces. The interfaces may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous receiver transmitter (universal asynchronous receiver/transmitter, UART) interface, a mobile industry processor interface (mobile industry processor interface, MIPI), a general-purpose input/output (GPIO) interface, a subscriber identity module (subscriber identity module, SIM) interface, and/or a universal serial bus (universal serial bus, USB) interface, among others.
The charge management module 140 is configured to receive a charge input from a power supply device (e.g., a charger, notebook power, etc.). The charger can be a wireless charger or a wired charger. In some wired charging embodiments, the charge management module 140 may receive a charging input of a wired charger through the USB interface 130. In some wireless charging embodiments, the charge management module 140 may receive wireless charging input through a wireless charging coil of the electronic device.
The charging management module 140 may also supply power to the electronic device through the power management module 141 while charging the battery 142. The battery 142 may specifically be a plurality of batteries connected in series. The power management module 141 is used for connecting the battery 142, the charge management module 140 and the processor 110.
The power management module 141 is used for connecting the battery 142, and the charge management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charge management module 140 and provides power to the processor 110, the internal memory 121, the display 193, the camera 195, the wireless communication module 160, and the like. The power management module 141 may also be configured to monitor parameters such as battery voltage, current, battery cycle number, battery state of health (leakage, impedance), etc. In other embodiments, the power management module 141 may also be provided in the processor 110. In other embodiments, the power management module 141 and the charging management module 140 may also be disposed in the same device, for example, the power management module 141 and the charging management module 140 may be different functional modules in the same chip.
The external memory interface 120 may be used to connect external non-volatile memory to enable expansion of the memory capabilities of the electronic device. The external nonvolatile memory communicates with the processor 110 through the external memory interface 120 to implement a data storage function. For example, files such as music and video are stored in an external nonvolatile memory.
The internal memory 121 may include one or more random access memories (random access memory, RAM) and one or more non-volatile memories (NVM). The random access memory may be read directly from and written to by the processor 110, may be used to store executable programs (e.g., machine instructions) for an operating system or other on-the-fly programs, may also be used to store data for users and applications, and the like. The nonvolatile memory may store executable programs, store data of users and applications, and the like, and may be loaded into the random access memory in advance for the processor 110 to directly read and write.
Touch sensors, also known as "touch devices". The touch sensor may be disposed on the display screen 193, and the touch sensor and the display screen 193 form a touch screen, which is also called a "touch screen". The touch sensor is used to monitor touch operations acting on or near it. The touch sensor may communicate the monitored touch operation to the application processor to determine the touch event type. Visual output related to the touch operation may be provided through the display 193. In other embodiments, the touch sensor may also be disposed on a surface of the electronic device other than where the display 193 is located.
The ambient light sensor is used for sensing ambient light brightness. For example: the ambient light sensor may measure the light intensity of four channels of ambient light. The ambient light sensor outputs the measured light intensities of the four channels of ambient light to the processor 110. The processor 110 may process (e.g., integrate) the light intensities of the four channels of ambient light output by the ambient light sensor to obtain the light intensity (e.g., illuminance value, or alternatively, illuminance value and color temperature value) of the ambient light. In the bright screen state (including the bright screen after unlocking and the bright screen under the screen locking), the electronic equipment can self-adaptively adjust the brightness of the display screen according to the light intensity of the obtained ambient light. For example, when the ambient light is darker, the screen brightness is reduced to prevent glare; when the ambient light is brighter, the brightness of the screen is improved, so that the screen display can be clearer. The ambient light sensor may also be used to automatically adjust white balance when implementing a photographing function. Wherein, when the electronic device is in a bright screen state or the shooting function is realized, the processor 110 controls the ambient light sensor to be turned on. Upon screen extinction, the processor 110 controls the ambient light sensor to turn off. The ambient light sensor may also be used to determine whether to turn off the screen when the electronic device is in a call state, for example, when the ambient light sensor determines that the light of the ambient light is slightly below a certain threshold, the electronic device may consider itself to be in an enclosed space, and then turn off the screen.
The pressure sensor is used for sensing a pressure signal and can convert the pressure signal into an electric signal. In some embodiments, the pressure sensor may be provided on the display 193. Pressure sensors are of many kinds, such as resistive pressure sensors, inductive pressure sensors, capacitive pressure sensors, etc. When a touch operation is applied to the display screen 193, the electronic apparatus monitors the intensity of the touch operation according to the pressure sensor. The electronic device may also calculate the location of the touch based on the monitoring signal of the pressure sensor. In some embodiments, touch operations that act on the same touch location, but at different touch operation strengths, may correspond to different operation instructions. For example: and executing an instruction for checking the short message when the touch operation with the touch operation intensity smaller than the first pressure threshold acts on the short message application icon. And executing an instruction for newly creating the short message when the touch operation with the touch operation intensity being greater than or equal to the first pressure threshold acts on the short message application icon.
In some embodiments, the electronic device may include 1 or N cameras 195, N being a positive integer greater than 1. In an embodiment of the present application, the type of camera 195 may be differentiated according to hardware configuration and physical location. For example, the plurality of cameras included in the camera 195 may be disposed on the front and back sides of the electronic device, the camera disposed on the display screen 294 of the electronic device may be referred to as a front camera, and the camera disposed on the rear cover of the electronic device may be referred to as a rear camera; for another example, cameras including the camera 195 may be referred to as wide-angle cameras, as cameras having short focal lengths and larger viewing angles, and cameras having long focal lengths and smaller viewing angles may be referred to as normal cameras. The content of the images collected by different cameras is different in that: the front camera is used for collecting sceneries facing the front surface of the electronic equipment, and the rear camera is used for collecting sceneries facing the back surface of the electronic equipment; the wide-angle camera can shoot scenes with larger area in a shorter shooting distance range, and the scenes shot at the same shooting distance are smaller than the images of the scenes shot by using the common lens in the picture. The focal length and the visual angle are relative concepts, and are not limited by specific parameters, so that the wide-angle camera and the common camera are also relative concepts, and can be distinguished according to physical parameters such as the focal length, the visual angle and the like.
In the embodiment of the present application, the camera 195 may further include a flash, a focusing motor, and other components, and the combination of these components may be referred to as a photographing module.
The electronic device implements display functions through a GPU, a display screen 193, an application processor, and the like. The GPU is a microprocessor for image editing, and is connected to the display 193 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 110 may include one or more GPUs that execute program instructions to generate or change display information.
The electronic device may implement photographing functions through an ISP, a camera 195, a video codec, a GPU, a display screen 193, an application processor, and the like.
The ISP is used to process the data fed back by the camera 195. For example, when photographing, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electric signal, and the camera photosensitive element transmits the electric signal to the ISP for processing and is converted into an image visible to naked eyes. ISP can also perform algorithm optimization on noise and brightness of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be located in the camera 195. The camera 195 is used to capture still images or video. In some embodiments, the electronic device may include 1 or N cameras, N being a positive integer greater than 1. The camera 195 may be a front camera or a rear camera.
The digital signal processor is used for processing digital signals, and can process other digital signals besides digital image signals. For example, when the electronic device selects a frequency bin, the digital signal processor is used to fourier transform the frequency bin energy, and so on.
The display 193 is used to display images, videos, and the like. The display 193 includes a display panel. The display panel may employ a liquid crystal display (liquid crystal display, LCD), an organic light-emitting diode (OLED), an active-matrix organic light emitting diode (AMOLED), a flexible light-emitting diode (flex), a mini, a Micro-OLED, a quantum dot light-emitting diode (quantum dot light emitting diodes, QLED), or the like. In some embodiments, the electronic device may include 1 or N display screens 193, N being a positive integer greater than 1.
In an embodiment of the application, the display 193 may be used to display an interface of the electronic device (e.g., a camera preview interface, a video preview interface, a film preview interface, etc.) and display images captured from any one or more cameras 195 in the interface.
The wireless communication function of the electronic device may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem, a baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the electronic device may be used to cover a single or multiple communication bands. Different antennas may also be multiplexed to improve the utilization of the antennas.
The mobile communication module 150 may provide a solution for wireless communication including 2G/3G/4G/5G, etc. applied on an electronic device. The mobile communication module 150 may receive electromagnetic waves from the antenna 1, perform processes such as filtering, amplifying, and the like on the received electromagnetic waves, and transmit the processed electromagnetic waves to the modem processor for demodulation. The mobile communication module 150 can amplify the signal modulated by the modem processor, and convert the signal into electromagnetic waves through the antenna 1 to radiate. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be provided in the same device as at least some of the modules of the processor 110.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating the low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then transmits the demodulated low frequency baseband signal to the baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor. The application processor outputs sound signals through an audio device (not limited to the speaker 170A, the receiver 170B, etc.), or displays images or videos through the display screen 193. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be provided in the same device as the mobile communication module 150 or other functional module, independent of the processor 110.
The wireless communication module 160 may provide solutions for wireless communication including wireless local area network (wireless local area networks, WLAN) (e.g., wireless fidelity (wireless fidelity, wi-Fi) network), bluetooth (BT), global navigation satellite system (global navigation satellite system, GNSS), frequency modulation (frequency modulation, FM), near field wireless communication technology (near field communication, NFC), infrared technology (IR), etc. for application on an electronic device. The wireless communication module 160 may be one or more devices that integrate at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, modulates the electromagnetic wave signals, filters the electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, frequency modulate it, amplify it, and convert it to electromagnetic waves for radiation via the antenna 2.
The SIM card interface 194 is used to connect to a SIM card. The SIM card may be inserted into the SIM card interface 194, or removed from the SIM card interface 194 to effect contact and separation with the electronic device. The electronic device may support one or more SIM card interfaces. The SIM card interface 194 may support a Nano SIM card, micro SIM card, etc. The same SIM card interface 194 may be used to insert multiple cards simultaneously. The SIM card interface 194 may also be compatible with external memory cards. The electronic equipment interacts with the network through the SIM card, so that the functions of communication, data communication and the like are realized. One SIM card corresponds to one subscriber number.
It should be understood that the connection relationship between the modules illustrated in the embodiments of the present application is only illustrative, and does not limit the structure of the electronic device. In other embodiments of the present application, the electronic device may also use different interfacing manners, or a combination of multiple interfacing manners in the foregoing embodiments.
It will be understood, of course, that the above illustration of fig. 4 is merely exemplary of the case where the electronic device is in the form of a cellular phone. If the electronic device is a tablet computer, a handheld computer, a PC, a PDA, a wearable device (e.g., a smart watch, a smart bracelet), etc., the electronic device may include fewer structures than those shown in fig. 4, or may include more structures than those shown in fig. 4, which is not limited herein.
It will be appreciated that, in general, implementation of electronic device functions requires hardware support and software coordination. The software system of the electronic device may employ a layered architecture, an event driven architecture, a microkernel architecture, a microservice architecture, or a cloud architecture. In the embodiment of the application, an Android system with a layered architecture is taken as an example, and the software structure of the electronic equipment is illustrated by an example.
Fig. 5 is a schematic diagram of a layered architecture of a software system of an electronic device according to an embodiment of the present application. The layered architecture divides the software into several layers, each with distinct roles and branches. The layers communicate with each other through a software interface (e.g., API).
In some examples, referring to fig. 5, in an embodiment of the present application, the software of the electronic device is divided into five layers, namely, an application layer, a framework layer (or referred to as an application framework layer), a system library and android run time (HAL layer (hardware abstraction layer), a hardware abstraction layer) and a driver layer (or referred to as a kernel layer) from top to bottom.
The application layer may include a series of applications, among others. As shown in fig. 5, the application layer may include Applications (APP) such as camera, gallery, calendar, map, WLAN, bluetooth, music, video, short message, talk, navigation, instant messaging, etc.
In some embodiments, the user may take a photograph of the user's needs by taking a photograph of the operation of some applications with photographing functions in the application layer. Specifically, taking the application program as an example, the user may display a photographing interface in response to the operation of opening the photographing function by the user. The photographing interface may be as shown in fig. 2 (a). And then, the electronic equipment can respond to the triggering operation (namely photographing operation) of the user on the photographing control in the photographing interface, call the target application to execute the photographing flow, and complete photographing. In the process of executing the photographing procedure, the target application is completed through modules in other layers of the software architecture in the electronic device.
After receiving a photographing operation of a user, the target application may send a corresponding original focusing request to a camera service in the frame layer based on a default or a focusing mode set by the user, so that the camera service sends a corresponding focusing instruction to the camera module through a camera HAL of the HAL layer to instruct the camera module to focus according to the corresponding focusing mode. After the camera module is focused successfully, the camera module returns a focusing result to the camera service through the camera HAL. After receiving the focusing result, the camera service returns the focusing result to the target application, so that the target application performs a subsequent photographing process based on the focusing result.
For example, if the focusing result is successful, the target application may send a photographing request to the camera HAL through the camera service, so that the camera HAL obtains the original image from the camera module and returns the original image to the target application through the camera service after processing the original image to obtain the target image. After receiving the target image, the target application can display the target image in the film preview interface for the user to view.
For another example, if the focusing result is failed, the target application may instruct to determine whether to continue focusing or directly take a picture, and if the user determines to directly take a picture, the target application may acquire the target image according to the subsequent procedure that the focusing result is successful, and display the target image to the user (the target image may be blurred because the focusing is unsuccessful). If the user decides to continue focusing, the target application may continue to send the original focusing request to the camera service to make the camera module continue focusing, so as to repeatedly execute the subsequent procedure.
In some possible embodiments, the target application may be an enterprise WeChat.
The framework layer provides an application programming interface (application programming interface, API) and programming framework for application programs of the application layer. The application framework layer includes a number of predefined functions. For example, the application framework layer may include an activity manager, a window manager, a content provider, an audio service, a view system, a telephony manager, a resource manager, a notification manager, etc., to which embodiments of the present application are not limited in any way.
The window manager is used for managing window programs. The window manager can acquire the size of the display screen, judge whether a status bar exists, lock the screen, intercept the screen and the like.
The content provider is used to store and retrieve data and make such data accessible to applications. The data may include video, images, audio, calls made and received, browsing history and bookmarks, phonebooks, etc.
The view system includes visual controls, such as controls to display text, controls to display pictures, and the like. The view system may be used to build applications. The display interface may be composed of one or more views. For example, a display interface including a text message notification icon may include a view displaying text and a view displaying a picture.
The telephony manager is for providing communication functions of the electronic device. For example, the telephony manager may manage the call state (including initiate, connect, hang-up, etc.) of the call application.
The resource manager provides various resources for the application program, such as localization strings, icons, pictures, layout files, video files, and the like.
The notification manager allows the application to display notification information in a status bar, can be used to communicate notification type messages, can automatically disappear after a short dwell, and does not require user interaction. Such as notification manager is used to inform that the download is complete, message alerts, etc. The notification manager may also be a notification in the form of a chart or scroll bar text that appears on the system top status bar, such as a notification of a background running application, or a notification that appears on the screen in the form of a dialog window. For example, a text message is prompted in a status bar, a prompt tone is emitted, the electronic device vibrates, and an indicator light blinks, etc.
In the embodiment of the application, the frame layer can also comprise camera services. The main role of the camera service is to provide a unified interface and functionality for applications to access and operate the camera device. The following are some of the main roles of camera services:
Camera access: the camera service provides an interface to access camera hardware to enable applications to communicate with the camera device. The method abstracts the details of the camera driver and hardware at the bottom layer, conceals the difference of the realization at the bottom layer, and enables the application program to uniformly use the camera functions on different devices.
Camera control: camera services provide camera setup and control functions such as adjusting parameters of exposure, focus, flash, etc., and switching camera operations before and after. The application program can flexibly control and configure the camera through the camera service so as to meet different shooting requirements.
For example, camera services provide image capturing functionality, allowing applications to capture image data captured by a camera. It can provide different image formats, resolutions and frame rate options to suit the needs of different application scenarios. The camera services may also support image processing functions such as real-time filters, image beautification, face recognition, etc. The camera module can be matched with a camera module for use, and image data can be processed or analyzed in real time so as to provide more visual effects or functions. The camera service can also provide a callback mechanism of the state and the event of the camera (namely the camera module), so that an application program can timely acquire the state change of the camera module or complete the shooting and other events. Through callbacks, the application can respond to camera operations in time, such as updating UI interfaces, executing other logic, and the like.
In summary, the camera service plays a role of a bridge in the framework layer of the software system architecture, encapsulates the hardware details of the camera at the bottom layer, provides a unified interface and function, and provides the application with capabilities of convenient camera access, control, image capturing, processing, and the like.
In the embodiment of the application, the camera service can convert the original focusing request sent by the target application in the application program into the original focusing instruction which can be identified by the camera HAL in the lower HAL layer, so that the camera HAL can focus the camera module according to the corresponding focusing mode through interaction with the camera module. The camera service may also convert the focusing result obtained by the camera HAL from the camera module after receiving the focusing result, and send the focusing result to the target application after converting the focusing result into content that can be identified by the target application, so that the target application executes a subsequent photographing procedure based on the focusing result.
The system library may include a plurality of functional modules. For example: surface manager (surface manager), media library (Media Libraries), openGL ES, SGL, etc. The surface manager is used to manage the display subsystem and provides a fusion of 2D and 3D layers for multiple applications. Media libraries support a variety of commonly used audio, video format playback and recording, still image files, and the like. The media library may support a variety of audio video encoding formats, such as: MPEG4, h.264, MP3, AAC, AMR, JPG, PNG, etc. OpenGL ES is used to implement three-dimensional graphics drawing, image rendering, compositing, and layer processing, among others. SGL is the drawing engine for 2D drawing.
Android runtime (android run) includes a core library and an ART virtual machine. android run is responsible for scheduling and management of android systems. The core library consists of two parts: one part is a function which needs to be called by java language, and the other part is a core library of android. The application layer and the application framework layer run in an ART virtual machine. The ART virtual machine executes java files of the application program layer and the application program framework layer into binary files. The ART virtual machine is used for executing the functions of object life cycle management, stack management, thread management, security and exception management, garbage collection and the like.
The HAL layer is an interface layer between the operating system kernel and the hardware circuitry that aims at abstracting the hardware. The hardware interface details of a specific platform are hidden, a virtual hardware platform is provided for an operating system, so that the operating system has hardware independence, and can be transplanted on various platforms. The HAL layer provides a standard interface to display device hardware functionality to a higher level Java API framework (i.e., framework layer). The HAL layer contains a plurality of library modules, each of which implements an interface for a particular type of hardware component, such as: an audio HAL audio module, a blue HAL Bluetooth module, a camera HAL camera module (also referred to as a camera HAL or a camera hardware abstraction module), and a sensors HAL sensor module (or referred to as an Isensor service).
In the embodiment of the application, the camera HAL (i.e., the camera module) of the hardware abstraction layer is mainly used for playing the role of going up and down, and the camera service can be provided with a method (or called a function or an API) thereof through the HIDL interface of the HAL layer, so that the camera service can communicate with the bottom layer driver (even if the instruction of the camera service can be transmitted to the camera module, the camera module can work according to the instruction).
In the embodiment of the application, under the condition that a user actively turns off a flash lamp of a camera module through a target application, after receiving an original focusing instruction from a camera service, a camera HAL judges the original focusing instruction, judges the environment brightness of the electronic equipment, and when a focusing mode corresponding to the original focusing instruction is an automatic focusing mode and the environment brightness of the electronic equipment is smaller than a preset threshold (namely, the electronic equipment is in a dark environment), overwrites the original focusing instruction as a continuous focusing instruction and sends the continuous focusing instruction to the camera module. In this way, the camera HAL can timely return the focusing result to the target application in the dark environment, so that the target application can timely respond to photographing operation of the user, and the use experience of the user is improved.
In the embodiment of the application, the sensor module (or referred to as the sensor HAL) of the hardware abstraction layer is mainly used for interacting with the camera HAL, so that the camera HAL can acquire the detection data of the sensor when needed. Specifically, when the camera HAL receives the original focusing instruction, the sensor HAL may acquire detection data (i.e., data such as ambient light brightness) of the ambient light sensor, so as to determine whether the ambient brightness of the electronic device is greater than a preset threshold.
The kernel layer is a layer between hardware and software. The inner core layer at least comprises display drive, camera drive, audio drive, sensor drive, battery drive and the like, and the application is not limited. The sensor driver may specifically include a driver of each sensor included in the electronic device, for example, an ambient light sensor driver, or the like. For example, the ambient light sensor driver may send the ambient light sensor detection data to the sensing module in a timely manner in response to an indication or instruction by the sensing module to obtain the detection data.
Based on the hardware architecture and the software architecture, the control method provided by the embodiment of the application is introduced by taking the electronic equipment as a mobile phone as an example.
Based on the above hardware architecture and software structure, the focusing method provided by the embodiment of the present application is described in detail below with reference to fig. 6. Fig. 6 is a flowchart of a focusing method according to an embodiment of the present application. Referring to fig. 6, taking an electronic device as an example of a mobile phone, the focusing method may include S601-S616:
s601, the mobile phone receives an opening operation of opening a photographing function of the target application by a user, and displays a photographing interface of the target application in response to the opening operation.
The target application can be any application with a photographing function in the mobile phone. For example, cameras, weChat ® 、QQ ® Etc. Of course, as the local camera application of the mobile phone is used as the application of the special photographing function, the adverse effect that photographing cannot be successfully performed in a dark light environment is generally considered, specific logic can be set to avoid the influence on user experience when focusing is unsuccessful, and the problems in the background technology of the application can not exist. While a third party application such as taking a photographing function as a secondary function may not sufficiently consider various adverse effects of a focusing failure when photographing in a dark light environment, there may be technical problems mentioned in the background of the present application. Based on this, in the embodiment of the present application, the target application may be a third party application having a photographing function in the mobile phone.
When the user needs to take a picture by using the photographing function of the target application, the user first performs a triggering operation (e.g., clicking operation) on the icon of the target application on the desktop of the mobile phone. And the mobile phone responds to the triggering operation, so that the target application can be operated, and a main interface of the target application is displayed. And then, according to the default configuration condition of the target application (namely, the interface and the position of the opening control which are preset by each function), the user can enable the mobile phone to open the photographing function of the target application through at least one triggering operation, and the photographing interface of the target application is displayed. In this process, a series of operations in which the user opens the photographing function of the target application may be referred to as an opening operation; of course, the last trigger operation in the series of operations (e.g., the trigger operation on the photographing function control) may also be referred to as an open operation.
For example, taking a target application as an instant messaging application, the mobile phone may display a desktop 701 as shown in fig. 7 (a). An application icon 702 for the target application may be included in the desktop 701. In a case where the user needs to take a picture using the photographing function of the target application, the user performs a triggering operation (e.g., a clicking operation) on the application icon 702 of the target application in the desktop 701. The mobile phone may run or open the target application in response to the trigger operation, and present the main interface 703 of the target application as shown in (b) of fig. 7. The main interface includes a plurality of chat records therein. If the user is willing to send the real scene image to Zhang three that the user is currently facing, the user may perform a triggering operation (e.g., a clicking operation) on the chat log 704 of the corresponding Zhang three of the plurality of chat logs. In response to the triggering operation, the mobile phone may display a chat interface 705 corresponding to Zhang three as shown in (c) of fig. 7.
An expansion control 706 may be included in the chat interface 705. The user may implement a triggering operation on the expansion control 706. In response to the trigger operation, the mobile phone may then display a function block 707 as shown in fig. 7 (d). A capture functionality control 708 may then be included in this function block 707. In the event that a user needs to take a picture, a trigger operation (e.g., a click operation) may be performed on the capture functionality control 708. In response to the triggering operation, the mobile phone may then start the camera (the rear camera and/or the front camera) to start shooting, and display a shooting interface 801 as shown in fig. 8 (a). A photographing control 802 may be included in the photographing interface 801.
In the above-mentioned user operation process, from the time when the user opens the target application to the time when the user implements the triggering operation of the photographing function control 708, the whole may be used as the opening operation when the user opens the photographing function of the target application; alternatively, the triggering operation performed by the user on the photographing function control 708 may be regarded as an opening operation by which the user opens the photographing function of the target application.
After the mobile phone displays the photographing interface of the target application, the user can complete photographing by implementing photographing operation in the photographing interface.
It should be noted that, in the application scenario corresponding to the technical problem solved by the technical solution provided by the embodiment of the present application, the flash lamp of the camera module of the electronic device is turned off. The flash of the camera module is turned off before the subsequent step S602 is performed. The closing of the flash may be a default closing when the photographing function of the target application is turned on; or after the photographing function of the target application is started, the user actively closes the photographing function.
Based on this, in some embodiments, if the closing of the flash is actively completed by the user, between the time when the user opens the photographing function of the target application and the time when the user performs the photographing operation (i.e., between S601 and S602), the user also performs the closing operation of the flash in the photographing interface, and in response to the closing operation, the mobile phone may close the flash of the photographing module, so that the flash does not work in the subsequent photographing process. Illustratively, taking the photo interface 801 shown in fig. 8 (a) as an example, a flash control 803 may be further included in the photo interface. In the case where the flash is turned on, the icon style of the flash control 803 may be style one as shown in fig. 8 (a). If the user needs to turn off the flash, the user may implement a trigger operation for the flash control 803 on the photo interface 801. In response to the trigger operation, the target application of the mobile phone may display a flash mode selection box 804 as shown in fig. 8 (b). The flash mode selection box 804 includes a plurality of flash on mode options and a flash off option 805. The plurality of flash lamp on modes include: an automatic mode, a normally open mode, and an on mode. In the automatic mode, the camera module can automatically determine whether to turn on a flash lamp once after the user performs photographing operation according to the brightness of ambient light in the photographing process. Under the normal open mode, the camera module can always turn on the flash lamp in the photographing process. In the on mode, the camera module may turn on the flash once after the user performs the photographing operation.
The handset may receive and respond to the triggering operation of the user for the option 805 to turn off the flash, turn off the flash of the camera module, and adjust the pattern of the flash control 803 to pattern two as shown in fig. 8 (c). Of course, if the user implements a trigger operation for the options of the other flash on modes, the pattern of the flash control may be any other possible form.
S602, the mobile phone receives photographing operation implemented by a user in a photographing interface of the target application, and the target application of the mobile phone sends an original focusing request to a camera service in response to the photographing operation.
Under the condition that the user needs to take a picture, the user can take a picture on a picture taking interface of a target application displayed by the mobile phone. In response to the photographing operation, the target application of the mobile phone can complete the calling or control of the camera module through the camera service of the frame layer and the camera HAL of the HAL layer in sequence, so that focusing and photographing are realized.
Taking the photographing interface 801 shown in fig. 8 (a) as an example, the photographing operation performed by the user may specifically be a click operation on the photographing control 802 in the photographing interface. In response to the clicking operation, the target application in the mobile phone can then send a corresponding focusing request to the camera service, so that the camera service instructs the camera module to focus through the camera HAL. Of course, in other possible embodiments, the photographing operation may be any other feasible operation, such as a voice command, a gesture, and the like.
In practice, in order to make the photographing effect better, the camera module of the mobile phone needs to be focused when photographing. The focusing mode adopted by the camera module during focusing can be indicated by an application calling the camera module to realize the photographing function. Based on the above, after the user performs the photographing operation on the photographing interface of the target application, the target application of the mobile phone may send an original focusing request to the camera service, where the original focusing request can request focusing by adopting a focusing mode corresponding to the target application. The focusing mode corresponding to the target application can be set by a user or by a default of the target application.
In some other embodiments, the original focus request may also be request information for requesting the camera service to instruct the camera module to focus in a focus mode corresponding to the target application.
S603, the camera service of the mobile phone receives an original focusing request from a target application, and after generating an original focusing instruction based on the original focusing request, the camera service sends the original focusing instruction corresponding to the original focusing request to a camera HAL (or called a camera hardware abstraction module).
Wherein the original focus instruction is generated by the camera service based on the original focus request.
In some embodiments, the original focus instruction is used to instruct focusing with a focus mode corresponding to the target application. In other embodiments, the original focus instruction may also be request instruction information for instructing the camera HAL to instruct the camera module to focus in a focus mode corresponding to the target application.
Because the camera service is a module in the frame layer in the mobile phone software architecture, the camera module of the mobile phone hardware cannot be directly called to work, and interaction with the camera module is required to be completed through a corresponding module corresponding to the camera module in the HAL layer. Therefore, after the camera service receives the original focus request from the target application, the camera service may convert the original focus request into an original focus instruction that the camera HAL may recognize, so that the camera HAL instructs the camera module to focus.
S604, the camera HAL of the mobile phone receives an original focusing instruction from a camera service.
The technical scheme provided by the application mainly aims to solve the problems that when the camera module is unable to successfully focus in a dark light environment by adopting an automatic focusing mode, a focusing result cannot be timely sent to a target application calling the camera module to work, and further the target application cannot timely receive the focusing result, so that response to photographing operation of a user cannot be timely made. To solve this problem, it is necessary to make the camera module set not use an auto-focus mode but use a preset focus mode (e.g., a continuous focus mode) with timeout protection when focusing. Under the condition that the camera module is focused when adopting a preset focusing mode, whether focusing is successful or failed, a focusing result is returned to the target application through the camera HAL and the camera service after focusing for a certain period of time.
Based on this, in the technical scheme provided by the application, in the process of calling the camera module by the target application (specifically, the target application can call the camera module through the camera service and the camera HAL), the module involved in the middle can judge the original focusing instruction and the ambient light brightness, and then when the mobile phone is in a dim light environment (the ambient light brightness is smaller than a preset threshold value) and the original focusing instruction is used for indicating to focus by adopting the automatic focusing mode, the camera module is indicated to focus by adopting the preset focusing mode.
Because the primary role of the camera services in the framework layer is to translate the requests of the target application into instructions that the camera HAL in the HAL layer can recognize, a "take-up and-down" like effect is played. The camera HAL is a module for directly calling and controlling the hardware of the camera module of the mobile phone. In some embodiments of the application, the logic to determine the original focus instruction and ambient light level may be performed on the camera HAL. That is, in order to solve the technical problem provided by the embodiment of the present application, after receiving an original focusing instruction from a camera service, a camera HAL may determine an original focusing instruction and an ambient light brightness, so as to determine whether the mobile phone is currently in a dark environment and whether the original focusing instruction is used for indicating to focus in an auto-focusing mode. I.e., S604, S605 and S606 are performed.
S605, the camera HAL of the mobile phone judges whether the focusing mode corresponding to the original focusing instruction is an automatic focusing mode.
In some embodiments, the original focus instruction may carry a feature parameter indicating a focus mode of the target application, based on which the camera HAL may determine what focus mode the original focus instruction corresponds to.
If the focusing mode corresponding to the target application is an automatic focusing mode, the original focusing instruction is used for indicating to focus by adopting the automatic focusing mode, and then the focusing mode corresponding to the original focusing instruction is the automatic focusing mode. In this case, since the camera module of the mobile phone adopts the auto-focus mode to focus, the camera module cannot successfully focus for a long time only in a dark environment. Therefore, in order to avoid that the camera module fails to focus due to the dark environment of the mobile phone and fails to timely return a focusing result, if the focusing mode corresponding to the original focusing command is the automatic focusing mode, the camera HAL also needs to determine whether the mobile phone is currently in the dark environment, so as to determine whether the original focusing command needs to be changed into a target focusing command for focusing by using the preset focusing mode. I.e. the subsequent S606 is performed.
If the focusing mode of the target application focusing is an optional focusing mode except the automatic focusing mode, the original focusing instruction is used for indicating that the optional focusing mode is adopted for focusing, and then the focusing mode corresponding to the original focusing instruction is the optional focusing mode. Because the optional focusing mode is not the same as the auto-focusing mode, the camera module does not return the focusing result in time due to focusing failure. So the camera HAL can send the original focusing instruction to the camera module at this time so that the camera module focuses according to the focusing mode corresponding to the target application. I.e. the subsequent S612 is performed.
Note that, in practice, the above-described step of determining like S605 may not be performed, and S606 may be performed when the focus mode corresponding to the original focus instruction is the auto-focus mode, and S612 may be performed when the focus mode corresponding to the original focus instruction is not the auto-focus mode.
S606, the camera HAL of the mobile phone judges whether the mobile phone is in a dim light environment.
If the camera HAL of the mobile phone determines that the mobile phone is in a dim light environment, the mobile phone HAL can overwrite an original focusing instruction as a target focusing instruction so that the camera module can timely return a focusing result. The target focusing instruction is used for indicating the camera module to focus by adopting a preset focusing mode. In the preset focusing mode, the camera module returns a focusing result to the camera HAL after focusing for a certain period of time (for example, 2 s) no matter whether focusing is successful or failed. The preset focus may be a continuous focus mode, for example. After that, the camera HAL of the mobile phone can send the target focusing instruction to the camera module, so that the camera module can return a first focusing result to the camera HAL after focusing for a certain period of time, and further the camera HAL can return the first focusing result to the target application through the camera service, and the target application can also make corresponding actions based on the focusing result according to the set logic processing to serve as a response of photographing operation implemented by a user.
That is, after S606 is executed, if the camera HAL of the mobile phone determines that the focusing mode corresponding to the original focusing instruction is the auto-focusing mode and the mobile phone is in the dim light environment, the following S607-S611 are executed.
If the camera HAL of the mobile phone determines that the mobile phone is not in a dark light environment, the camera module of the mobile phone focuses in an automatic focusing mode or other focusing modes, and the camera HAL can quickly and successfully focus because the brightness of the ambient light is sufficient, so that a second focusing result of successful focusing is returned to the camera HAL. The camera HAL can then return the second focusing result to the target application via the camera service, so that the target application can acquire a photographing result based on the second focusing result based on the processing according to the set logic and display the photographing result to the user. That is, after S606 is executed, if the camera HAL of the mobile phone determines that the focusing mode corresponding to the original focusing instruction is the auto-focusing mode and the mobile phone is not in the dim light environment, S612-S616 are executed.
Note that, in practice, the above-described step of determining like S606 may not be performed, S607 may be performed when the mobile phone is in a dark environment, and S612 may be performed when the mobile phone is not in a dark environment.
In addition, the sequence of S605 and S606 is not specifically limited in the present application, the camera HAL may determine whether the mobile phone is in a dark environment, and determine whether the focusing mode corresponding to the original focusing instruction is an automatic focusing mode when the mobile phone is in the dark environment; or, the camera HAL can also judge whether the focusing mode corresponding to the original focusing instruction is an automatic focusing mode or not, and then judge whether the mobile phone is in a dark light environment or not under the condition that the focusing mode corresponding to the original focusing instruction is the automatic focusing mode.
In some embodiments, the camera HAL of the mobile phone may be obtained by means of a sensor HAL to determine whether the mobile phone is in a dim light environment. In particular, the camera HAL may obtain ambient light level from the sensor HAL and determine whether the cell phone is in a dim light environment based on the ambient light level. Based on this, in some embodiments, referring to fig. 9, S606 may specifically include S901-S906:
s901, the camera HAL of the mobile phone sends an ambient light level request to the sensor HAL.
Wherein the ambient light level request is for requesting ambient light level detected by the ambient light sensor.
S902, the sensor HAL of the mobile phone receives an ambient light level request from the camera HAL, and obtains the ambient light level detected by the ambient light sensor in response to the ambient light level request.
In some possible implementations, the sensor HAL may detect ambient light by invoking an ambient light sensor drive to drive the ambient light sensor to obtain ambient light level. The manner in which the sensor HAL invokes the ambient light sensor drive to drive the ambient light sensor to detect ambient light may be any feasible manner, for example, the sensor HAL registers a callback function in the ambient light sensor drive so that the ambient light sensor drive returns a detection result to the sensor HAL when the ambient light sensor drive drives the ambient light sensor to detect the ambient light.
S903, the sensor HAL of the mobile phone transmits the ambient light level to the camera HAL.
After the sensor HAL of the mobile phone acquires the ambient light level, the ambient light level can be returned to the camera HAL, so that the camera HAL can determine whether the mobile phone is in a dark light environment or not based on the ambient light level. I.e. S904 is performed after S903.
S904, the camera HAL of the mobile phone judges whether the ambient light brightness is larger than a preset brightness value.
In some embodiments, ambient light level may be represented by luxndex, with a greater luxndex representing a darker environment to which ambient light level corresponds. For example, the preset brightness value may be 450.
Based on this, in case that the ambient light brightness is greater than the preset brightness value, the camera HAL of the mobile phone may determine that the mobile phone is in a dim light environment, i.e., perform S905; in the case that the ambient light level is less than the preset brightness value, the camera HAL of the mobile phone may determine that the mobile phone is not in the dim light environment, i.e., perform S906.
In the embodiment of the application, the condition that the ambient light brightness is equal to the preset brightness value can be attributed to the condition that the ambient light brightness is smaller than the preset brightness value, or can be attributed to the condition that the ambient light brightness is larger than the preset brightness value. In the present application, only the case where the ambient light brightness is equal to the preset brightness value is taken as an example, and the case where the ambient light brightness is smaller than the preset brightness value is taken as an example, and the present application is not limited to the practical case.
S905, determining that the mobile phone is in a dim light environment by the camera HAL of the mobile phone.
S906, the camera HAL of the mobile phone determines that the mobile phone is not in a dim light environment.
It should be noted that the step similar to the step S704 may not exist, and the camera HAL of the mobile phone may determine that the mobile phone is in the dark environment when the ambient light brightness is greater than the preset brightness value, and determine that the mobile phone is not in the dark environment when the ambient light brightness is less than the preset brightness value.
Based on the technical scheme corresponding to the S901-S906, the camera HAL of the mobile phone can accurately determine whether the mobile phone is in a dim light environment currently, so that data support is provided for whether the original focusing instruction is overwritten or not and the execution of the follow-up steps, and smooth implementation of the focusing method provided by the application is ensured.
S607, the camera HAL of the mobile phone overwrites the original focusing instruction as a target focusing instruction and sends the target focusing instruction to the camera module.
The target focusing instruction is used for indicating the camera module to focus in a preset focusing mode. The preset focus mode may be a continuous focus mode, for example. Since in the preset focusing mode, the camera module will return the focusing result to the camera HAL within a certain focusing time period (for example, 2 s) no matter the focusing is successful or the focusing is failed. Therefore, the camera HAL can timely return a focusing result to the target application through the camera service, and the target application can also respond to the photographing operation implemented by the user by making corresponding actions based on the focusing result according to the set logic processing. I.e. the subsequent S608-S611 is performed.
S608, the camera module of the mobile phone receives a target focusing instruction from the camera HAL, responds to the target focusing instruction, focuses in a preset focusing mode, and sends a first focusing result to the camera HAL within a preset time after focusing starts.
The preset duration may be, for example, 2s. The first focusing result is used for indicating successful focusing or failed focusing.
S609, the camera HAL of the mobile phone receives the first focusing result from the camera module and sends the first focusing result to the camera service.
Here, the camera HAL needs to convert the data format of the first focusing result into a data format that can be recognized by the camera service, and then send the first focusing result to the camera service.
S610, the camera service of the mobile phone receives the first focusing result from the camera HAL and sends the first focusing result to the target application.
S611, the target application of the mobile phone receives a first focusing result from the camera service, and executes a response operation for the photographing operation based on the first focusing result.
In one possible implementation, if the first focusing result is used to indicate that focusing is successful, after receiving the first focusing result, the target application of the mobile phone may acquire and process an original image from the camera module through the camera service and the camera HAL, and obtain a target image (i.e. a photo obtained by photographing or a photographing result) that may be finally presented to the user. The whole operation flow of obtaining the original image from the camera module through the camera service and the camera HAL, processing the original image to obtain the target image, and presenting the target image to the user is the response operation.
Illustratively, in such a possible implementation, referring to fig. 10 in conjunction with fig. 6, S611 may specifically include S1001-S1007:
s1001, a target application of the mobile phone receives a first focusing result from a camera service.
The first focusing result is used for indicating successful focusing.
S1002, a target application of the mobile phone sends a photographing request to a camera HAL through a camera service.
Wherein the photographing request is used for requesting the camera HAL to acquire an image from the camera module.
In some embodiments, the target application of the handset may first send a photographing request to the camera service, so that the camera service converts the photographing request into content that the camera HAL can recognize and then forwards the content to the camera HAL.
S1003, the camera HAL receives a photographing request, and acquires an original image from the camera module in response to the photographing request.
S1004, the camera module of the mobile phone returns the original image to the camera HAL.
S1005, processing the original image by a camera HAL of the mobile phone to obtain a target image.
S1006, the camera HAL of the mobile phone returns the target image to the target application through the camera service.
In some embodiments, the camera HAL of the handset may first send the target image to the camera service, so that the camera service converts the target image into content recognizable by the target application (e.g., into image data in JPEG format) and then forwards the content to the target application.
S1007, displaying a film preview interface by the target application of the mobile phone, and displaying a target image in the film preview interface.
Illustratively, the tile preview interface may be as shown in fig. 2 (b).
Specific implementations of S1003-S1007 may refer to the related expressions of step 7-step 11 in the foregoing existing photographing procedure, and will not be described herein.
Based on the technical scheme corresponding to the S1001-S1007, the mobile phone can timely display the photo (namely the target image) to be viewed after the photographing operation is carried out by the user under the condition of successful focusing, so that the use experience of the user is improved.
In another possible implementation manner, if the first focusing result is used to indicate focusing failure, the target application of the mobile phone may output the prompt information in the photographing interface after receiving the first focusing result. On one hand, the prompt information can prompt the user that focusing fails; on the other hand, the user may be instructed to turn on a flash or to determine whether to refocus or to confirm whether to acquire a photographing result according to a result of failed focusing. After receiving the corresponding operation of the user for the prompt information, the target application of the mobile phone can execute different operations in a targeted manner.
For example, if the user performs an operation of turning on the flash, the target application of the mobile phone may send a target focus request to the camera service. The target focusing request is used for indicating focusing by adopting a preset focusing mode. After receiving the target focus request, the camera service may then send a target focus instruction to the camera HAL. Thereafter, the handset may then execute S608-S611. At this time, since the flash is turned on, the first focusing result of the camera module focusing is used to indicate that focusing is successful. Thereafter, the handset may then execute S1001-S1007 so that the target application may present the final target image to the user.
For another example, if the user performs an operation to determine refocusing, the target application of the handset may send a target focus request to the camera service. The target focusing request is used for indicating focusing by adopting a preset focusing mode. After receiving the target focus request, the camera service may then send a target focus instruction to the camera HAL. Thereafter, the handset may then execute S608-S611.
For another example, if the user performs an operation of acquiring a photographing result according to a result of the focus failure, the mobile phone may perform S1002-S1007. At this point, the final target application may show the user a blurred image of the target due to unsuccessful focusing.
Of course, in practice, if the first focusing result indicates that focusing fails, the response made by the target application of the mobile phone may be any other feasible response that can be perceived by the user, which is not particularly limited by the present application.
S612, the camera HAL of the mobile phone sends an original focusing instruction to the camera module.
The original focusing instruction is used for indicating the camera module to focus in an automatic focusing mode.
S613, the camera module of the mobile phone receives an original focusing instruction from the camera HAL, responds to the original focusing instruction, focuses by adopting a focusing mode corresponding to the target application, and sends a second focusing result to the camera HAL after the second focusing result is obtained.
Specifically, when the camera module adopts a focusing mode corresponding to the target application to focus, a focusing result can be obtained according to the corresponding focusing logic and returned to the camera HAL.
In some embodiments, if the focusing mode corresponding to the target application is an auto-focusing mode, the camera module of the mobile phone will focus in the auto-focusing mode. When the camera module of the mobile phone adopts an automatic focusing mode to focus, the camera module can continuously focus until successful focusing before successful focusing, so as to obtain a second focusing result.
The second focusing result is used for indicating successful focusing or failed focusing.
In one possible scenario, when S613 is executed, the environment in which the mobile phone is located is not a dark light environment, and even if the camera module adopts the auto-focusing mode to perform the cyclic focusing, the focusing can be quickly successful, so as to obtain the second focusing result. Therefore, in this case, the camera module of the mobile phone can send the second focusing result to the camera HAL in a shorter time, so that the subsequent target application cannot respond to the photographing operation implemented by the user because the focusing result cannot be obtained in time.
In another possible scenario, when S613 is executed, the focusing mode corresponding to the target application is not the auto-focusing mode, so the camera module will not continuously focus in a loop due to failure of focusing, but can obtain the focusing result as soon as possible, so that the camera HAL can obtain the focusing result in time, and further, the subsequent target application can make a response operation to the photographing operation of the user based on the focusing result in time.
S614, the camera HAL of the mobile phone receives the second focusing result from the camera module and sends the second focusing result to the camera service.
It should be noted that, here, the camera HAL needs to convert the data format of the second focusing result into a data format that can be recognized by the camera service, and then send the second focusing result to the camera service.
S615, the camera service of the mobile phone receives the second focusing result from the camera HAL and sends the second focusing result to the target application.
S616, the target application of the mobile phone receives a second focusing result from the camera service and executes a response operation for the photographing operation based on the second focusing result.
The specific implementation of S616 may refer to the specific implementation of S611 in the foregoing embodiment, which is not described herein.
In the embodiment of the present application, no matter what focusing mode is adopted by the camera module of the mobile phone for focusing, focusing can be performed based on an Auto Focus (AF) algorithm.
Based on the technical scheme provided by the embodiment of the application, after receiving photographing operation implemented by a user in a photographing interface of a target application, the mobile phone can generate an original focusing instruction. And then, the mobile phone can judge whether the focusing mode corresponding to the original focusing instruction is an automatic focusing mode or not and whether the mobile phone is in a dark light environment or not. When the focusing mode corresponding to the original focusing instruction is determined to be an automatic focusing mode and the current environment of the mobile phone is a dark environment, the original focusing instruction can be rewritten into a target focusing instruction for focusing in a preset focusing mode (such as a continuous focusing mode). Then, the mobile phone can focus based on the focusing mode corresponding to the target focusing instruction, so as to obtain a focusing result (namely a first focusing result) in time. The focusing result is used for indicating successful focusing or failed focusing. Then, the mobile phone can perform corresponding processing based on the focusing result, so that response or feedback is given to the photographing operation implemented by the user in time. The problem that in the prior art, in the process of photographing by using a target application in a dim light environment, a user cannot obtain the response of the mobile phone for a long time after photographing operation is implemented, and a great photographing delay is generated is avoided, so that the mobile phone can respond to the photographing operation of the user in time, and the use experience of the user is improved.
In order to facilitate understanding, a focusing method provided by the embodiment of the present application is described below with reference to fig. 11. As shown in fig. 11, the method may include S1101-S1103:
s1101, the electronic equipment responds to photographing operation carried out by a user on a photographing interface of the target application, and an original focusing instruction is generated.
The original focusing instruction is used for indicating to focus in a focusing mode corresponding to the target application.
In some embodiments, S1101 may specifically include: the method comprises the steps that a target application of the electronic equipment responds to photographing operation implemented by a user on a photographing interface of the target application, and an original focusing request is sent to camera service of the electronic equipment; the original focusing request is used for requesting focusing by adopting a focusing mode corresponding to the target application; the camera service of the electronic device receives the original focus request from the target application and generates an original focus instruction based on the original focus request.
The specific implementation of S1101 may refer to the specific identifiers of S601-S603 in the foregoing embodiments, and will not be described herein.
S1102, when the focusing mode corresponding to the original focusing instruction is an automatic focusing mode and the electronic equipment is in a dark light environment, the electronic equipment converts the original focusing instruction into a target focusing instruction.
The target focusing instruction is used for indicating to focus in a preset focusing mode; in the automatic focusing mode, the camera module of the electronic equipment carries out cyclic focusing, and a focusing result indicating successful focusing is obtained under the condition of successful focusing; under a preset focusing mode, the camera module of the electronic equipment obtains a focusing result within a preset time period after focusing starts. The preset focus mode may be a continuous focus mode, for example.
In some embodiments, the manner in which the electronic device determines whether itself is in a dim light environment may be: the electronic equipment acquires the ambient light brightness detected by the ambient light sensor; and under the condition that the ambient light brightness is larger than a preset brightness value, the electronic equipment determines that the electronic equipment is in a dark light environment. In this embodiment, the specific implementation of the electronic device to determine whether the electronic device is in the dark environment may refer to the relevant expressions S901-S906 in the foregoing embodiment, which are not repeated herein.
In some embodiments, S1102 may include: the camera service of the electronic equipment sends an original focusing instruction to a camera hardware abstraction module of the electronic equipment; a camera hardware abstraction module of the electronic equipment receives an original focusing instruction from a camera service; the camera hardware abstraction module of the electronic equipment converts the original focusing instruction into a target focusing instruction under the condition that the focusing mode corresponding to the original focusing instruction is determined to be an automatic focusing mode and the electronic equipment is determined to be in a dark light environment.
The specific implementation of S1102 may refer to the relevant expressions of S604-S607 in the foregoing embodiments, which are not repeated here.
S1103, the electronic device controls the camera module to focus in a preset focusing mode based on the target focusing instruction so as to obtain a first focusing result within a preset duration, and executes response operation for photographing operation based on the first focusing result.
For example, the specific implementation of S1103 may refer to the relevant expressions of S608-S611 in the foregoing embodiments, which are not repeated herein.
In some embodiments, S1103 may include: under the condition that the first focusing result is used for indicating successful focusing, the electronic equipment acquires a first original image from the camera module; the electronic equipment obtains a first target image based on the first original image; the electronic device displays a first target image on a tile preview interface of the target application. In this embodiment, the specific implementation of S1103 may refer to the relevant expressions of S1001 to S1007 in the foregoing embodiment, which are not described here again.
It should be noted that, when the focusing method provided by the embodiment of the application is implemented, the flash lamp of the camera module of the electronic device is in a closed state.
Based on the technical scheme corresponding to S1101-S1103, after receiving the photographing operation performed by the user in the photographing interface of the target application, the electronic device may generate an original focusing instruction. And then, the electronic equipment can judge whether the focusing mode corresponding to the original focusing instruction is an automatic focusing mode or not and whether the electronic equipment is in a dark light environment or not. When the focusing mode corresponding to the original focusing instruction is determined to be an automatic focusing mode and the current environment of the electronic equipment is a dark environment, the original focusing instruction can be rewritten into a target focusing instruction for focusing in a preset focusing mode (such as a continuous focusing mode). Then, the electronic device can focus based on the focusing mode corresponding to the target focusing instruction, so as to obtain a focusing result (namely a first focusing result) in time. The focusing result is used for indicating successful focusing or failed focusing. Then, the electronic device can perform corresponding processing based on the focusing result, so that response or feedback is given to the photographing operation implemented by the user in time. In the prior art, the problem that in the process of photographing by using the target application in the dim light environment, the user cannot obtain the response of the electronic equipment for a long time after performing photographing operation, and a great photographing delay is generated is avoided, so that the electronic equipment can respond to the photographing operation of the user in time, and the use experience of the user is improved.
In some embodiments, when the focusing mode corresponding to the original focusing instruction is not an automatic focusing mode or the electronic device is not in a dark environment, the electronic device controls the camera module to focus by adopting the focusing mode corresponding to the target application based on the original focusing instruction, and after a second focusing result is obtained, response operation for photographing operation is executed based on the second focusing result.
For example, the specific implementation of this embodiment may refer to the relevant expressions S612 to S616 in the foregoing embodiment, which are not described herein.
It will be appreciated that, in order to achieve the above-mentioned functions, the electronic device includes corresponding hardware structures and/or software modules for performing the respective functions. Those of skill in the art will readily appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as hardware or combinations of hardware and computer software. Whether a function is implemented as hardware or computer software driven hardware depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the embodiments of the present application.
The embodiment of the application can divide the functional modules of the electronic device according to the method example, for example, each functional module can be divided corresponding to each function, or two or more functions can be integrated in one processing module. The integrated modules may be implemented in hardware or in software functional modules. It should be noted that, in the embodiment of the present application, the division of the modules is schematic, which is merely a logic function division, and other division manners may be implemented in actual implementation.
In the case of dividing each functional module by corresponding each function, referring to fig. 12, an embodiment of the present application provides an electronic device, including: an acquisition module 1201, a processing module 1202 and a control module 1203. The acquiring module 1201 is configured to generate an original focusing instruction in response to a photographing operation performed by a user on a photographing interface of a target application; the original focusing instruction is used for indicating to focus by adopting a focusing mode corresponding to the target application. The processing module 1202 is configured to convert the original focus instruction into a target focus instruction when the focus mode corresponding to the original focus instruction generated by the obtaining module 1201 is an auto focus mode and the electronic device is in a dark environment; the target focusing instruction is used for indicating to focus in a preset focusing mode; in the automatic focusing mode, the camera module of the electronic equipment carries out cyclic focusing, and a focusing result indicating successful focusing is obtained under the condition of successful focusing; under a preset focusing mode, the camera module of the electronic equipment obtains a focusing result within a preset time period after focusing starts. The control module 1203 is configured to control the camera module to focus in a preset focusing mode based on the target focusing instruction converted by the processing module 1202, so as to obtain a first focusing result within a preset duration, and execute a response operation for the photographing operation based on the first focusing result.
Optionally, the processing module 1202 is further configured to: acquiring the ambient light brightness detected by an ambient light sensor; and under the condition that the ambient light brightness is larger than a preset brightness value, determining that the electronic equipment is in a dark light environment.
Optionally, the electronic device further includes a display module 1204; the control module 1203 is specifically configured to: under the condition that the first focusing result is used for indicating successful focusing, the electronic equipment acquires a first original image from the camera module; obtaining a first target image based on the first original image; the control display module 1204 displays the first target image on a film preview interface of the target application.
Optionally, the processing module 1202 is further configured to, when the focus mode corresponding to the original focus instruction generated by the obtaining module 1201 is not an auto focus mode or the electronic device is not in a dark environment, instruct the control module 1203 to control the camera module to focus in a focus mode corresponding to the target application based on the original focus instruction, and execute a response operation for the photographing operation based on the second focus result after the second focus result is obtained.
Optionally, the acquisition module 1201 includes a first unit 12011 and a second unit 12012. The first unit 12011 is configured to send an original focusing request to a camera service of the electronic device in response to a photographing operation performed by a user on a photographing interface of the target application; the original focusing request is used for requesting focusing by adopting a focusing mode corresponding to the target application. The second unit 12012 is configured to receive the original focus request from the first unit 12011, and generate an original focus instruction based on the original focus request.
Optionally, the processing module 1202 includes a third unit 12021 and a fourth unit 12022. The third unit 12021 is configured to send an original focusing instruction to a camera hardware abstraction module of the electronic device. The fourth unit 12022 is configured to receive an original focusing instruction from the third unit 12021; the fourth unit 12022 is further configured to convert the original focus instruction into the target focus instruction when it is determined that the focus mode corresponding to the original focus instruction is an auto focus mode and it is determined that the electronic device is in a dark environment.
With respect to the electronic apparatus in the above-described embodiments, the specific manner in which the respective modules perform the operations has been described in detail in the embodiments of the focusing method in the foregoing embodiments, and will not be specifically described here. The related beneficial effects of the focusing method can also refer to the related beneficial effects of the focusing method, and are not repeated here.
The embodiment of the application also provides electronic equipment, which comprises: a display screen, a memory, and one or more processors; the display screen and the memory are coupled with the processor; wherein the memory has stored therein computer program code comprising computer instructions which, when executed by the processor, cause the electronic device to perform the focusing method as provided by the foregoing embodiments. The specific structure of the electronic device may refer to the structure of the electronic device shown in fig. 4.
Embodiments of the present application also provide a computer-readable storage medium including computer instructions that, when executed on an electronic device, cause the electronic device to perform a focusing method as provided by the foregoing embodiments.
Embodiments of the present application also provide a computer program product containing executable instructions that, when run on an electronic device, cause the electronic device to perform a focusing method as provided by the previous embodiments.
The present application also provides a chip system, as shown in fig. 13, the chip system 1300 includes at least one processor 1301 and at least one interface circuit 1302. The processor 1301 and the interface circuit 1302 may be interconnected by wires. For example, interface circuit 1302 may be used to receive signals from other devices (e.g., a memory of an electronic apparatus). For another example, interface circuit 1302 may be used to send signals to other devices (e.g., processor 1301).
Illustratively, the interface circuit 1302 may read instructions stored in the memory and send the instructions to the processor 1101. The instructions, when executed by processor 1301, may cause the electronic device to perform the various steps of the embodiments described above. Of course, the system-on-chip may also include other discrete devices, which are not particularly limited in accordance with embodiments of the present application.
It will be apparent to those skilled in the art from this description that, for convenience and brevity of description, only the above-described division of the functional modules is illustrated, and in practical application, the above-described functional allocation may be performed by different functional modules according to needs, i.e. the internal structure of the apparatus is divided into different functional modules to perform all or part of the functions described above.
In the several embodiments provided by the present application, it should be understood that the disclosed apparatus/device and method may be implemented in other manners. For example, the apparatus/device embodiments described above are merely illustrative, e.g., the division of the modules or units is merely a logical function division, and there may be additional divisions when actually implemented, e.g., multiple units or components may be combined or integrated into another apparatus, or some features may be omitted or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or units, which may be in electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and the parts displayed as units may be one physical unit or a plurality of physical units, may be located in one place, or may be distributed in a plurality of different places. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in the embodiments of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The integrated units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a readable storage medium. Based on such understanding, the technical solution of the embodiments of the present application may be essentially or a part contributing to the prior art or all or part of the technical solution may be embodied in the form of a software product stored in a storage medium, including several instructions for causing a device (may be a single-chip microcomputer, a chip or the like) or a processor (processor) to perform all or part of the steps of the method described in the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read Only Memory (ROM), a random access memory (random access memory, RAM), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
The foregoing is merely illustrative of specific embodiments of the present application, but the scope of the present application is not limited thereto, and any changes or substitutions within the technical scope of the present application should be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (10)

1. A focusing method, characterized in that it is applied to an electronic device, the electronic device including a target application having a photographing function, the method comprising:
the electronic equipment responds to photographing operation implemented by a user on a photographing interface of the target application, and an original focusing instruction is generated; the original focusing instruction is used for indicating to focus in a focusing mode corresponding to the target application;
when the focusing mode corresponding to the original focusing instruction is an automatic focusing mode and the electronic equipment is in a dark light environment, the electronic equipment converts the original focusing instruction into a target focusing instruction; the target focusing instruction is used for indicating to focus in a preset focusing mode; in the automatic focusing mode, the camera module of the electronic equipment carries out cyclic focusing, and a focusing result indicating successful focusing is obtained under the condition of successful focusing; under the preset focusing mode, a camera module of the electronic equipment obtains a focusing result within a preset time length after focusing starts;
And the electronic equipment controls the camera module to focus in a preset focusing mode based on the target focusing instruction so as to obtain a first focusing result within a preset time period, and executes response operation for the photographing operation based on the first focusing result.
2. The method according to claim 1, wherein the method further comprises:
the electronic equipment acquires the ambient light brightness detected by the ambient light sensor;
and under the condition that the ambient light brightness is larger than a preset brightness value, the electronic equipment determines that the electronic equipment is in a dark light environment.
3. The method of claim 1, wherein the electronic device performs a response operation to the photographing operation based on the first focusing result, comprising:
under the condition that the first focusing result is used for indicating successful focusing, the electronic equipment acquires a first original image from the camera module;
the electronic equipment obtains a first target image based on the first original image;
and the electronic equipment displays the first target image on a film preview interface of the target application.
4. The method of claim 1, wherein when a focus mode corresponding to the original focus command is not an auto focus mode or the electronic device is not in a dark environment, the electronic device controls the camera module to focus in a focus mode corresponding to the target application based on the original focus command, and after a second focus result is obtained, performs a response operation for the photographing operation based on the second focus result.
5. The method of any of claims 1-4, wherein the electronic device generating the original focus instruction in response to a photographing operation performed by a user at a photographing interface of the target application comprises:
the method comprises the steps that a target application of the electronic equipment responds to photographing operation implemented by a user on a photographing interface of the target application, and an original focusing request is sent to a camera service of the electronic equipment; the original focusing request is used for requesting focusing by adopting a focusing mode corresponding to the target application;
the camera service of the electronic device receives the original focus request from the target application and generates an original focus instruction based on the original focus request.
6. The method of claim 5, wherein the electronic device converting the original focus instruction to a target focus instruction when the focus mode corresponding to the original focus instruction is an auto-focus mode and the electronic device is in a dim light environment, comprising:
the camera service of the electronic equipment sends the original focusing instruction to a camera hardware abstraction module of the electronic equipment;
a camera hardware abstraction module of the electronic device receives the original focusing instruction from the camera service;
And the camera hardware abstraction module of the electronic equipment converts the original focusing instruction into a target focusing instruction under the condition that the focusing mode corresponding to the original focusing instruction is determined to be an automatic focusing mode and the electronic equipment is determined to be in a dark light environment.
7. The method of any one of claims 1-4, wherein the preset focus mode is a continuous focus mode.
8. The method of any of claims 1-4, wherein a flash of a camera module of the electronic device is in an off state.
9. An electronic device comprising a display screen, a memory, and one or more processors; the display screen and the memory are coupled with the processor; wherein the memory has stored therein computer program code comprising computer instructions which, when executed by the processor, cause the electronic device to perform the focusing method according to any of claims 1-8.
10. A computer readable storage medium comprising computer instructions which, when run on an electronic device, cause the electronic device to perform the focusing method of any one of claims 1-8.
CN202311147338.5A 2023-09-07 2023-09-07 Focusing method, electronic equipment and storage medium Active CN116887047B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311147338.5A CN116887047B (en) 2023-09-07 2023-09-07 Focusing method, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311147338.5A CN116887047B (en) 2023-09-07 2023-09-07 Focusing method, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN116887047A true CN116887047A (en) 2023-10-13
CN116887047B CN116887047B (en) 2024-03-29

Family

ID=88272109

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311147338.5A Active CN116887047B (en) 2023-09-07 2023-09-07 Focusing method, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN116887047B (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS63138317A (en) * 1986-11-29 1988-06-10 Kyocera Corp Focus adjusting device for automatic focusing type single lens reflex camera
JPS63172227A (en) * 1987-01-12 1988-07-15 Canon Inc Auto-focusing device
US20050104993A1 (en) * 2003-09-30 2005-05-19 Hisayuki Matsumoto Auto focusing device for camera and method used in auto focusing device for camera for determining whether or not to emit auxiliary light
US20080049137A1 (en) * 2006-08-28 2008-02-28 Fujifilm Corporation Imaging device capable of reducing power consumption
CN103428416A (en) * 2012-05-18 2013-12-04 华为终端有限公司 Automatic terminal focusing mode switching method and terminal

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS63138317A (en) * 1986-11-29 1988-06-10 Kyocera Corp Focus adjusting device for automatic focusing type single lens reflex camera
JPS63172227A (en) * 1987-01-12 1988-07-15 Canon Inc Auto-focusing device
US20050104993A1 (en) * 2003-09-30 2005-05-19 Hisayuki Matsumoto Auto focusing device for camera and method used in auto focusing device for camera for determining whether or not to emit auxiliary light
US20080049137A1 (en) * 2006-08-28 2008-02-28 Fujifilm Corporation Imaging device capable of reducing power consumption
CN103428416A (en) * 2012-05-18 2013-12-04 华为终端有限公司 Automatic terminal focusing mode switching method and terminal
CN107124545A (en) * 2012-05-18 2017-09-01 华为终端有限公司 A kind of method and terminal of automatic switchover terminal focal modes

Also Published As

Publication number Publication date
CN116887047B (en) 2024-03-29

Similar Documents

Publication Publication Date Title
EP3893491A1 (en) Method for photographing the moon and electronic device
CN113132620B (en) Image shooting method and related device
CN112231025B (en) UI component display method and electronic equipment
CN114095666B (en) Photographing method, electronic device, and computer-readable storage medium
CN116996762B (en) Automatic exposure method, electronic equipment and computer readable storage medium
US11816494B2 (en) Foreground element display method and electronic device
CN113630558B (en) Camera exposure method and electronic equipment
CN113703894A (en) Display method and display device of notification message
CN113723397B (en) Screen capturing method and electronic equipment
CN114546969A (en) File sharing method and device and electronic equipment
CN116055856B (en) Camera interface display method, electronic device, and computer-readable storage medium
CN116887047B (en) Focusing method, electronic equipment and storage medium
CN115017498B (en) Method for operating applet and electronic device
CN113590346B (en) Method and electronic equipment for processing service request
CN113891008B (en) Exposure intensity adjusting method and related equipment
CN115550556B (en) Exposure intensity adjusting method and related device
CN115359105A (en) Depth-of-field extended image generation method, depth-of-field extended image generation device, and storage medium
CN116156044A (en) Equipment cooperation method and related device
CN115686182A (en) Processing method of augmented reality video and electronic equipment
CN116723382B (en) Shooting method and related equipment
CN116723410B (en) Method and device for adjusting frame interval
CN116055872B (en) Image acquisition method, electronic device, and computer-readable storage medium
CN116048323B (en) Image processing method and electronic equipment
CN114245011B (en) Image processing method, user interface and electronic equipment
CN116347217A (en) Image processing method, device and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant