WO2023160224A9 - Procédé pour photographier et dispositif associé - Google Patents

Procédé pour photographier et dispositif associé Download PDF

Info

Publication number
WO2023160224A9
WO2023160224A9 PCT/CN2022/142907 CN2022142907W WO2023160224A9 WO 2023160224 A9 WO2023160224 A9 WO 2023160224A9 CN 2022142907 W CN2022142907 W CN 2022142907W WO 2023160224 A9 WO2023160224 A9 WO 2023160224A9
Authority
WO
WIPO (PCT)
Prior art keywords
electronic device
state
camera
application
ideal
Prior art date
Application number
PCT/CN2022/142907
Other languages
English (en)
Chinese (zh)
Other versions
WO2023160224A1 (fr
Inventor
林梦然
Original Assignee
荣耀终端有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 荣耀终端有限公司 filed Critical 荣耀终端有限公司
Publication of WO2023160224A1 publication Critical patent/WO2023160224A1/fr
Publication of WO2023160224A9 publication Critical patent/WO2023160224A9/fr

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/95Computational photography systems, e.g. light-field imaging systems
    • H04N23/951Computational photography systems, e.g. light-field imaging systems by using two or more images to influence resolution, frame rate or aspect ratio
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/14Picture signal circuitry for video frequency region

Definitions

  • the present application relates to the field of terminal technology, and in particular, to a photographing method and related equipment.
  • Shutter lag also known as shutter lag, refers to the time from when the shutter is pressed to when the photo is taken.
  • shutter delay can reflect the processing speed of the electronic equipment during the period from when the shutter is pressed to when the shooting is completed. When the shutter delay is too long, the terminal's response speed will be slower, which will affect the user's shooting experience.
  • This application provides a shooting method and related equipment.
  • the electronic device can acquire image frames based on the ideal shooting parameters.
  • the ideal shooting parameters are the shooting parameters obtained by the electronic device when the 3A algorithm has converged. In this way, the electronic device does not need to wait until the 3A state changes from the ideal parameter state to the parameter locked state before acquiring image frames based on the ideal shooting parameters. In other words, the electronic device can obtain the photographed image faster.
  • This shooting method shortens the shutter delay and improves the response speed of electronic devices. At the same time, it also improves the user's photography experience.
  • this application provides a photographing method.
  • the method includes: an electronic device displays a first interface; the first interface includes a shutter control; the first interface is a photography interface of a camera application; in response to a first operation acting on the shutter control, the electronic device Execute the 3A algorithm to adjust the shooting parameters and update the 3A status; the 3A status is used to represent the execution of the 3A algorithm; if the 3A status is an ideal parameter status and the electronic device is in a stable state, then The electronic device acquires image frames based on ideal shooting parameters; the ideal parameter state indicates that the 3A algorithm has converged; and the ideal shooting parameters are the shooting parameters obtained by the electronic device when the 3A algorithm converges.
  • the electronic device does not need to wait until the 3A state changes from the ideal parameter state to the parameter locked state before acquiring image frames based on the ideal shooting parameters.
  • the 3A state is the ideal parameter state, and the electronic device is in a stable state.
  • image frames can be obtained based on ideal shooting parameters.
  • the ideal shooting parameters are the shooting parameters obtained by the electronic device when the 3A algorithm has converged. This also means that electronic devices can obtain photographed images faster.
  • the above-mentioned shooting method shortens the shutter delay, improves the response speed of electronic devices, and also improves the user's photography experience.
  • the first operation may be click, voice control, gesture, etc., which is not limited by this application.
  • the image frame may be the first image in the embodiments described below.
  • the method may further include: the electronic device acquires a state based on a gyroscope sensor The monitoring data determines whether the electronic device is in the stable state; the state monitoring data includes the angular velocity and/or jitter angle of the electronic device around the x-axis, y-axis, and z-axis.
  • the electronic device can determine the motion status of the electronic device through the status monitoring data obtained by the gyroscope sensor.
  • the camera HAL module in the electronic device can continuously receive status monitoring data from the gyroscope sensor, and determine whether the electronic device is in a stable state based on the status monitoring data.
  • the electronic device can obtain the motion status of the device in real time. There is no need to wait until the 3A state becomes the ideal parameter state and then judge whether the electronic device is in a stable state based on the acquired image. In this way, the electronic device can respond to taking photos faster, improving the user's shooting experience.
  • the condition monitoring data includes the angular velocity and/or jitter angle of the electronic device around the x-axis, y-axis, and z-axis.
  • the electronic device can determine whether the electronic device is in a stable state by comparing the relationship between the angular velocity and the preset angular velocity, and/or comparing the relationship between the shaking angle and the preset angle.
  • the electronic device is in a stable state.
  • the electronic device is in a stable state.
  • the electronic device is in a stable state.
  • the method further includes: If the electronic device is not in the stable state, the electronic device receives new state monitoring data obtained by the gyro sensor, and re-judges whether the electronic device is in a stable state based on the new state monitoring data.
  • the electronic device can continuously determine the motion status of the electronic device based on the status monitoring data obtained by the gyroscope sensor. It is understandable that if the electronic device is not in a stable state, the electronic device will make a judgment based on the new state monitoring data until the electronic device is in a stable state. In this way, the electronic device does not need to send the shooting data through the camera HAL module to obtain the image, nor does it need to judge whether the electronic device is in a stable state based on the image. Instead, it only needs to judge based on the received status monitoring data, saving time (no need to send shooting data, no need to upload images collected by the camera), which improves the speed of the electronic device's response to taking photos.
  • the method before the electronic device acquires image frames based on ideal shooting parameters, the method further includes: the electronic device acquires N frames of images; if the N If M frame images in the frame images are 3A stable images, the electronic device is in the stable state; the ratio of M to N is greater than the first threshold; the 3A stable image is the electronic device at a confidence level The image obtained when it is greater than the second threshold; the confidence is used to measure the stability of the 3A state; the greater the confidence, the more stable the 3A state is; both M and N are positive integers.
  • the electronic device when the 3A state is an ideal parameter state, the electronic device can determine the stability of the 3A state based on the proportion of 3A stable images in the acquired images, thereby determining whether the electronic device is in a stable state. . It is understandable that in the process of the 3A state changing from the ideal parameter state to the parameter locked state, the electronic device needs to continuously acquire images (here refers to the preview image). Until the acquired image meets a certain threshold, the 3A state can change from the ideal parameter state. Changes to parameter locked state. And this process will take a lot of time. Through the above method, the electronic device can obtain the photographed image frame without waiting for the 3A state to change to the parameter lock state, which saves time and allows the user to obtain the photographed image faster.
  • the electronic device if the ratio of M to N is greater than the first threshold, the electronic device is in a stable state.
  • the electronic device if the number of M is greater than the first threshold, the electronic device is in a stable state.
  • the method further includes: if there are no M frames of images in the N frames of images, it is the 3A To stabilize the image, the electronic device continues to acquire L frame images, and determines whether the electronic device is in the stable state according to the number of frames of the 3A stable image included in the L frame image.
  • the electronic device determines that the electronic device is not in a stable state based on N frames of images, it can continue to acquire L frames of images, and when the 3A stable images included in the L frame of images reach a certain number (or the proportion reaches After a certain value), it is determined that the electronic device is in a stable state. In this way, the electronic device can obtain the photographed image frame without waiting for the 3A state to change to the parameter lock state, saving time and allowing the user to obtain the photographed image faster.
  • the method further includes: the electronic device uses the camera application Send a 3A trigger instruction to the hardware abstraction HAL layer; the 3A trigger instruction is used to trigger the electronic device to execute the 3A algorithm.
  • the electronic device acquires an image frame based on the ideal shooting parameters. , including: if the 3A state is the ideal parameter state and the electronic device is in the stable state, the electronic device sends the current 3A state to the camera application through the camera HAL module; in the camera application When it is determined that the current 3A state is the ideal parameter state, the electronic device sends a photographing request to the camera HAL module through the camera application; based on the photographing request, the electronic device sends a photographing request through the camera HAL module.
  • the camera HAL module sends the ideal shooting parameters to the camera, and uses the camera to collect the image frames according to the ideal shooting parameters.
  • the method further includes: the electronic device displays a second interface; the second The interface includes a gallery shortcut control; the gallery shortcut control displays thumbnails of the image frames.
  • the electronic device can save the acquired image taken this time to the gallery application.
  • the gallery shortcut control included in the photo-taking interface can display the thumbnail of the image taken this time.
  • the present application provides an electronic device.
  • the electronic device may include a display, memory, and one or more processors.
  • the memory can store a computer program, and the processor can call the computer program.
  • the display may be used to: display a first interface; the first interface includes a shutter control; the first interface is a photography interface of a camera application; and the processor may be used to: respond to the action on the shutter control.
  • the first operation is to execute the 3A algorithm to adjust the shooting parameters and update the 3A status; the 3A status is used to represent the execution of the 3A algorithm; if the 3A status is an ideal parameter status and the electronic device is in In a stable state, image frames are acquired based on ideal shooting parameters; the ideal parameter state indicates that the 3A algorithm has converged; the ideal shooting parameters are the shooting parameters obtained by the electronic device when the 3A algorithm converges.
  • the processor before being used to obtain the image frame based on the ideal shooting parameters, is also used to: determine the image frame based on the status monitoring data obtained by the gyroscope sensor. Whether the electronic device is in the stable state; the state monitoring data includes the angular velocity and/or jitter angle of the electronic device around the x-axis, y-axis, and z-axis.
  • the processor after being used to determine whether the electronic device is in the stable state based on the state monitoring data obtained by the gyroscope sensor, also uses In: if the electronic device is not in the stable state, receive new state monitoring data obtained by the gyro sensor, and re-determine whether the electronic device is in a stable state based on the new state monitoring data.
  • the processor before being used to obtain image frames based on ideal shooting parameters, is also used to: obtain N frames of images; if the N frames of images The M frame images in are 3A stable images, then the electronic device is in the stable state; the ratio of M to N is greater than the first threshold; the 3A stable image is the electronic device with a confidence level greater than the The image obtained at two thresholds; the confidence is used to measure the stability of the 3A state; the greater the confidence, the more stable the 3A state is; both M and N are positive integers.
  • the processor after being used to obtain N frames of images, is also used to: if M frames of images do not exist in the N frames of images, the processor is For the 3A stable image, continue to acquire L frame images, and determine whether the electronic device is in the stable state according to the number of frames of the 3A stable image included in the L frame image.
  • the processor after responding to the first operation acting on the shutter control, is also configured to: use the camera application Send a 3A trigger instruction to the hardware abstraction HAL layer; the 3A trigger instruction is used to trigger the electronic device to execute the 3A algorithm.
  • the processor is configured to obtain the ideal shooting parameters based on the ideal parameter state if the 3A state is an ideal parameter state and the electronic device is in a stable state.
  • image frame specifically for: if the 3A state is the ideal parameter state and the electronic device is in the stable state, send the current 3A state to the camera application through the camera HAL module; in the camera
  • the camera application issues a photographing request to the camera HAL module; based on the photographing request, the camera HAL module issues the Ideal shooting parameters to the camera.
  • the electronic device may also be provided with a camera. The camera is used to collect the image frames according to the ideal shooting parameters.
  • the display screen may be used to display a second interface; the second interface It includes a gallery shortcut control; the gallery shortcut control displays thumbnails of the image frames.
  • the present application provides a computer storage medium that includes instructions that, when run on an electronic device, cause the electronic device to execute any of the possible implementations of the first aspect.
  • embodiments of the present application provide a chip that is applied to an electronic device.
  • the chip includes one or more processors, and the processor is used to call computer instructions to cause the electronic device to execute any one of the above first aspects. possible implementation methods.
  • embodiments of the present application provide a computer program product containing instructions, which when the computer program product is run on an electronic device, causes the electronic device to execute any of the possible implementations of the first aspect.
  • the electronic device provided by the second aspect the computer storage medium provided by the third aspect, the chip provided by the fourth aspect, and the computer program product provided by the fifth aspect are all used to execute any one of the first aspects. Possible implementations. Therefore, the beneficial effects that can be achieved can be referred to the beneficial effects of any possible implementation method in the first aspect, and will not be described again here.
  • Figure 1 is a schematic diagram of a shutter delay provided by an embodiment of the present application.
  • Figures 2A-2C are a set of user interfaces provided by embodiments of the present application.
  • Figure 3 is a flow chart of a shooting method provided by an embodiment of the present application.
  • Figure 4 is a schematic diagram of the hardware structure of an electronic device provided by an embodiment of the present application.
  • Figure 5 is a schematic diagram of the software structure of an electronic device provided by an embodiment of the present application.
  • FIG. 6 is a schematic diagram of another shutter delay provided by an embodiment of the present application.
  • an embodiment means that a particular feature, structure or characteristic described in connection with the embodiment may be included in at least one embodiment of the application.
  • the appearances of this phrase in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. It will be explicitly and implicitly understood by those skilled in the art that the embodiments described herein may be combined with other embodiments.
  • the electronic device can be a smartphone, a smart TV, a tablet computer, a desktop computer, a laptop computer, a handheld computer, a notebook computer, an ultra-mobile personal computer (UMPC), a netbook, and a cellular phone.
  • PDA personal digital assistant
  • AR augmented reality
  • VR virtual reality
  • AI artificial intelligence
  • wearable devices wearable devices
  • vehicle-mounted devices Smart home devices and/or smart city devices, etc.
  • the embodiment of the present application does not place any special restrictions on the specific type of the electronic device.
  • 3A refers to autofocus (autofocus, AF), automatic exposure (auto exposure, AE) and automatic white balance (autowhitebalance, AWB).
  • Autofocus is the process of adjusting the focal length of a camera to automatically obtain a clear image.
  • Automatic exposure is the process of automatically adjusting exposure parameters so that the photosensitive device obtains the appropriate exposure.
  • Automatic white balance is the process of automatically adjusting the white balance gain so that the color of the shot is close to the real color of the object.
  • the 3A algorithm is actually implemented at the hardware abstraction layer of electronic devices.
  • the camera application located at the application layer needs to interact with the hardware abstraction layer (HAL) to implement the 3A algorithm.
  • HAL hardware abstraction layer
  • the electronic device not only needs to spend time executing the 3A algorithm, but also needs to spend time completing the interaction between the camera application and the hardware abstraction layer.
  • electronic devices include camera applications.
  • the camera app's shooting interface includes shutter controls.
  • the electronic device can detect user actions on the shutter control.
  • the camera application can generate a 3A trigger command.
  • the 3A trigger instruction is used to trigger the 3A algorithm.
  • the camera application can send the 3A trigger command to the HAL layer.
  • the camera HAL module processes instructions in sequence.
  • processing the 3A trigger instruction it can be determined that the instruction includes information indicating turning on the 3A algorithm (for example, a TRIGGER_START entry), and provides an interface to implement the 3A algorithm.
  • the 3A state is the parameter adjustment state.
  • the electronic device continuously adjusts the shooting parameters to obtain ideal parameters until the 3A algorithm converges. Once the 3A algorithm converges, the electronic device adjusts the 3A state to the ideal parameter state. The electronic device can acquire multiple frames of images under ideal parameters. After the 3A state remains stable, the 3A state is adjusted from the ideal parameter state to the parameter locked state. At this time, the camera HAL module can report the current 3A status (ie, parameter lock status) to the camera application. After the camera application obtains the current 3A status, it can issue a photo request. The camera HAL module receives the photo request sent by the camera application and can select and output frames.
  • the current 3A status ie, parameter lock status
  • the electronic device can generate a 3A cancellation command and send the 3A cancellation command to the HAL layer.
  • the 3A cancel instruction is used to stop execution of the 3A algorithm.
  • the 3A cancellation instruction is sent to the HAL layer, it is temporarily stored in the buffer and waits for processing.
  • the camera HAL module processes the 3A cancel command, it determines that the command includes information to stop executing the 3A algorithm (for example, a TRIGGER_CANCEL entry), thereby resetting the shooting parameters.
  • the adjusted shooting parameters will be restored to their default values.
  • the shooting parameters mentioned in this application include focus parameters, exposure parameters and white balance parameters.
  • the electronic device can determine whether the electronic device is in a good focus state based on the focus parameters.
  • Focus parameters may include focal length.
  • Exposure parameters can include aperture size, shutter speed, and sensitivity parameters.
  • White balance parameters may include RGB values.
  • the shutter delay includes channel delay, 3A algorithm execution time and data reporting delay.
  • the path delay is the time that the 3A trigger instruction waits for processing after it reaches the HAL layer, which is the delay 1 shown in Figure 1.
  • the 3A algorithm execution time is the time required to adjust the shooting parameters to the ideal parameters.
  • the data reporting delay is the time in the ideal parameter state, that is, delay 2 shown in Figure 1. This also means that the shooting response time of electronic equipment is long and the response speed is slow, which affects the user's shooting experience.
  • this application provides a shooting method and related equipment.
  • the electronic device can detect the user operation on the shutter control of the camera application.
  • the camera application can generate a 3A trigger instruction and send the 3A trigger instruction to the camera HAL module.
  • the camera HAL module can provide an interface to implement the 3A algorithm.
  • the 3A status may change.
  • the camera HAL module in the electronic device can directly report the status to the camera application, and then the camera application in the electronic device issues a photography instruction, and then selects the frame through the camera HAL module, and finally passes The monitor displays the acquired image.
  • This method reduces the shutter delay, reduces the shooting response time of the electronic device, makes the response speed faster, and improves the user's shooting experience.
  • GUI graphical user interface
  • Controls can include icons, buttons, menus, tabs, text boxes, dialog boxes, status bars, navigation bars, widgets, etc. Visual interface elements.
  • FIG. 2A illustrates a user interface 210 of an electronic device.
  • the user interface 210 displays a page on which application icons are placed.
  • the page may include multiple application icons (for example, a weather application icon, a calendar application icon, a photo album application icon, a note application icon, an email application icon, an application store application icon, Set application icon, etc.).
  • Page indicators may also be displayed below the multiple application icons to indicate the positional relationship between the currently displayed page and other pages.
  • There are multiple application icons below the page indicator eg, camera application icon 211, browser application icon, information application icon, dialing application icon). The app icon remains displayed when switching pages.
  • the camera application icon 211 is an icon of the camera application.
  • the camera application icon 21 can be used to trigger the launch of the camera application, ie, the camera application.
  • a camera application is an image shooting application on electronic devices such as smartphones and tablets. This application does not impose restrictions on the name of the application.
  • the electronic device can detect user operations on the camera application icon 211. In response to the user operation, the electronic device may display the user interface 220 shown in FIG. 2B.
  • the user interface 220 may include: a setting area 221, a preview area 222, a camera mode option 223, a gallery shortcut control 224, a shutter control 225, and a camera flip control 226. in:
  • Settings area 221 may include setting controls and flash controls.
  • the setting controls can be used to adjust the parameters of taking photos (such as resolution, filters, etc.) and turn on or off some methods for taking photos (such as timed photos, smile captures, voice-activated photos, etc.). It can be understood that the setting control can also be used to set more other shooting functions, which is not limited in the embodiments of the present application.
  • the flash control can be used to turn the flash on or off.
  • the preview area 222 can be used to display images captured by the camera in real time.
  • the electronic device can refresh the display content in real time to facilitate the user to preview the image currently collected by the camera.
  • One or more shooting mode options may be displayed in the camera mode option 223 .
  • the one or more shooting mode options may include: a night scene mode option, a portrait mode option, a photo mode option, a video mode option and more options.
  • This one or more shooting mode options can be displayed as text information on the interface, such as "night scene”, “portrait”, “photography”, “video recording”, and “more”.
  • the one or more camera options can also be represented as icons or other forms of interactive elements (IE) on the interface.
  • IE interactive elements
  • the electronic device can further display more other shooting mode options, such as slow-motion shooting mode options, etc., which can display richer camera functions to the user. Not limited to what is shown in Figure 2B, more options may not be displayed in the camera mode options, and the user can browse other shooting mode options by sliding left/right in the camera mode options.
  • more other shooting mode options such as slow-motion shooting mode options, etc.
  • the gallery shortcut control 224 can be used to open the gallery application.
  • the electronic device can open the gallery application.
  • a user operation such as a click operation
  • the electronic device can open the gallery application.
  • the gallery application is a picture management application on electronic devices such as smartphones and tablets. It can also be called a "photo album". This embodiment does not limit the name of the application.
  • the gallery application can support users to perform various operations on pictures stored on electronic devices, such as browsing, editing, deleting, selecting and other operations.
  • the shutter control 225 can be used to monitor user operations that trigger taking pictures.
  • the electronic device can detect a user operation on the shutter control 225, and in response to the operation, the electronic device can acquire an image through the camera and save it as a picture in the gallery application.
  • the electronic device can also display thumbnails of the saved images in the gallery shortcut control 224 . That is, the user can click the shutter control 225 to trigger taking a picture.
  • the shutter control 225 may be a button or other form of control.
  • Camera flip control 226 may be used to listen for user operations that trigger flipping the camera.
  • the electronic device can detect a user operation, such as a click operation, on the camera flip control 226, and in response to the operation, the electronic device can flip the camera, such as switching a rear camera to a front camera. At this time, the image captured by the front camera may be displayed in the preview area 222 .
  • the electronic device may detect user operations on shutter control 225 .
  • the camera application can generate a 3A trigger command and send the 3A trigger command to the HAL layer.
  • the relevant modules of the HAL layer can process the 3A trigger command and provide an interface for related processing of the electronic device.
  • the unit executes the 3A algorithm.
  • the relevant modules of the HAL layer of the electronic device can update the 3A status.
  • the 3A state is an ideal parameter state
  • the relevant module of the HAL layer of the electronic device can report the current 3A state to the camera application.
  • the camera application determines that the current 3A status is the ideal parameter status, it can issue a photo request.
  • the electronic device can capture an image through a camera and save the image to a gallery application.
  • the electronic device may display the user interface 230 shown in Figure 2C.
  • User interface 230 includes substantially the same controls as user interface 220 . The difference is that the preview image displayed in the preview area 231 in the user interface 230 changes, and the image displayed in the gallery shortcut control 232 changes. It can be understood that the images displayed by the gallery shortcut control 232 are images obtained by the above-mentioned electronic device through the camera and saved in the gallery application.
  • a photographing method provided by an embodiment of the present application will be introduced below with reference to FIG. 3 .
  • the electronic device displays the first interface.
  • the first interface is the photo interface of the camera application.
  • the first interface includes shutter controls.
  • the user can click the camera application icon, and in response to the click operation, the electronic device can start the camera application (ie, camera application program) and display the first interface.
  • the first interface is the photographing interface of the camera application.
  • the first interface includes shutter controls. Users can take photos by triggering the shutter control.
  • the first interface may be the user interface 220 shown in FIG. 2B.
  • the electronic device detects a user operation on the shutter control, and in response to the user operation, the camera application generates a 3A trigger instruction. Among them, the 3A trigger instruction is used to trigger the electronic device to execute the 3A algorithm.
  • the user can click on the shutter control.
  • the electronic device can detect a click operation on the shutter control, and in response to the click operation, the camera application in the electronic device can generate a 3A trigger instruction.
  • the user can also trigger the shutter control through sound, gestures, etc., and this application does not limit this. That is to say, this application does not limit the specific form of user operations.
  • the 3A trigger instruction is used to trigger the electronic device to execute the 3A algorithm.
  • the 3A trigger instruction may include instruction information that triggers the 3A algorithm.
  • the 3A trigger instruction may include startup instruction information that triggers the AF algorithm.
  • 3A trigger instructions may include ANDROID_CONTROL_AF_TRIGGER_START. ANDROID_CONTROL_AF_TRIGGER_START is used to trigger autofocus scanning.
  • the 3A trigger instruction may also include a 3A mode.
  • 3A mode can include OFF mode (ie, shutdown mode), AUTO mode (ie, automatic mode), etc.
  • OFF mode the individual autofocus (AF), automatic exposure (AE), and automatic white balance (AWB) modes are effectively turned off, and any shooting controls are not overridden by the 3A routine.
  • AUTO mode AF, AE and AWB modes all run their own independent algorithms and have their own mode, status and trigger metadata entries.
  • the 3A trigger command may include AF_MODE_AUTO.
  • S303 The camera application in the electronic device sends the 3A trigger command to the HAL layer.
  • the camera application in the electronic device can send the 3A trigger command to the HAL layer of the electronic device so that the relevant modules of the HAL layer can process it.
  • the camera HAL module located at the HAL layer in the electronic device can process the 3A trigger instruction.
  • the camera HAL module processes requests in order. Therefore, after the 3A trigger command is transmitted to the HAL layer, it needs to enter the buffer and queue up to wait for processing by the camera HAL module.
  • the camera HAL module in the electronic device determines that the 3A trigger instruction includes instruction information for triggering the 3A algorithm, and provides an interface to execute the 3A algorithm.
  • the camera HAL module in the electronic device can process related requests sent to the HAL layer in sequence.
  • the camera HAL module can determine that the 3A trigger command includes instruction information for triggering the 3A algorithm, and provide a corresponding interface to execute the 3A algorithm.
  • the camera HAL module can connect the upper-layer software in the camera framework to the underlying camera driver and hardware.
  • the camera subsystem may include implementation in camera pipeline components, such as 3A algorithm processing controls.
  • the camera HAL module provides interfaces for implementing these components.
  • the camera HAL module can provide an interface to support the application framework to trigger the 3A algorithm and then call the kernel layer driver, so that the image signal processor (image signal processor, ISP) can schedule the 3A algorithm library. And execute the 3A algorithm to adjust the shooting parameters.
  • the 3A algorithm includes AF algorithm, AE algorithm and AWB algorithm.
  • the shooting parameters mentioned in this application include focus parameters, exposure parameters and white balance parameters.
  • S305 The electronic device updates the 3A status to the parameter adjustment status.
  • the 3A state is used to represent the execution of the 3A algorithm.
  • the camera HAL module in the electronic device can provide an interface for the processing unit (for example, ISP) to execute the 3A algorithm and update the 3A status to the parameter adjustment status.
  • the electronic device may provide an interface to allow the ISP to execute the 3A algorithm.
  • the shooting parameters are initial values. Once the ISP starts executing the 3A algorithm, the 3A state changes to the parameter adjustment state.
  • the parameter adjustment status indicates that the 3A algorithm is in a running state and has not yet converged.
  • the electronic device can continuously adjust the shooting parameters.
  • the metadata entry of the automatic focus setting is AF_STATE_PASSIVE_SCAN
  • the metadata entry of the automatic exposure setting is AE_STATE_SEARCHING
  • the metadata entry of the automatic white balance setting is AWB_STATE_SEARCHING.
  • the camera HAL module in the electronic device can update the 3A state from the parameter adjustment state to the ideal parameter state.
  • the ideal parameter state indicates that the 3A algorithm has converged. That is to say, when the 3A state is the ideal parameter state, the electronic device has determined the best shooting parameters, that is, the ideal shooting parameters.
  • the metadata entry of the automatic focus setting is _STATE_PASSIVE_FOCUSED
  • the metadata entry of the automatic exposure setting is AE_STATE_CONVERGED
  • the metadata entry of the automatic white balance setting is AWB_STATE_CONVERGED.
  • the camera HAL module in the electronic device can report the current 3A state to the camera application to notify the camera application that the 3A algorithm has converged and the shooting parameters have reached the optimum.
  • the electronic device when the 3A state is the ideal parameter state, the electronic device continuously acquires images through the camera. If the confidence level of the continuous multi-frame images obtained by the electronic device through the camera is high under the ideal parameter state, the camera HAL module in the electronic device can report the current 3A state (i.e., the ideal parameter state) to the camera application. That is to say, if under the ideal parameter state, the number of 3A stable image frames acquired by the electronic device through the camera reaches the first threshold, the camera HAL module in the electronic device can report the current 3A state (i.e., the ideal parameter state) to the camera application.
  • the 3A stable image is an image obtained when the confidence level is greater than the second threshold.
  • Confidence is used to measure the stability of the 3A state. It is understandable that the greater the confidence, the more stable the 3A state is.
  • the 3A state may switch back and forth between different states. For example, the 3A state continuously changes from the parameter adjustment state to the ideal parameter state, and then from the ideal parameter state to the parameter adjustment state.
  • the 3A state is stable, the 3A state is basically stable and remains in one state, and will not switch back and forth between different states.
  • the first threshold and the second threshold can be set according to actual needs, and this application does not limit this.
  • the camera HAL module in the electronic device can report the current 3A state (ie, the ideal parameter state) to the camera application.
  • the electronic device can determine whether the electronic device is stable through status monitoring data. Among them, the status monitoring data can be used to determine the motion status of the electronic device.
  • condition monitoring data may be the angular velocity and/or jitter angle of the electronic device about three axes (i.e., x, y, and z axes) sent by a gyroscope sensor.
  • the gyroscope sensor can detect the motion state of the electronic device and send its motion state to the camera HAL module of the electronic device. That is, the status monitoring data may include the motion status of the electronic device.
  • the motion state of the electronic device may include a stable state (i.e., a stationary state) and a shaking state.
  • the gyroscope sensor continuously sends status monitoring data to the camera HAL module. In some embodiments of the present application, after the 3A trigger command is issued to the HAL layer of the electronic device, the gyroscope sensor then sends status monitoring data to the camera HAL module.
  • the camera application in the electronic device After determining that the current 3A state is the ideal parameter state, the camera application in the electronic device issues a photo-taking request, obtains the first image through the camera, and saves the first image.
  • the first image is an image obtained by taking a photo.
  • the camera application in the electronic device can determine that the current 3A status is the ideal parameter status, and issue a photo request, and obtain the first image through the camera. and save the first image to the gallery application. It can be understood that after the photographing request is issued, the image (ie, the first image) acquired by the electronic device is a photographed image, not a preview image.
  • the electronic device displays the second interface.
  • the second interface includes gallery shortcut controls.
  • the gallery shortcut control can display a thumbnail of the first image.
  • the electronic device can display a second interface, and the second interface can include a gallery shortcut control.
  • the electronic device can display the thumbnail of the first image through the gallery shortcut control.
  • the second interface may be the user interface 230 shown in FIG. 2C.
  • FIG. 4 is a schematic diagram of the hardware structure of an electronic device provided by an embodiment of the present application.
  • the electronic device may include a processor 110, an external memory interface 120, an internal memory 121, a Universal Serial Bus (USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, Mobile communication module 150, wireless communication module 160, audio module 170, speaker 170A, receiver 170B, microphone 170C, headphone interface 170D, sensor module 180, button 190, motor 191, indicator 192, camera 193, display screen 194, and user Identification module (Subscriber Identification Module, SIM) card interface 195, etc.
  • a processor 110 an external memory interface 120, an internal memory 121, a Universal Serial Bus (USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, Mobile communication module 150, wireless communication module 160, audio module 170, speaker 170A, receiver 170B, microphone 170C, headphone interface 170D, sensor module 180, button 190, motor 191, indicator 192, camera 193, display screen
  • the sensor module 180 may include a pressure sensor 180A, a gyro sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, and ambient light. Sensor 180L, bone conduction sensor 180M, etc.
  • the structures illustrated in the embodiments of the present invention do not constitute specific limitations on the electronic equipment.
  • the electronic device may include more or less components than shown in the figures, or some components may be combined, some components may be separated, or some components may be arranged differently.
  • the components illustrated may be implemented in hardware, software, or a combination of software and hardware.
  • the processor 110 may include one or more processing units.
  • the processor 110 may include an application processor (Application Processor, AP), a modem processor, a graphics processor (Graphics Processing unit, GPU), and an image signal processor. (i.e. ISP), controller, memory, video codec, digital signal processor (Digital Signal Processor, DSP), baseband processor, and/or neural network processor (Neural-network Processing Unit, NPU), etc.
  • ISP application processor
  • controller memory
  • video codec digital signal processor
  • DSP Digital Signal Processor
  • NPU neural network Processing Unit
  • different processing units can be independent devices or integrated in one or more processors.
  • the controller can be the nerve center and command center of the electronic device.
  • the controller can generate operation control signals based on the instruction operation code and timing signals to complete the control of fetching and executing instructions.
  • the electronic device can execute the shooting method through the processor 110 .
  • the processor 110 may also be provided with a memory for storing instructions and data.
  • the memory in processor 110 is cache memory. This memory may hold instructions or data that have been recently used or recycled by processor 110 . If the processor 110 needs to use the instructions or data again, it can be called directly from the memory. Repeated access is avoided and the waiting time of the processor 110 is reduced, thus improving the efficiency of the system.
  • processor 110 may include one or more interfaces.
  • the USB interface 130 is an interface that complies with the USB standard specification, and may be a Mini USB interface, a Micro USB interface, a USB Type C interface, etc.
  • the USB interface 130 can be used to connect a charger to charge the electronic device, and can also be used to transmit data between the electronic device and peripheral devices. It can also be used to connect headphones to play audio through them. This interface can also be used to connect other electronic devices, such as AR devices, etc.
  • the charging management module 140 is used to receive charging input from the charger. While the charging management module 140 charges the battery 142, it can also provide power to the electronic device through the power management module 141.
  • the power management module 141 is used to connect the battery 142, the charging management module 140 and the processor 110.
  • the power management module 141 receives input from the battery 142 and/or the charging management module 140, and supplies power to the processor 110, internal memory 121, external memory, display screen 194, camera 193, wireless communication module 160, etc.
  • the wireless communication function of the electronic device can be realized through the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, the modem processor and the baseband processor.
  • Antenna 1 and Antenna 2 are used to transmit and receive electromagnetic wave signals.
  • Each antenna in an electronic device can be used to cover a single or multiple communication frequency bands. Different antennas can also be reused to improve antenna utilization.
  • the mobile communication module 150 can provide wireless communication solutions including 2G/3G/4G/5G applied to electronic devices.
  • the mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (Low Noise Amplifier, LNA), etc.
  • the mobile communication module 150 can receive electromagnetic waves through the antenna 1, perform filtering, amplification and other processing on the received electromagnetic waves, and transmit them to the modem processor for demodulation.
  • the mobile communication module 150 can also amplify the signal modulated by the modem processor and convert it into electromagnetic waves through the antenna 1 for radiation.
  • a modem processor may include a modulator and a demodulator.
  • the modulator is used to modulate the low-frequency baseband signal to be sent into a medium-high frequency signal.
  • the demodulator is used to demodulate the received electromagnetic wave signal into a low-frequency baseband signal.
  • the demodulator then transmits the demodulated low-frequency baseband signal to the baseband processor for processing.
  • the application processor outputs sound signals through audio devices (not limited to speaker 170A, receiver 170B, etc.), or displays images or videos through display screen 194.
  • the wireless communication module 160 can provide applications on electronic devices including Wireless Local Area Networks (WLAN) (such as Wireless Fidelity (Wi-Fi) network), Bluetooth (Bluetooth, BT), and Global Navigation Satellite System. (Global Navigation Satellite System, GNSS), Frequency Modulation (FM), Near Field Communication (NFC), Infrared (IR) and other wireless communication solutions.
  • the wireless communication module 160 may be one or more devices integrating at least one communication processing module.
  • the wireless communication module 160 receives electromagnetic waves via the antenna 2 , frequency modulates and filters the electromagnetic wave signals, and sends the processed signals to the processor 110 .
  • the wireless communication module 160 can also receive the signal to be sent from the processor 110, frequency modulate it, amplify it, and convert it into electromagnetic waves through the antenna 2 for radiation.
  • the antenna 1 of the electronic device is coupled to the mobile communication module 150, and the antenna 2 is coupled to the wireless communication module 160, so that the electronic device can communicate with the network and other devices through wireless communication technology.
  • the electronic device implements display functions through the GPU, display screen 194, and application processor.
  • the GPU is an image processing microprocessor and is connected to the display screen 194 and the application processor. GPUs are used to perform mathematical and geometric calculations for graphics rendering.
  • Processor 110 may include one or more GPUs that execute program instructions to generate or alter display information.
  • the display screen 194 is used to display images, videos, etc.
  • Display 194 includes a display panel.
  • the display panel can use a liquid crystal display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode or an active-matrix organic light-emitting diode (Active-Matrix Organic Light).
  • the electronic device may include 1 or N display screens 194, where N is a positive integer greater than 1.
  • Electronic devices can achieve acquisition functions through ISPs, cameras 193, video codecs, GPUs, display screens 194, and application processors.
  • the ISP is used to process the data fed back by the camera 193. For example, when taking a photo, the shutter is opened, the light is transmitted to the camera sensor through the lens, the light signal is converted into an electrical signal, and the camera sensor passes the electrical signal to the ISP for processing, and converts it into an image or video visible to the naked eye. ISP can also perform algorithm optimization on image noise, brightness, and skin color. ISP can also optimize the exposure, color temperature and other parameters of the shooting scene. In some embodiments, the ISP may be provided in the camera 193.
  • ISP may be used to execute the 3A algorithm.
  • Camera 193 is used to capture still images or video.
  • the object passes through the lens to produce an optical image that is projected onto the photosensitive element.
  • the photosensitive element can be a Charge Coupled Device (CCD) or a Complementary Metal-Oxide-Semiconductor (CMOS) phototransistor.
  • CCD Charge Coupled Device
  • CMOS Complementary Metal-Oxide-Semiconductor
  • the photosensitive element converts the optical signal into an electrical signal, and then passes the electrical signal to the ISP for conversion into a digital image or video signal.
  • ISP outputs digital images or video signals to DSP for processing.
  • DSP converts digital images or video signals into standard RGB, YUV and other formats.
  • the electronic device may include 1 or N cameras 193, where N is a positive integer greater than 1.
  • the electronic device can use N cameras 193 to acquire images of multiple exposure coefficients.
  • the electronic device can synthesize HDR images through HDR technology based on the images of multiple exposure coefficients.
  • a digital signal processor is used to process digital signals. In addition to processing digital images or video signals, it can also process other digital signals. For example, when the electronic device selects a frequency point, the digital signal processor is used to perform Fourier transform on the frequency point energy.
  • Video codecs are used to compress or decompress digital video.
  • Electronic devices may support one or more video codecs. In this way, electronic devices can play or record videos in multiple encoding formats, such as: Moving Picture Experts Group (MPEG) 1, MPEG2, MPEG3, MPEG4, etc.
  • MPEG Moving Picture Experts Group
  • MPEG2 MPEG2, MPEG3, MPEG4, etc.
  • NPU is a neural network (Neural-Network, NN) computing processor.
  • NN neural network
  • Intelligent cognitive applications of electronic devices can be realized through NPU, such as image recognition, face recognition, speech recognition, text understanding, etc.
  • the external memory interface 120 can be used to connect an external memory card, such as a Micro SD card, to expand the storage capacity of the electronic device.
  • the external memory card communicates with the processor 110 through the external memory interface 120 to implement the data storage function. Such as saving music, videos, etc. files in external memory card.
  • Internal memory 121 may be used to store computer executable program code, which includes instructions.
  • the processor 110 executes instructions stored in the internal memory 121 to execute various functional applications and data processing of the electronic device.
  • the internal memory 121 may include a program storage area and a data storage area.
  • the stored program area can store an operating system, at least one application program required for a function (such as a sound playback function, an image video playback function, etc.), etc.
  • the storage data area can store data created during the use of electronic equipment (such as audio data, phone books, etc.).
  • the electronic device can implement audio functions through the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the headphone interface 170D, and the application processor. Such as music playback, recording, etc.
  • the audio module 170 is used to convert digital audio information into analog audio signal output, and is also used to convert analog audio input into digital audio signals.
  • Speaker 170A also called “speaker” is used to convert audio electrical signals into sound signals.
  • Receiver 170B also called “earpiece” is used to convert audio electrical signals into sound signals.
  • Microphone 170C also called “microphone” or “microphone”, is used to convert sound signals into electrical signals.
  • the electronic device may be provided with at least one microphone 170C.
  • the headphone interface 170D is used to connect wired headphones.
  • the sensor module 180 may include one or more sensors, which may be of the same type or different types. It can be understood that the sensor module 180 shown in FIG. 4 is only an exemplary division method, and other division methods are possible, and this application is not limited thereto.
  • the pressure sensor 180A is used to sense pressure signals and can convert the pressure signals into electrical signals.
  • pressure sensor 180A may be disposed on display screen 194 .
  • the electronic device detects the strength of the touch operation according to the pressure sensor 180A.
  • the electronic device can also calculate the location of the touch based on the detection signal of the pressure sensor 180A.
  • touch operations acting on the same touch location but with different touch operation intensities may correspond to different operation instructions.
  • the gyro sensor 180B can be used to determine the motion posture of the electronic device. In some embodiments, the angular velocity of the electronic device about three axes (ie, x, y, and z axes) may be determined by gyro sensor 180B. The gyro sensor 180B can be used for image stabilization.
  • Air pressure sensor 180C is used to measure air pressure. In some embodiments, the electronic device calculates the altitude through the air pressure value measured by the air pressure sensor 180C to assist positioning and navigation.
  • Magnetic sensor 180D includes a Hall sensor.
  • the electronic device can use the magnetic sensor 180D to detect the opening and closing of the flip holster.
  • the acceleration sensor 180E can detect the acceleration of the electronic device in various directions (generally three axes). When the electronic device is stationary, the magnitude and direction of gravity can be detected. It can also be used to identify the posture of electronic devices and be used in horizontal and vertical screen switching, pedometer and other applications.
  • Distance sensor 180F for measuring distance.
  • Electronic devices can measure distance via infrared or laser. In some embodiments, when shooting a scene, the electronic device can utilize the distance sensor 180F to measure distance to achieve fast focusing.
  • Proximity light sensor 180G may include, for example, a light emitting diode (LED) and a light detector, such as a photodiode.
  • the light emitting diode may be an infrared light emitting diode.
  • Electronic devices emit infrared light through light-emitting diodes.
  • Electronic devices use photodiodes to detect infrared reflected light from nearby objects. When sufficient reflected light is detected, it can be determined that there is an object near the electronic device. When insufficient reflected light is detected, the electronic device can determine that there is no object near the electronic device.
  • the ambient light sensor 180L is used to sense ambient light brightness.
  • the electronic device can adaptively adjust the brightness of the display screen 194 based on perceived ambient light brightness.
  • the ambient light sensor 180L can also be used to automatically adjust the white balance when taking pictures.
  • the ambient light sensor 180L can also cooperate with the proximity light sensor 180G to detect whether the electronic device is in the pocket to prevent accidental touching.
  • the ambient light sensor 180L in the electronic device may be used to obtain the ambient brightness and transmit it to the corresponding processing module (eg, the processor 110, etc.).
  • the processing module eg, the processor 110, etc.
  • Fingerprint sensor 180H is used to acquire fingerprints.
  • Temperature sensor 180J is used to detect temperature.
  • Touch sensor 180K also called “touch panel”.
  • the touch sensor 180K can be disposed on the display screen 194.
  • the touch sensor 180K and the display screen 194 form a touch screen, which is also called a "touch screen”.
  • the touch sensor 180K is used to detect a touch operation on or near the touch sensor 180K.
  • the touch sensor can pass the detected touch operation to the application processor to determine the touch event type.
  • Visual output related to the touch operation may be provided through display screen 194 .
  • the touch sensor 180K may also be disposed on the surface of the electronic device at a location different from that of the display screen 194 .
  • Bone conduction sensor 180M can acquire vibration signals.
  • the buttons 190 include a power button, a volume button, etc.
  • Key 190 may be a mechanical key. It can also be a touch button.
  • the electronic device can receive key input and generate key signal input related to user settings and function control of the electronic device.
  • the motor 191 can generate vibration prompts.
  • the motor 191 can be used for vibration prompts for incoming calls and can also be used for touch vibration feedback.
  • touch operations for different applications can correspond to different vibration feedback effects.
  • the motor 191 can also respond to different vibration feedback effects for touch operations in different areas of the display screen 194 .
  • Different application scenarios such as time reminders, receiving information, alarm clocks, games, etc.
  • the touch vibration feedback effect can also be customized.
  • the indicator 192 may be an indicator light, which may be used to indicate charging status, power changes, or may be used to indicate messages, missed calls, notifications, etc.
  • the SIM card interface 195 is used to connect a SIM card.
  • the SIM card can be inserted into the SIM card interface 195 or pulled out from the SIM card interface 195 to realize contact and separation from the electronic device.
  • the electronic device can support 1 or N SIM card interfaces, where N is a positive integer greater than 1.
  • Electronic devices interact with the network through SIM cards to implement functions such as calls and data communications.
  • the electronic device uses an eSIM, that is, an embedded SIM card.
  • the eSIM card can be embedded in the electronic device and cannot be separated from the electronic device.
  • the software structure of electronic devices can adopt layered architecture, event-driven architecture, microkernel architecture, microservice architecture, or cloud architecture.
  • FIG. 5 is a schematic diagram of the software structure of an electronic device provided by an embodiment of the present application.
  • the software framework of the electronic device involved in this application may include an application layer, an application framework layer (framework, FWK), a system library, an Android runtime, a hardware abstraction layer and a kernel layer (kernel).
  • an application layer an application framework layer (framework, FWK)
  • FWK application framework layer
  • system library an application framework layer
  • Android runtime a hardware abstraction layer
  • kernel layer kernel layer
  • the application layer can include a series of application packages, such as camera, gallery, calendar, call, WLAN, music, video and other applications (also called applications). Among them, cameras are used to acquire images and videos. Regarding other applications of the application layer, please refer to the introduction and description in conventional technologies, and this application does not elaborate on them.
  • the application on the electronic device may be a native application (such as an application installed in the electronic device when the operating system is installed before the electronic device leaves the factory), or it may be a third-party application (such as a user downloading and installing it through an application store). application), the embodiments of this application are not limited.
  • the application framework layer provides application programming interfaces (Application Programming Interface, API) and programming frameworks for applications in the application layer.
  • API Application Programming Interface
  • the application framework layer includes some predefined functions.
  • the application framework layer can include window manager, content provider, view system, phone manager, resource manager, notification manager, etc.
  • a window manager is used to manage window programs.
  • the window manager can obtain the display size, determine whether there is a status bar, lock the screen, capture the screen, etc.
  • Content providers are used to store and retrieve data and make this data accessible to applications.
  • Said data can include videos, images, audio, calls made and received, browsing history and bookmarks, phone books, etc.
  • the view system includes visual controls, such as controls that display text, controls that display pictures, etc.
  • a view system can be used to build applications.
  • the display interface can be composed of one or more views.
  • a display interface including a text message notification icon may include a view for displaying text and a view for displaying pictures.
  • Telephone managers are used to provide communication functions of electronic devices. For example, call status management (including connected, hung up, etc.).
  • the resource manager provides various resources to applications, such as localized strings, icons, pictures, layout files, video files, etc.
  • the notification manager allows applications to display notification information in the status bar, which can be used to convey notification-type messages and can automatically disappear after a short stay without user interaction.
  • the notification manager is used to notify download completion, message reminders, etc.
  • the notification manager can also be notifications that appear in the status bar at the top of the system in the form of charts or scroll bar text, such as notifications for applications running in the background, or notifications that appear on the screen in the form of a conversation interface. For example, text information is prompted in the status bar, a beep sounds, the electronic device vibrates, the indicator light flashes, etc.
  • Runtime includes core libraries and virtual machines. Runtime is responsible for the scheduling and management of the system.
  • the core library contains two parts: one part is the functional functions that need to be called by the programming language (for example, Java language), and the other part is the core library of the system.
  • the programming language for example, Java language
  • the application layer and application framework layer run in virtual machines.
  • the virtual machine executes the programming files (for example, java files) of the application layer and application framework layer into binary files.
  • the virtual machine is used to perform object life cycle management, stack management, thread management, security and exception management, and garbage collection and other functions.
  • System libraries can include multiple functional modules. For example: Surface Manager (Surface Manager), Media Libraries (Media Libraries), 3D graphics processing library (for example: OpenGL ES), 2D graphics engine (for example: SGL), etc.
  • Surface Manager Surface Manager
  • Media Libraries Media Libraries
  • 3D graphics processing library for example: OpenGL ES
  • 2D graphics engine for example: SGL
  • the surface manager is used to manage the display subsystem and provides the fusion of two-dimensional (2-Dimensional, 2D) and three-dimensional (3-Dimensional, 3D) layers for multiple applications.
  • the media library supports playback and recording of a variety of commonly used audio and video formats, as well as static image files, etc.
  • the media library can support a variety of audio and video encoding formats, such as: MPEG4, H.264, MP3, AAC, AMR, JPG, PNG, etc.
  • the 3D graphics processing library is used to implement 3D graphics drawing, image rendering, composition, and layer processing.
  • 2D Graphics Engine is a drawing engine for 2D drawing.
  • the hardware abstraction layer is the interface layer between the operating system kernel and upper-layer software. Its purpose is to abstract the hardware.
  • the hardware abstraction layer is an abstract interface of the device kernel driver, which is used to provide application programming interfaces for accessing underlying devices to higher-level Java API frameworks.
  • HAL contains multiple library modules, such as camera HAL module, display, Bluetooth, audio, etc. Each of these library modules implements an interface for a specific type of hardware component.
  • the Android operating system will load the library module for the hardware component.
  • the kernel layer is the foundation of the Android operating system, and the final functions of the Android operating system are completed through the kernel layer.
  • the kernel layer at least includes display driver, camera driver, audio driver, sensor driver, and virtual card driver.
  • the following embodiments take the software structure of the electronic device shown in FIG. 5 as an example to specifically describe the technical solution provided by the embodiments of the present application.
  • FIG. 6 is a schematic diagram of a shutter delay provided by an embodiment of the present application.
  • the electronic device may include a camera application.
  • the camera app's shooting interface includes shutter controls.
  • the electronic device can detect user actions on the shutter control.
  • the camera application can generate a 3A trigger command.
  • the 3A trigger instruction is used to trigger the 3A algorithm.
  • the camera application can send the 3A trigger command to the HAL layer.
  • the camera HAL module processes instructions in sequence. When processing the 3A trigger instruction, it can be determined that the instruction includes information indicating turning on the 3A algorithm (for example, a TRIGGER_START entry), and provides an interface to implement the 3A algorithm.
  • the 3A state is the parameter adjustment state.
  • the electronic device adjusts the 3A state to the ideal parameter state.
  • the camera HAL module can determine whether the electronic device is stable based on the number of frames of the 3A stable image and the status monitoring data (for specific methods, please refer to the aforementioned embodiment). If the electronic device is in a stable state, the camera HAL module in the electronic device can report the current 3A state (that is, the ideal parameter state) to the camera application. After the camera application obtains the current 3A status, it can issue a photo request. After receiving the photo request, the camera HAL module can perform frame selection and post-processing. It can be understood that the image obtained after post-processing is the image obtained by this photography.
  • the camera HAL module can upload this image to the application layer.
  • the electronic device can save it to the gallery application.
  • the electronic device can generate a 3A cancellation instruction and send the 3A cancellation instruction to the HAL layer.
  • the camera HAL module processes the 3A cancel command, it determines that the command includes information to stop executing the 3A algorithm (for example, a TRIGGER_CANCEL entry), thereby resetting the shooting parameters.
  • the adjusted shooting parameters will be restored to their default values.
  • the shutter delay includes delay 1 and 3A algorithm execution time.
  • the camera HAL module in the electronic device does not need to wait for the 3A status to change to the parameter lock status before reporting the current 3A status.
  • This method reduces the shutter delay, reduces the shooting response time of the electronic device, makes the response speed faster, and improves the user's shooting experience.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computing Systems (AREA)
  • Theoretical Computer Science (AREA)
  • Studio Devices (AREA)

Abstract

La présente demande concerne un procédé pour photographier et un dispositif associé. Un dispositif électronique peut détecter une opération d'utilisateur agissant sur une commande d'obturateur d'une application de caméra, et en réponse à l'opération d'utilisateur, l'application de caméra peut générer une instruction de déclenchement 3A et émettre l'instruction de déclenchement 3A à un module HAL de caméra. Le module HAL de caméra peut fournir une interface pour exécuter un algorithme 3A. Pendant l'exécution de l'algorithme 3A, un état 3A peut changer. Lorsque l'état 3A est dans un état de paramètre idéal, le module HAL de caméra dans le dispositif électronique peut rapporter l'état à l'application de caméra, et l'application de caméra émet ensuite une instruction pour photographier de façon à acquérir une image, et stocke l'image. Au moyen du procédé, un module HAL de caméra dans un dispositif électronique n'a pas besoin d'attendre qu'un état 3A devienne un état verrouillé de paramètre avant de rapporter l'état 3A actuel. De cette manière, le retard d'obturateur est réduit, de sorte que le temps de réponse à photographier du dispositif électronique est raccourci, la vitesse de réponse est augmentée et l'expérience de photographie d'un utilisateur est ainsi améliorée.
PCT/CN2022/142907 2022-02-28 2022-12-28 Procédé pour photographier et dispositif associé WO2023160224A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202210185836.8A CN116723382B (zh) 2022-02-28 2022-02-28 一种拍摄方法及相关设备
CN202210185836.8 2022-02-28

Publications (2)

Publication Number Publication Date
WO2023160224A1 WO2023160224A1 (fr) 2023-08-31
WO2023160224A9 true WO2023160224A9 (fr) 2023-10-19

Family

ID=87764662

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/142907 WO2023160224A1 (fr) 2022-02-28 2022-12-28 Procédé pour photographier et dispositif associé

Country Status (2)

Country Link
CN (1) CN116723382B (fr)
WO (1) WO2023160224A1 (fr)

Family Cites Families (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0851571A (ja) * 1994-08-03 1996-02-20 Olympus Optical Co Ltd 電子的撮像装置
KR101692399B1 (ko) * 2010-10-14 2017-01-03 삼성전자주식회사 감성 기반의 영상을 얻을 수 있는 디지털 영상 처리 장치 및 디지털 영상 처리 방법
JP6390002B2 (ja) * 2014-03-10 2018-09-19 パナソニックIpマネジメント株式会社 撮像装置
WO2016059877A1 (fr) * 2014-10-17 2016-04-21 ソニー株式会社 Dispositif de commande, procédé de commande, et dispositif de véhicule de vol
JP6584122B2 (ja) * 2015-04-21 2019-10-02 キヤノン株式会社 画像処理装置及び画像処理方法
CN105827942A (zh) * 2015-09-24 2016-08-03 维沃移动通信有限公司 一种快速拍照方法及电子设备
CN106559615B (zh) * 2015-09-29 2020-04-03 宁波舜宇光电信息有限公司 摄像模组光学防抖系统的校正设备及其校正方法
KR102407624B1 (ko) * 2015-10-06 2022-06-10 삼성전자주식회사 전자 장치의 영상 처리 방법 및 그 전자 장치
CN106686305B (zh) * 2016-12-22 2020-09-15 南昌黑鲨科技有限公司 电子设备的图像处理方法及电子设备
CN108777767A (zh) * 2018-08-22 2018-11-09 Oppo广东移动通信有限公司 拍照方法、装置、终端及计算机可读存储介质
KR20200101180A (ko) * 2019-02-19 2020-08-27 삼성전자주식회사 이미지 안정화를 위한 전자 장치 및 그의 동작 방법
CN112532859B (zh) * 2019-09-18 2022-05-31 华为技术有限公司 视频采集方法和电子设备
JP7379061B2 (ja) * 2019-10-04 2023-11-14 キヤノン株式会社 カメラシステムおよびその制御方法
CN113497879B (zh) * 2020-03-18 2023-04-07 Oppo广东移动通信有限公司 一种拍照补光方法、装置、终端及存储介质
CN111526314B (zh) * 2020-04-24 2022-04-05 荣耀终端有限公司 视频拍摄方法及电子设备
CN113572993B (zh) * 2020-04-27 2022-10-11 华为技术有限公司 一种视频处理方法及移动终端
CN112954201B (zh) * 2021-01-28 2022-09-27 维沃移动通信有限公司 拍摄控制方法、装置和电子设备
CN113114933A (zh) * 2021-03-30 2021-07-13 维沃移动通信有限公司 图像拍摄方法、装置、电子设备和可读存储介质
CN113727016A (zh) * 2021-06-15 2021-11-30 荣耀终端有限公司 一种拍摄方法及电子设备
CN113965693B (zh) * 2021-08-12 2022-12-13 荣耀终端有限公司 一种视频拍摄方法、设备和存储介质
CN113727035B (zh) * 2021-10-15 2023-05-12 Oppo广东移动通信有限公司 图像处理方法、系统、电子设备及存储介质

Also Published As

Publication number Publication date
WO2023160224A1 (fr) 2023-08-31
CN116723382B (zh) 2024-05-03
CN116723382A (zh) 2023-09-08

Similar Documents

Publication Publication Date Title
WO2020168956A1 (fr) Procédé pour photographier la lune, et dispositif électronique
WO2020233553A1 (fr) Procédé de photographie et terminal
WO2021093793A1 (fr) Procédé de capture et dispositif électronique
WO2021052232A1 (fr) Procédé et dispositif de photographie à intervalle de temps
US11669242B2 (en) Screenshot method and electronic device
WO2021147482A1 (fr) Procédé de photographie au téléobjectif et dispositif électronique
US20230007186A1 (en) Video Shooting Method and Electronic Device
KR102577396B1 (ko) 녹화 프레임 레이트 제어 방법 및 관련 장치
WO2021213164A1 (fr) Procédé d'interaction entre des interfaces d'application, dispositif électronique et support de stockage lisible par ordinateur
WO2021213341A1 (fr) Procédé de photographie vidéo et dispositif électronique
JP7355941B2 (ja) 長焦点シナリオにおける撮影方法および端末
WO2023273323A1 (fr) Procédé de mise au point et dispositif électronique
CN113891009B (zh) 曝光调整方法及相关设备
EP4199499A1 (fr) Procédé de capture d'image, interface graphique utilisateur et dispositif électronique
CN113630558B (zh) 一种摄像曝光方法及电子设备
WO2023056795A1 (fr) Procédé de photographie rapide, dispositif électronique, et support de stockage lisible par ordinateur
WO2022143180A1 (fr) Procédé d'affichage collaboratif, dispositif terminal et support de stockage lisible par ordinateur
WO2022267608A1 (fr) Procédé de réglage d'intensité d'exposition et appareil associé
WO2023160230A9 (fr) Procédé photographique et dispositif associé
US20230412929A1 (en) Photographing Method and Related Apparatus
WO2021204103A1 (fr) Procédé de prévisualisation d'images, dispositif électronique et support de stockage
WO2023160224A9 (fr) Procédé pour photographier et dispositif associé
WO2023169237A1 (fr) Procédé de capture d'écran, dispositif électronique, et système
WO2024036998A1 (fr) Procédé d'affichage, support de stockage et dispositif électronique
WO2022228010A1 (fr) Procédé de génération de couverture, et dispositif électronique

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22928464

Country of ref document: EP

Kind code of ref document: A1